content stringlengths 1 103k ⌀ | path stringlengths 8 216 | filename stringlengths 2 179 | language stringclasses 15
values | size_bytes int64 2 189k | quality_score float64 0.5 0.95 | complexity float64 0 1 | documentation_ratio float64 0 1 | repository stringclasses 5
values | stars int64 0 1k | created_date stringdate 2023-07-10 19:21:08 2025-07-09 19:11:45 | license stringclasses 4
values | is_test bool 2
classes | file_hash stringlengths 32 32 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
# Generic component-level labels and high-level groupings.\n"domain: sources":\n - changed-files:\n - any-glob-to-any-file: ["src/sources/**/*"]\n\n"domain: sinks":\n - changed-files:\n - any-glob-to-any-file: ["src/sinks/**/*"]\n\n"domain: transforms":\n - changed-files:\n - any-glob-to-any-file: ["src/transforms/**/*"]\n\n"domain: topology":\n - changed-files:\n - any-glob-to-any-file: ["src/topology/**/*"]\n\n"domain: codecs":\n - changed-files:\n - any-glob-to-any-file: ["src/codecs/**/*"]\n\n"domain: core":\n - changed-files:\n - any-glob-to-any-file: ["lib/vector-core/**/*"]\n\n"domain: vrl":\n - changed-files:\n - any-glob-to-any-file: ["lib/vrl/**/*"]\n\n"domain: ci":\n - changed-files:\n - any-glob-to-any-file: ["scripts/**/*"]\n\n"domain: vdev":\n - changed-files:\n - any-glob-to-any-file: ["vdev/**/*"]\n\n"domain: releasing":\n - changed-files:\n - any-glob-to-any-file: ["distribution/**/*", "scripts/package-*", "scripts/release-*"]\n\n"domain: rfc":\n - changed-files:\n - any-glob-to-any-file: ["rfcs/**/*"]\n\n"domain: external docs":\n - changed-files:\n - any-glob-to-any-file: ["website/cue/**/*"]\n | dataset_sample\yaml\rust\labeler.yml | labeler.yml | YAML | 1,130 | 0.8 | 0 | 0.027027 | awesome-app | 584 | 2025-06-26T11:30:44.412843 | BSD-3-Clause | false | 560aa69fbba9478deca1e4cd2fcc1278 |
commit-msg:\n commands:\n lint-commit-msg:\n run: npx commitlint --edit\n\npre-commit:\n commands:\n js-linter:\n priority: 1\n glob: "**/*.{js,jsx,ts,tsx}"\n run: npm run lint\n rust-linter:\n priority: 2\n glob: "**/*.rs"\n run: cargo fmt -- --check\n ts-type-check:\n priority: 3\n glob: "**/*.{js,jsx,ts,tsx}"\n run: npm run type-check\n rust-code-check:\n priority: 4\n glob: "**/*.rs"\n run: cargo clippy -- -D warnings\n\npre-push:\n commands:\n js-test:\n priority: 1\n glob: "**/*.{js,jsx,ts,tsx}"\n run: npm run test\n rust-test:\n priority: 3\n glob: "**/*.rs"\n run: cargo test\n | dataset_sample\yaml\rust\lefthook.yml | lefthook.yml | YAML | 676 | 0.8 | 0 | 0 | python-kit | 495 | 2024-12-21T09:28:26.860431 | GPL-3.0 | false | 3365362156ab1b213a99cc58dcaa8349 |
version: 1\ndisable_existing_loggers: False\nformatters:\n default:\n "()": uvicorn.logging.DefaultFormatter\n format: '%(levelprefix)s [%(asctime)s] %(message)s'\n use_colors: null\n datefmt: '%d-%m-%Y %H:%M:%S'\n access:\n "()": uvicorn.logging.AccessFormatter\n format: '%(levelprefix)s [%(asctime)s] %(client_addr)s - "%(request_line)s" %(status_code)s'\n datefmt: '%d-%m-%Y %H:%M:%S'\nhandlers:\n default:\n formatter: default\n class: logging.StreamHandler\n stream: ext://sys.stderr\n access:\n formatter: access\n class: logging.StreamHandler\n stream: ext://sys.stdout\n console:\n class: logging.StreamHandler\n stream: ext://sys.stdout\n formatter: default\n file:\n class : logging.handlers.RotatingFileHandler\n filename: chroma.log\n formatter: default\nloggers:\n root:\n level: WARN\n handlers: [console, file]\n chromadb:\n level: DEBUG\n uvicorn:\n level: INFO\n | dataset_sample\yaml\rust\log_config.yml | log_config.yml | YAML | 921 | 0.95 | 0.108108 | 0 | react-lib | 890 | 2024-06-29T05:25:02.471251 | Apache-2.0 | false | c818feebf73e80d28651c647729e8840 |
copyright: Copyright © 2018-Present Wez Furlong\nuse_directory_urls: false\nsite_name: Wez's Terminal Emulator\nsite_url: https://wezterm.org/\nsite_description: Wez's Terminal Emulator\nrepo_url: https://github.com/wezterm/wezterm\nrepo_name: wezterm/wezterm\nedit_uri: edit/main/docs/\ndocs_dir: docs\nsite_dir: gh_pages\nstrict: true\ntheme:\n name: material\n logo: favicon.svg\n favicon: favicon.png\n custom_dir: docs/overrides\n palette:\n - media: "(prefers-color-scheme: light)"\n scheme: default\n primary: deep purple\n accent: deep purple\n toggle:\n icon: material/weather-sunny\n name: Switch to dark mode\n\n # Palette toggle for dark mode\n - media: "(prefers-color-scheme: dark)"\n scheme: slate\n primary: deep purple\n accent: purple\n toggle:\n icon: material/weather-night\n name: Switch to light mode\n features:\n - content.action.edit\n - content.action.view\n - content.code.copy\n - content.tabs.link\n - navigation.footer\n - navigation.indexes\n #- navigation.instant - breaks colorscheme preview\n #- navigation.expand\n - navigation.tabs\n - navigation.tabs.sticky\n - navigation.tracking\n - navigation.top\n - search.highlight\n - search.share\n - search.suggest\n - toc.follow\nplugins:\n - include-markdown\n - macros:\n module_name: mkdocs_macros\n - search:\n separator: '[\s\-,:!=\[\]()"/]+|(?!\b)(?=[A-Z][a-z])'\n - social:\n cards: !ENV [CARDS, false]\n #- git-revision-date-localized:\n # enable_creation_date: true\n # type: timeago\n - exclude:\n glob:\n - "**/_index.md"\n - "**/*.markdown"\n - "generate_toc.py"\n - "README.markdown"\n - "build.sh"\n - "SUMMARY.md"\n - "book.toml"\n - "overrides/**"\n\nexclude_docs: |\n **/_index.md\n **/*.markdown\n generate_toc.py\n README.markdown\n build.sh\n SUMMARY.md\n book.toml\n overrides/**\n\nnot_in_nav: |\n tags.md\n\nextra_css:\n - style.css\n - colorschemes/scheme.css\n - asciinema-player.css\nextra_javascript:\n - asciinema-player.min.js\n - javascript/fix-codeblock-console-copy-button.js\n - colorschemes/scheme.js\n\nmarkdown_extensions:\n - admonition\n - pymdownx.saneheaders\n - pymdownx.magiclink:\n repo_url_shorthand: true\n user: wez\n repo: wezterm\n - pymdownx.details\n - attr_list\n - md_in_html\n - def_list\n - toc:\n permalink: true\n - pymdownx.highlight:\n anchor_linenums: true\n line_spans: __span\n pygments_lang_class: true\n - pymdownx.pathconverter:\n base_path: "./docs"\n - pymdownx.inlinehilite\n - pymdownx.emoji:\n emoji_index: !!python/name:material.extensions.emoji.twemoji\n emoji_generator: !!python/name:material.extensions.emoji.to_svg\n - pymdownx.tasklist:\n custom_checkbox: true\n - pymdownx.tabbed:\n alternate_style: true\n - pymdownx.superfences:\n custom_fences:\n - name: mermaid\n class: mermaid\n format: !!python/name:pymdownx.superfences.fence_code_format\n\nextra:\n social:\n - icon: fontawesome/brands/github\n link: https://github.com/wezterm/wezterm\n - icon: fontawesome/brands/mastodon\n link: https://fosstodon.org/@wez\n - icon: fontawesome/brands/twitter\n link: https://twitter.com/wezfurlong\n | dataset_sample\yaml\rust\mkdocs-base.yml | mkdocs-base.yml | YAML | 3,289 | 0.8 | 0.014815 | 0.062016 | awesome-app | 664 | 2024-08-03T17:52:43.848289 | MIT | false | 975a3ca1fc3ff07b8a9095649d3b7338 |
# NOTE: Usually, you should edit the template instead.\n# This file is used for forks and contributors, production uses `mkdocs.insiders.yml`.\nINHERIT: mkdocs.template.yml\n\nwatch:\n - mkdocs.template.yml\n | dataset_sample\yaml\rust\mkdocs.public.yml | mkdocs.public.yml | YAML | 203 | 0.8 | 0.166667 | 0.4 | awesome-app | 726 | 2025-01-15T09:35:31.185617 | GPL-3.0 | false | d850c7e5fa2453ce165c81a8efc8d338 |
# mkdocs.yml\n# Top-level config for mkdocs\n# See: https://www.mkdocs.org/user-guide/configuration/\nsite_name: Rerun Python APIs\nsite_url: https://ref.rerun.io/docs/python/\nrepo_url: https://github.com/rerun-io/rerun/\n\n# Use the material theme\n# Override some options for nav: https://squidfunk.github.io/mkdocs-material/setup/setting-up-navigation/\ntheme:\n name: "material"\n features:\n - navigation.indexes\n - navigation.instant\n - navigation.tabs\n - navigation.tabs.sticky\n - navigation.tracking\n\nplugins:\n - search # https://squidfunk.github.io/mkdocs-material/setup/setting-up-site-search/\n - mkdocstrings: # https://mkdocstrings.github.io/usage/#global-options\n custom_templates: docs/templates # Override the function template.\n handlers:\n python:\n paths: ["rerun_sdk", "."] # Lookup python modules relative to this path\n import: # Cross-references for python and numpy\n - https://arrow.apache.org/docs/objects.inv\n - https://docs.python.org/3/objects.inv\n - https://numpy.org/doc/stable/objects.inv\n - https://ipython.readthedocs.io/en/stable/objects.inv\n options: # https://mkdocstrings.github.io/python/usage/#globallocal-options\n docstring_section_style: spacy # list spacy table\n docstring_style: numpy\n heading_level: 3\n filters: [\n "!__attrs_clear__", # For internal use\n "!^_[^_]", # Hide things starting with a single underscore\n "!as_component_batches", # Inherited from AsComponents\n "!indicator", # Inherited from Archetype\n "!num_instances", # Inherited from AsComponents\n ]\n inherited_members: true\n members_order: source # The order of class members\n merge_init_into_class: false # Not compatible with `inherited_members`\n show_if_no_docstring: false # We intentionally hide archetype fields\n show_source: no\n load_external_modules: true\n preload_modules:\n - rerun_bindings\n annotations_path: brief\n signature_crossrefs: true\n\n - gen-files: # https://oprypin.github.io/mkdocs-gen-files\n scripts:\n - docs/gen_common_index.py\n - literate-nav: # https://oprypin.github.io/mkdocs-literate-nav\n nav_file: SUMMARY.txt\n - redirects: # https://github.com/mkdocs/mkdocs-redirects\n redirect_maps:\n "index.md": "common/index.md"\n\n# https://www.mkdocs.org/user-guide/configuration/#markdown_extensions\n# https://squidfunk.github.io/mkdocs-material/setup/extensions/python-markdown-extensions/\nmarkdown_extensions:\n - admonition # https://squidfunk.github.io/mkdocs-material/reference/admonitions/\n - pymdownx.highlight # https://mkdocstrings.github.io/theming/#syntax-highlighting\n - pymdownx.superfences\n - toc:\n toc_depth: 4\n\n# Some extra styling\nextra_css:\n - css/mkdocstrings.css\n\n# https://squidfunk.github.io/mkdocs-material/setup/setting-up-versioning/\nextra:\n version:\n provider: mike\n default: latest\n | dataset_sample\yaml\rust\mkdocs.yml | mkdocs.yml | YAML | 3,118 | 0.95 | 0.063291 | 0.123288 | python-kit | 371 | 2023-11-30T17:19:02.042512 | BSD-3-Clause | false | b2ff9650c491c2c9529a1992c6e7bf0c |
id: only-dao-can-depend-tabby-db\nmessage: Only dao can depend on tabby-db\nseverity: error\nlanguage: rust\nfiles:\n- ./ee/tabby-schema/src/**\nignores:\n- ./ee/tabby-schema/src/dao.rs\n- ./ee/tabby-schema/src/policy.rs\nrule:\n pattern: tabby_db\n | dataset_sample\yaml\rust\only-dao-and-policy-can-depend-tabby-db.yml | only-dao-and-policy-can-depend-tabby-db.yml | YAML | 239 | 0.8 | 0 | 0 | vue-tools | 321 | 2024-03-05T08:22:58.823364 | BSD-3-Clause | false | 4092a2a98a7218e64a7d94db545b3c32 |
# 🔒 PROTECTED: Changes to locks-review-team should be approved by the current locks-review-team\nlocks-review-team: locks-review\nteam-leads-team: polkadot-review\naction-review-team: ci\n\nrules:\n - name: Core developers\n check_type: changed_files\n condition:\n include: .*\n # excluding files from 'CI team' and 'FRAME coders' rules\n exclude: ^\.gitlab-ci\.yml|^scripts/ci/.*|^\.github/.*|^\.config/nextest.toml|^frame/(?!.*(nfts/.*|uniques/.*|babe/.*|grandpa/.*|beefy|merkle-mountain-range/.*|contracts/.*|election|nomination-pools/.*|staking/.*|aura/.*))\n min_approvals: 2\n teams:\n - core-devs\n\n - name: FRAME coders\n check_type: changed_files\n condition:\n include: ^frame/(?!.*(nfts/.*|uniques/.*|babe/.*|grandpa/.*|beefy|merkle-mountain-range/.*|contracts/.*|election|nomination-pools/.*|staking/.*|aura/.*))\n all:\n - min_approvals: 2\n teams:\n - core-devs\n - min_approvals: 1\n teams:\n - frame-coders\n\n - name: CI team\n check_type: changed_files\n condition:\n include: ^\.gitlab-ci\.yml|^scripts/ci/.*|^\.github/.*|^\.config/nextest.toml\n min_approvals: 2\n teams:\n - ci\n\nprevent-review-request:\n teams:\n - core-devs\n | dataset_sample\yaml\rust\pr-custom-review.yml | pr-custom-review.yml | YAML | 1,233 | 0.8 | 0 | 0.057143 | react-lib | 266 | 2024-12-04T09:03:13.085303 | GPL-3.0 | false | 805a101b7f6a65b1396b7bfd34b7dc5c |
# See context at https://github.com/web-infra-dev/rspack/discussions/2760\n\n"release: feature":\n - "/^feat/"\n"release: performance":\n - "/^perf/"\n"release: bug fix":\n - "/^fix/"\n"release: document":\n - "/^docs/"\n | dataset_sample\yaml\rust\pr-labeler.yml | pr-labeler.yml | YAML | 215 | 0.8 | 0 | 0.111111 | react-lib | 444 | 2024-10-24T12:33:43.246751 | BSD-3-Clause | false | 5c493204ecd2c696b92baa6503dd7b17 |
global:\n scrape_interval: 15s # By default, scrape targets every 15 seconds.\n\n # Attach these labels to any time series or alerts when communicating with\n # external systems (federation, remote storage, Alertmanager).\n external_labels:\n monitor: 'codelab-monitor'\n\n# A scrape configuration containing exactly one endpoint to scrape:\n# Here it's Prometheus itself.\nscrape_configs:\n # The job name is added as a label `job=<job_name>` to any timeseries scraped from this config.\n - job_name: 'meilisearch'\n\n # Override the global default and scrape targets from this job every 5 seconds.\n scrape_interval: 5s\n\n static_configs:\n - targets: ['localhost:7700'] | dataset_sample\yaml\rust\prometheus-basic-scraper.yml | prometheus-basic-scraper.yml | YAML | 682 | 0.8 | 0 | 0.4 | node-utils | 427 | 2023-07-13T08:19:59.375880 | BSD-3-Clause | false | 1df5c02f3c5d8dd8801ab1edccb0100d |
global:\n scrape_interval: 15s\n evaluation_interval: 60s\n external_labels:\n rw_cluster: 20240506-185437\n\n\nscrape_configs:\n - job_name: prometheus\n static_configs:\n - targets: ["127.0.0.1:9500"]\n\n - job_name: compute\n static_configs:\n - targets: ["127.0.0.1:1222"]\n\n - job_name: meta\n static_configs:\n - targets: ["127.0.0.1:1250"]\n\n - job_name: minio\n metrics_path: /minio/v2/metrics/cluster\n static_configs:\n - targets: ["127.0.0.1:9301"]\n\n - job_name: compactor\n static_configs:\n - targets: ["127.0.0.1:1260"]\n\n - job_name: etcd\n static_configs:\n - targets: ["127.0.0.1:2379"]\n\n - job_name: frontend\n static_configs:\n - targets: ["127.0.0.1:2222"]\n\n - job_name: redpanda\n static_configs:\n - targets: []\n | dataset_sample\yaml\rust\prometheus.yml | prometheus.yml | YAML | 783 | 0.7 | 0 | 0 | python-kit | 590 | 2024-01-17T13:23:42.394330 | Apache-2.0 | false | d3b94dcdf92f86b4fe66d49a61b99bdb |
# .github/release.yml\n\nchangelog:\n exclude:\n labels:\n - "release: ignore"\n authors:\n # Ignore the release PR created by github-actions\n - github-actions\n categories:\n - title: Breaking Changes 🛠\n labels:\n - "release: breaking change"\n - title: Performance Improvements ⚡\n labels:\n - "release: performance"\n - title: Exciting New Features 🎉\n labels:\n - "release: feature"\n - title: Bug Fixes 🐞\n labels:\n - "release: bug fix"\n - title: Document Updates 📖\n labels:\n - "release: document"\n - title: Other Changes\n labels:\n - "*"\n | dataset_sample\yaml\rust\release.yml | release.yml | YAML | 652 | 0.8 | 0 | 0.074074 | vue-tools | 841 | 2023-12-07T08:15:18.616633 | BSD-3-Clause | false | 06da0d234b45f82cacbd5bc0017c80aa |
reviewers:\n - sunshowers\n - vgao1996\n - Gauri-Agarwal\n - bothra90\n - dmitri-perelman\n - huitseeker\n - bmwill\n - wqfish\n - bschwab\n - 0-o-0\n - phlip9\n - gdanezis\n - fbandersnatch\n - phoenix-antigravity\n - andll\n - weilei0107\n - zekun000\n - archjayas\n - darioncassel\n - zrt95\n - tzakian\n - msmouse\n - ankushdas\n - t-rasmud\n - kchalkias\n - opsguy\n - kevinlewi\n - pmohassel\n - dahliamalkhi\n - kphfb\n - mimoo\n - Andyyhope\n - sheharbano\n - asonnino\n - mgorven\n - shazqadeer\n - davidldill\n - valerini\n - emmazzz\n - zamsden\n - sunmilee\n - davidiw\n - matbd\n - metajack\n - hermanventer\n - dpim\n - zoep\n - kkmc\n - huyle46\n - dariorussi\n - bob-wilson\n - sausagee\n | dataset_sample\yaml\rust\review-assign-bot.yml | review-assign-bot.yml | YAML | 703 | 0.7 | 0 | 0 | python-kit | 446 | 2024-03-06T14:00:43.282079 | BSD-3-Clause | false | 303f80a0e5667639bae50e9399b19555 |
# The schema for RiseDev configuration files is defined under `src/risedevtool/schemas`.\n#\n# You can add the following section to `.vscode/settings.json` to get hover support in VS Code:\n#\n# ```\n# "yaml.schemas": {\n# "src/risedevtool/schemas/risedev.json": "risedev.yml",\n# "src/risedevtool/schemas/risedev-profiles.user.json": "risedev-profiles.user.yml"\n# }\n# ```\n\nprofile:\n #################################################\n ### Configuration profiles used by developers ###\n #################################################\n\n # The default configuration will start 1 compute node, 1 meta node and 1 frontend.\n default:\n # # Specify a configuration file to override the default settings\n # config-path: src/config/example.toml\n # # Specify custom environment variables\n # env:\n # RUST_LOG: "info,risingwave_storage::hummock=off"\n # ENABLE_PRETTY_LOG: "true"\n steps:\n # If you want to use the local s3 storage, enable the following line\n # - use: minio\n\n # If you want to use aws-s3, configure AK and SK in env var and enable the following lines:\n # - use: aws-s3\n # bucket: test-bucket\n\n # By default, the meta-backend is sqlite.\n # To enable postgres backend, uncomment the following lines and set the meta-backend to postgres in 'meta-node'\n # - use: postgres\n # port: 8432\n # user: postgres\n # database: metadata\n\n # If you want to enable metrics or tracing, uncomment the following lines.\n # - use: prometheus # metrics\n # - use: tempo # tracing\n # - use: grafana # visualization\n\n - use: meta-node\n # meta-backend: postgres\n - use: compute-node\n - use: frontend\n\n # If you want to enable compactor, uncomment the following line, and enable either minio or aws-s3 as well.\n # - use: compactor\n\n # If you want to create source from Kafka, uncomment the following lines\n # - use: kafka\n # persist-data: true\n\n # To enable Confluent schema registry, uncomment the following line\n # - use: schema-registry\n\n default-v6:\n steps:\n - use: meta-node\n address: "[::1]"\n listen-address: "[::]"\n - use: compute-node\n address: "[::1]"\n listen-address: "[::]"\n - use: frontend\n address: "[::1]"\n listen-address: "[::]"\n\n # The minimum config to use with risectl.\n for-ctl:\n steps:\n - use: minio\n - use: meta-node\n - use: compute-node\n - use: frontend\n - use: compactor\n\n # `dev-compute-node` have the same settings as default except the compute node will be started by user.\n dev-compute-node:\n steps:\n - use: meta-node\n - use: compute-node\n user-managed: true\n - use: frontend\n\n dev-frontend:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n user-managed: true\n\n dev-meta:\n steps:\n - use: meta-node\n user-managed: true\n - use: compute-node\n - use: frontend\n\n # You can use this in combination with the virtual compactor\n # provided in https://github.com/risingwavelabs/risingwave-extensions\n dev-compactor:\n steps:\n - use: minio\n - use: meta-node\n - use: compute-node\n - use: frontend\n - use: compactor\n user-managed: true\n\n full:\n steps:\n - use: minio\n - use: postgres\n port: 8432\n user: postgres\n password: postgres\n database: metadata\n - use: meta-node\n meta-backend: postgres\n - use: compute-node\n - use: frontend\n - use: compactor\n - use: prometheus\n - use: grafana\n - use: kafka\n persist-data: true\n\n standalone-full-peripherals:\n steps:\n - use: minio\n - use: postgres\n port: 8432\n user: postgres\n database: metadata\n - use: meta-node\n user-managed: true\n meta-backend: postgres\n - use: compute-node\n user-managed: true\n - use: frontend\n user-managed: true\n - use: compactor\n user-managed: true\n - use: prometheus\n - use: grafana\n - use: kafka\n persist-data: true\n\n standalone-minio-sqlite:\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n user-managed: true\n meta-backend: sqlite\n - use: compute-node\n user-managed: true\n - use: frontend\n user-managed: true\n - use: compactor\n user-managed: true\n\n standalone-minio-sqlite-compactor:\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n user-managed: true\n meta-backend: sqlite\n - use: compute-node\n user-managed: true\n - use: frontend\n user-managed: true\n - use: compactor\n\n hdfs:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n # If you want to use hdfs as storage backend, configure hdfs namenode:\n - use: opendal\n engine: hdfs\n namenode: "127.0.0.1:9000"\n - use: compactor\n # - use: prometheus\n # - use: grafana\n fs:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n - use: opendal\n engine: fs\n - use: compactor\n # - use: prometheus\n # - use: grafana\n webhdfs:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n # If you want to use webhdfs as storage backend, configure hdfs namenode:\n - use: opendal\n engine: webhdfs\n namenode: "127.0.0.1:9870"\n - use: compactor\n # - use: prometheus\n # - use: grafana\n\n gcs:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n # If you want to use google cloud storage as storage backend, configure bucket name:\n - use: opendal\n engine: gcs\n bucket: bucket-name\n - use: compactor\n # - use: prometheus\n # - use: grafana\n obs:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n # If you want to use obs as storage backend, configure bucket name:\n - use: opendal\n engine: obs\n bucket: bucket-name\n - use: compactor\n # - use: prometheus\n # - use: grafana\n\n oss:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n # If you want to use oss as storage backend, configure bucket name:\n - use: opendal\n engine: oss\n bucket: bucket-name\n - use: compactor\n # - use: prometheus\n # - use: grafana\n\n azblob:\n steps:\n - use: meta-node\n - use: compute-node\n - use: frontend\n # If you want to use azblob as storage backend, configure bucket(container) name:\n - use: opendal\n engine: azblob\n bucket: test-bucket\n - use: compactor\n # - use: prometheus\n # - use: grafana\n\n full-benchmark:\n steps:\n - use: minio\n - use: postgres\n - use: meta-node\n meta-backend: postgres\n - use: compute-node\n - use: frontend\n - use: compactor\n - use: prometheus\n remote-write: true\n remote-write-region: "ap-southeast-1"\n remote-write-url: "https://aps-workspaces.ap-southeast-1.amazonaws.com/workspaces/ws-f3841dad-6a5c-420f-8f62-8f66487f512a/api/v1/remote_write"\n - use: grafana\n - use: kafka\n persist-data: true\n\n kafka:\n steps:\n - use: kafka\n\n meta-1cn-1fe-sqlite:\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n port: 5690\n dashboard-port: 5691\n exporter-port: 1250\n meta-backend: sqlite\n - use: compactor\n - use: compute-node\n - use: frontend\n\n # Start 4 CNs with resource groups rg1, rg2, and default\n multiple-resource-groups:\n steps:\n - use: minio\n - use: meta-node\n - use: compactor\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n resource-group: "rg1"\n - use: compute-node\n port: 5688\n exporter-port: 1223\n resource-group: "rg2"\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n\n ci-time-travel:\n config-path: src/config/ci-time-travel.toml\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n port: 5690\n dashboard-port: 5691\n exporter-port: 1250\n meta-backend: sqlite\n - use: compactor\n - use: compute-node\n - use: frontend\n\n ci-iceberg-test:\n steps:\n - use: minio\n - use: mysql\n port: 3306\n address: mysql\n user: root\n password: 123456\n user-managed: true\n - use: postgres\n port: 5432\n address: db\n database: metadata\n user: postgres\n password: postgres\n user-managed: true\n application: metastore\n - use: meta-node\n meta-backend: postgres\n - use: compute-node\n - use: frontend\n - use: compactor\n\n meta-1cn-1fe-sqlite-with-recovery:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n port: 5690\n dashboard-port: 5691\n exporter-port: 1250\n meta-backend: sqlite\n - use: compactor\n - use: compute-node\n - use: frontend\n\n meta-1cn-1fe-pg-backend:\n steps:\n - use: minio\n - use: postgres\n port: 8432\n user: postgres\n database: metadata\n - use: meta-node\n port: 5690\n dashboard-port: 5691\n exporter-port: 1250\n meta-backend: postgres\n - use: compactor\n - use: compute-node\n - use: frontend\n\n meta-1cn-1fe-pg-backend-with-recovery:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: postgres\n port: 8432\n user: postgres\n database: metadata\n - use: meta-node\n port: 5690\n dashboard-port: 5691\n exporter-port: 1250\n meta-backend: postgres\n - use: compactor\n - use: compute-node\n - use: frontend\n\n meta-1cn-1fe-mysql-backend:\n steps:\n - use: minio\n - use: mysql\n port: 4306\n user: root\n database: metadata\n application: metastore\n - use: meta-node\n port: 5690\n dashboard-port: 5691\n exporter-port: 1250\n meta-backend: mysql\n - use: compactor\n - use: compute-node\n - use: frontend\n\n meta-1cn-1fe-mysql-backend-with-recovery:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: mysql\n port: 4306\n user: root\n database: metadata\n application: metastore\n - use: meta-node\n port: 5690\n dashboard-port: 5691\n exporter-port: 1250\n meta-backend: mysql\n - use: compactor\n - use: compute-node\n - use: frontend\n\n java-binding-demo:\n steps:\n - use: minio\n address: "127.0.0.1"\n port: 9301\n root-user: hummockadmin\n root-password: hummockadmin\n hummock-bucket: hummock001\n - use: meta-node\n address: "127.0.0.1"\n port: 5690\n - use: compute-node\n - use: frontend\n - use: compactor\n\n ci-gen-cpu-flamegraph:\n steps:\n # NOTE(kwannoel): We do not use aws-s3 here, to avoid\n # contention over s3 bucket when multiple benchmarks at run at once.\n - use: minio\n - use: sqlite\n - use: meta-node\n meta-backend: sqlite\n - use: compute-node\n parallelism: 8\n - use: frontend\n - use: compactor\n # - use: prometheus\n # - use: grafana\n # Do not use kafka here, we will spawn it separately,\n # so we don't have to re-generate data each time.\n # RW will still be ale to talk to it.\n # - use: kafka\n # port: 9092\n # persist-data: true\n\n ######################################\n ### Configurations used in Compose ###\n ######################################\n\n compose:\n steps:\n - use: minio\n id: minio-0\n address: ${id}\n listen-address: "0.0.0.0"\n console-address: "0.0.0.0"\n\n - use: meta-node\n # Id must starts with `meta-node`, therefore to be picked up by other\n # components.\n id: meta-node-0\n\n # Advertise address can be `id`, so as to use docker's DNS. If running\n # in host network mode, we should use IP directly in this field.\n address: ${id}\n\n listen-address: "0.0.0.0"\n\n - use: compute-node\n id: compute-node-0\n listen-address: "0.0.0.0"\n address: ${id}\n\n - use: frontend\n id: frontend-node-0\n listen-address: "0.0.0.0"\n address: ${id}\n\n - use: compactor\n id: compactor-0\n listen-address: "0.0.0.0"\n address: ${id}\n\n - use: redpanda\n\n - use: prometheus\n id: prometheus-0\n listen-address: "0.0.0.0"\n address: ${id}\n\n - use: grafana\n listen-address: "0.0.0.0"\n address: ${id}\n id: grafana-0\n\n - use: tempo\n listen-address: "0.0.0.0"\n address: ${id}\n id: tempo-0\n\n # special config for deployment, see related PR for more information\n compose-3node-deploy:\n steps:\n # - use: minio\n # id: minio-0\n # address: ${dns-host:rw-source-0}\n # listen-address: "0.0.0.0"\n # console-address: "0.0.0.0"\n\n - use: aws-s3\n bucket: ${terraform:s3-bucket}\n\n - use: meta-node\n # Id must starts with `meta-node`, therefore to be picked up by other\n # components.\n id: meta-node-0\n\n # Advertise address can be `id`, so as to use docker's DNS. If running\n # in host network mode, we should use IP directly in this field.\n address: ${dns-host:rw-meta-0}\n listen-address: "0.0.0.0"\n\n - use: compute-node\n id: compute-node-0\n listen-address: "0.0.0.0"\n address: ${dns-host:rw-compute-0}\n async-stack-trace: verbose\n enable-tiered-cache: true\n\n - use: compute-node\n id: compute-node-1\n listen-address: "0.0.0.0"\n address: ${dns-host:rw-compute-1}\n async-stack-trace: verbose\n enable-tiered-cache: true\n\n - use: compute-node\n id: compute-node-2\n listen-address: "0.0.0.0"\n address: ${dns-host:rw-compute-2}\n async-stack-trace: verbose\n enable-tiered-cache: true\n\n - use: frontend\n id: frontend-node-0\n listen-address: "0.0.0.0"\n address: ${dns-host:rw-meta-0}\n\n - use: compactor\n id: compactor-0\n listen-address: "0.0.0.0"\n address: ${dns-host:rw-source-0}\n compaction-worker-threads-number: 15\n\n - use: redpanda\n address: ${dns-host:rw-source-0}\n\n - use: prometheus\n id: prometheus-0\n listen-address: "0.0.0.0"\n address: ${dns-host:rw-meta-0}\n\n - use: grafana\n listen-address: "0.0.0.0"\n address: ${dns-host:rw-meta-0}\n id: grafana-0\n\n #################################\n ### Configurations used on CI ###\n #################################\n\n ci-1cn-1fe:\n config-path: src/config/ci.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n ci-1cn-1fe-jdbc-to-native:\n config-path: src/config/ci-jdbc-to-native.toml\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n meta-backend: sqlite\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n ci-3cn-1fe:\n config-path: src/config/ci.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n ci-backfill-3cn-1fe:\n config-path: src/config/ci-longer-streaming-upload-timeout.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n ci-backfill-3cn-1fe-with-monitoring:\n config-path: src/config/ci-longer-streaming-upload-timeout.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n - use: prometheus\n - use: grafana\n\n ci-backfill-3cn-1fe-with-minio-rate-limit:\n config-path: src/config/ci-longer-streaming-upload-timeout.toml\n steps:\n - use: minio\n # Set the rate limit for MinIO to N requests per second\n api-requests-max: 1000\n # Set the deadline for API requests to N seconds\n api-requests-deadline: 20s\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n ci-backfill-3cn-1fe-with-monitoring-and-minio-rate-limit:\n config-path: src/config/ci-longer-streaming-upload-timeout.toml\n steps:\n - use: minio\n api-requests-max: 30\n api-requests-deadline: 2s\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n - use: prometheus\n - use: grafana\n\n ci-3cn-3fe:\n config-path: src/config/ci.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n port: 4565\n exporter-port: 2222\n health-check-port: 6786\n - use: frontend\n port: 4566\n exporter-port: 2223\n health-check-port: 6787\n - use: frontend\n port: 4567\n exporter-port: 2224\n health-check-port: 6788\n - use: compactor\n\n ci-3cn-3fe-opendal-fs-backend:\n config-path: src/config/ci.toml\n steps:\n - use: meta-node\n meta-backend: env\n - use: opendal\n engine: fs\n bucket: "/tmp/rw_ci"\n - use: compute-node\n port: 5687\n exporter-port: 1222\n - use: compute-node\n port: 5688\n exporter-port: 1223\n - use: compute-node\n port: 5689\n exporter-port: 1224\n - use: frontend\n port: 4565\n exporter-port: 2222\n health-check-port: 6786\n - use: frontend\n port: 4566\n exporter-port: 2223\n health-check-port: 6787\n - use: frontend\n port: 4567\n exporter-port: 2224\n health-check-port: 6788\n - use: compactor\n\n ci-3streaming-2serving-3fe:\n config-path: src/config/ci.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n role: streaming\n parallelism: 4\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n role: streaming\n parallelism: 4\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n role: streaming\n parallelism: 4\n - use: compute-node\n port: 5685\n exporter-port: 1225\n enable-tiered-cache: true\n role: serving\n parallelism: 4\n - use: compute-node\n port: 5686\n exporter-port: 1226\n enable-tiered-cache: true\n role: serving\n parallelism: 8\n - use: frontend\n port: 4565\n exporter-port: 2222\n health-check-port: 6786\n - use: frontend\n port: 4566\n exporter-port: 2223\n health-check-port: 6787\n - use: frontend\n port: 4567\n exporter-port: 2224\n health-check-port: 6788\n - use: compactor\n\n ci-kafka:\n config-path: src/config/ci.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n - use: kafka\n user-managed: true\n address: message_queue\n port: 29092\n - use: schema-registry\n user-managed: true\n address: schemaregistry\n port: 8082\n\n local-inline-source-test:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n meta-backend: sqlite\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n - use: pubsub\n persist-data: true\n - use: kafka\n persist-data: true\n - use: schema-registry\n - use: mysql\n - use: postgres\n\n ci-inline-source-test:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n - use: pubsub\n persist-data: true\n - use: kafka\n user-managed: true\n address: message_queue\n port: 29092\n - use: schema-registry\n user-managed: true\n address: schemaregistry\n port: 8082\n - use: mysql\n port: 3306\n address: mysql\n user: root\n password: 123456\n user-managed: true\n - use: postgres\n port: 5432\n address: db\n user: postgres\n password: postgres\n user-managed: true\n\n ci-redis:\n config-path: src/config/ci.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n - use: redis\n\n ci-compaction-test:\n config-path: src/config/ci-compaction-test.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n enable-tiered-cache: true\n total-memory-bytes: 17179869184\n - use: frontend\n - use: compactor\n\n ci-1cn-1fe-with-recovery:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n ci-3cn-1fe-with-recovery:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n port: 5687\n exporter-port: 1222\n enable-tiered-cache: true\n - use: compute-node\n port: 5688\n exporter-port: 1223\n enable-tiered-cache: true\n - use: compute-node\n port: 5689\n exporter-port: 1224\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n ci-1cn-1fe-user-kafka-with-recovery:\n config-path: src/config/ci-recovery.toml\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n - use: kafka\n user-managed: true\n address: message_queue\n port: 29092\n\n ci-meta-backup-test-sql:\n config-path: src/config/ci-meta-backup-test.toml\n steps:\n - use: sqlite\n - use: minio\n - use: meta-node\n meta-backend: sqlite\n - use: compute-node\n - use: frontend\n - use: compactor\n\n ci-meta-backup-test-restore-sql:\n config-path: src/config/ci-meta-backup-test.toml\n steps:\n - use: sqlite\n - use: minio\n\n ci-sink-test:\n config-path: src/config/ci.toml\n steps:\n - use: minio\n - use: meta-node\n - use: compute-node\n enable-tiered-cache: true\n - use: frontend\n - use: compactor\n\n hummock-trace:\n config-path: src/config/hummock-trace.toml\n steps:\n - use: minio\n - use: meta-node\n - use: compute-node\n - use: frontend\n - use: compactor\n\n ci-backfill:\n config-path: "src/config/ci-backfill.toml"\n steps:\n - use: minio\n - use: meta-node\n meta-backend: env\n - use: compute-node\n - use: frontend\n - use: compactor\n\n full-with-batch-query-limit:\n config-path: src/config/full-with-batch-query-limit.toml\n steps:\n - use: minio\n - use: sqlite\n - use: meta-node\n meta-backend: sqlite\n - use: compute-node\n - use: frontend\n - use: compactor\n - use: prometheus\n - use: grafana\n\ncompose:\n risingwave: "ghcr.io/risingwavelabs/risingwave:latest"\n prometheus: "prom/prometheus:latest"\n minio: "quay.io/minio/minio:latest"\n redpanda: "redpandadata/redpanda:latest"\n grafana: "grafana/grafana-oss:latest"\n tempo: "grafana/tempo:latest"\n\n# The `use` field specified in the above `risedev` section will refer to the templates below.\ntemplate:\n minio:\n # Advertise address of MinIO s3 endpoint\n address: "127.0.0.1"\n\n # Advertise port of MinIO s3 endpoint\n port: 9301\n\n # Listen address of MinIO endpoint\n listen-address: ${address}\n\n # Console address of MinIO s3 endpoint\n console-address: "127.0.0.1"\n\n # Console port of MinIO s3 endpoint\n console-port: 9400\n\n # Root username (can be used to login to MinIO console)\n root-user: hummockadmin\n\n # Root password (can be used to login to MinIO console)\n root-password: hummockadmin\n\n # Bucket name to store hummock information\n hummock-bucket: hummock001\n\n # Id of this instance\n id: minio\n\n # Prometheus nodes used by this MinIO\n provide-prometheus: "prometheus*"\n\n # Max concurrent api requests.\n # see: https://github.com/minio/minio/blob/master/docs/throttle/README.md.\n # '0' means this env var will use the default of minio.\n api-requests-max: 0\n\n # Deadline for api requests.\n # Empty string means this env var will use the default of minio.\n api-requests-deadline: ""\n\n sqlite:\n # Id of this instance\n id: sqlite\n\n # File name of the sqlite database\n file: metadata.db\n\n compute-node:\n # Compute-node advertise address\n address: "127.0.0.1"\n\n # Listen address\n listen-address: ${address}\n\n # Compute-node listen port\n port: 5688\n\n # Prometheus exporter listen port\n exporter-port: 1222\n\n # Id of this instance\n id: compute-node-${port}\n\n # Whether to enable async stack trace for this compute node, `off`, `on`, or `verbose`.\n # Considering the performance, `verbose` mode only effect under `release` profile with `debug_assertions` off.\n async-stack-trace: verbose\n\n # If `enable-tiered-cache` is true, hummock will use data directory as file cache.\n enable-tiered-cache: false\n\n # Minio instances used by this compute node\n provide-minio: "minio*"\n\n # OpenDAL storage backend used by this compute node\n provide-opendal: "opendal*"\n\n # AWS s3 bucket used by this compute node\n provide-aws-s3: "aws-s3*"\n\n # Meta-nodes used by this compute node\n provide-meta-node: "meta-node*"\n\n # Tempo used by this compute node\n provide-tempo: "tempo*"\n\n # If `user-managed` is true, this service will be started by user with the above config\n user-managed: false\n\n # Total available memory for the compute node in bytes\n total-memory-bytes: 8589934592\n\n # Parallelism of tasks per compute node\n parallelism: 4\n\n role: both\n\n # Resource group for scheduling, default value is "default"\n resource-group: "default"\n\n meta-node:\n # Meta-node advertise address\n address: "127.0.0.1"\n\n # Meta-node listen port\n port: 5690\n\n # Listen address\n listen-address: ${address}\n\n # Dashboard listen port\n dashboard-port: 5691\n\n # Prometheus exporter listen port\n exporter-port: 1250\n\n # Id of this instance\n id: meta-node-${port}\n\n # If `user-managed` is true, this service will be started by user with the above config\n user-managed: false\n\n # meta backend type, requires extra config for provided backend\n meta-backend: "memory"\n\n # Sqlite backend config\n provide-sqlite-backend: "sqlite*"\n\n # Postgres backend config\n provide-postgres-backend: "postgres*"\n\n # Mysql backend config\n provide-mysql-backend: "mysql*"\n\n # Prometheus nodes used by dashboard service\n provide-prometheus: "prometheus*"\n\n # Sanity check: should use shared storage if there're multiple compute nodes\n provide-compute-node: "compute-node*"\n\n # Sanity check: should start at lease one compactor if using shared object store\n provide-compactor: "compactor*"\n\n # Minio instances used by the cluster\n provide-minio: "minio*"\n\n # OpenDAL storage backend used by the cluster\n provide-opendal: "opendal*"\n\n # AWS s3 bucket used by the cluster\n provide-aws-s3: "aws-s3*"\n\n # Tempo used by this meta node\n provide-tempo: "tempo*"\n\n prometheus:\n # Advertise address of Prometheus\n address: "127.0.0.1"\n\n # Listen port of Prometheus\n port: 9500\n\n # Listen address\n listen-address: ${address}\n\n # Id of this instance\n id: prometheus\n\n # If `remote_write` is true, this Prometheus instance will push metrics to remote instance\n remote-write: false\n\n # AWS region of remote write\n remote-write-region: ""\n\n # Remote write url of this instance\n remote-write-url: ""\n\n # Compute-nodes used by this Prometheus instance\n provide-compute-node: "compute-node*"\n\n # Meta-nodes used by this Prometheus instance\n provide-meta-node: "meta-node*"\n\n # Minio instances used by this Prometheus instance\n provide-minio: "minio*"\n\n # Compactors used by this Prometheus instance\n provide-compactor: "compactor*"\n\n # Redpanda used by this Prometheus instance\n provide-redpanda: "redpanda*"\n\n # Frontend used by this Prometheus instance\n provide-frontend: "frontend*"\n\n # How frequently Prometheus scrape targets (collect metrics)\n scrape-interval: 15s\n\n frontend:\n # Advertise address of frontend\n address: "127.0.0.1"\n\n # Listen port of frontend\n port: 4566\n\n # Listen address\n listen-address: ${address}\n\n # Prometheus exporter listen port\n exporter-port: 2222\n\n # Health check listen port\n health-check-port: 6786\n\n # Id of this instance\n id: frontend-${port}\n\n # Meta-nodes used by this frontend instance\n provide-meta-node: "meta-node*"\n\n # Tempo used by this frontend instance\n provide-tempo: "tempo*"\n\n # If `user-managed` is true, this service will be started by user with the above config\n user-managed: false\n\n compactor:\n # Compactor advertise address\n address: "127.0.0.1"\n\n # Compactor listen port\n port: 6660\n\n # Listen address\n listen-address: ${address}\n\n # Prometheus exporter listen port\n exporter-port: 1260\n\n # Id of this instance\n id: compactor-${port}\n\n # Minio instances used by this compactor\n provide-minio: "minio*"\n\n # Meta-nodes used by this compactor\n provide-meta-node: "meta-node*"\n\n # Tempo used by this compator\n provide-tempo: "tempo*"\n\n # If `user-managed` is true, this service will be started by user with the above config\n user-managed: false\n\n grafana:\n # Listen address of Grafana\n listen-address: ${address}\n\n # Advertise address of Grafana\n address: "127.0.0.1"\n\n # Listen port of Grafana\n port: 3001\n\n # Id of this instance\n id: grafana\n\n # Prometheus used by this Grafana instance\n provide-prometheus: "prometheus*"\n\n # Tempo used by this Grafana instance\n provide-tempo: "tempo*"\n\n tempo:\n # Id of this instance\n id: tempo\n\n # Listen address of HTTP server and OTLP gRPC collector\n listen-address: "127.0.0.1"\n\n # Advertise address of Tempo\n address: "127.0.0.1"\n\n # HTTP server listen port\n port: 3200\n\n # gRPC listen port of the OTLP collector\n otlp-port: 4317\n\n max-bytes-per-trace: 5000000\n\n opendal:\n id: opendal\n\n engine: hdfs\n\n namenode: 127.0.0.1:9000\n\n bucket: risingwave-test\n\n # aws-s3 is a placeholder service to provide configurations\n aws-s3:\n # Id to be picked-up by services\n id: aws-s3\n\n # The bucket to be used for AWS S3\n bucket: test-bucket\n\n # access key, secret key and region should be set in aws config (either by env var or .aws/config)\n\n # Apache Kafka service backed by docker.\n kafka:\n # Id to be picked-up by services\n id: kafka-${port}\n\n # Advertise address of Kafka\n address: "127.0.0.1"\n\n # Listen port of Kafka\n port: 29092\n\n # Listen port of KRaft controller\n controller-port: 29093\n # Listen port for other services in docker (schema-registry)\n docker-port: 29094\n\n # The docker image. Can be overridden to use a different version.\n image: "confluentinc/cp-kafka:7.6.1"\n\n # If set to true, data will be persisted at data/{id}.\n persist-data: true\n\n # Kafka node id. If there are multiple instances of Kafka, we will need to set.\n node-id: 0\n\n user-managed: false\n\n schema-registry:\n # Id to be picked-up by services\n id: schema-registry-${port}\n\n # Advertise address\n address: "127.0.0.1"\n\n # Listen port of Schema Registry\n port: 8081\n\n # The docker image. Can be overridden to use a different version.\n image: "confluentinc/cp-schema-registry:7.6.1"\n\n user-managed: false\n\n provide-kafka: "kafka*"\n\n # Google pubsub emulator service\n pubsub:\n id: pubsub-${port}\n\n address: "127.0.0.1"\n\n port: 5980\n\n persist-data: true\n\n # Only supported in RiseDev compose\n redpanda:\n # Id to be picked-up by services\n id: redpanda\n\n # Port used inside docker-compose cluster (e.g. create MV)\n internal-port: 29092\n\n # Port used on host (e.g. import data, connecting using kafkacat)\n outside-port: 9092\n\n # Connect address\n address: ${id}\n\n # Number of CPUs to use\n cpus: 8\n\n # Memory limit for Redpanda\n memory: 16G\n\n # redis service\n redis:\n # Id to be picked-up by services\n id: redis\n\n # listen port of redis\n port: 6379\n\n # address of redis\n address: "127.0.0.1"\n\n # MySQL service backed by docker.\n mysql:\n # Id to be picked-up by services\n id: mysql-${port}\n\n # address of mysql\n address: "127.0.0.1"\n\n # listen port of mysql\n port: 8306\n\n # Note:\n # - This will be used to initialize the MySQL instance.\n # * If the user is "root", the password will be used as the root password.\n # * Otherwise, a regular user will be created with the given password. The root password will be empty.\n # Note that this only applies to fresh instances, i.e., the data directory is empty.\n # - These configs will be passed as-is to risedev-env default user for MySQL operations.\n # - This is not used in RISEDEV_MYSQL_WITH_OPTIONS_COMMON.\n user: root\n password: ""\n database: "risedev"\n\n # Which application to use. Can be overridden for metastore purpose.\n application: "connector"\n\n # The docker image. Can be overridden to use a different version.\n image: "mysql:8.0"\n\n # If set to true, data will be persisted at data/{id}.\n persist-data: true\n\n # If `user-managed` is true, user is responsible for starting the service\n # to serve at the above address and port in any way they see fit.\n user-managed: false\n\n # PostgreSQL service backed by docker.\n postgres:\n # Id to be picked-up by services\n id: postgres-${port}\n\n # address of pg\n address: "127.0.0.1"\n\n # listen port of pg\n port: 8432\n\n # Note:\n # - This will be used to initialize the PostgreSQL instance if it's fresh.\n # - These configs will be passed as-is to risedev-env default user for PostgreSQL operations.\n user: postgres\n password: ""\n database: "postgres"\n\n # Which application to use. Can be overridden for connector purpose.\n application: "metastore"\n\n # The docker image. Can be overridden to use a different version.\n image: "postgres:17-alpine"\n\n # If set to true, data will be persisted at data/{id}.\n persist-data: true\n\n # If `user-managed` is true, user is responsible for starting the service\n # to serve at the above address and port in any way they see fit.\n user-managed: false\n\n # Sql Server service backed by docker.\n sqlserver:\n # Note: Sql Server is now only for connector purpose.\n # Id to be picked-up by services\n id: sqlserver-${port}\n\n # address of mssql\n address: "127.0.0.1"\n\n # listen port of mssql\n port: 1433\n\n # Note:\n # - This will be used to initialize the Sql Server instance if it's fresh.\n # - In user-managed mode, these configs are not validated by risedev.\n # They are passed as-is to risedev-env default user for Sql Server operations.\n user: SA\n password: "YourPassword123"\n database: "master"\n\n # The docker image. Can be overridden to use a different version.\n image: "mcr.microsoft.com/mssql/server:2022-latest"\n\n # If set to true, data will be persisted at data/{id}.\n persist-data: true\n\n # If `user-managed` is true, user is responsible for starting the service\n # to serve at the above address and port in any way they see fit.\n user-managed: false\n | dataset_sample\yaml\rust\risedev.yml | risedev.yml | YAML | 38,979 | 0.95 | 0.017386 | 0.201974 | vue-tools | 889 | 2025-07-07T07:50:30.800128 | MIT | false | 10f85c30a3ca5dab005c5682cf88fff2 |
# Inherit from the internal-ops repo\n# https://runs-on.com/configuration/repo-config/\n_extends: internal-ops\n | dataset_sample\yaml\rust\runs-on.yml | runs-on.yml | YAML | 109 | 0.8 | 0 | 0.666667 | vue-tools | 244 | 2023-10-22T07:13:28.941411 | BSD-3-Clause | false | 0d80da488b3cbfdfabb4c7f7cf3ca7ef |
openapi: "3.0.2"\ninfo:\n title: Tartarus - OpenAPI 3.0\n description: |-\n This is the OpenAPI 3.0 specification for the card locker.\n This is used by the [hyperswitch](https://github.com/juspay/hyperswitch) for storing card information securely.\n version: "1.0"\ntags:\n - name: Key Custodian\n description: API used to initialize the locker after deployment.\n - name: Data\n description: CRUD APIs for working with data to be stored in the locker\n - name: Cards\n description: CRUD APIs for working with cards data to be stored in the locker (deprecated)\npaths:\n /custodian/key1:\n post:\n tags:\n - Key Custodian\n summary: Provide Key 1\n description: Provide the first key to unlock the locker\n operationId: setKey1\n requestBody:\n description: Provide key 1 to unlock the locker\n content:\n application/json:\n schema:\n $ref: "#/components/schemas/Key"\n required: true\n responses:\n "200":\n description: Key 1 provided\n content:\n text/plain:\n schema:\n $ref: "#/components/schemas/Key1Set"\n /custodian/key2:\n post:\n tags:\n - Key Custodian\n summary: Provide Key 2\n description: Provide the second key to unlock the locker\n operationId: setKey2\n requestBody:\n description: Provide key 2 to unlock the locker\n content:\n application/json:\n schema:\n $ref: "#/components/schemas/Key"\n required: true\n responses:\n "200":\n description: Key 2 provided\n content:\n text/plain:\n schema:\n $ref: "#/components/schemas/Key2Set"\n /custodian/decrypt:\n post:\n tags:\n - Key Custodian\n summary: Unlock the locker\n description: Unlock the locker with the key1 and key2 provided\n responses:\n "200":\n description: Successfully Unlocked\n content:\n text/plain:\n schema:\n $ref: "#/components/schemas/Decrypt200"\n /health:\n get:\n summary: Get Health\n description: To check whether the application is up\n responses:\n "200":\n description: Health is good\n content:\n text/plain:\n schema:\n $ref: "#/components/schemas/Health"\n /data/add:\n post:\n tags:\n - Cards\n - Data\n summary: Add Data in Locker\n description: Add sensitive data in the locker\n requestBody:\n description: The request body might be JWE + JWS encrypted when using middleware\n content:\n application/json:\n schema:\n oneOf:\n - $ref: "#/components/schemas/StoreDataReq"\n - $ref: "#/components/schemas/JWEReq"\n required: true\n responses:\n "200":\n description: Store Data Response\n content:\n application/json:\n schema:\n oneOf:\n - $ref: "#/components/schemas/StoreDataRes"\n - $ref: "#/components/schemas/JWERes"\n /data/delete:\n post:\n tags:\n - Cards\n - Data\n summary: Delete Data from Locker\n description: Delete sensitive data from the locker\n requestBody:\n description: The request body might be JWE + JWS encrypted when using middleware\n content:\n application/json:\n schema:\n oneOf:\n - $ref: "#/components/schemas/DeleteDataReq"\n - $ref: "#/components/schemas/JWEReq"\n required: true\n responses:\n "200":\n description: Delete Data Response\n content:\n application/json:\n schema:\n oneOf:\n - $ref: "#/components/schemas/DeleteDataRes"\n - $ref: "#/components/schemas/JWERes"\n /data/retrieve:\n post:\n tags:\n - Cards\n - Data\n summary: Retrieve Data from Locker\n description: Retrieve sensitive data from the locker\n requestBody:\n description: The request body might be JWE + JWS encrypted when using middleware\n content:\n application/json:\n schema:\n oneOf:\n - $ref: "#/components/schemas/RetrieveDataReq"\n - $ref: "#/components/schemas/JWEReq"\n required: true\n responses:\n "200":\n description: Retrieve Data Response\n content:\n application/json:\n schema:\n oneOf:\n - $ref: "#/components/schemas/RetrieveDataRes"\n - $ref: "#/components/schemas/JWERes"\n /data/fingerprint:\n post:\n tags:\n - Cards\n - Data\n summary: Get or insert the card fingerprint\n description: Get or insert the card fingerprint\n requestBody:\n description: Provide card number and hash key\n content:\n application/json:\n schema:\n $ref: "#/components/schemas/FingerprintReq"\n required: true\n responses:\n "200":\n description: Fingerprint Response\n content:\n application/json:\n schema:\n $ref: "#/components/schemas/FingerprintRes"\ncomponents:\n schemas:\n Key:\n type: object\n properties:\n key:\n type: string\n example: 801bb63c1bd51820acbc8ac20c674675\n required:\n - key\n StoreDataReq:\n title: StoreDataReq\n type: object\n properties:\n merchant_id:\n type: string\n example: m0100\n merchant_customer_id:\n type: string\n example: HsCustomer1\n requester_card_reference:\n type: string\n example: 3ffdf1e5-7f38-4f26-936f-c66a6f4296fa\n card:\n $ref: "#/components/schemas/Card"\n enc_card_data:\n type: string\n example: "qwe4tyusdfg"\n RetrieveDataReq:\n title: RetrieveDataReq\n type: object\n properties:\n merchant_id:\n type: string\n example: m0100\n merchant_customer_id:\n type: string\n example: HsCustomer1\n card_reference:\n type: string\n example: 3ffdf1e5-7f38-4f26-936f-c66a6f4296fa\n DeleteDataReq:\n title: DeleteDataReq\n type: object\n properties:\n merchant_id:\n type: string\n example: m0100\n merchant_customer_id:\n type: string\n example: HsCustomer1\n card_reference:\n type: string\n example: 3ffdf1e5-7f38-4f26-936f-c66a6f4296fa\n FingerprintReq:\n type: object\n properties:\n card:\n $ref: "#/components/schemas/FingerprintCardData"\n hash_key:\n type: string\n example: Hash1\n JWEReq:\n title: JWEReq\n type: object\n properties:\n header:\n type: string\n iv:\n type: string\n encrypted_payload:\n type: string\n tag:\n type: string\n encrypted_key:\n type: string\n RetrieveRes:\n title: RetrieveRes\n oneOf:\n - type: object\n properties:\n card:\n $ref: "#/components/schemas/Card"\n - type: object\n properties:\n enc_card_data:\n type: string\n Card:\n title: Card\n type: object\n required:\n - card_number\n properties:\n card_number:\n type: string\n name_on_card:\n type: string\n card_exp_month:\n type: string\n card_exp_year:\n type: string\n card_brand:\n type: string\n card_isin:\n type: string\n nick_name:\n type: string\n FingerprintCardData:\n type: object\n properties:\n card_number:\n type: string\n example: 4242424242424242\n Key1Set:\n title: Key1Set\n type: string\n # summary: Response after setting key1\n description: Received Key1\n example: Received Key1\n Key2Set:\n title: Key2Set\n type: string\n # description: Response after setting key2\n description: Received Key2\n example: Received Key2\n Decrypt200:\n title: Decrypt200\n type: string\n # description: Response if the locker key custodian decryption was successful\n description: Decryption successful\n example: Decryption successful\n Health:\n title: Health\n type: string\n # description: Response when the health is good\n description: health is good\n example: health is good\n StoreDataRes:\n title: StoreDataRes\n type: object\n description: Response received if the data was stored successfully\n properties:\n status:\n type: string\n enum: [Ok]\n payload:\n type: object\n properties:\n card_reference:\n type: string\n RetrieveDataRes:\n title: RetrieveDataRes\n type: object\n description: Response received with the sensitive data, associated to the card reference\n properties:\n status:\n type: string\n enum: [Ok]\n payload:\n $ref: "#/components/schemas/RetrieveRes"\n DeleteDataRes:\n title: DeleteDataRes\n type: object\n description: Response received if the data deletion was successful\n properties:\n status:\n type: string\n enum: [Ok]\n FingerprintRes:\n type: object\n description: Response received if the fingerprint insertion or retrieval was successful\n properties:\n status:\n type: string\n enum: [Ok]\n payload:\n type: object\n properties:\n fingerprint:\n type: string\n JWERes:\n title: JWERes\n type: object\n description: JWE encrypted response equivalent\n properties:\n header:\n type: string\n iv:\n type: string\n encrypted_payload:\n type: string\n tag:\n type: string\n encrypted_key:\n type: string | dataset_sample\yaml\rust\rust_locker_open_api_spec.yml | rust_locker_open_api_spec.yml | YAML | 10,223 | 0.95 | 0.021563 | 0.010753 | awesome-app | 349 | 2025-01-13T20:51:31.184053 | Apache-2.0 | false | 13b623bfc568c5431f01ed2fea2767d2 |
# It is considered harmful to close issues or PRs, so our stale bot will never close an issue or PR.\n# To remove the stale label, just leave a new comment.\n# Adapted from https://github.com/NixOS/nixpkgs/blob/master/.github/STALE-BOT.md\n\n# Configuration for probot-stale - https://github.com/probot/stale\nstaleLabel: "stale"\ndaysUntilStale: 60\ndaysUntilClose: false\ncloseComment: false\nexemptLabels:\n - "tracking issue"\n - "awaiting more feedback"\nissues:\n markComment: >\n This issue has been automatically marked as stale because it has not had recent activity.\n **If this issue is still affecting you, please leave any comment** (for example, "bump").\n We are sorry that we haven't been able to prioritize it yet. If you have any new additional information, please include it with your comment!\npulls:\n markComment: >\n This pull request has been automatically marked as stale because it has not had recent activity.\n **If this pull request is still relevant, please leave any comment** (for example, "bump").\n | dataset_sample\yaml\rust\stale.yml | stale.yml | YAML | 1,030 | 0.8 | 0.142857 | 0.3 | react-lib | 819 | 2025-03-04T02:36:55.422946 | Apache-2.0 | false | 466a525a0f297e25ae31c7bd3db3da2d |
name: tabby\nroot: ./\n\nwindows:\n - caddy:\n panes:\n - caddy run --watch --config ee/tabby-webserver/development/Caddyfile\n - server:\n layout: even-horizontal\n panes:\n - cargo run serve --port 8081\n - cd ee/tabby-ui && pnpm dev\n | dataset_sample\yaml\rust\tabby.yml | tabby.yml | YAML | 263 | 0.7 | 0 | 0 | awesome-app | 913 | 2023-11-27T01:30:33.670907 | MIT | false | 10a6dc1e3eb2147163290304b5f6ef36 |
team:\n - "@hardfist"\n - "@h-a-n-a"\n - "@bvanjoi"\n - "@ahabhgk"\n - "@jerrykingxyz"\n - "@chenjiahan"\n - "@JSerFeng"\n - "@9aoy"\n - "@caohuilin"\n - "@KyrieLii"\n - "@sanyuan0704"\n - "@jkzing"\n - "@zhoushaw"\n - "@ulivz"\n - "@LingyuCoder"\n - "@yuyutaotao"\n - "@SyMind"\n - "@SoonIter"\n - "@fi3ework"\n - "@Timeless0911"\n - "@GiveMe-A-Name"\n | dataset_sample\yaml\rust\teams.yml | teams.yml | YAML | 352 | 0.7 | 0 | 0 | python-kit | 216 | 2025-02-19T07:37:41.714391 | BSD-3-Clause | false | a485523311011b5e8ea37c5486b26220 |
# Docs\n\n# Label Config\n\n# labeler:\n# - settings:\n# - codeOwnersPath: {PATH TO CODEOWNERS FILE (defaults to .github/CODEOWNERS)}\n# - labels:\n# - label: {YOUR LABEL NAME}\n# condition: {AND (default) | OR}\n# when:\n# {TEST_FUNCTION}: {REGEX}\n# ...\n# ...\n\n#| Function Name | Description |\n#| --------------------------- | -------------------------------------------------------------------------- |\n#| `isAnyFilePathMatch` | Returns true if any filename in the PR diff matches the given regex |\n#| `isPRBodyMatch` | Returns true if the PR description matches the given regex |\n#| `isPRTitleMatch` | Returns true if the PR title matches the given regex |\n#| `isPRAuthorMatch` | Returns true if the PR author matches the given regex |\n#| `isPRAuthorCompanyMatch` | Returns true if the PR author's company matches the given regex |\n#| `isAnyFileOwnedByMatch` | Returns true if any owner of a file in the PR diff matches the given regex |\n#| `isNotAnyFilePathMatch` | The negation of `isAnyFilePathMatch` |\n#| `isNotPRBodyMatch` | The negation of `isPRBodyMatch` |\n#| `isNotPRTitleMatch` | The negation of `isPRTitleMatch` |\n#| `isNotPRAuthorMatch` | The negation of `isPRAuthorMatch` |\n#| `isNotPRAuthorCompanyMatch` | The negation of `isPRAuthorCompanyMatch` |\n#| `isNotAnyFileOwnerByMatch` | The negation of `isAnyFileOwnedByMatch` |\n\nlabeler:\n labels:\n # needs: triage when not any of the turborepo team\n - label: "needs: triage"\n when:\n isNotPRAuthorMatch: "^(padmaia|anthonyshew|dimitropoulos|tknickman|chris-olszewski|NicholasLYang)$"\n # Removes the PR from release notes when its chore or ci\n - label: "release-notes-ignore"\n when:\n isPRTitleMatch: "(\bchore\b|\bci\b).*?:"\n\n # areas\n - label: "area: ci"\n when:\n isAnyFilePathMatch: '^\.github\/(workflows|actions).*$'\n - label: "area: examples"\n when:\n isAnyFilePathMatch: '^examples\/.*$'\n - label: "area: docs"\n when:\n isAnyFilePathMatch: '^docs\/.*\.mdx$'\n - label: "area: site"\n when:\n isAnyFilePathMatch: '^docs\/.*\.(?!mdx).*$'\n\n # packages\n - label: "pkg: turbo-eslint"\n when:\n isAnyFilePathMatch: '^packages\/eslint-(plugin|config)-turbo\/.*$'\n - label: "pkg: turbo-ignore"\n when:\n isAnyFilePathMatch: '^packages\/turbo-ignore\/.*$'\n - label: "pkg: turbo-codemod"\n when:\n isAnyFilePathMatch: '^packages\/turbo-codemod\/.*$'\n - label: "pkg: create-turbo"\n when:\n isAnyFilePathMatch: '^packages\/create-turbo\/.*$'\n - label: "pkg: turbo-gen"\n when:\n isAnyFilePathMatch: '^packages\/turbo-gen\/.*$'\n - label: "pkg: turbo-workspaces"\n when:\n isAnyFilePathMatch: '^packages\/turbo-workspaces\/.*$'\n - label: "pkg: turbo-repository"\n when:\n isAnyFilePathMatch: '^packages\/turbo-repository\/.*$'\n - label: "pkg: turbo-telemetry"\n when:\n isAnyFilePathMatch: '^packages\/turbo-telemetry\/.*$'\n\n # release\n - label: "release: turborepo"\n when:\n isAnyFilePathMatch: '^version\.txt$'\n isPRTitleMatch: '^release\(turborepo\):.*$'\nevents:\n onPublish:\n turbo:\n - runWorkflow: bench-turborepo.yml\n when: any\n - runWorkflow: update-examples-on-release.yml\n when: latest\n | dataset_sample\yaml\rust\turbo-orchestrator.yml | turbo-orchestrator.yml | YAML | 3,795 | 0.8 | 0.064516 | 0.360465 | react-lib | 825 | 2023-07-20T00:36:39.783442 | Apache-2.0 | false | ba83d88fecbcab3994cee12eb9db2403 |
id: use-basic-job\nmessage: Use BasicJob / CronJob for worker creation.\nseverity: error\nlanguage: rust\nfiles:\n- ./ee/tabby-webserver/src/service/background_job/**\nignores:\n- ./ee/tabby-webserver/src/service/background_job/helper/mod.rs\nrule:\n pattern: WorkerBuilder | dataset_sample\yaml\rust\use-basic-job.yml | use-basic-job.yml | YAML | 265 | 0.8 | 0.111111 | 0 | python-kit | 117 | 2025-06-11T14:47:18.893813 | Apache-2.0 | false | fd7eff8a1a81c0777f4fdbd785345e08 |
id: use-schema-result\nmessage: Use schema::Result as API interface\nseverity: error\nlanguage: rust\nfiles:\n- ./ee/tabby-schema/src/**\nignores:\n- ./ee/tabby-schema/src/lib.rs\n- ./ee/tabby-schema/src/dao.rs\nrule:\n any:\n - pattern: anyhow\n not:\n inside:\n kind: enum_variant\n stopBy: end\n - pattern: FieldResult | dataset_sample\yaml\rust\use-schema-result.yml | use-schema-result.yml | YAML | 342 | 0.8 | 0 | 0 | react-lib | 542 | 2025-03-20T02:58:49.789156 | Apache-2.0 | false | 793ceaf1657eb3a7f7e90c0187cf80c7 |
id: validate-requires-code\nmessage: Validations requires code / message being set for frontend error display\nseverity: error\nlanguage: rust\nfiles:\n - ./ee/tabby-webserver/src/**\n - ./ee/tabby-schema/src/**\nrule:\n all:\n - pattern: "#[validate]"\n - not:\n all:\n - has:\n stopBy: end\n pattern: code\n - has:\n stopBy: end\n pattern: message\n - not:\n any:\n - has:\n stopBy: end\n pattern: custom\n - has:\n stopBy: end\n pattern: nested\n - has:\n stopBy: end\n pattern: schema\n | dataset_sample\yaml\rust\validate-requires-code.yml | validate-requires-code.yml | YAML | 662 | 0.95 | 0.034483 | 0 | react-lib | 759 | 2024-10-12T06:55:30.969768 | MIT | false | bebd9319a3819a3d46b3ed0c6d7e83bd |
# Configuration for the zizmor static analysis tool, run via pre-commit in CI\n# https://woodruffw.github.io/zizmor/configuration/\n#\n# TODO: can we remove the ignores here so that our workflows are more secure?\nrules:\n dangerous-triggers:\n ignore:\n - pr-comment.yaml\n cache-poisoning:\n ignore:\n - build-docker.yml\n - publish-playground.yml\n excessive-permissions:\n # it's hard to test what the impact of removing these ignores would be\n # without actually running the release workflow...\n ignore:\n - build-docker.yml\n - publish-playground.yml\n - publish-docs.yml\n | dataset_sample\yaml\rust\zizmor.yml | zizmor.yml | YAML | 610 | 0.8 | 0.052632 | 0.315789 | vue-tools | 469 | 2024-05-03T19:58:06.397718 | BSD-3-Clause | false | 889cc2b681ee9c2f8aa932e18a17e336 |
name: CI\non: [push, pull_request]\n\njobs:\n test:\n name: Run tests\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@master\n - name: Update rustup\n run: rustup self update\n - name: Install Rust\n run: |\n rustup set profile minimal\n rustup toolchain install 1.85 -c rust-docs\n rustup default 1.85\n - name: Install mdbook\n run: |\n mkdir bin\n curl -sSL https://github.com/rust-lang/mdBook/releases/download/v0.4.45/mdbook-v0.4.45-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=bin\n echo "$(pwd)/bin" >> "${GITHUB_PATH}"\n - name: Report versions\n run: |\n rustup --version\n rustc -Vv\n mdbook --version\n\n # mdBook does not currently have particularly good support for “external”\n # crates. To make the test suite work correctly with `trpl`, we must first\n # build `trpl` itself (`mdbook` will not do it), and then explicitly pass\n # its `deps` path as a library search path for `mdbook test`. That will make\n # sure all the crates can be resolved when running the tests.\n - name: Build `trpl` crate\n run: |\n cd packages/trpl\n cargo build\n - name: Run tests\n run:\n mdbook test --library-path packages/trpl/target/debug/deps\n package_tests:\n name: Run package tests\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@master\n - name: Update rustup\n run: rustup self update\n - name: Install Rust\n run: |\n rustup set profile minimal\n rustup toolchain install 1.85 -c rust-docs\n rustup default 1.85\n - name: Run `tools` package tests\n run: |\n cargo test\n - name: Run `mdbook-trpl` package tests\n working-directory: packages/mdbook-trpl\n run: |\n cargo test\n lint:\n name: Run lints\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@master\n - name: Update rustup\n run: rustup self update\n - name: Install Rust\n run: |\n rustup set profile minimal\n rustup toolchain install nightly -c rust-docs\n rustup override set nightly\n - name: Install mdbook\n run: |\n mkdir bin\n curl -sSL https://github.com/rust-lang/mdBook/releases/download/v0.4.45/mdbook-v0.4.45-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=bin\n echo "$(pwd)/bin" >> "${GITHUB_PATH}"\n - name: Install mdbook-trpl binaries\n run: cargo install --path packages/mdbook-trpl\n - name: Install aspell\n run: sudo apt-get install aspell\n - name: Install shellcheck\n run: sudo apt-get install shellcheck\n - name: Report versions\n run: |\n rustup --version\n rustc -Vv\n mdbook --version\n aspell --version\n shellcheck --version\n - name: Shellcheck\n run: find . -name '*.sh' -print0 | xargs -0 shellcheck\n - name: Spellcheck\n run: bash ci/spellcheck.sh list\n - name: Lint for local file paths\n run: |\n mdbook build\n cargo run --bin lfp src\n - name: Validate references\n run: bash ci/validate.sh\n - name: Check for broken links\n run: |\n curl -sSLo linkcheck.sh \\n https://raw.githubusercontent.com/rust-lang/rust/master/src/tools/linkchecker/linkcheck.sh\n # Cannot use --all here because of the generated redirect pages aren't available.\n sh linkcheck.sh book\n | dataset_sample\yaml\rust-lang_book\.github\workflows\main.yml | main.yml | YAML | 3,401 | 0.8 | 0.038462 | 0.058824 | python-kit | 298 | 2024-11-07T05:30:54.006929 | GPL-3.0 | false | a19fee964e5596fda55f44695510fe04 |
name: Bug Report\ndescription: Create a report to help us improve\nlabels: ["C-bug", "S-triage"]\nbody:\n - type: markdown\n attributes:\n value: Thanks for filing a 🐛 bug report 😄!\n - type: textarea\n id: problem\n attributes:\n label: Problem\n description: >\n Please provide a clear and concise description of what the bug is,\n including what currently happens and what you expected to happen.\n validations:\n required: true\n - type: textarea\n id: steps\n attributes:\n label: Steps\n description: Please list the steps to reproduce the bug.\n placeholder: |\n 1.\n 2.\n 3.\n - type: textarea\n id: possible-solutions\n attributes:\n label: Possible Solution(s)\n description: >\n Not obligatory, but suggest a fix/reason for the bug,\n or ideas how to implement the addition or change.\n - type: textarea\n id: notes\n attributes:\n label: Notes\n description: Provide any additional notes that might be helpful.\n - type: textarea\n id: version\n attributes:\n label: Version\n description: Please paste the output of running `cargo version --verbose`.\n render: text\n | dataset_sample\yaml\rust-lang_cargo\.github\ISSUE_TEMPLATE\bug_report.yml | bug_report.yml | YAML | 1,204 | 0.85 | 0.046512 | 0 | awesome-app | 665 | 2024-04-03T05:42:48.186371 | BSD-3-Clause | false | 91dd1fac99655a7a64d836e5de2234d7 |
contact_links:\n - name: Question\n url: https://users.rust-lang.org\n about: >\n Got a question about Cargo? Ask the community on the user forum.\n - name: Inspiring Idea\n url: https://internals.rust-lang.org/c/tools-and-infrastructure/cargo\n about: >\n Need more discussions with your next big idea?\n Reach out the coummunity on the internals forum.\n | dataset_sample\yaml\rust-lang_cargo\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 375 | 0.8 | 0 | 0 | node-utils | 622 | 2023-10-20T14:57:46.019264 | BSD-3-Clause | false | 196cac2f07b48e48b4b8f370b64179bb |
name: Feature Request\ndescription: Suggest an idea for enhancing Cargo\nlabels: ["C-feature-request", "S-triage"]\nbody:\n - type: markdown\n attributes:\n value: |\n Thanks for filing a 🙋 feature request 😄!\n\n If the feature request is relatively small and already with a possible solution, this might be the place for you.\n\n If you are brewing a big feature that needs feedback from the community, [the internal forum] is the best fit, especially for pre-RFC. You can also talk the idea over with other developers in [#t-cargo Zulip stream].\n\n [the internal forum]: https://internals.rust-lang.org/c/tools-and-infrastructure/cargo/15\n [#t-cargo Zulip stream]: https://rust-lang.zulipchat.com/#narrow/stream/246057-t-cargo\n - type: textarea\n id: problem\n attributes:\n label: Problem\n description: >\n Please provide a clear description of your use case and the problem\n this feature request is trying to solve.\n validations:\n required: true\n - type: textarea\n id: solution\n attributes:\n label: Proposed Solution\n description: >\n Please provide a clear and concise description of what you want to happen.\n - type: textarea\n id: notes\n attributes:\n label: Notes\n description: Provide any additional context or information that might be helpful.\n | dataset_sample\yaml\rust-lang_cargo\.github\ISSUE_TEMPLATE\feature_request.yml | feature_request.yml | YAML | 1,366 | 0.95 | 0.114286 | 0 | node-utils | 682 | 2025-01-08T20:15:21.057326 | GPL-3.0 | false | 60c9cb3fde9d3de2c85c017945e23820 |
name: Tracking Issue\ndescription: A tracking issue for an accepted feature or RFC in Cargo.\ntitle: "Tracking Issue for _FEATURE_NAME_"\nlabels: ["C-tracking-issue"]\nbody:\n - type: markdown\n attributes:\n value: >\n Thank you for creating a tracking issue! Tracking issues are for tracking an\n accepted feature or RFC from implementation to stabilization. Please do not\n file a tracking issue until the feature or RFC has been approved.\n - type: textarea\n id: summary\n attributes:\n label: Summary\n description: Please provide a very brief summary of the feature.\n value: |\n RFC: [#NNNN](https://github.com/rust-lang/rfcs/pull/NNNN) <!-- If this is an RFC -->\n Original issue: #NNNN <!-- if there is a related issue that spawned this feature -->\n Implementation: #NNNN <!-- link to the PR that implemented this feature if applicable -->\n Documentation: https://doc.rust-lang.org/nightly/cargo/reference/unstable.html#my-feature\n\n Please enter a short, one-sentence description here.\n validations:\n required: true\n - type: textarea\n id: unresolved\n attributes:\n label: Unresolved Issues\n description: List issues that have not yet been resolved.\n placeholder: |\n * [ ] Make a list of any known implementation or design issues.\n - type: textarea\n id: future\n attributes:\n label: Future Extensions\n description: >\n An optional section where you can mention where the feature may be\n extended in the future, but is explicitly not intended to\n address.\n - type: textarea\n id: about\n attributes:\n label: About tracking issues\n description: Please include this notice in the issue.\n value: |\n Tracking issues are used to record the overall progress of implementation.\n They are also used as hubs connecting to other relevant issues, e.g., bugs or open design questions.\n A tracking issue is however *not* meant for large scale discussion, questions, or bug reports about a feature.\n Instead, open a dedicated issue for the specific matter and add the relevant feature gate label.\n | dataset_sample\yaml\rust-lang_cargo\.github\ISSUE_TEMPLATE\tracking_issue.yml | tracking_issue.yml | YAML | 2,180 | 0.95 | 0.16 | 0.020408 | vue-tools | 329 | 2025-06-29T23:22:49.619513 | MIT | false | f4cb0e5abc36f3d5b364813e5c92cc6e |
name: Security audit\n\npermissions:\n contents: read\n\non:\n pull_request:\n paths:\n - '**/Cargo.toml'\n - '**/Cargo.lock'\n push:\n branches:\n - master\n\njobs:\n cargo_deny:\n runs-on: ubuntu-latest\n strategy:\n matrix:\n checks:\n - advisories\n - bans licenses sources\n steps:\n - uses: actions/checkout@v4\n - uses: EmbarkStudios/cargo-deny-action@v2\n # Prevent sudden announcement of a new advisory from failing ci:\n continue-on-error: ${{ matrix.checks == 'advisories' }}\n with:\n command: check ${{ matrix.checks }}\n rust-version: stable\n | dataset_sample\yaml\rust-lang_cargo\.github\workflows\audit.yml | audit.yml | YAML | 624 | 0.8 | 0 | 0.037037 | awesome-app | 627 | 2024-08-29T08:51:04.287518 | GPL-3.0 | false | 2c53b00a879fcbaa5e37e7dd0683738f |
name: Contrib Deploy\non:\n push:\n branches:\n - master\n\nconcurrency:\n cancel-in-progress: false\n group: "gh-pages"\n\npermissions:\n contents: read\n\njobs:\n deploy:\n permissions:\n contents: write # for Git to git push\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n with:\n fetch-depth: 0\n - name: Install mdbook\n run: |\n mkdir mdbook\n curl -Lf https://github.com/rust-lang/mdBook/releases/download/v0.4.44/mdbook-v0.4.44-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=./mdbook\n echo `pwd`/mdbook >> $GITHUB_PATH\n - name: Deploy docs\n run: |\n GENERATE_PY="$(pwd)/ci/generate.py"\n\n cd src/doc/contrib\n mdbook build\n\n # Override previous ref to avoid keeping history.\n git worktree add --orphan -B gh-pages gh-pages\n git config user.name "Deploy from CI"\n git config user.email ""\n cd gh-pages\n mv ../book contrib\n git add contrib\n\n # Generate HTML for link redirections.\n python3 "$GENERATE_PY"\n git add *.html\n # WARN: The CNAME file is for GitHub to redirect requests to the custom domain.\n # Missing this may entail security hazard and domain takeover.\n # See <https://docs.github.com/en/pages/configuring-a-custom-domain-for-your-github-pages-site/managing-a-custom-domain-for-your-github-pages-site#securing-your-custom-domain>\n git add CNAME\n\n git commit -m "Deploy $GITHUB_SHA to gh-pages"\n git push origin +gh-pages\n | dataset_sample\yaml\rust-lang_cargo\.github\workflows\contrib.yml | contrib.yml | YAML | 1,550 | 0.8 | 0.096154 | 0.111111 | awesome-app | 380 | 2024-12-19T13:44:48.742980 | MIT | false | 401aa82e5495b893b7873bc158698d7a |
name: CI\non:\n merge_group:\n pull_request:\n branches:\n - "**"\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: read\n\nconcurrency:\n group: "${{ github.workflow }}-${{ github.ref }}"\n cancel-in-progress: true\n\njobs:\n conclusion:\n needs:\n - build_std\n - clippy\n - msrv\n - docs\n - lint-docs\n - lockfile\n - resolver\n - rustfmt\n - test\n - test_gitoxide\n permissions:\n contents: none\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n\n # Check Code style quickly by running `rustfmt` over all code\n rustfmt:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update stable && rustup default stable\n - run: rustup component add rustfmt\n - run: cargo fmt --all --check\n\n # Ensure there are no clippy warnings\n clippy:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update stable && rustup default stable\n - run: rustup component add clippy\n - run: cargo clippy --workspace --all-targets --no-deps -- -D warnings\n\n stale-label:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update stable && rustup default stable\n - run: cargo stale-label\n\n lint-docs:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update stable && rustup default stable\n - run: cargo lint-docs --check\n\n # Ensure Cargo.lock is up-to-date\n lockfile:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update stable && rustup default stable\n - run: cargo update -p cargo --locked\n\n check-version-bump:\n runs-on: ubuntu-latest\n env:\n BASE_SHA: ${{ github.event.pull_request.base.sha }}\n HEAD_SHA: ${{ github.event.pull_request.head.sha != '' && github.event.pull_request.head.sha || github.sha }}\n steps:\n - uses: actions/checkout@v4\n with:\n fetch-depth: 0\n - run: rustup update stable && rustup default stable\n - name: Install cargo-semver-checks\n run: |\n mkdir installed-bins\n curl -Lf https://github.com/obi1kenobi/cargo-semver-checks/releases/download/v0.41.0/cargo-semver-checks-x86_64-unknown-linux-gnu.tar.gz \\n | tar -xz --directory=./installed-bins\n echo `pwd`/installed-bins >> $GITHUB_PATH\n - run: ci/validate-version-bump.sh\n\n test:\n runs-on: ${{ matrix.os }}\n env:\n CARGO_PROFILE_DEV_DEBUG: 1\n CARGO_PROFILE_TEST_DEBUG: 1\n CARGO_INCREMENTAL: 0\n CARGO_PUBLIC_NETWORK_TESTS: 1\n # Workaround for https://github.com/rust-lang/rustup/issues/3036\n RUSTUP_WINDOWS_PATH_ADD_BIN: 0\n strategy:\n matrix:\n include:\n - name: Linux x86_64 stable\n os: ubuntu-latest\n rust: stable\n other: i686-unknown-linux-gnu\n - name: Linux x86_64 beta\n os: ubuntu-latest\n rust: beta\n other: i686-unknown-linux-gnu\n - name: Linux x86_64 nightly\n os: ubuntu-latest\n rust: nightly\n other: i686-unknown-linux-gnu\n - name: Linux aarch64 stable\n os: ubuntu-24.04-arm\n rust: stable\n other: TODO # cross-compile tests are disabled, this shouldn't matter.\n - name: Linux aarch64 nightly\n os: ubuntu-24.04-arm\n rust: nightly\n other: TODO # cross-compile tests are disabled, this shouldn't matter.\n - name: macOS aarch64 stable\n os: macos-14\n rust: stable\n other: x86_64-apple-darwin\n - name: macOS x86_64 nightly\n os: macos-13\n rust: nightly\n other: x86_64-apple-ios\n - name: macOS aarch64 nightly\n os: macos-14\n rust: nightly\n other: x86_64-apple-darwin\n - name: Windows x86_64 MSVC stable\n os: windows-latest\n rust: stable-msvc\n other: i686-pc-windows-msvc\n - name: Windows x86_64 gnu nightly # runs out of space while trying to link the test suite\n os: windows-latest\n rust: nightly-gnu\n other: i686-pc-windows-gnu\n name: Tests ${{ matrix.name }}\n steps:\n - uses: actions/checkout@v4\n - name: Dump Environment\n run: ci/dump-environment.sh\n # Some tests require stable. Make sure it is set to the most recent stable\n # so that we can predictably handle updates if necessary (and not randomly\n # when GitHub updates its image).\n - run: rustup update --no-self-update stable\n - run: rustup update --no-self-update ${{ matrix.rust }} && rustup default ${{ matrix.rust }}\n - run: rustup target add ${{ matrix.other }}\n if: matrix.os != 'ubuntu-24.04-arm' # cross-compile tests are disabled on ARM machines\n - run: rustup target add wasm32-unknown-unknown\n - run: rustup target add aarch64-unknown-none # need this for build-std mock tests\n if: startsWith(matrix.rust, 'nightly')\n - run: rustup component add rustc-dev llvm-tools-preview rust-docs\n if: startsWith(matrix.rust, 'nightly')\n - run: sudo apt update -y && sudo apt install lldb gcc-multilib libsecret-1-0 libsecret-1-dev -y\n if: matrix.os == 'ubuntu-latest'\n - run: rustup component add rustfmt || echo "rustfmt not available"\n - name: Configure extra test environment\n run: echo CARGO_CONTAINER_TESTS=1 >> $GITHUB_ENV\n if: matrix.os == 'ubuntu-latest'\n - run: cargo test -p cargo\n - name: Clear intermediate test output\n run: ci/clean-test-output.sh\n\n - name: gitoxide tests (all git-related tests)\n run: cargo test -p cargo git\n env:\n __CARGO_USE_GITOXIDE_INSTEAD_OF_GIT2: 1\n # The testsuite generates a huge amount of data, and fetch-smoke-test was\n # running out of disk space.\n - name: Clear test output\n run: ci/clean-test-output.sh\n\n # This only tests `cargo fix` because fix-proxy-mode is one of the most\n # complicated subprocess management in Cargo.\n - name: Check operability of rustc invocation with argfile\n run: 'cargo test -p cargo --test testsuite -- fix::'\n env:\n __CARGO_TEST_FORCE_ARGFILE: 1\n - run: cargo test --workspace --exclude cargo --exclude benchsuite --exclude resolver-tests\n - name: Check benchmarks\n run: |\n # This only tests one benchmark since it can take over 10 minutes to\n # download all workspaces.\n cargo test -p benchsuite --all-targets -- cargo\n cargo check -p capture\n # The testsuite generates a huge amount of data, and fetch-smoke-test was\n # running out of disk space.\n - name: Clear benchmark output\n run: ci/clean-test-output.sh\n\n - name: Fetch smoke test\n run: ci/fetch-smoke-test.sh\n\n schema:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update stable && rustup default stable\n - run: cargo test -p cargo-util-schemas -F unstable-schema\n\n resolver:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update stable && rustup default stable\n - run: cargo test -p resolver-tests\n\n test_gitoxide:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update --no-self-update stable && rustup default stable\n - run: rustup target add i686-unknown-linux-gnu\n - run: rustup target add wasm32-unknown-unknown\n - run: sudo apt update -y && sudo apt install gcc-multilib libsecret-1-0 libsecret-1-dev -y\n - run: rustup component add rustfmt || echo "rustfmt not available"\n - run: cargo test -p cargo\n env:\n __CARGO_USE_GITOXIDE_INSTEAD_OF_GIT2: 1\n\n build_std:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update nightly && rustup default nightly\n - run: rustup component add rust-src\n - run: cargo build\n - run: cargo test -p cargo --test build-std\n env:\n CARGO_RUN_BUILD_STD_TESTS: 1\n docs:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: rustup update nightly && rustup default nightly\n - run: rustup update stable\n - run: rustup component add rust-docs\n - run: ci/validate-man.sh\n # This requires rustfmt, use stable.\n - name: Run semver-check\n run: cargo +stable run -p semver-check\n - name: Ensure intradoc links are valid\n run: cargo doc --workspace --document-private-items --no-deps\n env:\n RUSTDOCFLAGS: -D warnings\n - name: Install mdbook\n run: |\n mkdir mdbook\n curl -Lf https://github.com/rust-lang/mdBook/releases/download/v0.4.44/mdbook-v0.4.44-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=./mdbook\n echo `pwd`/mdbook >> $GITHUB_PATH\n - run: cd src/doc && mdbook build --dest-dir ../../target/doc\n - name: Run linkchecker.sh\n run: |\n cd target\n curl -sSLO https://raw.githubusercontent.com/rust-lang/rust/master/src/tools/linkchecker/linkcheck.sh\n sh linkcheck.sh --all --path ../src/doc cargo\n\n msrv:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: taiki-e/install-action@cargo-hack\n - run: cargo hack check --all-targets --rust-version --workspace --ignore-private --locked\n | dataset_sample\yaml\rust-lang_cargo\.github\workflows\main.yml | main.yml | YAML | 10,073 | 0.95 | 0.04878 | 0.093284 | vue-tools | 825 | 2025-07-03T15:32:09.990687 | Apache-2.0 | false | 2d7c703c47a6bce5c7c575c50963d3de |
# Publish Cargo to crates.io whenever a new tag is pushed. Tags are pushed by\n# the Rust release process (https://github.com/rust-lang/promote-release),\n# which will cause this workflow to run.\n\nname: Release\non:\n push:\n tags:\n - "0.*"\n\n# Prevent multiple releases from starting at the same time.\nconcurrency:\n group: release\n\njobs:\n crates-io:\n name: Publish on crates.io\n runs-on: ubuntu-latest\n permissions:\n contents: read\n\n # Gain access to the crates.io publishing token.\n environment:\n name: release\n\n steps:\n - name: Checkout the source code\n uses: actions/checkout@v4\n\n - name: Publish Cargo to crates.io\n run: ./publish.py\n env:\n CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_REGISTRY_TOKEN }}\n | dataset_sample\yaml\rust-lang_cargo\.github\workflows\release.yml | release.yml | YAML | 777 | 0.8 | 0 | 0.185185 | awesome-app | 333 | 2025-06-17T10:43:43.993752 | GPL-3.0 | false | fbb30c59140360e4031f4f57dcceee8e |
blank_issues_enabled: true\ncontact_links:\n - name: Question\n url: https://users.rust-lang.org\n about: Please ask and answer questions about Rust on the user forum.\n - name: Feature Request\n url: https://internals.rust-lang.org/\n about: Please discuss language feature requests on the internals forum.\n - name: Clippy Bug\n url: https://github.com/rust-lang/rust-clippy/issues/new/choose\n about: Please report Clippy bugs such as false positives in the Clippy repo.\n | dataset_sample\yaml\rust-lang_rust\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 485 | 0.8 | 0 | 0 | python-kit | 373 | 2024-09-30T10:22:07.445596 | Apache-2.0 | false | daef651f1aeb8100b33ba07259e29e2e |
# This file defines our primary CI workflow that runs on pull requests\n# and also on pushes to special branches (auto, try).\n#\n# The actual definition of the executed jobs is calculated by the\n# `src/ci/citool` crate, which\n# uses job definition data from src/ci/github-actions/jobs.yml.\n# You should primarily modify the `jobs.yml` file if you want to modify\n# what jobs are executed in CI.\n\nname: CI\non:\n push:\n branches:\n - auto\n - try\n - try-perf\n - automation/bors/try\n pull_request:\n branches:\n - "**"\n\npermissions:\n contents: read\n packages: write\n\ndefaults:\n run:\n # On Linux, macOS, and Windows, use the system-provided bash as the default\n # shell. (This should only make a difference on Windows, where the default\n # shell is PowerShell.)\n shell: bash\n\nconcurrency:\n # For a given workflow, if we push to the same branch, cancel all previous builds on that branch.\n # We add an exception for try builds (try branch) and unrolled rollup builds (try-perf), which\n # are all triggered on the same branch, but which should be able to run concurrently.\n group: ${{ github.workflow }}-${{ ((github.ref == 'refs/heads/try' || github.ref == 'refs/heads/try-perf') && github.sha) || github.ref }}\n cancel-in-progress: true\nenv:\n TOOLSTATE_REPO: "https://github.com/rust-lang-nursery/rust-toolstate"\n # This will be empty in PR jobs.\n TOOLSTATE_REPO_ACCESS_TOKEN: ${{ secrets.TOOLSTATE_REPO_ACCESS_TOKEN }}\njobs:\n # The job matrix for `calculate_matrix` is defined in src/ci/github-actions/jobs.yml.\n # It calculates which jobs should be executed, based on the data of the ${{ github }} context.\n # If you want to modify CI jobs, take a look at src/ci/github-actions/jobs.yml.\n calculate_matrix:\n name: Calculate job matrix\n runs-on: ubuntu-24.04\n outputs:\n jobs: ${{ steps.jobs.outputs.jobs }}\n run_type: ${{ steps.jobs.outputs.run_type }}\n steps:\n - name: Checkout the source code\n uses: actions/checkout@v4\n # Cache citool to make its build faster, as it's in the critical path.\n # The rust-cache doesn't bleed into the main `job`, so it should not affect any other\n # Rust compilation.\n - name: Cache citool\n uses: Swatinem/rust-cache@9d47c6ad4b02e050fd481d890b2ea34778fd09d6 # v2.7.8\n with:\n workspaces: src/ci/citool\n - name: Calculate the CI job matrix\n env:\n COMMIT_MESSAGE: ${{ github.event.head_commit.message }}\n run: |\n cd src/ci/citool\n CARGO_INCREMENTAL=0 cargo test\n CARGO_INCREMENTAL=0 cargo run calculate-job-matrix >> $GITHUB_OUTPUT\n id: jobs\n job:\n name: ${{ matrix.full_name }}\n needs: [ calculate_matrix ]\n runs-on: "${{ matrix.os }}"\n timeout-minutes: 360\n env:\n CI_JOB_NAME: ${{ matrix.name }}\n CI_JOB_DOC_URL: ${{ matrix.doc_url }}\n GITHUB_WORKFLOW_RUN_ID: ${{ github.run_id }}\n GITHUB_REPOSITORY: ${{ github.repository }}\n CARGO_REGISTRIES_CRATES_IO_PROTOCOL: sparse\n # commit of PR sha or commit sha. `GITHUB_SHA` is not accurate for PRs.\n HEAD_SHA: ${{ github.event.pull_request.head.sha || github.sha }}\n DOCKER_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n SCCACHE_BUCKET: rust-lang-ci-sccache2\n SCCACHE_REGION: us-west-1\n CACHE_DOMAIN: ci-caches.rust-lang.org\n continue-on-error: ${{ matrix.continue_on_error || false }}\n strategy:\n matrix:\n # Check the `calculate_matrix` job to see how is the matrix defined.\n include: ${{ fromJSON(needs.calculate_matrix.outputs.jobs) }}\n steps:\n - name: disable git crlf conversion\n run: git config --global core.autocrlf false\n\n - name: checkout the source code\n uses: actions/checkout@v4\n with:\n fetch-depth: 2\n\n # Free up disk space on Linux by removing preinstalled components that\n # we do not need. We do this to enable some of the less resource\n # intensive jobs to run on free runners, which however also have\n # less disk space.\n - name: free up disk space\n run: src/ci/scripts/free-disk-space.sh\n if: matrix.free_disk\n\n # Rust Log Analyzer can't currently detect the PR number of a GitHub\n # Actions build on its own, so a hint in the log message is needed to\n # point it in the right direction.\n - name: configure the PR in which the error message will be posted\n run: echo "[CI_PR_NUMBER=$num]"\n env:\n num: ${{ github.event.number }}\n if: needs.calculate_matrix.outputs.run_type == 'pr'\n\n - name: add extra environment variables\n run: src/ci/scripts/setup-environment.sh\n env:\n # Since it's not possible to merge `${{ matrix.env }}` with the other\n # variables in `job.<name>.env`, the variables defined in the matrix\n # are passed to the `setup-environment.sh` script encoded in JSON,\n # which then uses log commands to actually set them.\n EXTRA_VARIABLES: ${{ toJson(matrix.env) }}\n\n - name: ensure the channel matches the target branch\n run: src/ci/scripts/verify-channel.sh\n\n - name: collect CPU statistics\n run: src/ci/scripts/collect-cpu-stats.sh\n\n - name: show the current environment\n run: src/ci/scripts/dump-environment.sh\n\n - name: install awscli\n run: src/ci/scripts/install-awscli.sh\n\n - name: install sccache\n run: src/ci/scripts/install-sccache.sh\n\n - name: select Xcode\n run: src/ci/scripts/select-xcode.sh\n\n - name: install clang\n run: src/ci/scripts/install-clang.sh\n\n - name: install tidy\n run: src/ci/scripts/install-tidy.sh\n\n - name: install WIX\n run: src/ci/scripts/install-wix.sh\n\n - name: disable git crlf conversion\n run: src/ci/scripts/disable-git-crlf-conversion.sh\n\n - name: checkout submodules\n run: src/ci/scripts/checkout-submodules.sh\n\n - name: install MinGW\n run: src/ci/scripts/install-mingw.sh\n\n - name: install ninja\n run: src/ci/scripts/install-ninja.sh\n\n - name: enable ipv6 on Docker\n run: src/ci/scripts/enable-docker-ipv6.sh\n\n # Disable automatic line ending conversion (again). On Windows, when we're\n # installing dependencies, something switches the git configuration directory or\n # re-enables autocrlf. We've not tracked down the exact cause -- and there may\n # be multiple -- but this should ensure submodules are checked out with the\n # appropriate line endings.\n - name: disable git crlf conversion\n run: src/ci/scripts/disable-git-crlf-conversion.sh\n\n - name: ensure line endings are correct\n run: src/ci/scripts/verify-line-endings.sh\n\n - name: ensure backported commits are in upstream branches\n run: src/ci/scripts/verify-backported-commits.sh\n\n - name: ensure the stable version number is correct\n run: src/ci/scripts/verify-stable-version-number.sh\n\n # Show the environment just before we run the build\n # This makes it easier to diagnose problems with the above install scripts.\n - name: show the current environment\n run: src/ci/scripts/dump-environment.sh\n\n # Pre-build citool before the following step uninstalls rustup\n # Build it into the build directory, to avoid modifying sources\n - name: build citool\n run: |\n cd src/ci/citool\n CARGO_INCREMENTAL=0 CARGO_TARGET_DIR=../../../build/citool cargo build\n\n - name: run the build\n run: |\n set +e\n # Redirect stderr to stdout to avoid reordering the two streams in the GHA logs.\n src/ci/scripts/run-build-from-ci.sh 2>&1\n STATUS=$?\n set -e\n\n if [[ "$STATUS" -ne 0 && -n "$CI_JOB_DOC_URL" ]]; then\n echo "****************************************************************************"\n echo "To find more information about this job, visit the following URL:"\n echo "$CI_JOB_DOC_URL"\n echo "****************************************************************************"\n fi\n exit ${STATUS}\n env:\n AWS_ACCESS_KEY_ID: ${{ env.CACHES_AWS_ACCESS_KEY_ID }}\n AWS_SECRET_ACCESS_KEY: ${{ secrets[format('AWS_SECRET_ACCESS_KEY_{0}', env.CACHES_AWS_ACCESS_KEY_ID)] }}\n\n - name: create github artifacts\n run: src/ci/scripts/create-doc-artifacts.sh\n\n - name: print disk usage\n run: |\n echo "disk usage:"\n df -h\n\n - name: upload artifacts to github\n uses: actions/upload-artifact@v4\n with:\n # name is set in previous step\n name: ${{ env.DOC_ARTIFACT_NAME }}\n path: obj/artifacts/doc\n if-no-files-found: ignore\n retention-days: 5\n\n - name: upload artifacts to S3\n run: src/ci/scripts/upload-artifacts.sh\n env:\n AWS_ACCESS_KEY_ID: ${{ env.ARTIFACTS_AWS_ACCESS_KEY_ID }}\n AWS_SECRET_ACCESS_KEY: ${{ secrets[format('AWS_SECRET_ACCESS_KEY_{0}', env.ARTIFACTS_AWS_ACCESS_KEY_ID)] }}\n # Adding a condition on DEPLOY=1 or DEPLOY_ALT=1 is not needed as all deploy\n # builders *should* have the AWS credentials available. Still, explicitly\n # adding the condition is helpful as this way CI will not silently skip\n # deploying artifacts from a dist builder if the variables are misconfigured,\n # erroring about invalid credentials instead.\n if: github.event_name == 'push' || env.DEPLOY == '1' || env.DEPLOY_ALT == '1'\n\n - name: postprocess metrics into the summary\n # This step is not critical, and if some I/O problem happens, we don't want\n # to cancel the build.\n continue-on-error: true\n run: |\n if [ -f build/metrics.json ]; then\n METRICS=build/metrics.json\n elif [ -f obj/build/metrics.json ]; then\n METRICS=obj/build/metrics.json\n else\n echo "No metrics.json found"\n exit 0\n fi\n\n # Get closest bors merge commit\n PARENT_COMMIT=`git rev-list --author='bors <bors@rust-lang.org>' -n1 --first-parent HEAD^1`\n\n ./build/citool/debug/citool postprocess-metrics \\n --job-name ${CI_JOB_NAME} \\n --parent ${PARENT_COMMIT} \\n ${METRICS} >> ${GITHUB_STEP_SUMMARY}\n\n - name: upload job metrics to DataDog\n # This step is not critical, and if some I/O problem happens, we don't want\n # to cancel the build.\n continue-on-error: true\n if: needs.calculate_matrix.outputs.run_type != 'pr'\n env:\n DATADOG_API_KEY: ${{ secrets.DATADOG_API_KEY }}\n DD_GITHUB_JOB_NAME: ${{ matrix.full_name }}\n run: ./build/citool/debug/citool upload-build-metrics build/cpu-usage.csv\n\n # This job isused to tell bors the final status of the build, as there is no practical way to detect\n # when a workflow is successful listening to webhooks only in our current bors implementation (homu).\n outcome:\n name: bors build finished\n runs-on: ubuntu-24.04\n needs: [ calculate_matrix, job ]\n # !cancelled() executes the job regardless of whether the previous jobs passed or failed\n if: ${{ !cancelled() && contains(fromJSON('["auto", "try"]'), needs.calculate_matrix.outputs.run_type) }}\n steps:\n - name: checkout the source code\n uses: actions/checkout@v4\n with:\n fetch-depth: 2\n # Calculate the exit status of the whole CI workflow.\n # If all dependent jobs were successful, this exits with 0 (and the outcome job continues successfully).\n # If a some dependent job has failed, this exits with 1.\n - name: calculate the correct exit status\n run: jq --exit-status 'all(.result == "success" or .result == "skipped")' <<< '${{ toJson(needs) }}'\n # Publish the toolstate if an auto build succeeds (just before push to master)\n - name: publish toolstate\n run: src/ci/publish_toolstate.sh\n shell: bash\n if: needs.calculate_matrix.outputs.run_type == 'auto'\n env:\n TOOLSTATE_ISSUES_API_URL: https://api.github.com/repos/rust-lang/rust/issues\n TOOLSTATE_PUBLISH: 1\n | dataset_sample\yaml\rust-lang_rust\.github\workflows\ci.yml | ci.yml | YAML | 12,279 | 0.8 | 0.092105 | 0.233962 | vue-tools | 231 | 2024-12-16T16:51:09.877287 | GPL-3.0 | false | 28c840ffd3ec5b7ebe733abc4392fc04 |
# Automatically run `cargo update` periodically\n\n---\nname: Bump dependencies in Cargo.lock\non:\n schedule:\n # Run weekly\n - cron: '0 0 * * Sun'\n workflow_dispatch:\n # Needed so we can run it manually\npermissions:\n contents: read\ndefaults:\n run:\n shell: bash\nenv:\n # So cargo doesn't complain about unstable features\n RUSTC_BOOTSTRAP: 1\n PR_TITLE: Weekly `cargo update`\n PR_MESSAGE: |\n Automation to keep dependencies in `Cargo.lock` current.\n\n The following is the output from `cargo update`:\n COMMIT_MESSAGE: "cargo update \n\n"\n\njobs:\n not-waiting-on-bors:\n if: github.repository_owner == 'rust-lang'\n name: skip if S-waiting-on-bors\n runs-on: ubuntu-24.04\n steps:\n - env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n run: |\n # Fetch state and labels of PR\n # Or exit successfully if PR does not exist\n JSON=$(gh pr view cargo_update --repo $GITHUB_REPOSITORY --json labels,state || exit 0)\n STATE=$(echo "$JSON" | jq -r '.state')\n WAITING_ON_BORS=$(echo "$JSON" | jq '.labels[] | any(.name == "S-waiting-on-bors"; .)')\n\n # Exit with error if open and S-waiting-on-bors\n if [[ "$STATE" == "OPEN" && "$WAITING_ON_BORS" == "true" ]]; then\n exit 1\n fi\n\n update:\n if: github.repository_owner == 'rust-lang'\n name: update dependencies\n needs: not-waiting-on-bors\n runs-on: ubuntu-24.04\n steps:\n - name: checkout the source code\n uses: actions/checkout@v4\n with:\n submodules: recursive\n - name: install the bootstrap toolchain\n run: |\n # Extract the stage0 version\n TOOLCHAIN=$(awk -F= '{a[$1]=$2} END {print(a["compiler_version"] "-" a["compiler_date"])}' src/stage0)\n # Install and set as default\n rustup toolchain install --no-self-update --profile minimal $TOOLCHAIN\n rustup default $TOOLCHAIN\n\n - name: cargo update compiler & tools\n # Remove first line that always just says "Updating crates.io index"\n run: |\n echo -e "\ncompiler & tools dependencies:" >> cargo_update.log\n cargo update 2>&1 | sed '/crates.io index/d' | tee -a cargo_update.log\n - name: cargo update library\n run: |\n echo -e "\nlibrary dependencies:" >> cargo_update.log\n cargo update --manifest-path library/Cargo.toml 2>&1 | sed '/crates.io index/d' | tee -a cargo_update.log\n - name: cargo update rustbook\n run: |\n echo -e "\nrustbook dependencies:" >> cargo_update.log\n cargo update --manifest-path src/tools/rustbook/Cargo.toml 2>&1 | sed '/crates.io index/d' | tee -a cargo_update.log\n - name: upload Cargo.lock artifact for use in PR\n uses: actions/upload-artifact@v4\n with:\n name: Cargo-lock\n path: |\n Cargo.lock\n library/Cargo.lock\n src/tools/rustbook/Cargo.lock\n retention-days: 1\n - name: upload cargo-update log artifact for use in PR\n uses: actions/upload-artifact@v4\n with:\n name: cargo-updates\n path: cargo_update.log\n retention-days: 1\n\n pr:\n if: github.repository_owner == 'rust-lang'\n name: amend PR\n needs: update\n runs-on: ubuntu-24.04\n permissions:\n contents: write\n pull-requests: write\n steps:\n - name: checkout the source code\n uses: actions/checkout@v4\n\n - name: download Cargo.lock from update job\n uses: actions/download-artifact@v4\n with:\n name: Cargo-lock\n - name: download cargo-update log from update job\n uses: actions/download-artifact@v4\n with:\n name: cargo-updates\n\n - name: craft PR body and commit message\n run: |\n echo "${COMMIT_MESSAGE}" > commit.txt\n cat cargo_update.log >> commit.txt\n\n echo "${PR_MESSAGE}" > body.md\n echo '```txt' >> body.md\n cat cargo_update.log >> body.md\n echo '```' >> body.md\n\n - name: commit\n run: |\n git config user.name github-actions\n git config user.email github-actions@github.com\n git switch --force-create cargo_update\n git add ./Cargo.lock ./library/Cargo.lock ./src/tools/rustbook/Cargo.lock\n git commit --no-verify --file=commit.txt\n\n - name: push\n run: git push --no-verify --force --set-upstream origin cargo_update\n\n - name: edit existing open pull request\n id: edit\n # Don't fail job if we need to open new PR\n continue-on-error: true\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n run: |\n # Exit with error if PR is closed\n STATE=$(gh pr view cargo_update --repo $GITHUB_REPOSITORY --json state --jq '.state')\n if [[ "$STATE" != "OPEN" ]]; then\n exit 1\n fi\n\n gh pr edit cargo_update --title "${PR_TITLE}" --body-file body.md --repo $GITHUB_REPOSITORY\n\n - name: open new pull request\n # Only run if there wasn't an existing PR\n if: steps.edit.outcome != 'success'\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n run: gh pr create --title "${PR_TITLE}" --body-file body.md --repo $GITHUB_REPOSITORY\n | dataset_sample\yaml\rust-lang_rust\.github\workflows\dependencies.yml | dependencies.yml | YAML | 5,297 | 0.8 | 0.096774 | 0.092857 | react-lib | 759 | 2024-01-26T07:29:23.867982 | MIT | false | d7a90635ed14d9ad4138511d3a2b6d2a |
# Mirror DockerHub images used by the Rust project to ghcr.io.\n# Images are available at https://github.com/orgs/rust-lang/packages.\n#\n# In some CI jobs, we pull images from ghcr.io instead of Docker Hub because\n# Docker Hub has a rate limit, while ghcr.io doesn't.\n# Those images are pushed to ghcr.io by this job.\n#\n# While Docker Hub rate limit *shouldn't* be an issue on GitHub Actions,\n# it certainly is for AWS codebuild.\n#\n# Note that authenticating to DockerHub or other registries isn't possible\n# for PR jobs, because forks can't access secrets.\n# That's why we use ghcr.io: it has no rate limit and it doesn't require authentication.\n\nname: GHCR image mirroring\n\non:\n workflow_dispatch:\n schedule:\n # Run daily at midnight UTC\n - cron: '0 0 * * *'\n\njobs:\n mirror:\n name: DockerHub mirror\n runs-on: ubuntu-24.04\n if: github.repository == 'rust-lang/rust'\n permissions:\n # Needed to write to the ghcr.io registry\n packages: write\n steps:\n - uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Log in to registry\n run: echo "${{ secrets.GITHUB_TOKEN }}" | docker login ghcr.io -u ${{ github.repository_owner }} --password-stdin\n\n # Download crane in the current directory.\n # We use crane because it copies the docker image for all the architectures available in\n # DockerHub for the image.\n # Learn more about crane at\n # https://github.com/google/go-containerregistry/blob/main/cmd/crane/README.md\n - name: Download crane\n run: |\n curl -sL "https://github.com/google/go-containerregistry/releases/download/${VERSION}/go-containerregistry_${OS}_${ARCH}.tar.gz" | tar -xzf -\n env:\n VERSION: v0.20.2\n OS: Linux\n ARCH: x86_64\n\n - name: Mirror DockerHub\n run: |\n # List of DockerHub images to mirror to ghcr.io\n images=(\n # Mirrored because used by the mingw-check-tidy, which doesn't cache Docker images\n "ubuntu:22.04"\n # Mirrored because used by all linux CI jobs, including mingw-check-tidy\n "moby/buildkit:buildx-stable-1"\n # Mirrored because used when CI is running inside a Docker container\n "alpine:3.4"\n # Mirrored because used by dist-x86_64-linux\n "centos:7"\n )\n\n # Mirror each image from DockerHub to ghcr.io\n for img in "${images[@]}"; do\n echo "Mirroring ${img}..."\n # Remove namespace from the image if any.\n # E.g. "moby/buildkit:buildx-stable-1" becomes "buildkit:buildx-stable-1"\n dest_image=$(echo "${img}" | cut -d'/' -f2-)\n ./crane copy \\n "docker.io/${img}" \\n "ghcr.io/${{ github.repository_owner }}/${dest_image}"\n done\n | dataset_sample\yaml\rust-lang_rust\.github\workflows\ghcr.yml | ghcr.yml | YAML | 2,850 | 0.95 | 0.106667 | 0.411765 | node-utils | 624 | 2024-11-02T23:23:08.871784 | GPL-3.0 | false | bfb0182948f745b8d45b63c6dc715b18 |
# Workflow that runs after a merge to master, analyses changes in test executions\n# and posts the result to the merged PR.\n\nname: Post merge analysis\n\non:\n push:\n branches:\n - master\n\njobs:\n analysis:\n runs-on: ubuntu-24.04\n if: github.repository == 'rust-lang/rust'\n permissions:\n pull-requests: write\n steps:\n - uses: actions/checkout@v4\n with:\n # Make sure that we have enough commits to find the parent merge commit.\n # Since all merges should be through merge commits, fetching two commits\n # should be enough to get the parent bors merge commit.\n fetch-depth: 2\n - name: Perform analysis and send PR\n env:\n GH_TOKEN: ${{ github.token }}\n run: |\n # Get closest bors merge commit\n PARENT_COMMIT=`git rev-list --author='bors <bors@rust-lang.org>' -n1 --first-parent HEAD^1`\n echo "Parent: ${PARENT_COMMIT}"\n\n # Find PR for the current commit\n HEAD_PR=`gh pr list --search "${{ github.sha }}" --state merged --json number --jq '.[0].number'`\n echo "HEAD: ${{ github.sha }} (#${HEAD_PR})"\n\n cd src/ci/citool\n\n printf "<details>\n<summary>What is this?</summary>\n" >> output.log\n printf "This is an experimental post-merge analysis report that shows differences in test outcomes between the merged PR and its parent PR.\n" >> output.log\n printf "</details>\n\n" >> output.log\n\n cargo run --release post-merge-report ${PARENT_COMMIT} ${{ github.sha }} >> output.log\n\n cat output.log\n\n gh pr comment ${HEAD_PR} -F output.log\n | dataset_sample\yaml\rust-lang_rust\.github\workflows\post-merge.yml | post-merge.yml | YAML | 1,644 | 0.8 | 0.043478 | 0.189189 | awesome-app | 470 | 2025-06-18T12:03:55.502388 | Apache-2.0 | false | c603c8a9e4373aa3d0bc05d1304e37b8 |
# FIXME re-enable once https://github.com/rust-lang/rust/issues/134863 is fixed.\n# task:\n# name: freebsd\n# freebsd_instance:\n# image: freebsd-13-2-release-amd64\n# setup_rust_script:\n# - pkg install -y git-tiny binutils\n# - curl https://sh.rustup.rs -sSf --output rustup.sh\n# - sh rustup.sh --default-toolchain none -y --profile=minimal\n# target_cache:\n# folder: build/cg_clif\n# prepare_script:\n# - . $HOME/.cargo/env\n# - ./y.sh prepare\n# test_script:\n# - . $HOME/.cargo/env\n# # Disabling incr comp reduces cache size and incr comp doesn't save as much\n# # on CI anyway.\n# - export CARGO_BUILD_INCREMENTAL=false\n# # Skip rand as it fails on FreeBSD due to rust-random/rand#1355\n# - ./y.sh test --skip-test test.rust-random/rand\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_cranelift\.cirrus.yml | .cirrus.yml | YAML | 791 | 0.8 | 0 | 1 | node-utils | 700 | 2024-04-11T09:54:16.349085 | Apache-2.0 | false | 26830effd6799205c3951660412d504d |
name: Abi-cafe\n\non:\n - push\n - pull_request\n\npermissions: {}\n\njobs:\n abi_cafe:\n runs-on: ${{ matrix.os }}\n timeout-minutes: 30\n concurrency:\n group: ${{ github.workflow }}-${{ github.ref }}-${{ matrix.os }}-${{ matrix.env.TARGET_TRIPLE }}\n cancel-in-progress: true\n\n defaults:\n run:\n shell: bash\n\n strategy:\n fail-fast: true\n matrix:\n include:\n - os: ubuntu-latest\n env:\n TARGET_TRIPLE: x86_64-unknown-linux-gnu\n - os: ubuntu-24.04-arm\n env:\n TARGET_TRIPLE: aarch64-unknown-linux-gnu\n - os: macos-13\n env:\n TARGET_TRIPLE: x86_64-apple-darwin\n - os: macos-latest\n env:\n TARGET_TRIPLE: aarch64-apple-darwin\n - os: windows-latest\n env:\n TARGET_TRIPLE: x86_64-pc-windows-msvc\n # FIXME Currently hangs. Re-enable once this is fixed or abi-cafe adds a timeout.\n #- os: windows-latest\n # env:\n # TARGET_TRIPLE: x86_64-pc-windows-gnu\n\n steps:\n - uses: actions/checkout@v4\n\n - name: CPU features\n if: matrix.os == 'ubuntu-latest'\n run: cat /proc/cpuinfo\n\n - name: Cache cargo target dir\n uses: actions/cache@v4\n with:\n path: build/cg_clif\n key: ${{ runner.os }}-${{ matrix.env.TARGET_TRIPLE }}-cargo-build-target-${{ hashFiles('rust-toolchain', '**/Cargo.lock') }}\n\n - name: Set MinGW as the default toolchain\n if: matrix.env.TARGET_TRIPLE == 'x86_64-pc-windows-gnu'\n run: rustup set default-host x86_64-pc-windows-gnu\n\n - name: Build\n run: ./y.sh build --sysroot none\n\n - name: Test abi-cafe\n env:\n TARGET_TRIPLE: ${{ matrix.env.TARGET_TRIPLE }}\n run: ./y.sh abi-cafe\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_cranelift\.github\workflows\abi-cafe.yml | abi-cafe.yml | YAML | 1,809 | 0.8 | 0.029412 | 0.070175 | python-kit | 179 | 2024-06-26T05:35:59.051035 | BSD-3-Clause | false | d8b5add4e81c68b59f952b1adb5764ad |
name: Security audit\non:\n workflow_dispatch:\n schedule:\n - cron: '0 10 * * 1' # every monday at 10:00 UTC\npermissions:\n issues: write\n checks: write\njobs:\n audit:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - run: |\n sed -i 's/components.*/components = []/' rust-toolchain\n - uses: rustsec/audit-check@v1.4.1\n with:\n token: ${{ secrets.GITHUB_TOKEN }}\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_cranelift\.github\workflows\audit.yml | audit.yml | YAML | 424 | 0.8 | 0 | 0 | node-utils | 629 | 2025-04-15T11:01:22.355215 | Apache-2.0 | false | 28ca7968c60157938cbe1ed7af1d1220 |
name: CI\n\non:\n - push\n - pull_request\n\ndefaults:\n run:\n shell: bash\n\npermissions: {}\n\nenv:\n # Disabling incr comp reduces cache size and incr comp doesn't save as much\n # on CI anyway.\n CARGO_BUILD_INCREMENTAL: false\n # Rust's CI denies warnings. Deny them here too to ensure subtree syncs don't\n # fail because of warnings.\n RUSTFLAGS: "-Dwarnings"\n\njobs:\n rustfmt:\n runs-on: ubuntu-latest\n timeout-minutes: 10\n\n steps:\n - uses: actions/checkout@v4\n\n - name: Avoid installing rustc-dev\n run: |\n sed -i 's/components.*/components = ["rustfmt"]/' rust-toolchain\n rustfmt -v\n\n - name: Rustfmt\n run: |\n cargo fmt --check\n rustfmt --check build_system/main.rs\n rustfmt --check example/*\n rustfmt --check scripts/*.rs\n\n\n test:\n runs-on: ${{ matrix.os }}\n timeout-minutes: 60\n\n env:\n CG_CLIF_EXPENSIVE_CHECKS: 1\n\n strategy:\n fail-fast: false\n matrix:\n include:\n - os: ubuntu-latest\n env:\n TARGET_TRIPLE: x86_64-unknown-linux-gnu\n - os: ubuntu-24.04-arm\n env:\n TARGET_TRIPLE: aarch64-unknown-linux-gnu\n - os: macos-13\n env:\n TARGET_TRIPLE: x86_64-apple-darwin\n - os: macos-latest\n env:\n TARGET_TRIPLE: aarch64-apple-darwin\n - os: ubuntu-latest\n env:\n TARGET_TRIPLE: s390x-unknown-linux-gnu\n apt_deps: gcc-s390x-linux-gnu qemu-user\n - os: ubuntu-latest\n env:\n TARGET_TRIPLE: riscv64gc-unknown-linux-gnu\n apt_deps: gcc-riscv64-linux-gnu qemu-user\n - os: windows-latest\n env:\n TARGET_TRIPLE: x86_64-pc-windows-msvc\n - os: windows-latest\n env:\n TARGET_TRIPLE: x86_64-pc-windows-gnu\n\n steps:\n - uses: actions/checkout@v4\n\n - name: CPU features\n if: matrix.os == 'ubuntu-latest'\n run: cat /proc/cpuinfo\n\n - name: Cache cargo target dir\n uses: actions/cache@v4\n with:\n path: build/cg_clif\n key: ${{ runner.os }}-${{ matrix.env.TARGET_TRIPLE }}-cargo-build-target-${{ hashFiles('rust-toolchain', '**/Cargo.lock') }}\n\n - name: Set MinGW as the default toolchain\n if: matrix.os == 'windows-latest' && matrix.env.TARGET_TRIPLE == 'x86_64-pc-windows-gnu'\n run: rustup set default-host x86_64-pc-windows-gnu\n\n - name: Install toolchain and emulator\n if: matrix.apt_deps != null\n run: |\n sudo apt-get update\n sudo apt-get install -y ${{ matrix.apt_deps }}\n\n - name: Prepare dependencies\n run: ./y.sh prepare\n\n - name: Build\n run: ./y.sh build --sysroot none\n\n - name: Test\n env:\n TARGET_TRIPLE: ${{ matrix.env.TARGET_TRIPLE }}\n run: ./y.sh test\n\n - name: Install LLVM standard library\n run: rustup target add ${{ matrix.env.TARGET_TRIPLE }}\n\n # This is roughly config rust-lang/rust uses for testing\n - name: Test with LLVM sysroot\n env:\n TARGET_TRIPLE: ${{ matrix.env.TARGET_TRIPLE }}\n run: ./y.sh test --sysroot llvm --no-unstable-features\n\n\n # This job doesn't use cg_clif in any way. It checks that all cg_clif tests work with cg_llvm too.\n test_llvm:\n runs-on: ubuntu-latest\n timeout-minutes: 60\n\n steps:\n - uses: actions/checkout@v4\n\n - name: CPU features\n run: cat /proc/cpuinfo\n\n - name: Prepare dependencies\n run: ./y.sh prepare\n\n - name: Disable JIT tests\n run: |\n sed -i 's/jit./#jit./' config.txt\n\n - name: Test\n env:\n TARGET_TRIPLE: x86_64-unknown-linux-gnu\n run: ./y.sh test --use-backend llvm\n\n bench:\n runs-on: ubuntu-latest\n timeout-minutes: 60\n\n steps:\n - uses: actions/checkout@v4\n\n - name: CPU features\n run: cat /proc/cpuinfo\n\n - name: Cache cargo target dir\n uses: actions/cache@v4\n with:\n path: build/cg_clif\n key: ${{ runner.os }}-x86_64-unknown-linux-gnu-cargo-build-target-${{ hashFiles('rust-toolchain', '**/Cargo.lock') }}\n\n - name: Install hyperfine\n run: |\n sudo apt update\n sudo apt install -y hyperfine\n\n - name: Build\n run: ./y.sh build --sysroot none\n\n - name: Benchmark\n run: ./y.sh bench\n\n\n dist:\n runs-on: ${{ matrix.os }}\n timeout-minutes: 60\n\n strategy:\n fail-fast: false\n matrix:\n include:\n # Intentionally using an older ubuntu version to lower the glibc requirements of the distributed cg_clif\n - os: ubuntu-22.04\n env:\n TARGET_TRIPLE: x86_64-unknown-linux-gnu\n - os: ubuntu-24.04-arm\n env:\n TARGET_TRIPLE: aarch64-unknown-linux-gnu\n - os: macos-13\n env:\n TARGET_TRIPLE: x86_64-apple-darwin\n - os: macos-latest\n env:\n TARGET_TRIPLE: aarch64-apple-darwin\n - os: windows-latest\n env:\n TARGET_TRIPLE: x86_64-pc-windows-msvc\n - os: windows-latest\n env:\n TARGET_TRIPLE: x86_64-pc-windows-gnu\n\n steps:\n - uses: actions/checkout@v4\n\n - name: Cache cargo target dir\n uses: actions/cache@v4\n with:\n path: build/cg_clif\n key: ${{ runner.os }}-${{ matrix.env.TARGET_TRIPLE }}-dist-cargo-build-target-${{ hashFiles('rust-toolchain', '**/Cargo.lock') }}\n\n - name: Set MinGW as the default toolchain\n if: matrix.os == 'windows-latest' && matrix.env.TARGET_TRIPLE == 'x86_64-pc-windows-gnu'\n run: rustup set default-host x86_64-pc-windows-gnu\n\n - name: Build backend\n run: ./y.sh build --sysroot none\n\n - name: Build sysroot\n run: ./y.sh build\n\n - name: Package prebuilt cg_clif\n run: tar cvfJ cg_clif.tar.xz dist\n\n - name: Upload prebuilt cg_clif\n uses: actions/upload-artifact@v4\n with:\n name: cg_clif-${{ matrix.env.TARGET_TRIPLE }}\n path: cg_clif.tar.xz\n\n release:\n runs-on: ubuntu-latest\n timeout-minutes: 10\n if: ${{ github.ref == 'refs/heads/master' }}\n needs: [rustfmt, test, bench, dist]\n\n permissions:\n contents: write # for creating the dev tag and release\n\n concurrency:\n group: release-dev\n cancel-in-progress: true\n\n steps:\n - uses: actions/checkout@v4\n\n - name: Download all built artifacts\n uses: actions/download-artifact@v4\n with:\n path: artifacts/\n\n - run: |\n ls -R artifacts/\n mkdir release/\n pushd artifacts/\n for dir in *; do\n mv $dir/cg_clif.tar.xz ../release/$dir.tar.xz\n rmdir $dir/ # verify $dir is empty\n done\n popd\n rmdir artifacts/ # verify all artifacts are represented in release/\n ls -R release/\n\n - name: Publish release\n env:\n GH_TOKEN: ${{ github.token }}\n run: |\n gh release delete --cleanup-tag -y dev || true\n gh release create --target $GITHUB_SHA --prerelease dev release/*\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_cranelift\.github\workflows\main.yml | main.yml | YAML | 7,074 | 0.95 | 0.02974 | 0.032407 | vue-tools | 713 | 2025-06-03T11:48:39.468306 | BSD-3-Clause | false | b0355b777c1b934c077a499d55128229 |
name: Various rustc tests\n\non:\n - push\n\npermissions: {}\n\njobs:\n bootstrap_rustc:\n runs-on: ubuntu-latest\n timeout-minutes: 60\n\n steps:\n - uses: actions/checkout@v4\n\n - name: CPU features\n run: cat /proc/cpuinfo\n\n - name: Cache cargo target dir\n uses: actions/cache@v4\n with:\n path: build/cg_clif\n key: ${{ runner.os }}-rustc-test-cargo-build-target-${{ hashFiles('rust-toolchain', 'Cargo.lock') }}\n\n - name: Test\n run: ./scripts/test_bootstrap.sh\n\n\n rustc_test_suite:\n runs-on: ubuntu-latest\n timeout-minutes: 60\n\n steps:\n - uses: actions/checkout@v4\n\n - name: CPU features\n run: cat /proc/cpuinfo\n\n - name: Cache cargo target dir\n uses: actions/cache@v4\n with:\n path: build/cg_clif\n key: ${{ runner.os }}-rustc-test-cargo-build-target-${{ hashFiles('rust-toolchain', 'Cargo.lock') }}\n\n - name: Install ripgrep\n run: |\n sudo apt update\n sudo apt install -y ripgrep\n\n - name: Test\n run: ./scripts/test_rustc_tests.sh\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_cranelift\.github\workflows\rustc.yml | rustc.yml | YAML | 1,051 | 0.7 | 0 | 0 | vue-tools | 338 | 2024-08-22T13:58:16.951086 | BSD-3-Clause | false | e571c961e6f175b59d0dd2bd68ae28f6 |
name: CI\n\non:\n push:\n branches:\n - master\n pull_request:\n\npermissions:\n contents: read\n\nenv:\n # Enable backtraces for easier debugging\n RUST_BACKTRACE: 1\n\njobs:\n build:\n runs-on: ubuntu-24.04\n\n strategy:\n fail-fast: false\n matrix:\n libgccjit_version:\n - { gcc: "gcc-15.deb" }\n - { gcc: "gcc-15-without-int128.deb" }\n commands: [\n "--std-tests",\n # FIXME: re-enable asm tests when GCC can emit in the right syntax.\n # "--asm-tests",\n "--test-libcore",\n "--extended-rand-tests",\n "--extended-regex-example-tests",\n "--extended-regex-tests",\n "--test-successful-rustc --nb-parts 2 --current-part 0",\n "--test-successful-rustc --nb-parts 2 --current-part 1",\n "--projects",\n ]\n\n steps:\n - uses: actions/checkout@v4\n\n # `rustup show` installs from rust-toolchain.toml\n - name: Setup rust toolchain\n run: rustup show\n\n - name: Setup rust cache\n uses: Swatinem/rust-cache@v2\n\n - name: Install packages\n # `llvm-14-tools` is needed to install the `FileCheck` binary which is used for asm tests.\n run: sudo apt-get install ninja-build ripgrep llvm-14-tools\n\n - name: Install rustfmt & clippy\n run: rustup component add rustfmt clippy\n\n - name: Download artifact\n run: curl -LO https://github.com/rust-lang/gcc/releases/latest/download/${{ matrix.libgccjit_version.gcc }}\n\n - name: Setup path to libgccjit\n run: |\n sudo dpkg --force-overwrite -i ${{ matrix.libgccjit_version.gcc }}\n echo 'gcc-path = "/usr/lib/"' > config.toml\n\n - name: Set env\n run: |\n echo "workspace="$GITHUB_WORKSPACE >> $GITHUB_ENV\n echo "LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n echo "LD_LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n\n #- name: Cache rust repository\n ## We only clone the rust repository for rustc tests\n #if: ${{ contains(matrix.commands, 'rustc') }}\n #uses: actions/cache@v3\n #id: cache-rust-repository\n #with:\n #path: rust\n #key: ${{ runner.os }}-packages-${{ hashFiles('rust/.git/HEAD') }}\n\n - name: Build\n run: |\n ./y.sh prepare --only-libcore\n ./y.sh build --sysroot\n ./y.sh test --mini-tests\n cargo test\n\n - name: Run y.sh cargo build\n run: |\n ./y.sh cargo build --manifest-path tests/hello-world/Cargo.toml\n\n - name: Clean\n run: |\n ./y.sh clean all\n\n - name: Prepare dependencies\n run: |\n git config --global user.email "user@example.com"\n git config --global user.name "User"\n ./y.sh prepare\n\n - name: Run tests\n run: |\n ./y.sh test --release --clean --build-sysroot ${{ matrix.commands }}\n\n - name: Check formatting\n run: ./y.sh fmt --check\n\n - name: clippy\n run: |\n cargo clippy --all-targets -- -D warnings\n cargo clippy --all-targets --features master -- -D warnings\n\n duplicates:\n runs-on: ubuntu-24.04\n steps:\n - uses: actions/checkout@v4\n - run: python tools/check_intrinsics_duplicates.py\n\n build_system:\n runs-on: ubuntu-24.04\n steps:\n - uses: actions/checkout@v4\n - name: Test build system\n run: |\n cd build_system\n cargo test\n\n # Summary job for the merge queue.\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n success:\n needs: [build, duplicates, build_system]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_gcc\.github\workflows\ci.yml | ci.yml | YAML | 4,255 | 0.8 | 0.069444 | 0.183333 | awesome-app | 160 | 2024-04-18T22:22:35.101470 | BSD-3-Clause | false | b492b37cc79a536c00e3e21c3508150d |
# TODO: refactor to avoid duplication with the ci.yml file.\nname: Failures\n\non:\n push:\n branches:\n - master\n pull_request:\n\npermissions:\n contents: read\n\nenv:\n # Enable backtraces for easier debugging\n RUST_BACKTRACE: 1\n\njobs:\n build:\n runs-on: ubuntu-24.04\n\n strategy:\n fail-fast: false\n matrix:\n libgccjit_version:\n - gcc: "libgccjit.so"\n artifacts_branch: "master"\n - gcc: "libgccjit_without_int128.so"\n artifacts_branch: "master-without-128bit-integers"\n - gcc: "libgccjit12.so"\n artifacts_branch: "gcc12"\n extra: "--no-default-features"\n # FIXME(antoyo): we need to set GCC_EXEC_PREFIX so that the linker can find the linker plugin.\n # Not sure why it's not found otherwise.\n env_extra: "TEST_FLAGS='-Cpanic=abort -Zpanic-abort-tests' GCC_EXEC_PREFIX=/usr/lib/gcc/"\n\n steps:\n - uses: actions/checkout@v4\n\n # `rustup show` installs from rust-toolchain.toml\n - name: Setup rust toolchain\n run: rustup show\n\n - name: Setup rust cache\n uses: Swatinem/rust-cache@v2\n\n - name: Install packages\n run: sudo apt-get install ninja-build ripgrep\n\n - name: Install libgccjit12\n if: matrix.libgccjit_version.gcc == 'libgccjit12.so'\n run: sudo apt-get install libgccjit-12-dev\n\n - name: Setup path to libgccjit\n if: matrix.libgccjit_version.gcc == 'libgccjit12.so'\n run: |\n echo 'gcc-path = "/usr/lib/gcc/x86_64-linux-gnu/12"' > config.toml\n echo "LIBRARY_PATH=/usr/lib/gcc/x86_64-linux-gnu/12" >> $GITHUB_ENV\n echo "LD_LIBRARY_PATH=/usr/lib/gcc/x86_64-linux-gnu/12" >> $GITHUB_ENV\n\n - name: Download artifact\n if: matrix.libgccjit_version.gcc != 'libgccjit12.so'\n run: curl -LO https://github.com/rust-lang/gcc/releases/latest/download/gcc-15.deb\n\n - name: Setup path to libgccjit\n if: matrix.libgccjit_version.gcc != 'libgccjit12.so'\n run: |\n sudo dpkg --force-overwrite -i gcc-15.deb\n echo 'gcc-path = "/usr/lib"' > config.toml\n echo "LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n echo "LD_LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n\n - name: Set env\n run: |\n echo "workspace="$GITHUB_WORKSPACE >> $GITHUB_ENV\n\n #- name: Cache rust repository\n #uses: actions/cache@v3\n #id: cache-rust-repository\n #with:\n #path: rust\n #key: ${{ runner.os }}-packages-${{ hashFiles('rust/.git/HEAD') }}\n\n - name: Git config\n run: |\n git config --global user.email "user@example.com"\n git config --global user.name "User"\n\n - name: Prepare dependencies\n if: matrix.libgccjit_version.gcc == 'libgccjit12.so'\n run: ./y.sh prepare --libgccjit12-patches\n\n - name: Prepare dependencies\n if: matrix.libgccjit_version.gcc != 'libgccjit12.so'\n run: ./y.sh prepare\n\n - name: Run tests\n # TODO: re-enable those tests for libgccjit 12.\n if: matrix.libgccjit_version.gcc != 'libgccjit12.so'\n id: tests\n run: |\n ${{ matrix.libgccjit_version.env_extra }} ./y.sh test --release --clean --build-sysroot --test-failing-rustc ${{ matrix.libgccjit_version.extra }} 2>&1 | tee output_log\n rg --text "test result" output_log >> $GITHUB_STEP_SUMMARY\n\n - name: Run failing ui pattern tests for ICE\n # TODO: re-enable those tests for libgccjit 12.\n if: matrix.libgccjit_version.gcc != 'libgccjit12.so'\n id: ui-tests\n run: |\n ${{ matrix.libgccjit_version.env_extra }} ./y.sh test --release --test-failing-ui-pattern-tests ${{ matrix.libgccjit_version.extra }} 2>&1 | tee output_log_ui\n if grep -q "the compiler unexpectedly panicked" output_log_ui; then\n echo "Error: 'the compiler unexpectedly panicked' found in output logs. CI Error!!"\n exit 1\n fi\n\n # Summary job for the merge queue.\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n success_failures:\n needs: [build]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_gcc\.github\workflows\failures.yml | failures.yml | YAML | 4,785 | 0.8 | 0.143939 | 0.198198 | python-kit | 864 | 2025-05-28T04:47:35.999309 | MIT | false | 9e73070d6ab273caf61c7aaca9fc31e9 |
name: CI libgccjit 12\n\non:\n push:\n branches:\n - master\n pull_request:\n\npermissions:\n contents: read\n\nenv:\n # Enable backtraces for easier debugging\n RUST_BACKTRACE: 1\n TEST_FLAGS: "-Cpanic=abort -Zpanic-abort-tests"\n # FIXME(antoyo): we need to set GCC_EXEC_PREFIX so that the linker can find the linker plugin.\n # Not sure why it's not found otherwise.\n GCC_EXEC_PREFIX: /usr/lib/gcc/\n\njobs:\n build:\n runs-on: ubuntu-24.04\n\n strategy:\n fail-fast: false\n matrix:\n commands: [\n "--mini-tests",\n "--std-tests",\n # FIXME: re-enable asm tests when GCC can emit in the right syntax.\n # "--asm-tests",\n "--test-libcore",\n "--test-successful-rustc --nb-parts 2 --current-part 0",\n "--test-successful-rustc --nb-parts 2 --current-part 1",\n ]\n\n steps:\n - uses: actions/checkout@v4\n\n # `rustup show` installs from rust-toolchain.toml\n - name: Setup rust toolchain\n run: rustup show\n\n - name: Setup rust cache\n uses: Swatinem/rust-cache@v2\n\n - name: Install packages\n # `llvm-14-tools` is needed to install the `FileCheck` binary which is used for asm tests.\n run: sudo apt-get install ninja-build ripgrep llvm-14-tools libgccjit-12-dev\n\n - name: Setup path to libgccjit\n run: echo 'gcc-path = "/usr/lib/gcc/x86_64-linux-gnu/12"' > config.toml\n\n - name: Set env\n run: |\n echo "workspace="$GITHUB_WORKSPACE >> $GITHUB_ENV\n echo "LIBRARY_PATH=/usr/lib/gcc/x86_64-linux-gnu/12" >> $GITHUB_ENV\n echo "LD_LIBRARY_PATH=/usr/lib/gcc/x86_64-linux-gnu/12" >> $GITHUB_ENV\n\n #- name: Cache rust repository\n ## We only clone the rust repository for rustc tests\n #if: ${{ contains(matrix.commands, 'rustc') }}\n #uses: actions/cache@v3\n #id: cache-rust-repository\n #with:\n #path: rust\n #key: ${{ runner.os }}-packages-${{ hashFiles('rust/.git/HEAD') }}\n\n - name: Build\n run: |\n ./y.sh prepare --only-libcore --libgccjit12-patches\n ./y.sh build --no-default-features --sysroot-panic-abort\n # Uncomment when we no longer need to remove global variables.\n #./y.sh build --sysroot --no-default-features --sysroot-panic-abort\n #cargo test --no-default-features\n #./y.sh clean all\n\n #- name: Prepare dependencies\n #run: |\n #git config --global user.email "user@example.com"\n #git config --global user.name "User"\n #./y.sh prepare --libgccjit12-patches\n\n #- name: Add more failing tests for GCC 12\n #run: cat tests/failing-ui-tests12.txt >> tests/failing-ui-tests.txt\n\n #- name: Run tests\n #run: |\n #./y.sh test --release --clean --build-sysroot ${{ matrix.commands }} --no-default-features\n\n # Summary job for the merge queue.\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n success_gcc12:\n needs: [build]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_gcc\.github\workflows\gcc12.yml | gcc12.yml | YAML | 3,699 | 0.8 | 0.101852 | 0.417582 | vue-tools | 102 | 2024-06-10T14:24:26.197310 | MIT | false | 473390f24c69bc845a83bbf8b1777a4e |
# TODO: check if qemu-user-static-binfmt is needed (perhaps to run some tests since it probably calls exec).\n\nname: m68k CI\n\non:\n push:\n branches:\n - master\n pull_request:\n\npermissions:\n contents: read\n\nenv:\n # Enable backtraces for easier debugging\n RUST_BACKTRACE: 1\n # TODO: remove when confish.sh is removed.\n OVERWRITE_TARGET_TRIPLE: m68k-unknown-linux-gnu\n\njobs:\n build:\n runs-on: ubuntu-24.04\n\n strategy:\n fail-fast: false\n matrix:\n commands: [\n "--std-tests",\n # TODO(antoyo): fix those on m68k.\n #"--test-libcore",\n #"--extended-rand-tests",\n #"--extended-regex-example-tests",\n #"--extended-regex-tests",\n #"--test-successful-rustc --nb-parts 2 --current-part 0",\n #"--test-successful-rustc --nb-parts 2 --current-part 1",\n #"--test-failing-rustc",\n ]\n\n steps:\n - uses: actions/checkout@v4\n\n # `rustup show` installs from rust-toolchain.toml\n - name: Setup rust toolchain\n run: rustup show\n\n - name: Setup rust cache\n uses: Swatinem/rust-cache@v2\n\n - name: Install packages\n run: |\n sudo apt-get update\n sudo apt-get install qemu-system qemu-user-static\n\n - name: Download artifact\n run: curl -LO https://github.com/cross-cg-gcc-tools/cross-gcc/releases/latest/download/gcc-m68k-15.deb\n\n - name: Download VM artifact\n run: curl -LO https://github.com/cross-cg-gcc-tools/vms/releases/latest/download/debian-m68k.img\n\n - name: Setup path to libgccjit\n run: |\n sudo dpkg -i gcc-m68k-15.deb\n echo 'gcc-path = "/usr/lib/"' > config.toml\n\n - name: Set env\n run: |\n echo "workspace="$GITHUB_WORKSPACE >> $GITHUB_ENV\n echo "LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n echo "LD_LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n\n #- name: Cache rust repository\n ## We only clone the rust repository for rustc tests\n #if: ${{ contains(matrix.commands, 'rustc') }}\n #uses: actions/cache@v3\n #id: cache-rust-repository\n #with:\n #path: rust\n #key: ${{ runner.os }}-packages-${{ hashFiles('rust/.git/HEAD') }}\n\n - name: Prepare VM\n run: |\n mkdir vm\n sudo mount debian-m68k.img vm\n sudo cp $(which qemu-m68k-static) vm/usr/bin/\n\n - name: Build sample project with target defined as JSON spec\n run: |\n ./y.sh prepare --only-libcore --cross\n ./y.sh build --sysroot --features compiler_builtins/no-f16-f128 --target-triple m68k-unknown-linux-gnu --target ${{ github.workspace }}/target_specs/m68k-unknown-linux-gnu.json\n ./y.sh cargo build --manifest-path=./tests/hello-world/Cargo.toml --target ${{ github.workspace }}/target_specs/m68k-unknown-linux-gnu.json\n ./y.sh clean all\n\n - name: Build\n run: |\n ./y.sh prepare --only-libcore --cross\n ./y.sh build --sysroot --features compiler_builtins/no-f16-f128 --target-triple m68k-unknown-linux-gnu\n ./y.sh test --mini-tests\n CG_GCC_TEST_TARGET=m68k-unknown-linux-gnu cargo test\n ./y.sh clean all\n\n - name: Prepare dependencies\n run: |\n git config --global user.email "user@example.com"\n git config --global user.name "User"\n ./y.sh prepare --cross\n\n - name: Run tests\n run: |\n ./y.sh test --release --clean --build-sysroot --sysroot-features compiler_builtins/no-f16-f128 ${{ matrix.commands }}\n\n # Summary job for the merge queue.\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n success_m68k:\n needs: [build]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_gcc\.github\workflows\m68k.yml | m68k.yml | YAML | 4,366 | 0.8 | 0.078125 | 0.271028 | node-utils | 734 | 2025-04-18T10:58:08.605816 | MIT | false | 70a8ae2c60b1fc512de294685c65c793 |
name: CI with sysroot compiled in release mode\n\non:\n push:\n branches:\n - master\n pull_request:\n\npermissions:\n contents: read\n\nenv:\n # Enable backtraces for easier debugging\n RUST_BACKTRACE: 1\n\njobs:\n build:\n runs-on: ubuntu-24.04\n\n strategy:\n fail-fast: false\n matrix:\n commands: [\n "--test-successful-rustc --nb-parts 2 --current-part 0",\n "--test-successful-rustc --nb-parts 2 --current-part 1",\n ]\n\n steps:\n - uses: actions/checkout@v4\n\n # `rustup show` installs from rust-toolchain.toml\n - name: Setup rust toolchain\n run: rustup show\n\n - name: Setup rust cache\n uses: Swatinem/rust-cache@v2\n\n - name: Install packages\n run: sudo apt-get install ninja-build ripgrep\n\n - name: Download artifact\n run: curl -LO https://github.com/rust-lang/gcc/releases/latest/download/gcc-15.deb\n\n - name: Setup path to libgccjit\n run: |\n sudo dpkg --force-overwrite -i gcc-15.deb\n echo 'gcc-path = "/usr/lib/"' > config.toml\n\n - name: Set env\n run: |\n echo "workspace="$GITHUB_WORKSPACE >> $GITHUB_ENV\n echo "LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n echo "LD_LIBRARY_PATH=/usr/lib" >> $GITHUB_ENV\n\n - name: Build\n run: |\n ./y.sh prepare --only-libcore\n EMBED_LTO_BITCODE=1 ./y.sh build --sysroot --release --release-sysroot\n ./y.sh test --mini-tests\n cargo test\n ./y.sh clean all\n\n - name: Prepare dependencies\n run: |\n git config --global user.email "user@example.com"\n git config --global user.name "User"\n ./y.sh prepare\n\n - name: Add more failing tests because of undefined symbol errors (FIXME)\n run: cat tests/failing-lto-tests.txt >> tests/failing-ui-tests.txt\n\n - name: Run tests\n run: |\n # FIXME(antoyo): we cannot enable LTO for stdarch tests currently because of some failing LTO tests using proc-macros.\n echo -n 'lto = "fat"' >> build_system/build_sysroot/Cargo.toml\n EMBED_LTO_BITCODE=1 ./y.sh test --release --clean --release-sysroot --build-sysroot --keep-lto-tests ${{ matrix.commands }}\n\n - name: Run y.sh cargo build\n run: |\n EMBED_LTO_BITCODE=1 CHANNEL="release" ./y.sh cargo build --release --manifest-path tests/hello-world/Cargo.toml\n call_found=$(objdump -dj .text tests/hello-world/target/release/hello_world | grep -c "call .*mylib.*my_func" ) ||:\n if [ $call_found -gt 0 ]; then\n echo "ERROR: call my_func found in asm"\n echo "Test is done with LTO enabled, hence inlining should occur across crates"\n exit 1\n fi\n\n # Summary job for the merge queue.\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n success_release:\n needs: [build]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_gcc\.github\workflows\release.yml | release.yml | YAML | 3,567 | 0.8 | 0.085714 | 0.137931 | react-lib | 938 | 2023-09-10T12:59:34.368253 | BSD-3-Clause | false | 65afff09f9e0dccf8adc58249b04c0b7 |
name: stdarch tests with sysroot compiled in release mode\n\non:\n push:\n branches:\n - master\n pull_request:\n\npermissions:\n contents: read\n\nenv:\n # Enable backtraces for easier debugging\n RUST_BACKTRACE: 1\n\njobs:\n build:\n runs-on: ubuntu-24.04\n\n strategy:\n fail-fast: false\n matrix:\n cargo_runner: [\n "sde -future -rtm_mode full --",\n "",\n ]\n\n steps:\n - uses: actions/checkout@v4\n\n # `rustup show` installs from rust-toolchain.toml\n - name: Setup rust toolchain\n run: rustup show\n\n - name: Setup rust cache\n uses: Swatinem/rust-cache@v2\n\n - name: Install packages\n run: sudo apt-get install ninja-build ripgrep\n\n # TODO: remove when we have binutils version 2.43 in the repo.\n - name: Install more recent binutils\n run: |\n echo "deb http://archive.ubuntu.com/ubuntu oracular main universe" | sudo tee /etc/apt/sources.list.d/oracular-copies.list\n sudo apt-get update\n sudo apt-get install binutils\n\n - name: Install Intel Software Development Emulator\n if: ${{ matrix.cargo_runner }}\n run: |\n mkdir intel-sde\n cd intel-sde\n dir=sde-external-9.33.0-2024-01-07-lin\n file=$dir.tar.xz\n wget https://downloadmirror.intel.com/813591/$file\n tar xvf $file\n sudo mkdir /usr/share/intel-sde\n sudo cp -r $dir/* /usr/share/intel-sde\n sudo ln -s /usr/share/intel-sde/sde /usr/bin/sde\n sudo ln -s /usr/share/intel-sde/sde64 /usr/bin/sde64\n\n - name: Set env\n run: |\n echo "workspace="$GITHUB_WORKSPACE >> $GITHUB_ENV\n echo 'download-gccjit = true' > config.toml\n\n - name: Build\n run: |\n ./y.sh prepare --only-libcore\n ./y.sh build --sysroot --release --release-sysroot\n\n - name: Set env (part 2)\n run: |\n # Set the `LD_LIBRARY_PATH` and `LIBRARY_PATH` env variables...\n echo "LD_LIBRARY_PATH="$(./y.sh info | grep -v Using) >> $GITHUB_ENV\n echo "LIBRARY_PATH="$(./y.sh info | grep -v Using) >> $GITHUB_ENV\n\n - name: Clean\n if: ${{ !matrix.cargo_runner }}\n run: |\n ./y.sh clean all\n\n - name: Prepare dependencies\n run: |\n git config --global user.email "user@example.com"\n git config --global user.name "User"\n ./y.sh prepare\n\n - name: Run tests\n if: ${{ !matrix.cargo_runner }}\n run: |\n ./y.sh test --release --clean --release-sysroot --build-sysroot --mini-tests --std-tests --test-libcore\n cargo test\n\n - name: Run stdarch tests\n if: ${{ !matrix.cargo_runner }}\n run: |\n CHANNEL=release TARGET=x86_64-unknown-linux-gnu CG_RUSTFLAGS="-Ainternal_features" ./y.sh cargo test --manifest-path build/build_sysroot/sysroot_src/library/stdarch/Cargo.toml\n\n - name: Run stdarch tests\n if: ${{ matrix.cargo_runner }}\n run: |\n # FIXME: these tests fail when the sysroot is compiled with LTO because of a missing symbol in proc-macro.\n # TODO: remove --skip test_mm512_stream_ps when stdarch is updated in rustc.\n # TODO: remove --skip test_tile_ when it's implemented.\n STDARCH_TEST_EVERYTHING=1 CHANNEL=release CARGO_TARGET_X86_64_UNKNOWN_LINUX_GNU_RUNNER="${{ matrix.cargo_runner }}" TARGET=x86_64-unknown-linux-gnu CG_RUSTFLAGS="-Ainternal_features --cfg stdarch_intel_sde" ./y.sh cargo test --manifest-path build/build_sysroot/sysroot_src/library/stdarch/Cargo.toml -- --skip rtm --skip tbm --skip sse4a --skip test_mm512_stream_ps --skip test_tile_\n\n # Summary job for the merge queue.\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n success_stdarch:\n needs: [build]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\compiler\rustc_codegen_gcc\.github\workflows\stdarch.yml | stdarch.yml | YAML | 4,479 | 0.8 | 0.096 | 0.152381 | python-kit | 868 | 2024-05-31T02:21:58.479266 | Apache-2.0 | false | 4a002799221512b050579776b4c85ac2 |
# This only controls whether a tiny, hard-to-find "open a blank issue" link appears at the end of\n# the template list.\nblank_issues_enabled: true\ncontact_links:\n - name: Intrinsic Support\n url: https://github.com/rust-lang/stdarch/issues\n about: Please direct issues about Rust's support for vendor intrinsics to core::arch\n - name: Internal Compiler Error\n url: https://github.com/rust-lang/rust/issues\n about: Please report ICEs to the rustc repository\n | dataset_sample\yaml\rust-lang_rust\library\portable-simd\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 469 | 0.8 | 0.1 | 0.2 | react-lib | 791 | 2023-10-28T17:50:11.327721 | BSD-3-Clause | false | 14ad040bbaca3b3eb6fa902dbc84fd83 |
name: CI\n\non:\n pull_request:\n push:\n branches:\n - master\n\nenv:\n CARGO_NET_RETRY: 10\n RUSTUP_MAX_RETRIES: 10\n PROPTEST_CASES: 64\n\njobs:\n rustfmt:\n name: "rustfmt"\n runs-on: ubuntu-latest\n\n steps:\n - uses: actions/checkout@v4\n - name: Run rustfmt\n run: cargo fmt --all -- --check\n\n clippy:\n name: "clippy on ${{ matrix.target }}"\n runs-on: ubuntu-latest\n strategy:\n fail-fast: false\n matrix:\n target:\n # We shouldn't really have any OS-specific code, so think of this as a list of architectures\n - x86_64-unknown-linux-gnu\n - i686-unknown-linux-gnu\n - i586-unknown-linux-gnu\n - aarch64-unknown-linux-gnu\n - arm64ec-pc-windows-msvc\n - armv7-unknown-linux-gnueabihf\n - loongarch64-unknown-linux-gnu\n # non-nightly since https://github.com/rust-lang/rust/pull/113274\n # - mips-unknown-linux-gnu\n # - mips64-unknown-linux-gnuabi64\n - powerpc-unknown-linux-gnu\n - powerpc64-unknown-linux-gnu\n - riscv64gc-unknown-linux-gnu\n - s390x-unknown-linux-gnu\n - sparc64-unknown-linux-gnu\n - wasm32-unknown-unknown\n\n steps:\n - uses: actions/checkout@v4\n - name: Setup Rust\n run: rustup target add ${{ matrix.target }}\n - name: Run Clippy\n run: cargo clippy --all-targets --target ${{ matrix.target }}\n\n x86-tests:\n name: "${{ matrix.target_feature }} on ${{ matrix.target }}"\n runs-on: ${{ matrix.os }}\n strategy:\n fail-fast: false\n matrix:\n target: [x86_64-pc-windows-msvc, i686-pc-windows-msvc, i586-pc-windows-msvc, x86_64-unknown-linux-gnu]\n # `default` means we use the default target config for the target,\n # `native` means we run with `-Ctarget-cpu=native`, and anything else is\n # an arg to `-Ctarget-feature`\n target_feature: [default, native, +sse3, +ssse3, +sse4.1, +sse4.2, +avx, +avx2]\n\n exclude:\n # -Ctarget-cpu=native sounds like bad-news if target != host\n - { target: i686-pc-windows-msvc, target_feature: native }\n - { target: i586-pc-windows-msvc, target_feature: native }\n\n include:\n # Populate the `matrix.os` field\n - { target: x86_64-unknown-linux-gnu, os: ubuntu-latest }\n - { target: x86_64-pc-windows-msvc, os: windows-latest }\n - { target: i686-pc-windows-msvc, os: windows-latest }\n - { target: i586-pc-windows-msvc, os: windows-latest }\n\n # These are globally available on all the other targets.\n - { target: i586-pc-windows-msvc, target_feature: +sse, os: windows-latest }\n - { target: i586-pc-windows-msvc, target_feature: +sse2, os: windows-latest }\n\n # Annoyingly, the x86_64-unknown-linux-gnu runner *almost* always has\n # avx512vl, but occasionally doesn't. Maybe one day we can enable it.\n\n steps:\n - uses: actions/checkout@v4\n - name: Setup Rust\n run: rustup target add ${{ matrix.target }}\n\n - name: Configure RUSTFLAGS\n shell: bash\n run: |\n case "${{ matrix.target_feature }}" in\n default)\n echo "RUSTFLAGS=-Dwarnings" >> $GITHUB_ENV;;\n native)\n echo "RUSTFLAGS=-Dwarnings -Ctarget-cpu=native" >> $GITHUB_ENV\n ;;\n *)\n echo "RUSTFLAGS=-Dwarnings -Ctarget-feature=${{ matrix.target_feature }}" >> $GITHUB_ENV\n ;;\n esac\n\n # Super useful for debugging why a SIGILL occurred.\n - name: Dump target configuration and support\n run: |\n rustc -Vv\n\n echo "Caveat: not all target features are expected to be logged"\n\n echo "## Requested target configuration (RUSTFLAGS=$RUSTFLAGS)"\n rustc --print=cfg --target=${{ matrix.target }} $RUSTFLAGS\n\n echo "## Supported target configuration for --target=${{ matrix.target }}"\n rustc --print=cfg --target=${{ matrix.target }} -Ctarget-cpu=native\n\n echo "## Natively supported target configuration"\n rustc --print=cfg -Ctarget-cpu=native\n\n - name: Test (debug)\n run: cargo test --verbose --target=${{ matrix.target }}\n\n - name: Test (release)\n run: cargo test --verbose --target=${{ matrix.target }} --release\n\n - name: Generate docs\n run: cargo doc --verbose --target=${{ matrix.target }}\n env:\n RUSTDOCFLAGS: -Dwarnings\n \n macos-tests:\n name: ${{ matrix.target }}\n runs-on: macos-latest\n strategy:\n fail-fast: false\n matrix:\n target:\n - aarch64-apple-darwin\n - x86_64-apple-darwin\n steps:\n - uses: actions/checkout@v4\n - name: Setup Rust\n run: rustup target add ${{ matrix.target }}\n\n - name: Configure RUSTFLAGS\n shell: bash\n run: echo "RUSTFLAGS=-Dwarnings" >> $GITHUB_ENV\n\n - name: Test (debug)\n run: cargo test --verbose --target=${{ matrix.target }}\n\n - name: Test (release)\n run: cargo test --verbose --target=${{ matrix.target }} --release\n\n - name: Generate docs\n run: cargo doc --verbose --target=${{ matrix.target }}\n env:\n RUSTDOCFLAGS: -Dwarnings\n\n wasm-tests:\n name: "wasm (firefox, ${{ matrix.name }})"\n runs-on: ubuntu-latest\n strategy:\n matrix:\n include:\n - { name: default, RUSTFLAGS: "" }\n - { name: simd128, RUSTFLAGS: "-C target-feature=+simd128" }\n steps:\n - uses: actions/checkout@v4\n - name: Install wasm-pack\n run: curl https://rustwasm.github.io/wasm-pack/installer/init.sh -sSf | sh\n - name: Test (debug)\n run: wasm-pack test --firefox --headless crates/core_simd\n env:\n RUSTFLAGS: ${{ matrix.rustflags }}\n - name: Test (release)\n run: wasm-pack test --firefox --headless crates/core_simd --release\n env:\n RUSTFLAGS: ${{ matrix.rustflags }}\n\n cross-tests:\n name: "${{ matrix.target_feature }} on ${{ matrix.target }} (via cross)"\n runs-on: ubuntu-latest\n env:\n PROPTEST_CASES: 16\n strategy:\n fail-fast: false\n\n matrix:\n target:\n - armv7-unknown-linux-gnueabihf\n - thumbv7neon-unknown-linux-gnueabihf # includes neon by default\n - aarch64-unknown-linux-gnu # includes neon by default\n - powerpc-unknown-linux-gnu\n - powerpc64le-unknown-linux-gnu # includes altivec by default\n - riscv64gc-unknown-linux-gnu\n - loongarch64-unknown-linux-gnu\n # MIPS uses a nonstandard binary representation for NaNs which makes it worth testing\n # non-nightly since https://github.com/rust-lang/rust/pull/113274\n # - mips-unknown-linux-gnu\n # - mips64-unknown-linux-gnuabi64\n # Lots of errors in QEMU and no real hardware to test on. Not clear if it's QEMU or bad codegen.\n # - powerpc64-unknown-linux-gnu\n target_feature: [default]\n include:\n - { target: powerpc64le-unknown-linux-gnu, target_feature: "+vsx" }\n # Fails due to QEMU floating point errors, probably handling subnormals incorrectly.\n # This target is somewhat redundant, since ppc64le has altivec as well.\n # - { target: powerpc-unknown-linux-gnu, target_feature: "+altivec" }\n # We should test this, but cross currently can't run it\n # - { target: riscv64gc-unknown-linux-gnu, target_feature: "+v,+zvl128b" }\n\n steps:\n - uses: actions/checkout@v4\n - name: Setup Rust\n run: rustup target add ${{ matrix.target }}\n\n - name: Install Cross\n # Install the latest git version for newer targets.\n run: |\n cargo install cross --git https://github.com/cross-rs/cross --rev 4090beca3cfffa44371a5bba524de3a578aa46c3\n\n - name: Configure Emulated CPUs\n run: |\n echo "CARGO_TARGET_POWERPC_UNKNOWN_LINUX_GNU_RUNNER=qemu-ppc -cpu e600" >> $GITHUB_ENV\n # echo "CARGO_TARGET_RISCV64GC_UNKNOWN_LINUX_GNU_RUNNER=qemu-riscv64 -cpu rv64,zba=true,zbb=true,v=true,vlen=256,vext_spec=v1.0" >> $GITHUB_ENV\n\n - name: Configure RUSTFLAGS\n shell: bash\n run: |\n case "${{ matrix.target_feature }}" in\n default)\n echo "RUSTFLAGS=" >> $GITHUB_ENV;;\n *)\n echo "RUSTFLAGS=-Ctarget-feature=${{ matrix.target_feature }}" >> $GITHUB_ENV\n ;;\n esac\n\n - name: Test (debug)\n run: cross test --verbose --target=${{ matrix.target }}\n\n - name: Test (release)\n run: cross test --verbose --target=${{ matrix.target }} --release\n\n miri:\n runs-on: ubuntu-latest\n env:\n PROPTEST_CASES: 16\n steps:\n - uses: actions/checkout@v4\n - name: Test (Miri)\n run: cargo miri test\n | dataset_sample\yaml\rust-lang_rust\library\portable-simd\.github\workflows\ci.yml | ci.yml | YAML | 8,919 | 0.8 | 0.027559 | 0.12844 | python-kit | 613 | 2024-07-20T15:54:48.178324 | GPL-3.0 | false | d0ed69a4c70cc825d246c53b96fe0a19 |
name: Documentation\n\non:\n push:\n branches:\n - master\n\njobs:\n release:\n name: Deploy Documentation\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout Repository\n uses: actions/checkout@v4\n\n - name: Setup Rust\n run: |\n rustup update nightly --no-self-update\n rustup default nightly\n\n - name: Build Documentation\n run: cargo doc --no-deps\n \n - name: Deploy Documentation\n uses: peaceiris/actions-gh-pages@v3\n with:\n github_token: ${{ secrets.GITHUB_TOKEN }}\n publish_branch: gh-pages\n publish_dir: ./target/doc\n | dataset_sample\yaml\rust-lang_rust\library\portable-simd\.github\workflows\doc.yml | doc.yml | YAML | 632 | 0.7 | 0 | 0 | awesome-app | 369 | 2025-06-18T15:38:10.316854 | Apache-2.0 | false | 8874d9041e8e021c192146b094e33e64 |
# This file contains definitions of CI job parameters that are loaded\n# dynamically in CI from ci.yml.\nrunners:\n - &base-job\n env: { }\n\n - &job-linux-4c\n os: ubuntu-24.04\n # Free some disk space to avoid running out of space during the build.\n free_disk: true\n <<: *base-job\n\n # Large runner used mainly for its bigger disk capacity\n - &job-linux-4c-largedisk\n os: ubuntu-24.04-4core-16gb\n <<: *base-job\n\n - &job-linux-8c\n os: ubuntu-24.04-8core-32gb\n <<: *base-job\n\n - &job-linux-16c\n os: ubuntu-24.04-16core-64gb\n <<: *base-job\n\n - &job-macos\n os: macos-13\n <<: *base-job\n\n - &job-macos-m1\n os: macos-14\n <<: *base-job\n\n - &job-windows\n os: windows-2022\n <<: *base-job\n\n - &job-windows-25\n os: windows-2025\n <<: *base-job\n\n - &job-windows-8c\n os: windows-2022-8core-32gb\n <<: *base-job\n\n - &job-windows-25-8c\n os: windows-2025-8core-32gb\n <<: *base-job\n\n - &job-aarch64-linux\n # Free some disk space to avoid running out of space during the build.\n free_disk: true\n os: ubuntu-24.04-arm\n <<: *base-job\n\n - &job-aarch64-linux-8c\n os: ubuntu-24.04-arm64-8core-32gb\n <<: *base-job\nenvs:\n env-x86_64-apple-tests: &env-x86_64-apple-tests\n SCRIPT: ./x.py check compiletest --set build.compiletest-use-stage0-libtest=true && ./x.py --stage 2 test --skip tests/ui --skip tests/rustdoc -- --exact\n RUST_CONFIGURE_ARGS: --build=x86_64-apple-darwin --enable-sanitizers --enable-profiler --set rust.jemalloc\n RUSTC_RETRY_LINKER_ON_SEGFAULT: 1\n # Ensure that host tooling is tested on our minimum supported macOS version.\n MACOSX_DEPLOYMENT_TARGET: 10.12\n MACOSX_STD_DEPLOYMENT_TARGET: 10.12\n SELECT_XCODE: /Applications/Xcode_15.2.app\n NO_LLVM_ASSERTIONS: 1\n NO_DEBUG_ASSERTIONS: 1\n NO_OVERFLOW_CHECKS: 1\n\n production:\n &production\n DEPLOY_BUCKET: rust-lang-ci2\n # AWS_SECRET_ACCESS_KEYs are stored in GitHub's secrets storage, named\n # AWS_SECRET_ACCESS_KEY_<keyid>. Including the key id in the name allows to\n # rotate them in a single branch while keeping the old key in another\n # branch, which wouldn't be possible if the key was named with the kind\n # (caches, artifacts...).\n CACHES_AWS_ACCESS_KEY_ID: AKIA46X5W6CZI5DHEBFL\n ARTIFACTS_AWS_ACCESS_KEY_ID: AKIA46X5W6CZN24CBO55\n AWS_REGION: us-west-1\n TOOLSTATE_PUBLISH: 1\n\n # Try builds started through `@bors try` (without specifying custom jobs\n # in the PR description) will be passed the `DIST_TRY_BUILD` environment\n # variable by citool.\n # This tells the `opt-dist` tool to skip building certain components\n # and skip running tests, so that the try build finishes faster.\n try:\n <<: *production\n\n auto:\n <<: *production\n\n pr:\n PR_CI_JOB: 1\n\n# Jobs that run on each push to a pull request (PR)\n# These jobs automatically inherit envs.pr, to avoid repeating\n# it in each job definition.\npr:\n - name: mingw-check\n <<: *job-linux-4c\n - name: mingw-check-tidy\n continue_on_error: true\n <<: *job-linux-4c\n - name: x86_64-gnu-llvm-19\n env:\n ENABLE_GCC_CODEGEN: "1"\n DOCKER_SCRIPT: x86_64-gnu-llvm.sh\n <<: *job-linux-16c\n - name: x86_64-gnu-tools\n <<: *job-linux-16c\n\n# Jobs that run when you perform a try build (@bors try)\n# These jobs automatically inherit envs.try, to avoid repeating\n# it in each job definition.\ntry:\n - name: dist-x86_64-linux\n env:\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-linux-16c\n\n# Main CI jobs that have to be green to merge a commit into master\n# These jobs automatically inherit envs.auto, to avoid repeating\n# it in each job definition.\nauto:\n #############################\n # Linux/Docker builders #\n #############################\n\n - name: aarch64-gnu\n <<: *job-aarch64-linux\n\n - name: aarch64-gnu-debug\n <<: *job-aarch64-linux\n\n - name: arm-android\n <<: *job-linux-4c\n\n - name: armhf-gnu\n <<: *job-linux-4c\n\n - name: dist-aarch64-linux\n env:\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-aarch64-linux-8c\n\n - name: dist-android\n <<: *job-linux-4c\n\n - name: dist-arm-linux\n <<: *job-linux-8c\n\n - name: dist-armhf-linux\n <<: *job-linux-4c\n\n - name: dist-armv7-linux\n <<: *job-linux-4c\n\n - name: dist-i586-gnu-i586-i686-musl\n <<: *job-linux-4c\n\n - name: dist-i686-linux\n <<: *job-linux-4c\n\n - name: dist-loongarch64-linux\n <<: *job-linux-4c\n\n - name: dist-loongarch64-musl\n <<: *job-linux-4c\n\n - name: dist-ohos\n <<: *job-linux-4c-largedisk\n\n - name: dist-powerpc-linux\n <<: *job-linux-4c\n\n - name: dist-powerpc64-linux\n <<: *job-linux-4c\n\n - name: dist-powerpc64le-linux\n <<: *job-linux-4c-largedisk\n\n - name: dist-riscv64-linux\n <<: *job-linux-4c\n\n - name: dist-s390x-linux\n <<: *job-linux-4c\n\n - name: dist-various-1\n <<: *job-linux-4c\n\n - name: dist-various-2\n <<: *job-linux-4c\n\n - name: dist-x86_64-freebsd\n <<: *job-linux-4c\n\n - name: dist-x86_64-illumos\n <<: *job-linux-4c\n\n - name: dist-x86_64-linux\n env:\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-linux-16c\n\n - name: dist-x86_64-linux-alt\n env:\n IMAGE: dist-x86_64-linux\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-linux-16c\n\n - name: dist-x86_64-musl\n env:\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-linux-4c\n\n - name: dist-x86_64-netbsd\n <<: *job-linux-4c\n\n # The i686-gnu job is split into multiple jobs to run tests in parallel.\n # i686-gnu-1 skips tests that run in i686-gnu-2.\n - name: i686-gnu-1\n env:\n IMAGE: i686-gnu\n DOCKER_SCRIPT: stage_2_test_set1.sh\n <<: *job-linux-4c\n\n # Skip tests that run in i686-gnu-1\n - name: i686-gnu-2\n env:\n IMAGE: i686-gnu\n DOCKER_SCRIPT: stage_2_test_set2.sh\n <<: *job-linux-4c\n\n # The i686-gnu-nopt job is split into multiple jobs to run tests in parallel.\n # i686-gnu-nopt-1 skips tests that run in i686-gnu-nopt-2\n - name: i686-gnu-nopt-1\n env:\n IMAGE: i686-gnu-nopt\n DOCKER_SCRIPT: /scripts/stage_2_test_set1.sh\n <<: *job-linux-4c\n\n # Skip tests that run in i686-gnu-nopt-1\n - name: i686-gnu-nopt-2\n env:\n IMAGE: i686-gnu-nopt\n DOCKER_SCRIPT: >-\n python3 ../x.py test --stage 0 --config /config/nopt-std-config.toml library/std &&\n /scripts/stage_2_test_set2.sh\n <<: *job-linux-4c\n\n - name: mingw-check\n <<: *job-linux-4c\n\n - name: test-various\n <<: *job-linux-4c\n\n # FIXME: temporarily disabled due to fuchsia server rate limits. See\n # <https://rust-lang.zulipchat.com/#narrow/channel/242791-t-infra/topic/fuchsia.20failure/with/506637259>.\n #\n #- name: x86_64-fuchsia\n # # Only run this job on the nightly channel. Fuchsia requires\n # # nightly features to compile, and this job would fail if\n # # executed on beta and stable.\n # only_on_channel: nightly\n # doc_url: https://rustc-dev-guide.rust-lang.org/tests/fuchsia.html\n # <<: *job-linux-8c\n\n # Tests integration with Rust for Linux.\n # Builds stage 1 compiler and tries to compile a few RfL examples with it.\n - name: x86_64-rust-for-linux\n doc_url: https://rustc-dev-guide.rust-lang.org/tests/rust-for-linux.html\n <<: *job-linux-4c\n\n - name: x86_64-gnu\n <<: *job-linux-4c\n\n # This job ensures commits landing on nightly still pass the full\n # test suite on the stable channel. There are some UI tests that\n # depend on the channel being built (for example if they include the\n # channel name on the output), and this builder prevents landing\n # changes that would result in broken builds after a promotion.\n - name: x86_64-gnu-stable\n # Only run this job on the nightly channel. Running this on beta\n # could cause failures when `dev: 1` in `stage0.txt`, and running\n # this on stable is useless.\n only_on_channel: nightly\n env:\n IMAGE: x86_64-gnu\n RUST_CI_OVERRIDE_RELEASE_CHANNEL: stable\n <<: *job-linux-4c\n\n - name: x86_64-gnu-aux\n <<: *job-linux-4c\n\n - name: x86_64-gnu-debug\n <<: *job-linux-4c\n\n - name: x86_64-gnu-distcheck\n <<: *job-linux-8c\n\n # The x86_64-gnu-llvm-20 job is split into multiple jobs to run tests in parallel.\n # x86_64-gnu-llvm-20-1 skips tests that run in x86_64-gnu-llvm-20-{2,3}.\n - name: x86_64-gnu-llvm-20-1\n env:\n RUST_BACKTRACE: 1\n IMAGE: x86_64-gnu-llvm-20\n DOCKER_SCRIPT: stage_2_test_set1.sh\n <<: *job-linux-4c\n\n # Skip tests that run in x86_64-gnu-llvm-20-{1,3}\n - name: x86_64-gnu-llvm-20-2\n env:\n RUST_BACKTRACE: 1\n IMAGE: x86_64-gnu-llvm-20\n DOCKER_SCRIPT: x86_64-gnu-llvm2.sh\n <<: *job-linux-4c\n\n # Skip tests that run in x86_64-gnu-llvm-20-{1,2}\n - name: x86_64-gnu-llvm-20-3\n env:\n RUST_BACKTRACE: 1\n IMAGE: x86_64-gnu-llvm-20\n DOCKER_SCRIPT: x86_64-gnu-llvm3.sh\n <<: *job-linux-4c\n\n # The x86_64-gnu-llvm-19 job is split into multiple jobs to run tests in parallel.\n # x86_64-gnu-llvm-19-1 skips tests that run in x86_64-gnu-llvm-19-{2,3}.\n - name: x86_64-gnu-llvm-19-1\n env:\n RUST_BACKTRACE: 1\n IMAGE: x86_64-gnu-llvm-19\n DOCKER_SCRIPT: stage_2_test_set1.sh\n <<: *job-linux-4c\n\n # Skip tests that run in x86_64-gnu-llvm-19-{1,3}\n - name: x86_64-gnu-llvm-19-2\n env:\n RUST_BACKTRACE: 1\n IMAGE: x86_64-gnu-llvm-19\n DOCKER_SCRIPT: x86_64-gnu-llvm2.sh\n <<: *job-linux-4c\n\n # Skip tests that run in x86_64-gnu-llvm-19-{1,2}\n - name: x86_64-gnu-llvm-19-3\n env:\n RUST_BACKTRACE: 1\n IMAGE: x86_64-gnu-llvm-19\n DOCKER_SCRIPT: x86_64-gnu-llvm3.sh\n <<: *job-linux-4c\n\n - name: x86_64-gnu-nopt\n <<: *job-linux-4c\n\n - name: x86_64-gnu-tools\n env:\n DEPLOY_TOOLSTATES_JSON: toolstates-linux.json\n <<: *job-linux-4c\n\n ####################\n # macOS Builders #\n ####################\n\n - name: dist-x86_64-apple\n env:\n SCRIPT: ./x.py dist bootstrap --include-default-paths --host=x86_64-apple-darwin --target=x86_64-apple-darwin\n RUST_CONFIGURE_ARGS: --enable-full-tools --enable-sanitizers --enable-profiler --set rust.jemalloc --set rust.lto=thin --set rust.codegen-units=1\n RUSTC_RETRY_LINKER_ON_SEGFAULT: 1\n # Ensure that host tooling is built to support our minimum support macOS version.\n MACOSX_DEPLOYMENT_TARGET: 10.12\n MACOSX_STD_DEPLOYMENT_TARGET: 10.12\n SELECT_XCODE: /Applications/Xcode_15.2.app\n NO_LLVM_ASSERTIONS: 1\n NO_DEBUG_ASSERTIONS: 1\n NO_OVERFLOW_CHECKS: 1\n DIST_REQUIRE_ALL_TOOLS: 1\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-macos\n\n - name: dist-apple-various\n env:\n SCRIPT: ./x.py dist bootstrap --include-default-paths --host='' --target=aarch64-apple-ios,x86_64-apple-ios,aarch64-apple-ios-sim,aarch64-apple-ios-macabi,x86_64-apple-ios-macabi\n # Mac Catalyst cannot currently compile the sanitizer:\n # https://github.com/rust-lang/rust/issues/129069\n RUST_CONFIGURE_ARGS: --enable-sanitizers --enable-profiler --set rust.jemalloc --set target.aarch64-apple-ios-macabi.sanitizers=false --set target.x86_64-apple-ios-macabi.sanitizers=false\n RUSTC_RETRY_LINKER_ON_SEGFAULT: 1\n # Ensure that host tooling is built to support our minimum support macOS version.\n # FIXME(madsmtm): This might be redundant, as we're not building host tooling here (?)\n MACOSX_DEPLOYMENT_TARGET: 10.12\n MACOSX_STD_DEPLOYMENT_TARGET: 10.12\n SELECT_XCODE: /Applications/Xcode_15.2.app\n NO_LLVM_ASSERTIONS: 1\n NO_DEBUG_ASSERTIONS: 1\n NO_OVERFLOW_CHECKS: 1\n <<: *job-macos\n\n - name: x86_64-apple-1\n env:\n <<: *env-x86_64-apple-tests\n <<: *job-macos\n\n - name: x86_64-apple-2\n env:\n SCRIPT: ./x.py --stage 2 test tests/ui tests/rustdoc\n <<: *env-x86_64-apple-tests\n <<: *job-macos\n\n - name: dist-aarch64-apple\n env:\n SCRIPT: ./x.py dist bootstrap --include-default-paths --host=aarch64-apple-darwin --target=aarch64-apple-darwin\n RUST_CONFIGURE_ARGS: >-\n --enable-full-tools\n --enable-sanitizers\n --enable-profiler\n --set rust.jemalloc\n --set llvm.ninja=false\n --set rust.lto=thin\n --set rust.codegen-units=1\n RUSTC_RETRY_LINKER_ON_SEGFAULT: 1\n SELECT_XCODE: /Applications/Xcode_15.4.app\n USE_XCODE_CLANG: 1\n # Aarch64 tooling only needs to support macOS 11.0 and up as nothing else\n # supports the hardware.\n MACOSX_DEPLOYMENT_TARGET: 11.0\n MACOSX_STD_DEPLOYMENT_TARGET: 11.0\n NO_LLVM_ASSERTIONS: 1\n NO_DEBUG_ASSERTIONS: 1\n NO_OVERFLOW_CHECKS: 1\n DIST_REQUIRE_ALL_TOOLS: 1\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-macos-m1\n\n - name: aarch64-apple\n env:\n SCRIPT: ./x.py --stage 2 test --host=aarch64-apple-darwin --target=aarch64-apple-darwin\n RUST_CONFIGURE_ARGS: >-\n --enable-sanitizers\n --enable-profiler\n --set rust.jemalloc\n RUSTC_RETRY_LINKER_ON_SEGFAULT: 1\n SELECT_XCODE: /Applications/Xcode_15.4.app\n USE_XCODE_CLANG: 1\n # Aarch64 tooling only needs to support macOS 11.0 and up as nothing else\n # supports the hardware, so only need to test it there.\n MACOSX_DEPLOYMENT_TARGET: 11.0\n MACOSX_STD_DEPLOYMENT_TARGET: 11.0\n NO_LLVM_ASSERTIONS: 1\n NO_DEBUG_ASSERTIONS: 1\n NO_OVERFLOW_CHECKS: 1\n <<: *job-macos-m1\n\n ######################\n # Windows Builders #\n ######################\n\n # x86_64-msvc is split into two jobs to run tests in parallel.\n - name: x86_64-msvc-1\n env:\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-sanitizers --enable-profiler\n SCRIPT: make ci-msvc-py\n <<: *job-windows-25\n\n - name: x86_64-msvc-2\n env:\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-sanitizers --enable-profiler\n SCRIPT: make ci-msvc-ps1\n <<: *job-windows-25\n\n # i686-msvc is split into two jobs to run tests in parallel.\n - name: i686-msvc-1\n env:\n RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc --enable-sanitizers\n SCRIPT: make ci-msvc-py\n <<: *job-windows\n\n - name: i686-msvc-2\n env:\n RUST_CONFIGURE_ARGS: --build=i686-pc-windows-msvc --enable-sanitizers\n SCRIPT: make ci-msvc-ps1\n <<: *job-windows\n\n # x86_64-msvc-ext is split into multiple jobs to run tests in parallel.\n - name: x86_64-msvc-ext1\n env:\n SCRIPT: python x.py --stage 2 test src/tools/cargotest src/tools/cargo\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-lld\n <<: *job-windows\n\n # Temporary builder to workaround CI issues\n # See <https://github.com/rust-lang/rust/issues/127883>\n #FIXME: Remove this, and re-enable the same tests in `checktools.sh`, once CI issues are fixed.\n - name: x86_64-msvc-ext2\n env:\n SCRIPT: >\n python x.py test --stage 2 src/tools/miri --target aarch64-apple-darwin --test-args pass &&\n python x.py test --stage 2 src/tools/miri --target x86_64-pc-windows-gnu --test-args pass &&\n python x.py miri --stage 2 library/core --test-args notest &&\n python x.py miri --stage 2 library/alloc --test-args notest &&\n python x.py miri --stage 2 library/std --test-args notest\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-lld\n <<: *job-windows\n\n # Run `checktools.sh` and upload the toolstate file.\n - name: x86_64-msvc-ext3\n env:\n SCRIPT: src/ci/docker/host-x86_64/x86_64-gnu-tools/checktools.sh x.py /tmp/toolstate/toolstates.json windows\n HOST_TARGET: x86_64-pc-windows-msvc\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-lld --save-toolstates=/tmp/toolstate/toolstates.json\n DEPLOY_TOOLSTATES_JSON: toolstates-windows.json\n <<: *job-windows\n\n # 32/64-bit MinGW builds.\n #\n # We are using MinGW with POSIX threads since LLVM requires\n # C++'s std::thread which is disabled in libstdc++ with win32 threads.\n # FIXME: Libc++ doesn't have this limitation so we can avoid\n # winpthreads if we switch to it.\n #\n # Instead of relying on the MinGW version installed on CI we download\n # and install one ourselves so we won't be surprised by changes to CI's\n # build image.\n #\n # Finally, note that the downloads below are all in the `rust-lang-ci` S3\n # bucket, but they clearly didn't originate there! The downloads originally\n # came from the mingw-w64 SourceForge download site. Unfortunately\n # SourceForge is notoriously flaky, so we mirror it on our own infrastructure.\n\n # x86_64-mingw is split into two jobs to run tests in parallel.\n - name: x86_64-mingw-1\n env:\n SCRIPT: make ci-mingw-x\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu\n # There is no dist-x86_64-mingw-alt, so there is no prebuilt LLVM with assertions\n NO_DOWNLOAD_CI_LLVM: 1\n <<: *job-windows\n\n - name: x86_64-mingw-2\n env:\n SCRIPT: make ci-mingw-bootstrap\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-gnu\n # There is no dist-x86_64-mingw-alt, so there is no prebuilt LLVM with assertions\n NO_DOWNLOAD_CI_LLVM: 1\n <<: *job-windows\n\n - name: dist-x86_64-msvc\n env:\n RUST_CONFIGURE_ARGS: >-\n --build=x86_64-pc-windows-msvc\n --host=x86_64-pc-windows-msvc\n --target=x86_64-pc-windows-msvc\n --enable-full-tools\n --enable-profiler\n --set rust.codegen-units=1\n SCRIPT: python x.py build --set rust.debug=true opt-dist && PGO_HOST=x86_64-pc-windows-msvc ./build/x86_64-pc-windows-msvc/stage0-tools-bin/opt-dist windows-ci -- python x.py dist bootstrap --include-default-paths\n DIST_REQUIRE_ALL_TOOLS: 1\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-windows-25-8c\n\n - name: dist-i686-msvc\n env:\n RUST_CONFIGURE_ARGS: >-\n --build=i686-pc-windows-msvc\n --host=i686-pc-windows-msvc\n --target=i686-pc-windows-msvc\n --enable-full-tools\n --enable-profiler\n SCRIPT: python x.py dist bootstrap --include-default-paths\n DIST_REQUIRE_ALL_TOOLS: 1\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-windows\n\n - name: dist-aarch64-msvc\n env:\n RUST_CONFIGURE_ARGS: >-\n --build=x86_64-pc-windows-msvc\n --host=aarch64-pc-windows-msvc\n --target=aarch64-pc-windows-msvc,arm64ec-pc-windows-msvc\n --enable-full-tools\n --enable-profiler\n SCRIPT: python x.py dist bootstrap --include-default-paths\n DIST_REQUIRE_ALL_TOOLS: 1\n <<: *job-windows\n\n - name: dist-i686-mingw\n env:\n RUST_CONFIGURE_ARGS: >-\n --build=i686-pc-windows-gnu\n --enable-full-tools\n SCRIPT: python x.py dist bootstrap --include-default-paths\n DIST_REQUIRE_ALL_TOOLS: 1\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-windows\n\n - name: dist-x86_64-mingw\n env:\n SCRIPT: python x.py dist bootstrap --include-default-paths\n RUST_CONFIGURE_ARGS: >-\n --build=x86_64-pc-windows-gnu\n --enable-full-tools\n DIST_REQUIRE_ALL_TOOLS: 1\n CODEGEN_BACKENDS: llvm,cranelift\n <<: *job-windows\n\n - name: dist-x86_64-msvc-alt\n env:\n RUST_CONFIGURE_ARGS: --build=x86_64-pc-windows-msvc --enable-extended --enable-profiler\n SCRIPT: python x.py dist bootstrap --include-default-paths\n <<: *job-windows\n | dataset_sample\yaml\rust-lang_rust\src\ci\github-actions\jobs.yml | jobs.yml | YAML | 19,146 | 0.95 | 0.02946 | 0.196154 | awesome-app | 14 | 2025-01-31T15:50:40.636276 | MIT | false | 2e51c22799174cee76315192d8504382 |
name: CI\n\non:\n push:\n branches:\n - master\n pull_request:\n schedule:\n # Run multiple times a day as the successfull cached links are not checked every time.\n - cron: '0 */8 * * *'\n\njobs:\n ci:\n if: github.repository == 'rust-lang/rustc-dev-guide'\n runs-on: ubuntu-latest\n env:\n MDBOOK_VERSION: 0.4.21\n MDBOOK_LINKCHECK2_VERSION: 0.9.1\n MDBOOK_MERMAID_VERSION: 0.12.6\n MDBOOK_TOC_VERSION: 0.11.2\n MDBOOK_OUTPUT__LINKCHECK__FOLLOW_WEB_LINKS: ${{ github.event_name != 'pull_request' }}\n DEPLOY_DIR: book/html\n BASE_SHA: ${{ github.event.pull_request.base.sha }}\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n steps:\n - uses: actions/checkout@v4\n with:\n # linkcheck needs the base commit.\n fetch-depth: 0\n\n - name: Cache binaries\n id: mdbook-cache\n uses: actions/cache@v4\n with:\n path: |\n ~/.cargo/bin\n key: ${{ runner.os }}-${{ env.MDBOOK_VERSION }}--${{ env.MDBOOK_LINKCHECK2_VERSION }}--${{ env.MDBOOK_TOC_VERSION }}--${{ env.MDBOOK_MERMAID_VERSION }}\n\n - name: Restore cached Linkcheck\n if: github.event_name == 'schedule'\n id: cache-linkcheck-restore\n uses: actions/cache/restore@v4\n with:\n path: book/linkcheck/cache.json\n key: linkcheck--${{ env.MDBOOK_LINKCHECK2_VERSION }}--${{ github.run_id }}\n restore-keys: |\n linkcheck--${{ env.MDBOOK_LINKCHECK2_VERSION }}--\n\n - name: Install latest nightly Rust toolchain\n if: steps.mdbook-cache.outputs.cache-hit != 'true'\n run: |\n rustup update nightly\n rustup override set nightly\n\n - name: Install Dependencies\n if: steps.mdbook-cache.outputs.cache-hit != 'true'\n run: |\n cargo install mdbook --version ${{ env.MDBOOK_VERSION }}\n cargo install mdbook-linkcheck2 --version ${{ env.MDBOOK_LINKCHECK2_VERSION }}\n cargo install mdbook-toc --version ${{ env.MDBOOK_TOC_VERSION }}\n cargo install mdbook-mermaid --version ${{ env.MDBOOK_MERMAID_VERSION }}\n\n - name: Check build\n run: ENABLE_LINKCHECK=1 mdbook build\n\n - name: Save cached Linkcheck\n id: cache-linkcheck-save\n if: ${{ !cancelled() && github.event_name == 'schedule' }}\n uses: actions/cache/save@v4\n with:\n path: book/linkcheck/cache.json\n key: linkcheck--${{ env.MDBOOK_LINKCHECK2_VERSION }}--${{ github.run_id }}\n\n - name: Deploy to gh-pages\n if: github.event_name == 'push'\n run: |\n touch "${{ env.DEPLOY_DIR }}/.nojekyll"\n cp CNAME "${{ env.DEPLOY_DIR }}"\n cd "${{ env.DEPLOY_DIR }}"\n rm -rf .git\n git init\n git config user.name "Deploy from CI"\n git config user.email ""\n git add .\n git commit -m "Deploy ${GITHUB_SHA} to gh-pages"\n git push --quiet -f "https://x-token:${{ secrets.GITHUB_TOKEN }}@github.com/${GITHUB_REPOSITORY}" HEAD:gh-pages\n | dataset_sample\yaml\rust-lang_rust\src\doc\rustc-dev-guide\.github\workflows\ci.yml | ci.yml | YAML | 3,040 | 0.8 | 0.069767 | 0.025974 | python-kit | 194 | 2024-05-11T22:46:18.932820 | GPL-3.0 | false | c3e3ed7536b2e9a0337503edac692ff2 |
name: Date-Check\n\non:\n schedule:\n # Run at noon UTC every 1st of the month\n - cron: '00 12 01 * *'\n\n # Allow manually starting the workflow\n workflow_dispatch:\n\njobs:\n date-check:\n if: github.repository == 'rust-lang/rustc-dev-guide'\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout repo\n uses: actions/checkout@v4\n\n - name: Ensure Rust is up-to-date\n run: |\n rustup update stable\n\n - name: Run `date-check`\n working-directory: ci/date-check\n run: |\n cargo run -- ../../src/ > ../../date-check-output.txt\n\n - name: Open issue\n uses: actions/github-script@v7\n with:\n script: |\n const fs = require('fs');\n\n const rawText = fs.readFileSync('date-check-output.txt', { encoding: 'utf8' });\n const title = rawText.split('\n')[0];\n if (title != 'empty') {\n const body = rawText.split('\n').slice(1).join('\n');\n github.rest.issues.create({\n owner: context.repo.owner,\n repo: context.repo.repo,\n title,\n body,\n });\n console.log('Opened issue.');\n } else {\n console.log('No dates to triage.');\n }\n | dataset_sample\yaml\rust-lang_rust\src\doc\rustc-dev-guide\.github\workflows\date-check.yml | date-check.yml | YAML | 1,307 | 0.95 | 0.041667 | 0.05 | python-kit | 846 | 2024-11-07T20:34:46.799385 | GPL-3.0 | false | ce397f9fb78db1ebbcc3981f5635cb9e |
name: rustc-pull\n\non:\n workflow_dispatch:\n schedule:\n # Run at 04:00 UTC every Monday and Thursday\n - cron: '0 4 * * 1,4'\n\njobs:\n pull:\n if: github.repository == 'rust-lang/rustc-dev-guide'\n runs-on: ubuntu-latest\n outputs:\n pr_url: ${{ steps.update-pr.outputs.pr_url }}\n permissions:\n contents: write\n pull-requests: write\n steps:\n - uses: actions/checkout@v4\n with:\n # We need the full history for josh to work\n fetch-depth: '0'\n - name: Install stable Rust toolchain\n run: rustup update stable\n - uses: Swatinem/rust-cache@v2\n with:\n workspaces: "josh-sync"\n # Cache the josh directory with checked out rustc\n cache-directories: "/home/runner/.cache/rustc-dev-guide-josh"\n - name: Install josh\n run: RUSTFLAGS="--cap-lints warn" cargo +stable install josh-proxy --git https://github.com/josh-project/josh --tag r24.10.04\n - name: Setup bot git name and email\n run: |\n git config --global user.name 'The rustc-dev-guide Cronjob Bot'\n git config --global user.email 'github-actions@github.com'\n - name: Perform rustc-pull\n id: rustc-pull\n # Turn off -e to disable early exit\n shell: bash {0}\n run: |\n cargo run --manifest-path josh-sync/Cargo.toml -- rustc-pull\n exitcode=$?\n\n # If no pull was performed, we want to mark this job as successful,\n # but we do not want to perform the follow-up steps.\n if [ $exitcode -eq 0 ]; then\n echo "pull_result=pull-finished" >> $GITHUB_OUTPUT\n elif [ $exitcode -eq 2 ]; then\n echo "pull_result=skipped" >> $GITHUB_OUTPUT\n exitcode=0\n fi\n\n exit ${exitcode}\n - name: Push changes to a branch\n if: ${{ steps.rustc-pull.outputs.pull_result == 'pull-finished' }}\n run: |\n # Update a sticky branch that is used only for rustc pulls\n BRANCH="rustc-pull"\n git switch -c $BRANCH\n git push -u origin $BRANCH --force\n - name: Create pull request\n id: update-pr\n if: ${{ steps.rustc-pull.outputs.pull_result == 'pull-finished' }}\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n run: |\n # Check if an open pull request for an rustc pull update already exists\n # If it does, the previous push has just updated it\n # If not, we create it now\n RESULT=`gh pr list --author github-actions[bot] --state open -q 'map(select(.title=="Rustc pull update")) | length' --json title`\n if [[ "$RESULT" -eq 0 ]]; then\n echo "Creating new pull request"\n PR_URL=`gh pr create -B master --title 'Rustc pull update' --body 'Latest update from rustc.'`\n echo "pr_url=$PR_URL" >> $GITHUB_OUTPUT\n else\n PR_URL=`gh pr list --author github-actions[bot] --state open -q 'map(select(.title=="Rustc pull update")) | .[0].url' --json url,title`\n echo "Updating pull request ${PR_URL}"\n echo "pr_url=$PR_URL" >> $GITHUB_OUTPUT\n fi\n send-zulip-message:\n needs: [pull]\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Compute message\n id: create-message\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n run: |\n if [ "${{ needs.pull.result }}" == "failure" ]; then\n WORKFLOW_URL="${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }}"\n echo "message=Rustc pull sync failed. Check out the [workflow URL]($WORKFLOW_URL)." >> $GITHUB_OUTPUT\n else\n CREATED_AT=`gh pr list --author github-actions[bot] --state open -q 'map(select(.title=="Rustc pull update")) | .[0].createdAt' --json createdAt,title`\n PR_URL=`gh pr list --author github-actions[bot] --state open -q 'map(select(.title=="Rustc pull update")) | .[0].url' --json url,title`\n week_ago=$(date +%F -d '7 days ago')\n\n # If there is an open PR that is at least a week old, post a message about it\n if [[ -n $DATE_GH && $DATE_GH < $week_ago ]]; then\n echo "message=A PR with a Rustc pull has been opened for more a week. Check out the [PR](${PR_URL})." >> $GITHUB_OUTPUT\n fi\n fi\n - name: Send a Zulip message about updated PR\n if: ${{ steps.create-message.outputs.message != '' }}\n uses: zulip/github-actions-zulip/send-message@e4c8f27c732ba9bd98ac6be0583096dea82feea5\n with:\n api-key: ${{ secrets.ZULIP_API_TOKEN }}\n email: "rustc-dev-guide-gha-notif-bot@rust-lang.zulipchat.com"\n organization-url: "https://rust-lang.zulipchat.com"\n to: 196385\n type: "stream"\n topic: "Subtree sync automation"\n content: ${{ steps.create-message.outputs.message }}\n | dataset_sample\yaml\rust-lang_rust\src\doc\rustc-dev-guide\.github\workflows\rustc-pull.yml | rustc-pull.yml | YAML | 4,973 | 0.8 | 0.131579 | 0.100917 | react-lib | 863 | 2024-12-13T06:38:33.660618 | Apache-2.0 | false | e77d0218327365b1f5e562fc3a37c583 |
name: Blank Issue\ndescription: Create a blank issue.\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing an issue!\n - type: textarea\n id: problem\n attributes:\n label: Description\n description: >\n Please provide a description of the issue, along with any information\n you feel relevant to replicate it.\n validations:\n required: true\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: labels\n attributes:\n label: Additional Labels\n description: >\n Additional labels can be added to this issue by including the following\n command\n placeholder: |\n @rustbot label +<label>\n\n Common labels for this issue type are:\n * C-an-interesting-project\n * C-enhancement\n * C-question\n * C-tracking-issue\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\ISSUE_TEMPLATE\blank_issue.yml | blank_issue.yml | YAML | 1,226 | 0.85 | 0.045455 | 0.093023 | awesome-app | 878 | 2025-03-15T20:13:03.427703 | BSD-3-Clause | false | 5eea6ed6bf92186f5e6db27a437f6d07 |
name: Bug Report\ndescription: Create a bug report for Clippy\nlabels: ["C-bug"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing a bug report! 🐛\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: >\n Please provide a short summary of the bug, along with any information\n you feel relevant to replicate the bug.\n validations:\n required: true\n - type: textarea\n id: reproducer\n attributes:\n label: Reproducer\n description: Please provide the code and steps to reproduce the bug\n value: |\n I tried this code:\n\n ```rust\n <code>\n ```\n\n I expected to see this happen:\n\n Instead, this happened:\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: labels\n attributes:\n label: Additional Labels\n description: >\n Additional labels can be added to this issue by including the following\n command\n placeholder: |\n @rustbot label +<label>\n\n Common labels for this issue type are:\n * `I-suggestion-causes-error`\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\ISSUE_TEMPLATE\bug_report.yml | bug_report.yml | YAML | 1,504 | 0.85 | 0.052632 | 0.018868 | awesome-app | 821 | 2023-09-29T19:22:40.886677 | MIT | false | 2b6f795a563b363ed821556dcbcb5f79 |
blank_issues_enabled: true\ncontact_links:\n - name: Rust Programming Language Forum\n url: https://users.rust-lang.org\n about: Please ask and answer questions about Rust here.\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 181 | 0.8 | 0 | 0 | python-kit | 881 | 2025-02-07T10:05:53.891014 | BSD-3-Clause | false | f75886be4d74fc4879e4c3fb4eb91a89 |
name: Bug Report (False Negative)\ndescription: Create a bug report about missing warnings from a lint\nlabels: ["C-bug", "I-false-negative"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing a bug report! 🐛\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: >\n Please provide a short summary of the bug, along with any information\n you feel relevant to replicate the bug.\n validations:\n required: true\n - type: input\n id: lint-name\n attributes:\n label: Lint Name\n description: Please provide the lint name.\n - type: textarea\n id: reproducer\n attributes:\n label: Reproducer\n description: Please provide the code and steps to reproduce the bug\n value: |\n I tried this code:\n\n ```rust\n <code>\n ```\n\n I expected to see this happen:\n\n Instead, this happened:\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\ISSUE_TEMPLATE\false_negative.yml | false_negative.yml | YAML | 1,350 | 0.85 | 0.02 | 0 | node-utils | 638 | 2023-08-27T04:02:25.228894 | BSD-3-Clause | false | e6a5817b77c846baf276de85f9647532 |
name: Bug Report (False Positive)\ndescription: Create a bug report about a wrongly emitted lint warning\nlabels: ["C-bug", "I-false-positive"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing a bug report! 🐛\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: >\n Please provide a short summary of the bug, along with any information\n you feel relevant to replicate the bug.\n validations:\n required: true\n - type: input\n id: lint-name\n attributes:\n label: Lint Name\n description: Please provide the lint name.\n - type: textarea\n id: reproducer\n attributes:\n label: Reproducer\n description: >\n Please provide the code and steps to reproduce the bug together with the\n output from Clippy.\n value: |\n I tried this code:\n\n ```rust\n <code>\n ```\n\n I saw this happen:\n\n ```\n <output>\n ```\n\n I expected to see this happen:\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: labels\n attributes:\n label: Additional Labels\n description: >\n Additional labels can be added to this issue by including the following\n command\n placeholder: |\n @rustbot label +<label>\n\n Common labels for this issue type are:\n * `I-suggestion-causes-error`\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\ISSUE_TEMPLATE\false_positive.yml | false_positive.yml | YAML | 1,782 | 0.85 | 0.029412 | 0.015873 | node-utils | 570 | 2025-01-07T02:23:01.446587 | GPL-3.0 | false | 5faf10accbc0afed30dff11e00fe27ae |
name: Internal Compiler Error\ndescription: Create a report for an internal compiler error (ICE) in Clippy.\nlabels: ["C-bug", "I-ICE"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for finding an Internal Compiler Error! 🧊\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: |\n If possible, try to provide a minimal verifiable example. You can read ["Rust Bug Minimization Patterns"][mve] for how to create smaller examples. Otherwise, provide the crate where the ICE occurred.\n\n [mve]: http://blog.pnkfx.org/blog/2019/11/18/rust-bug-minimization-patterns/\n validations:\n required: true\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: error\n attributes:\n label: Error output\n description: >\n Include a backtrace in the code block by setting `RUST_BACKTRACE=1` in\n your environment. E.g. `RUST_BACKTRACE=1 cargo clippy`.\n value: |\n <details><summary>Backtrace</summary>\n <p>\n\n ```\n <backtrace>\n ```\n\n </p>\n </details>\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\ISSUE_TEMPLATE\ice.yml | ice.yml | YAML | 1,494 | 0.95 | 0.083333 | 0 | react-lib | 6 | 2024-11-16T12:28:49.094371 | GPL-3.0 | false | 2582343b97d8346d729a75228f83bb37 |
name: New lint suggestion\ndescription: Suggest a new Clippy lint.\nlabels: ["A-lint"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for your lint idea!\n - type: textarea\n id: what\n attributes:\n label: What it does\n description: What does this lint do?\n validations:\n required: true\n - type: textarea\n id: advantage\n attributes:\n label: Advantage\n description: >\n What is the advantage of the recommended code over the original code?\n placeholder: |\n - Remove bounds check inserted by ...\n - Remove the need to duplicate/store ...\n - Remove typo ...\n - type: textarea\n id: drawbacks\n attributes:\n label: Drawbacks\n description: What might be possible drawbacks of such a lint?\n - type: textarea\n id: example\n attributes:\n label: Example\n description: >\n Include a short example showing when the lint should trigger together\n with the improved code.\n value: |\n ```rust\n <code>\n ```\n\n Could be written as:\n\n ```rust\n <code>\n ```\n validations:\n required: true\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\ISSUE_TEMPLATE\new_lint.yml | new_lint.yml | YAML | 1,158 | 0.85 | 0.020833 | 0 | awesome-app | 40 | 2023-10-23T19:48:35.165966 | BSD-3-Clause | false | 57a1ff48a772a5bfaf866faa4bb9aed6 |
name: Clippy changelog check\n\non:\n merge_group:\n pull_request:\n types: [opened, reopened, synchronize, edited]\n\nconcurrency:\n # For a given workflow, if we push to the same PR, cancel all previous builds on that PR.\n # If the push is not attached to a PR, we will cancel all builds on the same branch.\n group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"\n cancel-in-progress: true\n\njobs:\n changelog:\n runs-on: ubuntu-latest\n\n defaults:\n run:\n shell: bash\n\n steps:\n # Run\n - name: Check Changelog\n if: ${{ github.event_name == 'pull_request' }}\n run: |\n body=$(curl -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" -s "https://api.github.com/repos/${{ github.repository }}/pulls/$PR_NUMBER" | \\n python -c "import sys, json; print(json.load(sys.stdin)['body'])")\n output=$(awk '/^changelog:\s*\S/ && !/changelog: \[.*\]: your change/' <<< "$body" | sed "s/changelog:\s*//g")\n if [ -z "$output" ]; then\n echo "ERROR: pull request message must contain 'changelog: ...' with your changelog. Please add it."\n exit 1\n else\n echo "changelog: $output"\n fi\n env:\n PYTHONIOENCODING: 'utf-8'\n PR_NUMBER: '${{ github.event.number }}'\n\n # We need to have the "conclusion" job also on PR CI, to make it possible\n # to add PRs to a merge queue.\n conclusion_changelog:\n needs: [ changelog ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\workflows\clippy_changelog.yml | clippy_changelog.yml | YAML | 2,301 | 0.95 | 0.135593 | 0.264151 | node-utils | 616 | 2025-05-28T13:22:52.613058 | GPL-3.0 | false | 162a74503cde178f21309e4a03cbfb60 |
name: Clippy Dev Test\n\non:\n merge_group:\n pull_request:\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_INCREMENTAL: 0\n RUSTFLAGS: -D warnings\n\njobs:\n clippy_dev:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n # Run\n - name: Build\n run: cargo build\n working-directory: clippy_dev\n\n - name: Test update_lints\n run: cargo dev update_lints --check\n\n - name: Test fmt\n run: cargo dev fmt --check\n\n - name: Test cargo dev new lint\n env:\n RUSTFLAGS: -A unused-imports\n run: |\n cargo dev new_lint --name new_early_pass --pass early\n cargo dev new_lint --name new_late_pass --pass late\n cargo check\n git reset --hard HEAD\n\n conclusion_dev:\n needs: [ clippy_dev ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\workflows\clippy_dev.yml | clippy_dev.yml | YAML | 1,772 | 0.95 | 0.081967 | 0.230769 | node-utils | 985 | 2025-06-16T04:03:12.828531 | GPL-3.0 | false | 8d3d3057879902e0585281bb0e5c62a9 |
name: Clippy Test (merge queue)\n\non:\n merge_group:\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_TARGET_DIR: '${{ github.workspace }}/target'\n NO_FMT_TEST: 1\n CARGO_INCREMENTAL: 0\n RUSTFLAGS: -D warnings\n\ndefaults:\n run:\n shell: bash\n\njobs:\n base:\n strategy:\n matrix:\n include:\n - os: ubuntu-latest\n host: x86_64-unknown-linux-gnu\n - os: ubuntu-latest\n host: i686-unknown-linux-gnu\n - os: windows-latest\n host: x86_64-pc-windows-msvc\n - os: macos-13\n host: x86_64-apple-darwin\n - os: macos-latest\n host: aarch64-apple-darwin\n\n runs-on: ${{ matrix.os }}\n\n # NOTE: If you modify this job, make sure you copy the changes to clippy.yml\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install i686 dependencies\n if: matrix.host == 'i686-unknown-linux-gnu'\n run: |\n sudo dpkg --add-architecture i386\n sudo apt-get update\n sudo apt-get install gcc-multilib zlib1g-dev:i386\n\n - name: Install toolchain\n run: |\n rustup set default-host ${{ matrix.host }}\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Run\n - name: Build\n run: cargo build --tests --features internal\n\n - name: Test\n if: matrix.host == 'x86_64-unknown-linux-gnu'\n run: cargo test --features internal\n\n - name: Test\n if: matrix.host != 'x86_64-unknown-linux-gnu'\n run: cargo test --features internal -- --skip dogfood\n\n - name: Test clippy_lints\n run: cargo test\n working-directory: clippy_lints\n\n - name: Test clippy_utils\n run: cargo test\n working-directory: clippy_utils\n\n - name: Test clippy_config\n run: cargo test\n working-directory: clippy_config\n\n - name: Test rustc_tools_util\n run: cargo test\n working-directory: rustc_tools_util\n\n - name: Test clippy_dev\n run: cargo test\n working-directory: clippy_dev\n\n - name: Test clippy-driver\n run: .github/driver.sh\n env:\n OS: ${{ runner.os }}\n\n metadata_collection:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n - name: Test metadata collection\n run: cargo collect-metadata\n\n integration_build:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Run\n - name: Build Integration Test\n env:\n CARGO_PROFILE_DEV_SPLIT_DEBUGINFO: off\n run: cargo test --test integration --features integration --no-run\n\n # Upload\n - name: Extract Binaries\n run: |\n DIR=$CARGO_TARGET_DIR/debug\n find $DIR/deps/integration-* -executable ! -type d | xargs -I {} mv {} $DIR/integration\n find $DIR ! -executable -o -type d ! -path $DIR | xargs rm -rf\n\n - name: Upload Binaries\n uses: actions/upload-artifact@v4\n with:\n name: binaries\n path: target/debug\n\n integration:\n needs: integration_build\n strategy:\n fail-fast: false\n max-parallel: 6\n matrix:\n integration:\n - 'matthiaskrgr/clippy_ci_panic_test'\n - 'rust-lang/cargo'\n - 'rust-lang/chalk'\n - 'rust-lang/rustfmt'\n - 'Marwes/combine'\n - 'Geal/nom'\n - 'rust-lang/stdarch'\n - 'serde-rs/serde'\n - 'chronotope/chrono'\n - 'hyperium/hyper'\n - 'rust-random/rand'\n - 'rust-lang/futures-rs'\n - 'rust-itertools/itertools'\n - 'rust-lang-nursery/failure'\n - 'rust-lang/log'\n\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Download\n - name: Download target dir\n uses: actions/download-artifact@v4\n with:\n name: binaries\n path: target/debug\n\n - name: Make Binaries Executable\n run: chmod +x $CARGO_TARGET_DIR/debug/*\n\n # Run\n - name: Test ${{ matrix.integration }}\n run: |\n TOOLCHAIN=$(rustup show active-toolchain | head -n 1 | cut -f1 -d' ')\n rustup run $TOOLCHAIN $CARGO_TARGET_DIR/debug/integration --show-output\n env:\n INTEGRATION: ${{ matrix.integration }}\n\n conclusion:\n needs: [ base, metadata_collection, integration_build, integration ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\workflows\clippy_mq.yml | clippy_mq.yml | YAML | 5,931 | 0.8 | 0.036866 | 0.126374 | vue-tools | 325 | 2024-01-17T08:00:54.194861 | BSD-3-Clause | false | 8c43afad331aac7c814bdcdfb9ecb9b5 |
name: Clippy Test\n\non:\n pull_request:\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_TARGET_DIR: '${{ github.workspace }}/target'\n NO_FMT_TEST: 1\n CARGO_INCREMENTAL: 0\n RUSTFLAGS: -D warnings\n\nconcurrency:\n # For a given workflow, if we push to the same PR, cancel all previous builds on that PR.\n # If the push is not attached to a PR, we will cancel all builds on the same branch.\n group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"\n cancel-in-progress: true\n\njobs:\n base:\n # NOTE: If you modify this job, make sure you copy the changes to clippy_mq.yml\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Run\n - name: Build\n run: cargo build --tests --features internal\n\n - name: Test\n run: cargo test --features internal\n\n - name: Test clippy_lints\n run: cargo test\n working-directory: clippy_lints\n\n - name: Test clippy_utils\n run: cargo test\n working-directory: clippy_utils\n\n - name: Test rustc_tools_util\n run: cargo test\n working-directory: rustc_tools_util\n\n - name: Test clippy_dev\n run: cargo test\n working-directory: clippy_dev\n\n - name: Test clippy-driver\n run: .github/driver.sh\n env:\n OS: ${{ runner.os }}\n\n # We need to have the "conclusion" job also on PR CI, to make it possible\n # to add PRs to a merge queue.\n conclusion:\n needs: [ base ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\workflows\clippy_pr.yml | clippy_pr.yml | YAML | 2,612 | 0.8 | 0.071429 | 0.257143 | python-kit | 887 | 2024-06-11T07:54:57.922759 | MIT | false | 1c27b57b58207cd9593a84d7112d9a50 |
name: Deploy\n\non:\n push:\n branches:\n - master\n - beta\n tags:\n - rust-1.**\n\nconcurrency:\n group: ${{ github.workflow }}\n cancel-in-progress: false\n\nenv:\n TARGET_BRANCH: 'gh-pages'\n SHA: '${{ github.sha }}'\n SSH_REPO: 'git@github.com:${{ github.repository }}.git'\n\njobs:\n deploy:\n runs-on: ubuntu-latest\n if: github.repository == 'rust-lang/rust-clippy'\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Checkout\n uses: actions/checkout@v4\n with:\n ref: ${{ env.TARGET_BRANCH }}\n path: 'out'\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n # Run\n - name: Set tag name\n if: startswith(github.ref, 'refs/tags/')\n run: |\n TAG=$(basename "${TAGNAME}")\n echo "TAG_NAME=$TAG" >> $GITHUB_ENV\n env:\n # Make sure that the reference gets expanded before injecting it\n TAGNAME: ${{ github.ref }}\n - name: Set beta to true\n if: github.ref == 'refs/heads/beta'\n run: echo "BETA=true" >> $GITHUB_ENV\n\n # We need to check out all files that (transitively) depend on the\n # structure of the gh-pages branch, so that we're able to change that\n # structure without breaking the deployment.\n - name: Use deploy files from master branch\n run: |\n git fetch --no-tags --prune --depth=1 origin master\n git checkout origin/master -- .github/deploy.sh util/versions.py util/gh-pages/versions.html\n\n # Generate lockfile for caching to avoid build problems with cached deps\n - name: cargo generate-lockfile\n run: cargo generate-lockfile\n\n - name: Cache\n uses: Swatinem/rust-cache@v2\n with:\n save-if: ${{ github.ref == 'refs/heads/master' }}\n\n - name: cargo collect-metadata\n run: cargo collect-metadata\n\n - name: Deploy\n run: |\n eval "$(ssh-agent -s)"\n ssh-add - <<< "${{ secrets.DEPLOY_KEY }}"\n bash .github/deploy.sh\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\workflows\deploy.yml | deploy.yml | YAML | 2,170 | 0.8 | 0.064103 | 0.136364 | vue-tools | 207 | 2023-08-05T04:19:33.986030 | GPL-3.0 | false | 510c9bfdabc5f093b031897a829ed2e6 |
name: Lintcheck\n\non:\n pull_request:\n paths-ignore:\n - 'book/**'\n - 'util/**'\n - 'tests/**'\n - '*.md'\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_INCREMENTAL: 0\n\nconcurrency:\n # For a given workflow, if we push to the same PR, cancel all previous builds on that PR.\n group: "${{ github.workflow }}-${{ github.event.pull_request.number}}"\n cancel-in-progress: true\n\njobs:\n # Runs lintcheck on the PR's target branch and stores the results as an artifact\n base:\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n with:\n fetch-depth: 2\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n # HEAD is the generated merge commit `refs/pull/N/merge` between the PR and `master`, `HEAD^`\n # being the commit from `master` that is the base of the merge\n - name: Checkout base\n run: git checkout HEAD^\n\n # Use the lintcheck from the PR to generate the JSON in case the PR modifies lintcheck in some\n # way\n - name: Checkout current lintcheck\n run: |\n rm -rf lintcheck\n git checkout ${{ github.sha }} -- lintcheck\n\n - name: Cache lintcheck bin\n id: cache-lintcheck-bin\n uses: actions/cache@v4\n with:\n path: target/debug/lintcheck\n key: lintcheck-bin-${{ hashfiles('lintcheck/**') }}\n\n - name: Build lintcheck\n if: steps.cache-lintcheck-bin.outputs.cache-hit != 'true'\n run: cargo build --manifest-path=lintcheck/Cargo.toml\n\n - name: Create cache key\n id: key\n run: echo "key=lintcheck-base-${{ hashfiles('lintcheck/**') }}-$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"\n\n - name: Cache results JSON\n id: cache-json\n uses: actions/cache@v4\n with:\n path: lintcheck-logs/ci_crates_logs.json\n key: ${{ steps.key.outputs.key }}\n\n - name: Run lintcheck\n if: steps.cache-json.outputs.cache-hit != 'true'\n run: env CLIPPY_CONF_DIR="$PWD/lintcheck/ci-config" ./target/debug/lintcheck --format json --all-lints --crates-toml ./lintcheck/ci_crates.toml\n\n - name: Upload base JSON\n uses: actions/upload-artifact@v4\n with:\n name: base\n path: lintcheck-logs/ci_crates_logs.json\n\n # Runs lintcheck on the PR and stores the results as an artifact\n head:\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Cache lintcheck bin\n id: cache-lintcheck-bin\n uses: actions/cache@v4\n with:\n path: target/debug/lintcheck\n key: lintcheck-bin-${{ hashfiles('lintcheck/**') }}\n\n - name: Build lintcheck\n if: steps.cache-lintcheck-bin.outputs.cache-hit != 'true'\n run: cargo build --manifest-path=lintcheck/Cargo.toml\n\n - name: Run lintcheck\n run: env CLIPPY_CONF_DIR="$PWD/lintcheck/ci-config" ./target/debug/lintcheck --format json --all-lints --crates-toml ./lintcheck/ci_crates.toml\n\n - name: Upload head JSON\n uses: actions/upload-artifact@v4\n with:\n name: head\n path: lintcheck-logs/ci_crates_logs.json\n\n # Retrieves the head and base JSON results and prints the diff to the GH actions step summary\n diff:\n runs-on: ubuntu-latest\n\n needs: [base, head]\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Restore lintcheck bin\n uses: actions/cache/restore@v4\n with:\n path: target/debug/lintcheck\n key: lintcheck-bin-${{ hashfiles('lintcheck/**') }}\n fail-on-cache-miss: true\n\n - name: Download JSON\n uses: actions/download-artifact@v4\n\n - name: Diff results\n # GH's summery has a maximum size of 1024k:\n # https://docs.github.com/actions/using-workflows/workflow-commands-for-github-actions#adding-a-markdown-summary\n # That's why we first log to file and then to the summary and logs\n run: |\n ./target/debug/lintcheck diff {base,head}/ci_crates_logs.json --truncate >> truncated_diff.md\n head -c 1024000 truncated_diff.md >> $GITHUB_STEP_SUMMARY\n cat truncated_diff.md\n ./target/debug/lintcheck diff {base,head}/ci_crates_logs.json >> full_diff.md\n\n - name: Upload full diff\n uses: actions/upload-artifact@v4\n with:\n name: diff\n if-no-files-found: ignore\n path: |\n full_diff.md\n truncated_diff.md\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\workflows\lintcheck.yml | lintcheck.yml | YAML | 4,668 | 0.8 | 0.040541 | 0.114754 | react-lib | 853 | 2025-04-21T13:57:25.248019 | GPL-3.0 | false | caa77c80e6c248ee3ab35ea0fee1f1eb |
name: Remark\n\non:\n merge_group:\n pull_request:\n\njobs:\n remark:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Setup Node.js\n uses: actions/setup-node@v4\n with:\n node-version: '18.x'\n\n - name: Install remark\n run: npm install remark-cli remark-lint remark-lint-maximum-line-length@^3.1.3 remark-preset-lint-recommended remark-gfm\n\n - name: Install mdbook\n run: |\n mkdir mdbook\n curl -Lf https://github.com/rust-lang/mdBook/releases/download/v0.4.43/mdbook-v0.4.43-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=./mdbook\n echo `pwd`/mdbook >> $GITHUB_PATH\n\n # Run\n - name: Check *.md files\n run: ./node_modules/.bin/remark -u lint -f .\n\n - name: Linkcheck book\n run: |\n rustup toolchain install nightly --component rust-docs\n rustup override set nightly\n curl https://raw.githubusercontent.com/rust-lang/rust/master/src/tools/linkchecker/linkcheck.sh -o linkcheck.sh\n sh linkcheck.sh clippy --path ./book\n\n - name: Build mdbook\n run: mdbook build book\n\n conclusion_remark:\n needs: [ remark ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust\src\tools\clippy\.github\workflows\remark.yml | remark.yml | YAML | 2,167 | 0.8 | 0.078125 | 0.222222 | react-lib | 881 | 2024-05-24T22:00:17.137984 | BSD-3-Clause | false | aebaa38d4033346ae30c55601fb1ed47 |
image: ubuntu:latest\n\ntasks:\n - before: echo "..."\n init: |\n cargo install rustup-toolchain-install-master\n ./miri toolchain\n ./miri build\n command: echo "Run tests with ./miri test"\n | dataset_sample\yaml\rust-lang_rust\src\tools\miri\.gitpod.yml | .gitpod.yml | YAML | 205 | 0.7 | 0 | 0 | awesome-app | 257 | 2024-06-18T22:08:39.377979 | MIT | false | 94b9792587d0a2b47db68b1c0eb434f2 |
name: CI\n\non:\n merge_group:\n pull_request:\n branches:\n - 'master'\n schedule:\n - cron: '44 4 * * *' # At 4:44 UTC every day.\n\ndefaults:\n run:\n shell: bash\n\njobs:\n build:\n strategy:\n fail-fast: false\n matrix:\n include:\n - os: ubuntu-latest\n host_target: x86_64-unknown-linux-gnu\n - os: macos-14\n host_target: aarch64-apple-darwin\n - os: windows-latest\n host_target: i686-pc-windows-msvc\n runs-on: ${{ matrix.os }}\n env:\n HOST_TARGET: ${{ matrix.host_target }}\n steps:\n - uses: actions/checkout@v4\n - uses: ./.github/workflows/setup\n with:\n toolchain_flags: "--host ${{ matrix.host_target }}"\n\n # The `style` job only runs on Linux; this makes sure the Windows-host-specific\n # code is also covered by clippy.\n - name: Check clippy\n if: ${{ matrix.os == 'windows-latest' }}\n run: ./miri clippy -- -D warnings\n\n - name: Test Miri\n run: ./ci/ci.sh\n\n style:\n name: style checks\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: ./.github/workflows/setup\n\n - name: rustfmt\n run: ./miri fmt --check\n - name: clippy\n run: ./miri clippy -- -D warnings\n - name: clippy (no features)\n run: ./miri clippy --no-default-features -- -D warnings\n - name: clippy (all features)\n run: ./miri clippy --all-features -- -D warnings\n - name: rustdoc\n run: RUSTDOCFLAGS="-Dwarnings" ./miri doc --document-private-items\n\n coverage:\n name: coverage report\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: ./.github/workflows/setup\n - name: coverage\n run: ./miri test --coverage\n\n # Summary job for the merge queue.\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n # And they should be added below in `cron-fail-notify` as well.\n conclusion:\n needs: [build, style, coverage]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n\n cron-fail-notify:\n name: cronjob failure notification\n runs-on: ubuntu-latest\n permissions:\n # The cronjob needs to be able to push to the repo...\n contents: write\n # ... and create a PR.\n pull-requests: write\n needs: [build, style, coverage]\n if: ${{ github.event_name == 'schedule' && failure() }}\n steps:\n # Send a Zulip notification\n - name: Install zulip-send\n run: pip3 install zulip\n - name: Send Zulip notification\n env:\n ZULIP_BOT_EMAIL: ${{ secrets.ZULIP_BOT_EMAIL }}\n ZULIP_API_TOKEN: ${{ secrets.ZULIP_API_TOKEN }}\n run: |\n ~/.local/bin/zulip-send --user $ZULIP_BOT_EMAIL --api-key $ZULIP_API_TOKEN --site https://rust-lang.zulipchat.com \\n --stream miri --subject "Miri Build Failure ($(date -u +%Y-%m))" \\n --message 'Dear @*T-miri*,\n\n It would appear that the [Miri cron job build]('"https://github.com/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID"') failed.\n\n This likely means that rustc changed the miri directory and\n we now need to do a [`./miri rustc-pull`](https://github.com/rust-lang/miri/blob/master/CONTRIBUTING.md#importing-changes-from-the-rustc-repo).\n\n Would you mind investigating this issue?\n\n Thanks in advance!\n Sincerely,\n The Miri Cronjobs Bot'\n\n # Attempt to auto-sync with rustc\n - uses: actions/checkout@v4\n with:\n fetch-depth: 256 # get a bit more of the history\n - name: install josh-proxy\n run: cargo +stable install josh-proxy --git https://github.com/josh-project/josh --tag r24.10.04\n - name: setup bot git name and email\n run: |\n git config --global user.name 'The Miri Cronjob Bot'\n git config --global user.email 'miri@cron.bot'\n - name: Install nightly toolchain\n run: rustup toolchain install nightly --profile minimal\n - name: get changes from rustc\n run: ./miri rustc-pull\n - name: Install rustup-toolchain-install-master\n run: cargo install -f rustup-toolchain-install-master\n - name: format changes (if any)\n run: |\n ./miri toolchain\n ./miri fmt --check || (./miri fmt && git commit -am "fmt")\n - name: Push changes to a branch\n run: |\n BRANCH="rustup-$(date -u +%Y-%m-%d)"\n git switch -c $BRANCH\n git push -u origin $BRANCH\n - name: Create Pull Request\n run: |\n PR=$(gh pr create -B master --title 'Automatic Rustup' --body 'Please close and re-open this PR to trigger CI, then enable auto-merge.')\n ~/.local/bin/zulip-send --user $ZULIP_BOT_EMAIL --api-key $ZULIP_API_TOKEN --site https://rust-lang.zulipchat.com \\n --stream miri --subject "Miri Build Failure ($(date -u +%Y-%m))" \\n --message "A PR doing a rustc-pull [has been automatically created]($PR) for your convenience."\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n ZULIP_BOT_EMAIL: ${{ secrets.ZULIP_BOT_EMAIL }}\n ZULIP_API_TOKEN: ${{ secrets.ZULIP_API_TOKEN }}\n | dataset_sample\yaml\rust-lang_rust\src\tools\miri\.github\workflows\ci.yml | ci.yml | YAML | 5,924 | 0.95 | 0.06875 | 0.110345 | vue-tools | 628 | 2025-02-03T08:33:03.199469 | MIT | false | f5d34467479ad6ba4f85d0c46f4a5c1f |
name: Tier 2 sysroots\n\non:\n schedule:\n - cron: '44 4 * * *' # At 4:44 UTC every day.\n\ndefaults:\n run:\n shell: bash\n\njobs:\n sysroots:\n name: Build the sysroots\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Build the sysroots\n run: |\n rustup toolchain install nightly\n cargo install -f rustup-toolchain-install-master\n ./miri toolchain -c rust-docs # Docs are the only place targets are separated by tier\n ./miri install\n python3 -m pip install beautifulsoup4\n ./ci/build-all-targets.sh\n - name: Upload build errors\n # We don't want to skip this step on failure\n if: always()\n uses: actions/upload-artifact@v4\n with:\n name: failures\n path: failures.tar.gz\n\n sysroots-cron-fail-notify:\n name: sysroots cronjob failure notification\n runs-on: ubuntu-latest\n needs: [sysroots]\n if: failure() || cancelled()\n steps:\n # Download our build error logs\n - name: Download build errors\n uses: actions/download-artifact@v4\n with:\n name: failures\n # Send a Zulip notification\n - name: Install zulip-send\n run: pip3 install zulip\n - name: Send Zulip notification\n env:\n ZULIP_BOT_EMAIL: ${{ secrets.ZULIP_BOT_EMAIL }}\n ZULIP_API_TOKEN: ${{ secrets.ZULIP_API_TOKEN }}\n run: |\n tar xf failures.tar.gz\n ls failures\n ~/.local/bin/zulip-send --user $ZULIP_BOT_EMAIL --api-key $ZULIP_API_TOKEN --site https://rust-lang.zulipchat.com \\n --stream miri --subject "Sysroot Build Errors ($(date -u +%Y-%m))" \\n --message 'It would appear that the [Miri sysroots cron job build]('"https://github.com/$GITHUB_REPOSITORY/actions/runs/$GITHUB_RUN_ID"') failed to build these targets:\n '"$(ls failures)"'\n\n Would you mind investigating this issue?\n\n Thanks in advance!\n Sincerely,\n The Miri Cronjobs Bot'\n | dataset_sample\yaml\rust-lang_rust\src\tools\miri\.github\workflows\sysroots.yml | sysroots.yml | YAML | 2,034 | 0.8 | 0.031746 | 0.052632 | awesome-app | 528 | 2025-04-01T21:11:11.050357 | GPL-3.0 | false | c058fb49482c93a38ea050b07941baf5 |
name: "Miri CI setup"\ndescription: "Sets up Miri CI"\ninputs:\n toolchain_flags:\n required: false\n default: ''\nruns:\n using: "composite"\n steps:\n - name: Show Rust version (stable toolchain)\n run: |\n rustup show\n rustc -Vv\n cargo -V\n shell: bash\n\n # Cache the global cargo directory, but NOT the local `target` directory which\n # we cannot reuse anyway when the nightly changes (and it grows quite large\n # over time).\n - name: Add cache for cargo\n id: cache\n uses: actions/cache@v4\n with:\n path: |\n # Taken from <https://doc.rust-lang.org/nightly/cargo/guide/cargo-home.html#caching-the-cargo-home-in-ci>.\n # Cache package/registry information\n ~/.cargo/registry/index\n ~/.cargo/registry/cache\n ~/.cargo/git/db\n # Cache installed binaries\n ~/.cargo/bin\n ~/.cargo/.crates.toml\n ~/.cargo/.crates2.json\n key: cargo-${{ runner.os }}-${{ hashFiles('**/Cargo.lock', '.github/workflows/**/*.yml') }}\n restore-keys: cargo-${{ runner.os }}\n\n - name: Install rustup-toolchain-install-master\n if: steps.cache.outputs.cache-hit != 'true'\n run: cargo install -f rustup-toolchain-install-master hyperfine\n shell: bash\n\n - name: Install nightly toolchain\n run: rustup toolchain install nightly --profile minimal\n shell: bash\n\n - name: Install "master" toolchain\n run: |\n if [[ ${{ github.event_name }} == 'schedule' ]]; then\n echo "Building against latest rustc git version"\n git ls-remote https://github.com/rust-lang/rust/ HEAD | cut -f 1 > rust-version\n fi\n ./miri toolchain ${{ inputs.toolchain_flags }}\n shell: bash\n\n - name: Show Rust version (miri toolchain)\n run: |\n rustup show\n rustc -Vv\n cargo -V\n shell: bash\n | dataset_sample\yaml\rust-lang_rust\src\tools\miri\.github\workflows\setup\action.yml | action.yml | YAML | 1,986 | 0.95 | 0.05 | 0.109091 | react-lib | 678 | 2024-09-03T00:17:09.776855 | BSD-3-Clause | false | 2a80771408fb7298d7989769c11ba627 |
name: 'wasmtime github releases'\ndescription: 'wasmtime github releases'\ninputs:\n token:\n description: ''\n required: true\n name:\n description: ''\n required: true\n files:\n description: ''\n required: true\nruns:\n using: 'docker'\n image: 'Dockerfile'\n | dataset_sample\yaml\rust-lang_rust\src\tools\rust-analyzer\.github\actions\github-release\action.yml | action.yml | YAML | 270 | 0.85 | 0 | 0 | react-lib | 522 | 2025-05-20T12:37:20.616809 | Apache-2.0 | false | 32d269cf6b2d86480951130aa97deb9e |
name: Fuzz\non:\n schedule:\n # Once a week\n - cron: '0 0 * * 0'\n push:\n paths:\n - '.github/workflows/fuzz.yml'\n # Allow manual trigger\n workflow_dispatch:\n\nenv:\n CARGO_INCREMENTAL: 0\n CARGO_NET_RETRY: 10\n CI: 1\n RUST_BACKTRACE: short\n RUSTFLAGS: "-D warnings -W unreachable-pub -W bare-trait-objects"\n RUSTUP_MAX_RETRIES: 10\n\njobs:\n rust:\n if: ${{ github.repository == 'rust-lang/rust-analyzer' || github.event_name == 'workflow_dispatch' }}\n name: Rust\n runs-on: ubuntu-latest\n env:\n CC: deny_c\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n ref: ${{ github.event.pull_request.head.sha }}\n fetch-depth: 1\n\n - name: Install Rust toolchain\n run: |\n rustup install --profile minimal nightly\n\n - name: Build fuzzers\n run: |\n cargo install cargo-fuzz\n cd crates/syntax\n cargo +nightly fuzz build\n | dataset_sample\yaml\rust-lang_rust\src\tools\rust-analyzer\.github\workflows\fuzz.yml | fuzz.yml | YAML | 956 | 0.8 | 0.023256 | 0.052632 | awesome-app | 580 | 2023-10-16T06:16:05.848777 | GPL-3.0 | false | c638c6a5a236622b5c234ce6d18d85be |
name: Diff Check\non:\n workflow_dispatch:\n inputs:\n clone_url:\n description: 'Git url of a rustfmt fork to compare against the latest master rustfmt'\n required: true\n branch_name:\n description: 'Name of the feature branch on the forked repo'\n required: true\n commit_hash:\n description: 'Optional commit hash from the feature branch'\n required: false\n rustfmt_configs:\n description: 'Optional comma separated list of rustfmt config options to pass when running the feature branch'\n required: false\n\njobs:\n diff_check:\n runs-on: ubuntu-latest\n\n steps:\n - name: checkout\n uses: actions/checkout@v4\n\n - name: install rustup\n run: |\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > rustup-init.sh\n sh rustup-init.sh -y --default-toolchain none\n rustup target add x86_64-unknown-linux-gnu\n\n - name: check diff\n run: bash ${GITHUB_WORKSPACE}/ci/check_diff.sh ${{ github.event.inputs.clone_url }} ${{ github.event.inputs.branch_name }} ${{ github.event.inputs.commit_hash || github.event.inputs.branch_name }} ${{ github.event.inputs.rustfmt_configs }}\n | dataset_sample\yaml\rust-lang_rust\src\tools\rustfmt\.github\workflows\check_diff.yml | check_diff.yml | YAML | 1,188 | 0.95 | 0 | 0 | awesome-app | 713 | 2024-03-29T05:45:43.336476 | MIT | false | 57109e82aa81d7b0280f673a75934a24 |
name: integration\non:\n push:\n branches:\n - master\n pull_request:\n\njobs:\n integration-tests:\n runs-on: ubuntu-latest\n name: ${{ matrix.integration }}\n strategy:\n # https://help.github.com/en/actions/getting-started-with-github-actions/about-github-actions#usage-limits\n # There's a limit of 60 concurrent jobs across all repos in the rust-lang organization.\n # In order to prevent overusing too much of that 60 limit, we throttle the\n # number of rustfmt jobs that will run concurrently.\n max-parallel: 4\n fail-fast: false\n matrix:\n integration: [\n bitflags,\n log,\n mdbook,\n packed_simd,\n rust-semverver,\n tempdir,\n futures-rs,\n rust-clippy,\n ]\n include:\n # Allowed Failures\n # Actions doesn't yet support explicitly marking matrix legs as allowed failures\n # https://github.community/t5/GitHub-Actions/continue-on-error-allow-failure-UI-indication/td-p/37033\n # https://github.community/t5/GitHub-Actions/Why-a-matrix-step-will-be-canceled-if-another-one-failed/td-p/30920\n # Instead, leverage `continue-on-error`\n # https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions#jobsjob_idstepscontinue-on-error\n #\n # Failing due to breaking changes in rustfmt 2.0 where empty\n # match blocks have trailing commas removed\n # https://github.com/rust-lang/rustfmt/pull/4226\n - integration: chalk\n allow-failure: true\n - integration: crater\n allow-failure: true\n - integration: glob\n allow-failure: true\n - integration: stdsimd\n allow-failure: true\n # Using old rustfmt configuration option\n - integration: rand\n allow-failure: true\n # Keep this as an allowed failure as it's fragile to breaking changes of rustc.\n - integration: rust-clippy\n allow-failure: true\n # Using old rustfmt configuration option\n - integration: packed_simd\n allow-failure: true\n # calebcartwright (2019-12-24)\n # Keeping this as an allowed failure since it was flagged as such in the TravisCI config, even though\n # it appears to have been passing for quite some time.\n # Original comment was: temporal build failure due to breaking changes in the nightly compiler\n - integration: rust-semverver\n allow-failure: true\n\n steps:\n - name: checkout\n uses: actions/checkout@v4\n\n # Run build\n - name: install rustup\n run: |\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > rustup-init.sh\n sh rustup-init.sh -y --default-toolchain none\n\n - name: run integration tests\n env:\n INTEGRATION: ${{ matrix.integration }}\n TARGET: x86_64-unknown-linux-gnu\n run: ./ci/integration.sh\n continue-on-error: ${{ matrix.allow-failure == true }}\n | dataset_sample\yaml\rust-lang_rust\src\tools\rustfmt\.github\workflows\integration.yml | integration.yml | YAML | 3,091 | 0.8 | 0.0375 | 0.289474 | vue-tools | 232 | 2024-03-30T14:04:19.451349 | BSD-3-Clause | false | 598311159b3fec1e069e68f2170e2306 |
name: linux\non:\n push:\n branches:\n - master\n pull_request:\n\njobs:\n test:\n runs-on: ubuntu-latest\n name: (${{ matrix.target }}, ${{ matrix.cfg_release_channel }})\n env:\n CFG_RELEASE_CHANNEL: ${{ matrix.cfg_release_channel }}\n strategy:\n # https://help.github.com/en/actions/getting-started-with-github-actions/about-github-actions#usage-limits\n # There's a limit of 60 concurrent jobs across all repos in the rust-lang organization.\n # In order to prevent overusing too much of that 60 limit, we throttle the\n # number of rustfmt jobs that will run concurrently.\n max-parallel: 1\n fail-fast: false\n matrix:\n target: [\n x86_64-unknown-linux-gnu,\n ]\n cfg_release_channel: [nightly, stable]\n\n steps:\n - name: checkout\n uses: actions/checkout@v4\n\n # Run build\n - name: install rustup\n run: |\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > rustup-init.sh\n sh rustup-init.sh -y --default-toolchain none\n rustup target add ${{ matrix.target }}\n\n - name: Build and Test\n run: ./ci/build_and_test.sh\n | dataset_sample\yaml\rust-lang_rust\src\tools\rustfmt\.github\workflows\linux.yml | linux.yml | YAML | 1,150 | 0.8 | 0 | 0.142857 | node-utils | 541 | 2023-11-22T23:57:18.242478 | MIT | false | 9c4aa2ff5447cb7ecbe29c3a4ab8dbc7 |
name: mac\non:\n push:\n branches:\n - master\n pull_request:\n\njobs:\n test:\n # https://help.github.com/en/actions/automating-your-workflow-with-github-actions/virtual-environments-for-github-hosted-runners#supported-runners-and-hardware-resources\n runs-on: macos-latest\n name: (${{ matrix.target }}, ${{ matrix.cfg_release_channel }})\n env:\n CFG_RELEASE_CHANNEL: ${{ matrix.cfg_release_channel }}\n strategy:\n fail-fast: false\n matrix:\n target: [\n x86_64-apple-darwin,\n ]\n cfg_release_channel: [nightly, stable]\n\n steps:\n - name: checkout\n uses: actions/checkout@v4\n\n # Run build\n - name: install rustup\n run: |\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > rustup-init.sh\n sh rustup-init.sh -y --default-toolchain none\n rustup target add ${{ matrix.target }}\n\n - name: Build and Test\n run: ./ci/build_and_test.sh\n | dataset_sample\yaml\rust-lang_rust\src\tools\rustfmt\.github\workflows\mac.yml | mac.yml | YAML | 947 | 0.8 | 0.028571 | 0.064516 | react-lib | 446 | 2023-08-24T13:20:48.117735 | MIT | false | 02e2e76497807e00d3f1be95953f3850 |
name: rustdoc check\non:\n push:\n branches:\n - master\n pull_request:\n\njobs:\n rustdoc_check:\n runs-on: ubuntu-latest\n name: rustdoc check\n steps:\n - name: checkout\n uses: actions/checkout@v4\n\n - name: install rustup\n run: |\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > rustup-init.sh\n sh rustup-init.sh -y --default-toolchain none\n rustup target add x86_64-unknown-linux-gnu\n\n - name: document rustfmt\n env:\n RUSTDOCFLAGS: --document-private-items --enable-index-page --show-type-layout --generate-link-to-definition -Zunstable-options -Dwarnings\n run: cargo doc -Zskip-rustdoc-fingerprint --no-deps -p rustfmt-nightly -p rustfmt-config_proc_macro\n | dataset_sample\yaml\rust-lang_rust\src\tools\rustfmt\.github\workflows\rustdoc_check.yml | rustdoc_check.yml | YAML | 738 | 0.8 | 0 | 0 | vue-tools | 35 | 2024-03-18T23:48:47.680265 | Apache-2.0 | false | d3ace331f3f7cf7bba05b60572270940 |
name: upload\n\non:\n push:\n release:\n types: [created]\n workflow_dispatch:\n\njobs:\n build-release:\n name: build-release\n strategy:\n matrix:\n build: [linux-x86_64, macos-x86_64, windows-x86_64-gnu, windows-x86_64-msvc]\n include:\n - build: linux-x86_64\n os: ubuntu-latest\n rust: nightly\n target: x86_64-unknown-linux-gnu\n - build: macos-x86_64\n os: macos-latest\n rust: nightly\n target: x86_64-apple-darwin\n - build: windows-x86_64-gnu\n os: windows-latest\n rust: nightly-x86_64-gnu\n target: x86_64-pc-windows-gnu\n - build: windows-x86_64-msvc\n os: windows-latest\n rust: nightly-x86_64-msvc\n target: x86_64-pc-windows-msvc\n runs-on: ${{ matrix.os }}\n steps:\n - uses: actions/checkout@v4\n\n # Run build\n - name: install rustup\n run: |\n curl --proto '=https' --tlsv1.2 -sSf https://sh.rustup.rs > rustup-init.sh\n sh rustup-init.sh -y --default-toolchain none\n rustup target add ${{ matrix.target }}\n\n - name: Add mingw64 to path for x86_64-gnu\n run: echo "C:\msys64\mingw64\bin" >> $GITHUB_PATH\n if: matrix.rust == 'nightly-x86_64-gnu'\n shell: bash\n\n - name: Build release binaries\n run: cargo build --release\n\n - name: Build archive\n shell: bash\n run: |\n staging="rustfmt_${{ matrix.build }}_${{ github.event.release.tag_name }}"\n mkdir -p "$staging"\n\n cp {README.md,Configurations.md,CHANGELOG.md,LICENSE-MIT,LICENSE-APACHE} "$staging/"\n\n if [ "${{ matrix.os }}" = "windows-latest" ]; then\n cp target/release/{rustfmt.exe,cargo-fmt.exe,rustfmt-format-diff.exe,git-rustfmt.exe} "$staging/"\n 7z a "$staging.zip" "$staging"\n echo "ASSET=$staging.zip" >> $GITHUB_ENV\n else\n cp target/release/{rustfmt,cargo-fmt,rustfmt-format-diff,git-rustfmt} "$staging/"\n tar czf "$staging.tar.gz" "$staging"\n echo "ASSET=$staging.tar.gz" >> $GITHUB_ENV\n fi\n\n - name: Upload Release Asset\n if: github.event_name == 'release'\n uses: actions/upload-release-asset@v1\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n with:\n upload_url: ${{ github.event.release.upload_url }}\n asset_path: ${{ env.ASSET }}\n asset_name: ${{ env.ASSET }}\n asset_content_type: application/octet-stream\n | dataset_sample\yaml\rust-lang_rust\src\tools\rustfmt\.github\workflows\upload-assets.yml | upload-assets.yml | YAML | 2,563 | 0.8 | 0.051282 | 0.014493 | python-kit | 547 | 2023-08-25T18:36:23.943853 | Apache-2.0 | false | 2fb5a959c8b3e550145ce9d40f63b59e |
name: windows\non:\n push:\n branches:\n - master\n pull_request:\n\njobs:\n test:\n runs-on: windows-latest\n name: (${{ matrix.target }}, ${{ matrix.cfg_release_channel }})\n env:\n CFG_RELEASE_CHANNEL: ${{ matrix.cfg_release_channel }}\n strategy:\n # https://help.github.com/en/actions/getting-started-with-github-actions/about-github-actions#usage-limits\n # There's a limit of 60 concurrent jobs across all repos in the rust-lang organization.\n # In order to prevent overusing too much of that 60 limit, we throttle the\n # number of rustfmt jobs that will run concurrently.\n max-parallel: 2\n fail-fast: false\n matrix:\n target: [\n i686-pc-windows-gnu,\n i686-pc-windows-msvc,\n x86_64-pc-windows-gnu,\n x86_64-pc-windows-msvc,\n ]\n cfg_release_channel: [nightly, stable]\n\n steps:\n # The Windows runners have autocrlf enabled by default\n # which causes failures for some of rustfmt's line-ending sensitive tests\n - name: disable git eol translation\n run: git config --global core.autocrlf false\n - name: checkout\n uses: actions/checkout@v4\n\n # Run build\n - name: Install Rustup using win.rustup.rs\n run: |\n # Disable the download progress bar which can cause perf issues\n $ProgressPreference = "SilentlyContinue"\n Invoke-WebRequest https://win.rustup.rs/ -OutFile rustup-init.exe\n .\rustup-init.exe -y --default-host=x86_64-pc-windows-msvc --default-toolchain=none\n del rustup-init.exe\n rustup target add ${{ matrix.target }}\n shell: powershell\n\n - name: Add mingw32 to path for i686-gnu\n run: |\n echo "C:\msys64\mingw32\bin" >> $GITHUB_PATH\n if: matrix.target == 'i686-pc-windows-gnu' && matrix.channel == 'nightly'\n shell: bash\n\n - name: Add mingw64 to path for x86_64-gnu\n run: echo "C:\msys64\mingw64\bin" >> $GITHUB_PATH\n if: matrix.target == 'x86_64-pc-windows-gnu' && matrix.channel == 'nightly'\n shell: bash\n\n - name: Build and Test\n shell: cmd\n run: ci\build_and_test.bat\n | dataset_sample\yaml\rust-lang_rust\src\tools\rustfmt\.github\workflows\windows.yml | windows.yml | YAML | 2,127 | 0.8 | 0.080645 | 0.142857 | python-kit | 866 | 2023-12-04T01:29:52.659834 | Apache-2.0 | false | 45e4862a69df04b819f31c0ff91ed8d5 |
name: 'wasmtime github releases'\ndescription: 'wasmtime github releases'\ninputs:\n token:\n description: ''\n required: true\n name:\n description: ''\n required: true\n files:\n description: ''\n required: true\nruns:\n using: 'docker'\n image: 'Dockerfile'\n | dataset_sample\yaml\rust-lang_rust-analyzer\.github\actions\github-release\action.yml | action.yml | YAML | 270 | 0.85 | 0 | 0 | node-utils | 482 | 2024-12-17T16:35:12.342413 | Apache-2.0 | false | 32d269cf6b2d86480951130aa97deb9e |
name: Fuzz\non:\n schedule:\n # Once a week\n - cron: '0 0 * * 0'\n push:\n paths:\n - '.github/workflows/fuzz.yml'\n # Allow manual trigger\n workflow_dispatch:\n\nenv:\n CARGO_INCREMENTAL: 0\n CARGO_NET_RETRY: 10\n CI: 1\n RUST_BACKTRACE: short\n RUSTFLAGS: "-D warnings -W unreachable-pub -W bare-trait-objects"\n RUSTUP_MAX_RETRIES: 10\n\njobs:\n rust:\n if: ${{ github.repository == 'rust-lang/rust-analyzer' || github.event_name == 'workflow_dispatch' }}\n name: Rust\n runs-on: ubuntu-latest\n env:\n CC: deny_c\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n ref: ${{ github.event.pull_request.head.sha }}\n fetch-depth: 1\n\n - name: Install Rust toolchain\n run: |\n rustup install --profile minimal nightly\n\n - name: Build fuzzers\n run: |\n cargo install cargo-fuzz\n cd crates/syntax\n cargo +nightly fuzz build\n | dataset_sample\yaml\rust-lang_rust-analyzer\.github\workflows\fuzz.yml | fuzz.yml | YAML | 956 | 0.8 | 0.023256 | 0.052632 | react-lib | 580 | 2024-01-10T10:12:40.173388 | GPL-3.0 | false | c638c6a5a236622b5c234ce6d18d85be |
name: Blank Issue\ndescription: Create a blank issue.\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing an issue!\n - type: textarea\n id: problem\n attributes:\n label: Description\n description: >\n Please provide a description of the issue, along with any information\n you feel relevant to replicate it.\n validations:\n required: true\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: labels\n attributes:\n label: Additional Labels\n description: >\n Additional labels can be added to this issue by including the following\n command\n placeholder: |\n @rustbot label +<label>\n\n Common labels for this issue type are:\n * C-an-interesting-project\n * C-enhancement\n * C-question\n * C-tracking-issue\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\ISSUE_TEMPLATE\blank_issue.yml | blank_issue.yml | YAML | 1,226 | 0.85 | 0.045455 | 0.093023 | react-lib | 447 | 2023-12-07T11:49:36.390324 | MIT | false | 5eea6ed6bf92186f5e6db27a437f6d07 |
name: Bug Report\ndescription: Create a bug report for Clippy\nlabels: ["C-bug"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing a bug report! 🐛\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: >\n Please provide a short summary of the bug, along with any information\n you feel relevant to replicate the bug.\n validations:\n required: true\n - type: textarea\n id: reproducer\n attributes:\n label: Reproducer\n description: Please provide the code and steps to reproduce the bug\n value: |\n I tried this code:\n\n ```rust\n <code>\n ```\n\n I expected to see this happen:\n\n Instead, this happened:\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: labels\n attributes:\n label: Additional Labels\n description: >\n Additional labels can be added to this issue by including the following\n command\n placeholder: |\n @rustbot label +<label>\n\n Common labels for this issue type are:\n * `I-suggestion-causes-error`\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\ISSUE_TEMPLATE\bug_report.yml | bug_report.yml | YAML | 1,504 | 0.85 | 0.052632 | 0.018868 | awesome-app | 215 | 2024-03-31T12:05:11.844014 | BSD-3-Clause | false | 2b6f795a563b363ed821556dcbcb5f79 |
blank_issues_enabled: true\ncontact_links:\n - name: Rust Programming Language Forum\n url: https://users.rust-lang.org\n about: Please ask and answer questions about Rust here.\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 181 | 0.8 | 0 | 0 | node-utils | 333 | 2024-12-28T11:57:09.443392 | GPL-3.0 | false | f75886be4d74fc4879e4c3fb4eb91a89 |
name: Bug Report (False Negative)\ndescription: Create a bug report about missing warnings from a lint\nlabels: ["C-bug", "I-false-negative"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing a bug report! 🐛\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: >\n Please provide a short summary of the bug, along with any information\n you feel relevant to replicate the bug.\n validations:\n required: true\n - type: input\n id: lint-name\n attributes:\n label: Lint Name\n description: Please provide the lint name.\n - type: textarea\n id: reproducer\n attributes:\n label: Reproducer\n description: Please provide the code and steps to reproduce the bug\n value: |\n I tried this code:\n\n ```rust\n <code>\n ```\n\n I expected to see this happen:\n\n Instead, this happened:\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\ISSUE_TEMPLATE\false_negative.yml | false_negative.yml | YAML | 1,350 | 0.85 | 0.02 | 0 | awesome-app | 843 | 2023-11-16T09:06:02.493757 | Apache-2.0 | false | e6a5817b77c846baf276de85f9647532 |
name: Bug Report (False Positive)\ndescription: Create a bug report about a wrongly emitted lint warning\nlabels: ["C-bug", "I-false-positive"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for filing a bug report! 🐛\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: >\n Please provide a short summary of the bug, along with any information\n you feel relevant to replicate the bug.\n validations:\n required: true\n - type: input\n id: lint-name\n attributes:\n label: Lint Name\n description: Please provide the lint name.\n - type: textarea\n id: reproducer\n attributes:\n label: Reproducer\n description: >\n Please provide the code and steps to reproduce the bug together with the\n output from Clippy.\n value: |\n I tried this code:\n\n ```rust\n <code>\n ```\n\n I saw this happen:\n\n ```\n <output>\n ```\n\n I expected to see this happen:\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: labels\n attributes:\n label: Additional Labels\n description: >\n Additional labels can be added to this issue by including the following\n command\n placeholder: |\n @rustbot label +<label>\n\n Common labels for this issue type are:\n * `I-suggestion-causes-error`\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\ISSUE_TEMPLATE\false_positive.yml | false_positive.yml | YAML | 1,782 | 0.85 | 0.029412 | 0.015873 | python-kit | 225 | 2023-07-30T18:20:59.061790 | BSD-3-Clause | false | 5faf10accbc0afed30dff11e00fe27ae |
name: Internal Compiler Error\ndescription: Create a report for an internal compiler error (ICE) in Clippy.\nlabels: ["C-bug", "I-ICE"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for finding an Internal Compiler Error! 🧊\n - type: textarea\n id: problem\n attributes:\n label: Summary\n description: |\n If possible, try to provide a minimal verifiable example. You can read ["Rust Bug Minimization Patterns"][mve] for how to create smaller examples. Otherwise, provide the crate where the ICE occurred.\n\n [mve]: http://blog.pnkfx.org/blog/2019/11/18/rust-bug-minimization-patterns/\n validations:\n required: true\n - type: textarea\n id: version\n attributes:\n label: Version\n description: "Rust version (`rustc -Vv`)"\n placeholder: |\n rustc 1.46.0-nightly (f455e46ea 2020-06-20)\n binary: rustc\n commit-hash: f455e46eae1a227d735091091144601b467e1565\n commit-date: 2020-06-20\n host: x86_64-unknown-linux-gnu\n release: 1.46.0-nightly\n LLVM version: 10.0\n render: text\n - type: textarea\n id: error\n attributes:\n label: Error output\n description: >\n Include a backtrace in the code block by setting `RUST_BACKTRACE=1` in\n your environment. E.g. `RUST_BACKTRACE=1 cargo clippy`.\n value: |\n <details><summary>Backtrace</summary>\n <p>\n\n ```\n <backtrace>\n ```\n\n </p>\n </details>\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\ISSUE_TEMPLATE\ice.yml | ice.yml | YAML | 1,494 | 0.95 | 0.083333 | 0 | vue-tools | 708 | 2024-09-09T01:02:44.415979 | BSD-3-Clause | false | 2582343b97d8346d729a75228f83bb37 |
name: New lint suggestion\ndescription: Suggest a new Clippy lint.\nlabels: ["A-lint"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for your lint idea!\n - type: textarea\n id: what\n attributes:\n label: What it does\n description: What does this lint do?\n validations:\n required: true\n - type: textarea\n id: advantage\n attributes:\n label: Advantage\n description: >\n What is the advantage of the recommended code over the original code?\n placeholder: |\n - Remove bounds check inserted by ...\n - Remove the need to duplicate/store ...\n - Remove typo ...\n - type: textarea\n id: drawbacks\n attributes:\n label: Drawbacks\n description: What might be possible drawbacks of such a lint?\n - type: textarea\n id: example\n attributes:\n label: Example\n description: >\n Include a short example showing when the lint should trigger together\n with the improved code.\n value: |\n ```rust\n <code>\n ```\n\n Could be written as:\n\n ```rust\n <code>\n ```\n validations:\n required: true\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\ISSUE_TEMPLATE\new_lint.yml | new_lint.yml | YAML | 1,158 | 0.85 | 0.020833 | 0 | vue-tools | 961 | 2025-02-14T04:43:03.419894 | GPL-3.0 | false | 57a1ff48a772a5bfaf866faa4bb9aed6 |
name: Clippy changelog check\n\non:\n merge_group:\n pull_request:\n types: [opened, reopened, synchronize, edited]\n\nconcurrency:\n # For a given workflow, if we push to the same PR, cancel all previous builds on that PR.\n # If the push is not attached to a PR, we will cancel all builds on the same branch.\n group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"\n cancel-in-progress: true\n\njobs:\n changelog:\n runs-on: ubuntu-latest\n\n defaults:\n run:\n shell: bash\n\n steps:\n # Run\n - name: Check Changelog\n if: ${{ github.event_name == 'pull_request' }}\n run: |\n body=$(curl -H "Authorization: token ${{ secrets.GITHUB_TOKEN }}" -s "https://api.github.com/repos/${{ github.repository }}/pulls/$PR_NUMBER" | \\n python -c "import sys, json; print(json.load(sys.stdin)['body'])")\n output=$(awk '/^changelog:\s*\S/ && !/changelog: \[.*\]: your change/' <<< "$body" | sed "s/changelog:\s*//g")\n if [ -z "$output" ]; then\n echo "ERROR: pull request message must contain 'changelog: ...' with your changelog. Please add it."\n exit 1\n else\n echo "changelog: $output"\n fi\n env:\n PYTHONIOENCODING: 'utf-8'\n PR_NUMBER: '${{ github.event.number }}'\n\n # We need to have the "conclusion" job also on PR CI, to make it possible\n # to add PRs to a merge queue.\n conclusion_changelog:\n needs: [ changelog ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\workflows\clippy_changelog.yml | clippy_changelog.yml | YAML | 2,301 | 0.95 | 0.135593 | 0.264151 | react-lib | 431 | 2025-01-21T21:55:24.624772 | MIT | false | 162a74503cde178f21309e4a03cbfb60 |
name: Clippy Dev Test\n\non:\n merge_group:\n pull_request:\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_INCREMENTAL: 0\n RUSTFLAGS: -D warnings\n\njobs:\n clippy_dev:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n # Run\n - name: Build\n run: cargo build\n working-directory: clippy_dev\n\n - name: Test update_lints\n run: cargo dev update_lints --check\n\n - name: Test fmt\n run: cargo dev fmt --check\n\n - name: Test cargo dev new lint\n env:\n RUSTFLAGS: -A unused-imports\n run: |\n cargo dev new_lint --name new_early_pass --pass early\n cargo dev new_lint --name new_late_pass --pass late\n cargo check\n git reset --hard HEAD\n\n conclusion_dev:\n needs: [ clippy_dev ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\workflows\clippy_dev.yml | clippy_dev.yml | YAML | 1,772 | 0.95 | 0.081967 | 0.230769 | awesome-app | 88 | 2023-07-19T19:04:47.852227 | Apache-2.0 | false | 8d3d3057879902e0585281bb0e5c62a9 |
name: Clippy Test (merge queue)\n\non:\n merge_group:\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_TARGET_DIR: '${{ github.workspace }}/target'\n NO_FMT_TEST: 1\n CARGO_INCREMENTAL: 0\n RUSTFLAGS: -D warnings\n\ndefaults:\n run:\n shell: bash\n\njobs:\n base:\n strategy:\n matrix:\n include:\n - os: ubuntu-latest\n host: x86_64-unknown-linux-gnu\n - os: ubuntu-latest\n host: i686-unknown-linux-gnu\n - os: windows-latest\n host: x86_64-pc-windows-msvc\n - os: macos-13\n host: x86_64-apple-darwin\n - os: macos-latest\n host: aarch64-apple-darwin\n\n runs-on: ${{ matrix.os }}\n\n # NOTE: If you modify this job, make sure you copy the changes to clippy.yml\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install i686 dependencies\n if: matrix.host == 'i686-unknown-linux-gnu'\n run: |\n sudo dpkg --add-architecture i386\n sudo apt-get update\n sudo apt-get install gcc-multilib zlib1g-dev:i386\n\n - name: Install toolchain\n run: |\n rustup set default-host ${{ matrix.host }}\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Run\n - name: Build\n run: cargo build --tests --features internal\n\n - name: Test\n if: matrix.host == 'x86_64-unknown-linux-gnu'\n run: cargo test --features internal\n\n - name: Test\n if: matrix.host != 'x86_64-unknown-linux-gnu'\n run: cargo test --features internal -- --skip dogfood\n\n - name: Test clippy_lints\n run: cargo test\n working-directory: clippy_lints\n\n - name: Test clippy_utils\n run: cargo test\n working-directory: clippy_utils\n\n - name: Test clippy_config\n run: cargo test\n working-directory: clippy_config\n\n - name: Test rustc_tools_util\n run: cargo test\n working-directory: rustc_tools_util\n\n - name: Test clippy_dev\n run: cargo test\n working-directory: clippy_dev\n\n - name: Test clippy-driver\n run: .github/driver.sh\n env:\n OS: ${{ runner.os }}\n\n metadata_collection:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n - name: Test metadata collection\n run: cargo collect-metadata\n\n integration_build:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Run\n - name: Build Integration Test\n env:\n CARGO_PROFILE_DEV_SPLIT_DEBUGINFO: off\n run: cargo test --test integration --features integration --no-run\n\n # Upload\n - name: Extract Binaries\n run: |\n DIR=$CARGO_TARGET_DIR/debug\n find $DIR/deps/integration-* -executable ! -type d | xargs -I {} mv {} $DIR/integration\n find $DIR ! -executable -o -type d ! -path $DIR | xargs rm -rf\n\n - name: Upload Binaries\n uses: actions/upload-artifact@v4\n with:\n name: binaries\n path: target/debug\n\n integration:\n needs: integration_build\n strategy:\n fail-fast: false\n max-parallel: 6\n matrix:\n integration:\n - 'matthiaskrgr/clippy_ci_panic_test'\n - 'rust-lang/cargo'\n - 'rust-lang/chalk'\n - 'rust-lang/rustfmt'\n - 'Marwes/combine'\n - 'Geal/nom'\n - 'rust-lang/stdarch'\n - 'serde-rs/serde'\n - 'chronotope/chrono'\n - 'hyperium/hyper'\n - 'rust-random/rand'\n - 'rust-lang/futures-rs'\n - 'rust-itertools/itertools'\n - 'rust-lang-nursery/failure'\n - 'rust-lang/log'\n\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Download\n - name: Download target dir\n uses: actions/download-artifact@v4\n with:\n name: binaries\n path: target/debug\n\n - name: Make Binaries Executable\n run: chmod +x $CARGO_TARGET_DIR/debug/*\n\n # Run\n - name: Test ${{ matrix.integration }}\n run: |\n TOOLCHAIN=$(rustup show active-toolchain | head -n 1 | cut -f1 -d' ')\n rustup run $TOOLCHAIN $CARGO_TARGET_DIR/debug/integration --show-output\n env:\n INTEGRATION: ${{ matrix.integration }}\n\n conclusion:\n needs: [ base, metadata_collection, integration_build, integration ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\workflows\clippy_mq.yml | clippy_mq.yml | YAML | 5,931 | 0.8 | 0.036866 | 0.126374 | react-lib | 152 | 2025-01-05T14:51:53.001697 | GPL-3.0 | false | 8c43afad331aac7c814bdcdfb9ecb9b5 |
name: Clippy Test\n\non:\n pull_request:\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_TARGET_DIR: '${{ github.workspace }}/target'\n NO_FMT_TEST: 1\n CARGO_INCREMENTAL: 0\n RUSTFLAGS: -D warnings\n\nconcurrency:\n # For a given workflow, if we push to the same PR, cancel all previous builds on that PR.\n # If the push is not attached to a PR, we will cancel all builds on the same branch.\n group: "${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}"\n cancel-in-progress: true\n\njobs:\n base:\n # NOTE: If you modify this job, make sure you copy the changes to clippy_mq.yml\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Install toolchain\n run: |\n # Use a way compatible with Rustup pre-1.28.0 and Rustup 1.28.0\n rustup show active-toolchain || rustup toolchain install\n\n # Run\n - name: Build\n run: cargo build --tests --features internal\n\n - name: Test\n run: cargo test --features internal\n\n - name: Test clippy_lints\n run: cargo test\n working-directory: clippy_lints\n\n - name: Test clippy_utils\n run: cargo test\n working-directory: clippy_utils\n\n - name: Test rustc_tools_util\n run: cargo test\n working-directory: rustc_tools_util\n\n - name: Test clippy_dev\n run: cargo test\n working-directory: clippy_dev\n\n - name: Test clippy-driver\n run: .github/driver.sh\n env:\n OS: ${{ runner.os }}\n\n # We need to have the "conclusion" job also on PR CI, to make it possible\n # to add PRs to a merge queue.\n conclusion:\n needs: [ base ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\workflows\clippy_pr.yml | clippy_pr.yml | YAML | 2,612 | 0.8 | 0.071429 | 0.257143 | awesome-app | 86 | 2025-06-06T01:28:32.222021 | GPL-3.0 | false | 1c27b57b58207cd9593a84d7112d9a50 |
name: Deploy\n\non:\n push:\n branches:\n - master\n - beta\n tags:\n - rust-1.**\n\nconcurrency:\n group: ${{ github.workflow }}\n cancel-in-progress: false\n\nenv:\n TARGET_BRANCH: 'gh-pages'\n SHA: '${{ github.sha }}'\n SSH_REPO: 'git@github.com:${{ github.repository }}.git'\n\njobs:\n deploy:\n runs-on: ubuntu-latest\n if: github.repository == 'rust-lang/rust-clippy'\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Checkout\n uses: actions/checkout@v4\n with:\n ref: ${{ env.TARGET_BRANCH }}\n path: 'out'\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n # Run\n - name: Set tag name\n if: startswith(github.ref, 'refs/tags/')\n run: |\n TAG=$(basename "${TAGNAME}")\n echo "TAG_NAME=$TAG" >> $GITHUB_ENV\n env:\n # Make sure that the reference gets expanded before injecting it\n TAGNAME: ${{ github.ref }}\n - name: Set beta to true\n if: github.ref == 'refs/heads/beta'\n run: echo "BETA=true" >> $GITHUB_ENV\n\n # We need to check out all files that (transitively) depend on the\n # structure of the gh-pages branch, so that we're able to change that\n # structure without breaking the deployment.\n - name: Use deploy files from master branch\n run: |\n git fetch --no-tags --prune --depth=1 origin master\n git checkout origin/master -- .github/deploy.sh util/versions.py util/gh-pages/versions.html\n\n # Generate lockfile for caching to avoid build problems with cached deps\n - name: cargo generate-lockfile\n run: cargo generate-lockfile\n\n - name: Cache\n uses: Swatinem/rust-cache@v2\n with:\n save-if: ${{ github.ref == 'refs/heads/master' }}\n\n - name: cargo collect-metadata\n run: cargo collect-metadata\n\n - name: Deploy\n run: |\n eval "$(ssh-agent -s)"\n ssh-add - <<< "${{ secrets.DEPLOY_KEY }}"\n bash .github/deploy.sh\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\workflows\deploy.yml | deploy.yml | YAML | 2,170 | 0.8 | 0.064103 | 0.136364 | awesome-app | 438 | 2023-11-19T07:15:12.362963 | BSD-3-Clause | false | 510c9bfdabc5f093b031897a829ed2e6 |
name: Lintcheck\n\non:\n pull_request:\n paths-ignore:\n - 'book/**'\n - 'util/**'\n - 'tests/**'\n - '*.md'\n\nenv:\n RUST_BACKTRACE: 1\n CARGO_INCREMENTAL: 0\n\nconcurrency:\n # For a given workflow, if we push to the same PR, cancel all previous builds on that PR.\n group: "${{ github.workflow }}-${{ github.event.pull_request.number}}"\n cancel-in-progress: true\n\njobs:\n # Runs lintcheck on the PR's target branch and stores the results as an artifact\n base:\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n with:\n fetch-depth: 2\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n # HEAD is the generated merge commit `refs/pull/N/merge` between the PR and `master`, `HEAD^`\n # being the commit from `master` that is the base of the merge\n - name: Checkout base\n run: git checkout HEAD^\n\n # Use the lintcheck from the PR to generate the JSON in case the PR modifies lintcheck in some\n # way\n - name: Checkout current lintcheck\n run: |\n rm -rf lintcheck\n git checkout ${{ github.sha }} -- lintcheck\n\n - name: Cache lintcheck bin\n id: cache-lintcheck-bin\n uses: actions/cache@v4\n with:\n path: target/debug/lintcheck\n key: lintcheck-bin-${{ hashfiles('lintcheck/**') }}\n\n - name: Build lintcheck\n if: steps.cache-lintcheck-bin.outputs.cache-hit != 'true'\n run: cargo build --manifest-path=lintcheck/Cargo.toml\n\n - name: Create cache key\n id: key\n run: echo "key=lintcheck-base-${{ hashfiles('lintcheck/**') }}-$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"\n\n - name: Cache results JSON\n id: cache-json\n uses: actions/cache@v4\n with:\n path: lintcheck-logs/ci_crates_logs.json\n key: ${{ steps.key.outputs.key }}\n\n - name: Run lintcheck\n if: steps.cache-json.outputs.cache-hit != 'true'\n run: env CLIPPY_CONF_DIR="$PWD/lintcheck/ci-config" ./target/debug/lintcheck --format json --all-lints --crates-toml ./lintcheck/ci_crates.toml\n\n - name: Upload base JSON\n uses: actions/upload-artifact@v4\n with:\n name: base\n path: lintcheck-logs/ci_crates_logs.json\n\n # Runs lintcheck on the PR and stores the results as an artifact\n head:\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Cache lintcheck bin\n id: cache-lintcheck-bin\n uses: actions/cache@v4\n with:\n path: target/debug/lintcheck\n key: lintcheck-bin-${{ hashfiles('lintcheck/**') }}\n\n - name: Build lintcheck\n if: steps.cache-lintcheck-bin.outputs.cache-hit != 'true'\n run: cargo build --manifest-path=lintcheck/Cargo.toml\n\n - name: Run lintcheck\n run: env CLIPPY_CONF_DIR="$PWD/lintcheck/ci-config" ./target/debug/lintcheck --format json --all-lints --crates-toml ./lintcheck/ci_crates.toml\n\n - name: Upload head JSON\n uses: actions/upload-artifact@v4\n with:\n name: head\n path: lintcheck-logs/ci_crates_logs.json\n\n # Retrieves the head and base JSON results and prints the diff to the GH actions step summary\n diff:\n runs-on: ubuntu-latest\n\n needs: [base, head]\n\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Restore lintcheck bin\n uses: actions/cache/restore@v4\n with:\n path: target/debug/lintcheck\n key: lintcheck-bin-${{ hashfiles('lintcheck/**') }}\n fail-on-cache-miss: true\n\n - name: Download JSON\n uses: actions/download-artifact@v4\n\n - name: Diff results\n # GH's summery has a maximum size of 1024k:\n # https://docs.github.com/actions/using-workflows/workflow-commands-for-github-actions#adding-a-markdown-summary\n # That's why we first log to file and then to the summary and logs\n run: |\n ./target/debug/lintcheck diff {base,head}/ci_crates_logs.json --truncate >> truncated_diff.md\n head -c 1024000 truncated_diff.md >> $GITHUB_STEP_SUMMARY\n cat truncated_diff.md\n ./target/debug/lintcheck diff {base,head}/ci_crates_logs.json >> full_diff.md\n\n - name: Upload full diff\n uses: actions/upload-artifact@v4\n with:\n name: diff\n if-no-files-found: ignore\n path: |\n full_diff.md\n truncated_diff.md\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\workflows\lintcheck.yml | lintcheck.yml | YAML | 4,668 | 0.8 | 0.040541 | 0.114754 | node-utils | 941 | 2023-09-04T17:20:36.604446 | Apache-2.0 | false | caa77c80e6c248ee3ab35ea0fee1f1eb |
name: Remark\n\non:\n merge_group:\n pull_request:\n\njobs:\n remark:\n runs-on: ubuntu-latest\n\n steps:\n # Setup\n - name: Checkout\n uses: actions/checkout@v4\n with:\n # Unsetting this would make so that any malicious package could get our Github Token\n persist-credentials: false\n\n - name: Setup Node.js\n uses: actions/setup-node@v4\n with:\n node-version: '18.x'\n\n - name: Install remark\n run: npm install remark-cli remark-lint remark-lint-maximum-line-length@^3.1.3 remark-preset-lint-recommended remark-gfm\n\n - name: Install mdbook\n run: |\n mkdir mdbook\n curl -Lf https://github.com/rust-lang/mdBook/releases/download/v0.4.43/mdbook-v0.4.43-x86_64-unknown-linux-gnu.tar.gz | tar -xz --directory=./mdbook\n echo `pwd`/mdbook >> $GITHUB_PATH\n\n # Run\n - name: Check *.md files\n run: ./node_modules/.bin/remark -u lint -f .\n\n - name: Linkcheck book\n run: |\n rustup toolchain install nightly --component rust-docs\n rustup override set nightly\n curl https://raw.githubusercontent.com/rust-lang/rust/master/src/tools/linkchecker/linkcheck.sh -o linkcheck.sh\n sh linkcheck.sh clippy --path ./book\n\n - name: Build mdbook\n run: mdbook build book\n\n conclusion_remark:\n needs: [ remark ]\n # We need to ensure this job does *not* get skipped if its dependencies fail,\n # because a skipped job is considered a success by GitHub. So we have to\n # overwrite `if:`. We use `!cancelled()` to ensure the job does still not get run\n # when the workflow is canceled manually.\n #\n # ALL THE PREVIOUS JOBS NEED TO BE ADDED TO THE `needs` SECTION OF THIS JOB!\n if: ${{ !cancelled() }}\n runs-on: ubuntu-latest\n steps:\n # Manually check the status of all dependencies. `if: failure()` does not work.\n - name: Conclusion\n run: |\n # Print the dependent jobs to see them in the CI log\n jq -C <<< '${{ toJson(needs) }}'\n # Check if all jobs that we depend on (in the needs array) were successful.\n jq --exit-status 'all(.result == "success")' <<< '${{ toJson(needs) }}'\n | dataset_sample\yaml\rust-lang_rust-clippy\.github\workflows\remark.yml | remark.yml | YAML | 2,167 | 0.8 | 0.078125 | 0.222222 | react-lib | 35 | 2024-08-01T07:39:31.370102 | Apache-2.0 | false | aebaa38d4033346ae30c55601fb1ed47 |
version: 2\nupdates:\n - package-ecosystem: "gitsubmodule"\n directory: "/"\n target-branch: "master"\n schedule:\n interval: "daily"\n commit-message:\n prefix: "Git submodule"\n labels:\n - "dependencies"\n | dataset_sample\yaml\rustdesk_rustdesk\.github\dependabot.yml | dependabot.yml | YAML | 228 | 0.7 | 0 | 0 | awesome-app | 703 | 2023-08-07T15:17:00.320150 | GPL-3.0 | false | 7a18a09cda3ae1655b5892007b992a3b |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.