content stringlengths 1 103k ⌀ | path stringlengths 8 216 | filename stringlengths 2 179 | language stringclasses 15
values | size_bytes int64 2 189k | quality_score float64 0.5 0.95 | complexity float64 0 1 | documentation_ratio float64 0 1 | repository stringclasses 5
values | stars int64 0 1k | created_date stringdate 2023-07-10 19:21:08 2025-07-09 19:11:45 | license stringclasses 4
values | is_test bool 2
classes | file_hash stringlengths 32 32 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n mysql:\n image: mysql:latest\n ports:\n - "3306:3306"\n environment:\n - MYSQL_ROOT_PASSWORD=123456\n - MYSQL_USER=mysqluser\n - MYSQL_PASSWORD=mysqlpw\n - MYSQL_DATABASE=mydb\n volumes:\n - "./mysql_prepare.sql:/mysql_prepare.sql"\n healthcheck:\n test: [ "CMD-SHELL", "mysqladmin ping -h 127.0.0.1 -u root -p123456" ]\n interval: 5s\n timeout: 5s\n retries: 5\n container_name: mysql\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\mysql-sink\docker-compose.yml | docker-compose.yml | YAML | 1,207 | 0.7 | 0 | 0 | awesome-app | 507 | 2025-07-02T16:41:07.999923 | Apache-2.0 | true | 8fe48071d5f64e278816cab1c451aed4 |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n nats-server:\n image: nats:latest\n ports:\n - "4222:4222"\n command: -js\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n subject1:\n image: bitnami/natscli:0.1.4\n depends_on: [nats-server]\n command: sub subject1 -s nats://nats-server:4222\n subject2:\n image: bitnami/natscli:0.1.4\n depends_on: [nats-server]\n command: sub subject2 -s nats://nats-server:4222\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\n datagen:\n build: ../datagen\n depends_on: [message_queue]\n command:\n - /bin/sh\n - -c\n - /datagen --mode livestream --qps 10 nats --url nats-server:4222 --jetstream\n restart: always\n container_name: datagen\nvolumes:\n compute-node-0:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\nats\docker-compose.yml | docker-compose.yml | YAML | 1,442 | 0.8 | 0 | 0 | python-kit | 284 | 2024-10-04T10:52:22.635886 | BSD-3-Clause | true | 8f2422488ee6b51fce08a1a0f2cf6797 |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n kafka:\n image: confluentinc/cp-kafka:latest\n hostname: kafka\n container_name: kafka\n ports:\n - "29092:29092"\n environment:\n KAFKA_BROKER_ID: 1\n KAFKA_ZOOKEEPER_CONNECT: 'zookeeper:2181'\n KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT\n KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:9092,PLAINTEXT_HOST://localhost:29092\n KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1\n KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0\n KAFKA_TOOLS_LOG4J_LOGLEVEL: ERROR\n depends_on:\n [ zookeeper ]\n healthcheck: { test: nc -z localhost 9092, interval: 1s, start_period: 120s }\n pinot-controller:\n image: apachepinot/pinot:latest\n command: "StartController -zkAddress zookeeper:2181"\n container_name: "pinot-controller"\n volumes:\n - ./config:/config\n restart: unless-stopped\n ports:\n - "9000:9000"\n depends_on:\n - zookeeper\n pinot-broker:\n image: apachepinot/pinot:latest\n command: "StartBroker -zkAddress zookeeper:2181"\n restart: unless-stopped\n container_name: "pinot-broker"\n ports:\n - "8099:8099"\n depends_on:\n - pinot-controller\n pinot-server:\n image: apachepinot/pinot:latest\n container_name: "pinot-server"\n command: "StartServer -zkAddress zookeeper:2181"\n restart: unless-stopped\n depends_on:\n - pinot-broker\n zookeeper:\n image: confluentinc/cp-zookeeper:latest\n hostname: zookeeper\n container_name: zookeeper\n ports:\n - "2181:2181"\n environment:\n ZOOKEEPER_CLIENT_PORT: 2181\n ZOOKEEPER_TICK_TIME: 2000\n postgres:\n image: postgres:latest\n command: tail -f /dev/null\n volumes:\n - "./update.sql:/update.sql"\n restart: on-failure\n container_name: postgres\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\pinot-sink\docker-compose.yml | docker-compose.yml | YAML | 2,542 | 0.8 | 0 | 0 | awesome-app | 129 | 2025-02-25T16:04:21.160581 | BSD-3-Clause | true | a295997e5094b24be78848c433b50faf |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n # Use this command to connect to the DB from outside the container:\n # docker exec postgres psql --username=myuser --dbname=mydb\n postgres:\n image: postgres:17-alpine\n environment:\n - POSTGRES_USER=myuser\n - POSTGRES_PASSWORD=123456\n - POSTGRES_DB=mydb\n ports:\n - 5432:5432\n healthcheck:\n test: [ "CMD-SHELL", "pg_isready --username=myuser --dbname=mydb" ]\n interval: 5s\n timeout: 5s\n retries: 5\n command: [ "postgres", "-c", "wal_level=logical" ]\n restart: always\n container_name: postgres\n postgres_prepare:\n image: postgres\n depends_on:\n - postgres\n command:\n - /bin/sh\n - -c\n - "psql postgresql://myuser:123456@postgres:5432/mydb < postgres_prepare.sql &&\n psql postgresql://myuser:123456@postgres:5432/mydb < compatibility-pg.sql &&\n sleep 5 &&\n psql postgresql://root:@risingwave-standalone:4566/dev < compatibility-rw.sql"\n volumes:\n - "./postgres_prepare.sql:/postgres_prepare.sql"\n - "./compatibility-pg.sql:/compatibility-pg.sql"\n - "./compatibility-rw.sql:/compatibility-rw.sql"\n container_name: postgres_prepare\n restart: on-failure\n datagen_tpch:\n image: ghcr.io/risingwavelabs/go-tpc:v0.1\n depends_on: [ postgres ]\n command: tpch prepare --sf 1 --threads 4 -d postgres -U myuser -p '123456' -H postgres -D mydb -P 5432 --conn-params sslmode=disable\n container_name: datagen_tpch\n restart: on-failure\n datagen_kafka:\n build: ../datagen\n depends_on: [ message_queue ]\n command:\n - /bin/sh\n - -c\n - /datagen --mode nexmark --qps 2 kafka --brokers message_queue:29092\n restart: always\n container_name: datagen_kafka\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\postgres-cdc\docker-compose.yml | docker-compose.yml | YAML | 2,588 | 0.8 | 0 | 0.022222 | react-lib | 745 | 2024-08-19T01:48:47.488357 | GPL-3.0 | true | 62acac68d81f663aa9b15d2e0cba5051 |
pk_types:\n - boolean\n - bigint\n - date\ndatatypes:\n - name: boolean\n aliases:\n - bool\n zero: false\n minimum: false\n maximum: true\n rw_type: boolean\n - name: smallint\n zero: 0\n minimum: -32767\n maximum: 32767\n - name: integer\n aliases:\n - int\n zero: 0\n minimum: -2147483647\n maximum: 2147483647\n - name: bigint\n zero: 0\n minimum: -9223372036854775807\n maximum: 9223372036854775807\n - name: decimal\n aliases:\n - numeric\n zero: 0\n minimum: -9.9999999999999999999999999999999\n maximum: -9.9999999999999999999999999999999\n - name: real\n zero: 0\n minimum: -9999.999999\n maximum: 9999.999999\n - name: double precision\n zero: 0\n minimum: -9999.99999999999999\n maximum: 9999.99999999999999\n - name: varchar\n aliases:\n - character varying\n - string\n zero: "''"\n minimum: "''"\n maximum_gen_py: "\"'{}'\".format('z'*65535)"\n - name: bytea\n zero: "'\\x00'"\n minimum: "'\\x00'"\n maximum_gen_py: "\"'{}'\".format('\\\\x'+'f'*65534)"\n - name: date\n zero: "'0001-01-01'"\n minimum: "'0001-01-01'"\n maximum: "'9999-12-31'"\n - name: time\n aliases:\n - time without time zone\n zero: "'00:00:00'"\n minimum: "'00:00:00'"\n maximum: "'23:59:59'"\n - name: timestamp\n aliases:\n - timestamp without time zone\n zero: "'0001-01-01 00:00:00'::timestamp"\n minimum: "'0001-01-01 00:00:00'::timestamp"\n maximum: "'9999-12-31 23:59:59'::timestamp"\n - name: timestamptz\n aliases:\n - timestamp with time zone\n zero: "'0001-01-01 00:00:00'::timestamptz"\n minimum: "'0001-01-01 00:00:00'::timestamptz"\n maximum: "'9999-12-31 23:59:59'::timestamptz"\n - name: interval\n zero: "interval '0 second'"\n minimum: "interval '0 second'"\n maximum: "interval '9990 year'"\n - name: jsonb\n zero: "'{}'"\n minimum: "'{}'"\n maximum: "'{\"whatever\":\"meaningless\"}'"\n\n\n\n\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\postgres-cdc\postgresql-datatypes.yml | postgresql-datatypes.yml | YAML | 1,932 | 0.7 | 0 | 0 | python-kit | 794 | 2023-11-16T16:22:00.296930 | BSD-3-Clause | true | 21e92ec14adf69a1ae79bde2e90b26db |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n # Use this command to connect to the DB from outside the container:\n # docker exec postgres psql --username=myuser --dbname=mydb\n postgres:\n image: postgres:latest\n environment:\n - POSTGRES_USER=myuser\n - POSTGRES_PASSWORD=123456\n - POSTGRES_DB=mydb\n ports:\n - 5432:5432\n healthcheck:\n test: [ "CMD-SHELL", "pg_isready --username=myuser --dbname=mydb" ]\n interval: 5s\n timeout: 5s\n retries: 5\n command: [ "postgres", "-c", "wal_level=logical" ]\n volumes:\n - "./postgres_prepare.sql:/postgres_prepare.sql"\n restart: always\n container_name: postgres\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\postgres-sink\docker-compose.yml | docker-compose.yml | YAML | 1,390 | 0.8 | 0 | 0.035714 | python-kit | 282 | 2025-01-13T11:43:35.079990 | GPL-3.0 | true | 9506475aa272375f515fefff7225b31c |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n trino:\n image: trinodb/trino:418\n container_name: trino\n volumes:\n - ./etc/risingwave.properties:/etc/trino/catalog/risingwave.properties\n ports:\n - "48080:8080"\n trino-client:\n image: trinodb/trino:418\n profiles: [ "client" ]\n entrypoint: [ "trino", "--server", "trino:8080", "--catalog", "risingwave", "--schema", "public" ]\n presto:\n image: prestodb/presto:0.284\n container_name: presto\n volumes:\n - ./etc/risingwave.properties:/opt/presto-server/etc/catalog/risingwave.properties\n - ./etc/config.properties:/opt/presto-server/etc/config.properties\n - ./etc/jvm.config:/opt/presto-server/etc/jvm.config\n ports:\n - "8080:8080"\n presto-client:\n image: prestodb/presto:0.284\n profiles: [ "client" ]\n entrypoint: [ "presto-cli", "--server", "presto:8080", "--catalog", "risingwave", "--schema", "public" ]\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\presto-trino\docker-compose.yml | docker-compose.yml | YAML | 1,612 | 0.7 | 0 | 0 | python-kit | 641 | 2025-01-03T13:42:19.248877 | BSD-3-Clause | true | fe511312490ce72fd8554e14f55c3c62 |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n image: "prom/prometheus:latest"\n command:\n - "--config.file=/etc/prometheus/prometheus.yml"\n - "--storage.tsdb.path=/prometheus"\n - "--web.console.libraries=/usr/share/prometheus/console_libraries"\n - "--web.console.templates=/usr/share/prometheus/consoles"\n - "--web.listen-address=0.0.0.0:9500"\n - "--storage.tsdb.retention.time=5m" # Use prometheus for short-term storage.\n expose:\n - "9500"\n ports:\n - "9500:9500"\n depends_on: []\n volumes:\n - "prometheus-0:/prometheus"\n - "./prometheus.yaml:/etc/prometheus/prometheus.yml"\n environment: {}\n container_name: prometheus-0\n healthcheck:\n test:\n - CMD-SHELL\n - sh -c 'printf "GET /-/healthy HTTP/1.0\n\n" | nc localhost 9500; exit $?;'\n interval: 1s\n timeout: 5s\n retries: 5\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\n prometheus-kafka-adaptor:\n image: "telefonica/prometheus-kafka-adapter:1.8.0"\n expose:\n - "9501"\n ports:\n - "9501:9501"\n environment:\n - KAFKA_BROKER_LIST=message_queue:29092\n - KAFKA_TOPIC=prometheus\n - PORT=9501\n - GIN_MODE=release\n - LOG_LEVEL=info\n - SERIALIZATION_FORMAT=json\n container_name: prometheus-kafka-adaptor\n depends_on:\n - prometheus-0\n - message_queue\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\prometheus\docker-compose.yml | docker-compose.yml | YAML | 2,062 | 0.8 | 0.012658 | 0 | react-lib | 846 | 2024-12-03T21:12:51.827733 | Apache-2.0 | true | 46160104e00b348d09bfeddc07387219 |
---\nversion: "3"\nservices:\n pubsub-emulator:\n image: google/cloud-sdk:latest\n command: >\n gcloud beta emulators pubsub start --project=demo --host-port=0.0.0.0:8900\n ports:\n - "8900:8900"\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\pubsub\docker-compose.yml | docker-compose.yml | YAML | 1,064 | 0.7 | 0 | 0 | vue-tools | 349 | 2025-06-29T12:07:11.288991 | MIT | true | 46022f4f99014c40f77975965d0ad929 |
---\nservices:\n redis:\n image: 'redis:latest'\n expose:\n - 6379\n ports:\n - 6379:6379\n healthcheck:\n test: ["CMD", "redis-cli", "ping"]\n interval: 5s\n timeout: 30s\n retries: 50\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\redis-sink\docker-compose.yml | docker-compose.yml | YAML | 1,073 | 0.7 | 0 | 0 | python-kit | 589 | 2023-08-12T03:39:34.746078 | BSD-3-Clause | true | 93c1e09a29ac6b38d2de1906ae9748d2 |
---\nservices:\n sqlserver-server:\n container_name: sqlserver-server\n image: mcr.microsoft.com/mssql/server:2022-latest\n hostname: sqlserver-server\n ports:\n - 1433:1433\n environment:\n ACCEPT_EULA: 'Y'\n SA_PASSWORD: 'SomeTestOnly@SA'\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n container_name: postgres-0\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\sqlserver-sink\docker-compose.yml | docker-compose.yml | YAML | 1,046 | 0.7 | 0 | 0 | react-lib | 100 | 2024-10-15T16:26:58.016081 | MIT | true | ca3d161b040a9f35e7e238b6679162c3 |
---\nservices:\n starrocks-fe:\n image: starrocks/fe-ubuntu:3.1.7\n hostname: starrocks-fe\n container_name: starrocks-fe\n volumes:\n - "./starrocks_prepare.sql:/starrocks_prepare.sql"\n command:\n /opt/starrocks/fe/bin/start_fe.sh\n ports:\n - 8030:8030\n - 9020:9020\n - 9030:9030\n healthcheck:\n test: ["CMD", "curl", "-f", "http://localhost:9030"]\n interval: 5s\n timeout: 5s\n retries: 30\n starrocks-be:\n image: starrocks/be-ubuntu:3.1.7\n command:\n - /bin/bash\n - -c\n - |\n sleep 15s; mysql --connect-timeout 2 -h starrocks-fe -P9030 -uroot -e "alter system add backend \"starrocks-be:9050\";"\n /opt/starrocks/be/bin/start_be.sh\n ports:\n - 9050:9050\n - 8040:8040\n hostname: starrocks-be\n container_name: starrocks-be\n depends_on:\n - starrocks-fe\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n postgres:\n image: postgres:latest\n command: tail -f /dev/null\n volumes:\n - "./update_delete.sql:/update_delete.sql"\n restart: on-failure\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\starrocks-sink\docker-compose.yml | docker-compose.yml | YAML | 1,774 | 0.8 | 0 | 0 | python-kit | 260 | 2024-02-21T22:41:30.243698 | GPL-3.0 | true | c1433db118664d20372f25a0b75c1d51 |
x-superset-image: &superset-image apache/superset:${TAG:-latest-dev}\nx-superset-depends-on:\n &superset-depends-on\n - db\n - redis\nx-superset-volumes:\n # /app/pythonpath_docker will be appended to the PYTHONPATH in the final container\n &superset-volumes\n - ./docker:/app/docker\n - superset_home:/app/superset_home\n\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\n datagen:\n build: ../datagen\n depends_on: [message_queue]\n command:\n - /bin/sh\n - -c\n - /datagen --mode livestream --qps 2 kafka --brokers message_queue:29092\n restart: always\n container_name: datagen\n\n # Superset-related services #\n\n redis:\n image: redis:latest\n container_name: superset_cache\n restart: unless-stopped\n volumes:\n - redis:/data\n\n db:\n env_file: docker/.env-non-dev\n image: postgres:10\n container_name: superset_db\n restart: unless-stopped\n volumes:\n - db_home:/var/lib/postgresql/data\n\n superset:\n env_file: docker/.env-non-dev\n image: *superset-image\n container_name: superset_app\n command: [ "/app/docker/docker-bootstrap.sh", "app-gunicorn" ]\n user: "root"\n restart: unless-stopped\n ports:\n - 8088:8088\n depends_on: *superset-depends-on\n volumes: *superset-volumes\n\n superset-init:\n image: *superset-image\n container_name: superset_init\n command: [ "/app/docker/docker-init.sh" ]\n env_file: docker/.env-non-dev\n depends_on: *superset-depends-on\n user: "root"\n volumes: *superset-volumes\n\n superset-worker:\n image: *superset-image\n container_name: superset_worker\n command: [ "/app/docker/docker-bootstrap.sh", "worker" ]\n env_file: docker/.env-non-dev\n restart: unless-stopped\n depends_on: *superset-depends-on\n user: "root"\n volumes: *superset-volumes\n\n superset-worker-beat:\n image: *superset-image\n container_name: superset_worker_beat\n command: [ "/app/docker/docker-bootstrap.sh", "beat" ]\n env_file: docker/.env-non-dev\n restart: unless-stopped\n depends_on: *superset-depends-on\n user: "root"\n volumes: *superset-volumes\n\nvolumes:\n superset_home:\n external: false\n db_home:\n external: false\n redis:\n external: false\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\superset\docker-compose.yml | docker-compose.yml | YAML | 3,007 | 0.8 | 0 | 0.017391 | python-kit | 889 | 2024-07-02T21:44:37.045796 | GPL-3.0 | true | c8f400b24c679cc773d0e36b1a9b9068 |
---\nversion: '3'\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n\n #=================== TiDB & TiCDC components ==================\n ticdc-controller:\n image: pingcap/ticdc:v6.6.0\n container_name: ticdc-controller\n entrypoint: "/cdc cli"\n command:\n - changefeed\n - create\n - --server\n - http://ticdc-capturer0:8300\n - --sink-uri\n - "kafka://kafka:9092/ticdc_default?protocol=canal-json&kafka-version=2.4.0&partition-num=3&max-message-bytes=67108864&replication-factor=1"\n - --changefeed-id\n - "ticdc-replication-task"\n - --config\n - "/changefeed.toml"\n volumes:\n - ./config/changefeed.toml:/changefeed.toml:ro\n depends_on:\n - "pd"\n - "kafka"\n - "ticdc-capturer0"\n restart: on-failure\n\n ticdc-capturer0:\n image: pingcap/ticdc:v6.6.0\n entrypoint: "/cdc server"\n ports:\n - "8300:8300"\n command:\n - --addr=0.0.0.0:8300\n - --pd=http://pd:2379\n - --advertise-addr=ticdc-capturer0:8300\n - --tz=UTC\n depends_on:\n - pd\n - "tidb"\n - "kafka"\n restart: on-failure\n\n pd:\n image: pingcap/pd:v6.6.0\n ports:\n - "2379:2379"\n volumes:\n - ./config/pd.toml:/pd.toml:ro\n command:\n - --name=pd\n - --client-urls=http://0.0.0.0:2379\n - --peer-urls=http://0.0.0.0:2380\n - --advertise-client-urls=http://pd:2379\n - --advertise-peer-urls=http://pd:2380\n - --initial-cluster=pd=http://pd:2380\n - --data-dir=/data/pd\n - --config=/pd.toml\n restart: on-failure\n\n tikv0:\n image: pingcap/tikv:v6.6.0\n volumes:\n - ./config/tikv.toml:/tikv.toml:ro\n - /data\n command:\n - --addr=0.0.0.0:20160\n - --advertise-addr=tikv0:20160\n - --data-dir=/data/tikv0\n - --pd=pd:2379\n - --config=/tikv.toml\n depends_on:\n - "pd"\n restart: on-failure\n\n tikv1:\n image: pingcap/tikv:v6.6.0\n volumes:\n - ./config/tikv.toml:/tikv.toml:ro\n - /data\n command:\n - --addr=0.0.0.0:20160\n - --advertise-addr=tikv1:20160\n - --data-dir=/data/tikv1\n - --pd=pd:2379\n - --config=/tikv.toml\n depends_on:\n - "pd"\n restart: on-failure\n\n tikv2:\n image: pingcap/tikv:v6.6.0\n volumes:\n - ./config/tikv.toml:/tikv.toml:ro\n - /data\n command:\n - --addr=0.0.0.0:20160\n - --advertise-addr=tikv2:20160\n - --data-dir=/data/tikv2\n - --pd=pd:2379\n - --config=/tikv.toml\n depends_on:\n - "pd"\n restart: on-failure\n\n tidb:\n image: pingcap/tidb:v6.6.0\n ports:\n - "4000:4000"\n - "10080:10080"\n volumes:\n - ./config/tidb.toml:/tidb.toml:ro\n command:\n - --store=tikv\n - --path=pd:2379\n - --config=/tidb.toml\n - --advertise-address=tidb\n depends_on:\n - "tikv0"\n - "tikv1"\n - "tikv2"\n restart: on-failure\n\n #=================== Kafka ==================\n\n # Adapted from https://github.com/confluentinc/demo-scene/blob/master/connect-jdbc/docker-compose.yml\n zookeeper:\n image: confluentinc/cp-zookeeper:5.5.1\n platform: linux/amd64\n container_name: zookeeper\n environment:\n ZOOKEEPER_CLIENT_PORT: 2181\n ZOOKEEPER_TICK_TIME: 2000\n\n kafka:\n image: confluentinc/cp-enterprise-kafka:5.5.1\n platform: linux/amd64\n container_name: kafka\n depends_on:\n - "zookeeper"\n ports:\n # Exposes 9092 for external connections to the broker\n # Use kafka:29092 for connections internal on the docker network\n # See https://rmoff.net/2018/08/02/kafka-listeners-explained/ for details\n - 9092:9092\n environment:\n KAFKA_BROKER_ID: 1\n KAFKA_ZOOKEEPER_CONNECT: zookeeper:2181\n KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT\n KAFKA_INTER_BROKER_LISTENER_NAME: PLAINTEXT\n KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka:29092,PLAINTEXT_HOST://kafka:9092\n KAFKA_AUTO_CREATE_TOPICS_ENABLE: "true"\n KAFKA_METRIC_REPORTERS: io.confluent.metrics.reporter.ConfluentMetricsReporter\n KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1\n KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 100\n CONFLUENT_METRICS_REPORTER_BOOTSTRAP_SERVERS: kafka:29092\n CONFLUENT_METRICS_REPORTER_ZOOKEEPER_CONNECT: zookeeper:2181\n CONFLUENT_METRICS_REPORTER_TOPIC_REPLICAS: 1\n CONFLUENT_METRICS_ENABLE: 'true'\n CONFLUENT_SUPPORT_CUSTOMER_ID: 'anonymous'\n\n #===================== Others ===================\n datagen:\n build: ../datagen\n depends_on:\n - tidb\n command:\n - /bin/sh\n - -c\n - /datagen --mode twitter --qps 2 mysql --host tidb --db test --port 4000 --user root --password ""\n restart: always\n container_name: datagen\n\n mysql:\n image: mysql:latest\n command: tail -f /dev/null\n volumes:\n - "./tidb_create_tables.sql:/tidb_create_tables.sql"\n - "./tidb_prepare.sql:/tidb_prepare.sql"\n container_name: mysql\n restart: on-failure\n\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\tidb-cdc-sink\docker-compose.yml | docker-compose.yml | YAML | 5,671 | 0.8 | 0.013825 | 0.034314 | awesome-app | 941 | 2024-07-05T02:49:29.430911 | GPL-3.0 | true | fcb886385e929842376b4806b032181e |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\n datagen:\n build: ../datagen\n depends_on: [message_queue]\n command:\n - /bin/sh\n - -c\n - /datagen --mode twitter --qps 2 kafka --brokers message_queue:29092\n restart: always\n container_name: datagen\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\twitter\docker-compose.yml | docker-compose.yml | YAML | 1,099 | 0.7 | 0 | 0 | node-utils | 968 | 2025-03-12T17:15:18.150970 | BSD-3-Clause | true | 6b38bbb86116af87a01ec0e738c85ca9 |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n message_queue:\n image: "apachepulsar/pulsar:2.11.1"\n command: bin/pulsar standalone\n ports:\n - 8080:8080\n - 6650:6650\n hostname: message_queue\n container_name: message_queue\n stop_grace_period: 2s\n datagen:\n build: ../datagen\n depends_on: [message_queue]\n command:\n - /bin/sh\n - -c\n - /datagen --mode twitter --qps 2 pulsar --brokers message_queue:6650\n restart: always\n container_name: datagen\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\twitter-pulsar\docker-compose.yml | docker-compose.yml | YAML | 1,223 | 0.7 | 0 | 0 | vue-tools | 807 | 2024-01-07T14:57:06.469844 | MIT | true | bd0cae000d318f58c357b886bd3df062 |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose.yml\n service: prometheus-0\n message_queue:\n extends:\n file: ../../docker/docker-compose.yml\n service: message_queue\n datagen:\n image: python:3.10\n depends_on: [ message_queue ]\n volumes:\n - type: bind\n source: ./datagen.py\n target: /datagen.py\n command:\n - /bin/sh\n - -c\n - |\n pip install requests fastavro confluent_kafka\n python /datagen.py message_queue:29092 http://message_queue:8081 ua1\n restart: always\n container_name: datagen\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\n message_queue:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\upsert-avro\docker-compose.yml | docker-compose.yml | YAML | 1,256 | 0.8 | 0 | 0 | awesome-app | 820 | 2025-02-03T22:42:13.872134 | Apache-2.0 | true | d03405477526ee57befd9d531d5b97c4 |
---\nservices:\n risingwave-standalone:\n extends:\n file: ../../docker/docker-compose.yml\n service: risingwave-standalone\n postgres-0:\n extends:\n file: ../../docker/docker-compose-distributed.yml\n service: postgres-0\n grafana-0:\n extends:\n file: ../../docker/docker-compose-distributed.yml\n service: grafana-0\n minio-0:\n extends:\n file: ../../docker/docker-compose-distributed.yml\n service: minio-0\n prometheus-0:\n extends:\n file: ../../docker/docker-compose-distributed.yml\n service: prometheus-0\n\n vector:\n image: "ghcr.io/risingwavelabs/vector:0.33.1.custom.7e244e087-debian"\n volumes:\n - "./vector.yaml:/etc/vector/vector.yaml:ro"\n depends_on:\n - "risingwave-standalone"\n container_name: vector\n restart: always\n\nvolumes:\n risingwave-standalone:\n external: false\n postgres-0:\n external: false\n grafana-0:\n external: false\n minio-0:\n external: false\n prometheus-0:\n external: false\nname: risingwave-compose\n | dataset_sample\yaml\risingwavelabs_risingwave\integration_tests\vector\docker-compose.yml | docker-compose.yml | YAML | 1,023 | 0.7 | 0 | 0 | python-kit | 109 | 2025-03-03T10:06:45.093700 | Apache-2.0 | true | ce40b49bf8e70072dcb417355de4fac4 |
table_name: "sink_bench"\npk_indices:\n - 0\ncolumns:\n - name: "v1"\n type: "int"\n - name: "v2"\n type: "bigint"\n - name: "v3"\n type: "varchar"\n - name: "v4"\n type: "varchar" | dataset_sample\yaml\risingwavelabs_risingwave\src\bench\sink_bench\schema.yml | schema.yml | YAML | 186 | 0.7 | 0 | 0 | python-kit | 735 | 2023-08-21T14:42:14.989012 | Apache-2.0 | false | bdd8a4b4031bc3b58e47044c943b5f8f |
Clickhouse:\n connector: "clickhouse"\n type: "append-only"\n force_append_only: "true"\n clickhouse.url: "http://127.0.0.1:8123"\n clickhouse.user: "default"\n clickhouse.password: ""\n clickhouse.database: "default"\n clickhouse.table: "sink_bench"\nRedis:\n connector: 'redis'\n primary_key: 'v1'\n type: 'append-only'\n redis.url: 'redis://127.0.0.1:6379/'\nKafka:\n connector: 'kafka'\n properties.bootstrap.server: '127.0.0.1:9092'\n topic : 'counts'\n primary_key: 'v1'\nPulsar:\n connector: 'pulsar'\n pulsar.topic: 'twitter'\n pulsar.service.url: 'pulsar://127.0.0.1:6650'\nIceberg:\n connector: 'iceberg'\n type: 'append-only'\n warehouse.path: 's3a://iceberg-data'\n s3.endpoint: 'http://127.0.0.1:9301'\n s3.access.key: 'hummockadmin'\n s3.secret.key: 'hummockadmin'\n database.name: 'demo_db'\n s3.region: '1'\n table.name: 'demo_db.demo_table'\n force_append_only: 'true'\n# RemoteIceberg:\n# connector: 'iceberg_java'\n# type: 'append-only'\n# force_append_only: 'true'\n# warehouse.path: 's3://iceberg-data'\n# s3.endpoint: 'http://127.0.0.1:9301'\n# s3.access.key: 'hummockadmin'\n# s3.secret.key: 'hummockadmin'\n# database.name: 'demo_db'\n# table.name: 'demo_table'\nMysql:\n connector: 'jdbc'\n type: 'append-only'\n force_append_only: 'true'\n jdbc.url: 'jdbc:mysql://127.0.0.1:3306/mydb?user=username&password=123456'\n table.name: 'bench_table'\nPostgres:\n connector: 'jdbc'\n type: 'append-only'\n force_append_only: 'true'\n jdbc.url: 'jdbc:postgresql://127.0.0.1:5432/mydb?user=postgres&password=123'\n table.name: 'bench_table'\nDeltaLake:\n connector: 'deltalake'\n type: 'append-only'\n force_append_only: 'true'\n location: 's3a://deltalake-bench/deltalake'\n s3.access.key: 'hummockadmin'\n s3.secret.key: 'hummockadmin'\n s3.endpoint: 'http://127.0.0.1:9301'\nElasticSearch:\n connector: 'elasticsearch'\n index: 'test'\n url: 'http://127.0.0.1:9200'\n username: 'elastic'\n password: 'risingwave'\n delimiter: '_'\nCassandra:\n connector: 'cassandra'\n type: 'append-only'\n force_append_only: 'true'\n cassandra.url: '127.0.0.1:9042'\n cassandra.keyspace: 'demo'\n cassandra.table: 'table_bench'\n cassandra.datacenter: 'datacenter1'\nDoris:\n connector: 'doris'\n type: 'append-only'\n doris.url: 'http://127.0.0.1:8030'\n doris.user: 'users'\n doris.password: '123456'\n doris.database: 'demo'\n doris.table: 'table_bench'\n force_append_only: 'true'\nStarrocks:\n connector: 'starrocks'\n type: 'append-only'\n starrocks.host: '127.0.0.1'\n starrocks.mysqlport: '9030'\n starrocks.httpport: '8030'\n starrocks.user: 'users'\n starrocks.password: '123456'\n starrocks.database: 'demo'\n starrocks.table: 'table_bench'\n force_append_only: 'true'\nBigQuery:\n connector: 'bigquery'\n type: 'append-only'\n bigquery.local.path: 'xxx.json'\n bigquery.project: 'xxx'\n bigquery.dataset: 'test_bigquery_sink'\n bigquery.table: 'table_bench'\n force_append_only: 'true' | dataset_sample\yaml\risingwavelabs_risingwave\src\bench\sink_bench\sink_option.yml | sink_option.yml | YAML | 2,897 | 0.8 | 0 | 0.093458 | node-utils | 840 | 2025-03-27T16:21:07.671560 | Apache-2.0 | false | 44bac967dfcd59ff4b2af53e2e80ea8e |
- id: create_source\n sql: |\n CREATE SOURCE s(x int,y int)\n WITH(\n connector='kafka',\n topic = 'mytopic',\n properties.bootstrap.server = '6',\n scan.startup.mode = 'earliest',\n ) FORMAT PLAIN ENCODE JSON;\n expected_outputs: []\n- with_config_map:\n streaming_use_shared_source: true\n sql: |\n /* The shared source config doesn't affect table with connector. */\n EXPLAIN CREATE TABLE s(x int,y int)\n WITH(\n connector='kafka',\n topic = 'mytopic',\n properties.bootstrap.server = '6',\n scan.startup.mode = 'earliest',\n ) FORMAT PLAIN ENCODE JSON;\n expected_outputs:\n - explain_output\n# Note: The execution order is first apply config, then execute "before", then execute "sql"\n# We use with_config_map to control the config when CREATE SOURCE, and use another SET statement to change the config for CREATE MV\n#\n# batch: All 4 plans should be the same.\n# stream: StreamSourceScan (with backfill) should be used only for the last 2. The first 2 use StreamSource. streaming_use_shared_source changes the behavior of CREATE SOURCE, but not CREATE MATERIALIZED VIEW\n- with_config_map:\n streaming_use_shared_source: false\n before:\n - create_source\n sql: |\n SET streaming_use_shared_source = false;\n select * from s;\n expected_outputs:\n - batch_plan\n - stream_plan\n- with_config_map:\n streaming_use_shared_source: false\n before:\n - create_source\n sql: |\n SET streaming_use_shared_source = true;\n select * from s;\n expected_outputs:\n - batch_plan\n - stream_plan\n- with_config_map:\n streaming_use_shared_source: true\n before:\n - create_source\n sql: |\n SET streaming_use_shared_source = false;\n select * from s;\n expected_outputs:\n - batch_plan\n - stream_plan\n- with_config_map:\n streaming_use_shared_source: true\n before:\n - create_source\n sql: |\n SET streaming_use_shared_source = true;\n select * from s;\n expected_outputs:\n - batch_plan\n - stream_plan\n | dataset_sample\yaml\risingwavelabs_risingwave\src\frontend\planner_test\tests\testdata\input\shared_source.yml | shared_source.yml | YAML | 2,007 | 0.8 | 0.029412 | 0.088235 | vue-tools | 422 | 2025-01-23T22:35:53.653476 | MIT | true | 2b90833a695965bb3c8e3c48d68218fb |
# This file is automatically generated. See `src/frontend/planner_test/README.md` for more information.\n- id: create_source\n sql: |\n CREATE SOURCE s(x int,y int)\n WITH(\n connector='kafka',\n topic = 'mytopic',\n properties.bootstrap.server = '6',\n scan.startup.mode = 'earliest',\n ) FORMAT PLAIN ENCODE JSON;\n- sql: |\n /* The shared source config doesn't affect table with connector. */\n EXPLAIN CREATE TABLE s(x int,y int)\n WITH(\n connector='kafka',\n topic = 'mytopic',\n properties.bootstrap.server = '6',\n scan.startup.mode = 'earliest',\n ) FORMAT PLAIN ENCODE JSON;\n explain_output: |\n StreamMaterialize { columns: [x, y, _row_id(hidden)], stream_key: [_row_id], pk_columns: [_row_id], pk_conflict: Overwrite }\n └─StreamRowIdGen { row_id_index: 2 }\n └─StreamUnion { all: true }\n ├─StreamExchange [no_shuffle] { dist: SomeShard }\n │ └─StreamSource { source: s, columns: [x, y, _row_id] }\n └─StreamExchange { dist: HashShard(_row_id) }\n └─StreamDml { columns: [x, y, _row_id] }\n └─StreamSource\n with_config_map:\n streaming_use_shared_source: 'true'\n- before:\n - create_source\n sql: |\n SET streaming_use_shared_source = false;\n select * from s;\n batch_plan: |-\n BatchExchange { order: [], dist: Single }\n └─BatchProject { exprs: [x, y] }\n └─BatchKafkaScan { source: s, columns: [x, y, _rw_kafka_timestamp, _row_id], filter: (None, None) }\n stream_plan: |-\n StreamMaterialize { columns: [x, y, _row_id(hidden)], stream_key: [_row_id], pk_columns: [_row_id], pk_conflict: NoCheck }\n └─StreamProject { exprs: [x, y, _row_id] }\n └─StreamRowIdGen { row_id_index: 3 }\n └─StreamSource { source: s, columns: [x, y, _rw_kafka_timestamp, _row_id] }\n with_config_map:\n streaming_use_shared_source: 'false'\n- before:\n - create_source\n sql: |\n SET streaming_use_shared_source = true;\n select * from s;\n batch_plan: |-\n BatchExchange { order: [], dist: Single }\n └─BatchProject { exprs: [x, y] }\n └─BatchKafkaScan { source: s, columns: [x, y, _rw_kafka_timestamp, _row_id], filter: (None, None) }\n stream_plan: |-\n StreamMaterialize { columns: [x, y, _row_id(hidden)], stream_key: [_row_id], pk_columns: [_row_id], pk_conflict: NoCheck }\n └─StreamProject { exprs: [x, y, _row_id] }\n └─StreamRowIdGen { row_id_index: 3 }\n └─StreamSource { source: s, columns: [x, y, _rw_kafka_timestamp, _row_id] }\n with_config_map:\n streaming_use_shared_source: 'false'\n- before:\n - create_source\n sql: |\n SET streaming_use_shared_source = false;\n select * from s;\n batch_plan: |-\n BatchExchange { order: [], dist: Single }\n └─BatchProject { exprs: [x, y] }\n └─BatchKafkaScan { source: s, columns: [x, y, _rw_kafka_timestamp, _rw_kafka_partition, _rw_kafka_offset, _row_id], filter: (None, None) }\n stream_plan: |-\n StreamMaterialize { columns: [x, y, _row_id(hidden)], stream_key: [_row_id], pk_columns: [_row_id], pk_conflict: NoCheck }\n └─StreamProject { exprs: [x, y, _row_id] }\n └─StreamRowIdGen { row_id_index: 5 }\n └─StreamSourceScan { columns: [x, y, _rw_kafka_timestamp, _rw_kafka_partition, _rw_kafka_offset, _row_id] }\n with_config_map:\n streaming_use_shared_source: 'true'\n- before:\n - create_source\n sql: |\n SET streaming_use_shared_source = true;\n select * from s;\n batch_plan: |-\n BatchExchange { order: [], dist: Single }\n └─BatchProject { exprs: [x, y] }\n └─BatchKafkaScan { source: s, columns: [x, y, _rw_kafka_timestamp, _rw_kafka_partition, _rw_kafka_offset, _row_id], filter: (None, None) }\n stream_plan: |-\n StreamMaterialize { columns: [x, y, _row_id(hidden)], stream_key: [_row_id], pk_columns: [_row_id], pk_conflict: NoCheck }\n └─StreamProject { exprs: [x, y, _row_id] }\n └─StreamRowIdGen { row_id_index: 5 }\n └─StreamSourceScan { columns: [x, y, _rw_kafka_timestamp, _rw_kafka_partition, _rw_kafka_offset, _row_id] }\n with_config_map:\n streaming_use_shared_source: 'true'\n | dataset_sample\yaml\risingwavelabs_risingwave\src\frontend\planner_test\tests\testdata\output\shared_source.yml | shared_source.yml | YAML | 4,183 | 0.8 | 0.010638 | 0.021277 | python-kit | 373 | 2025-05-21T09:03:54.524786 | GPL-3.0 | true | 4d07302094e7a0d1ada1049cd4213f54 |
- input: SELECT 'a''b'\n formatted_sql: SELECT 'a''b'\n\n- input: SELECT e'a\'b'\n formatted_sql: SELECT E'a\'b'\n\n- input: SELECT e'a''b'\n formatted_sql: SELECT E'a\'b'\n\n- input: SELECT '\\'\n formatted_sql: SELECT '\\'\n\n- input: SELECT e'\\'\n formatted_sql: SELECT E'\\'\n | dataset_sample\yaml\risingwavelabs_risingwave\src\sqlparser\tests\testdata\escape_string.yml | escape_string.yml | YAML | 272 | 0.7 | 0 | 0 | python-kit | 208 | 2023-12-13T12:27:34.591888 | MIT | true | 29d05326d746ca4e2c1ca71d47494b4d |
global:\n scrape_interval: 15s\n evaluation_interval: 60s\n external_labels:\n rw_cluster: 20240506-185437\n\n\nscrape_configs:\n - job_name: prometheus\n static_configs:\n - targets: ["127.0.0.1:9500"]\n\n - job_name: compute\n static_configs:\n - targets: ["127.0.0.1:1222"]\n\n - job_name: meta\n static_configs:\n - targets: ["127.0.0.1:1250"]\n\n - job_name: minio\n metrics_path: /minio/v2/metrics/cluster\n static_configs:\n - targets: ["127.0.0.1:9301"]\n\n - job_name: compactor\n static_configs:\n - targets: ["127.0.0.1:1260"]\n\n - job_name: etcd\n static_configs:\n - targets: ["127.0.0.1:2379"]\n\n - job_name: frontend\n static_configs:\n - targets: ["127.0.0.1:2222"]\n\n - job_name: redpanda\n static_configs:\n - targets: []\n | dataset_sample\yaml\risingwavelabs_risingwave\standalone\prometheus.yml | prometheus.yml | YAML | 783 | 0.7 | 0 | 0 | awesome-app | 897 | 2024-10-01T19:21:48.825936 | GPL-3.0 | false | d3b94dcdf92f86b4fe66d49a61b99bdb |
# These are supported funding model platforms\n\ngithub: # Replace with up to 4 GitHub Sponsors-enabled usernames e.g., [user1, user2]\npatreon: rocksdanister\nopen_collective: # Replace with a single Open Collective username\nko_fi: # Replace with kofi username\ntidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel\ncommunity_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry\nliberapay: # Replace with a single Liberapay username\nissuehunt: # Replace with a single IssueHunt username\notechie: # Replace with a single Otechie username\ncustom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']\n | dataset_sample\yaml\rocksdanister_lively\.github\FUNDING.yml | FUNDING.yml | YAML | 674 | 0.8 | 0 | 0.090909 | awesome-app | 193 | 2024-11-13T19:38:54.845779 | BSD-3-Clause | false | a93e15589af6a54c9d138f546e5e4cfa |
# See https://github.com/actions/labeler\n\nA-Cli:\n - crates/rome_cli/**\n - crates/rome_fs/**\n - crates/rome_console/**\n\nA-Core:\n - crates/rome_deserializer/**\n - crates/rome_fs/**\n - crates/rome_rowan/**\n - crates/rome_text_edit/**\n - crates/rome_text_size/**\n - crates/rome_suppression/**\n - crates/rome_suppression/**\n - crates/rome_markup/**\n\nA-Project:\n- crates/rome_service/**\n\nA-Linter:\n - crates/rome_analyze/**\n - crates/rome_js_analyze/**\n - crates/rome_json_analyze/**\n\nA-Parser:\n - crates/rome_parser/**\n - crates/rome_js_parser/**\n - crates/rome_js_syntax/**\n - crates/rome_json_parser/**\n - crates/rome_json_syntax/**\n - crates/rome_css_parser/**\n - crates/rome_css_syntax/**\n\nA-Formatter:\n - crates/rome_formatter/**\n - crates/rome_formatter_test/**\n - crates/rome_js_formatter/**\n - crates/rome_json_formatter/**\n - crates/rome_css_formatter/**\n\nA-Editors:\n - crates/editors/**\n\nA-Diagnostic:\n - crates/rome_diagnostics/**\n - crates/rome_diagnostics_categories/**\n - crates/rome_diagnostics_macros/**\n\nA-Tooling:\n - xtask/**\n\nA-Website:\n - website/**\n\nA-LSP:\n - crates/rome_lsp/**\n\nL-Javascript:\n - crates/rome_js_*/**\n\nL-Json:\n - crates/rome_json_*/**\n\nL-CSS:\n - crates/rome_css_*/**\n | dataset_sample\yaml\rome_tools\.github\labeler.yml | labeler.yml | YAML | 1,235 | 0.8 | 0 | 0.019231 | react-lib | 97 | 2024-01-02T14:27:57.679651 | Apache-2.0 | false | bbf726af4c0d149d3f33b91061a6423f |
name: 🐛 Bug Report\ndescription: Report a possible bug or regression\ntitle: "🐛 <TITLE>"\nlabels: ["S-To triage"]\nbody:\n - type: markdown\n attributes:\n value: Thank you for submitting the bug! We'll try to triage it ASAP!\n - type: markdown\n attributes:\n value: |\n Bug reports that don't follow this template will be closed.\n Please provide a clear and concise description of what the bug is.\n - type: textarea\n id: environment\n attributes:\n label: Environment information\n description: Run the command `rome rage` and paste its output here. Please review it, in case there are sensitive information you don't want to share.\n render: block\n validations:\n required: true\n - type: textarea\n id: steps-to-reproduce\n attributes:\n label: What happened?\n description: |\n Provide a detailed list of steps that reproduce the issue\n The more information and included steps, the quicker your report can be triaged and addressed!\n \n You can also use the [playground](https://play.rome.tools) to share code snippets.\n This is useful to reproduce the issue.\n placeholder: |\n 1.\n 2.\n validations:\n required: true\n - type: textarea\n id: expected-result\n attributes:\n label: Expected result\n description: Describe what you expected to happen.\n placeholder: It should not throw an error.\n validations:\n required: true\n - type: checkboxes\n id: terms\n attributes:\n label: Code of Conduct\n description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/rome/tools/blob/main/CODE_OF_CONDUCT.md)\n options:\n - label: I agree to follow Rome's Code of Conduct\n required: true\n | dataset_sample\yaml\rome_tools\.github\ISSUE_TEMPLATE\01_bug.yml | 01_bug.yml | YAML | 1,799 | 0.95 | 0.038462 | 0 | python-kit | 780 | 2024-10-30T06:35:41.482213 | Apache-2.0 | false | 5f609429b177b2b1da5c4238833ea45d |
name: 📎 Task\ndescription: Specific actionable task. Typically opened by existing contributors. If you aren't sure, then create a discussion.\ntitle: "📎 <TITLE> "\nlabels: ["task"]\nbody:\n - type: textarea\n id: description\n attributes:\n label: Description\n description: A summary of the task\n validations:\n required: true\n | dataset_sample\yaml\rome_tools\.github\ISSUE_TEMPLATE\02_task.yml | 02_task.yml | YAML | 349 | 0.85 | 0 | 0 | awesome-app | 613 | 2025-01-21T15:07:00.753192 | Apache-2.0 | false | ef9493ac374d0a5047e656bc79137885 |
name: ☂️ Umbrella\ndescription: Discuss and track high level goals for a collection of other issues. Should only be opened with prior discussion.\ntitle: "☂️ <TITLE> "\nlabels: ["umbrella"]\nbody:\n - type: textarea\n id: description\n attributes:\n label: Description\n description: Summary of the umbrella. Describe the scope and list the action items needed\n validations:\n required: true\n | dataset_sample\yaml\rome_tools\.github\ISSUE_TEMPLATE\03_umbrella.yml | 03_umbrella.yml | YAML | 415 | 0.85 | 0.083333 | 0 | python-kit | 573 | 2025-05-23T06:23:12.552921 | Apache-2.0 | false | 27be184867791dda857711effa0f6423 |
blank_issues_enabled: false\ncontact_links:\n - name: 🗣️ Chat\n url: https://discord.gg/rome\n about: Our Discord server is active and is used for real-time discussions including contribution collaboration, questions, and more!\n - name: 🤔 Support\n url: https://discord.gg/rome\n about: "This issue tracker is not for support questions. Visit our community Discord and the #support channel for assistance!"\n - name: 🆘 Code of Conduct Reports\n url: https://github.com/rome/tools/blob/main/CODE_OF_CONDUCT.md#contributor-covenant-code-of-conduct\n about: Please use the contact information in Code of Conduct as issues created here are public\n - name: 💡 Feature requests\n url: https://github.com/rome/tools/discussions/new\n about: Please use a new Github discussion to propose ideas of feature requests\n | dataset_sample\yaml\rome_tools\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 836 | 0.8 | 0.214286 | 0 | node-utils | 894 | 2023-08-20T15:42:59.551875 | BSD-3-Clause | false | 0eec8431c0cd35992d2dd09cf90ca076 |
# Parser benchmark, compares main and PR branch with Criterion.\n# Comment with text containing `!bench`, a new result will be commented at the bottom of this PR.\n\nname: Analyzer Benchmark\n\non:\n issue_comment:\n types: [ created ]\n\nenv:\n RUST_LOG: info\n\njobs:\n bench:\n name: Bench\n if: github.event.issue.pull_request && contains(github.event.comment.body, '!bench_analyzer')\n runs-on: ubuntu-latest\n\n steps:\n - name: Get PR SHA\n id: sha\n uses: actions/github-script@v6\n with:\n result-encoding: string\n script: |\n const response = await github.request(context.payload.issue.pull_request.url);\n return response.data.head.sha;\n\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n ref: ${{ steps.sha.outputs.result }}\n\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n\n - name: Install critcmp\n run: cargo install critcmp\n\n - name: Compile\n run: cargo build --release --locked -p xtask_bench\n\n - name: Run Bench on PR Branch\n run: cargo bench_analyzer --save-baseline pr\n\n - name: Checkout Main Branch\n uses: actions/checkout@v3\n with:\n clean: false\n ref: main\n\n - name: Run Bench on Main Branch\n run: cargo bench_analyzer --save-baseline main\n\n - name: Compare Bench Results\n id: bench_comparison\n shell: bash\n run: |\n echo "### Analyzer Benchmark Results" > output\n echo "\`\`\`" >> output\n critcmp main pr >> output\n echo "\`\`\`" >> output\n cat output\n echo "comment<<EOF" >> $GITHUB_OUTPUT\n cat output >> $GITHUB_OUTPUT\n echo "EOF" >> $GITHUB_OUTPUT\n\n - name: Write a new comment\n uses: peter-evans/create-or-update-comment@v1.4.5\n continue-on-error: true\n with:\n issue-number: ${{ github.event.issue.number }}\n body: |\n ${{ steps.bench_comparison.outputs.comment }}\n\n - name: Remove Criterion Artifact\n run: rm -rf ./target/criterion\n | dataset_sample\yaml\rome_tools\.github\workflows\bench_analyzer.yml | bench_analyzer.yml | YAML | 2,157 | 0.8 | 0.012821 | 0.031746 | node-utils | 673 | 2023-09-18T15:28:05.406322 | GPL-3.0 | false | bb0ce8ca7b490efe0f822254dcfeb1fe |
# CLI benchmark, compares main and PR branch with Hyperfine.\n# Comment with text containing `!bench_cli`, a new result will be commented at the bottom of this PR.\n\nname: CLI Benchmark\n\non:\n issue_comment:\n types: [ created ]\n\nenv:\n RUST_LOG: info\n\njobs:\n bench:\n name: Bench\n if: github.event.issue.pull_request && contains(github.event.comment.body, '!bench_cli')\n runs-on: ubuntu-latest\n\n steps:\n - name: Get PR SHA\n id: sha\n uses: actions/github-script@v6\n with:\n result-encoding: string\n script: |\n const response = await github.request(context.payload.issue.pull_request.url);\n return response.data.head.sha;\n\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n ref: ${{ steps.sha.outputs.result }}\n\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n\n - name: Install hyperfine\n run: cargo install hyperfine\n\n - name: Compile on PR Branch\n run: |\n cargo build --release --bin rome\n mkdir -p benchmark/target\n cp target/release/rome benchmark/target/rome_pr\n\n - name: Checkout Main Branch\n uses: actions/checkout@v3\n with:\n clean: false\n ref: main\n\n - name: Compile on Main Branch\n run: |\n cargo build --release --bin rome\n cp target/release/rome benchmark/target/rome_main\n\n - name: Checkout webpack\n uses: actions/checkout@v3\n with:\n repository: webpack/webpack\n path: benchmark/target/webpack\n - name: Checkout prettier\n uses: actions/checkout@v3\n with:\n repository: prettier/prettier\n path: benchmark/target/prettier\n - name: Checkout eslint\n uses: actions/checkout@v3\n with:\n repository: eslint/eslint\n path: benchmark/target/eslint\n\n - name: Run Benchmarks\n id: benchmarks\n working-directory: benchmark/target\n env:\n FORMAT_BENCH_COMMAND: "format webpack/lib webpack/examples webpack/declarations webpack/benchmark prettier/src prettier/scripts --write"\n CHECK_BENCH_COMMAND: "--max-diagnostics=0 eslint/lib eslint/messages eslint/tests/lib eslint/tests/performance eslint/tools webpack/lib"\n run: |\n hyperfine -w 2 --export-markdown benchmark_format.md \\n -n "rome format (main)" "./rome_main $FORMAT_BENCH_COMMAND" \\n -n "rome format (pr)" "./rome_pr $FORMAT_BENCH_COMMAND"\n hyperfine -w 2 --export-markdown benchmark_check.md \\n -n "rome check (main)" "./rome_main check $CHECK_BENCH_COMMAND" \\n -n "rome check (pr)" "./rome_pr check $CHECK_BENCH_COMMAND"\n cat benchmark_format.md >> benchmark.md\n echo $'\n' >> benchmark.md\n cat benchmark_check.md >> benchmark.md\n\n - name: Write a new comment\n uses: peter-evans/create-or-update-comment@v2\n continue-on-error: true\n with:\n issue-number: ${{ github.event.issue.number }}\n body-file: benchmark/target/benchmark.md\n | dataset_sample\yaml\rome_tools\.github\workflows\bench_cli.yml | bench_cli.yml | YAML | 3,160 | 0.8 | 0.010417 | 0.02439 | react-lib | 81 | 2024-07-02T22:33:51.279925 | BSD-3-Clause | false | c2d47983a23c29a83dfe9b3f0b589aad |
# Parser benchmark, compares main and PR branch with Criterion.\n# Comment with text containing `!bench`, a new result will be commented at the bottom of this PR.\n\nname: Formatter Benchmark\n\non:\n issue_comment:\n types: [ created ]\n\nenv:\n RUST_LOG: info\n\njobs:\n bench:\n name: Bench\n if: github.event.issue.pull_request && contains(github.event.comment.body, '!bench_formatter')\n runs-on: ubuntu-latest\n\n steps:\n - name: Get PR SHA\n id: sha\n uses: actions/github-script@v6\n with:\n result-encoding: string\n script: |\n const response = await github.request(context.payload.issue.pull_request.url);\n return response.data.head.sha;\n\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n ref: ${{ steps.sha.outputs.result }}\n\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n\n - name: Install critcmp\n run: cargo install critcmp\n\n - name: Compile\n run: cargo build --release --locked -p xtask_bench\n\n - name: Run Bench on PR Branch\n run: cargo bench_formatter --save-baseline pr\n\n - name: Checkout Main Branch\n uses: actions/checkout@v3\n with:\n clean: false\n ref: main\n\n - name: Run Bench on Main Branch\n run: cargo bench_formatter --save-baseline main\n\n - name: Compare Bench Results\n id: bench_comparison\n shell: bash\n run: |\n echo "### Formatter Benchmark Results" > output\n echo "\`\`\`" >> output\n critcmp main pr >> output\n echo "\`\`\`" >> output\n cat output\n echo "comment<<EOF" >> $GITHUB_OUTPUT\n cat output >> $GITHUB_OUTPUT\n echo "EOF" >> $GITHUB_OUTPUT\n\n - name: Write a new comment\n uses: peter-evans/create-or-update-comment@v1.4.5\n continue-on-error: true\n with:\n issue-number: ${{ github.event.issue.number }}\n body: |\n ${{ steps.bench_comparison.outputs.comment }}\n\n - name: Remove Criterion Artifact\n run: rm -rf ./target/criterion\n | dataset_sample\yaml\rome_tools\.github\workflows\bench_formatter.yml | bench_formatter.yml | YAML | 2,162 | 0.8 | 0.012821 | 0.031746 | awesome-app | 457 | 2024-08-23T20:34:11.205128 | GPL-3.0 | false | d93bef715735c33d3c0d662936036e6f |
# Parser benchmark, compares main and PR branch with Criterion.\n# Comment with text containing `!bench`, a new result will be commented at the bottom of this PR.\n\nname: Parser Benchmark\n\non:\n issue_comment:\n types: [created]\n\nenv:\n RUST_LOG: info\n\njobs:\n bench:\n name: Bench\n if: github.event.issue.pull_request && contains(github.event.comment.body, '!bench_parser')\n runs-on: ubuntu-latest\n\n steps:\n - name: Get PR SHA\n id: sha\n uses: actions/github-script@v6\n with:\n result-encoding: string\n script: |\n const response = await github.request(context.payload.issue.pull_request.url);\n return response.data.head.sha;\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n ref: ${{ steps.sha.outputs.result }}\n\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n\n - name: Install critcmp\n run: cargo install critcmp\n\n - name: Compile\n run: cargo build --release --locked -p xtask_bench\n\n - name: Run Bench on PR Branch\n run: cargo bench_parser --save-baseline pr\n\n - name: Checkout Main Branch\n uses: actions/checkout@v3\n with:\n clean: false\n ref: main\n\n - name: Run Bench on Main Branch\n run: cargo bench_parser --save-baseline main\n\n - name: Compare Bench Results\n id: bench_comparison\n shell: bash\n run: |\n echo "### Parser Benchmark Results" > output\n echo "\`\`\`" >> output\n critcmp main pr >> output\n echo "\`\`\`" >> output\n cat output\n echo "comment<<EOF" >> $GITHUB_OUTPUT\n cat output >> $GITHUB_OUTPUT\n echo "EOF" >> $GITHUB_OUTPUT\n\n - name: Write a new comment\n uses: peter-evans/create-or-update-comment@v1.4.5\n continue-on-error: true\n with:\n issue-number: ${{ github.event.issue.number }}\n body: |\n ${{ steps.bench_comparison.outputs.comment }}\n\n - name: Remove Criterion Artifact\n run: rm -rf ./target/criterion\n | dataset_sample\yaml\rome_tools\.github\workflows\bench_parser.yml | bench_parser.yml | YAML | 2,144 | 0.8 | 0.012987 | 0.031746 | node-utils | 192 | 2025-04-12T07:23:44.983871 | Apache-2.0 | false | 4bedfa450f4c906682f4a5800ccff9f3 |
# Automatically labels PRs based on the configuration file\n# you are probably looking for 👉 `.github/labeler.yml`\nname: Label PRs\n\non:\n - pull_request_target\n\njobs:\n triage:\n runs-on: ubuntu-latest\n if: github.repository_owner == 'rome'\n steps:\n - uses: actions/labeler@v4\n with:\n repo-token: "${{ secrets.GITHUB_TOKEN }}"\n sync-labels: true\n | dataset_sample\yaml\rome_tools\.github\workflows\labels.yml | labels.yml | YAML | 386 | 0.8 | 0.125 | 0.142857 | react-lib | 449 | 2024-04-20T15:38:16.596601 | MIT | false | a65c6c2d3936e9df803481ef6d16a1ec |
\n# This workflow run when something is pushed on main and it does:\n # - normal checks like in the normal PRs\n # - expand the test suite to be run on multiple OS\n # - runs the coverage and prints in the command line\nname: CI on main\non:\n workflow_dispatch:\n push:\n branches:\n - main\n\nenv:\n RUST_LOG: info\n RUST_BACKTRACE: 1\n\njobs:\n format:\n name: Format Rust Files\n runs-on: ubuntu-latest\n steps:\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n - name: Support longpaths\n run: git config core.longpaths true\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n components: rustfmt\n - name: Run rustfmt\n run: cargo fmt --all --verbose -- --check\n\n lint:\n name: Lint Rust Files\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n components: clippy\n - name: Run cargo check\n run: cargo check --workspace --all-targets --release\n - name: Run clippy\n run: cargo lint\n\n check-dependencies:\n name: Check Dependencies\n runs-on: ubuntu-latest\n steps:\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n channel: nightly\n - name: Install udeps\n run: cargo install cargo-udeps --locked\n - name: Run udeps\n run: cargo +nightly udeps\n\n\n test:\n strategy:\n matrix:\n include:\n - os: windows-latest\n - os: ubuntu-latest\n - os: macos-latest\n\n name: Test\n runs-on: ${{ matrix.os }}\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n bins: cargo-nextest\n - name: Run tests on ${{ matrix.os }}\n run: cargo nextest run --workspace --verbose\n - name: Clean cache\n run: cargo cache --autoclean\n\n documentation:\n name: Documentation\n environment: netlify-rustdocs\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install netlify-cli\n run: npm i -g netlify-cli\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n - name: Run doc command\n run: cargo documentation\n - name: Write index.html\n run: echo '<meta http-equiv="refresh" content="0; url=/rome/index.html">' >target/doc/index.html\n - name: Write _redirects\n run: echo '/ /rome/index.html' >target/doc/_redirects\n - name: Deploy documentation\n run: |\n netlify deploy --dir=./target/doc --prod --site rustdocs-rometools --auth ${{ secrets.NETLIFY_AUTH_TOKEN }}\n\n coverage:\n name: Test262 Coverage\n runs-on: ${{ matrix.os }}\n\n strategy:\n fail-fast: false\n matrix:\n os: [windows-latest, ubuntu-latest]\n\n steps:\n # ref: https://github.com/orgs/community/discussions/26952\n - name: Support longpaths\n if: matrix.os == 'windows-latest'\n run: git config --system core.longpaths true\n - name: Checkout repository\n uses: actions/checkout@v3\n with:\n submodules: recursive\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n - name: Compile\n run: cargo build --release --locked -p xtask_coverage\n - name: Run Test262 suite\n continue-on-error: true\n run: cargo coverage\n - name: Clean cache\n run: cargo cache --autoclean\n | dataset_sample\yaml\rome_tools\.github\workflows\main.yml | main.yml | YAML | 3,813 | 0.8 | 0.007194 | 0.03937 | python-kit | 281 | 2025-01-02T15:40:37.827071 | GPL-3.0 | false | fab30e20d7b4db21a46adb818d00ddc4 |
# Test coverage job. It is run on pull request because it prints the results via comment\nname: Parser conformance and comparison\non:\n pull_request:\n branches:\n - main\n paths:\n - 'crates/rome_js_syntax/**'\n - 'crates/rome_js_factory/**'\n - 'crates/rome_js_parser/**'\n - 'crates/rome_rowan/**'\n - 'xtask/**'\n\nenv:\n RUST_LOG: info\n RUST_BACKTRACE: 1\n\njobs:\n coverage:\n name: Parser conformance\n runs-on: ${{ matrix.os }}\n\n strategy:\n fail-fast: false\n matrix:\n os: [ ubuntu-latest ]\n\n steps:\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n\n - name: Support longpaths\n run: git config core.longpaths true\n\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: recursive\n\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n\n - name: Compile\n run: cargo build --release --locked -p xtask_coverage\n\n - name: Run Test suites\n continue-on-error: true\n run: cargo coverage --json > new_results.json\n\n - name: Checkout main Branch\n uses: actions/checkout@v3\n with:\n clean: false\n ref: main\n submodules: recursive\n\n - name: Run Test suites on main branch\n continue-on-error: true\n run: cargo coverage --json > base_results.json\n\n - name: Compare results on ${{ matrix.os }}\n if: github.event_name == 'pull_request'\n id: comparison\n shell: bash\n run: |\n echo "## Parser conformance results on ${{ matrix.os }}" > output\n cargo coverage compare ./base_results.json ./new_results.json --markdown >> output\n cat output\n echo "comment<<EOF" >> $GITHUB_OUTPUT\n cat output >> $GITHUB_OUTPUT\n echo "EOF" >> $GITHUB_OUTPUT\n\n\n - name: Get the PR number\n if: github.event_name == 'pull_request'\n id: pr-number\n uses: kkak10/pr-number-action@v1.3\n\n - name: Find Previous Comment\n if: github.event_name == 'pull_request'\n uses: peter-evans/find-comment@v1.3.0\n id: previous-comment\n with:\n issue-number: ${{ steps.pr-number.outputs.pr }}\n body-includes: Parser conformance results on ${{ matrix.os }}\n\n - name: Update existing comment\n if: github.event_name == 'pull_request' && steps.previous-comment.outputs.comment-id\n uses: peter-evans/create-or-update-comment@v1.4.5\n continue-on-error: true\n with:\n comment-id: ${{ steps.previous-comment.outputs.comment-id }}\n body: |\n ${{ steps.comparison.outputs.comment }}\n edit-mode: replace\n\n - name: Write a new comment\n if: github.event_name == 'pull_request' && !steps.previous-comment.outputs.comment-id\n uses: peter-evans/create-or-update-comment@v1.4.5\n continue-on-error: true\n with:\n issue-number: ${{ steps.pr-number.outputs.pr }}\n body: |\n ${{ steps.comparison.outputs.comment }}\n\n - name: Clean cache\n run: cargo cache --autoclean\n | dataset_sample\yaml\rome_tools\.github\workflows\parser_conformance.yml | parser_conformance.yml | YAML | 3,177 | 0.8 | 0.045872 | 0.010989 | node-utils | 185 | 2024-01-13T15:44:22.493213 | MIT | false | f8bb74da9c4a8fb03213cc44c013022f |
# Jobs run on pull request\nname: Pull request\non:\n pull_request:\n branches:\n - main\n paths: # Only run when changes are made to rust code or root Cargo\n - 'crates/**'\n - 'fuzz/**'\n - 'xtask/**'\n - 'Cargo.toml'\n - 'Cargo.lock'\n - 'rust-toolchain.toml'\n - 'rustfmt.toml'\n\nenv:\n RUST_LOG: info\n RUST_BACKTRACE: 1\n\njobs:\n format:\n name: Format Rust Files\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n components: rustfmt\n bins: taplo-cli\n - name: Run format\n run: |\n cargo fmt --all --check\n taplo fmt -- --locked\n\n lint:\n name: Lint Rust Files\n runs-on: ubuntu-latest\n steps:\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n components: clippy\n - name: Run cargo check\n run: cargo lint\n\n check-dependencies:\n name: Check Dependencies\n runs-on: ubuntu-latest\n steps:\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n - name: Install toolchain\n run: rustup toolchain install nightly\n - name: Install udeps\n run: cargo install cargo-udeps --locked\n - name: Run udeps\n run: cargo +nightly udeps\n\n\n test:\n strategy:\n matrix:\n include:\n - os: windows-latest\n - os: ubuntu-latest\n\n name: Test\n runs-on: ${{ matrix.os }}\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n bins: cargo-nextest\n - name: Run tests\n run: cargo nextest run --workspace --verbose\n - name: Run doctests\n run: cargo test --doc\n\n fuzz-all:\n name: Build and init fuzzers\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n bins: cargo-fuzz\n - name: Run init-fuzzer\n run: bash fuzz/init-fuzzer.sh\n\n test-node-api:\n name: Test node.js API\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n with:\n fetch-depth: 1\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n - name: Build main binary\n run: cargo build -p rome_cli --release\n\n - name: Install Node.js\n uses: actions/setup-node@v3\n with:\n node-version: 18\n - name: Cache pnpm modules\n uses: actions/cache@v3\n with:\n path: ~/.pnpm-store\n key: ${{ runner.os }}-${{ hashFiles('**/pnpm-lock.yaml') }}\n restore-keys: |\n ${{ runner.os }}-\n - uses: pnpm/action-setup@v2.2.4\n with:\n version: 8\n - name: Install wasm-pack\n uses: jetli/wasm-pack-action@v0.4.0\n with:\n version: 'latest'\n - name: Build TypeScript code\n run: |\n pnpm --prefix npm/backend-jsonrpc i\n pnpm --prefix npm/backend-jsonrpc run build\n pnpm --prefix npm/js-api run build:wasm-bundler\n pnpm --prefix npm/js-api run build:wasm-node\n pnpm --prefix npm/js-api run build:wasm-web\n pnpm --prefix npm/js-api i\n pnpm --prefix npm/js-api run build\n - name: Run JS tests\n run: |\n pnpm --prefix npm/backend-jsonrpc test:ci\n pnpm --prefix npm/js-api test:ci\n\n documentation:\n name: Documentation\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n - name: Run doc command\n run: cargo documentation\n\n codegen:\n name: Codegen\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n - name: Run the grammar codegen\n run: cargo codegen grammar\n - name: Run the analyzer codegen\n run: cargo codegen analyzer\n - name: Run the configuration codegen\n run: cargo codegen-configuration\n - name: Run the schema codegen\n run: cargo codegen-schema\n - name: Run the bindings codegen\n run: cargo codegen-bindings\n - name: Run the website codegen\n run: |\n cargo lintdoc\n cargo codegen-website\n - name: Check for git diff\n run: |\n if [[ `git status --porcelain` ]]; then\n git status\n exit 1\n fi\n | dataset_sample\yaml\rome_tools\.github\workflows\pull_request.yml | pull_request.yml | YAML | 4,872 | 0.8 | 0.010582 | 0.00578 | vue-tools | 406 | 2023-09-03T02:12:46.482673 | Apache-2.0 | false | 48b8813d712f312d2b9ce7033ec0868b |
\n# Jobs run on pull request in js folders\nname: Pull request JS\non:\n pull_request:\n branches:\n - main\n paths: # Only run when changes are made to js code\n - 'website/**'\n - 'editors/**'\n - 'crates/**'\n - 'npm/js-api/**'\n\nenv:\n RUST_LOG: info\n RUST_BACKTRACE: 1\n\njobs:\n format-js:\n name: Check JS Files\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n - name: Cache\n uses: Swatinem/rust-cache@v2\n with:\n shared-key: "cli-release"\n\n - name: Run Rome Format\n run: cargo rome-cli ci editors website npm/js-api\n type-check:\n name: Type-check JS Files\n runs-on: ubuntu-latest\n steps:\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n - uses: jetli/wasm-pack-action@v0.4.0\n - name: Cache pnpm modules\n uses: actions/cache@v3\n with:\n path: ~/.pnpm-store\n key: ${{ runner.os }}-${{ hashFiles('**/pnpm-lock.yaml') }}\n restore-keys: |\n ${{ runner.os }}-\n - uses: pnpm/action-setup@v2.2.4\n with:\n version: 8\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n - name: Build WASM module for the web\n run: wasm-pack build --out-dir ../../npm/wasm-web --target web --scope rometools crates/rome_wasm\n - name: Install libraries\n working-directory: website\n run: pnpm i\n - name: Build JS\n working-directory: website\n run: pnpm build:js\n - name: Type check\n working-directory: website\n run: pnpm tsc\n | dataset_sample\yaml\rome_tools\.github\workflows\pull_request_js.yml | pull_request_js.yml | YAML | 1,658 | 0.8 | 0.015873 | 0.016949 | node-utils | 385 | 2024-03-26T06:31:15.902370 | BSD-3-Clause | false | 2ad4f6fa37db2d00916a7ad6c5572dc2 |
name: Release CLI\non:\n workflow_dispatch:\n schedule:\n - cron: '0 0 * * 2-6'\n push:\n branches:\n - main\n paths:\n - npm/rome/package.json\n\njobs:\n check:\n name: Check version\n runs-on: ubuntu-latest\n outputs:\n version: ${{ env.version }}\n prerelease: ${{ env.prerelease }}\n nightly: ${{ env.nightly }}\n version_changed: ${{ steps.version.outputs.changed }}\n steps:\n - uses: actions/checkout@v3\n\n - name: Check nightly status\n id: nightly\n if: github.event_name == 'schedule' || github.event_name == 'workflow_dispatch'\n run: echo "nightly=true" >> $GITHUB_ENV\n\n - name: Check version changes\n uses: EndBug/version-check@v1\n if: env.nightly != 'true'\n id: version\n with:\n diff-search: true\n file-name: npm/rome/package.json\n\n - name: Set version name\n if: steps.version.outputs.changed == 'true'\n run: |\n echo "Version change found! New version: ${{ steps.version.outputs.version }} (${{ steps.version.outputs.version_type }})"\n echo "version=${{ steps.version.outputs.version }}" >> $GITHUB_ENV\n\n - name: Set prerelease status\n if: env.nightly == 'true'\n run: |\n echo "prerelease=true" >> $GITHUB_ENV\n echo "version=$(node npm/rome/scripts/update-nightly-version.mjs)" >> $GITHUB_ENV\n\n build:\n strategy:\n matrix:\n include:\n - os: windows-2022\n target: x86_64-pc-windows-msvc\n code-target: win32-x64\n - os: windows-2022\n target: aarch64-pc-windows-msvc\n code-target: win32-arm64\n - os: ubuntu-20.04\n target: x86_64-unknown-linux-gnu\n code-target: linux-x64\n - os: ubuntu-20.04\n target: aarch64-unknown-linux-gnu\n code-target: linux-arm64\n - os: macos-11\n target: x86_64-apple-darwin\n code-target: darwin-x64\n - os: macos-11\n target: aarch64-apple-darwin\n code-target: darwin-arm64\n\n name: Package ${{ matrix.code-target }}\n runs-on: ${{ matrix.os }}\n\n needs: check\n if: needs.check.outputs.version_changed == 'true' || needs.check.outputs.nightly == 'true'\n env:\n version: ${{ needs.check.outputs.version }}\n outputs:\n version: ${{ env.version }}\n prerelease: ${{ needs.check.outputs.prerelease }}\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n with:\n fetch-depth: 1\n\n - name: Install Node.js\n uses: actions/setup-node@v3\n with:\n node-version: 18\n\n - name: Install Rust toolchain\n run: rustup target add ${{ matrix.target }}\n\n - name: Install arm64 toolchain\n if: matrix.code-target == 'linux-arm64'\n run: |\n sudo apt-get update\n sudo apt-get install -y gcc-aarch64-linux-gnu\n\n - name: Audit crates.io dependencies\n if: matrix.code-target == 'linux-x64'\n run: cargo audit\n\n # Build the CLI binary\n - name: Build binaries\n run: cargo build -p rome_cli --release --target ${{ matrix.target }}\n env:\n CARGO_TARGET_AARCH64_UNKNOWN_LINUX_GNU_LINKER: aarch64-linux-gnu-gcc\n # Strip all debug symbols from the resulting binaries\n RUSTFLAGS: "-C strip=symbols"\n # Inline the version of the npm package in the CLI binary\n ROME_VERSION: ${{ env.version }}\n\n # Copy the CLI binary and rename it to include the name of the target platform\n - name: Copy CLI binary\n if: matrix.os == 'windows-2022'\n run: |\n mkdir dist\n cp target/${{ matrix.target }}/release/rome.exe ./dist/rome-${{ matrix.code-target }}.exe\n - name: Copy CLI binary\n if: matrix.os != 'windows-2022'\n run: |\n mkdir dist\n cp target/${{ matrix.target }}/release/rome ./dist/rome-${{ matrix.code-target }}\n\n # Upload the CLI binary as a build artifact\n - name: Upload CLI artifact\n uses: actions/upload-artifact@v3\n with:\n name: cli\n path: ./dist/rome-*\n if-no-files-found: error\n\n build-wasm:\n name: Build WASM\n runs-on: ubuntu-latest\n needs: check\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n with:\n fetch-depth: 1\n\n - name: Install wasm-pack\n uses: jetli/wasm-pack-action@v0.4.0\n\n - name: Build WASM module for bundlers\n run: wasm-pack build --out-dir ../../npm/wasm-bundler --target bundler --release --scope rometools crates/rome_wasm\n - name: Build WASM module for node.js\n run: wasm-pack build --out-dir ../../npm/wasm-nodejs --target nodejs --release --scope rometools crates/rome_wasm\n - name: Build WASM module for the web\n run: wasm-pack build --out-dir ../../npm/wasm-web --target web --release --scope rometools crates/rome_wasm\n\n - name: Upload WASM artifact\n uses: actions/upload-artifact@v3\n with:\n name: wasm\n path: |\n ./npm/wasm-bundler\n ./npm/wasm-nodejs\n ./npm/wasm-web\n if-no-files-found: error\n\n publish:\n name: Publish\n runs-on: ubuntu-latest\n needs:\n - build\n - build-wasm\n environment: npm-publish\n steps:\n - uses: actions/checkout@v3\n\n - name: Download CLI artifacts\n uses: actions/download-artifact@v3\n with:\n name: cli\n - name: Download WASM artifacts\n uses: actions/download-artifact@v3\n with:\n name: wasm\n path: npm\n\n - name: Install Node.js\n uses: actions/setup-node@v3\n with:\n node-version: 18\n registry-url: 'https://registry.npmjs.org'\n\n - name: Set release infos\n if: needs.build.outputs.prerelease == 'true'\n run: node npm/rome/scripts/update-nightly-version.mjs\n - name: Generate npm packages\n run: node npm/rome/scripts/generate-packages.mjs\n\n - name: Publish npm packages as latest\n run: for package in npm/*; do if [ $package != "npm/js-api" ]; then npm publish $package --tag latest --access public; fi; done\n if: needs.build.outputs.prerelease != 'true'\n env:\n NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}\n - name: Publish npm packages as nightly\n run: for package in npm/*; do if [ $package != "npm/js-api" ]; then npm publish $package --tag nightly --access public; fi; done\n if: needs.build.outputs.prerelease == 'true'\n env:\n NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}\n\n - name: Create GitHub release and tag\n uses: softprops/action-gh-release@v1\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n with:\n name: CLI v${{ needs.build.outputs.version }}\n tag_name: cli/v${{ needs.build.outputs.version }}\n draft: false\n prerelease: ${{ needs.build.outputs.prerelease == 'true' }}\n files: |\n rome-*\n fail_on_unmatched_files: true\n generate_release_notes: true\n | dataset_sample\yaml\rome_tools\.github\workflows\release_cli.yml | release_cli.yml | YAML | 7,167 | 0.8 | 0.094595 | 0.02551 | awesome-app | 210 | 2024-11-02T06:43:09.576100 | MIT | false | 1bf0d2e19f2d9dbc9103626e4be6baaa |
name: Release JavaScript API\non:\n workflow_dispatch:\n schedule:\n - cron: '0 0 * * 2-6'\n push:\n branches:\n - main\n paths:\n - npm/js-api/package.json\n\njobs:\n check:\n name: Check version\n runs-on: ubuntu-latest\n outputs:\n version: ${{ env.version }}\n prerelease: ${{ env.prerelease }}\n nightly: ${{ env.nightly }}\n version_changed: ${{ steps.version.outputs.changed }}\n steps:\n - uses: actions/checkout@v3\n\n - name: Check nightly status\n id: nightly\n if: github.event_name == 'schedule' || github.event_name == 'workflow_dispatch'\n run: echo "nightly=true" >> $GITHUB_ENV\n\n - name: Check version changes\n uses: EndBug/version-check@v1\n if: env.nightly != 'true'\n id: version\n with:\n diff-search: true\n file-name: npm/js-api/package.json\n\n - name: Set version name\n run: echo "version=${{ steps.version.outputs.version }}" >> $GITHUB_ENV\n\n - name: Check prerelease status\n id: prerelease\n if: env.nightly == 'true'\n run: echo "prerelease=true" >> $GITHUB_ENV\n\n - name: Check version status\n if: steps.version.outputs.changed == 'true'\n run: 'echo "Version change found! New version: ${{ steps.version.outputs.version }} (${{ steps.version.outputs.version_type }})"'\n\n build:\n name: Package JavaScript APIs\n runs-on: ubuntu-latest\n\n needs: check\n if: needs.check.outputs.version_changed == 'true' || needs.check.outputs.nightly == 'true'\n outputs:\n version: ${{ env.version }}\n prerelease: ${{ env.prerelease }}\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n with:\n fetch-depth: 1\n\n - name: Install Node.js\n uses: actions/setup-node@v3\n with:\n node-version: 18\n\n - name: Install wasm-pack\n uses: jetli/wasm-pack-action@v0.4.0\n with:\n version: 'latest'\n\n - name: Cache pnpm modules\n uses: actions/cache@v3\n with:\n path: ~/.pnpm-store\n key: ${{ runner.os }}-${{ hashFiles('**/pnpm-lock.yaml') }}\n restore-keys: |\n ${{ runner.os }}-\n - uses: pnpm/action-setup@v2.2.4\n with:\n version: 8\n\n - name: Set release infos\n if: needs.check.outputs.prerelease == 'true'\n run: |\n echo "prerelease=true" >> $GITHUB_ENV\n node npm/js-api/scripts/update-nightly-version.mjs >> $GITHUB_ENV\n - name: Set release infos\n if: needs.check.outputs.prerelease != 'true'\n run: |\n echo "prerelease=false" >> $GITHUB_ENV\n echo "version=${{ needs.check.outputs.version }}" >> $GITHUB_ENV\n\n - name: Compile backends\n run: |\n pnpm --prefix npm/js-api build:wasm-bundler\n pnpm --prefix npm/js-api build:wasm-node\n pnpm --prefix npm/js-api build:wasm-web\n pnpm --prefix npm/backend-jsonrpc i\n pnpm --prefix npm/backend-jsonrpc run build\n\n - name: Build package\n working-directory: npm/js-api\n run: |\n pnpm i\n pnpm build\n\n - name: Upload JS API artifact\n uses: actions/upload-artifact@v3\n with:\n name: js-api\n path: |\n ./npm/js-api/dist\n if-no-files-found: error\n\n publish:\n name: Publish\n runs-on: ubuntu-latest\n needs: build\n environment: npm-publish\n steps:\n - uses: actions/checkout@v3\n\n - name: Download package artifact\n uses: actions/download-artifact@v3\n with:\n name: js-api\n path: npm/js-api/dist\n\n - name: Install Node.js\n uses: actions/setup-node@v3\n with:\n node-version: 18\n registry-url: 'https://registry.npmjs.org'\n\n - name: Set release infos\n if: needs.build.outputs.prerelease == 'true'\n run: node npm/js-api/scripts/update-nightly-version.mjs\n\n - name: Publish npm package as latest\n run: npm publish npm/js-api --tag latest --access public\n if: needs.build.outputs.prerelease != 'true'\n env:\n NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}\n - name: Publish npm package as nightly\n run: npm publish npm/js-api --tag nightly --access public\n if: needs.build.outputs.prerelease == 'true'\n env:\n NODE_AUTH_TOKEN: ${{ secrets.NPM_TOKEN }}\n\n - name: Create GitHub release and tag\n uses: softprops/action-gh-release@v1\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n with:\n name: JavaScript APIs v${{ needs.build.outputs.version }}\n tag_name: js-api/v${{ needs.build.outputs.version }}\n draft: false\n prerelease: ${{ needs.build.outputs.prerelease == 'true' }}\n generate_release_notes: true\n | dataset_sample\yaml\rome_tools\.github\workflows\release_js_api.yml | release_js_api.yml | YAML | 4,853 | 0.8 | 0.067485 | 0 | awesome-app | 343 | 2024-10-20T20:54:20.426424 | Apache-2.0 | false | 11ffc341f186b7f3553462442462b8aa |
name: Release LSP\non:\n workflow_dispatch:\n schedule:\n - cron: '0 0 * * 2-6'\n push:\n branches:\n - main\n paths:\n - editors/vscode/package.json\n\njobs:\n check:\n name: Check version\n runs-on: ubuntu-latest\n outputs:\n # LSP Extension Version\n version: ${{ env.version }}\n\n # Version of the Rome binary\n rome_version: ${{ env.rome_version }}\n prerelease: ${{ env.prerelease }}\n nightly: ${{ env.nightly }}\n version_changed: ${{ steps.version.outputs.changed }}\n steps:\n - uses: actions/checkout@v3\n\n - name: Check nightly status\n id: nightly\n if: github.event_name == 'schedule' || github.event_name == 'workflow_dispatch'\n run: echo "nightly=true" >> $GITHUB_ENV\n\n - name: Check version changes\n uses: EndBug/version-check@v1\n if: env.nightly != 'true'\n id: version\n with:\n diff-search: true\n file-name: editors/vscode/package.json\n\n - name: Check Rome version changes\n uses: EndBug/version-check@v1\n if: env.nightly != 'true'\n id: rome_version\n with:\n diff-search: true\n file-name: npm/rome/package.json\n\n - name: Set version name\n run: |\n echo "version=${{ steps.version.outputs.version }}" >> $GITHUB_ENV\n echo "rome_version=${{ steps.rome_version.outputs.version }}" >> $GITHUB_ENV\n\n - name: Check prerelease status\n id: prerelease\n if: env.nightly == 'true' || steps.version.outputs.type == 'prerelease' || steps.version.outputs.type == 'prepatch' || steps.version.outputs.type == 'premajor' || steps.version.outputs.type == 'preminor'\n run: |\n echo "prerelease=true" >> $GITHUB_ENV\n node ./editors/vscode/scripts/update-nightly-version.mjs >> $GITHUB_ENV\n echo "rome_version=$(node ./npm/rome/scripts/update-nightly-version.mjs)" >> $GITHUB_ENV\n\n - name: Check version status\n if: steps.version.outputs.changed == 'true'\n run: 'echo "Version change found! New version: ${{ steps.version.outputs.version }} (${{ steps.version.outputs.version_type }})"'\n\n - name: Rome Check version status\n if: steps.rome_version.outputs.changed == 'true'\n run: 'echo "Rome Version change found! New version: ${{ steps.rome_version.outputs.version }} (${{ steps.rome_version.outputs.version_type }})"'\n\n build:\n strategy:\n matrix:\n include:\n - os: windows-2022\n target: x86_64-pc-windows-msvc\n code-target: win32-x64\n - os: windows-2022\n target: aarch64-pc-windows-msvc\n code-target: win32-arm64\n - os: ubuntu-20.04\n target: x86_64-unknown-linux-gnu\n code-target: linux-x64\n - os: ubuntu-20.04\n target: aarch64-unknown-linux-gnu\n code-target: linux-arm64\n - os: macos-11\n target: x86_64-apple-darwin\n code-target: darwin-x64\n - os: macos-11\n target: aarch64-apple-darwin\n code-target: darwin-arm64\n\n name: Package ${{ matrix.code-target }}\n runs-on: ${{ matrix.os }}\n\n needs: check\n env:\n version: ${{ needs.check.outputs.version }}\n prerelease: ${{ needs.check.outputs.prerelease }}\n\n if: needs.check.outputs.version_changed == 'true' || needs.check.outputs.nightly == 'true'\n outputs:\n version: ${{ env.version }}\n prerelease: ${{ env.prerelease }}\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v3\n with:\n fetch-depth: 1\n\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n with:\n targets: ${{ matrix.target }}\n\n - name: Install arm64 toolchain\n if: matrix.code-target == 'linux-arm64'\n run: |\n sudo apt-get update\n sudo apt-get install -y gcc-aarch64-linux-gnu\n\n - name: Audit crates.io dependencies\n if: matrix.code-target == 'linux-x64'\n run: cargo audit\n\n # Build the LSP binary\n - name: Build binaries\n run: cargo build -p rome_cli --release --target ${{ matrix.target }}\n env:\n CARGO_TARGET_AARCH64_UNKNOWN_LINUX_GNU_LINKER: aarch64-linux-gnu-gcc\n # Strip all debug symbols from the resulting binaries\n RUSTFLAGS: "-C strip=symbols"\n ROME_VERSION: ${{ needs.check.outputs.rome_version }}\n\n - name: Copy LSP binary\n if: matrix.os == 'windows-2022'\n run: |\n mkdir dist\n mkdir editors/vscode/server\n cp target/${{ matrix.target }}/release/rome.exe editors/vscode/server/rome.exe\n - name: Copy LSP binary\n if: matrix.os != 'windows-2022'\n run: |\n mkdir dist\n mkdir editors/vscode/server\n cp target/${{ matrix.target }}/release/rome editors/vscode/server/rome\n\n - name: Install Node.js\n uses: actions/setup-node@v3\n with:\n node-version: 18\n\n - name: Update package.json version\n if: needs.check.outputs.prerelease == 'true'\n working-directory: editors/vscode\n run: |\n node ./scripts/update-nightly-version.mjs >> $GITHUB_ENV\n\n - name: Package extension\n run: |\n npm ci\n npm run compile\n npx vsce package -o "../../dist/rome_lsp-${{ matrix.code-target }}.vsix" --target ${{ matrix.code-target }}\n working-directory: editors/vscode\n if: needs.check.outputs.prerelease != 'true'\n - name: Package extension (pre-release)\n run: |\n npm ci\n npm run compile\n npx vsce package --pre-release -o "../../dist/rome_lsp-${{ matrix.code-target }}.vsix" --target ${{ matrix.code-target }}\n working-directory: editors/vscode\n if: needs.check.outputs.prerelease == 'true'\n\n - name: Upload VSCode extension artifact\n uses: actions/upload-artifact@v3\n with:\n name: vscode_packages\n path: ./dist/rome_lsp-${{ matrix.code-target }}.vsix\n if-no-files-found: error\n\n publish:\n name: Publish\n runs-on: ubuntu-latest\n needs: build\n environment: marketplace\n steps:\n - uses: actions/checkout@v3\n\n - name: Download extension artifacts\n uses: actions/download-artifact@v3\n with:\n name: vscode_packages\n\n - name: Install Node.js\n uses: actions/setup-node@v3\n with:\n node-version: 18\n registry-url: 'https://registry.npmjs.org'\n\n - name: Publish extension to Microsoft Marketplace (pre-release)\n run: npx vsce publish --pre-release --packagePath rome_lsp-*.vsix\n if: needs.build.outputs.prerelease == 'true'\n env:\n VSCE_PAT: ${{ secrets.VSCE_PAT }}\n - name: Publish extension to Microsoft Marketplace\n run: npx vsce publish --packagePath rome_lsp-*.vsix\n if: needs.build.outputs.prerelease != 'true'\n env:\n VSCE_PAT: ${{ secrets.VSCE_PAT }}\n\n - name: Publish extension to Open VSX (pre-release)\n run: npx ovsx publish --pre-release --packagePath rome_lsp-*.vsix\n if: needs.build.outputs.prerelease == 'true'\n env:\n OVSX_PAT: ${{ secrets.OVSX_PAT }}\n - name: Publish extension to Open VSX\n run: npx ovsx publish --packagePath rome_lsp-*.vsix\n if: needs.build.outputs.prerelease != 'true'\n env:\n OVSX_PAT: ${{ secrets.OVSX_PAT }}\n\n - name: Create GitHub release and tag\n uses: softprops/action-gh-release@v1\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n with:\n name: VSCode Extension v${{ needs.build.outputs.version }}\n tag_name: lsp/v${{ needs.build.outputs.version }}\n draft: false\n prerelease: ${{ needs.build.outputs.prerelease == 'true' }}\n files: |\n rome_lsp-*.vsix\n fail_on_unmatched_files: true\n generate_release_notes: true\n | dataset_sample\yaml\rome_tools\.github\workflows\release_lsp.yml | release_lsp.yml | YAML | 7,993 | 0.8 | 0.080851 | 0.019417 | vue-tools | 678 | 2024-05-10T22:56:29.445407 | Apache-2.0 | false | ff765d6cf04a93fe6e6875c9a1f34cce |
# We have code that relies on Rust code AND JS code, we want to run this job when the relevant code changes\nname: Checks for our runtimes\non:\n push:\n branches:\n - main\n paths:\n - 'npm/**'\n - 'crates/**'\n pull_request:\n branches:\n - main\n paths:\n - 'npm/**'\n - 'crates/**'\n\nenv:\n RUST_LOG: info\n RUST_BACKTRACE: 1\n\njobs:\n apis-check:\n name: Checks on APIs project\n runs-on: ubuntu-latest\n steps:\n - name: Checkout PR Branch\n uses: actions/checkout@v3\n with:\n submodules: false\n - name: Install wasm-pack\n uses: jetli/wasm-pack-action@v0.4.0\n with:\n version: 'latest'\n - name: Cache pnpm modules\n uses: actions/cache@v3\n with:\n path: ~/.pnpm-store\n key: ${{ runner.os }}-${{ hashFiles('**/pnpm-lock.yaml') }}\n restore-keys: |\n ${{ runner.os }}-\n - uses: pnpm/action-setup@v2.2.4\n with:\n version: 8\n - name: Install toolchain\n uses: moonrepo/setup-rust@v0\n - name: Install libraries\n working-directory: npm/js-api\n run: pnpm i\n - name: Compile backends\n run: |\n pnpm --prefix npm/js-api build:wasm-bundler\n pnpm --prefix npm/js-api build:wasm-node\n pnpm --prefix npm/js-api build:wasm-web\n pnpm --prefix npm/backend-jsonrpc i\n pnpm --prefix npm/backend-jsonrpc run build\n - name: CI checks\n working-directory: npm/js-api\n run: pnpm run ci\n | dataset_sample\yaml\rome_tools\.github\workflows\runtime.yml | runtime.yml | YAML | 1,533 | 0.8 | 0.017241 | 0.017857 | python-kit | 520 | 2025-06-28T04:40:51.833228 | Apache-2.0 | false | 6178e69c59c93fb4b2d3f05fb765cc42 |
pull_request_rules:\n # if there is a conflict in a backport PR, ping the author to send a proper backport PR\n - name: ping author on conflicts\n conditions:\n - conflict\n actions:\n comment:\n message: This pull request has merge conflicts that must be resolved before it can be merged. @{{author}} please rebase it. https://rook.io/docs/rook/latest/Contributing/development-flow/#updating-your-fork\n\n - name: ping author on direct push to release branch\n conditions:\n - base~=^release-\n - author!=mergify[bot]\n actions:\n comment:\n message: Hi @{{author}}, this pull request was opened against a release branch, is it expected? Normally patches should go in the master branch first and then be backported to release branches.\n\n # release-1.14 branch\n - name: automerge backport release-1.14\n conditions:\n - author=mergify[bot]\n - base=release-1.14\n - label!=do-not-merge\n - "status-success=DCO"\n - "check-success=linux-build-all (1.21)"\n - "check-success=linux-build-all (1.22)"\n - "check-success=unittests"\n - "check-success=golangci-lint"\n - "check-success=codegen"\n - "check-success=codespell"\n - "check-success=lint"\n - "check-success=modcheck"\n - "check-success=Shellcheck"\n - "check-success=yaml-linter"\n - "check-success=lint-test"\n - "check-success=gen-rbac"\n - "check-success=crds-gen"\n - "check-success=docs-check"\n - "check-success=pylint"\n - "check-success=canary-tests / canary (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / raw-disk-with-object (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / two-osds-in-device (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / osd-with-metadata-partition-device (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / osd-with-metadata-device (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / lvm (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / pvc (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / pvc-db (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / pvc-db-wal (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-db (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-db-wal (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-token-auth (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-k8s-auth (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / lvm-pvc (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / multi-cluster-mirroring (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / rgw-multisite-testing (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-kms-ibm-kp (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / multus-cluster-network (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / csi-hostnetwork-disabled (quay.io/ceph/ceph:v18)"\n - "check-success=TestCephSmokeSuite (v1.25.16)"\n - "check-success=TestCephSmokeSuite (v1.30.0)"\n - "check-success=TestCephHelmSuite (v1.25.16)"\n - "check-success=TestCephHelmSuite (v1.30.0)"\n - "check-success=TestCephMultiClusterDeploySuite (v1.30.0)"\n - "check-success=TestCephObjectSuite (v1.30.0)"\n - "check-success=TestCephUpgradeSuite (v1.25.16)"\n - "check-success=TestCephUpgradeSuite (v1.30.0)"\n - "check-success=TestHelmUpgradeSuite (v1.25.16)"\n - "check-success=TestHelmUpgradeSuite (v1.30.0)"\n actions:\n merge:\n method: merge\n dismiss_reviews: {}\n delete_head_branch: {}\n\n # release-1.15 branch\n - name: automerge backport release-1.15\n conditions:\n - author=mergify[bot]\n - base=release-1.15\n - label!=do-not-merge\n - "status-success=DCO"\n - "check-success=linux-build-all (1.22)"\n - "check-success=unittests"\n - "check-success=golangci-lint"\n - "check-success=codegen"\n - "check-success=codespell"\n - "check-success=lint"\n - "check-success=modcheck"\n - "check-success=Shellcheck"\n - "check-success=yaml-linter"\n - "check-success=lint-test"\n - "check-success=gen-rbac"\n - "check-success=crds-gen"\n - "check-success=docs-check"\n - "check-success=pylint"\n - "check-success=canary-tests / canary (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / raw-disk-with-object (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / two-osds-in-device (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / osd-with-metadata-partition-device (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / osd-with-metadata-device (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / lvm (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / pvc (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / pvc-db (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / pvc-db-wal (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-db (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-db-wal (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-token-auth (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-k8s-auth (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / lvm-pvc (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / multi-cluster-mirroring (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / rgw-multisite-testing (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / encryption-pvc-kms-ibm-kp (quay.io/ceph/ceph:v18)"\n - "check-success=canary-tests / multus-cluster-network (quay.io/ceph/ceph:v18)" # note: changed name for 1.16 (multus-public-and-cluster)\n - "check-success=canary-tests / csi-hostnetwork-disabled (quay.io/ceph/ceph:v18)"\n - "check-success=TestCephSmokeSuite (v1.26.15)"\n - "check-success=TestCephSmokeSuite (v1.31.0)"\n - "check-success=TestCephHelmSuite (v1.26.15)"\n - "check-success=TestCephHelmSuite (v1.31.0)"\n - "check-success=TestCephMultiClusterDeploySuite (v1.31.0)"\n - "check-success=TestCephObjectSuite (v1.26.15)"\n - "check-success=TestCephObjectSuite (v1.31.0)"\n - "check-success=TestCephUpgradeSuite (v1.26.15)"\n - "check-success=TestCephUpgradeSuite (v1.31.0)"\n - "check-success=TestHelmUpgradeSuite (v1.26.15)"\n - "check-success=TestHelmUpgradeSuite (v1.31.0)"\n actions:\n merge:\n method: merge\n dismiss_reviews: {}\n delete_head_branch: {}\n\n # release-1.16 branch\n - name: automerge backport release-1.16\n conditions:\n - author=mergify[bot]\n - base=release-1.16\n - label!=do-not-merge\n - "status-success=DCO"\n - "check-success=linux-build-all (1.22)"\n - "check-success=linux-build-all (1.23)"\n - "check-success=unittests"\n - "check-success=golangci-lint"\n - "check-success=codegen"\n - "check-success=codespell"\n - "check-success=lint"\n - "check-success=modcheck"\n - "check-success=Shellcheck"\n - "check-success=yaml-linter"\n - "check-success=lint-test"\n - "check-success=gen-rbac"\n - "check-success=crds-gen"\n - "check-success=docs-check"\n - "check-success=pylint"\n - "check-success=canary-tests / canary (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / raw-disk-with-object (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / two-osds-in-device (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / osd-with-metadata-partition-device (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / osd-with-metadata-device (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / lvm (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / pvc (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / pvc-db (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / pvc-db-wal (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-db (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-db-wal (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-token-auth (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-k8s-auth (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / lvm-pvc (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / multi-cluster-mirroring (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-kms-ibm-kp (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / multus-public-and-cluster (quay.io/ceph/ceph:v19)"\n - "check-success=TestCephSmokeSuite (v1.27.16)"\n - "check-success=TestCephSmokeSuite (v1.32.0)"\n - "check-success=TestCephHelmSuite (v1.27.16)"\n - "check-success=TestCephHelmSuite (v1.32.0)"\n - "check-success=TestCephMultiClusterDeploySuite (v1.32.0)"\n - "check-success=TestCephObjectSuite (v1.27.16)"\n - "check-success=TestCephObjectSuite (v1.32.0)"\n - "check-success=TestCephUpgradeSuite (v1.27.16)"\n - "check-success=TestCephUpgradeSuite (v1.32.0)"\n - "check-success=TestHelmUpgradeSuite (v1.27.16)"\n - "check-success=TestHelmUpgradeSuite (v1.32.0)"\n actions:\n merge:\n method: merge\n dismiss_reviews: {}\n delete_head_branch: {}\n\n # release-1.17 branch\n - name: automerge backport release-1.17\n conditions:\n - author=mergify[bot]\n - base=release-1.17\n - label!=do-not-merge\n - "status-success=DCO"\n - "check-success=codegen"\n - "check-success=codespell"\n - "check-success=crds-gen"\n - "check-success=docs-check"\n - "check-success=gen-rbac"\n - "check-success=golangci-lint"\n - "check-success=govulncheck"\n - "check-success=lint"\n - "check-success=lint-test"\n - "check-success=linux-build-all (1.23)"\n - "check-success=linux-build-all (1.24)"\n - "check-success=misspell"\n - "check-success=modcheck"\n - "check-success=pylint"\n - "check-success=Shellcheck"\n - "check-success=unittests"\n - "check-success=yaml-linter"\n - "check-success=canary-tests / canary (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-db (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-db-wal (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-kms-ibm-kp (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-k8s-auth (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / encryption-pvc-kms-vault-token-auth (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / lvm (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / lvm-pvc (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / multi-cluster-mirroring (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / multus-public-and-cluster (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / osd-with-metadata-device (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / osd-with-metadata-partition-device (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / pvc (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / pvc-db (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / pvc-db-wal (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / raw-disk-with-object (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / two-object-one-zone (quay.io/ceph/ceph:v19)"\n - "check-success=canary-tests / two-osds-in-device (quay.io/ceph/ceph:v19)"\n - "check-success=TestCephHelmSuite (v1.28.15)"\n - "check-success=TestCephHelmSuite (v1.32.3)"\n - "check-success=TestCephMultiClusterDeploySuite (v1.32.3)"\n - "check-success=TestCephObjectSuite (v1.28.15)"\n - "check-success=TestCephObjectSuite (v1.32.3)"\n - "check-success=TestCephSmokeSuite (v1.28.15)"\n - "check-success=TestCephSmokeSuite (v1.32.3)"\n - "check-success=TestCephUpgradeSuite (v1.28.15)"\n - "check-success=TestCephUpgradeSuite (v1.32.3)"\n - "check-success=TestHelmUpgradeSuite (v1.28.15)"\n - "check-success=TestHelmUpgradeSuite (v1.32.3)"\n actions:\n merge:\n method: merge\n dismiss_reviews: {}\n delete_head_branch: {}\n\n # release-1.14 branch\n - actions:\n backport:\n branches:\n - release-1.14\n conditions:\n - label=backport-release-1.14\n name: backport release-1.14\n\n # release-1.15 branch\n - actions:\n backport:\n branches:\n - release-1.15\n conditions:\n - label=backport-release-1.15\n name: backport release-1.15\n\n # release-1.16 branch\n - actions:\n backport:\n branches:\n - release-1.16\n conditions:\n - label=backport-release-1.16\n name: backport release-1.16\n\n # release-1.17 branch\n - actions:\n backport:\n branches:\n - release-1.17\n conditions:\n - label=backport-release-1.17\n name: backport release-1.17\n | dataset_sample\yaml\rook_rook\.mergify.yml | .mergify.yml | YAML | 13,896 | 0.8 | 0.00692 | 0.032143 | react-lib | 854 | 2025-07-03T13:03:13.416584 | BSD-3-Clause | false | 26f058cfa0a8f8f5c4ae317025481fc6 |
site_name: Rook Ceph Documentation\ndocs_dir: Documentation/\nsite_url: "https://rook.io"\nrepo_url: https://github.com/rook/rook\nedit_uri: edit/master/Documentation/\nsite_author: Rook Authors\nsite_description: "Rook Ceph Documentation"\nuse_directory_urls: true\ncopyright: |\n <a class="logo" href="/">\n <img src="https://rook.io/images/rook-logo-small.svg" alt="rook.io logo" />\n </a>\n <p>\n © Rook Authors 2022. Documentation distributed under\n <a href="https://creativecommons.org/licenses/by/4.0">CC-BY-4.0</a>.\n </p>\n <p>\n © 2022 The Linux Foundation. All rights reserved. The Linux Foundation has\n registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see our\n <a href="https://www.linuxfoundation.org/trademark-usage/">Trademark Usage</a> page.\n </p>\ntheme:\n name: material\n custom_dir: .docs/overrides/\n font: false\n favicon: https://rook.io/images/favicon_192x192.png\n logo: https://rook.io/images/rook-logo.svg\n palette:\n - scheme: "default"\n primary: "rook-blue"\n accent: "deep orange"\n toggle:\n icon: material/toggle-switch-off-outline\n name: Switch to dark mode\n - scheme: "slate"\n primary: "rook-blue"\n accent: "red"\n toggle:\n icon: material/toggle-switch\n name: Switch to light mode\n icon:\n repo: fontawesome/brands/github\n features:\n - content.tabs.link\n - instant\n - navigation.expand\n - navigation.tabs\n - navigation.tabs.sticky\n - navigation.top\n - navigation.tracking\n - search.highlight\n - search.share\n - search.suggest\n - tabs\nextra_css:\n - stylesheets/extra.css\nplugins:\n - search\n - exclude:\n glob:\n - README.md\n - "*.gotmpl"\n - "*.gotmpl.md"\n - awesome-pages\n - macros:\n module_name: .docs/macros/includes/main\n - minify:\n minify_html: true\n minify_js: true\n htmlmin_opts:\n remove_comments: true\n #js_files: []\n - redirects:\n redirect_maps:\n README.md: Getting-Started/intro.md\n - mike:\n # these fields are all optional; the defaults are as below...\n version_selector: true # set to false to leave out the version selector\n css_dir: css # the directory to put the version selector's CSS\n javascript_dir: js # the directory to put the version selector's JS\n canonical_version:\n null # the version for <link rel="canonical">; `null`\n # uses the version specified via `mike deploy`\nmarkdown_extensions:\n - admonition\n - attr_list\n - def_list\n - footnotes\n - meta\n - toc:\n permalink: true\n - tables\n - pymdownx.details\n - pymdownx.emoji:\n emoji_index: !!python/name:material.extensions.emoji.twemoji\n emoji_generator: !!python/name:material.extensions.emoji.to_svg\n - pymdownx.highlight:\n anchor_linenums: true\n use_pygments: true\n linenums: true\n - pymdownx.inlinehilite\n - pymdownx.keys\n - pymdownx.magiclink\n - pymdownx.mark\n - pymdownx.snippets\n - pymdownx.tasklist:\n custom_checkbox: true\n - pymdownx.superfences\n - pymdownx.tabbed\nextra:\n version:\n provider: mike\n default: latest-release\n social:\n - icon: fontawesome/brands/slack\n link: https://slack.rook.io/\n - icon: fontawesome/brands/twitter\n link: https://twitter.com/rook_io\n - icon: fontawesome/solid/envelopes-bulk\n link: "https://groups.google.com/forum/#!forum/rook-dev"\n - icon: fontawesome/brands/medium\n link: https://blog.rook.io/\n | dataset_sample\yaml\rook_rook\mkdocs.yml | mkdocs.yml | YAML | 3,518 | 0.8 | 0.032787 | 0.02459 | vue-tools | 344 | 2024-09-23T14:32:06.024044 | GPL-3.0 | false | 39a7dcf669077afc3a1eba634a929434 |
version: 2\nenable-beta-ecosystems: true\nupdates:\n # Dependencies listed in go.mod\n - package-ecosystem: "gomod"\n directory: "/" # Location of package manifests\n schedule:\n interval: "weekly"\n groups:\n golang-dependencies:\n patterns:\n - "github.com/golang*"\n k8s-dependencies:\n patterns:\n - "k8s.io*"\n - "sigs.k8s.io*"\n github-dependencies:\n patterns:\n - "github.com*"\n\n # Dependencies listed in .github/workflows/*.yml\n - package-ecosystem: "github-actions"\n directory: "/"\n schedule:\n interval: "weekly"\n | dataset_sample\yaml\rook_rook\.github\dependabot.yml | dependabot.yml | YAML | 606 | 0.8 | 0 | 0.083333 | vue-tools | 20 | 2025-03-02T17:45:13.641255 | MIT | false | 9ba8c7289974ae8d5bcb4f3401eb0c3a |
name: Builds\non:\n pull_request:\n\ndefaults:\n run:\n # reference: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#using-a-specific-shell\n shell: bash --noprofile --norc -eo pipefail -x {0}\n\n# cancel the in-progress workflow when PR is refreshed.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.sha }}\n cancel-in-progress: true\n\npermissions:\n contents: read\n\njobs:\n macos-build:\n runs-on: macos-latest\n if: "!contains(github.event.pull_request.labels.*.name, 'skip-ci')"\n steps:\n - name: checkout\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n fetch-depth: 0\n\n - uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0\n with:\n go-version: "1.23"\n\n - name: Set up Helm\n uses: azure/setup-helm@b9e51907a09c216f16ebe8536097933489208112 # v4.3.0\n with:\n version: v3.17.1\n\n - name: build rook\n run: |\n GOPATH=$(go env GOPATH) make clean && make -j$nproc BUILD_CONTAINER_IMAGE=false build\n\n - name: validate build\n run: tests/scripts/validate_modified_files.sh build\n\n - name: run codegen\n run: GOPATH=$(go env GOPATH) make codegen\n\n - name: validate codegen\n run: tests/scripts/validate_modified_files.sh codegen\n\n - name: run mod check\n run: GOPATH=$(go env GOPATH) make -j $(nproc) mod.check\n\n - name: validate modcheck\n run: tests/scripts/validate_modified_files.sh modcheck\n\n - name: run crds-gen\n run: GOPATH=$(go env GOPATH) make crds\n\n - name: validate crds-gen\n run: tests/scripts/validate_modified_files.sh crd\n\n - name: run gen-rbac\n run: GOPATH=$(go env GOPATH) make gen-rbac\n\n - name: validate gen-rbac\n run: tests/scripts/validate_modified_files.sh gen-rbac\n\n linux-build-all:\n runs-on: ubuntu-22.04\n if: "!contains(github.event.pull_request.labels.*.name, 'skip-ci')"\n strategy:\n fail-fast: false\n matrix:\n go-version: ["1.23", "1.24"]\n steps:\n - name: checkout\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n fetch-depth: 0\n\n - name: setup golang ${{ matrix.go-version }}\n uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0\n with:\n go-version: ${{ matrix.go-version }}\n\n - name: set up QEMU\n uses: docker/setup-qemu-action@29109295f81e9208d7d86ff1c6c12d2833863392 # master\n with:\n platforms: all\n\n - name: build.all rook with go ${{ matrix.go-version }}\n run: |\n tests/scripts/github-action-helper.sh build_rook_all\n | dataset_sample\yaml\rook_rook\.github\workflows\build.yml | build.yml | YAML | 2,789 | 0.8 | 0.032258 | 0.027397 | react-lib | 84 | 2023-11-01T01:01:09.597993 | BSD-3-Clause | false | f02a1b4ceaca92c7618e8250c721a89a |
name: Canary integration tests\non:\n push:\n tags:\n - v*\n branches:\n - master\n - release-*\n pull_request:\n branches:\n - master\n - release-*\n paths-ignore:\n - "Documentation/**"\n - "design/**"\n\ndefaults:\n run:\n # reference: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#using-a-specific-shell\n shell: bash --noprofile --norc -eo pipefail -x {0}\n\n# cancel the in-progress workflow when PR is refreshed.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.sha }}\n cancel-in-progress: true\n\npermissions:\n contents: read\n\njobs:\n canary-tests:\n uses: ./.github/workflows/canary-integration-test.yml\n with:\n ceph_images: '["quay.io/ceph/ceph:v19"]'\n secrets: inherit\n | dataset_sample\yaml\rook_rook\.github\workflows\canary-integration-suite.yml | canary-integration-suite.yml | YAML | 827 | 0.8 | 0.028571 | 0.064516 | awesome-app | 753 | 2024-02-26T20:24:03.455522 | BSD-3-Clause | false | 35bf0f49cb12fc6b34b392107243edc6 |
name: Codegen\non:\n push:\n tags:\n - v*\n branches:\n - master\n - release-*\n pull_request:\n branches:\n - master\n - release-*\n\ndefaults:\n run:\n # reference: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#using-a-specific-shell\n shell: bash --noprofile --norc -eo pipefail -x {0}\n\n# cancel the in-progress workflow when PR is refreshed.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.sha }}\n cancel-in-progress: true\n\npermissions:\n contents: read\n\njobs:\n codegen:\n runs-on: ubuntu-22.04\n if: "!contains(github.event.pull_request.labels.*.name, 'skip-ci')"\n steps:\n - name: checkout\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n fetch-depth: 0\n\n - uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0\n with:\n go-version: "1.23"\n\n - name: run codegen\n run: GOPATH=$(go env GOPATH) make codegen\n\n - name: validate codegen\n run: tests/scripts/validate_modified_files.sh codegen\n | dataset_sample\yaml\rook_rook\.github\workflows\codegen.yml | codegen.yml | YAML | 1,152 | 0.8 | 0.044444 | 0.052632 | awesome-app | 369 | 2023-09-19T22:16:56.848669 | GPL-3.0 | false | 22cf8a96b5d5ccf2c700211f659d0732 |
name: Commitlint\non:\n push:\n tags:\n - v*\n branches:\n - master\n - release-*\n pull_request:\n branches:\n - master\n - release-*\n\n# cancel the in-progress workflow when PR is refreshed.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.sha }}\n cancel-in-progress: true\n\npermissions:\n contents: read\n\njobs:\n lint:\n permissions:\n contents: read # for actions/checkout to fetch code\n pull-requests: read # for wagoid/commitlint-github-action to get commits in PR\n runs-on: ubuntu-22.04\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n steps:\n - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n fetch-depth: 0\n - uses: wagoid/commitlint-github-action@b948419dd99f3fd78a6548d48f94e3df7f6bf3ed # v6.2.1\n with:\n configFile: "./.commitlintrc.json"\n helpURL: https://rook.io/docs/rook/latest/Contributing/development-flow/#commit-structure\n | dataset_sample\yaml\rook_rook\.github\workflows\commitlint.yml | commitlint.yml | YAML | 1,042 | 0.8 | 0.054054 | 0.029412 | awesome-app | 239 | 2023-12-15T14:38:32.209560 | Apache-2.0 | false | 694f6186d24576e68cfddae46ac9d73d |
name: CRDs gen\non:\n push:\n tags:\n - v*\n branches:\n - master\n - release-*\n pull_request:\n branches:\n - master\n - release-*\n\ndefaults:\n run:\n # reference: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#using-a-specific-shell\n shell: bash --noprofile --norc -eo pipefail -x {0}\n\n# cancel the in-progress workflow when PR is refreshed.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.sha }}\n cancel-in-progress: true\n\npermissions:\n contents: read\n\njobs:\n crds-gen:\n runs-on: ubuntu-22.04\n steps:\n - name: checkout\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n fetch-depth: 0\n\n - uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0\n with:\n go-version: "1.23"\n\n - name: run crds-gen\n run: GOPATH=$(go env GOPATH) make crds\n\n - name: validate crds-gen\n run: tests/scripts/validate_modified_files.sh crd\n | dataset_sample\yaml\rook_rook\.github\workflows\crds-gen.yml | crds-gen.yml | YAML | 1,077 | 0.8 | 0.022727 | 0.054054 | awesome-app | 521 | 2024-12-29T07:40:44.200238 | BSD-3-Clause | false | 987c592e626b9f07688825abf23398c7 |
name: docs-check\non:\n push:\n tags:\n - v*\n branches:\n - master\n - release-*\n pull_request:\n branches:\n - master\n - release-*\n\n# cancel the in-progress workflow when PR is refreshed.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.sha }}\n cancel-in-progress: true\n\npermissions:\n contents: read\n\njobs:\n docs-check:\n name: docs-check\n runs-on: ubuntu-22.04\n steps:\n - uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n fetch-depth: 0\n\n - uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0\n with:\n go-version: "1.23"\n\n - uses: actions/setup-python@8d9ed9ac5c53483de85588cdf95a591a75ab9f55 # v5.5.0\n with:\n python-version: 3.9\n\n - uses: DavidAnson/markdownlint-cli2-action@05f32210e84442804257b2a6f20b273450ec8265 # v19.1.0\n with:\n globs: |\n Documentation/**/*.md\n !Documentation/Helm-Charts\n\n - name: Check docs\n run: |\n make gen.docs\n tests/scripts/validate_modified_files.sh docs\n - name: Install mkdocs and dependencies\n run: cd build/release/ && make deps.docs\n\n - name: Check documentation for CRDs\n run: |\n make generate-docs-crds\n DIFF_ON_DOCS=$(git diff --ignore-matching-lines='on git commit')\n if [ ! -z "$DIFF_ON_DOCS" ]; then\n echo "Please run 'make generate-docs-crds' locally, commit the updated crds docs, and push the change"\n fi\n git diff --ignore-matching-lines='on git commit' --exit-code\n\n - name: Build documentation using mkdocs\n run: make docs-build\n | dataset_sample\yaml\rook_rook\.github\workflows\docs-check.yml | docs-check.yml | YAML | 1,760 | 0.8 | 0.032258 | 0.018868 | react-lib | 378 | 2025-04-30T23:28:33.193982 | Apache-2.0 | false | 70ddc79f3b5647c048545bee48523ef7 |
name: Mod check\non:\n push:\n tags:\n - v*\n branches:\n - master\n - release-*\n pull_request:\n branches:\n - master\n - release-*\n\n# cancel the in-progress workflow when PR is refreshed.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.sha }}\n cancel-in-progress: true\n\ndefaults:\n run:\n # reference: https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#using-a-specific-shell\n shell: bash --noprofile --norc -eo pipefail -x {0}\n\npermissions:\n contents: read\n\njobs:\n modcheck:\n runs-on: ubuntu-22.04\n steps:\n - name: checkout\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n fetch-depth: 0\n\n - uses: actions/setup-go@0aaccfd150d50ccaeb58ebd88d36e91967a5f35b # v5.4.0\n with:\n go-version: "1.23"\n\n - name: run mod check\n run: GOPATH=$(go env GOPATH) make -j $(nproc) mod.check\n\n - name: validate modcheck\n run: tests/scripts/validate_modified_files.sh modcheck\n | dataset_sample\yaml\rook_rook\.github\workflows\mod-check.yml | mod-check.yml | YAML | 1,101 | 0.8 | 0.022727 | 0.054054 | node-utils | 67 | 2023-08-29T00:32:57.979510 | BSD-3-Clause | false | 14349d2382c6202f6418585a7834a4e1 |
name: Scorecard supply-chain security\non:\n # For Branch-Protection check. Only the default branch is supported. See\n # https://github.com/ossf/scorecard/blob/main/docs/checks.md#branch-protection\n branch_protection_rule:\n # To guarantee Maintained check is occasionally updated. See\n # https://github.com/ossf/scorecard/blob/main/docs/checks.md#maintained\n schedule:\n - cron: "28 23 * * 3"\n push:\n branches: ["master"]\n\n# Declare default permissions as read only.\npermissions: read-all\n\njobs:\n analysis:\n name: Scorecard analysis\n runs-on: ubuntu-latest\n permissions:\n # Needed to upload the results to code-scanning dashboard.\n security-events: write\n # Needed to publish results and get a badge (see publish_results below).\n id-token: write\n # Uncomment the permissions below if installing in a private repository.\n # contents: read\n # actions: read\n\n steps:\n - name: "Checkout code"\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n with:\n persist-credentials: false\n\n - name: "Run analysis"\n uses: ossf/scorecard-action@f49aabe0b5af0936a0987cfb85d86b75731b0186 # v2.4.1\n with:\n results_file: results.sarif\n results_format: sarif\n # (Optional) "write" PAT token. Uncomment the `repo_token` line below if:\n # - you want to enable the Branch-Protection check on a *public* repository, or\n # - you are installing Scorecard on a *private* repository\n # To create the PAT, follow the steps in https://github.com/ossf/scorecard-action?tab=readme-ov-file#authentication-with-fine-grained-pat-optional.\n # repo_token: ${{ secrets.SCORECARD_TOKEN }}\n\n # Public repositories:\n # - Publish results to OpenSSF REST API for easy access by consumers\n # - Allows the repository to include the Scorecard badge.\n # - See https://github.com/ossf/scorecard-action#publishing-results.\n # For private repositories:\n # - `publish_results` will always be set to `false`, regardless\n # of the value entered here.\n publish_results: true\n\n # Upload the results as artifacts (optional). Commenting out will disable uploads of run results in SARIF\n # format to the repository Actions tab.\n - name: "Upload artifact"\n uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2\n with:\n name: SARIF file\n path: results.sarif\n retention-days: 5\n\n # Upload the results to GitHub's code scanning dashboard (optional).\n # Commenting out will disable upload of results to your repo's Code Scanning dashboard\n - name: "Upload to code-scanning"\n uses: github/codeql-action/upload-sarif@45775bd8235c68ba998cffa5171334d58593da47 # v3.28.15\n with:\n sarif_file: results.sarif\n | dataset_sample\yaml\rook_rook\.github\workflows\scorecards.yml | scorecards.yml | YAML | 2,932 | 0.8 | 0.043478 | 0.419355 | react-lib | 51 | 2023-07-31T23:59:19.396476 | GPL-3.0 | false | ea46e7aafb916b5fa8b0dec163aa6da4 |
name: Encryption KMS IBM Key Protect\ndescription: Reusable workflow to test Encryption KMS IBM Key Protect\ninputs:\n ibm-instance-id:\n description: IBM_KP_SERVICE_INSTANCE_ID from the calling workflow\n required: true\n ibm-service-api-key:\n description: IBM_KP_SERVICE_API_KEY from the calling workflow\n required: true\n artifact-name:\n description: the name of the artifact where logs will be stored\n required: true\n ceph-image:\n description: the name of ceph image\n required: true\n\nruns:\n using: "composite"\n steps:\n - name: fail if env no present\n if: "env.IBM_KP_SERVICE_INSTANCE_ID == '' || env.IBM_KP_SERVICE_API_KEY == ''"\n env:\n IBM_KP_SERVICE_INSTANCE_ID: ${{ inputs.ibm-instance-id }}\n IBM_KP_SERVICE_API_KEY: ${{ inputs.ibm-service-api-key }}\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: echo "IBM_KP_SERVICE_INSTANCE_ID and IBM_KP_SERVICE_API_KEY must be set in the environment" && exit 0\n\n - name: setup cluster resources\n uses: ./.github/workflows/canary-test-config\n\n - name: set Ceph version in CephCluster manifest\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: tests/scripts/github-action-helper.sh replace_ceph_image "deploy/examples/cluster-test.yaml" "${{ inputs.ceph-image }}"\n\n - name: use local disk and create partitions for osds\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: |\n tests/scripts/github-action-helper.sh use_local_disk\n tests/scripts/github-action-helper.sh create_partitions_for_osds\n\n - name: create cluster prerequisites\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: |\n export BLOCK="/dev/$(tests/scripts/github-action-helper.sh find_extra_block_dev)"\n tests/scripts/localPathPV.sh "$BLOCK"\n tests/scripts/github-action-helper.sh create_cluster_prerequisites\n\n - name: deploy cluster\n shell: bash --noprofile --norc -eo pipefail -x {0}\n env:\n IBM_KP_SERVICE_INSTANCE_ID: ${{ inputs.ibm-instance-id }}\n IBM_KP_SERVICE_API_KEY: ${{ inputs.ibm-service-api-key }}\n run: |\n tests/scripts/github-action-helper.sh deploy_manifest_with_local_build deploy/examples/operator.yaml\n envsubst < "tests/manifests/test-kms-ibm-kp-secret.in" > "tests/manifests/test-kms-ibm-kp-secret.yaml"\n envsubst < "tests/manifests/test-kms-ibm-kp-spec.in" > "tests/manifests/test-kms-ibm-kp-spec.yaml"\n cat tests/manifests/test-kms-ibm-kp-secret.yaml >> tests/manifests/test-cluster-on-pvc-encrypted.yaml\n yq merge --inplace --arrays append tests/manifests/test-cluster-on-pvc-encrypted.yaml tests/manifests/test-kms-ibm-kp-spec.yaml\n yq write -i tests/manifests/test-cluster-on-pvc-encrypted.yaml "spec.storage.storageClassDeviceSets[0].count" 2\n yq write -i tests/manifests/test-cluster-on-pvc-encrypted.yaml "spec.storage.storageClassDeviceSets[0].volumeClaimTemplates[0].spec.resources.requests.storage" 6Gi\n kubectl create -f tests/manifests/test-cluster-on-pvc-encrypted.yaml\n tests/scripts/github-action-helper.sh deploy_manifest_with_local_build deploy/examples/toolbox.yaml\n\n - name: wait for prepare pod\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: tests/scripts/github-action-helper.sh wait_for_prepare_pod 2\n\n - name: wait for ceph to be ready\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: |\n tests/scripts/github-action-helper.sh wait_for_ceph_to_be_ready osd 2\n kubectl -n rook-ceph get pods\n kubectl -n rook-ceph get secrets\n\n - name: validate encrypted osd\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: |\n sudo lsblk\n\n - name: collect common logs\n if: always()\n uses: ./.github/workflows/collect-logs\n with:\n name: ${{ inputs.artifact-name }}\n\n - name: teardown cluster so that keys are removed from the KMS\n shell: bash --noprofile --norc -eo pipefail -x {0}\n run: kubectl -n rook-ceph delete cephcluster rook-ceph --wait\n | dataset_sample\yaml\rook_rook\.github\workflows\encryption-pvc-kms-ibm-kp\action.yml | action.yml | YAML | 4,084 | 0.85 | 0.068182 | 0 | node-utils | 94 | 2024-01-30T13:19:00.491761 | BSD-3-Clause | false | fe96992bbe89bc1900fcb68d8b70ded1 |
name: "Tmate debugging tests"\ndescription: "Setup tmate session if the test fails"\ninputs:\n use-tmate:\n description: "boolean for enabling TMATE"\n required: true\n debug-ci:\n description: "boolean for debug-ci label in PR"\n requnred: false\nruns:\n using: "composite"\n steps:\n - name: consider debugging\n shell: bash --noprofile --norc -eo pipefail -x {0}\n if: runner.debug || inputs.debug-ci == 'true'\n run: |\n # Enable tmate only in the Rook fork, where the USE_TMATE secret is set in the repo, or if the action is re-run\n if [ "$GITHUB_REPOSITORY_OWNER" = "rook" ] || [ -n "${{ inputs.use-tmate }}" ] || [ "$GITHUB_RUN_ATTEMPT" -gt 1 ]; then\n echo USE_TMATE=1 >> $GITHUB_ENV\n fi\n\n - name: set up tmate session for debugging\n if: env.USE_TMATE\n uses: mxschmitt/action-tmate@v3\n with:\n limit-access-to-actor: false\n detached: true\n | dataset_sample\yaml\rook_rook\.github\workflows\tmate_debug\action.yml | action.yml | YAML | 925 | 0.95 | 0.296296 | 0.038462 | vue-tools | 835 | 2025-04-29T22:17:49.020187 | GPL-3.0 | false | f3c6f0a2570719845c32f11996df732f |
version: 2\nupdates:\n - package-ecosystem: "github-actions"\n directory: "/"\n schedule:\n interval: "weekly"\n - package-ecosystem: "composer"\n directory: "/"\n schedule:\n interval: "weekly"\n versioning-strategy: "widen"\n - package-ecosystem: "npm"\n directory: "/"\n schedule:\n interval: "weekly"\n versioning-strategy: "widen"\n | dataset_sample\yaml\roundcube_roundcubemail\.github\dependabot.yml | dependabot.yml | YAML | 363 | 0.7 | 0 | 0 | node-utils | 16 | 2023-08-20T13:02:52.521225 | BSD-3-Clause | false | a2c1276c1ddd2e736d75d59d6e8978e5 |
name: Report a bug\ndescription: Describe a bug or issue you may have identified in Roundcube.\ntitle: "Provide a general summary of the issue"\nlabels: []\nassignees: []\nbody:\n - type: markdown\n attributes:\n value: |\n **IMPORTANT!** If you have problems with your email account (e.g. cannot log in, emails got lost, etc.) or if you have questions how to configure your Outlook or mobile phone to get email, this isn't the right place to ask. **Roundcube is not a service but free software which somebody installed for you.**\n\n Please contact your internet hosting provider or IT responsible instead. If you don't know who this might be, please review your bills and find out who you're paying for email and webhosting services.\n - type: checkboxes\n attributes:\n label: Prerequisites\n options:\n - label: I have [searched](https://github.com/roundcube/roundcubemail/issues?q=is%3Aissue) for duplicate or closed issues\n required: true\n - label: I can recreate the issue with all plugins disabled\n required: false\n - type: textarea\n id: what-happened\n attributes:\n label: Describe the issue\n description: Provide a summary of the issue and what you expected to happen, including specific steps to reproduce.\n validations:\n required: true\n - type: markdown\n attributes:\n value: |\n ## Environment\n - type: dropdown\n id: browser\n attributes:\n label: What browser(s) are you seeing the problem on?\n multiple: true\n options:\n - Chrome\n - Edge\n - Firefox\n - Safari\n - Other\n - type: input\n id: php\n attributes:\n label: What version of PHP are you using?\n placeholder: "e.g., v7.2 or v8.1"\n - type: input\n id: version\n attributes:\n label: What version of Roundcube are you using?\n placeholder: "e.g., v1.5.2 or v1.6.6"\n validations:\n required: true\n - type: markdown\n attributes:\n value: |\n ## Logs\n - type: textarea\n id: js-errors\n attributes:\n label: JavaScript errors\n description: Provide any relevant entries from the browser's JavaScript console.\n - type: textarea\n id: logs\n attributes:\n label: PHP errors\n description: Provide any relevant entries from the Roundcube error log.\n | dataset_sample\yaml\roundcube_roundcubemail\.github\ISSUE_TEMPLATE\bug_report.yml | bug_report.yml | YAML | 2,331 | 0.95 | 0.058824 | 0.044776 | node-utils | 455 | 2025-04-08T18:14:01.693009 | MIT | false | 4b59918424b597ab8e24a3d887b2bb98 |
contact_links:\n - name: Ask for help\n url: https://www.roundcubeforum.net/\n about: Ask and discuss questions with other Roundcube community members.\n | dataset_sample\yaml\roundcube_roundcubemail\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 156 | 0.8 | 0.25 | 0 | awesome-app | 909 | 2025-02-06T06:02:13.783220 | BSD-3-Clause | false | 73959b4cdaaca6471bd7058e7ab5063c |
name: Suggest a new feature\ndescription: Suggest a new feature to be developed for Roundcube.\ntitle: "Suggest a new feature"\nlabels: []\nassignees: []\nbody:\n - type: checkboxes\n attributes:\n label: Prerequisites\n options:\n - label: I have [searched](https://github.com/roundcube/roundcubemail/issues?q=is%3Aissue) for duplicate or closed feature requests\n required: true\n - label: I have [searched](https://plugins.roundcube.net/) for plugins that provide already provide the feature\n required: true\n - type: textarea\n id: proposal\n attributes:\n label: Proposal\n description: Describe the new feature you would like to see in Roundcube.\n validations:\n required: true\n - type: textarea\n id: motivation\n attributes:\n label: Motivation and context\n description: Tell us why this change is needed or helpful, and what problems it may help solve.\n validations:\n required: true\n | dataset_sample\yaml\roundcube_roundcubemail\.github\ISSUE_TEMPLATE\feature_request.yml | feature_request.yml | YAML | 967 | 0.95 | 0.107143 | 0 | python-kit | 852 | 2024-12-09T16:59:54.921508 | BSD-3-Clause | false | 77a94bdcb4baab3b679dc000e09d82dc |
name: 'Create reminder from comment'\n\npermissions:\n issues: write\n pull-requests: write\n\non:\n issue_comment:\n types: [created, edited]\n\njobs:\n reminder:\n if: github.repository == 'roundcube/roundcubemail'\n runs-on: ubuntu-latest\n\n steps:\n - name: 👀 check for reminder\n uses: agrc/create-reminder-action@9ff30cde74284045941af16a04362938957253b1 # v1.1.17\n | dataset_sample\yaml\roundcube_roundcubemail\.github\workflows\bot-create-manual-reminder.yml | bot-create-manual-reminder.yml | YAML | 385 | 0.8 | 0.111111 | 0 | python-kit | 406 | 2024-08-31T12:57:50.553391 | GPL-3.0 | false | 87c02c9f5ee338ef9c9409079ce25a72 |
name: 'Notify manually requested reminders'\n\non:\n schedule:\n - cron: '0 * * * *'\n\npermissions:\n issues: write\n pull-requests: write\n\njobs:\n reminder:\n if: github.repository == 'roundcube/roundcubemail'\n runs-on: ubuntu-latest\n\n steps:\n - name: check reminders and notify\n uses: agrc/reminder-action@96f2ec2e1a7a53ead156504922e9bc36d64f49ee # v1.0.16\n | dataset_sample\yaml\roundcube_roundcubemail\.github\workflows\bot-manual-reminder.yml | bot-manual-reminder.yml | YAML | 378 | 0.8 | 0.055556 | 0 | react-lib | 598 | 2023-12-27T18:49:47.138440 | GPL-3.0 | false | 514ac6a41cf0fcae65b14181a9efab84 |
name: CI\n\non:\n push:\n pull_request:\n\npermissions:\n contents: read\n\njobs:\n cs:\n runs-on: ubuntu-latest\n if: "!contains(github.event.head_commit.message, '[skip ci]') && !contains(github.event.head_commit.message, '[ci skip]')"\n\n name: Coding Style\n\n steps:\n - name: Checkout code\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n\n - name: Setup PHP\n uses: shivammathur/setup-php@9e72090525849c5e82e596468b86eb55e9cc5401 # v2.32.0\n with:\n php-version: "8.3"\n extensions: mbstring\n tools: composer:v2\n coverage: none\n\n - name: Install dependencies\n run: |\n composer install --prefer-dist --no-interaction --no-progress\n npm install --omit=optional\n\n - name: Check Coding Style - PHP\n run: vendor/bin/php-cs-fixer fix --dry-run --using-cache=no --diff --verbose\n\n - name: Check composer.json format\n run: composer validate --strict --no-check-lock && composer normalize --dry-run --no-check-lock\n\n - name: Check Coding Style - JS\n run: npx eslint --ext .js .\n\n - name: Plugins - Check composer.json format\n run: |\n for plugin_dir in plugins/*/; do (\n echo "========== $plugin_dir =========="\n cd "$plugin_dir"\n composer validate --strict --no-check-lock && composer normalize --dry-run --no-check-lock "$plugin_dir/composer.json"\n echo " "\n ); done\n\n phpstan:\n runs-on: ubuntu-latest\n if: "!contains(github.event.head_commit.message, '[skip ci]') && !contains(github.event.head_commit.message, '[ci skip]')"\n\n name: Static Analysis\n\n steps:\n - name: Checkout code\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n\n - name: Setup PHP\n uses: shivammathur/setup-php@9e72090525849c5e82e596468b86eb55e9cc5401 # v2.32.0\n with:\n php-version: "8.4"\n extensions: mbstring\n tools: composer:v2\n coverage: none\n\n - name: Setup composer\n run: |\n composer require "kolab/net_ldap3:~1.1.1" --no-update\n composer require "laravel/dusk:^8.3" --no-update\n\n - name: Install dependencies\n run: composer install --prefer-dist --no-interaction --no-progress\n\n - name: Run Static Analysis\n run: vendor/bin/phpstan analyse -v\n | dataset_sample\yaml\roundcube_roundcubemail\.github\workflows\ci.yml | ci.yml | YAML | 2,409 | 0.95 | 0.037975 | 0 | awesome-app | 167 | 2024-02-01T05:04:34.005197 | BSD-3-Clause | false | fdd059180aa2f8438dc628ee1267c987 |
name: roundcubemail-testrunner image\n\non:\n push:\n paths:\n - '.ci/docker-images/*'\n - '.github/workflows/docker_image.yml'\n schedule:\n # Rebuild automatically each monday early morning.\n - cron: "42 5 * * 1"\n\njobs:\n build_and_push:\n strategy:\n fail-fast: false\n # Set up a matrix so we can add more versions to build with in the future.\n matrix:\n php: ["8.3"]\n\n name: build and push with PHP ${{ matrix.php }}\n runs-on: ubuntu-latest\n # Set the permissions granted to the GITHUB_TOKEN for the actions in this job.\n permissions:\n contents: read\n packages: write\n attestations: write\n id-token: write\n steps:\n - name: Check actor permission\n uses: skjnldsv/check-actor-permission@69e92a3c4711150929bca9fcf34448c5bf5526e7 # v3.0\n with:\n require: admin\n - name: Check out the repo\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n - name: Log in to the Container registry\n uses: docker/login-action@74a5d142397b4f367a81961eba4e8cd7edddf772 # v3.4.0\n with:\n registry: ghcr.io\n username: ${{ github.actor }}\n password: ${{ secrets.GITHUB_TOKEN }}\n - name: Build and push Docker image\n id: push\n uses: docker/build-push-action@471d1dc4e07e5cdedd4c2171150001c434f0b7a4 # v6.15.0\n with:\n context: .\n file: ./.ci/docker-images/Dockerfile\n push: true\n #platforms: linux/amd64,linux/arm64\n build-args: PHPVERSION=${{ matrix.php }}\n tags: "ghcr.io/roundcube/roundcubemail-testrunner:php${{ matrix.php }}"\n\n | dataset_sample\yaml\roundcube_roundcubemail\.github\workflows\docker_image.yml | docker_image.yml | YAML | 1,665 | 0.95 | 0.019608 | 0.085106 | node-utils | 600 | 2024-09-19T21:22:08.284101 | MIT | false | d7b72365022ab64c25bd1e893aab11cd |
name: Message Rendering\n\non:\n push:\n pull_request:\n\npermissions:\n contents: read\n\njobs:\n message_rendering:\n runs-on: ubuntu-latest\n if: "!contains(github.event.head_commit.message, '[skip ci]') && !contains(github.event.head_commit.message, '[ci skip]')"\n\n strategy:\n fail-fast: false\n\n name: Linux / PHP 8.3\n\n steps:\n - name: Checkout code\n uses: actions/checkout@11bd71901bbe5b1630ceea73d27597364c9af683 # v4.2.2\n\n # Run via docker compose because we can't run greenmail in a server here\n # (it requires the testing emails to be present when starting but\n # services are started before the repo is cloned). And instead of\n # re-building what our compose-file contains we can just use it.\n - name: Run tests via docker compose\n run: docker compose -f .ci/compose.yaml run test_message_rendering\n\n - name: Upload artifacts\n uses: actions/upload-artifact@4cec3d8aa04e39d1a68397de0c4cd6fb9dce8ec1 # v4.6.1\n if: failure()\n with:\n name: Logs\n path: logs/errors.log\n\n | dataset_sample\yaml\roundcube_roundcubemail\.github\workflows\message_rendering.yml | message_rendering.yml | YAML | 1,075 | 0.95 | 0.054054 | 0.142857 | node-utils | 148 | 2024-10-25T04:08:29.208611 | MIT | false | bc96f0ced3ab4764a3a62f8a176ff4e0 |
---\n:position: before\n:position_in_additional_file_patterns: before\n:position_in_class: before\n:position_in_factory: before\n:position_in_fixture: before\n:position_in_routes: before\n:position_in_serializer: before\n:position_in_test: before\n:classified_sort: true\n:exclude_controllers: true\n:exclude_factories: true\n:exclude_fixtures: true\n:exclude_helpers: true\n:exclude_scaffolds: true\n:exclude_serializers: true\n:exclude_sti_subclasses: true\n:exclude_tests: true\n:force: false\n:format_markdown: false\n:format_rdoc: false\n:format_yard: false\n:frozen: false\n:ignore_model_sub_dir: false\n:ignore_unknown_models: false\n:include_version: false\n:show_complete_foreign_keys: false\n:show_foreign_keys: false\n:show_indexes: false\n:simple_indexes: false\n:sort: false\n:timestamp: false\n:trace: false\n:with_comment: true\n:with_column_comments: true\n:with_table_comments: true\n:active_admin: false\n:command:\n:debug: false\n:hide_default_column_types: ''\n:hide_limit_column_types: 'integer,boolean'\n:ignore_columns:\n:ignore_routes:\n:models: true\n:routes: false\n:skip_on_db_migrate: false\n:target_action: :do_annotations\n:wrapper:\n:wrapper_close:\n:wrapper_open:\n:classes_default_to_s: []\n:additional_file_patterns: []\n:model_dir:\n - app/models\n:require: []\n:root_dir:\n - ''\n\n:show_check_constraints: false\n | dataset_sample\yaml\ruby\.annotaterb.yml | .annotaterb.yml | YAML | 1,293 | 0.85 | 0 | 0 | awesome-app | 154 | 2024-11-30T17:44:25.662239 | MIT | false | f49688771487470ea35afe1812e20fc2 |
---\nignore:\n - CVE-2021-41098 # https://github.com/chatwoot/chatwoot/issues/3097 (update once azure blob storage is updated)\n | dataset_sample\yaml\ruby\.bundler-audit.yml | .bundler-audit.yml | YAML | 126 | 0.8 | 0 | 0 | vue-tools | 231 | 2024-11-22T15:37:49.492667 | GPL-3.0 | false | 9ebbc1a57b88d41d672dd179fd2cc926 |
version: "2"\nplugins:\n rubocop:\n enabled: true\n checks:\n Rubocop/Naming/RescuedExceptionsVariableName:\n enabled: false\n Rubocop/Layout/AlignArguments:\n enabled: false\n Rubocop/Style/MultilineWhenThen:\n enabled: false\n Rubocop/Layout/SpaceAroundOperators:\n enabled: false\n Rubocop/Style/FormatStringToken:\n enabled: false\n config:\n file: .rubocop.yml\n channel: "rubocop-1-50-3"\nexclude_patterns:\n - "**/bin/"\n - "**/script/"\n - "**/vendor/"\n - "**/spec/"\n - "public/"\n - "sample/db/samples"\n | dataset_sample\yaml\ruby\.codeclimate.yml | .codeclimate.yml | YAML | 574 | 0.7 | 0 | 0 | python-kit | 134 | 2024-11-09T13:43:25.285182 | GPL-3.0 | false | 09813d55ff2408c9ee173492e40b11f5 |
linters:\n ErbSafety:\n # TODO: [@rhymes] re-enable this and fix the violations,\n # see https://github.com/Shopify/erb-lint#ErbSafety\n enabled: false\n SelfClosingTag:\n enabled: false\n ParserErrors:\n exclude:\n - '**/app/views/pages/_editor_guide_text.html.erb'\n SpaceInHtmlTag:\n exclude:\n - '**/app/views/pages/_editor_guide_text.html.erb'\n AllowedScriptType:\n enabled: true\n allowed_types:\n - 'text/javascript'\n - 'text/x-tmpl'\n - 'application/ld+json'\n allow_blank: true\n disallow_inline_scripts: false\n Rubocop:\n enabled: true\n rubocop_config:\n inherit_from:\n - .rubocop.yml\n Layout/InitialIndentation:\n Enabled: false\n Layout/LineLength:\n Max: 289\n Exclude:\n - '**/app/views/comments/_comment_proper.html.erb'\n Layout/TrailingEmptyLines:\n Enabled: false\n Lint/UselessAssignment:\n Enabled: false\n Rails/OutputSafety:\n Enabled: false\n Exclude:\n - '**/app/views/feedback_messages/index.html.erb'\n - '**/app/views/layouts/_styles.html.erb'\n - '**/app/views/liquid_embeds/show.html.erb'\n - '**/app/views/moderations/mod.html.erb'\n - '**/app/views/pages/onboarding.html.erb'\n - '**/app/views/partnerships/show.html.erb'\n - '**/app/views/partnerships/index.html.erb'\n | dataset_sample\yaml\ruby\.erb-lint.yml | .erb-lint.yml | YAML | 1,386 | 0.8 | 0 | 0.043478 | node-utils | 176 | 2024-04-20T19:52:07.910857 | GPL-3.0 | false | f08495a9e0418f6febaa5f09a12cd6ec |
stages:\n - sync\n - preflight\n - prepare\n - build-images\n - fixtures\n - lint\n - test-frontend\n - test\n - post-test\n - review\n - qa\n - post-qa\n - pre-merge\n - pages\n - notify\n - release-environments\n - benchmark\n - ai-gateway\n\n# always use `gitlab-org` runners, however\n# in cases where jobs require Docker-in-Docker, the job\n# definition must be extended with `.use-docker-in-docker`\ndefault:\n image: $DEFAULT_CI_IMAGE\n tags:\n - $DEFAULT_JOB_TAG\n # All jobs are interruptible by default\n interruptible: true\n # Default job timeout doesn't work: https://gitlab.com/gitlab-org/gitlab/-/issues/387528\n timeout: 90m\n\n.default-ruby-variables: &default-ruby-variables\n RUBY_VERSION: "${RUBY_VERSION_DEFAULT}"\n OMNIBUS_GITLAB_CACHE_EDITION: "GITLAB_RUBY3_2"\n\n.next-ruby-variables: &next-ruby-variables\n RUBY_VERSION: "${RUBY_VERSION_NEXT}"\n OMNIBUS_GITLAB_CACHE_EDITION: "GITLAB_RUBY3_3"\n\n.default-branch-pipeline-failure-variables: &default-branch-pipeline-failure-variables\n CREATE_RAILS_FLAKY_TEST_ISSUES: "true"\n CREATE_RAILS_SLOW_TEST_ISSUES: "true"\n CREATE_RAILS_TEST_FAILURE_ISSUES: "true"\n GLCI_PUSH_RAILS_TEST_FAILURE_ISSUES_TO_GCS: "true"\n\n.default-merge-request-variables: &default-merge-request-variables\n ADD_SLOW_TEST_NOTE_TO_MERGE_REQUEST: "true"\n CREATE_RAILS_TEST_FAILURE_ISSUES: "false"\n FF_NETWORK_PER_BUILD: "true"\n FF_TIMESTAMPS: "true"\n FF_USE_FASTZIP: "true"\n NO_SOURCEMAPS: "true"\n GLCI_PUSH_RAILS_TEST_FAILURE_ISSUES_TO_GCS: "true"\n\n.if-merge-request-security-canonical-sync: &if-merge-request-security-canonical-sync\n if: '$CI_MERGE_REQUEST_SOURCE_PROJECT_PATH == "gitlab-org/security/gitlab" && $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME == $CI_DEFAULT_BRANCH && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME == $CI_DEFAULT_BRANCH'\n\n.if-not-security-canonical-sync: &if-not-security-canonical-sync\n if: '$CI_MERGE_REQUEST_SOURCE_PROJECT_PATH != "gitlab-org/security/gitlab" || $CI_MERGE_REQUEST_SOURCE_BRANCH_NAME != $CI_DEFAULT_BRANCH'\n\n.if-merge-request-labels-run-with-rails-next: &if-merge-request-labels-run-with-rails-next\n if: '($CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_EVENT_TYPE != "merge_train") && $CI_MERGE_REQUEST_LABELS =~ /pipeline:run-with-rails-next/'\n\nworkflow:\n name: '$PIPELINE_NAME'\n rules:\n # https://gitlab.com/gitlab-org/gitlab/-/issues/514740\n - if: '$GITLAB_USER_LOGIN == "gitlab-crowdin-bot" && $CI_PIPELINE_SOURCE == "merge_request_event" && $CI_MERGE_REQUEST_APPROVED != "true"'\n when: never\n - if: '$CI_PIPELINE_SOURCE == "pipeline" && $GITALY_TEST'\n variables:\n <<: *default-ruby-variables\n PIPELINE_NAME: 'Gitaly Rails Test Pipeline'\n # If `$FORCE_GITLAB_CI` is set, create a pipeline.\n - if: '$FORCE_GITLAB_CI'\n variables:\n <<: *default-ruby-variables\n PIPELINE_NAME: 'Ruby $RUBY_VERSION forced'\n - if: '$START_AS_IF_FOSS'\n variables:\n <<: *default-ruby-variables\n PIPELINE_NAME: 'Ruby $RUBY_VERSION as-if-foss'\n # As part of the process of creating RCs automatically, we update stable\n # branches with the changes of the most recent production deployment. The\n # merge requests used for this merge a branch release-tools/X into a stable\n # branch. For these merge requests we don't want to run any pipelines, as\n # they serve no purpose and will run anyway when the changes are merged.\n - if: '$CI_MERGE_REQUEST_SOURCE_BRANCH_NAME =~ /^release-tools\/\d+\.\d+\.\d+-rc\d+$/ && $CI_MERGE_REQUEST_TARGET_BRANCH_NAME =~ /^[\d-]+-stable(-ee)?$/ && $CI_PROJECT_PATH == "gitlab-org/gitlab"'\n when: never\n - if: '$CI_MERGE_REQUEST_LABELS =~ /pipeline:run-in-ruby3_2/ || $CI_MERGE_REQUEST_TARGET_BRANCH_NAME =~ /^[\d-]+-stable(-ee)?$/'\n variables:\n <<: [*default-ruby-variables, *default-merge-request-variables]\n PIPELINE_NAME: 'Ruby $RUBY_VERSION MR'\n - if: '$CI_MERGE_REQUEST_LABELS =~ /Community contribution/'\n variables:\n <<: [*next-ruby-variables, *default-merge-request-variables]\n GITLAB_DEPENDENCY_PROXY_ADDRESS: ""\n PIPELINE_NAME: 'Ruby $RUBY_VERSION MR (community contribution)'\n # This work around https://gitlab.com/gitlab-org/gitlab/-/issues/332411 which prevents usage of dependency proxy\n # when pipeline is triggered by a project access token.\n # Example of project bot usernames (the format changed over time):\n # - project_278964_bot2\n # - project_278964_bot_7fb4d1cca8242cb399a0b8f483783120\n #\n # We set the DOCKER_AUTH_CONFIG variable to authenticate to Docker Hub to not be impacted by rate limitations\n # We don't set DOCKER_AUTH_CONFIG as a CI/CD group/project variable, because it would then have higher precedence than .gitlab-ci.yml,\n # and we would always reconfigure the docker deamon for any CI/CD pipeline.\n - if: '$CI_MERGE_REQUEST_IID && $GITLAB_USER_LOGIN =~ /project_\d+_bot/'\n variables:\n <<: [*next-ruby-variables, *default-merge-request-variables]\n GITLAB_DEPENDENCY_PROXY_ADDRESS: ""\n DOCKER_AUTH_CONFIG: "${DOCKER_AUTH_CONFIG_OVERRIDE}"\n PIPELINE_NAME: 'Ruby $RUBY_VERSION MR (triggered by a project token)'\n - <<: *if-merge-request-security-canonical-sync\n variables:\n <<: [*next-ruby-variables, *default-merge-request-variables]\n PIPELINE_NAME: '$CI_DEFAULT_BRANCH security->canonical sync'\n SKIP_MESSAGE: 'MR only contains changes from the security mirror, which have already been reviewed, tested and deployed.'\n - <<: *if-merge-request-labels-run-with-rails-next\n variables:\n <<: [*next-ruby-variables, *default-merge-request-variables]\n BUNDLE_GEMFILE: Gemfile.next\n # This variable can be specified in manual or scheduled pipelines to run CI with the next Rails version.\n # Specifying BUNDLE_GEMFILE: Gemfile.next wouldn't work because QA jobs require\n # BUNDLE_GEMFILE to be Gemfile even with the next Rails version pipelines.\n - if: '$CI_COMMIT_BRANCH == "rails-next" && $CI_PIPELINE_SOURCE == "schedule"'\n variables:\n <<: *next-ruby-variables\n BUNDLE_GEMFILE: Gemfile.next\n PIPELINE_NAME: 'Rails next version (Gemfile.next)'\n # For (detached) merge request pipelines.\n - if: '$CI_MERGE_REQUEST_IID'\n variables:\n <<: [*next-ruby-variables, *default-merge-request-variables]\n PIPELINE_NAME: 'Ruby $RUBY_VERSION MR'\n # For the scheduled pipelines, we set specific variables.\n - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH && $CI_PIPELINE_SOURCE == "schedule" && $BUILD_WITH_NEXT_RUBY_VERSION == "true"'\n variables:\n <<: *next-ruby-variables\n PIPELINE_NAME: 'Scheduled Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch'\n - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH && $CI_PIPELINE_SOURCE == "schedule"'\n variables:\n <<: [*default-ruby-variables, *default-branch-pipeline-failure-variables]\n CRYSTALBALL: "true"\n PIPELINE_NAME: 'Scheduled Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch'\n - if: '$CI_COMMIT_BRANCH == "ruby-next" && $CI_PIPELINE_SOURCE == "schedule"'\n variables:\n <<: *next-ruby-variables\n PIPELINE_NAME: 'Scheduled Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch'\n # This work around https://gitlab.com/gitlab-org/gitlab/-/issues/332411 which prevents usage of dependency proxy\n # when pipeline is triggered by a project access token.\n # Example of project bot usernames (the format changed over time):\n # - project_278964_bot2\n # - project_278964_bot_7fb4d1cca8242cb399a0b8f483783120\n #\n # We set the DOCKER_AUTH_CONFIG variable to authenticate to Docker Hub to not be impacted by rate limitations\n # We don't set DOCKER_AUTH_CONFIG as a CI/CD group/project variable, because it would then have higher precedence than .gitlab-ci.yml,\n # and we would always reconfigure the docker deamon for any CI/CD pipeline.\n - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH && $GITLAB_USER_LOGIN =~ /project_\d+_bot/'\n variables:\n <<: [*default-ruby-variables, *default-branch-pipeline-failure-variables]\n GITLAB_DEPENDENCY_PROXY_ADDRESS: ""\n DOCKER_AUTH_CONFIG: "${DOCKER_AUTH_CONFIG_OVERRIDE}"\n PIPELINE_NAME: 'Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch (triggered by a project token)'\n # For `$CI_DEFAULT_BRANCH` from wider community contributors, we don't want to run any pipelines on pushes,\n # because normally we want to run merge request pipelines and scheduled pipelines, not for repository synchronization.\n # This can avoid accidentally using up pipeline minutes quota while synchronizing the repository for wider community contributors.\n - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH && $CI_PIPELINE_SOURCE == "push" && $CI_PROJECT_NAMESPACE !~ /^gitlab(-org|-cn)?($|\/)/'\n when: never\n # For `$CI_DEFAULT_BRANCH` branch, create a pipeline (this includes on schedules, pushes, merges, etc.).\n - if: '$CI_COMMIT_BRANCH == $CI_DEFAULT_BRANCH'\n variables:\n <<: [*default-ruby-variables, *default-branch-pipeline-failure-variables]\n PIPELINE_NAME: 'Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch'\n # For tags, create a pipeline.\n - if: '$CI_COMMIT_TAG'\n variables:\n <<: *default-ruby-variables\n PIPELINE_NAME: 'Ruby $RUBY_VERSION $CI_COMMIT_TAG tag'\n # If `$GITLAB_INTERNAL` isn't set, don't create a pipeline.\n - if: '$GITLAB_INTERNAL == null'\n when: never\n # For stable, auto-deploy, and security branches, create a pipeline.\n - if: '$CI_COMMIT_BRANCH =~ /^[\d-]+-stable(-ee)?$/'\n variables:\n <<: *default-ruby-variables\n PIPELINE_NAME: 'Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch'\n - if: '$CI_COMMIT_BRANCH =~ /^\d+-\d+-auto-deploy-\d+$/'\n variables:\n <<: *default-ruby-variables\n PIPELINE_NAME: 'Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch'\n - if: '$CI_COMMIT_BRANCH =~ /^security\//'\n variables:\n <<: *default-ruby-variables\n PIPELINE_NAME: 'Ruby $RUBY_VERSION $CI_COMMIT_BRANCH branch'\n\nvariables:\n PG_VERSION: "14"\n DEFAULT_CI_IMAGE: "${REGISTRY_HOST}/${REGISTRY_GROUP}/gitlab-build-images/ci/${BUILD_OS}-${OS_VERSION}-slim-ruby-${RUBY_VERSION}-golang-${GO_VERSION}-node-${NODE_VERSION}-postgresql-${PG_VERSION}:rubygems-${RUBYGEMS_VERSION}-git-${GIT_VERSION}-lfs-${LFS_VERSION}-chrome-${CHROME_VERSION}-yarn-${YARN_VERSION}-graphicsmagick-${GRAPHICSMAGICK_VERSION}"\n DEFAULT_JOB_TAG: "gitlab-org"\n GITLAB_LARGE_RUNNER_OPTIONAL: "gitlab-org" # overridden just in gitlab-org/gitlab\n GLCI_PRODUCTION_ASSETS_RUNNER_OPTIONAL: "gitlab-org-docker" # this is set to GITLAB_LARGE_RUNNER_OPTIONAL on gitlab-org/gitlab and default to docker runners for dind to work correctly\n DEFAULT_RSPEC_PREDICTIVE_JOB_TAGS: "${DEFAULT_JOB_TAG}" # Separated by commas, overridden in JiHu\n # We set $GITLAB_DEPENDENCY_PROXY to another variable (since it's set at the group level and has higher precedence than .gitlab-ci.yml)\n # so that we can override $GITLAB_DEPENDENCY_PROXY_ADDRESS in workflow rules.\n GITLAB_DEPENDENCY_PROXY_ADDRESS: "${GITLAB_DEPENDENCY_PROXY}"\n RAILS_ENV: "test"\n NODE_ENV: "test"\n BUNDLE_WITHOUT: "production:development"\n BUNDLE_INSTALL_FLAGS: "--jobs=$(nproc) --retry=3"\n BUNDLE_FROZEN: "true"\n BUNDLE_GEMFILE: Gemfile\n # we override the max_old_space_size to prevent OOM errors\n NODE_OPTIONS: --max-old-space-size=10240\n GIT_DEPTH: "20"\n # 'GIT_STRATEGY: clone' optimizes the pack-objects cache hit ratio\n GIT_STRATEGY: "clone"\n GIT_SUBMODULE_STRATEGY: "none"\n GET_SOURCES_ATTEMPTS: "3"\n # CI_FETCH_REPO_GIT_STRATEGY: "none" is from artifacts. "clone" is from cloning\n CI_FETCH_REPO_GIT_STRATEGY: "none"\n FORCE_COLOR: 1\n CLICOLOR_FORCE: 1\n\n FLAKY_RSPEC_SUITE_REPORT_PATH: rspec/flaky/report-suite.json\n FRONTEND_FIXTURES_MAPPING_PATH: crystalball/frontend_fixtures_mapping.json\n GITLAB_WORKHORSE_FOLDER: "gitlab-workhorse"\n JOB_METRICS_FILE_PATH: "${CI_PROJECT_DIR}/tmp/job-metrics.json"\n KNAPSACK_RSPEC_SUITE_REPORT_PATH: knapsack/report-master.json\n RSPEC_CHANGED_FILES_PATH: rspec/changed_files.txt\n RSPEC_FAIL_FAST_THRESHOLD: 20\n RSPEC_FAST_QUARANTINE_PATH: rspec/fast_quarantine-gitlab.txt\n RSPEC_LAST_RUN_RESULTS_FILE: rspec/rspec_last_run_results.txt\n RSPEC_MATCHING_JS_FILES_PATH: rspec/js_matching_files.txt\n RSPEC_MATCHING_TESTS_EE_PATH: rspec/matching_tests-ee.txt\n RSPEC_MATCHING_TESTS_FOSS_PATH: rspec/matching_tests-foss.txt\n RSPEC_MATCHING_TESTS_PATH: rspec/matching_tests.txt\n RSPEC_PACKED_TESTS_MAPPING_PATH: crystalball/packed-mapping.json\n RSPEC_PACKED_TESTS_MAPPING_ALT_PATH: crystalball/packed-mapping-alt.json\n RSPEC_PREDICTIVE_PIPELINE_TEMPLATE_YML: .gitlab/ci/rails/rspec-predictive.gitlab-ci.yml.erb\n RSPEC_PROFILING_FOLDER_PATH: rspec/profiling\n RSPEC_TESTS_MAPPING_PATH: crystalball/mapping.json\n RSPEC_TESTS_MAPPING_ALT_PATH: crystalball/mapping-alt.json\n RSPEC_VIEWS_INCLUDING_PARTIALS_PATH: rspec/views_including_partials.txt\n RSPEC_AUTO_EXPLAIN_LOG_PATH: auto_explain/auto_explain.ndjson.gz\n RUBYOPT: "--yjit"\n TMP_TEST_FOLDER: "${CI_PROJECT_DIR}/tmp/tests"\n TMP_TEST_GITLAB_WORKHORSE_PATH: "${TMP_TEST_FOLDER}/${GITLAB_WORKHORSE_FOLDER}"\n\n ES_JAVA_OPTS: "-Xms256m -Xmx256m"\n ELASTIC_URL: "http://elastic:changeme@elasticsearch:9200"\n BUNDLER_CHECKSUM_VERIFICATION_OPT_IN: "1"\n CACHE_CLASSES: "true"\n CHECK_PRECOMPILED_ASSETS: "true"\n RETRY_FAILED_TESTS_IN_NEW_PROCESS: "true"\n # Run with decomposed databases by default\n DECOMPOSED_DB: "true"\n SEC_DECOMPOSED_DB: "true"\n\n DOCS_GITLAB_REPO_SUFFIX: "ee"\n\n REVIEW_APPS_IMAGE: "${REGISTRY_HOST}/${REGISTRY_GROUP}/gitlab-build-images/${BUILD_OS}-${OS_VERSION}-ruby-${RUBY_VERSION}:gcloud-${GCLOUD_VERSION}-kubectl-1.30-helm-${HELM_VERSION}"\n REVIEW_APPS_DOMAIN: "gitlab-review.app"\n REVIEW_APPS_GCP_PROJECT: "gitlab-review-apps"\n REVIEW_APPS_GCP_REGION: "us-central1"\n\n REGISTRY_HOST: "registry.gitlab.com"\n REGISTRY_GROUP: "gitlab-org"\n\n # Disable useless network connections when installing some NPM packages.\n # See https://gitlab.com/gitlab-com/gl-security/engineering-and-research/inventory/-/issues/827#note_1203181407\n DISABLE_OPENCOLLECTIVE: "true"\n\n # This is set at the gitlab-org level, but we set it here for forks\n DANGER_DO_NOT_POST_INVALID_DANGERFILE_ERROR: "1"\n\n # Workaround for https://gitlab.com/gitlab-org/gitlab/-/issues/390313. This can be dropped whenever\n # https://github.com/ruby/ruby/pull/7663 lands in the Ruby interpreter.\n NOKOGIRI_LIBXML_MEMORY_MANAGEMENT: default\n\n # CI jobs behavior can be changed by changing the value of these variables in the project's CI/CD variables\n AVERAGE_KNAPSACK_REPORT: "true"\n ENABLE_DEPSCORE: "true"\n CACHE_ASSETS_AS_PACKAGE: "true"\n REUSE_FRONTEND_FIXTURES_ENABLED: "true"\n SIMPLECOV: "true"\n\ninclude:\n - local: .gitlab/ci/_skip.yml\n rules:\n - <<: *if-merge-request-security-canonical-sync\n - local: .gitlab/ci/version.yml\n - local: .gitlab/ci/*.gitlab-ci.yml\n rules:\n - <<: *if-not-security-canonical-sync\n - remote: 'https://gitlab.com/gitlab-org/frontend/untamper-my-lockfile/-/raw/main/templates/merge_request_pipelines.yml'\n rules:\n - <<: *if-not-security-canonical-sync\n - local: .gitlab/ci/includes/gitlab-com/*.gitlab-ci.yml\n rules:\n - if: '$CI_SERVER_HOST == "gitlab.com"'\n - if: '$CI_SERVER_HOST == "jihulab.com"'\n - local: .gitlab/ci/includes/as-if-jh.gitlab-ci.yml\n rules:\n # Only run as-if-jh triggerred pipelines for gitlab.com/gitlab-org/gitlab MRs that don't target stable branches\n # and that don't have the quarantine or pipeline::expedited labels.\n - if: '$CI_PROJECT_URL != "https://gitlab.com/gitlab-org/gitlab"'\n when: never\n - if: '$CI_MERGE_REQUEST_ID == null'\n when: never\n - if: '$CI_MERGE_REQUEST_TARGET_BRANCH_NAME =~ /^[\d-]+-stable(-ee|-jh)?$/'\n when: never\n - if: '$CI_MERGE_REQUEST_LABELS =~ /quarantine/ || $CI_MERGE_REQUEST_LABELS =~ /pipeline::expedited/ || $CI_MERGE_REQUEST_LABELS =~ /pipeline:expedite/'\n when: never\n - when: always\n | dataset_sample\yaml\ruby\.gitlab-ci.yml | .gitlab-ci.yml | YAML | 16,080 | 0.95 | 0.170886 | 0.189189 | awesome-app | 993 | 2023-12-10T10:10:49.370059 | MIT | false | d3c975b14e9fec15e3780cafeae07ac6 |
# Gitpod file reference\n# https://www.gitpod.io/docs/configure/workspaces/tasks\n\nimage: registry.gitlab.com/gitlab-org/gitlab-development-kit/gitpod-workspace:stable\ncheckoutLocation: gitlab-development-kit/gitlab\n\ntasks:\n\n - name: GDK\n # "command:" emits gitpod-start\n before: |\n START_UNIXTIME="$(date +%s)"\n echo START_UNIXTIME="$(date +%s)" > /workspace/gitpod_start_time.sh\n command: |\n # send signal to other tasks that Gitpod started\n gp sync-done gitpod-start\n echo "Waiting for other task to copy GDK.."\n gp sync-await gdk-copied && cd /workspace/gitlab-development-kit && gdk help\n\n - name: GitLab\n # "command:" emits gdk-copied\n init: |\n (\n set -e\n echo "$(date) – Copying GDK" | tee -a /workspace/startup.log\n cp -r $HOME/gitlab-development-kit /workspace/\n cd /workspace/gitlab-development-kit\n mv -v /workspace/gitlab-development-kit/secrets.yml /workspace/gitlab-development-kit/gitlab/config\n # ensure gdk.yml has correct instance settings\n gdk config set gitlab.rails.port 443 |& tee -a /workspace/startup.log\n gdk config set gitlab.rails.https.enabled true |& tee -a /workspace/startup.log\n gdk config set webpack.host 127.0.0.1 |& tee -a /workspace/startup.log\n gdk config set webpack.static false |& tee -a /workspace/startup.log\n gdk config set webpack.live_reload false |& tee -a /workspace/startup.log\n # reconfigure GDK\n echo "$(date) – Reconfiguring GDK" | tee -a /workspace/startup.log\n gdk reconfigure\n # run DB migrations\n echo "$(date) – Running DB migrations" | tee -a /workspace/startup.log\n make gitlab-db-migrate\n # stop GDK\n echo "$(date) – Stopping GDK" | tee -a /workspace/startup.log\n gdk stop\n echo "$(date) – GDK stopped" | tee -a /workspace/startup.log\n )\n command: |\n (\n set -e\n gp sync-done gdk-copied\n gp sync-await gitpod-start\n [[ -f /workspace/gitpod_start_time.sh ]] && source /workspace/gitpod_start_time.sh\n SECONDS=0\n cd /workspace/gitlab-development-kit\n # update GDK\n echo "$(date) – Updating GDK" | tee -a /workspace/startup.log\n export DEFAULT_BRANCH=$(git --git-dir=gitlab/.git branch --show-current)\n gdk config set gitlab.default_branch "$DEFAULT_BRANCH"\n gdk update\n # ensure gdk.yml has correct instance settings\n gdk config set gitlab.rails.hostname $(gp url 3000 | sed -e 's+^http[s]*://++')\n gdk config set gitlab.rails.port 443\n gdk config set gitlab.rails.https.enabled true\n gdk config set webpack.host 127.0.0.1\n gdk config set webpack.static false\n gdk config set webpack.live_reload false\n # reconfigure GDK\n echo "$(date) – Reconfiguring GDK" | tee -a /workspace/startup.log\n gdk reconfigure\n # start GDK\n echo "$(date) – Starting GDK" | tee -a /workspace/startup.log\n export DEV_SERVER_PUBLIC_ADDR=$(gp url 3808)\n export RAILS_HOSTS=$(gp url 3000 | sed -e 's+^http[s]*://++')\n gdk start\n # Run DB migrations\n if [ "$GITLAB_RUN_DB_MIGRATIONS" == true ]; then\n make gitlab-db-migrate\n fi\n cd /workspace/gitlab-development-kit/gitlab\n echo "--- on branch: $DEFAULT_BRANCH"\n echo "--- installing lefthook"\n bundle exec lefthook install\n echo "--- resetting db/structure.sql"\n git checkout db/structure.sql\n echo "--- waiting for GitLab"\n gp ports await 3000\n printf "Awaiting /-/readiness on $(gp url 3000) ..."\n # Check /-/readiness which returns JSON, but we're only interested in the exit code\n #\n # We use http://localhost:3000 instead of the public hostname because\n # it's no longer possible to access as specific cookies are required\n until curl --silent --no-buffer --fail http://localhost:3000/-/readiness > /dev/null 2>&1; do printf '.'; sleep 5; done && echo ""\n # Give Gitpod a few more seconds to set up everything ...\n sleep 5\n printf "$(date) – GitLab is up (took ~%.1f minutes)\n" "$((10*$SECONDS/60))e-1" | tee -a /workspace/startup.log\n gp preview $(gp url 3000) --external || true\n PREBUILD_LOG=(/workspace/.gitpod/prebuild-log-*)\n [[ -f /workspace/gitpod_start_time.sh ]] && printf "Took %.1f minutes from https://gitlab.com/gitlab-org/gitlab/-/blob/master/.gitpod.yml being executed through to completion %s\n" "$((10*(($(date +%s)-${START_UNIXTIME}))/60))e-1" "$([[ -f "$PREBUILD_LOG" ]] && echo "With Prebuilds")"\n )\n\nports:\n - port: 2222 # sshd\n onOpen: ignore\n - port: 3000 # rails-web\n onOpen: notify\n - port: 3005 # gitlab-docs\n onOpen: notify\n - port: 3010 # gitlab-pages\n onOpen: ignore\n - port: 3808 # webpack\n onOpen: ignore\n - port: 5000 # auto_devops\n onOpen: ignore\n - port: 5778 # jaeger\n onOpen: ignore\n - port: 9000 # object_store / minio\n onOpen: ignore\n - port: 9122 # gitlab-shell\n onOpen: ignore\n\nvscode:\n extensions:\n - GitLab.gitlab-workflow\n - rebornix.ruby@0.28.1\n - wingrunr21.vscode-ruby@0.28.0\n - karunamurti.haml@1.4.1\n - octref.vetur@0.37.3\n - dbaeumer.vscode-eslint@2.4.0\n - DavidAnson.vscode-markdownlint@0.49.0\n - esbenp.prettier-vscode\n | dataset_sample\yaml\ruby\.gitpod.yml | .gitpod.yml | YAML | 5,418 | 0.95 | 0.023256 | 0.154472 | python-kit | 8 | 2024-09-21T14:06:54.188275 | Apache-2.0 | false | d75b3b37a8b04167e7ffe0d28a446ed4 |
exclude:\n - 'vendor/**/*'\n\nrequire:\n - ./lib/linter/haml_middle_dot.rb\n\nlinters:\n AltText:\n enabled: true\n MiddleDot:\n enabled: true\n LineLength:\n max: 300\n ViewLength:\n max: 200 # Override default value of 100 inherited from rubocop\n | dataset_sample\yaml\ruby\.haml-lint.yml | .haml-lint.yml | YAML | 252 | 0.95 | 0 | 0 | react-lib | 324 | 2023-08-19T20:21:03.852690 | GPL-3.0 | false | 2b657376df12a1b4de3babf963ada22c |
sources:\n bundler: true\n\nallowed:\n - 0bsd\n - apache-2.0\n - bsd-2-clause\n - bsd-3-clause\n - cc0-1.0\n - isc\n - mit\n - ruby\n\nignored:\n bundler:\n - abbrev # Ruby (default gem)\n - base64 # Ruby (default gem)\n - bigdecimal # Ruby (default gem)\n - cgi # Ruby (default gem)\n - date # Ruby (default gem)\n - digest # Ruby (default gem)\n - io-console # Ruby (default gem)\n - io-wait # Ruby (default gem)\n - irb # Ruby (default gem)\n - json # Ruby (default gem)\n - logger # Ruby (default gem)\n - mutex_m # BSD-2-Clause\n - net-http # Ruby (default gem)\n - net-protocol # Ruby (default gem)\n - openssl # Ruby (default gem)\n - ostruct # Ruby (default gem)\n - pp # Ruby (default gem)\n - prettyprint # Ruby (default gem)\n - racc # Ruby (default gem)\n - rchardet # LGPL\n - rdoc # Ruby (default gem)\n - reline # Ruby (default gem)\n - ruby2_keywords # Ruby (default gem)\n - stringio # Ruby (default gem)\n - strscan # Ruby (default gem)\n - timeout # Ruby (default gem)\n - uri # Ruby (default gem)\n\nreviewed:\n bundler:\n - activerecord # MIT\n - benchmark # Ruby\n - coderay # MIT\n - concurrent-ruby # MIT\n - css_parser # MIT\n - drb # BSD-2-Clause\n - dry-initializer # MIT\n - excon # MIT\n - faraday-em_http # MIT\n - faraday-em_synchrony # MIT\n - faraday-excon # MIT\n - faraday-httpclient # MIT\n - faraday-net_http # MIT\n - faraday-patron # MIT\n - faraday-rack # MIT\n - highline # Ruby or GPL-2.0\n - htmlentities # MIT\n - image_size # MIT\n - jwt # MIT\n - kgio # LGPL-2.1+\n - logstash-event # Apache-2.0\n - net-imap # Ruby (bundled gem)\n - net-pop # Ruby (bundled gem)\n - net-smtp # Ruby (bundled gem)\n - nio4r # MIT + BSD\n - omniauth # MIT\n - pg # Ruby\n - r2 # Apache-2.0 (Twitter)\n - raindrops # LGPL-2.1+\n - rubyzip # Ruby\n - securerandom # Ruby\n - sidekiq # LGPL (Sidekiq)\n - tilt # MIT\n - unf # BSD-2-Clause\n - unicorn # Ruby or GPLv2/GPLv3\n | dataset_sample\yaml\ruby\.licensed.yml | .licensed.yml | YAML | 2,016 | 0.8 | 0 | 0 | python-kit | 650 | 2024-12-09T01:46:16.591622 | BSD-3-Clause | false | 991194f244713e29d30c30f570f2f088 |
require: rubocop-rspec\n\nAllCops:\n TargetRubyVersion: 3.0\n Exclude:\n - "**/sandbox/**/*"\n - "**/db/migrate/*"\n - "**/Gemfile"\n - "**/Gemfile.lock"\n - "**/Rakefile"\n - "**/rails"\n - "**/*.gemspec"\n - "**/dummy/**/*"\n - "**/vendor/**/*"\n - "**/spec_helper.rb"\n - "**/templates/**/*"\n\nLayout/MultilineOperationIndentation:\n EnforcedStyle: indented\n\nLayout/ParameterAlignment:\n Enabled: false\n\nMetrics/ClassLength:\n CountComments: false\n Max: 150\n\nMetrics/ModuleLength:\n CountComments: false\n Max: 250\n Exclude:\n - "**/spec/**/*"\n\nStyle/Documentation:\n Enabled: false\n\nLayout/LineLength:\n Max: 150\n Exclude:\n - "**/spec/**/*"\n\nMetrics/MethodLength:\n CountComments: false\n Max: 50\n\nMetrics/BlockLength:\n CountComments: false\n Max: 50\n Exclude:\n - "**/spec/**/*"\n - "**/*.rake"\n - "**/factories/**/*"\n - "**/config/routes.rb"\n - "**/lib/**/testing_support/**/*"\n\nMetrics/AbcSize:\n Max: 45\n\nStyle/StringLiterals:\n EnforcedStyle: single_quotes\n\nLayout/DotPosition:\n EnforcedStyle: trailing\n Enabled: true\n\nLayout/SpaceInsideArrayLiteralBrackets:\n Exclude:\n - "api/spec/integration/**/*.rb"\n - "api/lib/spree/api/testing_support/v2/platform_contexts.rb"\n\nStyle/FrozenStringLiteralComment:\n Enabled: false\n\nStyle/ClassVars:\n Exclude:\n - "core/lib/spree/permitted_attributes.rb"\n\nStyle/RegexpLiteral:\n Enabled: false\n\nStyle/WordArray:\n Enabled: false\n\nStyle/SymbolArray:\n Enabled: false\n\nStyle/SymbolProc:\n Exclude:\n - "**/app/serializers/**/*"\n\nStyle/GuardClause:\n Enabled: false\n\nStyle/TrailingCommaInArrayLiteral:\n Enabled: false\n\nStyle/TrailingCommaInHashLiteral:\n Enabled: false\n\nStyle/BarePercentLiterals:\n Enabled: false\n\nStyle/MutableConstant:\n Enabled: false\n\nStyle/PercentLiteralDelimiters:\n Enabled: false\n\nStyle/IfUnlessModifier:\n Enabled: false\n\nNaming/VariableNumber:\n Enabled: false\n\nStyle/RedundantPercentQ:\n Enabled: false\n\nLint/ParenthesesAsGroupedExpression:\n Enabled: false\n\nStyle/NumericPredicate:\n Enabled: false\n\nMetrics/PerceivedComplexity:\n Max: 10\n\nMetrics/CyclomaticComplexity:\n Max: 10\n\nStyle/ClassAndModuleChildren:\n Enabled: false\n\nStyle/AndOr:\n Exclude:\n - "**/*controller.rb"\n\nStyle/HashEachMethods:\n Enabled: false\n\nStyle/HashTransformKeys:\n Enabled: false\n\nStyle/HashTransformValues:\n Enabled: false\n\nStyle/Alias:\n Enabled: false\n\nRSpec/NestedGroups:\n Max: 7\n\nLint/AmbiguousBlockAssociation:\n Exclude:\n - "**/spec/**/*"\n\nStyle/NumericLiterals:\n Enabled: false\n\nRSpec/DescribeClass:\n Enabled: false\n\nRSpec/VerifiedDoubles:\n Enabled: false\n\nRSpec/MessageChain:\n Enabled: false\n\nRSpec/AnyInstance:\n Enabled: false\n\nRSpec/InstanceVariable:\n Enabled: false\n\nRSpec/ContextWording:\n Enabled: false\n\nRSpec/ExpectInHook:\n Enabled: false\n\nRSpec/ExampleLength:\n Enabled: false\n\nRSpec/MessageSpies:\n Enabled: false\n\nRSpec/NamedSubject:\n Enabled: false\n\nRSpec/MultipleExpectations:\n Enabled: false\n\nRSpec/LetSetup:\n Enabled: false\n\nRSpec/SubjectStub:\n Enabled: false\n\nRSpec/VoidExpect:\n Enabled: false\n\nRSpec/BeforeAfterAll:\n Enabled: false\n | dataset_sample\yaml\ruby\.rubocop.yml | .rubocop.yml | YAML | 3,087 | 0.95 | 0 | 0 | vue-tools | 213 | 2025-02-15T20:28:42.612444 | Apache-2.0 | false | cec1fe19a6b720ecb6dcb9d6d09a72cb |
# This configuration was generated by\n# `rubocop --auto-gen-config --auto-gen-only-exclude --no-offense-counts --no-auto-gen-timestamp`\n# using RuboCop version 1.75.2.\n# The point is for the user to remove these configuration records\n# one by one as the offenses are removed from the code base.\n# Note that changes in the inspected code, or installation of new\n# versions of RuboCop, may require this file to be generated again.\n\nLint/NonLocalExitFromIterator:\n Exclude:\n - 'app/helpers/json_ld_helper.rb'\n\n# Configuration parameters: AllowedMethods, AllowedPatterns, CountRepeatedAttributes.\nMetrics/AbcSize:\n Max: 82\n\n# Configuration parameters: CountBlocks, CountModifierForms, Max.\nMetrics/BlockNesting:\n Exclude:\n - 'lib/tasks/mastodon.rake'\n\n# Configuration parameters: AllowedMethods, AllowedPatterns.\nMetrics/CyclomaticComplexity:\n Max: 25\n\n# Configuration parameters: AllowedMethods, AllowedPatterns.\nMetrics/PerceivedComplexity:\n Max: 27\n\nRails/OutputSafety:\n Exclude:\n - 'config/initializers/simple_form.rb'\n\n# This cop supports safe autocorrection (--autocorrect).\n# Configuration parameters: AllowedVars.\nStyle/FetchEnvVar:\n Exclude:\n - 'config/environments/production.rb'\n - 'config/initializers/2_limited_federation_mode.rb'\n - 'config/initializers/3_omniauth.rb'\n - 'config/initializers/cache_buster.rb'\n - 'config/initializers/devise.rb'\n - 'config/initializers/paperclip.rb'\n - 'config/initializers/vapid.rb'\n - 'lib/tasks/repo.rake'\n\n# This cop supports safe autocorrection (--autocorrect).\n# Configuration parameters: EnforcedStyle, MaxUnannotatedPlaceholdersAllowed, Mode, AllowedMethods, AllowedPatterns.\n# SupportedStyles: annotated, template, unannotated\n# AllowedMethods: redirect\nStyle/FormatStringToken:\n Exclude:\n - 'config/initializers/devise.rb'\n - 'lib/paperclip/color_extractor.rb'\n\n# This cop supports safe autocorrection (--autocorrect).\n# Configuration parameters: MinBodyLength, AllowConsecutiveConditionals.\nStyle/GuardClause:\n Enabled: false\n\n# Configuration parameters: AllowedMethods.\n# AllowedMethods: respond_to_missing?\nStyle/OptionalBooleanParameter:\n Exclude:\n - 'app/lib/admin/system_check/message.rb'\n - 'app/lib/request.rb'\n - 'app/lib/webfinger.rb'\n - 'app/services/block_domain_service.rb'\n - 'app/services/fetch_resource_service.rb'\n - 'app/workers/domain_block_worker.rb'\n - 'app/workers/unfollow_follow_worker.rb'\n\n# This cop supports unsafe autocorrection (--autocorrect-all).\n# Configuration parameters: EnforcedStyle.\n# SupportedStyles: short, verbose\nStyle/PreferredHashMethods:\n Exclude:\n - 'config/initializers/paperclip.rb'\n\n# This cop supports safe autocorrection (--autocorrect).\nStyle/RedundantConstantBase:\n Exclude:\n - 'config/environments/production.rb'\n - 'config/initializers/sidekiq.rb'\n | dataset_sample\yaml\ruby\.rubocop_todo.yml | .rubocop_todo.yml | YAML | 2,837 | 0.95 | 0.011905 | 0.347222 | node-utils | 293 | 2025-06-07T16:57:44.739182 | MIT | false | 9b6f3a7ddffdb4780ec3fc72531d0745 |
scss_files: 'app/assets/stylesheets/**/*.scss'\nplugin_directories: ['.scss-linters']\n\n# List of gem names to load custom linters from (make sure they are already\n# installed)\nplugin_gems: []\n\nlinters:\n BangFormat:\n enabled: true\n space_before_bang: true\n space_after_bang: false\n\n BemDepth:\n enabled: false\n max_elements: 1\n\n BorderZero:\n enabled: true\n convention: zero # or `none`\n\n ColorKeyword:\n enabled: true\n\n ColorVariable:\n enabled: true\n\n Comment:\n enabled: true\n\n DebugStatement:\n enabled: true\n\n DeclarationOrder:\n enabled: true\n\n DisableLinterReason:\n enabled: true\n\n DuplicateProperty:\n enabled: true\n\n ElsePlacement:\n enabled: true\n style: same_line # or 'new_line'\n\n EmptyLineBetweenBlocks:\n enabled: true\n ignore_single_line_blocks: true\n\n EmptyRule:\n enabled: true\n\n ExtendDirective:\n enabled: false\n\n FinalNewline:\n enabled: true\n present: true\n\n HexLength:\n enabled: true\n style: short # or 'long'\n\n HexNotation:\n enabled: true\n style: lowercase # or 'uppercase'\n\n HexValidation:\n enabled: true\n\n IdSelector:\n enabled: false\n\n ImportantRule:\n enabled: true\n\n ImportPath:\n enabled: true\n leading_underscore: false\n filename_extension: false\n\n Indentation:\n enabled: true\n allow_non_nested_indentation: false\n character: space # or 'tab'\n width: 2\n\n LeadingZero:\n enabled: true\n style: exclude_zero # or 'include_zero'\n\n MergeableSelector:\n enabled: true\n force_nesting: true\n\n NameFormat:\n enabled: true\n allow_leading_underscore: true\n convention: hyphenated_lowercase # or 'camel_case', or 'snake_case', or a regex pattern\n\n NestingDepth:\n enabled: true\n max_depth: 3\n ignore_parent_selectors: false\n\n PlaceholderInExtend:\n enabled: true\n\n PropertyCount:\n enabled: false\n include_nested: false\n max_properties: 10\n\n PropertySortOrder:\n enabled: true\n ignore_unspecified: false\n min_properties: 2\n separate_groups: false\n\n PropertySpelling:\n enabled: true\n extra_properties: []\n\n PropertyUnits:\n enabled: true\n global: [\n 'ch', 'em', 'ex', 'rem', # Font-relative lengths\n 'cm', 'in', 'mm', 'pc', 'pt', 'px', 'q', # Absolute lengths\n 'vh', 'vw', 'vmin', 'vmax', # Viewport-percentage lengths\n 'deg', 'grad', 'rad', 'turn', # Angle\n 'ms', 's', # Duration\n 'Hz', 'kHz', # Frequency\n 'dpi', 'dpcm', 'dppx', # Resolution\n '%'] # Other\n properties: {}\n\n QualifyingElement:\n enabled: true\n allow_element_with_attribute: false\n allow_element_with_class: false\n allow_element_with_id: false\n\n SelectorDepth:\n enabled: true\n max_depth: 3\n\n SelectorFormat:\n enabled: true\n convention: hyphenated_lowercase # or 'strict_BEM', or 'hyphenated_BEM', or 'snake_case', or 'camel_case', or a regex pattern\n\n Shorthand:\n enabled: true\n allowed_shorthands: [1, 2, 3]\n\n SingleLinePerProperty:\n enabled: true\n allow_single_line_rule_sets: true\n\n SingleLinePerSelector:\n enabled: true\n\n SpaceAfterComma:\n enabled: true\n\n SpaceAfterPropertyColon:\n enabled: true\n style: one_space # or 'no_space', or 'at_least_one_space', or 'aligned'\n\n SpaceAfterPropertyName:\n enabled: true\n\n SpaceAfterVariableName:\n enabled: true\n\n SpaceAroundOperator:\n enabled: true\n style: one_space # or 'no_space'\n\n SpaceBeforeBrace:\n enabled: true\n style: space # or 'new_line'\n allow_single_line_padding: false\n\n SpaceBetweenParens:\n enabled: true\n spaces: 0\n\n StringQuotes:\n enabled: true\n style: single_quotes # or double_quotes\n\n TrailingSemicolon:\n enabled: true\n\n TrailingWhitespace:\n enabled: true\n\n TrailingZero:\n enabled: false\n\n TransitionAll:\n enabled: false\n\n UnnecessaryMantissa:\n enabled: true\n\n UnnecessaryParentReference:\n enabled: true\n\n UrlFormat:\n enabled: true\n\n UrlQuotes:\n enabled: true\n\n VariableForProperty:\n enabled: false\n properties: []\n\n VendorPrefix:\n enabled: true\n identifier_list: base\n additional_identifiers: []\n excluded_identifiers: []\n\n ZeroUnit:\n enabled: true\n\n Compass::*:\n enabled: false\n | dataset_sample\yaml\ruby\.scss-lint.yml | .scss-lint.yml | YAML | 4,355 | 0.8 | 0 | 0.011561 | react-lib | 295 | 2024-02-06T22:12:41.656737 | Apache-2.0 | false | a759a353ab1db61433b739854fdf5a93 |
---\ninclude:\n- "**/*.rb"\nexclude:\n- "*/spec/**/*"\n- "*test/**/*"\n- "*/vendor/**/*"\n- vendor/**/*\n- ".bundle/**/*"\n- sample/**/*\n- sandbox/**/*\n- pkg/**/*\n- cli/**/*.gemspec\nrequire: []\ndomains: []\nreporters:\n- rubocop\n- require_not_found\nrequire_paths: []\nmax_files: 50_000\n | dataset_sample\yaml\ruby\.solargraph.yml | .solargraph.yml | YAML | 274 | 0.95 | 0 | 0 | awesome-app | 225 | 2023-07-24T18:17:53.816215 | GPL-3.0 | false | 56223ec808a848d510ccab4486252531 |
# -*- YAML -*-\n# Copyright (C) 2011 Urabe, Shyouhei. All rights reserved.\n#\n# This file is a part of the programming language Ruby. Permission is hereby\n# granted, to either redistribute or modify this file, provided that the\n# conditions mentioned in the file COPYING are met. Consult the file for\n# details.\n\n# When you see Travis CI issues, or you are interested in understanding how to\n# manage, please check the link below.\n# https://github.com/ruby/ruby/wiki/CI-Servers#travis-ci\n\n# We enable Travis on the specific branches or forked repositories here.\n# https://docs.travis-ci.com/user/conditions-v1\nif: >-\n (fork OR branch = master OR branch =~ /^ruby_\d_\d$/)\n AND (commit_message !~ /(\[DOC\]|Document)/)\n AND NOT (type = 'push' AND sender =~ /\[bot\]/)\n\nlanguage: c\n\nos: linux\n\ndist: jammy\n\ngit:\n quiet: true\n\nenv:\n global:\n - NPROC="$(nproc)"\n - JOBS="-j${NPROC}"\n # https://github.com/travis-ci/travis-build/blob/e411371dda21430a60f61b8f3f57943d2fe4d344/lib/travis/build/bash/travis_apt_get_options.bash#L7\n - travis_apt_get_options='--allow-downgrades --allow-remove-essential --allow-change-held-packages'\n - travis_apt_get_options="-yq --no-install-suggests --no-install-recommends $travis_apt_get_options"\n # -g0 disables backtraces when SEGV. Do not set that.\n - debugflags=-ggdb3\n - RUBY_TESTOPTS="$JOBS -q --tty=no"\n\n.org.ruby-lang.ci.matrix-definitions:\n - &gcc-11\n compiler: gcc-11\n before_install:\n - tool/travis_retry.sh sudo bash -c "rm -rf '${TRAVIS_ROOT}/var/lib/apt/lists/'* && exec apt-get update -yq"\n - >-\n tool/travis_retry.sh sudo -E apt-get $travis_apt_get_options install\n gcc-11\n g++-11\n libffi-dev\n libncurses-dev\n libncursesw5-dev\n libreadline-dev\n libssl-dev\n libyaml-dev\n openssl\n zlib1g-dev\n - gcc-11 --version\n - &ppc64le-linux\n name: ppc64le-linux\n arch: ppc64le\n <<: *gcc-11\n - &s390x-linux\n name: s390x-linux\n arch: s390x\n <<: *gcc-11\n env:\n # Avoid possible test failures with the zlib applying the following patch\n # on s390x CPU architecture.\n # https://github.com/madler/zlib/pull/410\n - DFLTCC=0\n\nmatrix:\n include:\n - <<: *ppc64le-linux\n - <<: *s390x-linux\n allow_failures:\n - name: ppc64le-linux\n - name: s390x-linux\n fast_finish: true\n\nbefore_script:\n - lscpu\n - ./autogen.sh\n - mkdir build\n - cd build\n - ../configure -C --disable-install-doc --prefix=$(pwd)/install\n - make -s $JOBS\n - make -s $JOBS install\n # Useful info to report issues to the Ruby.\n - $(pwd)/install/bin/ruby -v\n # Useful info To report issues to the RubyGems.\n - $(pwd)/install/bin/gem env\n\nscript:\n - make -s test\n - ../tool/travis_wait.sh make -s test-all RUBYOPT="-w"\n - ../tool/travis_wait.sh make -s test-spec\n\n# We want to be notified when something happens.\nnotifications:\n webhooks:\n urls:\n # ruby-lang slack: ruby/simpler-alerts-bot (travis)\n - secure: mRsoS/UbqDkKkW5p3AEqM27d4SZnV6Gsylo3bm8T/deltQzTsGzZwrm7OIBXZv0UFZdE68XmPlyHfZFLSP2V9QZ7apXMf9/vw0GtcSe1gchtnjpAPF6lYBn7nMCbVPPx9cS0dwL927fjdRM1vj7IKZ2bk4F0lAJ25R25S6teqdk=\n on_success: never\n on_failure: always\n email:\n recipients:\n - jun.aruga@gmail.com\n on_success: never\n on_failure: always\n | dataset_sample\yaml\ruby\.travis.yml | .travis.yml | YAML | 3,337 | 0.8 | 0.018018 | 0.212121 | node-utils | 79 | 2024-01-05T00:44:58.772347 | BSD-3-Clause | false | b8a06cb0f448727de6bbc3f59a57da14 |
self-hosted-runner:\n # Labels of self-hosted runner in array of strings.\n labels: []\n# Configuration variables in array of strings defined in your repository or\n# organization. `null` means disabling configuration variables check.\n# Empty array means no configuration variable is allowed.\nconfig-variables: []\n | dataset_sample\yaml\ruby\actionlint.yaml | actionlint.yaml | YAML | 312 | 0.8 | 0 | 0.571429 | awesome-app | 847 | 2025-06-16T19:47:06.228110 | Apache-2.0 | false | 8eab440d4affc6cb2b982c01081b2675 |
shared: &shared\n version: '4.1.0'\n\ndevelopment:\n <<: *shared\n\nproduction:\n <<: *shared\n\nstaging:\n <<: *shared\n\ntest:\n <<: *shared\n | dataset_sample\yaml\ruby\app.yml | app.yml | YAML | 135 | 0.7 | 0 | 0 | node-utils | 987 | 2024-09-27T02:17:58.604192 | GPL-3.0 | false | 97a968e5e77fc0e0d3584c896b526ecf |
# https://www.appveyor.com/docs/appveyor-yml/\n\nversion: "{build}"\n\n# branches to build\nbranches:\n # allowlist\n only:\n - master\n\n# Do not build on tags (GitHub and BitBucket)\nskip_tags: true\n\n# Do not build additional commits to feature branch with open Pull Requests\nskip_branch_with_pr: true\n\n# Rolling builds\n# can only be configured via UI: https://www.appveyor.com/docs/build-configuration/#rolling-builds\n\ninit:\n - git config --global core.autocrlf true\n\n# cloning the repository happens here\n\ninstall:\n - set PATH=C:\Ruby26-x64\bin;%PATH%\n - FOR /F "usebackq delims==" %%A IN (`ruby .ci/compatible_gem_version`) DO gem update --system %%A\n - FOR /F "usebackq delims==" %%A IN (`ruby .ci/bundler_version.rb`) DO gem install bundler -v %%A\n - bundle install\n # iconv\n - appveyor DownloadFile\n https://downloads.sourceforge.net/project/gnuwin32/libiconv/1.9.2-1/libiconv-1.9.2-1.exe?download\n -FileName %TEMP%\libiconv.exe\n - "%TEMP%\\libiconv.exe /SILENT /SUPPRESSMSGBOXES"\n\nenvironment:\n LC_ALL: en_US.UTF-8\n LANG: en_US.UTF-8\n FASTLANE_ITUNES_TRANSPORTER_PATH: C:/tmp\n\nbefore_test:\n - ruby -v\n - which ruby\n - gem -v\n - bundle -v\n - which iconv\n - which git\n - git --version\n\nbuild: off\n\ntest_script:\n # Uncomment the first line and comment the second one to debug failing tests\n # on Windows (some tests might fail with this).\n # - cmd /V /C "set DEBUG=1 && bundle exec fastlane execute_tests"\n - bundle exec fastlane execute_tests\n | dataset_sample\yaml\ruby\appveyor.yml | appveyor.yml | YAML | 1,472 | 0.8 | 0 | 0.272727 | python-kit | 176 | 2024-03-24T10:40:46.701010 | GPL-3.0 | false | f7c90da3e4bc38802293462d384f4c97 |
#\n# Create many HTML strings with ERB.\n#\nprelude: |\n require 'erb'\n\n data = <<erb\n <html>\n <head> <%= title %> </head>\n <body>\n <h1> <%= title %> </h1>\n <p>\n <%= content %>\n </p>\n </body>\n </html>\n erb\n\n title = "hello world!"\n content = "hello world!\n" * 10\nbenchmark:\n app_erb: ERB.new(data).result(binding)\nloop_count: 15000\n | dataset_sample\yaml\ruby\app_erb.yml | app_erb.yml | YAML | 368 | 0.95 | 0 | 0.142857 | node-utils | 233 | 2025-01-04T05:18:28.224905 | Apache-2.0 | false | 7f37d80bc58e46d8cab256e3266ff4a2 |
prelude: |\n small_flat_ary = 5.times.to_a\n large_flat_ary = 100.times.to_a\n small_pairs_ary = [[1, 2]] * 5\n large_pairs_ary = [[1, 2]] * 100\n mostly_flat_ary = 100.times.to_a.push([101, 102])\n\nbenchmark:\n small_flat_ary.flatten: small_flat_ary.flatten\n small_flat_ary.flatten!: small_flat_ary.flatten!\n large_flat_ary.flatten: large_flat_ary.flatten\n large_flat_ary.flatten!: large_flat_ary.flatten!\n small_pairs_ary.flatten: small_pairs_ary.flatten\n small_pairs_ary.flatten!: small_pairs_ary.dup.flatten!\n large_pairs_ary.flatten: large_pairs_ary.flatten\n large_pairs_ary.flatten!: large_pairs_ary.dup.flatten!\n mostly_flat_ary.flatten: mostly_flat_ary.flatten\n mostly_flat_ary.flatten!: mostly_flat_ary.dup.flatten!\nloop_count: 10000\n | dataset_sample\yaml\ruby\array_flatten.yml | array_flatten.yml | YAML | 751 | 0.7 | 0 | 0 | awesome-app | 429 | 2024-08-02T20:24:22.374034 | BSD-3-Clause | false | e1f38ea1513f8c93926044462b68185b |
prelude: |\n small1 = [1, 2, 3]\n small2 = [1, 2, 3, 4, 5]\n small3 = [2, 3, 4, 5]\n small4 = [2]\n big1 = [1, 2, 3, 4] * 64\n big2 = [1, 2, 3] * 64\n big3 = [1, 2] * 64\n\nbenchmark:\n small-&: small1 & small2 & small3 & small4\n small-intersection: small1.intersection(small2, small3, small4)\n big-&: big1 & big2 & big3\n big-intersection: big1.intersection(big2, big3)\n | dataset_sample\yaml\ruby\array_intersection.yml | array_intersection.yml | YAML | 371 | 0.7 | 0 | 0 | node-utils | 438 | 2024-06-12T13:36:41.823042 | MIT | false | 7a77dc8750289ca2ffc65fec148c0aab |
prelude: |\n ary2 = 2.times.map(&:to_f).shuffle\n ary10 = 10.times.map(&:to_f).shuffle\n ary100 = 100.times.map(&:to_f).shuffle\n ary500 = 500.times.map(&:to_f).shuffle\n ary1000 = 1000.times.map(&:to_f).shuffle\n ary2000 = 2500.times.map(&:to_f).shuffle\n ary3000 = 2500.times.map(&:to_f).shuffle\n ary5000 = 5000.times.map(&:to_f).shuffle\n ary10000 = 10000.times.map(&:to_f).shuffle\n ary20000 = 20000.times.map(&:to_f).shuffle\n ary50000 = 50000.times.map(&:to_f).shuffle\n ary100000 = 100000.times.map(&:to_f).shuffle\n\nbenchmark:\n ary2.max: ary2.max\n ary10.max: ary10.max\n ary100.max: ary100.max\n ary500.max: ary500.max\n ary1000.max: ary1000.max\n ary2000.max: ary2000.max\n ary3000.max: ary3000.max\n ary5000.max: ary5000.max\n ary10000.max: ary10000.max\n ary20000.max: ary20000.max\n ary50000.max: ary50000.max\n ary100000.max: ary100000.max\n\nloop_count: 10000\n\n | dataset_sample\yaml\ruby\array_max_float.yml | array_max_float.yml | YAML | 875 | 0.7 | 0 | 0 | react-lib | 569 | 2025-06-12T07:58:09.146767 | MIT | false | b3c3e289551f0517a3422b93dd5e8864 |
prelude: |\n ary2 = 2.times.to_a.shuffle\n ary10 = 10.times.to_a.shuffle\n ary100 = 100.times.to_a.shuffle\n ary500 = 500.times.to_a.shuffle\n ary1000 = 1000.times.to_a.shuffle\n ary2000 = 2500.times.to_a.shuffle\n ary3000 = 2500.times.to_a.shuffle\n ary5000 = 5000.times.to_a.shuffle\n ary10000 = 10000.times.to_a.shuffle\n ary20000 = 20000.times.to_a.shuffle\n ary50000 = 50000.times.to_a.shuffle\n ary100000 = 100000.times.to_a.shuffle\n ary1000000 = 1000000.times.to_a.shuffle\n\nbenchmark:\n ary2.max: ary2.max\n ary10.max: ary10.max\n ary100.max: ary100.max\n ary500.max: ary500.max\n ary1000.max: ary1000.max\n ary2000.max: ary2000.max\n ary3000.max: ary3000.max\n ary5000.max: ary5000.max\n ary10000.max: ary10000.max\n ary20000.max: ary20000.max\n ary50000.max: ary50000.max\n ary100000.max: ary100000.max\n ary1000000.max: ary1000000.max\n\nloop_count: 10000\n | dataset_sample\yaml\ruby\array_max_int.yml | array_max_int.yml | YAML | 865 | 0.7 | 0 | 0 | node-utils | 665 | 2024-01-18T11:12:36.085708 | BSD-3-Clause | false | 652aceb99e9ec76a77819c52f7cedbef |
prelude: |\n ary2 = 2.times.map(&:to_s).shuffle\n ary10 = 10.times.map(&:to_s).shuffle\n ary100 = 100.times.map(&:to_s).shuffle\n ary500 = 500.times.map(&:to_s).shuffle\n ary1000 = 1000.times.map(&:to_s).shuffle\n ary2000 = 2500.times.map(&:to_s).shuffle\n ary3000 = 2500.times.map(&:to_s).shuffle\n ary5000 = 5000.times.map(&:to_s).shuffle\n ary10000 = 10000.times.map(&:to_s).shuffle\n ary20000 = 20000.times.map(&:to_s).shuffle\n ary50000 = 50000.times.map(&:to_s).shuffle\n ary100000 = 100000.times.map(&:to_s).shuffle\n\nbenchmark:\n ary2.max: ary2.max\n ary10.max: ary10.max\n ary100.max: ary100.max\n ary500.max: ary500.max\n ary1000.max: ary1000.max\n ary2000.max: ary2000.max\n ary3000.max: ary3000.max\n ary5000.max: ary5000.max\n ary10000.max: ary10000.max\n ary20000.max: ary20000.max\n ary50000.max: ary50000.max\n ary100000.max: ary100000.max\n\nloop_count: 10000\n\n | dataset_sample\yaml\ruby\array_max_str.yml | array_max_str.yml | YAML | 875 | 0.7 | 0 | 0 | react-lib | 748 | 2025-01-03T14:13:53.720965 | GPL-3.0 | false | a08a543e352aa7688b8109a7f50a39a8 |
prelude: |\n ary2 = 2.times.to_a.shuffle\n ary10 = 10.times.to_a.shuffle\n ary100 = 100.times.to_a.shuffle\n ary500 = 500.times.to_a.shuffle\n ary1000 = 1000.times.to_a.shuffle\n ary2000 = 2500.times.to_a.shuffle\n ary3000 = 2500.times.to_a.shuffle\n ary5000 = 5000.times.to_a.shuffle\n ary10000 = 10000.times.to_a.shuffle\n ary20000 = 20000.times.to_a.shuffle\n ary50000 = 50000.times.to_a.shuffle\n ary100000 = 100000.times.to_a.shuffle\n ary1000000 = 1000000.times.to_a.shuffle\n\nbenchmark:\n ary2.min: ary2.min\n ary10.min: ary10.min\n ary100.min: ary100.min\n ary500.min: ary500.min\n ary1000.min: ary1000.min\n ary2000.min: ary2000.min\n ary3000.min: ary3000.min\n ary5000.min: ary5000.min\n ary10000.min: ary10000.min\n ary20000.min: ary20000.min\n ary50000.min: ary50000.min\n ary100000.min: ary100000.min\n ary1000000.min: ary1000000.min\n\nloop_count: 10000\n | dataset_sample\yaml\ruby\array_min.yml | array_min.yml | YAML | 865 | 0.7 | 0 | 0 | vue-tools | 217 | 2024-04-10T23:09:47.913562 | Apache-2.0 | false | 682a5f0e72dba76820fd4f58a5e991db |
prelude: |\n ary2 = 2.times.to_a.shuffle\n ary10 = 10.times.to_a.shuffle\n ary100 = 100.times.to_a.shuffle\n ary1000 = 1000.times.to_a.shuffle\n ary10000 = 10000.times.to_a.shuffle\n\nbenchmark:\n ary2.sort: ary2.sort\n ary10.sort: ary10.sort\n ary100.sort: ary100.sort\n ary1000.sort: ary1000.sort\n ary10000.sort: ary10000.sort\n\nloop_count: 10000\n | dataset_sample\yaml\ruby\array_sort_int.yml | array_sort_int.yml | YAML | 347 | 0.7 | 0 | 0 | awesome-app | 20 | 2025-07-07T20:09:34.064720 | BSD-3-Clause | false | c6087276e45e043303ec5016e21691c1 |
prelude: |\n class C\n attr_accessor :x\n def initialize\n @x = nil\n end\n class_eval <<-END\n def ar\n #{'x;'*256}\n end\n def aw\n #{'self.x = nil;'*256}\n end\n def arm\n m = method(:x)\n #{'m.call;'*256}\n end\n def awm\n m = method(:x=)\n #{'m.call(nil);'*256}\n end\n END\n end\n obj = C.new\nbenchmark:\n attr_reader: "obj.ar"\n attr_writer: "obj.aw"\n attr_reader_method: "obj.arm"\n attr_writer_method: "obj.awm"\n | dataset_sample\yaml\ruby\attr_accessor.yml | attr_accessor.yml | YAML | 504 | 0.95 | 0.206897 | 0.137931 | node-utils | 25 | 2023-11-12T19:12:01.154127 | Apache-2.0 | false | 2cf15b01ca4165ba2a1b2b6838c04157 |
actioncable:\n - "actioncable/**/*"\nactionmailbox:\n - "actionmailbox/**/*"\nactionmailer:\n - "actionmailer/**/*"\nactionpack:\n - "actionpack/**/*"\nactiontext:\n - "actiontext/**/*"\nactionview:\n - "actionview/**/*"\nactivejob:\n - "activejob/**/*"\nactivemodel:\n - "activemodel/**/*"\nactiverecord:\n - "activerecord/**/*"\nactivestorage:\n - "activestorage/**/*"\nactivesupport:\n - "activesupport/**/*"\nrails-ujs:\n - "actionview/app/assets/javascripts/rails-ujs*/*"\nrailties:\n - "railties/**/*"\ndocs:\n - "guides/**/*"\n | dataset_sample\yaml\ruby\autolabeler.yml | autolabeler.yml | YAML | 520 | 0.8 | 0 | 0 | python-kit | 176 | 2023-12-11T15:40:09.910997 | MIT | false | 081a0a46692e56d63140852855b65e45 |
files:\n 'yjit*': [team:yjit]\n 'yjit/**/*': [team:yjit]\n 'yjit/src/cruby_bindings.inc.rs': []\n 'doc/yjit/*': [team:yjit]\n 'bootstraptest/test_yjit*': [team:yjit]\n 'test/ruby/test_yjit*': [team:yjit]\n 'zjit*': [team:yjit]\n 'zjit/**/*': [team:yjit]\n 'zjit/src/cruby_bindings.inc.rs': []\n 'doc/zjit*': [team:yjit]\n 'test/ruby/test_zjit*': [team:yjit]\noptions:\n ignore_draft: true\n # This currently doesn't work as intended. We want to skip reviews when only\n # cruby_bingings.inc.rs is modified, but this skips reviews even when other\n # yjit files are modified as well. To be enabled after fixing the behavior.\n #last_files_match_only: true\n | dataset_sample\yaml\ruby\auto_request_review.yml | auto_request_review.yml | YAML | 654 | 0.8 | 0 | 0.222222 | awesome-app | 572 | 2025-02-01T01:17:37.841133 | BSD-3-Clause | false | eb0cb65570bb532384df261a7be9632d |
# see https://github.com/ankane/blazer for more info\n\ndata_sources:\n main:\n url: <%= ENV["BLAZER_DATABASE_URL"] %>\n\n # statement timeout, in seconds\n # none by default\n timeout: 30\n\n # caching settings\n # can greatly improve speed\n # off by default\n cache:\n mode: slow # or all\n expires_in: 60 # min\n slow_threshold: 15 # sec, only used in slow mode\n\n # wrap queries in a transaction for safety\n # not necessary if you use a read-only user\n # true by default\n # use_transaction: false\n\n smart_variables:\n # zone_id: "SELECT id, name FROM zones ORDER BY name ASC"\n # period: ["day", "week", "month"]\n # status: {0: "Active", 1: "Archived"}\n tag: "SELECT name FROM tags ORDER BY name ASC"\n\n linked_columns:\n # user_id: "/admin/users/{value}"\n\n smart_columns:\n status: {0: "enqueued", 1: "working", 2: "succeeded", 3: "failed"}\n input_types: {0: "text_field", 1: "text_area", 2: "check_box", 3: "color_field"}\n\n# create audits\naudit: true\n\n# change the time zone\n# time_zone: "Pacific Time (US & Canada)"\n\n# class name of the user model\n# user_class: User\n\n# method name for the current user\n# user_method: current_user\n\n# method name for the display name\n# user_name: name\n\n# custom before_action to use for auth\n# before_action_method: require_admin\n\n# email to send checks from\n# from_email: blazer@example.org\n\n# webhook for Slack\n# slack_webhook_url: <%= ENV["BLAZER_SLACK_WEBHOOK_URL"] %>\n\ncheck_schedules:\n - "1 day"\n - "1 hour"\n - "5 minutes"\n\n# enable anomaly detection\n# note: with trend, time series are sent to https://trendapi.org\n# anomaly_checks: trend / r\n\n# enable forecasting\n# note: with trend, time series are sent to https://trendapi.org\n# forecasting: trend\n\n# enable map\n# mapbox_access_token: <%= ENV["MAPBOX_ACCESS_TOKEN"] %>\n | dataset_sample\yaml\ruby\blazer.yml | blazer.yml | YAML | 1,847 | 0.95 | 0.106667 | 0.660714 | python-kit | 808 | 2024-04-03T00:06:00.259094 | MIT | false | 7927f24581c488329b11073c416a7869 |
prelude: |\n # frozen_string_literal: true\n Warning[:experimental] = false\n string = "The quick brown fox jumped over the lazy dog."\n array = string.bytes\n buffer = IO::Buffer.for(string)\nbenchmark:\n string.each_byte: |\n upcased = String.new\n string.each_byte do |byte|\n upcased << (byte ^ 32)\n end\n array.each: |\n upcased = String.new\n array.each do |byte|\n upcased << (byte ^ 32)\n end\n buffer.each: |\n upcased = String.new\n buffer.each(:U8) do |offset, byte|\n upcased << (byte ^ 32)\n end\n buffer.each_byte: |\n upcased = String.new\n buffer.each_byte do |byte|\n upcased << (byte ^ 32)\n end\n | dataset_sample\yaml\ruby\buffer_each.yml | buffer_each.yml | YAML | 654 | 0.8 | 0.037037 | 0.037037 | python-kit | 791 | 2023-10-13T13:57:52.220481 | BSD-3-Clause | false | 23991a1c4af4dea5b7b95d854a4120e1 |
prelude: |\n # frozen_string_literal: true\n Warning[:experimental] = false\n string = "The quick brown fox jumped over the lazy dog."\n buffer = IO::Buffer.for(string)\n format = [:U32, :U32, :U32, :U32]\nbenchmark:\n string.unpack1: |\n [\n string.unpack1("N"),\n string.unpack1("N", offset: 4),\n string.unpack1("N", offset: 8),\n string.unpack1("N", offset: 12),\n ]\n buffer.get_value: |\n [\n buffer.get_value(:U32, 0),\n buffer.get_value(:U32, 4),\n buffer.get_value(:U32, 8),\n buffer.get_value(:U32, 12),\n ]\n buffer.get_values: |\n buffer.get_values(format, 0)\n string.unpack: |\n string.unpack("NNNN")\n | dataset_sample\yaml\ruby\buffer_get.yml | buffer_get.yml | YAML | 658 | 0.8 | 0.04 | 0.04 | awesome-app | 344 | 2024-01-18T16:03:47.960965 | BSD-3-Clause | false | 498c7fe05c61ae3adb61ddf0ef125f4e |
development:\n adapter: async\n\ntest:\n adapter: async\n\nproduction:\n adapter: redis\n url: <%= ENV.fetch("REDIS_URL") { "redis://localhost:6379/1" } %>\n channel_prefix: practical_developer_production\n | dataset_sample\yaml\ruby\cable.yml | cable.yml | YAML | 201 | 0.8 | 0 | 0 | python-kit | 135 | 2023-09-22T02:11:00.531790 | GPL-3.0 | false | a68e6d9dc24044381a95d8f1a6636c27 |
shared:\n secret_key: <%= ENV.fetch('HCAPTCHA_SECRET_KEY', nil) %>\n site_key: <%= ENV.fetch('HCAPTCHA_SITE_KEY', nil) %>\n | dataset_sample\yaml\ruby\captcha.yml | captcha.yml | YAML | 122 | 0.7 | 0 | 0 | python-kit | 889 | 2024-09-03T20:39:19.957052 | GPL-3.0 | false | 1874a99c49a045ced0fe6d2a9162b5cf |
prelude: |\n # frozen_string_literal: true\n require 'cgi/escape'\nbenchmark:\n - script: CGI.escapeHTML("")\n loop_count: 20000000\n - script: CGI.escapeHTML("abcde")\n loop_count: 20000000\n - script: CGI.escapeHTML("abcd<")\n loop_count: 20000000\n - script: CGI.escapeHTML("'&\"<>")\n loop_count: 5000000\n - prelude: long_no_escape = "abcde" * 300\n script: CGI.escapeHTML(long_no_escape)\n loop_count: 1000000\n - prelude: long_all_escape = "'&\"<>" * 10\n script: CGI.escapeHTML(long_all_escape)\n loop_count: 1000000\n - prelude: | # http://example.com/\n example_html = <<~HTML\n <body>\n <div>\n <h1>Example Domain</h1>\n <p>This domain is established to be used for illustrative examples in documents. You may use this\n domain in examples without prior coordination or asking for permission.</p>\n <p><a href="http://www.iana.org/domains/example">More information...</a></p>\n </div>\n </body>\n HTML\n script: CGI.escapeHTML(example_html)\n loop_count: 1000000\n | dataset_sample\yaml\ruby\cgi_escape_html.yml | cgi_escape_html.yml | YAML | 1,065 | 0.95 | 0.064516 | 0.032258 | python-kit | 287 | 2024-10-01T12:24:46.433278 | GPL-3.0 | false | 906150abaee7c91010ccb854659953b8 |
---\n# Settings for generating changelogs using the GitLab API. See\n# https://docs.gitlab.com/ee/api/repositories.html#generate-changelog-data for\n# more information.\ncategories:\n added: Added\n fixed: Fixed\n changed: Changed\n deprecated: Deprecated\n removed: Removed\n security: Security\n performance: Performance\n other: Other\ninclude_groups:\n - gitlab-org/gitlab-core-team/community-members\ntemplate: |\n {% if categories %}\n {% each categories %}\n ### {{ title }} ({% if single_change %}1 change{% else %}{{ count }} changes{% end %})\n\n {% each entries %}\n - [{{ title }}]({{ commit.web_url }})\\n {% if author.credit %} by {{ author.reference }}{% end %}\\n {% if commit.trailers.MR %}\\n ([merge request]({{ commit.trailers.MR }}))\\n {% else %}\\n {% if merge_request %}\\n ([merge request]({{ merge_request.web_url }}))\\n {% end %}\\n {% end %}\\n {% if commit.trailers.EE %}\\n **GitLab Enterprise Edition**\\n {% end %}\n\n {% end %}\n\n {% end %}\n {% else %}\n No changes.\n {% end %}\n# The tag format for gitlab-org/gitlab is vX.Y.Z(-rcX)-ee. The -ee prefix would\n# be treated as a pre-release identifier, which can result in the wrong tag\n# being used as the starting point of a changelog commit range. The custom regex\n# here is used to ensure we find the correct tag.\ntag_regex: '^v(?P<major>\d+)\.(?P<minor>\d+)\.(?P<patch>\d+)-ee$'\n | dataset_sample\yaml\ruby\changelog_config.yml | changelog_config.yml | YAML | 1,362 | 0.8 | 0.2 | 0.214286 | awesome-app | 126 | 2024-11-10T15:44:45.944919 | MIT | false | 6c8a557a9481a82c4ea7c0d2edec3713 |
comment: false # Do not leave PR comments\ncoverage:\n status:\n project:\n default:\n # GitHub status check is not blocking\n informational: true\n patch:\n default:\n # GitHub status check is not blocking\n informational: true\ngithub_checks:\n annotations: false\n | dataset_sample\yaml\ruby\codecov.yml | codecov.yml | YAML | 300 | 0.8 | 0 | 0.153846 | python-kit | 378 | 2024-10-05T10:48:49.936210 | BSD-3-Clause | false | 05322f3e73fb240558356dee13cdac18 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.