content stringlengths 1 103k ⌀ | path stringlengths 8 216 | filename stringlengths 2 179 | language stringclasses 15
values | size_bytes int64 2 189k | quality_score float64 0.5 0.95 | complexity float64 0 1 | documentation_ratio float64 0 1 | repository stringclasses 5
values | stars int64 0 1k | created_date stringdate 2023-07-10 19:21:08 2025-07-09 19:11:45 | license stringclasses 4
values | is_test bool 2
classes | file_hash stringlengths 32 32 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
language: php\n\nbefore_script:\n - composer self-update\n - composer install --no-interaction --prefer-source --dev\n\nphp:\n - 5.3.3\n - 5.3\n - 5.4\n - 5.5\n - 5.6\n - hhvm\n\nnotifications:\n email: false\n webhooks:\n urls:\n - https://webhooks.gitter.im/e/6668f52f3dd4e3f81960\n on_success: always\n on_failure: always\n on_start: false\n\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\sebastian\exporter\.travis.yml | .travis.yml | YAML | 350 | 0.8 | 0 | 0 | awesome-app | 839 | 2024-01-20T07:39:09.571529 | Apache-2.0 | false | c304e66ee9f79a8e983fd9cb5e88f86e |
language: php\n\nphp:\n - 5.3.3\n - 5.3\n - 5.4\n - 5.5\n - 5.6\n - hhvm\n\nsudo: false\n\nbefore_script:\n - composer self-update\n - composer install --no-interaction --prefer-source --dev\n\nscript: ./vendor/bin/phpunit\n\nnotifications:\n email: false\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\sebastian\global-state\.travis.yml | .travis.yml | YAML | 246 | 0.7 | 0 | 0 | awesome-app | 470 | 2025-07-09T03:18:55.703053 | BSD-3-Clause | false | 54683a386bf74e6ccd12040370c4db79 |
language: php\n\nphp:\n - 5.6\n - 7.0\n - 7.1\n - nightly\n\nsudo: false\n\nbefore_install:\n - composer self-update\n - composer clear-cache\n\ninstall:\n - travis_retry composer update --no-interaction --no-ansi --no-progress --no-suggest --optimize-autoloader --prefer-stable\n\nscript:\n - ./vendor/bin/phpunit\n\nnotifications:\n email: false\n\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\sebastian\object-enumerator\.travis.yml | .travis.yml | YAML | 337 | 0.7 | 0 | 0 | react-lib | 106 | 2025-05-15T03:29:08.518670 | MIT | false | 875d1bb79a2295c90e6f563369786cc3 |
language: php\n\nphp:\n - 5.3.3\n - 5.3\n - 5.4\n - 5.5\n - 5.6\n - hhvm\n\nsudo: false\n\nbefore_script:\n - composer self-update\n - composer install --no-interaction --prefer-source --dev\n\nscript: ./vendor/bin/phpunit\n\nnotifications:\n email: false\n irc: "irc.freenode.org#phpunit"\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\sebastian\recursion-context\.travis.yml | .travis.yml | YAML | 280 | 0.8 | 0 | 0 | react-lib | 478 | 2023-11-11T02:40:15.665815 | MIT | false | fa3ba0b60448f6c880a66f4bb4260a5e |
--- %YAML:1.0\ntest: Miscellaneous\nspec: 2.21\nyaml: |\n true: true\n false: false\nphp: |\n [\n 'true' => true,\n 'false' => false,\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\booleanMappingKeys.yml | booleanMappingKeys.yml | YAML | 138 | 0.7 | 0 | 0 | node-utils | 379 | 2024-07-21T11:38:48.348268 | BSD-3-Clause | true | fda820cb52422994914fb20ef8b40983 |
test: outside double quotes\nyaml: |\n \0 \ \a \b \n\nphp: |\n "\\0 \\ \\a \\b \\n"\n---\ntest: 'null'\nyaml: |\n "\0"\nphp: |\n "\x00"\n---\ntest: bell\nyaml: |\n "\a"\nphp: |\n "\x07"\n---\ntest: backspace\nyaml: |\n "\b"\nphp: |\n "\x08"\n---\ntest: horizontal tab (1)\nyaml: |\n "\t"\nphp: |\n "\x09"\n---\ntest: horizontal tab (2)\nyaml: |\n "\ "\nphp: |\n "\x09"\n---\ntest: line feed\nyaml: |\n "\n"\nphp: |\n "\x0a"\n---\ntest: vertical tab\nyaml: |\n "\v"\nphp: |\n "\x0b"\n---\ntest: form feed\nyaml: |\n "\f"\nphp: |\n "\x0c"\n---\ntest: carriage return\nyaml: |\n "\r"\nphp: |\n "\x0d"\n---\ntest: escape\nyaml: |\n "\e"\nphp: |\n "\x1b"\n---\ntest: space\nyaml: |\n "\ "\nphp: |\n "\x20"\n---\ntest: slash\nyaml: |\n "\/"\nphp: |\n "\x2f"\n---\ntest: backslash\nyaml: |\n "\\"\nphp: |\n "\\"\n---\ntest: Unicode next line\nyaml: |\n "\N"\nphp: |\n "\xc2\x85"\n---\ntest: Unicode non-breaking space\nyaml: |\n "\_"\nphp: |\n "\xc2\xa0"\n---\ntest: Unicode line separator\nyaml: |\n "\L"\nphp: |\n "\xe2\x80\xa8"\n---\ntest: Unicode paragraph separator\nyaml: |\n "\P"\nphp: |\n "\xe2\x80\xa9"\n---\ntest: Escaped 8-bit Unicode\nyaml: |\n "\x42"\nphp: |\n "B"\n---\ntest: Escaped 16-bit Unicode\nyaml: |\n "\u20ac"\nphp: |\n "\xe2\x82\xac"\n---\ntest: Escaped 32-bit Unicode\nyaml: |\n "\U00000043"\nphp: |\n "C"\n---\ntest: Example 5.13 Escaped Characters\nnote: |\n Currently throws an error parsing first line. Maybe Symfony Yaml doesn't support\n continuation of string across multiple lines? Keeping test here but disabled.\ntodo: true\nyaml: |\n "Fun with \\\n \" \a \b \e \f \\n \n \r \t \v \0 \\n \ \_ \N \L \P \\n \x41 \u0041 \U00000041"\nphp: |\n "Fun with \x5C\n\x22 \x07 \x08 \x1B \x0C\n\x0A \x0D \x09 \x0B \x00\n\x20 \xA0 \x85 \xe2\x80\xa8 \xe2\x80\xa9\nA A A"\n---\ntest: Double quotes with a line feed\nyaml: |\n { double: "some value\n \"some quoted string\" and 'some single quotes one'" }\nphp: |\n [\n 'double' => "some value\n \"some quoted string\" and 'some single quotes one'"\n ]\n---\ntest: Backslashes\nyaml: |\n { single: 'foo\Var', no-quotes: foo\Var, double: "foo\\Var" }\nphp: |\n [\n 'single' => 'foo\Var', 'no-quotes' => 'foo\Var', 'double' => 'foo\Var'\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\escapedCharacters.yml | escapedCharacters.yml | YAML | 2,228 | 0.7 | 0 | 0 | awesome-app | 223 | 2024-04-19T04:58:28.614566 | MIT | true | bf2c2c6d4cf356ddde0e0a92ed714218 |
- escapedCharacters\n- sfComments\n- sfCompact\n- sfTests\n- sfObjects\n- sfMergeKey\n- sfQuotes\n- YtsAnchorAlias\n- YtsBasicTests\n- YtsBlockMapping\n- YtsDocumentSeparator\n- YtsErrorTests\n- YtsFlowCollections\n- YtsFoldedScalars\n- YtsNullsAndEmpties\n- YtsSpecificationExamples\n- YtsTypeTransfers\n- unindentedCollections\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\index.yml | index.yml | YAML | 312 | 0.7 | 0 | 0 | python-kit | 193 | 2023-10-24T16:56:18.702525 | GPL-3.0 | true | 809db138c609075657d8d2e81fabaa5c |
--- %YAML:1.0\ntest: Miscellaneous\nspec: 2.21\nyaml: |\n true: true\n false: false\nphp: |\n [\n 1 => true,\n 0 => false,\n ]\n---\ntest: Boolean\nyaml: |\n false: used as key\n logical: true\n answer: false\nphp: |\n [\n false => 'used as key',\n 'logical' => true,\n 'answer' => false\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\legacyBooleanMappingKeys.yml | legacyBooleanMappingKeys.yml | YAML | 303 | 0.7 | 0 | 0 | react-lib | 673 | 2024-07-30T07:38:21.074633 | GPL-3.0 | true | e6e4b3a581724e2590370225e1fc9198 |
--- %YAML:1.0\ntest: Miscellaneous\nspec: 2.21\nyaml: |\n null: ~\nphp: |\n [\n '' => null,\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\legacyNullMappingKey.yml | legacyNullMappingKey.yml | YAML | 94 | 0.5 | 0 | 0 | python-kit | 284 | 2023-10-10T18:58:10.669674 | GPL-3.0 | true | 5b294cccf5da7fa8c44bff361be666f1 |
data:\n single_line: 'foo bar baz'\n multi_line: |\n foo\n line with trailing spaces:\n \n bar\n integer like line:\n 123456789\n empty line:\n \n baz\n multi_line_with_carriage_return: "foo\nbar\r\nbaz"\n nested_inlined_multi_line_string: { inlined_multi_line: "foo\nbar\r\nempty line:\n\nbaz" }\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\multiple_lines_as_literal_block.yml | multiple_lines_as_literal_block.yml | YAML | 361 | 0.7 | 0 | 0 | node-utils | 330 | 2024-10-09T00:00:34.880142 | Apache-2.0 | true | b2862c72ae733162fd9d691696050cc1 |
data:\n foo: !bar "foo\r\nline with trailing spaces:\n \nbar\ninteger like line:\n123456789\nempty line:\n\nbaz"\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\multiple_lines_as_literal_block_for_tagged_values.yml | multiple_lines_as_literal_block_for_tagged_values.yml | YAML | 116 | 0.7 | 0 | 0 | react-lib | 524 | 2023-12-30T09:58:38.326273 | GPL-3.0 | true | 7ad14c7104568052b53818111375a2f3 |
data:\n multi_line: |4\n the first line has leading spaces\n The second line does not.\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\multiple_lines_as_literal_block_leading_space_in_first_line.yml | multiple_lines_as_literal_block_leading_space_in_first_line.yml | YAML | 105 | 0.7 | 0 | 0 | react-lib | 782 | 2025-05-09T15:49:47.235357 | BSD-3-Clause | true | 3a51ade8ba35bb95e656aacb1db1521e |
- escapedCharacters\n- sfComments\n- sfCompact\n- sfTests\n- sfObjects\n- sfMergeKey\n- sfQuotes\n- YtsAnchorAlias\n- YtsBasicTests\n- YtsBlockMapping\n- YtsDocumentSeparator\n- YtsErrorTests\n- YtsFlowCollections\n- YtsFoldedScalars\n- YtsNullsAndEmpties\n- YtsSpecificationExamples\n- YtsTypeTransfers\n- unindentedCollections\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\not_readable.yml | not_readable.yml | YAML | 312 | 0.7 | 0 | 0 | node-utils | 634 | 2025-02-26T03:52:36.493712 | MIT | true | 809db138c609075657d8d2e81fabaa5c |
--- %YAML:1.0\ntest: Miscellaneous\nspec: 2.21\nyaml: |\n null: ~\nphp: |\n [\n 'null' => null,\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\nullMappingKey.yml | nullMappingKey.yml | YAML | 98 | 0.5 | 0 | 0 | react-lib | 445 | 2023-08-01T23:43:10.425675 | BSD-3-Clause | true | 8efb82a3ac6d14bab70f1a42d1724f19 |
--- %YAML:1.0\ntest: A sequence with an unordered array\nbrief: >\n A sequence with an unordered array\nyaml: |\n 1: foo\n 0: bar\nphp: |\n [1 => 'foo', 0 => 'bar']\n---\ntest: Integers as Map Keys\nbrief: >\n An integer can be used as dictionary key.\nyaml: |\n 1: one\n 2: two\n 3: three\nphp: |\n [\n 1 => 'one',\n 2 => 'two',\n 3 => 'three'\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\numericMappingKeys.yml | numericMappingKeys.yml | YAML | 370 | 0.7 | 0 | 0 | react-lib | 759 | 2024-04-06T12:44:53.767421 | GPL-3.0 | true | 47e975302d4f91c208bc58c4266f9caf |
--- %YAML:1.0\ntest: Comments at the end of a line\nbrief: >\n Comments at the end of a line\nyaml: |\n ex1: "foo # bar"\n ex2: "foo # bar" # comment\n ex3: 'foo # bar' # comment\n ex4: foo # comment\n ex5: foo # comment with tab before \n ex6: foo#foo # comment here\n ex7: foo # ignore me # and me\nphp: |\n ['ex1' => 'foo # bar', 'ex2' => 'foo # bar', 'ex3' => 'foo # bar', 'ex4' => 'foo', 'ex5' => 'foo', 'ex6' => 'foo#foo', 'ex7' => 'foo']\n---\ntest: Comments in the middle\nbrief: >\n Comments in the middle\nyaml: |\n foo:\n # some comment\n # some comment\n bar: foo\n # some comment\n # some comment\nphp: |\n ['foo' => ['bar' => 'foo']]\n---\ntest: Comments on a hash line\nbrief: >\n Comments on a hash line\nyaml: |\n foo: # a comment\n foo: bar # a comment\nphp: |\n ['foo' => ['foo' => 'bar']]\n---\ntest: 'Value starting with a #'\nbrief: >\n 'Value starting with a #'\nyaml: |\n foo: '#bar'\nphp: |\n ['foo' => '#bar']\n---\ntest: Document starting with a comment and a separator\nbrief: >\n Commenting before document start is allowed\nyaml: |\n # document comment\n ---\n foo: bar # a comment\nphp: |\n ['foo' => 'bar']\n---\ntest: Comment containing a colon on a hash line\nbrief: >\n Comment containing a colon on a scalar line\nyaml: 'foo # comment: this is also part of the comment'\nphp: |\n 'foo'\n---\ntest: 'Hash key containing a #'\nbrief: >\n 'Hash key containing a #'\nyaml: 'foo#bar: baz'\nphp: |\n ['foo#bar' => 'baz']\n---\ntest: 'Hash key ending with a space and a #'\nbrief: >\n 'Hash key ending with a space and a #'\nyaml: |\n 'foo #': baz\nphp: |\n ['foo #' => 'baz']\n---\ntest: Comment before first item in unindented collection\nbrief: >\n Comment directly before unindented collection is allowed\nyaml: |\n collection1:\n # comment\n - a\n - b\n collection2:\n - a\n - b\nphp: |\n ['collection1' => ['a', 'b'], 'collection2' => ['a', 'b']]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\sfComments.yml | sfComments.yml | YAML | 1,930 | 0.8 | 0 | 0.066667 | node-utils | 543 | 2024-12-09T15:59:09.929065 | Apache-2.0 | true | 40a50cf14b5503f6c5cea256253b5231 |
--- %YAML:1.0\ntest: Compact notation\nbrief: |\n Compact notation for sets of mappings with single element\nyaml: |\n ---\n # products purchased\n - item : Super Hoop\n - item : Basketball\n quantity: 1\n - item:\n name: Big Shoes\n nick: Biggies\n quantity: 1\nphp: |\n [\n [\n 'item' => 'Super Hoop',\n ],\n [\n 'item' => 'Basketball',\n 'quantity' => 1,\n ],\n [\n 'item' => [\n 'name' => 'Big Shoes',\n 'nick' => 'Biggies'\n ],\n 'quantity' => 1\n ]\n ]\n---\ntest: Compact notation combined with inline notation\nbrief: |\n Combinations of compact and inline notation are allowed\nyaml: |\n ---\n items:\n - { item: Super Hoop, quantity: 1 }\n - [ Basketball, Big Shoes ]\nphp: |\n [\n 'items' => [\n [\n 'item' => 'Super Hoop',\n 'quantity' => 1,\n ],\n [\n 'Basketball',\n 'Big Shoes'\n ]\n ]\n ]\n--- %YAML:1.0\ntest: Compact notation\nbrief: |\n Compact notation for sets of mappings with single element\nyaml: |\n ---\n # products purchased\n - item : Super Hoop\n - item : Basketball\n quantity: 1\n - item:\n name: Big Shoes\n nick: Biggies\n quantity: 1\nphp: |\n [\n [\n 'item' => 'Super Hoop',\n ],\n [\n 'item' => 'Basketball',\n 'quantity' => 1,\n ],\n [\n 'item' => [\n 'name' => 'Big Shoes',\n 'nick' => 'Biggies'\n ],\n 'quantity' => 1\n ]\n ]\n---\ntest: Compact notation combined with inline notation\nbrief: |\n Combinations of compact and inline notation are allowed\nyaml: |\n ---\n items:\n - { item: Super Hoop, quantity: 1 }\n - [ Basketball, Big Shoes ]\nphp: |\n [\n 'items' => [\n [\n 'item' => 'Super Hoop',\n 'quantity' => 1,\n ],\n [\n 'Basketball',\n 'Big Shoes'\n ]\n ]\n ]\n--- %YAML:1.0\ntest: Compact notation\nbrief: |\n Compact notation for sets of mappings with single element\nyaml: |\n ---\n # products purchased\n - item : Super Hoop\n - item : Basketball\n quantity: 1\n - item:\n name: Big Shoes\n nick: Biggies\n quantity: 1\nphp: |\n [\n [\n 'item' => 'Super Hoop',\n ],\n [\n 'item' => 'Basketball',\n 'quantity' => 1,\n ],\n [\n 'item' => [\n 'name' => 'Big Shoes',\n 'nick' => 'Biggies'\n ],\n 'quantity' => 1\n ]\n ]\n---\ntest: Compact notation combined with inline notation\nbrief: |\n Combinations of compact and inline notation are allowed\nyaml: |\n ---\n items:\n - { item: Super Hoop, quantity: 1 }\n - [ Basketball, Big Shoes ]\nphp: |\n [\n 'items' => [\n [\n 'item' => 'Super Hoop',\n 'quantity' => 1,\n ],\n [\n 'Basketball',\n 'Big Shoes'\n ]\n ]\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\sfCompact.yml | sfCompact.yml | YAML | 2,746 | 0.8 | 0.018868 | 0.018868 | react-lib | 803 | 2025-02-21T21:27:13.396608 | GPL-3.0 | true | e67e1ba92c5c3b590e1440e9faf7c874 |
--- %YAML:1.0\ntest: Simple In Place Substitution\nbrief: >\n If you want to reuse an entire alias, only overwriting what is different\n you can use a << in place substitution. This is not part of the official\n YAML spec, but a widely implemented extension. See the following URL for\n details: http://yaml.org/type/merge.html\nyaml: |\n foo: &foo\n a: Steve\n b: Clark\n c: Brian\n e: notnull\n bar:\n a: before\n d: other\n e: ~\n <<: *foo\n b: new\n x: Oren\n c:\n foo: bar\n bar: foo\n bar_inline: {a: before, d: other, <<: *foo, b: new, x: Oren, c: { foo: bar, bar: foo}}\n foo2: &foo2\n a: Ballmer\n ding: &dong [ fi, fei, fo, fam]\n check:\n <<:\n - *foo\n - *dong\n isit: tested\n head:\n <<: [ *foo , *dong , *foo2 ]\n taz: &taz\n a: Steve\n w:\n p: 1234\n nested:\n <<: *taz\n d: Doug\n w: &nestedref\n p: 12345\n z:\n <<: *nestedref\n head_inline: &head_inline { <<: [ *foo , *dong , *foo2 ] }\n recursive_inline: { <<: *head_inline, c: { <<: *foo2 } }\nphp: |\n [\n 'foo' => ['a' => 'Steve', 'b' => 'Clark', 'c' => 'Brian', 'e' => 'notnull'],\n 'bar' => ['a' => 'before', 'd' => 'other', 'e' => null, 'b' => 'new', 'c' => ['foo' => 'bar', 'bar' => 'foo'], 'x' => 'Oren'],\n 'bar_inline' => ['a' => 'before', 'd' => 'other', 'b' => 'new', 'c' => ['foo' => 'bar', 'bar' => 'foo'], 'e' => 'notnull', 'x' => 'Oren'],\n 'foo2' => ['a' => 'Ballmer'],\n 'ding' => ['fi', 'fei', 'fo', 'fam'],\n 'check' => ['a' => 'Steve', 'b' => 'Clark', 'c' => 'Brian', 'e' => 'notnull', 'fi', 'fei', 'fo', 'fam', 'isit' => 'tested'],\n 'head' => ['a' => 'Steve', 'b' => 'Clark', 'c' => 'Brian', 'e' => 'notnull', 'fi', 'fei', 'fo', 'fam'],\n 'taz' => ['a' => 'Steve', 'w' => ['p' => 1234]],\n 'nested' => ['a' => 'Steve', 'w' => ['p' => 12345], 'd' => 'Doug', 'z' => ['p' => 12345]],\n 'head_inline' => ['a' => 'Steve', 'b' => 'Clark', 'c' => 'Brian', 'e' => 'notnull', 'fi', 'fei', 'fo', 'fam'],\n 'recursive_inline' => ['a' => 'Steve', 'b' => 'Clark', 'c' => ['a' => 'Ballmer'], 'e' => 'notnull', 'fi', 'fei', 'fo', 'fam'],\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\sfMergeKey.yml | sfMergeKey.yml | YAML | 2,313 | 0.8 | 0.016393 | 0 | vue-tools | 885 | 2024-09-18T08:15:11.101894 | Apache-2.0 | true | 0c252f843c46634dc2349bd0f0d62c61 |
--- %YAML:1.0\ntest: Objects\nbrief: >\n Comments at the end of a line\nyaml: |\n ex1: "foo # bar"\n ex2: "foo # bar" # comment\n ex3: 'foo # bar' # comment\n ex4: foo # comment\nphp: |\n ['ex1' => 'foo # bar', 'ex2' => 'foo # bar', 'ex3' => 'foo # bar', 'ex4' => 'foo']\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\sfObjects.yml | sfObjects.yml | YAML | 279 | 0.8 | 0 | 0 | python-kit | 188 | 2024-04-06T22:31:04.183566 | Apache-2.0 | true | 1a972f1753f3d77a10cea06fb2e733b0 |
--- %YAML:1.0\ntest: Some characters at the beginning of a string must be escaped\nbrief: >\n Some characters at the beginning of a string must be escaped\nyaml: |\n foo: '| bar'\nphp: |\n ['foo' => '| bar']\n---\ntest: A key can be a quoted string\nbrief: >\n A key can be a quoted string\nyaml: |\n "foo1": bar\n 'foo2': bar\n "foo \" bar": bar\n 'foo '' bar': bar\n 'foo3: ': bar\n "foo4: ": bar\n foo5: { "foo \" bar: ": bar, 'foo '' bar: ': bar }\nphp: |\n [\n 'foo1' => 'bar',\n 'foo2' => 'bar',\n 'foo " bar' => 'bar',\n 'foo \' bar' => 'bar',\n 'foo3: ' => 'bar',\n 'foo4: ' => 'bar',\n 'foo5' => [\n 'foo " bar: ' => 'bar',\n 'foo \' bar: ' => 'bar',\n ],\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\sfQuotes.yml | sfQuotes.yml | YAML | 728 | 0.7 | 0 | 0 | react-lib | 148 | 2025-05-09T19:04:04.549718 | BSD-3-Clause | true | 052e4bad52c632cabb3540e5086a5f73 |
--- %YAML:1.0\ntest: Unindented collection\nbrief: >\n Unindented collection\nyaml: |\n collection:\n - item1\n - item2\n - item3\nphp: |\n ['collection' => ['item1', 'item2', 'item3']]\n---\ntest: Nested unindented collection (two levels)\nbrief: >\n Nested unindented collection\nyaml: |\n collection:\n key:\n - a\n - b\n - c\nphp: |\n ['collection' => ['key' => ['a', 'b', 'c']]]\n---\ntest: Nested unindented collection (three levels)\nbrief: >\n Nested unindented collection\nyaml: |\n collection:\n key:\n subkey:\n - one\n - two\n - three\nphp: |\n ['collection' => ['key' => ['subkey' => ['one', 'two', 'three']]]]\n---\ntest: Key/value after unindented collection (1)\nbrief: >\n Key/value after unindented collection (1)\nyaml: |\n collection:\n key:\n - a\n - b\n - c\n foo: bar\nphp: |\n ['collection' => ['key' => ['a', 'b', 'c']], 'foo' => 'bar']\n---\ntest: Key/value after unindented collection (at the same level)\nbrief: >\n Key/value after unindented collection\nyaml: |\n collection:\n key:\n - a\n - b\n - c\n foo: bar\nphp: |\n ['collection' => ['key' => ['a', 'b', 'c'], 'foo' => 'bar']]\n---\ntest: Shortcut Key after unindented collection\nbrief: >\n Key/value after unindented collection\nyaml: |\n collection:\n - key: foo\n foo: bar\nphp: |\n ['collection' => [['key' => 'foo', 'foo' => 'bar']]]\n---\ntest: Shortcut Key after unindented collection with custom spaces\nbrief: >\n Key/value after unindented collection\nyaml: |\n collection:\n - key: foo\n foo: bar\nphp: |\n ['collection' => [['key' => 'foo', 'foo' => 'bar']]]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\unindentedCollections.yml | unindentedCollections.yml | YAML | 1,711 | 0.7 | 0 | 0 | python-kit | 479 | 2024-11-08T02:17:36.471412 | MIT | true | 39f698b8badcf24ef84f05d9cd5f1cfc |
--- %YAML:1.0\ntest: Simple Alias Example\nbrief: >\n If you need to refer to the same item of data twice,\n you can give that item an alias. The alias is a plain\n string, starting with an ampersand. The item may then\n be referred to by the alias throughout your document\n by using an asterisk before the name of the alias.\n This is called an anchor.\nyaml: |\n - &showell Steve\n - Clark\n - Brian\n - Oren\n - *showell\nphp: |\n ['Steve', 'Clark', 'Brian', 'Oren', 'Steve']\n\n---\ntest: Alias of a Mapping\nbrief: >\n An alias can be used on any item of data, including\n sequences, mappings, and other complex data types.\nyaml: |\n - &hello\n Meat: pork\n Starch: potato\n - banana\n - *hello\nphp: |\n [['Meat'=>'pork', 'Starch'=>'potato'], 'banana', ['Meat'=>'pork', 'Starch'=>'potato']]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\YtsAnchorAlias.yml | YtsAnchorAlias.yml | YAML | 839 | 0.7 | 0 | 0 | python-kit | 437 | 2023-12-28T18:54:30.427741 | Apache-2.0 | true | 41c47d33cab7c84954aa3df39fa59a03 |
---\ntest: One Element Mapping\nbrief: |\n A mapping with one key/value pair\nyaml: |\n foo: bar\nphp: |\n ['foo' => 'bar']\n---\ntest: Multi Element Mapping\nbrief: |\n More than one key/value pair\nyaml: |\n red: baron\n white: walls\n blue: berries\nphp: |\n [\n 'red' => 'baron',\n 'white' => 'walls',\n 'blue' => 'berries',\n ]\n---\ntest: Values aligned\nbrief: |\n Often times human editors of documents will align the values even\n though YAML emitters generally don't.\nyaml: |\n red: baron\n white: walls\n blue: berries\nphp: |\n [\n 'red' => 'baron',\n 'white' => 'walls',\n 'blue' => 'berries',\n ]\n---\ntest: Colons aligned\nbrief: |\n Spaces can come before the ': ' key/value separator.\nyaml: |\n red : baron\n white : walls\n blue : berries\nphp: |\n [\n 'red' => 'baron',\n 'white' => 'walls',\n 'blue' => 'berries',\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\YtsBlockMapping.yml | YtsBlockMapping.yml | YAML | 899 | 0.7 | 0 | 0 | vue-tools | 632 | 2023-12-13T04:35:05.253665 | Apache-2.0 | true | 1988c14b94f384746ef607b7e7fee0ed |
--- %YAML:1.0\ntest: Trailing Document Separator\ntodo: true\nbrief: >\n You can separate YAML documents\n with a string of three dashes.\nyaml: |\n - foo: 1\n bar: 2\n ---\n more: stuff\npython: |\n [\n [ { 'foo': 1, 'bar': 2 } ],\n { 'more': 'stuff' }\n ]\nruby: |\n [ { 'foo' => 1, 'bar' => 2 } ]\n\n---\ntest: Leading Document Separator\ntodo: true\nbrief: >\n You can explicitly give an opening\n document separator to your YAML stream.\nyaml: |\n ---\n - foo: 1\n bar: 2\n ---\n more: stuff\npython: |\n [\n [ {'foo': 1, 'bar': 2}],\n {'more': 'stuff'}\n ]\nruby: |\n [ { 'foo' => 1, 'bar' => 2 } ]\n\n---\ntest: YAML Header\ntodo: true\nbrief: >\n The opening separator can contain directives\n to the YAML parser, such as the version\n number.\nyaml: |\n --- %YAML:1.0\n foo: 1\n bar: 2\nphp: |\n ['foo' => 1, 'bar' => 2]\ndocuments: 1\n\n---\ntest: Red Herring Document Separator\nbrief: >\n Separators included in blocks or strings\n are treated as blocks or strings, as the\n document separator should have no indentation\n preceding it.\nyaml: |\n foo: |\n ---\nphp: |\n ['foo' => "---\n"]\n\n---\ntest: Multiple Document Separators in Block\nbrief: >\n This technique allows you to embed other YAML\n documents within literal blocks.\nyaml: |\n foo: |\n ---\n foo: bar\n ---\n yo: baz\n bar: |\n fooness\nphp: |\n [\n 'foo' => "---\nfoo: bar\n---\nyo: baz\n",\n 'bar' => "fooness\n"\n ]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\YtsDocumentSeparator.yml | YtsDocumentSeparator.yml | YAML | 1,516 | 0.7 | 0 | 0 | python-kit | 922 | 2024-07-10T19:32:29.636055 | Apache-2.0 | true | f6613ca25c2af3874ad4117fe664eb59 |
---\ntest: Simple Inline Array\nbrief: >\n Sequences can be contained on a\n single line, using the inline syntax.\n Separate each entry with commas and\n enclose in square brackets.\nyaml: |\n seq: [ a, b, c ]\nphp: |\n ['seq' => ['a', 'b', 'c']]\n---\ntest: Simple Inline Hash\nbrief: >\n Mapping can also be contained on\n a single line, using the inline\n syntax. Each key-value pair is\n separated by a colon, with a comma\n between each entry in the mapping.\n Enclose with curly braces.\nyaml: |\n hash: { name: Steve, foo: bar }\nphp: |\n ['hash' => ['name' => 'Steve', 'foo' => 'bar']]\n---\ntest: Multi-line Inline Collections\ntodo: true\nbrief: >\n Both inline sequences and inline mappings\n can span multiple lines, provided that you\n indent the additional lines.\nyaml: |\n languages: [ Ruby,\n Perl,\n Python ]\n websites: { YAML: yaml.org,\n Ruby: ruby-lang.org,\n Python: python.org,\n Perl: use.perl.org }\nphp: |\n [\n 'languages' => ['Ruby', 'Perl', 'Python'],\n 'websites' => [\n 'YAML' => 'yaml.org',\n 'Ruby' => 'ruby-lang.org',\n 'Python' => 'python.org',\n 'Perl' => 'use.perl.org'\n ]\n ]\n---\ntest: Commas in Values (not in the spec!)\ntodo: true\nbrief: >\n List items in collections are delimited by commas, but\n there must be a space after each comma. This allows you\n to add numbers without quoting.\nyaml: |\n attendances: [ 45,123, 70,000, 17,222 ]\nphp: |\n ['attendances' => [45123, 70000, 17222]]\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\YtsFlowCollections.yml | YtsFlowCollections.yml | YAML | 1,579 | 0.7 | 0 | 0 | node-utils | 494 | 2025-02-24T21:43:47.421225 | BSD-3-Clause | true | 174b18d078296f3f0d7eb9f961e22ef0 |
--- %YAML:1.0\ntest: Empty Sequence\nbrief: >\n You can represent the empty sequence\n with an empty inline sequence.\nyaml: |\n empty: []\nphp: |\n ['empty' => []]\n---\ntest: Empty Mapping\nbrief: >\n You can represent the empty mapping\n with an empty inline mapping.\nyaml: |\n empty: {}\nphp: |\n ['empty' => []]\n---\ntest: Empty Sequence as Entire Document\nyaml: |\n []\nphp: |\n []\n---\ntest: Empty Mapping as Entire Document\nyaml: |\n {}\nphp: |\n []\n---\ntest: Null as Document\nyaml: |\n ~\nphp: |\n null\n---\ntest: Empty String\nbrief: >\n You can represent an empty string\n with a pair of quotes.\nyaml: |\n ''\nphp: |\n ''\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\YtsNullsAndEmpties.yml | YtsNullsAndEmpties.yml | YAML | 653 | 0.7 | 0 | 0 | react-lib | 273 | 2025-01-09T02:13:22.507599 | BSD-3-Clause | true | 530c936ebe8524f5f847fc54950bc2f8 |
--- %YAML:1.0\ntest: Strings\nbrief: >\n Any group of characters beginning with an\n alphabetic or numeric character is a string,\n unless it belongs to one of the groups below\n (such as an Integer or Time).\nyaml: |\n String\nphp: |\n 'String'\n---\ntest: String characters\nbrief: >\n A string can contain any alphabetic or\n numeric character, along with many\n punctuation characters, including the\n period, dash, space, quotes, exclamation, and\n question mark.\nyaml: |\n - What's Yaml?\n - It's for writing data structures in plain text.\n - And?\n - And what? That's not good enough for you?\n - No, I mean, "And what about Yaml?"\n - Oh, oh yeah. Uh.. Yaml for Ruby.\nphp: |\n [\n "What's Yaml?",\n "It's for writing data structures in plain text.",\n "And?",\n "And what? That's not good enough for you?",\n "No, I mean, \"And what about Yaml?\"",\n "Oh, oh yeah. Uh.. Yaml for Ruby."\n ]\n---\ntest: Indicators in Strings\nbrief: >\n Be careful using indicators in strings. In particular,\n the comma, colon, and pound sign must be used carefully.\nyaml: |\n the colon followed by space is an indicator: but is a string:right here\n same for the pound sign: here we have it#in a string\n the comma can, honestly, be used in most cases: [ but not in, inline collections ]\nphp: |\n [\n 'the colon followed by space is an indicator' => 'but is a string:right here',\n 'same for the pound sign' => 'here we have it#in a string',\n 'the comma can, honestly, be used in most cases' => ['but not in', 'inline collections']\n ]\n---\ntest: Forcing Strings\nbrief: >\n Any YAML type can be forced into a string using the\n explicit !!str method.\nyaml: |\n date string: !!str 2001-08-01\n number string: !!str 192\nphp: |\n [\n 'date string' => '2001-08-01',\n 'number string' => '192'\n ]\n---\ntest: Single-quoted Strings\nbrief: >\n You can also enclose your strings within single quotes,\n which allows use of slashes, colons, and other indicators\n freely. Inside single quotes, you can represent a single\n quote in your string by using two single quotes next to\n each other.\nyaml: |\n all my favorite symbols: '#:!/%.)'\n a few i hate: '&(*'\n why do i hate them?: 'it''s very hard to explain'\n entities: '£ me'\nphp: |\n [\n 'all my favorite symbols' => '#:!/%.)',\n 'a few i hate' => '&(*',\n 'why do i hate them?' => 'it\'s very hard to explain',\n 'entities' => '£ me'\n ]\n---\ntest: Double-quoted Strings\nbrief: >\n Enclosing strings in double quotes allows you\n to use escapings to represent ASCII and\n Unicode characters.\nyaml: |\n i know where i want my line breaks: "one here\nand another here\n"\nphp: |\n [\n 'i know where i want my line breaks' => "one here\nand another here\n"\n ]\n---\ntest: Multi-line Quoted Strings\ntodo: true\nbrief: >\n Both single- and double-quoted strings may be\n carried on to new lines in your YAML document.\n They must be indented a step and indentation\n is interpreted as a single space.\nyaml: |\n i want a long string: "so i'm going to\n let it go on and on to other lines\n until i end it with a quote."\nphp: |\n ['i want a long string' => "so i'm going to ".\n "let it go on and on to other lines ".\n "until i end it with a quote."\n ]\n\n---\ntest: Plain scalars\ntodo: true\nbrief: >\n Unquoted strings may also span multiple lines, if they\n are free of YAML space indicators and indented.\nyaml: |\n - My little toe is broken in two places;\n - I'm crazy to have skied this way;\n - I'm not the craziest he's seen, since there was always the German guy\n who skied for 3 hours on a broken shin bone (just below the kneecap);\n - Nevertheless, second place is respectable, and he doesn't\n recommend going for the record;\n - He's going to put my foot in plaster for a month;\n - This would impair my skiing ability somewhat for the\n duration, as can be imagined.\nphp: |\n [\n "My little toe is broken in two places;",\n "I'm crazy to have skied this way;",\n "I'm not the craziest he's seen, since there was always ".\n "the German guy who skied for 3 hours on a broken shin ".\n "bone (just below the kneecap);",\n "Nevertheless, second place is respectable, and he doesn't ".\n "recommend going for the record;",\n "He's going to put my foot in plaster for a month;",\n "This would impair my skiing ability somewhat for the duration, ".\n "as can be imagined."\n ]\n---\ntest: 'Null'\nbrief: >\n You can use the tilde '~' character for a null value.\nyaml: |\n name: Mr. Show\n hosted by: Bob and David\n date of next season: ~\nphp: |\n [\n 'name' => 'Mr. Show',\n 'hosted by' => 'Bob and David',\n 'date of next season' => null\n ]\n---\ntest: Boolean\nbrief: >\n You can use 'true' and 'false' for Boolean values.\nyaml: |\n Is Gus a Liar?: true\n Do I rely on Gus for Sustenance?: false\nphp: |\n [\n 'Is Gus a Liar?' => true,\n 'Do I rely on Gus for Sustenance?' => false\n ]\n---\ntest: Integers\ndump_skip: true\nbrief: >\n An integer is a series of numbers, optionally\n starting with a positive or negative sign. Integers\n may also contain commas for readability.\nyaml: |\n zero: 0\n simple: 12\nphp: |\n [\n 'zero' => 0,\n 'simple' => 12,\n ]\n---\ntest: Positive Big Integer\ndeprecated: true\ndump_skip: true\nbrief: >\n An integer is a series of numbers, optionally\n starting with a positive or negative sign. Integers\n may also contain commas for readability.\nyaml: |\n one-thousand: 1,000\nphp: |\n [\n 'one-thousand' => 1000.0,\n ]\n---\ntest: Negative Big Integer\ndeprecated: true\ndump_skip: true\nbrief: >\n An integer is a series of numbers, optionally\n starting with a positive or negative sign. Integers\n may also contain commas for readability.\nyaml: |\n negative one-thousand: -1,000\nphp: |\n [\n 'negative one-thousand' => -1000.0\n ]\n---\ntest: Floats\ndump_skip: true\nbrief: >\n Floats are represented by numbers with decimals,\n allowing for scientific notation, as well as\n positive and negative infinity and "not a number."\nyaml: |\n a simple float: 2.00\n scientific notation: 1.00009e+3\nphp: |\n [\n 'a simple float' => 2.0,\n 'scientific notation' => 1000.09\n ]\n---\ntest: Larger Float\ndump_skip: true\ndeprecated: true\nbrief: >\n Floats are represented by numbers with decimals,\n allowing for scientific notation, as well as\n positive and negative infinity and "not a number."\nyaml: |\n larger float: 1,000.09\nphp: |\n [\n 'larger float' => 1000.09,\n ]\n---\ntest: Time\ntodo: true\nbrief: >\n You can represent timestamps by using\n ISO8601 format, or a variation which\n allows spaces between the date, time and\n time zone.\nyaml: |\n iso8601: 2001-12-14t21:59:43.10-05:00\n space separated: 2001-12-14 21:59:43.10 -05:00\nphp: |\n [\n 'iso8601' => mktime( 2001, 12, 14, 21, 59, 43, 0.10, "-05:00" ),\n 'space separated' => mktime( 2001, 12, 14, 21, 59, 43, 0.10, "-05:00" )\n ]\n---\ntest: Date\ntodo: true\nbrief: >\n A date can be represented by its year,\n month and day in ISO8601 order.\nyaml: |\n 1976-07-31\nphp: |\n date( 1976, 7, 31 )\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\symfony\yaml\Tests\Fixtures\YtsTypeTransfers.yml | YtsTypeTransfers.yml | YAML | 7,371 | 0.8 | 0.097744 | 0 | python-kit | 117 | 2023-12-25T01:36:08.933218 | MIT | true | f23cc9a4a8eb94571ac2d98e31c32743 |
language: php\n\ndist: trusty\n\nsudo: false\n\ncache:\n directories:\n - vendor\n - $HOME/.composer/cache/files\n\n\nenv:\n global:\n - TWIG_EXT=no\n - SYMFONY_PHPUNIT_REMOVE_RETURN_TYPEHINT=1\n\nbefore_install:\n - phpenv config-rm xdebug.ini || return 0\n\ninstall:\n - travis_retry composer install\n\nbefore_script:\n - if [ "$TWIG_EXT" == "yes" ]; then sh -c "cd ext/twig && phpize && ./configure --enable-twig && make && make install"; fi\n - if [ "$TWIG_EXT" == "yes" ]; then echo "extension=twig.so" >> `php --ini | grep "Loaded Configuration" | sed -e "s|.*:\s*||"`; fi\n\nscript: ./vendor/bin/simple-phpunit\n\njobs:\n fast_finish: true\n include:\n - php: 5.5\n - php: 5.5\n env: TWIG_EXT=yes\n - php: 5.6\n - php: 5.6\n env: TWIG_EXT=yes\n - php: 7.0\n - php: 7.1\n - php: 7.2\n - php: 7.3\n - php: 7.4snapshot\n - stage: integration tests\n php: 7.3\n script: ./drupal_test.sh\n | dataset_sample\yaml\Qloapps_QloApps\modules\qloautoupgrade\vendor\twig\twig\.travis.yml | .travis.yml | YAML | 1,004 | 0.7 | 0.043478 | 0 | node-utils | 631 | 2024-05-14T22:11:19.959608 | MIT | false | 1e7c0c1a85ea7b4efe909fa64374dcb4 |
language: csharp\nmono: none\ndotnet: 5.0\nos: linux\ndist: focal\nbefore_install:\n - export PATH="$HOME/miniconda3/bin:$PATH"\n - export PYTHONNET_PYDLL="$HOME/miniconda3/lib/libpython3.6m.so"\n - wget -q https://cdn.quantconnect.com/miniconda/Miniconda3-4.5.12-Linux-x86_64.sh\n - bash Miniconda3-4.5.12-Linux-x86_64.sh -b\n - rm -rf Miniconda3-4.5.12-Linux-x86_64.sh\n - sudo ln -s $HOME/miniconda3/lib/libpython3.6m.so /usr/lib/libpython3.6m.so\n - conda update -y python conda pip\n - conda install -y python=3.6.8\n - conda install -y numpy=1.18.1\n - conda install -y pandas=0.25.3\n - conda install -y cython=0.29.15\n - conda install -y scipy=1.4.1\n - conda install -y wrapt=1.12.1\nscript:\n - dotnet nuget add source $TRAVIS_BUILD_DIR/LocalPackages\n - dotnet build /p:Configuration=Release /v:quiet /p:WarningLevel=1 QuantConnect.Lean.sln\n - dotnet test ./Tests/bin/Release/QuantConnect.Tests.dll --filter TestCategory!=TravisExclude -- TestRunParameters.Parameter\(name=\"log-handler\", value=\"ConsoleErrorLogHandler\"\) | dataset_sample\yaml\QuantConnect_Lean\.travis.yml | .travis.yml | YAML | 1,031 | 0.8 | 0 | 0 | react-lib | 776 | 2023-07-26T19:20:23.862441 | BSD-3-Clause | false | 02b17c675087491bc9a15ed589f4b3f2 |
name: Build, Test And Create Nuget Package\n\non:\n push:\n branches: [ main ]\n workflow_dispatch:\n\njobs:\n main:\n name: ${{ matrix.runtime.name }}\n runs-on: ${{ matrix.runtime.runs-on }}\n container: ${{ matrix.runtime.container }}\n timeout-minutes: 15\n \n strategy:\n fail-fast: false\n matrix:\n runtime: \n - name: win-x64\n runs-on: windows-latest-xlarge\n - name: win-x86\n runs-on: windows-latest-xlarge\n - name: linux-x64\n runs-on: ubuntu-latest-xlarge\n container: ubuntu:24.04\n - name: linux-arm64\n runs-on: ubuntu-latest-xlarge-arm64\n container: ubuntu:24.04\n - name: linux-musl-x64\n runs-on: ubuntu-latest-xlarge\n container: alpine:3.20\n - name: osx-x64\n runs-on: macos-latest-large\n - name: osx-arm64\n runs-on: macos-latest-xlarge\n\n steps:\n - name: Checkout sources\n uses: actions/checkout@v4\n\n\n - name: Install Build Tools (Linux)\n if: matrix.runtime.name == 'linux-x64' || matrix.runtime.name == 'linux-arm64'\n shell: sh\n run: |\n apt update --yes\n apt upgrade --yes\n\n # required by actions/setup-dotnet\n apt install bash wget --yes\n\n\n - name: Install Build Tools (Alpine)\n if: matrix.runtime.name == 'linux-musl-x64'\n shell: sh\n run: |\n apk update\n apk upgrade\n\n # required by actions/setup-dotnet\n apk add bash wget\n\n # required by dotnet build command\n apk add libstdc++ libgcc\n\n\n - name: Setup dotnet\n uses: actions/setup-dotnet@v3\n with:\n dotnet-version: '8.0.x'\n\n\n - name: Build and test solution\n shell: bash\n working-directory: ./Source\n env:\n TEST_SHOW_RESULTS: false\n DOTNET_SYSTEM_GLOBALIZATION_INVARIANT: 1\n run: |\n dotnet build --configuration Release --property WarningLevel=0\n dotnet test QuestPDF.UnitTests --configuration Release --runtime ${{ matrix.runtime.name }}\n dotnet test QuestPDF.LayoutTests --configuration Release --runtime ${{ matrix.runtime.name }}\n dotnet test QuestPDF.DocumentationExamples --configuration Release --runtime ${{ matrix.runtime.name }}\n dotnet test QuestPDF.ReportSample --configuration Release --runtime ${{ matrix.runtime.name }} --framework net8.0\n\n if [ "${{ matrix.runtime.name }}" != "linux-musl-x64" ]; then\n dotnet test QuestPDF.ZUGFeRD --configuration Release --runtime ${{ matrix.runtime.name }} --framework net8.0\n else\n echo "Skipping QuestPDF.ZUGFeRD tests on linux-musl-x64"\n fi\n\n dotnet build QuestPDF/QuestPDF.csproj --configuration Release --property WarningLevel=0 --property BUILD_PACKAGE=true\n\n TEST_EXECUTION_PATH='QuestPDF.ReportSample/bin/Release/net8.0/${{ matrix.runtime.name }}'\n mkdir -p testOutput/${{ matrix.runtime.name }} \n cp -r $TEST_EXECUTION_PATH/report.pdf testOutput/${{ matrix.runtime.name }} \n\n\n - name: Upload test results\n uses: actions/upload-artifact@v4\n with:\n name: questpdf-test-results-${{ matrix.runtime.name }}\n path: |\n **/*.pdf\n\n \n - name: Upload nuget artifacts\n uses: actions/upload-artifact@v4\n if: ${{ matrix.runtime.name == 'win-x64' }}\n with:\n name: questpdf-nuget-package\n path: |\n **/*.nupkg\n **/*.snupkg\n !.nuget\n\n merge:\n runs-on: ubuntu-latest\n needs: main\n steps:\n - name: Merge Artifacts\n uses: actions/upload-artifact/merge@v4\n with:\n name: questpdf-test-results\n pattern: questpdf-test-results-*\n delete-merged: true\n | dataset_sample\yaml\QuestPDF_QuestPDF\.github\workflows\main.yml | main.yml | YAML | 3,862 | 0.95 | 0.031746 | 0.058252 | react-lib | 500 | 2025-07-01T18:56:43.702051 | Apache-2.0 | false | 394d23b77bbb2f29b4bb22fa33416fe2 |
# By default, this docker compose script maps all services to localhost only.\n# If you need to make services available outside of your machine, add\n# appropriate service mappings to the .env file. See .env.example file for\n# configuration example.\n#\n# Notes on image versions:\n# - For the key services such as postgres and pulsar we are trying to run\n# against the oldest supported version\n# - For the zookeeper and kafka we are trying to use the oldest supported\n# version that has arm64 images\n# - For everything else we are trying to run against the latest version.\n#\n# To run against the latest image versions update .env file. See .env.example\n# file for configuration examples. You might need to remove the old images\n# first if they are already tagged latest and volumes if their content is\n# incompatible with the latest version, as in case of postgres.\n\nname: quickwit\n\nnetworks:\n default:\n name: quickwit-network\n ipam:\n config:\n - subnet: 172.16.7.0/24\n gateway: 172.16.7.1\n\nservices:\n localstack:\n image: localstack/localstack:${LOCALSTACK_VERSION:-3.5.0}\n container_name: localstack\n ports:\n - "${MAP_HOST_LOCALSTACK:-127.0.0.1}:4566:4566"\n - "${MAP_HOST_LOCALSTACK:-127.0.0.1}:4571:4571"\n - "${MAP_HOST_LOCALSTACK:-127.0.0.1}:8080:8080"\n profiles:\n - all\n - localstack\n environment:\n SERVICES: kinesis,s3,sqs\n PERSISTENCE: 1\n volumes:\n - .localstack:/etc/localstack/init/ready.d\n - localstack_data:/var/lib/localstack\n healthcheck:\n test: ["CMD", "curl", "-k", "-f", "https://localhost:4566/quickwit-integration-tests"]\n interval: 1s\n timeout: 5s\n retries: 100\n\n postgres:\n # The oldest supported version. EOL November 14, 2024\n image: postgres:${POSTGRES_VERSION:-12.17-alpine}\n container_name: postgres\n ports:\n - "${MAP_HOST_POSTGRES:-127.0.0.1}:5432:5432"\n profiles:\n - all\n - postgres\n environment:\n PGDATA: /var/lib/postgresql/data/pgdata\n POSTGRES_USER: ${POSTGRES_USER:-quickwit-dev}\n POSTGRES_PASSWORD: ${POSTGRES_PASSWORD:-quickwit-dev}\n POSTGRES_DB: ${POSTGRES_DB:-quickwit-metastore-dev}\n volumes:\n - postgres_data:/var/lib/postgresql/data\n healthcheck:\n test: ["CMD", "pg_isready"]\n interval: 1s\n timeout: 5s\n retries: 100\n\n pulsar-broker:\n # The oldest version with arm64 docker images. EOL May 2 2025\n image: apachepulsar/pulsar:${PULSAR_VERSION:-3.0.0}\n container_name: pulsar-broker\n command: bin/pulsar standalone\n ports:\n - "${MAP_HOST_PULSAR:-127.0.0.1}:6650:6650"\n - "${MAP_HOST_PULSAR:-127.0.0.1}:8081:8080"\n environment:\n PULSAR_MEM: "-Xms384M -Xmx384M"\n profiles:\n - all\n - pulsar\n\n kafka-broker:\n # The oldest supported version with arm64 docker images. EOL October 27, 2023\n image: confluentinc/cp-kafka:${CP_VERSION:-7.0.9}\n container_name: kafka-broker\n depends_on:\n - zookeeper\n ports:\n - "${MAP_HOST_KAFKA:-127.0.0.1}:9092:9092"\n - "${MAP_HOST_KAFKA:-127.0.0.1}:9101:9101"\n - "${MAP_HOST_KAFKA:-127.0.0.1}:29092:29092"\n profiles:\n - all\n - kafka\n environment:\n KAFKA_BROKER_ID: 1\n KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"\n KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT\n KAFKA_ADVERTISED_LISTENERS: PLAINTEXT_HOST://localhost:9092,PLAINTEXT://kafka-broker:29092\n KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1\n KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0\n KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1\n KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1\n KAFKA_JMX_PORT: 9101\n KAFKA_JMX_HOSTNAME: localhost\n KAFKA_HEAP_OPTS: -Xms256M -Xmx256M\n healthcheck:\n test: ["CMD", "cub", "kafka-ready", "-b", "localhost:9092", "1", "5"]\n start_period: 5s\n interval: 5s\n timeout: 10s\n retries: 100\n\n zookeeper:\n # The oldest supported version with arm64 images. EOL October 27, 2023\n image: confluentinc/cp-zookeeper:${CP_VERSION:-7.0.9}\n container_name: zookeeper\n ports:\n - "${MAP_HOST_ZOOKEEPER:-127.0.0.1}:2181:2181"\n profiles:\n - all\n - kafka\n environment:\n KAFKA_HEAP_OPTS: -Xms256M -Xmx256M\n ZOOKEEPER_CLIENT_PORT: 2181\n ZOOKEEPER_TICK_TIME: 2000\n healthcheck:\n test: ["CMD", "cub", "zk-ready", "localhost:2181", "5"]\n start_period: 5s\n interval: 5s\n timeout: 10s\n retries: 100\n\n azurite:\n image: mcr.microsoft.com/azure-storage/azurite:${AZURITE_VERSION:-3.24.0}\n container_name: azurite\n ports:\n - "${MAP_HOST_AZURITE:-127.0.0.1}:10000:10000" # Blob store port\n profiles:\n - all\n - azurite\n volumes:\n - azurite_data:/data\n command: azurite --blobHost 0.0.0.0 --loose\n\n fake-gcs-server:\n image: fsouza/fake-gcs-server:${FAKE_GCS_SERVER_VERSION:-1.47.7}\n container_name: fake-gcs-server\n ports:\n - "${MAP_HOST_FAKE_GCS_SERVER:-127.0.0.1}:4443:4443" # Blob store port\n profiles:\n - all\n - fake-gcs-server\n volumes:\n - fake_gcs_server_data:/data/sample-bucket\n command: -scheme http\n\n grafana:\n image: grafana/grafana-oss:${GRAFANA_VERSION:-10.4.1}\n container_name: grafana\n ports:\n - "${MAP_HOST_GRAFANA:-127.0.0.1}:3000:3000"\n profiles:\n - grafana\n - monitoring\n environment:\n GF_AUTH_DISABLE_LOGIN_FORM: "true"\n GF_AUTH_ANONYMOUS_ENABLED: "true"\n GF_AUTH_ANONYMOUS_ORG_ROLE: Admin\n volumes:\n - grafana_conf:/etc/grafana\n - grafana_data:/var/lib/grafana\n - ./monitoring/grafana/dashboards:/var/lib/grafana/dashboards\n - ./monitoring/grafana/provisioning:/etc/grafana/provisioning\n\n jaeger:\n image: jaegertracing/all-in-one:${JAEGER_VERSION:-1.48.0}\n container_name: jaeger\n ports:\n - "${MAP_HOST_JAEGER:-127.0.0.1}:16686:16686" # Frontend\n profiles:\n - jaeger\n - monitoring\n\n otel-collector:\n image: otel/opentelemetry-collector:${OTEL_VERSION:-0.84.0}\n container_name: otel-collector\n ports:\n - "${MAP_HOST_OTEL:-127.0.0.1}:1888:1888" # pprof extension\n - "${MAP_HOST_OTEL:-127.0.0.1}:8888:8888" # Prometheus metrics exposed by the collector\n - "${MAP_HOST_OTEL:-127.0.0.1}:8889:8889" # Prometheus exporter metrics\n - "${MAP_HOST_OTEL:-127.0.0.1}:13133:13133" # health_check extension\n - "${MAP_HOST_OTEL:-127.0.0.1}:4317:4317" # OTLP gRPC receiver\n - "${MAP_HOST_OTEL:-127.0.0.1}:4318:4318" # OTLP http receiver\n - "${MAP_HOST_OTEL:-127.0.0.1}:55679:55679" # zpages extension\n profiles:\n - otel\n - monitoring\n volumes:\n - ./monitoring/otel-collector-config.yaml:/etc/otel-collector-config.yaml\n command: ["--config=/etc/otel-collector-config.yaml"]\n\n prometheus:\n image: prom/prometheus:${PROMETHEUS_VERSION:-v2.43.0}\n container_name: prometheus\n ports:\n - "${MAP_HOST_PROMETHEUS:-127.0.0.1}:9090:9090"\n profiles:\n - prometheus\n - monitoring\n volumes:\n - ./monitoring/prometheus.yaml:/etc/prometheus/prometheus.yml\n extra_hosts:\n - "host.docker.internal:host-gateway"\n\n gcp-pubsub-emulator:\n # It is not an official docker image\n # if we prefer we can build a docker from the official docker image (gcloud cli)\n # and install the pubsub emulator https://cloud.google.com/pubsub/docs/emulator\n image: thekevjames/gcloud-pubsub-emulator:${GCLOUD_EMULATOR:-455.0.0}\n container_name: gcp-pubsub-emulator\n ports:\n - "${MAP_HOST_GCLOUD_EMULATOR:-127.0.0.1}:8681:8681"\n environment:\n # create a fake gcp project and a topic / subscription\n - PUBSUB_PROJECT1=quickwit-emulator,emulator_topic:emulator_subscription\n profiles:\n - all\n - gcp-pubsub\n\nvolumes:\n azurite_data:\n fake_gcs_server_data:\n grafana_conf:\n grafana_data:\n localstack_data:\n postgres_data:\n | dataset_sample\yaml\quickwit-oss_quickwit\docker-compose.yml | docker-compose.yml | YAML | 7,937 | 0.8 | 0.020661 | 0.105727 | vue-tools | 411 | 2025-01-28T21:05:53.323267 | MIT | false | 94aeae57c089b5421e9f4b1ab759b865 |
version: 2\nupdates:\n # Rust dependencies\n - package-ecosystem: cargo\n directory: "/quickwit"\n schedule:\n interval: "weekly"\n groups:\n rust-dependencies:\n patterns:\n - "*"\n open-pull-requests-limit: 10\n ignore:\n - dependency-name: "*"\n update-types: ["version-update:semver-patch"]\n\n # Docker dependencies\n - package-ecosystem: docker\n directory: "/"\n schedule:\n interval: "weekly"\n open-pull-requests-limit: 10\n\n # GitHub Actions\n - package-ecosystem: github-actions\n directory: "/"\n schedule:\n interval: "weekly"\n groups:\n github-actions:\n patterns:\n - "*"\n open-pull-requests-limit: 10\n\n # NPM dependencies\n - package-ecosystem: npm\n directory: "/"\n schedule:\n interval: "weekly"\n groups:\n npm-dependencies:\n patterns:\n - "*"\n open-pull-requests-limit: 10\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\dependabot.yml | dependabot.yml | YAML | 909 | 0.8 | 0 | 0.097561 | vue-tools | 11 | 2023-07-21T17:49:31.304286 | MIT | false | dc2693d34e03631979fbdf6ded1969ec |
name: "Build Quickwit binary for macOS"\ndescription: "Build React app and Rust binary for macOS with cargo build."\ninputs:\n target:\n description: "Target"\n required: true\n version:\n description: "Binary version"\n required: true\n token:\n description: "GitHub access token"\n required: true\nruns:\n using: "composite"\n steps:\n - run: echo "ASSET_FULL_NAME=quickwit-${{ inputs.version }}-${{ inputs.target }}" >> $GITHUB_ENV\n shell: bash\n - uses: actions/setup-node@v3\n with:\n node-version: 20\n cache: "yarn"\n cache-dependency-path: quickwit/quickwit-ui/yarn.lock\n - run: yarn global add node-gyp\n shell: bash\n - run: make build-ui\n shell: bash\n - name: Install protoc\n run: brew install protobuf\n shell: bash\n - name: Install rustup\n shell: bash\n run: curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain none -y\n - name: Add target ${{ inputs.target }}\n run: rustup target add ${{ inputs.target }}\n shell: bash\n working-directory: ./quickwit\n - name: Retrieve and export commit date, hash, and tags\n run: |\n echo "QW_COMMIT_DATE=$(TZ=UTC0 git log -1 --format=%cd --date=format-local:%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_ENV\n echo "QW_COMMIT_HASH=$(git rev-parse HEAD)" >> $GITHUB_ENV\n echo "QW_COMMIT_TAGS=$(git tag --points-at HEAD | tr '\n' ',')" >> $GITHUB_ENV\n shell: bash\n - name: Build binary\n run: cargo build --release --features release-macos-feature-vendored-set --target ${{ matrix.target }} --bin quickwit\n shell: bash\n working-directory: ./quickwit\n env:\n QW_COMMIT_DATE: ${{ env.QW_COMMIT_DATE }}\n QW_COMMIT_HASH: ${{ env.QW_COMMIT_HASH }}\n QW_COMMIT_TAGS: ${{ env.QW_COMMIT_TAGS }}\n - name: Bundle archive\n run: |\n make archive BINARY_FILE=quickwit/target/${{ inputs.target }}/release/quickwit \\n BINARY_VERSION=${{ inputs.version }} ARCHIVE_NAME=${{ env.ASSET_FULL_NAME }}\n shell: bash\n - name: Save binary archive for three days\n uses: actions/upload-artifact@v4.4.0\n with:\n name: ${{ env.ASSET_FULL_NAME }}.tar.gz\n path: ./${{ env.ASSET_FULL_NAME }}.tar.gz\n retention-days: 3\n - name: Deploy archive to GitHub release\n uses: quickwit-inc/upload-to-github-release@v1\n env:\n GITHUB_TOKEN: ${{ inputs.token }}\n with:\n file: ${{ env.ASSET_FULL_NAME }}.tar.gz\n overwrite: true\n draft: ${{ inputs.version != 'nightly' }}\n tag_name: ${{ inputs.version }}\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\actions\cargo-build-macos-binary\action.yml | action.yml | YAML | 2,587 | 0.95 | 0.042857 | 0 | python-kit | 862 | 2024-02-01T07:47:52.687714 | GPL-3.0 | false | 8ddf4e9e3b272fa70ae9937df185ad1a |
name: "Build Quickwit binary with cargo cross"\ndescription: "Build React app and Rust binary with cargo cross."\ninputs:\n target:\n description: "Target"\n required: true\n version:\n description: "Binary version"\n required: true\n token:\n description: "GitHub access token"\n required: true\nruns:\n using: "composite"\n steps:\n - run: echo "ASSET_FULL_NAME=quickwit-${{ inputs.version }}-${{ inputs.target }}" >> $GITHUB_ENV\n shell: bash\n - uses: actions/setup-node@v3\n with:\n node-version: 20\n cache: "yarn"\n cache-dependency-path: quickwit/quickwit-ui/yarn.lock\n - run: yarn global add node-gyp\n shell: bash\n - run: make build-ui\n shell: bash\n - name: Install rustup\n shell: bash\n run: curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain none -y\n - name: Install cross\n run: cargo install cross\n shell: bash\n - name: Retrieve and export commit date, hash, and tags\n run: |\n echo "QW_COMMIT_DATE=$(TZ=UTC0 git log -1 --format=%cd --date=format-local:%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_ENV\n echo "QW_COMMIT_HASH=$(git rev-parse HEAD)" >> $GITHUB_ENV\n echo "QW_COMMIT_TAGS=$(git tag --points-at HEAD | tr '\n' ',')" >> $GITHUB_ENV\n shell: bash\n - name: Build Quickwit\n run: cross build --release --features release-feature-vendored-set --target ${{ inputs.target }} --bin quickwit\n shell: bash\n env:\n QW_COMMIT_DATE: ${{ env.QW_COMMIT_DATE }}\n QW_COMMIT_HASH: ${{ env.QW_COMMIT_HASH }}\n QW_COMMIT_TAGS: ${{ env.QW_COMMIT_TAGS }}\n working-directory: ./quickwit\n - name: Bundle archive\n run: |\n make archive BINARY_FILE=quickwit/target/${{ inputs.target }}/release/quickwit \\n BINARY_VERSION=${{ inputs.version }} ARCHIVE_NAME=${{ env.ASSET_FULL_NAME }}\n shell: bash\n - name: Save binary archive for three days\n uses: actions/upload-artifact@v4.4.0\n with:\n name: ${{ env.ASSET_FULL_NAME }}.tar.gz\n path: ./${{ env.ASSET_FULL_NAME }}.tar.gz\n retention-days: 3\n - name: Upload archive\n uses: quickwit-inc/upload-to-github-release@v1\n env:\n GITHUB_TOKEN: ${{ inputs.token }}\n with:\n file: ${{ env.ASSET_FULL_NAME }}.tar.gz\n overwrite: true\n draft: ${{ inputs.version != 'nightly' }}\n tag_name: ${{ inputs.version }}\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\actions\cross-build-binary\action.yml | action.yml | YAML | 2,411 | 0.95 | 0.015152 | 0 | node-utils | 433 | 2024-01-22T15:27:00.333961 | MIT | false | 1cb7c3e1e51498558bd63b5e255fa0b6 |
name: CBENCH\n\non:\n workflow_dispatch:\n push:\n branches:\n - main\n paths:\n - "quickwit/**"\n - "!quickwit/quickwit-ui/**"\n # For security reasons (to make sure the list of allowed users is\n # trusted), make sure we run the workflow definition from the base\n # commit of the pull request.\n pull_request_target:\n\n# This is required for github.rest.issues.createComment.\npermissions:\n issues: write\n pull-requests: write\n\nenv:\n RUSTFLAGS: --cfg tokio_unstable\n\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\njobs:\n tests:\n name: Benchmark\n # The self-hosted runner must have the system deps installed for QW and\n # the benchmark, because we don't have root access.\n runs-on: self-hosted\n timeout-minutes: 60\n steps:\n - name: Set authorized users\n id: authorized-users\n # List of users allowed to trigger this workflow.\n # Because it executes code on a self-hosted runner, it must be restricted to trusted users.\n run: |\n echo 'users=["ddelemeny", "fmassot", "fulmicoton", "guilload", "PSeitz", "rdettai", "trinity-1686a"]' >> $GITHUB_OUTPUT\n\n - uses: actions/checkout@v4\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor) && github.event_name == 'pull_request_target'\n name: Checkout quickwit (pull request commit)\n with:\n repository: quickwit-oss/quickwit\n ref: ${{ github.event.pull_request.head.sha }}\n path: ./quickwit\n\n - uses: actions/checkout@v4\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor) && github.event_name != 'pull_request_target'\n name: Checkout quickwit\n with:\n repository: quickwit-oss/quickwit\n ref: ${{ github.sha }}\n path: ./quickwit\n\n - name: Checkout benchmarking code\n uses: actions/checkout@v4\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor)\n with:\n repository: quickwit-oss/benchmarks\n ref: main\n path: ./benchmarks\n\n - name: Install Rust\n run: rustup update stable\n\n - name: Install protoc\n uses: taiki-e/install-action@v2\n with:\n tool: protoc\n\n # We don't use rust-cache as it requires root access on the self-hosted runner, which we don't have.\n - name: cargo build\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor)\n run: cargo build --release --bin quickwit\n working-directory: ./quickwit/quickwit\n - name: Compile qbench\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor)\n run: cargo build --release\n working-directory: ./benchmarks/qbench\n - name: Run Benchmark on SSD\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor)\n id: bench-run-ssd\n run: python3 ./run.py --search-only --storage pd-ssd --engine quickwit --track generated-logs --tags "${{ github.event_name }}_${{ github.ref_name }}" --manage-engine --source github_workflow --binary-path ../quickwit/quickwit/target/release/quickwit --instance "{autodetect_gcp}" --export-to-endpoint=https://qw-benchmarks.104.155.161.122.nip.io --engine-data-dir "{qwdata_local}" --github-workflow-user "${{ github.actor }}" --github-workflow-run-id "${{ github.run_id }}" --comparison-reference-tag="push_main" --github-pr "${{ github.event_name == 'pull_request_target' && github.event.number || 0 }}" --comparison-reference-commit "${{ github.event_name == 'pull_request_target' && github.sha || github.event.before }}" --write-exported-run-url-to-file $GITHUB_OUTPUT\n working-directory: ./benchmarks\n - name: Run Benchmark on cloud storage\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor)\n id: bench-run-cloud-storage\n run: python3 ./run.py --search-only --storage gcs --engine quickwit --track generated-logs --tags "${{ github.event_name }}_${{ github.ref_name }}" --manage-engine --source github_workflow --binary-path ../quickwit/quickwit/target/release/quickwit --instance "{autodetect_gcp}" --export-to-endpoint=https://qw-benchmarks.104.155.161.122.nip.io --engine-data-dir "{qwdata_gcs}" --engine-config-file engines/quickwit/configs/cbench_quickwit_gcs.yaml --github-workflow-user "${{ github.actor }}" --github-workflow-run-id "${{ github.run_id }}" --comparison-reference-tag="push_main" --github-pr "${{ github.event_name == 'pull_request_target' && github.event.number || 0 }}" --comparison-reference-commit "${{ github.event_name == 'pull_request_target' && github.sha || github.event.before }}" --write-exported-run-url-to-file $GITHUB_OUTPUT\n working-directory: ./benchmarks\n - name: Show results links\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor)\n run: |\n echo "::notice title=Benchmark Results on SSD::${{ steps.bench-run-ssd.outputs.url }}"\n echo "::notice title=Comparison of results on SSD::${{ steps.bench-run-ssd.outputs.comparison_text }}"\n echo "::notice title=Benchmark Results on Cloud Storage::${{ steps.bench-run-cloud-storage.outputs.url }}"\n echo "::notice title=Comparison of results on Cloud Storage::${{ steps.bench-run-cloud-storage.outputs.comparison_text }}"\n - name: In case of auth error\n if: ${{ ! contains(fromJSON(steps.authorized-users.outputs.users), github.actor) }}\n run: |\n echo "::error title=User not allowed to run the benchmark::User must be in list ${{ steps.authorized-users.outputs.users }}"\n - name: Add a PR comment with comparison results\n uses: actions/github-script@v7\n if: contains(fromJSON(steps.authorized-users.outputs.users), github.actor) && github.event_name == 'pull_request_target'\n # Inspired from: https://github.com/actions/github-script/blob/60a0d83039c74a4aee543508d2ffcb1c3799cdea/.github/workflows/pull-request-test.yml\n with:\n script: |\n // Get the existing comments.\n const {data: comments} = await github.rest.issues.listComments({\n owner: context.repo.owner,\n repo: context.repo.repo,\n issue_number: context.payload.number,\n })\n\n // Find any comment already made by the bot to update it.\n const botComment = comments.find(comment => comment.user.id === 41898282)\n const commentBody = "### On SSD:\n${{ steps.bench-run-ssd.outputs.comparison_text }}\n### On GCS:\n${{ steps.bench-run-cloud-storage.outputs.comparison_text }}\n"\n if (botComment) {\n // Update existing comment.\n await github.rest.issues.updateComment({\n owner: context.repo.owner,\n repo: context.repo.repo,\n comment_id: botComment.id,\n body: commentBody\n })\n } else {\n // New comment.\n await github.rest.issues.createComment({\n owner: context.repo.owner,\n repo: context.repo.repo,\n issue_number: context.payload.number,\n body: commentBody\n })\n }\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\cbench.yml | cbench.yml | YAML | 7,333 | 0.95 | 0.094891 | 0.112 | react-lib | 121 | 2024-10-30T05:36:12.463715 | MIT | false | a90a51d9f4bcb35412ef1f716debb39f |
name: CI\n\non:\n workflow_dispatch:\n pull_request:\n push:\n branches:\n - main\n - trigger-ci-workflow\n paths:\n - "quickwit/**"\n - "!quickwit/quickwit-ui/**"\nenv:\n CARGO_INCREMENTAL: 0\n QW_DISABLE_TELEMETRY: 1\n QW_TEST_DATABASE_URL: postgres://quickwit-dev:quickwit-dev@localhost:5432/quickwit-metastore-dev\n RUST_BACKTRACE: 1\n RUSTDOCFLAGS: -Dwarnings -Arustdoc::private_intra_doc_links\n RUSTFLAGS: -Dwarnings --cfg tokio_unstable\n\n# Ensures that we cancel running jobs for the same PR / same workflow.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\njobs:\n tests:\n name: Unit tests\n runs-on: "ubuntu-latest"\n timeout-minutes: 40\n services:\n # PostgreSQL service container\n postgres:\n image: postgres:latest\n ports:\n - 5432:5432\n env:\n POSTGRES_USER: quickwit-dev\n POSTGRES_PASSWORD: quickwit-dev\n POSTGRES_DB: quickwit-metastore-dev\n # Set health checks to wait until postgres has started\n options: >-\n --health-cmd pg_isready\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n steps:\n - uses: actions/checkout@v4\n - name: Install Ubuntu packages\n run: sudo apt-get -y install protobuf-compiler python3\n - uses: dorny/paths-filter@v3\n id: modified\n with:\n filters: |\n rust_src:\n - quickwit/**/*.rs\n - quickwit/**/*.toml\n - quickwit/**/*.proto\n - quickwit/rest-api-tests/**\n - .github/workflows/ci.yml\n # The following step is just meant to install rustup actually.\n # The next one installs the correct toolchain.\n - name: Install rustup\n if: steps.modified.outputs.rust_src == 'true'\n run: curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain none -y\n - name: Setup stable Rust Toolchain\n if: steps.modified.outputs.rust_src == 'true'\n run: rustup show active-toolchain || rustup toolchain install\n working-directory: ./quickwit\n - name: Setup cache\n uses: Swatinem/rust-cache@v2\n if: steps.modified.outputs.rust_src == 'true'\n with:\n workspaces: "./quickwit -> target"\n - name: Install nextest\n if: always() && steps.modified.outputs.rust_src == 'true'\n uses: taiki-e/cache-cargo-install-action@v2\n with:\n tool: cargo-nextest\n - name: cargo nextest\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: cargo nextest run --features=postgres --retries 1\n working-directory: ./quickwit\n - name: cargo build\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: cargo build --features=postgres --bin quickwit\n working-directory: ./quickwit\n - name: Install python packages\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: python3 -m venv venv && source venv/bin/activate && pip install pyaml requests\n working-directory: ./quickwit/rest-api-tests\n - name: Run REST API tests\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: source venv/bin/activate && python3 ./run_tests.py --binary ../target/debug/quickwit\n working-directory: ./quickwit/rest-api-tests\n\n lints:\n name: Lints\n runs-on: "ubuntu-latest"\n timeout-minutes: 20\n steps:\n - uses: actions/checkout@v4\n - uses: dorny/paths-filter@v3\n id: modified\n with:\n filters: |\n rust_src:\n - quickwit/**/*.rs\n - quickwit/**/*.toml\n - quickwit/**/*.proto\n - .github/workflows/ci.yml\n - name: Install Ubuntu packages\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: sudo apt-get -y install protobuf-compiler python3 python3-pip\n - name: Install rustup\n if: steps.modified.outputs.rust_src == 'true'\n run: curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain none -y\n - name: Setup nightly Rust Toolchain (for rustfmt)\n if: steps.modified.outputs.rust_src == 'true'\n run: rustup toolchain install nightly\n - name: Setup stable Rust Toolchain\n if: steps.modified.outputs.rust_src == 'true'\n run: rustup show active-toolchain || rustup toolchain install\n working-directory: ./quickwit\n - name: Setup cache\n if: steps.modified.outputs.rust_src == 'true'\n uses: Swatinem/rust-cache@v2\n with:\n workspaces: "./quickwit -> target"\n - name: Install cargo deny\n if: always() && steps.modified.outputs.rust_src == 'true'\n uses: taiki-e/cache-cargo-install-action@v2\n with:\n # 0.18 requires rustc 1.85\n tool: cargo-deny@0.17.0\n - name: cargo clippy\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: cargo clippy --workspace --tests --all-features\n working-directory: ./quickwit\n - name: cargo deny\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: cargo deny check licenses\n working-directory: ./quickwit\n - name: cargo doc\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: cargo doc\n working-directory: ./quickwit\n - name: License headers check\n if: always()\n run: bash scripts/check_license_headers.sh\n working-directory: ./quickwit\n - name: rustfmt\n if: always() && steps.modified.outputs.rust_src == 'true'\n run: cargo +nightly fmt --all -- --check\n working-directory: ./quickwit\n\n thirdparty-license:\n name: Check Datadog third-party license file\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Install Rust toolchain\n uses: dtolnay/rust-toolchain@stable\n\n - name: Cache cargo tools\n uses: actions/cache@v4\n with:\n path: ~/.cargo/bin\n key: ${{ runner.os }}-cargo-tools-${{ hashFiles('**/Cargo.lock') }}\n\n - name: Install dd-rust-license-tool\n run: dd-rust-license-tool --help || cargo install --git https://github.com/DataDog/rust-license-tool.git --force\n\n - name: Check Datadog third-party license file\n run: dd-rust-license-tool --config quickwit/license-tool.toml --manifest-path quickwit/Cargo.toml check\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\ci.yml | ci.yml | YAML | 6,498 | 0.95 | 0.12 | 0.035928 | node-utils | 714 | 2023-11-05T00:47:04.007497 | GPL-3.0 | false | 01538ee2e13a5b6ce848dd1d7145d085 |
name: Code coverage\n\non:\n workflow_dispatch:\n push:\n branches:\n - main\n - trigger-coverage-workflow\n paths:\n - quickwit/Cargo.toml\n - quickwit/Cargo.lock\n - quickwit/quickwit-*/**\n\nenv:\n AWS_REGION: us-east-1\n AWS_ACCESS_KEY_ID: "placeholder"\n AWS_SECRET_ACCESS_KEY: "placeholder"\n CARGO_INCREMENTAL: 0\n PUBSUB_EMULATOR_HOST: "localhost:9898"\n QW_DISABLE_TELEMETRY: 1\n QW_S3_ENDPOINT: "http://localhost:4566" # Services are exposed as localhost because we are not running coverage in a container.\n QW_S3_FORCE_PATH_STYLE_ACCESS: 1\n QW_TEST_DATABASE_URL: postgres://quickwit-dev:quickwit-dev@localhost:5432/quickwit-metastore-dev\n RUSTFLAGS: -Dwarnings --cfg tokio_unstable\n\njobs:\n test:\n name: Coverage\n runs-on: gh-ubuntu-arm64\n timeout-minutes: 40\n # Setting a containing will require to fix the QW_S3_ENDPOINT to http://localstack:4566\n services:\n localstack:\n image: localstack/localstack:latest\n ports:\n - "4566:4566"\n - "4571:4571"\n - "8080:8080"\n env:\n SERVICES: kinesis,s3,sqs\n options: >-\n --health-cmd "curl -k https://localhost:4566"\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n\n postgres:\n image: postgres:latest\n ports:\n - "5432:5432"\n env:\n POSTGRES_USER: quickwit-dev\n POSTGRES_PASSWORD: quickwit-dev\n POSTGRES_DB: quickwit-metastore-dev\n options: >-\n --health-cmd pg_isready\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n\n kafka-broker:\n image: confluentinc/cp-kafka:7.2.1\n ports:\n - "9092:9092"\n - "9101:9101"\n env:\n KAFKA_BROKER_ID: 1\n KAFKA_ZOOKEEPER_CONNECT: "zookeeper:2181"\n KAFKA_LISTENER_SECURITY_PROTOCOL_MAP: PLAINTEXT:PLAINTEXT,PLAINTEXT_HOST:PLAINTEXT\n KAFKA_ADVERTISED_LISTENERS: PLAINTEXT://kafka-broker:29092,PLAINTEXT_HOST://localhost:9092\n KAFKA_OFFSETS_TOPIC_REPLICATION_FACTOR: 1\n KAFKA_GROUP_INITIAL_REBALANCE_DELAY_MS: 0\n KAFKA_TRANSACTION_STATE_LOG_MIN_ISR: 1\n KAFKA_TRANSACTION_STATE_LOG_REPLICATION_FACTOR: 1\n KAFKA_JMX_PORT: 9101\n KAFKA_JMX_HOSTNAME: localhost\n KAFKA_HEAP_OPTS: -Xms256M -Xmx256M\n options: >-\n --health-cmd "cub kafka-ready -b localhost:9092 1 5"\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n\n zookeeper:\n image: confluentinc/cp-zookeeper:7.2.1\n ports:\n - "2181:2181"\n env:\n KAFKA_HEAP_OPTS: -Xms256M -Xmx256M\n ZOOKEEPER_CLIENT_PORT: 2181\n ZOOKEEPER_TICK_TIME: 2000\n options: >-\n --health-cmd "cub zk-ready localhost:2181 5"\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n\n gcp-pubsub-emulator:\n image: thekevjames/gcloud-pubsub-emulator:7555256f2c\n ports:\n - "9898:8681"\n env:\n PUBSUB_PROJECT1: "quickwit-emulator,emulator_topic:emulator_subscription"\n\n steps:\n - uses: actions/checkout@v4\n\n - name: Install lib libsasl2\n run: |\n sudo apt update\n sudo apt install libsasl2-dev\n sudo apt install libsasl2-2\n\n - uses: actions/cache@v4\n with:\n path: |\n ~/.cargo/git\n ~/.cargo/registry\n key: ${{ runner.os }}-cargo-test-${{ hashFiles('Cargo.lock') }}\n restore-keys: |\n ${{ runner.os }}-cargo-test-${{ hashFiles('Cargo.lock') }}\n ${{ runner.os }}-cargo-test\n\n - name: Install awslocal\n run: pip install awscli-local\n\n - name: Prepare LocalStack S3\n run: ./quickwit-cli/tests/prepare_tests.sh\n working-directory: ./quickwit\n\n # GitHub Actions does not allow services to be started with a custom command,\n # so we are running Azurite as a container manually.\n - name: Run Azurite service\n run: DOCKER_SERVICES=azurite make docker-compose-up\n\n # GitHub Actions does not allow services to be started with a custom command,\n # so we are running fake gcs server as a container manually.\n - name: Run Fake GCS Server service\n run: DOCKER_SERVICES=fake-gcs-server make docker-compose-up\n\n - name: Run Pulsar service\n run: DOCKER_SERVICES=pulsar make docker-compose-up\n\n - name: Install Rust\n run: rustup update stable\n\n - name: Install cargo-llvm-cov, cargo-nextest, and protoc\n uses: taiki-e/install-action@v2\n with:\n tool: cargo-llvm-cov,nextest,protoc\n\n # We limit the number of jobs to 4 to avoid OOM errors when linking the binary.\n - name: Generate code coverage\n run: |\n cargo llvm-cov clean --workspace\n cargo llvm-cov nextest --no-report --test failpoints --features fail/failpoints --retries 4\n # increase stack size for test_all_with_s3_localstack_cli, see quickwit#4963\n RUST_MIN_STACK=67108864 CARGO_BUILD_JOBS=4 cargo llvm-cov nextest --no-report --all-features --retries 4\n cargo llvm-cov report --lcov --output-path lcov.info\n working-directory: ./quickwit\n\n - name: Upload coverage to Codecov\n uses: codecov/codecov-action@v5\n with:\n token: ${{ secrets.CODECOV_TOKEN }} # not required for public repos\n files: ./quickwit/lcov.info\n\n on-failure:\n if: ${{ github.repository_owner == 'quickwit-oss' && failure() }}\n name: On Failure\n needs: [test]\n runs-on: ubuntu-latest\n steps:\n - name: Send Message\n uses: sarisia/actions-status-discord@v1\n with:\n webhook: ${{ secrets.DISCORD_WEBHOOK }}\n nodetail: true\n color: "#FF0000"\n title: ""\n description: |\n ### ❌ [${{ github.event.pull_request.title }}](${{ github.event.pull_request.html_url }})\n\n @${{ github.actor }} quickwit coverage CI failed on your PR.\n\n Coverage CI contains tests that are not running in the regular CI because they are too lengthy.\n For this reason it is possible for it to break even if the tests were passing on your PR.\n This is not a catastrophy, but you are responsible for fixing it!\n\n You can run the full test suite locally with `make test-all`.\n\n Please report in this channel that you are working on it/fixed it/or if it is a flaky test/\n or if you need help.\n\n **[View logs](https://github.com/${{ github.repository }}/actions/runs/${{ github.run_id }})**\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\coverage.yml | coverage.yml | YAML | 6,721 | 0.95 | 0.041026 | 0.052941 | react-lib | 87 | 2023-07-18T15:57:12.353112 | Apache-2.0 | false | f0ddc6811a2ef700cdbf069153d86438 |
name: "Dependency Review"\non: [pull_request]\n\npermissions:\n contents: read\n\n# Ensures that we cancel running jobs for the same PR / same workflow.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\njobs:\n dependency-review:\n runs-on: ubuntu-latest\n steps:\n - name: "Checkout Repository"\n uses: actions/checkout@v4\n - name: "Dependency Review"\n uses: actions/dependency-review-action@v4\n with:\n # This is an minor vuln on the rsa crate, used for\n # google storage.\n allow-ghsas: GHSA-c38w-74pg-36hr,GHSA-4grx-2x9w-596c\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\dependency.yml | dependency.yml | YAML | 660 | 0.8 | 0.086957 | 0.15 | python-kit | 424 | 2025-03-11T17:55:32.077429 | MIT | false | 1de201e4f2b04a352feac02673a83955 |
name: Publish custom cross images\n\non:\n workflow_dispatch:\n push:\n branches:\n - main\n paths:\n - "build/cross-images/**"\n\njobs:\n build-cross-images:\n name: Publish cross images\n runs-on: ubuntu-latest\n steps:\n - name: Check out the repo\n uses: actions/checkout@v4\n - name: Log in to Docker Hub\n uses: docker/login-action@v3\n with:\n username: ${{ secrets.DOCKERHUB_USERNAME }}\n password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN }}\n - name: Build and push cross images\n run: make cross-images\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\publish_cross_images.yml | publish_cross_images.yml | YAML | 574 | 0.8 | 0 | 0 | awesome-app | 544 | 2023-09-17T01:23:14.058126 | GPL-3.0 | false | 1f52bfaa47c94fba10bd4a172d15766f |
name: Build and publish Docker images\n\non:\n workflow_dispatch:\n push:\n branches:\n - main\n paths:\n - "quickwit/**"\n tags:\n - airmail\n - happy-plazza\n - qw*\n - v*\nenv:\n REGISTRY_IMAGE: quickwit/quickwit\n\njobs:\n docker:\n strategy:\n matrix:\n include:\n - os: ubuntu-latest\n platform: linux/amd64\n platform_suffix: amd64\n - os: gh-ubuntu-arm64\n platform: linux/arm64\n platform_suffix: arm64\n runs-on: ${{ matrix.os }}\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - name: Login to Docker Hub\n uses: docker/login-action@v3\n with:\n username: ${{ secrets.DOCKERHUB_USERNAME }}\n password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN }}\n\n - name: Set up QEMU\n uses: docker/setup-qemu-action@v3\n\n - name: Set up Docker Buildx\n uses: docker/setup-buildx-action@v3\n\n - name: Docker meta\n id: meta\n uses: docker/metadata-action@v5\n with:\n images: |\n ${{ env.REGISTRY_IMAGE }}\n labels: |\n org.opencontainers.image.title=Quickwit\n maintainer=Quickwit, Inc. <hello@quickwit.io>\n org.opencontainers.image.vendor=Quickwit, Inc.\n org.opencontainers.image.licenses=Apache-2.0\n\n - name: Retrieve commit date, hash, and tags\n run: |\n echo "QW_COMMIT_DATE=$(TZ=UTC0 git log -1 --format=%cd --date=format-local:%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_ENV\n echo "QW_COMMIT_HASH=$(git rev-parse HEAD)" >> $GITHUB_ENV\n echo "QW_COMMIT_TAGS=$(git tag --points-at HEAD | tr '\n' ',')" >> $GITHUB_ENV\n\n - name: Build and push image\n uses: docker/build-push-action@v6\n id: build\n with:\n context: .\n platforms: ${{ matrix.platform }}\n build-args: |\n QW_COMMIT_DATE=${{ env.QW_COMMIT_DATE }}\n QW_COMMIT_HASH=${{ env.QW_COMMIT_HASH }}\n QW_COMMIT_TAGS=${{ env.QW_COMMIT_TAGS }}\n labels: ${{ steps.meta.outputs.labels }}\n outputs: type=image,name=${{ env.REGISTRY_IMAGE }},push-by-digest=true,name-canonical=true,push=true\n\n - name: Export digest\n run: |\n mkdir -p /tmp/digests\n digest="${{ steps.build.outputs.digest }}"\n touch "/tmp/digests/${digest#sha256:}"\n\n - name: Upload digest\n uses: actions/upload-artifact@v4.6.2\n with:\n name: digest-${{ matrix.platform_suffix }}\n path: /tmp/digests/*\n if-no-files-found: error\n retention-days: 1\n\n merge:\n runs-on: ubuntu-latest\n needs: [docker]\n steps:\n - name: Download digests\n uses: actions/download-artifact@v4.2.1\n with:\n pattern: digest-*\n path: /tmp/digests\n merge-multiple: true\n\n - name: Set up Docker Buildx\n uses: docker/setup-buildx-action@v3\n\n - name: Docker meta\n id: meta\n uses: docker/metadata-action@v5\n with:\n images: ${{ env.REGISTRY_IMAGE }}\n flavor: |\n latest=false\n tags: |\n type=edge,branch=main\n type=edge,branch=main,suffix=-slim-bookworm\n type=semver,pattern={{version}}\n type=semver,pattern={{version}},value=latest\n type=semver,pattern={{version}},suffix=-slim-bookworm\n type=ref,event=tag\n - name: Login to Docker Hub\n uses: docker/login-action@v3\n with:\n username: ${{ secrets.DOCKERHUB_USERNAME }}\n password: ${{ secrets.DOCKERHUB_ACCESS_TOKEN }}\n - name: Create manifest list and push tags\n working-directory: /tmp/digests\n run: |\n docker buildx imagetools create $(jq -cr '.tags | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \\n $(printf '${{ env.REGISTRY_IMAGE }}@sha256:%s ' *)\n - name: Inspect image\n run: |\n docker buildx imagetools inspect ${{ env.REGISTRY_IMAGE }}:${{ steps.meta.outputs.version }}\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\publish_docker_images.yml | publish_docker_images.yml | YAML | 4,088 | 0.8 | 0.007634 | 0 | vue-tools | 100 | 2024-06-27T06:05:38.342979 | BSD-3-Clause | false | 9bfdb2d3801b1e79a76943e46ec332ec |
name: Build and publish AWS Lambda packages\n\non:\n push:\n tags:\n - "aws-lambda-beta-*"\n\njobs:\n build-lambdas:\n name: Build Quickwit Lambdas\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Install Ubuntu packages\n run: sudo apt-get -y install protobuf-compiler python3 python3-pip\n - name: Install rustup\n run: curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain none -y\n - name: Install python dependencies\n run: |\n pip install --user pipenv\n pipenv install --system\n working-directory: ./distribution/lambda\n - name: Lint and format\n run: |\n mypy .\n black . --check\n working-directory: ./distribution/lambda\n - name: Retrieve and export commit date, hash, and tags\n run: |\n echo "QW_COMMIT_DATE=$(TZ=UTC0 git log -1 --format=%cd --date=format-local:%Y-%m-%dT%H:%M:%SZ)" >> $GITHUB_ENV\n echo "QW_COMMIT_HASH=$(git rev-parse HEAD)" >> $GITHUB_ENV\n echo "QW_COMMIT_TAGS=$(git tag --points-at HEAD | tr '\n' ',')" >> $GITHUB_ENV\n - name: Build Quickwit Lambdas\n run: make package\n env:\n QW_COMMIT_DATE: ${{ env.QW_COMMIT_DATE }}\n QW_COMMIT_HASH: ${{ env.QW_COMMIT_HASH }}\n QW_COMMIT_TAGS: ${{ env.QW_COMMIT_TAGS }}\n QW_LAMBDA_BUILD: 1\n working-directory: ./distribution/lambda\n - name: Extract package locations\n run: |\n echo "SEARCHER_PACKAGE_LOCATION=./distribution/lambda/$(make searcher-package-path)" >> $GITHUB_ENV\n echo "INDEXER_PACKAGE_LOCATION=./distribution/lambda/$(make indexer-package-path)" >> $GITHUB_ENV\n working-directory: ./distribution/lambda\n - name: Upload Lambda archives\n uses: quickwit-inc/upload-to-github-release@v1\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n with:\n file: ${{ env.SEARCHER_PACKAGE_LOCATION }};${{ env.INDEXER_PACKAGE_LOCATION }}\n overwrite: true\n draft: true\n tag_name: ${{ github.ref_name }}\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\publish_lambda_packages.yml | publish_lambda_packages.yml | YAML | 2,115 | 0.8 | 0 | 0 | awesome-app | 668 | 2024-06-14T22:31:49.113036 | BSD-3-Clause | false | 5f59850db65802a2f1cbd5cf46815698 |
name: Build and publish nightly packages\n\non:\n workflow_dispatch:\n schedule:\n - cron: "0 5 * * *"\n\njobs:\n build-macos-binaries:\n name: Build ${{ matrix.target }}\n runs-on: macos-latest\n strategy:\n fail-fast: false\n matrix:\n target: [x86_64-apple-darwin, aarch64-apple-darwin]\n steps:\n - uses: actions/checkout@v4\n - uses: ./.github/actions/cargo-build-macos-binary\n with:\n target: ${{ matrix.target }}\n version: nightly\n token: ${{ secrets.GITHUB_TOKEN }}\n build-linux-binaries:\n strategy:\n fail-fast: false\n matrix:\n target: [x86_64-unknown-linux-gnu, aarch64-unknown-linux-gnu]\n name: Build ${{ matrix.target }}\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: ./.github/actions/cross-build-binary\n with:\n target: ${{ matrix.target }}\n version: nightly\n token: ${{ secrets.GITHUB_TOKEN }}\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\publish_nightly_packages.yml | publish_nightly_packages.yml | YAML | 965 | 0.7 | 0 | 0 | react-lib | 891 | 2024-05-22T02:22:11.570727 | GPL-3.0 | false | b80001c19d910b37be58d18147652d24 |
name: Build and publish release packages\n\non:\n push:\n tags:\n - "v*"\n\njobs:\n build-macos-binaries:\n name: Build ${{ matrix.target }}\n runs-on: macos-latest\n strategy:\n matrix:\n target: [x86_64-apple-darwin, aarch64-apple-darwin]\n\n steps:\n - uses: actions/checkout@v4\n - name: Extract asset version\n run: echo "ASSET_VERSION=${GITHUB_REF/refs\/tags\//}" >> $GITHUB_ENV\n - uses: ./.github/actions/cargo-build-macos-binary\n with:\n target: ${{ matrix.target }}\n version: ${{ env.ASSET_VERSION }}\n token: ${{ secrets.GITHUB_TOKEN }}\n\n build-linux-binaries:\n strategy:\n matrix:\n target: [x86_64-unknown-linux-gnu, aarch64-unknown-linux-gnu]\n name: Build ${{ matrix.target }}\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Extract asset version\n run: echo "ASSET_VERSION=${GITHUB_REF/refs\/tags\//}" >> $GITHUB_ENV\n - uses: ./.github/actions/cross-build-binary\n with:\n target: ${{ matrix.target }}\n version: ${{ env.ASSET_VERSION }}\n token: ${{ secrets.GITHUB_TOKEN }}\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\publish_release_packages.yml | publish_release_packages.yml | YAML | 1,155 | 0.8 | 0 | 0 | awesome-app | 189 | 2024-12-12T14:20:09.997552 | MIT | false | 58a32b365367d628980022eb8808e731 |
name: UI CI\n\non:\n workflow_dispatch:\n pull_request:\n paths:\n - "quickwit/quickwit-ui/**"\n - ".github/workflows/ui-ci.yml"\n push:\n branches:\n - main\n - trigger-ci-workflow\n paths:\n - "quickwit/quickwit-ui/**"\n - ".github/workflows/ui-ci.yml"\n\njobs:\n tests:\n name: ${{ matrix.task.name }}\n runs-on: ubuntu-latest\n strategy:\n fail-fast: false\n matrix:\n task:\n - name: Cypress run\n command: |\n sudo apt-get -y install protobuf-compiler\n rustup show active-toolchain || rustup toolchain install\n CI=false yarn --cwd quickwit-ui build\n RUSTFLAGS="--cfg tokio_unstable" cargo build --features=postgres\n mkdir qwdata\n RUSTFLAGS="--cfg tokio_unstable" cargo run --features=postgres -- run --service searcher --service metastore --config ../config/quickwit.yaml &\n yarn --cwd quickwit-ui cypress run\n - name: Lint\n command: yarn --cwd quickwit-ui lint\n - name: Unit Test\n command: yarn --cwd quickwit-ui test\n services:\n # PostgreSQL service container\n postgres:\n image: postgres:latest\n ports:\n - 5432:5432\n env:\n POSTGRES_USER: quickwit-dev\n POSTGRES_PASSWORD: quickwit-dev\n POSTGRES_DB: quickwit-metastore-dev\n # Set health checks to wait until postgres has started\n options: >-\n --health-cmd pg_isready\n --health-interval 10s\n --health-timeout 5s\n --health-retries 5\n env:\n CARGO_INCREMENTAL: 0\n RUST_BACKTRACE: 1\n RUSTFLAGS: -Dwarnings -C lto=off\n RUSTDOCFLAGS: -Dwarnings -Arustdoc::private_intra_doc_links\n QW_TEST_DATABASE_URL: postgres://quickwit-dev:quickwit-dev@postgres:5432/quickwit-metastore-dev\n steps:\n - uses: actions/checkout@v4\n - uses: actions/setup-node@v4\n with:\n node-version: 20\n cache: "yarn"\n cache-dependency-path: quickwit/quickwit-ui/yarn.lock\n - name: Install rustup\n run: curl https://sh.rustup.rs -sSf | sh -s -- --default-toolchain none -y\n - name: Install JS dependencies\n run: yarn --cwd quickwit-ui install\n working-directory: ./quickwit\n - name: Setup Rust cache\n if: matrix.task.name == 'Cypress run'\n uses: Swatinem/rust-cache@v2\n with:\n workspaces: "./quickwit -> target"\n - name: ${{ matrix.task.name }}\n run: ${{ matrix.task.command }}\n working-directory: ./quickwit\n | dataset_sample\yaml\quickwit-oss_quickwit\.github\workflows\ui-ci.yml | ui-ci.yml | YAML | 2,599 | 0.8 | 0.012658 | 0.025974 | node-utils | 134 | 2024-10-15T17:40:50.769649 | BSD-3-Clause | false | aac4c79aeeabff31b287ad67938c18c7 |
version: "3.9"\n\nnetworks:\n default:\n name: quickwit-grafana\n # ipam:\n # config:\n # - subnet: 172.16.7.0/24\n # gateway: 172.16.7.1\n\nservices:\n quickwit:\n image: quickwit/quickwit:${QUICKWIT_VERSION:-0.7.1}\n grafana:\n image: grafana/grafana-oss:${GRAFANA_VERSION:-9.4.7}\n container_name: grafana\n ports:\n - "${MAP_HOST_GRAFANA:-127.0.0.1}:3000:3000"\n environment:\n GF_AUTH_DISABLE_LOGIN_FORM: "true"\n GF_AUTH_ANONYMOUS_ENABLED: "true"\n GF_AUTH_ANONYMOUS_ORG_ROLE: Admin\n volumes:\n - ./monitoring/grafana/dashboards:/var/lib/grafana/dashboards\n - ./monitoring/grafana/provisioning:/etc/grafana/provisioning\n\n jaeger:\n image: jaegertracing/all-in-one:${JAEGER_VERSION:-1.48.0}\n container_name: jaeger\n ports:\n - "${MAP_HOST_JAEGER:-127.0.0.1}:16686:16686" # Frontend\n profiles:\n - jaeger\n - monitoring\n\n otel-collector:\n image: otel/opentelemetry-collector:${OTEL_VERSION:-0.84.0}\n container_name: otel-collector\n ports:\n - "${MAP_HOST_OTEL:-127.0.0.1}:1888:1888" # pprof extension\n - "${MAP_HOST_OTEL:-127.0.0.1}:8888:8888" # Prometheus metrics exposed by the collector\n - "${MAP_HOST_OTEL:-127.0.0.1}:8889:8889" # Prometheus exporter metrics\n - "${MAP_HOST_OTEL:-127.0.0.1}:13133:13133" # health_check extension\n - "${MAP_HOST_OTEL:-127.0.0.1}:4317:4317" # OTLP gRPC receiver\n - "${MAP_HOST_OTEL:-127.0.0.1}:4318:4318" # OTLP http receiver\n - "${MAP_HOST_OTEL:-127.0.0.1}:55679:55679" # zpages extension\n profiles:\n - otel\n - monitoring\n volumes:\n - ./monitoring/otel-collector-config.yaml:/etc/otel-collector-config.yaml\n command: ["--config=/etc/otel-collector-config.yaml"]\n\n prometheus:\n image: prom/prometheus:${PROMETHEUS_VERSION:-v2.43.0}\n container_name: prometheus\n ports:\n - "${MAP_HOST_PROMETHEUS:-127.0.0.1}:9090:9090"\n profiles:\n - prometheus\n - monitoring\n volumes:\n - ./monitoring/prometheus.yaml:/etc/prometheus/prometheus.yml\n extra_hosts:\n - "host.docker.internal:host-gateway"\n\n gcp-pubsub-emulator:\n # It is not an official docker image\n # if we prefer we can build a docker from the official docker image (gcloud cli)\n # and install the pubsub emulator https://cloud.google.com/pubsub/docs/emulator\n image: thekevjames/gcloud-pubsub-emulator:${GCLOUD_EMULATOR:-455.0.0}\n container_name: gcp-pubsub-emulator\n ports:\n - "${MAP_HOST_GCLOUD_EMULATOR:-127.0.0.1}:8681:8681"\n environment:\n # create a fake gcp project and a topic / subscription\n - PUBSUB_PROJECT1=quickwit-emulator,emulator_topic:emulator_subscription\n profiles:\n - all\n - gcp-pubsub\n\nvolumes:\n localstack_data:\n postgres_data:\n azurite_data:\n | dataset_sample\yaml\quickwit-oss_quickwit\config\tutorials\grafana\docker-compose.yml | docker-compose.yml | YAML | 2,801 | 0.8 | 0.011765 | 0.102564 | vue-tools | 373 | 2025-06-26T13:33:53.777352 | BSD-3-Clause | false | deaf1945dd019c0a56c84dfc6d28a9ed |
version: 2\nupdates:\n- package-ecosystem: cargo\n directory: "/"\n schedule:\n interval: daily\n time: "20:00"\n open-pull-requests-limit: 10\n\n- package-ecosystem: "github-actions"\n directory: "/"\n schedule:\n interval: daily\n time: "20:00"\n open-pull-requests-limit: 10\n | dataset_sample\yaml\quickwit-oss_tantivy\.github\dependabot.yml | dependabot.yml | YAML | 282 | 0.7 | 0 | 0 | awesome-app | 429 | 2025-02-03T17:57:02.008360 | Apache-2.0 | false | bd4f8b7a0aeb6fb06e740fed1393aef8 |
# These are supported funding model platforms\n\ngithub: fulmicoton\npatreon: # Replace with a single Patreon username\nopen_collective: # Replace with a single Open Collective username\nko_fi: # Replace with a single Ko-fi username\ntidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel\ncommunity_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry\nliberapay: # Replace with a single Liberapay username\nissuehunt: # Replace with a single IssueHunt username\notechie: # Replace with a single Otechie username\ncustom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']\n | dataset_sample\yaml\quickwit-oss_tantivy\.github\FUNDING.yml | FUNDING.yml | YAML | 644 | 0.8 | 0 | 0.090909 | python-kit | 884 | 2024-08-20T16:42:32.126965 | MIT | false | b327b34dc2d654780f85be2e248db237 |
name: Coverage\n\non:\n push:\n branches: [main]\n\n# Ensures that we cancel running jobs for the same PR / same workflow.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\njobs:\n coverage:\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - name: Install Rust\n run: rustup toolchain install nightly-2024-07-01 --profile minimal --component llvm-tools-preview\n - uses: Swatinem/rust-cache@v2\n - uses: taiki-e/install-action@cargo-llvm-cov\n - name: Generate code coverage\n run: cargo +nightly-2024-07-01 llvm-cov --all-features --workspace --doctests --lcov --output-path lcov.info\n - name: Upload coverage to Codecov\n uses: codecov/codecov-action@v3\n continue-on-error: true\n with:\n token: ${{ secrets.CODECOV_TOKEN }} # not required for public repos\n files: lcov.info\n fail_ci_if_error: true\n | dataset_sample\yaml\quickwit-oss_tantivy\.github\workflows\coverage.yml | coverage.yml | YAML | 979 | 0.95 | 0.068966 | 0.038462 | vue-tools | 863 | 2025-04-18T04:47:37.664290 | Apache-2.0 | false | 18af113c3095b5eea9820db563c4263b |
name: Long running tests\n\non:\n push:\n branches: [ main ]\n\nenv:\n CARGO_TERM_COLOR: always\n NUM_FUNCTIONAL_TEST_ITERATIONS: 20000\n\n# Ensures that we cancel running jobs for the same PR / same workflow.\nconcurrency:\n group: ${{ github.workflow }}-${{ github.event.pull_request.number || github.ref }}\n cancel-in-progress: true\n\njobs:\n test:\n\n runs-on: ubuntu-latest\n\n steps:\n - uses: actions/checkout@v4\n - name: Install stable\n uses: actions-rs/toolchain@v1\n with:\n toolchain: stable\n profile: minimal\n override: true\n\n - name: Run indexing_unsorted\n run: cargo test indexing_unsorted -- --ignored\n - name: Run indexing_sorted\n run: cargo test indexing_sorted -- --ignored\n | dataset_sample\yaml\quickwit-oss_tantivy\.github\workflows\long_running.yml | long_running.yml | YAML | 745 | 0.8 | 0.030303 | 0.038462 | node-utils | 839 | 2024-05-23T05:41:54.121948 | Apache-2.0 | false | 5816138deb9c82f6b16ce89199f2892c |
# Starter pipeline\n# Start with a minimal pipeline that you can customize to build and deploy your code.\n# Add steps that build, run tests, deploy, and more:\n# https://aka.ms/yaml\n\nvariables:\n outputFolder: './_output'\n artifactsFolder: './_artifacts'\n testsFolder: './_tests'\n yarnCacheFolder: $(Pipeline.Workspace)/.yarn\n nugetCacheFolder: $(Pipeline.Workspace)/.nuget/packages\n majorVersion: '5.23.0'\n minorVersion: $[counter('minorVersion', 2000)]\n radarrVersion: '$(majorVersion).$(minorVersion)'\n buildName: '$(Build.SourceBranchName).$(radarrVersion)'\n sentryOrg: 'servarr'\n sentryUrl: 'https://sentry.servarr.com'\n dotnetVersion: '6.0.427'\n nodeVersion: '20.X'\n innoVersion: '6.2.2'\n windowsImage: 'windows-2022'\n linuxImage: 'ubuntu-22.04'\n macImage: 'macOS-13'\n\ntrigger:\n branches:\n include:\n - develop\n - master\n paths:\n exclude:\n - .github\n - src/Radarr.Api.*/openapi.json\n\npr:\n branches:\n include:\n - develop\n paths:\n exclude:\n - .github\n - src/NzbDrone.Core/Localization/Core\n - src/Radarr.Api.*/openapi.json\n\nstages:\n - stage: Setup\n displayName: Setup\n jobs:\n - job:\n displayName: Build Variables\n pool:\n vmImage: ${{ variables.linuxImage }}\n steps:\n # Set the build name properly. The 'name' property won't recursively expand so hack here:\n - bash: echo "##vso[build.updatebuildnumber]$RADARRVERSION"\n displayName: Set Build Name\n - bash: |\n if [[ $BUILD_REASON == "PullRequest" ]]; then\n git diff origin/develop...HEAD --name-only | grep -E "^(src/|azure-pipelines.yml)"\n echo $? > not_backend_update\n else\n echo 0 > not_backend_update\n fi\n cat not_backend_update\n displayName: Check for Backend File Changes\n - publish: not_backend_update\n artifact: not_backend_update\n displayName: Publish update type\n - stage: Build_Backend\n displayName: Build Backend\n dependsOn: Setup\n jobs:\n - job: Backend\n strategy:\n matrix:\n Linux:\n osName: 'Linux'\n imageName: ${{ variables.linuxImage }}\n enableAnalysis: 'true'\n Mac:\n osName: 'Mac'\n imageName: ${{ variables.macImage }}\n enableAnalysis: 'false'\n Windows:\n osName: 'Windows'\n imageName: ${{ variables.windowsImage }}\n enableAnalysis: 'false'\n\n pool:\n vmImage: $(imageName)\n variables:\n # Disable stylecop here - linting errors get caught by the analyze task\n EnableAnalyzers: $(enableAnalysis)\n steps:\n - checkout: self\n submodules: true\n fetchDepth: 1\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - bash: |\n BUNDLEDVERSIONS=${AGENT_TOOLSDIRECTORY}/dotnet/sdk/${DOTNETVERSION}/Microsoft.NETCoreSdk.BundledVersions.props\n echo $BUNDLEDVERSIONS\n if grep -q freebsd-x64 $BUNDLEDVERSIONS; then\n echo "Extra platforms already enabled"\n else\n echo "Enabling extra platform support"\n sed -i.ORI 's/osx-x64/osx-x64;freebsd-x64;linux-x86/' $BUNDLEDVERSIONS\n fi\n displayName: Enable Extra Platform Support\n - bash: ./build.sh --backend --enable-extra-platforms\n displayName: Build Radarr Backend\n - bash: |\n find ${OUTPUTFOLDER} -type f ! -path "*/publish/*" -exec rm -rf {} \;\n find ${OUTPUTFOLDER} -depth -empty -type d -exec rm -r "{}" \;\n find ${TESTSFOLDER} -type f ! -path "*/publish/*" -exec rm -rf {} \;\n find ${TESTSFOLDER} -depth -empty -type d -exec rm -r "{}" \;\n displayName: Clean up intermediate output\n condition: and(succeeded(), ne(variables['osName'], 'Windows'))\n - publish: $(outputFolder)\n artifact: '$(osName)Backend'\n displayName: Publish Backend\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n - publish: '$(testsFolder)/net6.0/win-x64/publish'\n artifact: win-x64-tests\n displayName: Publish win-x64 Test Package\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n - publish: '$(testsFolder)/net6.0/linux-x64/publish'\n artifact: linux-x64-tests\n displayName: Publish linux-x64 Test Package\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n - publish: '$(testsFolder)/net6.0/linux-x86/publish'\n artifact: linux-x86-tests\n displayName: Publish linux-x86 Test Package\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n - publish: '$(testsFolder)/net6.0/linux-musl-x64/publish'\n artifact: linux-musl-x64-tests\n displayName: Publish linux-musl-x64 Test Package\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n - publish: '$(testsFolder)/net6.0/freebsd-x64/publish'\n artifact: freebsd-x64-tests\n displayName: Publish freebsd-x64 Test Package\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n - publish: '$(testsFolder)/net6.0/osx-x64/publish'\n artifact: osx-x64-tests\n displayName: Publish osx-x64 Test Package\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n\n - stage: Build_Frontend\n displayName: Frontend\n dependsOn: Setup\n jobs:\n - job: Build\n strategy:\n matrix:\n Linux:\n osName: 'Linux'\n imageName: ${{ variables.linuxImage }}\n Mac:\n osName: 'Mac'\n imageName: ${{ variables.macImage }}\n Windows:\n osName: 'Windows'\n imageName: ${{ variables.windowsImage }}\n pool:\n vmImage: $(imageName)\n steps:\n - task: UseNode@1\n displayName: Set Node.js version\n inputs:\n version: $(nodeVersion)\n - checkout: self\n submodules: true\n fetchDepth: 1\n - task: Cache@2\n inputs:\n key: 'yarn | "$(osName)" | yarn.lock'\n restoreKeys: |\n yarn | "$(osName)"\n path: $(yarnCacheFolder)\n displayName: Cache Yarn packages\n - bash: ./build.sh --frontend\n displayName: Build Radarr Frontend\n env:\n FORCE_COLOR: 0\n YARN_CACHE_FOLDER: $(yarnCacheFolder)\n - publish: $(outputFolder)\n artifact: '$(osName)Frontend'\n displayName: Publish Frontend\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n \n - stage: Installer\n dependsOn:\n - Build_Backend\n - Build_Frontend\n jobs:\n - job: Windows_Installer\n displayName: Create Installer\n pool:\n vmImage: ${{ variables.windowsImage }}\n steps:\n - checkout: self\n fetchDepth: 1\n - task: DownloadPipelineArtifact@2\n inputs:\n buildType: 'current'\n artifactName: WindowsBackend\n targetPath: _output\n displayName: Fetch Backend\n - task: DownloadPipelineArtifact@2\n inputs:\n buildType: 'current'\n artifactName: WindowsFrontend\n targetPath: _output\n displayName: Fetch Frontend\n - bash: |\n ./build.sh --packages --installer\n cp distribution/windows/setup/output/Radarr.*win-x64.exe ${BUILD_ARTIFACTSTAGINGDIRECTORY}/Radarr.${BUILDNAME}.windows-core-x64-installer.exe\n cp distribution/windows/setup/output/Radarr.*win-x86.exe ${BUILD_ARTIFACTSTAGINGDIRECTORY}/Radarr.${BUILDNAME}.windows-core-x86-installer.exe\n displayName: Create Installers\n - publish: $(Build.ArtifactStagingDirectory)\n artifact: 'WindowsInstaller'\n displayName: Publish Installer\n\n - stage: Packages\n dependsOn:\n - Build_Backend\n - Build_Frontend\n jobs:\n - job: Other_Packages\n displayName: Create Standard Packages\n pool:\n vmImage: ${{ variables.linuxImage }}\n steps:\n - checkout: self\n fetchDepth: 1\n - task: DownloadPipelineArtifact@2\n inputs:\n buildType: 'current'\n artifactName: WindowsBackend\n targetPath: _output\n displayName: Fetch Backend\n - task: DownloadPipelineArtifact@2\n inputs:\n buildType: 'current'\n artifactName: WindowsFrontend\n targetPath: _output\n displayName: Fetch Frontend\n - bash: ./build.sh --packages --enable-extra-platforms\n displayName: Create Packages\n - bash: |\n find . -name "ffprobe" -exec chmod a+x {} \;\n find . -name "Radarr" -exec chmod a+x {} \;\n find . -name "Radarr.Update" -exec chmod a+x {} \;\n displayName: Set executable bits\n - task: ArchiveFiles@2\n displayName: Create win-x64 zip\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).windows-core-x64.zip'\n archiveType: 'zip'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/win-x64/net6.0\n - task: ArchiveFiles@2\n displayName: Create win-x86 zip\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).windows-core-x86.zip'\n archiveType: 'zip'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/win-x86/net6.0\n - task: ArchiveFiles@2\n displayName: Create osx-x64 app\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).osx-app-core-x64.zip'\n archiveType: 'zip'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/osx-x64-app/net6.0\n - task: ArchiveFiles@2\n displayName: Create osx-x64 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).osx-core-x64.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/osx-x64/net6.0\n - task: ArchiveFiles@2\n displayName: Create osx-arm64 app\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).osx-app-core-arm64.zip'\n archiveType: 'zip'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/osx-arm64-app/net6.0\n - task: ArchiveFiles@2\n displayName: Create osx-arm64 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).osx-core-arm64.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/osx-arm64/net6.0\n - task: ArchiveFiles@2\n displayName: Create linux-x64 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).linux-core-x64.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/linux-x64/net6.0\n - task: ArchiveFiles@2\n displayName: Create linux-musl-x64 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).linux-musl-core-x64.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/linux-musl-x64/net6.0\n - task: ArchiveFiles@2\n displayName: Create linux-x86 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).linux-core-x86.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/linux-x86/net6.0\n - task: ArchiveFiles@2\n displayName: Create linux-arm tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).linux-core-arm.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/linux-arm/net6.0\n - task: ArchiveFiles@2\n displayName: Create linux-musl-arm tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).linux-musl-core-arm.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/linux-musl-arm/net6.0\n - task: ArchiveFiles@2\n displayName: Create linux-arm64 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).linux-core-arm64.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/linux-arm64/net6.0\n - task: ArchiveFiles@2\n displayName: Create linux-musl-arm64 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).linux-musl-core-arm64.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/linux-musl-arm64/net6.0\n - task: ArchiveFiles@2\n displayName: Create freebsd-x64 tar\n inputs:\n archiveFile: '$(Build.ArtifactStagingDirectory)/Radarr.$(buildName).freebsd-core-x64.tar.gz'\n archiveType: 'tar'\n tarCompression: 'gz'\n includeRootFolder: false\n rootFolderOrFile: $(artifactsFolder)/freebsd-x64/net6.0\n - publish: $(Build.ArtifactStagingDirectory)\n artifact: 'Packages'\n displayName: Publish Packages\n - bash: |\n echo "Uploading source maps to sentry"\n curl -sL https://sentry.io/get-cli/ | bash\n RELEASENAME="Radarr@${RADARRVERSION}-${BUILD_SOURCEBRANCHNAME}"\n sentry-cli releases new --finalize -p radarr -p radarr-ui -p radarr-update "${RELEASENAME}"\n sentry-cli releases -p radarr-ui files "${RELEASENAME}" upload-sourcemaps _output/UI/ --rewrite\n sentry-cli releases set-commits --auto "${RELEASENAME}"\n if [[ ${BUILD_SOURCEBRANCH} == "refs/heads/develop" ]]; then\n sentry-cli releases deploys "${RELEASENAME}" new -e nightly\n else\n sentry-cli releases deploys "${RELEASENAME}" new -e production\n fi\n if [ $? -gt 0 ]; then\n echo "##vso[task.logissue type=warning]Error uploading source maps."\n fi\n exit 0\n displayName: Publish Sentry Source Maps\n condition: |\n or\n (\n and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/develop')),\n and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/master'))\n )\n env:\n SENTRY_AUTH_TOKEN: $(sentryAuthTokenServarr)\n SENTRY_ORG: $(sentryOrg)\n SENTRY_URL: $(sentryUrl)\n \n - stage: Unit_Test\n displayName: Unit Tests\n dependsOn: Build_Backend\n\n jobs:\n - job: Prepare\n pool:\n vmImage: ${{ variables.linuxImage }}\n steps:\n - checkout: none\n - task: DownloadPipelineArtifact@2\n inputs:\n buildType: 'current'\n artifactName: 'not_backend_update'\n targetPath: '.'\n - bash: echo "##vso[task.setvariable variable=backendNotUpdated;isOutput=true]$(cat not_backend_update)"\n name: setVar\n\n - job: Unit\n displayName: Unit Native\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n workspace:\n clean: all\n\n strategy:\n matrix:\n MacCore:\n osName: 'Mac'\n testName: 'osx-x64'\n poolName: 'Azure Pipelines'\n imageName: ${{ variables.macImage }}\n WindowsCore:\n osName: 'Windows'\n testName: 'win-x64'\n poolName: 'Azure Pipelines'\n imageName: ${{ variables.windowsImage }}\n LinuxCore:\n osName: 'Linux'\n testName: 'linux-x64'\n poolName: 'Azure Pipelines'\n imageName: ${{ variables.linuxImage }}\n FreebsdCore:\n osName: 'Linux'\n testName: 'freebsd-x64'\n poolName: 'FreeBSD'\n imageName:\n\n pool:\n name: $(poolName)\n vmImage: $(imageName)\n\n steps:\n - checkout: none\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n condition: ne(variables['poolName'], 'FreeBSD')\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: '$(testName)-tests'\n targetPath: $(testsFolder)\n - powershell: Set-Service SCardSvr -StartupType Manual\n displayName: Enable Windows Test Service\n condition: and(succeeded(), eq(variables['osName'], 'Windows'))\n - bash: |\n chmod a+x _tests/ffprobe\n displayName: Make ffprobe Executable\n condition: and(succeeded(), ne(variables['osName'], 'Windows'))\n - bash: find ${TESTSFOLDER} -name "Radarr.Test.Dummy" -exec chmod a+x {} \;\n displayName: Make Test Dummy Executable\n condition: and(succeeded(), ne(variables['osName'], 'Windows'))\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ${TESTSFOLDER}/test.sh ${OSNAME} Unit Test\n displayName: Run Tests\n env:\n TEST_DIR: $(Build.SourcesDirectory)/_tests\n - task: PublishTestResults@2\n displayName: Publish Test Results\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: '$(testName) Unit Tests'\n failTaskOnFailedTests: true\n\n - job: Unit_Docker\n displayName: Unit Docker\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n strategy:\n matrix:\n alpine:\n testName: 'Musl Net Core'\n artifactName: linux-musl-x64-tests\n containerImage: ghcr.io/servarr/testimages:alpine\n linux-x86:\n testName: 'linux-x86'\n artifactName: linux-x86-tests\n containerImage: ghcr.io/servarr/testimages:linux-x86\n\n pool:\n vmImage: ${{ variables.linuxImage }}\n \n container: $[ variables['containerImage'] ]\n\n timeoutInMinutes: 10\n \n steps:\n - task: UseDotNet@2\n displayName: 'Install .NET'\n inputs:\n version: $(dotnetVersion)\n condition: and(succeeded(), ne(variables['testName'], 'linux-x86'))\n - bash: |\n SDKURL=$(curl -s https://api.github.com/repos/Servarr/dotnet-linux-x86/releases | jq -rc '.[].assets[].browser_download_url' | grep sdk-${DOTNETVERSION}.*gz$)\n curl -fsSL $SDKURL | tar xzf - -C /opt/dotnet\n displayName: 'Install .NET'\n condition: and(succeeded(), eq(variables['testName'], 'linux-x86'))\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: $(artifactName)\n targetPath: $(testsFolder)\n - bash: |\n chmod a+x _tests/ffprobe\n displayName: Make ffprobe Executable\n - bash: find ${TESTSFOLDER} -name "Radarr.Test.Dummy" -exec chmod a+x {} \;\n displayName: Make Test Dummy Executable\n condition: and(succeeded(), ne(variables['osName'], 'Windows'))\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ls -lR ${TESTSFOLDER}\n ${TESTSFOLDER}/test.sh Linux Unit Test\n displayName: Run Tests\n - task: PublishTestResults@2\n displayName: Publish Test Results\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: '$(testName) Unit Tests'\n failTaskOnFailedTests: true\n \n - job: Unit_LinuxCore_Postgres14\n displayName: Unit Native LinuxCore with Postgres14 Database\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n variables:\n pattern: 'Radarr.*.linux-core-x64.tar.gz'\n artifactName: linux-x64-tests\n Radarr__Postgres__Host: 'localhost'\n Radarr__Postgres__Port: '5432'\n Radarr__Postgres__User: 'radarr'\n Radarr__Postgres__Password: 'radarr'\n\n pool:\n vmImage: ${{ variables.linuxImage }}\n\n timeoutInMinutes: 10\n \n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: $(artifactName)\n targetPath: $(testsFolder)\n - bash: |\n chmod a+x _tests/ffprobe\n displayName: Make ffprobe Executable\n - bash: find ${TESTSFOLDER} -name "Radarr.Test.Dummy" -exec chmod a+x {} \;\n displayName: Make Test Dummy Executable\n condition: and(succeeded(), ne(variables['osName'], 'Windows'))\n - bash: |\n docker run -d --name=postgres14 \\n -e POSTGRES_PASSWORD=radarr \\n -e POSTGRES_USER=radarr \\n -p 5432:5432/tcp \\n -v /usr/share/zoneinfo/America/Chicago:/etc/localtime:ro \\n postgres:14\n displayName: Start postgres\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ls -lR ${TESTSFOLDER}\n ${TESTSFOLDER}/test.sh Linux Unit Test\n displayName: Run Tests\n - task: PublishTestResults@2\n displayName: Publish Test Results\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: 'LinuxCore Postgres14 Unit Tests'\n failTaskOnFailedTests: true\n\n - job: Unit_LinuxCore_Postgres15\n displayName: Unit Native LinuxCore with Postgres15 Database\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n variables:\n pattern: 'Radarr.*.linux-core-x64.tar.gz'\n artifactName: linux-x64-tests\n Radarr__Postgres__Host: 'localhost'\n Radarr__Postgres__Port: '5432'\n Radarr__Postgres__User: 'radarr'\n Radarr__Postgres__Password: 'radarr'\n \n pool:\n vmImage: ${{ variables.linuxImage }}\n\n timeoutInMinutes: 10\n \n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: $(artifactName)\n targetPath: $(testsFolder)\n - bash: |\n chmod a+x _tests/ffprobe\n displayName: Make ffprobe Executable\n - bash: find ${TESTSFOLDER} -name "Radarr.Test.Dummy" -exec chmod a+x {} \;\n displayName: Make Test Dummy Executable\n condition: and(succeeded(), ne(variables['osName'], 'Windows'))\n - bash: |\n docker run -d --name=postgres15 \\n -e POSTGRES_PASSWORD=radarr \\n -e POSTGRES_USER=radarr \\n -p 5432:5432/tcp \\n -v /usr/share/zoneinfo/America/Chicago:/etc/localtime:ro \\n postgres:15\n displayName: Start postgres\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ls -lR ${TESTSFOLDER}\n ${TESTSFOLDER}/test.sh Linux Unit Test\n displayName: Run Tests\n - task: PublishTestResults@2\n displayName: Publish Test Results\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: 'LinuxCore Postgres15 Unit Tests'\n failTaskOnFailedTests: true\n\n - stage: Integration\n displayName: Integration\n dependsOn: Packages\n\n jobs:\n - job: Prepare\n pool:\n vmImage: ${{ variables.linuxImage }}\n steps:\n - checkout: none\n - task: DownloadPipelineArtifact@2\n inputs:\n buildType: 'current'\n artifactName: 'not_backend_update'\n targetPath: '.'\n - bash: echo "##vso[task.setvariable variable=backendNotUpdated;isOutput=true]$(cat not_backend_update)"\n name: setVar\n\n - job: Integration_Native\n displayName: Integration Native\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n strategy:\n matrix:\n MacCore:\n osName: 'Mac'\n testName: 'osx-x64'\n imageName: ${{ variables.macImage }}\n pattern: 'Radarr.*.osx-core-x64.tar.gz'\n WindowsCore:\n osName: 'Windows'\n testName: 'win-x64'\n imageName: ${{ variables.windowsImage }}\n pattern: 'Radarr.*.windows-core-x64.zip'\n LinuxCore:\n osName: 'Linux'\n testName: 'linux-x64'\n imageName: ${{ variables.linuxImage }}\n pattern: 'Radarr.*.linux-core-x64.tar.gz'\n\n pool:\n vmImage: $(imageName)\n \n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: '$(testName)-tests'\n targetPath: $(testsFolder)\n - task: DownloadPipelineArtifact@2\n displayName: Download Build Artifact\n inputs:\n buildType: 'current'\n artifactName: Packages\n itemPattern: '**/$(pattern)'\n targetPath: $(Build.ArtifactStagingDirectory)\n - task: ExtractFiles@1\n inputs:\n archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/**/$(pattern)' \n destinationFolder: '$(Build.ArtifactStagingDirectory)/bin'\n displayName: Extract Package\n - bash: |\n mkdir -p ./bin/\n cp -r -v ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin/Radarr/. ./bin/\n displayName: Move Package Contents\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ${TESTSFOLDER}/test.sh ${OSNAME} Integration Test\n displayName: Run Integration Tests\n - task: PublishTestResults@2\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: '$(testName) Integration Tests'\n failTaskOnFailedTests: true\n displayName: Publish Test Results\n\n - job: Integration_LinuxCore_Postgres14\n displayName: Integration Native LinuxCore with Postgres14 Database\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n variables:\n pattern: 'Radarr.*.linux-core-x64.tar.gz'\n Radarr__Postgres__Host: 'localhost'\n Radarr__Postgres__Port: '5432'\n Radarr__Postgres__User: 'radarr'\n Radarr__Postgres__Password: 'radarr'\n\n pool:\n vmImage: ${{ variables.linuxImage }}\n\n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: 'linux-x64-tests'\n targetPath: $(testsFolder)\n - task: DownloadPipelineArtifact@2\n displayName: Download Build Artifact\n inputs:\n buildType: 'current'\n artifactName: Packages\n itemPattern: '**/$(pattern)'\n targetPath: $(Build.ArtifactStagingDirectory)\n - task: ExtractFiles@1\n inputs:\n archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/**/$(pattern)' \n destinationFolder: '$(Build.ArtifactStagingDirectory)/bin'\n displayName: Extract Package\n - bash: |\n mkdir -p ./bin/\n cp -r -v ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin/Radarr/. ./bin/\n displayName: Move Package Contents\n - bash: |\n docker run -d --name=postgres14 \\n -e POSTGRES_PASSWORD=radarr \\n -e POSTGRES_USER=radarr \\n -p 5432:5432/tcp \\n -v /usr/share/zoneinfo/America/Chicago:/etc/localtime:ro \\n postgres:14\n displayName: Start postgres\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ${TESTSFOLDER}/test.sh Linux Integration Test\n displayName: Run Integration Tests\n - task: PublishTestResults@2\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: 'Integration LinuxCore Postgres14 Database Integration Tests'\n failTaskOnFailedTests: true\n displayName: Publish Test Results\n\n\n - job: Integration_LinuxCore_Postgres15\n displayName: Integration Native LinuxCore with Postgres Database\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n variables:\n pattern: 'Radarr.*.linux-core-x64.tar.gz'\n Radarr__Postgres__Host: 'localhost'\n Radarr__Postgres__Port: '5432'\n Radarr__Postgres__User: 'radarr'\n Radarr__Postgres__Password: 'radarr'\n\n pool:\n vmImage: ${{ variables.linuxImage }}\n\n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: 'linux-x64-tests'\n targetPath: $(testsFolder)\n - task: DownloadPipelineArtifact@2\n displayName: Download Build Artifact\n inputs:\n buildType: 'current'\n artifactName: Packages\n itemPattern: '**/$(pattern)'\n targetPath: $(Build.ArtifactStagingDirectory)\n - task: ExtractFiles@1\n inputs:\n archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/**/$(pattern)' \n destinationFolder: '$(Build.ArtifactStagingDirectory)/bin'\n displayName: Extract Package\n - bash: |\n mkdir -p ./bin/\n cp -r -v ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin/Radarr/. ./bin/\n displayName: Move Package Contents\n - bash: |\n docker run -d --name=postgres15 \\n -e POSTGRES_PASSWORD=radarr \\n -e POSTGRES_USER=radarr \\n -p 5432:5432/tcp \\n -v /usr/share/zoneinfo/America/Chicago:/etc/localtime:ro \\n postgres:15\n displayName: Start postgres\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ${TESTSFOLDER}/test.sh Linux Integration Test\n displayName: Run Integration Tests\n - task: PublishTestResults@2\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: 'Integration LinuxCore Postgres15 Database Integration Tests'\n failTaskOnFailedTests: true\n displayName: Publish Test Results\n\n - job: Integration_FreeBSD\n displayName: Integration Native FreeBSD\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n workspace:\n clean: all\n variables:\n pattern: 'Radarr.*.freebsd-core-x64.tar.gz'\n pool:\n name: 'FreeBSD'\n\n steps:\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: 'freebsd-x64-tests'\n targetPath: $(testsFolder)\n - task: DownloadPipelineArtifact@2\n displayName: Download Build Artifact\n inputs:\n buildType: 'current'\n artifactName: Packages\n itemPattern: '**/$(pattern)'\n targetPath: $(Build.ArtifactStagingDirectory)\n - bash: |\n mkdir -p ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin\n tar xf ${BUILD_ARTIFACTSTAGINGDIRECTORY}/$(pattern) -C ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin\n displayName: Extract Package\n - bash: |\n mkdir -p ./bin/\n cp -r -v ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin/Radarr/. ./bin/\n displayName: Move Package Contents\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ${TESTSFOLDER}/test.sh Linux Integration Test\n displayName: Run Integration Tests\n - task: PublishTestResults@2\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: 'FreeBSD Integration Tests'\n failTaskOnFailedTests: true\n displayName: Publish Test Results\n\n - job: Integration_Docker\n displayName: Integration Docker\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n strategy:\n matrix:\n alpine:\n testName: 'linux-musl-x64'\n artifactName: linux-musl-x64-tests\n containerImage: ghcr.io/servarr/testimages:alpine\n pattern: 'Radarr.*.linux-musl-core-x64.tar.gz'\n linux-x86:\n testName: 'linux-x86'\n artifactName: linux-x86-tests\n containerImage: ghcr.io/servarr/testimages:linux-x86\n pattern: 'Radarr.*.linux-core-x86.tar.gz'\n pool:\n vmImage: ${{ variables.linuxImage }}\n\n container: $[ variables['containerImage'] ]\n\n timeoutInMinutes: 15\n \n steps:\n - task: UseDotNet@2\n displayName: 'Install .NET'\n inputs:\n version: $(dotnetVersion)\n condition: and(succeeded(), ne(variables['testName'], 'linux-x86'))\n - bash: |\n SDKURL=$(curl -s https://api.github.com/repos/Servarr/dotnet-linux-x86/releases | jq -rc '.[].assets[].browser_download_url' | grep sdk-${DOTNETVERSION}.*gz$)\n curl -fsSL $SDKURL | tar xzf - -C /opt/dotnet\n displayName: 'Install .NET'\n condition: and(succeeded(), eq(variables['testName'], 'linux-x86'))\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: $(artifactName)\n targetPath: $(testsFolder)\n - task: DownloadPipelineArtifact@2\n displayName: Download Build Artifact\n inputs:\n buildType: 'current'\n artifactName: Packages\n itemPattern: '**/$(pattern)'\n targetPath: $(Build.ArtifactStagingDirectory)\n - task: ExtractFiles@1\n inputs:\n archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/**/$(pattern)' \n destinationFolder: '$(Build.ArtifactStagingDirectory)/bin'\n displayName: Extract Package\n - bash: |\n mkdir -p ./bin/\n cp -r -v ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin/Radarr/. ./bin/\n displayName: Move Package Contents\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ${TESTSFOLDER}/test.sh Linux Integration Test\n displayName: Run Integration Tests\n - task: PublishTestResults@2\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: '$(testName) Integration Tests'\n failTaskOnFailedTests: true\n displayName: Publish Test Results\n\n - stage: Automation\n displayName: Automation\n dependsOn: Packages\n \n jobs:\n - job: Automation\n strategy:\n matrix:\n Linux:\n osName: 'Linux'\n artifactName: 'linux-x64'\n imageName: ${{ variables.linuxImage }}\n pattern: 'Radarr.*.linux-core-x64.tar.gz'\n failBuild: true\n Mac:\n osName: 'Mac'\n artifactName: 'osx-x64'\n imageName: ${{ variables.macImage }}\n pattern: 'Radarr.*.osx-core-x64.tar.gz'\n failBuild: true\n Windows:\n osName: 'Windows'\n artifactName: 'win-x64'\n imageName: ${{ variables.windowsImage }}\n pattern: 'Radarr.*.windows-core-x64.zip'\n failBuild: true\n\n pool:\n vmImage: $(imageName)\n \n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: none\n - task: DownloadPipelineArtifact@2\n displayName: Download Test Artifact\n inputs:\n buildType: 'current'\n artifactName: '$(artifactName)-tests'\n targetPath: $(testsFolder)\n - task: DownloadPipelineArtifact@2\n displayName: Download Build Artifact\n inputs:\n buildType: 'current'\n artifactName: Packages\n itemPattern: '**/$(pattern)'\n targetPath: $(Build.ArtifactStagingDirectory)\n - task: ExtractFiles@1\n inputs:\n archiveFilePatterns: '$(Build.ArtifactStagingDirectory)/**/$(pattern)' \n destinationFolder: '$(Build.ArtifactStagingDirectory)/bin'\n displayName: Extract Package\n - bash: |\n mkdir -p ./bin/\n cp -r -v ${BUILD_ARTIFACTSTAGINGDIRECTORY}/bin/Radarr/. ./bin/\n displayName: Move Package Contents\n - bash: |\n chmod a+x ${TESTSFOLDER}/test.sh\n ${TESTSFOLDER}/test.sh ${OSNAME} Automation Test\n displayName: Run Automation Tests\n - task: CopyFiles@2\n displayName: 'Copy Screenshot to: $(Build.ArtifactStagingDirectory)'\n inputs:\n SourceFolder: '$(Build.SourcesDirectory)'\n Contents: |\n **/*_test_screenshot.png\n TargetFolder: '$(Build.ArtifactStagingDirectory)/screenshots'\n - publish: $(Build.ArtifactStagingDirectory)/screenshots\n artifact: '$(osName)AutomationScreenshots'\n displayName: Publish Screenshot Bundle\n condition: and(succeeded(), eq(variables['System.JobAttempt'], '1'))\n - task: PublishTestResults@2\n inputs:\n testResultsFormat: 'NUnit'\n testResultsFiles: '**/TestResult.xml'\n testRunTitle: '$(osName) Automation Tests'\n failTaskOnFailedTests: $(failBuild)\n displayName: Publish Test Results\n\n - stage: Analyze\n dependsOn:\n - Setup\n displayName: Analyze\n\n jobs:\n - job: Prepare\n pool:\n vmImage: ${{ variables.linuxImage }}\n steps:\n - checkout: none\n - task: DownloadPipelineArtifact@2\n inputs:\n buildType: 'current'\n artifactName: 'not_backend_update'\n targetPath: '.'\n - bash: echo "##vso[task.setvariable variable=backendNotUpdated;isOutput=true]$(cat not_backend_update)"\n name: setVar\n\n - job: Lint_Frontend\n displayName: Lint Frontend\n strategy:\n matrix:\n Linux:\n osName: 'Linux'\n imageName: ${{ variables.linuxImage }}\n Windows:\n osName: 'Windows'\n imageName: ${{ variables.windowsImage }}\n pool:\n vmImage: $(imageName)\n steps:\n - task: UseNode@1\n displayName: Set Node.js version\n inputs:\n version: $(nodeVersion)\n - checkout: self\n submodules: true\n fetchDepth: 1\n - task: Cache@2\n inputs:\n key: 'yarn | "$(osName)" | yarn.lock'\n restoreKeys: |\n yarn | "$(osName)"\n path: $(yarnCacheFolder)\n displayName: Cache Yarn packages\n - bash: ./build.sh --lint\n displayName: Lint Radarr Frontend\n env:\n FORCE_COLOR: 0\n YARN_CACHE_FOLDER: $(yarnCacheFolder)\n\n - job: Analyze_Frontend\n displayName: Frontend\n condition: eq(variables['System.PullRequest.IsFork'], 'False')\n pool:\n vmImage: ${{ variables.windowsImage }}\n steps:\n - checkout: self # Need history for Sonar analysis\n - task: SonarCloudPrepare@3\n env:\n SONAR_SCANNER_OPTS: ''\n inputs:\n SonarCloud: 'SonarCloud'\n organization: 'radarr'\n scannerMode: 'cli'\n configMode: 'manual'\n cliProjectKey: 'Radarr_Radarr.UI'\n cliProjectName: 'RadarrUI'\n cliProjectVersion: '$(radarrVersion)'\n cliSources: './frontend'\n - task: SonarCloudAnalyze@3\n\n - job: Api_Docs\n displayName: API Docs\n dependsOn: Prepare\n condition: |\n and\n (\n and(succeeded(), eq(variables['Build.SourceBranch'], 'refs/heads/develop')),\n and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n )\n\n pool:\n vmImage: ${{ variables.windowsImage }}\n\n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: self\n submodules: true\n persistCredentials: true\n fetchDepth: 1 \n - bash: ./docs.sh Windows\n displayName: Create openapi.json\n - bash: |\n git config --global user.email "development@lidarr.audio"\n git config --global user.name "Servarr"\n git checkout -b api-docs\n git add .\n git status\n if git status | grep modified\n then\n git commit -am 'Automated API Docs update'\n git push -f --set-upstream origin api-docs\n curl -X POST -H "Authorization: token ${GITHUBTOKEN}" -H "Accept: application/vnd.github.v3+json" https://api.github.com/repos/radarr/radarr/pulls -d '{"head":"api-docs","base":"develop","title":"Update API docs"}'\n else\n echo "No changes since last run"\n fi\n displayName: Commit API Doc Change\n continueOnError: true\n env:\n GITHUBTOKEN: $(githubToken)\n - task: CopyFiles@2\n displayName: 'Copy openapi.json to: $(Build.ArtifactStagingDirectory)'\n inputs:\n SourceFolder: '$(Build.SourcesDirectory)'\n Contents: |\n **/*openapi.json\n TargetFolder: '$(Build.ArtifactStagingDirectory)/api_docs'\n - publish: $(Build.ArtifactStagingDirectory)/api_docs\n artifact: 'APIDocs'\n displayName: Publish API Docs Bundle\n condition: and(succeeded(), eq(variables['System.JobAttempt'], '1'))\n\n - job: Analyze_Backend\n displayName: Backend\n dependsOn: Prepare\n condition: and(succeeded(), eq(dependencies.Prepare.outputs['setVar.backendNotUpdated'], '0'))\n\n variables:\n disable.coverage.autogenerate: 'true'\n EnableAnalyzers: 'false'\n\n pool:\n vmImage: ${{ variables.windowsImage }}\n\n steps:\n - task: UseDotNet@2\n displayName: 'Install .net core'\n inputs:\n version: $(dotnetVersion)\n - checkout: self # Need history for Sonar analysis\n submodules: true\n - powershell: Set-Service SCardSvr -StartupType Manual\n displayName: Enable Windows Test Service\n - task: SonarCloudPrepare@3\n condition: eq(variables['System.PullRequest.IsFork'], 'False')\n inputs:\n SonarCloud: 'SonarCloud'\n organization: 'radarr'\n scannerMode: 'dotnet'\n projectKey: 'Radarr_Radarr'\n projectName: 'Radarr'\n projectVersion: '$(radarrVersion)'\n extraProperties: |\n sonar.exclusions=**/obj/**,**/*.dll,**/NzbDrone.Core.Test/Files/**/*,./frontend/**,**/ExternalModules/**,./src/Libraries/**\n sonar.coverage.exclusions=**/Radarr.Api.V3/**/*\n sonar.cs.opencover.reportsPaths=$(Build.SourcesDirectory)/CoverageResults/**/coverage.opencover.xml\n sonar.cs.nunit.reportsPaths=$(Build.SourcesDirectory)/TestResult.xml\n - bash: |\n ./build.sh --backend -f net6.0 -r win-x64\n TEST_DIR=_tests/net6.0/win-x64/publish/ ./test.sh Windows Unit Coverage\n displayName: Coverage Unit Tests\n - task: SonarCloudAnalyze@3\n condition: eq(variables['System.PullRequest.IsFork'], 'False')\n displayName: Publish SonarCloud Results\n - task: reportgenerator@5.3.11\n displayName: Generate Coverage Report\n inputs:\n reports: '$(Build.SourcesDirectory)/CoverageResults/**/coverage.opencover.xml'\n targetdir: '$(Build.SourcesDirectory)/CoverageResults/combined'\n reporttypes: 'HtmlInline_AzurePipelines;Cobertura;Badges'\n publishCodeCoverageResults: true\n\n - stage: Report_Out\n dependsOn:\n - Analyze\n - Installer\n - Unit_Test\n - Integration\n - Automation\n condition: eq(variables['system.pullrequest.isfork'], false)\n displayName: Build Status Report\n jobs:\n - job:\n displayName: Discord Notification\n pool:\n vmImage: ${{ variables.linuxImage }}\n steps:\n - task: DownloadPipelineArtifact@2\n continueOnError: true\n displayName: Download Screenshot Artifact\n inputs:\n buildType: 'current'\n artifactName: 'WindowsAutomationScreenshots'\n targetPath: $(Build.SourcesDirectory)\n - checkout: none\n - pwsh: |\n iex ((New-Object System.Net.WebClient).DownloadString('https://raw.githubusercontent.com/Servarr/AzureDiscordNotify/master/DiscordNotify.ps1'))\n env:\n SYSTEM_ACCESSTOKEN: $(System.AccessToken)\n DISCORDCHANNELID: $(discordChannelId)\n DISCORDWEBHOOKKEY: $(discordWebhookKey)\n DISCORDTHREADID: $(discordThreadId)\n\n | dataset_sample\yaml\Radarr_Radarr\azure-pipelines.yml | azure-pipelines.yml | YAML | 46,086 | 0.8 | 0.006314 | 0.006639 | python-kit | 245 | 2023-07-29T21:32:11.314063 | GPL-3.0 | false | a128f725b5db246824e4ed6fd1f815fe |
# To get started with Dependabot version updates, you'll need to specify which\n# package ecosystems to update and where the package manifests are located.\n# Please see the documentation for more information:\n# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates\n# https://containers.dev/guide/dependabot\n\nversion: 2\nupdates:\n - package-ecosystem: "devcontainers"\n directory: "/"\n schedule:\n interval: weekly\n | dataset_sample\yaml\Radarr_Radarr\.github\dependabot.yml | dependabot.yml | YAML | 467 | 0.8 | 0.166667 | 0.454545 | python-kit | 541 | 2025-03-17T03:47:17.526409 | GPL-3.0 | false | ea316d07ef2a5b1cbd4191907aa383fe |
# These are supported funding model platforms\n\ngithub: radarr\npatreon: # Replace with a single Patreon username\nopen_collective: radarr\nko_fi: # Replace with a single Ko-fi username\ntidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel\ncustom: # Replace with a single custom sponsorship URL\n | dataset_sample\yaml\Radarr_Radarr\.github\FUNDING.yml | FUNDING.yml | YAML | 323 | 0.8 | 0 | 0.142857 | node-utils | 378 | 2025-06-18T01:32:06.465829 | GPL-3.0 | false | 19fe94d08cf1647e2c87592bc8c99615 |
# Configuration for Label Actions - https://github.com/dessant/label-actions\n\n'Type: Support':\n comment: >\n :wave: @{issue-author}, we use the issue tracker exclusively\n for bug reports and feature requests. However, this issue appears\n to be a support request. Please hop over onto our [Discord](https://radarr.video/discord).\n close: true\n close-reason: 'not planned'\n\n'Status: Logs Needed':\n comment: >\n :wave: @{issue-author}, In order to help you further we'll need to see logs. \n You'll need to enable trace logging and replicate the problem that you encountered. \n Guidance on how to enable trace logging can be found in \n our [troubleshooting guide](https://wiki.servarr.com/radarr/troubleshooting#logging-and-log-files). | dataset_sample\yaml\Radarr_Radarr\.github\label-actions.yml | label-actions.yml | YAML | 754 | 0.8 | 0.133333 | 0.071429 | react-lib | 122 | 2023-08-29T21:50:48.934053 | GPL-3.0 | false | 2cadac346dcfe808fdf0698f669a0dd9 |
'Area: API':\n - src/Radarr.Api.V3/**/*\n\n'Area: Db-migration':\n - src/NzbDrone.Core/Datastore/Migration/*\n\n'Area: Download Clients':\n - src/NzbDrone.Core/Download/Clients/**/*\n\n'Area: Import Lists':\n - src/NzbDrone.Core/ImportLists/**/*\n\n'Area: Indexer':\n - src/NzbDrone.Core/Indexers/**/*\n\n'Area: Notifications':\n - src/NzbDrone.Core/Notifications/**/*\n\n'Area: Organizer':\n - src/NzbDrone.Core/Organizer/**/*\n\n'Area: Parser':\n - src/NzbDrone.Core/Parser/**/*\n\n'Area: UI':\n - frontend/**/*\n - package.json\n - yarn.lock\n | dataset_sample\yaml\Radarr_Radarr\.github\labeler.yml | labeler.yml | YAML | 529 | 0.8 | 0 | 0 | vue-tools | 739 | 2023-09-18T08:15:43.024473 | Apache-2.0 | false | abf496f5aa6071f3773e540505cbb811 |
# Number of days of inactivity before an issue becomes stale\ndaysUntilStale: 60\n# Number of days of inactivity before a stale issue is closed\ndaysUntilClose: 7\n# Issues with these labels will never be considered stale\nexemptLabels:\n - feature request #legacy\n - 'Type: Feature Request'\n - 'Status: Confirmed'\n - sonarr-pull\n - lidarr-pull\n - readarr-pull\n# Label to use when marking an issue as stale\nstaleLabel: stale\n# Comment to post when marking an issue as stale. Set to `false` to disable\nmarkComment: >\n This issue has been automatically marked as stale because it has not had recent activity. Please verify that this is still an issue with the latest version of Radarr and report back. Otherwise this issue will be closed.\n# Comment to post when closing a stale issue. Set to `false` to disable\ncloseComment: false\nonly: issues\n | dataset_sample\yaml\Radarr_Radarr\.github\stale.yml | stale.yml | YAML | 843 | 0.8 | 0 | 0.3 | vue-tools | 481 | 2024-10-28T10:52:30.591702 | GPL-3.0 | false | 02fcb48b2119fa9f844b5fe5418beeb4 |
name: Bug Report\ndescription: 'Report a new bug, if you are not 100% certain this is a bug please go to our Discord first'\nlabels: ['Type: Bug', 'Status: Needs Triage']\nbody:\n- type: checkboxes\n attributes:\n label: Is there an existing issue for this?\n description: Please search to see if an open or closed issue already exists for the bug you encountered. If a bug exists and is closed note that it may only be fixed in an unstable branch. \n options:\n - label: I have searched the existing open and closed issues\n required: true\n- type: textarea\n attributes:\n label: Current Behavior\n description: A concise description of what you're experiencing.\n validations:\n required: true\n- type: textarea\n attributes:\n label: Expected Behavior\n description: A concise description of what you expected to happen.\n validations:\n required: true\n- type: textarea\n attributes:\n label: Steps To Reproduce\n description: Steps to reproduce the behavior.\n placeholder: |\n 1. In this environment...\n 2. With this config...\n 3. Run '...'\n 4. See error...\n validations:\n required: false\n- type: textarea\n attributes:\n label: Environment\n description: |\n examples:\n - **OS**: Ubuntu 20.04\n - **Radarr**: Radarr 3.0.1.4259\n - **Docker Install**: Yes\n - **Using Reverse Proxy**: No\n - **Browser**: Firefox 90 (If UI related)\n - **Database**: Sqlite 3.36.0\n value: |\n - OS: \n - Radarr: \n - Docker Install: \n - Using Reverse Proxy: \n - Browser: \n - Database: \n render: markdown\n validations:\n required: true\n- type: dropdown\n attributes:\n label: What branch are you running?\n options:\n - Master\n - Develop\n - Nightly\n - Other (This issue will be closed)\n validations:\n required: true\n- type: textarea\n attributes:\n label: Trace Logs? **Not Optional**\n description: |\n Trace Logs (https://wiki.servarr.com/radarr/troubleshooting#logging-and-log-files) \n ***Generally speaking, all bug reports MUST have trace logs provided.***\n Tip: You can attach images or log files by clicking this area to highlight it and then dragging files in.\n Additionally, any additional info? Screenshots? References? Anything that will give us more context about the issue you are encountering!\n validations:\n required: true\n- type: checkboxes\n attributes:\n label: Trace Logs have been provided as applicable. Reports will be closed if the required logs are not provided.\n description: Trace logs are **generally required** and are not optional for all bug reports and contain `trace`. Info logs are invalid for bug reports and do not contain `debug` nor `trace`\n options:\n - label: I have read and followed the steps in the wiki link above and provided the required trace logs - the logs contain `trace` - that are relevant and show this issue.\n required: true\n | dataset_sample\yaml\Radarr_Radarr\.github\ISSUE_TEMPLATE\bug_report.yml | bug_report.yml | YAML | 2,975 | 0.95 | 0.085366 | 0.012195 | node-utils | 982 | 2024-02-07T15:02:46.013215 | MIT | false | 3ff50823e0222223c9c70ebfd091cc83 |
blank_issues_enabled: false\ncontact_links:\n - name: Support via Discord\n url: https://radarr.video/discord\n about: Chat with users and devs on support and setup related topics.\n | dataset_sample\yaml\Radarr_Radarr\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 184 | 0.8 | 0 | 0 | vue-tools | 934 | 2023-10-02T14:49:49.384429 | MIT | false | 52d221b48731cabc3b6814bb63734aa0 |
name: Feature Request\ndescription: 'Suggest an idea for Radarr'\nlabels: ['Type: Feature Request', 'Status: Needs Triage']\nbody:\n- type: checkboxes\n attributes:\n label: Is there an existing issue for this?\n description: Please search to see if an open or closed issue already exists for the feature you are requesting. If a request exists and is closed note that it may only be fixed in an unstable branch.\n options:\n - label: I have searched the existing open and closed issues\n required: true\n- type: textarea\n attributes:\n label: Is your feature request related to a problem? Please describe\n description: A clear and concise description of what the problem is.\n validations:\n required: true\n- type: textarea\n attributes:\n label: Describe the solution you'd like\n description: A clear and concise description of what you want to happen.\n validations:\n required: true\n- type: textarea\n attributes:\n label: Describe alternatives you've considered\n description: A clear and concise description of any alternative solutions or features you've considered.\n validations:\n required: true\n- type: textarea\n attributes:\n label: Anything else?\n description: |\n Links? References? Mockups? Anything that will give us more context about the feature you are encountering!\n \n Tip: You can attach images or log files by clicking this area to highlight it and then dragging files in.\n validations:\n required: true\n | dataset_sample\yaml\Radarr_Radarr\.github\ISSUE_TEMPLATE\feature_request.yml | feature_request.yml | YAML | 1,477 | 0.85 | 0.105263 | 0 | python-kit | 205 | 2025-06-05T01:17:08.778359 | Apache-2.0 | false | 083a93e82c177a78062579280c108f3e |
name: 'Label Actions'\n\non:\n issues:\n types: [labeled, unlabeled]\n\npermissions:\n contents: read\n issues: write\n\njobs:\n action:\n runs-on: ubuntu-latest\n steps:\n - uses: dessant/label-actions@v3\n with:\n process-only: 'issues'\n | dataset_sample\yaml\Radarr_Radarr\.github\workflows\label-actions.yml | label-actions.yml | YAML | 257 | 0.7 | 0 | 0 | react-lib | 482 | 2024-03-04T08:47:30.124620 | GPL-3.0 | false | f2d4ff5ff57db5e26208ef2be0d2e82d |
name: "Pull Request Labeler"\non:\n - pull_request_target\n\njobs:\n triage:\n permissions:\n contents: read\n pull-requests: write\n runs-on: ubuntu-latest\n steps:\n - uses: actions/labeler@v4\n | dataset_sample\yaml\Radarr_Radarr\.github\workflows\labeler.yml | labeler.yml | YAML | 210 | 0.7 | 0 | 0 | node-utils | 356 | 2025-06-26T22:29:56.011712 | BSD-3-Clause | false | 8fe775411ceb2d4a26a858584534f1bd |
name: 'Lock threads'\n\non:\n workflow_dispatch:\n schedule:\n - cron: '0 0 * * *'\n\npermissions: {}\njobs:\n lock:\n permissions:\n issues: write # to lock issues (dessant/lock-threads)\n pull-requests: write # to lock PRs (dessant/lock-threads)\n\n runs-on: ubuntu-latest\n steps:\n - uses: dessant/lock-threads@v4\n with:\n github-token: ${{ github.token }}\n issue-inactive-days: '90'\n exclude-issue-created-before: ''\n exclude-any-issue-labels: ''\n add-issue-labels: ''\n issue-comment: ''\n issue-lock-reason: 'resolved'\n process-only: ''\n | dataset_sample\yaml\Radarr_Radarr\.github\workflows\lock.yml | lock.yml | YAML | 634 | 0.8 | 0 | 0 | node-utils | 475 | 2024-07-12T06:44:23.563266 | BSD-3-Clause | false | 78b9c22a5d7e6b90f0b21f9dfbaa4928 |
plugins:\n - rubocop-minitest\n - rubocop-packaging\n - rubocop-performance\n - rubocop-rails\n - rubocop-md\n\nAllCops:\n # RuboCop has a bunch of cops enabled by default. This setting tells RuboCop\n # to ignore them, so only the ones explicitly set in this file are enabled.\n DisabledByDefault: true\n SuggestExtensions: false\n Exclude:\n - '**/tmp/**/*'\n - '**/templates/**/*'\n - '**/vendor/**/*'\n - 'actionmailbox/test/dummy/**/*'\n - 'activestorage/test/dummy/**/*'\n - 'actiontext/test/dummy/**/*'\n - 'tools/rail_inspector/test/fixtures/*'\n - guides/source/debugging_rails_applications.md\n - guides/source/active_support_instrumentation.md\n - '**/node_modules/**/*'\n - '**/CHANGELOG.md'\n - '**/2_*_release_notes.md'\n - '**/3_*_release_notes.md'\n - '**/4_*_release_notes.md'\n - '**/5_*_release_notes.md'\n - '**/6_*_release_notes.md'\n\n\nPerformance:\n Exclude:\n - '**/test/**/*'\n\n# Prefer assert_not over assert !\nRails/AssertNot:\n Include:\n - '**/test/**/*'\n\n# Prefer assert_not_x over refute_x\nRails/RefuteMethods:\n Include:\n - '**/test/**/*'\n\nRails/IndexBy:\n Enabled: true\n\nRails/IndexWith:\n Enabled: true\n\n# Prefer &&/|| over and/or.\nStyle/AndOr:\n Enabled: true\n\nLayout/ClosingHeredocIndentation:\n Enabled: true\n\nLayout/ClosingParenthesisIndentation:\n Enabled: true\n\n# Align comments with method definitions.\nLayout/CommentIndentation:\n Enabled: true\n\nLayout/DefEndAlignment:\n Enabled: true\n\nLayout/ElseAlignment:\n Enabled: true\n\n# Align `end` with the matching keyword or starting expression except for\n# assignments, where it should be aligned with the LHS.\nLayout/EndAlignment:\n Enabled: true\n EnforcedStyleAlignWith: variable\n AutoCorrect: true\n\nLayout/EndOfLine:\n Enabled: true\n\nLayout/EmptyLineAfterMagicComment:\n Enabled: true\n\nLayout/EmptyLinesAroundAccessModifier:\n Enabled: true\n EnforcedStyle: only_before\n\nLayout/EmptyLinesAroundBlockBody:\n Enabled: true\n\n# In a regular class definition, no empty lines around the body.\nLayout/EmptyLinesAroundClassBody:\n Enabled: true\n\n# In a regular method definition, no empty lines around the body.\nLayout/EmptyLinesAroundMethodBody:\n Enabled: true\n\n# In a regular module definition, no empty lines around the body.\nLayout/EmptyLinesAroundModuleBody:\n Enabled: true\n\n# Use Ruby >= 1.9 syntax for hashes. Prefer { a: :b } over { :a => :b }.\nStyle/HashSyntax:\n Enabled: true\n EnforcedShorthandSyntax: either\n\n# Method definitions after `private` or `protected` isolated calls need one\n# extra level of indentation.\nLayout/IndentationConsistency:\n Enabled: true\n EnforcedStyle: indented_internal_methods\n Exclude:\n - '**/*.md'\n\n# Two spaces, no tabs (for indentation).\nLayout/IndentationWidth:\n Enabled: true\n\nLayout/LeadingCommentSpace:\n Enabled: true\n\nLayout/SpaceAfterColon:\n Enabled: true\n\nLayout/SpaceAfterComma:\n Enabled: true\n\nLayout/SpaceAfterSemicolon:\n Enabled: true\n\nLayout/SpaceAroundEqualsInParameterDefault:\n Enabled: true\n\nLayout/SpaceAroundKeyword:\n Enabled: true\n\nLayout/SpaceAroundOperators:\n Enabled: true\n\nLayout/SpaceBeforeComma:\n Enabled: true\n\nLayout/SpaceBeforeComment:\n Enabled: true\n\nLayout/SpaceBeforeFirstArg:\n Enabled: true\n\nStyle/DefWithParentheses:\n Enabled: true\n\n# Defining a method with parameters needs parentheses.\nStyle/MethodDefParentheses:\n Enabled: true\n\nStyle/ExplicitBlockArgument:\n Enabled: true\n\nStyle/FrozenStringLiteralComment:\n Enabled: true\n EnforcedStyle: always\n Exclude:\n - 'actionview/test/**/*.builder'\n - 'actionview/test/**/*.ruby'\n - 'actionpack/test/**/*.builder'\n - 'actionpack/test/**/*.ruby'\n - 'activestorage/db/migrate/**/*.rb'\n - 'activestorage/db/update_migrate/**/*.rb'\n - 'actionmailbox/db/migrate/**/*.rb'\n - 'actiontext/db/migrate/**/*.rb'\n - '**/*.md'\n\nStyle/MapToHash:\n Enabled: true\n\nStyle/RedundantFreeze:\n Enabled: true\n\n# Use `foo {}` not `foo{}`.\nLayout/SpaceBeforeBlockBraces:\n Enabled: true\n\n# Use `foo { bar }` not `foo {bar}`.\nLayout/SpaceInsideBlockBraces:\n Enabled: true\n EnforcedStyleForEmptyBraces: space\n\n# Use `{ a: 1 }` not `{a:1}`.\nLayout/SpaceInsideHashLiteralBraces:\n Enabled: true\n\nLayout/SpaceInsideParens:\n Enabled: true\n\n# Check quotes usage according to lint rule below.\nStyle/StringLiterals:\n Enabled: true\n EnforcedStyle: double_quotes\n\n# Detect hard tabs, no hard tabs.\nLayout/IndentationStyle:\n Enabled: true\n\n# Empty lines should not have any spaces.\nLayout/TrailingEmptyLines:\n Enabled: true\n\n# No trailing whitespace.\nLayout/TrailingWhitespace:\n Enabled: true\n\n# Use quotes for string literals when they are enough.\nStyle/RedundantPercentQ:\n Enabled: true\n\nLint/NestedMethodDefinition:\n Enabled: true\n\nLint/AmbiguousOperator:\n Enabled: true\n\nLint/AmbiguousRegexpLiteral:\n Enabled: true\n\nLint/Debugger:\n Enabled: true\n DebuggerRequires:\n - debug\n\nLint/DuplicateRequire:\n Enabled: true\n\nLint/DuplicateMagicComment:\n Enabled: true\n\nLint/DuplicateMethods:\n Enabled: true\n\nLint/ErbNewArguments:\n Enabled: true\n\nLint/EnsureReturn:\n Enabled: true\n\nLint/MissingCopEnableDirective:\n Enabled: true\n\n# Use my_method(my_arg) not my_method( my_arg ) or my_method my_arg.\nLint/RequireParentheses:\n Enabled: true\n\nLint/RedundantCopDisableDirective:\n Enabled: true\n\nLint/RedundantCopEnableDirective:\n Enabled: true\n\nLint/RedundantRequireStatement:\n Enabled: true\n\nLint/RedundantStringCoercion:\n Enabled: true\n\nLint/RedundantSafeNavigation:\n Enabled: true\n\nLint/UriEscapeUnescape:\n Enabled: true\n\nLint/UselessAssignment:\n Enabled: true\n\nLint/DeprecatedClassMethods:\n Enabled: true\n\nLint/InterpolationCheck:\n Enabled: true\n Exclude:\n - '**/test/**/*'\n\nLint/SafeNavigationChain:\n Enabled: true\n\nStyle/EvalWithLocation:\n Enabled: true\n Exclude:\n - '**/test/**/*'\n\nStyle/ParenthesesAroundCondition:\n Enabled: true\n\nStyle/HashTransformKeys:\n Enabled: true\n\nStyle/HashTransformValues:\n Enabled: true\n\nStyle/RedundantBegin:\n Enabled: true\n\nStyle/RedundantReturn:\n Enabled: true\n AllowMultipleReturnValues: true\n\nStyle/RedundantRegexpEscape:\n Enabled: true\n\nStyle/Semicolon:\n Enabled: true\n AllowAsExpressionSeparator: true\n\n# Prefer Foo.method over Foo::method\nStyle/ColonMethodCall:\n Enabled: true\n\nStyle/TrivialAccessors:\n Enabled: true\n\n# Prefer a = b || c over a = b ? b : c\nStyle/RedundantCondition:\n Enabled: true\n\nStyle/RedundantDoubleSplatHashBraces:\n Enabled: true\n\nStyle/OpenStructUse:\n Enabled: true\n\nStyle/ArrayIntersect:\n Enabled: true\n\nStyle/KeywordArgumentsMerging:\n Enabled: true\n\nPerformance/BindCall:\n Enabled: true\n\nPerformance/FlatMap:\n Enabled: true\n\nPerformance/MapCompact:\n Enabled: true\n\nPerformance/SelectMap:\n Enabled: true\n\nPerformance/RedundantMerge:\n Enabled: true\n\nPerformance/StartWith:\n Enabled: true\n\nPerformance/EndWith:\n Enabled: true\n\nPerformance/RegexpMatch:\n Enabled: true\n\nPerformance/ReverseEach:\n Enabled: true\n\nPerformance/StringReplacement:\n Enabled: true\n\nPerformance/DeletePrefix:\n Enabled: true\n\nPerformance/DeleteSuffix:\n Enabled: true\n\nPerformance/InefficientHashSearch:\n Enabled: true\n\nPerformance/ConstantRegexp:\n Enabled: true\n\nPerformance/RedundantStringChars:\n Enabled: true\n\nPerformance/StringInclude:\n Enabled: true\n\nMinitest/AssertNil:\n Enabled: true\n\nMinitest/AssertRaisesWithRegexpArgument:\n Enabled: true\n\nMinitest/AssertWithExpectedArgument:\n Enabled: true\n\nMinitest/LiteralAsActualArgument:\n Enabled: true\n\nMinitest/NonExecutableTestMethod:\n Enabled: true\n\nMinitest/SkipEnsure:\n Enabled: true\n\nMinitest/UnreachableAssertion:\n Enabled: true\n\nMarkdown:\n # Whether to run RuboCop against non-valid snippets\n WarnInvalid: true\n # Whether to lint codeblocks without code attributes\n Autodetect: false\n | dataset_sample\yaml\rails_rails\.rubocop.yml | .rubocop.yml | YAML | 7,706 | 0.95 | 0.012195 | 0.096346 | python-kit | 610 | 2025-06-19T03:29:28.829574 | MIT | false | 4a70813d33feb85a2dbe446e818da1a0 |
# This list was intially created by analyzing the last three months (51\n# modules) committed to Metasploit Framework. Many, many older modules\n# will have offenses, but this should at least provide a baseline for\n# new modules.\n#\n# Updates to this file should include a 'Description' parameter for any\n# explanation needed.\n\n# inherit_from: .rubocop_todo.yml\n\nAllCops:\n TargetRubyVersion: 2.7\n SuggestExtensions: false\n NewCops: disable\n\nrequire:\n - ./lib/rubocop/cop/layout/module_hash_on_new_line.rb\n - ./lib/rubocop/cop/layout/module_hash_values_on_same_line.rb\n - ./lib/rubocop/cop/layout/module_description_indentation.rb\n - ./lib/rubocop/cop/layout/extra_spacing_with_bindata_ignored.rb\n - ./lib/rubocop/cop/lint/module_disclosure_date_format.rb\n - ./lib/rubocop/cop/lint/module_disclosure_date_present.rb\n - ./lib/rubocop/cop/lint/deprecated_gem_version.rb\n - ./lib/rubocop/cop/lint/module_enforce_notes.rb\n - ./lib/rubocop/cop/lint/detect_invalid_pack_directives.rb\n\nLayout/SpaceBeforeBrackets:\n Description: >-\n Disabled as it generates invalid code:\n https://github.com/rubocop-hq/rubocop/issues/9499\n Enabled: false\n\nLint/AmbiguousAssignment:\n Enabled: true\n\nLint/DeprecatedConstants:\n Enabled: true\n\nLint/DuplicateBranch:\n Description: >-\n Disabled as it causes a lot of noise around our current exception/error handling\n Enabled: false\n\nLint/DuplicateRegexpCharacterClassElement:\n Enabled: false\n\nLint/EmptyBlock:\n Enabled: false\n\nLint/EmptyClass:\n Enabled: false\n\nLint/LambdaWithoutLiteralBlock:\n Enabled: true\n\nLint/NoReturnInBeginEndBlocks:\n Enabled: true\n\nLint/NumberedParameterAssignment:\n Enabled: true\n\nLint/OrAssignmentToConstant:\n Enabled: true\n\nLint/RedundantDirGlobSort:\n Enabled: true\n\nLint/SymbolConversion:\n Enabled: true\n\nLint/ToEnumArguments:\n Enabled: true\n\nLint/TripleQuotes:\n Enabled: true\n\nLint/UnexpectedBlockArity:\n Enabled: true\n\nLint/UnmodifiedReduceAccumulator:\n Enabled: true\n\nLint/UnusedMethodArgument:\n Description: >-\n Disabled on files under the lib/ directory (aka library files)\n as this can break YARD documentation since YARD doesn't recognize\n the _ prefix before parameter names and thinks its a different argument.\n See https://github.com/rapid7/metasploit-framework/pull/17735\n Also see https://github.com/rubocop/rubocop/pull/11020\n Enabled: true\n Exclude:\n - 'lib/**/*'\n\nStyle/ArgumentsForwarding:\n Enabled: true\n\nStyle/BlockComments:\n Description: >-\n Disabled as multiline comments are great for embedded code snippets/payloads that can\n be copy/pasted directly into a terminal etc.\n Enabled: false\n\nStyle/CaseLikeIf:\n Description: >-\n This would cause a lot of noise, and potentially introduce subtly different code when\n being auto fixed. Could potentially be enabled in isolation, but would require more\n consideration.\n Enabled: false\n\nStyle/CollectionCompact:\n Enabled: true\n\nStyle/DocumentDynamicEvalDefinition:\n Enabled: false\n\nStyle/EndlessMethod:\n Enabled: true\n\nStyle/HashExcept:\n Enabled: true\n\nStyle/IfWithBooleanLiteralBranches:\n Description: >-\n Most of the time this is a valid replacement. Although it can generate subtly different\n rewrites that might break code:\n 2.7.2 :001 > foo = nil\n => nil\n 2.7.2 :002 > (foo && foo['key'] == 'foo') ? true : false\n => false\n 2.7.2 :003 > foo && foo['key'] == 'foo'\n => nil\n Enabled: false\n\nStyle/NegatedIfElseCondition:\n Enabled: false\n\nStyle/MultipleComparison:\n Description: >-\n Disabled as it generates invalid code:\n https://github.com/rubocop-hq/rubocop/issues/9520\n It may also introduce subtle semantic issues if automatically applied to the\n entire codebase without rigorous testing.\n Enabled: false\n\nStyle/NilLambda:\n Enabled: true\n\nStyle/RedundantArgument:\n Enabled: false\n\nStyle/RedundantAssignment:\n Description: >-\n Disabled as it sometimes improves the readability of code having an explicitly named\n response object, it also makes it easier to put a breakpoint between the assignment\n and return expression\n Enabled: false\n\nStyle/SwapValues:\n Enabled: false\n\nLayout/ModuleHashOnNewLine:\n Enabled: true\n\nLayout/ModuleHashValuesOnSameLine:\n Enabled: true\n\nLayout/ModuleDescriptionIndentation:\n Enabled: true\n\nLint/DetectInvalidPackDirectives:\n Enabled: true\n\nLint/ModuleDisclosureDateFormat:\n Enabled: true\n\nLint/ModuleDisclosureDatePresent:\n Include:\n # Only exploits require disclosure dates, but they can be present in auxiliary modules etc.\n - 'modules/exploits/**/*'\n\nLint/ModuleEnforceNotes:\n Include:\n # Only exploits and auxiliary modules require SideEffects to be listed.\n - 'modules/exploits/**/*'\n - 'modules/auxiliary/**/*'\n - 'modules/post/**/*'\n\nLint/DeprecatedGemVersion:\n Enabled: true\n Exclude:\n - 'metasploit-framework.gemspec'\n\nMetrics/ModuleLength:\n Description: 'Most Metasploit modules are quite large. This is ok.'\n Enabled: false\n\nMetrics/ClassLength:\n Description: 'Most Metasploit classes are quite large. This is ok.'\n Enabled: false\n\nStyle/ClassAndModuleChildren:\n Enabled: false\n Description: 'Forced nesting is harmful for grepping and general code comprehension'\n\nMetrics/AbcSize:\n Enabled: false\n Description: 'This is often a red-herring'\n\nMetrics/CyclomaticComplexity:\n Enabled: false\n Description: 'This is often a red-herring'\n\nMetrics/PerceivedComplexity:\n Enabled: false\n Description: 'This is often a red-herring'\n\nMetrics/BlockNesting:\n Description: >-\n This is a good rule to follow, but will cause a lot of overhead introducing this rule.\n Enabled: false\n\nMetrics/ParameterLists:\n Description: >-\n This is a good rule to follow, but will cause a lot of overhead introducing this rule.\n Increasing the max count for now\n Max: 8\n\nStyle/TernaryParentheses:\n Enabled: false\n Description: 'This outright produces bugs'\n\nStyle/FrozenStringLiteralComment:\n Enabled: false\n Description: 'We cannot support this yet without a lot of things breaking'\n\nStyle/MutableConstant:\n Enabled: false\n Description: 'We cannot support this yet without a lot of things breaking'\n\nStyle/RedundantReturn:\n Description: 'This often looks weird when mixed with actual returns, and hurts nothing'\n Enabled: false\n\nNaming/HeredocDelimiterNaming:\n Description: >-\n Could be enabled in isolation with additional effort.\n Enabled: false\n\nNaming/AccessorMethodName:\n Description: >-\n Disabled for now, as this naming convention is used in a lot of core library files.\n Could be enabled in isolation with additional effort.\n Enabled: false\n\nNaming/ConstantName:\n Description: >-\n Disabled for now, Metasploit is unfortunately too inconsistent with its naming to introduce\n this. Definitely possible to enforce this in the future if need be.\n\n Examples:\n ManualRanking, LowRanking, etc.\n NERR_ClientNameNotFound\n HttpFingerprint\n CachedSize\n ErrUnknownTransferId\n Enabled: false\n\nNaming/VariableNumber:\n Description: 'To make it easier to use reference code, disable this cop'\n Enabled: false\n\nStyle/NumericPredicate:\n Description: 'This adds no efficiency nor space saving'\n Enabled: false\n\nStyle/EvenOdd:\n Description: 'This adds no efficiency nor space saving'\n Enabled: false\n\nStyle/FloatDivision:\n Description: 'Not a safe rule to run on Metasploit without manual verification as the right hand side may be a string'\n Enabled: false\n\nStyle/FormatString:\n Description: 'Not a safe rule to run on Metasploit without manual verification that the format is not redefined/shadowed'\n Enabled: false\n\nStyle/Documentation:\n Enabled: true\n Description: 'Most Metasploit modules do not have class documentation.'\n Exclude:\n - 'modules/**/*'\n - 'test/modules/**/*'\n - 'spec/file_fixtures/modules/**/*'\n\nLayout/FirstArgumentIndentation:\n Enabled: true\n EnforcedStyle: consistent\n Description: 'Useful for the module hash to be indented consistently'\n\nLayout/ArgumentAlignment:\n Enabled: true\n EnforcedStyle: with_first_argument\n Description: 'Useful for the module hash to be indented consistently'\n\nLayout/FirstHashElementIndentation:\n Enabled: true\n EnforcedStyle: consistent\n Description: 'Useful for the module hash to be indented consistently'\n\nLayout/FirstHashElementLineBreak:\n Enabled: true\n Description: 'Enforce consistency by breaking hash elements on to new lines'\n\nLayout/SpaceInsideArrayLiteralBrackets:\n Enabled: false\n Description: 'Almost all module metadata have space in brackets'\n\nStyle/GuardClause:\n Enabled: false\n Description: 'This often introduces bugs in tested code'\n\nStyle/EmptyLiteral:\n Enabled: false\n Description: 'This looks awkward when you mix empty and non-empty literals'\n\nStyle/NegatedIf:\n Enabled: false\n Description: 'This often introduces bugs in tested code'\n\nStyle/ConditionalAssignment:\n Enabled: false\n Description: 'This is confusing for folks coming from other languages'\n\nStyle/Encoding:\n Description: 'We prefer binary to UTF-8.'\n Enabled: false\n\nStyle/ParenthesesAroundCondition:\n Enabled: false\n Description: 'This is used in too many places to discount, especially in ported code. Has little effect'\n\nStyle/StringConcatenation:\n Enabled: false\n Description: >-\n Disabled for now as it changes escape sequences when auto corrected:\n https://github.com/rubocop/rubocop/issues/9543\n\n Additionally seems to break with multiline string concatenation with trailing comments, example:\n payload = "\x12" + # Size\n "\x34" + # eip\n "\x56" # etc\n With `rubocop -A` this will become:\n payload = "\u00124V" # etc\n\nStyle/TrailingCommaInArrayLiteral:\n Enabled: false\n Description: 'This is often a useful pattern, and is actually required by other languages. It does not hurt.'\n\nLayout/LineLength:\n Description: >-\n Metasploit modules often pattern match against very\n long strings when identifying targets.\n Enabled: false\n\nMetrics/BlockLength:\n Enabled: true\n Description: >-\n While the style guide suggests 10 lines, exploit definitions\n often exceed 200 lines.\n Max: 300\n\nMetrics/MethodLength:\n Enabled: true\n Description: >-\n While the style guide suggests 10 lines, exploit definitions\n often exceed 200 lines.\n Max: 300\n\nNaming/MethodParameterName:\n Enabled: true\n Description: 'Whoever made this requirement never looked at crypto methods, IV'\n MinNameLength: 2\n\nNaming/PredicateName:\n Enabled: true\n # Current methods that break the rule, so that we don't add additional methods that break the convention\n AllowedMethods:\n - has_additional_info?\n - has_advanced_options?\n - has_auth\n - has_auto_target?\n - has_bad_activex?\n - has_badchars?\n - has_chars?\n - has_check?\n - has_command?\n - has_content_type_extension?\n - has_datastore_cred?\n - has_evasion_options?\n - has_fatal_errors?\n - has_fields\n - has_files?\n - has_flag?\n - has_function_name?\n - has_gcc?\n - has_h2_headings\n - has_input_name?\n - has_j_security_check?\n - has_key?\n - has_match?\n - has_module\n - has_object_ref\n - has_objects_list\n - has_options?\n - has_page?\n - has_passphrase?\n - has_pid?\n - has_pkt_line_data?\n - has_prereqs?\n - has_privacy_waiver?\n - has_privates?\n - has_protected_mode_prompt?\n - has_proxy?\n - has_read_data?\n - has_ref?\n - has_required_args\n - has_required_module_options?\n - has_requirements\n - has_rop?\n - has_s_flag?\n - has_service_cred?\n - has_subscriber?\n - has_subtree?\n - has_text\n - has_tlv?\n - has_u_flag?\n - has_users?\n - has_vuln?\n - has_waiver?\n - have_auth_error?\n - have_powershell?\n - is_accessible?\n - is_admin?\n - is_alive?\n - is_alpha_web_server?\n - is_android?\n - is_app_binom3?\n - is_app_carlogavazzi?\n - is_app_cnpilot?\n - is_app_epaduo?\n - is_app_epmp1000?\n - is_app_infovista?\n - is_app_ironport?\n - is_app_metweblog?\n - is_app_oilom?\n - is_app_openmind?\n - is_app_popad?\n - is_app_radware?\n - is_app_rfreader?\n - is_app_sentry?\n - is_app_sevone?\n - is_app_splunk?\n - is_app_ssl_vpn?\n - is_array_type?\n - is_auth_required?\n - is_author_blacklisted?\n - is_badchar\n - is_base64?\n - is_bind?\n - is_cached_size_accurate?\n - is_cgi_enabled?\n - is_cgi_exploitable?\n - is_check_interesting?\n - is_child_of?\n - is_clr_enabled\n - is_connect?\n - is_dlink?\n - is_dn?\n - is_dynamic?\n - is_error_code\n - is_exception?\n - is_exploit_module?\n - is_exploitable?\n - is_fqdn?\n - is_glob?\n - is_groupwise?\n - is_guest_mode_enabled?\n - is_hash_from_empty_pwd?\n - is_high_integrity?\n - is_hostname?\n - is_ie?\n - is_imc?\n - is_imc_som?\n - is_in_admin_group?\n - is_interface?\n - is_ip_targeted?\n - is_key_wanted?\n - is_leaf?\n - is_local?\n - is_logged_in?\n - is_loggedin\n - is_loopback_address?\n - is_mac?\n - is_match\n - is_md5_format?\n - is_module_arch?\n - is_module_platform?\n - is_module_wanted?\n - is_multi_platform_exploit?\n - is_not_null?\n - is_null_pointer\n - is_null_pointer?\n - is_num?\n - is_num_type?\n - is_numeric\n - is_online?\n - is_parseable\n - is_pass_ntlm_hash?\n - is_passwd_method?\n - is_password_required?\n - is_payload_compatible?\n - is_payload_platform_compatible?\n - is_pointer_type?\n - is_pri_key?\n - is_proficy?\n - is_rdp_up\n - is_remote_exploit?\n - is_resource_taken?\n - is_rf?\n - is_rmi?\n - is_root?\n - is_routable?\n - is_running?\n - is_scan_complete\n - is_secure_admin_disabled?\n - is_session_type?\n - is_signature_correct?\n - is_single_object?\n - is_struct_type?\n - is_supermicro?\n - is_superuser?\n - is_sws?\n - is_system?\n - is_system_user?\n - is_target?\n - is_target_suitable?\n - is_trial_enabled?\n - is_trustworthy\n - is_uac_enabled?\n - is_url_alive\n - is_usable?\n - is_uuid?\n - is_valid?\n - is_valid_bus?\n - is_valid_snmp_value\n - is_value_wanted?\n - is_version_compat?\n - is_version_tested?\n - is_vmware?\n - is_vul\n - is_vulnerable?\n - is_warbird?\n - is_windows?\n - is_writable\n - is_writable?\n - is_x86?\n - is_zigbee_hwbridge_session?\n\n# %q() is super useful for long strings split over multiple lines and\n# is very common in module constructors for things like descriptions\nStyle/RedundantPercentQ:\n Enabled: false\n\nStyle/NumericLiterals:\n Enabled: false\n Description: 'This often hurts readability for exploit-ish code.'\n\nLayout/FirstArrayElementLineBreak:\n Enabled: true\n Description: 'This cop checks for a line break before the first element in a multi-line array.'\n\nLayout/FirstArrayElementIndentation:\n Enabled: true\n EnforcedStyle: consistent\n Description: 'Useful to force values within the register_options array to have sane indentation'\n\nLayout/EmptyLinesAroundClassBody:\n Enabled: false\n Description: 'these are used to increase readability'\n\nLayout/EmptyLinesAroundMethodBody:\n Enabled: true\n\nLayout/ExtraSpacingWithBinDataIgnored:\n Description: 'Do not use unnecessary spacing.'\n Enabled: true\n # When true, allows most uses of extra spacing if the intent is to align\n # things with the previous or next line, not counting empty lines or comment\n # lines.\n AllowForAlignment: false\n # When true, allows things like 'obj.meth(arg) # comment',\n # rather than insisting on 'obj.meth(arg) # comment'.\n # If done for alignment, either this OR AllowForAlignment will allow it.\n AllowBeforeTrailingComments: true\n # When true, forces the alignment of `=` in assignments on consecutive lines.\n ForceEqualSignAlignment: false\n\nStyle/For:\n Enabled: false\n Description: 'if a module is written with a for loop, it cannot always be logically replaced with each'\n\nStyle/WordArray:\n Enabled: false\n Description: 'Metasploit prefers consistent use of []'\n\nStyle/IfUnlessModifier:\n Enabled: false\n Description: 'This style might save a couple of lines, but often makes code less clear'\n\nStyle/PercentLiteralDelimiters:\n Description: 'Use `%`-literal delimiters consistently.'\n Enabled: true\n # Specify the default preferred delimiter for all types with the 'default' key\n # Override individual delimiters (even with default specified) by specifying\n # an individual key\n PreferredDelimiters:\n default: ()\n '%i': '[]'\n '%I': '[]'\n '%r': '{}'\n '%w': '[]'\n '%W': '[]'\n '%q': '{}' # Chosen for module descriptions as () are frequently used characters, whilst {} are rarely used\n VersionChanged: '0.48.1'\n\nStyle/RedundantBegin:\n Enabled: true\n\nStyle/SafeNavigation:\n Description: >-\n This cop transforms usages of a method call safeguarded by\n a check for the existence of the object to\n safe navigation (`&.`).\n\n This has been disabled as in some scenarios it produced invalid code, and disobeyed the 'AllowedMethods'\n configuration.\n Enabled: false\n\nStyle/UnpackFirst:\n Description: >-\n Disabling to make it easier to copy/paste `unpack('h*')` expressions from code\n into a debugging REPL.\n Enabled: false\n | dataset_sample\yaml\rapid7_metasploit-framework\.rubocop.yml | .rubocop.yml | YAML | 17,239 | 0.95 | 0.039755 | 0.041441 | node-utils | 592 | 2024-03-05T19:41:19.795734 | MIT | false | b8b48571ef6b59dae4096078abd8f977 |
---\ninclude:\n- "**/*.rb"\nexclude:\n- spec/**/*\n- test/**/*\n- vendor/**/*\n- ".bundle/**/*"\n- modules/**/*\n- data/**/*\n- db/**/*\n- external/**/*\n- plugins/**/*\n- scripts/**/* # Some of this is old and may not need indexing???\nrequire: []\ndomains: []\nreporters:\n- rubocop\n- require_not_found\nformatter:\n rubocop:\n cops: safe\n except: []\n only: []\n extra_args: []\nrequire_paths: []\nplugins: []\nmax_files: 0\n | dataset_sample\yaml\rapid7_metasploit-framework\.solargraph.yml | .solargraph.yml | YAML | 415 | 0.95 | 0 | 0 | node-utils | 355 | 2024-12-29T16:49:33.539247 | GPL-3.0 | false | 26524a51de208f031a936644db96e253 |
services:\n ms:\n build:\n context: .\n dockerfile: ./Dockerfile\n args:\n BUNDLER_ARGS: --jobs=8\n image: metasploit:dev\n environment:\n DATABASE_URL: postgres://postgres@db:5432/msf_dev?pool=200&timeout=5\n volumes:\n - .:/usr/src/metasploit-framework\n | dataset_sample\yaml\rapid7_metasploit-framework\docker-compose.override.yml | docker-compose.override.yml | YAML | 289 | 0.8 | 0 | 0 | awesome-app | 786 | 2023-10-29T21:17:08.190890 | BSD-3-Clause | false | 0367231f82fded554e69c0502a74ac23 |
services:\n ms:\n image: metasploitframework/metasploit-framework:latest\n environment:\n DATABASE_URL: postgres://postgres@db:5432/msf?pool=200&timeout=5\n links:\n - db\n ports:\n - 4444:4444\n volumes:\n - $HOME/.msf4:/home/msf/.msf4\n\n db:\n image: postgres:10-alpine\n volumes:\n - pg_data:/var/lib/postgresql/data\n environment:\n POSTGRES_HOST_AUTH_METHOD: trust\n\nvolumes:\n pg_data:\n driver: local\n | dataset_sample\yaml\rapid7_metasploit-framework\docker-compose.yml | docker-compose.yml | YAML | 449 | 0.8 | 0 | 0 | react-lib | 831 | 2024-12-23T22:27:01.272484 | BSD-3-Clause | false | 21aaca9b84738aac04d3954ca08be948 |
coverage: # https://docs.codecov.com/docs/codecovyml-reference#coverage\n precision: 1 # e.g. 89.1%\n round: down\n range: 85..100 # https://docs.codecov.com/docs/coverage-configuration#section-range\n status: # https://docs.codecov.com/docs/commit-status\n project:\n default:\n threshold: 1% # Avoid false negatives\nignore:\n - "examples"\n - "benches"\ncomment: # https://docs.codecov.com/docs/pull-request-comments\n # make the comments less noisy\n require_changes: true\n | dataset_sample\yaml\ratatui_ratatui\codecov.yml | codecov.yml | YAML | 487 | 0.95 | 0 | 0.071429 | vue-tools | 910 | 2023-07-28T21:10:36.360670 | BSD-3-Clause | false | 75707c9a4bf7fc3f801d08b2f29cd2a9 |
# To get started with Dependabot version updates, you'll need to specify which\n# package ecosystems to update and where the package manifests are located.\n# Please see the documentation for all configuration options:\n# https://docs.github.com/github/administering-a-repository/configuration-options-for-dependency-updates\n\nversion: 2\nupdates:\n # Maintain dependencies for Cargo\n - package-ecosystem: "cargo"\n directory: "/" # Location of package manifests\n schedule:\n interval: "weekly"\n # Maintain dependencies for GitHub Actions\n - package-ecosystem: github-actions\n directory: "/"\n schedule:\n interval: weekly\n open-pull-requests-limit: 10\n | dataset_sample\yaml\ratatui_ratatui\.github\dependabot.yml | dependabot.yml | YAML | 672 | 0.8 | 0.222222 | 0.352941 | node-utils | 364 | 2024-03-25T13:32:15.059152 | Apache-2.0 | false | 6d003cd8c7f2c0cbc4d4097f04e593ea |
blank_issues_enabled: false\ncontact_links:\n - name: Frequently Asked Questions\n url: https://ratatui.rs/faq/\n about: Check the website FAQ section to see if your question has already been answered\n - name: Ratatui Forum\n url: https://forum.ratatui.rs\n about: Ask questions about ratatui on our Forum\n - name: Discord Chat\n url: https://discord.gg/pMCEU9hNEj\n about: Ask questions about ratatui on Discord\n - name: Matrix Chat\n url: https://matrix.to/#/#ratatui:matrix.org\n about: Ask questions about ratatui on Matrix\n | dataset_sample\yaml\ratatui_ratatui\.github\ISSUE_TEMPLATE\config.yml | config.yml | YAML | 546 | 0.8 | 0.071429 | 0 | vue-tools | 396 | 2024-07-19T02:36:14.874944 | Apache-2.0 | false | eb45b1659a32fac144c485c282529b3f |
name: Check Pull Requests\n\non:\n pull_request_target:\n types:\n - opened\n - edited\n - synchronize\n - labeled\n - unlabeled\n merge_group:\n\npermissions:\n pull-requests: write\n\njobs:\n check-title:\n runs-on: ubuntu-latest\n steps:\n - name: Check PR title\n if: github.event_name == 'pull_request_target'\n uses: amannn/action-semantic-pull-request@v5\n id: check_pr_title\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n # Add comment indicating we require pull request titles to follow conventional commits specification\n - uses: marocchino/sticky-pull-request-comment@v2\n if: always() && (steps.check_pr_title.outputs.error_message != null)\n with:\n header: pr-title-lint-error\n message: |\n Thank you for opening this pull request!\n\n We require pull request titles to follow the [Conventional Commits specification](https://www.conventionalcommits.org/en/v1.0.0/) and it looks like your proposed title needs to be adjusted.\n\n Details:\n\n > ${{ steps.check_pr_title.outputs.error_message }}\n\n # Delete a previous comment when the issue has been resolved\n - if: ${{ steps.check_pr_title.outputs.error_message == null }}\n uses: marocchino/sticky-pull-request-comment@v2\n with:\n header: pr-title-lint-error\n delete: true\n\n check-breaking-change-label:\n runs-on: ubuntu-latest\n env:\n # use an environment variable to pass untrusted input to the script\n # see https://securitylab.github.com/research/github-actions-untrusted-input/\n PR_TITLE: ${{ github.event.pull_request.title }}\n steps:\n - name: Check breaking change label\n id: check_breaking_change\n run: |\n pattern='^(build|chore|ci|docs|feat|fix|perf|refactor|revert|style|test)(\(\w+\))?!:'\n # Check if pattern matches\n if echo "${PR_TITLE}" | grep -qE "$pattern"; then\n echo "breaking_change=true" >> $GITHUB_OUTPUT\n else\n echo "breaking_change=false" >> $GITHUB_OUTPUT\n fi\n - name: Add label\n if: steps.check_breaking_change.outputs.breaking_change == 'true'\n uses: actions/github-script@v7\n with:\n github-token: ${{ secrets.GITHUB_TOKEN }}\n script: |\n github.rest.issues.addLabels({\n issue_number: context.issue.number,\n owner: context.repo.owner,\n repo: context.repo.repo,\n labels: ['Type: Breaking Change']\n })\n\n do-not-merge:\n if: ${{ contains(github.event.*.labels.*.name, 'do not merge') }}\n name: Prevent Merging\n runs-on: ubuntu-latest\n steps:\n - name: Check for label\n run: |\n echo "Pull request is labeled as 'do not merge'"\n echo "This workflow fails so that the pull request cannot be merged"\n exit 1\n | dataset_sample\yaml\ratatui_ratatui\.github\workflows\check-pr.yml | check-pr.yml | YAML | 2,902 | 0.95 | 0.104651 | 0.064935 | react-lib | 299 | 2024-02-09T04:21:00.446241 | GPL-3.0 | false | f0cfcfdab8fb7143c5e9b9f17177e054 |
name: Check Semver\n\non:\n pull_request:\n branches:\n - main\n\njobs:\n check-semver:\n name: Check semver\n runs-on: ubuntu-latest\n steps:\n - name: Checkout the repository\n uses: actions/checkout@v4\n - name: Check semver\n uses: obi1kenobi/cargo-semver-checks-action@v2\n | dataset_sample\yaml\ratatui_ratatui\.github\workflows\check-semver.yml | check-semver.yml | YAML | 305 | 0.7 | 0 | 0 | node-utils | 681 | 2025-02-27T10:32:06.962856 | GPL-3.0 | false | 8f945ef83d4f4e184744eab1b609ab3b |
name: Continuous Integration\n\non:\n # Allows you to run this workflow manually from the Actions tab\n workflow_dispatch:\n push:\n branches:\n - main\n pull_request:\n branches:\n - main\n\n# ensure that the workflow is only triggered once per PR, subsequent pushes to the PR will cancel\n# and restart the workflow. See https://docs.github.com/en/actions/using-jobs/using-concurrency\nconcurrency:\n group: ${{ github.workflow }}-${{ github.head_ref || github.run_id }}\n cancel-in-progress: true\n\n# lint, clippy and coverage jobs are intentionally early in the workflow to catch simple formatting,\n# typos, and missing tests as early as possible. This allows us to fix these and resubmit the PR\n# without having to wait for the comprehensive matrix of tests to complete.\njobs:\n # Lint the formatting of the codebase.\n lint-formatting:\n name: Check Formatting\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@nightly\n with: { components: rustfmt }\n - uses: Swatinem/rust-cache@v2\n - uses: taiki-e/install-action@v2\n with:\n tool: taplo-cli\n - run: cargo xtask format --check\n\n # Check for typos in the codebase.\n # See <https://github.com/crate-ci/typos/>\n lint-typos:\n name: Check Typos\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: crate-ci/typos@master\n\n # Check for any disallowed dependencies in the codebase due to license / security issues.\n # See <https://github.com/EmbarkStudios/cargo-deny>\n dependencies:\n name: Check Dependencies\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@stable\n - uses: taiki-e/install-action@cargo-deny\n - run: cargo deny --log-level info --all-features check\n\n # Check for any unused dependencies in the codebase.\n # See <https://github.com/bnjbvr/cargo-machete/>\n cargo-machete:\n name: Check Unused Dependencies\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: bnjbvr/cargo-machete@v0.8.0\n\n # Run cargo clippy.\n #\n # We check for clippy warnings on beta, but these are not hard failures. They should often be\n # fixed to prevent clippy failing on the next stable release, but don't block PRs on them unless\n # they are introduced by the PR.\n lint-clippy:\n name: Check Clippy\n runs-on: ubuntu-latest\n strategy:\n fail-fast: false\n matrix:\n toolchain: ["stable", "beta"]\n continue-on-error: ${{ matrix.toolchain == 'beta' }}\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@master\n with:\n toolchain: ${{ matrix.toolchain }}\n components: clippy\n - uses: Swatinem/rust-cache@v2\n - run: cargo xtask clippy\n\n # Run markdownlint on all markdown files in the repository.\n lint-markdown:\n name: Check Markdown\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: DavidAnson/markdownlint-cli2-action@v19\n with:\n globs: |\n '**/*.md'\n '!target'\n\n # Run cargo coverage. This will generate a coverage report and upload it to codecov.\n # <https://app.codecov.io/gh/ratatui/ratatui>\n coverage:\n name: Coverage Report\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@stable\n with:\n components: llvm-tools\n - uses: taiki-e/install-action@cargo-llvm-cov\n - uses: Swatinem/rust-cache@v2\n - run: cargo xtask coverage\n - uses: codecov/codecov-action@v5\n with:\n token: ${{ secrets.CODECOV_TOKEN }}\n fail_ci_if_error: true\n\n # Run cargo check. This is a fast way to catch any obvious errors in the code.\n check:\n name: Check ${{ matrix.os }} ${{ matrix.toolchain }}\n strategy:\n fail-fast: false\n matrix:\n os: [ubuntu-latest, windows-latest, macos-latest]\n toolchain: ["1.74.0", "stable"]\n runs-on: ${{ matrix.os }}\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@master\n with:\n toolchain: ${{ matrix.toolchain }}\n - uses: Swatinem/rust-cache@v2\n - run: cargo xtask check --all-features\n\n # Check if README.md is up-to-date with the crate's documentation.\n check-readme:\n name: Check README\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: Swatinem/rust-cache@v2\n - uses: taiki-e/install-action@cargo-rdme\n - run: cargo xtask readme --check\n\n # Run cargo rustdoc with the same options that would be used by docs.rs, taking into account the\n # package.metadata.docs.rs configured in Cargo.toml. https://github.com/dtolnay/cargo-docs-rs\n lint-docs:\n name: Check Docs\n runs-on: ubuntu-latest\n env:\n RUSTDOCFLAGS: -Dwarnings\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@nightly\n - uses: dtolnay/install@cargo-docs-rs\n - uses: Swatinem/rust-cache@v2\n - run: cargo xtask docs\n\n # Run cargo test on the documentation of the crate. This will catch any code examples that don't\n # compile, or any other issues in the documentation.\n test-docs:\n name: Test Docs\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@stable\n - uses: Swatinem/rust-cache@v2\n - run: cargo xtask test-docs\n\n # Run cargo test on the libraries of the crate.\n test-libs:\n name: Test Libs ${{ matrix.toolchain }}\n runs-on: ubuntu-latest\n strategy:\n fail-fast: false\n matrix:\n toolchain: ["1.74.0", "stable"]\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@stable\n - uses: Swatinem/rust-cache@v2\n - run: cargo xtask test-libs\n\n # Run cargo test on all the backends.\n test-backends:\n name: Test ${{matrix.backend}} on ${{ matrix.os }}\n runs-on: ${{ matrix.os }}\n strategy:\n fail-fast: false\n matrix:\n os: [ubuntu-latest, windows-latest, macos-latest]\n backend: [crossterm, termion, termwiz]\n exclude:\n # termion is not supported on windows\n - os: windows-latest\n backend: termion\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@stable\n - uses: Swatinem/rust-cache@v2\n - run: cargo xtask test-backend ${{ matrix.backend }}\n | dataset_sample\yaml\ratatui_ratatui\.github\workflows\ci.yml | ci.yml | YAML | 6,456 | 0.8 | 0.044776 | 0.16129 | vue-tools | 583 | 2023-10-30T19:47:49.826221 | MIT | false | c41439014f4fa00b73f09d74a04eed45 |
name: Release alpha version\n\non:\n workflow_dispatch:\n schedule:\n # At 00:00 on Saturday\n # https://crontab.guru/#0_0_*_*_6\n - cron: "0 0 * * 6"\n\ndefaults:\n run:\n shell: bash\n\njobs:\n publish-alpha:\n name: Create an alpha release\n runs-on: ubuntu-latest\n permissions:\n contents: write\n steps:\n - name: Checkout the repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n\n - name: Calculate the next release\n run: .github/workflows/calculate-alpha-release.bash\n\n - name: Install Rust stable\n uses: dtolnay/rust-toolchain@stable\n\n - name: Publish\n run: cargo publish --allow-dirty --token ${{ secrets.CARGO_TOKEN }}\n\n - name: Generate a changelog\n uses: orhun/git-cliff-action@v4\n with:\n config: cliff.toml\n args: --unreleased --tag ${{ env.NEXT_TAG }} --strip header\n env:\n OUTPUT: BODY.md\n\n - name: Publish on GitHub\n uses: ncipollo/release-action@v1\n with:\n tag: ${{ env.NEXT_TAG }}\n prerelease: true\n bodyFile: BODY.md\n | dataset_sample\yaml\ratatui_ratatui\.github\workflows\release-alpha.yml | release-alpha.yml | YAML | 1,121 | 0.8 | 0 | 0.05 | python-kit | 768 | 2024-06-13T04:03:51.576856 | BSD-3-Clause | false | a2ff5c5dfee3bce70b95607a85e57de0 |
name: Release-plz\n\npermissions:\n pull-requests: write\n contents: write\n\non:\n push:\n branches:\n - main\n workflow_dispatch:\n\njobs:\n # Release unpublished packages.\n release-plz-release:\n name: Release-plz release\n runs-on: ubuntu-latest\n if: ${{ github.repository_owner == 'ratatui' }}\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n - name: Install Rust toolchain\n uses: dtolnay/rust-toolchain@stable\n - name: Run release-plz\n uses: release-plz/action@v0.5\n with:\n command: release\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_TOKEN }}\n\n # Create a PR with the new versions and changelog, preparing the next release.\n release-plz-pr:\n name: Release-plz PR\n runs-on: ubuntu-latest\n if: ${{ github.repository_owner == 'ratatui' }}\n concurrency:\n group: release-plz-${{ github.ref }}\n cancel-in-progress: false\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n - name: Install Rust toolchain\n uses: dtolnay/rust-toolchain@stable\n - name: Run release-plz\n uses: release-plz/action@v0.5\n with:\n command: release-pr\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n CARGO_REGISTRY_TOKEN: ${{ secrets.CARGO_TOKEN }}\n | dataset_sample\yaml\ratatui_ratatui\.github\workflows\release-plz.yml | release-plz.yml | YAML | 1,470 | 0.8 | 0.036364 | 0.039216 | python-kit | 713 | 2025-02-16T05:44:47.290169 | BSD-3-Clause | false | b98ad176674698f1ea675808fcec7836 |
name: Release stable version\n\non:\n push:\n tags:\n - "v*.*.*"\n\njobs:\n publish-stable:\n name: Create an stable release\n runs-on: ubuntu-latest\n permissions:\n contents: write\n steps:\n - name: Checkout the repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n\n - name: Generate a changelog\n uses: orhun/git-cliff-action@v4\n with:\n config: cliff.toml\n args: --latest --strip header\n env:\n OUTPUT: BODY.md\n\n - name: Publish on GitHub\n uses: ncipollo/release-action@v1\n with:\n prerelease: false\n bodyFile: BODY.md\n\n publish-crate:\n name: Publish crate\n runs-on: ubuntu-latest\n steps:\n - name: Checkout the repository\n uses: actions/checkout@v4\n\n - name: Install Rust stable\n uses: dtolnay/rust-toolchain@stable\n\n - name: Publish\n run: cargo publish --token ${{ secrets.CARGO_TOKEN }}\n | dataset_sample\yaml\ratatui_ratatui\.github\workflows\release-stable.yml | release-stable.yml | YAML | 975 | 0.7 | 0 | 0 | react-lib | 468 | 2024-06-27T15:33:54.770524 | BSD-3-Clause | false | f0fc6818a0746f65e8e77c27a3823f6d |
# golangci-lint configuration options\n\nlinters:\n enable:\n - errcheck\n - goimports\n - revive\n - ineffassign\n - govet\n - unconvert\n - staticcheck\n - gosimple\n - stylecheck\n - unused\n - misspell\n - gocritic\n #- prealloc\n #- maligned\n disable-all: true\n\nissues:\n # Enable some lints excluded by default\n exclude-use-default: false\n\n # Maximum issues count per one linter. Set to 0 to disable. Default is 50.\n max-issues-per-linter: 0\n\n # Maximum count of issues with the same text. Set to 0 to disable. Default is 3.\n max-same-issues: 0\n\n exclude-rules:\n\n - linters:\n - staticcheck\n text: 'SA1019: "github.com/rclone/rclone/cmd/serve/httplib" is deprecated'\n\n # don't disable the revive messages about comments on exported functions\n include:\n - EXC0012\n - EXC0013\n - EXC0014\n - EXC0015\n\nrun:\n # timeout for analysis, e.g. 30s, 5m, default is 1m\n timeout: 10m\n\nlinters-settings:\n revive:\n # setting rules seems to disable all the rules, so re-enable them here\n rules:\n - name: blank-imports\n disabled: false\n - name: context-as-argument\n disabled: false\n - name: context-keys-type\n disabled: false\n - name: dot-imports\n disabled: false\n - name: empty-block\n disabled: true\n - name: error-naming\n disabled: false\n - name: error-return\n disabled: false\n - name: error-strings\n disabled: false\n - name: errorf\n disabled: false\n - name: exported\n disabled: false\n - name: increment-decrement\n disabled: true\n - name: indent-error-flow\n disabled: false\n - name: package-comments\n disabled: false\n - name: range\n disabled: false\n - name: receiver-naming\n disabled: false\n - name: redefines-builtin-id\n disabled: true\n - name: superfluous-else\n disabled: true\n - name: time-naming\n disabled: false\n - name: unexported-return\n disabled: false\n - name: unreachable-code\n disabled: true\n - name: unused-parameter\n disabled: true\n - name: var-declaration\n disabled: false\n - name: var-naming\n disabled: false\n stylecheck:\n # Only enable the checks performed by the staticcheck stand-alone tool,\n # as documented here: https://staticcheck.io/docs/configuration/options/#checks\n checks: ["all", "-ST1000", "-ST1003", "-ST1016", "-ST1020", "-ST1021", "-ST1022", "-ST1023"]\n gocritic:\n # Enable all default checks with some exceptions and some additions (commented).\n # Cannot use both enabled-checks and disabled-checks, so must specify all to be used.\n disable-all: true\n enabled-checks:\n #- appendAssign # Enabled by default\n - argOrder\n - assignOp\n - badCall\n - badCond\n #- captLocal # Enabled by default\n - caseOrder\n - codegenComment\n #- commentFormatting # Enabled by default\n - defaultCaseOrder\n - deprecatedComment\n - dupArg\n - dupBranchBody\n - dupCase\n - dupSubExpr\n - elseif\n #- exitAfterDefer # Enabled by default\n - flagDeref\n - flagName\n #- ifElseChain # Enabled by default\n - mapKey\n - newDeref\n - offBy1\n - regexpMust\n - ruleguard # Not enabled by default\n #- singleCaseSwitch # Enabled by default\n - sloppyLen\n - sloppyTypeAssert\n - switchTrue\n - typeSwitchVar\n - underef\n - unlambda\n - unslice\n - valSwap\n - wrapperFunc\n settings:\n ruleguard:\n rules: "${configDir}/bin/rules.go"\n | dataset_sample\yaml\rclone_rclone\.golangci.yml | .golangci.yml | YAML | 3,654 | 0.95 | 0.006944 | 0.140741 | awesome-app | 688 | 2023-10-14T20:56:26.877783 | BSD-3-Clause | false | e5d17fc48163d8f962cbf53a7645227f |
# This action sets up:\n# - The correct version of Rust based on the `rust-toolchain` file\n# - All components + targets specified in `rust-toolchain`\n# - Caching of individual compilation requests via `sccache` and GCS\n# - Uses our own `rerun-io/sccache-action` which supports GCS\n# - `cargo nextest`\n#\n# Note that due to the use of GCS as an sccache storage backend,\n# this action also sets up GCP credentials as a side effect.\n# There is no harm to setting up the credentials twice accidentally,\n# but care should be taken not to do that, as it's wasteful.\n\nname: "Setup Rust"\n\ninputs:\n cache_key:\n type: string\n required: true\n save_cache:\n type: boolean\n required: false\n default: false\n workload_identity_provider:\n type: string\n required: true\n service_account:\n type: string\n required: true\n toolchains:\n type: string\n required: false\n description: "Space-separated list of extra toolchains to install"\n targets:\n type: string\n required: false\n description: "One or more space separated target triplets that will be ensured to be supported."\n\nruns:\n using: "composite"\n steps:\n - name: Set up GCP credentials\n uses: google-github-actions/auth@v2\n with:\n workload_identity_provider: ${{ inputs.workload_identity_provider }}\n service_account: ${{ inputs.service_account }}\n\n - name: Ensure correct version of Rust is installed\n shell: bash\n run: |\n # This is the only way to force rustup to install the version of Rust\n # and the components/targets specified in our `rust-toolchain` file.\n # It might break at some point: https://github.com/rust-lang/rustup/issues/1397\n rustup show\n\n - name: Install additional targets\n if: ${{ inputs.targets != '' }}\n shell: bash\n run: rustup target add ${{ inputs.targets }}\n\n - name: Install additional toolchains\n if: ${{ inputs.toolchains }}\n shell: bash\n run: |\n for toolchain in ${{ inputs.toolchains }}; do\n rustup install $toolchain\n done\n\n - name: Set up sccache\n uses: rerun-io/sccache-action@v0.7.1\n with:\n version: "v0.7.7"\n use_gcs: true\n gcs_bucket: rerun-sccache\n gcs_read_only: false\n\n - name: Display sccache config\n shell: bash\n run: |\n cat $HOME/.config/sccache/config\n\n - name: Verify sccache\n shell: bash\n run: |\n sccache --show-stats\n\n # Recommended way to install nextest on CI.\n - name: Install latest nextest release\n uses: taiki-e/install-action@v2.48.7\n with:\n tool: nextest@0.9.89\n | dataset_sample\yaml\rerun-io_rerun\.github\actions\setup-rust\action.yml | action.yml | YAML | 2,635 | 0.95 | 0.033333 | 0.1875 | react-lib | 414 | 2023-12-24T14:25:55.001702 | MIT | false | 6cc0a237f2207294a232a08d8d82af73 |
# If `target` is set to `production`, this action handles updating the\n# target commit env variable (`RELEASE_COMMIT`) which is used as the\n# pointer for `rerun.io/docs` and `rerun.io/examples` and triggering\n# a redeploy of `rerun.io`.\n\n# If `target` is set to `preview`, then this instead deploys a fresh preview\n# with an override for `release_commit`, and sets the following outputs:\n# - `vercel_preview_deployment_id`\n# - `vercel_preview_url`\n# - `vercel_preview_inspector_url`\n#\n# The `vercel_preview_deployment_id` may be used to wait for the deployment\n# to complete with the `wait-for-deployment` command, which outputs:\n# - `vercel_preview_result`, either "success" or "failure"\n# - `vercel_preview_url`\n# - `vercel_preview_inspector_url`\n\nname: "Deploy rerun.io"\n\ndescription: "Vercel utilities"\n\n# Changing these inputs also requires changing their usage in `index.mjs`\ninputs:\n vercel_token:\n description: "Vercel access token"\n required: true\n vercel_team_name:\n description: "Vercel team name under which `vercel_project_name` can be found"\n required: true\n vercel_project_name:\n description: "Vercel project name to update and redeploy"\n required: true\n vercel_deployment_id:\n description: "Vercel deployment ID used in `wait-for-deployment`"\n required: false\n command:\n description: "one of: `deploy`, `wait-for-deployment`, `update-env`"\n required: true\n release_commit:\n description: "Release commit to update the deployment to"\n required: false\n release_version:\n description: "Which release version to update the deployment to"\n required: false\n target:\n description: "Which Vercel environment to deploy to"\n required: false\n\nruns:\n using: "node20"\n main: "index.mjs"\n | dataset_sample\yaml\rerun-io_rerun\.github\actions\vercel\action.yml | action.yml | YAML | 1,747 | 0.95 | 0.117647 | 0.347826 | python-kit | 846 | 2023-09-02T18:44:36.343912 | GPL-3.0 | false | 108a76188515493677e50380839074b1 |
name: Adhoc Wheels\n\non:\n workflow_dispatch:\n inputs:\n MODE:\n type: choice\n required: false\n options:\n - pypi\n - pr\n - extra\n description: "The build mode (`pypi` includes the web viewer, `pr` does not)"\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "write"\n id-token: "write"\n deployments: "write"\n\njobs:\n # -----------------------------------------------------------------------------------\n # Build rerun-cli (rerun binaries):\n\n build-rerun-cli-and-upload-linux-arm64:\n name: "Linux-arm64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: adhoc-wheels-linux-arm64\n PLATFORM: linux-arm64\n secrets: inherit\n\n build-rerun-cli-and-upload-linux-x64:\n name: "Linux-x64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: adhoc-wheels-linux-x64\n PLATFORM: linux-x64\n secrets: inherit\n\n build-rerun-cli-and-upload-macos-x64:\n name: "Mac-x64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: adhoc-wheels-macos-x64\n PLATFORM: macos-x64\n secrets: inherit\n\n build-rerun-cli-and-upload-macos-arm64:\n name: "Mac-arm64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: adhoc-wheels-macos-arm64\n PLATFORM: macos-arm64\n secrets: inherit\n\n build-rerun-cli-and-upload-windows-x64:\n name: "Windows-x64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: adhoc-wheels-windows-x64\n PLATFORM: windows-x64\n secrets: inherit\n\n # ---------------------------------------------------------------------------\n # Build wheels:\n\n build-wheel-linux-arm64:\n needs: [build-rerun-cli-and-upload-linux-arm64]\n name: "Linux-arm64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: adhoc-wheels-linux-arm64\n PLATFORM: linux-arm64\n WHEEL_ARTIFACT_NAME: linux-arm64-wheel\n MODE: ${{ inputs.MODE }}\n secrets: inherit\n\n build-wheel-linux-x64:\n needs: [build-rerun-cli-and-upload-linux-x64]\n name: "Linux-x64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: adhoc-wheels-linux-x64\n PLATFORM: linux-x64\n WHEEL_ARTIFACT_NAME: linux-x64-wheel\n MODE: ${{ inputs.MODE }}\n secrets: inherit\n\n build-wheel-macos-arm64:\n needs: [build-rerun-cli-and-upload-macos-arm64]\n name: "Macos-arm64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: adhoc-wheels-macos-arm64\n PLATFORM: macos-arm64\n WHEEL_ARTIFACT_NAME: macos-arm64-wheel\n MODE: ${{ inputs.MODE }}\n secrets: inherit\n\n build-wheel-macos-x64:\n needs: [build-rerun-cli-and-upload-macos-x64]\n name: "Macos-x64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: adhoc-wheels-macos-x64\n PLATFORM: macos-x64\n WHEEL_ARTIFACT_NAME: "macos-x64-wheel"\n MODE: ${{ inputs.MODE }}\n secrets: inherit\n\n build-wheel-windows-x64:\n needs: [build-rerun-cli-and-upload-windows-x64]\n name: "Windows-x64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: adhoc-wheels-windows-x64\n PLATFORM: windows-x64\n WHEEL_ARTIFACT_NAME: windows-x64-wheel\n MODE: "pypi"\n secrets: inherit\n\n # --------------------------------------------------------------------------\n\n generate-pip-index:\n name: "Generate Pip Index"\n needs:\n [\n build-wheel-linux-arm64,\n build-wheel-linux-x64,\n build-wheel-macos-arm64,\n build-wheel-macos-x64,\n build-wheel-windows-x64,\n ]\n uses: ./.github/workflows/reusable_pip_index.yml\n with:\n CONCURRENCY: adhoc-wheels\n secrets: inherit\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\adhoc_wheels.yml | adhoc_wheels.yml | YAML | 4,189 | 0.95 | 0 | 0.040323 | node-utils | 381 | 2023-08-30T23:17:00.877689 | BSD-3-Clause | false | ca47a94c37da809bda2be4396052f797 |
name: "Approve Workflow Runs"\n\non:\n pull_request_target:\n issue_comment:\n types: [created, edited]\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "read"\n actions: "write"\n\njobs:\n approve-workflow-runs:\n name: "Check for approval"\n runs-on: ubuntu-latest\n if: |\n github.event.pull_request.head.repo.owner.login != 'rerun-io' &&\n (github.event_name == 'pull_request_target' || github.event.issue.pull_request)\n steps:\n - name: Checkout\n uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Wait a few seconds\n run: |\n # Give GitHub a bit of time to synchronize everything\n sleep 5s\n\n - name: Approve workflow runs\n run: |\n pixi run python scripts/ci/approve_workflow_runs.py \\n --github-token "${{ secrets.GITHUB_TOKEN }}" \\n --github-repository "rerun-io/rerun" \\n --pr-number "${{ github.event.pull_request.number || github.event.issue.number }}"\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\auto_approve.yml | auto_approve.yml | YAML | 1,058 | 0.8 | 0.04878 | 0.029412 | awesome-app | 887 | 2024-02-18T14:20:57.078122 | GPL-3.0 | false | 888e102fb45633a17136d0d1ead99546 |
name: Docs deploy\n\non:\n push:\n branches: [main]\n\npermissions:\n contents: "read"\n id-token: "write"\n\ndefaults:\n run:\n shell: bash\n\n# The lack of `concurrency` is intentional.\n# We want this job to run on every commit, even if multiple are merged in a row.\n\njobs:\n has-label:\n name: Check for PR label\n runs-on: ubuntu-latest\n outputs:\n result: ${{ steps.find-pr.outputs.result }}\n steps:\n - uses: actions/checkout@v3\n with:\n # ref - not set, because we want to end up on the merge commit\n fetch-depth: 0 # don't perform a shallow clone\n\n # Find the PR by the number in the merge commit subject line\n - name: Find PR\n id: find-pr\n env:\n GH_TOKEN: ${{ secrets.RERUN_BOT_TOKEN }}\n run: |\n commit_message=$(git log --pretty=format:%s -n 1 ${{ github.sha }})\n pr_number=$(echo $commit_message | grep -oP '(?<=#)\d+')\n\n result=$(gh pr view $pr_number --json labels | jq -r 'any(.labels[].name; . == "deploy docs")')\n echo "result=$result" >> $GITHUB_OUTPUT\n\n cherry-pick:\n name: Cherry-pick to docs-latest\n needs: [has-label]\n if: needs.has-label.outputs.result == 'true'\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v3\n with:\n fetch-depth: 0\n token: ${{ secrets.RERUN_BOT_TOKEN }}\n\n - name: Cherry-pick\n run: |\n # Setup git user\n git config --global user.name "rerun-bot"\n git config --global user.email "bot@rerun.io"\n\n # Cherry-pick the commit\n git checkout docs-latest\n git cherry-pick ${{ github.sha }}\n git push origin docs-latest\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\auto_docs.yml | auto_docs.yml | YAML | 1,698 | 0.8 | 0.048387 | 0.115385 | python-kit | 901 | 2024-03-28T09:06:35.180236 | GPL-3.0 | false | 633795feafe08cde4d39e7417ccc9a6c |
name: Docs deploy pre-check\n\non:\n pull_request:\n types:\n - opened\n - synchronize\n - reopened\n - labeled\n - unlabeled\n\npermissions:\n contents: "read"\n id-token: "write"\n pull-requests: "write"\n\ndefaults:\n run:\n shell: bash\n\nconcurrency:\n group: pr-${{ github.event.pull_request.number }}-auto-docs-check\n cancel-in-progress: true\n\njobs:\n check-cherry-pick:\n name: Check if merge commit can be cherry-picked\n runs-on: ubuntu-latest\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && contains(github.event.pull_request.labels.*.name, 'deploy docs')\n steps:\n - uses: actions/checkout@v3\n with:\n # ref - not set, because we want to end up on the merge commit\n fetch-depth: 0 # don't perform a shallow clone\n\n - name: Try cherry-pick\n env:\n GH_TOKEN: ${{ secrets.RERUN_BOT_TOKEN }}\n run: |\n # Setup git user\n git config --global user.name "rerun-bot"\n git config --global user.email "bot@rerun.io"\n\n git fetch origin main\n git checkout main\n git merge --squash origin/${{ github.event.pull_request.head.ref }}\n git commit -m "${{ github.event.pull_request.head.title }} (#${{ github.event.pull_request.number }})"\n commit=$(git rev-parse HEAD)\n git checkout docs-latest\n\n if git cherry-pick $commit; then\n echo "Cherry-pick successful"\n exit 0\n else\n echo "Cherry-pick failed"\n printf $(git diff)\n exit 1\n fi\n\n - name: Add success comment\n # https://github.com/mshick/add-pr-comment\n uses: mshick/add-pr-comment@v2.8.2\n if: success()\n with:\n message-id: "cherry-pick-check"\n repo-token: ${{ secrets.GITHUB_TOKEN }}\n message: |\n Your changes can be cherry-picked to `docs-latest` and will be deployed\n immediately after merging.\n\n - name: Add failure comment\n # https://github.com/mshick/add-pr-comment\n uses: mshick/add-pr-comment@v2.8.2\n if: failure()\n with:\n message-id: "cherry-pick-check"\n repo-token: ${{ secrets.GITHUB_TOKEN }}\n message: |\n Your changes cannot be automatically cherry-picked to `docs-latest`.\n\n You should remove the `deploy docs` label and perform the cherry-pick manually after merging.\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\auto_docs_check.yml | auto_docs_check.yml | YAML | 2,461 | 0.8 | 0.061728 | 0.057143 | python-kit | 825 | 2024-06-15T11:30:47.222694 | GPL-3.0 | false | 39ea7a8aacf808120f31b13175f82187 |
name: Cargo Shear\n\non:\n push:\n branches:\n - "main"\n pull_request:\n types: [opened, synchronize]\n\njobs:\n cargo-shear:\n runs-on: ubuntu-latest\n steps:\n - name: Checkout\n uses: actions/checkout@v3\n\n - name: Shear\n run: |\n cargo +stable install cargo-shear@1.1.11 --locked\n cargo shear\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\cargo_shear.yml | cargo_shear.yml | YAML | 344 | 0.7 | 0 | 0 | node-utils | 391 | 2024-10-20T00:35:15.785701 | GPL-3.0 | false | 6cd683d05b8d39b0bf17abcf1fc491b4 |
# Checks that all checkboxes in a PR are checked\n\nname: Pull Request Checkboxes\n\non:\n pull_request_target:\n types:\n - opened\n - synchronize\n - reopened\n - edited\n\nconcurrency:\n group: ${{ github.event.pull_request.number }}-pr-checkboxes\n cancel-in-progress: true\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "read"\n pull-requests: "read"\n\njobs:\n pr-checkboxes:\n name: Check PR checkboxes\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Check PR checkboxes\n run: |\n pixi run ./scripts/ci/check_pr_checkboxes.py \\n --github-token ${{ secrets.GITHUB_TOKEN }} \\n --github-repository ${{ github.repository }} \\n --pr-number ${{ github.event.pull_request.number }}\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\checkboxes.yml | checkboxes.yml | YAML | 918 | 0.8 | 0 | 0.029412 | node-utils | 469 | 2024-02-23T14:36:42.110165 | BSD-3-Clause | false | 02da8911c327215309327afea1230f2c |
name: Clear cache\n\non:\n workflow_dispatch:\n\npermissions:\n actions: write\n\njobs:\n clear-cache:\n runs-on: ubuntu-latest\n steps:\n - name: Clear cache\n uses: actions/github-script@v6\n with:\n script: |\n const opts = {\n owner: context.repo.owner,\n repo: context.repo.repo,\n };\n\n const cache_list = await github.rest.actions.getActionsCacheList({\n per_page: 1,\n ...opts,\n });\n const per_page = 100;\n const pages = Math.ceil(cache_list.data.total_count / per_page);\n\n const total_count = cache_list.data.total_count;\n let deleted = 0;\n const _id = setInterval(() => {\n console.log(`${deleted}/${total_count}`);\n }, 500);\n\n let promises = [];\n for (let page = 1; page <= pages; page++) {\n const cache_page = await github.rest.actions.getActionsCacheList({\n per_page,\n page,\n ...opts,\n });\n for (const cache of cache_page.data.actions_caches) {\n promises.push(\n github.rest.actions\n .deleteActionsCacheById({\n cache_id: cache.id,\n ...opts,\n })\n .then(() => (deleted += 1))\n );\n }\n await Promise.all(promises);\n promises = [];\n\n // wait 60 seconds every 4 pages (secondary rate limit)\n if (page % 4 === 0) {\n await new Promise(f => setTimeout(f, 60_000));\n }\n }\n\n clearInterval(_id);\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\clear_cache.yml | clear_cache.yml | YAML | 1,752 | 0.8 | 0.04918 | 0.018868 | node-utils | 663 | 2024-09-21T00:19:00.184676 | GPL-3.0 | false | a211f4a48884404bba7888445dc287e1 |
name: "Checks: Lints, Tests, Docs"\n\non:\n workflow_call:\n inputs:\n CONCURRENCY:\n required: true\n type: string\n PR_NUMBER:\n required: false\n type: string\n default: ""\n\nconcurrency:\n group: ${{ inputs.CONCURRENCY }}-checks\n cancel-in-progress: true\n\nenv:\n PYTHON_VERSION: "3.9"\n # web_sys_unstable_apis is required to enable the web_sys clipboard API which egui_web uses\n # https://rustwasm.github.io/wasm-bindgen/api/web_sys/struct.Clipboard.html\n # https://rustwasm.github.io/docs/wasm-bindgen/web-sys/unstable-apis.html\n RUSTFLAGS: --cfg=web_sys_unstable_apis --deny warnings\n\n RUSTDOCFLAGS: --deny warnings\n\n # Do *not* use sscache since on contributor ci we don't have access to the gcloud stored cache.\n #RUSTC_WRAPPER: "sccache"\n\n # Not only `sccache` cannot cache incremental builds, it's counter-productive to generate all\n # these incremental artifacts when running on CI.\n CARGO_INCREMENTAL: "0"\n\n # Sourced from https://vulkan.lunarg.com/sdk/home#linux\n VULKAN_SDK_VERSION: "1.3.290.0"\n\n # ANSI color codes should be supported by default on GitHub Actions.\n CARGO_TERM_COLOR: always\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "read"\n\njobs:\n py-lints:\n name: Python lints (ruff, mypy, …)\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Python format check\n run: pixi run py-fmt-check\n\n - name: Lint Python\n run: pixi run py-lint\n\n py-test-docs:\n name: Test Python Docs\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n environments: py-docs\n\n - name: Build via mkdocs\n shell: bash\n run: |\n pixi run -e py-docs mkdocs build --strict -f rerun_py/mkdocs.yml\n\n no-codegen-changes:\n name: Check if running codegen would produce any changes\n # TODO(andreas): setup-vulkan doesn't work on 24.4 right now due to missing .so\n runs-on: ubuntu-22.04-large\n steps:\n # Note: We explicitly don't override `ref` here. We need to see if changes would be made\n # in a context where we have merged with main. Otherwise we might miss changes such as one\n # PR introduces a new type and another PR changes the codegen.\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Codegen check\n run: pixi run codegen --force --check\n\n - name: Codegen out-of-sync (protos)\n run: pixi run codegen-protos-check\n\n rs-lints:\n name: Rust lints (fmt, check, clippy, tests, doc)\n # TODO(andreas): setup-vulkan doesn't work on 24.4 right now due to missing .so\n runs-on: ubuntu-22.04-large\n steps:\n - uses: actions/checkout@v4\n with:\n lfs: true\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n # Install the Vulkan SDK, so we can use the software rasterizer.\n # TODO(andreas): It would be nice if `setup_software_rasterizer.py` could do that for us as well (note though that this action here is very fast when cached!)\n - name: Install Vulkan SDK\n uses: rerun-io/install-vulkan-sdk-action@v1.1.0\n with:\n vulkan_version: ${{ env.VULKAN_SDK_VERSION }}\n install_runtime: true\n cache: true\n stripdown: true\n\n - name: Setup software rasterizer\n run: pixi run python ./scripts/ci/setup_software_rasterizer.py\n\n # Recommended way to install nextest on CI.\n - name: Install latest nextest release\n uses: taiki-e/install-action@v2.48.7\n with:\n tool: nextest@0.9.89\n\n - name: Rust checks & tests\n run: pixi run rs-check --skip individual_crates docs_slow\n\n - name: Upload test results\n uses: actions/upload-artifact@v4\n if: always()\n with:\n name: test-results-ubuntu\n path: "**/tests/snapshots"\n\n rerun-lints:\n name: Rerun lints\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Set up Python\n uses: actions/setup-python@v5\n with:\n python-version: "3.11"\n\n - name: Rerun lints\n run: pixi run lint-rerun\n\n toml-format-check:\n name: Toml format check\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Toml format check\n run: pixi run toml-fmt-check\n\n check-too-large-files:\n name: Check for too large files\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Check for too large files\n run: pixi run check-large-files\n\n check-example-thumbnails:\n name: Check Python example thumbnails\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Check Python example thumbnails\n run: pixi run ./scripts/ci/thumbnails.py check\n\n spell-check:\n name: Spell Check\n runs-on: ubuntu-latest\n steps:\n - name: Checkout Actions Repository\n uses: actions/checkout@v4\n\n - name: Check spelling of entire workspace\n uses: crate-ci/typos@v1.18.0\n\n cpp-formatting:\n name: C++ formatting check\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - name: Run clang format on all relevant files\n uses: jidicula/clang-format-action@v4.11.0\n with:\n clang-format-version: "16"\n # Only check c/cpp/h/hpp (default checks also .proto and others)\n include-regex: ^.*\.(c|cpp|h|hpp)$\n\n cpp-tests:\n name: C++ tests\n runs-on: ubuntu-latest\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n environments: cpp\n\n # TODO(emilk): make this work somehow. Right now this just results in\n # > Compiler: GNU 12.3.0 (/__w/rerun/rerun/.pixi/env/bin/x86_64-conda-linux-gnu-c++)\n # 😭\n # - name: Build and run C++ tests with clang++\n # run: |\n # pixi run -e cpp cpp-clean\n # RERUN_WERROR=ON RERUN_USE_ASAN=ON CXX=clang++ pixi run -e cpp cpp-build-all\n # RERUN_WERROR=ON RERUN_USE_ASAN=ON CXX=clang++ pixi run -e cpp cpp-test\n\n - name: Build and run C++ tests with g++\n run: |\n pixi run -e cpp cpp-clean\n RERUN_WERROR=ON RERUN_USE_ASAN=ON LSAN_OPTIONS=suppressions=.github/workflows/lsan_suppressions.supp CXX=g++ pixi run -e cpp cpp-build-all\n RERUN_WERROR=ON RERUN_USE_ASAN=ON LSAN_OPTIONS=suppressions=.github/workflows/lsan_suppressions.supp CXX=g++ pixi run -e cpp cpp-test\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\contrib_checks.yml | contrib_checks.yml | YAML | 7,132 | 0.95 | 0.028455 | 0.13198 | vue-tools | 388 | 2024-07-29T07:59:43.962678 | BSD-3-Clause | false | a51b1b6381e0cf35e718870d6c86aea4 |
name: Reusable Build and Test Wheels (contrib)\n\non:\n workflow_call:\n inputs:\n CONCURRENCY:\n required: true\n type: string\n MATURIN_FEATURE_FLAGS:\n required: false\n type: string\n default: "--no-default-features --features pypi"\n\nconcurrency:\n group: ${{ inputs.CONCURRENCY }}-build-wheels\n cancel-in-progress: true\n\nenv:\n PYTHON_VERSION: "3.9"\n\n # web_sys_unstable_apis is required to enable the web_sys clipboard API which egui_web uses\n # https://rustwasm.github.io/wasm-bindgen/api/web_sys/struct.Clipboard.html\n # https://rustwasm.github.io/docs/wasm-bindgen/web-sys/unstable-apis.html\n\n # TODO(jleibs) --deny warnings causes installation of wasm-bindgen to fail on mac\n # RUSTFLAGS: --cfg=web_sys_unstable_apis --deny warnings\n RUSTFLAGS: --cfg=web_sys_unstable_apis\n\n RUSTDOCFLAGS: --deny warnings\n\n # Do *not* use sscache since on contributor ci we don't have access to the gcloud stored cache.\n #RUSTC_WRAPPER: "sccache"\n\n # Not only `sccache` cannot cache incremental builds, it's counter-productive to generate all\n # these incremental artifacts when running on CI.\n CARGO_INCREMENTAL: "0"\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "read"\n\njobs:\n build-wheels:\n name: Build Wheels\n runs-on: ubuntu-latest-16-cores\n container:\n image: rerunio/ci_docker:0.15.0\n steps:\n - uses: actions/checkout@v4\n with:\n lfs: true\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n environments: wheel-test-min\n\n - name: Build rerun-cli\n run: |\n pixi run rerun-build-native-and-web-release\n\n - name: Copy rerun-cli to wheel foldcer\n run: |\n cp target/release/rerun rerun_py/rerun_sdk/rerun_cli\n\n - name: Build the wheel\n run: |\n pixi run python scripts/ci/build_and_upload_wheels.py \\n --mode pr \\n --target x86_64-unknown-linux-gnu \\n --dir unused \\n --compat manylinux_2_34\n\n - name: Install built wheel\n run: |\n pixi run python scripts/ci/pixi_install_wheel.py --feature python-pypi --package rerun-sdk --dir dist/x86_64-unknown-linux-gnu\n\n - name: Run e2e test\n run: pixi run -e wheel-test-min RUST_LOG=debug scripts/run_python_e2e_test.py --no-build # rerun-sdk is already built and installed\n\n - name: Run tests/roundtrips.py\n # --release so we can inherit from some of the artifacts that maturin has just built before\n # --target x86_64-unknown-linux-gnu because otherwise cargo loses the target cache… even though this is the target anyhow…\n # --no-py-build because rerun-sdk is already built and installed\n run: |\n pixi run -e wheel-test-min RUST_LOG=debug tests/roundtrips.py --release --target x86_64-unknown-linux-gnu --no-py-build\n\n - name: Run docs/snippets/compare_snippet_output.py\n # --release so we can inherit from some of the artifacts that maturin has just built before\n # --target x86_64-unknown-linux-gnu because otherwise cargo loses the target cache… even though this is the target anyhow…\n # --no-py-build because rerun-sdk is already built and installed\n run: |\n pixi run -e wheel-test-min RUST_LOG=debug docs/snippets/compare_snippet_output.py --release --target x86_64-unknown-linux-gnu --no-py-build\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\contrib_rerun_py.yml | contrib_rerun_py.yml | YAML | 3,423 | 0.95 | 0 | 0.194805 | vue-tools | 333 | 2023-09-28T05:59:40.913038 | BSD-3-Clause | false | bb7851e04dae58370b1bf4364f7aa171 |
name: First time contributors\n\non:\n pull_request_target:\n\npermissions:\n contents: "write"\n id-token: "write"\n pull-requests: "write"\n\njobs:\n comment:\n name: Comment on PRs\n runs-on: ubuntu-latest\n steps:\n - uses: actions/first-interaction@v1.3.0\n with:\n repo-token: ${{ secrets.GITHUB_TOKEN }}\n pr-message: |\n Hi! Thanks for opening this pull request.\n\n Because this is your first time contributing to this repository, make sure you've read our [Contributor Guide](https://github.com/rerun-io/rerun/blob/main/CONTRIBUTING.md) and [Code of Conduct](https://github.com/rerun-io/rerun/blob/main/CODE_OF_CONDUCT.md).\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\first_time_contrib.yml | first_time_contrib.yml | YAML | 679 | 0.8 | 0.045455 | 0 | python-kit | 840 | 2024-01-09T09:47:44.491161 | Apache-2.0 | false | 5b9ccd46447f4064de82aae024990e1a |
# https://github.com/marketplace/actions/require-labels\n# Check for existence of labels\n# See all our labels at https://github.com/rerun-io/rerun/issues/labels\n\nname: Pull Request Labels\n\non:\n pull_request_target:\n types:\n - opened\n - synchronize\n - reopened\n - labeled\n - unlabeled\n\n# No permissions needed here\n# permissions:\n\njobs:\n label:\n runs-on: ubuntu-latest\n steps:\n - name: Check for a "do-not-merge" label\n uses: mheap/github-action-required-labels@v3\n with:\n mode: exactly\n count: 0\n labels: "do-not-merge"\n\n - name: Require label "include in changelog" or "exclude from changelog"\n uses: mheap/github-action-required-labels@v3\n with:\n mode: minimum\n count: 1\n labels: "exclude from changelog, include in changelog"\n\n - name: Require at least one label\n uses: mheap/github-action-required-labels@v3\n with:\n mode: minimum\n count: 1\n labels: "📊 analytics, 🟦 blueprint, 🪳 bug, 🌊 C++ API, CLI, codegen/idl, 🧑💻 dev experience, dependencies, 📖 documentation, 💬 discussion, examples, exclude from changelog, 🪵 Log & send APIs, 📉 performance, 🐍 Python API, ⛃ re_datastore, 🔍 re_query, 📺 re_viewer, 🔺 re_renderer, 🚜 refactor, ⛴ release, 🦀 Rust API, 🔨 testing, ui, 🕸️ web"\n\n wasm-bindgen-check:\n name: Check wasm-bindgen version\n if: ${{ github.event_name == 'pull_request' }}\n runs-on: ubuntu-latest\n continue-on-error: true\n steps:\n - uses: actions/checkout@v4\n with:\n fetch-depth: 0\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Get current wasm-bindgen version\n id: current-version\n run: |\n version=$(pixi run taplo get -f crates/viewer/re_viewer/Cargo.toml "target.*.dependencies.wasm-bindgen")\n echo "current_version=$version" >> $GITHUB_OUTPUT\n\n - name: Get previous wasm-bindgen version\n id: previous-version\n run: |\n prev_ref=$(git rev-parse --abbrev-ref HEAD)\n git checkout main\n\n version=$(pixi run taplo get -f crates/viewer/re_viewer/Cargo.toml "target.*.dependencies.wasm-bindgen")\n echo "previous_version=$version" >> $GITHUB_OUTPUT\n\n git checkout $prev_ref\n\n - name: Require label if versions changed\n if: ${{ steps.current-version.outputs.current_version != steps.previous-version.outputs.previous_version }}\n uses: mheap/github-action-required-labels@v3\n with:\n mode: exactly\n count: 1\n labels: "wasm-bindgen version update"\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\labels.yml | labels.yml | YAML | 2,727 | 0.95 | 0.061728 | 0.073529 | vue-tools | 84 | 2023-07-29T18:36:51.051998 | Apache-2.0 | false | 28dd9c204ee66179a6b273b2a3f694ae |
name: Nightly\n\non:\n workflow_dispatch:\n schedule:\n # https://docs.github.com/en/actions/using-workflows/events-that-trigger-workflows#schedule\n # 3am UTC, so 4am-5am CET and evening in east time, basically after everyone's day.\n - cron: "15 3 * * *"\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "write"\n id-token: "write"\n deployments: "write"\n # This is needed since the web viewer build has this permission in order to write comments in PRs\n # (not needed for nightly, but the permission is still active).\n pull-requests: "write"\n\njobs:\n checks:\n name: Checks\n uses: ./.github/workflows/reusable_checks.yml\n with:\n CONCURRENCY: nightly\n CHANNEL: nightly\n secrets: inherit\n\n checks-cpp:\n name: Checks\n uses: ./.github/workflows/reusable_checks_cpp.yml\n with:\n CONCURRENCY: nightly\n CHANNEL: nightly\n secrets: inherit\n\n checks-rust:\n name: Checks\n uses: ./.github/workflows/reusable_checks_rust.yml\n with:\n CONCURRENCY: nightly\n CHANNEL: nightly\n secrets: inherit\n\n checks-python:\n name: Checks\n uses: ./.github/workflows/reusable_checks_python.yml\n with:\n CONCURRENCY: nightly\n secrets: inherit\n\n # Check that a CLEAN container with just `cargo` on it can build rerun:\n clean-build:\n name: cargo build on clean container\n strategy:\n matrix:\n os: [ubuntu-latest-16-cores, macos-latest, windows-latest-8-cores]\n runs-on: ${{ matrix.os }}\n steps:\n - uses: actions/checkout@v4\n - uses: dtolnay/rust-toolchain@master\n with:\n toolchain: 1.84.0\n\n - run: cargo build -p rerun\n\n build-web:\n name: "Build web viewer"\n uses: ./.github/workflows/reusable_build_web.yml\n with:\n CONCURRENCY: nightly\n CHANNEL: nightly\n secrets: inherit\n\n upload-web:\n name: "Upload Web"\n needs: [build-web]\n uses: ./.github/workflows/reusable_upload_web.yml\n with:\n CONCURRENCY: nightly\n NIGHTLY: true\n secrets: inherit\n\n # -----------------------------------------------------------------------------------\n # Build rerun_c library binaries:\n\n build-rerun_c-and-upload-linux-arm64:\n needs: [checks]\n name: "Linux-Arm64: Build & Upload rerun_c"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_c.yml\n with:\n CONCURRENCY: nightly-linux-arm64\n PLATFORM: linux-arm64\n secrets: inherit\n\n build-rerun_c-and-upload-linux-x64:\n needs: [checks]\n name: "Linux-x64: Build & Upload rerun_c"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_c.yml\n with:\n CONCURRENCY: nightly-linux-x64\n PLATFORM: linux-x64\n secrets: inherit\n\n build-rerun_c-and-upload-macos-x64:\n needs: [checks]\n name: "Mac-Intel: Build & Upload rerun_c"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_c.yml\n with:\n CONCURRENCY: nightly-macos-x64\n PLATFORM: macos-x64\n secrets: inherit\n\n build-rerun_c-and-upload-macos-arm64:\n needs: [checks]\n name: "Mac-Arm64: Build & Upload rerun_c"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_c.yml\n with:\n CONCURRENCY: nightly-macos-arm64\n PLATFORM: macos-arm64\n secrets: inherit\n\n build-rerun_c-and-upload-windows-x64:\n needs: [checks]\n name: "Windows-x64: Build & Upload rerun_c"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_c.yml\n with:\n CONCURRENCY: nightly-windows-x64\n PLATFORM: windows-x64\n secrets: inherit\n\n # -----------------------------------------------------------------------------------\n # Build rerun-cli (rerun binaries):\n\n build-rerun-cli-and-upload-linux-arm64:\n needs: [checks]\n name: "Linux-arm64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: nightly-linux-arm64\n PLATFORM: linux-arm64\n secrets: inherit\n\n build-rerun-cli-and-upload-linux-x64:\n needs: [checks]\n name: "Linux-x64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: nightly-linux-x64\n PLATFORM: linux-x64\n secrets: inherit\n\n build-rerun-cli-and-upload-macos-x64:\n needs: [checks]\n name: "Mac-x64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: nightly-macos-x64\n PLATFORM: macos-x64\n secrets: inherit\n\n build-rerun-cli-and-upload-macos-arm64:\n needs: [checks]\n name: "Mac-arm64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: nightly-macos-arm64\n PLATFORM: macos-arm64\n secrets: inherit\n\n build-rerun-cli-and-upload-windows-x64:\n needs: [checks]\n name: "Windows-x64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: nightly-windows-x64\n PLATFORM: windows-x64\n secrets: inherit\n\n # ---------------------------------------------------------------------------\n # Build wheels:\n\n build-wheel-linux-arm64:\n needs: [checks, build-rerun-cli-and-upload-linux-arm64]\n name: "Linux-arm64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: nightly-linux-arm64\n PLATFORM: linux-arm64\n WHEEL_ARTIFACT_NAME: linux-arm64-wheel\n MODE: "pypi"\n secrets: inherit\n\n build-wheel-linux-x64:\n needs: [checks, build-rerun-cli-and-upload-linux-x64]\n name: "Linux-x64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: nightly-linux-x64\n PLATFORM: linux-x64\n WHEEL_ARTIFACT_NAME: linux-x64-wheel\n MODE: "pypi"\n secrets: inherit\n\n build-wheel-macos-arm64:\n needs: [checks, build-rerun-cli-and-upload-macos-arm64]\n name: "Macos-arm64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: nightly-macos-arm64\n PLATFORM: macos-arm64\n WHEEL_ARTIFACT_NAME: macos-arm64-wheel\n MODE: "pypi"\n secrets: inherit\n\n build-wheel-macos-x64:\n needs: [checks, build-rerun-cli-and-upload-macos-x64]\n name: "Macos-x64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: nightly-macos-x64\n PLATFORM: macos-x64\n WHEEL_ARTIFACT_NAME: "macos-x64-wheel"\n MODE: "pypi"\n secrets: inherit\n\n build-wheel-windows-x64:\n needs: [checks, build-rerun-cli-and-upload-windows-x64]\n name: "Windows-x64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: nightly-windows-x64\n PLATFORM: windows-x64\n WHEEL_ARTIFACT_NAME: windows-x64-wheel\n MODE: "pypi"\n secrets: inherit\n\n # ---------------------------------------------------------------------------\n # Test wheels:\n\n test-wheel-linux-arm64:\n needs: [checks, build-wheel-linux-arm64]\n name: "linux-arm64: Test Wheels"\n uses: ./.github/workflows/reusable_test_wheels.yml\n with:\n CONCURRENCY: nightly-linux-arm64\n PLATFORM: linux-arm64\n WHEEL_ARTIFACT_NAME: linux-arm64-wheel\n secrets: inherit\n\n test-wheel-linux-x64:\n needs: [checks, build-wheel-linux-x64]\n name: "Linux-x64: Test Wheels"\n uses: ./.github/workflows/reusable_test_wheels.yml\n with:\n CONCURRENCY: nightly-linux-x64\n PLATFORM: linux-x64\n WHEEL_ARTIFACT_NAME: linux-x64-wheel\n secrets: inherit\n\n test-wheel-macos-arm64:\n needs: [checks, build-wheel-macos-arm64]\n name: "macos-arm64: Test Wheels"\n uses: ./.github/workflows/reusable_test_wheels.yml\n with:\n CONCURRENCY: nightly-macos-arm64\n PLATFORM: macos-arm64\n WHEEL_ARTIFACT_NAME: macos-arm64-wheel\n secrets: inherit\n\n # TODO(#9108): Test macos wheels\n # test-wheel-macos-x64:\n # needs: [checks, build-wheel-macos-x64]\n # name: "macos-x64: Test Wheels"\n # uses: ./.github/workflows/reusable_test_wheels.yml\n # with:\n # CONCURRENCY: nightly-macos-x64\n # PLATFORM: macos-x64\n # WHEEL_ARTIFACT_NAME: macos-x64-wheel\n # secrets: inherit\n\n test-wheel-windows-x64:\n needs: [checks, build-wheel-windows-x64]\n name: "Windows-x64: Test Wheels"\n uses: ./.github/workflows/reusable_test_wheels.yml\n with:\n CONCURRENCY: nightly-windows-x64\n PLATFORM: windows-x64\n WHEEL_ARTIFACT_NAME: windows-x64-wheel\n secrets: inherit\n\n # ---------------------------------------------------------------------------\n\n # TODO(#9304): make the notebook export work\n # run-notebook:\n # name: "Run Notebook"\n # needs: [build-wheel-linux-x64]\n # uses: ./.github/workflows/reusable_run_notebook.yml\n # with:\n # CONCURRENCY: nightly\n # WHEEL_ARTIFACT_NAME: linux-x64-wheel\n # secrets: inherit\n\n build-examples:\n name: "Build Examples"\n needs: [build-wheel-linux-x64]\n uses: ./.github/workflows/reusable_build_examples.yml\n with:\n CONCURRENCY: nightly\n CHANNEL: nightly\n WHEEL_ARTIFACT_NAME: linux-x64-wheel\n secrets: inherit\n\n upload-examples:\n name: "Upload Examples"\n needs: [build-examples]\n uses: ./.github/workflows/reusable_upload_examples.yml\n with:\n CONCURRENCY: nightly\n NIGHTLY: true\n secrets: inherit\n\n benches:\n name: Benchmarks\n uses: ./.github/workflows/reusable_bench.yml\n with:\n CONCURRENCY: nightly\n SAVE_BENCHES: true\n BENCH_NAME: main # We currently only run benches nightly, but we used to run them on main\n COMPARE_TO: main # We currently only run benches nightly, but we used to run them on main\n secrets: inherit\n\n # --------------------------------------------------------------------------\n # Release:\n\n generate-pip-index:\n name: "Generate Pip Index"\n needs:\n [\n build-wheel-linux-arm64,\n build-wheel-linux-x64,\n build-wheel-macos-arm64,\n build-wheel-macos-x64,\n build-wheel-windows-x64,\n ]\n uses: ./.github/workflows/reusable_pip_index.yml\n with:\n CONCURRENCY: nightly\n secrets: inherit\n\n bundle-and-upload-rerun_cpp:\n name: "Bundle and upload rerun_cpp_sdk.zip"\n needs:\n [\n build-rerun_c-and-upload-linux-arm64,\n build-rerun_c-and-upload-linux-x64,\n build-rerun_c-and-upload-macos-arm64,\n build-rerun_c-and-upload-macos-x64,\n build-rerun_c-and-upload-windows-x64,\n ]\n uses: ./.github/workflows/reusable_bundle_and_upload_rerun_cpp.yml\n with:\n CONCURRENCY: nightly\n secrets: inherit\n\n pre-release:\n name: Pre Release\n concurrency: nightly\n needs:\n [\n build-rerun-cli-and-upload-linux-arm64,\n build-rerun-cli-and-upload-linux-x64,\n build-rerun-cli-and-upload-macos-arm64,\n build-rerun-cli-and-upload-macos-x64,\n build-rerun-cli-and-upload-windows-x64,\n build-rerun_c-and-upload-linux-arm64,\n build-rerun_c-and-upload-linux-x64,\n build-rerun_c-and-upload-macos-arm64,\n build-rerun_c-and-upload-macos-x64,\n build-rerun_c-and-upload-windows-x64,\n bundle-and-upload-rerun_cpp,\n generate-pip-index,\n upload-web,\n ]\n runs-on: "ubuntu-latest"\n steps:\n - name: Add SHORT_SHA env property with commit short sha\n run: echo "SHORT_SHA=`echo ${{github.sha}} | cut -c1-7`" >> $GITHUB_ENV\n\n # First delete the old prerelease. If we don't do this, we don't get things like\n # proper source-archives and changelog info.\n # https://github.com/dev-drprasad/delete-tag-and-release\n - uses: dev-drprasad/delete-tag-and-release@v0.2.1\n with:\n tag_name: prerelease\n delete_release: true\n env:\n GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}\n\n # Create the actual prerelease\n # https://github.com/ncipollo/release-action\n - name: GitHub Release\n uses: ncipollo/release-action@v1.12.0\n with:\n body: |\n This is a prerelease. It is not intended for production use.\n Please report any issues you find.\n\n ## Example Hosted App\n https://rerun.io/viewer/commit/${{ env.SHORT_SHA }}\n\n ## Wheels can be installed with:\n ```\n pip install --pre --no-index -f https://build.rerun.io/commit/${{ env.SHORT_SHA }}/wheels --upgrade rerun-sdk\n ```\n or\n ```\n pip install --pre --no-index -f https://github.com/rerun-io/rerun/releases/download/prerelease --upgrade rerun-sdk\n ```\n\n ## CMake fetch-content for C++ SDK\n ```\n include(FetchContent)\n FetchContent_Declare(rerun_sdk URL https://build.rerun.io/commit/${{ env.SHORT_SHA }}/rerun_cpp_sdk.zip)\n FetchContent_MakeAvailable(rerun_sdk)\n ```\n or\n ```\n include(FetchContent)\n FetchContent_Declare(rerun_sdk URL https://github.com/rerun-io/rerun/releases/download/prerelease/rerun_cpp_sdk.zip)\n FetchContent_MakeAvailable(rerun_sdk)\n ```\n\n prerelease: true\n # Be explicit about the commit we're releasing/tagging.\n # Otherwise it can happen that there's a discrepancy between the tag and the commit for which we uploaded & linked artifacts.\n # It seems to otherwise use the latest commit for the tag. From the actions's docs:\n # > If the tag of the release you are creating does not yet exist, you should set both the tag and commit action inputs.\n # We just deleted the previous tag, so this is the case!\n commit: ${{github.sha}}\n name: "Development Build"\n tag: "prerelease"\n token: ${{ secrets.GITHUB_TOKEN }}\n generateReleaseNotes: false\n allowUpdates: true\n removeArtifacts: true\n replacesArtifacts: true\n\n sync-release-assets:\n needs: [pre-release]\n name: "Sync pre-release assets & build.rerun.io"\n uses: ./.github/workflows/reusable_sync_release_assets.yml\n with:\n CONCURRENCY: nightly\n RELEASE_VERSION: prerelease\n secrets: inherit\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\nightly.yml | nightly.yml | YAML | 14,300 | 0.8 | 0.010917 | 0.117936 | node-utils | 792 | 2024-02-25T09:54:35.602178 | Apache-2.0 | false | b2ea3baac2e65a780dcebf24a98d81ae |
name: "GitHub Release"\n\non:\n # Triggers when the `Publish release` button is pressed\n release:\n types: [published]\n\n # Manual trigger with `tag` input\n workflow_dispatch:\n inputs:\n tag_name:\n description: "Release tag"\n type: string\n required: true\n\nconcurrency:\n group: "release-${{ github.event.release.tag_name }}"\n cancel-in-progress: true\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n # required for updating the release\n contents: write\n id-token: write\n\njobs:\n sync-release-assets:\n name: "Sync Release Assets"\n uses: ./.github/workflows/reusable_sync_release_assets.yml\n with:\n CONCURRENCY: "${{ github.event.release.tag_name || inputs.tag_name }}"\n RELEASE_VERSION: "${{ github.event.release.tag_name || inputs.tag_name }}"\n secrets: inherit\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\on_gh_release.yml | on_gh_release.yml | YAML | 818 | 0.95 | 0.027778 | 0.1 | node-utils | 911 | 2025-03-19T15:20:58.285454 | Apache-2.0 | false | d2a9307f74e158e8a3c53dbc063bcfeb |
# This workflow is triggered on any PR comment, and tries to find a `@rerun-bot` mention in it.\n# If the mention is a command, such as `@rerun-bot full-check`, then it runs the command.\n#\n# Available commands:\n# full-check Triggers a run of `on_push_main.yml` on the PR.\n\nname: "PR Comment"\n\non:\n issue_comment:\n types: [created]\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "read"\n id-token: "write"\n pull-requests: "write"\n\njobs:\n parse-command:\n if: |\n contains(github.event.comment.html_url, '/pull/') &&\n contains(github.event.comment.body, '@rerun-bot') &&\n contains(\n fromJSON('["COLLABORATOR","MEMBER","OWNER"]'),\n github.event.comment.author_association\n )\n runs-on: ubuntu-latest\n outputs:\n command: ${{ steps.parse.outputs.command }}\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n\n - name: Parse comment\n id: parse\n env:\n GITHUB_COMMENT_BODY: "${{ github.event.comment.body }}"\n run: python ./scripts/ci/parse_bot_pr_comment.py\n\n full-check:\n needs: [parse-command]\n if: needs.parse-command.outputs.command == 'full-check'\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n\n - name: Dispatch main workflow\n id: dispatch\n env:\n # NOTE: This uses `RERUN_BOT_TOKEN` instead of `GITHUB_TOKEN`,\n # otherwise the recursive workflow protection prevents us from\n # starting the `on_push_main` run.\n # https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#triggering-a-workflow-from-a-workflow\n GH_TOKEN: ${{ secrets.RERUN_BOT_TOKEN }}\n run: |\n get_latest_workflow_run () {\n local workflow_name=$1\n local ref_name=$2\n local created_after=$3\n echo $(\n # https://cli.github.com/manual/gh_run_list\n # https://docs.github.com/en/search-github/getting-started-with-searching-on-github/understanding-the-search-syntax#query-for-dates\n gh run list \\n --workflow $workflow_name \\n --event workflow_dispatch \\n --branch $ref_name \\n --created ">$created_after" \\n --json databaseId\n )\n }\n\n dispatch_workflow () {\n local workflow_name=$1\n local ref_name=$2\n local inputs=$3\n # https://cli.github.com/manual/gh_workflow_run\n echo $inputs | gh workflow run $workflow_name --ref "refs/heads/$ref_name" --json\n }\n\n workflow_name='on_push_main.yml'\n ref_name=$(gh pr view ${{ github.event.issue.number }} --json headRefName | jq -r '.headRefName')\n inputs='{"CONCURRENCY":"pr-${{ github.event.issue.number }}-full-check"}'\n now=$(date --utc --iso-8601=seconds)\n\n echo "Dispatching workflow $workflow_name on branch $ref_name"\n dispatch_workflow $workflow_name $ref_name $inputs\n\n # `gh workflow run` does NOT return the ID.\n # In fact, it returns absolutely nothing: https://github.com/cli/cli/issues/4001\n # Instead, we have to wait for the workflow to start, and hope that nobody has\n # started a workflow in parallel with us on the same branch.\n\n echo "Fetching workflow run id…"\n run_info=$(get_latest_workflow_run $workflow_name $ref_name $now)\n echo $run_info\n run_id=$(echo $run_info | jq -r '.[0].databaseId')\n while [ $run_id == 'null' ]\n do\n run_info=$(get_latest_workflow_run $workflow_name $ref_name $now)\n echo $run_info\n run_id=$(echo $run_info | jq -r '.[0].databaseId')\n sleep 1\n done\n echo "Workflow run: https://github.com/rerun-io/rerun/actions/runs/$run_id"\n\n echo "workflow_run_url=https://github.com/rerun-io/rerun/actions/runs/$run_id" >> "$GITHUB_OUTPUT"\n\n - name: Create PR comment\n # https://github.com/mshick/add-pr-comment\n uses: mshick/add-pr-comment@v2.8.2\n with:\n # We use `GITHUB_TOKEN` here so there is no chance that we'll trigger another run of this workflow.\n # https://docs.github.com/en/actions/using-workflows/triggering-a-workflow#triggering-a-workflow-from-a-workflow\n repo-token: ${{ secrets.GITHUB_TOKEN }}\n message-id: "pr-${{ github.event.issue.number }}-${{ github.run_id }}"\n message: |\n Started a full build: ${{ steps.dispatch.outputs.workflow_run_url }}\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\on_pr_comment.yml | on_pr_comment.yml | YAML | 4,646 | 0.8 | 0.040984 | 0.17757 | python-kit | 110 | 2025-05-07T13:41:28.616715 | GPL-3.0 | false | 04a8d01c7b5664dd9fa4804aa711b88f |
# Jobs that only run for developers on the `rerun` team.\n# We have to ensure that these jobs _only_ run for PRs inside the `rerun-io` organization\n# this is done using the following check, added to every job:\n# if: github.event.pull_request.head.repo.owner.login == 'rerun-io'\n# (unfortunately this does not work on the trigger or the entire `jobs` category)\n\nname: Pull-Request\n\non:\n pull_request:\n types:\n - opened\n - synchronize\n\npermissions: write-all\n\n# These jobs use fairly short names as they are a prefix in the display hierarchy\njobs:\n checks:\n name: Checks\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io'\n uses: ./.github/workflows/reusable_checks.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n CHANNEL: pr\n secrets: inherit\n\n paths-filter:\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io'\n runs-on: ubuntu-latest\n outputs:\n cpp_changes: ${{ steps.filter.outputs.cpp_changes }}\n docs_changes: ${{ steps.filter.outputs.docs_changes }}\n python_changes: ${{ steps.filter.outputs.python_changes }}\n rust_changes: ${{ steps.filter.outputs.rust_changes }}\n protobuf_changes: ${{ steps.filter.outputs.protobuf_changes }}\n web_changes: ${{ steps.filter.outputs.web_changes }}\n steps:\n - uses: actions/checkout@v4\n with:\n ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.ref || '' }}\n - uses: dorny/paths-filter@v3\n id: filter\n with:\n filters: |\n cpp_changes:\n # - .github/**/*.yml - this is tempting, but leads to constant rebuilds\n - pixi.lock # maybe our build commands have changed\n - pixi.toml # maybe our build commands have changed\n - scripts/ci/*\n - '**/*.hpp'\n - '**/*.cpp'\n - '**/CMakeLists.txt'\n - '**/*cmake'\n\n docs_changes:\n # - .github/**/*.yml - this is tempting, but leads to constant rebuilds\n - .github/actions/vercel/**/*\n - pixi.lock # maybe our build commands have changed\n - pixi.toml # maybe our build commands have changed\n - scripts/ci/*\n - 'docs/content/**/*.md'\n - 'examples/**/*.md'\n - 'examples/manifest.toml'\n\n python_changes:\n # - .github/**/*.yml - this is tempting, but leads to constant rebuilds\n - pixi.lock # maybe our build commands have changed\n - pixi.toml # maybe our build commands have changed\n - scripts/ci/*\n - '**/*.py'\n - '**/requirements.txt'\n - '**/pyproject.toml'\n\n rust_changes:\n # - .github/**/*.yml - this is tempting, but leads to constant rebuilds\n - Cargo.lock\n - pixi.lock # maybe our build commands have changed\n - pixi.toml # maybe our build commands have changed\n - scripts/ci/*\n - "**/*.rs"\n - "**/*.toml"\n\n protobuf_changes:\n # - .github/**/*.yml - this is tempting, but leads to constant rebuilds\n - "**/**.proto"\n - pixi.lock # maybe our build commands have changed\n - pixi.toml # maybe our build commands have changed\n - scripts/ci/*\n - "**/*.toml"\n\n web_changes:\n # - .github/**/*.yml - this is tempting, but leads to constant rebuilds\n - Cargo.lock\n - pixi.lock # maybe our build commands have changed\n - pixi.toml # maybe our build commands have changed\n - scripts/ci/*\n - "**/*.html"\n - "**/*.js"\n - "**/*.mjs"\n - "**/*.json"\n - "**/*.rs"\n - "**/*.toml"\n - "**/yarn.lock"\n - "crates/viewer/re_ui/data/**"\n\n protobuf-checks:\n name: "Protobuf Checks"\n needs: [paths-filter]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && needs.paths-filter.outputs.protobuf_changes == 'true'\n uses: ./.github/workflows/reusable_checks_protobuf.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n secrets: inherit\n\n rust-checks:\n name: "Rust Checks"\n needs: [paths-filter]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && needs.paths-filter.outputs.rust_changes == 'true'\n uses: ./.github/workflows/reusable_checks_rust.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n CHANNEL: pr\n secrets: inherit\n\n python-checks:\n name: "Python Checks"\n needs: [paths-filter]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && needs.paths-filter.outputs.python_changes == 'true'\n uses: ./.github/workflows/reusable_checks_python.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n secrets: inherit\n\n cpp-tests:\n name: "C++ tests"\n needs: [paths-filter]\n if: needs.paths-filter.outputs.cpp_changes == 'true'\n uses: ./.github/workflows/reusable_checks_cpp.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n CHANNEL: pr\n secrets: inherit\n\n min-cli-build:\n name: "Minimum CLI Build"\n needs: [paths-filter]\n if: needs.paths-filter.outputs.python_changes == 'true' || needs.paths-filter.outputs.rust_changes == 'true'\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n PLATFORM: linux-x64\n secrets: inherit\n\n # Build and test a single wheel to limit CI cost. We use linux-x64 because it's fast. linux-arm64 would also be a good\n # choice, but reusable_test_wheels.yml is broken for that target (https://github.com/rerun-io/rerun/issues/5525)\n min-wheel-build:\n name: "Minimum Wheel Build"\n needs: [min-cli-build, paths-filter]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && (needs.paths-filter.outputs.python_changes == 'true' || needs.paths-filter.outputs.rust_changes == 'true')\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n MODE: "pr"\n PLATFORM: linux-x64\n WHEEL_ARTIFACT_NAME: "linux-x64-wheel-fast"\n secrets: inherit\n\n min-wheel-test:\n name: "Minimum Wheel Test"\n needs: [min-wheel-build, paths-filter]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && (needs.paths-filter.outputs.python_changes == 'true' || needs.paths-filter.outputs.rust_changes == 'true')\n uses: ./.github/workflows/reusable_test_wheels.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n PLATFORM: linux-x64\n WHEEL_ARTIFACT_NAME: "linux-x64-wheel-fast"\n FAST: true\n secrets: inherit\n\n build-js:\n name: "Build rerun_js"\n needs: [paths-filter]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && needs.paths-filter.outputs.web_changes == 'true'\n uses: ./.github/workflows/reusable_build_js.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n secrets: inherit\n\n build-web:\n name: "Build web viewer"\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && needs.paths-filter.outputs.web_changes == 'true'\n needs: [paths-filter]\n uses: ./.github/workflows/reusable_build_web.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n CHANNEL: main\n secrets: inherit\n\n upload-web:\n name: "Upload Web"\n needs: [build-web]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io'\n uses: ./.github/workflows/reusable_upload_web.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n PR_NUMBER: ${{ github.event.pull_request.number }}\n secrets: inherit\n\n deploy-landing-preview:\n name: "Deploy Landing Preview"\n needs: [paths-filter]\n if: github.event.pull_request.head.repo.owner.login == 'rerun-io' && needs.paths-filter.outputs.docs_changes == 'true'\n uses: ./.github/workflows/reusable_deploy_landing_preview.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n PR_NUMBER: ${{ github.event.pull_request.number }}\n secrets: inherit\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\on_pull_request.yml | on_pull_request.yml | YAML | 8,456 | 0.95 | 0.077982 | 0.071066 | python-kit | 753 | 2024-03-17T08:12:40.492906 | Apache-2.0 | false | d4b082768a20d7b31743901293a37a0c |
# Jobs that only run for external contributors.\n# These have to be carefully sanitized, we don't want to leak secrets.\n# - We can't use caching, outside of really rare scenarios, because our caching largely depends on GCS\n# - We have to ensure that these jobs _only_ run for PRs outside of the `rerun-io` organization\n# this is done using the following check, added to every job:\n# if: github.event.pull_request.head.repo.owner.login != 'rerun-io'\n\nname: Pull-Request (Contrib)\n\non:\n pull_request:\n types:\n - opened\n - synchronize\n\npermissions:\n contents: "read"\n\njobs:\n checks:\n name: "Checks"\n if: github.event.pull_request.head.repo.owner.login != 'rerun-io'\n uses: ./.github/workflows/contrib_checks.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n PR_NUMBER: ${{ github.event.pull_request.number }}\n\n python:\n name: "Python"\n if: github.event.pull_request.head.repo.owner.login != 'rerun-io'\n uses: ./.github/workflows/contrib_rerun_py.yml\n with:\n CONCURRENCY: pr-${{ github.event.pull_request.number }}\n MATURIN_FEATURE_FLAGS: "--no-default-features --features extension-module"\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\on_pull_request_contrib.yml | on_pull_request_contrib.yml | YAML | 1,172 | 0.8 | 0.147059 | 0.206897 | awesome-app | 469 | 2024-03-20T21:37:32.425837 | MIT | false | 5483896415068b5005bc9d1dfcb6f0ae |
name: "Push Docs"\n\non:\n push:\n branches: [docs-latest]\n\nconcurrency:\n group: on-push-docs\n cancel-in-progress: true\n\npermissions:\n contents: "read"\n id-token: "write"\n\ndefaults:\n run:\n shell: bash\n\njobs:\n # Get latest release version from crates.io\n # This excludes any prerelease builds, e.g. `0.15.0-alpha.1` or `rc` or similar.\n # We get it from crates.io because it's the strongest indicator of the latest\n # fully released version we have available, and there is no better way to retrieve\n # that in the context of this branch, because we don't want to rely on the contents\n # of a local `Cargo.toml` or the git branch name.\n get-version:\n runs-on: ubuntu-latest\n outputs:\n version: ${{ steps.versioning.outputs.crate_version }}\n steps:\n - uses: actions/checkout@v4\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Get version\n id: versioning\n run: |\n crate_version=$(pixi run python scripts/ci/crates.py get-version --from=cratesio --skip-prerelease)\n echo "crate_version=$crate_version" >> "$GITHUB_OUTPUT"\n\n build-search-index:\n runs-on: ubuntu-latest-16-cores\n strategy:\n matrix:\n toolchain: ["nightly-2025-02-05"]\n needs: [get-version]\n steps:\n - uses: actions/checkout@v4\n\n - name: Set up Rust\n uses: ./.github/actions/setup-rust\n with:\n cache_key: "build-linux"\n save_cache: false\n workload_identity_provider: ${{ secrets.GOOGLE_WORKLOAD_IDENTITY_PROVIDER }}\n service_account: ${{ secrets.GOOGLE_SERVICE_ACCOUNT }}\n # pinned to a specific version that happens to work with current `rustdoc-types`\n toolchains: ${{ matrix.toolchain }}\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n environments: py-docs\n\n - name: Install rerun-sdk\n run: |\n\n - name: Build search index\n env:\n # Here we disable:\n # - All warnings - nightly toolchain has a different set of warnings enabled by default.\n # We already test with `-D warnings` elsewhere, and the output is really noisy with the nightly warnings.\n # - The web viewer server - we don't want to waste time building the web viewer,\n # because it is not actually going to run\n RUSTFLAGS: "-Awarnings --cfg disable_web_viewer_server"\n run: |\n pixi run search-index build \\n landing \\n --url "https://edge.meilisearch.com" \\n --master-key "${{ secrets.MEILISEARCH_TOKEN }}" \\n --release-version "${{ needs.get-version.outputs.version }}" \\n --rust-toolchain "${{ matrix.toolchain }}"\n\n redeploy-rerun-io:\n runs-on: ubuntu-latest\n needs: [get-version]\n steps:\n - uses: actions/checkout@v4\n\n - name: Re-deploy rerun.io\n uses: ./.github/actions/vercel\n with:\n command: "deploy"\n vercel_token: "${{ secrets.VERCEL_TOKEN }}"\n vercel_team_name: "${{ vars.VERCEL_TEAM_NAME }}"\n vercel_project_name: "${{ vars.VERCEL_PROJECT_NAME }}"\n release_commit: "docs-latest"\n release_version: "${{ needs.get-version.outputs.version }}"\n target: "production"\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\on_push_docs.yml | on_push_docs.yml | YAML | 3,322 | 0.8 | 0 | 0.137931 | node-utils | 353 | 2024-07-22T08:38:03.687667 | Apache-2.0 | false | 4dff90e51b2019ea0f110c98960a48fa |
name: Push To Main\n\non:\n push:\n branches:\n - "main"\n\n # Can be triggered manually from within the UI or using the GH CLI,\n # e.g. `gh workflow run on_push_main.yml --ref main`\n workflow_dispatch:\n inputs:\n CONCURRENCY:\n required: true\n type: string\n\npermissions: write-all\n\njobs:\n checks:\n name: Checks\n uses: ./.github/workflows/reusable_checks.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n CHANNEL: main\n secrets: inherit\n\n cpp_checks:\n name: Checks\n uses: ./.github/workflows/reusable_checks_cpp.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n CHANNEL: main\n secrets: inherit\n\n rust_checks:\n name: Checks\n uses: ./.github/workflows/reusable_checks_rust.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n CHANNEL: main\n secrets: inherit\n\n python_checks:\n name: Checks\n uses: ./.github/workflows/reusable_checks_python.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n secrets: inherit\n\n deploy-docs:\n needs: [checks]\n name: Deploy Docs\n uses: ./.github/workflows/reusable_deploy_docs.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n PY_DOCS_VERSION_NAME: "main"\n CPP_DOCS_VERSION_NAME: "main"\n RS_DOCS_VERSION_NAME: "head"\n UPDATE_LATEST: false\n secrets: inherit\n\n build-web:\n name: "Build web viewer"\n uses: ./.github/workflows/reusable_build_web.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n CHANNEL: main\n secrets: inherit\n\n upload-web:\n name: "Upload Web"\n needs: [build-web]\n uses: ./.github/workflows/reusable_upload_web.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n secrets: inherit\n\n build-examples:\n name: "Build Examples"\n needs: [build-wheel-linux-x64]\n uses: ./.github/workflows/reusable_build_examples.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n CHANNEL: main\n WHEEL_ARTIFACT_NAME: linux-x64-wheel\n secrets: inherit\n\n track-sizes:\n name: "Track Sizes"\n needs: [build-web, build-examples]\n uses: ./.github/workflows/reusable_track_size.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n WITH_EXAMPLES: true\n secrets: inherit\n\n upload-examples:\n name: "Upload Examples"\n needs: [build-examples]\n uses: ./.github/workflows/reusable_upload_examples.yml\n with:\n CONCURRENCY: push-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n secrets: inherit\n\n # -----------------------------------------------------------------------------------\n # TODO(emilk): build and test one additional platform, picked at random\n\n build-rerun_c-and-upload-linux-x64:\n needs: [checks]\n name: "Linux-x64: Build & Upload rerun_c"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_c.yml\n with:\n CONCURRENCY: push-linux-x64-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n PLATFORM: linux-x64\n secrets: inherit\n\n # -----------------------------------------------------------------------------------\n # TODO(emilk): build and test one additional platform, picked at random\n\n build-rerun-cli-and-upload-linux-x64:\n needs: [checks]\n name: "Linux-x64: Build & Upload rerun-cli"\n uses: ./.github/workflows/reusable_build_and_upload_rerun_cli.yml\n with:\n CONCURRENCY: push-linux-x64-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n PLATFORM: linux-x64\n secrets: inherit\n\n # -----------------------------------------------------------------------------------\n # TODO(emilk): build and test one additional platform, picked at random\n\n build-wheel-linux-x64:\n needs: [checks, build-rerun-cli-and-upload-linux-x64]\n name: "Linux-x64: Build & Upload Wheels"\n uses: ./.github/workflows/reusable_build_and_upload_wheels.yml\n with:\n CONCURRENCY: push-linux-x64-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n PLATFORM: linux-x64\n WHEEL_ARTIFACT_NAME: linux-x64-wheel\n MODE: "pypi"\n secrets: inherit\n\n test-wheel-linux-x64:\n needs: [checks, build-wheel-linux-x64]\n name: "Linux-x64: Test Wheels"\n uses: ./.github/workflows/reusable_test_wheels.yml\n with:\n CONCURRENCY: push-linux-x64-${{ github.ref_name }}-${{ inputs.CONCURRENCY }}\n PLATFORM: linux-x64\n WHEEL_ARTIFACT_NAME: linux-x64-wheel\n secrets: inherit\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\on_push_main.yml | on_push_main.yml | YAML | 4,590 | 0.95 | 0 | 0.061069 | react-lib | 657 | 2024-04-28T11:01:50.398415 | MIT | false | 2a0b15359f6e613ba4179578d476d89b |
name: Release\n\non:\n workflow_dispatch:\n inputs:\n release-type:\n description: "What kind of release is this?"\n type: choice\n options:\n - alpha\n - rc\n - final\n required: true\n\nconcurrency:\n group: ${{ github.ref_name }}\n cancel-in-progress: true\n\ndefaults:\n run:\n shell: bash\n\n# wants to push commits and create a PR\npermissions: write-all\n\njobs:\n # Re-entrancy:\n # - `version` is re-entrant because it doesn't commit/create PR if the version doesn't change,\n # and the version doesn't change if we're already on the final version specified by the branch name.\n # - `update-docs` is re-entrant because it overwrites history of the `gh-pages` branch, so any\n # previous partial update will just be overwritten by the next successful run.\n # - `publish-crates` is re-entrant because the `crates.py` script correctly handles publish failures\n # by first checking if a crate has already been published before attempting to publish it.\n # - `build-and-publish-wheels` is re-entrant because all the uploaded artifacts will be overwritten\n # by any subsequent runs, and the final upload to PyPI has the `--skip-existing` flag, which ignores\n # any wheels already uploaded.\n # - `build-and-publish-web` is re-entrant for the same reason as `build-and-publish-wheels`,\n # except that uploads are done to GCS instead of PyPI.\n\n checks:\n name: "Checks"\n runs-on: ubuntu-latest\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n\n - name: Setup Python\n uses: actions/setup-python@v5\n with:\n python-version: 3.11\n\n - name: Check links for `?speculative-link`\n # This checks that we have no links with `?speculative-link` in its query params.\n # We use those markers to get our link checker to ignore links to unreleased docs.\n #\n # NOTE: For alpha releases, we won't fully publish all our docs,\n # so we skip the check here, because we won't be able to\n # remove the markers yet.\n run: |\n if [ ${{ inputs.release-type }} != "alpha" ]; then\n python3 scripts/ci/check_speculative_links.py\n fi\n\n # NOTE: When updating this job, also remember to update `post-release-version-bump`.\n version:\n name: "Versioning"\n runs-on: ubuntu-latest\n outputs:\n previous: ${{ steps.versioning.outputs.previous }}\n current: ${{ steps.versioning.outputs.current }}\n final: ${{ steps.versioning.outputs.final }}\n git_tag: ${{ steps.versioning.outputs.git_tag }}\n # will be set to `github.sha` if the pull request already exists\n # this is the last (and not merge) commit in the release branch\n release-commit: ${{ steps.commit.outputs.version_bump_commit_sha || github.sha }}\n\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n token: ${{ secrets.RERUN_BOT_TOKEN }}\n\n - uses: actions/setup-node@v4\n with:\n node-version: 18\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Update crate versions\n id: versioning\n run: |\n echo Check that the release version matches expected format…\n pixi run python scripts/ci/crates.py check-git-branch-name\n\n echo Parse the release version from the branch name…\n # `release-0.8.1-meta.N` -> `0.8.1`\n release_version=$(pixi run python scripts/ci/crates.py get-version --from git --finalize)\n\n echo "release_version: $release_version"\n\n echo Store version before the update, so we can later detect if it changed…\n previous=$(pixi run python scripts/ci/crates.py get-version)\n\n echo If the version minus prerelease/build metadata is not the same as the release version, then update it.…\n if [ $(pixi run python scripts/ci/crates.py get-version --finalize) != $release_version ]; then\n pixi run python scripts/ci/crates.py version --exact $release_version\n fi\n\n echo If this is an 'rc', additionally set add '-rc.N'. This will also bump the 'N' if '-rc.N' is already set…\n if [ ${{ inputs.release-type }} = "rc" ]; then\n pixi run python scripts/ci/crates.py version --bump prerelease --pre-id=rc\n fi\n\n echo If this is an 'alpha', set the version to whatever is in the git branch name.…\n if [ ${{ inputs.release-type }} = "alpha" ]; then\n pixi run python scripts/ci/crates.py version --exact $(pixi run python scripts/ci/crates.py get-version --from git)\n fi\n\n echo If this is a 'final', set the version to the final release version…\n if [ ${{ inputs.release-type }} = "final" ]; then\n pixi run python scripts/ci/crates.py version --exact $release_version\n fi\n\n echo Store version after the update, and the expected "final" release version…\n current=$(pixi run python scripts/ci/crates.py get-version)\n final=$(pixi run python scripts/ci/crates.py get-version --finalize)\n\n echo Output everything for use in other steps…\n echo "previous=$previous"\n echo "current=$current"\n echo "final=$final"\n\n echo "previous=$previous" >> "$GITHUB_OUTPUT"\n echo "current=$current" >> "$GITHUB_OUTPUT"\n echo "final=$final" >> "$GITHUB_OUTPUT"\n\n # Pick what version we use for creating a github tag.\n if [ ${{ inputs.release-type }} = "final" ]; then\n git_tag=$final\n else\n git_tag=$current\n fi\n\n # Verify that it wasn't created yet.\n if [ $(git tag -l "$git_tag") ]; then\n echo "Error: Version tag $git_tag already exists!"\n exit 1\n fi\n echo "git_tag=$git_tag" >> "$GITHUB_OUTPUT"\n\n - name: Update rerun_py & rerun_c version\n run: |\n pixi run python scripts/ci/update_rerun_py_and_c_version.py "${{ steps.versioning.outputs.current }}"\n\n - name: Update rerun_notebook package version\n run: |\n pixi run python scripts/ci/update_rerun_notebook_version.py "${{ steps.versioning.outputs.current }}"\n\n - name: Update JS package versions\n run: |\n pixi run node rerun_js/scripts/version.mjs "${{ steps.versioning.outputs.current }}"\n\n - run: pixi run toml-fmt\n\n - name: Commit new version\n id: commit\n if: steps.versioning.outputs.previous != steps.versioning.outputs.current\n run: |\n git pull\n git config --global user.name "rerun-bot"\n git config --global user.email "bot@rerun.io"\n git commit -am "Bump versions to ${{ steps.versioning.outputs.current }}"\n git push\n echo "version_bump_commit_sha=$(git rev-parse HEAD)" >> "$GITHUB_OUTPUT"\n\n - name: Create pull request\n env:\n GH_TOKEN: ${{ secrets.RERUN_BOT_TOKEN }}\n run: |\n set +e\n pr=$(gh pr view --json headRefName 2>/dev/null || echo "{}")\n if echo "$pr" | jq '. | has("headRefName")' | grep -q 'true'; then\n echo "PR already exists"\n exit 0\n fi\n set -e\n\n echo "PR does not exist, creating…"\n\n cat <<EOF > pr-body.txt\n ### Next steps\n - Test the release\n - If this is an 'alpha' release, you can just merge the pull request.\n - Otherwise:\n - For any added commits, run the release workflow in 'rc' mode again\n - After testing, _ensure that this PR is mergeable to `main`_, then run the release workflow in 'release' mode\n - Once the final release workflow finishes it will create a GitHub release for you. Then:\n - [ ] Sanity check the build artifacts:\n - [ ] pip install: does it install and run?\n - [ ] cargo install of cli tool: does it install and run?\n - [ ] C++ SDK zip: does it contain rerun_c for all platforms?\n - [ ] Populate the release with the changelog and a nice header video/picture, check `Set as latest release`, then click `Publish release`.\n - [ ] Update the [google colab notebooks](https://colab.research.google.com/drive/1R9I7s4o6wydQC_zkybqaSRFTtlEaked_) to install this version and re-execute the notebook.\n - [ ] Update landing's version of the web viewer (@jprochazk)\n\n A few hours after the GitHub release is created, `regro-cf-autotick-bot` will create a\n [conda feedstock PR](https://github.com/conda-forge/rerun-sdk-feedstock/pulls).\n Make sure Jeremy is on top of it!\n\n - [ ] Tests\n - [ ] Windows\n - [ ] Linux\n - [ ] MacOS\n EOF\n\n gh pr create \\n --base main \\n --head $(git branch --show-current) \\n --title "Release ${{ (inputs.release-type == 'alpha' && steps.versioning.outputs.current) || steps.versioning.outputs.final }}" \\n --label "⛴ release" \\n --label "exclude from changelog" \\n --fill \\n --body-file pr-body.txt\n\n update-docs:\n name: "Update Docs"\n needs: [version, publish-web]\n uses: ./.github/workflows/reusable_deploy_docs.yml\n with:\n CONCURRENCY: ${{ github.ref_name }}\n PY_DOCS_VERSION_NAME: ${{ inputs.release-type == 'final' && needs.version.outputs.final || 'dev' }}\n CPP_DOCS_VERSION_NAME: ${{ inputs.release-type == 'final' && 'stable' || 'dev' }}\n RS_DOCS_VERSION_NAME: ${{ inputs.release-type == 'final' && 'stable' || 'dev' }}\n RELEASE_COMMIT: ${{ needs.version.outputs.release-commit }}\n RELEASE_VERSION: ${{ needs.version.outputs.final }}\n UPDATE_LATEST: ${{ inputs.release-type == 'final' }}\n secrets: inherit\n\n publish-crates:\n name: "Publish Crates"\n needs: [version]\n uses: ./.github/workflows/reusable_release_crates.yml\n with:\n CONCURRENCY: ${{ github.ref_name }}\n RELEASE_COMMIT: ${{ needs.version.outputs.release-commit }}\n secrets: inherit\n\n publish-rerun_c:\n name: "Build and Publish C/C++ SDKs"\n needs: [version]\n uses: ./.github/workflows/reusable_publish_rerun_c.yml\n with:\n release-version: ${{ needs.version.outputs.current }}\n release-commit: ${{ needs.version.outputs.release-commit }}\n concurrency: ${{ github.ref_name }}\n secrets: inherit\n\n publish-rerun-cli:\n name: "Publish rerun-cli"\n needs: [version]\n uses: ./.github/workflows/reusable_publish_rerun_cli.yml\n with:\n release-version: ${{ needs.version.outputs.current }}\n release-commit: ${{ needs.version.outputs.release-commit }}\n concurrency: ${{ github.ref_name }}\n secrets: inherit\n\n publish-wheels:\n name: "Build and Publish Wheels"\n needs: [version, publish-rerun-cli]\n uses: ./.github/workflows/reusable_publish_wheels.yml\n with:\n release-version: ${{ needs.version.outputs.current }}\n concurrency: ${{ github.ref_name }}\n release-commit: ${{ needs.version.outputs.release-commit }}\n secrets: inherit\n\n publish-web:\n name: "Build and Publish Web"\n needs: [version, publish-wheels]\n uses: ./.github/workflows/reusable_publish_web.yml\n with:\n release-version: ${{ needs.version.outputs.current }}\n release-commit: ${{ needs.version.outputs.release-commit }}\n concurrency: ${{ github.ref_name }}\n wheel-artifact-name: linux-x64-wheel\n update-latest: ${{ inputs.release-type == 'final' }}\n secrets: inherit\n\n publish-js:\n name: "Publish JS"\n needs: [version]\n uses: ./.github/workflows/reusable_publish_js.yml\n with:\n release-commit: ${{ needs.version.outputs.release-commit }}\n concurrency: ${{ github.ref_name }}\n secrets: inherit\n\n # Force-pushes `latest` and `docs-latest` to the contents of the release branch.\n # The push to `docs-latest` also triggers a re-deploy of `rerun.io`.\n update-latest-branch:\n name: "Update Latest Branch"\n if: inputs.release-type == 'final'\n needs:\n [\n version,\n update-docs,\n publish-crates,\n publish-wheels,\n publish-web,\n publish-rerun_c,\n publish-rerun-cli,\n publish-js,\n ]\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n token: ${{ secrets.RERUN_BOT_TOKEN }}\n ref: ${{ needs.version.outputs.release-commit }}\n\n - name: Update latest branch\n run: |\n git config --global user.name "rerun-bot"\n git config --global user.email "bot@rerun.io"\n git fetch\n git checkout ${{ github.ref_name }}\n git push --force origin refs/heads/${{ github.ref_name }}:refs/heads/latest\n git push --force origin refs/heads/${{ github.ref_name }}:refs/heads/docs-latest\n\n github-release:\n name: "GitHub Release"\n if: inputs.release-type == 'rc' || inputs.release-type == 'final'\n needs:\n [\n version,\n update-docs,\n publish-crates,\n publish-wheels,\n publish-web,\n publish-rerun_c,\n publish-rerun-cli,\n publish-js,\n ]\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n token: ${{ secrets.RERUN_BOT_TOKEN }}\n\n - name: Release tag\n env:\n GH_TOKEN: ${{ secrets.RERUN_BOT_TOKEN }}\n run: |\n version="${{ needs.version.outputs.git_tag }}"\n commit="${{ needs.version.outputs.release-commit }}"\n\n if [ ${{ inputs.release-type }} = "final" ]; then\n pre_arg=""\n else\n pre_arg="--prerelease"\n fi\n\n git tag $version $commit\n git push origin $version\n gh release create $version --verify-tag --draft --title $version $pre_arg\n\n - name: Create comment\n env:\n GH_TOKEN: ${{ secrets.RERUN_BOT_TOKEN }}\n run: |\n pr_number=$(gh pr view --json number | jq '.number')\n version="${{ needs.version.outputs.final }}"\n\n cat <<EOF > comment-body.txt\n GitHub release draft: [$version](https://github.com/rerun-io/rerun/releases/tag/$version)\n\n Add a description, changelog, and a nice header video/picture, then click 'Publish release'.\n\n gh pr comment $pr_number --body-file comment-body.txt\n\n # Bump versions to next minor+alpha after the release has finished,\n # so that the release PR can be merged.\n post-release-version-bump:\n name: "Post-Release Version Bump"\n # We don't need to bump versions for `rc` releases, because we don't merge those.\n if: inputs.release-type == 'alpha' || inputs.release-type == 'final'\n needs:\n [\n version,\n update-docs,\n publish-crates,\n publish-wheels,\n publish-web,\n publish-rerun_c,\n publish-rerun-cli,\n publish-js,\n ]\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n token: ${{ secrets.RERUN_BOT_TOKEN }}\n\n - uses: actions/setup-node@v4\n with:\n node-version: 18\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: git config\n run: |\n git config --global user.name "rerun-bot"\n git config --global user.email "bot@rerun.io"\n git checkout ${{ github.ref_name }}\n git pull --rebase\n\n - name: Update crate versions\n id: crates\n run: |\n pixi run python scripts/ci/crates.py version --bump auto\n version="$(pixi run python scripts/ci/crates.py get-version)"\n echo "version=$version" >> "$GITHUB_OUTPUT"\n\n - name: Update rerun_notebook package version\n run: |\n pixi run python scripts/ci/update_rerun_notebook_version.py "${{ steps.crates.outputs.version }}"\n\n - name: Update JS package versions\n run: |\n pixi run node rerun_js/scripts/version.mjs "${{ steps.crates.outputs.version }}"\n\n - name: Update rerun_py & rerun_c version\n run: |\n pixi run python scripts/ci/update_rerun_py_and_c_version.py "${{ steps.crates.outputs.version }}"\n\n - run: pixi run toml-fmt\n\n - name: Commit new version\n run: |\n git commit -am "Bump versions to ${{ steps.crates.outputs.version }}"\n git push\n\n comment-artifact-links:\n name: "Link to artifacts"\n needs:\n [\n version,\n update-docs,\n publish-crates,\n publish-wheels,\n publish-web,\n publish-rerun_c,\n publish-rerun-cli,\n publish-js,\n ]\n runs-on: ubuntu-latest\n steps:\n - name: Checkout repository\n uses: actions/checkout@v4\n with:\n fetch-depth: 0\n token: ${{ secrets.RERUN_BOT_TOKEN }}\n\n - name: Create comment\n env:\n GH_TOKEN: ${{ secrets.RERUN_BOT_TOKEN }}\n run: |\n pr_number=$(gh pr view --json number | jq '.number')\n echo "pr_number: $pr_number"\n short_commit_hash=$(echo ${{ needs.version.outputs.release-commit }} | cut -c1-7)\n\n if [ ${{ inputs.release-type }} = "final" ]; then\n web_app_link="https://rerun.io/viewer/version/${{ needs.version.outputs.final }}"\n rerun_io_docs_link="https://rerun.io/docs"\n py_docs_link="https://ref.rerun.io/docs/python/${{ needs.version.outputs.final }}"\n else\n web_app_link="https://rerun.io/viewer/commit/$short_commit_hash"\n rerun_io_docs_link="https://rerun.io/preview/$short_commit_hash/docs"\n py_docs_link="https://ref.rerun.io/docs/python/dev"\n fi\n wheels_link="https://pypi.org/project/rerun-sdk/${{ needs.version.outputs.current }}"\n crates_link="https://crates.io/crates/rerun/${{ needs.version.outputs.current }}"\n npm_link="https://www.npmjs.com/package/@rerun-io/web-viewer/v/${{ needs.version.outputs.current }}"\n rs_docs_link="https://docs.rs/rerun/${{ needs.version.outputs.current }}"\n cpp_sdk_zip_link="https://build.rerun.io/commit/$short_commit_hash/rerun_cpp_sdk.zip"\n\n pip_install="pip install rerun-sdk==${{ needs.version.outputs.current }}"\n cargo_install="cargo install rerun-cli@${{ needs.version.outputs.current }}"\n npm_install="npm install @rerun-io/web-viewer@${{ needs.version.outputs.current }}"\n\n cat <<EOF > comment-body.txt\n Version ${{ needs.version.outputs.current }} published successfully.\n\n | artifact | install |\n | --------------------------------- | -------------- |\n | [web app]($web_app_link) | |\n | [wheels]($wheels_link) | $pip_install |\n | [crates]($crates_link) | $cargo_install |\n | [npm]($npm_link) | $npm_install |\n | [docs]($rerun_io_docs_link) | |\n | [py docs]($py_docs_link) | |\n | [rs docs]($rs_docs_link) | |\n | [cpp_sdk zip]($cpp_sdk_zip_link) | |\n EOF\n\n gh pr comment $pr_number --body-file comment-body.txt\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\release.yml | release.yml | YAML | 19,623 | 0.95 | 0.052023 | 0.069196 | node-utils | 312 | 2024-12-05T15:39:43.969877 | Apache-2.0 | false | 2eb4f0bbb6c26998e8ee098cb5d55b33 |
name: Reusable Bench\n\non:\n workflow_call:\n inputs:\n CONCURRENCY:\n required: true\n type: string\n SAVE_BENCHES:\n required: false\n type: boolean\n default: false\n BENCH_NAME:\n required: false\n type: string\n default: ""\n COMPARE_TO:\n required: false\n type: string\n default: ""\n\nconcurrency:\n group: ${{ inputs.CONCURRENCY }}-bench\n cancel-in-progress: true\n\nenv:\n PYTHON_VERSION: "3.9"\n # web_sys_unstable_apis is required to enable the web_sys clipboard API which egui_web uses\n # https://rustwasm.github.io/wasm-bindgen/api/web_sys/struct.Clipboard.html\n # https://rustwasm.github.io/docs/wasm-bindgen/web-sys/unstable-apis.html\n RUSTFLAGS: --cfg=web_sys_unstable_apis --deny warnings\n\n RUSTDOCFLAGS: --deny warnings\n\n # Disable the GHA backend (Github's 10GB storage) since we use our own GCS backend.\n # See: https://github.com/marketplace/actions/sccache-action\n SCCACHE_GHA_ENABLED: "false"\n\n # Wrap every `rustc` invocation in `sccache`.\n RUSTC_WRAPPER: "sccache"\n\n # Not only `sccache` cannot cache incremental builds, it's counter-productive to generate all\n # these incremental artifacts when running on CI.\n CARGO_INCREMENTAL: "0"\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n # contents permission to update benchmark contents in gh-pages branch\n contents: write\n id-token: "write"\n # deployments permission to deploy GitHub pages website\n deployments: write\n\njobs:\n # ---------------------------------------------------------------------------\n\n rs-benchmarks:\n name: Rust Criterion benchmarks\n runs-on: ubuntu-latest-16-cores\n steps:\n - uses: actions/checkout@v4\n with:\n fetch-depth: 0 # we need full history\n ref: ${{ github.event_name == 'pull_request' && github.event.pull_request.head.ref || '' }}\n\n - name: Set up Rust\n uses: ./.github/actions/setup-rust\n with:\n cache_key: "build-linux"\n # Cache will be produced by `reusable_checks/rs-lints`\n save_cache: false\n workload_identity_provider: ${{ secrets.GOOGLE_WORKLOAD_IDENTITY_PROVIDER }}\n service_account: ${{ secrets.GOOGLE_SERVICE_ACCOUNT }}\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n # default: for the rendering step\n # wheel-test-min: minimal env for roundtrips (less heavy than wheel-test/examples)\n environments: >-\n default\n wheel-test-min\n\n - name: Add SHORT_SHA env property with commit short sha\n run: echo "SHORT_SHA=`echo ${{github.sha}} | cut -c1-7`" >> $GITHUB_ENV\n\n - name: Run benchmark\n # Use bash shell so we get pipefail behavior with tee\n # Running under `pixi` so we get `nasm`\n run: |\n pixi run -e wheel-test-min \\n cargo bench \\n --all-features \\n -p re_entity_db \\n -p re_log_encoding \\n -p re_query \\n -p re_tuid \\n -p re_video \\n -- --output-format=bencher | tee /tmp/${{ env.SHORT_SHA }}\n\n - name: "Set up Cloud SDK"\n uses: "google-github-actions/setup-gcloud@v2"\n with:\n version: ">= 363.0.0"\n\n # TODO(jleibs) make this whole thing a python script\n - name: "Upload bench to GCS based on SHA"\n uses: google-github-actions/upload-cloud-storage@v2\n with:\n path: /tmp/${{ env.SHORT_SHA }}\n destination: "rerun-builds/benches/"\n process_gcloudignore: false\n\n - name: Download comparison bench from GCS\n if: ${{ inputs.COMPARE_TO != '' }}\n run: |\n mkdir /tmp/compare/\n gsutil cp gs://rerun-builds/benches/${{inputs.COMPARE_TO}} /tmp/compare/${{ inputs.COMPARE_TO }}\n\n - name: Install cargo-benchcmp\n run: cargo install --quiet cargo-benchcmp\n\n - name: Compare results with benchcmp\n if: ${{ inputs.COMPARE_TO != '' }}\n run: cargo benchcmp /tmp/compare/${{ inputs.COMPARE_TO }} /tmp/${{ env.SHORT_SHA }} > /tmp/bench_results.txt\n\n - name: "Upload bench-results to GCS"\n if: ${{ inputs.COMPARE_TO != '' }}\n uses: google-github-actions/upload-cloud-storage@v2\n with:\n path: /tmp/bench_results.txt\n destination: "rerun-builds/commit/${{env.SHORT_SHA}}/"\n process_gcloudignore: false\n\n - name: "Copy bench to named file"\n if: ${{ inputs.BENCH_NAME != '' }}\n run: cp /tmp/${{ env.SHORT_SHA }} /tmp/${{ inputs.BENCH_NAME }}\n\n # Don't upload the new named bench until the end in case the names are the same\n - name: "Upload named bench to GCS"\n if: ${{ inputs.BENCH_NAME != '' }}\n uses: google-github-actions/upload-cloud-storage@v2\n with:\n path: /tmp/${{ inputs.BENCH_NAME }}\n destination: "rerun-builds/benches/"\n process_gcloudignore: false\n\n - name: Alert on regression\n # https://github.com/benchmark-action/github-action-benchmark\n uses: benchmark-action/github-action-benchmark@v1\n with:\n name: Rust Benchmark\n tool: "cargo"\n output-file-path: /tmp/${{ env.SHORT_SHA }}\n github-token: ${{ secrets.GITHUB_TOKEN }}\n\n # Show alert with commit comment on detecting possible performance regression\n comment-on-alert: true\n alert-threshold: "125%"\n fail-on-alert: false\n comment-always: false # Generates too much GitHub notification spam\n\n # Don't push to gh-pages\n save-data-file: false\n auto-push: false\n\n - name: Render benchmark result\n if: github.ref == 'refs/heads/main'\n run: |\n pixi run python scripts/ci/render_bench.py crates \\n --after $(date -d"30 days ago" +%Y-%m-%d) \\n --output "gs://rerun-builds/graphs"\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\reusable_bench.yml | reusable_bench.yml | YAML | 5,927 | 0.95 | 0.045714 | 0.141892 | react-lib | 515 | 2024-09-01T10:37:44.248362 | Apache-2.0 | false | d762c2a15bee4baa81b1114b56f920b9 |
name: Reusable Rerun-c Build\n\non:\n workflow_call:\n inputs:\n CONCURRENCY:\n required: true\n type: string\n PLATFORM:\n required: true\n type: string\n ADHOC_NAME:\n required: false\n type: string\n default: ""\n RELEASE_COMMIT:\n required: false\n type: string\n default: ""\n\n workflow_dispatch:\n inputs:\n ADHOC_NAME:\n required: false\n type: string\n description: "Name of the adhoc build, used for upload directory"\n default: ""\n PLATFORM:\n type: choice\n options:\n - linux-arm64\n - linux-x64\n - windows-x64\n - macos-arm64\n - macos-x64\n description: "Platform to build for"\n required: true\n CONCURRENCY:\n required: false\n type: string\n default: "adhoc"\n description: "Concurrency group to use"\n\nconcurrency:\n group: ${{ inputs.CONCURRENCY }}-build-rerun_c\n cancel-in-progress: true\n\nenv:\n # web_sys_unstable_apis is required to enable the web_sys clipboard API which egui_web uses\n # https://rustwasm.github.io/wasm-bindgen/api/web_sys/struct.Clipboard.html\n # https://rustwasm.github.io/docs/wasm-bindgen/web-sys/unstable-apis.html\n RUSTFLAGS: --cfg=web_sys_unstable_apis --deny warnings\n\n RUSTDOCFLAGS: --deny warnings\n\n # Disable the GHA backend (Github's 10GB storage) since we use our own GCS backend.\n # See: https://github.com/marketplace/actions/sccache-action\n SCCACHE_GHA_ENABLED: "false"\n\n # Wrap every `rustc` invocation in `sccache`.\n RUSTC_WRAPPER: "sccache"\n\n # Not only `sccache` cannot cache incremental builds, it's counter-productive to generate all\n # these incremental artifacts when running on CI.\n CARGO_INCREMENTAL: "0"\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "read"\n id-token: "write"\n\njobs:\n set-config:\n name: Set Config (${{ inputs.PLATFORM }})\n runs-on: ubuntu-latest-16-cores\n outputs:\n RUNNER: ${{ steps.set-config.outputs.runner }}\n TARGET: ${{ steps.set-config.outputs.target }}\n CONTAINER: ${{ steps.set-config.outputs.container }}\n LIB_NAME: ${{ steps.set-config.outputs.lib_name }}\n steps:\n - name: Set runner and target based on platform\n id: set-config\n run: |\n case "${{ inputs.PLATFORM }}" in\n linux-arm64)\n runner="buildjet-8vcpu-ubuntu-2204-arm"\n target="aarch64-unknown-linux-gnu"\n container="'rerunio/ci_docker:0.15.0'"\n lib_name="librerun_c.a"\n ;;\n linux-x64)\n runner="ubuntu-latest-16-cores"\n target="x86_64-unknown-linux-gnu"\n container="'rerunio/ci_docker:0.15.0'"\n lib_name="librerun_c.a"\n ;;\n windows-x64)\n runner="windows-latest-8-cores"\n target="x86_64-pc-windows-msvc"\n container="null"\n lib_name="rerun_c.lib"\n ;;\n macos-arm64)\n runner="macos-latest" # Small runners, because building rerun_c is fast\n target="aarch64-apple-darwin"\n container="null"\n lib_name="librerun_c.a"\n ;;\n macos-x64)\n runner="macos-latest" # Small runners, because building rerun_c is fast\n target="x86_64-apple-darwin"\n container="null"\n lib_name="librerun_c.a"\n ;;\n *) echo "Invalid platform" && exit 1 ;;\n esac\n echo "runner=$runner" >> "$GITHUB_OUTPUT"\n echo "target=$target" >> "$GITHUB_OUTPUT"\n echo "container=$container" >> "$GITHUB_OUTPUT"\n echo "lib_name=$lib_name" >> "$GITHUB_OUTPUT"\n\n rs-build-rerun_c:\n name: Build rerun_c (${{ needs.set-config.outputs.RUNNER }})\n\n needs: [set-config]\n\n runs-on: ${{ needs.set-config.outputs.RUNNER }}\n container:\n image: ${{ fromJson(needs.set-config.outputs.CONTAINER) }}\n credentials:\n username: ${{ secrets.DOCKER_HUB_USER }}\n password: ${{ secrets.DOCKER_HUB_TOKEN }}\n\n steps:\n - name: Show context\n run: |\n echo "GITHUB_CONTEXT": $GITHUB_CONTEXT\n echo "JOB_CONTEXT": $JOB_CONTEXT\n echo "INPUTS_CONTEXT": $INPUTS_CONTEXT\n echo "ENV_CONTEXT": $ENV_CONTEXT\n env:\n ENV_CONTEXT: ${{ toJson(env) }}\n GITHUB_CONTEXT: ${{ toJson(github) }}\n JOB_CONTEXT: ${{ toJson(job) }}\n INPUTS_CONTEXT: ${{ toJson(inputs) }}\n\n - uses: actions/checkout@v4\n with:\n ref: ${{ inputs.RELEASE_COMMIT || ((github.event_name == 'pull_request' && github.event.pull_request.head.ref) || '') }}\n\n - name: Set up Rust and Authenticate to GCS\n uses: ./.github/actions/setup-rust\n with:\n cache_key: "build-${{ inputs.PLATFORM }}"\n save_cache: false\n workload_identity_provider: ${{ secrets.GOOGLE_WORKLOAD_IDENTITY_PROVIDER }}\n service_account: ${{ secrets.GOOGLE_SERVICE_ACCOUNT }}\n targets: ${{ needs.set-config.outputs.TARGET }}\n\n - name: Build rerun_c (release)\n run: cargo build --locked -p rerun_c --release --target ${{ needs.set-config.outputs.TARGET }}\n\n - name: Get sha\n id: get-sha\n run: |\n full_commit="${{ inputs.RELEASE_COMMIT || ((github.event_name == 'pull_request' && github.event.pull_request.head.sha) || github.sha) }}"\n echo "sha=$(echo $full_commit | cut -c1-7)" >> "$GITHUB_OUTPUT"\n\n - name: "Upload rerun_c (commit)"\n uses: google-github-actions/upload-cloud-storage@v2\n with:\n path: "./target/${{ needs.set-config.outputs.TARGET }}/release/${{ needs.set-config.outputs.LIB_NAME }}"\n destination: "rerun-builds/commit/${{ steps.get-sha.outputs.sha }}/rerun_c/${{ inputs.PLATFORM }}"\n parent: false\n process_gcloudignore: false\n\n - name: "Upload rerun_c (adhoc)"\n if: ${{ inputs.ADHOC_NAME != '' }}\n uses: google-github-actions/upload-cloud-storage@v2\n with:\n path: "./target/${{ needs.set-config.outputs.TARGET }}/release/${{ needs.set-config.outputs.LIB_NAME }}"\n destination: "rerun-builds/adhoc/${{inputs.ADHOC_NAME}}/rerun_c/${{ inputs.PLATFORM }}"\n parent: false\n process_gcloudignore: false\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\reusable_build_and_upload_rerun_c.yml | reusable_build_and_upload_rerun_c.yml | YAML | 6,411 | 0.95 | 0.015957 | 0.053892 | react-lib | 96 | 2025-06-01T17:39:16.616595 | Apache-2.0 | false | 3e62bb26a4bc6809d764b8c8b6b85943 |
name: Reusable Rerun CLI build & upload\n\non:\n workflow_call:\n inputs:\n CONCURRENCY:\n required: true\n type: string\n PLATFORM:\n required: true\n type: string\n ADHOC_NAME:\n required: false\n type: string\n default: ""\n RELEASE_COMMIT:\n required: false\n type: string\n default: ""\n\n workflow_dispatch:\n inputs:\n ADHOC_NAME:\n required: false\n type: string\n default: ""\n description: "Name of the adhoc build, used for upload directory"\n PLATFORM:\n type: choice\n options:\n - linux-arm64\n - linux-x64\n - windows-x64\n - macos-arm64\n - macos-x64\n description: "Platform to build for"\n required: true\n CONCURRENCY:\n required: false\n type: string\n default: "adhoc"\n description: "Concurrency group to use"\n\nconcurrency:\n group: ${{ inputs.CONCURRENCY }}-build-rerun-cli\n cancel-in-progress: true\n\nenv:\n PYTHON_VERSION: "3.9"\n # web_sys_unstable_apis is required to enable the web_sys clipboard API which egui_web uses\n # https://rustwasm.github.io/wasm-bindgen/api/web_sys/struct.Clipboard.html\n # https://rustwasm.github.io/docs/wasm-bindgen/web-sys/unstable-apis.html\n RUSTFLAGS: --cfg=web_sys_unstable_apis --deny warnings\n\n RUSTDOCFLAGS: --deny warnings\n\n # Disable the GHA backend (Github's 10GB storage) since we use our own GCS backend.\n # See: https://github.com/marketplace/actions/sccache-action\n SCCACHE_GHA_ENABLED: "false"\n\n # Wrap every `rustc` invocation in `sccache`.\n RUSTC_WRAPPER: "sccache"\n\n # Not only `sccache` cannot cache incremental builds, it's counter-productive to generate all\n # these incremental artifacts when running on CI.\n CARGO_INCREMENTAL: "0"\n\ndefaults:\n run:\n shell: bash\n\npermissions:\n contents: "read"\n id-token: "write"\n\njobs:\n set-config:\n name: Set Config (${{ inputs.PLATFORM }})\n runs-on: ubuntu-latest\n outputs:\n RUNNER: ${{ steps.set-config.outputs.runner }}\n TARGET: ${{ steps.set-config.outputs.target }}\n CONTAINER: ${{ steps.set-config.outputs.container }}\n BIN_NAME: ${{ steps.set-config.outputs.bin_name }}\n steps:\n - name: Set runner and target based on platform\n id: set-config\n run: |\n case "${{ inputs.PLATFORM }}" in\n linux-arm64)\n runner="buildjet-8vcpu-ubuntu-2204-arm"\n target="aarch64-unknown-linux-gnu"\n container="'rerunio/ci_docker:0.15.0'"\n bin_name="rerun"\n ;;\n linux-x64)\n runner="ubuntu-latest-16-cores"\n target="x86_64-unknown-linux-gnu"\n container="'rerunio/ci_docker:0.15.0'"\n bin_name="rerun"\n ;;\n windows-x64)\n runner="windows-latest-8-cores"\n target="x86_64-pc-windows-msvc"\n container="null"\n bin_name="rerun.exe"\n ;;\n macos-arm64)\n runner="macos-latest-large" # See https://github.blog/2023-10-02-introducing-the-new-apple-silicon-powered-m1-macos-larger-runner-for-github-actions/\n target="aarch64-apple-darwin"\n container="null"\n bin_name="rerun"\n ;;\n macos-x64)\n runner="macos-latest-large" # See https://github.blog/2023-10-02-introducing-the-new-apple-silicon-powered-m1-macos-larger-runner-for-github-actions/\n target="x86_64-apple-darwin"\n container="null"\n bin_name="rerun"\n ;;\n *) echo "Invalid platform" && exit 1 ;;\n esac\n echo "runner=$runner" >> "$GITHUB_OUTPUT"\n echo "target=$target" >> "$GITHUB_OUTPUT"\n echo "container=$container" >> "$GITHUB_OUTPUT"\n echo "bin_name=$bin_name" >> "$GITHUB_OUTPUT"\n\n build-rerun-cli:\n name: Build rerun-cli (${{ needs.set-config.outputs.RUNNER }})\n\n needs: [set-config]\n\n runs-on: ${{ needs.set-config.outputs.RUNNER }}\n container:\n image: ${{ fromJson(needs.set-config.outputs.CONTAINER) }}\n credentials:\n username: ${{ secrets.DOCKER_HUB_USER }}\n password: ${{ secrets.DOCKER_HUB_TOKEN }}\n\n steps:\n - name: Show context\n run: |\n echo "GITHUB_CONTEXT": $GITHUB_CONTEXT\n echo "JOB_CONTEXT": $JOB_CONTEXT\n echo "INPUTS_CONTEXT": $INPUTS_CONTEXT\n echo "ENV_CONTEXT": $ENV_CONTEXT\n env:\n ENV_CONTEXT: ${{ toJson(env) }}\n GITHUB_CONTEXT: ${{ toJson(github) }}\n JOB_CONTEXT: ${{ toJson(job) }}\n INPUTS_CONTEXT: ${{ toJson(inputs) }}\n\n - uses: actions/checkout@v4\n with:\n ref: ${{ inputs.RELEASE_COMMIT || ((github.event_name == 'pull_request' && github.event.pull_request.head.ref) || '') }}\n\n - name: Set up Rust and Authenticate to GCS\n uses: ./.github/actions/setup-rust\n with:\n cache_key: "build-${{ inputs.PLATFORM }}"\n save_cache: false\n workload_identity_provider: ${{ secrets.GOOGLE_WORKLOAD_IDENTITY_PROVIDER }}\n service_account: ${{ secrets.GOOGLE_SERVICE_ACCOUNT }}\n targets: ${{ needs.set-config.outputs.TARGET }}\n\n - uses: prefix-dev/setup-pixi@v0.8.1\n with:\n pixi-version: v0.41.4\n\n - name: Build web-viewer (release)\n run: pixi run rerun-build-web-release\n\n # This does not run in the pixi environment, doing so\n # causes it to select the wrong compiler on macos-arm64\n - name: Build rerun-cli\n run: |\n pixi run cargo build \\n --locked \\n -p rerun-cli \\n --no-default-features \\n --features release \\n --release \\n --target ${{ needs.set-config.outputs.TARGET }}\n\n - name: Get sha\n id: get-sha\n run: |\n full_commit="${{ inputs.RELEASE_COMMIT || ((github.event_name == 'pull_request' && github.event.pull_request.head.sha) || github.sha) }}"\n echo "sha=$(echo $full_commit | cut -c1-7)" >> "$GITHUB_OUTPUT"\n\n - name: "Upload rerun-cli (commit)"\n uses: google-github-actions/upload-cloud-storage@v2\n with:\n path: "./target/${{ needs.set-config.outputs.TARGET }}/release/${{ needs.set-config.outputs.BIN_NAME }}"\n destination: "rerun-builds/commit/${{ steps.get-sha.outputs.sha }}/rerun-cli/${{ inputs.PLATFORM }}"\n parent: false\n process_gcloudignore: false\n\n - name: "Upload rerun-cli (adhoc)"\n if: ${{ inputs.ADHOC_NAME != '' }}\n uses: google-github-actions/upload-cloud-storage@v2\n with:\n path: "./target/${{ needs.set-config.outputs.TARGET }}/release/${{ needs.set-config.outputs.BIN_NAME }}"\n destination: "rerun-builds/adhoc/${{inputs.ADHOC_NAME}}/rerun-cli/${{ inputs.PLATFORM }}"\n parent: false\n process_gcloudignore: false\n | dataset_sample\yaml\rerun-io_rerun\.github\workflows\reusable_build_and_upload_rerun_cli.yml | reusable_build_and_upload_rerun_cli.yml | YAML | 7,014 | 0.95 | 0.02439 | 0.06044 | react-lib | 10 | 2023-07-22T20:01:17.590508 | BSD-3-Clause | false | 555e9ce17b8f98a3cca7e3dedcf1d1da |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.