Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
4
112
repo_url
stringlengths
33
141
action
stringclasses
3 values
title
stringlengths
1
1.02k
labels
stringlengths
4
1.54k
body
stringlengths
1
262k
index
stringclasses
17 values
text_combine
stringlengths
95
262k
label
stringclasses
2 values
text
stringlengths
96
252k
binary_label
int64
0
1
320,399
27,433,980,226
IssuesEvent
2023-03-02 05:17:49
openBackhaul/ApplicationPattern
https://api.github.com/repos/openBackhaul/ApplicationPattern
closed
Service specific update for /v1/update-operation-client
testsuite_to_be_changed
Apart from generic changes, there are few updates made in /v1/update-operation-key which are specific to this request - [x] application-release-number to be updated as release-number
1.0
Service specific update for /v1/update-operation-client - Apart from generic changes, there are few updates made in /v1/update-operation-key which are specific to this request - [x] application-release-number to be updated as release-number
test
service specific update for update operation client apart from generic changes there are few updates made in update operation key which are specific to this request application release number to be updated as release number
1
11,754
3,222,911,082
IssuesEvent
2015-10-09 06:13:15
FreeCodeCamp/FreeCodeCamp
https://api.github.com/repos/FreeCodeCamp/FreeCodeCamp
closed
Bonfire #20: The following code can pass all of the tests but is wrong
tests
```javascript function diff(arr1, arr2) { var newArr = []; var flag = false; if(arr1.length === 0) return arr2; else if(arr2.length === 0) return arr1; if(arr1.length >= arr2.length) { for(i=0, iLength = arr1.length; i < iLength; i++) { flag = false; for(p=0, pLength = arr2.length; p < pLength; p++) { if(arr2.indexOf(arr1[i]) == -1) { flag = true; } } if(flag) { newArr.push(arr1[i]); console.log(newArr); } } } if(arr1.length <= arr2.length) { for(i=0, iLength = arr2.length; i < iLength; i++) { flag = false; for(p=0, pLength = arr1.length; p < pLength; p++) { if(arr1.indexOf(arr2[i]) == -1) { flag = true; } } if(flag) { newArr.push(arr2[i]); console.log(newArr); } } } // Same, same; but different. return newArr; } diff(['snuffleupagus', 'cookie monster', 'elmo'], [1]); ``` This should have all four elements in the final array but it does not. Have to beef up the validation on the tests.
1.0
Bonfire #20: The following code can pass all of the tests but is wrong - ```javascript function diff(arr1, arr2) { var newArr = []; var flag = false; if(arr1.length === 0) return arr2; else if(arr2.length === 0) return arr1; if(arr1.length >= arr2.length) { for(i=0, iLength = arr1.length; i < iLength; i++) { flag = false; for(p=0, pLength = arr2.length; p < pLength; p++) { if(arr2.indexOf(arr1[i]) == -1) { flag = true; } } if(flag) { newArr.push(arr1[i]); console.log(newArr); } } } if(arr1.length <= arr2.length) { for(i=0, iLength = arr2.length; i < iLength; i++) { flag = false; for(p=0, pLength = arr1.length; p < pLength; p++) { if(arr1.indexOf(arr2[i]) == -1) { flag = true; } } if(flag) { newArr.push(arr2[i]); console.log(newArr); } } } // Same, same; but different. return newArr; } diff(['snuffleupagus', 'cookie monster', 'elmo'], [1]); ``` This should have all four elements in the final array but it does not. Have to beef up the validation on the tests.
test
bonfire the following code can pass all of the tests but is wrong javascript function diff var newarr var flag false if length return else if length return if length length for i ilength length i ilength i flag false for p plength length p plength p if indexof flag true if flag newarr push console log newarr if length length for i ilength length i ilength i flag false for p plength length p plength p if indexof flag true if flag newarr push console log newarr same same but different return newarr diff this should have all four elements in the final array but it does not have to beef up the validation on the tests
1
34,237
9,310,313,963
IssuesEvent
2019-03-25 18:28:23
commercialhaskell/stack
https://api.github.com/repos/commercialhaskell/stack
closed
Stack does not pass parameters given in command line to GHC
component: build component: os x further investigation required
### General summary/comments (optional) `stack` is invoked with `--ghc-options -optL=/usr/lib/libiconv.dylib`. This is **not** passed to GHC, as the log below shows. This problem screws up using `stack` on MacOS with Macports installed. ### Steps to reproduce 1. Get a Mac computer, install Macports and Haskell Platform. 2. Do `stack install intero --ghc-options -optL=/usr/lib/libiconv.dylib` or `stack install hsdev --ghc-options -optL=/usr/lib/libiconv.dylib`, or `stack install happy --ghc-options -optL=/usr/lib/libiconv.dylib` `~/.stack/config.yaml` (in case it can help): ``` #resolver: lts-13.2 system-ghc: true skip-ghc-check: true extra-path: - /usr/local/bin arch: x86_64 with-gcc: /opt/local/bin/gcc extra-include-dirs: [ /opt/local/include, /usr/local/include ] #extra-lib-dirs: [ /opt/local/lib, /usr/local/lib ] extra-lib-dirs: [ /opt/local/lib/liconv, /opt/local/lib, /usr/local/lib, /usr/lib ] #ignore-revision-mismatch: true allow-newer: true ghc-options: # All packages #"$locals": -Wall /usr/lib/libiconv.dylib "$targets": -Wall /usr/lib/libiconv.dylib "$everything": -O2 /usr/lib/libiconv.dylib apply-ghc-options: everything build: cabal-verbose: true ``` ### Expected Successful build. ### Actual ``` $ stack install hsdev --ghc-options -optL=/usr/lib/libiconv.dylib --verbose 2>&1 | tee hsdev-build.txt Version 1.9.3, Git revision 40cf7b37526b86d1676da82167ea8758a854953b (6211 commits) x86_64 hpack-0.31.1 2019-01-21 13:15:12.147091: [debug] Checking for project config at: /Users/uri/src/stack.yaml 2019-01-21 13:15:12.147617: [debug] Checking for project config at: /Users/uri/stack.yaml 2019-01-21 13:15:12.147685: [debug] Checking for project config at: /Users/stack.yaml 2019-01-21 13:15:12.147735: [debug] Checking for project config at: /stack.yaml 2019-01-21 13:15:12.147785: [debug] No project config file found, using defaults. 2019-01-21 13:15:12.149954: [debug] Run from outside a project, using implicit global project config 2019-01-21 13:15:12.150343: [debug] Decoding build plan from: /Users/uri/.stack/build-plan/lts-13.2.yaml 2019-01-21 13:15:12.150398: [debug] Trying to decode /Users/uri/.stack/build-plan-cache/lts-13.2.cache 2019-01-21 13:15:12.156167: [debug] Success decoding /Users/uri/.stack/build-plan-cache/lts-13.2.cache 2019-01-21 13:15:12.156340: [debug] Getting system compiler version 2019-01-21 13:15:12.156619: [debug] Run process: /usr/local/bin/ghc --info 2019-01-21 13:15:12.261122: [debug] Process finished in 104ms: /usr/local/bin/ghc --info 2019-01-21 13:15:12.262655: [debug] Getting global package database location 2019-01-21 13:15:12.262895: [debug] Run process: /usr/local/bin/ghc-pkg --no-user-package-db list --global 2019-01-21 13:15:12.263590: [debug] Asking GHC for its version 2019-01-21 13:15:12.263769: [debug] Run process: /usr/local/bin/ghc --numeric-version 2019-01-21 13:15:12.264668: [debug] Getting Cabal package version 2019-01-21 13:15:12.264772: [debug] Run process: /usr/local/bin/ghc-pkg --no-user-package-db field --simple-output Cabal version 2019-01-21 13:15:12.349020: [debug] Process finished in 86ms: /usr/local/bin/ghc-pkg --no-user-package-db list --global . . . . . 2019-01-21 13:15:29.285083: [debug] Encoding /Users/uri/.stack/global-project/.stack-work/install/x86_64-osx/lts-13.2/8.6.3/flag-cache/network-uri-2.6.1.0-K75fCYvLQE41EntOQ30cqK 2019-01-21 13:15:29.285590: [debug] Finished writing /Users/uri/.stack/global-project/.stack-work/install/x86_64-osx/lts-13.2/8.6.3/flag-cache/network-uri-2.6.1.0-K75fCYvLQE41EntOQ30cqK -- While building package happy-1.19.9 using: /usr/local/bin/ghc --make -odir /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup -hidir /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup -i -i. -clear-package-db -global-package-db -package-db=/Users/uri/.stack/snapshots/x86_64-osx/lts-13.2/8.6.3/pkgdb -package-db=/Users/uri/.stack/global-project/.stack-work/install/x86_64-osx/lts-13.2/8.6.3/pkgdb -hide-all-packages -package-id=Cabal-2.4.1.0-Df4rkGuWEtO4aZs4eesJ3r -package-id=base-4.12.0.0 -package-id=directory-1.3.3.0 -package-id=filepath-1.4.2.1 -optP-include -optP/private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/setup_macros.h /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/Setup.lhs /Users/uri/.stack/setup-exe-src/setup-shim-mPHDZzAJ.hs -main-is StackSetupShim.mainOverride -o /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/setup -threaded Process exited with code: ExitFailure 1 Logs have been written to: /Users/uri/.stack/global-project/.stack-work/logs/happy-1.19.9.log [1 of 2] Compiling Main ( /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/Setup.lhs, /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/Main.o ) /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/Setup.lhs:48:22: warning: [-Wdeprecations] In the use of ‘rawSystemProgramConf’ (imported from Distribution.Simple.Program): Deprecated: "use runDbProgram instead. This symbol will be removed in Cabal-3.0 (est. Mar 2019)." | 48 | let runProgram p = rawSystemProgramConf (fromFlagOrDefault normal (buildVerbosity flags)) | ^^^^^^^^^^^^^^^^^^^^ [2 of 2] Compiling StackSetupShim ( /Users/uri/.stack/setup-exe-src/setup-shim-mPHDZzAJ.hs, /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/StackSetupShim.o ) Linking /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/setup ... Undefined symbols for architecture x86_64: "_iconv", referenced from: _hs_iconv in libHSbase-4.12.0.0.a(iconv.o) (maybe you meant: _base_GHCziIOziEncodingziIconv_iconvEncoding_closure, _base_GHCziIOziEncodingziIconv_iconvEncoding1_info , _base_GHCziIOziEncodingziIconv_iconvEncoding3_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding4_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding7_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding15_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding4_info , _base_GHCziIOziEncodingziIconv_iconvEncoding6_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding6_info , _base_GHCziIOziEncodingziIconv_iconvEncoding5_closure , _hs_iconv_open , _base_GHCziIOziEncodingziIconv_iconvEncoding9_info , _base_GHCziIOziEncodingziIconv_iconvEncoding11_info , _base_GHCziIOziEncodingziIconv_iconvEncoding12_info , _base_GHCziIOziEncodingziIconv_iconvEncoding14_bytes , _hs_iconv_close , _base_GHCziIOziEncodingziIconv_iconvEncoding2_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding2_info , _base_GHCziIOziEncodingziIconv_iconvEncoding11_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding15_info , _base_GHCziIOziEncodingziIconv_iconvEncoding_info , _base_GHCziIOziEncodingziIconv_iconvEncoding12_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding10_bytes , _base_GHCziIOziEncodingziIconv_iconvEncoding8_info , _base_GHCziIOziEncodingziIconv_iconvEncoding1_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding8_closure , _hs_iconv , _base_GHCziIOziEncodingziIconv_iconvEncoding13_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding13_info , _base_GHCziIOziEncodingziIconv_iconvEncoding7_info , _base_GHCziIOziEncodingziIconv_iconvEncoding9_closure ) "_iconv_open", referenced from: _hs_iconv_open in libHSbase-4.12.0.0.a(iconv.o) (maybe you meant: _hs_iconv_open) "_iconv_close", referenced from: _hs_iconv_close in libHSbase-4.12.0.0.a(iconv.o) (maybe you meant: _hs_iconv_close) "_locale_charset", referenced from: _localeEncoding in libHSbase-4.12.0.0.a(PrelIOUtils.o) ld: symbol(s) not found for architecture x86_64 clang: error: linker command failed with exit code 1 (use -v to see invocation) `clang' failed in phase `Linker'. (Exit code: 1) $ ``` Complete build log: [hsdev-build.txt](https://github.com/commercialhaskell/stack/files/2780071/hsdev-build.txt) [happy-log.txt](https://github.com/commercialhaskell/stack/files/2780072/happy-log.txt) ### Stack version ``` $ stack --version Version 1.9.3, Git revision 40cf7b37526b86d1676da82167ea8758a854953b (6211 commits) x86_64 hpack-0.31.1 ``` ### Method of installation * Official binary, downloaded from/with Haskell Platform, an updated via `stack update`
1.0
Stack does not pass parameters given in command line to GHC - ### General summary/comments (optional) `stack` is invoked with `--ghc-options -optL=/usr/lib/libiconv.dylib`. This is **not** passed to GHC, as the log below shows. This problem screws up using `stack` on MacOS with Macports installed. ### Steps to reproduce 1. Get a Mac computer, install Macports and Haskell Platform. 2. Do `stack install intero --ghc-options -optL=/usr/lib/libiconv.dylib` or `stack install hsdev --ghc-options -optL=/usr/lib/libiconv.dylib`, or `stack install happy --ghc-options -optL=/usr/lib/libiconv.dylib` `~/.stack/config.yaml` (in case it can help): ``` #resolver: lts-13.2 system-ghc: true skip-ghc-check: true extra-path: - /usr/local/bin arch: x86_64 with-gcc: /opt/local/bin/gcc extra-include-dirs: [ /opt/local/include, /usr/local/include ] #extra-lib-dirs: [ /opt/local/lib, /usr/local/lib ] extra-lib-dirs: [ /opt/local/lib/liconv, /opt/local/lib, /usr/local/lib, /usr/lib ] #ignore-revision-mismatch: true allow-newer: true ghc-options: # All packages #"$locals": -Wall /usr/lib/libiconv.dylib "$targets": -Wall /usr/lib/libiconv.dylib "$everything": -O2 /usr/lib/libiconv.dylib apply-ghc-options: everything build: cabal-verbose: true ``` ### Expected Successful build. ### Actual ``` $ stack install hsdev --ghc-options -optL=/usr/lib/libiconv.dylib --verbose 2>&1 | tee hsdev-build.txt Version 1.9.3, Git revision 40cf7b37526b86d1676da82167ea8758a854953b (6211 commits) x86_64 hpack-0.31.1 2019-01-21 13:15:12.147091: [debug] Checking for project config at: /Users/uri/src/stack.yaml 2019-01-21 13:15:12.147617: [debug] Checking for project config at: /Users/uri/stack.yaml 2019-01-21 13:15:12.147685: [debug] Checking for project config at: /Users/stack.yaml 2019-01-21 13:15:12.147735: [debug] Checking for project config at: /stack.yaml 2019-01-21 13:15:12.147785: [debug] No project config file found, using defaults. 2019-01-21 13:15:12.149954: [debug] Run from outside a project, using implicit global project config 2019-01-21 13:15:12.150343: [debug] Decoding build plan from: /Users/uri/.stack/build-plan/lts-13.2.yaml 2019-01-21 13:15:12.150398: [debug] Trying to decode /Users/uri/.stack/build-plan-cache/lts-13.2.cache 2019-01-21 13:15:12.156167: [debug] Success decoding /Users/uri/.stack/build-plan-cache/lts-13.2.cache 2019-01-21 13:15:12.156340: [debug] Getting system compiler version 2019-01-21 13:15:12.156619: [debug] Run process: /usr/local/bin/ghc --info 2019-01-21 13:15:12.261122: [debug] Process finished in 104ms: /usr/local/bin/ghc --info 2019-01-21 13:15:12.262655: [debug] Getting global package database location 2019-01-21 13:15:12.262895: [debug] Run process: /usr/local/bin/ghc-pkg --no-user-package-db list --global 2019-01-21 13:15:12.263590: [debug] Asking GHC for its version 2019-01-21 13:15:12.263769: [debug] Run process: /usr/local/bin/ghc --numeric-version 2019-01-21 13:15:12.264668: [debug] Getting Cabal package version 2019-01-21 13:15:12.264772: [debug] Run process: /usr/local/bin/ghc-pkg --no-user-package-db field --simple-output Cabal version 2019-01-21 13:15:12.349020: [debug] Process finished in 86ms: /usr/local/bin/ghc-pkg --no-user-package-db list --global . . . . . 2019-01-21 13:15:29.285083: [debug] Encoding /Users/uri/.stack/global-project/.stack-work/install/x86_64-osx/lts-13.2/8.6.3/flag-cache/network-uri-2.6.1.0-K75fCYvLQE41EntOQ30cqK 2019-01-21 13:15:29.285590: [debug] Finished writing /Users/uri/.stack/global-project/.stack-work/install/x86_64-osx/lts-13.2/8.6.3/flag-cache/network-uri-2.6.1.0-K75fCYvLQE41EntOQ30cqK -- While building package happy-1.19.9 using: /usr/local/bin/ghc --make -odir /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup -hidir /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup -i -i. -clear-package-db -global-package-db -package-db=/Users/uri/.stack/snapshots/x86_64-osx/lts-13.2/8.6.3/pkgdb -package-db=/Users/uri/.stack/global-project/.stack-work/install/x86_64-osx/lts-13.2/8.6.3/pkgdb -hide-all-packages -package-id=Cabal-2.4.1.0-Df4rkGuWEtO4aZs4eesJ3r -package-id=base-4.12.0.0 -package-id=directory-1.3.3.0 -package-id=filepath-1.4.2.1 -optP-include -optP/private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/setup_macros.h /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/Setup.lhs /Users/uri/.stack/setup-exe-src/setup-shim-mPHDZzAJ.hs -main-is StackSetupShim.mainOverride -o /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/setup -threaded Process exited with code: ExitFailure 1 Logs have been written to: /Users/uri/.stack/global-project/.stack-work/logs/happy-1.19.9.log [1 of 2] Compiling Main ( /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/Setup.lhs, /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/Main.o ) /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/Setup.lhs:48:22: warning: [-Wdeprecations] In the use of ‘rawSystemProgramConf’ (imported from Distribution.Simple.Program): Deprecated: "use runDbProgram instead. This symbol will be removed in Cabal-3.0 (est. Mar 2019)." | 48 | let runProgram p = rawSystemProgramConf (fromFlagOrDefault normal (buildVerbosity flags)) | ^^^^^^^^^^^^^^^^^^^^ [2 of 2] Compiling StackSetupShim ( /Users/uri/.stack/setup-exe-src/setup-shim-mPHDZzAJ.hs, /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/StackSetupShim.o ) Linking /private/var/folders/pd/mxn5kp_55jg23x7jjd10gtwm0000gn/T/stack3944/happy-1.19.9/.stack-work/dist/x86_64-osx/Cabal-2.4.0.1/setup/setup ... Undefined symbols for architecture x86_64: "_iconv", referenced from: _hs_iconv in libHSbase-4.12.0.0.a(iconv.o) (maybe you meant: _base_GHCziIOziEncodingziIconv_iconvEncoding_closure, _base_GHCziIOziEncodingziIconv_iconvEncoding1_info , _base_GHCziIOziEncodingziIconv_iconvEncoding3_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding4_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding7_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding15_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding4_info , _base_GHCziIOziEncodingziIconv_iconvEncoding6_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding6_info , _base_GHCziIOziEncodingziIconv_iconvEncoding5_closure , _hs_iconv_open , _base_GHCziIOziEncodingziIconv_iconvEncoding9_info , _base_GHCziIOziEncodingziIconv_iconvEncoding11_info , _base_GHCziIOziEncodingziIconv_iconvEncoding12_info , _base_GHCziIOziEncodingziIconv_iconvEncoding14_bytes , _hs_iconv_close , _base_GHCziIOziEncodingziIconv_iconvEncoding2_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding2_info , _base_GHCziIOziEncodingziIconv_iconvEncoding11_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding15_info , _base_GHCziIOziEncodingziIconv_iconvEncoding_info , _base_GHCziIOziEncodingziIconv_iconvEncoding12_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding10_bytes , _base_GHCziIOziEncodingziIconv_iconvEncoding8_info , _base_GHCziIOziEncodingziIconv_iconvEncoding1_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding8_closure , _hs_iconv , _base_GHCziIOziEncodingziIconv_iconvEncoding13_closure , _base_GHCziIOziEncodingziIconv_iconvEncoding13_info , _base_GHCziIOziEncodingziIconv_iconvEncoding7_info , _base_GHCziIOziEncodingziIconv_iconvEncoding9_closure ) "_iconv_open", referenced from: _hs_iconv_open in libHSbase-4.12.0.0.a(iconv.o) (maybe you meant: _hs_iconv_open) "_iconv_close", referenced from: _hs_iconv_close in libHSbase-4.12.0.0.a(iconv.o) (maybe you meant: _hs_iconv_close) "_locale_charset", referenced from: _localeEncoding in libHSbase-4.12.0.0.a(PrelIOUtils.o) ld: symbol(s) not found for architecture x86_64 clang: error: linker command failed with exit code 1 (use -v to see invocation) `clang' failed in phase `Linker'. (Exit code: 1) $ ``` Complete build log: [hsdev-build.txt](https://github.com/commercialhaskell/stack/files/2780071/hsdev-build.txt) [happy-log.txt](https://github.com/commercialhaskell/stack/files/2780072/happy-log.txt) ### Stack version ``` $ stack --version Version 1.9.3, Git revision 40cf7b37526b86d1676da82167ea8758a854953b (6211 commits) x86_64 hpack-0.31.1 ``` ### Method of installation * Official binary, downloaded from/with Haskell Platform, an updated via `stack update`
non_test
stack does not pass parameters given in command line to ghc general summary comments optional stack is invoked with ghc options optl usr lib libiconv dylib this is not passed to ghc as the log below shows this problem screws up using stack on macos with macports installed steps to reproduce get a mac computer install macports and haskell platform do stack install intero ghc options optl usr lib libiconv dylib or stack install hsdev ghc options optl usr lib libiconv dylib or stack install happy ghc options optl usr lib libiconv dylib stack config yaml in case it can help resolver lts system ghc true skip ghc check true extra path usr local bin arch with gcc opt local bin gcc extra include dirs extra lib dirs extra lib dirs ignore revision mismatch true allow newer true ghc options all packages locals wall usr lib libiconv dylib targets wall usr lib libiconv dylib everything usr lib libiconv dylib apply ghc options everything build cabal verbose true expected successful build actual stack install hsdev ghc options optl usr lib libiconv dylib verbose tee hsdev build txt version git revision commits hpack checking for project config at users uri src stack yaml checking for project config at users uri stack yaml checking for project config at users stack yaml checking for project config at stack yaml no project config file found using defaults run from outside a project using implicit global project config decoding build plan from users uri stack build plan lts yaml trying to decode users uri stack build plan cache lts cache success decoding users uri stack build plan cache lts cache getting system compiler version run process usr local bin ghc info process finished in usr local bin ghc info getting global package database location run process usr local bin ghc pkg no user package db list global asking ghc for its version run process usr local bin ghc numeric version getting cabal package version run process usr local bin ghc pkg no user package db field simple output cabal version process finished in usr local bin ghc pkg no user package db list global encoding users uri stack global project stack work install osx lts flag cache network uri finished writing users uri stack global project stack work install osx lts flag cache network uri while building package happy using usr local bin ghc make odir private var folders pd t happy stack work dist osx cabal setup hidir private var folders pd t happy stack work dist osx cabal setup i i clear package db global package db package db users uri stack snapshots osx lts pkgdb package db users uri stack global project stack work install osx lts pkgdb hide all packages package id cabal package id base package id directory package id filepath optp include optp private var folders pd t happy stack work dist osx cabal setup setup macros h private var folders pd t happy setup lhs users uri stack setup exe src setup shim mphdzzaj hs main is stacksetupshim mainoverride o private var folders pd t happy stack work dist osx cabal setup setup threaded process exited with code exitfailure logs have been written to users uri stack global project stack work logs happy log compiling main private var folders pd t happy setup lhs private var folders pd t happy stack work dist osx cabal setup main o private var folders pd t happy setup lhs warning in the use of ‘rawsystemprogramconf’ imported from distribution simple program deprecated use rundbprogram instead this symbol will be removed in cabal est mar let runprogram p rawsystemprogramconf fromflagordefault normal buildverbosity flags compiling stacksetupshim users uri stack setup exe src setup shim mphdzzaj hs private var folders pd t happy stack work dist osx cabal setup stacksetupshim o linking private var folders pd t happy stack work dist osx cabal setup setup undefined symbols for architecture iconv referenced from hs iconv in libhsbase a iconv o maybe you meant base ghcziioziencodingziiconv iconvencoding closure base ghcziioziencodingziiconv info base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv info base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv info base ghcziioziencodingziiconv closure hs iconv open base ghcziioziencodingziiconv info base ghcziioziencodingziiconv info base ghcziioziencodingziiconv info base ghcziioziencodingziiconv bytes hs iconv close base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv info base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv info base ghcziioziencodingziiconv iconvencoding info base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv bytes base ghcziioziencodingziiconv info base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv closure hs iconv base ghcziioziencodingziiconv closure base ghcziioziencodingziiconv info base ghcziioziencodingziiconv info base ghcziioziencodingziiconv closure iconv open referenced from hs iconv open in libhsbase a iconv o maybe you meant hs iconv open iconv close referenced from hs iconv close in libhsbase a iconv o maybe you meant hs iconv close locale charset referenced from localeencoding in libhsbase a prelioutils o ld symbol s not found for architecture clang error linker command failed with exit code use v to see invocation clang failed in phase linker exit code complete build log stack version stack version version git revision commits hpack method of installation official binary downloaded from with haskell platform an updated via stack update
0
220,693
17,241,703,543
IssuesEvent
2021-07-21 00:06:41
Becksteinlab/kda
https://api.github.com/repos/Becksteinlab/kda
closed
Move testing suite from Travis-CI to GitHub Actions
testing
Travis CI has changed their testing platform to only allow 1000 minutes of testing time per month for free users, which may become intractable once the testing suite is fully developed. Can view a guide for the migration [here](https://docs.github.com/en/free-pro-team@latest/actions/learn-github-actions/migrating-from-travis-ci-to-github-actions).
1.0
Move testing suite from Travis-CI to GitHub Actions - Travis CI has changed their testing platform to only allow 1000 minutes of testing time per month for free users, which may become intractable once the testing suite is fully developed. Can view a guide for the migration [here](https://docs.github.com/en/free-pro-team@latest/actions/learn-github-actions/migrating-from-travis-ci-to-github-actions).
test
move testing suite from travis ci to github actions travis ci has changed their testing platform to only allow minutes of testing time per month for free users which may become intractable once the testing suite is fully developed can view a guide for the migration
1
241,765
20,160,527,453
IssuesEvent
2022-02-09 21:00:06
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
closed
[CI] CoreWithMappedRuntimeFieldsIT test {yaml=search.aggregation/450_time_series/Basic test} failing
>test-failure :Analytics/TSDB
This showed up after bumping master to 8.2.0 so not sure if that is related. **Build scan:** https://gradle-enterprise.elastic.co/s/yzxgyguzc6qns/tests/:x-pack:qa:runtime-fields:core-with-mapped:yamlRestTest/org.elasticsearch.xpack.runtimefields.test.mapped.CoreWithMappedRuntimeFieldsIT/test%20%7Byaml=search.aggregation%2F450_time_series%2FBasic%20test%7D **Reproduction line:** `./gradlew ':x-pack:qa:runtime-fields:core-with-mapped:yamlRestTest' --tests "org.elasticsearch.xpack.runtimefields.test.mapped.CoreWithMappedRuntimeFieldsIT" -Dtests.method="test {yaml=search.aggregation/450_time_series/Basic test}" -Dtests.seed=10391BEE48E763BE -Dtests.locale=ga -Dtests.timezone=America/Cancun -Druntime.java=17` **Applicable branches:** master **Reproduces locally?:** Yes **Failure history:** https://gradle-enterprise.elastic.co/scans/tests?tests.container=org.elasticsearch.xpack.runtimefields.test.mapped.CoreWithMappedRuntimeFieldsIT&tests.test=test%20%7Byaml%3Dsearch.aggregation/450_time_series/Basic%20test%7D **Failure excerpt:** ``` java.lang.AssertionError: Failure at [search.aggregation/450_time_series:6]: expected [2xx] status code but api [indices.create] returned [400 Bad Request] [{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","caused_by":{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"},"stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:377)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\nCaused by: org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\t... 22 more\n"},"status":400}] at __randomizedtesting.SeedInfo.seed([10391BEE48E763BE:986D2434E61B0E46]:0) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.executeSection(ESClientYamlSuiteTestCase.java:489) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.test(ESClientYamlSuiteTestCase.java:453) at jdk.internal.reflect.GeneratedMethodAccessor12.invoke(null:-1) at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:568) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:44) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:824) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:475) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:831) at java.lang.Thread.run(Thread.java:833) Caused by: java.lang.AssertionError: expected [2xx] status code but api [indices.create] returned [400 Bad Request] [{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","caused_by":{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"},"stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:377)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\nCaused by: org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\t... 22 more\n"},"status":400}] at org.junit.Assert.fail(Assert.java:88) at org.elasticsearch.test.rest.yaml.section.DoSection.execute(DoSection.java:374) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.executeSection(ESClientYamlSuiteTestCase.java:478) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.test(ESClientYamlSuiteTestCase.java:453) at jdk.internal.reflect.GeneratedMethodAccessor12.invoke(null:-1) at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:568) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:44) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:824) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:475) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:831) at java.lang.Thread.run(Thread.java:833) ```
1.0
[CI] CoreWithMappedRuntimeFieldsIT test {yaml=search.aggregation/450_time_series/Basic test} failing - This showed up after bumping master to 8.2.0 so not sure if that is related. **Build scan:** https://gradle-enterprise.elastic.co/s/yzxgyguzc6qns/tests/:x-pack:qa:runtime-fields:core-with-mapped:yamlRestTest/org.elasticsearch.xpack.runtimefields.test.mapped.CoreWithMappedRuntimeFieldsIT/test%20%7Byaml=search.aggregation%2F450_time_series%2FBasic%20test%7D **Reproduction line:** `./gradlew ':x-pack:qa:runtime-fields:core-with-mapped:yamlRestTest' --tests "org.elasticsearch.xpack.runtimefields.test.mapped.CoreWithMappedRuntimeFieldsIT" -Dtests.method="test {yaml=search.aggregation/450_time_series/Basic test}" -Dtests.seed=10391BEE48E763BE -Dtests.locale=ga -Dtests.timezone=America/Cancun -Druntime.java=17` **Applicable branches:** master **Reproduces locally?:** Yes **Failure history:** https://gradle-enterprise.elastic.co/scans/tests?tests.container=org.elasticsearch.xpack.runtimefields.test.mapped.CoreWithMappedRuntimeFieldsIT&tests.test=test%20%7Byaml%3Dsearch.aggregation/450_time_series/Basic%20test%7D **Failure excerpt:** ``` java.lang.AssertionError: Failure at [search.aggregation/450_time_series:6]: expected [2xx] status code but api [indices.create] returned [400 Bad Request] [{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","caused_by":{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"},"stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:377)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\nCaused by: org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\t... 22 more\n"},"status":400}] at __randomizedtesting.SeedInfo.seed([10391BEE48E763BE:986D2434E61B0E46]:0) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.executeSection(ESClientYamlSuiteTestCase.java:489) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.test(ESClientYamlSuiteTestCase.java:453) at jdk.internal.reflect.GeneratedMethodAccessor12.invoke(null:-1) at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:568) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:44) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:824) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:475) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:831) at java.lang.Thread.run(Thread.java:833) Caused by: java.lang.AssertionError: expected [2xx] status code but api [indices.create] returned [400 Bad Request] [{"error":{"root_cause":[{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"}],"type":"mapper_parsing_exception","reason":"Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","caused_by":{"type":"mapper_parsing_exception","reason":"unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]","stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\n"},"stack_trace":"org.elasticsearch.index.mapper.MapperParsingException: Failed to parse mapping: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:377)\n\tat org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:352)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.updateIndexMappingsAndBuildSortOrder(MetadataCreateIndexService.java:1222)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.lambda$applyCreateIndexWithTemporaryService$3(MetadataCreateIndexService.java:428)\n\tat org.elasticsearch.indices.IndicesService.withTempIndexService(IndicesService.java:663)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexWithTemporaryService(MetadataCreateIndexService.java:426)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequestWithV2Template(MetadataCreateIndexService.java:605)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:368)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService.applyCreateIndexRequest(MetadataCreateIndexService.java:395)\n\tat org.elasticsearch.cluster.metadata.MetadataCreateIndexService$1.execute(MetadataCreateIndexService.java:293)\n\tat org.elasticsearch.cluster.ClusterStateTaskExecutor$1.execute(ClusterStateTaskExecutor.java:142)\n\tat org.elasticsearch.cluster.service.MasterService.executeTasks(MasterService.java:818)\n\tat org.elasticsearch.cluster.service.MasterService.calculateTaskOutputs(MasterService.java:397)\n\tat org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:237)\n\tat org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:162)\n\tat org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:152)\n\tat org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:208)\n\tat org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:717)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:260)\n\tat org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:223)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1136)\n\tat java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:635)\n\tat java.base/java.lang.Thread.run(Thread.java:833)\nCaused by: org.elasticsearch.index.mapper.MapperParsingException: unknown parameter [time_series_dimension] on runtime field [key] of type [keyword]\n\tat org.elasticsearch.index.mapper.RuntimeField$Builder.parse(RuntimeField.java:81)\n\tat org.elasticsearch.index.mapper.RuntimeField$Parser.parse(RuntimeField.java:114)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:182)\n\tat org.elasticsearch.index.mapper.RuntimeField.parseRuntimeFields(RuntimeField.java:132)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.processField(RootObjectMapper.java:231)\n\tat org.elasticsearch.index.mapper.RootObjectMapper$TypeParser.parse(RootObjectMapper.java:162)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:99)\n\tat org.elasticsearch.index.mapper.MappingParser.parse(MappingParser.java:94)\n\tat org.elasticsearch.index.mapper.MapperService.parseMapping(MapperService.java:375)\n\t... 22 more\n"},"status":400}] at org.junit.Assert.fail(Assert.java:88) at org.elasticsearch.test.rest.yaml.section.DoSection.execute(DoSection.java:374) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.executeSection(ESClientYamlSuiteTestCase.java:478) at org.elasticsearch.test.rest.yaml.ESClientYamlSuiteTestCase.test(ESClientYamlSuiteTestCase.java:453) at jdk.internal.reflect.GeneratedMethodAccessor12.invoke(null:-1) at jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.lang.reflect.Method.invoke(Method.java:568) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1758) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:946) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:982) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:996) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:44) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:824) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:475) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:955) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:840) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:891) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:902) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:375) at com.carrotsearch.randomizedtesting.ThreadLeakControl.lambda$forkTimeoutingTask$0(ThreadLeakControl.java:831) at java.lang.Thread.run(Thread.java:833) ```
test
corewithmappedruntimefieldsit test yaml search aggregation time series basic test failing this showed up after bumping master to so not sure if that is related build scan reproduction line gradlew x pack qa runtime fields core with mapped yamlresttest tests org elasticsearch xpack runtimefields test mapped corewithmappedruntimefieldsit dtests method test yaml search aggregation time series basic test dtests seed dtests locale ga dtests timezone america cancun druntime java applicable branches master reproduces locally yes failure history failure excerpt java lang assertionerror failure at expected status code but api returned on runtime field of type stack trace org elasticsearch index mapper mapperparsingexception unknown parameter on runtime field of type n tat org elasticsearch index mapper runtimefield builder parse runtimefield java n tat org elasticsearch index mapper runtimefield parser parse runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper rootobjectmapper typeparser processfield rootobjectmapper java n tat org elasticsearch index mapper rootobjectmapper typeparser parse rootobjectmapper java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n tat org elasticsearch index mapper mapperservice merge mapperservice java n tat org elasticsearch cluster metadata metadatacreateindexservice updateindexmappingsandbuildsortorder metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice lambda applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch indices indicesservice withtempindexservice indicesservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice execute metadatacreateindexservice java n tat org elasticsearch cluster clusterstatetaskexecutor execute clusterstatetaskexecutor java n tat org elasticsearch cluster service masterservice executetasks masterservice java n tat org elasticsearch cluster service masterservice calculatetaskoutputs masterservice java n tat org elasticsearch cluster service masterservice runtasks masterservice java n tat org elasticsearch cluster service masterservice batcher run masterservice java n tat org elasticsearch cluster service taskbatcher runifnotprocessed taskbatcher java n tat org elasticsearch cluster service taskbatcher batchedtask run taskbatcher java n tat org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable runandclean prioritizedesthreadpoolexecutor java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable run prioritizedesthreadpoolexecutor java n tat java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java n tat java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java n tat java base java lang thread run thread java n type mapper parsing exception reason failed to parse mapping unknown parameter on runtime field of type caused by type mapper parsing exception reason unknown parameter on runtime field of type stack trace org elasticsearch index mapper mapperparsingexception unknown parameter on runtime field of type n tat org elasticsearch index mapper runtimefield builder parse runtimefield java n tat org elasticsearch index mapper runtimefield parser parse runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper rootobjectmapper typeparser processfield rootobjectmapper java n tat org elasticsearch index mapper rootobjectmapper typeparser parse rootobjectmapper java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n tat org elasticsearch index mapper mapperservice merge mapperservice java n tat org elasticsearch cluster metadata metadatacreateindexservice updateindexmappingsandbuildsortorder metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice lambda applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch indices indicesservice withtempindexservice indicesservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice execute metadatacreateindexservice java n tat org elasticsearch cluster clusterstatetaskexecutor execute clusterstatetaskexecutor java n tat org elasticsearch cluster service masterservice executetasks masterservice java n tat org elasticsearch cluster service masterservice calculatetaskoutputs masterservice java n tat org elasticsearch cluster service masterservice runtasks masterservice java n tat org elasticsearch cluster service masterservice batcher run masterservice java n tat org elasticsearch cluster service taskbatcher runifnotprocessed taskbatcher java n tat org elasticsearch cluster service taskbatcher batchedtask run taskbatcher java n tat org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable runandclean prioritizedesthreadpoolexecutor java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable run prioritizedesthreadpoolexecutor java n tat java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java n tat java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java n tat java base java lang thread run thread java n stack trace org elasticsearch index mapper mapperparsingexception failed to parse mapping unknown parameter on runtime field of type n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n tat org elasticsearch index mapper mapperservice merge mapperservice java n tat org elasticsearch cluster metadata metadatacreateindexservice updateindexmappingsandbuildsortorder metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice lambda applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch indices indicesservice withtempindexservice indicesservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice execute metadatacreateindexservice java n tat org elasticsearch cluster clusterstatetaskexecutor execute clusterstatetaskexecutor java n tat org elasticsearch cluster service masterservice executetasks masterservice java n tat org elasticsearch cluster service masterservice calculatetaskoutputs masterservice java n tat org elasticsearch cluster service masterservice runtasks masterservice java n tat org elasticsearch cluster service masterservice batcher run masterservice java n tat org elasticsearch cluster service taskbatcher runifnotprocessed taskbatcher java n tat org elasticsearch cluster service taskbatcher batchedtask run taskbatcher java n tat org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable runandclean prioritizedesthreadpoolexecutor java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable run prioritizedesthreadpoolexecutor java n tat java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java n tat java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java n tat java base java lang thread run thread java ncaused by org elasticsearch index mapper mapperparsingexception unknown parameter on runtime field of type n tat org elasticsearch index mapper runtimefield builder parse runtimefield java n tat org elasticsearch index mapper runtimefield parser parse runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper rootobjectmapper typeparser processfield rootobjectmapper java n tat org elasticsearch index mapper rootobjectmapper typeparser parse rootobjectmapper java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n t more n status at randomizedtesting seedinfo seed at org elasticsearch test rest yaml esclientyamlsuitetestcase executesection esclientyamlsuitetestcase java at org elasticsearch test rest yaml esclientyamlsuitetestcase test esclientyamlsuitetestcase java at jdk internal reflect invoke null at jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testruleignoretestsuites evaluate testruleignoretestsuites java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol lambda forktimeoutingtask threadleakcontrol java at java lang thread run thread java caused by java lang assertionerror expected status code but api returned on runtime field of type stack trace org elasticsearch index mapper mapperparsingexception unknown parameter on runtime field of type n tat org elasticsearch index mapper runtimefield builder parse runtimefield java n tat org elasticsearch index mapper runtimefield parser parse runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper rootobjectmapper typeparser processfield rootobjectmapper java n tat org elasticsearch index mapper rootobjectmapper typeparser parse rootobjectmapper java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n tat org elasticsearch index mapper mapperservice merge mapperservice java n tat org elasticsearch cluster metadata metadatacreateindexservice updateindexmappingsandbuildsortorder metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice lambda applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch indices indicesservice withtempindexservice indicesservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice execute metadatacreateindexservice java n tat org elasticsearch cluster clusterstatetaskexecutor execute clusterstatetaskexecutor java n tat org elasticsearch cluster service masterservice executetasks masterservice java n tat org elasticsearch cluster service masterservice calculatetaskoutputs masterservice java n tat org elasticsearch cluster service masterservice runtasks masterservice java n tat org elasticsearch cluster service masterservice batcher run masterservice java n tat org elasticsearch cluster service taskbatcher runifnotprocessed taskbatcher java n tat org elasticsearch cluster service taskbatcher batchedtask run taskbatcher java n tat org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable runandclean prioritizedesthreadpoolexecutor java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable run prioritizedesthreadpoolexecutor java n tat java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java n tat java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java n tat java base java lang thread run thread java n type mapper parsing exception reason failed to parse mapping unknown parameter on runtime field of type caused by type mapper parsing exception reason unknown parameter on runtime field of type stack trace org elasticsearch index mapper mapperparsingexception unknown parameter on runtime field of type n tat org elasticsearch index mapper runtimefield builder parse runtimefield java n tat org elasticsearch index mapper runtimefield parser parse runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper rootobjectmapper typeparser processfield rootobjectmapper java n tat org elasticsearch index mapper rootobjectmapper typeparser parse rootobjectmapper java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n tat org elasticsearch index mapper mapperservice merge mapperservice java n tat org elasticsearch cluster metadata metadatacreateindexservice updateindexmappingsandbuildsortorder metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice lambda applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch indices indicesservice withtempindexservice indicesservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice execute metadatacreateindexservice java n tat org elasticsearch cluster clusterstatetaskexecutor execute clusterstatetaskexecutor java n tat org elasticsearch cluster service masterservice executetasks masterservice java n tat org elasticsearch cluster service masterservice calculatetaskoutputs masterservice java n tat org elasticsearch cluster service masterservice runtasks masterservice java n tat org elasticsearch cluster service masterservice batcher run masterservice java n tat org elasticsearch cluster service taskbatcher runifnotprocessed taskbatcher java n tat org elasticsearch cluster service taskbatcher batchedtask run taskbatcher java n tat org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable runandclean prioritizedesthreadpoolexecutor java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable run prioritizedesthreadpoolexecutor java n tat java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java n tat java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java n tat java base java lang thread run thread java n stack trace org elasticsearch index mapper mapperparsingexception failed to parse mapping unknown parameter on runtime field of type n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n tat org elasticsearch index mapper mapperservice merge mapperservice java n tat org elasticsearch cluster metadata metadatacreateindexservice updateindexmappingsandbuildsortorder metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice lambda applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch indices indicesservice withtempindexservice indicesservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexwithtemporaryservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice applycreateindexrequest metadatacreateindexservice java n tat org elasticsearch cluster metadata metadatacreateindexservice execute metadatacreateindexservice java n tat org elasticsearch cluster clusterstatetaskexecutor execute clusterstatetaskexecutor java n tat org elasticsearch cluster service masterservice executetasks masterservice java n tat org elasticsearch cluster service masterservice calculatetaskoutputs masterservice java n tat org elasticsearch cluster service masterservice runtasks masterservice java n tat org elasticsearch cluster service masterservice batcher run masterservice java n tat org elasticsearch cluster service taskbatcher runifnotprocessed taskbatcher java n tat org elasticsearch cluster service taskbatcher batchedtask run taskbatcher java n tat org elasticsearch common util concurrent threadcontext contextpreservingrunnable run threadcontext java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable runandclean prioritizedesthreadpoolexecutor java n tat org elasticsearch common util concurrent prioritizedesthreadpoolexecutor tiebreakingprioritizedrunnable run prioritizedesthreadpoolexecutor java n tat java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java n tat java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java n tat java base java lang thread run thread java ncaused by org elasticsearch index mapper mapperparsingexception unknown parameter on runtime field of type n tat org elasticsearch index mapper runtimefield builder parse runtimefield java n tat org elasticsearch index mapper runtimefield parser parse runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper runtimefield parseruntimefields runtimefield java n tat org elasticsearch index mapper rootobjectmapper typeparser processfield rootobjectmapper java n tat org elasticsearch index mapper rootobjectmapper typeparser parse rootobjectmapper java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mappingparser parse mappingparser java n tat org elasticsearch index mapper mapperservice parsemapping mapperservice java n t more n status at org junit assert fail assert java at org elasticsearch test rest yaml section dosection execute dosection java at org elasticsearch test rest yaml esclientyamlsuitetestcase executesection esclientyamlsuitetestcase java at org elasticsearch test rest yaml esclientyamlsuitetestcase test esclientyamlsuitetestcase java at jdk internal reflect invoke null at jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testruleignoretestsuites evaluate testruleignoretestsuites java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol lambda forktimeoutingtask threadleakcontrol java at java lang thread run thread java
1
78,672
7,657,515,312
IssuesEvent
2018-05-10 19:56:13
kubernetes/kubeadm
https://api.github.com/repos/kubernetes/kubeadm
closed
Create `[Feature:kubeadm]` e2e tests
active area/testing kind/feature priority/important-soon
...under `test/e2e_kubeadm/` in the core kubernetes repo. e2e tests here should for instance make sure that: - At least one node is tainted/labelled with `node-role.kubernetes.io/master=""` - `cluster-info` is exposed in a correct manner, and the right `Role` and `RoleBinding` exist - The right Node Bootstrap Token RBAC rules exist, the SAR approver is enabled, etc. - The `kubeadm-config` ConfigMap exists. - API Aggregation features are enabled - What else? @timothysc @pipejakob Ideas?
1.0
Create `[Feature:kubeadm]` e2e tests - ...under `test/e2e_kubeadm/` in the core kubernetes repo. e2e tests here should for instance make sure that: - At least one node is tainted/labelled with `node-role.kubernetes.io/master=""` - `cluster-info` is exposed in a correct manner, and the right `Role` and `RoleBinding` exist - The right Node Bootstrap Token RBAC rules exist, the SAR approver is enabled, etc. - The `kubeadm-config` ConfigMap exists. - API Aggregation features are enabled - What else? @timothysc @pipejakob Ideas?
test
create tests under test kubeadm in the core kubernetes repo tests here should for instance make sure that at least one node is tainted labelled with node role kubernetes io master cluster info is exposed in a correct manner and the right role and rolebinding exist the right node bootstrap token rbac rules exist the sar approver is enabled etc the kubeadm config configmap exists api aggregation features are enabled what else timothysc pipejakob ideas
1
309,038
26,648,631,422
IssuesEvent
2023-01-25 12:00:07
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix non_linear_activation_functions.test_torch_threshold
PyTorch Frontend Sub Task Failing Test
| | | |---|---| |torch|<a href="https://github.com/unifyai/ivy/actions/runs/3787482025/jobs/6439326637" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/3780436677/jobs/6426522537" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/3788736379/jobs/6441820526" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
1.0
Fix non_linear_activation_functions.test_torch_threshold - | | | |---|---| |torch|<a href="https://github.com/unifyai/ivy/actions/runs/3787482025/jobs/6439326637" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/3780436677/jobs/6426522537" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/3788736379/jobs/6441820526" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
test
fix non linear activation functions test torch threshold torch img src numpy img src tensorflow img src
1
32,050
12,059,783,473
IssuesEvent
2020-04-15 19:55:26
Dima2021/EUA_DEMO
https://api.github.com/repos/Dima2021/EUA_DEMO
opened
CVE-2015-9251 (Medium) detected in jquery-1.7.2.min.js
security vulnerability
## CVE-2015-9251 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js</a></p> <p>Path to vulnerable library: /EUA_DEMO/ksa/ksa-web-root/ksa-web/src/main/webapp/rs/jquery/jquery-1.7.2.min.js,/EUA_DEMO/ksa/ksa-web-root/ksa-web/src/main/webapp/rs/jquery/jquery-1.7.2.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.2.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Dima2021/EUA_DEMO/commit/f26c5c8306d8f562cf2c10bf0cc1c8e8b82efc4c">f26c5c8306d8f562cf2c10bf0cc1c8e8b82efc4c</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v3.0.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.7.2","isTransitiveDependency":false,"dependencyTree":"jquery:1.7.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2015-9251 (Medium) detected in jquery-1.7.2.min.js - ## CVE-2015-9251 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.2/jquery.min.js</a></p> <p>Path to vulnerable library: /EUA_DEMO/ksa/ksa-web-root/ksa-web/src/main/webapp/rs/jquery/jquery-1.7.2.min.js,/EUA_DEMO/ksa/ksa-web-root/ksa-web/src/main/webapp/rs/jquery/jquery-1.7.2.min.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.2.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Dima2021/EUA_DEMO/commit/f26c5c8306d8f562cf2c10bf0cc1c8e8b82efc4c">f26c5c8306d8f562cf2c10bf0cc1c8e8b82efc4c</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v3.0.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.7.2","isTransitiveDependency":false,"dependencyTree":"jquery:1.7.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_test
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library eua demo ksa ksa web root ksa web src main webapp rs jquery jquery min js eua demo ksa ksa web root ksa web src main webapp rs jquery jquery min js dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed vulnerabilityurl
0
290,463
25,069,260,224
IssuesEvent
2022-11-07 10:47:49
zer0five/yomikaze
https://api.github.com/repos/zer0five/yomikaze
opened
[Test Case]: RegisterHasLinkToLogin
Test Tested Selenium
### Description of the test case Kiểm tra xem register có link qua form login không ### Script ```Java // Generated by Selenium IDE import org.junit.Test; import org.junit.Before; import org.junit.After; import static org.junit.Assert.*; import static org.hamcrest.CoreMatchers.is; import static org.hamcrest.core.IsNot.not; import org.openqa.selenium.By; import org.openqa.selenium.WebDriver; import org.openqa.selenium.firefox.FirefoxDriver; import org.openqa.selenium.chrome.ChromeDriver; import org.openqa.selenium.remote.RemoteWebDriver; import org.openqa.selenium.remote.DesiredCapabilities; import org.openqa.selenium.Dimension; import org.openqa.selenium.WebElement; import org.openqa.selenium.interactions.Actions; import org.openqa.selenium.support.ui.ExpectedConditions; import org.openqa.selenium.support.ui.WebDriverWait; import org.openqa.selenium.JavascriptExecutor; import org.openqa.selenium.Alert; import org.openqa.selenium.Keys; import java.util.*; import java.net.MalformedURLException; import java.net.URL; public class RegisterHasLinkToLoginTest { private WebDriver driver; private Map<String, Object> vars; JavascriptExecutor js; @Before public void setUp() { driver = new ChromeDriver(); js = (JavascriptExecutor) driver; vars = new HashMap<String, Object>(); } @After public void tearDown() { driver.quit(); } @Test public void registerHasLinkToLogin() { driver.get("https://yomikaze.herokuapp.com/"); driver.manage().window().setSize(new Dimension(1536, 824)); driver.findElement(By.cssSelector(".btn-success")).click(); driver.findElement(By.linkText("Sign In")).click(); { List<WebElement> elements = driver.findElements(By.cssSelector(".text-5xl")); assert(elements.size() > 0); } assertThat(driver.findElement(By.cssSelector(".text-5xl")).getText(), is("Sign In")); { List<WebElement> elements = driver.findElements(By.cssSelector(".py-6")); assert(elements.size() > 0); } assertThat(driver.findElement(By.cssSelector(".py-6")).getText(), is("Get signed-in to access member only features")); { List<WebElement> elements = driver.findElements(By.cssSelector(".form-control > #submit")); assert(elements.size() > 0); } assertThat(driver.findElement(By.cssSelector("#submit > span")).getText(), is("SIGN IN")); } } ``` ### What browsers are you seeing the problem on? Chrome
2.0
[Test Case]: RegisterHasLinkToLogin - ### Description of the test case Kiểm tra xem register có link qua form login không ### Script ```Java // Generated by Selenium IDE import org.junit.Test; import org.junit.Before; import org.junit.After; import static org.junit.Assert.*; import static org.hamcrest.CoreMatchers.is; import static org.hamcrest.core.IsNot.not; import org.openqa.selenium.By; import org.openqa.selenium.WebDriver; import org.openqa.selenium.firefox.FirefoxDriver; import org.openqa.selenium.chrome.ChromeDriver; import org.openqa.selenium.remote.RemoteWebDriver; import org.openqa.selenium.remote.DesiredCapabilities; import org.openqa.selenium.Dimension; import org.openqa.selenium.WebElement; import org.openqa.selenium.interactions.Actions; import org.openqa.selenium.support.ui.ExpectedConditions; import org.openqa.selenium.support.ui.WebDriverWait; import org.openqa.selenium.JavascriptExecutor; import org.openqa.selenium.Alert; import org.openqa.selenium.Keys; import java.util.*; import java.net.MalformedURLException; import java.net.URL; public class RegisterHasLinkToLoginTest { private WebDriver driver; private Map<String, Object> vars; JavascriptExecutor js; @Before public void setUp() { driver = new ChromeDriver(); js = (JavascriptExecutor) driver; vars = new HashMap<String, Object>(); } @After public void tearDown() { driver.quit(); } @Test public void registerHasLinkToLogin() { driver.get("https://yomikaze.herokuapp.com/"); driver.manage().window().setSize(new Dimension(1536, 824)); driver.findElement(By.cssSelector(".btn-success")).click(); driver.findElement(By.linkText("Sign In")).click(); { List<WebElement> elements = driver.findElements(By.cssSelector(".text-5xl")); assert(elements.size() > 0); } assertThat(driver.findElement(By.cssSelector(".text-5xl")).getText(), is("Sign In")); { List<WebElement> elements = driver.findElements(By.cssSelector(".py-6")); assert(elements.size() > 0); } assertThat(driver.findElement(By.cssSelector(".py-6")).getText(), is("Get signed-in to access member only features")); { List<WebElement> elements = driver.findElements(By.cssSelector(".form-control > #submit")); assert(elements.size() > 0); } assertThat(driver.findElement(By.cssSelector("#submit > span")).getText(), is("SIGN IN")); } } ``` ### What browsers are you seeing the problem on? Chrome
test
registerhaslinktologin description of the test case kiểm tra xem register có link qua form login không script java generated by selenium ide import org junit test import org junit before import org junit after import static org junit assert import static org hamcrest corematchers is import static org hamcrest core isnot not import org openqa selenium by import org openqa selenium webdriver import org openqa selenium firefox firefoxdriver import org openqa selenium chrome chromedriver import org openqa selenium remote remotewebdriver import org openqa selenium remote desiredcapabilities import org openqa selenium dimension import org openqa selenium webelement import org openqa selenium interactions actions import org openqa selenium support ui expectedconditions import org openqa selenium support ui webdriverwait import org openqa selenium javascriptexecutor import org openqa selenium alert import org openqa selenium keys import java util import java net malformedurlexception import java net url public class registerhaslinktologintest private webdriver driver private map vars javascriptexecutor js before public void setup driver new chromedriver js javascriptexecutor driver vars new hashmap after public void teardown driver quit test public void registerhaslinktologin driver get driver manage window setsize new dimension driver findelement by cssselector btn success click driver findelement by linktext sign in click list elements driver findelements by cssselector text assert elements size assertthat driver findelement by cssselector text gettext is sign in list elements driver findelements by cssselector py assert elements size assertthat driver findelement by cssselector py gettext is get signed in to access member only features list elements driver findelements by cssselector form control submit assert elements size assertthat driver findelement by cssselector submit span gettext is sign in what browsers are you seeing the problem on chrome
1
104,447
16,616,824,360
IssuesEvent
2021-06-02 17:48:43
Dima2021/t-vault
https://api.github.com/repos/Dima2021/t-vault
opened
WS-2018-0072 (High) detected in https-proxy-agent-1.0.0.tgz
security vulnerability
## WS-2018-0072 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>https-proxy-agent-1.0.0.tgz</b></p></summary> <p>An HTTP(s) proxy `http.Agent` implementation for HTTPS</p> <p>Library home page: <a href="https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-1.0.0.tgz">https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-1.0.0.tgz</a></p> <p>Path to dependency file: t-vault/tvaultui/package.json</p> <p>Path to vulnerable library: t-vault/tvaultui/node_modules/https-proxy-agent/package.json</p> <p> Dependency Hierarchy: - gulp-protractor-1.0.0.tgz (Root Library) - protractor-2.5.1.tgz - saucelabs-1.0.1.tgz - :x: **https-proxy-agent-1.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Dima2021/t-vault/commit/259885b704776a5554c5d008b51b19c9b0ea9fd5">259885b704776a5554c5d008b51b19c9b0ea9fd5</a></p> <p>Found in base branch: <b>dev</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Versions of https-proxy-agent before 2.2.0 are vulnerable to a denial of service. This is due to unsanitized options (proxy.auth) being passed to Buffer(). <p>Publish Date: 2018-02-28 <p>URL: <a href=https://hackerone.com/reports/319532>WS-2018-0072</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nodesecurity.io/advisories/593">https://nodesecurity.io/advisories/593</a></p> <p>Release Date: 2018-01-27</p> <p>Fix Resolution: 2.2.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"https-proxy-agent","packageVersion":"1.0.0","packageFilePaths":["/tvaultui/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-protractor:1.0.0;protractor:2.5.1;saucelabs:1.0.1;https-proxy-agent:1.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.2.0"}],"baseBranches":["dev"],"vulnerabilityIdentifier":"WS-2018-0072","vulnerabilityDetails":"Versions of https-proxy-agent before 2.2.0 are vulnerable to a denial of service. This is due to unsanitized options (proxy.auth) being passed to Buffer().","vulnerabilityUrl":"https://hackerone.com/reports/319532","cvss2Severity":"high","cvss2Score":"8.0","extraData":{}}</REMEDIATE> -->
True
WS-2018-0072 (High) detected in https-proxy-agent-1.0.0.tgz - ## WS-2018-0072 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>https-proxy-agent-1.0.0.tgz</b></p></summary> <p>An HTTP(s) proxy `http.Agent` implementation for HTTPS</p> <p>Library home page: <a href="https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-1.0.0.tgz">https://registry.npmjs.org/https-proxy-agent/-/https-proxy-agent-1.0.0.tgz</a></p> <p>Path to dependency file: t-vault/tvaultui/package.json</p> <p>Path to vulnerable library: t-vault/tvaultui/node_modules/https-proxy-agent/package.json</p> <p> Dependency Hierarchy: - gulp-protractor-1.0.0.tgz (Root Library) - protractor-2.5.1.tgz - saucelabs-1.0.1.tgz - :x: **https-proxy-agent-1.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Dima2021/t-vault/commit/259885b704776a5554c5d008b51b19c9b0ea9fd5">259885b704776a5554c5d008b51b19c9b0ea9fd5</a></p> <p>Found in base branch: <b>dev</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Versions of https-proxy-agent before 2.2.0 are vulnerable to a denial of service. This is due to unsanitized options (proxy.auth) being passed to Buffer(). <p>Publish Date: 2018-02-28 <p>URL: <a href=https://hackerone.com/reports/319532>WS-2018-0072</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.0</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nodesecurity.io/advisories/593">https://nodesecurity.io/advisories/593</a></p> <p>Release Date: 2018-01-27</p> <p>Fix Resolution: 2.2.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"https-proxy-agent","packageVersion":"1.0.0","packageFilePaths":["/tvaultui/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-protractor:1.0.0;protractor:2.5.1;saucelabs:1.0.1;https-proxy-agent:1.0.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.2.0"}],"baseBranches":["dev"],"vulnerabilityIdentifier":"WS-2018-0072","vulnerabilityDetails":"Versions of https-proxy-agent before 2.2.0 are vulnerable to a denial of service. This is due to unsanitized options (proxy.auth) being passed to Buffer().","vulnerabilityUrl":"https://hackerone.com/reports/319532","cvss2Severity":"high","cvss2Score":"8.0","extraData":{}}</REMEDIATE> -->
non_test
ws high detected in https proxy agent tgz ws high severity vulnerability vulnerable library https proxy agent tgz an http s proxy http agent implementation for https library home page a href path to dependency file t vault tvaultui package json path to vulnerable library t vault tvaultui node modules https proxy agent package json dependency hierarchy gulp protractor tgz root library protractor tgz saucelabs tgz x https proxy agent tgz vulnerable library found in head commit a href found in base branch dev vulnerability details versions of https proxy agent before are vulnerable to a denial of service this is due to unsanitized options proxy auth being passed to buffer publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree gulp protractor protractor saucelabs https proxy agent isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier ws vulnerabilitydetails versions of https proxy agent before are vulnerable to a denial of service this is due to unsanitized options proxy auth being passed to buffer vulnerabilityurl
0
40,602
5,310,342,836
IssuesEvent
2017-02-12 19:16:02
X01X012013/AdBlockProtector
https://api.github.com/repos/X01X012013/AdBlockProtector
closed
Update request for AAK List
Fixed Test In Progress
We have to remove something from AAk script because shows to much ads on wp.pl
1.0
Update request for AAK List - We have to remove something from AAk script because shows to much ads on wp.pl
test
update request for aak list we have to remove something from aak script because shows to much ads on wp pl
1
273,613
8,550,855,503
IssuesEvent
2018-11-07 16:30:45
duckduckgo/zeroclickinfo-spice
https://api.github.com/repos/duckduckgo/zeroclickinfo-spice
closed
Forecast - Gives weather for incorrect country
Bug Priority: Medium Relevancy
Moved from https://github.com/duckduckgo/duckduckgo/issues/54 > When I search: "Vancouver Weather" in DDG, I get weather for Vancouver, WA. I'm actually looking for Vancouver, BC weather. I can fix this by searching for "Vancouver, BC Weather", but that is annoying, considering this is a search I might make quite often. > > Perhaps a solution for this is to prioritize cities based on your region in settings. Since I set my region to Canada, perhaps it should prioritize the Vancouver in Canada. I agree with @penguinvasion's recommendation -- we should prioritize locations that are physically closer to the user when multiple results with the same name exist ------ <!-- added by @pjhampton :: 9/mar/2017 --> IA Page: https://duck.co/ia/view/forecast
1.0
Forecast - Gives weather for incorrect country - Moved from https://github.com/duckduckgo/duckduckgo/issues/54 > When I search: "Vancouver Weather" in DDG, I get weather for Vancouver, WA. I'm actually looking for Vancouver, BC weather. I can fix this by searching for "Vancouver, BC Weather", but that is annoying, considering this is a search I might make quite often. > > Perhaps a solution for this is to prioritize cities based on your region in settings. Since I set my region to Canada, perhaps it should prioritize the Vancouver in Canada. I agree with @penguinvasion's recommendation -- we should prioritize locations that are physically closer to the user when multiple results with the same name exist ------ <!-- added by @pjhampton :: 9/mar/2017 --> IA Page: https://duck.co/ia/view/forecast
non_test
forecast gives weather for incorrect country moved from when i search vancouver weather in ddg i get weather for vancouver wa i m actually looking for vancouver bc weather i can fix this by searching for vancouver bc weather but that is annoying considering this is a search i might make quite often perhaps a solution for this is to prioritize cities based on your region in settings since i set my region to canada perhaps it should prioritize the vancouver in canada i agree with penguinvasion s recommendation we should prioritize locations that are physically closer to the user when multiple results with the same name exist ia page
0
41,545
5,374,396,282
IssuesEvent
2017-02-23 00:12:48
Microsoft/vscode
https://api.github.com/repos/Microsoft/vscode
closed
Test: inline decorations for debug column location
debug testplan-item
- [x] any os @egamma - [x] any os @mousetraps Complexity: 2 We are now using inline decorations to show when a debug step over happened on the same line (to make the transition more obvious when there are multiple statements on a line). Have a workspace that you can debug (for example our smoke test sample). Have some lines which contain multiple statements, start debugging and verify: * When you are stepping over lines with a single statement everything works as before * When you are stepping over a line with multiple statements an inline yellow arrow decoration appears to clarify the exact column location you are stepping over * Make sure the colors work for both light / dark / high contrast theme Note: column locations are not 100% correct and are coming from the debug adapter. At the moment we can not improve this.
1.0
Test: inline decorations for debug column location - - [x] any os @egamma - [x] any os @mousetraps Complexity: 2 We are now using inline decorations to show when a debug step over happened on the same line (to make the transition more obvious when there are multiple statements on a line). Have a workspace that you can debug (for example our smoke test sample). Have some lines which contain multiple statements, start debugging and verify: * When you are stepping over lines with a single statement everything works as before * When you are stepping over a line with multiple statements an inline yellow arrow decoration appears to clarify the exact column location you are stepping over * Make sure the colors work for both light / dark / high contrast theme Note: column locations are not 100% correct and are coming from the debug adapter. At the moment we can not improve this.
test
test inline decorations for debug column location any os egamma any os mousetraps complexity we are now using inline decorations to show when a debug step over happened on the same line to make the transition more obvious when there are multiple statements on a line have a workspace that you can debug for example our smoke test sample have some lines which contain multiple statements start debugging and verify when you are stepping over lines with a single statement everything works as before when you are stepping over a line with multiple statements an inline yellow arrow decoration appears to clarify the exact column location you are stepping over make sure the colors work for both light dark high contrast theme note column locations are not correct and are coming from the debug adapter at the moment we can not improve this
1
331,051
28,503,553,815
IssuesEvent
2023-04-18 19:21:28
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Failing test: X-Pack EPM API Integration Tests.x-pack/test/fleet_api_integration/apis/epm/list·ts - Fleet Endpoints EPM Endpoints EPM - list list api tests lists all limited packages from the registry
failed-test skipped-test Team:Fleet v7.17.10
A test failed on a tracked branch ``` Error: expected [] to sort of equal [ 'endpoint' ] at Assertion.assert (/opt/local-ssd/buildkite/builds/kb-n2-4-226a62fcfcfe82e7/elastic/kibana-hourly/kibana/node_modules/@kbn/expect/expect.js:100:11) at Assertion.eql (/opt/local-ssd/buildkite/builds/kb-n2-4-226a62fcfcfe82e7/elastic/kibana-hourly/kibana/node_modules/@kbn/expect/expect.js:244:8) at Context.<anonymous> (test/fleet_api_integration/apis/epm/list.ts:57:42) at runMicrotasks (<anonymous>) at processTicksAndRejections (node:internal/process/task_queues:96:5) at Object.apply (/opt/local-ssd/buildkite/builds/kb-n2-4-226a62fcfcfe82e7/elastic/kibana-hourly/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16) { actual: '[]', expected: '[\n "endpoint"\n]', showDiff: true } ``` First failure: [CI Build - 7.17](https://buildkite.com/elastic/kibana-hourly/builds/10823#bf6e57b4-8665-409d-a9ed-3be3a8d99170) <!-- kibanaCiData = {"failed-test":{"test.class":"X-Pack EPM API Integration Tests.x-pack/test/fleet_api_integration/apis/epm/list·ts","test.name":"Fleet Endpoints EPM Endpoints EPM - list list api tests lists all limited packages from the registry","test.failCount":1}} -->
2.0
Failing test: X-Pack EPM API Integration Tests.x-pack/test/fleet_api_integration/apis/epm/list·ts - Fleet Endpoints EPM Endpoints EPM - list list api tests lists all limited packages from the registry - A test failed on a tracked branch ``` Error: expected [] to sort of equal [ 'endpoint' ] at Assertion.assert (/opt/local-ssd/buildkite/builds/kb-n2-4-226a62fcfcfe82e7/elastic/kibana-hourly/kibana/node_modules/@kbn/expect/expect.js:100:11) at Assertion.eql (/opt/local-ssd/buildkite/builds/kb-n2-4-226a62fcfcfe82e7/elastic/kibana-hourly/kibana/node_modules/@kbn/expect/expect.js:244:8) at Context.<anonymous> (test/fleet_api_integration/apis/epm/list.ts:57:42) at runMicrotasks (<anonymous>) at processTicksAndRejections (node:internal/process/task_queues:96:5) at Object.apply (/opt/local-ssd/buildkite/builds/kb-n2-4-226a62fcfcfe82e7/elastic/kibana-hourly/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16) { actual: '[]', expected: '[\n "endpoint"\n]', showDiff: true } ``` First failure: [CI Build - 7.17](https://buildkite.com/elastic/kibana-hourly/builds/10823#bf6e57b4-8665-409d-a9ed-3be3a8d99170) <!-- kibanaCiData = {"failed-test":{"test.class":"X-Pack EPM API Integration Tests.x-pack/test/fleet_api_integration/apis/epm/list·ts","test.name":"Fleet Endpoints EPM Endpoints EPM - list list api tests lists all limited packages from the registry","test.failCount":1}} -->
test
failing test x pack epm api integration tests x pack test fleet api integration apis epm list·ts fleet endpoints epm endpoints epm list list api tests lists all limited packages from the registry a test failed on a tracked branch error expected to sort of equal at assertion assert opt local ssd buildkite builds kb elastic kibana hourly kibana node modules kbn expect expect js at assertion eql opt local ssd buildkite builds kb elastic kibana hourly kibana node modules kbn expect expect js at context test fleet api integration apis epm list ts at runmicrotasks at processticksandrejections node internal process task queues at object apply opt local ssd buildkite builds kb elastic kibana hourly kibana node modules kbn test target node functional test runner lib mocha wrap function js actual expected showdiff true first failure
1
294,493
25,376,642,166
IssuesEvent
2022-11-21 14:34:37
NVIDIA/spark-rapids
https://api.github.com/repos/NVIDIA/spark-rapids
closed
[BUG] CPU mismatch GPU result in test_hash_groupby_collect_with_single_distinct intermittently
bug duplicate test
**Describe the bug** test_hash_groupby_collect_with_single_distinct[[('a', RepeatSeq(Long)), ('b', RepeatSeq(Boolean)), ('c', LongRange(not_null))]][IGNORE_ORDER({'local': True})] - AssertionError: GPU and CPU boolean values are different at [18, 'sort_array(collect_list(b), true)', 3] [2022-11-18T03:41:02.798Z] CPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[**False, False, False, True**, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] [2022-11-18T03:41:02.799Z] GPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[**False, False, False, False**, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] ``` [2022-11-18T03:41:02.797Z] _ test_hash_groupby_collect_with_single_distinct[[('a', RepeatSeq(Long)), ('b', RepeatSeq(Boolean)), ('c', LongRange(not_null))]] _ [2022-11-18T03:41:02.797Z] [gw3] linux -- Python 3.8.15 /usr/bin/python [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] data_gen = [('a', RepeatSeq(Long)), ('b', RepeatSeq(Boolean)), ('c', LongRange(not_null))] [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] @ignore_order(local=True) [2022-11-18T03:41:02.797Z] @pytest.mark.parametrize('data_gen', _full_gen_data_for_collect_op, ids=idfn) [2022-11-18T03:41:02.797Z] def test_hash_groupby_collect_with_single_distinct(data_gen): [2022-11-18T03:41:02.797Z] # test collect_ops with other distinct aggregations [2022-11-18T03:41:02.797Z] > assert_gpu_and_cpu_are_equal_collect( [2022-11-18T03:41:02.797Z] lambda spark: gen_df(spark, data_gen, length=100) [2022-11-18T03:41:02.797Z] .groupby('a') [2022-11-18T03:41:02.797Z] .agg(f.sort_array(f.collect_list('b')), [2022-11-18T03:41:02.797Z] f.sort_array(f.collect_set('b')), [2022-11-18T03:41:02.797Z] f.countDistinct('c'), [2022-11-18T03:41:02.797Z] f.count('c'))) [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] ../../src/main/python/hash_aggregate_test.py:736: [2022-11-18T03:41:02.797Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:548: in assert_gpu_and_cpu_are_equal_collect [2022-11-18T03:41:02.797Z] _assert_gpu_and_cpu_are_equal(func, 'COLLECT', conf=conf, is_cpu_first=is_cpu_first) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:479: in _assert_gpu_and_cpu_are_equal [2022-11-18T03:41:02.797Z] assert_equal(from_cpu, from_gpu) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:106: in assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu, gpu, float_check=get_float_check(), path=[]) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:42: in _assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:35: in _assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu[field], gpu[field], float_check, path + [field]) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:42: in _assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.797Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] cpu = True, gpu = False [2022-11-18T03:41:02.797Z] float_check = <function get_float_check.<locals>.<lambda> at 0x7fcbd6f26ee0> [2022-11-18T03:41:02.797Z] path = [18, 'sort_array(collect_list(b), true)', 3] [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] def _assert_equal(cpu, gpu, float_check, path): [2022-11-18T03:41:02.797Z] t = type(cpu) [2022-11-18T03:41:02.797Z] if (t is Row): [2022-11-18T03:41:02.797Z] assert len(cpu) == len(gpu), "CPU and GPU row have different lengths at {} CPU: {} GPU: {}".format(path, len(cpu), len(gpu)) [2022-11-18T03:41:02.797Z] if hasattr(cpu, "__fields__") and hasattr(gpu, "__fields__"): [2022-11-18T03:41:02.797Z] assert cpu.__fields__ == gpu.__fields__, "CPU and GPU row have different fields at {} CPU: {} GPU: {}".format(path, cpu.__fields__, gpu.__fields__) [2022-11-18T03:41:02.797Z] for field in cpu.__fields__: [2022-11-18T03:41:02.797Z] _assert_equal(cpu[field], gpu[field], float_check, path + [field]) [2022-11-18T03:41:02.797Z] else: [2022-11-18T03:41:02.797Z] for index in range(len(cpu)): [2022-11-18T03:41:02.797Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.797Z] elif (t is list): [2022-11-18T03:41:02.797Z] assert len(cpu) == len(gpu), "CPU and GPU list have different lengths at {} CPU: {} GPU: {}".format(path, len(cpu), len(gpu)) [2022-11-18T03:41:02.798Z] for index in range(len(cpu)): [2022-11-18T03:41:02.798Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.798Z] elif (t is tuple): [2022-11-18T03:41:02.798Z] assert len(cpu) == len(gpu), "CPU and GPU list have different lengths at {} CPU: {} GPU: {}".format(path, len(cpu), len(gpu)) [2022-11-18T03:41:02.798Z] for index in range(len(cpu)): [2022-11-18T03:41:02.798Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.798Z] elif (t is pytypes.GeneratorType): [2022-11-18T03:41:02.798Z] index = 0 [2022-11-18T03:41:02.798Z] # generator has no zip :( so we have to do this the hard way [2022-11-18T03:41:02.798Z] done = False [2022-11-18T03:41:02.798Z] while not done: [2022-11-18T03:41:02.798Z] sub_cpu = None [2022-11-18T03:41:02.798Z] sub_gpu = None [2022-11-18T03:41:02.798Z] try: [2022-11-18T03:41:02.798Z] sub_cpu = next(cpu) [2022-11-18T03:41:02.798Z] except StopIteration: [2022-11-18T03:41:02.798Z] done = True [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] try: [2022-11-18T03:41:02.798Z] sub_gpu = next(gpu) [2022-11-18T03:41:02.798Z] except StopIteration: [2022-11-18T03:41:02.798Z] done = True [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] if done: [2022-11-18T03:41:02.798Z] assert sub_cpu == sub_gpu and sub_cpu == None, "CPU and GPU generators have different lengths at {}".format(path) [2022-11-18T03:41:02.798Z] else: [2022-11-18T03:41:02.798Z] _assert_equal(sub_cpu, sub_gpu, float_check, path + [index]) [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] index = index + 1 [2022-11-18T03:41:02.798Z] elif (t is dict): [2022-11-18T03:41:02.798Z] # The order of key/values is not guaranteed in python dicts, nor are they guaranteed by Spark [2022-11-18T03:41:02.798Z] # so sort the items to do our best with ignoring the order of dicts [2022-11-18T03:41:02.798Z] cpu_items = list(cpu.items()).sort(key=_RowCmp) [2022-11-18T03:41:02.798Z] gpu_items = list(gpu.items()).sort(key=_RowCmp) [2022-11-18T03:41:02.798Z] _assert_equal(cpu_items, gpu_items, float_check, path + ["map"]) [2022-11-18T03:41:02.798Z] elif (t is int): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU int values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif (t is float): [2022-11-18T03:41:02.798Z] if (math.isnan(cpu)): [2022-11-18T03:41:02.798Z] assert math.isnan(gpu), "GPU and CPU float values are different at {}".format(path) [2022-11-18T03:41:02.798Z] else: [2022-11-18T03:41:02.798Z] assert float_check(cpu, gpu), "GPU and CPU float values are different {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, str): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU string values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, datetime): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU timestamp values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, date): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU date values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, bool): [2022-11-18T03:41:02.798Z] > assert cpu == gpu, "GPU and CPU boolean values are different at {}".format(path) [2022-11-18T03:41:02.798Z] E AssertionError: GPU and CPU boolean values are different at [18, 'sort_array(collect_list(b), true)', 3] [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] ../../src/main/python/asserts.py:90: AssertionError [2022-11-18T03:41:02.798Z] ----------------------------- Captured stdout call ----------------------------- [2022-11-18T03:41:02.798Z] ### CPU RUN ### [2022-11-18T03:41:02.798Z] ### GPU RUN ### [2022-11-18T03:41:02.798Z] ### COLLECT: GPU TOOK 0.2924313545227051 CPU TOOK 0.29257845878601074 ### [2022-11-18T03:41:02.798Z] CPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[False, False, False, True, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] [2022-11-18T03:41:02.799Z] GPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[False, False, False, False, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] ``` **Steps/Code to reproduce bug** Not always reproducible, just saw one fail right now
1.0
[BUG] CPU mismatch GPU result in test_hash_groupby_collect_with_single_distinct intermittently - **Describe the bug** test_hash_groupby_collect_with_single_distinct[[('a', RepeatSeq(Long)), ('b', RepeatSeq(Boolean)), ('c', LongRange(not_null))]][IGNORE_ORDER({'local': True})] - AssertionError: GPU and CPU boolean values are different at [18, 'sort_array(collect_list(b), true)', 3] [2022-11-18T03:41:02.798Z] CPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[**False, False, False, True**, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] [2022-11-18T03:41:02.799Z] GPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[**False, False, False, False**, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] ``` [2022-11-18T03:41:02.797Z] _ test_hash_groupby_collect_with_single_distinct[[('a', RepeatSeq(Long)), ('b', RepeatSeq(Boolean)), ('c', LongRange(not_null))]] _ [2022-11-18T03:41:02.797Z] [gw3] linux -- Python 3.8.15 /usr/bin/python [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] data_gen = [('a', RepeatSeq(Long)), ('b', RepeatSeq(Boolean)), ('c', LongRange(not_null))] [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] @ignore_order(local=True) [2022-11-18T03:41:02.797Z] @pytest.mark.parametrize('data_gen', _full_gen_data_for_collect_op, ids=idfn) [2022-11-18T03:41:02.797Z] def test_hash_groupby_collect_with_single_distinct(data_gen): [2022-11-18T03:41:02.797Z] # test collect_ops with other distinct aggregations [2022-11-18T03:41:02.797Z] > assert_gpu_and_cpu_are_equal_collect( [2022-11-18T03:41:02.797Z] lambda spark: gen_df(spark, data_gen, length=100) [2022-11-18T03:41:02.797Z] .groupby('a') [2022-11-18T03:41:02.797Z] .agg(f.sort_array(f.collect_list('b')), [2022-11-18T03:41:02.797Z] f.sort_array(f.collect_set('b')), [2022-11-18T03:41:02.797Z] f.countDistinct('c'), [2022-11-18T03:41:02.797Z] f.count('c'))) [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] ../../src/main/python/hash_aggregate_test.py:736: [2022-11-18T03:41:02.797Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:548: in assert_gpu_and_cpu_are_equal_collect [2022-11-18T03:41:02.797Z] _assert_gpu_and_cpu_are_equal(func, 'COLLECT', conf=conf, is_cpu_first=is_cpu_first) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:479: in _assert_gpu_and_cpu_are_equal [2022-11-18T03:41:02.797Z] assert_equal(from_cpu, from_gpu) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:106: in assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu, gpu, float_check=get_float_check(), path=[]) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:42: in _assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:35: in _assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu[field], gpu[field], float_check, path + [field]) [2022-11-18T03:41:02.797Z] ../../src/main/python/asserts.py:42: in _assert_equal [2022-11-18T03:41:02.797Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.797Z] _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] cpu = True, gpu = False [2022-11-18T03:41:02.797Z] float_check = <function get_float_check.<locals>.<lambda> at 0x7fcbd6f26ee0> [2022-11-18T03:41:02.797Z] path = [18, 'sort_array(collect_list(b), true)', 3] [2022-11-18T03:41:02.797Z] [2022-11-18T03:41:02.797Z] def _assert_equal(cpu, gpu, float_check, path): [2022-11-18T03:41:02.797Z] t = type(cpu) [2022-11-18T03:41:02.797Z] if (t is Row): [2022-11-18T03:41:02.797Z] assert len(cpu) == len(gpu), "CPU and GPU row have different lengths at {} CPU: {} GPU: {}".format(path, len(cpu), len(gpu)) [2022-11-18T03:41:02.797Z] if hasattr(cpu, "__fields__") and hasattr(gpu, "__fields__"): [2022-11-18T03:41:02.797Z] assert cpu.__fields__ == gpu.__fields__, "CPU and GPU row have different fields at {} CPU: {} GPU: {}".format(path, cpu.__fields__, gpu.__fields__) [2022-11-18T03:41:02.797Z] for field in cpu.__fields__: [2022-11-18T03:41:02.797Z] _assert_equal(cpu[field], gpu[field], float_check, path + [field]) [2022-11-18T03:41:02.797Z] else: [2022-11-18T03:41:02.797Z] for index in range(len(cpu)): [2022-11-18T03:41:02.797Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.797Z] elif (t is list): [2022-11-18T03:41:02.797Z] assert len(cpu) == len(gpu), "CPU and GPU list have different lengths at {} CPU: {} GPU: {}".format(path, len(cpu), len(gpu)) [2022-11-18T03:41:02.798Z] for index in range(len(cpu)): [2022-11-18T03:41:02.798Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.798Z] elif (t is tuple): [2022-11-18T03:41:02.798Z] assert len(cpu) == len(gpu), "CPU and GPU list have different lengths at {} CPU: {} GPU: {}".format(path, len(cpu), len(gpu)) [2022-11-18T03:41:02.798Z] for index in range(len(cpu)): [2022-11-18T03:41:02.798Z] _assert_equal(cpu[index], gpu[index], float_check, path + [index]) [2022-11-18T03:41:02.798Z] elif (t is pytypes.GeneratorType): [2022-11-18T03:41:02.798Z] index = 0 [2022-11-18T03:41:02.798Z] # generator has no zip :( so we have to do this the hard way [2022-11-18T03:41:02.798Z] done = False [2022-11-18T03:41:02.798Z] while not done: [2022-11-18T03:41:02.798Z] sub_cpu = None [2022-11-18T03:41:02.798Z] sub_gpu = None [2022-11-18T03:41:02.798Z] try: [2022-11-18T03:41:02.798Z] sub_cpu = next(cpu) [2022-11-18T03:41:02.798Z] except StopIteration: [2022-11-18T03:41:02.798Z] done = True [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] try: [2022-11-18T03:41:02.798Z] sub_gpu = next(gpu) [2022-11-18T03:41:02.798Z] except StopIteration: [2022-11-18T03:41:02.798Z] done = True [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] if done: [2022-11-18T03:41:02.798Z] assert sub_cpu == sub_gpu and sub_cpu == None, "CPU and GPU generators have different lengths at {}".format(path) [2022-11-18T03:41:02.798Z] else: [2022-11-18T03:41:02.798Z] _assert_equal(sub_cpu, sub_gpu, float_check, path + [index]) [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] index = index + 1 [2022-11-18T03:41:02.798Z] elif (t is dict): [2022-11-18T03:41:02.798Z] # The order of key/values is not guaranteed in python dicts, nor are they guaranteed by Spark [2022-11-18T03:41:02.798Z] # so sort the items to do our best with ignoring the order of dicts [2022-11-18T03:41:02.798Z] cpu_items = list(cpu.items()).sort(key=_RowCmp) [2022-11-18T03:41:02.798Z] gpu_items = list(gpu.items()).sort(key=_RowCmp) [2022-11-18T03:41:02.798Z] _assert_equal(cpu_items, gpu_items, float_check, path + ["map"]) [2022-11-18T03:41:02.798Z] elif (t is int): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU int values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif (t is float): [2022-11-18T03:41:02.798Z] if (math.isnan(cpu)): [2022-11-18T03:41:02.798Z] assert math.isnan(gpu), "GPU and CPU float values are different at {}".format(path) [2022-11-18T03:41:02.798Z] else: [2022-11-18T03:41:02.798Z] assert float_check(cpu, gpu), "GPU and CPU float values are different {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, str): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU string values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, datetime): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU timestamp values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, date): [2022-11-18T03:41:02.798Z] assert cpu == gpu, "GPU and CPU date values are different at {}".format(path) [2022-11-18T03:41:02.798Z] elif isinstance(cpu, bool): [2022-11-18T03:41:02.798Z] > assert cpu == gpu, "GPU and CPU boolean values are different at {}".format(path) [2022-11-18T03:41:02.798Z] E AssertionError: GPU and CPU boolean values are different at [18, 'sort_array(collect_list(b), true)', 3] [2022-11-18T03:41:02.798Z] [2022-11-18T03:41:02.798Z] ../../src/main/python/asserts.py:90: AssertionError [2022-11-18T03:41:02.798Z] ----------------------------- Captured stdout call ----------------------------- [2022-11-18T03:41:02.798Z] ### CPU RUN ### [2022-11-18T03:41:02.798Z] ### GPU RUN ### [2022-11-18T03:41:02.798Z] ### COLLECT: GPU TOOK 0.2924313545227051 CPU TOOK 0.29257845878601074 ### [2022-11-18T03:41:02.798Z] CPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[False, False, False, True, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] [2022-11-18T03:41:02.799Z] GPU OUTPUT: [Row(a=-7540734677356764604, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-5831592707909023540, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-5133656973475552689, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-4426181692283497353, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-3917032101531217289, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-3502159106106506455, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=-2697073954890740236, sort_array(collect_list(b), true)=[False, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-2123199122092230623, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=-1, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=207981845540287738, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=393905103838704542, sort_array(collect_list(b), true)=[False, False, False, False, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=875130347651831881, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=4751953708995107450, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=6084712057446794809, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7198729688045931692, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7528354001793048440, sort_array(collect_list(b), true)=[False, False, False, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=7618709293599214015, sort_array(collect_list(b), true)=[True, True, True, True, True], sort_array(collect_set(b), true)=[True], count(c)=5, count(c)=5), Row(a=7984374766242566542, sort_array(collect_list(b), true)=[False, False, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=5, count(c)=5), Row(a=9223372036854775807, sort_array(collect_list(b), true)=[False, False, False, False, True, True, True, True, True, True], sort_array(collect_set(b), true)=[False, True], count(c)=10, count(c)=10)] ``` **Steps/Code to reproduce bug** Not always reproducible, just saw one fail right now
test
cpu mismatch gpu result in test hash groupby collect with single distinct intermittently describe the bug test hash groupby collect with single distinct assertionerror gpu and cpu boolean values are different at cpu output sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c gpu output sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c test hash groupby collect with single distinct linux python usr bin python data gen ignore order local true pytest mark parametrize data gen full gen data for collect op ids idfn def test hash groupby collect with single distinct data gen test collect ops with other distinct aggregations assert gpu and cpu are equal collect lambda spark gen df spark data gen length groupby a agg f sort array f collect list b f sort array f collect set b f countdistinct c f count c src main python hash aggregate test py src main python asserts py in assert gpu and cpu are equal collect assert gpu and cpu are equal func collect conf conf is cpu first is cpu first src main python asserts py in assert gpu and cpu are equal assert equal from cpu from gpu src main python asserts py in assert equal assert equal cpu gpu float check get float check path src main python asserts py in assert equal assert equal cpu gpu float check path src main python asserts py in assert equal assert equal cpu gpu float check path src main python asserts py in assert equal assert equal cpu gpu float check path cpu true gpu false float check at path def assert equal cpu gpu float check path t type cpu if t is row assert len cpu len gpu cpu and gpu row have different lengths at cpu gpu format path len cpu len gpu if hasattr cpu fields and hasattr gpu fields assert cpu fields gpu fields cpu and gpu row have different fields at cpu gpu format path cpu fields gpu fields for field in cpu fields assert equal cpu gpu float check path else for index in range len cpu assert equal cpu gpu float check path elif t is list assert len cpu len gpu cpu and gpu list have different lengths at cpu gpu format path len cpu len gpu for index in range len cpu assert equal cpu gpu float check path elif t is tuple assert len cpu len gpu cpu and gpu list have different lengths at cpu gpu format path len cpu len gpu for index in range len cpu assert equal cpu gpu float check path elif t is pytypes generatortype index generator has no zip so we have to do this the hard way done false while not done sub cpu none sub gpu none try sub cpu next cpu except stopiteration done true try sub gpu next gpu except stopiteration done true if done assert sub cpu sub gpu and sub cpu none cpu and gpu generators have different lengths at format path else assert equal sub cpu sub gpu float check path index index elif t is dict the order of key values is not guaranteed in python dicts nor are they guaranteed by spark so sort the items to do our best with ignoring the order of dicts cpu items list cpu items sort key rowcmp gpu items list gpu items sort key rowcmp assert equal cpu items gpu items float check path elif t is int assert cpu gpu gpu and cpu int values are different at format path elif t is float if math isnan cpu assert math isnan gpu gpu and cpu float values are different at format path else assert float check cpu gpu gpu and cpu float values are different format path elif isinstance cpu str assert cpu gpu gpu and cpu string values are different at format path elif isinstance cpu datetime assert cpu gpu gpu and cpu timestamp values are different at format path elif isinstance cpu date assert cpu gpu gpu and cpu date values are different at format path elif isinstance cpu bool assert cpu gpu gpu and cpu boolean values are different at format path e assertionerror gpu and cpu boolean values are different at src main python asserts py assertionerror captured stdout call cpu run gpu run collect gpu took cpu took cpu output sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c gpu output sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c row a sort array collect list b true sort array collect set b true count c count c steps code to reproduce bug not always reproducible just saw one fail right now
1
37,383
12,477,454,011
IssuesEvent
2020-05-29 14:59:12
LibrIT/passhport
https://api.github.com/repos/LibrIT/passhport
closed
CVE-2018-14042 (Medium) detected in bootstrap-3.3.2.min.js, bootstrap-3.3.7.min.js
New security vulnerability
## CVE-2018-14042 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.2.min.js</b>, <b>bootstrap-3.3.7.min.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.2.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.2/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.2/js/bootstrap.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/passhport/passhweb/app/static/bower_components/bootstrap-daterangepicker/website/index.html</p> <p>Path to vulnerable library: /passhport/passhweb/app/static/bower_components/bootstrap-daterangepicker/website/index.html,/passhport/passhweb/app/static/bower_components/bootstrap-daterangepicker/demo.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.2.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.7.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/passhport/passhweb/app/static/bower_components/bootstrap-colorpicker/index.html</p> <p>Path to vulnerable library: /passhport/passhweb/app/static/bower_components/bootstrap-colorpicker/index.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.7.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/LibrIT/passhport/commit/280394daf60b8887c5eebccaca5e3c390a11b1f2">280394daf60b8887c5eebccaca5e3c390a11b1f2</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042>CVE-2018-14042</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-14042 (Medium) detected in bootstrap-3.3.2.min.js, bootstrap-3.3.7.min.js - ## CVE-2018-14042 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bootstrap-3.3.2.min.js</b>, <b>bootstrap-3.3.7.min.js</b></p></summary> <p> <details><summary><b>bootstrap-3.3.2.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.2/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.2/js/bootstrap.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/passhport/passhweb/app/static/bower_components/bootstrap-daterangepicker/website/index.html</p> <p>Path to vulnerable library: /passhport/passhweb/app/static/bower_components/bootstrap-daterangepicker/website/index.html,/passhport/passhweb/app/static/bower_components/bootstrap-daterangepicker/demo.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.2.min.js** (Vulnerable Library) </details> <details><summary><b>bootstrap-3.3.7.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.7/js/bootstrap.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/passhport/passhweb/app/static/bower_components/bootstrap-colorpicker/index.html</p> <p>Path to vulnerable library: /passhport/passhweb/app/static/bower_components/bootstrap-colorpicker/index.html</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.7.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/LibrIT/passhport/commit/280394daf60b8887c5eebccaca5e3c390a11b1f2">280394daf60b8887c5eebccaca5e3c390a11b1f2</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042>CVE-2018-14042</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in bootstrap min js bootstrap min js cve medium severity vulnerability vulnerable libraries bootstrap min js bootstrap min js bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file tmp ws scm passhport passhweb app static bower components bootstrap daterangepicker website index html path to vulnerable library passhport passhweb app static bower components bootstrap daterangepicker website index html passhport passhweb app static bower components bootstrap daterangepicker demo html dependency hierarchy x bootstrap min js vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to dependency file tmp ws scm passhport passhweb app static bower components bootstrap colorpicker index html path to vulnerable library passhport passhweb app static bower components bootstrap colorpicker index html dependency hierarchy x bootstrap min js vulnerable library found in head commit a href vulnerability details in bootstrap before xss is possible in the data container property of tooltip publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org webjars npm bootstrap org webjars bootstrap step up your open source security game with whitesource
0
407,863
27,634,666,373
IssuesEvent
2023-03-10 13:35:26
larszi/marax-pressure-mod
https://api.github.com/repos/larszi/marax-pressure-mod
closed
Question on MaraX MCU pinout
documentation question
Hello Lars, I'm really impressed by your work and I will follow your project with interest :-) I would appreciate if you could add more detailed information on hardware/wiring. In particular I'm interested in the pinout of MaraX' MCU. Today I did some measurements with a mulitmeter, but the only thing that I can confirm is that pin 4 is TX and pin 6 is VCC (around 4.4 Volts) (pin 1 is the nearest to the case edge). In your schematics I've seen that you have a connection to the GND pin, so can you please tell me which pin is it? Best regards! Daniel
1.0
Question on MaraX MCU pinout - Hello Lars, I'm really impressed by your work and I will follow your project with interest :-) I would appreciate if you could add more detailed information on hardware/wiring. In particular I'm interested in the pinout of MaraX' MCU. Today I did some measurements with a mulitmeter, but the only thing that I can confirm is that pin 4 is TX and pin 6 is VCC (around 4.4 Volts) (pin 1 is the nearest to the case edge). In your schematics I've seen that you have a connection to the GND pin, so can you please tell me which pin is it? Best regards! Daniel
non_test
question on marax mcu pinout hello lars i m really impressed by your work and i will follow your project with interest i would appreciate if you could add more detailed information on hardware wiring in particular i m interested in the pinout of marax mcu today i did some measurements with a mulitmeter but the only thing that i can confirm is that pin is tx and pin is vcc around volts pin is the nearest to the case edge in your schematics i ve seen that you have a connection to the gnd pin so can you please tell me which pin is it best regards daniel
0
103,369
8,902,496,904
IssuesEvent
2019-01-17 07:43:53
brave/brave-browser
https://api.github.com/repos/brave/brave-browser
closed
Brave Action Item sizing
QA/Test-Plan-Specified QA/Yes priority/P2 release-notes/include
Icons change from 16px to 18px Notification bubble not cover lion eye Spec forthcoming in comment below... ## Test plan - Shields and Rewards icons look pixel perfect at new 18px size on 1x, 2x and 3x screens - Badge position is as indicated below, always anchored 12px from right edge of hover background on button. Same with 1, 2 or 3 character length strings. - Badge colors are as indicated below. - With 3 character length string in badge, badge extends right past edge of hover background of button
1.0
Brave Action Item sizing - Icons change from 16px to 18px Notification bubble not cover lion eye Spec forthcoming in comment below... ## Test plan - Shields and Rewards icons look pixel perfect at new 18px size on 1x, 2x and 3x screens - Badge position is as indicated below, always anchored 12px from right edge of hover background on button. Same with 1, 2 or 3 character length strings. - Badge colors are as indicated below. - With 3 character length string in badge, badge extends right past edge of hover background of button
test
brave action item sizing icons change from to notification bubble not cover lion eye spec forthcoming in comment below test plan shields and rewards icons look pixel perfect at new size on and screens badge position is as indicated below always anchored from right edge of hover background on button same with or character length strings badge colors are as indicated below with character length string in badge badge extends right past edge of hover background of button
1
71,870
7,266,957,707
IssuesEvent
2018-02-20 01:27:55
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
opened
[CI] RolloverIT#testRolloverWithDateMath failed at midnight
:Core/Index APIs >test jenkins
The rollover test with the date pattern happened and failed at midnight. ``` 1> [2018-02-19T11:59:59,017][INFO ][o.e.a.a.i.r.RolloverIT ] [testRolloverWithDateMath]: before test ... 1> [2018-02-19T12:00:00,546][INFO ][o.e.a.a.i.r.RolloverIT ] [RolloverIT#testRolloverWithDateMath]: cleaning up after test ``` The new index name was derived from the next day rather than today. ``` FAILURE 1.85s J2 | RolloverIT.testRolloverWithDateMath <<< FAILURES! > Throwable #1: java.lang.AssertionError: > Expected: "test-2018.02.19-000004" > but: was "test-2018.02.20-000004" > at __randomizedtesting.SeedInfo.seed([DE29C74DF54CE353:5FEB5F63B28659FB]:0) > at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20) > at org.elasticsearch.action.admin.indices.rollover.RolloverIT.testRolloverWithDateMath(RolloverIT.java:232) ``` CI: https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+6.x+multijob-unix-compatibility/os=sles/695/console Log: [testRolloverWithDateMath.txt](https://github.com/elastic/elasticsearch/files/1738671/testRolloverWithDateMath.txt)
1.0
[CI] RolloverIT#testRolloverWithDateMath failed at midnight - The rollover test with the date pattern happened and failed at midnight. ``` 1> [2018-02-19T11:59:59,017][INFO ][o.e.a.a.i.r.RolloverIT ] [testRolloverWithDateMath]: before test ... 1> [2018-02-19T12:00:00,546][INFO ][o.e.a.a.i.r.RolloverIT ] [RolloverIT#testRolloverWithDateMath]: cleaning up after test ``` The new index name was derived from the next day rather than today. ``` FAILURE 1.85s J2 | RolloverIT.testRolloverWithDateMath <<< FAILURES! > Throwable #1: java.lang.AssertionError: > Expected: "test-2018.02.19-000004" > but: was "test-2018.02.20-000004" > at __randomizedtesting.SeedInfo.seed([DE29C74DF54CE353:5FEB5F63B28659FB]:0) > at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:20) > at org.elasticsearch.action.admin.indices.rollover.RolloverIT.testRolloverWithDateMath(RolloverIT.java:232) ``` CI: https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+6.x+multijob-unix-compatibility/os=sles/695/console Log: [testRolloverWithDateMath.txt](https://github.com/elastic/elasticsearch/files/1738671/testRolloverWithDateMath.txt)
test
rolloverit testrolloverwithdatemath failed at midnight the rollover test with the date pattern happened and failed at midnight before test cleaning up after test the new index name was derived from the next day rather than today failure rolloverit testrolloverwithdatemath failures throwable java lang assertionerror expected test but was test at randomizedtesting seedinfo seed at org hamcrest matcherassert assertthat matcherassert java at org elasticsearch action admin indices rollover rolloverit testrolloverwithdatemath rolloverit java ci log
1
147,098
19,500,308,845
IssuesEvent
2021-12-28 01:08:50
TreyM-WSS/terra-clinical
https://api.github.com/repos/TreyM-WSS/terra-clinical
opened
CVE-2021-3801 (Medium) detected in prismjs-1.17.1.tgz, prismjs-1.20.0.tgz
security vulnerability
## CVE-2021-3801 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>prismjs-1.17.1.tgz</b>, <b>prismjs-1.20.0.tgz</b></p></summary> <p> <details><summary><b>prismjs-1.17.1.tgz</b></p></summary> <p>Lightweight, robust, elegant syntax highlighting. A spin-off project from Dabblet.</p> <p>Library home page: <a href="https://registry.npmjs.org/prismjs/-/prismjs-1.17.1.tgz">https://registry.npmjs.org/prismjs/-/prismjs-1.17.1.tgz</a></p> <p>Path to dependency file: /packages/terra-clinical-item-collection/package.json</p> <p>Path to vulnerable library: /packages/terra-clinical-item-collection/node_modules/terra-markdown/node_modules/prismjs/package.json,/packages/terra-clinical-item-collection/node_modules/terra-markdown/node_modules/prismjs/package.json,/packages/terra-clinical-item-collection/node_modules/terra-markdown/node_modules/prismjs/package.json</p> <p> Dependency Hierarchy: - terra-responsive-element-4.8.0.tgz (Root Library) - terra-doc-template-2.24.0.tgz - react-syntax-highlighter-11.0.2.tgz - refractor-2.10.1.tgz - :x: **prismjs-1.17.1.tgz** (Vulnerable Library) </details> <details><summary><b>prismjs-1.20.0.tgz</b></p></summary> <p>Lightweight, robust, elegant syntax highlighting. A spin-off project from Dabblet.</p> <p>Library home page: <a href="https://registry.npmjs.org/prismjs/-/prismjs-1.20.0.tgz">https://registry.npmjs.org/prismjs/-/prismjs-1.20.0.tgz</a></p> <p>Path to dependency file: /packages/terra-clinical-item-collection/package.json</p> <p>Path to vulnerable library: /packages/terra-clinical-item-collection/node_modules/prismjs/package.json,/packages/terra-clinical-item-collection/node_modules/prismjs/package.json</p> <p> Dependency Hierarchy: - terra-responsive-element-4.8.0.tgz (Root Library) - terra-doc-template-2.24.0.tgz - react-syntax-highlighter-11.0.2.tgz - :x: **prismjs-1.20.0.tgz** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> prism is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3801>CVE-2021-3801</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-3801">https://nvd.nist.gov/vuln/detail/CVE-2021-3801</a></p> <p>Release Date: 2021-09-15</p> <p>Fix Resolution (prismjs): 1.25.0</p> <p>Direct dependency fix Resolution (terra-responsive-element): 5.16.0</p><p>Fix Resolution (prismjs): 1.25.0</p> <p>Direct dependency fix Resolution (terra-responsive-element): 5.16.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"terra-responsive-element","packageVersion":"4.8.0","packageFilePaths":["/packages/terra-clinical-item-collection/package.json"],"isTransitiveDependency":false,"dependencyTree":"terra-responsive-element:4.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.16.0","isBinary":false},{"packageType":"javascript/Node.js","packageName":"terra-responsive-element","packageVersion":"4.8.0","packageFilePaths":["/packages/terra-clinical-item-collection/package.json"],"isTransitiveDependency":false,"dependencyTree":"terra-responsive-element:4.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.16.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-3801","vulnerabilityDetails":"prism is vulnerable to Inefficient Regular Expression Complexity","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3801","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-3801 (Medium) detected in prismjs-1.17.1.tgz, prismjs-1.20.0.tgz - ## CVE-2021-3801 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>prismjs-1.17.1.tgz</b>, <b>prismjs-1.20.0.tgz</b></p></summary> <p> <details><summary><b>prismjs-1.17.1.tgz</b></p></summary> <p>Lightweight, robust, elegant syntax highlighting. A spin-off project from Dabblet.</p> <p>Library home page: <a href="https://registry.npmjs.org/prismjs/-/prismjs-1.17.1.tgz">https://registry.npmjs.org/prismjs/-/prismjs-1.17.1.tgz</a></p> <p>Path to dependency file: /packages/terra-clinical-item-collection/package.json</p> <p>Path to vulnerable library: /packages/terra-clinical-item-collection/node_modules/terra-markdown/node_modules/prismjs/package.json,/packages/terra-clinical-item-collection/node_modules/terra-markdown/node_modules/prismjs/package.json,/packages/terra-clinical-item-collection/node_modules/terra-markdown/node_modules/prismjs/package.json</p> <p> Dependency Hierarchy: - terra-responsive-element-4.8.0.tgz (Root Library) - terra-doc-template-2.24.0.tgz - react-syntax-highlighter-11.0.2.tgz - refractor-2.10.1.tgz - :x: **prismjs-1.17.1.tgz** (Vulnerable Library) </details> <details><summary><b>prismjs-1.20.0.tgz</b></p></summary> <p>Lightweight, robust, elegant syntax highlighting. A spin-off project from Dabblet.</p> <p>Library home page: <a href="https://registry.npmjs.org/prismjs/-/prismjs-1.20.0.tgz">https://registry.npmjs.org/prismjs/-/prismjs-1.20.0.tgz</a></p> <p>Path to dependency file: /packages/terra-clinical-item-collection/package.json</p> <p>Path to vulnerable library: /packages/terra-clinical-item-collection/node_modules/prismjs/package.json,/packages/terra-clinical-item-collection/node_modules/prismjs/package.json</p> <p> Dependency Hierarchy: - terra-responsive-element-4.8.0.tgz (Root Library) - terra-doc-template-2.24.0.tgz - react-syntax-highlighter-11.0.2.tgz - :x: **prismjs-1.20.0.tgz** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> prism is vulnerable to Inefficient Regular Expression Complexity <p>Publish Date: 2021-09-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3801>CVE-2021-3801</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-3801">https://nvd.nist.gov/vuln/detail/CVE-2021-3801</a></p> <p>Release Date: 2021-09-15</p> <p>Fix Resolution (prismjs): 1.25.0</p> <p>Direct dependency fix Resolution (terra-responsive-element): 5.16.0</p><p>Fix Resolution (prismjs): 1.25.0</p> <p>Direct dependency fix Resolution (terra-responsive-element): 5.16.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"terra-responsive-element","packageVersion":"4.8.0","packageFilePaths":["/packages/terra-clinical-item-collection/package.json"],"isTransitiveDependency":false,"dependencyTree":"terra-responsive-element:4.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.16.0","isBinary":false},{"packageType":"javascript/Node.js","packageName":"terra-responsive-element","packageVersion":"4.8.0","packageFilePaths":["/packages/terra-clinical-item-collection/package.json"],"isTransitiveDependency":false,"dependencyTree":"terra-responsive-element:4.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"5.16.0","isBinary":false}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2021-3801","vulnerabilityDetails":"prism is vulnerable to Inefficient Regular Expression Complexity","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3801","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"Required","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_test
cve medium detected in prismjs tgz prismjs tgz cve medium severity vulnerability vulnerable libraries prismjs tgz prismjs tgz prismjs tgz lightweight robust elegant syntax highlighting a spin off project from dabblet library home page a href path to dependency file packages terra clinical item collection package json path to vulnerable library packages terra clinical item collection node modules terra markdown node modules prismjs package json packages terra clinical item collection node modules terra markdown node modules prismjs package json packages terra clinical item collection node modules terra markdown node modules prismjs package json dependency hierarchy terra responsive element tgz root library terra doc template tgz react syntax highlighter tgz refractor tgz x prismjs tgz vulnerable library prismjs tgz lightweight robust elegant syntax highlighting a spin off project from dabblet library home page a href path to dependency file packages terra clinical item collection package json path to vulnerable library packages terra clinical item collection node modules prismjs package json packages terra clinical item collection node modules prismjs package json dependency hierarchy terra responsive element tgz root library terra doc template tgz react syntax highlighter tgz x prismjs tgz vulnerable library vulnerability details prism is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution prismjs direct dependency fix resolution terra responsive element fix resolution prismjs direct dependency fix resolution terra responsive element isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree terra responsive element isminimumfixversionavailable true minimumfixversion isbinary false packagetype javascript node js packagename terra responsive element packageversion packagefilepaths istransitivedependency false dependencytree terra responsive element isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails prism is vulnerable to inefficient regular expression complexity vulnerabilityurl
0
270,897
23,546,187,592
IssuesEvent
2022-08-21 06:19:28
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
roachtest: autoupgrade failed
C-test-failure O-robot O-roachtest branch-master release-blocker
roachtest.autoupgrade [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6174190?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6174190?buildTab=artifacts#/autoupgrade) on master @ [e6a7dc2f8ee39549e186bd05626c4c375b76fd04](https://github.com/cockroachdb/cockroach/commits/e6a7dc2f8ee39549e186bd05626c4c375b76fd04): ``` | 3 true 43 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 41 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 38 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 35 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 32 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 29 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 27 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 26 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 24 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 23 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 21 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 19 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 18 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 15 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 13 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 12 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 7 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 6 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 5 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 4 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 3 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 2 true decommissioning false Wraps: (4) COMMAND_PROBLEM Wraps: (5) Node 3. Command with error: | `````` | ./cockroach node decommission 3 --insecure --port={pgport:3} | `````` Wraps: (6) exit status 1 Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) errors.Cmd (5) *hintdetail.withDetail (6) *exec.ExitError ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/kv-triage <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*autoupgrade.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
2.0
roachtest: autoupgrade failed - roachtest.autoupgrade [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6174190?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6174190?buildTab=artifacts#/autoupgrade) on master @ [e6a7dc2f8ee39549e186bd05626c4c375b76fd04](https://github.com/cockroachdb/cockroach/commits/e6a7dc2f8ee39549e186bd05626c4c375b76fd04): ``` | 3 true 43 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 41 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 38 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 35 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 32 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 29 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 27 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 26 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 24 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 23 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 21 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 19 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 18 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 15 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 13 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 12 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 7 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 6 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 5 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 4 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 3 true decommissioning false | id is_live replicas is_decommissioning membership is_draining | 3 true 2 true decommissioning false Wraps: (4) COMMAND_PROBLEM Wraps: (5) Node 3. Command with error: | `````` | ./cockroach node decommission 3 --insecure --port={pgport:3} | `````` Wraps: (6) exit status 1 Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) errors.Cmd (5) *hintdetail.withDetail (6) *exec.ExitError ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/kv-triage <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*autoupgrade.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
test
roachtest autoupgrade failed roachtest autoupgrade with on master true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false id is live replicas is decommissioning membership is draining true true decommissioning false wraps command problem wraps node command with error cockroach node decommission insecure port pgport wraps exit status error types withstack withstack errutil withprefix cluster withcommanddetails errors cmd hintdetail withdetail exec exiterror parameters roachtest cloud gce roachtest cpu roachtest ssd help see see cc cockroachdb kv triage
1
63,770
17,933,120,587
IssuesEvent
2021-09-10 12:06:09
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
Unnecessary builded twice Member in Client Cluster Service
Type: Defect Team: Client Module: Cluster Not Release Notes content
build() call in https://github.com/hazelcast/hazelcast/blob/fb720d9e7f4b7643c31fdb61e9b83b6ca8ecc552/hazelcast/src/main/java/com/hazelcast/client/impl/spi/impl/ClientClusterServiceImpl.java#L289 seems like unnecessary and should be deleted.
1.0
Unnecessary builded twice Member in Client Cluster Service - build() call in https://github.com/hazelcast/hazelcast/blob/fb720d9e7f4b7643c31fdb61e9b83b6ca8ecc552/hazelcast/src/main/java/com/hazelcast/client/impl/spi/impl/ClientClusterServiceImpl.java#L289 seems like unnecessary and should be deleted.
non_test
unnecessary builded twice member in client cluster service build call in seems like unnecessary and should be deleted
0
606,567
18,764,889,988
IssuesEvent
2021-11-05 21:44:51
cyntaria/UniPal-Backend
https://api.github.com/repos/cyntaria/UniPal-Backend
opened
[PATCH] A Teacher Review
Status: Pending Priority: Medium user story Type: Feature
### Summary As a `student`, I should be able to **update the details of a teacher review**, so that I can **fix old/inconsistent entries**. ### Acceptance Criteria **GIVEN** a `student` is *editing a teacher review* in the app **WHEN** the app hits the `/teacher-reviews/:id` endpoint with a valid PATCH request, containing:- The path parameter: - `:id`, the unique id of the entity of which the details are edited. And any of the following body parameters: - learning - grading - attendance - strictness - toughness - overall_rating - comment - reviewed_at **THEN** the app should receive a status `200` **AND** in the response, the following information should be returned: - header message indicating update operation success - rows matched - rows changed Sample Request/Sample Response ``` headers: { error: 0, message: "The specified item was updated successfully" } body: { rows_matched: 1, rows_changed: 1, info: "..." } ``` ### Resources - Development URL: {Here goes a URL to the feature on development API} - Production URL: {Here goes a URL to the feature on production API} ### Dev Notes {Some complementary notes if necessary} ### Testing Notes #### Scenario 1: PATCH request is successful 1. Update a teacher with a **PATCH** request to `/teacher-reviews/:id` endpoint 2. A subsequent **GET** request to `/teacher-reviews/:id` endpoint should return a status code `200` 3. And the teacher details with the updated information i.e. matching the initially sent body. 4. Resubmit a **PATCH** request to `/teacher-reviews/:id` endpoint to reverse the change and ensure status code `200` is returned. #### Scenario 2: PATCH request is unsuccessful due to unknown review_id 1. Update a teacher with a **PATCH** request to `/teacher-reviews/:id` endpoint containing a non-existent `review_id`. 2. Ensure a `404` status code is returned. 3. And the response headers' `code` parameter should contain "**_NotFoundException_**". #### Scenario 3: PATCH request is incorrect 1. Send a **PATCH** request to `/teacher-reviews/:id` endpoint with an incorrect key name in the body 2. Ensure a `422` status code is returned 3. And the response headers' `code` parameter should contain "**_InvalidPropertiesException_**". 4. And the response headers' `data` parameter should contain the name of the invalid parameter. #### Scenario 4: PATCH request is forbidden 1. Make a **PATCH** request to `/teacher-reviews/:id` endpoint with reviewed_by_erp != erp in student account token. 2. Ensure the endpoint returns a `403` forbidden status id. 3. And the response headers' `id` parameter should contain "**_ForbiddenException_**" #### Scenario 5: PATCH request is unauthorized 1. Send a **PATCH** request to `/teacher-reviews/:id` endpoint without an **authorization token** 2. Ensure a `401` unauthorized status code is returned. 3. And the response headers' `code` parameter should contain "**_TokenMissingException_**"
1.0
[PATCH] A Teacher Review - ### Summary As a `student`, I should be able to **update the details of a teacher review**, so that I can **fix old/inconsistent entries**. ### Acceptance Criteria **GIVEN** a `student` is *editing a teacher review* in the app **WHEN** the app hits the `/teacher-reviews/:id` endpoint with a valid PATCH request, containing:- The path parameter: - `:id`, the unique id of the entity of which the details are edited. And any of the following body parameters: - learning - grading - attendance - strictness - toughness - overall_rating - comment - reviewed_at **THEN** the app should receive a status `200` **AND** in the response, the following information should be returned: - header message indicating update operation success - rows matched - rows changed Sample Request/Sample Response ``` headers: { error: 0, message: "The specified item was updated successfully" } body: { rows_matched: 1, rows_changed: 1, info: "..." } ``` ### Resources - Development URL: {Here goes a URL to the feature on development API} - Production URL: {Here goes a URL to the feature on production API} ### Dev Notes {Some complementary notes if necessary} ### Testing Notes #### Scenario 1: PATCH request is successful 1. Update a teacher with a **PATCH** request to `/teacher-reviews/:id` endpoint 2. A subsequent **GET** request to `/teacher-reviews/:id` endpoint should return a status code `200` 3. And the teacher details with the updated information i.e. matching the initially sent body. 4. Resubmit a **PATCH** request to `/teacher-reviews/:id` endpoint to reverse the change and ensure status code `200` is returned. #### Scenario 2: PATCH request is unsuccessful due to unknown review_id 1. Update a teacher with a **PATCH** request to `/teacher-reviews/:id` endpoint containing a non-existent `review_id`. 2. Ensure a `404` status code is returned. 3. And the response headers' `code` parameter should contain "**_NotFoundException_**". #### Scenario 3: PATCH request is incorrect 1. Send a **PATCH** request to `/teacher-reviews/:id` endpoint with an incorrect key name in the body 2. Ensure a `422` status code is returned 3. And the response headers' `code` parameter should contain "**_InvalidPropertiesException_**". 4. And the response headers' `data` parameter should contain the name of the invalid parameter. #### Scenario 4: PATCH request is forbidden 1. Make a **PATCH** request to `/teacher-reviews/:id` endpoint with reviewed_by_erp != erp in student account token. 2. Ensure the endpoint returns a `403` forbidden status id. 3. And the response headers' `id` parameter should contain "**_ForbiddenException_**" #### Scenario 5: PATCH request is unauthorized 1. Send a **PATCH** request to `/teacher-reviews/:id` endpoint without an **authorization token** 2. Ensure a `401` unauthorized status code is returned. 3. And the response headers' `code` parameter should contain "**_TokenMissingException_**"
non_test
a teacher review summary as a student i should be able to update the details of a teacher review so that i can fix old inconsistent entries acceptance criteria given a student is editing a teacher review in the app when the app hits the teacher reviews id endpoint with a valid patch request containing the path parameter id the unique id of the entity of which the details are edited and any of the following body parameters learning grading attendance strictness toughness overall rating comment reviewed at then the app should receive a status and in the response the following information should be returned header message indicating update operation success rows matched rows changed sample request sample response headers error message the specified item was updated successfully body rows matched rows changed info resources development url here goes a url to the feature on development api production url here goes a url to the feature on production api dev notes some complementary notes if necessary testing notes scenario patch request is successful update a teacher with a patch request to teacher reviews id endpoint a subsequent get request to teacher reviews id endpoint should return a status code and the teacher details with the updated information i e matching the initially sent body resubmit a patch request to teacher reviews id endpoint to reverse the change and ensure status code is returned scenario patch request is unsuccessful due to unknown review id update a teacher with a patch request to teacher reviews id endpoint containing a non existent review id ensure a status code is returned and the response headers code parameter should contain notfoundexception scenario patch request is incorrect send a patch request to teacher reviews id endpoint with an incorrect key name in the body ensure a status code is returned and the response headers code parameter should contain invalidpropertiesexception and the response headers data parameter should contain the name of the invalid parameter scenario patch request is forbidden make a patch request to teacher reviews id endpoint with reviewed by erp erp in student account token ensure the endpoint returns a forbidden status id and the response headers id parameter should contain forbiddenexception scenario patch request is unauthorized send a patch request to teacher reviews id endpoint without an authorization token ensure a unauthorized status code is returned and the response headers code parameter should contain tokenmissingexception
0
62,775
6,812,940,038
IssuesEvent
2017-11-06 06:49:01
owncloud/core
https://api.github.com/repos/owncloud/core
closed
UI tests - waitTillPageIsLoaded does not work correctly for trashbin
bug dev:acceptance-tests
### Steps to reproduce 1. run trashbin UI tests 2. watch selenium output during check if files are in trashbin ### Expected behaviour test should check if the files are listed in the trashbin after the page is loaded ### Actual behaviour the tests cycles in `waitTillPageIsLoaded()` till end of timeout even the page get loaded correctly in selenium output log many rounds of this is shown: ``` 10:19:19.785 INFO - Executing: [find elements: By.xpath: //html/.//*[./@id = 'fileList']]) 10:19:19.792 INFO - Done: [find elements: By.xpath: //html/.//*[./@id = 'fileList']] 10:19:19.793 INFO - Executing: [find elements: By.xpath: (//html/.//*[./@id = 'fileList'])[1]//a]) 10:19:19.799 INFO - Done: [find elements: By.xpath: (//html/.//*[./@id = 'fileList'])[1]//a] 10:19:19.801 INFO - Executing: [find elements: By.xpath: //html/.//*[@id='emptycontent']]) 10:19:19.812 INFO - Done: [find elements: By.xpath: //html/.//*[@id='emptycontent']] 10:19:19.814 INFO - Executing: [find element: By.xpath: (//html/.//*[@id='emptycontent'])[1]]) 10:19:19.820 INFO - Done: [find element: By.xpath: (//html/.//*[@id='emptycontent'])[1]] 10:19:19.821 INFO - Executing: [execute script: return arguments[0].getAttribute("class"), [[[ChromeDriver: chrome on LINUX (0ea2f3d5a4207f369db8d31e6ed6a80e)] -> xpath: //html/.//*[@id='emptycontent']]]]) ``` ### Server configuration **Operating system**: Debian **Web server:** Apache **Database:** SQLite **PHP version:** 7.1 **ownCloud version:** (see ownCloud admin page) 10.0.3 **Updated from an older ownCloud or fresh install:** fresh **Where did you install ownCloud from:** git
1.0
UI tests - waitTillPageIsLoaded does not work correctly for trashbin - ### Steps to reproduce 1. run trashbin UI tests 2. watch selenium output during check if files are in trashbin ### Expected behaviour test should check if the files are listed in the trashbin after the page is loaded ### Actual behaviour the tests cycles in `waitTillPageIsLoaded()` till end of timeout even the page get loaded correctly in selenium output log many rounds of this is shown: ``` 10:19:19.785 INFO - Executing: [find elements: By.xpath: //html/.//*[./@id = 'fileList']]) 10:19:19.792 INFO - Done: [find elements: By.xpath: //html/.//*[./@id = 'fileList']] 10:19:19.793 INFO - Executing: [find elements: By.xpath: (//html/.//*[./@id = 'fileList'])[1]//a]) 10:19:19.799 INFO - Done: [find elements: By.xpath: (//html/.//*[./@id = 'fileList'])[1]//a] 10:19:19.801 INFO - Executing: [find elements: By.xpath: //html/.//*[@id='emptycontent']]) 10:19:19.812 INFO - Done: [find elements: By.xpath: //html/.//*[@id='emptycontent']] 10:19:19.814 INFO - Executing: [find element: By.xpath: (//html/.//*[@id='emptycontent'])[1]]) 10:19:19.820 INFO - Done: [find element: By.xpath: (//html/.//*[@id='emptycontent'])[1]] 10:19:19.821 INFO - Executing: [execute script: return arguments[0].getAttribute("class"), [[[ChromeDriver: chrome on LINUX (0ea2f3d5a4207f369db8d31e6ed6a80e)] -> xpath: //html/.//*[@id='emptycontent']]]]) ``` ### Server configuration **Operating system**: Debian **Web server:** Apache **Database:** SQLite **PHP version:** 7.1 **ownCloud version:** (see ownCloud admin page) 10.0.3 **Updated from an older ownCloud or fresh install:** fresh **Where did you install ownCloud from:** git
test
ui tests waittillpageisloaded does not work correctly for trashbin steps to reproduce run trashbin ui tests watch selenium output during check if files are in trashbin expected behaviour test should check if the files are listed in the trashbin after the page is loaded actual behaviour the tests cycles in waittillpageisloaded till end of timeout even the page get loaded correctly in selenium output log many rounds of this is shown info executing info done info executing a info done a info executing info done info executing info done info executing getattribute class xpath html server configuration operating system debian web server apache database sqlite php version owncloud version see owncloud admin page updated from an older owncloud or fresh install fresh where did you install owncloud from git
1
54,586
13,912,483,354
IssuesEvent
2020-10-20 18:56:33
jgeraigery/LocalCatalogManager
https://api.github.com/repos/jgeraigery/LocalCatalogManager
closed
CVE-2016-3083 (High) detected in hive-service-1.2.1.jar, hive-jdbc-1.2.1.jar - autoclosed
security vulnerability
## CVE-2016-3083 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>hive-service-1.2.1.jar</b>, <b>hive-jdbc-1.2.1.jar</b></p></summary> <p> <details><summary><b>hive-service-1.2.1.jar</b></p></summary> <p>The Apache Software Foundation provides support for the Apache community of open-source software projects. The Apache projects are characterized by a collaborative, consensus based development process, an open and pragmatic software license, and a desire to create high quality software that leads the way in its field. We consider ourselves not simply a group of projects sharing a server, but rather a community of developers and users.</p> <p>Path to dependency file: LocalCatalogManager/lcm-packaging/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hive/hive-service/1.2.1/hive-service-1.2.1.jar,/home/wss-scanner/.m2/repository/org/apache/hive/hive-service/1.2.1/hive-service-1.2.1.jar</p> <p> Dependency Hierarchy: - hive-jdbc-1.2.1.jar (Root Library) - :x: **hive-service-1.2.1.jar** (Vulnerable Library) </details> <details><summary><b>hive-jdbc-1.2.1.jar</b></p></summary> <p>The Apache Software Foundation provides support for the Apache community of open-source software projects. The Apache projects are characterized by a collaborative, consensus based development process, an open and pragmatic software license, and a desire to create high quality software that leads the way in its field. We consider ourselves not simply a group of projects sharing a server, but rather a community of developers and users.</p> <p>Path to dependency file: LocalCatalogManager/lcm-server/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/org/apache/hive/hive-jdbc/1.2.1/hive-jdbc-1.2.1.jar,/home/wss-scanner/.m2/repository/org/apache/hive/hive-jdbc/1.2.1/hive-jdbc-1.2.1.jar</p> <p> Dependency Hierarchy: - :x: **hive-jdbc-1.2.1.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/jgeraigery/LocalCatalogManager/commit/b8c24e199f2d440dea3ce3cc2c66ada102d5d922">b8c24e199f2d440dea3ce3cc2c66ada102d5d922</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache Hive (JDBC + HiveServer2) implements SSL for plain TCP and HTTP connections (it supports both transport modes). While validating the server's certificate during the connection setup, the client in Apache Hive before 1.2.2 and 2.0.x before 2.0.1 doesn't seem to be verifying the common name attribute of the certificate. In this way, if a JDBC client sends an SSL request to server abc.com, and the server responds with a valid certificate (certified by CA) but issued to xyz.com, the client will accept that as a valid certificate and the SSL handshake will go through. <p>Publish Date: 2017-05-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-3083>CVE-2016-3083</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-3083">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-3083</a></p> <p>Release Date: 2017-05-30</p> <p>Fix Resolution: org.apache.hive:hive-service:1.2.2,2.0.1,org.apache.hive:hive-jdbc:1.2.2,2.0.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.hive","packageName":"hive-service","packageVersion":"1.2.1","isTransitiveDependency":true,"dependencyTree":"org.apache.hive:hive-jdbc:1.2.1;org.apache.hive:hive-service:1.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.hive:hive-service:1.2.2,2.0.1,org.apache.hive:hive-jdbc:1.2.2,2.0.1"},{"packageType":"Java","groupId":"org.apache.hive","packageName":"hive-jdbc","packageVersion":"1.2.1","isTransitiveDependency":false,"dependencyTree":"org.apache.hive:hive-jdbc:1.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.hive:hive-service:1.2.2,2.0.1,org.apache.hive:hive-jdbc:1.2.2,2.0.1"}],"vulnerabilityIdentifier":"CVE-2016-3083","vulnerabilityDetails":"Apache Hive (JDBC + HiveServer2) implements SSL for plain TCP and HTTP connections (it supports both transport modes). While validating the server\u0027s certificate during the connection setup, the client in Apache Hive before 1.2.2 and 2.0.x before 2.0.1 doesn\u0027t seem to be verifying the common name attribute of the certificate. In this way, if a JDBC client sends an SSL request to server abc.com, and the server responds with a valid certificate (certified by CA) but issued to xyz.com, the client will accept that as a valid certificate and the SSL handshake will go through.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-3083","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2016-3083 (High) detected in hive-service-1.2.1.jar, hive-jdbc-1.2.1.jar - autoclosed - ## CVE-2016-3083 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>hive-service-1.2.1.jar</b>, <b>hive-jdbc-1.2.1.jar</b></p></summary> <p> <details><summary><b>hive-service-1.2.1.jar</b></p></summary> <p>The Apache Software Foundation provides support for the Apache community of open-source software projects. The Apache projects are characterized by a collaborative, consensus based development process, an open and pragmatic software license, and a desire to create high quality software that leads the way in its field. We consider ourselves not simply a group of projects sharing a server, but rather a community of developers and users.</p> <p>Path to dependency file: LocalCatalogManager/lcm-packaging/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hive/hive-service/1.2.1/hive-service-1.2.1.jar,/home/wss-scanner/.m2/repository/org/apache/hive/hive-service/1.2.1/hive-service-1.2.1.jar</p> <p> Dependency Hierarchy: - hive-jdbc-1.2.1.jar (Root Library) - :x: **hive-service-1.2.1.jar** (Vulnerable Library) </details> <details><summary><b>hive-jdbc-1.2.1.jar</b></p></summary> <p>The Apache Software Foundation provides support for the Apache community of open-source software projects. The Apache projects are characterized by a collaborative, consensus based development process, an open and pragmatic software license, and a desire to create high quality software that leads the way in its field. We consider ourselves not simply a group of projects sharing a server, but rather a community of developers and users.</p> <p>Path to dependency file: LocalCatalogManager/lcm-server/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/org/apache/hive/hive-jdbc/1.2.1/hive-jdbc-1.2.1.jar,/home/wss-scanner/.m2/repository/org/apache/hive/hive-jdbc/1.2.1/hive-jdbc-1.2.1.jar</p> <p> Dependency Hierarchy: - :x: **hive-jdbc-1.2.1.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/jgeraigery/LocalCatalogManager/commit/b8c24e199f2d440dea3ce3cc2c66ada102d5d922">b8c24e199f2d440dea3ce3cc2c66ada102d5d922</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache Hive (JDBC + HiveServer2) implements SSL for plain TCP and HTTP connections (it supports both transport modes). While validating the server's certificate during the connection setup, the client in Apache Hive before 1.2.2 and 2.0.x before 2.0.1 doesn't seem to be verifying the common name attribute of the certificate. In this way, if a JDBC client sends an SSL request to server abc.com, and the server responds with a valid certificate (certified by CA) but issued to xyz.com, the client will accept that as a valid certificate and the SSL handshake will go through. <p>Publish Date: 2017-05-30 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-3083>CVE-2016-3083</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-3083">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-3083</a></p> <p>Release Date: 2017-05-30</p> <p>Fix Resolution: org.apache.hive:hive-service:1.2.2,2.0.1,org.apache.hive:hive-jdbc:1.2.2,2.0.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.hive","packageName":"hive-service","packageVersion":"1.2.1","isTransitiveDependency":true,"dependencyTree":"org.apache.hive:hive-jdbc:1.2.1;org.apache.hive:hive-service:1.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.hive:hive-service:1.2.2,2.0.1,org.apache.hive:hive-jdbc:1.2.2,2.0.1"},{"packageType":"Java","groupId":"org.apache.hive","packageName":"hive-jdbc","packageVersion":"1.2.1","isTransitiveDependency":false,"dependencyTree":"org.apache.hive:hive-jdbc:1.2.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.hive:hive-service:1.2.2,2.0.1,org.apache.hive:hive-jdbc:1.2.2,2.0.1"}],"vulnerabilityIdentifier":"CVE-2016-3083","vulnerabilityDetails":"Apache Hive (JDBC + HiveServer2) implements SSL for plain TCP and HTTP connections (it supports both transport modes). While validating the server\u0027s certificate during the connection setup, the client in Apache Hive before 1.2.2 and 2.0.x before 2.0.1 doesn\u0027t seem to be verifying the common name attribute of the certificate. In this way, if a JDBC client sends an SSL request to server abc.com, and the server responds with a valid certificate (certified by CA) but issued to xyz.com, the client will accept that as a valid certificate and the SSL handshake will go through.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-3083","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_test
cve high detected in hive service jar hive jdbc jar autoclosed cve high severity vulnerability vulnerable libraries hive service jar hive jdbc jar hive service jar the apache software foundation provides support for the apache community of open source software projects the apache projects are characterized by a collaborative consensus based development process an open and pragmatic software license and a desire to create high quality software that leads the way in its field we consider ourselves not simply a group of projects sharing a server but rather a community of developers and users path to dependency file localcatalogmanager lcm packaging pom xml path to vulnerable library home wss scanner repository org apache hive hive service hive service jar home wss scanner repository org apache hive hive service hive service jar dependency hierarchy hive jdbc jar root library x hive service jar vulnerable library hive jdbc jar the apache software foundation provides support for the apache community of open source software projects the apache projects are characterized by a collaborative consensus based development process an open and pragmatic software license and a desire to create high quality software that leads the way in its field we consider ourselves not simply a group of projects sharing a server but rather a community of developers and users path to dependency file localcatalogmanager lcm server pom xml path to vulnerable library canner repository org apache hive hive jdbc hive jdbc jar home wss scanner repository org apache hive hive jdbc hive jdbc jar dependency hierarchy x hive jdbc jar vulnerable library found in head commit a href found in base branch master vulnerability details apache hive jdbc implements ssl for plain tcp and http connections it supports both transport modes while validating the server s certificate during the connection setup the client in apache hive before and x before doesn t seem to be verifying the common name attribute of the certificate in this way if a jdbc client sends an ssl request to server abc com and the server responds with a valid certificate certified by ca but issued to xyz com the client will accept that as a valid certificate and the ssl handshake will go through publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache hive hive service org apache hive hive jdbc isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails apache hive jdbc implements ssl for plain tcp and http connections it supports both transport modes while validating the server certificate during the connection setup the client in apache hive before and x before doesn seem to be verifying the common name attribute of the certificate in this way if a jdbc client sends an ssl request to server abc com and the server responds with a valid certificate certified by ca but issued to xyz com the client will accept that as a valid certificate and the ssl handshake will go through vulnerabilityurl
0
54,878
6,415,805,006
IssuesEvent
2017-08-08 13:36:23
tgstation/tgstation
https://api.github.com/repos/tgstation/tgstation
reopened
Parallax looks dumb on a high FPS
Needs Reproducing/Testing
Parallax seems to animate each time you move between tiles, not based on frame rate. This means that if you turn up your framerate then parallax appears to jump around instead of smoothly animating as you walk.
1.0
Parallax looks dumb on a high FPS - Parallax seems to animate each time you move between tiles, not based on frame rate. This means that if you turn up your framerate then parallax appears to jump around instead of smoothly animating as you walk.
test
parallax looks dumb on a high fps parallax seems to animate each time you move between tiles not based on frame rate this means that if you turn up your framerate then parallax appears to jump around instead of smoothly animating as you walk
1
316,769
27,182,787,714
IssuesEvent
2023-02-18 21:06:35
MicrosoftDocs/minecraft-creator
https://api.github.com/repos/MicrosoftDocs/minecraft-creator
closed
[Improvement] [GameTest] Load chunk function
GameTest/Scripting
**A function which takes an X,Z pair and loads the chunk at that location, generating it if required** Not sure how the internal of this would work, perhaps it could return a chunk class which can then be manually closed, otherwise it will keep ticking until the server stops. This could be represented in game as ticking areas but not editable by in game users.
1.0
[Improvement] [GameTest] Load chunk function - **A function which takes an X,Z pair and loads the chunk at that location, generating it if required** Not sure how the internal of this would work, perhaps it could return a chunk class which can then be manually closed, otherwise it will keep ticking until the server stops. This could be represented in game as ticking areas but not editable by in game users.
test
load chunk function a function which takes an x z pair and loads the chunk at that location generating it if required not sure how the internal of this would work perhaps it could return a chunk class which can then be manually closed otherwise it will keep ticking until the server stops this could be represented in game as ticking areas but not editable by in game users
1
325,551
27,884,852,915
IssuesEvent
2023-03-21 22:45:55
aodn/nrmn-application
https://api.github.com/repos/aodn/nrmn-application
closed
Survey Deletion feature
ready to test planned production
## Description - Add a button to delete a survey and all associated observations - Check if there are PQs catalogued against a survey, and if so do not delete - Ensure there is a clear warning before deleting - Record information about the deleted survey in the transcript of a Corrections job ## Backlog Items 1. https://github.com/aodn/backlog/issues/4130 2. https://github.com/aodn/backlog/issues/4129 3. https://github.com/aodn/backlog/issues/4134
1.0
Survey Deletion feature - ## Description - Add a button to delete a survey and all associated observations - Check if there are PQs catalogued against a survey, and if so do not delete - Ensure there is a clear warning before deleting - Record information about the deleted survey in the transcript of a Corrections job ## Backlog Items 1. https://github.com/aodn/backlog/issues/4130 2. https://github.com/aodn/backlog/issues/4129 3. https://github.com/aodn/backlog/issues/4134
test
survey deletion feature description add a button to delete a survey and all associated observations check if there are pqs catalogued against a survey and if so do not delete ensure there is a clear warning before deleting record information about the deleted survey in the transcript of a corrections job backlog items
1
352,301
32,058,260,317
IssuesEvent
2023-09-24 10:55:33
photoprism/photoprism
https://api.github.com/repos/photoprism/photoprism
closed
Setup: Add installation instructions for Portainer
help wanted docs 📚 tested
When I use the directions for the portainer/synology docker install the MariaDB gets created but the PhotoPrism app does not get created and there is no error. Are these directions still accurate? Or maybe the docker-compose file is out of date. https://docs.photoprism.org/getting-started/portainer/synology/ If you could provide an example of your Synology based Portainer compose files with the paths that may help as well.
1.0
Setup: Add installation instructions for Portainer - When I use the directions for the portainer/synology docker install the MariaDB gets created but the PhotoPrism app does not get created and there is no error. Are these directions still accurate? Or maybe the docker-compose file is out of date. https://docs.photoprism.org/getting-started/portainer/synology/ If you could provide an example of your Synology based Portainer compose files with the paths that may help as well.
test
setup add installation instructions for portainer when i use the directions for the portainer synology docker install the mariadb gets created but the photoprism app does not get created and there is no error are these directions still accurate or maybe the docker compose file is out of date if you could provide an example of your synology based portainer compose files with the paths that may help as well
1
1,747
2,571,241,232
IssuesEvent
2015-02-10 15:28:47
photonstorm/phaser
https://api.github.com/repos/photonstorm/phaser
closed
Rotation prop missing on objects loaded from object layer in a tilemap.
please test
Hi, Thanks for the awesome framework. I think I've covered the issue pretty well in [this thread](http://www.html5gamedevs.com/topic/10897-rotation-prop-missing-on-objects-loaded-from-object-layer-in-a-tilemap/). Regards, Nikolay Tsenkov
1.0
Rotation prop missing on objects loaded from object layer in a tilemap. - Hi, Thanks for the awesome framework. I think I've covered the issue pretty well in [this thread](http://www.html5gamedevs.com/topic/10897-rotation-prop-missing-on-objects-loaded-from-object-layer-in-a-tilemap/). Regards, Nikolay Tsenkov
test
rotation prop missing on objects loaded from object layer in a tilemap hi thanks for the awesome framework i think i ve covered the issue pretty well in regards nikolay tsenkov
1
107,077
13,432,537,673
IssuesEvent
2020-09-07 08:34:58
JoscaWij/Spacey
https://api.github.com/repos/JoscaWij/Spacey
opened
Plattform Grafic
Design User Story
# User Story As a player I want the plattforms to match the space astetic of the game # Description - scalable SVG of a asteroid in comic style # Material First Design: ![image](https://user-images.githubusercontent.com/68224111/92366550-9c68f780-f0f5-11ea-98f8-2ecea9a533db.png) # To do's: - [ ] create SVG in Photoshop
1.0
Plattform Grafic - # User Story As a player I want the plattforms to match the space astetic of the game # Description - scalable SVG of a asteroid in comic style # Material First Design: ![image](https://user-images.githubusercontent.com/68224111/92366550-9c68f780-f0f5-11ea-98f8-2ecea9a533db.png) # To do's: - [ ] create SVG in Photoshop
non_test
plattform grafic user story as a player i want the plattforms to match the space astetic of the game description scalable svg of a asteroid in comic style material first design to do s create svg in photoshop
0
361,055
10,703,409,953
IssuesEvent
2019-10-24 09:30:18
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
www.youtube.com - see bug description
browser-firefox engine-gecko priority-critical
<!-- @browser: Firefox 69.0 --> <!-- @ua_header: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:69.0) Gecko/20100101 Firefox/69.0 --> <!-- @reported_with: addon-reporter-firefox --> **URL**: https://www.youtube.com/channel/UCgfO009hB_847kwL6NQnn-w/community?lb=UgzMsTvUY2W4b-cQIql4AaABCQ **Browser / Version**: Firefox 69.0 **Operating System**: Ubuntu **Tested Another Browser**: Unknown **Problem type**: Something else **Description**: When open youtube this channel appears **Steps to Reproduce**: when open youtube this channel appears and looping with this channel if any search occured. [![Screenshot Description](https://webcompat.com/uploads/2019/10/3c1f54e1-b804-4d08-924d-4b15f7f56edf-thumb.jpg)](https://webcompat.com/uploads/2019/10/3c1f54e1-b804-4d08-924d-4b15f7f56edf.jpg) <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
www.youtube.com - see bug description - <!-- @browser: Firefox 69.0 --> <!-- @ua_header: Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:69.0) Gecko/20100101 Firefox/69.0 --> <!-- @reported_with: addon-reporter-firefox --> **URL**: https://www.youtube.com/channel/UCgfO009hB_847kwL6NQnn-w/community?lb=UgzMsTvUY2W4b-cQIql4AaABCQ **Browser / Version**: Firefox 69.0 **Operating System**: Ubuntu **Tested Another Browser**: Unknown **Problem type**: Something else **Description**: When open youtube this channel appears **Steps to Reproduce**: when open youtube this channel appears and looping with this channel if any search occured. [![Screenshot Description](https://webcompat.com/uploads/2019/10/3c1f54e1-b804-4d08-924d-4b15f7f56edf-thumb.jpg)](https://webcompat.com/uploads/2019/10/3c1f54e1-b804-4d08-924d-4b15f7f56edf.jpg) <details> <summary>Browser Configuration</summary> <ul> <li>None</li> </ul> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_test
see bug description url browser version firefox operating system ubuntu tested another browser unknown problem type something else description when open youtube this channel appears steps to reproduce when open youtube this channel appears and looping with this channel if any search occured browser configuration none from with ❤️
0
5,667
5,872,295,272
IssuesEvent
2017-05-15 11:04:14
hzi-braunschweig/SORMAS-Open
https://api.github.com/repos/hzi-braunschweig/SORMAS-Open
opened
Users of the mobile app login with username and password once
Infrastructure sormas-app
- [ ] create login screen that allows to enter username and password - [ ] credentials are stored in the android keystore - [ ] In order to assure the safety of stored user credentials android device encryption has to be activated. If possible check this and otherwise refuse to work
1.0
Users of the mobile app login with username and password once - - [ ] create login screen that allows to enter username and password - [ ] credentials are stored in the android keystore - [ ] In order to assure the safety of stored user credentials android device encryption has to be activated. If possible check this and otherwise refuse to work
non_test
users of the mobile app login with username and password once create login screen that allows to enter username and password credentials are stored in the android keystore in order to assure the safety of stored user credentials android device encryption has to be activated if possible check this and otherwise refuse to work
0
330,229
28,360,669,455
IssuesEvent
2023-04-12 10:29:07
PTYB/SlayerIssueTracker
https://api.github.com/repos/PTYB/SlayerIssueTracker
closed
Vannaka - Blue Dragons -> Walks to Kourend Dungeon for brutal dragons then tries to bank (loops)
bug website script Awaiting test
Slayer task: Blue Dragons Slayer master: Vannaka Current behavior: It grabs the right gear from the bank, then walks to the area. Note that in the config it says "Taverly Dungeon" for blue dragons, but it walks to brutal blue dragons in Kourend Dungeon. Expected behavior: Walk to either of the dragons and starts killing them [PTY Slayer - Vannaka - Blue Dragons - Banking issue.txt](https://github.com/PTYB/SlayerIssueTracker/files/11173410/PTY.Slayer.-.Vannaka.-.Blue.Dragons.-.Banking.issue.txt)
1.0
Vannaka - Blue Dragons -> Walks to Kourend Dungeon for brutal dragons then tries to bank (loops) - Slayer task: Blue Dragons Slayer master: Vannaka Current behavior: It grabs the right gear from the bank, then walks to the area. Note that in the config it says "Taverly Dungeon" for blue dragons, but it walks to brutal blue dragons in Kourend Dungeon. Expected behavior: Walk to either of the dragons and starts killing them [PTY Slayer - Vannaka - Blue Dragons - Banking issue.txt](https://github.com/PTYB/SlayerIssueTracker/files/11173410/PTY.Slayer.-.Vannaka.-.Blue.Dragons.-.Banking.issue.txt)
test
vannaka blue dragons walks to kourend dungeon for brutal dragons then tries to bank loops slayer task blue dragons slayer master vannaka current behavior it grabs the right gear from the bank then walks to the area note that in the config it says taverly dungeon for blue dragons but it walks to brutal blue dragons in kourend dungeon expected behavior walk to either of the dragons and starts killing them
1
414,726
28,000,545,732
IssuesEvent
2023-03-27 11:26:40
babyfacegames/git-hub-test-trial
https://api.github.com/repos/babyfacegames/git-hub-test-trial
opened
Research on how to make the main character
documentation
How the main character should jump or target the enemy.
1.0
Research on how to make the main character - How the main character should jump or target the enemy.
non_test
research on how to make the main character how the main character should jump or target the enemy
0
183,721
14,247,613,477
IssuesEvent
2020-11-19 11:42:57
Tencent/bk-ci
https://api.github.com/repos/Tencent/bk-ci
closed
bug: 私有构建机/构建集群的默认workspace提示错误
area/ci/frontend kind/bug stage/test stage/uat test/passed
![image](https://user-images.githubusercontent.com/10875468/77054449-c248f580-6a0a-11ea-95b2-b2f2dbd04af3.png) 指定工作空间的 placeholder 提示少了 /src 层级,应改为: 不填默认为 [Agent安装目录]/workspace/[流水线ID]/src
2.0
bug: 私有构建机/构建集群的默认workspace提示错误 - ![image](https://user-images.githubusercontent.com/10875468/77054449-c248f580-6a0a-11ea-95b2-b2f2dbd04af3.png) 指定工作空间的 placeholder 提示少了 /src 层级,应改为: 不填默认为 [Agent安装目录]/workspace/[流水线ID]/src
test
bug 私有构建机 构建集群的默认workspace提示错误 指定工作空间的 placeholder 提示少了 src 层级,应改为: 不填默认为 workspace src
1
48,191
7,389,691,993
IssuesEvent
2018-03-16 09:39:04
bounswe/bounswe2018group1
https://api.github.com/repos/bounswe/bounswe2018group1
opened
How should we show 'time' in our class diagram
Platform: Documentation Position: Help-wanted Type: Question
There are two options, we can separate it such as posting time and memory time or we can show it as one as time class
1.0
How should we show 'time' in our class diagram - There are two options, we can separate it such as posting time and memory time or we can show it as one as time class
non_test
how should we show time in our class diagram there are two options we can separate it such as posting time and memory time or we can show it as one as time class
0
172,287
13,299,263,662
IssuesEvent
2020-08-25 09:28:33
microsoft/AzureStorageExplorer
https://api.github.com/repos/microsoft/AzureStorageExplorer
opened
Pop up an error information bar when switching to 'All blobs and blobs without current version' view for one Emulator blob container
:gear: blobs :gear: emulator 🧪 testing
**Storage Explorer Version:** 1.15.0-dev **Build**: 20200825.4 **Branch**: rel/1.15.0 **Platform/OS:** Windows 10 **Azurite Version:** 3.8.0 **Architecture**: ia32 **Regression From:** Not a regression **Steps to reproduce:** 1. Install and run Azurite -> Launch Storage Explorer. 2. Expand 'Local & Attached' -> Storage Accounts -> (Emulator - Default Ports) (Key). 3. Right click the Blob Containers node -> Create a blob container. 4. Switch to 'All blobs and blobs without current version' view on the blob container's editor. 5. Check the result. **Expect Experience:** No error occurs. **Actual Experience:** 1. An error information bar pops up. 2. The current blob tab is auto closed. ![image](https://user-images.githubusercontent.com/41351993/91157215-9dfaee80-e6f7-11ea-92b3-aaa4af7fe72b.png) **More Info:** This issue also reproduces when switching to 'Active blobs and blobs without current version' view.
1.0
Pop up an error information bar when switching to 'All blobs and blobs without current version' view for one Emulator blob container - **Storage Explorer Version:** 1.15.0-dev **Build**: 20200825.4 **Branch**: rel/1.15.0 **Platform/OS:** Windows 10 **Azurite Version:** 3.8.0 **Architecture**: ia32 **Regression From:** Not a regression **Steps to reproduce:** 1. Install and run Azurite -> Launch Storage Explorer. 2. Expand 'Local & Attached' -> Storage Accounts -> (Emulator - Default Ports) (Key). 3. Right click the Blob Containers node -> Create a blob container. 4. Switch to 'All blobs and blobs without current version' view on the blob container's editor. 5. Check the result. **Expect Experience:** No error occurs. **Actual Experience:** 1. An error information bar pops up. 2. The current blob tab is auto closed. ![image](https://user-images.githubusercontent.com/41351993/91157215-9dfaee80-e6f7-11ea-92b3-aaa4af7fe72b.png) **More Info:** This issue also reproduces when switching to 'Active blobs and blobs without current version' view.
test
pop up an error information bar when switching to all blobs and blobs without current version view for one emulator blob container storage explorer version dev build branch rel platform os windows azurite version architecture regression from not a regression steps to reproduce install and run azurite launch storage explorer expand local attached storage accounts emulator default ports key right click the blob containers node create a blob container switch to all blobs and blobs without current version view on the blob container s editor check the result expect experience no error occurs actual experience an error information bar pops up the current blob tab is auto closed more info this issue also reproduces when switching to active blobs and blobs without current version view
1
83,069
16,086,690,986
IssuesEvent
2021-04-26 12:10:27
surge-synthesizer/surge
https://api.github.com/repos/surge-synthesizer/surge
closed
UserPref Keys should be variables
Code Cleanup
the user prefs have "fooBarHootie" as keys but that should really be UserPrefs::FooBarHootie consistently.
1.0
UserPref Keys should be variables - the user prefs have "fooBarHootie" as keys but that should really be UserPrefs::FooBarHootie consistently.
non_test
userpref keys should be variables the user prefs have foobarhootie as keys but that should really be userprefs foobarhootie consistently
0
119,044
10,023,676,390
IssuesEvent
2019-07-16 19:49:50
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Visualize listing test failures
Feature:Visualizations PR sent failed-test test test-darwin test-linux test-windows test_ui_functional
└-: visualize listing page └-> "before all" hook └-: create and delete └-> "before all" hook └-> "before all" hook └-> create new viz └-> "before each" hook: global before each └- **✖ fail: "visualize app visualize listing page create and delete create new viz"** │ Error: expected 2 to equal 1 │ at Assertion.assert (packages/kbn-expect/expect.js:100:11) │ at Assertion.be.Assertion.equal (packages/kbn-expect/expect.js:221:8) │ at Context.equal (test/functional/apps/visualize/_visualize_listing.js:41:29) │ at process._tickCallback (internal/process/next_tick.js:68:7) │ │ └-> delete all viz └-> "before each" hook: global before each └- ✖ **fail: "visualize app visualize listing page create and delete delete all viz"** │ Error: expected 4 to equal 3 │ at Assertion.assert (packages/kbn-expect/expect.js:100:11) │ at Assertion.be.Assertion.equal (packages/kbn-expect/expect.js:221:8) │ at Context.equal (test/functional/apps/visualize/_visualize_listing.js:50:29) │ at process._tickCallback (internal/process/next_tick.js:68:7) │ **Version: 7.3**
6.0
Visualize listing test failures - └-: visualize listing page └-> "before all" hook └-: create and delete └-> "before all" hook └-> "before all" hook └-> create new viz └-> "before each" hook: global before each └- **✖ fail: "visualize app visualize listing page create and delete create new viz"** │ Error: expected 2 to equal 1 │ at Assertion.assert (packages/kbn-expect/expect.js:100:11) │ at Assertion.be.Assertion.equal (packages/kbn-expect/expect.js:221:8) │ at Context.equal (test/functional/apps/visualize/_visualize_listing.js:41:29) │ at process._tickCallback (internal/process/next_tick.js:68:7) │ │ └-> delete all viz └-> "before each" hook: global before each └- ✖ **fail: "visualize app visualize listing page create and delete delete all viz"** │ Error: expected 4 to equal 3 │ at Assertion.assert (packages/kbn-expect/expect.js:100:11) │ at Assertion.be.Assertion.equal (packages/kbn-expect/expect.js:221:8) │ at Context.equal (test/functional/apps/visualize/_visualize_listing.js:50:29) │ at process._tickCallback (internal/process/next_tick.js:68:7) │ **Version: 7.3**
test
visualize listing test failures └ visualize listing page └ before all hook └ create and delete └ before all hook └ before all hook └ create new viz └ before each hook global before each └ ✖ fail visualize app visualize listing page create and delete create new viz │ error expected to equal │ at assertion assert packages kbn expect expect js │ at assertion be assertion equal packages kbn expect expect js │ at context equal test functional apps visualize visualize listing js │ at process tickcallback internal process next tick js │ │ └ delete all viz └ before each hook global before each └ ✖ fail visualize app visualize listing page create and delete delete all viz │ error expected to equal │ at assertion assert packages kbn expect expect js │ at assertion be assertion equal packages kbn expect expect js │ at context equal test functional apps visualize visualize listing js │ at process tickcallback internal process next tick js │ version
1
686,599
23,497,785,899
IssuesEvent
2022-08-18 04:44:55
WordPress/Learn
https://api.github.com/repos/WordPress/Learn
closed
Update Workshop Presenter Application form
[Type] Enhancement [Priority] High
Hi there, we need to update the Workshop Presenter Application form on this page [here](https://learn.wordpress.org/workshop-presenter-application/) to remove the Presenter Details section which includes: - WordPress.org User Name - First Name - Last Name - Email - Where can we find you online? Please share links to your website(s) and as many social media accounts as applicable, including but not limited to Twitter, Linkedin, Facebook, Instagram, etc. Can we also update the name of this form from `Workshop Application Form` to `Workshop Submission Form`? The code for this form lives [here](https://github.com/WordPress/Learn/blob/trunk/wp-content/plugins/wporg-learn/inc/form.php).
1.0
Update Workshop Presenter Application form - Hi there, we need to update the Workshop Presenter Application form on this page [here](https://learn.wordpress.org/workshop-presenter-application/) to remove the Presenter Details section which includes: - WordPress.org User Name - First Name - Last Name - Email - Where can we find you online? Please share links to your website(s) and as many social media accounts as applicable, including but not limited to Twitter, Linkedin, Facebook, Instagram, etc. Can we also update the name of this form from `Workshop Application Form` to `Workshop Submission Form`? The code for this form lives [here](https://github.com/WordPress/Learn/blob/trunk/wp-content/plugins/wporg-learn/inc/form.php).
non_test
update workshop presenter application form hi there we need to update the workshop presenter application form on this page to remove the presenter details section which includes wordpress org user name first name last name email where can we find you online please share links to your website s and as many social media accounts as applicable including but not limited to twitter linkedin facebook instagram etc can we also update the name of this form from workshop application form to workshop submission form the code for this form lives
0
244
2,591,129,352
IssuesEvent
2015-02-18 23:15:02
okTurtles/dnschain
https://api.github.com/repos/okTurtles/dnschain
opened
Have a mechanism for clients to automatically accept new fingerprints
security
Good news is that for *everything other than the connection to DNSChain*, sysadmins no longer need to worry about setting expiration dates for their SSL/TLS certs (they just update the cert and the fingerprint in the blockchain). However, the connection to DNSChain itself should have its cert (and therefore its fingerprint) updated periodically. For end-users, it would be prohibitively annoying to have to manually re-enter (or re-verify) an updated fingerprint. Therefore DNSChain should be able to tell clients over the old cert connection: "Hey, I've got a new fingerprint, use this from now on." How exactly this should be done is TBD.
True
Have a mechanism for clients to automatically accept new fingerprints - Good news is that for *everything other than the connection to DNSChain*, sysadmins no longer need to worry about setting expiration dates for their SSL/TLS certs (they just update the cert and the fingerprint in the blockchain). However, the connection to DNSChain itself should have its cert (and therefore its fingerprint) updated periodically. For end-users, it would be prohibitively annoying to have to manually re-enter (or re-verify) an updated fingerprint. Therefore DNSChain should be able to tell clients over the old cert connection: "Hey, I've got a new fingerprint, use this from now on." How exactly this should be done is TBD.
non_test
have a mechanism for clients to automatically accept new fingerprints good news is that for everything other than the connection to dnschain sysadmins no longer need to worry about setting expiration dates for their ssl tls certs they just update the cert and the fingerprint in the blockchain however the connection to dnschain itself should have its cert and therefore its fingerprint updated periodically for end users it would be prohibitively annoying to have to manually re enter or re verify an updated fingerprint therefore dnschain should be able to tell clients over the old cert connection hey i ve got a new fingerprint use this from now on how exactly this should be done is tbd
0
198,057
15,699,536,973
IssuesEvent
2021-03-26 08:38:10
steroid-team/app
https://api.github.com/repos/steroid-team/app
opened
Create a README
documentation
It would be great to have a README with the following contents: - Our logo. - Badges. - Details on requirements, build...
1.0
Create a README - It would be great to have a README with the following contents: - Our logo. - Badges. - Details on requirements, build...
non_test
create a readme it would be great to have a readme with the following contents our logo badges details on requirements build
0
25,609
2,683,862,086
IssuesEvent
2015-03-28 11:57:27
ConEmu/old-issues
https://api.github.com/repos/ConEmu/old-issues
closed
Conemu 2010.2.2 - ложное сообщение "Close confirmation - Incomplete operations: 1"
2–5 stars bug imported Priority-Medium
_From [Zero...@gmail.com](https://code.google.com/u/103642962356045697092/) on February 03, 2010 06:16:24_ Версия ОС: Windows 7 Версия FAR: 1.75 build 2619 x86 или v2.0 (build 1369) x86 Описание бага... Начиная с ConEmu .Maximus5.090909.7z введена новая фича [+] Перед закрытием ConEmu (щелчком по крестику в заголовке окна) отображается предупреждение при наличии несохраненных редактор ов или незавершенных процессов (копирования и пр.). Начиная с Maximus5 - 2010.01.12 и по 2010.2.2 включительно наблюдается проблема: Берём большой файл >3mb, например Winword.exe архивируем его в RAR архив test.rar (у меня RAR 3.92.1). Далее запускаем FAR через Conemu с одним плагом MA 1.75 build 192. заходим в архив распаковываем по F5 один файл из него на соседнюю панель. (после F5 и Enter ничего на клаве и мышой не делаем). Ждём несколько секунд жмём по крестику закрытия окна, в итоге нам выдаётся сообщение --------------------------- ConEmu --------------------------- Close confirmation. Incomplete operations: 1 Proceed with shutdown? \--------------------------- Повторяемость 100&#37; при таких условиях. OK Cancel \--------------------------- _Original issue: http://code.google.com/p/conemu-maximus5/issues/detail?id=170_
1.0
Conemu 2010.2.2 - ложное сообщение "Close confirmation - Incomplete operations: 1" - _From [Zero...@gmail.com](https://code.google.com/u/103642962356045697092/) on February 03, 2010 06:16:24_ Версия ОС: Windows 7 Версия FAR: 1.75 build 2619 x86 или v2.0 (build 1369) x86 Описание бага... Начиная с ConEmu .Maximus5.090909.7z введена новая фича [+] Перед закрытием ConEmu (щелчком по крестику в заголовке окна) отображается предупреждение при наличии несохраненных редактор ов или незавершенных процессов (копирования и пр.). Начиная с Maximus5 - 2010.01.12 и по 2010.2.2 включительно наблюдается проблема: Берём большой файл >3mb, например Winword.exe архивируем его в RAR архив test.rar (у меня RAR 3.92.1). Далее запускаем FAR через Conemu с одним плагом MA 1.75 build 192. заходим в архив распаковываем по F5 один файл из него на соседнюю панель. (после F5 и Enter ничего на клаве и мышой не делаем). Ждём несколько секунд жмём по крестику закрытия окна, в итоге нам выдаётся сообщение --------------------------- ConEmu --------------------------- Close confirmation. Incomplete operations: 1 Proceed with shutdown? \--------------------------- Повторяемость 100&#37; при таких условиях. OK Cancel \--------------------------- _Original issue: http://code.google.com/p/conemu-maximus5/issues/detail?id=170_
non_test
conemu ложное сообщение close confirmation incomplete operations from on february версия ос windows версия far build или build описание бага начиная с conemu введена новая фича перед закрытием conemu щелчком по крестику в заголовке окна отображается предупреждение при наличии несохраненных редактор ов или незавершенных процессов копирования и пр начиная с и по включительно наблюдается проблема берём большой файл например winword exe архивируем его в rar архив test rar у меня rar далее запускаем far через conemu с одним плагом ma build заходим в архив распаковываем по один файл из него на соседнюю панель после и enter ничего на клаве и мышой не делаем ждём несколько секунд жмём по крестику закрытия окна в итоге нам выдаётся сообщение conemu close confirmation incomplete operations proceed with shutdown повторяемость при таких условиях ok cancel original issue
0
766,254
26,875,249,972
IssuesEvent
2023-02-05 00:05:15
apache/hudi
https://api.github.com/repos/apache/hudi
closed
[SUPPORT][CDC]UnresolvedUnionException: Not in union ["null","double"]: 20230202105806923_0_1
schema-and-data-types priority:blocker spark change-data-capture
**_Tips before filing an issue_** - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? - Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org. - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly. **Describe the problem you faced** enable CDC, cannot perform compaction table service. **To Reproduce** Steps to reproduce the behavior: 1.hoodie.table.cdc.enabled=true hoodie.table.cdc.supplemental.logging.mode=data_before_after 2.table type: mor **Expected behavior** A clear and concise description of what you expected to happen. **Environment Description** * Hudi version : master * Spark version : 3.1.1 * Hive version : none * Hadoop version :none * Storage (HDFS/S3/GCS..) : * Running on Docker? (yes/no) : **Additional context** Add any other context about the problem here. **Stacktrace** ``` 23/02/02 10:58:21 ERROR HoodieStreamingSink: Micro batch id=1 threw following expections,aborting streaming app to avoid data loss: org.apache.hudi.exception.HoodieCompactionException: Could not compact /tmp/hudi/cdc_test at org.apache.hudi.table.action.compact.RunCompactionActionExecutor.execute(RunCompactionActionExecutor.java:116) at org.apache.hudi.table.HoodieSparkMergeOnReadTable.compact(HoodieSparkMergeOnReadTable.java:140) at org.apache.hudi.client.SparkRDDTableServiceClient.compact(SparkRDDTableServiceClient.java:75) at org.apache.hudi.client.BaseHoodieTableServiceClient.lambda$runAnyPendingCompactions$2(BaseHoodieTableServiceClient.java:191) at java.util.ArrayList.forEach(ArrayList.java:1259) at org.apache.hudi.client.BaseHoodieTableServiceClient.runAnyPendingCompactions(BaseHoodieTableServiceClient.java:189) at org.apache.hudi.client.BaseHoodieTableServiceClient.inlineCompaction(BaseHoodieTableServiceClient.java:160) at org.apache.hudi.client.BaseHoodieTableServiceClient.runTableServicesInline(BaseHoodieTableServiceClient.java:334) at org.apache.hudi.client.BaseHoodieWriteClient.runTableServicesInline(BaseHoodieWriteClient.java:540) at org.apache.hudi.client.BaseHoodieWriteClient.commitStats(BaseHoodieWriteClient.java:249) at org.apache.hudi.client.SparkRDDWriteClient.commit(SparkRDDWriteClient.java:102) at org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:903) at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:372) at org.apache.hudi.HoodieStreamingSink.$anonfun$addBatch$2(HoodieStreamingSink.scala:122) at scala.util.Try$.apply(Try.scala:213) at org.apache.hudi.HoodieStreamingSink.$anonfun$addBatch$1(HoodieStreamingSink.scala:120) at org.apache.hudi.HoodieStreamingSink.retry(HoodieStreamingSink.scala:244) at org.apache.hudi.HoodieStreamingSink.addBatch(HoodieStreamingSink.scala:119) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$16(MicroBatchExecution.scala:586) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:584) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355) at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:584) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:226) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355) at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:194) at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:57) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:188) at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:333) at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:244) Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 70.0 failed 1 times, most recent failure: Lost task 0.0 in stage 70.0 (TID 67) (LAPTOP-DONGSJ executor driver): org.apache.avro.UnresolvedUnionException: Not in union ["null","double"]: 20230202105806923_0_1 at org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:740) at org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:205) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:123) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:166) at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:125) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:166) at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62) at org.apache.hudi.avro.HoodieAvroUtils.indexedRecordToBytes(HoodieAvroUtils.java:136) at org.apache.hudi.avro.HoodieAvroUtils.avroToBytes(HoodieAvroUtils.java:128) at org.apache.hudi.common.model.HoodieAvroPayload.<init>(HoodieAvroPayload.java:47) at org.apache.hudi.io.HoodieCDCLogger.put(HoodieCDCLogger.java:175) at org.apache.hudi.io.HoodieMergeHandleWithChangeLog.writeInsertRecord(HoodieMergeHandleWithChangeLog.java:106) at org.apache.hudi.io.HoodieMergeHandle.writeIncomingRecords(HoodieMergeHandle.java:397) at org.apache.hudi.io.HoodieMergeHandle.close(HoodieMergeHandle.java:405) at org.apache.hudi.io.HoodieMergeHandleWithChangeLog.close(HoodieMergeHandleWithChangeLog.java:112) at org.apache.hudi.table.action.commit.HoodieMergeHelper.runMerge(HoodieMergeHelper.java:168) at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.handleUpdateInternal(HoodieSparkCopyOnWriteTable.java:224) at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.handleUpdate(HoodieSparkCopyOnWriteTable.java:215) at org.apache.hudi.table.action.compact.CompactionExecutionHelper.writeFileAndGetWriteStats(CompactionExecutionHelper.java:64) at org.apache.hudi.table.action.compact.HoodieCompactor.compact(HoodieCompactor.java:239) at org.apache.hudi.table.action.compact.HoodieCompactor.lambda$compact$9cd4b1be$1(HoodieCompactor.java:137) at org.apache.spark.api.java.JavaPairRDD$.$anonfun$toScalaFunction$1(JavaPairRDD.scala:1070) at scala.collection.Iterator$$anon$10.next(Iterator.scala:461) at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492) at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221) at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349) at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1440) at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1350) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1414) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1237) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:384) at org.apache.spark.rdd.RDD.iterator(RDD.scala:335) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373) at org.apache.spark.rdd.RDD.iterator(RDD.scala:337) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) at org.apache.spark.scheduler.Task.run(Task.scala:131) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2253) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2202) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2201) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2201) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1078) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1078) at scala.Option.foreach(Option.scala:407) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1078) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2440) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2382) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2371) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:868) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2202) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2223) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2242) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2267) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:414) at org.apache.spark.rdd.RDD.collect(RDD.scala:1029) at org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:362) at org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:361) at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45) at org.apache.hudi.data.HoodieJavaRDD.collectAsList(HoodieJavaRDD.java:163) at org.apache.hudi.table.action.compact.RunCompactionActionExecutor.execute(RunCompactionActionExecutor.java:101) ... 38 more ```
1.0
[SUPPORT][CDC]UnresolvedUnionException: Not in union ["null","double"]: 20230202105806923_0_1 - **_Tips before filing an issue_** - Have you gone through our [FAQs](https://hudi.apache.org/learn/faq/)? - Join the mailing list to engage in conversations and get faster support at dev-subscribe@hudi.apache.org. - If you have triaged this as a bug, then file an [issue](https://issues.apache.org/jira/projects/HUDI/issues) directly. **Describe the problem you faced** enable CDC, cannot perform compaction table service. **To Reproduce** Steps to reproduce the behavior: 1.hoodie.table.cdc.enabled=true hoodie.table.cdc.supplemental.logging.mode=data_before_after 2.table type: mor **Expected behavior** A clear and concise description of what you expected to happen. **Environment Description** * Hudi version : master * Spark version : 3.1.1 * Hive version : none * Hadoop version :none * Storage (HDFS/S3/GCS..) : * Running on Docker? (yes/no) : **Additional context** Add any other context about the problem here. **Stacktrace** ``` 23/02/02 10:58:21 ERROR HoodieStreamingSink: Micro batch id=1 threw following expections,aborting streaming app to avoid data loss: org.apache.hudi.exception.HoodieCompactionException: Could not compact /tmp/hudi/cdc_test at org.apache.hudi.table.action.compact.RunCompactionActionExecutor.execute(RunCompactionActionExecutor.java:116) at org.apache.hudi.table.HoodieSparkMergeOnReadTable.compact(HoodieSparkMergeOnReadTable.java:140) at org.apache.hudi.client.SparkRDDTableServiceClient.compact(SparkRDDTableServiceClient.java:75) at org.apache.hudi.client.BaseHoodieTableServiceClient.lambda$runAnyPendingCompactions$2(BaseHoodieTableServiceClient.java:191) at java.util.ArrayList.forEach(ArrayList.java:1259) at org.apache.hudi.client.BaseHoodieTableServiceClient.runAnyPendingCompactions(BaseHoodieTableServiceClient.java:189) at org.apache.hudi.client.BaseHoodieTableServiceClient.inlineCompaction(BaseHoodieTableServiceClient.java:160) at org.apache.hudi.client.BaseHoodieTableServiceClient.runTableServicesInline(BaseHoodieTableServiceClient.java:334) at org.apache.hudi.client.BaseHoodieWriteClient.runTableServicesInline(BaseHoodieWriteClient.java:540) at org.apache.hudi.client.BaseHoodieWriteClient.commitStats(BaseHoodieWriteClient.java:249) at org.apache.hudi.client.SparkRDDWriteClient.commit(SparkRDDWriteClient.java:102) at org.apache.hudi.HoodieSparkSqlWriter$.commitAndPerformPostOperations(HoodieSparkSqlWriter.scala:903) at org.apache.hudi.HoodieSparkSqlWriter$.write(HoodieSparkSqlWriter.scala:372) at org.apache.hudi.HoodieStreamingSink.$anonfun$addBatch$2(HoodieStreamingSink.scala:122) at scala.util.Try$.apply(Try.scala:213) at org.apache.hudi.HoodieStreamingSink.$anonfun$addBatch$1(HoodieStreamingSink.scala:120) at org.apache.hudi.HoodieStreamingSink.retry(HoodieStreamingSink.scala:244) at org.apache.hudi.HoodieStreamingSink.addBatch(HoodieStreamingSink.scala:119) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$16(MicroBatchExecution.scala:586) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$5(SQLExecution.scala:103) at org.apache.spark.sql.execution.SQLExecution$.withSQLConfPropagated(SQLExecution.scala:163) at org.apache.spark.sql.execution.SQLExecution$.$anonfun$withNewExecutionId$1(SQLExecution.scala:90) at org.apache.spark.sql.SparkSession.withActive(SparkSession.scala:772) at org.apache.spark.sql.execution.SQLExecution$.withNewExecutionId(SQLExecution.scala:64) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runBatch$15(MicroBatchExecution.scala:584) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355) at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runBatch(MicroBatchExecution.scala:584) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$2(MicroBatchExecution.scala:226) at scala.runtime.java8.JFunction0$mcV$sp.apply(JFunction0$mcV$sp.java:23) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken(ProgressReporter.scala:357) at org.apache.spark.sql.execution.streaming.ProgressReporter.reportTimeTaken$(ProgressReporter.scala:355) at org.apache.spark.sql.execution.streaming.StreamExecution.reportTimeTaken(StreamExecution.scala:68) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.$anonfun$runActivatedStream$1(MicroBatchExecution.scala:194) at org.apache.spark.sql.execution.streaming.ProcessingTimeExecutor.execute(TriggerExecutor.scala:57) at org.apache.spark.sql.execution.streaming.MicroBatchExecution.runActivatedStream(MicroBatchExecution.scala:188) at org.apache.spark.sql.execution.streaming.StreamExecution.org$apache$spark$sql$execution$streaming$StreamExecution$$runStream(StreamExecution.scala:333) at org.apache.spark.sql.execution.streaming.StreamExecution$$anon$1.run(StreamExecution.scala:244) Caused by: org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 70.0 failed 1 times, most recent failure: Lost task 0.0 in stage 70.0 (TID 67) (LAPTOP-DONGSJ executor driver): org.apache.avro.UnresolvedUnionException: Not in union ["null","double"]: 20230202105806923_0_1 at org.apache.avro.generic.GenericData.resolveUnion(GenericData.java:740) at org.apache.avro.generic.GenericDatumWriter.resolveUnion(GenericDatumWriter.java:205) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:123) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:166) at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:125) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.writeField(GenericDatumWriter.java:166) at org.apache.avro.generic.GenericDatumWriter.writeRecord(GenericDatumWriter.java:156) at org.apache.avro.generic.GenericDatumWriter.writeWithoutConversion(GenericDatumWriter.java:118) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:75) at org.apache.avro.generic.GenericDatumWriter.write(GenericDatumWriter.java:62) at org.apache.hudi.avro.HoodieAvroUtils.indexedRecordToBytes(HoodieAvroUtils.java:136) at org.apache.hudi.avro.HoodieAvroUtils.avroToBytes(HoodieAvroUtils.java:128) at org.apache.hudi.common.model.HoodieAvroPayload.<init>(HoodieAvroPayload.java:47) at org.apache.hudi.io.HoodieCDCLogger.put(HoodieCDCLogger.java:175) at org.apache.hudi.io.HoodieMergeHandleWithChangeLog.writeInsertRecord(HoodieMergeHandleWithChangeLog.java:106) at org.apache.hudi.io.HoodieMergeHandle.writeIncomingRecords(HoodieMergeHandle.java:397) at org.apache.hudi.io.HoodieMergeHandle.close(HoodieMergeHandle.java:405) at org.apache.hudi.io.HoodieMergeHandleWithChangeLog.close(HoodieMergeHandleWithChangeLog.java:112) at org.apache.hudi.table.action.commit.HoodieMergeHelper.runMerge(HoodieMergeHelper.java:168) at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.handleUpdateInternal(HoodieSparkCopyOnWriteTable.java:224) at org.apache.hudi.table.HoodieSparkCopyOnWriteTable.handleUpdate(HoodieSparkCopyOnWriteTable.java:215) at org.apache.hudi.table.action.compact.CompactionExecutionHelper.writeFileAndGetWriteStats(CompactionExecutionHelper.java:64) at org.apache.hudi.table.action.compact.HoodieCompactor.compact(HoodieCompactor.java:239) at org.apache.hudi.table.action.compact.HoodieCompactor.lambda$compact$9cd4b1be$1(HoodieCompactor.java:137) at org.apache.spark.api.java.JavaPairRDD$.$anonfun$toScalaFunction$1(JavaPairRDD.scala:1070) at scala.collection.Iterator$$anon$10.next(Iterator.scala:461) at scala.collection.Iterator$$anon$11.nextCur(Iterator.scala:486) at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:492) at org.apache.spark.storage.memory.MemoryStore.putIterator(MemoryStore.scala:221) at org.apache.spark.storage.memory.MemoryStore.putIteratorAsBytes(MemoryStore.scala:349) at org.apache.spark.storage.BlockManager.$anonfun$doPutIterator$1(BlockManager.scala:1440) at org.apache.spark.storage.BlockManager.org$apache$spark$storage$BlockManager$$doPut(BlockManager.scala:1350) at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1414) at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:1237) at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:384) at org.apache.spark.rdd.RDD.iterator(RDD.scala:335) at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52) at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:373) at org.apache.spark.rdd.RDD.iterator(RDD.scala:337) at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:90) at org.apache.spark.scheduler.Task.run(Task.scala:131) at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497) at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439) at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500) at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) at java.lang.Thread.run(Thread.java:748) Driver stacktrace: at org.apache.spark.scheduler.DAGScheduler.failJobAndIndependentStages(DAGScheduler.scala:2253) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2(DAGScheduler.scala:2202) at org.apache.spark.scheduler.DAGScheduler.$anonfun$abortStage$2$adapted(DAGScheduler.scala:2201) at scala.collection.mutable.ResizableArray.foreach(ResizableArray.scala:62) at scala.collection.mutable.ResizableArray.foreach$(ResizableArray.scala:55) at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:49) at org.apache.spark.scheduler.DAGScheduler.abortStage(DAGScheduler.scala:2201) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1(DAGScheduler.scala:1078) at org.apache.spark.scheduler.DAGScheduler.$anonfun$handleTaskSetFailed$1$adapted(DAGScheduler.scala:1078) at scala.Option.foreach(Option.scala:407) at org.apache.spark.scheduler.DAGScheduler.handleTaskSetFailed(DAGScheduler.scala:1078) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.doOnReceive(DAGScheduler.scala:2440) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2382) at org.apache.spark.scheduler.DAGSchedulerEventProcessLoop.onReceive(DAGScheduler.scala:2371) at org.apache.spark.util.EventLoop$$anon$1.run(EventLoop.scala:49) at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:868) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2202) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2223) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2242) at org.apache.spark.SparkContext.runJob(SparkContext.scala:2267) at org.apache.spark.rdd.RDD.$anonfun$collect$1(RDD.scala:1030) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) at org.apache.spark.rdd.RDD.withScope(RDD.scala:414) at org.apache.spark.rdd.RDD.collect(RDD.scala:1029) at org.apache.spark.api.java.JavaRDDLike.collect(JavaRDDLike.scala:362) at org.apache.spark.api.java.JavaRDDLike.collect$(JavaRDDLike.scala:361) at org.apache.spark.api.java.AbstractJavaRDDLike.collect(JavaRDDLike.scala:45) at org.apache.hudi.data.HoodieJavaRDD.collectAsList(HoodieJavaRDD.java:163) at org.apache.hudi.table.action.compact.RunCompactionActionExecutor.execute(RunCompactionActionExecutor.java:101) ... 38 more ```
non_test
unresolvedunionexception not in union tips before filing an issue have you gone through our join the mailing list to engage in conversations and get faster support at dev subscribe hudi apache org if you have triaged this as a bug then file an directly describe the problem you faced enable cdc cannot perform compaction table service to reproduce steps to reproduce the behavior hoodie table cdc enabled true hoodie table cdc supplemental logging mode data before after table type mor expected behavior a clear and concise description of what you expected to happen environment description hudi version master spark version hive version none hadoop version none storage hdfs gcs running on docker yes no additional context add any other context about the problem here stacktrace error hoodiestreamingsink micro batch id threw following expections aborting streaming app to avoid data loss org apache hudi exception hoodiecompactionexception could not compact tmp hudi cdc test at org apache hudi table action compact runcompactionactionexecutor execute runcompactionactionexecutor java at org apache hudi table hoodiesparkmergeonreadtable compact hoodiesparkmergeonreadtable java at org apache hudi client sparkrddtableserviceclient compact sparkrddtableserviceclient java at org apache hudi client basehoodietableserviceclient lambda runanypendingcompactions basehoodietableserviceclient java at java util arraylist foreach arraylist java at org apache hudi client basehoodietableserviceclient runanypendingcompactions basehoodietableserviceclient java at org apache hudi client basehoodietableserviceclient inlinecompaction basehoodietableserviceclient java at org apache hudi client basehoodietableserviceclient runtableservicesinline basehoodietableserviceclient java at org apache hudi client basehoodiewriteclient runtableservicesinline basehoodiewriteclient java at org apache hudi client basehoodiewriteclient commitstats basehoodiewriteclient java at org apache hudi client sparkrddwriteclient commit sparkrddwriteclient java at org apache hudi hoodiesparksqlwriter commitandperformpostoperations hoodiesparksqlwriter scala at org apache hudi hoodiesparksqlwriter write hoodiesparksqlwriter scala at org apache hudi hoodiestreamingsink anonfun addbatch hoodiestreamingsink scala at scala util try apply try scala at org apache hudi hoodiestreamingsink anonfun addbatch hoodiestreamingsink scala at org apache hudi hoodiestreamingsink retry hoodiestreamingsink scala at org apache hudi hoodiestreamingsink addbatch hoodiestreamingsink scala at org apache spark sql execution streaming microbatchexecution anonfun runbatch microbatchexecution scala at org apache spark sql execution sqlexecution anonfun withnewexecutionid sqlexecution scala at org apache spark sql execution sqlexecution withsqlconfpropagated sqlexecution scala at org apache spark sql execution sqlexecution anonfun withnewexecutionid sqlexecution scala at org apache spark sql sparksession withactive sparksession scala at org apache spark sql execution sqlexecution withnewexecutionid sqlexecution scala at org apache spark sql execution streaming microbatchexecution anonfun runbatch microbatchexecution scala at org apache spark sql execution streaming progressreporter reporttimetaken progressreporter scala at org apache spark sql execution streaming progressreporter reporttimetaken progressreporter scala at org apache spark sql execution streaming streamexecution reporttimetaken streamexecution scala at org apache spark sql execution streaming microbatchexecution runbatch microbatchexecution scala at org apache spark sql execution streaming microbatchexecution anonfun runactivatedstream microbatchexecution scala at scala runtime mcv sp apply mcv sp java at org apache spark sql execution streaming progressreporter reporttimetaken progressreporter scala at org apache spark sql execution streaming progressreporter reporttimetaken progressreporter scala at org apache spark sql execution streaming streamexecution reporttimetaken streamexecution scala at org apache spark sql execution streaming microbatchexecution anonfun runactivatedstream microbatchexecution scala at org apache spark sql execution streaming processingtimeexecutor execute triggerexecutor scala at org apache spark sql execution streaming microbatchexecution runactivatedstream microbatchexecution scala at org apache spark sql execution streaming streamexecution org apache spark sql execution streaming streamexecution runstream streamexecution scala at org apache spark sql execution streaming streamexecution anon run streamexecution scala caused by org apache spark sparkexception job aborted due to stage failure task in stage failed times most recent failure lost task in stage tid laptop dongsj executor driver org apache avro unresolvedunionexception not in union at org apache avro generic genericdata resolveunion genericdata java at org apache avro generic genericdatumwriter resolveunion genericdatumwriter java at org apache avro generic genericdatumwriter writewithoutconversion genericdatumwriter java at org apache avro generic genericdatumwriter write genericdatumwriter java at org apache avro generic genericdatumwriter writefield genericdatumwriter java at org apache avro generic genericdatumwriter writerecord genericdatumwriter java at org apache avro generic genericdatumwriter writewithoutconversion genericdatumwriter java at org apache avro generic genericdatumwriter write genericdatumwriter java at org apache avro generic genericdatumwriter writewithoutconversion genericdatumwriter java at org apache avro generic genericdatumwriter write genericdatumwriter java at org apache avro generic genericdatumwriter writefield genericdatumwriter java at org apache avro generic genericdatumwriter writerecord genericdatumwriter java at org apache avro generic genericdatumwriter writewithoutconversion genericdatumwriter java at org apache avro generic genericdatumwriter write genericdatumwriter java at org apache avro generic genericdatumwriter write genericdatumwriter java at org apache hudi avro hoodieavroutils indexedrecordtobytes hoodieavroutils java at org apache hudi avro hoodieavroutils avrotobytes hoodieavroutils java at org apache hudi common model hoodieavropayload hoodieavropayload java at org apache hudi io hoodiecdclogger put hoodiecdclogger java at org apache hudi io hoodiemergehandlewithchangelog writeinsertrecord hoodiemergehandlewithchangelog java at org apache hudi io hoodiemergehandle writeincomingrecords hoodiemergehandle java at org apache hudi io hoodiemergehandle close hoodiemergehandle java at org apache hudi io hoodiemergehandlewithchangelog close hoodiemergehandlewithchangelog java at org apache hudi table action commit hoodiemergehelper runmerge hoodiemergehelper java at org apache hudi table hoodiesparkcopyonwritetable handleupdateinternal hoodiesparkcopyonwritetable java at org apache hudi table hoodiesparkcopyonwritetable handleupdate hoodiesparkcopyonwritetable java at org apache hudi table action compact compactionexecutionhelper writefileandgetwritestats compactionexecutionhelper java at org apache hudi table action compact hoodiecompactor compact hoodiecompactor java at org apache hudi table action compact hoodiecompactor lambda compact hoodiecompactor java at org apache spark api java javapairrdd anonfun toscalafunction javapairrdd scala at scala collection iterator anon next iterator scala at scala collection iterator anon nextcur iterator scala at scala collection iterator anon hasnext iterator scala at org apache spark storage memory memorystore putiterator memorystore scala at org apache spark storage memory memorystore putiteratorasbytes memorystore scala at org apache spark storage blockmanager anonfun doputiterator blockmanager scala at org apache spark storage blockmanager org apache spark storage blockmanager doput blockmanager scala at org apache spark storage blockmanager doputiterator blockmanager scala at org apache spark storage blockmanager getorelseupdate blockmanager scala at org apache spark rdd rdd getorcompute rdd scala at org apache spark rdd rdd iterator rdd scala at org apache spark rdd mappartitionsrdd compute mappartitionsrdd scala at org apache spark rdd rdd computeorreadcheckpoint rdd scala at org apache spark rdd rdd iterator rdd scala at org apache spark scheduler resulttask runtask resulttask scala at org apache spark scheduler task run task scala at org apache spark executor executor taskrunner anonfun run executor scala at org apache spark util utils trywithsafefinally utils scala at org apache spark executor executor taskrunner run executor scala at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java driver stacktrace at org apache spark scheduler dagscheduler failjobandindependentstages dagscheduler scala at org apache spark scheduler dagscheduler anonfun abortstage dagscheduler scala at org apache spark scheduler dagscheduler anonfun abortstage adapted dagscheduler scala at scala collection mutable resizablearray foreach resizablearray scala at scala collection mutable resizablearray foreach resizablearray scala at scala collection mutable arraybuffer foreach arraybuffer scala at org apache spark scheduler dagscheduler abortstage dagscheduler scala at org apache spark scheduler dagscheduler anonfun handletasksetfailed dagscheduler scala at org apache spark scheduler dagscheduler anonfun handletasksetfailed adapted dagscheduler scala at scala option foreach option scala at org apache spark scheduler dagscheduler handletasksetfailed dagscheduler scala at org apache spark scheduler dagschedulereventprocessloop doonreceive dagscheduler scala at org apache spark scheduler dagschedulereventprocessloop onreceive dagscheduler scala at org apache spark scheduler dagschedulereventprocessloop onreceive dagscheduler scala at org apache spark util eventloop anon run eventloop scala at org apache spark scheduler dagscheduler runjob dagscheduler scala at org apache spark sparkcontext runjob sparkcontext scala at org apache spark sparkcontext runjob sparkcontext scala at org apache spark sparkcontext runjob sparkcontext scala at org apache spark sparkcontext runjob sparkcontext scala at org apache spark rdd rdd anonfun collect rdd scala at org apache spark rdd rddoperationscope withscope rddoperationscope scala at org apache spark rdd rddoperationscope withscope rddoperationscope scala at org apache spark rdd rdd withscope rdd scala at org apache spark rdd rdd collect rdd scala at org apache spark api java javarddlike collect javarddlike scala at org apache spark api java javarddlike collect javarddlike scala at org apache spark api java abstractjavarddlike collect javarddlike scala at org apache hudi data hoodiejavardd collectaslist hoodiejavardd java at org apache hudi table action compact runcompactionactionexecutor execute runcompactionactionexecutor java more
0
220,744
24,569,281,002
IssuesEvent
2022-10-13 07:19:10
BrianMcDonaldWS/keymaster
https://api.github.com/repos/BrianMcDonaldWS/keymaster
opened
WS-2022-0328 (Medium) detected in github.com/etcd-io/etcd/wal-v3.4.5
security vulnerability
## WS-2022-0328 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/etcd-io/etcd/wal-v3.4.5</b></p></summary> <p>Distributed reliable key-value store for the most critical data of a distributed system</p> <p> Dependency Hierarchy: - github.com/cloudflare/cfssl/revoke-v1.4.1 (Root Library) - github.com/cloudflare/cfssl/helpers-v1.4.1 - github.com/google/certificate-transparency-go-v1.1.0 - github.com/etcd-io/etcd-v3.4.5 - github.com/etcd-io/etcd-v3.4.5 - github.com/etcd-io/etcd/etcdserver/api/v2v3-v3.4.5 - github.com/etcd-io/etcd-v3.4.5 - :x: **github.com/etcd-io/etcd/wal-v3.4.5** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> etcd user credentials are stored in WAL logs in plaintext. <p>Publish Date: 2022-10-07 <p>URL: <a href=https://github.com/etcd-io/etcd/commit/7d1cf640497cbcdfb932e619b13624112c7e3865>WS-2022-0328</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-528j-9r78-wffx">https://github.com/advisories/GHSA-528j-9r78-wffx</a></p> <p>Release Date: 2022-10-07</p> <p>Fix Resolution: 3.3.23,3.4.10 </p> </p> </details> <p></p>
True
WS-2022-0328 (Medium) detected in github.com/etcd-io/etcd/wal-v3.4.5 - ## WS-2022-0328 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/etcd-io/etcd/wal-v3.4.5</b></p></summary> <p>Distributed reliable key-value store for the most critical data of a distributed system</p> <p> Dependency Hierarchy: - github.com/cloudflare/cfssl/revoke-v1.4.1 (Root Library) - github.com/cloudflare/cfssl/helpers-v1.4.1 - github.com/google/certificate-transparency-go-v1.1.0 - github.com/etcd-io/etcd-v3.4.5 - github.com/etcd-io/etcd-v3.4.5 - github.com/etcd-io/etcd/etcdserver/api/v2v3-v3.4.5 - github.com/etcd-io/etcd-v3.4.5 - :x: **github.com/etcd-io/etcd/wal-v3.4.5** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> etcd user credentials are stored in WAL logs in plaintext. <p>Publish Date: 2022-10-07 <p>URL: <a href=https://github.com/etcd-io/etcd/commit/7d1cf640497cbcdfb932e619b13624112c7e3865>WS-2022-0328</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/advisories/GHSA-528j-9r78-wffx">https://github.com/advisories/GHSA-528j-9r78-wffx</a></p> <p>Release Date: 2022-10-07</p> <p>Fix Resolution: 3.3.23,3.4.10 </p> </p> </details> <p></p>
non_test
ws medium detected in github com etcd io etcd wal ws medium severity vulnerability vulnerable library github com etcd io etcd wal distributed reliable key value store for the most critical data of a distributed system dependency hierarchy github com cloudflare cfssl revoke root library github com cloudflare cfssl helpers github com google certificate transparency go github com etcd io etcd github com etcd io etcd github com etcd io etcd etcdserver api github com etcd io etcd x github com etcd io etcd wal vulnerable library vulnerability details etcd user credentials are stored in wal logs in plaintext publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
0
50,508
26,666,733,284
IssuesEvent
2023-01-26 05:16:20
CodeEditApp/CodeEditTextView
https://api.github.com/repos/CodeEditApp/CodeEditTextView
opened
✨ Editing Performance
help wanted performance
Editing a large file can take up to multiple seconds per character. Right now CodeEditTextView uses the default `NSTextStorage` as a text store, which stores text contiguously in memory. This means that for each insert all characters after the insert location must be copied one spot over in memory to make room for the inserted character. For the most part this isn't a problem for smaller files. But opening a file like `sqlite.c` (~151,000 LOC) the performance impact becomes very noticeable. I've attached a screen recording of typing a few characters into this file, it's easy to see how long it takes. https://user-images.githubusercontent.com/35942988/214759044-5ae45760-fe25-46ee-b81c-206a01c5aff8.mov The way to fix this issue is to implement a custom `NSTextStorage` object using a data structure more suited to a text editor. By default, `NSTextStorage` uses an `NSMutableAttributedString` to store both text and attributes. These implementations are very general and don't perform well with edits. Therefore the problem is twofold: text storage and attribute storage. There are a few algorithms that are generally used in text editors for storing text: - [Gap Buffers](https://en.wikipedia.org/wiki/Gap_buffer) - [Piece Tables](https://en.wikipedia.org/wiki/Piece_table) In my opinion, a piece table would be the best option. A gap buffer may be a solid solution, but doesn't offer the same append-only type performance a piece table can offer. With a piece table every edit is added to the text buffer as an immutable, appended, chunk of text which completely removes the need to copy text when inserting characters. The only downside of a piece table comes around when a *lot* of edits happen as found by the [VSCode team](https://code.visualstudio.com/blogs/2018/03/23/text-buffer-reimplementation#_conclusion-and-gotchas). The other problem is attribute storage. Attributes consist of a range (`UInt`-`UInt`) and a dictionary of attributes `NSDictionary`. A solution for attribute storage may be to implement a tree that stores ranges as it's nodes and allows for `O(log n)` lookup time for a given index. However, a tree structure will have a large space requirement, and ranges may be very small (or even a single index) when we're talking about syntax highlighting, leading to many nodes in a tree. Another option would be to somehow store a small amount of metadata about each range alongside the string itself. We already have a `ThemeAttributesProviding` protocol that can be used to convert a capture type into attributes. We could store the capture type (`sqrt(21 available captures) = ~5` needed bits, rounded up) using 1 extra byte for every highlighted range and when asked for attribute data convert the capture into attributes. Any thoughts or discussion on implementation and performance is welcome!
True
✨ Editing Performance - Editing a large file can take up to multiple seconds per character. Right now CodeEditTextView uses the default `NSTextStorage` as a text store, which stores text contiguously in memory. This means that for each insert all characters after the insert location must be copied one spot over in memory to make room for the inserted character. For the most part this isn't a problem for smaller files. But opening a file like `sqlite.c` (~151,000 LOC) the performance impact becomes very noticeable. I've attached a screen recording of typing a few characters into this file, it's easy to see how long it takes. https://user-images.githubusercontent.com/35942988/214759044-5ae45760-fe25-46ee-b81c-206a01c5aff8.mov The way to fix this issue is to implement a custom `NSTextStorage` object using a data structure more suited to a text editor. By default, `NSTextStorage` uses an `NSMutableAttributedString` to store both text and attributes. These implementations are very general and don't perform well with edits. Therefore the problem is twofold: text storage and attribute storage. There are a few algorithms that are generally used in text editors for storing text: - [Gap Buffers](https://en.wikipedia.org/wiki/Gap_buffer) - [Piece Tables](https://en.wikipedia.org/wiki/Piece_table) In my opinion, a piece table would be the best option. A gap buffer may be a solid solution, but doesn't offer the same append-only type performance a piece table can offer. With a piece table every edit is added to the text buffer as an immutable, appended, chunk of text which completely removes the need to copy text when inserting characters. The only downside of a piece table comes around when a *lot* of edits happen as found by the [VSCode team](https://code.visualstudio.com/blogs/2018/03/23/text-buffer-reimplementation#_conclusion-and-gotchas). The other problem is attribute storage. Attributes consist of a range (`UInt`-`UInt`) and a dictionary of attributes `NSDictionary`. A solution for attribute storage may be to implement a tree that stores ranges as it's nodes and allows for `O(log n)` lookup time for a given index. However, a tree structure will have a large space requirement, and ranges may be very small (or even a single index) when we're talking about syntax highlighting, leading to many nodes in a tree. Another option would be to somehow store a small amount of metadata about each range alongside the string itself. We already have a `ThemeAttributesProviding` protocol that can be used to convert a capture type into attributes. We could store the capture type (`sqrt(21 available captures) = ~5` needed bits, rounded up) using 1 extra byte for every highlighted range and when asked for attribute data convert the capture into attributes. Any thoughts or discussion on implementation and performance is welcome!
non_test
✨ editing performance editing a large file can take up to multiple seconds per character right now codeedittextview uses the default nstextstorage as a text store which stores text contiguously in memory this means that for each insert all characters after the insert location must be copied one spot over in memory to make room for the inserted character for the most part this isn t a problem for smaller files but opening a file like sqlite c loc the performance impact becomes very noticeable i ve attached a screen recording of typing a few characters into this file it s easy to see how long it takes the way to fix this issue is to implement a custom nstextstorage object using a data structure more suited to a text editor by default nstextstorage uses an nsmutableattributedstring to store both text and attributes these implementations are very general and don t perform well with edits therefore the problem is twofold text storage and attribute storage there are a few algorithms that are generally used in text editors for storing text in my opinion a piece table would be the best option a gap buffer may be a solid solution but doesn t offer the same append only type performance a piece table can offer with a piece table every edit is added to the text buffer as an immutable appended chunk of text which completely removes the need to copy text when inserting characters the only downside of a piece table comes around when a lot of edits happen as found by the the other problem is attribute storage attributes consist of a range uint uint and a dictionary of attributes nsdictionary a solution for attribute storage may be to implement a tree that stores ranges as it s nodes and allows for o log n lookup time for a given index however a tree structure will have a large space requirement and ranges may be very small or even a single index when we re talking about syntax highlighting leading to many nodes in a tree another option would be to somehow store a small amount of metadata about each range alongside the string itself we already have a themeattributesproviding protocol that can be used to convert a capture type into attributes we could store the capture type sqrt available captures needed bits rounded up using extra byte for every highlighted range and when asked for attribute data convert the capture into attributes any thoughts or discussion on implementation and performance is welcome
0
231,777
25,547,366,993
IssuesEvent
2022-11-29 20:03:45
MatBenfield/news
https://api.github.com/repos/MatBenfield/news
closed
[SecurityWeek] Virginia County Confirms Personal Information Stolen in Ransomware Attack
SecurityWeek Stale
**Southampton County in Virginia last week started informing individuals that their personal information might have been compromised in a ransomware attack.** The incident was identified in September, when a threat actor accessed a server at Southampton and encrypted the data that was stored on it. [read more](https://www.securityweek.com/virginia-county-confirms-personal-information-stolen-ransomware-attack) <https://www.securityweek.com/virginia-county-confirms-personal-information-stolen-ransomware-attack>
True
[SecurityWeek] Virginia County Confirms Personal Information Stolen in Ransomware Attack - **Southampton County in Virginia last week started informing individuals that their personal information might have been compromised in a ransomware attack.** The incident was identified in September, when a threat actor accessed a server at Southampton and encrypted the data that was stored on it. [read more](https://www.securityweek.com/virginia-county-confirms-personal-information-stolen-ransomware-attack) <https://www.securityweek.com/virginia-county-confirms-personal-information-stolen-ransomware-attack>
non_test
virginia county confirms personal information stolen in ransomware attack southampton county in virginia last week started informing individuals that their personal information might have been compromised in a ransomware attack the incident was identified in september when a threat actor accessed a server at southampton and encrypted the data that was stored on it
0
4,896
6,893,943,626
IssuesEvent
2017-11-23 07:49:36
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
Kube-aggregator can't access /swagger.json on some aggregated API servers
kind/bug sig/api-machinery sig/auth sig/service-catalog
The Service Catalog API server looks at the existing RBAC rules (in the core k8s API server) to decide whether a client can access a nonResourceURL, such as `/swagger.json`. As I understand it, other API servers will use the same technique. The problem is that there is no `nonResourceURLs: ["/swagger.json"]` in the `system:kube-aggregator` ClusterRole. Additionally, there's no ClusterRoleBinding for the kube-aggregator in the default bootstrap policy. This prevents the aggregator from getting the OpenAPI schema from the aggregated API server. **Is this a BUG REPORT or FEATURE REQUEST?**: /kind bug **What happened**: The aggregator log shows the following: ``` Nov 22 15:09:10 minikube localkube[3220]: E1122 15:09:10.691211 3220 controller.go:111] loading OpenAPI spec for "v1beta1.servicecatalog.k8s.io" failed with: failed to retrieve openAPI spec, http error: ResponseCode: 403, Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:aggregator\" cannot get path \"/swagger.json\"","reason":"Forbidden","details":{},"code":403} ``` **What you expected to happen**: The aggregator should be able to access the OpenAPI schema of the underlying API servers. **How to reproduce it (as minimally and precisely as possible)**: 1. Deploy Service Catalog API server patched with https://github.com/kubernetes-incubator/service-catalog/pull/1208 2. Examine the log of the aggregator **Anything else we need to know?**: I can fix this myself; I'd just like to confirm that adding the nonResourceURL and the missing ClusterRoleBinding for the aggregator is OK or not. **Environment**: - Kubernetes version (use `kubectl version`): Client Version: version.Info{Major:"1", Minor:"8+", GitVersion:"v1.8.0-alpha.0.8556+d1e067be1f3065-dirty", GitCommit:"d1e067be1f30654867769ae9dd9aaa2e30513c21", GitTreeState:"dirty", BuildDate:"2017-11-15T21:38:13Z", GoVersion:"go1.9.2", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"8", GitVersion:"v1.8.0", GitCommit:"0b9efaeb34a2fc51ff8e4d34ad9bc6375459c4a4", GitTreeState:"dirty", BuildDate:"2017-10-17T15:09:55Z", GoVersion:"go1.8.3", Compiler:"gc", Platform:"linux/amd64"} - Cloud provider or hardware configuration: minikube version: v0.23.0
1.0
Kube-aggregator can't access /swagger.json on some aggregated API servers - The Service Catalog API server looks at the existing RBAC rules (in the core k8s API server) to decide whether a client can access a nonResourceURL, such as `/swagger.json`. As I understand it, other API servers will use the same technique. The problem is that there is no `nonResourceURLs: ["/swagger.json"]` in the `system:kube-aggregator` ClusterRole. Additionally, there's no ClusterRoleBinding for the kube-aggregator in the default bootstrap policy. This prevents the aggregator from getting the OpenAPI schema from the aggregated API server. **Is this a BUG REPORT or FEATURE REQUEST?**: /kind bug **What happened**: The aggregator log shows the following: ``` Nov 22 15:09:10 minikube localkube[3220]: E1122 15:09:10.691211 3220 controller.go:111] loading OpenAPI spec for "v1beta1.servicecatalog.k8s.io" failed with: failed to retrieve openAPI spec, http error: ResponseCode: 403, Body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:aggregator\" cannot get path \"/swagger.json\"","reason":"Forbidden","details":{},"code":403} ``` **What you expected to happen**: The aggregator should be able to access the OpenAPI schema of the underlying API servers. **How to reproduce it (as minimally and precisely as possible)**: 1. Deploy Service Catalog API server patched with https://github.com/kubernetes-incubator/service-catalog/pull/1208 2. Examine the log of the aggregator **Anything else we need to know?**: I can fix this myself; I'd just like to confirm that adding the nonResourceURL and the missing ClusterRoleBinding for the aggregator is OK or not. **Environment**: - Kubernetes version (use `kubectl version`): Client Version: version.Info{Major:"1", Minor:"8+", GitVersion:"v1.8.0-alpha.0.8556+d1e067be1f3065-dirty", GitCommit:"d1e067be1f30654867769ae9dd9aaa2e30513c21", GitTreeState:"dirty", BuildDate:"2017-11-15T21:38:13Z", GoVersion:"go1.9.2", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"8", GitVersion:"v1.8.0", GitCommit:"0b9efaeb34a2fc51ff8e4d34ad9bc6375459c4a4", GitTreeState:"dirty", BuildDate:"2017-10-17T15:09:55Z", GoVersion:"go1.8.3", Compiler:"gc", Platform:"linux/amd64"} - Cloud provider or hardware configuration: minikube version: v0.23.0
non_test
kube aggregator can t access swagger json on some aggregated api servers the service catalog api server looks at the existing rbac rules in the core api server to decide whether a client can access a nonresourceurl such as swagger json as i understand it other api servers will use the same technique the problem is that there is no nonresourceurls in the system kube aggregator clusterrole additionally there s no clusterrolebinding for the kube aggregator in the default bootstrap policy this prevents the aggregator from getting the openapi schema from the aggregated api server is this a bug report or feature request kind bug what happened the aggregator log shows the following nov minikube localkube controller go loading openapi spec for servicecatalog io failed with failed to retrieve openapi spec http error responsecode body kind status apiversion metadata status failure message forbidden user system aggregator cannot get path swagger json reason forbidden details code what you expected to happen the aggregator should be able to access the openapi schema of the underlying api servers how to reproduce it as minimally and precisely as possible deploy service catalog api server patched with examine the log of the aggregator anything else we need to know i can fix this myself i d just like to confirm that adding the nonresourceurl and the missing clusterrolebinding for the aggregator is ok or not environment kubernetes version use kubectl version client version version info major minor gitversion alpha dirty gitcommit gittreestate dirty builddate goversion compiler gc platform linux server version version info major minor gitversion gitcommit gittreestate dirty builddate goversion compiler gc platform linux cloud provider or hardware configuration minikube version
0
351,075
31,933,969,660
IssuesEvent
2023-09-19 09:15:35
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
opened
Fix array.test_jax_array_cumprod
JAX Frontend Sub Task Failing Test
| | | |---|---| |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6220351767"><img src=https://img.shields.io/badge/-failure-red></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a>
1.0
Fix array.test_jax_array_cumprod - | | | |---|---| |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6220351767"><img src=https://img.shields.io/badge/-failure-red></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a> |paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6201602967/job/16838607136"><img src=https://img.shields.io/badge/-success-success></a>
test
fix array test jax array cumprod numpy a href src jax a href src tensorflow a href src torch a href src paddle a href src
1
38,293
5,172,711,620
IssuesEvent
2017-01-18 14:21:00
reactor/reactor-addons
https://api.github.com/repos/reactor/reactor-addons
opened
Add StepVerifier expectations for Hooks
question reactor-test
This could set up the relevant hooks before verify, then correctly always reset them at the end. MVP probably `expectErrorDropped(Throwable...)` and `expectValueDropped(Object...)`, have the hook capture to a collection (as several drops can happen in a row, at least for values).
1.0
Add StepVerifier expectations for Hooks - This could set up the relevant hooks before verify, then correctly always reset them at the end. MVP probably `expectErrorDropped(Throwable...)` and `expectValueDropped(Object...)`, have the hook capture to a collection (as several drops can happen in a row, at least for values).
test
add stepverifier expectations for hooks this could set up the relevant hooks before verify then correctly always reset them at the end mvp probably expecterrordropped throwable and expectvaluedropped object have the hook capture to a collection as several drops can happen in a row at least for values
1
129,899
18,148,879,507
IssuesEvent
2021-09-26 00:05:05
ghc-dev/James-Fernandez
https://api.github.com/repos/ghc-dev/James-Fernandez
opened
CVE-2020-14365 (High) detected in ansible-2.9.9.tar.gz
security vulnerability
## CVE-2020-14365 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansible-2.9.9.tar.gz</b></p></summary> <p>Radically simple IT automation</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz">https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz</a></p> <p>Path to dependency file: James-Fernandez/requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **ansible-2.9.9.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/James-Fernandez/commit/58380ee1a5ee6575d496c120f9cfadade50fbc1a">58380ee1a5ee6575d496c120f9cfadade50fbc1a</a></p> <p>Found in base branch: <b>feature_branch</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in the Ansible Engine, in ansible-engine 2.8.x before 2.8.15 and ansible-engine 2.9.x before 2.9.13, when installing packages using the dnf module. GPG signatures are ignored during installation even when disable_gpg_check is set to False, which is the default behavior. This flaw leads to malicious packages being installed on the system and arbitrary code executed via package installation scripts. The highest threat from this vulnerability is to integrity and system availability. <p>Publish Date: 2020-09-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14365>CVE-2020-14365</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1869154">https://bugzilla.redhat.com/show_bug.cgi?id=1869154</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: 2.8.15,2.9.13</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"ansible","packageVersion":"2.9.9","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"ansible:2.9.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.15,2.9.13"}],"baseBranches":["feature_branch"],"vulnerabilityIdentifier":"CVE-2020-14365","vulnerabilityDetails":"A flaw was found in the Ansible Engine, in ansible-engine 2.8.x before 2.8.15 and ansible-engine 2.9.x before 2.9.13, when installing packages using the dnf module. GPG signatures are ignored during installation even when disable_gpg_check is set to False, which is the default behavior. This flaw leads to malicious packages being installed on the system and arbitrary code executed via package installation scripts. The highest threat from this vulnerability is to integrity and system availability.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14365","cvss3Severity":"high","cvss3Score":"7.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2020-14365 (High) detected in ansible-2.9.9.tar.gz - ## CVE-2020-14365 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansible-2.9.9.tar.gz</b></p></summary> <p>Radically simple IT automation</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz">https://files.pythonhosted.org/packages/00/5d/e10b83e0e6056dbd5b4809b451a191395175a57e3175ce04e35d9c5fc2a0/ansible-2.9.9.tar.gz</a></p> <p>Path to dependency file: James-Fernandez/requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **ansible-2.9.9.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/ghc-dev/James-Fernandez/commit/58380ee1a5ee6575d496c120f9cfadade50fbc1a">58380ee1a5ee6575d496c120f9cfadade50fbc1a</a></p> <p>Found in base branch: <b>feature_branch</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in the Ansible Engine, in ansible-engine 2.8.x before 2.8.15 and ansible-engine 2.9.x before 2.9.13, when installing packages using the dnf module. GPG signatures are ignored during installation even when disable_gpg_check is set to False, which is the default behavior. This flaw leads to malicious packages being installed on the system and arbitrary code executed via package installation scripts. The highest threat from this vulnerability is to integrity and system availability. <p>Publish Date: 2020-09-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14365>CVE-2020-14365</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1869154">https://bugzilla.redhat.com/show_bug.cgi?id=1869154</a></p> <p>Release Date: 2020-07-21</p> <p>Fix Resolution: 2.8.15,2.9.13</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"ansible","packageVersion":"2.9.9","packageFilePaths":["/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"ansible:2.9.9","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.15,2.9.13"}],"baseBranches":["feature_branch"],"vulnerabilityIdentifier":"CVE-2020-14365","vulnerabilityDetails":"A flaw was found in the Ansible Engine, in ansible-engine 2.8.x before 2.8.15 and ansible-engine 2.9.x before 2.9.13, when installing packages using the dnf module. GPG signatures are ignored during installation even when disable_gpg_check is set to False, which is the default behavior. This flaw leads to malicious packages being installed on the system and arbitrary code executed via package installation scripts. The highest threat from this vulnerability is to integrity and system availability.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14365","cvss3Severity":"high","cvss3Score":"7.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"Low","S":"Unchanged","C":"None","UI":"None","AV":"Local","I":"High"},"extraData":{}}</REMEDIATE> -->
non_test
cve high detected in ansible tar gz cve high severity vulnerability vulnerable library ansible tar gz radically simple it automation library home page a href path to dependency file james fernandez requirements txt path to vulnerable library requirements txt dependency hierarchy x ansible tar gz vulnerable library found in head commit a href found in base branch feature branch vulnerability details a flaw was found in the ansible engine in ansible engine x before and ansible engine x before when installing packages using the dnf module gpg signatures are ignored during installation even when disable gpg check is set to false which is the default behavior this flaw leads to malicious packages being installed on the system and arbitrary code executed via package installation scripts the highest threat from this vulnerability is to integrity and system availability publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree ansible isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails a flaw was found in the ansible engine in ansible engine x before and ansible engine x before when installing packages using the dnf module gpg signatures are ignored during installation even when disable gpg check is set to false which is the default behavior this flaw leads to malicious packages being installed on the system and arbitrary code executed via package installation scripts the highest threat from this vulnerability is to integrity and system availability vulnerabilityurl
0
238,163
19,701,826,746
IssuesEvent
2022-01-12 17:21:41
Kittentainment/Chuegelibahn
https://api.github.com/repos/Kittentainment/Chuegelibahn
closed
Print piece with right trigger
enhancement track printing to test
Additionally cut it off with the right trigger, so the printer does not need to be held.
1.0
Print piece with right trigger - Additionally cut it off with the right trigger, so the printer does not need to be held.
test
print piece with right trigger additionally cut it off with the right trigger so the printer does not need to be held
1
117,357
9,931,867,752
IssuesEvent
2019-07-02 08:32:42
knative/serving
https://api.github.com/repos/knative/serving
closed
Add e2e service throughput test
P1 area/test-and-release kind/feature
<!-- Pro-tip: You can leave this block commented, and it still works! Select the appropriate areas for your issue: /area test-and-release Classify what kind of issue this is: /kind feature --> ## Expected Behavior There is an end-to-end verification/guarantee that Serving can handle N parallel requests in scale from 0 scenario. ## Actual Behavior There is no such verification/guarantee. ## Additional Info https://github.com/knative/serving/issues/2988 has some background - 503s were returned when the number of concurrent requests was ramped up to 1K. The test would be useful to make sure no ingress config parameter change introduces a regression. I know that the tests might not provide a realistic throughput picture for a real production scenario when several services are competing for the resources, but it might be at least the starting point to drill into the throughput numbers of a single service.
1.0
Add e2e service throughput test - <!-- Pro-tip: You can leave this block commented, and it still works! Select the appropriate areas for your issue: /area test-and-release Classify what kind of issue this is: /kind feature --> ## Expected Behavior There is an end-to-end verification/guarantee that Serving can handle N parallel requests in scale from 0 scenario. ## Actual Behavior There is no such verification/guarantee. ## Additional Info https://github.com/knative/serving/issues/2988 has some background - 503s were returned when the number of concurrent requests was ramped up to 1K. The test would be useful to make sure no ingress config parameter change introduces a regression. I know that the tests might not provide a realistic throughput picture for a real production scenario when several services are competing for the resources, but it might be at least the starting point to drill into the throughput numbers of a single service.
test
add service throughput test pro tip you can leave this block commented and it still works select the appropriate areas for your issue area test and release classify what kind of issue this is kind feature expected behavior there is an end to end verification guarantee that serving can handle n parallel requests in scale from scenario actual behavior there is no such verification guarantee additional info has some background were returned when the number of concurrent requests was ramped up to the test would be useful to make sure no ingress config parameter change introduces a regression i know that the tests might not provide a realistic throughput picture for a real production scenario when several services are competing for the resources but it might be at least the starting point to drill into the throughput numbers of a single service
1
165,286
12,836,124,466
IssuesEvent
2020-07-07 13:55:19
prestosql/presto
https://api.github.com/repos/prestosql/presto
closed
Use apply-site-xml-override for kerberos test images configuration
maintenance test
(Creating issue here instead of prestosql/docker-images for visibility.) We could use https://github.com/prestosql/docker-images/blob/deb7329cd6b33df561ab70fd39d5ef4f292c1787/prestodev/centos7-oj8/files/usr/local/bin/apply-site-xml-override Instead of overriding `prestodev/hdp3.1-hive/files/etc/hadoop/conf/hdfs-site.xml` with `prestodev/hdp3.1-hive-kerberized/files/etc/hadoop/conf/hdfs-site.xml`.
1.0
Use apply-site-xml-override for kerberos test images configuration - (Creating issue here instead of prestosql/docker-images for visibility.) We could use https://github.com/prestosql/docker-images/blob/deb7329cd6b33df561ab70fd39d5ef4f292c1787/prestodev/centos7-oj8/files/usr/local/bin/apply-site-xml-override Instead of overriding `prestodev/hdp3.1-hive/files/etc/hadoop/conf/hdfs-site.xml` with `prestodev/hdp3.1-hive-kerberized/files/etc/hadoop/conf/hdfs-site.xml`.
test
use apply site xml override for kerberos test images configuration creating issue here instead of prestosql docker images for visibility we could use instead of overriding prestodev hive files etc hadoop conf hdfs site xml with prestodev hive kerberized files etc hadoop conf hdfs site xml
1
282,700
24,491,892,857
IssuesEvent
2022-10-10 03:34:09
MPMG-DCC-UFMG/F01
https://api.github.com/repos/MPMG-DCC-UFMG/F01
closed
Teste de generalizacao para a tag Informações institucionais - Unidades administrativas - Buenópolis
generalization test development template - Memory (66) tag - Informações Institucionais subtag - Unidades Administrativas
DoD: Realizar o teste de Generalização do validador da tag Informações institucionais - Unidades administrativas para o Município de Buenópolis.
1.0
Teste de generalizacao para a tag Informações institucionais - Unidades administrativas - Buenópolis - DoD: Realizar o teste de Generalização do validador da tag Informações institucionais - Unidades administrativas para o Município de Buenópolis.
test
teste de generalizacao para a tag informações institucionais unidades administrativas buenópolis dod realizar o teste de generalização do validador da tag informações institucionais unidades administrativas para o município de buenópolis
1
159,814
25,050,209,956
IssuesEvent
2022-11-05 19:47:01
apache/lucenenet
https://api.github.com/repos/apache/lucenenet
closed
Making Lucene.Net.Join.TermsQuery public
is:wontfix design is:question
Would it be possible to make [TermsQuery](https://github.com/apache/lucenenet/blob/master/src/Lucene.Net.Join/TermsQuery.cs) public? It seems a lot more performant than a BooleanQuery or even a TermsFilter so I think it would be quite useful if exposed publicly. (I believe it's public or at least [publicly documented ](https://lucene.apache.org/core/5_2_0/queries/org/apache/lucene/queries/TermsQuery.html)in 5.0 onwards).
1.0
Making Lucene.Net.Join.TermsQuery public - Would it be possible to make [TermsQuery](https://github.com/apache/lucenenet/blob/master/src/Lucene.Net.Join/TermsQuery.cs) public? It seems a lot more performant than a BooleanQuery or even a TermsFilter so I think it would be quite useful if exposed publicly. (I believe it's public or at least [publicly documented ](https://lucene.apache.org/core/5_2_0/queries/org/apache/lucene/queries/TermsQuery.html)in 5.0 onwards).
non_test
making lucene net join termsquery public would it be possible to make public it seems a lot more performant than a booleanquery or even a termsfilter so i think it would be quite useful if exposed publicly i believe it s public or at least onwards
0
98,704
8,684,929,671
IssuesEvent
2018-12-03 05:16:22
istio/istio
https://api.github.com/repos/istio/istio
closed
istio.io/istio/pilot/pkg/proxy/envoy/v2 TestAdsClusterUpdate is flaky
community/help wanted kind/test failure
Example of flake: https://k8s-gubernator.appspot.com/build/istio-prow/pr-logs/pull/istio_istio/9965/istio-unit-tests/16725#istioioistiopilotpkgproxyenvoyv2-testadsclusterupdate The owner is discovered using git blame. /assign @Nino-K
1.0
istio.io/istio/pilot/pkg/proxy/envoy/v2 TestAdsClusterUpdate is flaky - Example of flake: https://k8s-gubernator.appspot.com/build/istio-prow/pr-logs/pull/istio_istio/9965/istio-unit-tests/16725#istioioistiopilotpkgproxyenvoyv2-testadsclusterupdate The owner is discovered using git blame. /assign @Nino-K
test
istio io istio pilot pkg proxy envoy testadsclusterupdate is flaky example of flake the owner is discovered using git blame assign nino k
1
31,758
11,996,806,850
IssuesEvent
2020-04-08 17:24:22
wrbejar/JavaVulnerableLab
https://api.github.com/repos/wrbejar/JavaVulnerableLab
closed
CVE-2015-0254 (High) detected in jstl-1.2.jar
security vulnerability
## CVE-2015-0254 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jstl-1.2.jar</b></p></summary> <p>null</p> <p>Path to dependency file: /tmp/ws-scm/JavaVulnerableLab/pom.xml</p> <p>Path to vulnerable library: 20200408151214/downloadResource_b11fb59d-151c-4c61-8c23-583aa9a11de8/20200408154427/jstl-1.2.jar</p> <p> Dependency Hierarchy: - :x: **jstl-1.2.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/wrbejar/JavaVulnerableLab/commit/6a0fbfc49c97448a1babf624460dddf76995171e">6a0fbfc49c97448a1babf624460dddf76995171e</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache Standard Taglibs before 1.2.3 allows remote attackers to execute arbitrary code or conduct external XML entity (XXE) attacks via a crafted XSLT extension in a (1) <x:parse> or (2) <x:transform> JSTL XML tag. <p>Publish Date: 2015-03-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-0254>CVE-2015-0254</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-0254">https://nvd.nist.gov/vuln/detail/CVE-2015-0254</a></p> <p>Release Date: 2015-03-09</p> <p>Fix Resolution: 1.2.3</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"javax.servlet","packageName":"jstl","packageVersion":"1.2","isTransitiveDependency":false,"dependencyTree":"javax.servlet:jstl:1.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.2.3"}],"vulnerabilityIdentifier":"CVE-2015-0254","vulnerabilityDetails":"Apache Standard Taglibs before 1.2.3 allows remote attackers to execute arbitrary code or conduct external XML entity (XXE) attacks via a crafted XSLT extension in a (1) \u003cx:parse\u003e or (2) \u003cx:transform\u003e JSTL XML tag.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-0254","cvss2Severity":"high","cvss2Score":"7.5","extraData":{}}</REMEDIATE> -->
True
CVE-2015-0254 (High) detected in jstl-1.2.jar - ## CVE-2015-0254 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jstl-1.2.jar</b></p></summary> <p>null</p> <p>Path to dependency file: /tmp/ws-scm/JavaVulnerableLab/pom.xml</p> <p>Path to vulnerable library: 20200408151214/downloadResource_b11fb59d-151c-4c61-8c23-583aa9a11de8/20200408154427/jstl-1.2.jar</p> <p> Dependency Hierarchy: - :x: **jstl-1.2.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/wrbejar/JavaVulnerableLab/commit/6a0fbfc49c97448a1babf624460dddf76995171e">6a0fbfc49c97448a1babf624460dddf76995171e</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache Standard Taglibs before 1.2.3 allows remote attackers to execute arbitrary code or conduct external XML entity (XXE) attacks via a crafted XSLT extension in a (1) <x:parse> or (2) <x:transform> JSTL XML tag. <p>Publish Date: 2015-03-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-0254>CVE-2015-0254</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-0254">https://nvd.nist.gov/vuln/detail/CVE-2015-0254</a></p> <p>Release Date: 2015-03-09</p> <p>Fix Resolution: 1.2.3</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"javax.servlet","packageName":"jstl","packageVersion":"1.2","isTransitiveDependency":false,"dependencyTree":"javax.servlet:jstl:1.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.2.3"}],"vulnerabilityIdentifier":"CVE-2015-0254","vulnerabilityDetails":"Apache Standard Taglibs before 1.2.3 allows remote attackers to execute arbitrary code or conduct external XML entity (XXE) attacks via a crafted XSLT extension in a (1) \u003cx:parse\u003e or (2) \u003cx:transform\u003e JSTL XML tag.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-0254","cvss2Severity":"high","cvss2Score":"7.5","extraData":{}}</REMEDIATE> -->
non_test
cve high detected in jstl jar cve high severity vulnerability vulnerable library jstl jar null path to dependency file tmp ws scm javavulnerablelab pom xml path to vulnerable library downloadresource jstl jar dependency hierarchy x jstl jar vulnerable library found in head commit a href vulnerability details apache standard taglibs before allows remote attackers to execute arbitrary code or conduct external xml entity xxe attacks via a crafted xslt extension in a or jstl xml tag publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails apache standard taglibs before allows remote attackers to execute arbitrary code or conduct external xml entity xxe attacks via a crafted xslt extension in a parse or transform jstl xml tag vulnerabilityurl
0
430,982
12,468,265,820
IssuesEvent
2020-05-28 18:32:38
zephyrproject-rtos/zephyr
https://api.github.com/repos/zephyrproject-rtos/zephyr
opened
[Coverity CID :210564] Uninitialized scalar variable in tests/lib/cmsis_dsp/distance/src/u32.c
Coverity bug priority: low
Static code scan issues found in file: https://github.com/zephyrproject-rtos/zephyr/tree/4653b4e63f886a50ac7b72f8d47ba2950ab2dd0d/tests/lib/cmsis_dsp/distance/src/u32.c Category: Uninitialized variables Function: `test_arm_distance` Component: Tests CID: [210564](https://scan9.coverity.com/reports.htm#v29726/p12996/mergedDefectId=210564) Please fix or provide comments in coverity using the link: https://scan9.coverity.com/reports.htm#v32951/p12996. Note: This issue was created automatically. Priority was set based on classification of the file affected and the impact field in coverity. Assignees were set using the CODEOWNERS file.
1.0
[Coverity CID :210564] Uninitialized scalar variable in tests/lib/cmsis_dsp/distance/src/u32.c - Static code scan issues found in file: https://github.com/zephyrproject-rtos/zephyr/tree/4653b4e63f886a50ac7b72f8d47ba2950ab2dd0d/tests/lib/cmsis_dsp/distance/src/u32.c Category: Uninitialized variables Function: `test_arm_distance` Component: Tests CID: [210564](https://scan9.coverity.com/reports.htm#v29726/p12996/mergedDefectId=210564) Please fix or provide comments in coverity using the link: https://scan9.coverity.com/reports.htm#v32951/p12996. Note: This issue was created automatically. Priority was set based on classification of the file affected and the impact field in coverity. Assignees were set using the CODEOWNERS file.
non_test
uninitialized scalar variable in tests lib cmsis dsp distance src c static code scan issues found in file category uninitialized variables function test arm distance component tests cid please fix or provide comments in coverity using the link note this issue was created automatically priority was set based on classification of the file affected and the impact field in coverity assignees were set using the codeowners file
0
328,033
28,099,152,651
IssuesEvent
2023-03-30 18:02:57
wazuh/wazuh-qa
https://api.github.com/repos/wazuh/wazuh-qa
opened
Research about agent starts with an invalid fim configuration
level/task type/test
# Description The objective of this issue is to investigate and design a testing plan for the development issue: wazuh/wazuh#16268 # Planning stage - [ ] [Research the applied change](). - [ ] [Design of test scenarios and use cases](). - [ ] [Evaluate the current state of testing](8). - [ ] [Identify and justify the test "tier"]().
1.0
Research about agent starts with an invalid fim configuration - # Description The objective of this issue is to investigate and design a testing plan for the development issue: wazuh/wazuh#16268 # Planning stage - [ ] [Research the applied change](). - [ ] [Design of test scenarios and use cases](). - [ ] [Evaluate the current state of testing](8). - [ ] [Identify and justify the test "tier"]().
test
research about agent starts with an invalid fim configuration description the objective of this issue is to investigate and design a testing plan for the development issue wazuh wazuh planning stage
1
344,824
30,763,836,743
IssuesEvent
2023-07-30 03:07:27
neovim/neovim
https://api.github.com/repos/neovim/neovim
closed
test failure: Test_BufLeave_Wipe()
bug test
Related? #7869 66f5e5c7d7ce762db71886b72e759a745c2b41ce incorporated some Vim tests rewritten in "new style". `Test_BufLeave_Wipe()` provokes ASAN issue, but in order to move forward I marked it "pending". Looks related to GC and `tv_clear()`: global `g:` hashtable still has a reference to a freed buffer. (cc @ZyX-I , maybe you can see the issue quickly.) Steps to reproduce: ``` CC=clang make CMAKE_BUILD_TYPE=Debug CMAKE_EXTRA_FLAGS="-DCLANG_ASAN_UBSAN=ON" export ASAN_SYMBOLIZER_PATH=/usr/lib/llvm-5.0/bin/llvm-symbolizer TEST_FILE=test_autocmd.res make oldtest ``` ``` ==3668==ERROR: AddressSanitizer: heap-use-after-free on address 0x6260001411c8 at pc 0x000000a38a47 bp 0x7fff9982ee50 sp 0x7fff9982ee48 READ of size 4 at 0x6260001411c8 thread T0 0 0xa38a46 in _typval_encode_nothing_convert_one_value /home/vagrant/neovim/build/../src/nvim/eval/typval_encode.c.h:320:15 1 0xa340d5 in encode_vim_to_nothing /home/vagrant/neovim/build/../src/nvim/eval/typval_encode.c.h:830:9 2 0xa0ad63 in tv_clear /home/vagrant/neovim/build/../src/nvim/eval/typval.c:2189:25 3 0x85d584 in vars_clear_ext /home/vagrant/neovim/build/../src/nvim/eval.c:18894:9 4 0x7efac9 in vars_clear /home/vagrant/neovim/build/../src/nvim/eval.c:18871:3 5 0x7ef471 in eval_clear /home/vagrant/neovim/build/../src/nvim/eval.c:638:3 6 0xf80a7c in free_all_mem /home/vagrant/neovim/build/../src/nvim/memory.c:676:3 7 0x12789dc in mch_exit /home/vagrant/neovim/build/../src/nvim/os_unix.c:152:3 8 0xe85039 in getout /home/vagrant/neovim/build/../src/nvim/main.c:671:3 9 0xbb4496 in ex_quit_all /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:6051:5 10 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 11 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 12 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 13 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 14 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 15 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 16 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 17 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 18 0xb17fdd in do_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2973:3 19 0xb14630 in cmd_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2718:14 20 0xb14727 in ex_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2699:3 21 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 22 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 23 0xb34015 in do_cmdline_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:280:10 24 0xe829d3 in exe_commands /home/vagrant/neovim/build/../src/nvim/main.c:1702:5 25 0xe708bd in main /home/vagrant/neovim/build/../src/nvim/main.c:524:5 26 0x7f0cd5b7e82f in __libc_start_main /build/glibc-Cl5G7W/glibc-2.23/csu/../csu/libc-start.c:291 27 0x44d628 in _start (/home/vagrant/neovim/build/bin/nvim+0x44d628) 0x6260001411c8 is located 200 bytes inside of 10256-byte region [0x626000141100,0x626000143910) freed by thread T0 here: 0 0x50df60 in __interceptor_cfree.localalias.0 (/home/vagrant/neovim/build/bin/nvim+0x50df60) 1 0xf7dbe4 in xfree /home/vagrant/neovim/build/../src/nvim/memory.c:133:3 2 0x676577 in free_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:749:5 3 0x66e15b in close_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:590:5 4 0x67a835 in do_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:1216:9 5 0x681a1d in do_bufdel /home/vagrant/neovim/build/../src/nvim/buffer.c:945:16 6 0xb9cd72 in ex_bunload /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:4585:17 7 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 8 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 9 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 10 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 11 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 12 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 13 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 14 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 15 0x8646b1 in ex_execute /home/vagrant/neovim/build/../src/nvim/eval.c:19478:7 16 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 17 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 18 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 19 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 20 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 21 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 22 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 23 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 24 0xb17fdd in do_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2973:3 25 0xb14630 in cmd_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2718:14 26 0xb14727 in ex_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2699:3 27 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 28 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 29 0xb34015 in do_cmdline_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:280:10 previously allocated by thread T0 here: 0 0x50e340 in calloc (/home/vagrant/neovim/build/bin/nvim+0x50e340) 1 0xf7dc97 in xcalloc /home/vagrant/neovim/build/../src/nvim/memory.c:147:15 2 0x67eb65 in buflist_new /home/vagrant/neovim/build/../src/nvim/buffer.c:1641:11 3 0xa8fdcc in do_ecmd /home/vagrant/neovim/build/../src/nvim/ex_cmds.c:2221:13 4 0x683b21 in empty_curbuf /home/vagrant/neovim/build/../src/nvim/buffer.c:1031:12 5 0x67a098 in do_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:1196:14 6 0x68173f in do_bufdel /home/vagrant/neovim/build/../src/nvim/buffer.c:927:11 7 0xb9cd72 in ex_bunload /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:4585:17 8 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 9 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 10 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 11 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 12 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 13 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 14 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 15 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 16 0x8646b1 in ex_execute /home/vagrant/neovim/build/../src/nvim/eval.c:19478:7 17 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 18 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 19 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 20 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 21 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 22 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 23 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 24 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 25 0xb17fdd in do_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2973:3 26 0xb14630 in cmd_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2718:14 27 0xb14727 in ex_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2699:3 28 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 29 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 SUMMARY: AddressSanitizer: heap-use-after-free /home/vagrant/neovim/build/../src/nvim/eval/typval_encode.c.h:320:15 in _typval_encode_nothing_convert_one_va lue Shadow bytes around the buggy address: 0x0c4c800201e0: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c800201f0: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c80020200: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c80020210: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c80020220: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd =>0x0c4c80020230: fd fd fd fd fd fd fd fd fd[fd]fd fd fd fd fd fd 0x0c4c80020240: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020250: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020260: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020270: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020280: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd Shadow byte legend (one shadow byte represents 8 application bytes): Addressable: 00 Partially addressable: 01 02 03 04 05 06 07 Heap left redzone: fa Freed heap region: fd Stack left redzone: f1 Stack mid redzone: f2 Stack right redzone: f3 Stack after return: f5 Stack use after scope: f8 Global redzone: f9 Global init order: f6 Poisoned by user: f7 Container overflow: fc Array cookie: ac Intra object redzone: bb ASan internal: fe Left alloca redzone: ca Right alloca redzone: cb ==3668==ABORTING ```
1.0
test failure: Test_BufLeave_Wipe() - Related? #7869 66f5e5c7d7ce762db71886b72e759a745c2b41ce incorporated some Vim tests rewritten in "new style". `Test_BufLeave_Wipe()` provokes ASAN issue, but in order to move forward I marked it "pending". Looks related to GC and `tv_clear()`: global `g:` hashtable still has a reference to a freed buffer. (cc @ZyX-I , maybe you can see the issue quickly.) Steps to reproduce: ``` CC=clang make CMAKE_BUILD_TYPE=Debug CMAKE_EXTRA_FLAGS="-DCLANG_ASAN_UBSAN=ON" export ASAN_SYMBOLIZER_PATH=/usr/lib/llvm-5.0/bin/llvm-symbolizer TEST_FILE=test_autocmd.res make oldtest ``` ``` ==3668==ERROR: AddressSanitizer: heap-use-after-free on address 0x6260001411c8 at pc 0x000000a38a47 bp 0x7fff9982ee50 sp 0x7fff9982ee48 READ of size 4 at 0x6260001411c8 thread T0 0 0xa38a46 in _typval_encode_nothing_convert_one_value /home/vagrant/neovim/build/../src/nvim/eval/typval_encode.c.h:320:15 1 0xa340d5 in encode_vim_to_nothing /home/vagrant/neovim/build/../src/nvim/eval/typval_encode.c.h:830:9 2 0xa0ad63 in tv_clear /home/vagrant/neovim/build/../src/nvim/eval/typval.c:2189:25 3 0x85d584 in vars_clear_ext /home/vagrant/neovim/build/../src/nvim/eval.c:18894:9 4 0x7efac9 in vars_clear /home/vagrant/neovim/build/../src/nvim/eval.c:18871:3 5 0x7ef471 in eval_clear /home/vagrant/neovim/build/../src/nvim/eval.c:638:3 6 0xf80a7c in free_all_mem /home/vagrant/neovim/build/../src/nvim/memory.c:676:3 7 0x12789dc in mch_exit /home/vagrant/neovim/build/../src/nvim/os_unix.c:152:3 8 0xe85039 in getout /home/vagrant/neovim/build/../src/nvim/main.c:671:3 9 0xbb4496 in ex_quit_all /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:6051:5 10 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 11 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 12 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 13 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 14 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 15 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 16 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 17 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 18 0xb17fdd in do_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2973:3 19 0xb14630 in cmd_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2718:14 20 0xb14727 in ex_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2699:3 21 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 22 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 23 0xb34015 in do_cmdline_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:280:10 24 0xe829d3 in exe_commands /home/vagrant/neovim/build/../src/nvim/main.c:1702:5 25 0xe708bd in main /home/vagrant/neovim/build/../src/nvim/main.c:524:5 26 0x7f0cd5b7e82f in __libc_start_main /build/glibc-Cl5G7W/glibc-2.23/csu/../csu/libc-start.c:291 27 0x44d628 in _start (/home/vagrant/neovim/build/bin/nvim+0x44d628) 0x6260001411c8 is located 200 bytes inside of 10256-byte region [0x626000141100,0x626000143910) freed by thread T0 here: 0 0x50df60 in __interceptor_cfree.localalias.0 (/home/vagrant/neovim/build/bin/nvim+0x50df60) 1 0xf7dbe4 in xfree /home/vagrant/neovim/build/../src/nvim/memory.c:133:3 2 0x676577 in free_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:749:5 3 0x66e15b in close_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:590:5 4 0x67a835 in do_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:1216:9 5 0x681a1d in do_bufdel /home/vagrant/neovim/build/../src/nvim/buffer.c:945:16 6 0xb9cd72 in ex_bunload /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:4585:17 7 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 8 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 9 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 10 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 11 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 12 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 13 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 14 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 15 0x8646b1 in ex_execute /home/vagrant/neovim/build/../src/nvim/eval.c:19478:7 16 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 17 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 18 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 19 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 20 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 21 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 22 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 23 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 24 0xb17fdd in do_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2973:3 25 0xb14630 in cmd_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2718:14 26 0xb14727 in ex_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2699:3 27 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 28 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 29 0xb34015 in do_cmdline_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:280:10 previously allocated by thread T0 here: 0 0x50e340 in calloc (/home/vagrant/neovim/build/bin/nvim+0x50e340) 1 0xf7dc97 in xcalloc /home/vagrant/neovim/build/../src/nvim/memory.c:147:15 2 0x67eb65 in buflist_new /home/vagrant/neovim/build/../src/nvim/buffer.c:1641:11 3 0xa8fdcc in do_ecmd /home/vagrant/neovim/build/../src/nvim/ex_cmds.c:2221:13 4 0x683b21 in empty_curbuf /home/vagrant/neovim/build/../src/nvim/buffer.c:1031:12 5 0x67a098 in do_buffer /home/vagrant/neovim/build/../src/nvim/buffer.c:1196:14 6 0x68173f in do_bufdel /home/vagrant/neovim/build/../src/nvim/buffer.c:927:11 7 0xb9cd72 in ex_bunload /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:4585:17 8 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 9 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 10 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 11 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 12 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 13 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 14 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 15 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 16 0x8646b1 in ex_execute /home/vagrant/neovim/build/../src/nvim/eval.c:19478:7 17 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 18 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 19 0x845302 in call_user_func /home/vagrant/neovim/build/../src/nvim/eval.c:21332:3 20 0x81097b in call_func /home/vagrant/neovim/build/../src/nvim/eval.c:6358:11 21 0x825778 in get_func_tv /home/vagrant/neovim/build/../src/nvim/eval.c:6120:11 22 0x81e361 in ex_call /home/vagrant/neovim/build/../src/nvim/eval.c:2735:9 23 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 24 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 25 0xb17fdd in do_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2973:3 26 0xb14630 in cmd_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2718:14 27 0xb14727 in ex_source /home/vagrant/neovim/build/../src/nvim/ex_cmds2.c:2699:3 28 0xb4ba9c in do_one_cmd /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:2242:5 29 0xb2dd03 in do_cmdline /home/vagrant/neovim/build/../src/nvim/ex_docmd.c:609:20 SUMMARY: AddressSanitizer: heap-use-after-free /home/vagrant/neovim/build/../src/nvim/eval/typval_encode.c.h:320:15 in _typval_encode_nothing_convert_one_va lue Shadow bytes around the buggy address: 0x0c4c800201e0: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c800201f0: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c80020200: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c80020210: fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa 0x0c4c80020220: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd =>0x0c4c80020230: fd fd fd fd fd fd fd fd fd[fd]fd fd fd fd fd fd 0x0c4c80020240: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020250: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020260: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020270: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd 0x0c4c80020280: fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd Shadow byte legend (one shadow byte represents 8 application bytes): Addressable: 00 Partially addressable: 01 02 03 04 05 06 07 Heap left redzone: fa Freed heap region: fd Stack left redzone: f1 Stack mid redzone: f2 Stack right redzone: f3 Stack after return: f5 Stack use after scope: f8 Global redzone: f9 Global init order: f6 Poisoned by user: f7 Container overflow: fc Array cookie: ac Intra object redzone: bb ASan internal: fe Left alloca redzone: ca Right alloca redzone: cb ==3668==ABORTING ```
test
test failure test bufleave wipe related incorporated some vim tests rewritten in new style test bufleave wipe provokes asan issue but in order to move forward i marked it pending looks related to gc and tv clear global g hashtable still has a reference to a freed buffer cc zyx i maybe you can see the issue quickly steps to reproduce cc clang make cmake build type debug cmake extra flags dclang asan ubsan on export asan symbolizer path usr lib llvm bin llvm symbolizer test file test autocmd res make oldtest error addresssanitizer heap use after free on address at pc bp sp read of size at thread in typval encode nothing convert one value home vagrant neovim build src nvim eval typval encode c h in encode vim to nothing home vagrant neovim build src nvim eval typval encode c h in tv clear home vagrant neovim build src nvim eval typval c in vars clear ext home vagrant neovim build src nvim eval c in vars clear home vagrant neovim build src nvim eval c in eval clear home vagrant neovim build src nvim eval c in free all mem home vagrant neovim build src nvim memory c in mch exit home vagrant neovim build src nvim os unix c in getout home vagrant neovim build src nvim main c in ex quit all home vagrant neovim build src nvim ex docmd c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in call user func home vagrant neovim build src nvim eval c in call func home vagrant neovim build src nvim eval c in get func tv home vagrant neovim build src nvim eval c in ex call home vagrant neovim build src nvim eval c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in do source home vagrant neovim build src nvim ex c in cmd source home vagrant neovim build src nvim ex c in ex source home vagrant neovim build src nvim ex c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in do cmdline cmd home vagrant neovim build src nvim ex docmd c in exe commands home vagrant neovim build src nvim main c in main home vagrant neovim build src nvim main c in libc start main build glibc glibc csu csu libc start c in start home vagrant neovim build bin nvim is located bytes inside of byte region freed by thread here in interceptor cfree localalias home vagrant neovim build bin nvim in xfree home vagrant neovim build src nvim memory c in free buffer home vagrant neovim build src nvim buffer c in close buffer home vagrant neovim build src nvim buffer c in do buffer home vagrant neovim build src nvim buffer c in do bufdel home vagrant neovim build src nvim buffer c in ex bunload home vagrant neovim build src nvim ex docmd c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in call user func home vagrant neovim build src nvim eval c in call func home vagrant neovim build src nvim eval c in get func tv home vagrant neovim build src nvim eval c in ex call home vagrant neovim build src nvim eval c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in ex execute home vagrant neovim build src nvim eval c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in call user func home vagrant neovim build src nvim eval c in call func home vagrant neovim build src nvim eval c in get func tv home vagrant neovim build src nvim eval c in ex call home vagrant neovim build src nvim eval c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in do source home vagrant neovim build src nvim ex c in cmd source home vagrant neovim build src nvim ex c in ex source home vagrant neovim build src nvim ex c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in do cmdline cmd home vagrant neovim build src nvim ex docmd c previously allocated by thread here in calloc home vagrant neovim build bin nvim in xcalloc home vagrant neovim build src nvim memory c in buflist new home vagrant neovim build src nvim buffer c in do ecmd home vagrant neovim build src nvim ex cmds c in empty curbuf home vagrant neovim build src nvim buffer c in do buffer home vagrant neovim build src nvim buffer c in do bufdel home vagrant neovim build src nvim buffer c in ex bunload home vagrant neovim build src nvim ex docmd c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in call user func home vagrant neovim build src nvim eval c in call func home vagrant neovim build src nvim eval c in get func tv home vagrant neovim build src nvim eval c in ex call home vagrant neovim build src nvim eval c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in ex execute home vagrant neovim build src nvim eval c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in call user func home vagrant neovim build src nvim eval c in call func home vagrant neovim build src nvim eval c in get func tv home vagrant neovim build src nvim eval c in ex call home vagrant neovim build src nvim eval c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c in do source home vagrant neovim build src nvim ex c in cmd source home vagrant neovim build src nvim ex c in ex source home vagrant neovim build src nvim ex c in do one cmd home vagrant neovim build src nvim ex docmd c in do cmdline home vagrant neovim build src nvim ex docmd c summary addresssanitizer heap use after free home vagrant neovim build src nvim eval typval encode c h in typval encode nothing convert one va lue shadow bytes around the buggy address fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fa fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd fd shadow byte legend one shadow byte represents application bytes addressable partially addressable heap left redzone fa freed heap region fd stack left redzone stack mid redzone stack right redzone stack after return stack use after scope global redzone global init order poisoned by user container overflow fc array cookie ac intra object redzone bb asan internal fe left alloca redzone ca right alloca redzone cb aborting
1
195,300
14,713,575,365
IssuesEvent
2021-01-05 10:35:02
Blackhandpl/bugtracker
https://api.github.com/repos/Blackhandpl/bugtracker
closed
Scholomancer
Low Priority Need Testing Quest
1. Scholomancer 2. 27162 3. Po zbiciu wymaganego stwora do 1 hp, po winny zacząc go atakować 2 NPC, po czym prowadzić rozmowę. Sprawdzić jednego npc, prawdopodobnie ma złą frakcję (flaga) 4. wowhead.com/quest=27162/scholomancer Western Plaguelands
1.0
Scholomancer - 1. Scholomancer 2. 27162 3. Po zbiciu wymaganego stwora do 1 hp, po winny zacząc go atakować 2 NPC, po czym prowadzić rozmowę. Sprawdzić jednego npc, prawdopodobnie ma złą frakcję (flaga) 4. wowhead.com/quest=27162/scholomancer Western Plaguelands
test
scholomancer scholomancer po zbiciu wymaganego stwora do hp po winny zacząc go atakować npc po czym prowadzić rozmowę sprawdzić jednego npc prawdopodobnie ma złą frakcję flaga wowhead com quest scholomancer western plaguelands
1
336,543
30,199,953,757
IssuesEvent
2023-07-05 04:06:52
marcpage/libernet
https://api.github.com/repos/marcpage/libernet
closed
text::tolower should be tested with clear == ClearOututFirst
good first issue test
Test code needs to be added to tests/Text_test.cpp that calls: ``` std::wstring &tolower(const std::wstring &mixed, std::wstring &lower, ClearFirst clear = ClearOutputFirst); ``` with clear set to ClearOutputFirst.
1.0
text::tolower should be tested with clear == ClearOututFirst - Test code needs to be added to tests/Text_test.cpp that calls: ``` std::wstring &tolower(const std::wstring &mixed, std::wstring &lower, ClearFirst clear = ClearOutputFirst); ``` with clear set to ClearOutputFirst.
test
text tolower should be tested with clear clearoututfirst test code needs to be added to tests text test cpp that calls std wstring tolower const std wstring mixed std wstring lower clearfirst clear clearoutputfirst with clear set to clearoutputfirst
1
112,433
9,574,202,735
IssuesEvent
2019-05-07 00:39:06
QubesOS/updates-status
https://api.github.com/repos/QubesOS/updates-status
closed
linux-kernel v4.14.114-1 (r4.0)
r4.0-dom0-cur-test
Update of linux-kernel to v4.14.114-1 for Qubes r4.0, see comments below for details. Built from: https://github.com/QubesOS/qubes-linux-kernel/commit/75204c1cc2914b56c8354c1fcdef7a1ea49888d3 [Changes since previous version](https://github.com/QubesOS/qubes-linux-kernel/compare/v4.14.111-1...v4.14.114-1): QubesOS/qubes-linux-kernel@75204c1 Update to version 4.14.114 Referenced issues: If you're release manager, you can issue GPG-inline signed command: * `Upload linux-kernel 75204c1cc2914b56c8354c1fcdef7a1ea49888d3 r4.0 current repo` (available 7 days from now) * `Upload linux-kernel 75204c1cc2914b56c8354c1fcdef7a1ea49888d3 r4.0 current (dists) repo`, you can choose subset of distributions, like `vm-fc24 vm-fc25` (available 7 days from now) * `Upload linux-kernel 75204c1cc2914b56c8354c1fcdef7a1ea49888d3 r4.0 security-testing repo` Above commands will work only if packages in current-testing repository were built from given commit (i.e. no new version superseded it).
1.0
linux-kernel v4.14.114-1 (r4.0) - Update of linux-kernel to v4.14.114-1 for Qubes r4.0, see comments below for details. Built from: https://github.com/QubesOS/qubes-linux-kernel/commit/75204c1cc2914b56c8354c1fcdef7a1ea49888d3 [Changes since previous version](https://github.com/QubesOS/qubes-linux-kernel/compare/v4.14.111-1...v4.14.114-1): QubesOS/qubes-linux-kernel@75204c1 Update to version 4.14.114 Referenced issues: If you're release manager, you can issue GPG-inline signed command: * `Upload linux-kernel 75204c1cc2914b56c8354c1fcdef7a1ea49888d3 r4.0 current repo` (available 7 days from now) * `Upload linux-kernel 75204c1cc2914b56c8354c1fcdef7a1ea49888d3 r4.0 current (dists) repo`, you can choose subset of distributions, like `vm-fc24 vm-fc25` (available 7 days from now) * `Upload linux-kernel 75204c1cc2914b56c8354c1fcdef7a1ea49888d3 r4.0 security-testing repo` Above commands will work only if packages in current-testing repository were built from given commit (i.e. no new version superseded it).
test
linux kernel update of linux kernel to for qubes see comments below for details built from qubesos qubes linux kernel update to version referenced issues if you re release manager you can issue gpg inline signed command upload linux kernel current repo available days from now upload linux kernel current dists repo you can choose subset of distributions like vm vm available days from now upload linux kernel security testing repo above commands will work only if packages in current testing repository were built from given commit i e no new version superseded it
1
11,194
7,104,232,104
IssuesEvent
2018-01-16 09:16:26
Yakindu/statecharts
https://api.github.com/repos/Yakindu/statecharts
closed
Improve deletion of notes in diagram
Status-Consolidate is-Enhancement is-Usability-Issue
There is no explicit context menu item 'delete from model' for deleting notes from a statechart diagram. A workaround is to 'cut' the note. - Add an context menu action 'delete from model'
True
Improve deletion of notes in diagram - There is no explicit context menu item 'delete from model' for deleting notes from a statechart diagram. A workaround is to 'cut' the note. - Add an context menu action 'delete from model'
non_test
improve deletion of notes in diagram there is no explicit context menu item delete from model for deleting notes from a statechart diagram a workaround is to cut the note add an context menu action delete from model
0
306,275
9,383,229,035
IssuesEvent
2019-04-05 02:18:47
teleskope/teleskope
https://api.github.com/repos/teleskope/teleskope
closed
Create Template Layout
Priority: High UI good first issue
Do all work in a branch called issue-1. Update App layout `/ui/App.jsx` with mockup styling. - [x] Remove references to template code - [x] Comment out unimplemented component code
1.0
Create Template Layout - Do all work in a branch called issue-1. Update App layout `/ui/App.jsx` with mockup styling. - [x] Remove references to template code - [x] Comment out unimplemented component code
non_test
create template layout do all work in a branch called issue update app layout ui app jsx with mockup styling remove references to template code comment out unimplemented component code
0
160,793
6,102,717,612
IssuesEvent
2017-06-20 17:06:31
chef/chef
https://api.github.com/repos/chef/chef
closed
git submodules are not downloaded with git resource
Priority: Low Status: Pending Contributor Response
_From @puneetloya on October 29, 2015 1:29_ git tests_dir do repository node['xyz']['repo'] revision node['xyz']['branch'] enable_submodules true action :sync end It is downloading the main parent repository but not the submodules. I am getting only the .gitmodules file and nothing else. Am I missing something very basic? _Copied from original issue: jssjr/git#83_
1.0
git submodules are not downloaded with git resource - _From @puneetloya on October 29, 2015 1:29_ git tests_dir do repository node['xyz']['repo'] revision node['xyz']['branch'] enable_submodules true action :sync end It is downloading the main parent repository but not the submodules. I am getting only the .gitmodules file and nothing else. Am I missing something very basic? _Copied from original issue: jssjr/git#83_
non_test
git submodules are not downloaded with git resource from puneetloya on october git tests dir do repository node revision node enable submodules true action sync end it is downloading the main parent repository but not the submodules i am getting only the gitmodules file and nothing else am i missing something very basic copied from original issue jssjr git
0
99,885
8,714,188,736
IssuesEvent
2018-12-07 06:47:18
brave/brave-browser
https://api.github.com/repos/brave/brave-browser
opened
Shields settings is lost after clearing data and relaunching
QA/Test-Plan-Specified QA/Yes bug feature/shields good first bug priority/P2 privacy release-notes/include security
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue. PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE. INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED--> ## Description <!--Provide a brief description of the issue--> Shields settings is lost after clearing data and relaunching ## Steps to Reproduce <!--Please add a series of steps to reproduce the issue--> 1. Visit brave.com and disable shields 2. Visit github.com and change site shields settings from default to custom 3. Clear browsing data from all time and select all checkboxes 4. Restart browser after data is cleared 5. Open brave.com, shields is auto enabled 6. Open github.com, site shields settings are lost and reverted back to default settings ## Actual result: <!--Please add screenshots if needed--> Shields settings is lost after clearing data and relaunching ## Expected result: Clear data shouldn't reset shields settings ## Reproduces how often: <!--[Easily reproduced/Intermittent issue/No steps to reproduce]--> Easy ## Brave version (brave://version info) <!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details--> Brave | 0.56.15 Chromium: 70.0.3538.110 (Official Build) (64-bit) -- | -- Revision | ca97ba107095b2a88cf04f9135463301e685cbb0-refs/branch-heads/3538@{#1094} OS | All ### Reproducible on current release: - Does it reproduce on brave-browser dev/beta builds? Yes ### Website problems only: - Does the issue resolve itself when disabling Brave Shields? - Is the issue reproducible on the latest version of Chrome? ### Additional Information <!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue--> Issue reproduced by @kjozwiak on macOS and @GeetaSarvadnya on Windows cc: @diracdeltas @tomlowenthal @bbondy this should probably be part of 0.58.x or any earlier hotfix
1.0
Shields settings is lost after clearing data and relaunching - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue. PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE. INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED--> ## Description <!--Provide a brief description of the issue--> Shields settings is lost after clearing data and relaunching ## Steps to Reproduce <!--Please add a series of steps to reproduce the issue--> 1. Visit brave.com and disable shields 2. Visit github.com and change site shields settings from default to custom 3. Clear browsing data from all time and select all checkboxes 4. Restart browser after data is cleared 5. Open brave.com, shields is auto enabled 6. Open github.com, site shields settings are lost and reverted back to default settings ## Actual result: <!--Please add screenshots if needed--> Shields settings is lost after clearing data and relaunching ## Expected result: Clear data shouldn't reset shields settings ## Reproduces how often: <!--[Easily reproduced/Intermittent issue/No steps to reproduce]--> Easy ## Brave version (brave://version info) <!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details--> Brave | 0.56.15 Chromium: 70.0.3538.110 (Official Build) (64-bit) -- | -- Revision | ca97ba107095b2a88cf04f9135463301e685cbb0-refs/branch-heads/3538@{#1094} OS | All ### Reproducible on current release: - Does it reproduce on brave-browser dev/beta builds? Yes ### Website problems only: - Does the issue resolve itself when disabling Brave Shields? - Is the issue reproducible on the latest version of Chrome? ### Additional Information <!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue--> Issue reproduced by @kjozwiak on macOS and @GeetaSarvadnya on Windows cc: @diracdeltas @tomlowenthal @bbondy this should probably be part of 0.58.x or any earlier hotfix
test
shields settings is lost after clearing data and relaunching have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description shields settings is lost after clearing data and relaunching steps to reproduce visit brave com and disable shields visit github com and change site shields settings from default to custom clear browsing data from all time and select all checkboxes restart browser after data is cleared open brave com shields is auto enabled open github com site shields settings are lost and reverted back to default settings actual result shields settings is lost after clearing data and relaunching expected result clear data shouldn t reset shields settings reproduces how often easy brave version brave version info brave chromium   official build   bit revision refs branch heads os all reproducible on current release does it reproduce on brave browser dev beta builds yes website problems only does the issue resolve itself when disabling brave shields is the issue reproducible on the latest version of chrome additional information issue reproduced by kjozwiak on macos and geetasarvadnya on windows cc diracdeltas tomlowenthal bbondy this should probably be part of x or any earlier hotfix
1
492,342
14,200,907,309
IssuesEvent
2020-11-16 06:36:58
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
support.mozilla.org - site is not usable
browser-fenix engine-gecko ml-needsdiagnosis-false ml-probability-high priority-important
<!-- @browser: Firefox Mobile 83.0 --> <!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Mobile; rv:83.0) Gecko/83.0 Firefox/83.0 --> <!-- @reported_with: android-components-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/61740 --> <!-- @extra_labels: browser-fenix --> **URL**: https://support.mozilla.org/en-US/kb/enhanced-tracking-protection-firefox-beta-android?redirectslug=tracking-protection-firefox-preview&redirectlocale=en-US **Browser / Version**: Firefox Mobile 83.0 **Operating System**: Android 8.1.0 **Tested Another Browser**: Yes Other **Problem type**: Site is not usable **Description**: Browser unsupported **Steps to Reproduce**: Hackef <details> <summary>View the screenshot</summary> <img alt="Screenshot" src="https://webcompat.com/uploads/2020/11/616e8652-1aed-427b-8d9c-50c4d4474042.jpeg"> </details> <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20201108174701</li><li>channel: beta</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2020/11/8808ea9c-9e60-4124-a3b8-4f9bdfa941ab) _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
support.mozilla.org - site is not usable - <!-- @browser: Firefox Mobile 83.0 --> <!-- @ua_header: Mozilla/5.0 (Android 8.1.0; Mobile; rv:83.0) Gecko/83.0 Firefox/83.0 --> <!-- @reported_with: android-components-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/61740 --> <!-- @extra_labels: browser-fenix --> **URL**: https://support.mozilla.org/en-US/kb/enhanced-tracking-protection-firefox-beta-android?redirectslug=tracking-protection-firefox-preview&redirectlocale=en-US **Browser / Version**: Firefox Mobile 83.0 **Operating System**: Android 8.1.0 **Tested Another Browser**: Yes Other **Problem type**: Site is not usable **Description**: Browser unsupported **Steps to Reproduce**: Hackef <details> <summary>View the screenshot</summary> <img alt="Screenshot" src="https://webcompat.com/uploads/2020/11/616e8652-1aed-427b-8d9c-50c4d4474042.jpeg"> </details> <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20201108174701</li><li>channel: beta</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2020/11/8808ea9c-9e60-4124-a3b8-4f9bdfa941ab) _From [webcompat.com](https://webcompat.com/) with ❤️_
non_test
support mozilla org site is not usable url browser version firefox mobile operating system android tested another browser yes other problem type site is not usable description browser unsupported steps to reproduce hackef view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel beta hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️
0
319,864
27,403,429,026
IssuesEvent
2023-03-01 03:44:18
culturesofknowledge/emlo-project
https://api.github.com/repos/culturesofknowledge/emlo-project
opened
Are the number fields in people search form useful?
enhancement question AR test
The people search form has (number of letters) `sent`, `received`, `sent or received` and `mentioned` as a search option. An exact number. Really? should this not be a range atleast to be useful. (this is a question for the EMLO editors I suppose. Possible feature request?)
1.0
Are the number fields in people search form useful? - The people search form has (number of letters) `sent`, `received`, `sent or received` and `mentioned` as a search option. An exact number. Really? should this not be a range atleast to be useful. (this is a question for the EMLO editors I suppose. Possible feature request?)
test
are the number fields in people search form useful the people search form has number of letters sent received sent or received and mentioned as a search option an exact number really should this not be a range atleast to be useful this is a question for the emlo editors i suppose possible feature request
1
55,377
14,426,817,314
IssuesEvent
2020-12-06 00:11:27
openzfs/zfs
https://api.github.com/repos/openzfs/zfs
closed
Failed install on packaging for 2.x in Fedora
Status: Triage Needed Type: Defect
<!-- Please fill out the following template, which will help other contributors address your issue. --> <!-- Thank you for reporting an issue. *IMPORTANT* - Please check our issue tracker before opening a new issue. Additional valuable information can be found in the OpenZFS documentation and mailing list archives. Please fill in as much of the template as possible. --> ### System information <!-- add version after "|" character --> Type | Version/Name --- | --- Distribution Name | Fedora Distribution Version | 32 Linux Kernel | 5.9.11 Architecture | x86_64 ZFS Version | 2.0.0 SPL Version | <!-- Commands to find ZFS/SPL versions: modinfo zfs | grep -iw version modinfo spl | grep -iw version --> ### Describe the problem you're observing Recently pulled update for 2.0.0-1. Appears to build properly. Appears to be installed (dkms {build, install} reports back that the package is built and installed. When you try to load the module you get: ``` modprobe: ERROR: could not insert 'zfs': Exec format error ``` ### Describe how to reproduce the problem update fedora with the zfs repostiory ### Include any warning/errors/backtraces from the system logs <!-- *IMPORTANT* - Please mark logs and text output from terminal commands or else Github will not display them correctly. An example is provided below. Example: ``` this is an example how log text should be marked (wrap it with ```) ``` -->
1.0
Failed install on packaging for 2.x in Fedora - <!-- Please fill out the following template, which will help other contributors address your issue. --> <!-- Thank you for reporting an issue. *IMPORTANT* - Please check our issue tracker before opening a new issue. Additional valuable information can be found in the OpenZFS documentation and mailing list archives. Please fill in as much of the template as possible. --> ### System information <!-- add version after "|" character --> Type | Version/Name --- | --- Distribution Name | Fedora Distribution Version | 32 Linux Kernel | 5.9.11 Architecture | x86_64 ZFS Version | 2.0.0 SPL Version | <!-- Commands to find ZFS/SPL versions: modinfo zfs | grep -iw version modinfo spl | grep -iw version --> ### Describe the problem you're observing Recently pulled update for 2.0.0-1. Appears to build properly. Appears to be installed (dkms {build, install} reports back that the package is built and installed. When you try to load the module you get: ``` modprobe: ERROR: could not insert 'zfs': Exec format error ``` ### Describe how to reproduce the problem update fedora with the zfs repostiory ### Include any warning/errors/backtraces from the system logs <!-- *IMPORTANT* - Please mark logs and text output from terminal commands or else Github will not display them correctly. An example is provided below. Example: ``` this is an example how log text should be marked (wrap it with ```) ``` -->
non_test
failed install on packaging for x in fedora thank you for reporting an issue important please check our issue tracker before opening a new issue additional valuable information can be found in the openzfs documentation and mailing list archives please fill in as much of the template as possible system information type version name distribution name fedora distribution version linux kernel architecture zfs version spl version commands to find zfs spl versions modinfo zfs grep iw version modinfo spl grep iw version describe the problem you re observing recently pulled update for appears to build properly appears to be installed dkms build install reports back that the package is built and installed when you try to load the module you get modprobe error could not insert zfs exec format error describe how to reproduce the problem update fedora with the zfs repostiory include any warning errors backtraces from the system logs important please mark logs and text output from terminal commands or else github will not display them correctly an example is provided below example this is an example how log text should be marked wrap it with
0
233,786
19,060,281,508
IssuesEvent
2021-11-26 06:27:21
pingcap/tidb
https://api.github.com/repos/pingcap/tidb
closed
IT unstable case mysqltest `gcol_view`
type/bug component/test sig/sql-infra severity/major
## Bug Report Please answer these questions before submitting your issue. Thanks! ### 1. Minimal reproduce step (Required) in ci https://ci.pingcap.net/blue/organizations/jenkins/tidb_ghpr_integration_common_test/detail/tidb_ghpr_integration_common_test/7886/pipeline/145 ```bash [2021-11-24T05:44:31.174Z] time="2021-11-24T13:44:31+08:00" level=error msg="run test [gcol_view] err: sql:select is_updatable from information_schema.views where table_name='v1';: failed to run query \n\"select is_updatable from information_schema.views where table_name='v1';\" \n around line 5, \nwe need(92):\nselect is_updatable from information_schema.views where table_name='v1';\nis_updatable\nNO\ncre\nbut got(92):\nselect is_updatable from information_schema.views where table_name='v1';\nis_updatable\nNO\nNO\n\n" ``` <!-- a step by step guide for reproducing the bug. --> ### 2. What did you expect to see? (Required) ### 3. What did you see instead (Required) ### 4. What is your TiDB version? (Required) <!-- Paste the output of SELECT tidb_version() -->
1.0
IT unstable case mysqltest `gcol_view` - ## Bug Report Please answer these questions before submitting your issue. Thanks! ### 1. Minimal reproduce step (Required) in ci https://ci.pingcap.net/blue/organizations/jenkins/tidb_ghpr_integration_common_test/detail/tidb_ghpr_integration_common_test/7886/pipeline/145 ```bash [2021-11-24T05:44:31.174Z] time="2021-11-24T13:44:31+08:00" level=error msg="run test [gcol_view] err: sql:select is_updatable from information_schema.views where table_name='v1';: failed to run query \n\"select is_updatable from information_schema.views where table_name='v1';\" \n around line 5, \nwe need(92):\nselect is_updatable from information_schema.views where table_name='v1';\nis_updatable\nNO\ncre\nbut got(92):\nselect is_updatable from information_schema.views where table_name='v1';\nis_updatable\nNO\nNO\n\n" ``` <!-- a step by step guide for reproducing the bug. --> ### 2. What did you expect to see? (Required) ### 3. What did you see instead (Required) ### 4. What is your TiDB version? (Required) <!-- Paste the output of SELECT tidb_version() -->
test
it unstable case mysqltest gcol view bug report please answer these questions before submitting your issue thanks minimal reproduce step required in ci bash time level error msg run test err sql select is updatable from information schema views where table name failed to run query n select is updatable from information schema views where table name n around line nwe need nselect is updatable from information schema views where table name nis updatable nno ncre nbut got nselect is updatable from information schema views where table name nis updatable nno nno n n what did you expect to see required what did you see instead required what is your tidb version required
1
207,191
15,796,706,767
IssuesEvent
2021-04-02 15:23:10
isabellevillasenor/crate
https://api.github.com/repos/isabellevillasenor/crate
closed
FE Integration Testing w Cypress
FE MVP Testing
As a dev, I should be able to apply integration testing via Cypress to test. Will require reading the documentation first for Redux.
1.0
FE Integration Testing w Cypress - As a dev, I should be able to apply integration testing via Cypress to test. Will require reading the documentation first for Redux.
test
fe integration testing w cypress as a dev i should be able to apply integration testing via cypress to test will require reading the documentation first for redux
1
132
2,493,038,690
IssuesEvent
2015-01-05 10:01:14
diplomod/WellBet-App
https://api.github.com/repos/diplomod/WellBet-App
closed
Open WellBets: Only show bets I have not placed a bet on myself
enhancement Testenvironment dokuhl
It's too confusing still. If possible: Please filter out in "Open WellBets" those bets I have placed a bet on. Show only the bets I have not placed a bet yet in "Open WellBets" and which are open. I find the open ones that I have placed a bet on in "My WellBets" anyway. It's not super clean but more intuitive.
1.0
Open WellBets: Only show bets I have not placed a bet on myself - It's too confusing still. If possible: Please filter out in "Open WellBets" those bets I have placed a bet on. Show only the bets I have not placed a bet yet in "Open WellBets" and which are open. I find the open ones that I have placed a bet on in "My WellBets" anyway. It's not super clean but more intuitive.
test
open wellbets only show bets i have not placed a bet on myself it s too confusing still if possible please filter out in open wellbets those bets i have placed a bet on show only the bets i have not placed a bet yet in open wellbets and which are open i find the open ones that i have placed a bet on in my wellbets anyway it s not super clean but more intuitive
1
420,071
12,232,739,331
IssuesEvent
2020-05-04 10:15:08
google/fonts
https://api.github.com/repos/google/fonts
closed
Add Renner* font family
Needs Confirmation Priority 2 - Important but not Urgent
https://github.com/indestructible-type/Renner Hopefully this conforms to the standards set out by the team! This is a passion project of mine.
1.0
Add Renner* font family - https://github.com/indestructible-type/Renner Hopefully this conforms to the standards set out by the team! This is a passion project of mine.
non_test
add renner font family hopefully this conforms to the standards set out by the team this is a passion project of mine
0
187,954
14,435,078,148
IssuesEvent
2020-12-07 08:12:09
hzi-braunschweig/SORMAS-Project
https://api.github.com/repos/hzi-braunschweig/SORMAS-Project
closed
Change Automation tests in connection to new contacts directory filtering logic
Testing change
<!-- If you've never submitted an issue to the SORMAS repository before or this is your first time using this template, please read the Contributing guidelines (accessible in the right sidebar) for an explanation about the information we'd like you to provide. --> ### Feature Description ### Problem Description Due to changes done for contacts filtering at issue #3591 the automation tests have impacted which have to be refactored. ### Proposed Change Change the logic to work in a bidirectional way so that the filter is considered only when the search result is not found. ### Possible Alternatives ### Additional Information
1.0
Change Automation tests in connection to new contacts directory filtering logic - <!-- If you've never submitted an issue to the SORMAS repository before or this is your first time using this template, please read the Contributing guidelines (accessible in the right sidebar) for an explanation about the information we'd like you to provide. --> ### Feature Description ### Problem Description Due to changes done for contacts filtering at issue #3591 the automation tests have impacted which have to be refactored. ### Proposed Change Change the logic to work in a bidirectional way so that the filter is considered only when the search result is not found. ### Possible Alternatives ### Additional Information
test
change automation tests in connection to new contacts directory filtering logic if you ve never submitted an issue to the sormas repository before or this is your first time using this template please read the contributing guidelines accessible in the right sidebar for an explanation about the information we d like you to provide feature description problem description due to changes done for contacts filtering at issue the automation tests have impacted which have to be refactored proposed change change the logic to work in a bidirectional way so that the filter is considered only when the search result is not found possible alternatives additional information
1
154,396
24,287,153,067
IssuesEvent
2022-09-29 00:00:17
ZcashFoundation/zebra
https://api.github.com/repos/ZcashFoundation/zebra
closed
Sapling/Orchard note encryption implementation.
C-design spec-readthrough NU-1 Sapling A-cryptography
We need an implementation of note encryption/decryption. This has possible overlap with librustzcash. Two levels of completion: All the types to represent in the transaction Some types: sapling::Note (plaintext) sapling::OutCiphertext sapling::EncryptedCiphertext And implementing methods to do the note encryption NoteEncryptionBuilder But first: note commitment trees: #36 addresses as above, full Note implementation Related: #181 Spec references: https://zips.z.cash/protocol/protocol.pdf#saplingsend https://zips.z.cash/protocol/protocol.pdf#orchardsend https://zips.z.cash/protocol/protocol.pdf#saplingandorchardencrypt Note that ZIP-212 is included in this ticket (it was incorporated into the spec)
1.0
Sapling/Orchard note encryption implementation. - We need an implementation of note encryption/decryption. This has possible overlap with librustzcash. Two levels of completion: All the types to represent in the transaction Some types: sapling::Note (plaintext) sapling::OutCiphertext sapling::EncryptedCiphertext And implementing methods to do the note encryption NoteEncryptionBuilder But first: note commitment trees: #36 addresses as above, full Note implementation Related: #181 Spec references: https://zips.z.cash/protocol/protocol.pdf#saplingsend https://zips.z.cash/protocol/protocol.pdf#orchardsend https://zips.z.cash/protocol/protocol.pdf#saplingandorchardencrypt Note that ZIP-212 is included in this ticket (it was incorporated into the spec)
non_test
sapling orchard note encryption implementation we need an implementation of note encryption decryption this has possible overlap with librustzcash two levels of completion all the types to represent in the transaction some types sapling note plaintext sapling outciphertext sapling encryptedciphertext and implementing methods to do the note encryption noteencryptionbuilder but first note commitment trees addresses as above full note implementation related spec references note that zip is included in this ticket it was incorporated into the spec
0
53,139
6,302,406,595
IssuesEvent
2017-07-21 10:46:44
Princeton-CDH/derrida-django
https://api.github.com/repos/Princeton-CDH/derrida-django
closed
intervention anchor text
awaiting testing
As an intervention data editor, I want to transcribe the anchor text (if there is any) for an annotation so I can document the text the intervener is referencing. As with other fields, this should be optional. This is the `q` field in the existing annotator.js data model (shouldn't require any extra data handling in the intervention model). I think it's probably already editable in the admin form, but we should confirm.
1.0
intervention anchor text - As an intervention data editor, I want to transcribe the anchor text (if there is any) for an annotation so I can document the text the intervener is referencing. As with other fields, this should be optional. This is the `q` field in the existing annotator.js data model (shouldn't require any extra data handling in the intervention model). I think it's probably already editable in the admin form, but we should confirm.
test
intervention anchor text as an intervention data editor i want to transcribe the anchor text if there is any for an annotation so i can document the text the intervener is referencing as with other fields this should be optional this is the q field in the existing annotator js data model shouldn t require any extra data handling in the intervention model i think it s probably already editable in the admin form but we should confirm
1
114,174
9,691,802,635
IssuesEvent
2019-05-24 12:15:43
lupyana/Ride-My-Way
https://api.github.com/repos/lupyana/Ride-My-Way
opened
Test ride offer api
Test
Test case for : - Endpoint: api/v1/rides - method : get - Expected response: array of ride objects
1.0
Test ride offer api - Test case for : - Endpoint: api/v1/rides - method : get - Expected response: array of ride objects
test
test ride offer api test case for endpoint api rides method get expected response array of ride objects
1
183,732
14,247,875,100
IssuesEvent
2020-11-19 12:06:48
denoland/deno
https://api.github.com/repos/denoland/deno
closed
cargo test fails on windows
bug :bug: tests :white_check_mark:
The problem is introduced by this PR: https://github.com/denoland/deno/pull/8291. printf doesn't work on powershell.
1.0
cargo test fails on windows - The problem is introduced by this PR: https://github.com/denoland/deno/pull/8291. printf doesn't work on powershell.
test
cargo test fails on windows the problem is introduced by this pr printf doesn t work on powershell
1
818,316
30,683,424,461
IssuesEvent
2023-07-26 10:39:52
ppy/osu
https://api.github.com/repos/ppy/osu
closed
Collections disappear on exit
type:performance missing-details area:database priority:1
### Type Other ### Bug description I have to manually import my collections from stable every time I open the game as they disappear in lazer after exiting. ### Screenshots or videos https://user-images.githubusercontent.com/104596572/167032550-b568aca4-8d26-4b53-a100-bf432ea8c2a7.mp4 ### Version Latest AppImage ### Logs [database.log](https://github.com/ppy/osu/files/8636305/database.log) [network.log](https://github.com/ppy/osu/files/8636302/network.log) [performance.log](https://github.com/ppy/osu/files/8636303/performance.log) [runtime.log](https://github.com/ppy/osu/files/8636304/runtime.log)
1.0
Collections disappear on exit - ### Type Other ### Bug description I have to manually import my collections from stable every time I open the game as they disappear in lazer after exiting. ### Screenshots or videos https://user-images.githubusercontent.com/104596572/167032550-b568aca4-8d26-4b53-a100-bf432ea8c2a7.mp4 ### Version Latest AppImage ### Logs [database.log](https://github.com/ppy/osu/files/8636305/database.log) [network.log](https://github.com/ppy/osu/files/8636302/network.log) [performance.log](https://github.com/ppy/osu/files/8636303/performance.log) [runtime.log](https://github.com/ppy/osu/files/8636304/runtime.log)
non_test
collections disappear on exit type other bug description i have to manually import my collections from stable every time i open the game as they disappear in lazer after exiting screenshots or videos version latest appimage logs
0
16,225
5,231,216,175
IssuesEvent
2017-01-30 00:30:52
dvr0006/UBUGRAPH-5.0
https://api.github.com/repos/dvr0006/UBUGRAPH-5.0
closed
Incluir preguntas tipo "cloze"
Code
Se incluyen las preguntas tipo "cloze" para cuestionarios moodle en "Generar XML estocástico".
1.0
Incluir preguntas tipo "cloze" - Se incluyen las preguntas tipo "cloze" para cuestionarios moodle en "Generar XML estocástico".
non_test
incluir preguntas tipo cloze se incluyen las preguntas tipo cloze para cuestionarios moodle en generar xml estocástico
0
86,657
10,779,151,762
IssuesEvent
2019-11-04 09:57:44
AugurProject/augur
https://api.github.com/repos/AugurProject/augur
closed
Trading: Updating resized components + color changes
Design Epic Roadmap: Trading
After changes made to the Trading page, resizing and refining components. We need to apply that to components on: - Account summary - Portfolio We're also updating the UI color scheme Need to be applied across desktop/mobile etc
1.0
Trading: Updating resized components + color changes - After changes made to the Trading page, resizing and refining components. We need to apply that to components on: - Account summary - Portfolio We're also updating the UI color scheme Need to be applied across desktop/mobile etc
non_test
trading updating resized components color changes after changes made to the trading page resizing and refining components we need to apply that to components on account summary portfolio we re also updating the ui color scheme need to be applied across desktop mobile etc
0
463,808
13,301,108,513
IssuesEvent
2020-08-25 12:27:47
aau-network-security/haaukins
https://api.github.com/repos/aau-network-security/haaukins
opened
Lab Reset Option for the Team -- Feedback from IT CPH University
enhancement good first issue low priority
It would be good to have a button on the Amigo Interface in order to restart you Lab instead of asking to the event manager all the time
1.0
Lab Reset Option for the Team -- Feedback from IT CPH University - It would be good to have a button on the Amigo Interface in order to restart you Lab instead of asking to the event manager all the time
non_test
lab reset option for the team feedback from it cph university it would be good to have a button on the amigo interface in order to restart you lab instead of asking to the event manager all the time
0
84,671
7,929,462,942
IssuesEvent
2018-07-06 15:09:07
mautic/mautic
https://api.github.com/repos/mautic/mautic
closed
wrong URL encodage for special caracters
Bug Pending Test Confirmation Ready To Test
**Please DO NOT report security vulnerabilities here. Send them to security@mautic.com instead.** What type of report is this: | Q | A | ---| --- | Bug report? | X | Feature request? | | Enhancement? | ## Description: When using a special caracter un URL (like &) it is not encoded right. ## If a bug: | Q | A | --- | --- | Mautic version | | PHP version | ### Steps to reproduce: 1. Create an email, add a link and use the caracter & in the link <img width="686" alt="capture d ecran 2018-06-05 a 15 25 40" src="https://user-images.githubusercontent.com/31535432/40979105-21927a40-68d5-11e8-9919-e4d58f6f6f0c.png"> 2. Send the email 3. Open the email and click on the link. 4. Check the contact history 5. See that the caracter in the link has been encoded as &#38 <img width="930" alt="capture d ecran 2018-06-05 a 15 15 29" src="https://user-images.githubusercontent.com/31535432/40979213-69ba5842-68d5-11e8-8205-51bc1ed3ad9d.png"> ### Log errors: _Please check for related errors in the latest log file in [mautic root]/app/log/ and/or the web server's logs and post them here. Be sure to remove sensitive information if applicable._
2.0
wrong URL encodage for special caracters - **Please DO NOT report security vulnerabilities here. Send them to security@mautic.com instead.** What type of report is this: | Q | A | ---| --- | Bug report? | X | Feature request? | | Enhancement? | ## Description: When using a special caracter un URL (like &) it is not encoded right. ## If a bug: | Q | A | --- | --- | Mautic version | | PHP version | ### Steps to reproduce: 1. Create an email, add a link and use the caracter & in the link <img width="686" alt="capture d ecran 2018-06-05 a 15 25 40" src="https://user-images.githubusercontent.com/31535432/40979105-21927a40-68d5-11e8-9919-e4d58f6f6f0c.png"> 2. Send the email 3. Open the email and click on the link. 4. Check the contact history 5. See that the caracter in the link has been encoded as &#38 <img width="930" alt="capture d ecran 2018-06-05 a 15 15 29" src="https://user-images.githubusercontent.com/31535432/40979213-69ba5842-68d5-11e8-8205-51bc1ed3ad9d.png"> ### Log errors: _Please check for related errors in the latest log file in [mautic root]/app/log/ and/or the web server's logs and post them here. Be sure to remove sensitive information if applicable._
test
wrong url encodage for special caracters please do not report security vulnerabilities here send them to security mautic com instead what type of report is this q a bug report x feature request enhancement description when using a special caracter un url like it is not encoded right if a bug q a mautic version php version steps to reproduce create an email add a link and use the caracter in the link img width alt capture d ecran a src send the email open the email and click on the link check the contact history see that the caracter in the link has been encoded as img width alt capture d ecran a src log errors please check for related errors in the latest log file in app log and or the web server s logs and post them here be sure to remove sensitive information if applicable
1
167,642
13,038,389,975
IssuesEvent
2020-07-28 15:08:29
PhilipDaniels/rtest
https://api.github.com/repos/PhilipDaniels/rtest
opened
Consider linking to cargo as a crate rather than executing external processes
compilation test-running
It is possible to link against cargo as a crate, for example https://github.com/roblabla/cargo-travis But this has downsides (see the README above) and https://users.rust-lang.org/t/how-stable-is-cargos-message-format-json/15662
1.0
Consider linking to cargo as a crate rather than executing external processes - It is possible to link against cargo as a crate, for example https://github.com/roblabla/cargo-travis But this has downsides (see the README above) and https://users.rust-lang.org/t/how-stable-is-cargos-message-format-json/15662
test
consider linking to cargo as a crate rather than executing external processes it is possible to link against cargo as a crate for example but this has downsides see the readme above and
1
447,564
31,714,487,142
IssuesEvent
2023-09-09 17:38:13
gnustep/libs-base
https://api.github.com/repos/gnustep/libs-base
closed
making gnu step look like mac
documentation
in the web site says that gnustep can look like mac but i cannot acces to learn more so i was wondering if someone can help me to get the amazing look of mac in gnustep
1.0
making gnu step look like mac - in the web site says that gnustep can look like mac but i cannot acces to learn more so i was wondering if someone can help me to get the amazing look of mac in gnustep
non_test
making gnu step look like mac in the web site says that gnustep can look like mac but i cannot acces to learn more so i was wondering if someone can help me to get the amazing look of mac in gnustep
0
26,110
5,224,738,911
IssuesEvent
2017-01-27 16:15:47
backdrop-ops/backdropcms.org
https://api.github.com/repos/backdrop-ops/backdropcms.org
closed
User guide should consistently use `example.com` for example URLs
docs - needs work type - documentation
On the [quickstart](https://backdropcms.org/user-guide/quick-start) guide we are jumping between `mysite.com` (which links out to some commercial service) and `yoursite.com/user` We should consistently use `example.com` which is reserved for documentation purposes: https://en.wikipedia.org/wiki/Example.com
1.0
User guide should consistently use `example.com` for example URLs - On the [quickstart](https://backdropcms.org/user-guide/quick-start) guide we are jumping between `mysite.com` (which links out to some commercial service) and `yoursite.com/user` We should consistently use `example.com` which is reserved for documentation purposes: https://en.wikipedia.org/wiki/Example.com
non_test
user guide should consistently use example com for example urls on the guide we are jumping between mysite com which links out to some commercial service and yoursite com user we should consistently use example com which is reserved for documentation purposes
0
339,139
10,242,742,662
IssuesEvent
2019-08-20 06:14:24
ubclaunchpad/rocket2
https://api.github.com/repos/ubclaunchpad/rocket2
closed
Use Github Actions CI
feature request high priority research tooling & deployment
**Please give a one-sentence summary of the feature you would like to see.** Since travis is no longer available, we should try and use Github Actions Python CI instead. **Please give as many details as possible about the requirements for the feature.** Just use it. Make sure it works. **Please list any additional context.** Travis CI contacted us saying that our `ubclaunchpad` account was suspected of mining cryptocurrency using their services (as far as I know, we don't), and so they have suspended our account. This means all of Travis doesn't work, which means we no longer have a reliable way of checking pull requests. We need to replace it.
1.0
Use Github Actions CI - **Please give a one-sentence summary of the feature you would like to see.** Since travis is no longer available, we should try and use Github Actions Python CI instead. **Please give as many details as possible about the requirements for the feature.** Just use it. Make sure it works. **Please list any additional context.** Travis CI contacted us saying that our `ubclaunchpad` account was suspected of mining cryptocurrency using their services (as far as I know, we don't), and so they have suspended our account. This means all of Travis doesn't work, which means we no longer have a reliable way of checking pull requests. We need to replace it.
non_test
use github actions ci please give a one sentence summary of the feature you would like to see since travis is no longer available we should try and use github actions python ci instead please give as many details as possible about the requirements for the feature just use it make sure it works please list any additional context travis ci contacted us saying that our ubclaunchpad account was suspected of mining cryptocurrency using their services as far as i know we don t and so they have suspended our account this means all of travis doesn t work which means we no longer have a reliable way of checking pull requests we need to replace it
0
34,991
4,958,806,455
IssuesEvent
2016-12-02 11:02:17
hoangzinh/test-github-issues
https://api.github.com/repos/hoangzinh/test-github-issues
closed
Email notification is an unstable test - aa6c5c6b0
PRIORITY UNSTABLED TEST
Unfortunately, we found a new unstable test `features/unstable_test.feature:5` (╯°□°)╯︵ ┻━┻ For more details, please look at this Circleci build Hope it will be sorted out soon.
1.0
Email notification is an unstable test - aa6c5c6b0 - Unfortunately, we found a new unstable test `features/unstable_test.feature:5` (╯°□°)╯︵ ┻━┻ For more details, please look at this Circleci build Hope it will be sorted out soon.
test
email notification is an unstable test unfortunately we found a new unstable test features unstable test feature ╯°□°)╯︵ ┻━┻ for more details please look at this circleci build hope it will be sorted out soon
1
12,804
8,122,331,510
IssuesEvent
2018-08-16 11:13:40
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
Duplicate String Literal Added on Autocomplete Partial String
bug topic:editor usability
Godot Version: 3.0.6-stable Current Behavior: - Open Editor - Enter `if Input.is_action_pressed("ui_up"):` - Change `ui_up` to `ui_right` and select it from the autocomplete menu - An extra `"` gets added at the end Expected: - The extra `"` should not be added
True
Duplicate String Literal Added on Autocomplete Partial String - Godot Version: 3.0.6-stable Current Behavior: - Open Editor - Enter `if Input.is_action_pressed("ui_up"):` - Change `ui_up` to `ui_right` and select it from the autocomplete menu - An extra `"` gets added at the end Expected: - The extra `"` should not be added
non_test
duplicate string literal added on autocomplete partial string godot version stable current behavior open editor enter if input is action pressed ui up change ui up to ui right and select it from the autocomplete menu an extra gets added at the end expected the extra should not be added
0
183,995
14,266,180,926
IssuesEvent
2020-11-20 18:16:49
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
closed
CI: LongRareTerms failing with NPE
:Analytics/Aggregations >test-failure Team:Analytics
<!-- Please fill out the following information, and ensure you have attempted to reproduce locally --> **Build scan**: https://gradle-enterprise.elastic.co/s/2ilieh4lpxote **Repro line**: /gradlew ':server:test' --tests "org.elasticsearch.search.aggregations.bucket.terms.LongRareTermsTests.testReduceRandom" -Dtests.seed=4439B24ABA377C7F -Dtests.security.manager=true -Dtests.locale=cs -Dtests.timezone=Australia/Lindeman -Druntime.java=8 REPRODUCE WITH: ./gradlew ':server:test' --tests "org.elasticsearch.search.aggregations.bucket.terms.LongRareTermsTests.testReduceRandom" -Dtests.seed=4439B24ABA377C7F -Dtests.security.manager=true -Dtests.locale=cs -Dtests.timezone=Australia/Lindeman -Druntime.java=8 **Reproduces locally?**: Yes. **Applicable branches**: 7.10 **Failure history**: https://build-stats.elastic.co/goto/0fae824715006863554c738c6df7f5ab Looks to have started around 2 days ago, on the 18th. **Failure excerpt**: ``` at __randomizedtesting.SeedInfo.seed([4439B24ABA377C7F:84041E5B9C6AE529]:0) at org.elasticsearch.search.aggregations.bucket.terms.InternalMappedRareTerms.writeTermTypeInfoTo(InternalMappedRareTerms.java:84) at org.elasticsearch.search.aggregations.bucket.terms.InternalRareTerms.doWriteTo(InternalRareTerms.java:143) at org.elasticsearch.search.aggregations.InternalAggregation.writeTo(InternalAggregation.java:202) at org.elasticsearch.common.io.stream.StreamOutput.writeNamedWriteable(StreamOutput.java:1115) at org.elasticsearch.test.ESTestCase.lambda$copyNamedWriteable$15(ESTestCase.java:1310) at org.elasticsearch.test.ESTestCase.copyInstance(ESTestCase.java:1318) at org.elasticsearch.test.ESTestCase.copyNamedWriteable(ESTestCase.java:1309) at org.elasticsearch.test.ESTestCase.copyNamedWriteable(ESTestCase.java:1300) at org.elasticsearch.test.InternalAggregationTestCase.testReduceRandom(InternalAggregationTestCase.java:394) ```
1.0
CI: LongRareTerms failing with NPE - <!-- Please fill out the following information, and ensure you have attempted to reproduce locally --> **Build scan**: https://gradle-enterprise.elastic.co/s/2ilieh4lpxote **Repro line**: /gradlew ':server:test' --tests "org.elasticsearch.search.aggregations.bucket.terms.LongRareTermsTests.testReduceRandom" -Dtests.seed=4439B24ABA377C7F -Dtests.security.manager=true -Dtests.locale=cs -Dtests.timezone=Australia/Lindeman -Druntime.java=8 REPRODUCE WITH: ./gradlew ':server:test' --tests "org.elasticsearch.search.aggregations.bucket.terms.LongRareTermsTests.testReduceRandom" -Dtests.seed=4439B24ABA377C7F -Dtests.security.manager=true -Dtests.locale=cs -Dtests.timezone=Australia/Lindeman -Druntime.java=8 **Reproduces locally?**: Yes. **Applicable branches**: 7.10 **Failure history**: https://build-stats.elastic.co/goto/0fae824715006863554c738c6df7f5ab Looks to have started around 2 days ago, on the 18th. **Failure excerpt**: ``` at __randomizedtesting.SeedInfo.seed([4439B24ABA377C7F:84041E5B9C6AE529]:0) at org.elasticsearch.search.aggregations.bucket.terms.InternalMappedRareTerms.writeTermTypeInfoTo(InternalMappedRareTerms.java:84) at org.elasticsearch.search.aggregations.bucket.terms.InternalRareTerms.doWriteTo(InternalRareTerms.java:143) at org.elasticsearch.search.aggregations.InternalAggregation.writeTo(InternalAggregation.java:202) at org.elasticsearch.common.io.stream.StreamOutput.writeNamedWriteable(StreamOutput.java:1115) at org.elasticsearch.test.ESTestCase.lambda$copyNamedWriteable$15(ESTestCase.java:1310) at org.elasticsearch.test.ESTestCase.copyInstance(ESTestCase.java:1318) at org.elasticsearch.test.ESTestCase.copyNamedWriteable(ESTestCase.java:1309) at org.elasticsearch.test.ESTestCase.copyNamedWriteable(ESTestCase.java:1300) at org.elasticsearch.test.InternalAggregationTestCase.testReduceRandom(InternalAggregationTestCase.java:394) ```
test
ci longrareterms failing with npe please fill out the following information and ensure you have attempted to reproduce locally build scan repro line gradlew server test tests org elasticsearch search aggregations bucket terms longraretermstests testreducerandom dtests seed dtests security manager true dtests locale cs dtests timezone australia lindeman druntime java reproduce with gradlew server test tests org elasticsearch search aggregations bucket terms longraretermstests testreducerandom dtests seed dtests security manager true dtests locale cs dtests timezone australia lindeman druntime java reproduces locally yes applicable branches failure history looks to have started around days ago on the failure excerpt at randomizedtesting seedinfo seed at org elasticsearch search aggregations bucket terms internalmappedrareterms writetermtypeinfoto internalmappedrareterms java at org elasticsearch search aggregations bucket terms internalrareterms dowriteto internalrareterms java at org elasticsearch search aggregations internalaggregation writeto internalaggregation java at org elasticsearch common io stream streamoutput writenamedwriteable streamoutput java at org elasticsearch test estestcase lambda copynamedwriteable estestcase java at org elasticsearch test estestcase copyinstance estestcase java at org elasticsearch test estestcase copynamedwriteable estestcase java at org elasticsearch test estestcase copynamedwriteable estestcase java at org elasticsearch test internalaggregationtestcase testreducerandom internalaggregationtestcase java
1
30,486
2,723,879,431
IssuesEvent
2015-04-14 14:58:40
nprapps/dailygraphics
https://api.github.com/repos/nprapps/dailygraphics
closed
Backport automated Google Spreadsheet creation from podcast app
Priority: Normal
Need to talk to @eads about this one.
1.0
Backport automated Google Spreadsheet creation from podcast app - Need to talk to @eads about this one.
non_test
backport automated google spreadsheet creation from podcast app need to talk to eads about this one
0
66,994
3,265,118,458
IssuesEvent
2015-10-22 14:59:29
theodi/member-directory
https://api.github.com/repos/theodi/member-directory
opened
Changes to the student confirmation page text
0 - Backlog priority: high
Amend text in grey to: Your membership number is XXX and you’ll shortly receive a welcome email with all the information you need about your membership. In the meantime why not take a look at some of our open data stories and the other members in the network? Omit entire sentence ‘If you’d like to promote your membership you can display a badge on your site…..’ Omit ‘or preview how your organisation will show up in the directory.’ Keep the Supporter badge <!--- @huboard:{"order":0.014980316162109375,"milestone_order":0.232421875} -->
1.0
Changes to the student confirmation page text - Amend text in grey to: Your membership number is XXX and you’ll shortly receive a welcome email with all the information you need about your membership. In the meantime why not take a look at some of our open data stories and the other members in the network? Omit entire sentence ‘If you’d like to promote your membership you can display a badge on your site…..’ Omit ‘or preview how your organisation will show up in the directory.’ Keep the Supporter badge <!--- @huboard:{"order":0.014980316162109375,"milestone_order":0.232421875} -->
non_test
changes to the student confirmation page text amend text in grey to your membership number is xxx and you’ll shortly receive a welcome email with all the information you need about your membership in the meantime why not take a look at some of our open data stories and the other members in the network omit entire sentence ‘if you’d like to promote your membership you can display a badge on your site… ’ omit ‘or preview how your organisation will show up in the directory ’ keep the supporter badge huboard order milestone order
0
33,235
4,819,230,512
IssuesEvent
2016-11-04 18:37:14
leafo/scssphp
https://api.github.com/repos/leafo/scssphp
closed
Quotes error - Interpolation of lists
has-test-case
##### Expected behavior right half of the image ##### Actual behavior left half of the image The transform property is in quotes. ##### Reproduce steps `<link href="{{ [ 'assets/sass/main.scss' ]|theme }}" rel="stylesheet">` => `@include vendor('transform', 'translateX(#{_size(navPanel)})');` for example with other compilers like Scout-app for windows, it works fine. ##### October build 365 ![image](https://cloud.githubusercontent.com/assets/17984549/19606316/b589a2fa-97c1-11e6-99d1-df6cf5784e17.png)
1.0
Quotes error - Interpolation of lists - ##### Expected behavior right half of the image ##### Actual behavior left half of the image The transform property is in quotes. ##### Reproduce steps `<link href="{{ [ 'assets/sass/main.scss' ]|theme }}" rel="stylesheet">` => `@include vendor('transform', 'translateX(#{_size(navPanel)})');` for example with other compilers like Scout-app for windows, it works fine. ##### October build 365 ![image](https://cloud.githubusercontent.com/assets/17984549/19606316/b589a2fa-97c1-11e6-99d1-df6cf5784e17.png)
test
quotes error interpolation of lists expected behavior right half of the image actual behavior left half of the image the transform property is in quotes reproduce steps link href assets sass main scss theme rel stylesheet include vendor transform translatex size navpanel for example with other compilers like scout app for windows it works fine october build
1
189,030
15,175,965,856
IssuesEvent
2021-02-14 02:15:18
unoplatform/uno
https://api.github.com/repos/unoplatform/uno
closed
Following instructions for Xamarin forms and get same error each time
project/documentation triage/needs-information
<!-- Please only use this template for reporting issues with the documentation where the fix isn't clear. We greatly appreciate it when people send in pull-requests with fixes. If there's any friction, apart from knowledge, that's preventing you from doing so please let us know below. --> ## On which page? https://github.com/unoplatform/Uno.Xamarin.Forms.Platform ## What's wrong? Following the exact instructions for creating a first app using xamarin forms and uno, i always get the following issue with "failed to execute the linker" I have made no code changes at all. 1>------ Build started: Project: App4.Wasm, Configuration: Debug Any CPU ------ 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(45,17,45,58): warning Uno0001: Windows.UI.Xaml.DebugSettings.EnableFrameRateCounter is not implemented in Uno 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(62,21,62,45): warning Uno0001: Windows.ApplicationModel.Activation.LaunchActivatedEventArgs.PreviousExecutionState is not implemented in Uno 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(101,28,101,61): warning Uno0001: Windows.ApplicationModel.SuspendingOperation.GetDeferral() is not implemented in Uno 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(103,13,103,30): warning Uno0001: Windows.ApplicationModel.SuspendingDeferral.Complete() is not implemented in Uno 1>App4.Wasm -> C:\Users\andy\source\repos\App4\App4.Wasm\bin\Debug\netstandard2.0\App4.Wasm.dll 1>Determining projects to restore... 1>All projects are up-to-date for restore. 1>packager2 -> C:\Users\andy\AppData\Local\Temp\mono-wasm-f5cfc67c8ed\packager2.exe 1>C:\Users\andy\.nuget\packages\uno.wasm.bootstrap\1.0.10\build\Uno.Wasm.Bootstrap.targets(125,5): error : System.Exception: Failed to execute the linker 1>C:\Users\andy\.nuget\packages\uno.wasm.bootstrap\1.0.10\build\Uno.Wasm.Bootstrap.targets(125,5): error : at Uno.Wasm.Bootstrap.ShellTask_v1be21f07f93cd3a369c95001065dbda54db9dca8.RunPackager() in D:\a\1\s\src\Uno.Wasm.Bootstrap\ShellTask.cs:line 529 1>C:\Users\andy\.nuget\packages\uno.wasm.bootstrap\1.0.10\build\Uno.Wasm.Bootstrap.targets(125,5): error : at Uno.Wasm.Bootstrap.ShellTask_v1be21f07f93cd3a369c95001065dbda54db9dca8.Execute() in D:\a\1\s\src\Uno.Wasm.Bootstrap\ShellTask.cs:line 149 ## Any feedback?
1.0
Following instructions for Xamarin forms and get same error each time - <!-- Please only use this template for reporting issues with the documentation where the fix isn't clear. We greatly appreciate it when people send in pull-requests with fixes. If there's any friction, apart from knowledge, that's preventing you from doing so please let us know below. --> ## On which page? https://github.com/unoplatform/Uno.Xamarin.Forms.Platform ## What's wrong? Following the exact instructions for creating a first app using xamarin forms and uno, i always get the following issue with "failed to execute the linker" I have made no code changes at all. 1>------ Build started: Project: App4.Wasm, Configuration: Debug Any CPU ------ 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(45,17,45,58): warning Uno0001: Windows.UI.Xaml.DebugSettings.EnableFrameRateCounter is not implemented in Uno 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(62,21,62,45): warning Uno0001: Windows.ApplicationModel.Activation.LaunchActivatedEventArgs.PreviousExecutionState is not implemented in Uno 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(101,28,101,61): warning Uno0001: Windows.ApplicationModel.SuspendingOperation.GetDeferral() is not implemented in Uno 1>C:\Users\andy\source\repos\App4\App4.UWP\App.xaml.cs(103,13,103,30): warning Uno0001: Windows.ApplicationModel.SuspendingDeferral.Complete() is not implemented in Uno 1>App4.Wasm -> C:\Users\andy\source\repos\App4\App4.Wasm\bin\Debug\netstandard2.0\App4.Wasm.dll 1>Determining projects to restore... 1>All projects are up-to-date for restore. 1>packager2 -> C:\Users\andy\AppData\Local\Temp\mono-wasm-f5cfc67c8ed\packager2.exe 1>C:\Users\andy\.nuget\packages\uno.wasm.bootstrap\1.0.10\build\Uno.Wasm.Bootstrap.targets(125,5): error : System.Exception: Failed to execute the linker 1>C:\Users\andy\.nuget\packages\uno.wasm.bootstrap\1.0.10\build\Uno.Wasm.Bootstrap.targets(125,5): error : at Uno.Wasm.Bootstrap.ShellTask_v1be21f07f93cd3a369c95001065dbda54db9dca8.RunPackager() in D:\a\1\s\src\Uno.Wasm.Bootstrap\ShellTask.cs:line 529 1>C:\Users\andy\.nuget\packages\uno.wasm.bootstrap\1.0.10\build\Uno.Wasm.Bootstrap.targets(125,5): error : at Uno.Wasm.Bootstrap.ShellTask_v1be21f07f93cd3a369c95001065dbda54db9dca8.Execute() in D:\a\1\s\src\Uno.Wasm.Bootstrap\ShellTask.cs:line 149 ## Any feedback?
non_test
following instructions for xamarin forms and get same error each time on which page what s wrong following the exact instructions for creating a first app using xamarin forms and uno i always get the following issue with failed to execute the linker i have made no code changes at all build started project wasm configuration debug any cpu c users andy source repos uwp app xaml cs warning windows ui xaml debugsettings enableframeratecounter is not implemented in uno c users andy source repos uwp app xaml cs warning windows applicationmodel activation launchactivatedeventargs previousexecutionstate is not implemented in uno c users andy source repos uwp app xaml cs warning windows applicationmodel suspendingoperation getdeferral is not implemented in uno c users andy source repos uwp app xaml cs warning windows applicationmodel suspendingdeferral complete is not implemented in uno wasm c users andy source repos wasm bin debug wasm dll determining projects to restore all projects are up to date for restore c users andy appdata local temp mono wasm exe c users andy nuget packages uno wasm bootstrap build uno wasm bootstrap targets error system exception failed to execute the linker c users andy nuget packages uno wasm bootstrap build uno wasm bootstrap targets error at uno wasm bootstrap shelltask runpackager in d a s src uno wasm bootstrap shelltask cs line c users andy nuget packages uno wasm bootstrap build uno wasm bootstrap targets error at uno wasm bootstrap shelltask execute in d a s src uno wasm bootstrap shelltask cs line any feedback
0
267,015
23,274,184,295
IssuesEvent
2022-08-05 04:45:07
StoneCypher/fsl
https://api.github.com/repos/StoneCypher/fsl
closed
Remember to make theme heritable from graph to state
Graph use cases Test suite Graph language Visual improvements
This is required for the style cascase #114 to work out
1.0
Remember to make theme heritable from graph to state - This is required for the style cascase #114 to work out
test
remember to make theme heritable from graph to state this is required for the style cascase to work out
1
4,426
3,371,177,973
IssuesEvent
2015-11-23 17:59:46
angular/angular
https://api.github.com/repos/angular/angular
closed
How to bundle, import, and package separate modules
comp: build/dev-productivity effort3: hard (week) P!: backlog type: RFC / discussion / question
TL;DR skip to the table at the bottom to see how I'm suggesting we move things around ## Partial Backlog Based on Proposal and Discussion ### Done - [x] #2680 @jeffbcross _refactor(http): move http into its own module as sibling of angular2_ - [x] #2948 @jeffbcross _fix(typings): typings template should make module name at end of file dynamic_ - [x] #3540 @jeffbcross _feat(typings): bundle typings should be able to declare references_ - [x] #3590 @jeffbcross _feat(package): add "typescript" field with typings information_ - [x] #3555 _feat(bundles): include all bundles in npm distributions_ - [x] cancelled #3416 _refactor(di): move di into its own top-level module_ - [x] cancelled #3415 _refactor(facade): move facade into separate top-level module_ - [x] cancelled #3588 _feat(dist): include "core" modules in angular2 npm dist_ ### In-Progress - [ ] #3713 @jeffbcross _chore: re-arrange sources to better reflect npm package and import paths_ - [ ] #3714 @jeffbcross _chore: change bundle rules to generate one sfx module and several minimal modules_ ### Unscheduled - [ ] #3423 _export type definitions for test_lib_ - [ ] #3651 _docs: API reference should also include global/ambient object names included in sfx bundles_ - [ ] #3712 _chore(zone): add zone.js to bundles and npm distribution_ # Goals * Simple, predictable, consistent package installation and consumption experience for end-users * Have minimal, logical bundles for production use (router bundle, core bundle, http bundle, full bundle with deps, etc) * Pave path toward eventually being able to separate modules into separate repositories with independent development lifecycles * Maximum symmetry between Dart and JavaScript publishing (naming and structure mostly) * Logical import and require paths for end users, without having to think about hierarchy or relationships of modules (import \{Http\} from 'angular/http' or require('angular/router')) # Use Cases * User wants to create a quick code sample on plunker * User wants to create a production app with no build process but reasonable byte size * User wants to create a production app with a proprietary build process that creates minified bundles * User wants to create a production app on HTTP/2 server, using SystemJS loader, with no bundles, but minified modules that are loaded when needed. # Characteristics of Modules * Is likely to be used outside of an angular app, without a dependency on Angular? * Is it a runtime requirement of angular2? * Is useful as a runtime utility for third-party Angular libraries? * Is it primarily a tool, not useful in an app at runtime? Module | Used Outside | Required | Useful to Plugins | Tooling | -----------------------------------|--------------|----------|-------------------|------------| Angular2 | | &#10003; | | | Forms | | | &#10003; | | Facade | | &#10003; | &#10003; | | Router | | | | | Change Detection | | &#10003; | | | Zone.js | &#10003; | &#10003; | &#10003; | | Ts2Dart | | | &#10003; | &#10003; | Dependency Injection | &#10003; | &#10003; | &#10003; | | Http | &#10003; | | &#10003; | | WebSocket, Storage, etc | &#10003; | | &#10003; | | Tactical | &#10003; | | | | Benchpress | &#10003; | | | &#10003; | Material | | | | | Test lib | | | &#10003; | | NgUpgrade | | | &#10003; | | Universal | | | &#10003; | | React Native Renderer | | | &#10003; | | Workers | | | &#10003; | | # Proposal * If it is **not Required** and **not Tooling**, it becomes a subdirectory of modules/angular2/ and gets its own bundle. * If it is **not Required**, it should be moved to the top-level of the angular2/modules directory, , i.e. angular2/modules/http * All bundles will be pushed to code.angularjs.org * All runtime modules that are currently part of the core repo should be published inside of the angular2 package on NPM and Pub (zone.js should also be added to the npm dist). | Module | NPM Package Name | Pub Package | Bundle | Bundle Global | Primary Runtime JS Import Namespace* | Target Source Home (repo / path) | |-------------------------|-----------------------|-------------------|----------------|---------------|--------------------------------------|--------------------------------------------------------| | Angular2/Core | angular2 | angular2 | core.js | ngCore | angular2/core | angular/angular/modules/angular2/core | | Forms | angular2 | angular2 | forms.js | ngForms | angular2/forms | angular/angular/modules/angular2/forms | | Facade | angular2 | angular2 | | | angular2/core | angular/angular/modules/angular2/core/facade | | Router | angular2 | angular2 | router.js | ngRouter | angular2/router | angular/angular/modules/angular2/router | | Change Detection | angular2 | angular2 | | | angular2/core | angular/angular/modules/angular2/core/change_detection | | Zone.js | zone.js | | zone.js | ngZone | | angular/zone.js | | Ts2Dart | ts2dart | | | | | angular/ts2dart | | Dependency Injection | di | | | | angular2/core | angular/angular/modules/angular2/core/di | | Http | ngHttp | angular2 | http.js | ngHttp | angular2/http | angular/angular/modules/angular2/http | | WebSocket, Storage, etc | ng... | ... | ....js | ng... | angular2/... | angular/angular/modules/... | | Tactical | tactical | | tactical.js | ngTactical | | angular/tactical | | Benchpress | benchpress | benchpress | | | | angular/angular/modules/benchpress | | Material | angular-material2 | angular2_material | material2.js | ngMaterial | material2 | angular/angular/modules/material2 | | Test lib | angular2 | angular2 | test_lib.js | ngTestLib | angular2/test_lib | angular/angular/modules/angular2/test_lib | | NgUpgrade | angular2 | angular2_upgrade | upgrade.js | ngUpgrade | angular2/upgrade | angular/angular/modules/angular2/upgrade | | Universal | angular2_universal | | ** | | angular2_universal | angular/universal | | React Native Renderer | angular2_react_native | | | | angular2_react_native | angular/react-native-render | | Web Workers | angular2_workers | angular2_workers | web-workers.js | ngWebWorkers | angular/web-workers | angular/angular/modules/angular2/web-workers | \* Primary import namespaces when used in the context of an angular 2 app, as opposed to being used independently of Angular. For example: `import {FORM_DIRECTIVES} from 'angular2/forms';`. \** No bundle because only used in nodejs environment. ## Bundles These will be the bundles that are exported: * angular2.js (includes all other bundles) * core.js * forms.js * http.js * router.js * web-worker.js * zone.js And these are the extensions with which each bundle will be published to code.angularjs.org and npm. Bundles will be in the `bundles` folder of the npm package and will be published to code.angularjs.org as `code.angularjs.org/<version-number>/<bundle-name>`. * .js * .d.ts (typings are in separate `typings` folder in npm distribution) * .js.map * .min.js (Using uglify for now, eventually using angular/ts-minify) * .dev.js (runs in dev mode) * .dev.js.map * -testing.js * -testing.js.map * -testing.d.ts The angular2 bundle will also get an sfx version, which will include dependencies like Rx, reflect-metadata, traceur-runtime, and will export a global `ng` object. * angular2.sfx.js * angular2.sfx.min.js * angular2.sfx.js.map * angular2.sfx.min.js.map The `-testing` bundles will include testing utilities and mocks for the respective module. The core-testing.js will include the bulk of the testing utilities, leaving other testing bundles to mostly just include mocks.
1.0
How to bundle, import, and package separate modules - TL;DR skip to the table at the bottom to see how I'm suggesting we move things around ## Partial Backlog Based on Proposal and Discussion ### Done - [x] #2680 @jeffbcross _refactor(http): move http into its own module as sibling of angular2_ - [x] #2948 @jeffbcross _fix(typings): typings template should make module name at end of file dynamic_ - [x] #3540 @jeffbcross _feat(typings): bundle typings should be able to declare references_ - [x] #3590 @jeffbcross _feat(package): add "typescript" field with typings information_ - [x] #3555 _feat(bundles): include all bundles in npm distributions_ - [x] cancelled #3416 _refactor(di): move di into its own top-level module_ - [x] cancelled #3415 _refactor(facade): move facade into separate top-level module_ - [x] cancelled #3588 _feat(dist): include "core" modules in angular2 npm dist_ ### In-Progress - [ ] #3713 @jeffbcross _chore: re-arrange sources to better reflect npm package and import paths_ - [ ] #3714 @jeffbcross _chore: change bundle rules to generate one sfx module and several minimal modules_ ### Unscheduled - [ ] #3423 _export type definitions for test_lib_ - [ ] #3651 _docs: API reference should also include global/ambient object names included in sfx bundles_ - [ ] #3712 _chore(zone): add zone.js to bundles and npm distribution_ # Goals * Simple, predictable, consistent package installation and consumption experience for end-users * Have minimal, logical bundles for production use (router bundle, core bundle, http bundle, full bundle with deps, etc) * Pave path toward eventually being able to separate modules into separate repositories with independent development lifecycles * Maximum symmetry between Dart and JavaScript publishing (naming and structure mostly) * Logical import and require paths for end users, without having to think about hierarchy or relationships of modules (import \{Http\} from 'angular/http' or require('angular/router')) # Use Cases * User wants to create a quick code sample on plunker * User wants to create a production app with no build process but reasonable byte size * User wants to create a production app with a proprietary build process that creates minified bundles * User wants to create a production app on HTTP/2 server, using SystemJS loader, with no bundles, but minified modules that are loaded when needed. # Characteristics of Modules * Is likely to be used outside of an angular app, without a dependency on Angular? * Is it a runtime requirement of angular2? * Is useful as a runtime utility for third-party Angular libraries? * Is it primarily a tool, not useful in an app at runtime? Module | Used Outside | Required | Useful to Plugins | Tooling | -----------------------------------|--------------|----------|-------------------|------------| Angular2 | | &#10003; | | | Forms | | | &#10003; | | Facade | | &#10003; | &#10003; | | Router | | | | | Change Detection | | &#10003; | | | Zone.js | &#10003; | &#10003; | &#10003; | | Ts2Dart | | | &#10003; | &#10003; | Dependency Injection | &#10003; | &#10003; | &#10003; | | Http | &#10003; | | &#10003; | | WebSocket, Storage, etc | &#10003; | | &#10003; | | Tactical | &#10003; | | | | Benchpress | &#10003; | | | &#10003; | Material | | | | | Test lib | | | &#10003; | | NgUpgrade | | | &#10003; | | Universal | | | &#10003; | | React Native Renderer | | | &#10003; | | Workers | | | &#10003; | | # Proposal * If it is **not Required** and **not Tooling**, it becomes a subdirectory of modules/angular2/ and gets its own bundle. * If it is **not Required**, it should be moved to the top-level of the angular2/modules directory, , i.e. angular2/modules/http * All bundles will be pushed to code.angularjs.org * All runtime modules that are currently part of the core repo should be published inside of the angular2 package on NPM and Pub (zone.js should also be added to the npm dist). | Module | NPM Package Name | Pub Package | Bundle | Bundle Global | Primary Runtime JS Import Namespace* | Target Source Home (repo / path) | |-------------------------|-----------------------|-------------------|----------------|---------------|--------------------------------------|--------------------------------------------------------| | Angular2/Core | angular2 | angular2 | core.js | ngCore | angular2/core | angular/angular/modules/angular2/core | | Forms | angular2 | angular2 | forms.js | ngForms | angular2/forms | angular/angular/modules/angular2/forms | | Facade | angular2 | angular2 | | | angular2/core | angular/angular/modules/angular2/core/facade | | Router | angular2 | angular2 | router.js | ngRouter | angular2/router | angular/angular/modules/angular2/router | | Change Detection | angular2 | angular2 | | | angular2/core | angular/angular/modules/angular2/core/change_detection | | Zone.js | zone.js | | zone.js | ngZone | | angular/zone.js | | Ts2Dart | ts2dart | | | | | angular/ts2dart | | Dependency Injection | di | | | | angular2/core | angular/angular/modules/angular2/core/di | | Http | ngHttp | angular2 | http.js | ngHttp | angular2/http | angular/angular/modules/angular2/http | | WebSocket, Storage, etc | ng... | ... | ....js | ng... | angular2/... | angular/angular/modules/... | | Tactical | tactical | | tactical.js | ngTactical | | angular/tactical | | Benchpress | benchpress | benchpress | | | | angular/angular/modules/benchpress | | Material | angular-material2 | angular2_material | material2.js | ngMaterial | material2 | angular/angular/modules/material2 | | Test lib | angular2 | angular2 | test_lib.js | ngTestLib | angular2/test_lib | angular/angular/modules/angular2/test_lib | | NgUpgrade | angular2 | angular2_upgrade | upgrade.js | ngUpgrade | angular2/upgrade | angular/angular/modules/angular2/upgrade | | Universal | angular2_universal | | ** | | angular2_universal | angular/universal | | React Native Renderer | angular2_react_native | | | | angular2_react_native | angular/react-native-render | | Web Workers | angular2_workers | angular2_workers | web-workers.js | ngWebWorkers | angular/web-workers | angular/angular/modules/angular2/web-workers | \* Primary import namespaces when used in the context of an angular 2 app, as opposed to being used independently of Angular. For example: `import {FORM_DIRECTIVES} from 'angular2/forms';`. \** No bundle because only used in nodejs environment. ## Bundles These will be the bundles that are exported: * angular2.js (includes all other bundles) * core.js * forms.js * http.js * router.js * web-worker.js * zone.js And these are the extensions with which each bundle will be published to code.angularjs.org and npm. Bundles will be in the `bundles` folder of the npm package and will be published to code.angularjs.org as `code.angularjs.org/<version-number>/<bundle-name>`. * .js * .d.ts (typings are in separate `typings` folder in npm distribution) * .js.map * .min.js (Using uglify for now, eventually using angular/ts-minify) * .dev.js (runs in dev mode) * .dev.js.map * -testing.js * -testing.js.map * -testing.d.ts The angular2 bundle will also get an sfx version, which will include dependencies like Rx, reflect-metadata, traceur-runtime, and will export a global `ng` object. * angular2.sfx.js * angular2.sfx.min.js * angular2.sfx.js.map * angular2.sfx.min.js.map The `-testing` bundles will include testing utilities and mocks for the respective module. The core-testing.js will include the bulk of the testing utilities, leaving other testing bundles to mostly just include mocks.
non_test
how to bundle import and package separate modules tl dr skip to the table at the bottom to see how i m suggesting we move things around partial backlog based on proposal and discussion done jeffbcross refactor http move http into its own module as sibling of jeffbcross fix typings typings template should make module name at end of file dynamic jeffbcross feat typings bundle typings should be able to declare references jeffbcross feat package add typescript field with typings information feat bundles include all bundles in npm distributions cancelled refactor di move di into its own top level module cancelled refactor facade move facade into separate top level module cancelled feat dist include core modules in npm dist in progress jeffbcross chore re arrange sources to better reflect npm package and import paths jeffbcross chore change bundle rules to generate one sfx module and several minimal modules unscheduled export type definitions for test lib docs api reference should also include global ambient object names included in sfx bundles chore zone add zone js to bundles and npm distribution goals simple predictable consistent package installation and consumption experience for end users have minimal logical bundles for production use router bundle core bundle http bundle full bundle with deps etc pave path toward eventually being able to separate modules into separate repositories with independent development lifecycles maximum symmetry between dart and javascript publishing naming and structure mostly logical import and require paths for end users without having to think about hierarchy or relationships of modules import http from angular http or require angular router use cases user wants to create a quick code sample on plunker user wants to create a production app with no build process but reasonable byte size user wants to create a production app with a proprietary build process that creates minified bundles user wants to create a production app on http server using systemjs loader with no bundles but minified modules that are loaded when needed characteristics of modules is likely to be used outside of an angular app without a dependency on angular is it a runtime requirement of is useful as a runtime utility for third party angular libraries is it primarily a tool not useful in an app at runtime module used outside required useful to plugins tooling forms facade router change detection zone js dependency injection http websocket storage etc tactical benchpress material test lib ngupgrade universal react native renderer workers proposal if it is not required and not tooling it becomes a subdirectory of modules and gets its own bundle if it is not required it should be moved to the top level of the modules directory i e modules http all bundles will be pushed to code angularjs org all runtime modules that are currently part of the core repo should be published inside of the package on npm and pub zone js should also be added to the npm dist module npm package name pub package bundle bundle global primary runtime js import namespace target source home repo path core core js ngcore core angular angular modules core forms forms js ngforms forms angular angular modules forms facade core angular angular modules core facade router router js ngrouter router angular angular modules router change detection core angular angular modules core change detection zone js zone js zone js ngzone angular zone js angular dependency injection di core angular angular modules core di http nghttp http js nghttp http angular angular modules http websocket storage etc ng js ng angular angular modules tactical tactical tactical js ngtactical angular tactical benchpress benchpress benchpress angular angular modules benchpress material angular material js ngmaterial angular angular modules test lib test lib js ngtestlib test lib angular angular modules test lib ngupgrade upgrade upgrade js ngupgrade upgrade angular angular modules upgrade universal universal universal angular universal react native renderer react native react native angular react native render web workers workers workers web workers js ngwebworkers angular web workers angular angular modules web workers primary import namespaces when used in the context of an angular app as opposed to being used independently of angular for example import form directives from forms no bundle because only used in nodejs environment bundles these will be the bundles that are exported js includes all other bundles core js forms js http js router js web worker js zone js and these are the extensions with which each bundle will be published to code angularjs org and npm bundles will be in the bundles folder of the npm package and will be published to code angularjs org as code angularjs org js d ts typings are in separate typings folder in npm distribution js map min js using uglify for now eventually using angular ts minify dev js runs in dev mode dev js map testing js testing js map testing d ts the bundle will also get an sfx version which will include dependencies like rx reflect metadata traceur runtime and will export a global ng object sfx js sfx min js sfx js map sfx min js map the testing bundles will include testing utilities and mocks for the respective module the core testing js will include the bulk of the testing utilities leaving other testing bundles to mostly just include mocks
0
328,096
9,986,022,095
IssuesEvent
2019-07-10 18:02:13
ansible/galaxy-dev
https://api.github.com/repos/ansible/galaxy-dev
opened
Re-design Galaxy import process to work with pulp-ansible
area/backend priority/high status/in-prorgress type/enhancement
Come up with a design that takes the existing Galaxy importer/worker and integrates it with `pulp-ansible`.
1.0
Re-design Galaxy import process to work with pulp-ansible - Come up with a design that takes the existing Galaxy importer/worker and integrates it with `pulp-ansible`.
non_test
re design galaxy import process to work with pulp ansible come up with a design that takes the existing galaxy importer worker and integrates it with pulp ansible
0