Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
4
112
repo_url
stringlengths
33
141
action
stringclasses
3 values
title
stringlengths
1
1.02k
labels
stringlengths
4
1.54k
body
stringlengths
1
262k
index
stringclasses
17 values
text_combine
stringlengths
95
262k
label
stringclasses
2 values
text
stringlengths
96
252k
binary_label
int64
0
1
149,316
11,889,903,100
IssuesEvent
2020-03-28 15:56:25
trilinos/Trilinos
https://api.github.com/repos/trilinos/Trilinos
opened
Over 150 total failures of several MueLu tests failing in the majority of ATDM Trilinos builds showing "aggregation: phase3 avoid singletons" diffs starting 2020-03-27
ATDM Sev: Blocker PA: Linear Solvers client: ATDM impacting: tests pkg: MueLu type: bug
CC: @trilinos/muelu, @srajama1 (Trilinos Linear Solvers Product Lead) <Checklist> <???: Add label "client: ATDM"> <???: Add label "ATDM Sev: Blocker" (by default but could be other "ATDM Sev: XXX")> <???: Add label "type: bug"?> <???: Add label "impacting: tests"?> <???: Add label for affected packages (e.g. "pkg: MueLu", "pkg: Tpetra", "pkg: Kokkos", etc.)> <???: Add label "PA: ???Project Area???" (e.g. "PA: Linear Solvers", "PA: Data Services")> <???: Add milestone "Initial cleanup of new ATDM ..." or "Keep promoted ATDM ..."> <???: Once GitHub Issue is created, add entries for tests to TrilinosATDMStatus/*.csv files> ## Next Action Status <status-and-or-first-action> ## Description As shown in [this query](https://testing-dev.sandia.gov/cdash/queryTests.php?project=Trilinos&date=2020-03-27&filtercount=7&showfilters=1&filtercombine=and&field1=groupname&compare1=62&value1=Experimental&field2=buildname&compare2=65&value2=Trilinos-atdm-&field3=block&field3count=6&field3field1=testname&field3compare1=61&field3value1=MueLu_CreateOperatorTpetra_MPI_1&field3field2=testname&field3compare2=61&field3value2=MueLu_CreateOperatorTpetra_MPI_4&field3field3=testname&field3compare3=61&field3value3=MueLu_ParameterListInterpreterTpetra_MPI_1&field3field4=testname&field3compare4=61&field3value4=MueLu_ParameterListInterpreterTpetra_MPI_4&field3field5=testname&field3compare5=61&field3value5=MueLu_ParameterListInterpreterTpetraHeavy_MPI_1&field3field6=testname&field3compare6=61&field3value6=MueLu_ParameterListInterpreterTpetraHeavy_MPI_4&field4=details&compare4=64&value4=Timeout&field5=testoutput&compare5=94&value5=Error%20initializing%20RM%20connection.%20Exiting&field6=testoutput&compare6=94&value6=OPAL%20ERROR%3A%20Unreachable&field7=testoutput&compare7=96&value7=srun%3A%20error%3A%20s_p_parse_file%3A%20unable%20to%20read%20.%2Fetc%2Fslurm%2Fslurm.conf.%3A%20Permission%20denied_) and [this query](https://testing-dev.sandia.gov/cdash/queryTests.php?project=Trilinos&date=2020-03-27&filtercount=9&showfilters=1&filtercombine=and&field1=groupname&compare1=62&value1=Experimental&field2=buildname&compare2=65&value2=Trilinos-atdm-&field3=testname&compare3=65&value3=MueLu&field4=details&compare4=64&value4=Timeout&field5=status&compare5=62&value5=passed&field6=testoutput&compare6=94&value6=Error%20initializing%20RM%20connection.%20Exiting&field7=testoutput&compare7=94&value7=OPAL%20ERROR%3A%20Unreachable&field8=testoutput&compare8=96&value8=srun%3A%20error%3A%20s_p_parse_file%3A%20unable%20to%20read%20.%2Fetc%2Fslurm%2Fslurm.conf.%3A%20Permission%20denied&field9=testoutput&compare9=95&value9=%2Baggregation%3A%20error%20on%20nodes%20with%20no%20on-rank%20neighbors%20%3D%200) the 6 tests: * `MueLu_CreateOperatorTpetra_MPI_1` * `MueLu_CreateOperatorTpetra_MPI_4` * `MueLu_ParameterListInterpreterTpetra_MPI_1` * `MueLu_ParameterListInterpreterTpetra_MPI_4` * `MueLu_ParameterListInterpreterTpetraHeavy_MPI_1` * `MueLu_ParameterListInterpreterTpetraHeavy_MPI_4` in the 35 builds: * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_dbg` * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_dbg_cuda-aware-mpi` * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_opt` * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_opt_cuda-aware-mpi` * `Trilinos-atdm-ats2-gnu-7.3.1-spmpi-2019.06.24_serial_static_dbg` * `Trilinos-atdm-ats2-gnu-7.3.1-spmpi-2019.06.24_serial_static_opt` * `Trilinos-atdm-cee-rhel6_clang-5.0.1_openmpi-4.0.2_serial_static_opt` * `Trilinos-atdm-cee-rhel6_gnu-7.2.0_openmpi-4.0.2_serial_shared_opt` * `Trilinos-atdm-cee-rhel6_intel-18.0.2_mpich2-3.2_openmp_static_opt` * `Trilinos-atdm-cee-rhel6_intel-19.0.3_intelmpi-2018.4_serial_static_opt` * `Trilinos-atdm-cts1-intel-18.0.2_openmpi-2.0.3_openmp_static_dbg` * `Trilinos-atdm-cts1-intel-18.0.2_openmpi-2.0.3_openmp_static_opt` * `Trilinos-atdm-cts1-intel-19.0.5_openmpi-4.0.1_openmp_static_dbg` * `Trilinos-atdm-cts1-intel-19.0.5_openmpi-4.0.1_openmp_static_opt` * `Trilinos-atdm-mutrino-intel-opt-openmp-HSW` * `Trilinos-atdm-sems-rhel6-gnu-7.2.0-openmp-debug` * `Trilinos-atdm-sems-rhel6-gnu-7.2.0-openmp-release` * `Trilinos-atdm-sems-rhel6-gnu-7.2.0-openmp-release-debug` * `Trilinos-atdm-sems-rhel6-intel-17.0.1-openmp-release` * `Trilinos-atdm-sems-rhel7-intel-18.0.5-openmp-shared-debug` * `Trilinos-atdm-sems-rhel7-intel-18.0.5-openmp-shared-release-debug` * `Trilinos-atdm-tlcc2-intel-debug-openmp` * `Trilinos-atdm-tlcc2-intel-opt-openmp` * `Trilinos-atdm-waterman_cuda-9.2_fpic_static_opt` * `Trilinos-atdm-waterman_cuda-9.2_shared_opt` * `Trilinos-atdm-waterman-cuda-9.2-debug` * `Trilinos-atdm-waterman-cuda-9.2-opt` * `Trilinos-atdm-waterman-cuda-9.2-rdc-release-debug` * `Trilinos-atdm-waterman-cuda-9.2-release-debug` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-debug` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-rdc-release-debug` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-release` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-release-debug` * `Trilinos-atdm-white-ride-gnu-7.2.0-openmp-debug` * `Trilinos-atdm-white-ride-gnu-7.2.0-openmp-release` started failing on testing day 2020-03-27. All of these failing tests show failures with diffs like: ``` +aggregation: phase3 avoid singletons = 0 [default] +aggregation: error on nodes with no on-rank neighbors = 0 [default] ``` ## Current Status on CDash * [Status of these 6 MueLu tests in all ATDM Trilinos builds for the current testing day](https://testing-dev.sandia.gov/cdash/queryTests.php?project=Trilinos&filtercount=7&showfilters=1&filtercombine=and&field1=groupname&compare1=62&value1=Experimental&field2=buildname&compare2=65&value2=Trilinos-atdm-&field3=block&field3count=6&field3field1=testname&field3compare1=61&field3value1=MueLu_CreateOperatorTpetra_MPI_1&field3field2=testname&field3compare2=61&field3value2=MueLu_CreateOperatorTpetra_MPI_4&field3field3=testname&field3compare3=61&field3value3=MueLu_ParameterListInterpreterTpetra_MPI_1&field3field4=testname&field3compare4=61&field3value4=MueLu_ParameterListInterpreterTpetra_MPI_4&field3field5=testname&field3compare5=61&field3value5=MueLu_ParameterListInterpreterTpetraHeavy_MPI_1&field3field6=testname&field3compare6=61&field3value6=MueLu_ParameterListInterpreterTpetraHeavy_MPI_4&field4=details&compare4=64&value4=Timeout&field5=testoutput&compare5=94&value5=Error%20initializing%20RM%20connection.%20Exiting&field6=testoutput&compare6=94&value6=OPAL%20ERROR%3A%20Unreachable&field7=testoutput&compare7=96&value7=srun%3A%20error%3A%20s_p_parse_file%3A%20unable%20to%20read%20.%2Fetc%2Fslurm%2Fslurm.conf.%3A%20Permission%20denied) ## Steps to Reproduce One should be able to reproduce this failure on on any of the machines with any of the builds described in: * https://github.com/trilinos/Trilinos/blob/develop/cmake/std/atdm/README.md More specifically, the commands given for the system 'sems-rhel7' are provided at: * https://github.com/trilinos/Trilinos/blob/develop/cmake/std/atdm/README.md#sems-rhel7-environment The exact commands to reproduce this issue for one of the 'sems-rhel7' builds should be: ``` $ cd <some_build_dir>/ $ source $TRILINOS_DIR/cmake/std/atdm/load-env.sh \ Trilinos-atdm-sems-rhel7-intel-18.0.5-openmp-shared-debug $ cmake \ -GNinja \ -DTrilinos_CONFIGURE_OPTIONS_FILE:STRING=cmake/std/atdm/ATDMDevEnv.cmake \ -DTrilinos_ENABLE_TESTS=ON -DTrilinos_ENABLE_MueLu=ON \ $TRILINOS_DIR $ make NP=16 $ ctest -j4 ``` However, one should
1.0
Over 150 total failures of several MueLu tests failing in the majority of ATDM Trilinos builds showing "aggregation: phase3 avoid singletons" diffs starting 2020-03-27 - CC: @trilinos/muelu, @srajama1 (Trilinos Linear Solvers Product Lead) <Checklist> <???: Add label "client: ATDM"> <???: Add label "ATDM Sev: Blocker" (by default but could be other "ATDM Sev: XXX")> <???: Add label "type: bug"?> <???: Add label "impacting: tests"?> <???: Add label for affected packages (e.g. "pkg: MueLu", "pkg: Tpetra", "pkg: Kokkos", etc.)> <???: Add label "PA: ???Project Area???" (e.g. "PA: Linear Solvers", "PA: Data Services")> <???: Add milestone "Initial cleanup of new ATDM ..." or "Keep promoted ATDM ..."> <???: Once GitHub Issue is created, add entries for tests to TrilinosATDMStatus/*.csv files> ## Next Action Status <status-and-or-first-action> ## Description As shown in [this query](https://testing-dev.sandia.gov/cdash/queryTests.php?project=Trilinos&date=2020-03-27&filtercount=7&showfilters=1&filtercombine=and&field1=groupname&compare1=62&value1=Experimental&field2=buildname&compare2=65&value2=Trilinos-atdm-&field3=block&field3count=6&field3field1=testname&field3compare1=61&field3value1=MueLu_CreateOperatorTpetra_MPI_1&field3field2=testname&field3compare2=61&field3value2=MueLu_CreateOperatorTpetra_MPI_4&field3field3=testname&field3compare3=61&field3value3=MueLu_ParameterListInterpreterTpetra_MPI_1&field3field4=testname&field3compare4=61&field3value4=MueLu_ParameterListInterpreterTpetra_MPI_4&field3field5=testname&field3compare5=61&field3value5=MueLu_ParameterListInterpreterTpetraHeavy_MPI_1&field3field6=testname&field3compare6=61&field3value6=MueLu_ParameterListInterpreterTpetraHeavy_MPI_4&field4=details&compare4=64&value4=Timeout&field5=testoutput&compare5=94&value5=Error%20initializing%20RM%20connection.%20Exiting&field6=testoutput&compare6=94&value6=OPAL%20ERROR%3A%20Unreachable&field7=testoutput&compare7=96&value7=srun%3A%20error%3A%20s_p_parse_file%3A%20unable%20to%20read%20.%2Fetc%2Fslurm%2Fslurm.conf.%3A%20Permission%20denied_) and [this query](https://testing-dev.sandia.gov/cdash/queryTests.php?project=Trilinos&date=2020-03-27&filtercount=9&showfilters=1&filtercombine=and&field1=groupname&compare1=62&value1=Experimental&field2=buildname&compare2=65&value2=Trilinos-atdm-&field3=testname&compare3=65&value3=MueLu&field4=details&compare4=64&value4=Timeout&field5=status&compare5=62&value5=passed&field6=testoutput&compare6=94&value6=Error%20initializing%20RM%20connection.%20Exiting&field7=testoutput&compare7=94&value7=OPAL%20ERROR%3A%20Unreachable&field8=testoutput&compare8=96&value8=srun%3A%20error%3A%20s_p_parse_file%3A%20unable%20to%20read%20.%2Fetc%2Fslurm%2Fslurm.conf.%3A%20Permission%20denied&field9=testoutput&compare9=95&value9=%2Baggregation%3A%20error%20on%20nodes%20with%20no%20on-rank%20neighbors%20%3D%200) the 6 tests: * `MueLu_CreateOperatorTpetra_MPI_1` * `MueLu_CreateOperatorTpetra_MPI_4` * `MueLu_ParameterListInterpreterTpetra_MPI_1` * `MueLu_ParameterListInterpreterTpetra_MPI_4` * `MueLu_ParameterListInterpreterTpetraHeavy_MPI_1` * `MueLu_ParameterListInterpreterTpetraHeavy_MPI_4` in the 35 builds: * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_dbg` * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_dbg_cuda-aware-mpi` * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_opt` * `Trilinos-atdm-ats2-cuda-10.1.243-gnu-7.3.1-spmpi-2019.06.24_static_opt_cuda-aware-mpi` * `Trilinos-atdm-ats2-gnu-7.3.1-spmpi-2019.06.24_serial_static_dbg` * `Trilinos-atdm-ats2-gnu-7.3.1-spmpi-2019.06.24_serial_static_opt` * `Trilinos-atdm-cee-rhel6_clang-5.0.1_openmpi-4.0.2_serial_static_opt` * `Trilinos-atdm-cee-rhel6_gnu-7.2.0_openmpi-4.0.2_serial_shared_opt` * `Trilinos-atdm-cee-rhel6_intel-18.0.2_mpich2-3.2_openmp_static_opt` * `Trilinos-atdm-cee-rhel6_intel-19.0.3_intelmpi-2018.4_serial_static_opt` * `Trilinos-atdm-cts1-intel-18.0.2_openmpi-2.0.3_openmp_static_dbg` * `Trilinos-atdm-cts1-intel-18.0.2_openmpi-2.0.3_openmp_static_opt` * `Trilinos-atdm-cts1-intel-19.0.5_openmpi-4.0.1_openmp_static_dbg` * `Trilinos-atdm-cts1-intel-19.0.5_openmpi-4.0.1_openmp_static_opt` * `Trilinos-atdm-mutrino-intel-opt-openmp-HSW` * `Trilinos-atdm-sems-rhel6-gnu-7.2.0-openmp-debug` * `Trilinos-atdm-sems-rhel6-gnu-7.2.0-openmp-release` * `Trilinos-atdm-sems-rhel6-gnu-7.2.0-openmp-release-debug` * `Trilinos-atdm-sems-rhel6-intel-17.0.1-openmp-release` * `Trilinos-atdm-sems-rhel7-intel-18.0.5-openmp-shared-debug` * `Trilinos-atdm-sems-rhel7-intel-18.0.5-openmp-shared-release-debug` * `Trilinos-atdm-tlcc2-intel-debug-openmp` * `Trilinos-atdm-tlcc2-intel-opt-openmp` * `Trilinos-atdm-waterman_cuda-9.2_fpic_static_opt` * `Trilinos-atdm-waterman_cuda-9.2_shared_opt` * `Trilinos-atdm-waterman-cuda-9.2-debug` * `Trilinos-atdm-waterman-cuda-9.2-opt` * `Trilinos-atdm-waterman-cuda-9.2-rdc-release-debug` * `Trilinos-atdm-waterman-cuda-9.2-release-debug` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-debug` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-rdc-release-debug` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-release` * `Trilinos-atdm-white-ride-cuda-9.2-gnu-7.2.0-release-debug` * `Trilinos-atdm-white-ride-gnu-7.2.0-openmp-debug` * `Trilinos-atdm-white-ride-gnu-7.2.0-openmp-release` started failing on testing day 2020-03-27. All of these failing tests show failures with diffs like: ``` +aggregation: phase3 avoid singletons = 0 [default] +aggregation: error on nodes with no on-rank neighbors = 0 [default] ``` ## Current Status on CDash * [Status of these 6 MueLu tests in all ATDM Trilinos builds for the current testing day](https://testing-dev.sandia.gov/cdash/queryTests.php?project=Trilinos&filtercount=7&showfilters=1&filtercombine=and&field1=groupname&compare1=62&value1=Experimental&field2=buildname&compare2=65&value2=Trilinos-atdm-&field3=block&field3count=6&field3field1=testname&field3compare1=61&field3value1=MueLu_CreateOperatorTpetra_MPI_1&field3field2=testname&field3compare2=61&field3value2=MueLu_CreateOperatorTpetra_MPI_4&field3field3=testname&field3compare3=61&field3value3=MueLu_ParameterListInterpreterTpetra_MPI_1&field3field4=testname&field3compare4=61&field3value4=MueLu_ParameterListInterpreterTpetra_MPI_4&field3field5=testname&field3compare5=61&field3value5=MueLu_ParameterListInterpreterTpetraHeavy_MPI_1&field3field6=testname&field3compare6=61&field3value6=MueLu_ParameterListInterpreterTpetraHeavy_MPI_4&field4=details&compare4=64&value4=Timeout&field5=testoutput&compare5=94&value5=Error%20initializing%20RM%20connection.%20Exiting&field6=testoutput&compare6=94&value6=OPAL%20ERROR%3A%20Unreachable&field7=testoutput&compare7=96&value7=srun%3A%20error%3A%20s_p_parse_file%3A%20unable%20to%20read%20.%2Fetc%2Fslurm%2Fslurm.conf.%3A%20Permission%20denied) ## Steps to Reproduce One should be able to reproduce this failure on on any of the machines with any of the builds described in: * https://github.com/trilinos/Trilinos/blob/develop/cmake/std/atdm/README.md More specifically, the commands given for the system 'sems-rhel7' are provided at: * https://github.com/trilinos/Trilinos/blob/develop/cmake/std/atdm/README.md#sems-rhel7-environment The exact commands to reproduce this issue for one of the 'sems-rhel7' builds should be: ``` $ cd <some_build_dir>/ $ source $TRILINOS_DIR/cmake/std/atdm/load-env.sh \ Trilinos-atdm-sems-rhel7-intel-18.0.5-openmp-shared-debug $ cmake \ -GNinja \ -DTrilinos_CONFIGURE_OPTIONS_FILE:STRING=cmake/std/atdm/ATDMDevEnv.cmake \ -DTrilinos_ENABLE_TESTS=ON -DTrilinos_ENABLE_MueLu=ON \ $TRILINOS_DIR $ make NP=16 $ ctest -j4 ``` However, one should
test
over total failures of several muelu tests failing in the majority of atdm trilinos builds showing aggregation avoid singletons diffs starting cc trilinos muelu trilinos linear solvers product lead next action status description as shown in and the tests muelu createoperatortpetra mpi muelu createoperatortpetra mpi muelu parameterlistinterpretertpetra mpi muelu parameterlistinterpretertpetra mpi muelu parameterlistinterpretertpetraheavy mpi muelu parameterlistinterpretertpetraheavy mpi in the builds trilinos atdm cuda gnu spmpi static dbg trilinos atdm cuda gnu spmpi static dbg cuda aware mpi trilinos atdm cuda gnu spmpi static opt trilinos atdm cuda gnu spmpi static opt cuda aware mpi trilinos atdm gnu spmpi serial static dbg trilinos atdm gnu spmpi serial static opt trilinos atdm cee clang openmpi serial static opt trilinos atdm cee gnu openmpi serial shared opt trilinos atdm cee intel openmp static opt trilinos atdm cee intel intelmpi serial static opt trilinos atdm intel openmpi openmp static dbg trilinos atdm intel openmpi openmp static opt trilinos atdm intel openmpi openmp static dbg trilinos atdm intel openmpi openmp static opt trilinos atdm mutrino intel opt openmp hsw trilinos atdm sems gnu openmp debug trilinos atdm sems gnu openmp release trilinos atdm sems gnu openmp release debug trilinos atdm sems intel openmp release trilinos atdm sems intel openmp shared debug trilinos atdm sems intel openmp shared release debug trilinos atdm intel debug openmp trilinos atdm intel opt openmp trilinos atdm waterman cuda fpic static opt trilinos atdm waterman cuda shared opt trilinos atdm waterman cuda debug trilinos atdm waterman cuda opt trilinos atdm waterman cuda rdc release debug trilinos atdm waterman cuda release debug trilinos atdm white ride cuda gnu debug trilinos atdm white ride cuda gnu rdc release debug trilinos atdm white ride cuda gnu release trilinos atdm white ride cuda gnu release debug trilinos atdm white ride gnu openmp debug trilinos atdm white ride gnu openmp release started failing on testing day all of these failing tests show failures with diffs like aggregation avoid singletons aggregation error on nodes with no on rank neighbors current status on cdash steps to reproduce one should be able to reproduce this failure on on any of the machines with any of the builds described in more specifically the commands given for the system sems are provided at the exact commands to reproduce this issue for one of the sems builds should be cd source trilinos dir cmake std atdm load env sh trilinos atdm sems intel openmp shared debug cmake gninja dtrilinos configure options file string cmake std atdm atdmdevenv cmake dtrilinos enable tests on dtrilinos enable muelu on trilinos dir make np ctest however one should
1
153,217
24,089,317,239
IssuesEvent
2022-09-19 13:36:45
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
closed
Api to update text resource
area/language solution/studio/designer kind/user-story team/studio
### Description Today Studio frontend does a POST request to `designer/{org}/{repo}/Text/SaveResource/nb` Create a new endpoint for doing a PUT request to `/designer/api/v2/{org}/{repo}/texts/{languageCode}` that updates the specified language. ### Additional Information Should discuss whether a PUT should return the updated file in the response or just a 204 No Content. The PUT'ed content should equal on the on disk version so don't see any need to include it in the response. ### Tasks No response ### Acceptance Criterias - [ ] New endpoint implemented - [ ] New endpoint is covered by unit/integration tests - [ ] The selected text file with the matching language code should be updated - [ ] The http response should be either 200 (with payload) or 204 (without payload)
1.0
Api to update text resource - ### Description Today Studio frontend does a POST request to `designer/{org}/{repo}/Text/SaveResource/nb` Create a new endpoint for doing a PUT request to `/designer/api/v2/{org}/{repo}/texts/{languageCode}` that updates the specified language. ### Additional Information Should discuss whether a PUT should return the updated file in the response or just a 204 No Content. The PUT'ed content should equal on the on disk version so don't see any need to include it in the response. ### Tasks No response ### Acceptance Criterias - [ ] New endpoint implemented - [ ] New endpoint is covered by unit/integration tests - [ ] The selected text file with the matching language code should be updated - [ ] The http response should be either 200 (with payload) or 204 (without payload)
non_test
api to update text resource description today studio frontend does a post request to designer org repo text saveresource nb create a new endpoint for doing a put request to designer api org repo texts languagecode that updates the specified language additional information should discuss whether a put should return the updated file in the response or just a no content the put ed content should equal on the on disk version so don t see any need to include it in the response tasks no response acceptance criterias new endpoint implemented new endpoint is covered by unit integration tests the selected text file with the matching language code should be updated the http response should be either with payload or without payload
0
7,040
2,872,270,323
IssuesEvent
2015-06-08 10:44:23
elastic/elasticsearch
https://api.github.com/repos/elastic/elasticsearch
opened
[CI] two_nodes_should_run_using_public_ip fails due to missing node
test
Stack trace: ``` java.lang.AssertionError: expected:<2> but was:<1> at __randomizedtesting.SeedInfo.seed([6E452F021F50AE3F:1E58450F06FBF158]:0) at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.failNotEquals(Assert.java:743) at org.junit.Assert.assertEquals(Assert.java:118) at org.junit.Assert.assertEquals(Assert.java:555) at org.junit.Assert.assertEquals(Assert.java:542) at org.elasticsearch.discovery.azure.AbstractAzureComputeServiceTest.checkNumberOfNodes(AbstractAzureComputeServiceTest.java:51) at org.elasticsearch.discovery.azure.AzureTwoStartedNodesTest.two_nodes_should_run_using_public_ip(AzureTwoStartedNodesTest.java:78) ``` repro line: ``` mvn test -Pdev -Dtests.seed=6E452F021F50AE3F -Dtests.class=org.elasticsearch.discovery.azure.AzureTwoStartedNodesTest -Dtests.slow=true -Dtests.method="two_nodes_should_run_using_public_ip" -Des.logger.level=DEBUG -Des.node.mode=network -Dtests.assertion.disabled=false -Dtests.security.manager=true -Dtests.heap.size=512m -Dtests.locale=it -Dtests.timezone=Asia/Kashgar ``` Build url: http://build-us-00.elastic.co/job/es_core_master_centos/5094/
1.0
[CI] two_nodes_should_run_using_public_ip fails due to missing node - Stack trace: ``` java.lang.AssertionError: expected:<2> but was:<1> at __randomizedtesting.SeedInfo.seed([6E452F021F50AE3F:1E58450F06FBF158]:0) at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.failNotEquals(Assert.java:743) at org.junit.Assert.assertEquals(Assert.java:118) at org.junit.Assert.assertEquals(Assert.java:555) at org.junit.Assert.assertEquals(Assert.java:542) at org.elasticsearch.discovery.azure.AbstractAzureComputeServiceTest.checkNumberOfNodes(AbstractAzureComputeServiceTest.java:51) at org.elasticsearch.discovery.azure.AzureTwoStartedNodesTest.two_nodes_should_run_using_public_ip(AzureTwoStartedNodesTest.java:78) ``` repro line: ``` mvn test -Pdev -Dtests.seed=6E452F021F50AE3F -Dtests.class=org.elasticsearch.discovery.azure.AzureTwoStartedNodesTest -Dtests.slow=true -Dtests.method="two_nodes_should_run_using_public_ip" -Des.logger.level=DEBUG -Des.node.mode=network -Dtests.assertion.disabled=false -Dtests.security.manager=true -Dtests.heap.size=512m -Dtests.locale=it -Dtests.timezone=Asia/Kashgar ``` Build url: http://build-us-00.elastic.co/job/es_core_master_centos/5094/
test
two nodes should run using public ip fails due to missing node stack trace java lang assertionerror expected but was at randomizedtesting seedinfo seed at org junit assert fail assert java at org junit assert failnotequals assert java at org junit assert assertequals assert java at org junit assert assertequals assert java at org junit assert assertequals assert java at org elasticsearch discovery azure abstractazurecomputeservicetest checknumberofnodes abstractazurecomputeservicetest java at org elasticsearch discovery azure azuretwostartednodestest two nodes should run using public ip azuretwostartednodestest java repro line mvn test pdev dtests seed dtests class org elasticsearch discovery azure azuretwostartednodestest dtests slow true dtests method two nodes should run using public ip des logger level debug des node mode network dtests assertion disabled false dtests security manager true dtests heap size dtests locale it dtests timezone asia kashgar build url
1
240,674
20,069,010,309
IssuesEvent
2022-02-04 02:33:49
mellium/xmpp
https://api.github.com/repos/mellium/xmpp
closed
internal/integration/python: add new testing package
enhancement testing
Extracting and generalizing the Python bits of the [`aioxmpp`] package into a new `internal/integration/python` package will allow us to run integration tests against other Python libraries as well and share a lot of the code between them. The config and argument parsing can likely be shared, as well as the `Import` and `Args` options. [`aioxmpp`]: https://pkg.go.dev/mellium.im/xmpp/internal/integration/aioxmpp
1.0
internal/integration/python: add new testing package - Extracting and generalizing the Python bits of the [`aioxmpp`] package into a new `internal/integration/python` package will allow us to run integration tests against other Python libraries as well and share a lot of the code between them. The config and argument parsing can likely be shared, as well as the `Import` and `Args` options. [`aioxmpp`]: https://pkg.go.dev/mellium.im/xmpp/internal/integration/aioxmpp
test
internal integration python add new testing package extracting and generalizing the python bits of the package into a new internal integration python package will allow us to run integration tests against other python libraries as well and share a lot of the code between them the config and argument parsing can likely be shared as well as the import and args options
1
209,605
16,044,680,339
IssuesEvent
2021-04-22 12:21:05
trinodb/trino
https://api.github.com/repos/trinodb/trino
closed
Testing Phoenix service is flaky: Cannot assign requested address
bug test
``` 2021-04-08T12:17:16.908-0500 INFO LogTestDurationListener enabled: true [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running TestSuite log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Error: Tests run: 163, Failures: 1, Errors: 0, Skipped: 159, Time elapsed: 3.424 s <<< FAILURE! - in TestSuite Error: init(io.trino.plugin.phoenix.TestPhoenixConnectorTest) Time elapsed: 2.213 s <<< FAILURE! java.lang.RuntimeException: Can't start phoenix server. at io.trino.plugin.phoenix.TestingPhoenixServer.<init>(TestingPhoenixServer.java:106) at io.trino.plugin.phoenix.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:50) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.io.IOException: Shutting down at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:234) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:96) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1089) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1051) at io.trino.plugin.phoenix.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) ... 16 more Caused by: java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasterCannot assign requested address at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:223) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:158) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:216) ... 20 more Caused by: java.io.IOException: Problem binding to fv-az177-155.internal.cloudapp.net/10.1.1.77:0 : Cannot assign requested address. To switch ports use the 'hbase.master.port' configuration property. at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1103) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1051) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:242) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:611) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:562) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:446) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139) ... 23 more Caused by: java.net.BindException: Cannot assign requested address at java.base/sun.nio.ch.Net.bind0(Native Method) at java.base/sun.nio.ch.Net.bind(Net.java:455) at java.base/sun.nio.ch.Net.bind(Net.java:447) at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227) at java.base/sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:80) at org.apache.hadoop.hbase.ipc.RpcServer.bind(RpcServer.java:2755) at org.apache.hadoop.hbase.ipc.RpcServer$Listener.<init>(RpcServer.java:658) at org.apache.hadoop.hbase.ipc.RpcServer.<init>(RpcServer.java:2191) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1095) ... 33 more [INFO] [INFO] Results: [INFO] Error: Failures: Error: TestPhoenixConnectorTest>AbstractTestQueryFramework.init:93->createQueryRunner:50 » Runtime [INFO] Error: Tests run: 116, Failures: 1, Errors: 0, Skipped: 112 [INFO] [INFO] [INFO] ----------------------< io.trino:trino-phoenix5 >----------------------- [INFO] Building trino-phoenix5 355-SNAPSHOT [2/2] [INFO] ----------------------------[ trino-plugin ]---------------------------- [INFO] [INFO] --- trino-maven-plugin:10:check-spi-dependencies (default-check-spi-dependencies) @ trino-phoenix5 --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M3:enforce (default) @ trino-phoenix5 --- [INFO] Skipping Rule Enforcement. [INFO] [INFO] --- dependency-scope-maven-plugin:0.10:check (default) @ trino-phoenix5 --- [INFO] Skipping plugin execution [INFO] [INFO] --- license-maven-plugin:3.0:check (default) @ trino-phoenix5 --- [INFO] [INFO] --- maven-checkstyle-plugin:3.1.2:check (checkstyle) @ trino-phoenix5 --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.6:prepare-agent (default) @ trino-phoenix5 --- [INFO] Skipping JaCoCo execution because property jacoco.skip is set. [INFO] argLine set to empty [INFO] [INFO] --- git-commit-id-plugin:4.0.3:revision (default) @ trino-phoenix5 --- [INFO] [INFO] --- maven-resources-plugin:3.2.0:resources (default-resources) @ trino-phoenix5 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Using 'UTF-8' encoding to copy filtered properties files. [INFO] skip non existing resourceDirectory /home/runner/work/trino/trino/plugin/trino-phoenix5/src/main/resources [INFO] [INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ trino-phoenix5 --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- trino-maven-plugin:10:generate-service-descriptor (default-generate-service-descriptor) @ trino-phoenix5 --- [INFO] Wrote io.trino.plugin.phoenix5.PhoenixPlugin to /home/runner/work/trino/trino/plugin/trino-phoenix5/target/trino-phoenix5-355-SNAPSHOT-services.jar [INFO] [INFO] --- maven-resources-plugin:3.2.0:testResources (default-testResources) @ trino-phoenix5 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Using 'UTF-8' encoding to copy filtered properties files. [INFO] skip non existing resourceDirectory /home/runner/work/trino/trino/plugin/trino-phoenix5/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.8.1:testCompile (default-testCompile) @ trino-phoenix5 --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-dependency-plugin:3.1.2:analyze-only (default) @ trino-phoenix5 --- [INFO] Skipping plugin execution [INFO] [INFO] --- maven-dependency-plugin:3.1.2:analyze-duplicate (default) @ trino-phoenix5 --- [INFO] Skipping plugin execution [INFO] [INFO] --- duplicate-finder-maven-plugin:1.5.0:check (default) @ trino-phoenix5 --- [INFO] Skipping duplicate-finder execution! [INFO] [INFO] --- modernizer-maven-plugin:2.2.0:modernizer (modernizer) @ trino-phoenix5 --- [INFO] Skipping modernizer execution! [INFO] [INFO] --- maven-surefire-plugin:2.22.0:test (default-test) @ trino-phoenix5 --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running TestSuite 2021-04-08T12:17:21.715-0500 INFO LogTestDurationListener enabled: true 2021-04-08T12:17:24.172-0500 SEVERE Failed construction RegionServer java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.NettyRpcServer at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:66) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:45) at org.apache.hadoop.hbase.ipc.RpcServerFactory.createRpcServer(RpcServerFactory.java:66) at org.apache.hadoop.hbase.master.MasterRpcServices.createRpcServer(MasterRpcServices.java:390) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1261) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1216) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:368) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:720) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:606) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:493) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:132) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:227) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:174) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:245) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:116) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1169) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1213) at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) at io.trino.plugin.phoenix5.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix5.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:48) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:58) ... 37 more Caused by: org.apache.hbase.thirdparty.io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Cannot assign requested address at org.apache.hbase.thirdparty.io.netty.channel.unix.Errors.newIOException(Errors.java:122) at org.apache.hbase.thirdparty.io.netty.channel.unix.Socket.bind(Socket.java:287) at org.apache.hbase.thirdparty.io.netty.channel.epoll.AbstractEpollChannel.doBind(AbstractEpollChannel.java:686) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollServerSocketChannel.doBind(EpollServerSocketChannel.java:70) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:563) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1332) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:488) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:473) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:984) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel.bind(AbstractChannel.java:259) at org.apache.hbase.thirdparty.io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366) at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:333) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ... 1 more 2021-04-08T12:17:24.174-0500 SEVERE Error starting cluster java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasternull at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:137) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:227) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:174) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:245) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:116) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1169) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1213) at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) at io.trino.plugin.phoenix5.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix5.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:48) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.NettyRpcServer at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:66) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:45) at org.apache.hadoop.hbase.ipc.RpcServerFactory.createRpcServer(RpcServerFactory.java:66) at org.apache.hadoop.hbase.master.MasterRpcServices.createRpcServer(MasterRpcServices.java:390) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1261) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1216) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:368) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:720) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:606) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:493) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:132) ... 23 more Caused by: java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:58) ... 37 more Caused by: org.apache.hbase.thirdparty.io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Cannot assign requested address at org.apache.hbase.thirdparty.io.netty.channel.unix.Errors.newIOException(Errors.java:122) at org.apache.hbase.thirdparty.io.netty.channel.unix.Socket.bind(Socket.java:287) at org.apache.hbase.thirdparty.io.netty.channel.epoll.AbstractEpollChannel.doBind(AbstractEpollChannel.java:686) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollServerSocketChannel.doBind(EpollServerSocketChannel.java:70) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:563) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1332) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:488) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:473) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:984) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel.bind(AbstractChannel.java:259) at org.apache.hbase.thirdparty.io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366) at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:333) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ... 1 more Error: Tests run: 163, Failures: 1, Errors: 0, Skipped: 159, Time elapsed: 3.964 s <<< FAILURE! - in TestSuite Error: init(io.trino.plugin.phoenix5.TestPhoenixConnectorTest) Time elapsed: 2.676 s <<< FAILURE! java.lang.RuntimeException: Can't start phoenix server. at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:106) at io.trino.plugin.phoenix5.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix5.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:48) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.io.IOException: Shutting down at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:266) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:116) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1169) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1213) at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) ... 16 more Caused by: java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasternull at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:137) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:227) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:174) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:245) ... 20 more Caused by: java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.NettyRpcServer at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:66) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:45) at org.apache.hadoop.hbase.ipc.RpcServerFactory.createRpcServer(RpcServerFactory.java:66) at org.apache.hadoop.hbase.master.MasterRpcServices.createRpcServer(MasterRpcServices.java:390) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1261) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1216) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:368) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:720) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:606) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:493) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:132) ... 23 more Caused by: java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:58) ... 37 more Caused by: org.apache.hbase.thirdparty.io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Cannot assign requested address at org.apache.hbase.thirdparty.io.netty.channel.unix.Errors.newIOException(Errors.java:122) at org.apache.hbase.thirdparty.io.netty.channel.unix.Socket.bind(Socket.java:287) at org.apache.hbase.thirdparty.io.netty.channel.epoll.AbstractEpollChannel.doBind(AbstractEpollChannel.java:686) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollServerSocketChannel.doBind(EpollServerSocketChannel.java:70) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:563) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1332) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:488) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:473) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:984) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel.bind(AbstractChannel.java:259) at org.apache.hbase.thirdparty.io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366) at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:333) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ... 1 more [INFO] [INFO] Results: [INFO] Error: Failures: Error: TestPhoenixConnectorTest>AbstractTestQueryFramework.init:93->createQueryRunner:48 » Runtime [INFO] Error: Tests run: 116, Failures: 1, Errors: 0, Skipped: 112 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for trino-phoenix 355-SNAPSHOT: [INFO] [INFO] trino-phoenix ...................................... FAILURE [ 8.371 s] [INFO] trino-phoenix5 ..................................... FAILURE [ 5.353 s] ```
1.0
Testing Phoenix service is flaky: Cannot assign requested address - ``` 2021-04-08T12:17:16.908-0500 INFO LogTestDurationListener enabled: true [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running TestSuite log4j:WARN No appenders could be found for logger (org.apache.hadoop.util.Shell). log4j:WARN Please initialize the log4j system properly. log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info. Error: Tests run: 163, Failures: 1, Errors: 0, Skipped: 159, Time elapsed: 3.424 s <<< FAILURE! - in TestSuite Error: init(io.trino.plugin.phoenix.TestPhoenixConnectorTest) Time elapsed: 2.213 s <<< FAILURE! java.lang.RuntimeException: Can't start phoenix server. at io.trino.plugin.phoenix.TestingPhoenixServer.<init>(TestingPhoenixServer.java:106) at io.trino.plugin.phoenix.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:50) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.io.IOException: Shutting down at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:234) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:96) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1089) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1051) at io.trino.plugin.phoenix.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) ... 16 more Caused by: java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasterCannot assign requested address at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:143) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:223) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:158) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:216) ... 20 more Caused by: java.io.IOException: Problem binding to fv-az177-155.internal.cloudapp.net/10.1.1.77:0 : Cannot assign requested address. To switch ports use the 'hbase.master.port' configuration property. at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1103) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1051) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:242) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:611) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:562) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:446) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:139) ... 23 more Caused by: java.net.BindException: Cannot assign requested address at java.base/sun.nio.ch.Net.bind0(Native Method) at java.base/sun.nio.ch.Net.bind(Net.java:455) at java.base/sun.nio.ch.Net.bind(Net.java:447) at java.base/sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:227) at java.base/sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:80) at org.apache.hadoop.hbase.ipc.RpcServer.bind(RpcServer.java:2755) at org.apache.hadoop.hbase.ipc.RpcServer$Listener.<init>(RpcServer.java:658) at org.apache.hadoop.hbase.ipc.RpcServer.<init>(RpcServer.java:2191) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1095) ... 33 more [INFO] [INFO] Results: [INFO] Error: Failures: Error: TestPhoenixConnectorTest>AbstractTestQueryFramework.init:93->createQueryRunner:50 » Runtime [INFO] Error: Tests run: 116, Failures: 1, Errors: 0, Skipped: 112 [INFO] [INFO] [INFO] ----------------------< io.trino:trino-phoenix5 >----------------------- [INFO] Building trino-phoenix5 355-SNAPSHOT [2/2] [INFO] ----------------------------[ trino-plugin ]---------------------------- [INFO] [INFO] --- trino-maven-plugin:10:check-spi-dependencies (default-check-spi-dependencies) @ trino-phoenix5 --- [INFO] [INFO] --- maven-enforcer-plugin:3.0.0-M3:enforce (default) @ trino-phoenix5 --- [INFO] Skipping Rule Enforcement. [INFO] [INFO] --- dependency-scope-maven-plugin:0.10:check (default) @ trino-phoenix5 --- [INFO] Skipping plugin execution [INFO] [INFO] --- license-maven-plugin:3.0:check (default) @ trino-phoenix5 --- [INFO] [INFO] --- maven-checkstyle-plugin:3.1.2:check (checkstyle) @ trino-phoenix5 --- [INFO] [INFO] --- jacoco-maven-plugin:0.8.6:prepare-agent (default) @ trino-phoenix5 --- [INFO] Skipping JaCoCo execution because property jacoco.skip is set. [INFO] argLine set to empty [INFO] [INFO] --- git-commit-id-plugin:4.0.3:revision (default) @ trino-phoenix5 --- [INFO] [INFO] --- maven-resources-plugin:3.2.0:resources (default-resources) @ trino-phoenix5 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Using 'UTF-8' encoding to copy filtered properties files. [INFO] skip non existing resourceDirectory /home/runner/work/trino/trino/plugin/trino-phoenix5/src/main/resources [INFO] [INFO] --- maven-compiler-plugin:3.8.1:compile (default-compile) @ trino-phoenix5 --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- trino-maven-plugin:10:generate-service-descriptor (default-generate-service-descriptor) @ trino-phoenix5 --- [INFO] Wrote io.trino.plugin.phoenix5.PhoenixPlugin to /home/runner/work/trino/trino/plugin/trino-phoenix5/target/trino-phoenix5-355-SNAPSHOT-services.jar [INFO] [INFO] --- maven-resources-plugin:3.2.0:testResources (default-testResources) @ trino-phoenix5 --- [INFO] Using 'UTF-8' encoding to copy filtered resources. [INFO] Using 'UTF-8' encoding to copy filtered properties files. [INFO] skip non existing resourceDirectory /home/runner/work/trino/trino/plugin/trino-phoenix5/src/test/resources [INFO] [INFO] --- maven-compiler-plugin:3.8.1:testCompile (default-testCompile) @ trino-phoenix5 --- [INFO] Nothing to compile - all classes are up to date [INFO] [INFO] --- maven-dependency-plugin:3.1.2:analyze-only (default) @ trino-phoenix5 --- [INFO] Skipping plugin execution [INFO] [INFO] --- maven-dependency-plugin:3.1.2:analyze-duplicate (default) @ trino-phoenix5 --- [INFO] Skipping plugin execution [INFO] [INFO] --- duplicate-finder-maven-plugin:1.5.0:check (default) @ trino-phoenix5 --- [INFO] Skipping duplicate-finder execution! [INFO] [INFO] --- modernizer-maven-plugin:2.2.0:modernizer (modernizer) @ trino-phoenix5 --- [INFO] Skipping modernizer execution! [INFO] [INFO] --- maven-surefire-plugin:2.22.0:test (default-test) @ trino-phoenix5 --- [INFO] [INFO] ------------------------------------------------------- [INFO] T E S T S [INFO] ------------------------------------------------------- [INFO] Running TestSuite 2021-04-08T12:17:21.715-0500 INFO LogTestDurationListener enabled: true 2021-04-08T12:17:24.172-0500 SEVERE Failed construction RegionServer java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.NettyRpcServer at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:66) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:45) at org.apache.hadoop.hbase.ipc.RpcServerFactory.createRpcServer(RpcServerFactory.java:66) at org.apache.hadoop.hbase.master.MasterRpcServices.createRpcServer(MasterRpcServices.java:390) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1261) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1216) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:368) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:720) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:606) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:493) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:132) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:227) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:174) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:245) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:116) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1169) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1213) at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) at io.trino.plugin.phoenix5.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix5.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:48) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:58) ... 37 more Caused by: org.apache.hbase.thirdparty.io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Cannot assign requested address at org.apache.hbase.thirdparty.io.netty.channel.unix.Errors.newIOException(Errors.java:122) at org.apache.hbase.thirdparty.io.netty.channel.unix.Socket.bind(Socket.java:287) at org.apache.hbase.thirdparty.io.netty.channel.epoll.AbstractEpollChannel.doBind(AbstractEpollChannel.java:686) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollServerSocketChannel.doBind(EpollServerSocketChannel.java:70) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:563) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1332) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:488) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:473) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:984) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel.bind(AbstractChannel.java:259) at org.apache.hbase.thirdparty.io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366) at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:333) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ... 1 more 2021-04-08T12:17:24.174-0500 SEVERE Error starting cluster java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasternull at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:137) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:227) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:174) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:245) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:116) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1169) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1213) at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) at io.trino.plugin.phoenix5.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix5.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:48) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.NettyRpcServer at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:66) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:45) at org.apache.hadoop.hbase.ipc.RpcServerFactory.createRpcServer(RpcServerFactory.java:66) at org.apache.hadoop.hbase.master.MasterRpcServices.createRpcServer(MasterRpcServices.java:390) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1261) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1216) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:368) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:720) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:606) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:493) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:132) ... 23 more Caused by: java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:58) ... 37 more Caused by: org.apache.hbase.thirdparty.io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Cannot assign requested address at org.apache.hbase.thirdparty.io.netty.channel.unix.Errors.newIOException(Errors.java:122) at org.apache.hbase.thirdparty.io.netty.channel.unix.Socket.bind(Socket.java:287) at org.apache.hbase.thirdparty.io.netty.channel.epoll.AbstractEpollChannel.doBind(AbstractEpollChannel.java:686) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollServerSocketChannel.doBind(EpollServerSocketChannel.java:70) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:563) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1332) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:488) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:473) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:984) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel.bind(AbstractChannel.java:259) at org.apache.hbase.thirdparty.io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366) at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:333) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ... 1 more Error: Tests run: 163, Failures: 1, Errors: 0, Skipped: 159, Time elapsed: 3.964 s <<< FAILURE! - in TestSuite Error: init(io.trino.plugin.phoenix5.TestPhoenixConnectorTest) Time elapsed: 2.676 s <<< FAILURE! java.lang.RuntimeException: Can't start phoenix server. at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:106) at io.trino.plugin.phoenix5.TestingPhoenixServer.getInstance(TestingPhoenixServer.java:50) at io.trino.plugin.phoenix5.TestPhoenixConnectorTest.createQueryRunner(TestPhoenixConnectorTest.java:48) at io.trino.testing.AbstractTestQueryFramework.init(AbstractTestQueryFramework.java:93) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:104) at org.testng.internal.Invoker.invokeConfigurationMethod(Invoker.java:515) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:217) at org.testng.internal.Invoker.invokeConfigurations(Invoker.java:144) at org.testng.internal.TestMethodWorker.invokeBeforeClassMethods(TestMethodWorker.java:169) at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108) at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1128) at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:628) at java.base/java.lang.Thread.run(Thread.java:834) Caused by: java.io.IOException: Shutting down at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:266) at org.apache.hadoop.hbase.MiniHBaseCluster.<init>(MiniHBaseCluster.java:116) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1169) at org.apache.hadoop.hbase.HBaseTestingUtility.startMiniHBaseCluster(HBaseTestingUtility.java:1213) at io.trino.plugin.phoenix5.TestingPhoenixServer.<init>(TestingPhoenixServer.java:101) ... 16 more Caused by: java.lang.RuntimeException: Failed construction of Master: class org.apache.hadoop.hbase.master.HMasternull at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:137) at org.apache.hadoop.hbase.LocalHBaseCluster.addMaster(LocalHBaseCluster.java:227) at org.apache.hadoop.hbase.LocalHBaseCluster.<init>(LocalHBaseCluster.java:174) at org.apache.hadoop.hbase.MiniHBaseCluster.init(MiniHBaseCluster.java:245) ... 20 more Caused by: java.lang.UnsupportedOperationException: Constructor threw an exception for org.apache.hadoop.hbase.ipc.NettyRpcServer at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:66) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiateWithCustomCtor(ReflectionUtils.java:45) at org.apache.hadoop.hbase.ipc.RpcServerFactory.createRpcServer(RpcServerFactory.java:66) at org.apache.hadoop.hbase.master.MasterRpcServices.createRpcServer(MasterRpcServices.java:390) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1261) at org.apache.hadoop.hbase.regionserver.RSRpcServices.<init>(RSRpcServices.java:1216) at org.apache.hadoop.hbase.master.MasterRpcServices.<init>(MasterRpcServices.java:368) at org.apache.hadoop.hbase.master.HMaster.createRpcServices(HMaster.java:720) at org.apache.hadoop.hbase.regionserver.HRegionServer.<init>(HRegionServer.java:606) at org.apache.hadoop.hbase.master.HMaster.<init>(HMaster.java:493) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.JVMClusterUtil.createMasterThread(JVMClusterUtil.java:132) ... 23 more Caused by: java.lang.reflect.InvocationTargetException at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method) at java.base/jdk.internal.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45) at java.base/java.lang.reflect.Constructor.newInstance(Constructor.java:490) at org.apache.hadoop.hbase.util.ReflectionUtils.instantiate(ReflectionUtils.java:58) ... 37 more Caused by: org.apache.hbase.thirdparty.io.netty.channel.unix.Errors$NativeIoException: bind(..) failed: Cannot assign requested address at org.apache.hbase.thirdparty.io.netty.channel.unix.Errors.newIOException(Errors.java:122) at org.apache.hbase.thirdparty.io.netty.channel.unix.Socket.bind(Socket.java:287) at org.apache.hbase.thirdparty.io.netty.channel.epoll.AbstractEpollChannel.doBind(AbstractEpollChannel.java:686) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollServerSocketChannel.doBind(EpollServerSocketChannel.java:70) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:563) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1332) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:488) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:473) at org.apache.hbase.thirdparty.io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:984) at org.apache.hbase.thirdparty.io.netty.channel.AbstractChannel.bind(AbstractChannel.java:259) at org.apache.hbase.thirdparty.io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:366) at org.apache.hbase.thirdparty.io.netty.util.concurrent.AbstractEventExecutor.safeExecute(AbstractEventExecutor.java:163) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:404) at org.apache.hbase.thirdparty.io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:333) at org.apache.hbase.thirdparty.io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:905) at org.apache.hbase.thirdparty.io.netty.util.concurrent.FastThreadLocalRunnable.run(FastThreadLocalRunnable.java:30) ... 1 more [INFO] [INFO] Results: [INFO] Error: Failures: Error: TestPhoenixConnectorTest>AbstractTestQueryFramework.init:93->createQueryRunner:48 » Runtime [INFO] Error: Tests run: 116, Failures: 1, Errors: 0, Skipped: 112 [INFO] [INFO] ------------------------------------------------------------------------ [INFO] Reactor Summary for trino-phoenix 355-SNAPSHOT: [INFO] [INFO] trino-phoenix ...................................... FAILURE [ 8.371 s] [INFO] trino-phoenix5 ..................................... FAILURE [ 5.353 s] ```
test
testing phoenix service is flaky cannot assign requested address info logtestdurationlistener enabled true t e s t s running testsuite warn no appenders could be found for logger org apache hadoop util shell warn please initialize the system properly warn see for more info error tests run failures errors skipped time elapsed s failure in testsuite error init io trino plugin phoenix testphoenixconnectortest time elapsed s failure java lang runtimeexception can t start phoenix server at io trino plugin phoenix testingphoenixserver testingphoenixserver java at io trino plugin phoenix testingphoenixserver getinstance testingphoenixserver java at io trino plugin phoenix testphoenixconnectortest createqueryrunner testphoenixconnectortest java at io trino testing abstracttestqueryframework init abstracttestqueryframework java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org testng internal methodinvocationhelper invokemethod methodinvocationhelper java at org testng internal invoker invokeconfigurationmethod invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal testmethodworker invokebeforeclassmethods testmethodworker java at org testng internal testmethodworker run testmethodworker java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java caused by java io ioexception shutting down at org apache hadoop hbase minihbasecluster init minihbasecluster java at org apache hadoop hbase minihbasecluster minihbasecluster java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at io trino plugin phoenix testingphoenixserver testingphoenixserver java more caused by java lang runtimeexception failed construction of master class org apache hadoop hbase master hmastercannot assign requested address at org apache hadoop hbase util jvmclusterutil createmasterthread jvmclusterutil java at org apache hadoop hbase localhbasecluster addmaster localhbasecluster java at org apache hadoop hbase localhbasecluster localhbasecluster java at org apache hadoop hbase minihbasecluster init minihbasecluster java more caused by java io ioexception problem binding to fv internal cloudapp net cannot assign requested address to switch ports use the hbase master port configuration property at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase master masterrpcservices masterrpcservices java at org apache hadoop hbase master hmaster createrpcservices hmaster java at org apache hadoop hbase regionserver hregionserver hregionserver java at org apache hadoop hbase master hmaster hmaster java at java base jdk internal reflect nativeconstructoraccessorimpl native method at java base jdk internal reflect nativeconstructoraccessorimpl newinstance nativeconstructoraccessorimpl java at java base jdk internal reflect delegatingconstructoraccessorimpl newinstance delegatingconstructoraccessorimpl java at java base java lang reflect constructor newinstance constructor java at org apache hadoop hbase util jvmclusterutil createmasterthread jvmclusterutil java more caused by java net bindexception cannot assign requested address at java base sun nio ch net native method at java base sun nio ch net bind net java at java base sun nio ch net bind net java at java base sun nio ch serversocketchannelimpl bind serversocketchannelimpl java at java base sun nio ch serversocketadaptor bind serversocketadaptor java at org apache hadoop hbase ipc rpcserver bind rpcserver java at org apache hadoop hbase ipc rpcserver listener rpcserver java at org apache hadoop hbase ipc rpcserver rpcserver java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java more results error failures error testphoenixconnectortest abstracttestqueryframework init createqueryrunner » runtime error tests run failures errors skipped building trino snapshot trino maven plugin check spi dependencies default check spi dependencies trino maven enforcer plugin enforce default trino skipping rule enforcement dependency scope maven plugin check default trino skipping plugin execution license maven plugin check default trino maven checkstyle plugin check checkstyle trino jacoco maven plugin prepare agent default trino skipping jacoco execution because property jacoco skip is set argline set to empty git commit id plugin revision default trino maven resources plugin resources default resources trino using utf encoding to copy filtered resources using utf encoding to copy filtered properties files skip non existing resourcedirectory home runner work trino trino plugin trino src main resources maven compiler plugin compile default compile trino nothing to compile all classes are up to date trino maven plugin generate service descriptor default generate service descriptor trino wrote io trino plugin phoenixplugin to home runner work trino trino plugin trino target trino snapshot services jar maven resources plugin testresources default testresources trino using utf encoding to copy filtered resources using utf encoding to copy filtered properties files skip non existing resourcedirectory home runner work trino trino plugin trino src test resources maven compiler plugin testcompile default testcompile trino nothing to compile all classes are up to date maven dependency plugin analyze only default trino skipping plugin execution maven dependency plugin analyze duplicate default trino skipping plugin execution duplicate finder maven plugin check default trino skipping duplicate finder execution modernizer maven plugin modernizer modernizer trino skipping modernizer execution maven surefire plugin test default test trino t e s t s running testsuite info logtestdurationlistener enabled true severe failed construction regionserver java lang unsupportedoperationexception constructor threw an exception for org apache hadoop hbase ipc nettyrpcserver at org apache hadoop hbase util reflectionutils instantiate reflectionutils java at org apache hadoop hbase util reflectionutils instantiatewithcustomctor reflectionutils java at org apache hadoop hbase ipc rpcserverfactory createrpcserver rpcserverfactory java at org apache hadoop hbase master masterrpcservices createrpcserver masterrpcservices java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase master masterrpcservices masterrpcservices java at org apache hadoop hbase master hmaster createrpcservices hmaster java at org apache hadoop hbase regionserver hregionserver hregionserver java at org apache hadoop hbase master hmaster hmaster java at java base jdk internal reflect nativeconstructoraccessorimpl native method at java base jdk internal reflect nativeconstructoraccessorimpl newinstance nativeconstructoraccessorimpl java at java base jdk internal reflect delegatingconstructoraccessorimpl newinstance delegatingconstructoraccessorimpl java at java base java lang reflect constructor newinstance constructor java at org apache hadoop hbase util jvmclusterutil createmasterthread jvmclusterutil java at org apache hadoop hbase localhbasecluster addmaster localhbasecluster java at org apache hadoop hbase localhbasecluster localhbasecluster java at org apache hadoop hbase minihbasecluster init minihbasecluster java at org apache hadoop hbase minihbasecluster minihbasecluster java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at io trino plugin testingphoenixserver testingphoenixserver java at io trino plugin testingphoenixserver getinstance testingphoenixserver java at io trino plugin testphoenixconnectortest createqueryrunner testphoenixconnectortest java at io trino testing abstracttestqueryframework init abstracttestqueryframework java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org testng internal methodinvocationhelper invokemethod methodinvocationhelper java at org testng internal invoker invokeconfigurationmethod invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal testmethodworker invokebeforeclassmethods testmethodworker java at org testng internal testmethodworker run testmethodworker java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java caused by java lang reflect invocationtargetexception at java base jdk internal reflect nativeconstructoraccessorimpl native method at java base jdk internal reflect nativeconstructoraccessorimpl newinstance nativeconstructoraccessorimpl java at java base jdk internal reflect delegatingconstructoraccessorimpl newinstance delegatingconstructoraccessorimpl java at java base java lang reflect constructor newinstance constructor java at org apache hadoop hbase util reflectionutils instantiate reflectionutils java more caused by org apache hbase thirdparty io netty channel unix errors nativeioexception bind failed cannot assign requested address at org apache hbase thirdparty io netty channel unix errors newioexception errors java at org apache hbase thirdparty io netty channel unix socket bind socket java at org apache hbase thirdparty io netty channel epoll abstractepollchannel dobind abstractepollchannel java at org apache hbase thirdparty io netty channel epoll epollserversocketchannel dobind epollserversocketchannel java at org apache hbase thirdparty io netty channel abstractchannel abstractunsafe bind abstractchannel java at org apache hbase thirdparty io netty channel defaultchannelpipeline headcontext bind defaultchannelpipeline java at org apache hbase thirdparty io netty channel abstractchannelhandlercontext invokebind abstractchannelhandlercontext java at org apache hbase thirdparty io netty channel abstractchannelhandlercontext bind abstractchannelhandlercontext java at org apache hbase thirdparty io netty channel defaultchannelpipeline bind defaultchannelpipeline java at org apache hbase thirdparty io netty channel abstractchannel bind abstractchannel java at org apache hbase thirdparty io netty bootstrap abstractbootstrap run abstractbootstrap java at org apache hbase thirdparty io netty util concurrent abstracteventexecutor safeexecute abstracteventexecutor java at org apache hbase thirdparty io netty util concurrent singlethreadeventexecutor runalltasks singlethreadeventexecutor java at org apache hbase thirdparty io netty channel epoll epolleventloop run epolleventloop java at org apache hbase thirdparty io netty util concurrent singlethreadeventexecutor run singlethreadeventexecutor java at org apache hbase thirdparty io netty util concurrent fastthreadlocalrunnable run fastthreadlocalrunnable java more severe error starting cluster java lang runtimeexception failed construction of master class org apache hadoop hbase master hmasternull at org apache hadoop hbase util jvmclusterutil createmasterthread jvmclusterutil java at org apache hadoop hbase localhbasecluster addmaster localhbasecluster java at org apache hadoop hbase localhbasecluster localhbasecluster java at org apache hadoop hbase minihbasecluster init minihbasecluster java at org apache hadoop hbase minihbasecluster minihbasecluster java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at io trino plugin testingphoenixserver testingphoenixserver java at io trino plugin testingphoenixserver getinstance testingphoenixserver java at io trino plugin testphoenixconnectortest createqueryrunner testphoenixconnectortest java at io trino testing abstracttestqueryframework init abstracttestqueryframework java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org testng internal methodinvocationhelper invokemethod methodinvocationhelper java at org testng internal invoker invokeconfigurationmethod invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal testmethodworker invokebeforeclassmethods testmethodworker java at org testng internal testmethodworker run testmethodworker java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java caused by java lang unsupportedoperationexception constructor threw an exception for org apache hadoop hbase ipc nettyrpcserver at org apache hadoop hbase util reflectionutils instantiate reflectionutils java at org apache hadoop hbase util reflectionutils instantiatewithcustomctor reflectionutils java at org apache hadoop hbase ipc rpcserverfactory createrpcserver rpcserverfactory java at org apache hadoop hbase master masterrpcservices createrpcserver masterrpcservices java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase master masterrpcservices masterrpcservices java at org apache hadoop hbase master hmaster createrpcservices hmaster java at org apache hadoop hbase regionserver hregionserver hregionserver java at org apache hadoop hbase master hmaster hmaster java at java base jdk internal reflect nativeconstructoraccessorimpl native method at java base jdk internal reflect nativeconstructoraccessorimpl newinstance nativeconstructoraccessorimpl java at java base jdk internal reflect delegatingconstructoraccessorimpl newinstance delegatingconstructoraccessorimpl java at java base java lang reflect constructor newinstance constructor java at org apache hadoop hbase util jvmclusterutil createmasterthread jvmclusterutil java more caused by java lang reflect invocationtargetexception at java base jdk internal reflect nativeconstructoraccessorimpl native method at java base jdk internal reflect nativeconstructoraccessorimpl newinstance nativeconstructoraccessorimpl java at java base jdk internal reflect delegatingconstructoraccessorimpl newinstance delegatingconstructoraccessorimpl java at java base java lang reflect constructor newinstance constructor java at org apache hadoop hbase util reflectionutils instantiate reflectionutils java more caused by org apache hbase thirdparty io netty channel unix errors nativeioexception bind failed cannot assign requested address at org apache hbase thirdparty io netty channel unix errors newioexception errors java at org apache hbase thirdparty io netty channel unix socket bind socket java at org apache hbase thirdparty io netty channel epoll abstractepollchannel dobind abstractepollchannel java at org apache hbase thirdparty io netty channel epoll epollserversocketchannel dobind epollserversocketchannel java at org apache hbase thirdparty io netty channel abstractchannel abstractunsafe bind abstractchannel java at org apache hbase thirdparty io netty channel defaultchannelpipeline headcontext bind defaultchannelpipeline java at org apache hbase thirdparty io netty channel abstractchannelhandlercontext invokebind abstractchannelhandlercontext java at org apache hbase thirdparty io netty channel abstractchannelhandlercontext bind abstractchannelhandlercontext java at org apache hbase thirdparty io netty channel defaultchannelpipeline bind defaultchannelpipeline java at org apache hbase thirdparty io netty channel abstractchannel bind abstractchannel java at org apache hbase thirdparty io netty bootstrap abstractbootstrap run abstractbootstrap java at org apache hbase thirdparty io netty util concurrent abstracteventexecutor safeexecute abstracteventexecutor java at org apache hbase thirdparty io netty util concurrent singlethreadeventexecutor runalltasks singlethreadeventexecutor java at org apache hbase thirdparty io netty channel epoll epolleventloop run epolleventloop java at org apache hbase thirdparty io netty util concurrent singlethreadeventexecutor run singlethreadeventexecutor java at org apache hbase thirdparty io netty util concurrent fastthreadlocalrunnable run fastthreadlocalrunnable java more error tests run failures errors skipped time elapsed s failure in testsuite error init io trino plugin testphoenixconnectortest time elapsed s failure java lang runtimeexception can t start phoenix server at io trino plugin testingphoenixserver testingphoenixserver java at io trino plugin testingphoenixserver getinstance testingphoenixserver java at io trino plugin testphoenixconnectortest createqueryrunner testphoenixconnectortest java at io trino testing abstracttestqueryframework init abstracttestqueryframework java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org testng internal methodinvocationhelper invokemethod methodinvocationhelper java at org testng internal invoker invokeconfigurationmethod invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal invoker invokeconfigurations invoker java at org testng internal testmethodworker invokebeforeclassmethods testmethodworker java at org testng internal testmethodworker run testmethodworker java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java caused by java io ioexception shutting down at org apache hadoop hbase minihbasecluster init minihbasecluster java at org apache hadoop hbase minihbasecluster minihbasecluster java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at org apache hadoop hbase hbasetestingutility startminihbasecluster hbasetestingutility java at io trino plugin testingphoenixserver testingphoenixserver java more caused by java lang runtimeexception failed construction of master class org apache hadoop hbase master hmasternull at org apache hadoop hbase util jvmclusterutil createmasterthread jvmclusterutil java at org apache hadoop hbase localhbasecluster addmaster localhbasecluster java at org apache hadoop hbase localhbasecluster localhbasecluster java at org apache hadoop hbase minihbasecluster init minihbasecluster java more caused by java lang unsupportedoperationexception constructor threw an exception for org apache hadoop hbase ipc nettyrpcserver at org apache hadoop hbase util reflectionutils instantiate reflectionutils java at org apache hadoop hbase util reflectionutils instantiatewithcustomctor reflectionutils java at org apache hadoop hbase ipc rpcserverfactory createrpcserver rpcserverfactory java at org apache hadoop hbase master masterrpcservices createrpcserver masterrpcservices java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase regionserver rsrpcservices rsrpcservices java at org apache hadoop hbase master masterrpcservices masterrpcservices java at org apache hadoop hbase master hmaster createrpcservices hmaster java at org apache hadoop hbase regionserver hregionserver hregionserver java at org apache hadoop hbase master hmaster hmaster java at java base jdk internal reflect nativeconstructoraccessorimpl native method at java base jdk internal reflect nativeconstructoraccessorimpl newinstance nativeconstructoraccessorimpl java at java base jdk internal reflect delegatingconstructoraccessorimpl newinstance delegatingconstructoraccessorimpl java at java base java lang reflect constructor newinstance constructor java at org apache hadoop hbase util jvmclusterutil createmasterthread jvmclusterutil java more caused by java lang reflect invocationtargetexception at java base jdk internal reflect nativeconstructoraccessorimpl native method at java base jdk internal reflect nativeconstructoraccessorimpl newinstance nativeconstructoraccessorimpl java at java base jdk internal reflect delegatingconstructoraccessorimpl newinstance delegatingconstructoraccessorimpl java at java base java lang reflect constructor newinstance constructor java at org apache hadoop hbase util reflectionutils instantiate reflectionutils java more caused by org apache hbase thirdparty io netty channel unix errors nativeioexception bind failed cannot assign requested address at org apache hbase thirdparty io netty channel unix errors newioexception errors java at org apache hbase thirdparty io netty channel unix socket bind socket java at org apache hbase thirdparty io netty channel epoll abstractepollchannel dobind abstractepollchannel java at org apache hbase thirdparty io netty channel epoll epollserversocketchannel dobind epollserversocketchannel java at org apache hbase thirdparty io netty channel abstractchannel abstractunsafe bind abstractchannel java at org apache hbase thirdparty io netty channel defaultchannelpipeline headcontext bind defaultchannelpipeline java at org apache hbase thirdparty io netty channel abstractchannelhandlercontext invokebind abstractchannelhandlercontext java at org apache hbase thirdparty io netty channel abstractchannelhandlercontext bind abstractchannelhandlercontext java at org apache hbase thirdparty io netty channel defaultchannelpipeline bind defaultchannelpipeline java at org apache hbase thirdparty io netty channel abstractchannel bind abstractchannel java at org apache hbase thirdparty io netty bootstrap abstractbootstrap run abstractbootstrap java at org apache hbase thirdparty io netty util concurrent abstracteventexecutor safeexecute abstracteventexecutor java at org apache hbase thirdparty io netty util concurrent singlethreadeventexecutor runalltasks singlethreadeventexecutor java at org apache hbase thirdparty io netty channel epoll epolleventloop run epolleventloop java at org apache hbase thirdparty io netty util concurrent singlethreadeventexecutor run singlethreadeventexecutor java at org apache hbase thirdparty io netty util concurrent fastthreadlocalrunnable run fastthreadlocalrunnable java more results error failures error testphoenixconnectortest abstracttestqueryframework init createqueryrunner » runtime error tests run failures errors skipped reactor summary for trino phoenix snapshot trino phoenix failure trino failure
1
323,574
9,856,769,251
IssuesEvent
2019-06-19 23:28:53
RobotLocomotion/drake
https://api.github.com/repos/RobotLocomotion/drake
closed
Use double precision in tinyobjloader
priority: low team: kitware type: installation and distribution
In https://github.com/RobotLocomotion/drake/tree/master/tools/workspace/tinyobjloader we don't set any definitions, which means that we get `float` precision by default. We should just use `double` so as not to discard digits from the file. \CC @SeanCurtis-TRI
1.0
Use double precision in tinyobjloader - In https://github.com/RobotLocomotion/drake/tree/master/tools/workspace/tinyobjloader we don't set any definitions, which means that we get `float` precision by default. We should just use `double` so as not to discard digits from the file. \CC @SeanCurtis-TRI
non_test
use double precision in tinyobjloader in we don t set any definitions which means that we get float precision by default we should just use double so as not to discard digits from the file cc seancurtis tri
0
213,094
16,498,455,531
IssuesEvent
2021-05-25 12:37:53
IntelPython/numba-dppy
https://api.github.com/repos/IntelPython/numba-dppy
opened
Split `dpnp_skip_test()` because it responsible for 2 functions
bug tests
- check device exists - check dpnp exists If `dpnp_skip_test()` is not used the DPNP tests fails too.
1.0
Split `dpnp_skip_test()` because it responsible for 2 functions - - check device exists - check dpnp exists If `dpnp_skip_test()` is not used the DPNP tests fails too.
test
split dpnp skip test because it responsible for functions check device exists check dpnp exists if dpnp skip test is not used the dpnp tests fails too
1
268,686
23,388,478,585
IssuesEvent
2022-08-11 15:35:11
NVIDIA/spark-rapids
https://api.github.com/repos/NVIDIA/spark-rapids
opened
Tracking Exception Handling in Plugin Vs Spark
? - Needs Triage test task Spark 3.2+ spark 3.3+ Spark 3.4+
This is triggered by #6165 While working on a fix for #6165, I saw some inconsistency on how the plugin handles errors. For example, I found that Cast uses inconsistent ways to handle overflow: - `throw new ArithmeticException` - `throw RapidsErrorUtils` - `throw new IllegalStateException` - `throw QueryErrorUtils` This inconsistency makes addressing a specific change more challenging because it requires looking grepping in the entire code/shims instead of just looking in one shim/class. **Suggested Solution** Similar to previous issue #5196, I suggest to: - Create tables to be used as reference. Those table list the expected exceptions, exception messages, and what the GPU throws for each Spark version. | Case | example Query | spark-3.1 | Spark-3.2 | Spark-3.3 | Spark-3.4 | |----------------------------------|---------------|---------------------|--------------------------|--------------------------|-----------------------------------------| | Cast Overflow Numeric to Numeric | bla..bla | ArithmeticException | SparkArithmeticException | SparkArithmeticException | SparkArithmeticException(CAST_OVERFLOW) | | | | | | | | - Add more tests to verify the exception throws by Cast operations. Currently, changing the exception type may not break any integration tests. **Opened issues related to error handling** - https://github.com/NVIDIA/spark-rapids/issues/5806 - https://github.com/NVIDIA/spark-rapids/issues/5826 - https://github.com/NVIDIA/spark-rapids/issues/5152 - https://github.com/NVIDIA/spark-rapids/issues/5908 - https://github.com/NVIDIA/spark-rapids/issues/6262 - https://github.com/NVIDIA/spark-rapids/issues/6199 - https://github.com/NVIDIA/spark-rapids/issues/6185 - https://github.com/NVIDIA/spark-rapids/issues/6182 - https://github.com/NVIDIA/spark-rapids/issues/6076 - https://github.com/NVIDIA/spark-rapids/issues/6039 - https://github.com/NVIDIA/spark-rapids/issues/6014 - https://github.com/NVIDIA/spark-rapids/issues/6012
1.0
Tracking Exception Handling in Plugin Vs Spark - This is triggered by #6165 While working on a fix for #6165, I saw some inconsistency on how the plugin handles errors. For example, I found that Cast uses inconsistent ways to handle overflow: - `throw new ArithmeticException` - `throw RapidsErrorUtils` - `throw new IllegalStateException` - `throw QueryErrorUtils` This inconsistency makes addressing a specific change more challenging because it requires looking grepping in the entire code/shims instead of just looking in one shim/class. **Suggested Solution** Similar to previous issue #5196, I suggest to: - Create tables to be used as reference. Those table list the expected exceptions, exception messages, and what the GPU throws for each Spark version. | Case | example Query | spark-3.1 | Spark-3.2 | Spark-3.3 | Spark-3.4 | |----------------------------------|---------------|---------------------|--------------------------|--------------------------|-----------------------------------------| | Cast Overflow Numeric to Numeric | bla..bla | ArithmeticException | SparkArithmeticException | SparkArithmeticException | SparkArithmeticException(CAST_OVERFLOW) | | | | | | | | - Add more tests to verify the exception throws by Cast operations. Currently, changing the exception type may not break any integration tests. **Opened issues related to error handling** - https://github.com/NVIDIA/spark-rapids/issues/5806 - https://github.com/NVIDIA/spark-rapids/issues/5826 - https://github.com/NVIDIA/spark-rapids/issues/5152 - https://github.com/NVIDIA/spark-rapids/issues/5908 - https://github.com/NVIDIA/spark-rapids/issues/6262 - https://github.com/NVIDIA/spark-rapids/issues/6199 - https://github.com/NVIDIA/spark-rapids/issues/6185 - https://github.com/NVIDIA/spark-rapids/issues/6182 - https://github.com/NVIDIA/spark-rapids/issues/6076 - https://github.com/NVIDIA/spark-rapids/issues/6039 - https://github.com/NVIDIA/spark-rapids/issues/6014 - https://github.com/NVIDIA/spark-rapids/issues/6012
test
tracking exception handling in plugin vs spark this is triggered by while working on a fix for i saw some inconsistency on how the plugin handles errors for example i found that cast uses inconsistent ways to handle overflow throw new arithmeticexception throw rapidserrorutils throw new illegalstateexception throw queryerrorutils this inconsistency makes addressing a specific change more challenging because it requires looking grepping in the entire code shims instead of just looking in one shim class suggested solution similar to previous issue i suggest to create tables to be used as reference those table list the expected exceptions exception messages and what the gpu throws for each spark version case example query spark spark spark spark cast overflow numeric to numeric bla bla arithmeticexception sparkarithmeticexception sparkarithmeticexception sparkarithmeticexception cast overflow add more tests to verify the exception throws by cast operations currently changing the exception type may not break any integration tests opened issues related to error handling
1
27,067
5,311,771,415
IssuesEvent
2017-02-13 05:55:29
webpack/webpack.js.org
https://api.github.com/repos/webpack/webpack.js.org
closed
Document usage of `import()` for code splitting
Documentation
Instead of the deprecated `System.import`. As requested by @TheLarkInn. 😄
1.0
Document usage of `import()` for code splitting - Instead of the deprecated `System.import`. As requested by @TheLarkInn. 😄
non_test
document usage of import for code splitting instead of the deprecated system import as requested by thelarkinn 😄
0
262,447
22,841,274,354
IssuesEvent
2022-07-12 22:14:53
mapbox/mapbox-gl-js
https://api.github.com/repos/mapbox/mapbox-gl-js
closed
`fog/terrain/basic` render test is flaky
testing :100:
`render-tests/fog/terrain/basic` failed once on a completely unrelated change. We should fix it
1.0
`fog/terrain/basic` render test is flaky - `render-tests/fog/terrain/basic` failed once on a completely unrelated change. We should fix it
test
fog terrain basic render test is flaky render tests fog terrain basic failed once on a completely unrelated change we should fix it
1
170,962
13,211,148,089
IssuesEvent
2020-08-15 21:06:39
Rocologo/MobHunting
https://api.github.com/repos/Rocologo/MobHunting
closed
ERROR]: Could not pass event EntityDeathEvent to MobHunting v7.5.2
Fixed - To be tested
Timings report: https://timings.aikar.co/?id=459ec5acf1854b8097a87b064ac6b56e Console Spam: https://pastebin.com/uS3TmVkB This spams my console sometimes. I think it's related to the Nether being loaded on 1.15.2, and the changes between the mob names.
1.0
ERROR]: Could not pass event EntityDeathEvent to MobHunting v7.5.2 - Timings report: https://timings.aikar.co/?id=459ec5acf1854b8097a87b064ac6b56e Console Spam: https://pastebin.com/uS3TmVkB This spams my console sometimes. I think it's related to the Nether being loaded on 1.15.2, and the changes between the mob names.
test
error could not pass event entitydeathevent to mobhunting timings report console spam this spams my console sometimes i think it s related to the nether being loaded on and the changes between the mob names
1
433,298
12,505,529,041
IssuesEvent
2020-06-02 10:54:08
Warcraft-GoA-Development-Team/Warcraft-Guardians-of-Azeroth
https://api.github.com/repos/Warcraft-GoA-Development-Team/Warcraft-Guardians-of-Azeroth
closed
Change "Claim Capital City" Decision
:exclamation: priority critical :question: suggestion - balance :balance_scale: :question: suggestion :question:
**Describe your suggestion in full detail below:** I suggest changing it effect to "character get capital of its primary title". Currently, characters can usurp capital of non-primary titles and that can be weird sometimes. For example, in my game, my character's liege with Lordaeron primary title created Dalaran Kingdom and usurped Dalaran that belonged to my vassal, although liege already has Lordaeron as capital.
1.0
Change "Claim Capital City" Decision - **Describe your suggestion in full detail below:** I suggest changing it effect to "character get capital of its primary title". Currently, characters can usurp capital of non-primary titles and that can be weird sometimes. For example, in my game, my character's liege with Lordaeron primary title created Dalaran Kingdom and usurped Dalaran that belonged to my vassal, although liege already has Lordaeron as capital.
non_test
change claim capital city decision describe your suggestion in full detail below i suggest changing it effect to character get capital of its primary title currently characters can usurp capital of non primary titles and that can be weird sometimes for example in my game my character s liege with lordaeron primary title created dalaran kingdom and usurped dalaran that belonged to my vassal although liege already has lordaeron as capital
0
85,779
16,739,976,589
IssuesEvent
2021-06-11 08:36:19
Regalis11/Barotrauma
https://api.github.com/repos/Regalis11/Barotrauma
closed
Sprayer sound effect continues until player stops aiming sprayer
Bug Code
- [X] I have searched the issue tracker to check if the issue has already been reported. **Description** Sprayer starts sound effect after aiming and beginning to spray with left mouse button. When not spraying anymore, sound effect continues until player stops aiming with right mouse button. **Steps To Reproduce** Acquire sprayer with ethanol. Aim sprayer, then start spraying. Stop spraying while continuing to aim. **Version** Windows, latest unstable release (0.14.4.0) **Additional information** Add any other context about the problem here.
1.0
Sprayer sound effect continues until player stops aiming sprayer - - [X] I have searched the issue tracker to check if the issue has already been reported. **Description** Sprayer starts sound effect after aiming and beginning to spray with left mouse button. When not spraying anymore, sound effect continues until player stops aiming with right mouse button. **Steps To Reproduce** Acquire sprayer with ethanol. Aim sprayer, then start spraying. Stop spraying while continuing to aim. **Version** Windows, latest unstable release (0.14.4.0) **Additional information** Add any other context about the problem here.
non_test
sprayer sound effect continues until player stops aiming sprayer i have searched the issue tracker to check if the issue has already been reported description sprayer starts sound effect after aiming and beginning to spray with left mouse button when not spraying anymore sound effect continues until player stops aiming with right mouse button steps to reproduce acquire sprayer with ethanol aim sprayer then start spraying stop spraying while continuing to aim version windows latest unstable release additional information add any other context about the problem here
0
235,851
19,431,330,739
IssuesEvent
2021-12-21 12:20:42
wazuh/wazuh-qa
https://api.github.com/repos/wazuh/wazuh-qa
closed
Development of the launcher script that will contain the logic for the test automation
team/qa subteam/qa-thunder test/check-files type/development
We are asked to develop a python script to be able to group all the necessary processes to perform the testing of the check-files of the system. This script would have the necessary logic to be able to launch the tests with the following parameters: - `v --version` to specify Wazuh version to test (relese versions only). - `a --action` to specify the action we want to test (install, upgrade, uninstall) - `o --os` to specify the linux system we want to use. For now the following would be supported [centos7, centos8, ubuntu20.04]. The processes carried out by this script are as follows: - Build the `qa-ctl` configuration file to prepare for execution. (Deployment + actions on the environment) - Execute `qa-ctl` with the configuration file created above. - Launch the system check-files test and display results. This script will be located in the following location: ``` test_check_files/ ├── launcher.py ├── playbooks │ ├── install_wazuh.yaml │ ├── uninstall_wazuh.yaml │ └── upgrade_wauh.yaml └── test_system_check_files.py ``` ## Estimated tasks - [x] Create `launcher.py` script with parameters - [x] Validate input parameters - [x] Generate `qa-ctl` configuration file in `launcher.py` script - [x] Run `qa-ctl` with generated configuration in `launcher.py` script - [x] Launch the pytest of `test_system_check_files` script - [x] Get and display results obtained ## Update It has been decided that instead of having the playbooks uploaded in the repository, they will be created dynamically according to the parameters entered in the test launcher. This allows us to avoid having duplicate playbooks, while having more flexibility according to the specific condition for each test, without having to alter any playbook. In addition, this new development is beneficial for future tests, since once the functionality is developed, it will only be necessary to specify which tasks and playbooks will run each test. This new change, involves the following development: - [x] Add generic tasks to carry out actions with Wazuh. - [x] Create a class that allows us to generate playbooks dynamically. - [x] Build the necessary playbooks for the test-launcher, taking into account the input parameters of the test-launcher. - [x] Test that all the tasks are carried out correctly.
1.0
Development of the launcher script that will contain the logic for the test automation - We are asked to develop a python script to be able to group all the necessary processes to perform the testing of the check-files of the system. This script would have the necessary logic to be able to launch the tests with the following parameters: - `v --version` to specify Wazuh version to test (relese versions only). - `a --action` to specify the action we want to test (install, upgrade, uninstall) - `o --os` to specify the linux system we want to use. For now the following would be supported [centos7, centos8, ubuntu20.04]. The processes carried out by this script are as follows: - Build the `qa-ctl` configuration file to prepare for execution. (Deployment + actions on the environment) - Execute `qa-ctl` with the configuration file created above. - Launch the system check-files test and display results. This script will be located in the following location: ``` test_check_files/ ├── launcher.py ├── playbooks │ ├── install_wazuh.yaml │ ├── uninstall_wazuh.yaml │ └── upgrade_wauh.yaml └── test_system_check_files.py ``` ## Estimated tasks - [x] Create `launcher.py` script with parameters - [x] Validate input parameters - [x] Generate `qa-ctl` configuration file in `launcher.py` script - [x] Run `qa-ctl` with generated configuration in `launcher.py` script - [x] Launch the pytest of `test_system_check_files` script - [x] Get and display results obtained ## Update It has been decided that instead of having the playbooks uploaded in the repository, they will be created dynamically according to the parameters entered in the test launcher. This allows us to avoid having duplicate playbooks, while having more flexibility according to the specific condition for each test, without having to alter any playbook. In addition, this new development is beneficial for future tests, since once the functionality is developed, it will only be necessary to specify which tasks and playbooks will run each test. This new change, involves the following development: - [x] Add generic tasks to carry out actions with Wazuh. - [x] Create a class that allows us to generate playbooks dynamically. - [x] Build the necessary playbooks for the test-launcher, taking into account the input parameters of the test-launcher. - [x] Test that all the tasks are carried out correctly.
test
development of the launcher script that will contain the logic for the test automation we are asked to develop a python script to be able to group all the necessary processes to perform the testing of the check files of the system this script would have the necessary logic to be able to launch the tests with the following parameters v version to specify wazuh version to test relese versions only a action to specify the action we want to test install upgrade uninstall o os to specify the linux system we want to use for now the following would be supported the processes carried out by this script are as follows build the qa ctl configuration file to prepare for execution deployment actions on the environment execute qa ctl with the configuration file created above launch the system check files test and display results this script will be located in the following location test check files ├── launcher py ├── playbooks │ ├── install wazuh yaml │ ├── uninstall wazuh yaml │ └── upgrade wauh yaml └── test system check files py estimated tasks create launcher py script with parameters validate input parameters generate qa ctl configuration file in launcher py script run qa ctl with generated configuration in launcher py script launch the pytest of test system check files script get and display results obtained update it has been decided that instead of having the playbooks uploaded in the repository they will be created dynamically according to the parameters entered in the test launcher this allows us to avoid having duplicate playbooks while having more flexibility according to the specific condition for each test without having to alter any playbook in addition this new development is beneficial for future tests since once the functionality is developed it will only be necessary to specify which tasks and playbooks will run each test this new change involves the following development add generic tasks to carry out actions with wazuh create a class that allows us to generate playbooks dynamically build the necessary playbooks for the test launcher taking into account the input parameters of the test launcher test that all the tasks are carried out correctly
1
171,840
20,999,759,121
IssuesEvent
2022-03-29 16:18:44
AlexRogalskiy/github-action-user-contribution
https://api.github.com/repos/AlexRogalskiy/github-action-user-contribution
opened
CVE-2021-37712 (High) detected in tar-2.2.2.tgz, tar-6.1.0.tgz
security vulnerability
## CVE-2021-37712 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tar-2.2.2.tgz</b>, <b>tar-6.1.0.tgz</b></p></summary> <p> <details><summary><b>tar-2.2.2.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.2.tgz">https://registry.npmjs.org/tar/-/tar-2.2.2.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/@definitelytyped/utils/node_modules/tar/package.json,/node_modules/node-gyp/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - dtslint-4.1.6.tgz (Root Library) - utils-0.0.89.tgz - :x: **tar-2.2.2.tgz** (Vulnerable Library) </details> <details><summary><b>tar-6.1.0.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/tar/package.json,/node_modules/npm/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - editorconfig-checker-4.0.2.tgz (Root Library) - :x: **tar-6.1.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-user-contribution/commit/e51ac509e11100b261cb1c0ba3c578d747ec913a">e51ac509e11100b261cb1c0ba3c578d747ec913a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution: tar - 4.4.18,5.0.10,6.1.9</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-37712 (High) detected in tar-2.2.2.tgz, tar-6.1.0.tgz - ## CVE-2021-37712 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>tar-2.2.2.tgz</b>, <b>tar-6.1.0.tgz</b></p></summary> <p> <details><summary><b>tar-2.2.2.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.2.tgz">https://registry.npmjs.org/tar/-/tar-2.2.2.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/@definitelytyped/utils/node_modules/tar/package.json,/node_modules/node-gyp/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - dtslint-4.1.6.tgz (Root Library) - utils-0.0.89.tgz - :x: **tar-2.2.2.tgz** (Vulnerable Library) </details> <details><summary><b>tar-6.1.0.tgz</b></p></summary> <p>tar for node</p> <p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-6.1.0.tgz">https://registry.npmjs.org/tar/-/tar-6.1.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/tar/package.json,/node_modules/npm/node_modules/tar/package.json</p> <p> Dependency Hierarchy: - editorconfig-checker-4.0.2.tgz (Root Library) - :x: **tar-6.1.0.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-user-contribution/commit/e51ac509e11100b261cb1c0ba3c578d747ec913a">e51ac509e11100b261cb1c0ba3c578d747ec913a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The npm package "tar" (aka node-tar) before versions 4.4.18, 5.0.10, and 6.1.9 has an arbitrary file creation/overwrite and arbitrary code execution vulnerability. node-tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary stat calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value. Additionally, on Windows systems, long path portions would resolve to the same file system entities as their 8.3 "short path" counterparts. A specially crafted tar archive could thus include a directory with one form of the path, followed by a symbolic link with a different string that resolves to the same file system entity, followed by a file using the first form. By first creating a directory, and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem, it was thus possible to bypass node-tar symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. These issues were addressed in releases 4.4.18, 5.0.10 and 6.1.9. The v3 branch of node-tar has been deprecated and did not receive patches for these issues. If you are still using a v3 release we recommend you update to a more recent version of node-tar. If this is not possible, a workaround is available in the referenced GHSA-qq89-hq3f-393p. <p>Publish Date: 2021-08-31 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37712>CVE-2021-37712</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p">https://github.com/npm/node-tar/security/advisories/GHSA-qq89-hq3f-393p</a></p> <p>Release Date: 2021-08-31</p> <p>Fix Resolution: tar - 4.4.18,5.0.10,6.1.9</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in tar tgz tar tgz cve high severity vulnerability vulnerable libraries tar tgz tar tgz tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules definitelytyped utils node modules tar package json node modules node gyp node modules tar package json dependency hierarchy dtslint tgz root library utils tgz x tar tgz vulnerable library tar tgz tar for node library home page a href path to dependency file package json path to vulnerable library node modules tar package json node modules npm node modules tar package json dependency hierarchy editorconfig checker tgz root library x tar tgz vulnerable library found in head commit a href vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite and arbitrary code execution vulnerability node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with names containing unicode values that normalized to the same value additionally on windows systems long path portions would resolve to the same file system entities as their short path counterparts a specially crafted tar archive could thus include a directory with one form of the path followed by a symbolic link with a different string that resolves to the same file system entity followed by a file using the first form by first creating a directory and then replacing that directory with a symlink that had a different apparent name that resolved to the same entry in the filesystem it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite these issues were addressed in releases and the branch of node tar has been deprecated and did not receive patches for these issues if you are still using a release we recommend you update to a more recent version of node tar if this is not possible a workaround is available in the referenced ghsa publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar step up your open source security game with whitesource
0
11,197
4,927,677,858
IssuesEvent
2016-11-26 21:52:34
numpy/numpy
https://api.github.com/repos/numpy/numpy
closed
Build numpy for Python3.5: get_mathlib_info raise RuntimeError("Broken toolchain: cannot link a simple C program")
53 - Invalid component: build
I installed python3.5 using the installer from python official website, then trying to install numpy, but get the following error (I have Visual Studio 2015 installed). ``` pip3.5 install numpy Collecting numpy Using cached numpy-1.10.1.tar.gz Installing collected packages: numpy Running setup.py install for numpy Complete output from command d:\local\python3.5\python.exe -c "import setuptools, tokenize;__file__='C:\\Users\\SUOKUN~1\\AppData\\Local\\Temp\\pip-build-yevxt05r\\numpy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record C:\Users\SUOKUN~1\AppData\Local\Temp\pip-5_a5vpkk-record\install-record.txt --single-version-externally-managed --compile: blas_opt_info: blas_mkl_info: libraries mkl,vml,guide not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE openblas_info: libraries openblas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_3_10_blas_threads_info: Setting PTATLAS=ATLAS libraries tatlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_3_10_blas_info: libraries satlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_blas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_blas_info: libraries f77blas,cblas,atlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE blas_info: libraries blas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE blas_src_info: NOT AVAILABLE NOT AVAILABLE non-existing path in 'numpy\\distutils': 'site.cfg' F2PY Version 2 lapack_opt_info: openblas_lapack_info: libraries openblas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE lapack_mkl_info: mkl_info: libraries mkl,vml,guide not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE NOT AVAILABLE atlas_3_10_threads_info: Setting PTATLAS=ATLAS libraries tatlas,tatlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries tatlas,tatlas not found in C:\ libraries lapack_atlas not found in C:\ libraries tatlas,tatlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_3_10_threads_info'> NOT AVAILABLE atlas_3_10_info: libraries satlas,satlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries satlas,satlas not found in C:\ libraries lapack_atlas not found in C:\ libraries satlas,satlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_3_10_info'> NOT AVAILABLE atlas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries ptf77blas,ptcblas,atlas not found in C:\ libraries lapack_atlas not found in C:\ libraries ptf77blas,ptcblas,atlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_threads_info'> NOT AVAILABLE atlas_info: libraries f77blas,cblas,atlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries f77blas,cblas,atlas not found in C:\ libraries lapack_atlas not found in C:\ libraries f77blas,cblas,atlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_info'> NOT AVAILABLE lapack_info: libraries lapack not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE lapack_src_info: NOT AVAILABLE NOT AVAILABLE running install running build running config_cc unifing config_cc, config, build_clib, build_ext, build commands --compiler options running config_fc unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options running build_src build_src building py_modules sources creating build creating build\src.win32-3.5 creating build\src.win32-3.5\numpy creating build\src.win32-3.5\numpy\distutils building library "npymath" sources No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils customize GnuFCompiler Could not locate executable g77 Could not locate executable f77 customize IntelVisualFCompiler Could not locate executable ifort Could not locate executable ifl customize AbsoftFCompiler Could not locate executable f90 customize CompaqVisualFCompiler Could not locate executable DF customize IntelItaniumVisualFCompiler Could not locate executable efl customize Gnu95FCompiler Could not locate executable gfortran Could not locate executable f95 customize G95FCompiler Could not locate executable g95 customize IntelEM64VisualFCompiler customize IntelEM64TFCompiler Could not locate executable efort Could not locate executable efc don't know how to compile Fortran code on platform 'nt' C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -Id:\local\python3.5\include -Id:\local\python3.5\include -I"C:\Program Files\Microsoft Visual Studio 14.0\VC\INCLUDE" -I"C:\Program Files\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" -I"C:\Program Files\Windows Kits\NETFXSDK\4.6\include\um" /Tc_configtest.c /Fo_configtest.obj Found executable C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\cl.exe Running from numpy source directory. C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1651: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1660: UserWarning: Blas (http://www.netlib.org/blas/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [blas]) or by setting the BLAS environment variable. warnings.warn(BlasNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1663: UserWarning: Blas (http://www.netlib.org/blas/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [blas_src]) or by setting the BLAS_SRC environment variable. warnings.warn(BlasSrcNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1552: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1563: UserWarning: Lapack (http://www.netlib.org/lapack/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [lapack]) or by setting the LAPACK environment variable. warnings.warn(LapackNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1566: UserWarning: Lapack (http://www.netlib.org/lapack/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [lapack_src]) or by setting the LAPACK_SRC environment variable. warnings.warn(LapackSrcNotFoundError.__doc__) d:\local\python3.5\lib\distutils\dist.py:261: UserWarning: Unknown distribution option: 'define_macros' warnings.warn(msg) C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\link.exe /nologo /INCREMENTAL:NO /LTCG /MANIFEST:EMBED,ID=1 /LIBPATH:"C:\Program Files\Microsoft Visual Studio 14.0\VC\LIB" /LIBPATH:"C:\Program Files\Microsoft Visual Studio 14.0\VC\ATLMFC\LIB" /LIBPATH:"C:\Program Files\Windows Kits\NETFXSDK\4.6\lib\um\x86" _configtest.obj /OUT:_configtest.exe Found executable C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\link.exe failure. removing: _configtest.c _configtest.obj Traceback (most recent call last): File "<string>", line 1, in <module> File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\setup.py", line 264, in <module> setup_package() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\setup.py", line 256, in setup_package setup(**metadata) File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\core.py", line 169, in setup return old_setup(**new_attr) File "d:\local\python3.5\lib\distutils\core.py", line 148, in setup dist.run_commands() File "d:\local\python3.5\lib\distutils\dist.py", line 955, in run_commands self.run_command(cmd) File "d:\local\python3.5\lib\distutils\dist.py", line 974, in run_command cmd_obj.run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\install.py", line 62, in run r = self.setuptools_run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\install.py", line 36, in setuptools_run return distutils_install.run(self) File "d:\local\python3.5\lib\distutils\command\install.py", line 539, in run self.run_command('build') File "d:\local\python3.5\lib\distutils\cmd.py", line 313, in run_command self.distribution.run_command(command) File "d:\local\python3.5\lib\distutils\dist.py", line 974, in run_command cmd_obj.run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build.py", line 47, in run old_build.run(self) File "d:\local\python3.5\lib\distutils\command\build.py", line 135, in run self.run_command(cmd_name) File "d:\local\python3.5\lib\distutils\cmd.py", line 313, in run_command self.distribution.run_command(command) File "d:\local\python3.5\lib\distutils\dist.py", line 974, in run_command cmd_obj.run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 153, in run self.build_sources() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 164, in build_sources self.build_library_sources(*libname_info) File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 299, in build_library_sources sources = self.generate_sources(sources, (lib_name, build_info)) File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 386, in generate_sources source = func(extension, build_dir) File "numpy\core\setup.py", line 669, in get_mathlib_info raise RuntimeError("Broken toolchain: cannot link a simple C program") RuntimeError: Broken toolchain: cannot link a simple C program ---------------------------------------- Command "d:\local\python3.5\python.exe -c "import setuptools, tokenize;__file__='C:\\Users\\SUOKUN~1\\AppData\\Local\\Temp\\pip-build-yevxt05r\\numpy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record C:\Users\SUOKUN~1\AppData\Local\Temp\pip-5_a5vpkk-record\install-record.txt --single-version-externally-managed --compile" failed with error code 1 in C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy ```
1.0
Build numpy for Python3.5: get_mathlib_info raise RuntimeError("Broken toolchain: cannot link a simple C program") - I installed python3.5 using the installer from python official website, then trying to install numpy, but get the following error (I have Visual Studio 2015 installed). ``` pip3.5 install numpy Collecting numpy Using cached numpy-1.10.1.tar.gz Installing collected packages: numpy Running setup.py install for numpy Complete output from command d:\local\python3.5\python.exe -c "import setuptools, tokenize;__file__='C:\\Users\\SUOKUN~1\\AppData\\Local\\Temp\\pip-build-yevxt05r\\numpy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record C:\Users\SUOKUN~1\AppData\Local\Temp\pip-5_a5vpkk-record\install-record.txt --single-version-externally-managed --compile: blas_opt_info: blas_mkl_info: libraries mkl,vml,guide not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE openblas_info: libraries openblas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_3_10_blas_threads_info: Setting PTATLAS=ATLAS libraries tatlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_3_10_blas_info: libraries satlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_blas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE atlas_blas_info: libraries f77blas,cblas,atlas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE blas_info: libraries blas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE blas_src_info: NOT AVAILABLE NOT AVAILABLE non-existing path in 'numpy\\distutils': 'site.cfg' F2PY Version 2 lapack_opt_info: openblas_lapack_info: libraries openblas not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE lapack_mkl_info: mkl_info: libraries mkl,vml,guide not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE NOT AVAILABLE atlas_3_10_threads_info: Setting PTATLAS=ATLAS libraries tatlas,tatlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries tatlas,tatlas not found in C:\ libraries lapack_atlas not found in C:\ libraries tatlas,tatlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_3_10_threads_info'> NOT AVAILABLE atlas_3_10_info: libraries satlas,satlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries satlas,satlas not found in C:\ libraries lapack_atlas not found in C:\ libraries satlas,satlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_3_10_info'> NOT AVAILABLE atlas_threads_info: Setting PTATLAS=ATLAS libraries ptf77blas,ptcblas,atlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries ptf77blas,ptcblas,atlas not found in C:\ libraries lapack_atlas not found in C:\ libraries ptf77blas,ptcblas,atlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_threads_info'> NOT AVAILABLE atlas_info: libraries f77blas,cblas,atlas not found in d:\local\python3.5\lib libraries lapack_atlas not found in d:\local\python3.5\lib libraries f77blas,cblas,atlas not found in C:\ libraries lapack_atlas not found in C:\ libraries f77blas,cblas,atlas not found in d:\local\python3.5\libs libraries lapack_atlas not found in d:\local\python3.5\libs <class 'numpy.distutils.system_info.atlas_info'> NOT AVAILABLE lapack_info: libraries lapack not found in ['d:\\local\\python3.5\\lib', 'C:\\', 'd:\\local\\python3.5\\libs'] NOT AVAILABLE lapack_src_info: NOT AVAILABLE NOT AVAILABLE running install running build running config_cc unifing config_cc, config, build_clib, build_ext, build commands --compiler options running config_fc unifing config_fc, config, build_clib, build_ext, build commands --fcompiler options running build_src build_src building py_modules sources creating build creating build\src.win32-3.5 creating build\src.win32-3.5\numpy creating build\src.win32-3.5\numpy\distutils building library "npymath" sources No module named 'numpy.distutils._msvccompiler' in numpy.distutils; trying from distutils customize GnuFCompiler Could not locate executable g77 Could not locate executable f77 customize IntelVisualFCompiler Could not locate executable ifort Could not locate executable ifl customize AbsoftFCompiler Could not locate executable f90 customize CompaqVisualFCompiler Could not locate executable DF customize IntelItaniumVisualFCompiler Could not locate executable efl customize Gnu95FCompiler Could not locate executable gfortran Could not locate executable f95 customize G95FCompiler Could not locate executable g95 customize IntelEM64VisualFCompiler customize IntelEM64TFCompiler Could not locate executable efort Could not locate executable efc don't know how to compile Fortran code on platform 'nt' C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\cl.exe /c /nologo /Ox /W3 /GL /DNDEBUG /MD -Inumpy\core\src\private -Inumpy\core\src -Inumpy\core -Inumpy\core\src\npymath -Inumpy\core\src\multiarray -Inumpy\core\src\umath -Inumpy\core\src\npysort -Id:\local\python3.5\include -Id:\local\python3.5\include -I"C:\Program Files\Microsoft Visual Studio 14.0\VC\INCLUDE" -I"C:\Program Files\Microsoft Visual Studio 14.0\VC\ATLMFC\INCLUDE" -I"C:\Program Files\Windows Kits\NETFXSDK\4.6\include\um" /Tc_configtest.c /Fo_configtest.obj Found executable C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\cl.exe Running from numpy source directory. C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1651: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1660: UserWarning: Blas (http://www.netlib.org/blas/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [blas]) or by setting the BLAS environment variable. warnings.warn(BlasNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1663: UserWarning: Blas (http://www.netlib.org/blas/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [blas_src]) or by setting the BLAS_SRC environment variable. warnings.warn(BlasSrcNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1552: UserWarning: Atlas (http://math-atlas.sourceforge.net/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [atlas]) or by setting the ATLAS environment variable. warnings.warn(AtlasNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1563: UserWarning: Lapack (http://www.netlib.org/lapack/) libraries not found. Directories to search for the libraries can be specified in the numpy/distutils/site.cfg file (section [lapack]) or by setting the LAPACK environment variable. warnings.warn(LapackNotFoundError.__doc__) C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\system_info.py:1566: UserWarning: Lapack (http://www.netlib.org/lapack/) sources not found. Directories to search for the sources can be specified in the numpy/distutils/site.cfg file (section [lapack_src]) or by setting the LAPACK_SRC environment variable. warnings.warn(LapackSrcNotFoundError.__doc__) d:\local\python3.5\lib\distutils\dist.py:261: UserWarning: Unknown distribution option: 'define_macros' warnings.warn(msg) C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\link.exe /nologo /INCREMENTAL:NO /LTCG /MANIFEST:EMBED,ID=1 /LIBPATH:"C:\Program Files\Microsoft Visual Studio 14.0\VC\LIB" /LIBPATH:"C:\Program Files\Microsoft Visual Studio 14.0\VC\ATLMFC\LIB" /LIBPATH:"C:\Program Files\Windows Kits\NETFXSDK\4.6\lib\um\x86" _configtest.obj /OUT:_configtest.exe Found executable C:\Program Files\Microsoft Visual Studio 14.0\VC\BIN\link.exe failure. removing: _configtest.c _configtest.obj Traceback (most recent call last): File "<string>", line 1, in <module> File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\setup.py", line 264, in <module> setup_package() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\setup.py", line 256, in setup_package setup(**metadata) File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\core.py", line 169, in setup return old_setup(**new_attr) File "d:\local\python3.5\lib\distutils\core.py", line 148, in setup dist.run_commands() File "d:\local\python3.5\lib\distutils\dist.py", line 955, in run_commands self.run_command(cmd) File "d:\local\python3.5\lib\distutils\dist.py", line 974, in run_command cmd_obj.run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\install.py", line 62, in run r = self.setuptools_run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\install.py", line 36, in setuptools_run return distutils_install.run(self) File "d:\local\python3.5\lib\distutils\command\install.py", line 539, in run self.run_command('build') File "d:\local\python3.5\lib\distutils\cmd.py", line 313, in run_command self.distribution.run_command(command) File "d:\local\python3.5\lib\distutils\dist.py", line 974, in run_command cmd_obj.run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build.py", line 47, in run old_build.run(self) File "d:\local\python3.5\lib\distutils\command\build.py", line 135, in run self.run_command(cmd_name) File "d:\local\python3.5\lib\distutils\cmd.py", line 313, in run_command self.distribution.run_command(command) File "d:\local\python3.5\lib\distutils\dist.py", line 974, in run_command cmd_obj.run() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 153, in run self.build_sources() File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 164, in build_sources self.build_library_sources(*libname_info) File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 299, in build_library_sources sources = self.generate_sources(sources, (lib_name, build_info)) File "C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy\numpy\distutils\command\build_src.py", line 386, in generate_sources source = func(extension, build_dir) File "numpy\core\setup.py", line 669, in get_mathlib_info raise RuntimeError("Broken toolchain: cannot link a simple C program") RuntimeError: Broken toolchain: cannot link a simple C program ---------------------------------------- Command "d:\local\python3.5\python.exe -c "import setuptools, tokenize;__file__='C:\\Users\\SUOKUN~1\\AppData\\Local\\Temp\\pip-build-yevxt05r\\numpy\\setup.py';exec(compile(getattr(tokenize, 'open', open)(__file__).read().replace('\r\n', '\n'), __file__, 'exec'))" install --record C:\Users\SUOKUN~1\AppData\Local\Temp\pip-5_a5vpkk-record\install-record.txt --single-version-externally-managed --compile" failed with error code 1 in C:\Users\SUOKUN~1\AppData\Local\Temp\pip-build-yevxt05r\numpy ```
non_test
build numpy for get mathlib info raise runtimeerror broken toolchain cannot link a simple c program i installed using the installer from python official website then trying to install numpy but get the following error i have visual studio installed install numpy collecting numpy using cached numpy tar gz installing collected packages numpy running setup py install for numpy complete output from command d local python exe c import setuptools tokenize file c users suokun appdata local temp pip build numpy setup py exec compile getattr tokenize open open file read replace r n n file exec install record c users suokun appdata local temp pip record install record txt single version externally managed compile blas opt info blas mkl info libraries mkl vml guide not found in not available openblas info libraries openblas not found in not available atlas blas threads info setting ptatlas atlas libraries tatlas not found in not available atlas blas info libraries satlas not found in not available atlas blas threads info setting ptatlas atlas libraries ptcblas atlas not found in not available atlas blas info libraries cblas atlas not found in not available blas info libraries blas not found in not available blas src info not available not available non existing path in numpy distutils site cfg version lapack opt info openblas lapack info libraries openblas not found in not available lapack mkl info mkl info libraries mkl vml guide not found in not available not available atlas threads info setting ptatlas atlas libraries tatlas tatlas not found in d local lib libraries lapack atlas not found in d local lib libraries tatlas tatlas not found in c libraries lapack atlas not found in c libraries tatlas tatlas not found in d local libs libraries lapack atlas not found in d local libs not available atlas info libraries satlas satlas not found in d local lib libraries lapack atlas not found in d local lib libraries satlas satlas not found in c libraries lapack atlas not found in c libraries satlas satlas not found in d local libs libraries lapack atlas not found in d local libs not available atlas threads info setting ptatlas atlas libraries ptcblas atlas not found in d local lib libraries lapack atlas not found in d local lib libraries ptcblas atlas not found in c libraries lapack atlas not found in c libraries ptcblas atlas not found in d local libs libraries lapack atlas not found in d local libs not available atlas info libraries cblas atlas not found in d local lib libraries lapack atlas not found in d local lib libraries cblas atlas not found in c libraries lapack atlas not found in c libraries cblas atlas not found in d local libs libraries lapack atlas not found in d local libs not available lapack info libraries lapack not found in not available lapack src info not available not available running install running build running config cc unifing config cc config build clib build ext build commands compiler options running config fc unifing config fc config build clib build ext build commands fcompiler options running build src build src building py modules sources creating build creating build src creating build src numpy creating build src numpy distutils building library npymath sources no module named numpy distutils msvccompiler in numpy distutils trying from distutils customize gnufcompiler could not locate executable could not locate executable customize intelvisualfcompiler could not locate executable ifort could not locate executable ifl customize absoftfcompiler could not locate executable customize compaqvisualfcompiler could not locate executable df customize intelitaniumvisualfcompiler could not locate executable efl customize could not locate executable gfortran could not locate executable customize could not locate executable customize customize could not locate executable efort could not locate executable efc don t know how to compile fortran code on platform nt c program files microsoft visual studio vc bin cl exe c nologo ox gl dndebug md inumpy core src private inumpy core src inumpy core inumpy core src npymath inumpy core src multiarray inumpy core src umath inumpy core src npysort id local include id local include i c program files microsoft visual studio vc include i c program files microsoft visual studio vc atlmfc include i c program files windows kits netfxsdk include um tc configtest c fo configtest obj found executable c program files microsoft visual studio vc bin cl exe running from numpy source directory c users suokun appdata local temp pip build numpy numpy distutils system info py userwarning atlas libraries not found directories to search for the libraries can be specified in the numpy distutils site cfg file section or by setting the atlas environment variable warnings warn atlasnotfounderror doc c users suokun appdata local temp pip build numpy numpy distutils system info py userwarning blas libraries not found directories to search for the libraries can be specified in the numpy distutils site cfg file section or by setting the blas environment variable warnings warn blasnotfounderror doc c users suokun appdata local temp pip build numpy numpy distutils system info py userwarning blas sources not found directories to search for the sources can be specified in the numpy distutils site cfg file section or by setting the blas src environment variable warnings warn blassrcnotfounderror doc c users suokun appdata local temp pip build numpy numpy distutils system info py userwarning atlas libraries not found directories to search for the libraries can be specified in the numpy distutils site cfg file section or by setting the atlas environment variable warnings warn atlasnotfounderror doc c users suokun appdata local temp pip build numpy numpy distutils system info py userwarning lapack libraries not found directories to search for the libraries can be specified in the numpy distutils site cfg file section or by setting the lapack environment variable warnings warn lapacknotfounderror doc c users suokun appdata local temp pip build numpy numpy distutils system info py userwarning lapack sources not found directories to search for the sources can be specified in the numpy distutils site cfg file section or by setting the lapack src environment variable warnings warn lapacksrcnotfounderror doc d local lib distutils dist py userwarning unknown distribution option define macros warnings warn msg c program files microsoft visual studio vc bin link exe nologo incremental no ltcg manifest embed id libpath c program files microsoft visual studio vc lib libpath c program files microsoft visual studio vc atlmfc lib libpath c program files windows kits netfxsdk lib um configtest obj out configtest exe found executable c program files microsoft visual studio vc bin link exe failure removing configtest c configtest obj traceback most recent call last file line in file c users suokun appdata local temp pip build numpy setup py line in setup package file c users suokun appdata local temp pip build numpy setup py line in setup package setup metadata file c users suokun appdata local temp pip build numpy numpy distutils core py line in setup return old setup new attr file d local lib distutils core py line in setup dist run commands file d local lib distutils dist py line in run commands self run command cmd file d local lib distutils dist py line in run command cmd obj run file c users suokun appdata local temp pip build numpy numpy distutils command install py line in run r self setuptools run file c users suokun appdata local temp pip build numpy numpy distutils command install py line in setuptools run return distutils install run self file d local lib distutils command install py line in run self run command build file d local lib distutils cmd py line in run command self distribution run command command file d local lib distutils dist py line in run command cmd obj run file c users suokun appdata local temp pip build numpy numpy distutils command build py line in run old build run self file d local lib distutils command build py line in run self run command cmd name file d local lib distutils cmd py line in run command self distribution run command command file d local lib distutils dist py line in run command cmd obj run file c users suokun appdata local temp pip build numpy numpy distutils command build src py line in run self build sources file c users suokun appdata local temp pip build numpy numpy distutils command build src py line in build sources self build library sources libname info file c users suokun appdata local temp pip build numpy numpy distutils command build src py line in build library sources sources self generate sources sources lib name build info file c users suokun appdata local temp pip build numpy numpy distutils command build src py line in generate sources source func extension build dir file numpy core setup py line in get mathlib info raise runtimeerror broken toolchain cannot link a simple c program runtimeerror broken toolchain cannot link a simple c program command d local python exe c import setuptools tokenize file c users suokun appdata local temp pip build numpy setup py exec compile getattr tokenize open open file read replace r n n file exec install record c users suokun appdata local temp pip record install record txt single version externally managed compile failed with error code in c users suokun appdata local temp pip build numpy
0
18,559
10,256,407,036
IssuesEvent
2019-08-21 17:35:51
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
No longer able to nominate yourself as an MVP
Pri2 assigned-to-author doc-bug security/svc triaged
The last paragraph is no longer relevant as the MVP program no longer accepts self nominations & therefore needs to be amended --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3bb1b889-02ca-2b75-11f0-12798ff285b4 * Version Independent ID: b9ffc6b2-5666-d27c-96a2-da58ada50eba * Content: [Azure Security MVP Program](https://docs.microsoft.com/en-us/azure/security/azure-security-mvp) * Content Source: [articles/security/azure-security-mvp.md](https://github.com/Microsoft/azure-docs/blob/master/articles/security/azure-security-mvp.md) * Service: **security** * GitHub Login: @barclayn * Microsoft Alias: **barclayn**
True
No longer able to nominate yourself as an MVP - The last paragraph is no longer relevant as the MVP program no longer accepts self nominations & therefore needs to be amended --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3bb1b889-02ca-2b75-11f0-12798ff285b4 * Version Independent ID: b9ffc6b2-5666-d27c-96a2-da58ada50eba * Content: [Azure Security MVP Program](https://docs.microsoft.com/en-us/azure/security/azure-security-mvp) * Content Source: [articles/security/azure-security-mvp.md](https://github.com/Microsoft/azure-docs/blob/master/articles/security/azure-security-mvp.md) * Service: **security** * GitHub Login: @barclayn * Microsoft Alias: **barclayn**
non_test
no longer able to nominate yourself as an mvp the last paragraph is no longer relevant as the mvp program no longer accepts self nominations therefore needs to be amended document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service security github login barclayn microsoft alias barclayn
0
100,945
8,761,424,047
IssuesEvent
2018-12-16 17:14:39
litehelpers/Cordova-sqlite-storage
https://api.github.com/repos/litehelpers/Cordova-sqlite-storage
opened
SQLITE_DBCONFIG_DEFENSIVE flag
bug-data-loss-risk bug-general testing
Extra defense that "prevents ordinary SQL statements from corrupting the database file" (<https://www.sqlite.org/security.html>), as a followup to SQLite 3.26.0 security update (#837). Will be part of the upcoming major release (#773) due to the potential for this to be a breaking change.
1.0
SQLITE_DBCONFIG_DEFENSIVE flag - Extra defense that "prevents ordinary SQL statements from corrupting the database file" (<https://www.sqlite.org/security.html>), as a followup to SQLite 3.26.0 security update (#837). Will be part of the upcoming major release (#773) due to the potential for this to be a breaking change.
test
sqlite dbconfig defensive flag extra defense that prevents ordinary sql statements from corrupting the database file as a followup to sqlite security update will be part of the upcoming major release due to the potential for this to be a breaking change
1
131,130
10,681,525,810
IssuesEvent
2019-10-22 01:11:58
fga-eps-mds/2019.2-Chatbot-Nilo
https://api.github.com/repos/fga-eps-mds/2019.2-Chatbot-Nilo
opened
Retirar teste de style do gitlab-ci
0 - DevOps 1 - Viabilidade Técnica 2 - Integraçã Contínua 2 - Test 4 - Avançado Fix
## Descrição problemas com o "- flake8 --exclude venv" alterações no código estão sendo prejudicadas constantemente, por conta disso foi retirado do código.
1.0
Retirar teste de style do gitlab-ci - ## Descrição problemas com o "- flake8 --exclude venv" alterações no código estão sendo prejudicadas constantemente, por conta disso foi retirado do código.
test
retirar teste de style do gitlab ci descrição problemas com o exclude venv alterações no código estão sendo prejudicadas constantemente por conta disso foi retirado do código
1
301,920
26,109,186,519
IssuesEvent
2022-12-27 17:07:45
boa-dev/boa
https://api.github.com/repos/boa-dev/boa
closed
Test262 action gives warnings in GitHub actions
bug good first issue test
We are getting these warnings that we should fix in PR Test262 checks. They might become hard errors in the future: ![image](https://user-images.githubusercontent.com/597469/200281684-04960c33-6281-445d-8736-95e6d7817b80.png)
1.0
Test262 action gives warnings in GitHub actions - We are getting these warnings that we should fix in PR Test262 checks. They might become hard errors in the future: ![image](https://user-images.githubusercontent.com/597469/200281684-04960c33-6281-445d-8736-95e6d7817b80.png)
test
action gives warnings in github actions we are getting these warnings that we should fix in pr checks they might become hard errors in the future
1
351,166
25,016,231,630
IssuesEvent
2022-11-03 19:04:09
zowe/zowe-cli
https://api.github.com/repos/zowe/zowe-cli
closed
Update SDK guidelines to mention NPM workspaces
documentation priority-low
Our documentation for SDK guidelines mentions `lerna bootstrap` as the tool that handles monorepo interdependencies. This information is now obsolete and should be updated to reflect that we have migrated from Lerna to NPM workspaces for our dependency management.
1.0
Update SDK guidelines to mention NPM workspaces - Our documentation for SDK guidelines mentions `lerna bootstrap` as the tool that handles monorepo interdependencies. This information is now obsolete and should be updated to reflect that we have migrated from Lerna to NPM workspaces for our dependency management.
non_test
update sdk guidelines to mention npm workspaces our documentation for sdk guidelines mentions lerna bootstrap as the tool that handles monorepo interdependencies this information is now obsolete and should be updated to reflect that we have migrated from lerna to npm workspaces for our dependency management
0
600
8,799,554,964
IssuesEvent
2018-12-24 15:03:04
Cha-OS/colabo
https://api.github.com/repos/Cha-OS/colabo
opened
Registration: existing email
IMPORTANT UX.UsrOnBoard+AvoidUsrErr bug domain:RIMA reliability
check if the user's email is already existing (offer sign-in instead and data updating) now we can get several users with the same email
True
Registration: existing email - check if the user's email is already existing (offer sign-in instead and data updating) now we can get several users with the same email
non_test
registration existing email check if the user s email is already existing offer sign in instead and data updating now we can get several users with the same email
0
340,440
30,515,952,220
IssuesEvent
2023-07-19 03:01:42
d3ward/toolz
https://api.github.com/repos/d3ward/toolz
opened
Adguard 100% Test
passed-test
**Browser: Microsoft Edge** **OS: Windows 11** **Adblock Solution: Adguard Adblocker** **Test Passed Value ( 100% ):** **Description:** Adguard Filter List - [x] AdGuard Base filter - [x] AdGuard Mobile Ads filter - [x] AdGuard URL Tracking filter - [x] AdGuard Spyware filter - [x] Legitimate URL Shortener - [x] AdGuard Annoyances filter - [x] AdGuard Cookie Notices filter - [x] AdGuard Popups filter - [x] AdGuard Mobile App Banners filter - [x] AdGuard Other Annoyances filter - [x] AdGuard Widgets filter - [x] Adblock Warning Removal List - [x] ABPindo - [x] AdBlockID **Custom Filter List** - [x] [HaGeZi's Light DNS Blocklist](https://raw.githubusercontent.com/hagezi/dns-blocklists/main/adblock/light.txt) **Screenshots:** ![image](https://github.com/d3ward/toolz/assets/88470966/61a35009-fc17-4d15-8e18-5826258d0797) **Test Result Data:** ![image](https://github.com/d3ward/toolz/assets/88470966/a54e98e2-0520-4649-aaaf-eb7d92f52e03) **Test Log:** script_ads : true script_pagead : true script_partenrads : true ------------------------- Amazon => n° tests => 5 Google Ads => n° tests => 4 Doubleclick.net => n° tests => 5 Adcolony => n° tests => 4 Media.net => n° tests => 3 Ads => Total n° tests => 21 ------------------------- Google Analytics => n° tests => 5 Hotjar => n° tests => 7 MouseFlow => n° tests => 7 FreshWorks => n° tests => 3 Luckyorange => n° tests => 8 Stats WP Plugin => n° tests => 1 Analytics => Total n° tests => 31 ------------------------- Bugsnag => n° tests => 4 Sentry => n° tests => 2 Error Trackers => Total n° tests => 6 ------------------------- Facebook => n° tests => 2 Twitter => n° tests => 2 LinkedIn => n° tests => 2 Pinterest => n° tests => 5 Reddit => n° tests => 2 YouTube => n° tests => 1 TikTok => n° tests => 7 Social Trackers => Total n° tests => 21 ------------------------- Yahoo => n° tests => 11 Yandex => n° tests => 7 Unity => n° tests => 4 Mix => Total n° tests => 22 ------------------------- Realme => n° tests => 4 Xiaomi => n° tests => 8 Oppo => n° tests => 4 Huawei => n° tests => 6 OnePlus => n° tests => 2 Samsung => n° tests => 5 Apple => n° tests => 10 OEMs => Total n° tests => 39 ------------------------- cosmetic_static_ad : true ------------------------- cosmetic_dynamic_ad : true ------------------------- adtago.s3.amazonaws.com - blocked analyticsengine.s3.amazonaws.com - blocked analytics.s3.amazonaws.com - blocked advice-ads.s3.amazonaws.com - blocked advertising-api-eu.amazon.com - blocked pagead2.googlesyndication.com - blocked adservice.google.com - blocked pagead2.googleadservices.com - blocked afs.googlesyndication.com - blocked stats.g.doubleclick.net - blocked ad.doubleclick.net - blocked static.doubleclick.net - blocked m.doubleclick.net - blocked mediavisor.doubleclick.net - blocked ads30.adcolony.com - blocked adc3-launch.adcolony.com - blocked events3alt.adcolony.com - blocked wd.adcolony.com - blocked static.media.net - blocked media.net - blocked adservetx.media.net - blocked app-measurement.com - blocked analytics.google.com - blocked click.googleanalytics.com - blocked google-analytics.com - blocked ssl.google-analytics.com - blocked adm.hotjar.com - blocked identify.hotjar.com - blocked insights.hotjar.com - blocked script.hotjar.com - blocked surveys.hotjar.com - blocked careers.hotjar.com - blocked events.hotjar.io - blocked mouseflow.com - blocked cdn.mouseflow.com - blocked o2.mouseflow.com - blocked gtm.mouseflow.com - blocked api.mouseflow.com - blocked tools.mouseflow.com - blocked cdn-test.mouseflow.com - blocked freshmarketer.com - blocked claritybt.freshmarketer.com - blocked fwtracks.freshmarketer.com - blocked luckyorange.com - blocked api.luckyorange.com - blocked realtime.luckyorange.com - blocked cdn.luckyorange.com - blocked w1.luckyorange.com - blocked upload.luckyorange.net - blocked cs.luckyorange.net - blocked settings.luckyorange.net - blocked stats.wp.com - blocked notify.bugsnag.com - blocked sessions.bugsnag.com - blocked api.bugsnag.com - blocked app.bugsnag.com - blocked browser.sentry-cdn.com - blocked app.getsentry.com - blocked pixel.facebook.com - blocked an.facebook.com - blocked static.ads-twitter.com - blocked ads-api.twitter.com - blocked ads.linkedin.com - blocked analytics.pointdrive.linkedin.com - blocked ads.pinterest.com - blocked log.pinterest.com - blocked analytics.pinterest.com - blocked trk.pinterest.com - blocked widgets.pinterest.com - blocked events.reddit.com - blocked events.redditmedia.com - blocked ads.youtube.com - blocked ads-api.tiktok.com - blocked analytics.tiktok.com - blocked ads-sg.tiktok.com - blocked analytics-sg.tiktok.com - blocked business-api.tiktok.com - blocked ads.tiktok.com - blocked log.byteoversea.com - blocked ads.yahoo.com - blocked analytics.yahoo.com - blocked geo.yahoo.com - blocked udc.yahoo.com - blocked udcm.yahoo.com - blocked advertising.yahoo.com - blocked analytics.query.yahoo.com - blocked partnerads.ysm.yahoo.com - blocked log.fc.yahoo.com - blocked gemini.yahoo.com - blocked adtech.yahooinc.com - blocked extmaps-api.yandex.net - blocked appmetrica.yandex.ru - blocked adfstat.yandex.ru - blocked metrika.yandex.ru - blocked advertising.yandex.ru - blocked offerwall.yandex.net - blocked adfox.yandex.ru - blocked auction.unityads.unity3d.com - blocked webview.unityads.unity3d.com - blocked config.unityads.unity3d.com - blocked adserver.unityads.unity3d.com - blocked iot-eu-logser.realme.com - blocked iot-logser.realme.com - blocked bdapi-ads.realmemobile.com - blocked bdapi-in-ads.realmemobile.com - blocked api.ad.xiaomi.com - blocked data.mistat.xiaomi.com - blocked data.mistat.india.xiaomi.com - blocked data.mistat.rus.xiaomi.com - blocked sdkconfig.ad.xiaomi.com - blocked sdkconfig.ad.intl.xiaomi.com - blocked globalapi.ad.xiaomi.com - blocked tracking.rus.miui.com - blocked adsfs.oppomobile.com - blocked adx.ads.oppomobile.com - blocked ck.ads.oppomobile.com - blocked data.ads.oppomobile.com - blocked metrics.data.hicloud.com - blocked metrics2.data.hicloud.com - blocked grs.hicloud.com - blocked logservice.hicloud.com - blocked logservice1.hicloud.com - blocked logbak.hicloud.com - blocked click.oneplus.cn - blocked open.oneplus.net - blocked samsungads.com - blocked smetrics.samsung.com - blocked nmetrics.samsung.com - blocked samsung-com.112.2o7.net - blocked analytics-api.samsunghealthcn.com - blocked advertising.apple.com - blocked tr.iadsdk.apple.com - blocked iadsdk.apple.com - blocked metrics.icloud.com - blocked metrics.apple.com - blocked metrics.mzstatic.com - blocked api-adservices.apple.com - blocked books-analytics-events.apple.com - blocked weather-analytics-events.apple.com - blocked notes-analytics-events.apple.com - blocked ----- Total : 150 Blocked : 150 Not Blocked : 0
1.0
Adguard 100% Test - **Browser: Microsoft Edge** **OS: Windows 11** **Adblock Solution: Adguard Adblocker** **Test Passed Value ( 100% ):** **Description:** Adguard Filter List - [x] AdGuard Base filter - [x] AdGuard Mobile Ads filter - [x] AdGuard URL Tracking filter - [x] AdGuard Spyware filter - [x] Legitimate URL Shortener - [x] AdGuard Annoyances filter - [x] AdGuard Cookie Notices filter - [x] AdGuard Popups filter - [x] AdGuard Mobile App Banners filter - [x] AdGuard Other Annoyances filter - [x] AdGuard Widgets filter - [x] Adblock Warning Removal List - [x] ABPindo - [x] AdBlockID **Custom Filter List** - [x] [HaGeZi's Light DNS Blocklist](https://raw.githubusercontent.com/hagezi/dns-blocklists/main/adblock/light.txt) **Screenshots:** ![image](https://github.com/d3ward/toolz/assets/88470966/61a35009-fc17-4d15-8e18-5826258d0797) **Test Result Data:** ![image](https://github.com/d3ward/toolz/assets/88470966/a54e98e2-0520-4649-aaaf-eb7d92f52e03) **Test Log:** script_ads : true script_pagead : true script_partenrads : true ------------------------- Amazon => n° tests => 5 Google Ads => n° tests => 4 Doubleclick.net => n° tests => 5 Adcolony => n° tests => 4 Media.net => n° tests => 3 Ads => Total n° tests => 21 ------------------------- Google Analytics => n° tests => 5 Hotjar => n° tests => 7 MouseFlow => n° tests => 7 FreshWorks => n° tests => 3 Luckyorange => n° tests => 8 Stats WP Plugin => n° tests => 1 Analytics => Total n° tests => 31 ------------------------- Bugsnag => n° tests => 4 Sentry => n° tests => 2 Error Trackers => Total n° tests => 6 ------------------------- Facebook => n° tests => 2 Twitter => n° tests => 2 LinkedIn => n° tests => 2 Pinterest => n° tests => 5 Reddit => n° tests => 2 YouTube => n° tests => 1 TikTok => n° tests => 7 Social Trackers => Total n° tests => 21 ------------------------- Yahoo => n° tests => 11 Yandex => n° tests => 7 Unity => n° tests => 4 Mix => Total n° tests => 22 ------------------------- Realme => n° tests => 4 Xiaomi => n° tests => 8 Oppo => n° tests => 4 Huawei => n° tests => 6 OnePlus => n° tests => 2 Samsung => n° tests => 5 Apple => n° tests => 10 OEMs => Total n° tests => 39 ------------------------- cosmetic_static_ad : true ------------------------- cosmetic_dynamic_ad : true ------------------------- adtago.s3.amazonaws.com - blocked analyticsengine.s3.amazonaws.com - blocked analytics.s3.amazonaws.com - blocked advice-ads.s3.amazonaws.com - blocked advertising-api-eu.amazon.com - blocked pagead2.googlesyndication.com - blocked adservice.google.com - blocked pagead2.googleadservices.com - blocked afs.googlesyndication.com - blocked stats.g.doubleclick.net - blocked ad.doubleclick.net - blocked static.doubleclick.net - blocked m.doubleclick.net - blocked mediavisor.doubleclick.net - blocked ads30.adcolony.com - blocked adc3-launch.adcolony.com - blocked events3alt.adcolony.com - blocked wd.adcolony.com - blocked static.media.net - blocked media.net - blocked adservetx.media.net - blocked app-measurement.com - blocked analytics.google.com - blocked click.googleanalytics.com - blocked google-analytics.com - blocked ssl.google-analytics.com - blocked adm.hotjar.com - blocked identify.hotjar.com - blocked insights.hotjar.com - blocked script.hotjar.com - blocked surveys.hotjar.com - blocked careers.hotjar.com - blocked events.hotjar.io - blocked mouseflow.com - blocked cdn.mouseflow.com - blocked o2.mouseflow.com - blocked gtm.mouseflow.com - blocked api.mouseflow.com - blocked tools.mouseflow.com - blocked cdn-test.mouseflow.com - blocked freshmarketer.com - blocked claritybt.freshmarketer.com - blocked fwtracks.freshmarketer.com - blocked luckyorange.com - blocked api.luckyorange.com - blocked realtime.luckyorange.com - blocked cdn.luckyorange.com - blocked w1.luckyorange.com - blocked upload.luckyorange.net - blocked cs.luckyorange.net - blocked settings.luckyorange.net - blocked stats.wp.com - blocked notify.bugsnag.com - blocked sessions.bugsnag.com - blocked api.bugsnag.com - blocked app.bugsnag.com - blocked browser.sentry-cdn.com - blocked app.getsentry.com - blocked pixel.facebook.com - blocked an.facebook.com - blocked static.ads-twitter.com - blocked ads-api.twitter.com - blocked ads.linkedin.com - blocked analytics.pointdrive.linkedin.com - blocked ads.pinterest.com - blocked log.pinterest.com - blocked analytics.pinterest.com - blocked trk.pinterest.com - blocked widgets.pinterest.com - blocked events.reddit.com - blocked events.redditmedia.com - blocked ads.youtube.com - blocked ads-api.tiktok.com - blocked analytics.tiktok.com - blocked ads-sg.tiktok.com - blocked analytics-sg.tiktok.com - blocked business-api.tiktok.com - blocked ads.tiktok.com - blocked log.byteoversea.com - blocked ads.yahoo.com - blocked analytics.yahoo.com - blocked geo.yahoo.com - blocked udc.yahoo.com - blocked udcm.yahoo.com - blocked advertising.yahoo.com - blocked analytics.query.yahoo.com - blocked partnerads.ysm.yahoo.com - blocked log.fc.yahoo.com - blocked gemini.yahoo.com - blocked adtech.yahooinc.com - blocked extmaps-api.yandex.net - blocked appmetrica.yandex.ru - blocked adfstat.yandex.ru - blocked metrika.yandex.ru - blocked advertising.yandex.ru - blocked offerwall.yandex.net - blocked adfox.yandex.ru - blocked auction.unityads.unity3d.com - blocked webview.unityads.unity3d.com - blocked config.unityads.unity3d.com - blocked adserver.unityads.unity3d.com - blocked iot-eu-logser.realme.com - blocked iot-logser.realme.com - blocked bdapi-ads.realmemobile.com - blocked bdapi-in-ads.realmemobile.com - blocked api.ad.xiaomi.com - blocked data.mistat.xiaomi.com - blocked data.mistat.india.xiaomi.com - blocked data.mistat.rus.xiaomi.com - blocked sdkconfig.ad.xiaomi.com - blocked sdkconfig.ad.intl.xiaomi.com - blocked globalapi.ad.xiaomi.com - blocked tracking.rus.miui.com - blocked adsfs.oppomobile.com - blocked adx.ads.oppomobile.com - blocked ck.ads.oppomobile.com - blocked data.ads.oppomobile.com - blocked metrics.data.hicloud.com - blocked metrics2.data.hicloud.com - blocked grs.hicloud.com - blocked logservice.hicloud.com - blocked logservice1.hicloud.com - blocked logbak.hicloud.com - blocked click.oneplus.cn - blocked open.oneplus.net - blocked samsungads.com - blocked smetrics.samsung.com - blocked nmetrics.samsung.com - blocked samsung-com.112.2o7.net - blocked analytics-api.samsunghealthcn.com - blocked advertising.apple.com - blocked tr.iadsdk.apple.com - blocked iadsdk.apple.com - blocked metrics.icloud.com - blocked metrics.apple.com - blocked metrics.mzstatic.com - blocked api-adservices.apple.com - blocked books-analytics-events.apple.com - blocked weather-analytics-events.apple.com - blocked notes-analytics-events.apple.com - blocked ----- Total : 150 Blocked : 150 Not Blocked : 0
test
adguard test browser microsoft edge os windows adblock solution adguard adblocker test passed value description adguard filter list adguard base filter adguard mobile ads filter adguard url tracking filter adguard spyware filter legitimate url shortener adguard annoyances filter adguard cookie notices filter adguard popups filter adguard mobile app banners filter adguard other annoyances filter adguard widgets filter adblock warning removal list abpindo adblockid custom filter list screenshots test result data test log script ads true script pagead true script partenrads true amazon n° tests google ads n° tests doubleclick net n° tests adcolony n° tests media net n° tests ads total n° tests google analytics n° tests hotjar n° tests mouseflow n° tests freshworks n° tests luckyorange n° tests stats wp plugin n° tests analytics total n° tests bugsnag n° tests sentry n° tests error trackers total n° tests facebook n° tests twitter n° tests linkedin n° tests pinterest n° tests reddit n° tests youtube n° tests tiktok n° tests social trackers total n° tests yahoo n° tests yandex n° tests unity n° tests mix total n° tests realme n° tests xiaomi n° tests oppo n° tests huawei n° tests oneplus n° tests samsung n° tests apple n° tests oems total n° tests cosmetic static ad true cosmetic dynamic ad true adtago amazonaws com blocked analyticsengine amazonaws com blocked analytics amazonaws com blocked advice ads amazonaws com blocked advertising api eu amazon com blocked googlesyndication com blocked adservice google com blocked googleadservices com blocked afs googlesyndication com blocked stats g doubleclick net blocked ad doubleclick net blocked static doubleclick net blocked m doubleclick net blocked mediavisor doubleclick net blocked adcolony com blocked launch adcolony com blocked adcolony com blocked wd adcolony com blocked static media net blocked media net blocked adservetx media net blocked app measurement com blocked analytics google com blocked click googleanalytics com blocked google analytics com blocked ssl google analytics com blocked adm hotjar com blocked identify hotjar com blocked insights hotjar com blocked script hotjar com blocked surveys hotjar com blocked careers hotjar com blocked events hotjar io blocked mouseflow com blocked cdn mouseflow com blocked mouseflow com blocked gtm mouseflow com blocked api mouseflow com blocked tools mouseflow com blocked cdn test mouseflow com blocked freshmarketer com blocked claritybt freshmarketer com blocked fwtracks freshmarketer com blocked luckyorange com blocked api luckyorange com blocked realtime luckyorange com blocked cdn luckyorange com blocked luckyorange com blocked upload luckyorange net blocked cs luckyorange net blocked settings luckyorange net blocked stats wp com blocked notify bugsnag com blocked sessions bugsnag com blocked api bugsnag com blocked app bugsnag com blocked browser sentry cdn com blocked app getsentry com blocked pixel facebook com blocked an facebook com blocked static ads twitter com blocked ads api twitter com blocked ads linkedin com blocked analytics pointdrive linkedin com blocked ads pinterest com blocked log pinterest com blocked analytics pinterest com blocked trk pinterest com blocked widgets pinterest com blocked events reddit com blocked events redditmedia com blocked ads youtube com blocked ads api tiktok com blocked analytics tiktok com blocked ads sg tiktok com blocked analytics sg tiktok com blocked business api tiktok com blocked ads tiktok com blocked log byteoversea com blocked ads yahoo com blocked analytics yahoo com blocked geo yahoo com blocked udc yahoo com blocked udcm yahoo com blocked advertising yahoo com blocked analytics query yahoo com blocked partnerads ysm yahoo com blocked log fc yahoo com blocked gemini yahoo com blocked adtech yahooinc com blocked extmaps api yandex net blocked appmetrica yandex ru blocked adfstat yandex ru blocked metrika yandex ru blocked advertising yandex ru blocked offerwall yandex net blocked adfox yandex ru blocked auction unityads com blocked webview unityads com blocked config unityads com blocked adserver unityads com blocked iot eu logser realme com blocked iot logser realme com blocked bdapi ads realmemobile com blocked bdapi in ads realmemobile com blocked api ad xiaomi com blocked data mistat xiaomi com blocked data mistat india xiaomi com blocked data mistat rus xiaomi com blocked sdkconfig ad xiaomi com blocked sdkconfig ad intl xiaomi com blocked globalapi ad xiaomi com blocked tracking rus miui com blocked adsfs oppomobile com blocked adx ads oppomobile com blocked ck ads oppomobile com blocked data ads oppomobile com blocked metrics data hicloud com blocked data hicloud com blocked grs hicloud com blocked logservice hicloud com blocked hicloud com blocked logbak hicloud com blocked click oneplus cn blocked open oneplus net blocked samsungads com blocked smetrics samsung com blocked nmetrics samsung com blocked samsung com net blocked analytics api samsunghealthcn com blocked advertising apple com blocked tr iadsdk apple com blocked iadsdk apple com blocked metrics icloud com blocked metrics apple com blocked metrics mzstatic com blocked api adservices apple com blocked books analytics events apple com blocked weather analytics events apple com blocked notes analytics events apple com blocked total blocked not blocked
1
97,910
8,673,294,139
IssuesEvent
2018-11-30 01:41:56
backdrop/backdrop-issues
https://api.github.com/repos/backdrop/backdrop-issues
closed
[DX] Use WATCHDOG_* constants instead of numerical values
pr - reviewed & tested by the community status - has pull request type - task
When reading through code, I often find it annoying to have numerical values instead of plain constants. Following PR will replace any occurence of watchdog related value with the proper constant string. ---- PR by @opi https://github.com/backdrop/backdrop/pull/2327
1.0
[DX] Use WATCHDOG_* constants instead of numerical values - When reading through code, I often find it annoying to have numerical values instead of plain constants. Following PR will replace any occurence of watchdog related value with the proper constant string. ---- PR by @opi https://github.com/backdrop/backdrop/pull/2327
test
use watchdog constants instead of numerical values when reading through code i often find it annoying to have numerical values instead of plain constants following pr will replace any occurence of watchdog related value with the proper constant string pr by opi
1
2,384
2,611,135,932
IssuesEvent
2015-02-27 01:07:53
phetsims/least-squares-regression
https://api.github.com/repos/phetsims/least-squares-regression
closed
use of white boxes?
design
While investigating #25, these design-related questions came to mind, and could affect how we resolve #25 (equation layout). (1) Why is the "best fit" line shown with a white box (sun.Panel) around it, like the line in the "My Line" control panel? It makes it look like a text field, sending the cue that the "best fit" line is dynamic/editable (like My Line), which it is not. (2) Ditto for the `r = value` in the Correlation Coefficient box. This panel-within-a-panel doesn't seem to fill any purpose, and in fact sends the wrong cue that it's and editable text field. Assigning to @bereaphysics to discuss with the designer.
1.0
use of white boxes? - While investigating #25, these design-related questions came to mind, and could affect how we resolve #25 (equation layout). (1) Why is the "best fit" line shown with a white box (sun.Panel) around it, like the line in the "My Line" control panel? It makes it look like a text field, sending the cue that the "best fit" line is dynamic/editable (like My Line), which it is not. (2) Ditto for the `r = value` in the Correlation Coefficient box. This panel-within-a-panel doesn't seem to fill any purpose, and in fact sends the wrong cue that it's and editable text field. Assigning to @bereaphysics to discuss with the designer.
non_test
use of white boxes while investigating these design related questions came to mind and could affect how we resolve equation layout why is the best fit line shown with a white box sun panel around it like the line in the my line control panel it makes it look like a text field sending the cue that the best fit line is dynamic editable like my line which it is not ditto for the r value in the correlation coefficient box this panel within a panel doesn t seem to fill any purpose and in fact sends the wrong cue that it s and editable text field assigning to bereaphysics to discuss with the designer
0
156,241
19,843,949,704
IssuesEvent
2022-01-21 02:28:02
harrinry/zaproxy
https://api.github.com/repos/harrinry/zaproxy
opened
CVE-2021-23369 (High) detected in handlebars-4.2.0.jar
security vulnerability
## CVE-2021-23369 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.2.0.jar</b></p></summary> <p>Logic-less and semantic templates with Java</p> <p>Library home page: <a href="https://github.com/jknack/handlebars.java">https://github.com/jknack/handlebars.java</a></p> <p>Path to dependency file: /zap/zap.gradle.kts</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.github.jknack/handlebars/4.2.0/116306614fd4d250af27fe1ef48665e7830fc10b/handlebars-4.2.0.jar</p> <p> Dependency Hierarchy: - wiremock-jre8-2.31.0.jar (Root Library) - :x: **handlebars-4.2.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/harrinry/zaproxy/commit/6c702dc201587758d9c83537d740cfeda5b4c806">6c702dc201587758d9c83537d740cfeda5b4c806</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source. <p>Publish Date: 2021-04-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369>CVE-2021-23369</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369</a></p> <p>Release Date: 2021-04-12</p> <p>Fix Resolution: com.github.jknack:handlebars:4.2.0, handlebars - 4.7.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.github.jknack","packageName":"handlebars","packageVersion":"4.2.0","packageFilePaths":["/zap/zap.gradle.kts"],"isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock-jre8:2.31.0;com.github.jknack:handlebars:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.github.jknack:handlebars:4.2.0, handlebars - 4.7.7","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-23369","vulnerabilityDetails":"The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2021-23369 (High) detected in handlebars-4.2.0.jar - ## CVE-2021-23369 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.2.0.jar</b></p></summary> <p>Logic-less and semantic templates with Java</p> <p>Library home page: <a href="https://github.com/jknack/handlebars.java">https://github.com/jknack/handlebars.java</a></p> <p>Path to dependency file: /zap/zap.gradle.kts</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.github.jknack/handlebars/4.2.0/116306614fd4d250af27fe1ef48665e7830fc10b/handlebars-4.2.0.jar</p> <p> Dependency Hierarchy: - wiremock-jre8-2.31.0.jar (Root Library) - :x: **handlebars-4.2.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/harrinry/zaproxy/commit/6c702dc201587758d9c83537d740cfeda5b4c806">6c702dc201587758d9c83537d740cfeda5b4c806</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source. <p>Publish Date: 2021-04-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369>CVE-2021-23369</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23369</a></p> <p>Release Date: 2021-04-12</p> <p>Fix Resolution: com.github.jknack:handlebars:4.2.0, handlebars - 4.7.7</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.github.jknack","packageName":"handlebars","packageVersion":"4.2.0","packageFilePaths":["/zap/zap.gradle.kts"],"isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock-jre8:2.31.0;com.github.jknack:handlebars:4.2.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.github.jknack:handlebars:4.2.0, handlebars - 4.7.7","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-23369","vulnerabilityDetails":"The package handlebars before 4.7.7 are vulnerable to Remote Code Execution (RCE) when selecting certain compiling options to compile templates coming from an untrusted source.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23369","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_test
cve high detected in handlebars jar cve high severity vulnerability vulnerable library handlebars jar logic less and semantic templates with java library home page a href path to dependency file zap zap gradle kts path to vulnerable library home wss scanner gradle caches modules files com github jknack handlebars handlebars jar dependency hierarchy wiremock jar root library x handlebars jar vulnerable library found in head commit a href found in base branch main vulnerability details the package handlebars before are vulnerable to remote code execution rce when selecting certain compiling options to compile templates coming from an untrusted source publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com github jknack handlebars handlebars isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com github tomakehurst wiremock com github jknack handlebars isminimumfixversionavailable true minimumfixversion com github jknack handlebars handlebars isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails the package handlebars before are vulnerable to remote code execution rce when selecting certain compiling options to compile templates coming from an untrusted source vulnerabilityurl
0
263,909
23,089,183,106
IssuesEvent
2022-07-26 13:57:06
threefoldtech/twin_aydo
https://api.github.com/repos/threefoldtech/twin_aydo
opened
[Whisper]: remove markdown from message preview
type_bug acceptance testing
**Describe the bug** Remove markdown signs from message preview **To Reproduce** Steps to reproduce the behavior: 1. Go to 'whisper' 2. Click on 'any chat' 3. Send markdown 4. See markdown signs in preview **Expected behavior** Should just display message body string without markdown signs. **Screenshots** ![unknown](https://user-images.githubusercontent.com/47602885/181023874-6602574a-0f1a-4453-ac4d-0e07338bf39d.png)
1.0
[Whisper]: remove markdown from message preview - **Describe the bug** Remove markdown signs from message preview **To Reproduce** Steps to reproduce the behavior: 1. Go to 'whisper' 2. Click on 'any chat' 3. Send markdown 4. See markdown signs in preview **Expected behavior** Should just display message body string without markdown signs. **Screenshots** ![unknown](https://user-images.githubusercontent.com/47602885/181023874-6602574a-0f1a-4453-ac4d-0e07338bf39d.png)
test
remove markdown from message preview describe the bug remove markdown signs from message preview to reproduce steps to reproduce the behavior go to whisper click on any chat send markdown see markdown signs in preview expected behavior should just display message body string without markdown signs screenshots
1
99,285
12,415,041,716
IssuesEvent
2020-05-22 15:31:48
cityofaustin/techstack
https://api.github.com/repos/cityofaustin/techstack
closed
Dev Design sync 5/19/20
Meeting Team: Design + Research Team: Dev
- [ ] From devs: - [x] What's our handoff flow? Answer: Chase will assign issue to "Team:dev", remove "Team:design", Move into "On deck" lane, when handoff issue is ready for devs. From design: - [ ] topics/topic collection flow from create modal - [ ] OPO joplin container and topic collection container
1.0
Dev Design sync 5/19/20 - - [ ] From devs: - [x] What's our handoff flow? Answer: Chase will assign issue to "Team:dev", remove "Team:design", Move into "On deck" lane, when handoff issue is ready for devs. From design: - [ ] topics/topic collection flow from create modal - [ ] OPO joplin container and topic collection container
non_test
dev design sync from devs what s our handoff flow answer chase will assign issue to team dev remove team design move into on deck lane when handoff issue is ready for devs from design topics topic collection flow from create modal opo joplin container and topic collection container
0
30,876
4,218,235,002
IssuesEvent
2016-06-30 15:23:49
SethClydesdale/forumactif-edge
https://api.github.com/repos/SethClydesdale/forumactif-edge
closed
Sticky Footer
css design
**Post :** http://themedesign.forumotion.com/t24p25-suggestions-feedback#188 Currently the footer does not stick to the bottom of the page when there's little content. This is something that will need to be addressed so that the footer is always accessible at the absolute bottom. **Page where this occurs :** http://themedesign.forumotion.com/search?search_keywords=1
1.0
Sticky Footer - **Post :** http://themedesign.forumotion.com/t24p25-suggestions-feedback#188 Currently the footer does not stick to the bottom of the page when there's little content. This is something that will need to be addressed so that the footer is always accessible at the absolute bottom. **Page where this occurs :** http://themedesign.forumotion.com/search?search_keywords=1
non_test
sticky footer post currently the footer does not stick to the bottom of the page when there s little content this is something that will need to be addressed so that the footer is always accessible at the absolute bottom page where this occurs
0
1,640
3,297,030,482
IssuesEvent
2015-11-02 04:59:34
t3kt/vjzual2
https://api.github.com/repos/t3kt/vjzual2
opened
source selector menu doesn't update when parameter is set externally
bug infrastructure ui
this is relevant for soloing (see #15)
1.0
source selector menu doesn't update when parameter is set externally - this is relevant for soloing (see #15)
non_test
source selector menu doesn t update when parameter is set externally this is relevant for soloing see
0
331,701
29,047,923,561
IssuesEvent
2023-05-13 20:17:50
IntellectualSites/FastAsyncWorldEdit
https://api.github.com/repos/IntellectualSites/FastAsyncWorldEdit
opened
//copy - java.nio.channels.OverlappingFileLockExcption: null
Requires Testing
### Server Implementation Paper ### Server Version 1.19.4 ### Describe the bug If I copy something with //copy, then I am not on the server for a while and join again and try to copy something this error appears ### To Reproduce 1. Join Server 2. Copy Selection 3. Go offline for around 30-60min 4. Go Online 5. Copy again 6. Error 7. Restart Server 8. all wokrs ### Expected behaviour Fix ### Screenshots / Videos ![Screenshot_4](https://github.com/IntellectualSites/FastAsyncWorldEdit/assets/88806692/bf56775f-e530-44ee-9f8f-4e60804c897c) ![Screenshot_5](https://github.com/IntellectualSites/FastAsyncWorldEdit/assets/88806692/46ff3e9c-a61f-4524-a429-81d5f9484e02) ### Error log (if applicable) _No response_ ### Fawe Debugpaste https://athion.net/ISPaster/paste/view/93dc8edd01cc4565aef94f6f82ca60b9 ### Fawe Version FAWE-1.20 ### Checklist - [X] I have included a Fawe debugpaste. - [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists. ### Anything else? _No response_
1.0
//copy - java.nio.channels.OverlappingFileLockExcption: null - ### Server Implementation Paper ### Server Version 1.19.4 ### Describe the bug If I copy something with //copy, then I am not on the server for a while and join again and try to copy something this error appears ### To Reproduce 1. Join Server 2. Copy Selection 3. Go offline for around 30-60min 4. Go Online 5. Copy again 6. Error 7. Restart Server 8. all wokrs ### Expected behaviour Fix ### Screenshots / Videos ![Screenshot_4](https://github.com/IntellectualSites/FastAsyncWorldEdit/assets/88806692/bf56775f-e530-44ee-9f8f-4e60804c897c) ![Screenshot_5](https://github.com/IntellectualSites/FastAsyncWorldEdit/assets/88806692/46ff3e9c-a61f-4524-a429-81d5f9484e02) ### Error log (if applicable) _No response_ ### Fawe Debugpaste https://athion.net/ISPaster/paste/view/93dc8edd01cc4565aef94f6f82ca60b9 ### Fawe Version FAWE-1.20 ### Checklist - [X] I have included a Fawe debugpaste. - [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists. ### Anything else? _No response_
test
copy java nio channels overlappingfilelockexcption null server implementation paper server version describe the bug if i copy something with copy then i am not on the server for a while and join again and try to copy something this error appears to reproduce join server copy selection go offline for around go online copy again error restart server all wokrs expected behaviour fix screenshots videos error log if applicable no response fawe debugpaste fawe version fawe checklist i have included a fawe debugpaste i am using the newest build from and the issue still persists anything else no response
1
37,587
15,340,927,219
IssuesEvent
2021-02-27 09:32:48
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
The speech service returns an error. (Python)
Pri3 cognitive-services/svc cxp product-question speech-service/subsvc triaged
Previously, I sent for processing a voice recording in the .mp3 and .wav format and everything worked, I received a phrase back in the form of text. Now, when sending in .mp3 format, an error is returned in the console: **Process finished with exit code -1073741819 (0xC0000005)** at the same time when i send .wav everything works --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 095c74d0-66b3-9cd7-4588-291bf4f56a47 * Version Independent ID: 8125527b-407f-c34f-57e8-fea89db144e8 * Content: [Stream codec compressed audio with the Speech SDK - Speech service - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-use-codec-compressed-audio-input-streams?tabs=debian&pivots=programming-language-python) * Content Source: [articles/cognitive-services/Speech-Service/how-to-use-codec-compressed-audio-input-streams.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/cognitive-services/Speech-Service/how-to-use-codec-compressed-audio-input-streams.md) * Service: **cognitive-services** * Sub-service: **speech-service** * GitHub Login: @amitkumarshukla * Microsoft Alias: **amishu**
2.0
The speech service returns an error. (Python) - Previously, I sent for processing a voice recording in the .mp3 and .wav format and everything worked, I received a phrase back in the form of text. Now, when sending in .mp3 format, an error is returned in the console: **Process finished with exit code -1073741819 (0xC0000005)** at the same time when i send .wav everything works --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 095c74d0-66b3-9cd7-4588-291bf4f56a47 * Version Independent ID: 8125527b-407f-c34f-57e8-fea89db144e8 * Content: [Stream codec compressed audio with the Speech SDK - Speech service - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/speech-service/how-to-use-codec-compressed-audio-input-streams?tabs=debian&pivots=programming-language-python) * Content Source: [articles/cognitive-services/Speech-Service/how-to-use-codec-compressed-audio-input-streams.md](https://github.com/MicrosoftDocs/azure-docs/blob/master/articles/cognitive-services/Speech-Service/how-to-use-codec-compressed-audio-input-streams.md) * Service: **cognitive-services** * Sub-service: **speech-service** * GitHub Login: @amitkumarshukla * Microsoft Alias: **amishu**
non_test
the speech service returns an error python previously i sent for processing a voice recording in the and wav format and everything worked i received a phrase back in the form of text now when sending in format an error is returned in the console process finished with exit code at the same time when i send wav everything works document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cognitive services sub service speech service github login amitkumarshukla microsoft alias amishu
0
63,566
6,849,885,751
IssuesEvent
2017-11-14 00:04:29
Iridescent-CM/technovation-app
https://api.github.com/repos/Iridescent-CM/technovation-app
closed
RA/Admin Team filter: "teams with/without mentors"
4 - Test <= 8 added during sprint [sprint topic] admin / RA
Login as the admin Go to Teams Verify that you can search by teams with or without mentors --- Go to Participants Find an RA Login as the RA Verify the same steps above for RAs <!--- @huboard:{"order":3.680602004182461e-36,"milestone_order":7.915016439424366e-45,"custom_state":""} -->
1.0
RA/Admin Team filter: "teams with/without mentors" - Login as the admin Go to Teams Verify that you can search by teams with or without mentors --- Go to Participants Find an RA Login as the RA Verify the same steps above for RAs <!--- @huboard:{"order":3.680602004182461e-36,"milestone_order":7.915016439424366e-45,"custom_state":""} -->
test
ra admin team filter teams with without mentors login as the admin go to teams verify that you can search by teams with or without mentors go to participants find an ra login as the ra verify the same steps above for ras huboard order milestone order custom state
1
170,047
13,171,523,144
IssuesEvent
2020-08-11 16:48:59
istio/istio
https://api.github.com/repos/istio/istio
opened
Improve release notes tooling
area/test and release
(This is used to request new product features, please visit <https://discuss.istio.io> for questions on using Istio) **Describe the feature request** **Describe alternatives you've considered** **Affected product area (please put an X in all that apply)** [ ] Docs [ ] Installation [ ] Networking [ ] Performance and Scalability [ ] Extensions and Telemetry [ ] Security [ ] Test and Release [ ] User Experience [ ] Developer Infrastructure **Affected features (please put an X in all that apply)** [ ] Multi Cluster [ ] Virtual Machine [ ] Multi Control Plane **Additional context**
1.0
Improve release notes tooling - (This is used to request new product features, please visit <https://discuss.istio.io> for questions on using Istio) **Describe the feature request** **Describe alternatives you've considered** **Affected product area (please put an X in all that apply)** [ ] Docs [ ] Installation [ ] Networking [ ] Performance and Scalability [ ] Extensions and Telemetry [ ] Security [ ] Test and Release [ ] User Experience [ ] Developer Infrastructure **Affected features (please put an X in all that apply)** [ ] Multi Cluster [ ] Virtual Machine [ ] Multi Control Plane **Additional context**
test
improve release notes tooling this is used to request new product features please visit for questions on using istio describe the feature request describe alternatives you ve considered affected product area please put an x in all that apply docs installation networking performance and scalability extensions and telemetry security test and release user experience developer infrastructure affected features please put an x in all that apply multi cluster virtual machine multi control plane additional context
1
174,218
27,595,096,041
IssuesEvent
2023-03-09 05:23:14
codestates-seb/seb42_main_013
https://api.github.com/repos/codestates-seb/seb42_main_013
opened
[FE] 관리 페이지 CSS
FE design
### 만들고자 하는 기능 - 약 관리 페이지 ### 해당 기능을 구현하기 위해 할 일 - [ ] 탭 컴포넌트 - [ ] 필터 컴포넌트 - [ ] 약 리스트 컴포넌트 ### 작업 완료 예상 날짜 2023-03-29 ### 기타 사항(선택) - 추가로 작성할 내용 있으면 작성
1.0
[FE] 관리 페이지 CSS - ### 만들고자 하는 기능 - 약 관리 페이지 ### 해당 기능을 구현하기 위해 할 일 - [ ] 탭 컴포넌트 - [ ] 필터 컴포넌트 - [ ] 약 리스트 컴포넌트 ### 작업 완료 예상 날짜 2023-03-29 ### 기타 사항(선택) - 추가로 작성할 내용 있으면 작성
non_test
관리 페이지 css 만들고자 하는 기능 약 관리 페이지 해당 기능을 구현하기 위해 할 일 탭 컴포넌트 필터 컴포넌트 약 리스트 컴포넌트 작업 완료 예상 날짜 기타 사항 선택 추가로 작성할 내용 있으면 작성
0
191,229
14,593,582,755
IssuesEvent
2020-12-19 23:43:31
github-vet/rangeloop-pointer-findings
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
closed
mongodb-labs/mongoreplay: mongoreplay/filter_test.go; 73 LoC
fresh medium test
Found a possible issue in [mongodb-labs/mongoreplay](https://www.github.com/mongodb-labs/mongoreplay) at [mongoreplay/filter_test.go](https://github.com/mongodb-labs/mongoreplay/blob/0ba3daf929315d59daf3b98ee9b5ca399324993d/mongoreplay/filter_test.go#L57-L129) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable c used in defer or goroutine at line 72 [Click here to see the code in its original context.](https://github.com/mongodb-labs/mongoreplay/blob/0ba3daf929315d59daf3b98ee9b5ca399324993d/mongoreplay/filter_test.go#L57-L129) <details> <summary>Click here to show the 73 line(s) of Go which triggered the analyzer.</summary> ```go for _, c := range cases { t.Logf("running case: %s\n", c.name) // make an iowriter that just buffers b := &bytes.Buffer{} bufferFile := NopWriteCloser(b) playbackWriter, err := playbackFileWriterFromWriteCloser(bufferFile, "file", PlaybackFileMetadata{}) if err != nil { t.Fatalf("couldn't create playbackfile writer %v", err) } // start a goroutine to write recorded ops to the opChan generator := newRecordedOpGenerator() go func() { defer close(generator.opChan) t.Logf("Generating %d inserts\n", c.numInsertsToGenerate) err := generator.generateInsertHelper("insert", 0, c.numInsertsToGenerate) if err != nil { t.Error(err) } t.Log("Generating driver ops") for _, opName := range c.driverOpsToGenerate { err = generator.generateCommandOp(opName, bson.D{}, 123) if err != nil { t.Error(err) } } }() skipConf := newSkipConfig(c.shouldRemoveDriverOps, time.Time{}, 0*time.Second) // run Filter to remove the driver op from the file if err := Filter(generator.opChan, []*PlaybackFileWriter{playbackWriter}, skipConf); err != nil { t.Error(err) } rs := bytes.NewReader(b.Bytes()) // open a reader into the written output playbackReader, err := playbackFileReaderFromReadSeeker(rs, "") if err != nil { t.Fatalf("couldn't create playbackfile reader %v", err) } opChan, errChan := playbackReader.OpChan(1) // loop over the found operations and verify that the correct number and // types of operations are found numOpsFound := 0 numDriverOpsFound := 0 for op := range opChan { numOpsFound++ parsedOp, err := op.RawOp.Parse() if err != nil { t.Error(err) } if IsDriverOp(parsedOp) { numDriverOpsFound++ } } if c.shouldRemoveDriverOps && numDriverOpsFound > 0 { t.Errorf("expected to have removed driver ops but instead found %d", numDriverOpsFound) } if c.numOpsExpectedAfterFilter != numOpsFound { t.Errorf("expected to have found %d total ops after filter but instead found %d", c.numOpsExpectedAfterFilter, numOpsFound) } err = <-errChan if err != io.EOF { t.Errorf("should have eof at end, but got %v", err) } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 0ba3daf929315d59daf3b98ee9b5ca399324993d
1.0
mongodb-labs/mongoreplay: mongoreplay/filter_test.go; 73 LoC - Found a possible issue in [mongodb-labs/mongoreplay](https://www.github.com/mongodb-labs/mongoreplay) at [mongoreplay/filter_test.go](https://github.com/mongodb-labs/mongoreplay/blob/0ba3daf929315d59daf3b98ee9b5ca399324993d/mongoreplay/filter_test.go#L57-L129) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable c used in defer or goroutine at line 72 [Click here to see the code in its original context.](https://github.com/mongodb-labs/mongoreplay/blob/0ba3daf929315d59daf3b98ee9b5ca399324993d/mongoreplay/filter_test.go#L57-L129) <details> <summary>Click here to show the 73 line(s) of Go which triggered the analyzer.</summary> ```go for _, c := range cases { t.Logf("running case: %s\n", c.name) // make an iowriter that just buffers b := &bytes.Buffer{} bufferFile := NopWriteCloser(b) playbackWriter, err := playbackFileWriterFromWriteCloser(bufferFile, "file", PlaybackFileMetadata{}) if err != nil { t.Fatalf("couldn't create playbackfile writer %v", err) } // start a goroutine to write recorded ops to the opChan generator := newRecordedOpGenerator() go func() { defer close(generator.opChan) t.Logf("Generating %d inserts\n", c.numInsertsToGenerate) err := generator.generateInsertHelper("insert", 0, c.numInsertsToGenerate) if err != nil { t.Error(err) } t.Log("Generating driver ops") for _, opName := range c.driverOpsToGenerate { err = generator.generateCommandOp(opName, bson.D{}, 123) if err != nil { t.Error(err) } } }() skipConf := newSkipConfig(c.shouldRemoveDriverOps, time.Time{}, 0*time.Second) // run Filter to remove the driver op from the file if err := Filter(generator.opChan, []*PlaybackFileWriter{playbackWriter}, skipConf); err != nil { t.Error(err) } rs := bytes.NewReader(b.Bytes()) // open a reader into the written output playbackReader, err := playbackFileReaderFromReadSeeker(rs, "") if err != nil { t.Fatalf("couldn't create playbackfile reader %v", err) } opChan, errChan := playbackReader.OpChan(1) // loop over the found operations and verify that the correct number and // types of operations are found numOpsFound := 0 numDriverOpsFound := 0 for op := range opChan { numOpsFound++ parsedOp, err := op.RawOp.Parse() if err != nil { t.Error(err) } if IsDriverOp(parsedOp) { numDriverOpsFound++ } } if c.shouldRemoveDriverOps && numDriverOpsFound > 0 { t.Errorf("expected to have removed driver ops but instead found %d", numDriverOpsFound) } if c.numOpsExpectedAfterFilter != numOpsFound { t.Errorf("expected to have found %d total ops after filter but instead found %d", c.numOpsExpectedAfterFilter, numOpsFound) } err = <-errChan if err != io.EOF { t.Errorf("should have eof at end, but got %v", err) } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 0ba3daf929315d59daf3b98ee9b5ca399324993d
test
mongodb labs mongoreplay mongoreplay filter test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable c used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for c range cases t logf running case s n c name make an iowriter that just buffers b bytes buffer bufferfile nopwritecloser b playbackwriter err playbackfilewriterfromwritecloser bufferfile file playbackfilemetadata if err nil t fatalf couldn t create playbackfile writer v err start a goroutine to write recorded ops to the opchan generator newrecordedopgenerator go func defer close generator opchan t logf generating d inserts n c numinsertstogenerate err generator generateinserthelper insert c numinsertstogenerate if err nil t error err t log generating driver ops for opname range c driveropstogenerate err generator generatecommandop opname bson d if err nil t error err skipconf newskipconfig c shouldremovedriverops time time time second run filter to remove the driver op from the file if err filter generator opchan playbackfilewriter playbackwriter skipconf err nil t error err rs bytes newreader b bytes open a reader into the written output playbackreader err playbackfilereaderfromreadseeker rs if err nil t fatalf couldn t create playbackfile reader v err opchan errchan playbackreader opchan loop over the found operations and verify that the correct number and types of operations are found numopsfound numdriveropsfound for op range opchan numopsfound parsedop err op rawop parse if err nil t error err if isdriverop parsedop numdriveropsfound if c shouldremovedriverops numdriveropsfound t errorf expected to have removed driver ops but instead found d numdriveropsfound if c numopsexpectedafterfilter numopsfound t errorf expected to have found d total ops after filter but instead found d c numopsexpectedafterfilter numopsfound err errchan if err io eof t errorf should have eof at end but got v err leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
593,395
17,989,056,935
IssuesEvent
2021-09-15 01:55:30
shoepro/server
https://api.github.com/repos/shoepro/server
closed
[Feat]: Create faq flow
2.5h Server Feat Priority: Middle
### ISSUE - Group: `Server` - Type: `Feat` - Time: `2.5h` - Priority: `Middle` ### TODO 1. [x] Create and Define FaqController 2. [x] Create and Define FaqService 3. [x] Create and Define FaqRepository 4. [x] Create and Define FaqModel 5. [x] Create and Define FaqDTOs 6. [x] Create and Define FaqResponseInterceptors
1.0
[Feat]: Create faq flow - ### ISSUE - Group: `Server` - Type: `Feat` - Time: `2.5h` - Priority: `Middle` ### TODO 1. [x] Create and Define FaqController 2. [x] Create and Define FaqService 3. [x] Create and Define FaqRepository 4. [x] Create and Define FaqModel 5. [x] Create and Define FaqDTOs 6. [x] Create and Define FaqResponseInterceptors
non_test
create faq flow issue group server type feat time priority middle todo create and define faqcontroller create and define faqservice create and define faqrepository create and define faqmodel create and define faqdtos create and define faqresponseinterceptors
0
340,998
10,281,356,555
IssuesEvent
2019-08-26 08:18:28
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
outlook.live.com - site is not usable
browser-firefox engine-gecko priority-critical
<!-- @browser: Firefox 64.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0 --> <!-- @reported_with: desktop-reporter --> **URL**: https://outlook.live.com/mail/ **Browser / Version**: Firefox 64.0 **Operating System**: Windows 10 **Tested Another Browser**: No **Problem type**: Site is not usable **Description**: la pagina è bianca **Steps to Reproduce**: <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20181206201918</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: release</li> </ul> <p>Console Messages:</p> <pre> [u'[JavaScript Warning: "Content Security Policy: Impossibile elaborare la direttiva sconosciuta prefetch-src"]', u'[JavaScript Warning: "Content Security Policy: La direttiva child-src deprecata. Al suo posto utilizzare la direttiva worker-src per gestire i worker, oppure la direttiva frame-src per controllare i frame."]', u'[JavaScript Error: "Non stata dichiarata la codifica caratteri del documento HTML. Il documento verr visualizzato con del testo incomprensibile in alcune configurazioni del browser nel caso in cui contenga dei caratteri al di fuori dellintervallo US-ASCII. La codifica caratteri di una pagina deve essere dichiarata nel documento o nel protocollo di trasferimento." {file: "https://outlook.live.com/mail/" line: 0}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
outlook.live.com - site is not usable - <!-- @browser: Firefox 64.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:64.0) Gecko/20100101 Firefox/64.0 --> <!-- @reported_with: desktop-reporter --> **URL**: https://outlook.live.com/mail/ **Browser / Version**: Firefox 64.0 **Operating System**: Windows 10 **Tested Another Browser**: No **Problem type**: Site is not usable **Description**: la pagina è bianca **Steps to Reproduce**: <details> <summary>Browser Configuration</summary> <ul> <li>mixed active content blocked: false</li><li>image.mem.shared: true</li><li>buildID: 20181206201918</li><li>tracking content blocked: false</li><li>gfx.webrender.blob-images: true</li><li>hasTouchScreen: false</li><li>mixed passive content blocked: false</li><li>gfx.webrender.enabled: false</li><li>gfx.webrender.all: false</li><li>channel: release</li> </ul> <p>Console Messages:</p> <pre> [u'[JavaScript Warning: "Content Security Policy: Impossibile elaborare la direttiva sconosciuta prefetch-src"]', u'[JavaScript Warning: "Content Security Policy: La direttiva child-src deprecata. Al suo posto utilizzare la direttiva worker-src per gestire i worker, oppure la direttiva frame-src per controllare i frame."]', u'[JavaScript Error: "Non stata dichiarata la codifica caratteri del documento HTML. Il documento verr visualizzato con del testo incomprensibile in alcune configurazioni del browser nel caso in cui contenga dei caratteri al di fuori dellintervallo US-ASCII. La codifica caratteri di una pagina deve essere dichiarata nel documento o nel protocollo di trasferimento." {file: "https://outlook.live.com/mail/" line: 0}]'] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_test
outlook live com site is not usable url browser version firefox operating system windows tested another browser no problem type site is not usable description la pagina è bianca steps to reproduce browser configuration mixed active content blocked false image mem shared true buildid tracking content blocked false gfx webrender blob images true hastouchscreen false mixed passive content blocked false gfx webrender enabled false gfx webrender all false channel release console messages u u from with ❤️
0
212,038
16,413,291,471
IssuesEvent
2021-05-19 00:44:35
WordPress/gutenberg
https://api.github.com/repos/WordPress/gutenberg
closed
Group alignfull within a group align full
Needs Testing [Block] Group
## Description If we use group align full within a group alignfull there is an overflow on the left ## Step-by-step reproduction instructions Add a group alignfull Add another one in it... ## Expected behaviour It would be nice to have it align and nice ## Actual behaviour <img width="1629" alt="dc_screen_shot 2021-04-29 à 10 11 52" src="https://user-images.githubusercontent.com/2622001/116521577-6cfd3a00-a8d4-11eb-99ed-d60c11470f97.png"> ## WordPress information - WordPress version: 5.7.1 - Gutenberg version: Not installed - Are all plugins except Gutenberg deactivated? Yes - Are you using a default theme (e.g. Twenty Twenty-One)? Yes ## Device information - Device: Desktop - Operating system: "iOS 14" - Browser: Chrome 86.0
1.0
Group alignfull within a group align full - ## Description If we use group align full within a group alignfull there is an overflow on the left ## Step-by-step reproduction instructions Add a group alignfull Add another one in it... ## Expected behaviour It would be nice to have it align and nice ## Actual behaviour <img width="1629" alt="dc_screen_shot 2021-04-29 à 10 11 52" src="https://user-images.githubusercontent.com/2622001/116521577-6cfd3a00-a8d4-11eb-99ed-d60c11470f97.png"> ## WordPress information - WordPress version: 5.7.1 - Gutenberg version: Not installed - Are all plugins except Gutenberg deactivated? Yes - Are you using a default theme (e.g. Twenty Twenty-One)? Yes ## Device information - Device: Desktop - Operating system: "iOS 14" - Browser: Chrome 86.0
test
group alignfull within a group align full description if we use group align full within a group alignfull there is an overflow on the left step by step reproduction instructions add a group alignfull add another one in it expected behaviour it would be nice to have it align and nice actual behaviour img width alt dc screen shot à src wordpress information wordpress version gutenberg version not installed are all plugins except gutenberg deactivated yes are you using a default theme e g twenty twenty one yes device information device desktop operating system ios browser chrome
1
193,406
14,652,584,662
IssuesEvent
2020-12-28 02:26:55
github-vet/rangeloop-pointer-findings
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
closed
census-instrumentation/opencensus-service: translator/trace/zipkin/zipkinv1_to_protospan_test.go; 8 LoC
fresh test tiny
Found a possible issue in [census-instrumentation/opencensus-service](https://www.github.com/census-instrumentation/opencensus-service) at [translator/trace/zipkin/zipkinv1_to_protospan_test.go](https://github.com/census-instrumentation/opencensus-service/blob/57e037c599b75fe2dddf89cb9ada67bf4c8de9fc/translator/trace/zipkin/zipkinv1_to_protospan_test.go#L213-L220) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > reference to tsr is reassigned at line 218 [Click here to see the code in its original context.](https://github.com/census-instrumentation/opencensus-service/blob/57e037c599b75fe2dddf89cb9ada67bf4c8de9fc/translator/trace/zipkin/zipkinv1_to_protospan_test.go#L213-L220) <details> <summary>Click here to show the 8 line(s) of Go which triggered the analyzer.</summary> ```go for _, tsr := range g { key := tsr.Node.String() if pTsr, ok := nodeToTraceReqs[key]; ok { pTsr.Spans = append(pTsr.Spans, tsr.Spans...) } else { nodeToTraceReqs[key] = &tsr } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 57e037c599b75fe2dddf89cb9ada67bf4c8de9fc
1.0
census-instrumentation/opencensus-service: translator/trace/zipkin/zipkinv1_to_protospan_test.go; 8 LoC - Found a possible issue in [census-instrumentation/opencensus-service](https://www.github.com/census-instrumentation/opencensus-service) at [translator/trace/zipkin/zipkinv1_to_protospan_test.go](https://github.com/census-instrumentation/opencensus-service/blob/57e037c599b75fe2dddf89cb9ada67bf4c8de9fc/translator/trace/zipkin/zipkinv1_to_protospan_test.go#L213-L220) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > reference to tsr is reassigned at line 218 [Click here to see the code in its original context.](https://github.com/census-instrumentation/opencensus-service/blob/57e037c599b75fe2dddf89cb9ada67bf4c8de9fc/translator/trace/zipkin/zipkinv1_to_protospan_test.go#L213-L220) <details> <summary>Click here to show the 8 line(s) of Go which triggered the analyzer.</summary> ```go for _, tsr := range g { key := tsr.Node.String() if pTsr, ok := nodeToTraceReqs[key]; ok { pTsr.Spans = append(pTsr.Spans, tsr.Spans...) } else { nodeToTraceReqs[key] = &tsr } } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 57e037c599b75fe2dddf89cb9ada67bf4c8de9fc
test
census instrumentation opencensus service translator trace zipkin to protospan test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message reference to tsr is reassigned at line click here to show the line s of go which triggered the analyzer go for tsr range g key tsr node string if ptsr ok nodetotracereqs ok ptsr spans append ptsr spans tsr spans else nodetotracereqs tsr leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
60,300
7,329,841,776
IssuesEvent
2018-03-05 07:31:33
MarlinFirmware/Marlin
https://api.github.com/repos/MarlinFirmware/Marlin
closed
[2.0.x] Proposed workaround for STM32 enum pins
Needs: Work T: 32-Bit & HAL T: Design Concept T: Development T: Suggestion
**Problem:** Because pins are defined for the STM32F1 platform with an `enum` the preprocessor can't see them. This breaks some of our indirection and causes compilation to fail. **Solution:** Define the pin numbers in the HAL in the usual manner but with an underscore prefix. When we need to refer to the integer value of (_e.g._) `PA0` inside of a preprocessor `#if` expression such as `#if BEEPER_PIN == PA0`, wrap the pin name in a conversion macro. ### Proof of concept ```cpp #include <iostream> #include <string> #include <math.h> using namespace std; #define _CAT(a, ...) a ## __VA_ARGS__ #define _PIN(P) _CAT(_, P) #define PIN_EXISTS(P) (defined(P ##_PIN) && P ##_PIN >= 0) enum { PA0, PA1, PA2, PA3, PA4 }; // constexpr also works fine #define _PA0 0 #define _PA1 1 #define _PA2 2 #define _PA3 3 #define _PA4 4 #define HEATED_BED_PIN PA3 int main(int argc, char const *argv[]) { #if PIN_EXISTS(HEATED_BED) cout << "HEATED_BED_PIN=" << int(_PIN(HEATED_BED_PIN)) << endl; #else cout << "HEATED_BED_PIN undefined" << endl; #endif return 0; } ``` CC: @MarlinFirmware/32bit-maintainers @victorpv
1.0
[2.0.x] Proposed workaround for STM32 enum pins - **Problem:** Because pins are defined for the STM32F1 platform with an `enum` the preprocessor can't see them. This breaks some of our indirection and causes compilation to fail. **Solution:** Define the pin numbers in the HAL in the usual manner but with an underscore prefix. When we need to refer to the integer value of (_e.g._) `PA0` inside of a preprocessor `#if` expression such as `#if BEEPER_PIN == PA0`, wrap the pin name in a conversion macro. ### Proof of concept ```cpp #include <iostream> #include <string> #include <math.h> using namespace std; #define _CAT(a, ...) a ## __VA_ARGS__ #define _PIN(P) _CAT(_, P) #define PIN_EXISTS(P) (defined(P ##_PIN) && P ##_PIN >= 0) enum { PA0, PA1, PA2, PA3, PA4 }; // constexpr also works fine #define _PA0 0 #define _PA1 1 #define _PA2 2 #define _PA3 3 #define _PA4 4 #define HEATED_BED_PIN PA3 int main(int argc, char const *argv[]) { #if PIN_EXISTS(HEATED_BED) cout << "HEATED_BED_PIN=" << int(_PIN(HEATED_BED_PIN)) << endl; #else cout << "HEATED_BED_PIN undefined" << endl; #endif return 0; } ``` CC: @MarlinFirmware/32bit-maintainers @victorpv
non_test
proposed workaround for enum pins problem because pins are defined for the platform with an enum the preprocessor can t see them this breaks some of our indirection and causes compilation to fail solution define the pin numbers in the hal in the usual manner but with an underscore prefix when we need to refer to the integer value of e g inside of a preprocessor if expression such as if beeper pin wrap the pin name in a conversion macro proof of concept cpp include include include using namespace std define cat a a va args define pin p cat p define pin exists p defined p pin p pin enum constexpr also works fine define define define define define define heated bed pin int main int argc char const argv if pin exists heated bed cout heated bed pin int pin heated bed pin endl else cout heated bed pin undefined endl endif return cc marlinfirmware maintainers victorpv
0
191,454
14,594,346,764
IssuesEvent
2020-12-20 05:06:18
github-vet/rangeloop-pointer-findings
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
closed
softcorp-io/hqs-user-service: app/testdev/experiments/speedtest/speed_test.go; 9 LoC
fresh test tiny
Found a possible issue in [softcorp-io/hqs-user-service](https://www.github.com/softcorp-io/hqs-user-service) at [app/testdev/experiments/speedtest/speed_test.go](https://github.com/softcorp-io/hqs-user-service/blob/31b98ed99b740937c6b9d2c087294fa1cbfe81ae/app/testdev/experiments/speedtest/speed_test.go#L103-L111) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable cl used in defer or goroutine at line 106 [Click here to see the code in its original context.](https://github.com/softcorp-io/hqs-user-service/blob/31b98ed99b740937c6b9d2c087294fa1cbfe81ae/app/testdev/experiments/speedtest/speed_test.go#L103-L111) <details> <summary>Click here to show the 9 line(s) of Go which triggered the analyzer.</summary> ```go for _, cl := range clients { go func(wg *sync.WaitGroup) { wg.Add(1) _, err := client.Auth(cl, seedEmail, seedPassword, wg) if err != nil { log.Println("Warning:", "Test.Speed.Speed", "Could not authenticate") } }(&wgClient) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 31b98ed99b740937c6b9d2c087294fa1cbfe81ae
1.0
softcorp-io/hqs-user-service: app/testdev/experiments/speedtest/speed_test.go; 9 LoC - Found a possible issue in [softcorp-io/hqs-user-service](https://www.github.com/softcorp-io/hqs-user-service) at [app/testdev/experiments/speedtest/speed_test.go](https://github.com/softcorp-io/hqs-user-service/blob/31b98ed99b740937c6b9d2c087294fa1cbfe81ae/app/testdev/experiments/speedtest/speed_test.go#L103-L111) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > range-loop variable cl used in defer or goroutine at line 106 [Click here to see the code in its original context.](https://github.com/softcorp-io/hqs-user-service/blob/31b98ed99b740937c6b9d2c087294fa1cbfe81ae/app/testdev/experiments/speedtest/speed_test.go#L103-L111) <details> <summary>Click here to show the 9 line(s) of Go which triggered the analyzer.</summary> ```go for _, cl := range clients { go func(wg *sync.WaitGroup) { wg.Add(1) _, err := client.Auth(cl, seedEmail, seedPassword, wg) if err != nil { log.Println("Warning:", "Test.Speed.Speed", "Could not authenticate") } }(&wgClient) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 31b98ed99b740937c6b9d2c087294fa1cbfe81ae
test
softcorp io hqs user service app testdev experiments speedtest speed test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable cl used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for cl range clients go func wg sync waitgroup wg add err client auth cl seedemail seedpassword wg if err nil log println warning test speed speed could not authenticate wgclient leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
172,900
21,074,566,081
IssuesEvent
2022-04-02 01:11:34
DavidSpek/pipelines
https://api.github.com/repos/DavidSpek/pipelines
opened
CVE-2021-32798 (High) detected in notebook-5.7.10-py2.py3-none-any.whl
security vulnerability
## CVE-2021-32798 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>notebook-5.7.10-py2.py3-none-any.whl</b></p></summary> <p>A web-based notebook environment for interactive computing</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/7a/fb/6b1735e8ff43f68e867928526134cd6ba22554d596862a7fe71092ba8fc8/notebook-5.7.10-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/7a/fb/6b1735e8ff43f68e867928526134cd6ba22554d596862a7fe71092ba8fc8/notebook-5.7.10-py2.py3-none-any.whl</a></p> <p>Path to dependency file: /backend/src/apiserver/visualization/requirements.txt</p> <p>Path to vulnerable library: /backend/src/apiserver/visualization/requirements.txt,/test/sample-test/requirements.txt,/backend/requirements.txt</p> <p> Dependency Hierarchy: - jupyter-1.0.0-py2.py3-none-any.whl (Root Library) - :x: **notebook-5.7.10-py2.py3-none-any.whl** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The Jupyter notebook is a web-based notebook environment for interactive computing. In affected versions untrusted notebook can execute code on load. Jupyter Notebook uses a deprecated version of Google Caja to sanitize user inputs. A public Caja bypass can be used to trigger an XSS when a victim opens a malicious ipynb document in Jupyter Notebook. The XSS allows an attacker to execute arbitrary code on the victim computer using Jupyter APIs. <p>Publish Date: 2021-08-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32798>CVE-2021-32798</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/jupyter/notebook/security/advisories/GHSA-hwvq-6gjx-j797">https://github.com/jupyter/notebook/security/advisories/GHSA-hwvq-6gjx-j797</a></p> <p>Release Date: 2021-08-09</p> <p>Fix Resolution: notebook - 5.7.11, 6.4.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-32798 (High) detected in notebook-5.7.10-py2.py3-none-any.whl - ## CVE-2021-32798 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>notebook-5.7.10-py2.py3-none-any.whl</b></p></summary> <p>A web-based notebook environment for interactive computing</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/7a/fb/6b1735e8ff43f68e867928526134cd6ba22554d596862a7fe71092ba8fc8/notebook-5.7.10-py2.py3-none-any.whl">https://files.pythonhosted.org/packages/7a/fb/6b1735e8ff43f68e867928526134cd6ba22554d596862a7fe71092ba8fc8/notebook-5.7.10-py2.py3-none-any.whl</a></p> <p>Path to dependency file: /backend/src/apiserver/visualization/requirements.txt</p> <p>Path to vulnerable library: /backend/src/apiserver/visualization/requirements.txt,/test/sample-test/requirements.txt,/backend/requirements.txt</p> <p> Dependency Hierarchy: - jupyter-1.0.0-py2.py3-none-any.whl (Root Library) - :x: **notebook-5.7.10-py2.py3-none-any.whl** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The Jupyter notebook is a web-based notebook environment for interactive computing. In affected versions untrusted notebook can execute code on load. Jupyter Notebook uses a deprecated version of Google Caja to sanitize user inputs. A public Caja bypass can be used to trigger an XSS when a victim opens a malicious ipynb document in Jupyter Notebook. The XSS allows an attacker to execute arbitrary code on the victim computer using Jupyter APIs. <p>Publish Date: 2021-08-09 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32798>CVE-2021-32798</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.6</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/jupyter/notebook/security/advisories/GHSA-hwvq-6gjx-j797">https://github.com/jupyter/notebook/security/advisories/GHSA-hwvq-6gjx-j797</a></p> <p>Release Date: 2021-08-09</p> <p>Fix Resolution: notebook - 5.7.11, 6.4.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in notebook none any whl cve high severity vulnerability vulnerable library notebook none any whl a web based notebook environment for interactive computing library home page a href path to dependency file backend src apiserver visualization requirements txt path to vulnerable library backend src apiserver visualization requirements txt test sample test requirements txt backend requirements txt dependency hierarchy jupyter none any whl root library x notebook none any whl vulnerable library found in base branch master vulnerability details the jupyter notebook is a web based notebook environment for interactive computing in affected versions untrusted notebook can execute code on load jupyter notebook uses a deprecated version of google caja to sanitize user inputs a public caja bypass can be used to trigger an xss when a victim opens a malicious ipynb document in jupyter notebook the xss allows an attacker to execute arbitrary code on the victim computer using jupyter apis publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution notebook step up your open source security game with whitesource
0
291,641
25,163,505,386
IssuesEvent
2022-11-10 18:41:37
TeamHavit/Havit-Server-Upptime
https://api.github.com/repos/TeamHavit/Havit-Server-Upptime
closed
🛑 TEST slack is down
status test-slack
In [`4f5ddb5`](https://github.com/TeamHavit/Havit-Server-Upptime/commit/4f5ddb5ed4d6eaa0f3620e21a2642419df3a4550 ), TEST slack (http://sldkfjlsjdflksdjf.com) was **down**: - HTTP code: 0 - Response time: 0 ms
1.0
🛑 TEST slack is down - In [`4f5ddb5`](https://github.com/TeamHavit/Havit-Server-Upptime/commit/4f5ddb5ed4d6eaa0f3620e21a2642419df3a4550 ), TEST slack (http://sldkfjlsjdflksdjf.com) was **down**: - HTTP code: 0 - Response time: 0 ms
test
🛑 test slack is down in test slack was down http code response time ms
1
158,939
12,439,587,280
IssuesEvent
2020-05-26 10:22:17
gradle/gradle
https://api.github.com/repos/gradle/gradle
closed
java14 helpful NPEs not working in test
@jvm a:bug from:contributor in:testing
### Expected Behavior I configure the test task to add the new jvm flag (java14) that enables helpful NPE messages *build.gradle.kts* ```kotlin tasks.withType<Test> { jvmArgs("-XX:+ShowCodeDetailsInExceptionMessages") // Helpful NPEs not working :( testLogging { setExceptionFormat("full") // Prints exception messages } } ``` With a test like that ```java @Test public void true_npe() { Object o = null; o.toString(); } ``` I expect NPEs to have messages with details, something like this ``` TestNPE > true_npe FAILED java.lang.NullPointerException: Cannot invoke "Object.toString()" because "o" is null at TestNPE.true_npe(TestNPE.java:8) ``` ### Current Behavior It prints no message ``` TestNPE > true_npe FAILED java.lang.NullPointerException at TestNPE.true_npe(TestNPE.java:8) ``` ### Context For now, I guess only a few people uses this flag but as mentioned in the JEP https://openjdk.java.net/jeps/358, it will become the default behavior in a future version. I posted it on stackoverflow https://stackoverflow.com/questions/61664979/how-to-enable-helpful-nullpointerexceptions-in-gradle-test/61737264#61737264 but I'm putting the same infos here ### Steps to Reproduce I created a simple project to reproduce it. https://github.com/apflieger/gradle-helpful-npe Just `./gradlew test` Make sure to have java14. ### Environment ``` java -version openjdk version "14.0.1" 2020-04-14 OpenJDK Runtime Environment (build 14.0.1+7) OpenJDK 64-Bit Server VM (build 14.0.1+7, mixed mode, sharing) ``` I retrieved the classpath gradle uses by running test with debug logs. With that, I could run JUnit directly. With `-XX:+ShowCodeDetailsInExceptionMessages` and the helpful NPE message shows up: ``` java -cp '/Users/apflieger/src/gradle-helpfull-npe/build/classes/java/test:/Users/apflieger/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar:/Users/apflieger/.gradle/caches/modules-2/files-2.1/org.hamcrest/hamcrest-core/1.3/42a25dc3219429f0e5d060061f71acb49bf010a0/hamcrest-core-1.3.jar' -XX:+ShowCodeDetailsInExceptionMessages org.junit.runner.JUnitCore TestNPE JUnit version 4.13 .E.E Time: 0.006 There were 2 failures: 1) throw_npe(TestNPE) java.lang.NullPointerException: My own message at TestNPE.throw_npe(TestNPE.java:13) 2) true_npe(TestNPE) java.lang.NullPointerException: Cannot invoke "Object.toString()" because "o" is null at TestNPE.true_npe(TestNPE.java:8) FAILURES!!! Tests run: 2, Failures: 2 ``` So it doesn't come from JUnit
1.0
java14 helpful NPEs not working in test - ### Expected Behavior I configure the test task to add the new jvm flag (java14) that enables helpful NPE messages *build.gradle.kts* ```kotlin tasks.withType<Test> { jvmArgs("-XX:+ShowCodeDetailsInExceptionMessages") // Helpful NPEs not working :( testLogging { setExceptionFormat("full") // Prints exception messages } } ``` With a test like that ```java @Test public void true_npe() { Object o = null; o.toString(); } ``` I expect NPEs to have messages with details, something like this ``` TestNPE > true_npe FAILED java.lang.NullPointerException: Cannot invoke "Object.toString()" because "o" is null at TestNPE.true_npe(TestNPE.java:8) ``` ### Current Behavior It prints no message ``` TestNPE > true_npe FAILED java.lang.NullPointerException at TestNPE.true_npe(TestNPE.java:8) ``` ### Context For now, I guess only a few people uses this flag but as mentioned in the JEP https://openjdk.java.net/jeps/358, it will become the default behavior in a future version. I posted it on stackoverflow https://stackoverflow.com/questions/61664979/how-to-enable-helpful-nullpointerexceptions-in-gradle-test/61737264#61737264 but I'm putting the same infos here ### Steps to Reproduce I created a simple project to reproduce it. https://github.com/apflieger/gradle-helpful-npe Just `./gradlew test` Make sure to have java14. ### Environment ``` java -version openjdk version "14.0.1" 2020-04-14 OpenJDK Runtime Environment (build 14.0.1+7) OpenJDK 64-Bit Server VM (build 14.0.1+7, mixed mode, sharing) ``` I retrieved the classpath gradle uses by running test with debug logs. With that, I could run JUnit directly. With `-XX:+ShowCodeDetailsInExceptionMessages` and the helpful NPE message shows up: ``` java -cp '/Users/apflieger/src/gradle-helpfull-npe/build/classes/java/test:/Users/apflieger/.gradle/caches/modules-2/files-2.1/junit/junit/4.13/e49ccba652b735c93bd6e6f59760d8254cf597dd/junit-4.13.jar:/Users/apflieger/.gradle/caches/modules-2/files-2.1/org.hamcrest/hamcrest-core/1.3/42a25dc3219429f0e5d060061f71acb49bf010a0/hamcrest-core-1.3.jar' -XX:+ShowCodeDetailsInExceptionMessages org.junit.runner.JUnitCore TestNPE JUnit version 4.13 .E.E Time: 0.006 There were 2 failures: 1) throw_npe(TestNPE) java.lang.NullPointerException: My own message at TestNPE.throw_npe(TestNPE.java:13) 2) true_npe(TestNPE) java.lang.NullPointerException: Cannot invoke "Object.toString()" because "o" is null at TestNPE.true_npe(TestNPE.java:8) FAILURES!!! Tests run: 2, Failures: 2 ``` So it doesn't come from JUnit
test
helpful npes not working in test expected behavior i configure the test task to add the new jvm flag that enables helpful npe messages build gradle kts kotlin tasks withtype jvmargs xx showcodedetailsinexceptionmessages helpful npes not working testlogging setexceptionformat full prints exception messages with a test like that java test public void true npe object o null o tostring i expect npes to have messages with details something like this testnpe true npe failed java lang nullpointerexception cannot invoke object tostring because o is null at testnpe true npe testnpe java current behavior it prints no message testnpe true npe failed java lang nullpointerexception at testnpe true npe testnpe java context for now i guess only a few people uses this flag but as mentioned in the jep it will become the default behavior in a future version i posted it on stackoverflow but i m putting the same infos here steps to reproduce i created a simple project to reproduce it just gradlew test make sure to have environment java version openjdk version openjdk runtime environment build openjdk bit server vm build mixed mode sharing i retrieved the classpath gradle uses by running test with debug logs with that i could run junit directly with xx showcodedetailsinexceptionmessages and the helpful npe message shows up java cp users apflieger src gradle helpfull npe build classes java test users apflieger gradle caches modules files junit junit junit jar users apflieger gradle caches modules files org hamcrest hamcrest core hamcrest core jar xx showcodedetailsinexceptionmessages org junit runner junitcore testnpe junit version e e time there were failures throw npe testnpe java lang nullpointerexception my own message at testnpe throw npe testnpe java true npe testnpe java lang nullpointerexception cannot invoke object tostring because o is null at testnpe true npe testnpe java failures tests run failures so it doesn t come from junit
1
124,890
26,555,463,631
IssuesEvent
2023-01-20 11:36:34
sourcegraph/sourcegraph
https://api.github.com/repos/sourcegraph/sourcegraph
closed
insights: update backend architecture diagram
docs team/code-insights backend
the architecture diagram in our developing docs page is outdated and needs updating: http://docs.sourcegraph.com/dev/background-information/insights/backend#architecture
1.0
insights: update backend architecture diagram - the architecture diagram in our developing docs page is outdated and needs updating: http://docs.sourcegraph.com/dev/background-information/insights/backend#architecture
non_test
insights update backend architecture diagram the architecture diagram in our developing docs page is outdated and needs updating
0
305,491
26,390,749,508
IssuesEvent
2023-01-12 15:30:46
CSOIreland/PxStat
https://api.github.com/repos/CSOIreland/PxStat
closed
[BUG] Locally authenticated users not getting some automated emails
bug released tested fixed
Locally authenticated users not getting some automated emails - for check
1.0
[BUG] Locally authenticated users not getting some automated emails - Locally authenticated users not getting some automated emails - for check
test
locally authenticated users not getting some automated emails locally authenticated users not getting some automated emails for check
1
86,981
8,055,536,539
IssuesEvent
2018-08-02 09:36:38
geopython/OWSLib
https://api.github.com/repos/geopython/OWSLib
opened
doctest sos_20_bom_gov_au.txt is a flickering test
bug tests
... needs to be converted to a "normal" test with `skipif` check for the service availability.
1.0
doctest sos_20_bom_gov_au.txt is a flickering test - ... needs to be converted to a "normal" test with `skipif` check for the service availability.
test
doctest sos bom gov au txt is a flickering test needs to be converted to a normal test with skipif check for the service availability
1
228,259
18,167,155,782
IssuesEvent
2021-09-27 15:42:14
juju/python-libjuju
https://api.github.com/repos/juju/python-libjuju
closed
tox+py.test using wrong version of python
issue in testing
### Problem On a tox run, for example on [this one](https://jenkins.juju.canonical.com/job/github-integration-tests-pylibjuju-python3.9/134/console), even though we run it for `py39` ![image](https://user-images.githubusercontent.com/120652/134402191-a1a006d1-504a-4e1d-8d5a-f701794d5ca2.png) scroll down just a bit, and we see that it uses `python3.6`: ![image](https://user-images.githubusercontent.com/120652/134402570-8e649702-4d3f-4313-94ae-f16738520d6b.png) ### Discussion Once analysed it a bit, I saw that on my local system tox seems to be using the `python3` link for the Python versions that are **higher** than whichever `python` is used to run the `tox` itself. Check this out: On my local system, the `python3` is `python3.8.10`. So when I run `tox -r -e py37 --notest --vv` I get: ![image](https://user-images.githubusercontent.com/120652/134403532-ef0a1d53-adea-400d-9d4b-b61155f80012.png) However, when I run `tox -r -e py39 --notest -vv` (recall that 3.9 is higher than my default 3.8) Then I get: ![image](https://user-images.githubusercontent.com/120652/134403813-5be70dfe-9234-475c-bd61-8fed5499df4d.png) However, when I run `tox` itself with a higher interpreter, like: `python3.9 -m tox -r -e py39 --notest -vv` Then we get a similar response to what we've seen for `python3.7` above: ![image](https://user-images.githubusercontent.com/120652/134404222-77f56d71-d07f-43ca-a224-322b3a4450d7.png) ### Furthermore In the CI run example above, we see that it defaults to `3.6`, so if we look at the [same run for `3.5`](https://jenkins.juju.canonical.com/job/github-integration-tests-pylibjuju-python3.5/130/) (which we expect to use the correct version), ![image](https://user-images.githubusercontent.com/120652/134404672-bfe1e3b0-1d6b-4539-aea1-4eae0c8e3186.png) there we see that `tox` is actually using the correct version, which supports the initial prognosis: ![image](https://user-images.githubusercontent.com/120652/134404996-804f347c-cc17-4522-94af-0ef2abb7cef4.png) ### Possible Solutions - Ensure the `python3` to point to `python3.9` (or highest possible Python version), which might be tricky since it might break some system dependencies. - run `tox` explicitly with `python3.9`
1.0
tox+py.test using wrong version of python - ### Problem On a tox run, for example on [this one](https://jenkins.juju.canonical.com/job/github-integration-tests-pylibjuju-python3.9/134/console), even though we run it for `py39` ![image](https://user-images.githubusercontent.com/120652/134402191-a1a006d1-504a-4e1d-8d5a-f701794d5ca2.png) scroll down just a bit, and we see that it uses `python3.6`: ![image](https://user-images.githubusercontent.com/120652/134402570-8e649702-4d3f-4313-94ae-f16738520d6b.png) ### Discussion Once analysed it a bit, I saw that on my local system tox seems to be using the `python3` link for the Python versions that are **higher** than whichever `python` is used to run the `tox` itself. Check this out: On my local system, the `python3` is `python3.8.10`. So when I run `tox -r -e py37 --notest --vv` I get: ![image](https://user-images.githubusercontent.com/120652/134403532-ef0a1d53-adea-400d-9d4b-b61155f80012.png) However, when I run `tox -r -e py39 --notest -vv` (recall that 3.9 is higher than my default 3.8) Then I get: ![image](https://user-images.githubusercontent.com/120652/134403813-5be70dfe-9234-475c-bd61-8fed5499df4d.png) However, when I run `tox` itself with a higher interpreter, like: `python3.9 -m tox -r -e py39 --notest -vv` Then we get a similar response to what we've seen for `python3.7` above: ![image](https://user-images.githubusercontent.com/120652/134404222-77f56d71-d07f-43ca-a224-322b3a4450d7.png) ### Furthermore In the CI run example above, we see that it defaults to `3.6`, so if we look at the [same run for `3.5`](https://jenkins.juju.canonical.com/job/github-integration-tests-pylibjuju-python3.5/130/) (which we expect to use the correct version), ![image](https://user-images.githubusercontent.com/120652/134404672-bfe1e3b0-1d6b-4539-aea1-4eae0c8e3186.png) there we see that `tox` is actually using the correct version, which supports the initial prognosis: ![image](https://user-images.githubusercontent.com/120652/134404996-804f347c-cc17-4522-94af-0ef2abb7cef4.png) ### Possible Solutions - Ensure the `python3` to point to `python3.9` (or highest possible Python version), which might be tricky since it might break some system dependencies. - run `tox` explicitly with `python3.9`
test
tox py test using wrong version of python problem on a tox run for example on even though we run it for scroll down just a bit and we see that it uses discussion once analysed it a bit i saw that on my local system tox seems to be using the link for the python versions that are higher than whichever python is used to run the tox itself check this out on my local system the is so when i run tox r e notest vv i get however when i run tox r e notest vv recall that is higher than my default then i get however when i run tox itself with a higher interpreter like m tox r e notest vv then we get a similar response to what we ve seen for above furthermore in the ci run example above we see that it defaults to so if we look at the which we expect to use the correct version there we see that tox is actually using the correct version which supports the initial prognosis possible solutions ensure the to point to or highest possible python version which might be tricky since it might break some system dependencies run tox explicitly with
1
423,490
28,565,779,585
IssuesEvent
2023-04-21 01:54:34
storybookjs/storybook
https://api.github.com/repos/storybookjs/storybook
closed
[Bug]: storySort not working as a preset add-on in v7
question / support documentation core
### Describe the bug I have a preset addon that, among other things, also sorts stories through multiple projects, but since upgrading to v7, the storySort configuration is no longer picked up. If I copy that over to the project's `.storybook/preset.js`, the sorting works as expected, but it doesn't as part of the add-on. ```js // addon/index.js module.exports = { managerWebpack: async (config) => { return config; }, managerBabel: async (config) => { return config; }, webpackFinal: async (config) => { return config; }, babel: async (config) => { return config; }, previewAnnotations: (entry = []) => { return [...entry, path.resolve(__dirname, './preview')]; }, managerEntries: (entry = []) => { return [...entry, path.resolve(__dirname, './manager')]; }, }; ``` ```js // addon/preset.js export const parameters = { options: { method: 'alphabetical', storySort: { order: ['Base', ['Intro', '*']], }, }, }; ``` ### To Reproduce _No response_ ### System _No response_ ### Additional context _No response_
1.0
[Bug]: storySort not working as a preset add-on in v7 - ### Describe the bug I have a preset addon that, among other things, also sorts stories through multiple projects, but since upgrading to v7, the storySort configuration is no longer picked up. If I copy that over to the project's `.storybook/preset.js`, the sorting works as expected, but it doesn't as part of the add-on. ```js // addon/index.js module.exports = { managerWebpack: async (config) => { return config; }, managerBabel: async (config) => { return config; }, webpackFinal: async (config) => { return config; }, babel: async (config) => { return config; }, previewAnnotations: (entry = []) => { return [...entry, path.resolve(__dirname, './preview')]; }, managerEntries: (entry = []) => { return [...entry, path.resolve(__dirname, './manager')]; }, }; ``` ```js // addon/preset.js export const parameters = { options: { method: 'alphabetical', storySort: { order: ['Base', ['Intro', '*']], }, }, }; ``` ### To Reproduce _No response_ ### System _No response_ ### Additional context _No response_
non_test
storysort not working as a preset add on in describe the bug i have a preset addon that among other things also sorts stories through multiple projects but since upgrading to the storysort configuration is no longer picked up if i copy that over to the project s storybook preset js the sorting works as expected but it doesn t as part of the add on js addon index js module exports managerwebpack async config return config managerbabel async config return config webpackfinal async config return config babel async config return config previewannotations entry return managerentries entry return js addon preset js export const parameters options method alphabetical storysort order to reproduce no response system no response additional context no response
0
386,361
11,437,539,110
IssuesEvent
2020-02-05 00:17:50
kubeflow/kubeflow
https://api.github.com/repos/kubeflow/kubeflow
closed
Central dashboard liveness and readiness probes
area/centraldashboard kind/feature lifecycle/frozen priority/p0
/kind feature Central dashboard needs liveness and readiness probes.
1.0
Central dashboard liveness and readiness probes - /kind feature Central dashboard needs liveness and readiness probes.
non_test
central dashboard liveness and readiness probes kind feature central dashboard needs liveness and readiness probes
0
229,295
17,538,472,152
IssuesEvent
2021-08-12 09:12:29
Luos-io/Documentation
https://api.github.com/repos/Luos-io/Documentation
opened
[DOC] Images should be opened larger on clicked
documentation
**Insert the documentation link:** https://docs.luos.io/ (whole doc) **Describe the issue or the suggestion:** Some images are too small to get the details. **How would you fix the issue / which content would you add or delete?** Clicking on them should open them larger without leaving the page
1.0
[DOC] Images should be opened larger on clicked - **Insert the documentation link:** https://docs.luos.io/ (whole doc) **Describe the issue or the suggestion:** Some images are too small to get the details. **How would you fix the issue / which content would you add or delete?** Clicking on them should open them larger without leaving the page
non_test
images should be opened larger on clicked insert the documentation link whole doc describe the issue or the suggestion some images are too small to get the details how would you fix the issue which content would you add or delete clicking on them should open them larger without leaving the page
0
1,885
4,020,351,705
IssuesEvent
2016-05-16 18:06:37
ngageoint/hootenanny
https://api.github.com/repos/ngageoint/hootenanny
opened
Exception Handling In Hoot Services Tier
Category: Services Priority: Medium Status: Defined Type: Task
Exception handling in Hoot Services needs to be closely examined and refactored. There are place where exceptions are: 1) Swallowed 2) Not logged 3) Thrown, but are too generic
1.0
Exception Handling In Hoot Services Tier - Exception handling in Hoot Services needs to be closely examined and refactored. There are place where exceptions are: 1) Swallowed 2) Not logged 3) Thrown, but are too generic
non_test
exception handling in hoot services tier exception handling in hoot services needs to be closely examined and refactored there are place where exceptions are swallowed not logged thrown but are too generic
0
164,569
20,370,647,138
IssuesEvent
2022-02-21 10:52:47
SeldonIO/seldon-core
https://api.github.com/repos/SeldonIO/seldon-core
closed
Address vulneraibilities as part of 1.13.1 / 1.14.0 images
bug security
<!-- Welcome and thank you for helping us make Seldon Core better! To help us address your issue, please provide us as much of the information requested below as possble. Thanks! --> ## Describe the bug <!-- A clear and concise description of what the bug is. --> TODO: Update report add link ## To reproduce <!-- Steps required to reproduce the issue. For example: 1. define model ... 2. build image ... (especially what wrapper version is used) 3. deploy ... --> ## Expected behaviour <!-- A clear and concise description of what you expected to happen. --> ## Environment <!-- Description of environment --> <!-- You Can fill it manually or paste the output of the command below: * Cloud Provider: [e.g. GKE, AWS, Bare Metal, Kind, Minikube] * Kubernetes Cluster Version [Output of `kubectl version`] * Deployed Seldon System Images: [Output of `kubectl get --namespace seldon-system deploy seldon-controller-manager -o yaml | grep seldonio`] Alternatively run `echo "#### Kubernetes version:\n $(kubectl version) \n\n#### Seldon Images:\n$(kubectl get --namespace seldon-system deploy seldon-controller-manager -o yaml | grep seldonio)"` --> ## Model Details <!-- If the issue is with your deployed model you can also provide the following for fulll insights --> * Images of your model: [Output of: `kubectl get seldondeployment -n <yourmodelnamespace> <seldondepname> -o yaml | grep image:` where `<yourmodelnamespace>`] * Logs of your model: [You can get the logs of your model by running `kubectl logs -n <yourmodelnamespace> <seldonpodname> <container>`]
True
Address vulneraibilities as part of 1.13.1 / 1.14.0 images - <!-- Welcome and thank you for helping us make Seldon Core better! To help us address your issue, please provide us as much of the information requested below as possble. Thanks! --> ## Describe the bug <!-- A clear and concise description of what the bug is. --> TODO: Update report add link ## To reproduce <!-- Steps required to reproduce the issue. For example: 1. define model ... 2. build image ... (especially what wrapper version is used) 3. deploy ... --> ## Expected behaviour <!-- A clear and concise description of what you expected to happen. --> ## Environment <!-- Description of environment --> <!-- You Can fill it manually or paste the output of the command below: * Cloud Provider: [e.g. GKE, AWS, Bare Metal, Kind, Minikube] * Kubernetes Cluster Version [Output of `kubectl version`] * Deployed Seldon System Images: [Output of `kubectl get --namespace seldon-system deploy seldon-controller-manager -o yaml | grep seldonio`] Alternatively run `echo "#### Kubernetes version:\n $(kubectl version) \n\n#### Seldon Images:\n$(kubectl get --namespace seldon-system deploy seldon-controller-manager -o yaml | grep seldonio)"` --> ## Model Details <!-- If the issue is with your deployed model you can also provide the following for fulll insights --> * Images of your model: [Output of: `kubectl get seldondeployment -n <yourmodelnamespace> <seldondepname> -o yaml | grep image:` where `<yourmodelnamespace>`] * Logs of your model: [You can get the logs of your model by running `kubectl logs -n <yourmodelnamespace> <seldonpodname> <container>`]
non_test
address vulneraibilities as part of images welcome and thank you for helping us make seldon core better to help us address your issue please provide us as much of the information requested below as possble thanks describe the bug todo update report add link to reproduce steps required to reproduce the issue for example define model build image especially what wrapper version is used deploy expected behaviour environment you can fill it manually or paste the output of the command below cloud provider kubernetes cluster version deployed seldon system images alternatively run echo kubernetes version n kubectl version n n seldon images n kubectl get namespace seldon system deploy seldon controller manager o yaml grep seldonio model details images of your model logs of your model
0
85,123
7,961,846,440
IssuesEvent
2018-07-13 12:23:24
w3c/webrtc-pc
https://api.github.com/repos/w3c/webrtc-pc
closed
Checks for negotiation-needed are underspecified when currentLocalDescription/currentRemoteDescription are not set
Needs submitter action Test suite issue question
The following language does not work if the PC has not performed an initial negotiation. (Transceivers are expected to be associated before offer/answer completes, so this language can apply in this situation) " - If t's direction is "sendrecv" or "sendonly", and the associated m= section in connection's currentLocalDescription doesn't contain an "a=msid" line, return "true". - If connection's currentLocalDescription if of type "offer", and the direction of the associated m= section in neither the offer nor answer matches t's direction, return "true". - If connection's currentLocalDescription if of type "answer", and the direction of the associated m= section in the answer does not match t's direction intersected with the offered direction (as described in [JSEP] (section 5.3.1.)), return "true". "
1.0
Checks for negotiation-needed are underspecified when currentLocalDescription/currentRemoteDescription are not set - The following language does not work if the PC has not performed an initial negotiation. (Transceivers are expected to be associated before offer/answer completes, so this language can apply in this situation) " - If t's direction is "sendrecv" or "sendonly", and the associated m= section in connection's currentLocalDescription doesn't contain an "a=msid" line, return "true". - If connection's currentLocalDescription if of type "offer", and the direction of the associated m= section in neither the offer nor answer matches t's direction, return "true". - If connection's currentLocalDescription if of type "answer", and the direction of the associated m= section in the answer does not match t's direction intersected with the offered direction (as described in [JSEP] (section 5.3.1.)), return "true". "
test
checks for negotiation needed are underspecified when currentlocaldescription currentremotedescription are not set the following language does not work if the pc has not performed an initial negotiation transceivers are expected to be associated before offer answer completes so this language can apply in this situation if t s direction is sendrecv or sendonly and the associated m section in connection s currentlocaldescription doesn t contain an a msid line return true if connection s currentlocaldescription if of type offer and the direction of the associated m section in neither the offer nor answer matches t s direction return true if connection s currentlocaldescription if of type answer and the direction of the associated m section in the answer does not match t s direction intersected with the offered direction as described in section return true
1
14,234
3,386,926,230
IssuesEvent
2015-11-27 23:16:32
orome/crypto-enigma-py
https://api.github.com/repos/orome/crypto-enigma-py
closed
Add tests for exceptions to testing scripts
help wanted testing
* [x] Ask how to do this with `pytest`. * [x] Add tests for expected exceptions (e.g. for bad Unicode arguments) to testing scripts.
1.0
Add tests for exceptions to testing scripts - * [x] Ask how to do this with `pytest`. * [x] Add tests for expected exceptions (e.g. for bad Unicode arguments) to testing scripts.
test
add tests for exceptions to testing scripts ask how to do this with pytest add tests for expected exceptions e g for bad unicode arguments to testing scripts
1
305,689
23,126,857,219
IssuesEvent
2022-07-28 06:45:15
StoneCypher/fsl
https://api.github.com/repos/StoneCypher/fsl
closed
Master feature table
Research material Documentation Generators Chore Cleanup Publicity and Visibility Readme Ease of use Tutorials Issue needs work Help sidebar Collected propaganda List issues Low hanging fruit
One great thing for documentation, cleanup, and planning would be a master feature table. This would list ***every*** feature, as well as its status: 1. concept 2. planned 3. designed 4. in-grammar 5. in-compiler 6. in-machine 7. in-editor 8. in-visualizer 9. in-documentation 10. in-tutorials 11. in-sidepanel 12. on-site
1.0
Master feature table - One great thing for documentation, cleanup, and planning would be a master feature table. This would list ***every*** feature, as well as its status: 1. concept 2. planned 3. designed 4. in-grammar 5. in-compiler 6. in-machine 7. in-editor 8. in-visualizer 9. in-documentation 10. in-tutorials 11. in-sidepanel 12. on-site
non_test
master feature table one great thing for documentation cleanup and planning would be a master feature table this would list every feature as well as its status concept planned designed in grammar in compiler in machine in editor in visualizer in documentation in tutorials in sidepanel on site
0
473,286
13,639,711,420
IssuesEvent
2020-09-25 11:30:42
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
cse.google.com - design is broken
browser-firefox engine-gecko priority-critical
<!-- @browser: Firefox 66.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0 --> <!-- @reported_with: desktop-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/58791 --> **URL**: https://cse.google.com/cse?q=what+are+the+three+aspect+of+the+nazi++germany&sa=Search&ie=UTF-8&cx=partner%2Dpub%2D6638247779433690:3873384991#%9C&gsc.tab=0&gsc.q=what%20are%20the%20three%20aspect%20of%20the%20nazi%20%20germany&gsc.page=1 **Browser / Version**: Firefox 66.0 **Operating System**: Windows 7 **Tested Another Browser**: Yes Safari **Problem type**: Design is broken **Description**: Items are overlapped **Steps to Reproduce**: <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20190204181317</li><li>channel: beta</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2020/9/a98efda1-c273-4e48-ad84-810615bab137) _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
cse.google.com - design is broken - <!-- @browser: Firefox 66.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:66.0) Gecko/20100101 Firefox/66.0 --> <!-- @reported_with: desktop-reporter --> <!-- @public_url: https://github.com/webcompat/web-bugs/issues/58791 --> **URL**: https://cse.google.com/cse?q=what+are+the+three+aspect+of+the+nazi++germany&sa=Search&ie=UTF-8&cx=partner%2Dpub%2D6638247779433690:3873384991#%9C&gsc.tab=0&gsc.q=what%20are%20the%20three%20aspect%20of%20the%20nazi%20%20germany&gsc.page=1 **Browser / Version**: Firefox 66.0 **Operating System**: Windows 7 **Tested Another Browser**: Yes Safari **Problem type**: Design is broken **Description**: Items are overlapped **Steps to Reproduce**: <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20190204181317</li><li>channel: beta</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> </details> [View console log messages](https://webcompat.com/console_logs/2020/9/a98efda1-c273-4e48-ad84-810615bab137) _From [webcompat.com](https://webcompat.com/) with ❤️_
non_test
cse google com design is broken url browser version firefox operating system windows tested another browser yes safari problem type design is broken description items are overlapped steps to reproduce browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel beta hastouchscreen false mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️
0
234,213
19,101,822,710
IssuesEvent
2021-11-29 23:53:17
libreswan/libreswan
https://api.github.com/repos/libreswan/libreswan
opened
f35: end Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0)
testing
running make kvm-fedora-base: ``` testing/libvirt/fedora/base.py \ m1fedora-base \ 192.168.234.1 /home/pool /home/libreswan/wip-misc /home/libreswan/wip-misc/testing \ sudo virt-install --connect=qemu:///system --check=path_in_use=off --graphics=none --virt-type=kvm --noreboot --console=pty,target_type=serial --vcpus=4 --memory=5120 --cpu=host-passthrough --network=network:swandefault,model=virtio --rng=type=random,device=/dev/random --security=type=static,model=dac,label='1000:107',relabel=yes --filesystem=type=mount,accessmode=squash,source=/home/libreswan/wip-misc,target=source --filesystem=type=mount,accessmode=squash,source=/home/libreswan/wip-misc/testing,target=testing --filesystem=type=mount,accessmode=squash,source=/home/pool,target=pool \ --name=m1fedora-base \ --os-variant=fedora30 \ --disk=path=/home/pool/m1fedora-base.qcow2,size=8,bus=virtio,format=qcow2 \ --location=/home/pool/Fedora-Server-dvd-x86_64-35-1.2.iso --initrd-inject=testing/libvirt/fedora/fedora.ks --extra-args="inst.ks=file:/fedora.ks console=ttyS0,115200 net.ifnames=0 biosdevname=0" ``` can sometimes lead to: ``` Starting install... Retrieving file vmlinuz... | 11 MB 00:00:00 Retrieving file initrd.img... | 84 MB 00:00:00 Allocating 'm1fedora-base.qcow2' | 8.0 GB 00:00:01 Running text console command: virsh --connect qemu:///system console m1fedora-base Connected to domain 'm1fedora-base' Escape character is ^] (Ctrl + ]) [ 0.506136] PCI-DMA: Using software bounce buffering for IO (SWIOTLB) [ 0.507805] software IO TLB: mapped [mem 0x0000000071000000-0x0000000075000000] (64MB) [ 0.510668] Freeing initrd memory: 85604K [ 0.512951] Initialise system trusted keyrings [ 0.513560] Key type blacklist registered [ 0.514130] workingset: timestamp_bits=36 max_order=21 bucket_order=0 [ 0.516782] zbud: loaded [ 0.517895] integrity: Platform Keyring initialized [ 0.532746] NET: Registered PF_ALG protocol family [ 0.533381] xor: automatically using best checksumming function avx [ 0.534284] Key type asymmetric registered [ 0.534814] Asymmetric key parser 'x509' registered [ 0.535468] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 245) [ 0.536491] io scheduler mq-deadline registered [ 0.537069] io scheduler kyber registered [ 0.537645] io scheduler bfq registered [ 0.538542] atomic64_test: passed for x86-64 platform with CX8 and with SSE [ 0.541906] pcieport 0000:00:01.0: PME: Signaling with IRQ 24 [ 0.542850] pcieport 0000:00:01.0: AER: enabled with IRQ 24 [ 0.545946] pcieport 0000:00:01.1: PME: Signaling with IRQ 25 [ 0.546887] pcieport 0000:00:01.1: AER: enabled with IRQ 25 [ 0.550062] pcieport 0000:00:01.2: PME: Signaling with IRQ 26 [ 0.550996] pcieport 0000:00:01.2: AER: enabled with IRQ 26 [ 0.554066] pcieport 0000:00:01.3: PME: Signaling with IRQ 27 [ 0.554995] pcieport 0000:00:01.3: AER: enabled with IRQ 27 [ 0.556782] pcieport 0000:00:01.4: PME: Signaling with IRQ 28 [ 0.557722] pcieport 0000:00:01.4: AER: enabled with IRQ 28 [ 0.560801] pcieport 0000:00:01.5: PME: Signaling with IRQ 29 [ 0.561736] pcieport 0000:00:01.5: AER: enabled with IRQ 29 [ 0.564820] pcieport 0000:00:01.6: PME: Signaling with IRQ 30 [ 0.565767] pcieport 0000:00:01.6: AER: enabled with IRQ 30 [ 0.569397] pcieport 0000:00:01.7: PME: Signaling with IRQ 31 [ 0.570145] pcieport 0000:00:01.7: AER: enabled with IRQ 31 [ 0.571097] ACPI: \_SB_.GSIG: Enabled at IRQ 22 [ 0.572565] pcieport 0000:00:02.0: PME: Signaling with IRQ 32 [ 0.573316] pcieport 0000:00:02.0: AER: enabled with IRQ 32 [ 0.575328] pcieport 0000:00:02.1: PME: Signaling with IRQ 33 [ 0.576087] pcieport 0000:00:02.1: AER: enabled with IRQ 33 [ 0.576890] shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 [ 0.577659] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 0.578453] ACPI: button: Power Button [PWRF] [ 0.587276] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.588009] 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 0.590211] Non-volatile memory driver v1.3 [ 0.590927] Linux agpgart interface v0.103 [ 0.592252] ACPI: \_SB_.GSIA: Enabled at IRQ 16 [ 0.593095] ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode [ 0.593941] ahci 0000:00:1f.2: flags: 64bit ncq only [ 0.595332] scsi host0: ahci [ 0.595806] scsi host1: ahci [ 0.596264] scsi host2: ahci [ 0.596718] scsi host3: ahci [ 0.597159] scsi host4: ahci [ 0.597609] scsi host5: ahci [ 0.597938] ata1: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a100 irq 34 [ 0.598701] ata2: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a180 irq 34 [ 0.599457] ata3: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a200 irq 34 [ 0.599949] ata4: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a280 irq 34 [ 0.600453] ata5: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a300 irq 34 [ 0.600944] ata6: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a380 irq 34 [ 0.601525] libphy: Fixed MDIO Bus: probed [ 0.601925] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.602367] ehci-pci: EHCI PCI platform driver [ 0.602680] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.603099] ohci-pci: OHCI PCI platform driver [ 0.603403] uhci_hcd: USB Universal Host Controller Interface driver [ 0.604188] xhci_hcd 0000:05:00.0: xHCI Host Controller [ 0.604585] xhci_hcd 0000:05:00.0: new USB bus registered, assigned bus number 1 [ 0.605190] xhci_hcd 0000:05:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 [ 0.606174] xhci_hcd 0000:05:00.0: xHCI Host Controller [ 0.606568] xhci_hcd 0000:05:00.0: new USB bus registered, assigned bus number 2 [ 0.607062] xhci_hcd 0000:05:00.0: Host supports USB 3.0 SuperSpeed [ 0.607511] usb usb1: New USB device found, idVendor=1d6b, idProduct=0002, bcdDevice= 5.14 [ 0.608055] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.608542] usb usb1: Product: xHCI Host Controller [ 0.608868] usb usb1: Manufacturer: Linux 5.14.10-300.fc35.x86_64 xhci-hcd [ 0.609324] usb usb1: SerialNumber: 0000:05:00.0 [ 0.609714] hub 1-0:1.0: USB hub found [ 0.610001] hub 1-0:1.0: 15 ports detected [ 0.611169] usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. [ 0.611723] usb usb2: New USB device found, idVendor=1d6b, idProduct=0003, bcdDevice= 5.14 [ 0.612270] usb usb2: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.612751] usb usb2: Product: xHCI Host Controller [ 0.613075] usb usb2: Manufacturer: Linux 5.14.10-300.fc35.x86_64 xhci-hcd [ 0.613543] usb usb2: SerialNumber: 0000:05:00.0 [ 0.613910] hub 2-0:1.0: USB hub found [ 0.614204] hub 2-0:1.0: 15 ports detected [ 0.614762] usbcore: registered new interface driver usbserial_generic [ 0.615205] usbserial: USB Serial support registered for generic [ 0.615635] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 0.616582] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.616920] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.617338] mousedev: PS/2 mouse device common for all mice [ 0.617857] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 0.618538] rtc_cmos 00:03: RTC can wake from S4 [ 0.619279] input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 [ 0.619968] rtc_cmos 00:03: registered as rtc0 [ 0.620353] rtc_cmos 00:03: setting system clock to 2021-11-29T23:46:15 UTC (1638229575) [ 0.620919] rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram [ 0.621413] input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 [ 0.621997] device-mapper: uevent: version 1.0.3 [ 0.622408] device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com [ 0.623101] hid: raw HID events driver (C) Jiri Kosina [ 0.623468] usbcore: registered new interface driver usbhid [ 0.623838] usbhid: USB HID core driver [ 0.624134] drop_monitor: Initializing network drop monitor service [ 0.624609] Initializing XFRM netlink socket [ 0.624956] NET: Registered PF_INET6 protocol family [ 0.626158] Segment Routing with IPv6 [ 0.626407] RPL Segment Routing with IPv6 [ 0.626693] mip6: Mobile IPv6 [ 0.626890] NET: Registered PF_PACKET protocol family [ 0.627551] IPI shorthand broadcast: enabled [ 0.627841] AVX2 version of gcm_enc/dec engaged. [ 0.628215] AES CTR mode by8 optimization enabled [ 0.632260] sched_clock: Marking stable (594995080, 37088462)->(681749969, -49666427) [ 0.632921] registered taskstats version 1 [ 0.633320] Loading compiled-in X.509 certificates [ 0.634139] Loaded X.509 cert 'Fedora kernel signing key: c942454291c91bc973770746141181a1953c2bb5' [ 0.634784] zswap: loaded using pool lzo/zbud [ 0.635164] page_owner is disabled [ 0.635460] Key type ._fscrypt registered [ 0.635729] Key type .fscrypt registered [ 0.635993] Key type fscrypt-provisioning registered [ 0.636713] Btrfs loaded, crc32c=crc32c-generic, zoned=yes [ 0.637102] Key type big_key registered [ 0.637728] Key type encrypted registered [ 0.638001] ima: No TPM chip found, activating TPM-bypass! [ 0.638379] Loading compiled-in module X.509 certificates [ 0.639090] Loaded X.509 cert 'Fedora kernel signing key: c942454291c91bc973770746141181a1953c2bb5' [ 0.639704] ima: Allocated hash algorithm: sha256 [ 0.640029] ima: No architecture policies found [ 0.640352] evm: Initialising EVM extended attributes: [ 0.640696] evm: security.selinux [ 0.640918] evm: security.SMACK64 (disabled) [ 0.641209] evm: security.SMACK64EXEC (disabled) [ 0.641528] evm: security.SMACK64TRANSMUTE (disabled) [ 0.641863] evm: security.SMACK64MMAP (disabled) [ 0.642176] evm: security.apparmor (disabled) [ 0.642479] evm: security.ima [ 0.642679] evm: security.capability [ 0.642918] evm: HMAC attrs: 0x1 [ 0.643327] PM: Magic number: 13:887:808 [ 0.643683] RAS: Correctable Errors collector initialized. [ 0.906566] ata3: SATA link down (SStatus 0 SControl 300) [ 0.907342] ata6: SATA link down (SStatus 0 SControl 300) [ 0.908068] ata5: SATA link down (SStatus 0 SControl 300) [ 0.908791] ata2: SATA link down (SStatus 0 SControl 300) [ 0.909532] ata4: SATA link down (SStatus 0 SControl 300) [ 0.910287] ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) [ 0.911128] ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 [ 0.911789] ata1.00: applying bridge limits [ 0.912402] ata1.00: configured for UDMA/100 [ 0.913180] scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 [ 0.914397] sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray [ 0.915184] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 0.933817] sr 0:0:0:0: Attached scsi generic sg0 type 5 [ 0.934542] md: Waiting for all devices to be available before autodetect [ 0.935326] md: If you don't use raid, use raid=noautodetect [ 0.935966] md: Autodetecting RAID arrays. [ 0.936384] md: autorun ... [ 0.936637] md: ... autorun DONE. [ 0.937149] VFS: Cannot open root device "(null)" or unknown-block(0,0): error -6 [ 0.937815] Please append a correct "root=" boot option; here are the available partitions: [ 0.938558] 0b00 2180096 sr0 [ 0.938560] driver: sr [ 0.939112] Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) [ 0.939835] CPU: 3 PID: 1 Comm: swapper/0 Not tainted 5.14.10-300.fc35.x86_64 #1 [ 0.940493] Hardware name: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.14.0-6.fc35 04/01/2014 [ 0.941238] Call Trace: [ 0.941470] dump_stack_lvl+0x46/0x5a [ 0.941796] panic+0xf1/0x2d3 [ 0.942065] ? printk+0x48/0x4a [ 0.942355] mount_block_root+0x299/0x313 [ 0.942723] prepare_namespace+0x13b/0x16a [ 0.943092] kernel_init_freeable+0x23e/0x24d [ 0.943483] ? rest_init+0xc0/0xc0 [ 0.943785] kernel_init+0x16/0x120 [ 0.944096] ret_from_fork+0x22/0x30 [ 0.944465] Kernel Offset: 0x17000000 from 0xffffffff81000000 (relocation range: 0xffffffff80000000-0xffffffffbfffffff) [ 0.945415] ---[ end Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) ]--- ```
1.0
f35: end Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) - running make kvm-fedora-base: ``` testing/libvirt/fedora/base.py \ m1fedora-base \ 192.168.234.1 /home/pool /home/libreswan/wip-misc /home/libreswan/wip-misc/testing \ sudo virt-install --connect=qemu:///system --check=path_in_use=off --graphics=none --virt-type=kvm --noreboot --console=pty,target_type=serial --vcpus=4 --memory=5120 --cpu=host-passthrough --network=network:swandefault,model=virtio --rng=type=random,device=/dev/random --security=type=static,model=dac,label='1000:107',relabel=yes --filesystem=type=mount,accessmode=squash,source=/home/libreswan/wip-misc,target=source --filesystem=type=mount,accessmode=squash,source=/home/libreswan/wip-misc/testing,target=testing --filesystem=type=mount,accessmode=squash,source=/home/pool,target=pool \ --name=m1fedora-base \ --os-variant=fedora30 \ --disk=path=/home/pool/m1fedora-base.qcow2,size=8,bus=virtio,format=qcow2 \ --location=/home/pool/Fedora-Server-dvd-x86_64-35-1.2.iso --initrd-inject=testing/libvirt/fedora/fedora.ks --extra-args="inst.ks=file:/fedora.ks console=ttyS0,115200 net.ifnames=0 biosdevname=0" ``` can sometimes lead to: ``` Starting install... Retrieving file vmlinuz... | 11 MB 00:00:00 Retrieving file initrd.img... | 84 MB 00:00:00 Allocating 'm1fedora-base.qcow2' | 8.0 GB 00:00:01 Running text console command: virsh --connect qemu:///system console m1fedora-base Connected to domain 'm1fedora-base' Escape character is ^] (Ctrl + ]) [ 0.506136] PCI-DMA: Using software bounce buffering for IO (SWIOTLB) [ 0.507805] software IO TLB: mapped [mem 0x0000000071000000-0x0000000075000000] (64MB) [ 0.510668] Freeing initrd memory: 85604K [ 0.512951] Initialise system trusted keyrings [ 0.513560] Key type blacklist registered [ 0.514130] workingset: timestamp_bits=36 max_order=21 bucket_order=0 [ 0.516782] zbud: loaded [ 0.517895] integrity: Platform Keyring initialized [ 0.532746] NET: Registered PF_ALG protocol family [ 0.533381] xor: automatically using best checksumming function avx [ 0.534284] Key type asymmetric registered [ 0.534814] Asymmetric key parser 'x509' registered [ 0.535468] Block layer SCSI generic (bsg) driver version 0.4 loaded (major 245) [ 0.536491] io scheduler mq-deadline registered [ 0.537069] io scheduler kyber registered [ 0.537645] io scheduler bfq registered [ 0.538542] atomic64_test: passed for x86-64 platform with CX8 and with SSE [ 0.541906] pcieport 0000:00:01.0: PME: Signaling with IRQ 24 [ 0.542850] pcieport 0000:00:01.0: AER: enabled with IRQ 24 [ 0.545946] pcieport 0000:00:01.1: PME: Signaling with IRQ 25 [ 0.546887] pcieport 0000:00:01.1: AER: enabled with IRQ 25 [ 0.550062] pcieport 0000:00:01.2: PME: Signaling with IRQ 26 [ 0.550996] pcieport 0000:00:01.2: AER: enabled with IRQ 26 [ 0.554066] pcieport 0000:00:01.3: PME: Signaling with IRQ 27 [ 0.554995] pcieport 0000:00:01.3: AER: enabled with IRQ 27 [ 0.556782] pcieport 0000:00:01.4: PME: Signaling with IRQ 28 [ 0.557722] pcieport 0000:00:01.4: AER: enabled with IRQ 28 [ 0.560801] pcieport 0000:00:01.5: PME: Signaling with IRQ 29 [ 0.561736] pcieport 0000:00:01.5: AER: enabled with IRQ 29 [ 0.564820] pcieport 0000:00:01.6: PME: Signaling with IRQ 30 [ 0.565767] pcieport 0000:00:01.6: AER: enabled with IRQ 30 [ 0.569397] pcieport 0000:00:01.7: PME: Signaling with IRQ 31 [ 0.570145] pcieport 0000:00:01.7: AER: enabled with IRQ 31 [ 0.571097] ACPI: \_SB_.GSIG: Enabled at IRQ 22 [ 0.572565] pcieport 0000:00:02.0: PME: Signaling with IRQ 32 [ 0.573316] pcieport 0000:00:02.0: AER: enabled with IRQ 32 [ 0.575328] pcieport 0000:00:02.1: PME: Signaling with IRQ 33 [ 0.576087] pcieport 0000:00:02.1: AER: enabled with IRQ 33 [ 0.576890] shpchp: Standard Hot Plug PCI Controller Driver version: 0.4 [ 0.577659] input: Power Button as /devices/LNXSYSTM:00/LNXPWRBN:00/input/input0 [ 0.578453] ACPI: button: Power Button [PWRF] [ 0.587276] Serial: 8250/16550 driver, 32 ports, IRQ sharing enabled [ 0.588009] 00:00: ttyS0 at I/O 0x3f8 (irq = 4, base_baud = 115200) is a 16550A [ 0.590211] Non-volatile memory driver v1.3 [ 0.590927] Linux agpgart interface v0.103 [ 0.592252] ACPI: \_SB_.GSIA: Enabled at IRQ 16 [ 0.593095] ahci 0000:00:1f.2: AHCI 0001.0000 32 slots 6 ports 1.5 Gbps 0x3f impl SATA mode [ 0.593941] ahci 0000:00:1f.2: flags: 64bit ncq only [ 0.595332] scsi host0: ahci [ 0.595806] scsi host1: ahci [ 0.596264] scsi host2: ahci [ 0.596718] scsi host3: ahci [ 0.597159] scsi host4: ahci [ 0.597609] scsi host5: ahci [ 0.597938] ata1: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a100 irq 34 [ 0.598701] ata2: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a180 irq 34 [ 0.599457] ata3: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a200 irq 34 [ 0.599949] ata4: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a280 irq 34 [ 0.600453] ata5: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a300 irq 34 [ 0.600944] ata6: SATA max UDMA/133 abar m4096@0xfd60a000 port 0xfd60a380 irq 34 [ 0.601525] libphy: Fixed MDIO Bus: probed [ 0.601925] ehci_hcd: USB 2.0 'Enhanced' Host Controller (EHCI) Driver [ 0.602367] ehci-pci: EHCI PCI platform driver [ 0.602680] ohci_hcd: USB 1.1 'Open' Host Controller (OHCI) Driver [ 0.603099] ohci-pci: OHCI PCI platform driver [ 0.603403] uhci_hcd: USB Universal Host Controller Interface driver [ 0.604188] xhci_hcd 0000:05:00.0: xHCI Host Controller [ 0.604585] xhci_hcd 0000:05:00.0: new USB bus registered, assigned bus number 1 [ 0.605190] xhci_hcd 0000:05:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 [ 0.606174] xhci_hcd 0000:05:00.0: xHCI Host Controller [ 0.606568] xhci_hcd 0000:05:00.0: new USB bus registered, assigned bus number 2 [ 0.607062] xhci_hcd 0000:05:00.0: Host supports USB 3.0 SuperSpeed [ 0.607511] usb usb1: New USB device found, idVendor=1d6b, idProduct=0002, bcdDevice= 5.14 [ 0.608055] usb usb1: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.608542] usb usb1: Product: xHCI Host Controller [ 0.608868] usb usb1: Manufacturer: Linux 5.14.10-300.fc35.x86_64 xhci-hcd [ 0.609324] usb usb1: SerialNumber: 0000:05:00.0 [ 0.609714] hub 1-0:1.0: USB hub found [ 0.610001] hub 1-0:1.0: 15 ports detected [ 0.611169] usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. [ 0.611723] usb usb2: New USB device found, idVendor=1d6b, idProduct=0003, bcdDevice= 5.14 [ 0.612270] usb usb2: New USB device strings: Mfr=3, Product=2, SerialNumber=1 [ 0.612751] usb usb2: Product: xHCI Host Controller [ 0.613075] usb usb2: Manufacturer: Linux 5.14.10-300.fc35.x86_64 xhci-hcd [ 0.613543] usb usb2: SerialNumber: 0000:05:00.0 [ 0.613910] hub 2-0:1.0: USB hub found [ 0.614204] hub 2-0:1.0: 15 ports detected [ 0.614762] usbcore: registered new interface driver usbserial_generic [ 0.615205] usbserial: USB Serial support registered for generic [ 0.615635] i8042: PNP: PS/2 Controller [PNP0303:KBD,PNP0f13:MOU] at 0x60,0x64 irq 1,12 [ 0.616582] serio: i8042 KBD port at 0x60,0x64 irq 1 [ 0.616920] serio: i8042 AUX port at 0x60,0x64 irq 12 [ 0.617338] mousedev: PS/2 mouse device common for all mice [ 0.617857] input: AT Translated Set 2 keyboard as /devices/platform/i8042/serio0/input/input1 [ 0.618538] rtc_cmos 00:03: RTC can wake from S4 [ 0.619279] input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input4 [ 0.619968] rtc_cmos 00:03: registered as rtc0 [ 0.620353] rtc_cmos 00:03: setting system clock to 2021-11-29T23:46:15 UTC (1638229575) [ 0.620919] rtc_cmos 00:03: alarms up to one day, y3k, 242 bytes nvram [ 0.621413] input: VirtualPS/2 VMware VMMouse as /devices/platform/i8042/serio1/input/input3 [ 0.621997] device-mapper: uevent: version 1.0.3 [ 0.622408] device-mapper: ioctl: 4.45.0-ioctl (2021-03-22) initialised: dm-devel@redhat.com [ 0.623101] hid: raw HID events driver (C) Jiri Kosina [ 0.623468] usbcore: registered new interface driver usbhid [ 0.623838] usbhid: USB HID core driver [ 0.624134] drop_monitor: Initializing network drop monitor service [ 0.624609] Initializing XFRM netlink socket [ 0.624956] NET: Registered PF_INET6 protocol family [ 0.626158] Segment Routing with IPv6 [ 0.626407] RPL Segment Routing with IPv6 [ 0.626693] mip6: Mobile IPv6 [ 0.626890] NET: Registered PF_PACKET protocol family [ 0.627551] IPI shorthand broadcast: enabled [ 0.627841] AVX2 version of gcm_enc/dec engaged. [ 0.628215] AES CTR mode by8 optimization enabled [ 0.632260] sched_clock: Marking stable (594995080, 37088462)->(681749969, -49666427) [ 0.632921] registered taskstats version 1 [ 0.633320] Loading compiled-in X.509 certificates [ 0.634139] Loaded X.509 cert 'Fedora kernel signing key: c942454291c91bc973770746141181a1953c2bb5' [ 0.634784] zswap: loaded using pool lzo/zbud [ 0.635164] page_owner is disabled [ 0.635460] Key type ._fscrypt registered [ 0.635729] Key type .fscrypt registered [ 0.635993] Key type fscrypt-provisioning registered [ 0.636713] Btrfs loaded, crc32c=crc32c-generic, zoned=yes [ 0.637102] Key type big_key registered [ 0.637728] Key type encrypted registered [ 0.638001] ima: No TPM chip found, activating TPM-bypass! [ 0.638379] Loading compiled-in module X.509 certificates [ 0.639090] Loaded X.509 cert 'Fedora kernel signing key: c942454291c91bc973770746141181a1953c2bb5' [ 0.639704] ima: Allocated hash algorithm: sha256 [ 0.640029] ima: No architecture policies found [ 0.640352] evm: Initialising EVM extended attributes: [ 0.640696] evm: security.selinux [ 0.640918] evm: security.SMACK64 (disabled) [ 0.641209] evm: security.SMACK64EXEC (disabled) [ 0.641528] evm: security.SMACK64TRANSMUTE (disabled) [ 0.641863] evm: security.SMACK64MMAP (disabled) [ 0.642176] evm: security.apparmor (disabled) [ 0.642479] evm: security.ima [ 0.642679] evm: security.capability [ 0.642918] evm: HMAC attrs: 0x1 [ 0.643327] PM: Magic number: 13:887:808 [ 0.643683] RAS: Correctable Errors collector initialized. [ 0.906566] ata3: SATA link down (SStatus 0 SControl 300) [ 0.907342] ata6: SATA link down (SStatus 0 SControl 300) [ 0.908068] ata5: SATA link down (SStatus 0 SControl 300) [ 0.908791] ata2: SATA link down (SStatus 0 SControl 300) [ 0.909532] ata4: SATA link down (SStatus 0 SControl 300) [ 0.910287] ata1: SATA link up 1.5 Gbps (SStatus 113 SControl 300) [ 0.911128] ata1.00: ATAPI: QEMU DVD-ROM, 2.5+, max UDMA/100 [ 0.911789] ata1.00: applying bridge limits [ 0.912402] ata1.00: configured for UDMA/100 [ 0.913180] scsi 0:0:0:0: CD-ROM QEMU QEMU DVD-ROM 2.5+ PQ: 0 ANSI: 5 [ 0.914397] sr 0:0:0:0: [sr0] scsi3-mmc drive: 4x/4x cd/rw xa/form2 tray [ 0.915184] cdrom: Uniform CD-ROM driver Revision: 3.20 [ 0.933817] sr 0:0:0:0: Attached scsi generic sg0 type 5 [ 0.934542] md: Waiting for all devices to be available before autodetect [ 0.935326] md: If you don't use raid, use raid=noautodetect [ 0.935966] md: Autodetecting RAID arrays. [ 0.936384] md: autorun ... [ 0.936637] md: ... autorun DONE. [ 0.937149] VFS: Cannot open root device "(null)" or unknown-block(0,0): error -6 [ 0.937815] Please append a correct "root=" boot option; here are the available partitions: [ 0.938558] 0b00 2180096 sr0 [ 0.938560] driver: sr [ 0.939112] Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) [ 0.939835] CPU: 3 PID: 1 Comm: swapper/0 Not tainted 5.14.10-300.fc35.x86_64 #1 [ 0.940493] Hardware name: QEMU Standard PC (Q35 + ICH9, 2009), BIOS 1.14.0-6.fc35 04/01/2014 [ 0.941238] Call Trace: [ 0.941470] dump_stack_lvl+0x46/0x5a [ 0.941796] panic+0xf1/0x2d3 [ 0.942065] ? printk+0x48/0x4a [ 0.942355] mount_block_root+0x299/0x313 [ 0.942723] prepare_namespace+0x13b/0x16a [ 0.943092] kernel_init_freeable+0x23e/0x24d [ 0.943483] ? rest_init+0xc0/0xc0 [ 0.943785] kernel_init+0x16/0x120 [ 0.944096] ret_from_fork+0x22/0x30 [ 0.944465] Kernel Offset: 0x17000000 from 0xffffffff81000000 (relocation range: 0xffffffff80000000-0xffffffffbfffffff) [ 0.945415] ---[ end Kernel panic - not syncing: VFS: Unable to mount root fs on unknown-block(0,0) ]--- ```
test
end kernel panic not syncing vfs unable to mount root fs on unknown block running make kvm fedora base testing libvirt fedora base py base home pool home libreswan wip misc home libreswan wip misc testing sudo virt install connect qemu system check path in use off graphics none virt type kvm noreboot console pty target type serial vcpus memory cpu host passthrough network network swandefault model virtio rng type random device dev random security type static model dac label relabel yes filesystem type mount accessmode squash source home libreswan wip misc target source filesystem type mount accessmode squash source home libreswan wip misc testing target testing filesystem type mount accessmode squash source home pool target pool name base os variant disk path home pool base size bus virtio format location home pool fedora server dvd iso initrd inject testing libvirt fedora fedora ks extra args inst ks file fedora ks console net ifnames biosdevname can sometimes lead to starting install retrieving file vmlinuz mb retrieving file initrd img mb allocating base gb running text console command virsh connect qemu system console base connected to domain base escape character is ctrl pci dma using software bounce buffering for io swiotlb software io tlb mapped freeing initrd memory initialise system trusted keyrings key type blacklist registered workingset timestamp bits max order bucket order zbud loaded integrity platform keyring initialized net registered pf alg protocol family xor automatically using best checksumming function avx key type asymmetric registered asymmetric key parser registered block layer scsi generic bsg driver version loaded major io scheduler mq deadline registered io scheduler kyber registered io scheduler bfq registered test passed for platform with and with sse pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq acpi sb gsig enabled at irq pcieport pme signaling with irq pcieport aer enabled with irq pcieport pme signaling with irq pcieport aer enabled with irq shpchp standard hot plug pci controller driver version input power button as devices lnxsystm lnxpwrbn input acpi button power button serial driver ports irq sharing enabled at i o irq base baud is a non volatile memory driver linux agpgart interface acpi sb gsia enabled at irq ahci ahci slots ports gbps impl sata mode ahci flags ncq only scsi ahci scsi ahci scsi ahci scsi ahci scsi ahci scsi ahci sata max udma abar port irq sata max udma abar port irq sata max udma abar port irq sata max udma abar port irq sata max udma abar port irq sata max udma abar port irq libphy fixed mdio bus probed ehci hcd usb enhanced host controller ehci driver ehci pci ehci pci platform driver ohci hcd usb open host controller ohci driver ohci pci ohci pci platform driver uhci hcd usb universal host controller interface driver xhci hcd xhci host controller xhci hcd new usb bus registered assigned bus number xhci hcd hcc params hci version quirks xhci hcd xhci host controller xhci hcd new usb bus registered assigned bus number xhci hcd host supports usb superspeed usb new usb device found idvendor idproduct bcddevice usb new usb device strings mfr product serialnumber usb product xhci host controller usb manufacturer linux xhci hcd usb serialnumber hub usb hub found hub ports detected usb we don t know the algorithms for lpm for this host disabling lpm usb new usb device found idvendor idproduct bcddevice usb new usb device strings mfr product serialnumber usb product xhci host controller usb manufacturer linux xhci hcd usb serialnumber hub usb hub found hub ports detected usbcore registered new interface driver usbserial generic usbserial usb serial support registered for generic pnp ps controller at irq serio kbd port at irq serio aux port at irq mousedev ps mouse device common for all mice input at translated set keyboard as devices platform input rtc cmos rtc can wake from input virtualps vmware vmmouse as devices platform input rtc cmos registered as rtc cmos setting system clock to utc rtc cmos alarms up to one day bytes nvram input virtualps vmware vmmouse as devices platform input device mapper uevent version device mapper ioctl ioctl initialised dm devel redhat com hid raw hid events driver c jiri kosina usbcore registered new interface driver usbhid usbhid usb hid core driver drop monitor initializing network drop monitor service initializing xfrm netlink socket net registered pf protocol family segment routing with rpl segment routing with mobile net registered pf packet protocol family ipi shorthand broadcast enabled version of gcm enc dec engaged aes ctr mode optimization enabled sched clock marking stable registered taskstats version loading compiled in x certificates loaded x cert fedora kernel signing key zswap loaded using pool lzo zbud page owner is disabled key type fscrypt registered key type fscrypt registered key type fscrypt provisioning registered btrfs loaded generic zoned yes key type big key registered key type encrypted registered ima no tpm chip found activating tpm bypass loading compiled in module x certificates loaded x cert fedora kernel signing key ima allocated hash algorithm ima no architecture policies found evm initialising evm extended attributes evm security selinux evm security disabled evm security disabled evm security disabled evm security disabled evm security apparmor disabled evm security ima evm security capability evm hmac attrs pm magic number ras correctable errors collector initialized sata link down sstatus scontrol sata link down sstatus scontrol sata link down sstatus scontrol sata link down sstatus scontrol sata link down sstatus scontrol sata link up gbps sstatus scontrol atapi qemu dvd rom max udma applying bridge limits configured for udma scsi cd rom qemu qemu dvd rom pq ansi sr mmc drive cd rw xa tray cdrom uniform cd rom driver revision sr attached scsi generic type md waiting for all devices to be available before autodetect md if you don t use raid use raid noautodetect md autodetecting raid arrays md autorun md autorun done vfs cannot open root device null or unknown block error please append a correct root boot option here are the available partitions driver sr kernel panic not syncing vfs unable to mount root fs on unknown block cpu pid comm swapper not tainted hardware name qemu standard pc bios call trace dump stack lvl panic printk mount block root prepare namespace kernel init freeable rest init kernel init ret from fork kernel offset from relocation range
1
318,598
27,316,853,371
IssuesEvent
2023-02-24 16:21:08
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
roachtest: replicagc-changed-peers/restart=false failed
C-bug C-test-failure O-robot X-duplicate A-testing O-roachtest branch-master T-kv
roachtest.replicagc-changed-peers/restart=false [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/8803571?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/8803571?buildTab=artifacts#/replicagc-changed-peers/restart=false) on master @ [e028ce5b14505dfd17ef8b13001c0ab8ac811e3c](https://github.com/cockroachdb/cockroach/commits/e028ce5b14505dfd17ef8b13001c0ab8ac811e3c): ``` test artifacts and logs in: /artifacts/replicagc-changed-peers/restart=false/run_1 (cluster.go:1883).Start: COMMAND_PROBLEM: ssh verbose log retained in ssh_112059.361445944_n1_COCKROACHCONNECTTIME.log: exit status 1 ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_encrypted=false</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/replication <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*replicagc-changed-peers/restart=false.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-24760
3.0
roachtest: replicagc-changed-peers/restart=false failed - roachtest.replicagc-changed-peers/restart=false [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/8803571?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/8803571?buildTab=artifacts#/replicagc-changed-peers/restart=false) on master @ [e028ce5b14505dfd17ef8b13001c0ab8ac811e3c](https://github.com/cockroachdb/cockroach/commits/e028ce5b14505dfd17ef8b13001c0ab8ac811e3c): ``` test artifacts and logs in: /artifacts/replicagc-changed-peers/restart=false/run_1 (cluster.go:1883).Start: COMMAND_PROBLEM: ssh verbose log retained in ssh_112059.361445944_n1_COCKROACHCONNECTTIME.log: exit status 1 ``` <p>Parameters: <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_encrypted=false</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/replication <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*replicagc-changed-peers/restart=false.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> Jira issue: CRDB-24760
test
roachtest replicagc changed peers restart false failed roachtest replicagc changed peers restart false with on master test artifacts and logs in artifacts replicagc changed peers restart false run cluster go start command problem ssh verbose log retained in ssh cockroachconnecttime log exit status parameters roachtest cloud gce roachtest cpu roachtest encrypted false roachtest ssd help see see cc cockroachdb replication jira issue crdb
1
58,502
6,606,642,158
IssuesEvent
2017-09-19 01:36:53
airbnb/enzyme
https://api.github.com/repos/airbnb/enzyme
opened
test organization in a lerna repo
Tests
From #1101 As many tests as possible should live inside individual package dirs - cd-ing into each dir and running npm install && npm test should provide as close to 100% coverage of that dir as possible. It's obviously still fine if there are some tests that span multiple package dirs; these should be in a top-level tests dir (and not inside any package).
1.0
test organization in a lerna repo - From #1101 As many tests as possible should live inside individual package dirs - cd-ing into each dir and running npm install && npm test should provide as close to 100% coverage of that dir as possible. It's obviously still fine if there are some tests that span multiple package dirs; these should be in a top-level tests dir (and not inside any package).
test
test organization in a lerna repo from as many tests as possible should live inside individual package dirs cd ing into each dir and running npm install npm test should provide as close to coverage of that dir as possible it s obviously still fine if there are some tests that span multiple package dirs these should be in a top level tests dir and not inside any package
1
113,812
4,571,542,272
IssuesEvent
2016-09-16 00:20:11
SLC3/MARS
https://api.github.com/repos/SLC3/MARS
closed
Catch out cheating
enhancement High Priority
We'll need to add an option to log client IP's and provide presenters with a list of audience members using non-organisation IP's (e.g. Non Monash WiFi IP's). This list will then need to be displayed to presenters so that they can verify these audience members are actually in the room - not just completing the quiz from home or through a script. Audience members will also need to be notified on their phones (upon answering their first question) that they are required to verify their attendance after the presentation.
1.0
Catch out cheating - We'll need to add an option to log client IP's and provide presenters with a list of audience members using non-organisation IP's (e.g. Non Monash WiFi IP's). This list will then need to be displayed to presenters so that they can verify these audience members are actually in the room - not just completing the quiz from home or through a script. Audience members will also need to be notified on their phones (upon answering their first question) that they are required to verify their attendance after the presentation.
non_test
catch out cheating we ll need to add an option to log client ip s and provide presenters with a list of audience members using non organisation ip s e g non monash wifi ip s this list will then need to be displayed to presenters so that they can verify these audience members are actually in the room not just completing the quiz from home or through a script audience members will also need to be notified on their phones upon answering their first question that they are required to verify their attendance after the presentation
0
98,349
8,675,492,788
IssuesEvent
2018-11-30 11:02:40
shahkhan40/shantestrep
https://api.github.com/repos/shahkhan40/shantestrep
reopened
fxscantest : ApiV1DataRecordsGetQueryParamPageNegativeNumber
fxscantest
Project : fxscantest Job : uatenv Env : uatenv Region : US_WEST Result : fail Status Code : 404 Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=MThlNTJiNmMtYTQ2Yy00YjI1LWI0OWUtNWUzNDhmMWRhZDI0; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Fri, 30 Nov 2018 10:16:00 GMT]} Endpoint : http://13.56.210.25/api/v1/api/v1/data-records?page=-1 Request : Response : { "timestamp" : "2018-11-30T10:16:00.572+0000", "status" : 404, "error" : "Not Found", "message" : "No message available", "path" : "/api/v1/api/v1/data-records" } Logs : Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed] --- FX Bot ---
1.0
fxscantest : ApiV1DataRecordsGetQueryParamPageNegativeNumber - Project : fxscantest Job : uatenv Env : uatenv Region : US_WEST Result : fail Status Code : 404 Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=MThlNTJiNmMtYTQ2Yy00YjI1LWI0OWUtNWUzNDhmMWRhZDI0; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Fri, 30 Nov 2018 10:16:00 GMT]} Endpoint : http://13.56.210.25/api/v1/api/v1/data-records?page=-1 Request : Response : { "timestamp" : "2018-11-30T10:16:00.572+0000", "status" : 404, "error" : "Not Found", "message" : "No message available", "path" : "/api/v1/api/v1/data-records" } Logs : Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed] --- FX Bot ---
test
fxscantest project fxscantest job uatenv env uatenv region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options set cookie content type transfer encoding date endpoint request response timestamp status error not found message no message available path api api data records logs assertion resolved to result assertion resolved to result fx bot
1
519,842
15,058,036,207
IssuesEvent
2021-02-03 22:41:45
dietterc/SEO-ker
https://api.github.com/repos/dietterc/SEO-ker
opened
Feature 4: Integrate with Google Trends API
feature feature 4 high priority
The game relies on counting google searches, so it should be referring back to historic search data.
1.0
Feature 4: Integrate with Google Trends API - The game relies on counting google searches, so it should be referring back to historic search data.
non_test
feature integrate with google trends api the game relies on counting google searches so it should be referring back to historic search data
0
84,692
10,553,995,234
IssuesEvent
2019-10-03 18:27:38
legesher/legesher
https://api.github.com/repos/legesher/legesher
closed
style: add linting
Hacktoberfest Language: Python Opportunity: Design Opportunity: Technical Beginner Priority: Low Status: Completed Type: Enhancement Type: Maintenance
## Linting Being able to have formatted code that is aligned with the style guide will be important in the future. However, this might take additional time to make sure the linters apply to "Legesher" grammar and not just "Python" or "Javascript" grammar. This quote from [stack overflow](https://stackoverflow.com/questions/48363647/editorconfig-vs-eslint-vs-prettier-is-it-worthwhile-to-use-them-all) gives a good explanation of the different attributes we will have to touch on: >**EditorConfig**: This helps your editor produce code that looks like your style guide as you go. While this isn't strictly necessary in order to achieve your goals, it's nice if you're always looking at code that follows the same coding styles. Otherwise if you don't have EditorConfig, as you're typing your editor will auto-format differently to the rest of the code base, which is confusing. Of course if you've set up prettier it'll fix it before it goes into your code base, but still, why would you want to look at it in one format while you're writing it and then have it switch when you go to commit? Might as well be consistent. > >**Prettier**: Automatically formats your code. I like to set it up to do this when I stage my files for a commit, so that it's physically impossible for me to commit code that doesn't match my style guide. > >**ESLint**: So why would you want a linter too? Because ESLint does more than just style. It picks up when you declare variables you don't use, or reference things that aren't defined, amongst a few other niceties. So while its role diminishes somewhat compared to the days before prettier, it's still useful to have in a project to catch the other errors. This will eventually need a larger discussion of adopting best coding practices per programming language.
1.0
style: add linting - ## Linting Being able to have formatted code that is aligned with the style guide will be important in the future. However, this might take additional time to make sure the linters apply to "Legesher" grammar and not just "Python" or "Javascript" grammar. This quote from [stack overflow](https://stackoverflow.com/questions/48363647/editorconfig-vs-eslint-vs-prettier-is-it-worthwhile-to-use-them-all) gives a good explanation of the different attributes we will have to touch on: >**EditorConfig**: This helps your editor produce code that looks like your style guide as you go. While this isn't strictly necessary in order to achieve your goals, it's nice if you're always looking at code that follows the same coding styles. Otherwise if you don't have EditorConfig, as you're typing your editor will auto-format differently to the rest of the code base, which is confusing. Of course if you've set up prettier it'll fix it before it goes into your code base, but still, why would you want to look at it in one format while you're writing it and then have it switch when you go to commit? Might as well be consistent. > >**Prettier**: Automatically formats your code. I like to set it up to do this when I stage my files for a commit, so that it's physically impossible for me to commit code that doesn't match my style guide. > >**ESLint**: So why would you want a linter too? Because ESLint does more than just style. It picks up when you declare variables you don't use, or reference things that aren't defined, amongst a few other niceties. So while its role diminishes somewhat compared to the days before prettier, it's still useful to have in a project to catch the other errors. This will eventually need a larger discussion of adopting best coding practices per programming language.
non_test
style add linting linting being able to have formatted code that is aligned with the style guide will be important in the future however this might take additional time to make sure the linters apply to legesher grammar and not just python or javascript grammar this quote from gives a good explanation of the different attributes we will have to touch on editorconfig this helps your editor produce code that looks like your style guide as you go while this isn t strictly necessary in order to achieve your goals it s nice if you re always looking at code that follows the same coding styles otherwise if you don t have editorconfig as you re typing your editor will auto format differently to the rest of the code base which is confusing of course if you ve set up prettier it ll fix it before it goes into your code base but still why would you want to look at it in one format while you re writing it and then have it switch when you go to commit might as well be consistent prettier automatically formats your code i like to set it up to do this when i stage my files for a commit so that it s physically impossible for me to commit code that doesn t match my style guide eslint so why would you want a linter too because eslint does more than just style it picks up when you declare variables you don t use or reference things that aren t defined amongst a few other niceties so while its role diminishes somewhat compared to the days before prettier it s still useful to have in a project to catch the other errors this will eventually need a larger discussion of adopting best coding practices per programming language
0
20,641
10,863,116,520
IssuesEvent
2019-11-14 14:34:35
scalableminds/webknossos
https://api.github.com/repos/scalableminds/webknossos
closed
compress volume tracing data in fossildb?
backend discussion performance
generally, we should investigate why fossil is so slow with volume data
True
compress volume tracing data in fossildb? - generally, we should investigate why fossil is so slow with volume data
non_test
compress volume tracing data in fossildb generally we should investigate why fossil is so slow with volume data
0
22,230
3,945,891,610
IssuesEvent
2016-04-28 00:27:28
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
dns_internal and dns_search_internal values in instance table is sometimes not populated for LB instances.
kind/bug status/resolved status/to-test
Rancher Version:v1.0.1-rc2 On an upgraded rancher-server setup (from v.0.63.1) , the upgraded LB instances have dns_internal and dns_search_internal values populated in the instance table. But new instances that get created , do not have dns_internal and dns_search_internal values populated. LB services are functional in both these cases ``` mysql> select id,name,state,dns_internal, dns_search_internal,service_index_id from instance where name like "%lb%"; +----+--------------------+---------+-----------------+--------------------------------------------------------------------------+------------------+ | id | name | state | dns_internal | dns_search_internal | service_index_id | +----+--------------------+---------+-----------------+--------------------------------------------------------------------------+------------------+ | 26 | dns2_LBCROSSREAL_1 | purged | 169.254.169.250 | dns2.rancher.internal,lbcrossreal.dns2.rancher.internal | 26 | | 41 | dns1_LB-CROSS_1 | running | 169.254.169.250 | dns1.rancher.internal,lb-cross.dns1.rancher.internal | 38 | | 42 | testupg_lb-2_1 | purged | 169.254.169.250 | testupg.rancher.internal,lb-2.testupg.rancher.internal | 39 | | 43 | testupg_lb-1_1 | running | 169.254.169.250 | testupg.rancher.internal,lb-1.testupg.rancher.internal | 40 | | 44 | testupg_lb-1_2 | running | 169.254.169.250 | testupg.rancher.internal,lb-1.testupg.rancher.internal | 41 | | 45 | testupg_lb-1_3 | running | 169.254.169.250 | testupg.rancher.internal,lb-1.testupg.rancher.internal | 42 | | 47 | dns2_LBCROSSREAL_1 | running | 169.254.169.250 | dns2.rancher.internal,lbcrossreal.dns2.rancher.internal,rancher.internal | 26 | | 51 | testupg_lb-2_1 | running | NULL | testupg.rancher.internal,lb-2.testupg.rancher.internal | 39 | | 52 | testupg_lb-2_2 | running | NULL | NULL | 47 | +----+--------------------+---------+-----------------+--------------------------------------------------------------------------+------------------+ ```
1.0
dns_internal and dns_search_internal values in instance table is sometimes not populated for LB instances. - Rancher Version:v1.0.1-rc2 On an upgraded rancher-server setup (from v.0.63.1) , the upgraded LB instances have dns_internal and dns_search_internal values populated in the instance table. But new instances that get created , do not have dns_internal and dns_search_internal values populated. LB services are functional in both these cases ``` mysql> select id,name,state,dns_internal, dns_search_internal,service_index_id from instance where name like "%lb%"; +----+--------------------+---------+-----------------+--------------------------------------------------------------------------+------------------+ | id | name | state | dns_internal | dns_search_internal | service_index_id | +----+--------------------+---------+-----------------+--------------------------------------------------------------------------+------------------+ | 26 | dns2_LBCROSSREAL_1 | purged | 169.254.169.250 | dns2.rancher.internal,lbcrossreal.dns2.rancher.internal | 26 | | 41 | dns1_LB-CROSS_1 | running | 169.254.169.250 | dns1.rancher.internal,lb-cross.dns1.rancher.internal | 38 | | 42 | testupg_lb-2_1 | purged | 169.254.169.250 | testupg.rancher.internal,lb-2.testupg.rancher.internal | 39 | | 43 | testupg_lb-1_1 | running | 169.254.169.250 | testupg.rancher.internal,lb-1.testupg.rancher.internal | 40 | | 44 | testupg_lb-1_2 | running | 169.254.169.250 | testupg.rancher.internal,lb-1.testupg.rancher.internal | 41 | | 45 | testupg_lb-1_3 | running | 169.254.169.250 | testupg.rancher.internal,lb-1.testupg.rancher.internal | 42 | | 47 | dns2_LBCROSSREAL_1 | running | 169.254.169.250 | dns2.rancher.internal,lbcrossreal.dns2.rancher.internal,rancher.internal | 26 | | 51 | testupg_lb-2_1 | running | NULL | testupg.rancher.internal,lb-2.testupg.rancher.internal | 39 | | 52 | testupg_lb-2_2 | running | NULL | NULL | 47 | +----+--------------------+---------+-----------------+--------------------------------------------------------------------------+------------------+ ```
test
dns internal and dns search internal values in instance table is sometimes not populated for lb instances rancher version on an upgraded rancher server setup from v the upgraded lb instances have dns internal and dns search internal values populated in the instance table but new instances that get created do not have dns internal and dns search internal values populated lb services are functional in both these cases mysql select id name state dns internal dns search internal service index id from instance where name like lb id name state dns internal dns search internal service index id lbcrossreal purged rancher internal lbcrossreal rancher internal lb cross running rancher internal lb cross rancher internal testupg lb purged testupg rancher internal lb testupg rancher internal testupg lb running testupg rancher internal lb testupg rancher internal testupg lb running testupg rancher internal lb testupg rancher internal testupg lb running testupg rancher internal lb testupg rancher internal lbcrossreal running rancher internal lbcrossreal rancher internal rancher internal testupg lb running null testupg rancher internal lb testupg rancher internal testupg lb running null null
1
246,216
20,829,277,048
IssuesEvent
2022-03-19 06:20:43
apache/incubator-kyuubi
https://api.github.com/repos/apache/incubator-kyuubi
opened
[Umbrella] Manage test failures with kyuubi spark nightly build
help wanted good first issue kind:umbrella kind:test priority:major module:spark
### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct) ### Search before asking - [X] I have searched in the [issues](https://github.com/apache/incubator-kyuubi/issues) and found no similar issues. ### Describe the proposal We have nightly Github action[1] to check compatibility with the Spark master branch as some breaking changes might be introduced upstream. It's a good way for new contributors and those who want to get familiar with Kyuubi and Spark. Please help us to fix those test failures, compile failures, etc. [1] https://github.com/apache/incubator-kyuubi/actions/workflows/nightly.yml ### Task list - [ ] ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR!
1.0
[Umbrella] Manage test failures with kyuubi spark nightly build - ### Code of Conduct - [X] I agree to follow this project's [Code of Conduct](https://www.apache.org/foundation/policies/conduct) ### Search before asking - [X] I have searched in the [issues](https://github.com/apache/incubator-kyuubi/issues) and found no similar issues. ### Describe the proposal We have nightly Github action[1] to check compatibility with the Spark master branch as some breaking changes might be introduced upstream. It's a good way for new contributors and those who want to get familiar with Kyuubi and Spark. Please help us to fix those test failures, compile failures, etc. [1] https://github.com/apache/incubator-kyuubi/actions/workflows/nightly.yml ### Task list - [ ] ### Are you willing to submit PR? - [ ] Yes I am willing to submit a PR!
test
manage test failures with kyuubi spark nightly build code of conduct i agree to follow this project s search before asking i have searched in the and found no similar issues describe the proposal we have nightly github action to check compatibility with the spark master branch as some breaking changes might be introduced upstream it s a good way for new contributors and those who want to get familiar with kyuubi and spark please help us to fix those test failures compile failures etc task list are you willing to submit pr yes i am willing to submit a pr
1
287,308
31,834,293,611
IssuesEvent
2023-09-14 12:38:58
Trinadh465/linux-4.1.15_CVE-2022-3564
https://api.github.com/repos/Trinadh465/linux-4.1.15_CVE-2022-3564
closed
CVE-2018-10882 (Medium) detected in linux-stable-rtv4.1.33 - autoclosed
Mend: dependency security vulnerability
## CVE-2018-10882 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/Trinadh465/linux-4.1.15_CVE-2022-3564/commit/3e3e73fc07b87b414cc43712be62ebb4ac7de2e9">3e3e73fc07b87b414cc43712be62ebb4ac7de2e9</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/ext4.h</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/ext4.h</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in the Linux kernel's ext4 filesystem. A local user can cause an out-of-bound write in in fs/jbd2/transaction.c code, a denial of service, and a system crash by unmounting a crafted ext4 filesystem image. <p>Publish Date: 2018-07-27 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-10882>CVE-2018-10882</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://source.android.com/security/bulletin/2019-01-01">https://source.android.com/security/bulletin/2019-01-01</a></p> <p>Release Date: 2018-07-27</p> <p>Fix Resolution: v4.18-rc4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-10882 (Medium) detected in linux-stable-rtv4.1.33 - autoclosed - ## CVE-2018-10882 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/Trinadh465/linux-4.1.15_CVE-2022-3564/commit/3e3e73fc07b87b414cc43712be62ebb4ac7de2e9">3e3e73fc07b87b414cc43712be62ebb4ac7de2e9</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/ext4.h</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/ext4/ext4.h</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> A flaw was found in the Linux kernel's ext4 filesystem. A local user can cause an out-of-bound write in in fs/jbd2/transaction.c code, a denial of service, and a system crash by unmounting a crafted ext4 filesystem image. <p>Publish Date: 2018-07-27 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-10882>CVE-2018-10882</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://source.android.com/security/bulletin/2019-01-01">https://source.android.com/security/bulletin/2019-01-01</a></p> <p>Release Date: 2018-07-27</p> <p>Fix Resolution: v4.18-rc4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in linux stable autoclosed cve medium severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files fs h fs h vulnerability details a flaw was found in the linux kernel s filesystem a local user can cause an out of bound write in in fs transaction c code a denial of service and a system crash by unmounting a crafted filesystem image publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
10,430
3,388,495,033
IssuesEvent
2015-11-29 10:36:28
leeper/MTurkR
https://api.github.com/repos/leeper/MTurkR
opened
Switch request() to use environment variables
documentation enhancement
Pull AWS credentials from environment variables. - [ ] Need to completely deprecate use of options for credentials - [ ] Formal argument to `request()` should pull from environment variables - [ ] Add warning if credentials are stored in options but not environment variables
1.0
Switch request() to use environment variables - Pull AWS credentials from environment variables. - [ ] Need to completely deprecate use of options for credentials - [ ] Formal argument to `request()` should pull from environment variables - [ ] Add warning if credentials are stored in options but not environment variables
non_test
switch request to use environment variables pull aws credentials from environment variables need to completely deprecate use of options for credentials formal argument to request should pull from environment variables add warning if credentials are stored in options but not environment variables
0
315,703
9,631,172,472
IssuesEvent
2019-05-15 13:43:02
inverse-inc/packetfence
https://api.github.com/repos/inverse-inc/packetfence
closed
Need to be able to push admin compiled files through pf-maint.pl
Priority: High
The new admin compiled files need to be pushed through pf-maint.pl Similarly to the golang binaries we could host the files on inverse.ca/downloads and simply download + replace the existing ones
1.0
Need to be able to push admin compiled files through pf-maint.pl - The new admin compiled files need to be pushed through pf-maint.pl Similarly to the golang binaries we could host the files on inverse.ca/downloads and simply download + replace the existing ones
non_test
need to be able to push admin compiled files through pf maint pl the new admin compiled files need to be pushed through pf maint pl similarly to the golang binaries we could host the files on inverse ca downloads and simply download replace the existing ones
0
287,936
24,876,164,243
IssuesEvent
2022-10-27 19:17:45
Project-Sustain/data-sparsity-grpc-service
https://api.github.com/repos/Project-Sustain/data-sparsity-grpc-service
opened
Validate flask server input
test
- Write a function that validates input for flask server - Look into libraries
1.0
Validate flask server input - - Write a function that validates input for flask server - Look into libraries
test
validate flask server input write a function that validates input for flask server look into libraries
1
34,419
4,918,458,061
IssuesEvent
2016-11-24 08:59:43
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
opened
github.com/cockroachdb/cockroach/vendor/github.com/coreos/etcd/wal: TestOpenForRead failed under stress
Robot test-failure
SHA: https://github.com/cockroachdb/cockroach/commits/b54490b2cf70c155ec2b7af5133276ffe24dc02c Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=true TAGS=stress GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=58266&tab=buildLog ``` wal_test.go:495: mkdir /tmp/waltest017267957.tmp: no space left on device ```
1.0
github.com/cockroachdb/cockroach/vendor/github.com/coreos/etcd/wal: TestOpenForRead failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/b54490b2cf70c155ec2b7af5133276ffe24dc02c Parameters: ``` COCKROACH_PROPOSER_EVALUATED_KV=true TAGS=stress GOFLAGS= ``` Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=58266&tab=buildLog ``` wal_test.go:495: mkdir /tmp/waltest017267957.tmp: no space left on device ```
test
github com cockroachdb cockroach vendor github com coreos etcd wal testopenforread failed under stress sha parameters cockroach proposer evaluated kv true tags stress goflags stress build found a failed test wal test go mkdir tmp tmp no space left on device
1
172,518
27,292,388,748
IssuesEvent
2023-02-23 17:30:40
KDT3-MiniProject-8/MiniProject-FE
https://api.github.com/repos/KDT3-MiniProject-8/MiniProject-FE
closed
현주) 인증 관련 화면구성, 상태관리
Feature API Design
## 목적 인증 기능, 인증 상태별 화면 구성, 상태관리 ## 세부 내용 - [x] 로그인 시 access token 쿠키에 저장 - [x] access token 만료 시간 설정 - [x] 로그아웃 시 access token 삭제 - [x] 토큰 유뮤에 따라 nav bar에 로그인/로그아웃 버튼 나누기 - [x] 로그인 상태에서는 `/login`, `/findpassword`, `/signup` 경로에 접근했을 때 이전페이지 돌아가기 안내 띄우기 - [x] 로그인 상태 아닐 경우 마이페이지, 맞춤추천페이지에 로그인 안내만 띄우기 - [x] 로그인 상태 아닐 경우 `/cart` 페이지 로그인안내 - [x] 로그인 여부에 따른 메인 화면 구성 - [x] 로그인상태 아닐 경우 상세페이지에서 관심등록 버튼 클릭 못하게 처리
1.0
현주) 인증 관련 화면구성, 상태관리 - ## 목적 인증 기능, 인증 상태별 화면 구성, 상태관리 ## 세부 내용 - [x] 로그인 시 access token 쿠키에 저장 - [x] access token 만료 시간 설정 - [x] 로그아웃 시 access token 삭제 - [x] 토큰 유뮤에 따라 nav bar에 로그인/로그아웃 버튼 나누기 - [x] 로그인 상태에서는 `/login`, `/findpassword`, `/signup` 경로에 접근했을 때 이전페이지 돌아가기 안내 띄우기 - [x] 로그인 상태 아닐 경우 마이페이지, 맞춤추천페이지에 로그인 안내만 띄우기 - [x] 로그인 상태 아닐 경우 `/cart` 페이지 로그인안내 - [x] 로그인 여부에 따른 메인 화면 구성 - [x] 로그인상태 아닐 경우 상세페이지에서 관심등록 버튼 클릭 못하게 처리
non_test
현주 인증 관련 화면구성 상태관리 목적 인증 기능 인증 상태별 화면 구성 상태관리 세부 내용 로그인 시 access token 쿠키에 저장 access token 만료 시간 설정 로그아웃 시 access token 삭제 토큰 유뮤에 따라 nav bar에 로그인 로그아웃 버튼 나누기 로그인 상태에서는 login findpassword signup 경로에 접근했을 때 이전페이지 돌아가기 안내 띄우기 로그인 상태 아닐 경우 마이페이지 맞춤추천페이지에 로그인 안내만 띄우기 로그인 상태 아닐 경우 cart 페이지 로그인안내 로그인 여부에 따른 메인 화면 구성 로그인상태 아닐 경우 상세페이지에서 관심등록 버튼 클릭 못하게 처리
0
71,820
7,261,868,759
IssuesEvent
2018-02-19 01:04:59
istio/istio
https://api.github.com/repos/istio/istio
opened
vm test failure to download artifacts
area/test and release automated-release kind/test-failure
2 issues a) it doesn't say what it tried to download so it's hard to trouble shoot (same as #3231) b) it doesn't work/download https://k8s-gubernator.appspot.com/build/istio-prow/pull/istio_istio/3513/e2e-suite-rbac-auth/850/ ``` I0218 20:37:11.821] *** Fetching istio packages... I0218 20:37:11.821] % Total % Received % Xferd Average Speed Time Time Time Current I0218 20:37:11.821] Dload Upload Total Spent Left Speed I0218 20:37:11.821] 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 I0218 20:37:11.821] curl: (22) The requested URL returned error: 404 I0218 20:37:11.822] dpkg-deb: error: './istio-sidecar.deb' is not a debian format archive I0218 20:37:11.822] dpkg: error processing archive ./istio-sidecar.deb (--install): I0218 20:37:11.822] subprocess dpkg-deb --control returned error exit status 2 I0218 20:37:11.822] Errors were encountered while processing: I0218 20:37:11.822] ./istio-sidecar.deb I0218 20:37:11.822] cp: cannot create regular file '/var/lib/istio/envoy': No such file or directory I0218 20:37:11.822] chown: invalid user: ‘istio-proxy’ I0218 20:37:11.822] chown: invalid user: ‘istio-proxy’ I0218 20:37:11.822] Reading package lists... ```
2.0
vm test failure to download artifacts - 2 issues a) it doesn't say what it tried to download so it's hard to trouble shoot (same as #3231) b) it doesn't work/download https://k8s-gubernator.appspot.com/build/istio-prow/pull/istio_istio/3513/e2e-suite-rbac-auth/850/ ``` I0218 20:37:11.821] *** Fetching istio packages... I0218 20:37:11.821] % Total % Received % Xferd Average Speed Time Time Time Current I0218 20:37:11.821] Dload Upload Total Spent Left Speed I0218 20:37:11.821] 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 I0218 20:37:11.821] curl: (22) The requested URL returned error: 404 I0218 20:37:11.822] dpkg-deb: error: './istio-sidecar.deb' is not a debian format archive I0218 20:37:11.822] dpkg: error processing archive ./istio-sidecar.deb (--install): I0218 20:37:11.822] subprocess dpkg-deb --control returned error exit status 2 I0218 20:37:11.822] Errors were encountered while processing: I0218 20:37:11.822] ./istio-sidecar.deb I0218 20:37:11.822] cp: cannot create regular file '/var/lib/istio/envoy': No such file or directory I0218 20:37:11.822] chown: invalid user: ‘istio-proxy’ I0218 20:37:11.822] chown: invalid user: ‘istio-proxy’ I0218 20:37:11.822] Reading package lists... ```
test
vm test failure to download artifacts issues a it doesn t say what it tried to download so it s hard to trouble shoot same as b it doesn t work download fetching istio packages total received xferd average speed time time time current dload upload total spent left speed curl the requested url returned error dpkg deb error istio sidecar deb is not a debian format archive dpkg error processing archive istio sidecar deb install subprocess dpkg deb control returned error exit status errors were encountered while processing istio sidecar deb cp cannot create regular file var lib istio envoy no such file or directory chown invalid user ‘istio proxy’ chown invalid user ‘istio proxy’ reading package lists
1
322,393
27,598,324,606
IssuesEvent
2023-03-09 08:19:52
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
opened
Fix jax_numpy_math.test_jax_numpy_nextafter
JAX Frontend Sub Task Failing Test
| | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details>
1.0
Fix jax_numpy_math.test_jax_numpy_nextafter - | | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details> <details> <summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_nextafter[cpu-ivy.functional.backends.jax-False-False]</summary> 2023-03-09T01:00:07.6445479Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6445838Z E 2023-03-09T01:00:07.6446142Z E The stack trace below excludes JAX-internal frames. 2023-03-09T01:00:07.6446795Z E The preceding is the original exception that occurred, unmodified. 2023-03-09T01:00:07.6447191Z E 2023-03-09T01:00:07.6447434Z E -------------------- 2023-03-09T01:00:07.6452365Z E TypeError: <lambda>() got some positional-only arguments passed as keyword arguments: 'x1, x2' 2023-03-09T01:00:07.6452817Z E Falsifying example: test_jax_numpy_nextafter( 2023-03-09T01:00:07.6453248Z E dtype_and_x=(['float32', 'float32'], 2023-03-09T01:00:07.6453749Z E [array([-1.], dtype=float32), array([-1.], dtype=float32)]), 2023-03-09T01:00:07.6454137Z E test_flags=FrontendFunctionTestFlags( 2023-03-09T01:00:07.6454822Z E num_positional_args=0, 2023-03-09T01:00:07.6455053Z E with_out=False, 2023-03-09T01:00:07.6455272Z E inplace=False, 2023-03-09T01:00:07.6455502Z E as_variable=[False], 2023-03-09T01:00:07.6455731Z E native_arrays=[False], 2023-03-09T01:00:07.6455942Z E ), 2023-03-09T01:00:07.6456298Z E fn_tree='ivy.functional.frontends.jax.numpy.nextafter', 2023-03-09T01:00:07.6456637Z E on_device='cpu', 2023-03-09T01:00:07.6456888Z E frontend='jax', 2023-03-09T01:00:07.6457091Z E ) 2023-03-09T01:00:07.6457271Z E 2023-03-09T01:00:07.6457798Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2BkAAMoBaaR2QwAAH0ABg==') as a decorator on your test case </details>
test
fix jax numpy math test jax numpy nextafter tensorflow img src torch img src numpy img src jax img src failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy nextafter e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments e falsifying example test jax numpy nextafter e dtype and x e dtype array dtype e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e fn tree ivy functional frontends jax numpy nextafter e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy nextafter e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments e falsifying example test jax numpy nextafter e dtype and x e dtype array dtype e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e fn tree ivy functional frontends jax numpy nextafter e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy nextafter e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments e falsifying example test jax numpy nextafter e dtype and x e dtype array dtype e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e fn tree ivy functional frontends jax numpy nextafter e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy nextafter e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments e falsifying example test jax numpy nextafter e dtype and x e dtype array dtype e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e fn tree ivy functional frontends jax numpy nextafter e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case
1
349,254
31,789,151,439
IssuesEvent
2023-09-13 00:53:08
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
roachtest/unoptimized-query-oracle: internal error: decoding unset EncDatum
C-test-failure O-rsg branch-master T-sql-queries
roachtest.unoptimized-query-oracle/disable-rules=half/seed-multi-region [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/11626633?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/11626633?buildTab=artifacts#/unoptimized-query-oracle/disable-rules=half/seed-multi-region) on master @ [8ee0bc7e4391add8c3d8ec99cddf510f581b1a8f](https://github.com/cockroachdb/cockroach/commits/8ee0bc7e4391add8c3d8ec99cddf510f581b1a8f): ``` (query_comparison_util.go:388).runOneRoundQueryComparison: internal error while running optimized statement. 1082 statements run: pq: internal error: decoding unset EncDatum test artifacts and logs in: /artifacts/unoptimized-query-oracle/disable-rules=half/seed-multi-region/run_1 ``` <p>Parameters: <code>ROACHTEST_arch=amd64</code> , <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_encrypted=false</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) See: [Grafana](https://go.crdb.dev/p/roachfana/teamcity-11626633-1693892811-61-n9cpu4-geo/1693930301232/1693932186162) </p> </details> <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*unoptimized-query-oracle/disable-rules=half/seed-multi-region.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> _Originally posted by @cockroach-teamcity in https://github.com/cockroachdb/cockroach/issues/108527#issuecomment-1706960088_ Jira issue: CRDB-31359
1.0
roachtest/unoptimized-query-oracle: internal error: decoding unset EncDatum - roachtest.unoptimized-query-oracle/disable-rules=half/seed-multi-region [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/11626633?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/11626633?buildTab=artifacts#/unoptimized-query-oracle/disable-rules=half/seed-multi-region) on master @ [8ee0bc7e4391add8c3d8ec99cddf510f581b1a8f](https://github.com/cockroachdb/cockroach/commits/8ee0bc7e4391add8c3d8ec99cddf510f581b1a8f): ``` (query_comparison_util.go:388).runOneRoundQueryComparison: internal error while running optimized statement. 1082 statements run: pq: internal error: decoding unset EncDatum test artifacts and logs in: /artifacts/unoptimized-query-oracle/disable-rules=half/seed-multi-region/run_1 ``` <p>Parameters: <code>ROACHTEST_arch=amd64</code> , <code>ROACHTEST_cloud=gce</code> , <code>ROACHTEST_cpu=4</code> , <code>ROACHTEST_encrypted=false</code> , <code>ROACHTEST_ssd=0</code> </p> <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) See: [Grafana](https://go.crdb.dev/p/roachfana/teamcity-11626633-1693892811-61-n9cpu4-geo/1693930301232/1693932186162) </p> </details> <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*unoptimized-query-oracle/disable-rules=half/seed-multi-region.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub> _Originally posted by @cockroach-teamcity in https://github.com/cockroachdb/cockroach/issues/108527#issuecomment-1706960088_ Jira issue: CRDB-31359
test
roachtest unoptimized query oracle internal error decoding unset encdatum roachtest unoptimized query oracle disable rules half seed multi region with on master query comparison util go runoneroundquerycomparison internal error while running optimized statement statements run pq internal error decoding unset encdatum test artifacts and logs in artifacts unoptimized query oracle disable rules half seed multi region run parameters roachtest arch roachtest cloud gce roachtest cpu roachtest encrypted false roachtest ssd help see see see originally posted by cockroach teamcity in jira issue crdb
1
108,532
9,309,758,977
IssuesEvent
2019-03-25 17:10:18
vicrosa25/D02-Functional-Testing
https://api.github.com/repos/vicrosa25/D02-Functional-Testing
closed
Test requirement 16 - Sponsor manage his sponsorships
TEST
Manage his or her sponsorships, which includes listing, showing, creating, updating, and removing them. Note that removing a sponsorship does not actually delete it from the system, but de-activates it. A de-activated sponsorship can be re-activated later.
1.0
Test requirement 16 - Sponsor manage his sponsorships - Manage his or her sponsorships, which includes listing, showing, creating, updating, and removing them. Note that removing a sponsorship does not actually delete it from the system, but de-activates it. A de-activated sponsorship can be re-activated later.
test
test requirement sponsor manage his sponsorships manage his or her sponsorships which includes listing showing creating updating and removing them note that removing a sponsorship does not actually delete it from the system but de activates it a de activated sponsorship can be re activated later
1
78,195
7,622,960,195
IssuesEvent
2018-05-03 13:48:14
fabric8io/fabric8-test
https://api.github.com/repos/fabric8io/fabric8-test
closed
Environment not reset if che test fails
E2E-test
Che test case opens a new browser tab so if this test fails, environment is not reset.
1.0
Environment not reset if che test fails - Che test case opens a new browser tab so if this test fails, environment is not reset.
test
environment not reset if che test fails che test case opens a new browser tab so if this test fails environment is not reset
1
43,628
5,545,848,167
IssuesEvent
2017-03-22 22:43:45
websharks/s2member
https://api.github.com/repos/websharks/s2member
closed
<> Symbols in Shortcode Attributes = No Worky
bug needs testing pro
Jason - I sent this error to support, to Cristian forum and he asked me to send it here, I wonder if it has to do with the new version of s2member pro released plus I also got an error when doing my first Paypal Pro form - or a warning 1) here is the error I received when trying to view my page to offer my members their files I am using this s2IF condition to make sure a person sees all the member files for the 1st month and that they continue to see these files throughout the life of their membership - depending how long they are a paid member ... [s2If php="S2MEMBER_CURRENT_USER_PAID_REGISTRATION_DAYS >= 0"] the error message I received for this s2IF condition was: Parse error: syntax error, unexpected ‘;’ in /home2/crystals/public_html/cse/wp-content/plugins/s2member/includes/classes/sc-if-conds-in.inc.php(319) : eval()’d code on line 1 now before I upgraded to the new release of s2member pro I did not see this error I have not added any new plugins to my site the only change is the number of lines on my webpage were increased to accomodate all the files available to the member - (lines between the /s2If and s2If/ I am guessing something changed in the new release since the page is protected by the level – I guess I could assume that all the files for the first month can be viewed from day 0 (1) without any conditions but then for month 2, if above > 30 – is this going to also produce a parsing error also? 2) warning message after I created first Paypal Pro form using the Paypal Pro form generator here is the code given to me by the generator I am using this s2IF condition to make sure a person sees all the member files for the 1st month [s2If php="S2MEMBER_CURRENT_USER_PAID_REGISTRATION_DAYS >= 0"] the error message I received for this s2IF condition was: Parse error: syntax error, unexpected ‘;’ in /home2/crystals/public_html/cse/wp-content/plugins/s2member/includes/classes/sc-if-conds-in.inc.php(319) : eval()’d code on line 1 I am guessing something changed in the new release since the page is protected by the level – I guess I could assume that all the files for the first month can be viewed from day 0 (1) without any conditions [s2Member-Pro-PayPal-Form level="2" ccaps="novel1" desc="1st Month @$7.00, then $7.00 USD / Monthly (recurring charge) for Novel: CS Chronicles/Book 2" ps="paypal" lc="" cc="USD" dg="0" ns="1" custom="cse.crystalskullexplorers.com" ta="7.00" tp="1" tt="M" ra="7.00" rp="1" rt="M" rr="1" rrt="" rra="2" accept="paypal" accept_via_paypal="paypal" coupon="" accept_coupons="0" default_country_code="" captcha="0" /] so my membership is $7 for the first month and $7 there after - 1st error was my description was longer than 100 characters which it wasn't but I got pass that error this warning showed up on the top of the page in pink: Invalid form configuration. Invalid "ta, tp, tt" attributes. Trial Period. When provided, these cannot be exactly the same as your "ra, rp, rt" attributes. I only copied the code given to me by the on-line form generator - also I do not have Paypal Pro but Paypal Express Checkout so I changed accept to just 'paypal" we don't want our members to see this warning, why tp attributes are being added to the code I don't know - I use level 1 for trial period Joshua Shapiro
1.0
<> Symbols in Shortcode Attributes = No Worky - Jason - I sent this error to support, to Cristian forum and he asked me to send it here, I wonder if it has to do with the new version of s2member pro released plus I also got an error when doing my first Paypal Pro form - or a warning 1) here is the error I received when trying to view my page to offer my members their files I am using this s2IF condition to make sure a person sees all the member files for the 1st month and that they continue to see these files throughout the life of their membership - depending how long they are a paid member ... [s2If php="S2MEMBER_CURRENT_USER_PAID_REGISTRATION_DAYS >= 0"] the error message I received for this s2IF condition was: Parse error: syntax error, unexpected ‘;’ in /home2/crystals/public_html/cse/wp-content/plugins/s2member/includes/classes/sc-if-conds-in.inc.php(319) : eval()’d code on line 1 now before I upgraded to the new release of s2member pro I did not see this error I have not added any new plugins to my site the only change is the number of lines on my webpage were increased to accomodate all the files available to the member - (lines between the /s2If and s2If/ I am guessing something changed in the new release since the page is protected by the level – I guess I could assume that all the files for the first month can be viewed from day 0 (1) without any conditions but then for month 2, if above > 30 – is this going to also produce a parsing error also? 2) warning message after I created first Paypal Pro form using the Paypal Pro form generator here is the code given to me by the generator I am using this s2IF condition to make sure a person sees all the member files for the 1st month [s2If php="S2MEMBER_CURRENT_USER_PAID_REGISTRATION_DAYS >= 0"] the error message I received for this s2IF condition was: Parse error: syntax error, unexpected ‘;’ in /home2/crystals/public_html/cse/wp-content/plugins/s2member/includes/classes/sc-if-conds-in.inc.php(319) : eval()’d code on line 1 I am guessing something changed in the new release since the page is protected by the level – I guess I could assume that all the files for the first month can be viewed from day 0 (1) without any conditions [s2Member-Pro-PayPal-Form level="2" ccaps="novel1" desc="1st Month @$7.00, then $7.00 USD / Monthly (recurring charge) for Novel: CS Chronicles/Book 2" ps="paypal" lc="" cc="USD" dg="0" ns="1" custom="cse.crystalskullexplorers.com" ta="7.00" tp="1" tt="M" ra="7.00" rp="1" rt="M" rr="1" rrt="" rra="2" accept="paypal" accept_via_paypal="paypal" coupon="" accept_coupons="0" default_country_code="" captcha="0" /] so my membership is $7 for the first month and $7 there after - 1st error was my description was longer than 100 characters which it wasn't but I got pass that error this warning showed up on the top of the page in pink: Invalid form configuration. Invalid "ta, tp, tt" attributes. Trial Period. When provided, these cannot be exactly the same as your "ra, rp, rt" attributes. I only copied the code given to me by the on-line form generator - also I do not have Paypal Pro but Paypal Express Checkout so I changed accept to just 'paypal" we don't want our members to see this warning, why tp attributes are being added to the code I don't know - I use level 1 for trial period Joshua Shapiro
test
symbols in shortcode attributes no worky jason i sent this error to support to cristian forum and he asked me to send it here i wonder if it has to do with the new version of pro released plus i also got an error when doing my first paypal pro form or a warning here is the error i received when trying to view my page to offer my members their files i am using this condition to make sure a person sees all the member files for the month and that they continue to see these files throughout the life of their membership depending how long they are a paid member the error message i received for this condition was parse error syntax error unexpected ‘ ’ in crystals public html cse wp content plugins includes classes sc if conds in inc php eval ’d code on line now before i upgraded to the new release of pro i did not see this error i have not added any new plugins to my site the only change is the number of lines on my webpage were increased to accomodate all the files available to the member lines between the and i am guessing something changed in the new release since the page is protected by the level – i guess i could assume that all the files for the first month can be viewed from day without any conditions but then for month if above – is this going to also produce a parsing error also warning message after i created first paypal pro form using the paypal pro form generator here is the code given to me by the generator i am using this condition to make sure a person sees all the member files for the month the error message i received for this condition was parse error syntax error unexpected ‘ ’ in crystals public html cse wp content plugins includes classes sc if conds in inc php eval ’d code on line i am guessing something changed in the new release since the page is protected by the level – i guess i could assume that all the files for the first month can be viewed from day without any conditions so my membership is for the first month and there after error was my description was longer than characters which it wasn t but i got pass that error this warning showed up on the top of the page in pink invalid form configuration invalid ta tp tt attributes trial period when provided these cannot be exactly the same as your ra rp rt attributes i only copied the code given to me by the on line form generator also i do not have paypal pro but paypal express checkout so i changed accept to just paypal we don t want our members to see this warning why tp attributes are being added to the code i don t know i use level for trial period joshua shapiro
1
35,709
32,059,015,698
IssuesEvent
2023-09-24 12:42:46
ForNeVeR/fornever.me
https://api.github.com/repos/ForNeVeR/fornever.me
opened
Migrate away from nodesource builds
kind:infrastructure
This weird message and a build pause started to appear in container build on CI. ![image](https://github.com/ForNeVeR/fornever.me/assets/92793/32591019-4c97-4a05-bc45-13581ab6b184) This is ridiculous. We should migrate from nodesource.
1.0
Migrate away from nodesource builds - This weird message and a build pause started to appear in container build on CI. ![image](https://github.com/ForNeVeR/fornever.me/assets/92793/32591019-4c97-4a05-bc45-13581ab6b184) This is ridiculous. We should migrate from nodesource.
non_test
migrate away from nodesource builds this weird message and a build pause started to appear in container build on ci this is ridiculous we should migrate from nodesource
0
97,593
16,236,395,207
IssuesEvent
2021-05-07 01:37:55
michaeldotson/movie-app
https://api.github.com/repos/michaeldotson/movie-app
opened
CVE-2020-7595 (High) detected in nokogiri-1.10.3.gem
security vulnerability
## CVE-2020-7595 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nokogiri-1.10.3.gem</b></p></summary> <p>Nokogiri (鋸) is an HTML, XML, SAX, and Reader parser. Among Nokogiri's many features is the ability to search documents via XPath or CSS3 selectors.</p> <p>Library home page: <a href="https://rubygems.org/gems/nokogiri-1.10.3.gem">https://rubygems.org/gems/nokogiri-1.10.3.gem</a></p> <p>Path to dependency file: /movie-app/Gemfile.lock</p> <p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/nokogiri-1.10.3.gem</p> <p> Dependency Hierarchy: - sass-rails-5.0.7.gem (Root Library) - sprockets-rails-3.2.1.gem - actionpack-5.2.2.gem - rails-html-sanitizer-1.0.4.gem - loofah-2.2.3.gem - :x: **nokogiri-1.10.3.gem** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> xmlStringLenDecodeEntities in parser.c in libxml2 2.9.10 has an infinite loop in a certain end-of-file situation. <p>Publish Date: 2020-01-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7595>CVE-2020-7595</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://security.gentoo.org/glsa/202010-04">https://security.gentoo.org/glsa/202010-04</a></p> <p>Fix Resolution: All libxml2 users should upgrade to the latest version # emerge --sync # emerge --ask --oneshot --verbose >=dev-libs/libxml2-2.9.10 >= </p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7595 (High) detected in nokogiri-1.10.3.gem - ## CVE-2020-7595 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nokogiri-1.10.3.gem</b></p></summary> <p>Nokogiri (鋸) is an HTML, XML, SAX, and Reader parser. Among Nokogiri's many features is the ability to search documents via XPath or CSS3 selectors.</p> <p>Library home page: <a href="https://rubygems.org/gems/nokogiri-1.10.3.gem">https://rubygems.org/gems/nokogiri-1.10.3.gem</a></p> <p>Path to dependency file: /movie-app/Gemfile.lock</p> <p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/nokogiri-1.10.3.gem</p> <p> Dependency Hierarchy: - sass-rails-5.0.7.gem (Root Library) - sprockets-rails-3.2.1.gem - actionpack-5.2.2.gem - rails-html-sanitizer-1.0.4.gem - loofah-2.2.3.gem - :x: **nokogiri-1.10.3.gem** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> xmlStringLenDecodeEntities in parser.c in libxml2 2.9.10 has an infinite loop in a certain end-of-file situation. <p>Publish Date: 2020-01-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7595>CVE-2020-7595</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://security.gentoo.org/glsa/202010-04">https://security.gentoo.org/glsa/202010-04</a></p> <p>Fix Resolution: All libxml2 users should upgrade to the latest version # emerge --sync # emerge --ask --oneshot --verbose >=dev-libs/libxml2-2.9.10 >= </p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in nokogiri gem cve high severity vulnerability vulnerable library nokogiri gem nokogiri 鋸 is an html xml sax and reader parser among nokogiri s many features is the ability to search documents via xpath or selectors library home page a href path to dependency file movie app gemfile lock path to vulnerable library var lib gems cache nokogiri gem dependency hierarchy sass rails gem root library sprockets rails gem actionpack gem rails html sanitizer gem loofah gem x nokogiri gem vulnerable library vulnerability details xmlstringlendecodeentities in parser c in has an infinite loop in a certain end of file situation publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href fix resolution all users should upgrade to the latest version emerge sync emerge ask oneshot verbose dev libs step up your open source security game with whitesource
0
45,052
18,356,456,279
IssuesEvent
2021-10-08 18:56:57
keepassxreboot/keepassxc
https://api.github.com/repos/keepassxreboot/keepassxc
closed
Libsecret integration wants to move NetworkManager VPN entry to trash
more info needed downstream feature: Secret Service
## Overview [TIP]: # ( DO NOT include screenshots of your actual database! ) [NOTE]: # ( Give a BRIEF summary about your problem ) When I want to connect to an VPN Network via nm-applet, KeePassXC wants to move the entry to the trash folder. ## Steps to Reproduce [NOTE]: # ( Provide a simple set of steps to reproduce this bug. ) 1. Left-click on the nm-applet icon. 2. Chose a VPN connection for which you have an entry in KeePassXC. ## Expected Behavior [NOTE]: # ( Tell us what you expected to happen ) After choosing the VPN connection entry, NetworkManager should just connect to the VPN. ## Actual Behavior A popup appears that asks "Do you really want to move [entry name] to trash?" ## Context [NOTE]: # ( Give us any additional information you may have. ) [NOTE]: # ( Paste debug info from Help → About here ) KeePassXC - 2.5.4 Revision: dcca5aa [NOTE]: # ( Pick choices based on your environment ) Operating System: ArchLinux Desktop Env: Awesome WM Windowing System: X11 Packages: - networkmanager - networkmanager-pptp - network-manager-applet
1.0
Libsecret integration wants to move NetworkManager VPN entry to trash - ## Overview [TIP]: # ( DO NOT include screenshots of your actual database! ) [NOTE]: # ( Give a BRIEF summary about your problem ) When I want to connect to an VPN Network via nm-applet, KeePassXC wants to move the entry to the trash folder. ## Steps to Reproduce [NOTE]: # ( Provide a simple set of steps to reproduce this bug. ) 1. Left-click on the nm-applet icon. 2. Chose a VPN connection for which you have an entry in KeePassXC. ## Expected Behavior [NOTE]: # ( Tell us what you expected to happen ) After choosing the VPN connection entry, NetworkManager should just connect to the VPN. ## Actual Behavior A popup appears that asks "Do you really want to move [entry name] to trash?" ## Context [NOTE]: # ( Give us any additional information you may have. ) [NOTE]: # ( Paste debug info from Help → About here ) KeePassXC - 2.5.4 Revision: dcca5aa [NOTE]: # ( Pick choices based on your environment ) Operating System: ArchLinux Desktop Env: Awesome WM Windowing System: X11 Packages: - networkmanager - networkmanager-pptp - network-manager-applet
non_test
libsecret integration wants to move networkmanager vpn entry to trash overview do not include screenshots of your actual database give a brief summary about your problem when i want to connect to an vpn network via nm applet keepassxc wants to move the entry to the trash folder steps to reproduce provide a simple set of steps to reproduce this bug left click on the nm applet icon chose a vpn connection for which you have an entry in keepassxc expected behavior tell us what you expected to happen after choosing the vpn connection entry networkmanager should just connect to the vpn actual behavior a popup appears that asks do you really want to move to trash context give us any additional information you may have paste debug info from help → about here keepassxc revision pick choices based on your environment operating system archlinux desktop env awesome wm windowing system packages networkmanager networkmanager pptp network manager applet
0
296,151
22,290,857,501
IssuesEvent
2022-06-12 10:41:46
Flashky/rss-tracker-bot
https://api.github.com/repos/Flashky/rss-tracker-bot
closed
Change commands order at /help and /start commands
documentation
Currently: /newfeed - adds a new feed /myfeeds - see and manag your feeds Proposed: /myfeeds - see and manage your feeds /newfeed - adds a new feed
1.0
Change commands order at /help and /start commands - Currently: /newfeed - adds a new feed /myfeeds - see and manag your feeds Proposed: /myfeeds - see and manage your feeds /newfeed - adds a new feed
non_test
change commands order at help and start commands currently newfeed adds a new feed myfeeds see and manag your feeds proposed myfeeds see and manage your feeds newfeed adds a new feed
0
635,408
20,387,025,681
IssuesEvent
2022-02-22 08:12:36
o3de/o3de
https://api.github.com/repos/o3de/o3de
closed
Can't create project
kind/bug sig/content triage/accepted priority/major feature/project-manager
Tried to create a project but create project button is not there. **Steps to reproduce** Steps to reproduce the behavior: 1. Go to 'Project Manager' 2. Click on 'Create a project 3. Select 'Project Name, Location, a template' 4. Scroll down 5. See no 'button to config or create a project' **Desktop/Device (please complete the following information):** - Device: [PC] - OS: [Windows] - Version [10] - CPU [ intel i5 3470 ] - Memory [10GB] <img width="960" alt="o3dem" src="https://user-images.githubusercontent.com/13874736/153753349-3f2f62f7-406e-4bf0-9220-b8f11bbfea1d.png">
1.0
Can't create project - Tried to create a project but create project button is not there. **Steps to reproduce** Steps to reproduce the behavior: 1. Go to 'Project Manager' 2. Click on 'Create a project 3. Select 'Project Name, Location, a template' 4. Scroll down 5. See no 'button to config or create a project' **Desktop/Device (please complete the following information):** - Device: [PC] - OS: [Windows] - Version [10] - CPU [ intel i5 3470 ] - Memory [10GB] <img width="960" alt="o3dem" src="https://user-images.githubusercontent.com/13874736/153753349-3f2f62f7-406e-4bf0-9220-b8f11bbfea1d.png">
non_test
can t create project tried to create a project but create project button is not there steps to reproduce steps to reproduce the behavior go to project manager click on create a project select project name location a template scroll down see no button to config or create a project desktop device please complete the following information device os version cpu memory img width alt src
0
10,788
12,792,397,004
IssuesEvent
2020-07-02 01:16:23
GameModsBR/PowerNukkit
https://api.github.com/repos/GameModsBR/PowerNukkit
closed
Incompatibility with Creeperface01's GAC
Resolution: resolved Type: compatibility
# 💬 Let's talk <!--✍ Feel free to ask questions or start related discussion below --> pls help me ### 📋 Debug information <!-- ⚠ This information may help us to give you better answers but they are not required ⚠ --> <!-- Use the 'debugpaste' command in PowerNukkit --> <!-- You can get the version from the file name, the 'about' or 'debugpaste' command outputs --> * PowerNukkit version: 1.3.0.0-PN * YouTrack: [PN-123](https://youtrack.gamemods.com.br/issue/PN-123) * Debug link: https://hastebin.com/ofotohayah ### 💬 Anything else we should know? <!-- ✍ This is the perfect place to add any additional details --> ```java 22:01:54 [ERROR] Throwing java.lang.NoSuchMethodException: cz.creeperface.nukkit.gac.player.NukkitCheatPlayer.<init>(cn.nukkit.network.SourceInterface, java.lang.Long, java.net.InetSocketAddress) at java.lang.Class.getConstructor0(Class.java:3082) ~[?:1.8.0_231] at java.lang.Class.getConstructor(Class.java:1825) ~[?:1.8.0_231] at cn.nukkit.network.RakNetInterface.onSessionCreation(RakNetInterface.java:209) [nukkit-1.0-SNAPSHOT.jar:?] at com.nukkitx.network.raknet.RakNetServer.onOpenConnectionRequest1(RakNetServer.java:171) [nukkit-1.0-SNAPSHOT.jar:?] at com.nukkitx.network.raknet.RakNetServer.access$300(RakNetServer.java:31) [nukkit-1.0-SNAPSHOT.jar:?] at com.nukkitx.network.raknet.RakNetServer$ServerDatagramHandler.channelRead(RakNetServer.java:280) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1422) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:931) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollDatagramChannel.read(EpollDatagramChannel.java:688) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollDatagramChannel.access$100(EpollDatagramChannel.java:57) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollDatagramChannel$EpollDatagramChannelUnsafe.epollInReady(EpollDatagramChannel.java:507) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:502) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:407) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1050) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [nukkit-1.0-SNAPSHOT.jar:?] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_231] ``` i onley edit the jar name for my start.sh
True
Incompatibility with Creeperface01's GAC - # 💬 Let's talk <!--✍ Feel free to ask questions or start related discussion below --> pls help me ### 📋 Debug information <!-- ⚠ This information may help us to give you better answers but they are not required ⚠ --> <!-- Use the 'debugpaste' command in PowerNukkit --> <!-- You can get the version from the file name, the 'about' or 'debugpaste' command outputs --> * PowerNukkit version: 1.3.0.0-PN * YouTrack: [PN-123](https://youtrack.gamemods.com.br/issue/PN-123) * Debug link: https://hastebin.com/ofotohayah ### 💬 Anything else we should know? <!-- ✍ This is the perfect place to add any additional details --> ```java 22:01:54 [ERROR] Throwing java.lang.NoSuchMethodException: cz.creeperface.nukkit.gac.player.NukkitCheatPlayer.<init>(cn.nukkit.network.SourceInterface, java.lang.Long, java.net.InetSocketAddress) at java.lang.Class.getConstructor0(Class.java:3082) ~[?:1.8.0_231] at java.lang.Class.getConstructor(Class.java:1825) ~[?:1.8.0_231] at cn.nukkit.network.RakNetInterface.onSessionCreation(RakNetInterface.java:209) [nukkit-1.0-SNAPSHOT.jar:?] at com.nukkitx.network.raknet.RakNetServer.onOpenConnectionRequest1(RakNetServer.java:171) [nukkit-1.0-SNAPSHOT.jar:?] at com.nukkitx.network.raknet.RakNetServer.access$300(RakNetServer.java:31) [nukkit-1.0-SNAPSHOT.jar:?] at com.nukkitx.network.raknet.RakNetServer$ServerDatagramHandler.channelRead(RakNetServer.java:280) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:352) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1422) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:374) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:360) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:931) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollDatagramChannel.read(EpollDatagramChannel.java:688) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollDatagramChannel.access$100(EpollDatagramChannel.java:57) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollDatagramChannel$EpollDatagramChannelUnsafe.epollInReady(EpollDatagramChannel.java:507) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollEventLoop.processReady(EpollEventLoop.java:502) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.channel.epoll.EpollEventLoop.run(EpollEventLoop.java:407) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.util.concurrent.SingleThreadEventExecutor$6.run(SingleThreadEventExecutor.java:1050) [nukkit-1.0-SNAPSHOT.jar:?] at io.netty.util.internal.ThreadExecutorMap$2.run(ThreadExecutorMap.java:74) [nukkit-1.0-SNAPSHOT.jar:?] at java.lang.Thread.run(Thread.java:748) [?:1.8.0_231] ``` i onley edit the jar name for my start.sh
non_test
incompatibility with s gac 💬 let s talk pls help me 📋 debug information powernukkit version pn youtrack debug link 💬 anything else we should know java throwing java lang nosuchmethodexception cz creeperface nukkit gac player nukkitcheatplayer cn nukkit network sourceinterface java lang long java net inetsocketaddress at java lang class class java at java lang class getconstructor class java at cn nukkit network raknetinterface onsessioncreation raknetinterface java at com nukkitx network raknet raknetserver raknetserver java at com nukkitx network raknet raknetserver access raknetserver java at com nukkitx network raknet raknetserver serverdatagramhandler channelread raknetserver java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty channel defaultchannelpipeline headcontext channelread defaultchannelpipeline java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel defaultchannelpipeline firechannelread defaultchannelpipeline java at io netty channel epoll epolldatagramchannel read epolldatagramchannel java at io netty channel epoll epolldatagramchannel access epolldatagramchannel java at io netty channel epoll epolldatagramchannel epolldatagramchannelunsafe epollinready epolldatagramchannel java at io netty channel epoll epolleventloop processready epolleventloop java at io netty channel epoll epolleventloop run epolleventloop java at io netty util concurrent singlethreadeventexecutor run singlethreadeventexecutor java at io netty util internal threadexecutormap run threadexecutormap java at java lang thread run thread java i onley edit the jar name for my start sh
0
41,895
10,821,118,286
IssuesEvent
2019-11-08 17:53:56
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
closed
libc.so.6 : GLIBC_2.14 not found. Instruct Tensorflow to look for glibc in custom path.
stat:awaiting tensorflower subtype: ubuntu/linux type:build/install
<em>Please make sure that this is a build/installation issue. As per our [GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md), we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:build_template</em> **System information** - OS Platform and Distribution (e.g., Linux Ubuntu 16.04): RHEL6 - Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: N/A - TensorFlow installed from (source or binary): source - TensorFlow version: 1.13.1 - Python version: 3.6.2 - Installed using virtualenv? pip? conda?: Still at build phase - Bazel version (if compiling from source): 0.20.0 - GCC/Compiler version (if compiling from source): gcc-5.5.0 (installed from Linuxbrew) - CUDA/cuDNN version: N/A - GPU model and memory: 16GB RAM **Describe the problem** My computer has a rather old `glibc-2.12` and since I'm not an admin, there's nothing I can do to change that. Upon trying to build Tensorflow, I'd get an error along the lines of `$SECRET_PATH/lib64/libc.so.6 : GLIBC_2.14 not found`, which implies that the Tensorflow build system looks for `glibc` in the standard system paths. However, I have been able to install `glibc-2.23` into `$HOME/.linuxbrew`. I cannot simply do `LD_LIBRARY_PATH=$HOME/.linuxbrew/lib:$LD_LIBRARY_PATH <tensorflow build commands>` because it would segfault all the other utilities. How do I instruct the Tensorflow build system to look for `glibc` within `$HOME/.linuxbrew`? Which files should I edit? **Provide the exact sequence of commands / steps that you executed before running into the problem** ``` $ cd ~ $ mkdir build-bazel $ cd build-bazel $ wget https://github.com/bazelbuild/bazel/releases/download/0.20.0/bazel-0.20.0-dist.zip $ unzip bazel-0.20.0-dist.zip $ env EXTRA_BAZEL_ARGS="--host_javabase=@local_jdk//:jdk" bash ./compile.sh $ cd ~ $ mkdir build-tensorflow $ cd build-tensorflow $ wget https://codeload.github.com/tensorflow/tensorflow/zip/v1.13.1 -O tensorflow-1.13.1.zip $ unzip tensorflow-1.13.1.zip $ cd tensorflow-1.13.1 $ PATH=$HOME/build-bazel/output:$PATH ./configure $ PATH=$HOME/build-bazel/output:$PATH bazel build --config=monolithic //tensorflow/tools/pip_package:build_pip_package ``` In another issue #26432, I had succeeded in building Tensorflow with Python 3.7.2 (installed from Linuxbrew, and likely linked with the Linuxbrew `glibc-2.23`). In this issue, however, I am building it with Python 3.6.2 (official office Python), which I suspect may have been linked with the old `glibc-2.12`. For various reasons I need to build with Python 3.6.2. **Any other info / logs** Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
1.0
libc.so.6 : GLIBC_2.14 not found. Instruct Tensorflow to look for glibc in custom path. - <em>Please make sure that this is a build/installation issue. As per our [GitHub Policy](https://github.com/tensorflow/tensorflow/blob/master/ISSUES.md), we only address code/doc bugs, performance issues, feature requests and build/installation issues on GitHub. tag:build_template</em> **System information** - OS Platform and Distribution (e.g., Linux Ubuntu 16.04): RHEL6 - Mobile device (e.g. iPhone 8, Pixel 2, Samsung Galaxy) if the issue happens on mobile device: N/A - TensorFlow installed from (source or binary): source - TensorFlow version: 1.13.1 - Python version: 3.6.2 - Installed using virtualenv? pip? conda?: Still at build phase - Bazel version (if compiling from source): 0.20.0 - GCC/Compiler version (if compiling from source): gcc-5.5.0 (installed from Linuxbrew) - CUDA/cuDNN version: N/A - GPU model and memory: 16GB RAM **Describe the problem** My computer has a rather old `glibc-2.12` and since I'm not an admin, there's nothing I can do to change that. Upon trying to build Tensorflow, I'd get an error along the lines of `$SECRET_PATH/lib64/libc.so.6 : GLIBC_2.14 not found`, which implies that the Tensorflow build system looks for `glibc` in the standard system paths. However, I have been able to install `glibc-2.23` into `$HOME/.linuxbrew`. I cannot simply do `LD_LIBRARY_PATH=$HOME/.linuxbrew/lib:$LD_LIBRARY_PATH <tensorflow build commands>` because it would segfault all the other utilities. How do I instruct the Tensorflow build system to look for `glibc` within `$HOME/.linuxbrew`? Which files should I edit? **Provide the exact sequence of commands / steps that you executed before running into the problem** ``` $ cd ~ $ mkdir build-bazel $ cd build-bazel $ wget https://github.com/bazelbuild/bazel/releases/download/0.20.0/bazel-0.20.0-dist.zip $ unzip bazel-0.20.0-dist.zip $ env EXTRA_BAZEL_ARGS="--host_javabase=@local_jdk//:jdk" bash ./compile.sh $ cd ~ $ mkdir build-tensorflow $ cd build-tensorflow $ wget https://codeload.github.com/tensorflow/tensorflow/zip/v1.13.1 -O tensorflow-1.13.1.zip $ unzip tensorflow-1.13.1.zip $ cd tensorflow-1.13.1 $ PATH=$HOME/build-bazel/output:$PATH ./configure $ PATH=$HOME/build-bazel/output:$PATH bazel build --config=monolithic //tensorflow/tools/pip_package:build_pip_package ``` In another issue #26432, I had succeeded in building Tensorflow with Python 3.7.2 (installed from Linuxbrew, and likely linked with the Linuxbrew `glibc-2.23`). In this issue, however, I am building it with Python 3.6.2 (official office Python), which I suspect may have been linked with the old `glibc-2.12`. For various reasons I need to build with Python 3.6.2. **Any other info / logs** Include any logs or source code that would be helpful to diagnose the problem. If including tracebacks, please include the full traceback. Large logs and files should be attached.
non_test
libc so glibc not found instruct tensorflow to look for glibc in custom path please make sure that this is a build installation issue as per our we only address code doc bugs performance issues feature requests and build installation issues on github tag build template system information os platform and distribution e g linux ubuntu mobile device e g iphone pixel samsung galaxy if the issue happens on mobile device n a tensorflow installed from source or binary source tensorflow version python version installed using virtualenv pip conda still at build phase bazel version if compiling from source gcc compiler version if compiling from source gcc installed from linuxbrew cuda cudnn version n a gpu model and memory ram describe the problem my computer has a rather old glibc and since i m not an admin there s nothing i can do to change that upon trying to build tensorflow i d get an error along the lines of secret path libc so glibc not found which implies that the tensorflow build system looks for glibc in the standard system paths however i have been able to install glibc into home linuxbrew i cannot simply do ld library path home linuxbrew lib ld library path because it would segfault all the other utilities how do i instruct the tensorflow build system to look for glibc within home linuxbrew which files should i edit provide the exact sequence of commands steps that you executed before running into the problem cd mkdir build bazel cd build bazel wget unzip bazel dist zip env extra bazel args host javabase local jdk jdk bash compile sh cd mkdir build tensorflow cd build tensorflow wget o tensorflow zip unzip tensorflow zip cd tensorflow path home build bazel output path configure path home build bazel output path bazel build config monolithic tensorflow tools pip package build pip package in another issue i had succeeded in building tensorflow with python installed from linuxbrew and likely linked with the linuxbrew glibc in this issue however i am building it with python official office python which i suspect may have been linked with the old glibc for various reasons i need to build with python any other info logs include any logs or source code that would be helpful to diagnose the problem if including tracebacks please include the full traceback large logs and files should be attached
0
45,620
5,722,505,801
IssuesEvent
2017-04-20 09:39:29
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
ClientReconnectTest.testReconnectToNewInstanceAtSameAddress
Team: Client Type: Test-Failure
https://hazelcast-l337.ci.cloudbees.com/job/new-lab-fast-pr/8400/ ``` java.lang.AssertionError: CountDownLatch failed to complete within 120 seconds, count left: 1 at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.assertTrue(Assert.java:41) at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:899) at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:892) at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:884) at com.hazelcast.client.ClientReconnectTest.testReconnectToNewInstanceAtSameAddress(ClientReconnectTest.java:95) ```
1.0
ClientReconnectTest.testReconnectToNewInstanceAtSameAddress - https://hazelcast-l337.ci.cloudbees.com/job/new-lab-fast-pr/8400/ ``` java.lang.AssertionError: CountDownLatch failed to complete within 120 seconds, count left: 1 at org.junit.Assert.fail(Assert.java:88) at org.junit.Assert.assertTrue(Assert.java:41) at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:899) at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:892) at com.hazelcast.test.HazelcastTestSupport.assertOpenEventually(HazelcastTestSupport.java:884) at com.hazelcast.client.ClientReconnectTest.testReconnectToNewInstanceAtSameAddress(ClientReconnectTest.java:95) ```
test
clientreconnecttest testreconnecttonewinstanceatsameaddress java lang assertionerror countdownlatch failed to complete within seconds count left at org junit assert fail assert java at org junit assert asserttrue assert java at com hazelcast test hazelcasttestsupport assertopeneventually hazelcasttestsupport java at com hazelcast test hazelcasttestsupport assertopeneventually hazelcasttestsupport java at com hazelcast test hazelcasttestsupport assertopeneventually hazelcasttestsupport java at com hazelcast client clientreconnecttest testreconnecttonewinstanceatsameaddress clientreconnecttest java
1
392,241
11,585,563,094
IssuesEvent
2020-02-23 00:51:24
google/ground-platform
https://api.github.com/repos/google/ground-platform
closed
[Project header] Styling and layout
good first issue priority: p1 type: feature request web
For rough parity with React prototype: - [ ] Add even padding and vertical middle align components - [ ] Title should use "Open Sans" as main font - [ ] Right-align user profile pic - [ ] Add attached icon to left of project title (scale to 48x48px) ![ic_launcher_round](https://user-images.githubusercontent.com/228050/71560993-70618680-2a3e-11ea-98ae-78d9d9dcd174.png)
1.0
[Project header] Styling and layout - For rough parity with React prototype: - [ ] Add even padding and vertical middle align components - [ ] Title should use "Open Sans" as main font - [ ] Right-align user profile pic - [ ] Add attached icon to left of project title (scale to 48x48px) ![ic_launcher_round](https://user-images.githubusercontent.com/228050/71560993-70618680-2a3e-11ea-98ae-78d9d9dcd174.png)
non_test
styling and layout for rough parity with react prototype add even padding and vertical middle align components title should use open sans as main font right align user profile pic add attached icon to left of project title scale to
0
82,758
7,852,274,505
IssuesEvent
2018-06-20 14:12:26
leeensminger/DelDOT-NPDES-Viewer
https://api.github.com/repos/leeensminger/DelDOT-NPDES-Viewer
reopened
Add ability to query outfall table in advanced query
Ready For Testing in next release
Currently you cannot search the NPDES field within the advanced query. The field should be available to the user.
1.0
Add ability to query outfall table in advanced query - Currently you cannot search the NPDES field within the advanced query. The field should be available to the user.
test
add ability to query outfall table in advanced query currently you cannot search the npdes field within the advanced query the field should be available to the user
1
119,608
10,058,794,218
IssuesEvent
2019-07-22 14:36:48
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
closed
[Desktop] System.Net.Http.Functional.Tests.CancellationTest / GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly
area-System.Net.Http disabled-test test bug test-run-desktop
Failed test: System.Net.Http.Functional.Tests.CancellationTest.GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly Detail: https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_netfx_windows_nt_debug/32/testReport/System.Net.Http.Functional.Tests/CancellationTest/GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly_useTimeout__True__startResponseBody__True_/ Configuration: outerloop_netfx_windows_nt_debug MESSAGE: ~~~ Assert.Throws() Failure Expected: typeof(System.Net.WebException) Actual: typeof(System.OperationCanceledException): The operation was canceled. ~~~ STACK TRACE: ~~~ at System.Net.Http.HttpClient.HandleFinishSendAsyncError(Exception e, CancellationTokenSource cts) in D:\j\workspace\outerloop_net---903ddde6\src\System.Net.Http\src\System\Net\Http\HttpClient.cs:line 506 at System.Net.Http.HttpClient.<FinishSendAsyncBuffered>d__58.MoveNext() in D:\j\workspace\outerloop_net---903ddde6\src\System.Net.Http\src\System\Net\Http\HttpClient.cs:line 461 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult() at System.Net.Http.Functional.Tests.CancellationTest.<>c__DisplayClass2_3.<<GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly>b__2>d.MoveNext() in D:\j\workspace\outerloop_net---903ddde6\src\System.Net.Http\tests\FunctionalTests\CancellationTest.cs:line 71 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) ~~~
3.0
[Desktop] System.Net.Http.Functional.Tests.CancellationTest / GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly - Failed test: System.Net.Http.Functional.Tests.CancellationTest.GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly Detail: https://ci.dot.net/job/dotnet_corefx/job/master/job/outerloop_netfx_windows_nt_debug/32/testReport/System.Net.Http.Functional.Tests/CancellationTest/GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly_useTimeout__True__startResponseBody__True_/ Configuration: outerloop_netfx_windows_nt_debug MESSAGE: ~~~ Assert.Throws() Failure Expected: typeof(System.Net.WebException) Actual: typeof(System.OperationCanceledException): The operation was canceled. ~~~ STACK TRACE: ~~~ at System.Net.Http.HttpClient.HandleFinishSendAsyncError(Exception e, CancellationTokenSource cts) in D:\j\workspace\outerloop_net---903ddde6\src\System.Net.Http\src\System\Net\Http\HttpClient.cs:line 506 at System.Net.Http.HttpClient.<FinishSendAsyncBuffered>d__58.MoveNext() in D:\j\workspace\outerloop_net---903ddde6\src\System.Net.Http\src\System\Net\Http\HttpClient.cs:line 461 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult() at System.Net.Http.Functional.Tests.CancellationTest.<>c__DisplayClass2_3.<<GetAsync_ResponseContentRead_CancelUsingTimeoutOrToken_TaskCanceledQuickly>b__2>d.MoveNext() in D:\j\workspace\outerloop_net---903ddde6\src\System.Net.Http\tests\FunctionalTests\CancellationTest.cs:line 71 --- End of stack trace from previous location where exception was thrown --- at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task) at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task) ~~~
test
system net http functional tests cancellationtest getasync responsecontentread cancelusingtimeoutortoken taskcanceledquickly failed test system net http functional tests cancellationtest getasync responsecontentread cancelusingtimeoutortoken taskcanceledquickly detail configuration outerloop netfx windows nt debug message assert throws failure expected typeof system net webexception actual typeof system operationcanceledexception the operation was canceled stack trace at system net http httpclient handlefinishsendasyncerror exception e cancellationtokensource cts in d j workspace outerloop net src system net http src system net http httpclient cs line at system net http httpclient d movenext in d j workspace outerloop net src system net http src system net http httpclient cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task at system runtime compilerservices taskawaiter getresult at system net http functional tests cancellationtest c b d movenext in d j workspace outerloop net src system net http tests functionaltests cancellationtest cs line end of stack trace from previous location where exception was thrown at system runtime compilerservices taskawaiter throwfornonsuccess task task at system runtime compilerservices taskawaiter handlenonsuccessanddebuggernotification task task
1
70,627
7,194,255,983
IssuesEvent
2018-02-04 02:01:52
walterbender/musicblocks
https://api.github.com/repos/walterbender/musicblocks
closed
Harmonics block
Needs testing enhancement
harmonics are the overtones that are included in a particular pitch. They are a defining feature of a sound. It would be nice to have a feature that allows the user to access just one of the overtones of a note. 0 would be just the fundamental 1 would be first overtone 2 would be second overtone etc. (I lack the imagination to know what negative numbers would do. They do not really exist in the real world--more of a theoretical construct as far as I can tell.) It would be a clamp that goes over a pitch. All the other sounds for the enclosed pitch would be discarded and only the specified overtone (or overtones in the case of multiple clamps) would be performed. See https://en.wikipedia.org/wiki/Overtones for some basic info on overtones. Side note: I thought that tone.js had a feature to control overtones specifically, but I cannot find it now. It is worth looking into before starting this issue.
1.0
Harmonics block - harmonics are the overtones that are included in a particular pitch. They are a defining feature of a sound. It would be nice to have a feature that allows the user to access just one of the overtones of a note. 0 would be just the fundamental 1 would be first overtone 2 would be second overtone etc. (I lack the imagination to know what negative numbers would do. They do not really exist in the real world--more of a theoretical construct as far as I can tell.) It would be a clamp that goes over a pitch. All the other sounds for the enclosed pitch would be discarded and only the specified overtone (or overtones in the case of multiple clamps) would be performed. See https://en.wikipedia.org/wiki/Overtones for some basic info on overtones. Side note: I thought that tone.js had a feature to control overtones specifically, but I cannot find it now. It is worth looking into before starting this issue.
test
harmonics block harmonics are the overtones that are included in a particular pitch they are a defining feature of a sound it would be nice to have a feature that allows the user to access just one of the overtones of a note would be just the fundamental would be first overtone would be second overtone etc i lack the imagination to know what negative numbers would do they do not really exist in the real world more of a theoretical construct as far as i can tell it would be a clamp that goes over a pitch all the other sounds for the enclosed pitch would be discarded and only the specified overtone or overtones in the case of multiple clamps would be performed see for some basic info on overtones side note i thought that tone js had a feature to control overtones specifically but i cannot find it now it is worth looking into before starting this issue
1
275,521
23,919,935,817
IssuesEvent
2022-09-09 15:51:45
arfc/treetop
https://api.github.com/repos/arfc/treetop
closed
Add continuous integration for treetop
Comp:Core Difficulty:1-Beginner Priority:1-Critical Status:4-In Progress Type:Test
This issue can be closed when there is a continuous integration check that runs the tests in `treetop`.
1.0
Add continuous integration for treetop - This issue can be closed when there is a continuous integration check that runs the tests in `treetop`.
test
add continuous integration for treetop this issue can be closed when there is a continuous integration check that runs the tests in treetop
1
286,740
24,779,323,261
IssuesEvent
2022-10-24 02:19:58
likelion-backendschool/FinalProject_LimSoMang_team7
https://api.github.com/repos/likelion-backendschool/FinalProject_LimSoMang_team7
opened
[TEST] test 코드
✅ test
### 이슈내용 - TEST 코드 구현 ### 상세내용 - 시간 단축을 위한 TEST 코드 구현 ### TODO - [ ] HomeControllerTest - [ ] AdmHomeControllerTest - [ ] MemberControllerTest - [ ] PostControllerTest - [ ] ProductControllerTest
1.0
[TEST] test 코드 - ### 이슈내용 - TEST 코드 구현 ### 상세내용 - 시간 단축을 위한 TEST 코드 구현 ### TODO - [ ] HomeControllerTest - [ ] AdmHomeControllerTest - [ ] MemberControllerTest - [ ] PostControllerTest - [ ] ProductControllerTest
test
test 코드 이슈내용 test 코드 구현 상세내용 시간 단축을 위한 test 코드 구현 todo homecontrollertest admhomecontrollertest membercontrollertest postcontrollertest productcontrollertest
1
329,267
24,212,643,453
IssuesEvent
2022-09-26 01:51:18
jarischaefer/docker-librenms
https://api.github.com/repos/jarischaefer/docker-librenms
closed
standard_init_linux.go:211: exec user process caused "exec format error" on Raspberry Pi 4
enhancement waiting for feedback documentation
I have an issue trying to run it on a Raspberry Pi 4 (Raspbian GNU/Linux 10 (buster), once i try to generate the key i get this error: "standard_init_linux.go:211: exec user process caused "exec format error"
1.0
standard_init_linux.go:211: exec user process caused "exec format error" on Raspberry Pi 4 - I have an issue trying to run it on a Raspberry Pi 4 (Raspbian GNU/Linux 10 (buster), once i try to generate the key i get this error: "standard_init_linux.go:211: exec user process caused "exec format error"
non_test
standard init linux go exec user process caused exec format error on raspberry pi i have an issue trying to run it on a raspberry pi raspbian gnu linux buster once i try to generate the key i get this error standard init linux go exec user process caused exec format error
0
129,348
10,571,336,585
IssuesEvent
2019-10-07 06:48:07
a2000-erp-team/WEBERP
https://api.github.com/repos/a2000-erp-team/WEBERP
closed
Mandatory Cust PO still prompts even though user setting not set as manadatory Cust PO. (SALES - Order Processing - Operations - Customer Sales Order)
WEB ERP Testing By Katrina
Mandatory Cust PO still prompts even though user setting not set as mandatory Cust PO. User id: Katrina (User maintenance -> Security-2 -> UNTICK Mandatory Customer's PO in Sales Order. ![image.png](https://images.zenhubusercontent.com/5cf8a04f2e4fe4691d7f073e/63d38bd3-c5cb-4ad7-8868-6b49cb3da09a)
1.0
Mandatory Cust PO still prompts even though user setting not set as manadatory Cust PO. (SALES - Order Processing - Operations - Customer Sales Order) - Mandatory Cust PO still prompts even though user setting not set as mandatory Cust PO. User id: Katrina (User maintenance -> Security-2 -> UNTICK Mandatory Customer's PO in Sales Order. ![image.png](https://images.zenhubusercontent.com/5cf8a04f2e4fe4691d7f073e/63d38bd3-c5cb-4ad7-8868-6b49cb3da09a)
test
mandatory cust po still prompts even though user setting not set as manadatory cust po sales order processing operations customer sales order mandatory cust po still prompts even though user setting not set as mandatory cust po user id katrina user maintenance security untick mandatory customer s po in sales order
1
80,675
23,277,749,125
IssuesEvent
2022-08-05 08:55:48
OpenAssetIO/OpenAssetIO
https://api.github.com/repos/OpenAssetIO/OpenAssetIO
closed
Autogeneration of Trait/Specification classes
build
## What Automatic generation of C, C++ and Python classes for Traits and Specifications based on a common, simple JSON declaration. ## Why Maintaining manually created language-specific implementations is not scalable or suitable for end-use needs. ## Tasks - [x] #390 - [x] #389 - [x] #415
1.0
Autogeneration of Trait/Specification classes - ## What Automatic generation of C, C++ and Python classes for Traits and Specifications based on a common, simple JSON declaration. ## Why Maintaining manually created language-specific implementations is not scalable or suitable for end-use needs. ## Tasks - [x] #390 - [x] #389 - [x] #415
non_test
autogeneration of trait specification classes what automatic generation of c c and python classes for traits and specifications based on a common simple json declaration why maintaining manually created language specific implementations is not scalable or suitable for end use needs tasks
0
268,887
23,401,884,124
IssuesEvent
2022-08-12 08:49:56
OrdinaNederland/Stichting-NUTwente
https://api.github.com/repos/OrdinaNederland/Stichting-NUTwente
closed
Automatische Testsing opzetten voor live omgeving
Prio 3 Test Automation
Testsing opzetten voor live omgeving Dit willen we want dit valideert of de live omgeving goed werkt DoD - [x] Huidige automatische tests moeten gesplitst worden in wel en niet live omgeving veilige tests - [x] Veilige tests moeten automatisch gedraaid worden op de live omgeving
1.0
Automatische Testsing opzetten voor live omgeving - Testsing opzetten voor live omgeving Dit willen we want dit valideert of de live omgeving goed werkt DoD - [x] Huidige automatische tests moeten gesplitst worden in wel en niet live omgeving veilige tests - [x] Veilige tests moeten automatisch gedraaid worden op de live omgeving
test
automatische testsing opzetten voor live omgeving testsing opzetten voor live omgeving dit willen we want dit valideert of de live omgeving goed werkt dod huidige automatische tests moeten gesplitst worden in wel en niet live omgeving veilige tests veilige tests moeten automatisch gedraaid worden op de live omgeving
1
203,585
15,375,641,319
IssuesEvent
2021-03-02 15:08:31
istio/istio
https://api.github.com/repos/istio/istio
closed
istio/tools manifests need modernization
area/perf and scalability area/test and release
https://prow.istio.io/view/gs/istio-prow/pr-logs/pull/istio_tools/1458/lint_tools/1366477716181749760 We are using a ton of deprecated/removed stuff. We are even using apps/v1beta1 which was removed in 1.16. We need to go through and delete the stuff that is useless, and update the stuff that is useful.
1.0
istio/tools manifests need modernization - https://prow.istio.io/view/gs/istio-prow/pr-logs/pull/istio_tools/1458/lint_tools/1366477716181749760 We are using a ton of deprecated/removed stuff. We are even using apps/v1beta1 which was removed in 1.16. We need to go through and delete the stuff that is useless, and update the stuff that is useful.
test
istio tools manifests need modernization we are using a ton of deprecated removed stuff we are even using apps which was removed in we need to go through and delete the stuff that is useless and update the stuff that is useful
1
532,452
15,556,993,347
IssuesEvent
2021-03-16 08:35:00
dis-moi/backend
https://api.github.com/repos/dis-moi/backend
closed
URGENT: Err 500 when users try to create a new contributor profile
bug high priority
**Describe the bug** Please see the Trello card here: https://trello.com/c/vtYQ4WWT/680-quand-je-cr%C3%A9e-enti%C3%A8rement-un-nouveau-contributeur-%C3%A7a-p%C3%A8te-parfois > __@JalilArfaoui said__ Oui problème connu … il faut d’abord crée sans images et ensuite compléter je crois > __Etienne said__ En effet, -> juste le nom -> enregistrer -> ajouter tout le reste > __@MaartenLMEM said__ il faut créer ton contributeur de façon minimaliste et ensuite l'enrchir d'une image etC.
1.0
URGENT: Err 500 when users try to create a new contributor profile - **Describe the bug** Please see the Trello card here: https://trello.com/c/vtYQ4WWT/680-quand-je-cr%C3%A9e-enti%C3%A8rement-un-nouveau-contributeur-%C3%A7a-p%C3%A8te-parfois > __@JalilArfaoui said__ Oui problème connu … il faut d’abord crée sans images et ensuite compléter je crois > __Etienne said__ En effet, -> juste le nom -> enregistrer -> ajouter tout le reste > __@MaartenLMEM said__ il faut créer ton contributeur de façon minimaliste et ensuite l'enrchir d'une image etC.
non_test
urgent err when users try to create a new contributor profile describe the bug please see the trello card here jalilarfaoui said oui problème connu … il faut d’abord crée sans images et ensuite compléter je crois etienne said en effet juste le nom enregistrer ajouter tout le reste maartenlmem said il faut créer ton contributeur de façon minimaliste et ensuite l enrchir d une image etc
0