Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 4 112 | repo_url stringlengths 33 141 | action stringclasses 3 values | title stringlengths 1 1.02k | labels stringlengths 4 1.54k | body stringlengths 1 262k | index stringclasses 17 values | text_combine stringlengths 95 262k | label stringclasses 2 values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
262,421 | 22,840,149,849 | IssuesEvent | 2022-07-12 20:52:31 | dapr/components-contrib | https://api.github.com/repos/dapr/components-contrib | opened | InfluxDB binding as stable candidate | area/runtime/binding P1 pinned area/test/certification | ## Describe the proposal
Create certification tests for InfluxDB binding and mark it as stable candidate:
https://docs.dapr.io/reference/components-reference/supported-bindings/influxdb/ | 1.0 | InfluxDB binding as stable candidate - ## Describe the proposal
Create certification tests for InfluxDB binding and mark it as stable candidate:
https://docs.dapr.io/reference/components-reference/supported-bindings/influxdb/ | test | influxdb binding as stable candidate describe the proposal create certification tests for influxdb binding and mark it as stable candidate | 1 |
615,423 | 19,255,072,419 | IssuesEvent | 2021-12-09 10:23:39 | openscd/open-scd | https://api.github.com/repos/openscd/open-scd | closed | DOType or CDC creation helper wizards | Kind: Enhancement Reviewed: Prioritized Priority: Important | As a user of OpenSCD I want to be guided through the process of creating a new `DOType` element. I want the software to tell me which data objects `SDO`, data attributes (DA) are mandatory for the data object.
Two wizards are required here:
1. Choose the common data class (CDC) array automatically from `IEC_61850-7-3_2007B3`. Based on common data class, display possible data objects and data attributes in a selectable list. Mandatory elements shall be preselected.
2. Second wizard is the `sDOTypeWizard`, `dATypeWizard` where the user shall have the possibility to select the type. Types shall be preselected based on the information in `IEC_61850-7-3_2007B3`.
**Features**
- Array of possible common data classes are based on the definition in `IEC_61850-7-3_2007B3`
- Array of possible type in sDOTypeWizard and dATypeWizard are based on the CDC type definition in `IEC_61850-7-3_2007B3` and defined DOTypes, DATypes in the DataTypeTemplates section
- Progress shall be indicated in terms of how many wizards follow
- When canceling the process, no actions shall be performed. This is especially important when canceling in sDOTypeWizard, dATypeWizard | 1.0 | DOType or CDC creation helper wizards - As a user of OpenSCD I want to be guided through the process of creating a new `DOType` element. I want the software to tell me which data objects `SDO`, data attributes (DA) are mandatory for the data object.
Two wizards are required here:
1. Choose the common data class (CDC) array automatically from `IEC_61850-7-3_2007B3`. Based on common data class, display possible data objects and data attributes in a selectable list. Mandatory elements shall be preselected.
2. Second wizard is the `sDOTypeWizard`, `dATypeWizard` where the user shall have the possibility to select the type. Types shall be preselected based on the information in `IEC_61850-7-3_2007B3`.
**Features**
- Array of possible common data classes are based on the definition in `IEC_61850-7-3_2007B3`
- Array of possible type in sDOTypeWizard and dATypeWizard are based on the CDC type definition in `IEC_61850-7-3_2007B3` and defined DOTypes, DATypes in the DataTypeTemplates section
- Progress shall be indicated in terms of how many wizards follow
- When canceling the process, no actions shall be performed. This is especially important when canceling in sDOTypeWizard, dATypeWizard | non_test | dotype or cdc creation helper wizards as a user of openscd i want to be guided through the process of creating a new dotype element i want the software to tell me which data objects sdo data attributes da are mandatory for the data object two wizards are required here choose the common data class cdc array automatically from iec based on common data class display possible data objects and data attributes in a selectable list mandatory elements shall be preselected second wizard is the sdotypewizard datypewizard where the user shall have the possibility to select the type types shall be preselected based on the information in iec features array of possible common data classes are based on the definition in iec array of possible type in sdotypewizard and datypewizard are based on the cdc type definition in iec and defined dotypes datypes in the datatypetemplates section progress shall be indicated in terms of how many wizards follow when canceling the process no actions shall be performed this is especially important when canceling in sdotypewizard datypewizard | 0 |
305,755 | 26,409,613,719 | IssuesEvent | 2023-01-13 11:03:02 | OpenBB-finance/OpenBBTerminal | https://api.github.com/repos/OpenBB-finance/OpenBBTerminal | closed | [Bug] stocks/disc/arkord --sell_only | bug tests | **Describe the bug**
A clear and concise description of what the bug is.
`main` branch

`develop` branch
* This unit test file is related to the command and fails when `--record-mode=rewrite`
tests/openbb_terminal/stocks/discovery/test_ark_view.py
**To Reproduce**
Steps(from the start) and commands to reproduce the behavior
**Screenshots**
If applicable, add screenshots to help explain your problem.
If you are running the terminal using the conda version please
rerun the terminal with `python terminal.py --debug`, and then
recreate your issue. Then include a screenshot of the entire
error printout.
**Desktop (please complete the following information):**
- OS: [e.g. Mac Sierra]
- Python version [e.g. 3.6.8]
**Additional context**
Add any other information that you think could be useful for us.
| 1.0 | [Bug] stocks/disc/arkord --sell_only - **Describe the bug**
A clear and concise description of what the bug is.
`main` branch

`develop` branch
* This unit test file is related to the command and fails when `--record-mode=rewrite`
tests/openbb_terminal/stocks/discovery/test_ark_view.py
**To Reproduce**
Steps(from the start) and commands to reproduce the behavior
**Screenshots**
If applicable, add screenshots to help explain your problem.
If you are running the terminal using the conda version please
rerun the terminal with `python terminal.py --debug`, and then
recreate your issue. Then include a screenshot of the entire
error printout.
**Desktop (please complete the following information):**
- OS: [e.g. Mac Sierra]
- Python version [e.g. 3.6.8]
**Additional context**
Add any other information that you think could be useful for us.
| test | stocks disc arkord sell only describe the bug a clear and concise description of what the bug is main branch develop branch this unit test file is related to the command and fails when record mode rewrite tests openbb terminal stocks discovery test ark view py to reproduce steps from the start and commands to reproduce the behavior screenshots if applicable add screenshots to help explain your problem if you are running the terminal using the conda version please rerun the terminal with python terminal py debug and then recreate your issue then include a screenshot of the entire error printout desktop please complete the following information os python version additional context add any other information that you think could be useful for us | 1 |
84,458 | 24,314,143,546 | IssuesEvent | 2022-09-30 03:32:47 | eclipse-openj9/openj9 | https://api.github.com/repos/eclipse-openj9/openj9 | opened | AIX cannot load libfontmanager.so | comp:build test failure | The problem occurs in an extended.openjdk test, but easily duplicated by a simple test case.
```
public class Load {
public static void main(String[] args) throws Exception {
System.loadLibrary("fontmanager");
}
}
```
It occurs on at least jdk11 and jdk17.
jdk17
```
[2022-09-08T13:10:23.577Z] java.lang.UnsatisfiedLinkError: Failed to load library "/home/jenkins/workspace/Test_openjdk17_j9_extended.openjdk_ppc64_aix/openjdkbinary/j2sdk-image/lib/libfontmanager.so"
[2022-09-08T13:10:23.577Z] at java.base/jdk.internal.loader.NativeLibraries.load(Native Method)
```
jdk11
```
Exception in thread "main" java.lang.UnsatisfiedLinkError: fontmanager (rtld: 0712-001 Symbol _ZN2hb8vtable_tI8hb_set_tXadL_Z16hb_set_get_emptyEEXadL_Z16hb_set_referenceEEXadL_Z14hb_set_destroyEEXadL_Z20hb_set_set_user_dataEEXadL_Z20hb_set_get_user_dataEEE7destroyE was referenced
from module /home/jenkins/peter/jdk/lib/libfontmanager.so(), but a runtime definition
of the symbol was not found.)
```
Even on the same machine where the JVM was compiled it doesn't work, with the same error.
Seems like a problem with xlc 16.1.0. I think what's happening is that harfbuzz was updated from 2.8 to 4.4.1 and it no longer works with xlc 16.1.0. | 1.0 | AIX cannot load libfontmanager.so - The problem occurs in an extended.openjdk test, but easily duplicated by a simple test case.
```
public class Load {
public static void main(String[] args) throws Exception {
System.loadLibrary("fontmanager");
}
}
```
It occurs on at least jdk11 and jdk17.
jdk17
```
[2022-09-08T13:10:23.577Z] java.lang.UnsatisfiedLinkError: Failed to load library "/home/jenkins/workspace/Test_openjdk17_j9_extended.openjdk_ppc64_aix/openjdkbinary/j2sdk-image/lib/libfontmanager.so"
[2022-09-08T13:10:23.577Z] at java.base/jdk.internal.loader.NativeLibraries.load(Native Method)
```
jdk11
```
Exception in thread "main" java.lang.UnsatisfiedLinkError: fontmanager (rtld: 0712-001 Symbol _ZN2hb8vtable_tI8hb_set_tXadL_Z16hb_set_get_emptyEEXadL_Z16hb_set_referenceEEXadL_Z14hb_set_destroyEEXadL_Z20hb_set_set_user_dataEEXadL_Z20hb_set_get_user_dataEEE7destroyE was referenced
from module /home/jenkins/peter/jdk/lib/libfontmanager.so(), but a runtime definition
of the symbol was not found.)
```
Even on the same machine where the JVM was compiled it doesn't work, with the same error.
Seems like a problem with xlc 16.1.0. I think what's happening is that harfbuzz was updated from 2.8 to 4.4.1 and it no longer works with xlc 16.1.0. | non_test | aix cannot load libfontmanager so the problem occurs in an extended openjdk test but easily duplicated by a simple test case public class load public static void main string args throws exception system loadlibrary fontmanager it occurs on at least and java lang unsatisfiedlinkerror failed to load library home jenkins workspace test extended openjdk aix openjdkbinary image lib libfontmanager so at java base jdk internal loader nativelibraries load native method exception in thread main java lang unsatisfiedlinkerror fontmanager rtld symbol set txadl set get emptyeexadl set referenceeexadl set destroyeexadl set set user dataeexadl set get user was referenced from module home jenkins peter jdk lib libfontmanager so but a runtime definition of the symbol was not found even on the same machine where the jvm was compiled it doesn t work with the same error seems like a problem with xlc i think what s happening is that harfbuzz was updated from to and it no longer works with xlc | 0 |
183,701 | 14,246,491,454 | IssuesEvent | 2020-11-19 10:08:24 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | closed | [CI] HistoryTemplateEmailMappingsTests.testEmailFields fails | :Core/Features/Watcher >test-failure Team:Core/Features | **Build scan**: https://gradle-enterprise.elastic.co/s/uwm5wrx2u7vrg
**Repro line**:
```
./gradlew ':x-pack:plugin:watcher:internalClusterTest' --tests "org.elasticsearch.xpack.watcher.history.HistoryTemplateEmailMappingsTests.testEmailFields" -Dtests.seed=2E9825D4080F72BF -Dtests.security.manager=true -Dtests.locale=hr-HR -Dtests.timezone=SystemV/EST5EDT -Druntime.java=11
```
**Reproduces locally?**: No
**Applicable branches**: `master`
**Failure history**:
https://build-stats.elastic.co/app/kibana#/discover?_g=(refreshInterval:(pause:!t,value:0),time:(from:now-7d,mode:quick,to:now))&_a=(columns:!(_source),index:b646ed00-7efc-11e8-bf69-63c8ef516157,interval:auto,query:(language:lucene,query:testEmailFields),sort:!(process.time-start,desc))
**Failure excerpt**:
```
[2020-11-16T14:23:38,919][INFO ][o.s.s.s.SMTPServer ] [testEmailFields] SMTP server *:0 starting
[2020-11-16T14:23:38,930][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] before test
[2020-11-16T14:23:38,932][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: setting up test
[2020-11-16T14:23:38,932][INFO ][o.s.s.s.ServerThread ] [[org.subethamail.smtp.server.ServerThread *:39213]{smtpServerLocalSocketAddress=*:39213}] SMTP server *:39213 started
[2020-11-16T14:23:38,933][INFO ][o.e.t.InternalTestCluster] [testEmailFields] Setup InternalTestCluster [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster] with seed [98A0BE4EB87ED660] using [0] dedicated masters, [3] (data) nodes and [0] coord only nodes (master nodes are [auto-managed])
[2020-11-16T14:23:38,945][INFO ][o.e.n.Node ] [testEmailFields] version[8.0.0-SNAPSHOT], pid[246899], build[unknown/unknown/c2864e38fb097bd0e8095aaa716263356a0c3f8a/2020-11-16T05:45:22.776947Z], OS[Linux/4.18.0-193.28.1.el8_2.x86_64/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/11.0.2/11.0.2+7]
[2020-11-16T14:23:38,945][INFO ][o.e.n.Node ] [testEmailFields] JVM home [/var/lib/jenkins/.java/openjdk-11.0.2-linux]
[2020-11-16T14:23:38,945][DEPRECATION][o.e.d.n.Node ] [testEmailFields] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="no-jdk" message="no-jdk distributions that do not bundle a JDK are deprecated and will be removed in a future release"
[2020-11-16T14:23:38,946][INFO ][o.e.n.Node ] [testEmailFields] JVM arguments [-Dfile.encoding=UTF8, -Des.scripting.update.ctx_in_params=false, -Des.search.rewrite_sort=true, -Des.set.netty.runtime.available.processors=false, -Des.transport.cname_in_publish_address=true, -Dgradle.dist.lib=/var/lib/jenkins/.gradle/wrapper/dists/gradle-6.6.1-all/ejrtlte9hlw8v6ii20a9584rs/gradle-6.6.1/lib, -Dgradle.user.home=/var/lib/jenkins/.gradle, -Dgradle.worker.jar=/var/lib/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar, -Dio.netty.noKeySetOptimization=true, -Dio.netty.noUnsafe=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Djava.awt.headless=true, -Djava.locale.providers=SPI,COMPAT, -Djna.nosys=true, -Dorg.gradle.native=false, -Dtests.artifact=watcher, -Dtests.gradle=true, -Dtests.logger.level=WARN, -Dtests.security.manager=true, -Dtests.seed=F1C7AE174070F0E4, -Dtests.task=:x-pack:plugin:watcher:internalClusterTest, --illegal-access=warn, -XX:+HeapDumpOnOutOfMemoryError, -esa, -XX:HeapDumpPath=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/heapdump, -Xms512m, -Xmx512m, -Dfile.encoding=UTF-8, -Djava.io.tmpdir=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/testrun/internalClusterTest/temp, -Duser.country=US, -Duser.language=en, -Duser.variant, -ea]
[2020-11-16T14:23:38,947][WARN ][o.e.n.Node ] [testEmailFields] version [8.0.0-SNAPSHOT] is a pre-release version of Elasticsearch and is not suitable for production
[2020-11-16T14:23:38,947][INFO ][o.e.x.w.t.TimeWarpedWatcher] [testEmailFields] using time warped watchers plugin
[2020-11-16T14:23:38,948][INFO ][o.e.p.PluginsService ] [testEmailFields] no modules loaded
[2020-11-16T14:23:38,948][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.analysis.common.CommonAnalysisPlugin]
[2020-11-16T14:23:38,948][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.node.NodeMocksPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockMustacheScriptEngine$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockScriptService$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$AssertActionNamePlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$TestSeedPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.MockHttpTransport$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.TestGeoShapeFieldMapperPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.store.MockFSIndexStore$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.transport.nio.MockNioTransportPlugin]
[2020-11-16T14:23:38,950][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.datastreams.DataStreamsPlugin]
[2020-11-16T14:23:38,950][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.ilm.IndexLifecycle]
[2020-11-16T14:23:38,950][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.watcher.test.TimeWarpedWatcher]
[2020-11-16T14:23:38,959][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] using [3] data paths, mounts [[/ (/dev/sda2)]], net usable_space [232.4gb], net total_space [349.7gb], types [xfs]
[2020-11-16T14:23:38,963][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] heap size [512mb], compressed ordinary object pointers [true]
[2020-11-16T14:23:38,969][INFO ][o.e.n.Node ] [testEmailFields] node name [node_s0], node ID [5WY1CYTqRbejYfTqT5LK1w], cluster name [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster], roles [master, remote_cluster_client, data_hot, data_content, ingest, data_warm, data, data_cold]
[2020-11-16T14:23:39,052][INFO ][o.e.d.DiscoveryModule ] [testEmailFields] using discovery type [zen] and seed hosts providers [settings, file]
[2020-11-16T14:23:39,133][INFO ][o.e.n.Node ] [testEmailFields] initialized
[2020-11-16T14:23:39,136][INFO ][o.e.n.Node ] [testEmailFields] version[8.0.0-SNAPSHOT], pid[246899], build[unknown/unknown/c2864e38fb097bd0e8095aaa716263356a0c3f8a/2020-11-16T05:45:22.776947Z], OS[Linux/4.18.0-193.28.1.el8_2.x86_64/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/11.0.2/11.0.2+7]
[2020-11-16T14:23:39,136][INFO ][o.e.n.Node ] [testEmailFields] JVM home [/var/lib/jenkins/.java/openjdk-11.0.2-linux]
[2020-11-16T14:23:39,137][DEPRECATION][o.e.d.n.Node ] [testEmailFields] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="no-jdk" message="no-jdk distributions that do not bundle a JDK are deprecated and will be removed in a future release"
[2020-11-16T14:23:39,137][INFO ][o.e.n.Node ] [testEmailFields] JVM arguments [-Dfile.encoding=UTF8, -Des.scripting.update.ctx_in_params=false, -Des.search.rewrite_sort=true, -Des.set.netty.runtime.available.processors=false, -Des.transport.cname_in_publish_address=true, -Dgradle.dist.lib=/var/lib/jenkins/.gradle/wrapper/dists/gradle-6.6.1-all/ejrtlte9hlw8v6ii20a9584rs/gradle-6.6.1/lib, -Dgradle.user.home=/var/lib/jenkins/.gradle, -Dgradle.worker.jar=/var/lib/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar, -Dio.netty.noKeySetOptimization=true, -Dio.netty.noUnsafe=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Djava.awt.headless=true, -Djava.locale.providers=SPI,COMPAT, -Djna.nosys=true, -Dorg.gradle.native=false, -Dtests.artifact=watcher, -Dtests.gradle=true, -Dtests.logger.level=WARN, -Dtests.security.manager=true, -Dtests.seed=F1C7AE174070F0E4, -Dtests.task=:x-pack:plugin:watcher:internalClusterTest, --illegal-access=warn, -XX:+HeapDumpOnOutOfMemoryError, -esa, -XX:HeapDumpPath=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/heapdump, -Xms512m, -Xmx512m, -Dfile.encoding=UTF-8, -Djava.io.tmpdir=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/testrun/internalClusterTest/temp, -Duser.country=US, -Duser.language=en, -Duser.variant, -ea]
[2020-11-16T14:23:39,138][WARN ][o.e.n.Node ] [testEmailFields] version [8.0.0-SNAPSHOT] is a pre-release version of Elasticsearch and is not suitable for production
[2020-11-16T14:23:39,138][INFO ][o.e.x.w.t.TimeWarpedWatcher] [testEmailFields] using time warped watchers plugin
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] no modules loaded
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.analysis.common.CommonAnalysisPlugin]
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.node.NodeMocksPlugin]
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockMustacheScriptEngine$TestPlugin]
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockScriptService$TestPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$AssertActionNamePlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$TestSeedPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.MockHttpTransport$TestPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.TestGeoShapeFieldMapperPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.store.MockFSIndexStore$TestPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.transport.nio.MockNioTransportPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.datastreams.DataStreamsPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.ilm.IndexLifecycle]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.watcher.test.TimeWarpedWatcher]
[2020-11-16T14:23:39,150][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] using [3] data paths, mounts [[/ (/dev/sda2)]], net usable_space [232.4gb], net total_space [349.7gb], types [xfs]
[2020-11-16T14:23:39,150][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] heap size [512mb], compressed ordinary object pointers [true]
[2020-11-16T14:23:39,156][INFO ][o.e.n.Node ] [testEmailFields] node name [node_s1], node ID [VADujvzoSQyhHelIQIPhGA], cluster name [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster], roles [master, remote_cluster_client, data_hot, data_content, ingest, data_warm, data, data_cold]
[2020-11-16T14:23:39,222][INFO ][o.e.d.DiscoveryModule ] [testEmailFields] using discovery type [zen] and seed hosts providers [settings, file]
[2020-11-16T14:23:39,299][INFO ][o.e.n.Node ] [testEmailFields] initialized
[2020-11-16T14:23:39,302][INFO ][o.e.n.Node ] [testEmailFields] version[8.0.0-SNAPSHOT], pid[246899], build[unknown/unknown/c2864e38fb097bd0e8095aaa716263356a0c3f8a/2020-11-16T05:45:22.776947Z], OS[Linux/4.18.0-193.28.1.el8_2.x86_64/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/11.0.2/11.0.2+7]
[2020-11-16T14:23:39,303][INFO ][o.e.n.Node ] [testEmailFields] JVM home [/var/lib/jenkins/.java/openjdk-11.0.2-linux]
[2020-11-16T14:23:39,303][DEPRECATION][o.e.d.n.Node ] [testEmailFields] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="no-jdk" message="no-jdk distributions that do not bundle a JDK are deprecated and will be removed in a future release"
[2020-11-16T14:23:39,304][INFO ][o.e.n.Node ] [testEmailFields] JVM arguments [-Dfile.encoding=UTF8, -Des.scripting.update.ctx_in_params=false, -Des.search.rewrite_sort=true, -Des.set.netty.runtime.available.processors=false, -Des.transport.cname_in_publish_address=true, -Dgradle.dist.lib=/var/lib/jenkins/.gradle/wrapper/dists/gradle-6.6.1-all/ejrtlte9hlw8v6ii20a9584rs/gradle-6.6.1/lib, -Dgradle.user.home=/var/lib/jenkins/.gradle, -Dgradle.worker.jar=/var/lib/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar, -Dio.netty.noKeySetOptimization=true, -Dio.netty.noUnsafe=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Djava.awt.headless=true, -Djava.locale.providers=SPI,COMPAT, -Djna.nosys=true, -Dorg.gradle.native=false, -Dtests.artifact=watcher, -Dtests.gradle=true, -Dtests.logger.level=WARN, -Dtests.security.manager=true, -Dtests.seed=F1C7AE174070F0E4, -Dtests.task=:x-pack:plugin:watcher:internalClusterTest, --illegal-access=warn, -XX:+HeapDumpOnOutOfMemoryError, -esa, -XX:HeapDumpPath=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/heapdump, -Xms512m, -Xmx512m, -Dfile.encoding=UTF-8, -Djava.io.tmpdir=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/testrun/internalClusterTest/temp, -Duser.country=US, -Duser.language=en, -Duser.variant, -ea]
[2020-11-16T14:23:39,304][WARN ][o.e.n.Node ] [testEmailFields] version [8.0.0-SNAPSHOT] is a pre-release version of Elasticsearch and is not suitable for production
[2020-11-16T14:23:39,304][INFO ][o.e.x.w.t.TimeWarpedWatcher] [testEmailFields] using time warped watchers plugin
[2020-11-16T14:23:39,305][INFO ][o.e.p.PluginsService ] [testEmailFields] no modules loaded
[2020-11-16T14:23:39,305][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.analysis.common.CommonAnalysisPlugin]
[2020-11-16T14:23:39,305][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.node.NodeMocksPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockMustacheScriptEngine$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockScriptService$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$AssertActionNamePlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$TestSeedPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.MockHttpTransport$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.TestGeoShapeFieldMapperPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.store.MockFSIndexStore$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.transport.nio.MockNioTransportPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.datastreams.DataStreamsPlugin]
[2020-11-16T14:23:39,307][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.ilm.IndexLifecycle]
[2020-11-16T14:23:39,307][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.watcher.test.TimeWarpedWatcher]
[2020-11-16T14:23:39,316][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] using [3] data paths, mounts [[/ (/dev/sda2)]], net usable_space [232.4gb], net total_space [349.7gb], types [xfs]
[2020-11-16T14:23:39,325][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] heap size [512mb], compressed ordinary object pointers [true]
[2020-11-16T14:23:39,331][INFO ][o.e.n.Node ] [testEmailFields] node name [node_s2], node ID [ssLW2ClgQ1eVaIok0ITx-A], cluster name [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster], roles [master, remote_cluster_client, data_hot, data_content, ingest, data_warm, data, data_cold]
[2020-11-16T14:23:39,389][INFO ][o.e.d.DiscoveryModule ] [testEmailFields] using discovery type [zen] and seed hosts providers [settings, file]
[2020-11-16T14:23:39,451][INFO ][o.e.n.Node ] [testEmailFields] initialized
[2020-11-16T14:23:39,467][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#2]]] starting ...
[2020-11-16T14:23:39,473][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#3]]] starting ...
[2020-11-16T14:23:39,481][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#1]]] starting ...
[2020-11-16T14:23:39,486][INFO ][o.e.t.TransportService ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#2]]] publish_address {127.0.0.1:44717}, bound_addresses {[::1]:43339}, {127.0.0.1:44717}
[2020-11-16T14:23:39,505][INFO ][o.e.t.TransportService ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#1]]] publish_address {127.0.0.1:37883}, bound_addresses {[::1]:37255}, {127.0.0.1:37883}
[2020-11-16T14:23:39,544][INFO ][o.e.t.TransportService ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#3]]] publish_address {127.0.0.1:38021}, bound_addresses {[::1]:39523}, {127.0.0.1:38021}
[2020-11-16T14:23:39,714][INFO ][o.e.c.c.Coordinator ] [node_s0] setting initial configuration to VotingConfiguration{5WY1CYTqRbejYfTqT5LK1w,{bootstrap-placeholder}-node_s1,ssLW2ClgQ1eVaIok0ITx-A}
[2020-11-16T14:23:39,895][INFO ][o.e.c.s.MasterService ] [node_s0] elected-as-master ([2] nodes joined)[{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true} elect leader, {node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true} elect leader, _BECOME_MASTER_TASK_, _FINISH_ELECTION_], term: 1, version: 1, delta: master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:39,993][INFO ][o.e.c.c.CoordinationState] [node_s0] cluster UUID set to [wU0jGhUqSaS_LV4XG_p1Zg]
[2020-11-16T14:23:40,004][INFO ][o.e.c.c.CoordinationState] [node_s2] cluster UUID set to [wU0jGhUqSaS_LV4XG_p1Zg]
[2020-11-16T14:23:40,113][INFO ][o.e.c.s.ClusterApplierService] [node_s2] master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 1, reason: ApplyCommitRequest{term=1, version=1, sourceNode={node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,115][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#3]]] started
[2020-11-16T14:23:40,122][INFO ][o.e.c.s.ClusterApplierService] [node_s0] master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 1, reason: Publication{term=1, version=1}
[2020-11-16T14:23:40,166][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#1]]] started
[2020-11-16T14:23:40,183][INFO ][o.e.c.s.MasterService ] [node_s0] node-join[{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true} join existing leader], term: 1, version: 2, delta: added {{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,218][INFO ][o.e.c.r.a.DiskThresholdMonitor] [node_s0] skipping monitor as a check is already in progress
[2020-11-16T14:23:40,226][INFO ][o.e.c.s.ClusterApplierService] [node_s2] added {{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 2, reason: ApplyCommitRequest{term=1, version=2, sourceNode={node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,282][INFO ][o.e.c.s.ClusterApplierService] [node_s1] master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true},{node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 2, reason: ApplyCommitRequest{term=1, version=2, sourceNode={node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,284][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#2]]] started
[2020-11-16T14:23:40,285][INFO ][o.e.c.s.ClusterApplierService] [node_s0] added {{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 2, reason: Publication{term=1, version=2}
[2020-11-16T14:23:40,439][INFO ][o.e.g.GatewayService ] [node_s0] recovered [0] indices into cluster_state
[2020-11-16T14:23:40,461][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.slm-history] for index patterns [.slm-history-5*]
[2020-11-16T14:23:40,545][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.triggered_watches] for index patterns [.triggered_watches*]
[2020-11-16T14:23:40,634][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.watches] for index patterns [.watches*]
[2020-11-16T14:23:40,744][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.watch-history-14] for index patterns [.watcher-history-14*]
[2020-11-16T14:23:40,838][INFO ][o.e.x.i.a.TransportPutLifecycleAction] [node_s0] adding index lifecycle policy [watch-history-ilm-policy]
[2020-11-16T14:23:40,921][INFO ][o.e.x.i.a.TransportPutLifecycleAction] [node_s0] adding index lifecycle policy [slm-history-ilm-policy]
[2020-11-16T14:23:41,284][INFO ][o.e.l.LicenseService ] [node_s1] license [2860a10f-36d3-4bfd-b78f-c792d06c0567] mode [trial] - valid
[2020-11-16T14:23:41,288][INFO ][o.e.l.LicenseService ] [node_s2] license [2860a10f-36d3-4bfd-b78f-c792d06c0567] mode [trial] - valid
[2020-11-16T14:23:41,331][INFO ][o.e.l.LicenseService ] [node_s0] license [2860a10f-36d3-4bfd-b78f-c792d06c0567] mode [trial] - valid
[2020-11-16T14:23:41,351][WARN ][o.e.c.m.MetadataIndexTemplateService] [node_s0] legacy template [random_index_template] has index patterns [*] matching patterns from existing composable templates [.triggered_watches,.watch-history-14,.slm-history,.watches] with patterns (.triggered_watches => [.triggered_watches*],.watch-history-14 => [.watcher-history-14*],.slm-history => [.slm-history-5*],.watches => [.watches*]); this template [random_index_template] may be ignored in favor of a composable template at index creation time
[2020-11-16T14:23:41,353][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding template [random_index_template] for index patterns [*]
[2020-11-16T14:23:41,432][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: all set up test
[2020-11-16T14:23:41,437][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: freezing time on nodes
[2020-11-16T14:23:41,440][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STARTED], Tuple [v1=node_s2 (0), v2=STARTED], Tuple [v1=node_s1 (0), v2=STARTED]]
[2020-11-16T14:23:41,511][INFO ][o.e.x.w.WatcherService ] [node_s1] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:41,511][INFO ][o.e.x.w.WatcherService ] [node_s2] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:41,512][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s1] watcher has stopped
[2020-11-16T14:23:41,520][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s2] watcher has stopped
[2020-11-16T14:23:41,521][INFO ][o.e.x.w.WatcherService ] [node_s0] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:41,521][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s0] watcher has stopped
[2020-11-16T14:23:41,530][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STOPPED], Tuple [v1=node_s2 (0), v2=STOPPED], Tuple [v1=node_s1 (0), v2=STOPPED]]
[2020-11-16T14:23:41,540][DEPRECATION][o.e.d.c.m.MetadataCreateIndexService] [node_s0] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="index_name_starts_with_dot" message="index name [.watches] starts with a dot '.', in the next major version, index names starting with a dot are reserved for hidden indices and system indices"
[2020-11-16T14:23:41,550][INFO ][o.e.c.m.MetadataCreateIndexService] [node_s0] [.watches] creating index, cause [api], templates [.watches], shards [1]/[0]
[2020-11-16T14:23:41,555][INFO ][o.e.c.r.a.AllocationService] [node_s0] updating number_of_replicas to [1] for indices [.watches]
[2020-11-16T14:23:41,868][DEPRECATION][o.e.d.c.m.MetadataCreateIndexService] [node_s0] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="index_name_starts_with_dot" message="index name [.triggered_watches] starts with a dot '.', in the next major version, index names starting with a dot are reserved for hidden indices and system indices"
[2020-11-16T14:23:41,887][INFO ][o.e.c.m.MetadataCreateIndexService] [node_s0] [.triggered_watches] creating index, cause [api], templates [.triggered_watches], shards [1]/[1]
[2020-11-16T14:23:42,285][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STOPPED], Tuple [v1=node_s2, v2=STOPPED], Tuple [v1=node_s1, v2=STOPPED]]
[2020-11-16T14:23:42,534][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTING], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,544][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTING], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,553][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTING], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,568][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTED], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,608][INFO ][o.e.x.w.WatcherService ] [node_s2] reloading watcher, reason [new local watcher shard allocation ids], cancelled [0] queued tasks
[2020-11-16T14:23:42,619][INFO ][o.e.x.w.WatcherService ] [node_s0] reloading watcher, reason [new local watcher shard allocation ids], cancelled [0] queued tasks
[2020-11-16T14:23:42,636][INFO ][o.e.c.r.a.AllocationService] [node_s0] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[.triggered_watches][0]]])." previous.health="YELLOW" reason="shards started [[.triggered_watches][0]]"
[2020-11-16T14:23:42,803][INFO ][o.s.s.s.SMTPServer ] [testEmailFields] SMTP server *:39213 stopping
[2020-11-16T14:23:42,804][INFO ][o.s.s.s.ServerThread ] [[org.subethamail.smtp.server.ServerThread *:39213]{smtpServerLocalSocketAddress=*:39213}] SMTP server *:39213 stopped
[2020-11-16T14:23:42,805][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [#testEmailFields]: clearing watcher state
[2020-11-16T14:23:42,808][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STARTED], Tuple [v1=node_s2 (0), v2=STARTED], Tuple [v1=node_s1 (0), v2=STARTED]]
[2020-11-16T14:23:42,889][INFO ][o.e.x.w.WatcherService ] [node_s2] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:42,889][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s2] watcher has stopped
[2020-11-16T14:23:42,895][INFO ][o.e.x.w.WatcherService ] [node_s1] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:42,895][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s1] watcher has stopped
[2020-11-16T14:23:42,905][INFO ][o.e.x.w.WatcherService ] [node_s0] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:42,905][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s0] watcher has stopped
[2020-11-16T14:23:42,910][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STOPPED], Tuple [v1=node_s2 (0), v2=STOPPED], Tuple [v1=node_s1 (0), v2=STOPPED]]
[2020-11-16T14:23:42,919][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: cleaning up after test
[2020-11-16T14:23:42,919][INFO ][o.e.t.InternalTestCluster] [testEmailFields] Clearing active scheme time frozen, expected healing time 0s
[2020-11-16T14:23:43,002][INFO ][o.e.c.m.MetadataDeleteIndexService] [node_s0] [.watches/DgNTavZhQqSWESdZgB3WmA] deleting index
[2020-11-16T14:23:43,002][INFO ][o.e.c.m.MetadataDeleteIndexService] [node_s0] [.triggered_watches/NrMycBW_QZGlIMxOa4-pSw] deleting index
[2020-11-16T14:23:43,188][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] removing template [random_index_template]
[2020-11-16T14:23:43,252][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: cleaned up after test
[2020-11-16T14:23:43,252][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] after test
REPRODUCE WITH: ./gradlew ':x-pack:plugin:watcher:internalClusterTest' --tests "org.elasticsearch.xpack.watcher.history.HistoryTemplateEmailMappingsTests.testEmailFields" -Dtests.seed=F1C7AE174070F0E4 -Dtests.security.manager=true -Dtests.locale=pt-BR -Dtests.timezone=Asia/Brunei -Druntime.java=11
```
| 1.0 | [CI] HistoryTemplateEmailMappingsTests.testEmailFields fails - **Build scan**: https://gradle-enterprise.elastic.co/s/uwm5wrx2u7vrg
**Repro line**:
```
./gradlew ':x-pack:plugin:watcher:internalClusterTest' --tests "org.elasticsearch.xpack.watcher.history.HistoryTemplateEmailMappingsTests.testEmailFields" -Dtests.seed=2E9825D4080F72BF -Dtests.security.manager=true -Dtests.locale=hr-HR -Dtests.timezone=SystemV/EST5EDT -Druntime.java=11
```
**Reproduces locally?**: No
**Applicable branches**: `master`
**Failure history**:
https://build-stats.elastic.co/app/kibana#/discover?_g=(refreshInterval:(pause:!t,value:0),time:(from:now-7d,mode:quick,to:now))&_a=(columns:!(_source),index:b646ed00-7efc-11e8-bf69-63c8ef516157,interval:auto,query:(language:lucene,query:testEmailFields),sort:!(process.time-start,desc))
**Failure excerpt**:
```
[2020-11-16T14:23:38,919][INFO ][o.s.s.s.SMTPServer ] [testEmailFields] SMTP server *:0 starting
[2020-11-16T14:23:38,930][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] before test
[2020-11-16T14:23:38,932][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: setting up test
[2020-11-16T14:23:38,932][INFO ][o.s.s.s.ServerThread ] [[org.subethamail.smtp.server.ServerThread *:39213]{smtpServerLocalSocketAddress=*:39213}] SMTP server *:39213 started
[2020-11-16T14:23:38,933][INFO ][o.e.t.InternalTestCluster] [testEmailFields] Setup InternalTestCluster [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster] with seed [98A0BE4EB87ED660] using [0] dedicated masters, [3] (data) nodes and [0] coord only nodes (master nodes are [auto-managed])
[2020-11-16T14:23:38,945][INFO ][o.e.n.Node ] [testEmailFields] version[8.0.0-SNAPSHOT], pid[246899], build[unknown/unknown/c2864e38fb097bd0e8095aaa716263356a0c3f8a/2020-11-16T05:45:22.776947Z], OS[Linux/4.18.0-193.28.1.el8_2.x86_64/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/11.0.2/11.0.2+7]
[2020-11-16T14:23:38,945][INFO ][o.e.n.Node ] [testEmailFields] JVM home [/var/lib/jenkins/.java/openjdk-11.0.2-linux]
[2020-11-16T14:23:38,945][DEPRECATION][o.e.d.n.Node ] [testEmailFields] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="no-jdk" message="no-jdk distributions that do not bundle a JDK are deprecated and will be removed in a future release"
[2020-11-16T14:23:38,946][INFO ][o.e.n.Node ] [testEmailFields] JVM arguments [-Dfile.encoding=UTF8, -Des.scripting.update.ctx_in_params=false, -Des.search.rewrite_sort=true, -Des.set.netty.runtime.available.processors=false, -Des.transport.cname_in_publish_address=true, -Dgradle.dist.lib=/var/lib/jenkins/.gradle/wrapper/dists/gradle-6.6.1-all/ejrtlte9hlw8v6ii20a9584rs/gradle-6.6.1/lib, -Dgradle.user.home=/var/lib/jenkins/.gradle, -Dgradle.worker.jar=/var/lib/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar, -Dio.netty.noKeySetOptimization=true, -Dio.netty.noUnsafe=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Djava.awt.headless=true, -Djava.locale.providers=SPI,COMPAT, -Djna.nosys=true, -Dorg.gradle.native=false, -Dtests.artifact=watcher, -Dtests.gradle=true, -Dtests.logger.level=WARN, -Dtests.security.manager=true, -Dtests.seed=F1C7AE174070F0E4, -Dtests.task=:x-pack:plugin:watcher:internalClusterTest, --illegal-access=warn, -XX:+HeapDumpOnOutOfMemoryError, -esa, -XX:HeapDumpPath=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/heapdump, -Xms512m, -Xmx512m, -Dfile.encoding=UTF-8, -Djava.io.tmpdir=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/testrun/internalClusterTest/temp, -Duser.country=US, -Duser.language=en, -Duser.variant, -ea]
[2020-11-16T14:23:38,947][WARN ][o.e.n.Node ] [testEmailFields] version [8.0.0-SNAPSHOT] is a pre-release version of Elasticsearch and is not suitable for production
[2020-11-16T14:23:38,947][INFO ][o.e.x.w.t.TimeWarpedWatcher] [testEmailFields] using time warped watchers plugin
[2020-11-16T14:23:38,948][INFO ][o.e.p.PluginsService ] [testEmailFields] no modules loaded
[2020-11-16T14:23:38,948][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.analysis.common.CommonAnalysisPlugin]
[2020-11-16T14:23:38,948][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.node.NodeMocksPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockMustacheScriptEngine$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockScriptService$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$AssertActionNamePlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$TestSeedPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.MockHttpTransport$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.TestGeoShapeFieldMapperPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.store.MockFSIndexStore$TestPlugin]
[2020-11-16T14:23:38,949][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.transport.nio.MockNioTransportPlugin]
[2020-11-16T14:23:38,950][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.datastreams.DataStreamsPlugin]
[2020-11-16T14:23:38,950][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.ilm.IndexLifecycle]
[2020-11-16T14:23:38,950][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.watcher.test.TimeWarpedWatcher]
[2020-11-16T14:23:38,959][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] using [3] data paths, mounts [[/ (/dev/sda2)]], net usable_space [232.4gb], net total_space [349.7gb], types [xfs]
[2020-11-16T14:23:38,963][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] heap size [512mb], compressed ordinary object pointers [true]
[2020-11-16T14:23:38,969][INFO ][o.e.n.Node ] [testEmailFields] node name [node_s0], node ID [5WY1CYTqRbejYfTqT5LK1w], cluster name [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster], roles [master, remote_cluster_client, data_hot, data_content, ingest, data_warm, data, data_cold]
[2020-11-16T14:23:39,052][INFO ][o.e.d.DiscoveryModule ] [testEmailFields] using discovery type [zen] and seed hosts providers [settings, file]
[2020-11-16T14:23:39,133][INFO ][o.e.n.Node ] [testEmailFields] initialized
[2020-11-16T14:23:39,136][INFO ][o.e.n.Node ] [testEmailFields] version[8.0.0-SNAPSHOT], pid[246899], build[unknown/unknown/c2864e38fb097bd0e8095aaa716263356a0c3f8a/2020-11-16T05:45:22.776947Z], OS[Linux/4.18.0-193.28.1.el8_2.x86_64/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/11.0.2/11.0.2+7]
[2020-11-16T14:23:39,136][INFO ][o.e.n.Node ] [testEmailFields] JVM home [/var/lib/jenkins/.java/openjdk-11.0.2-linux]
[2020-11-16T14:23:39,137][DEPRECATION][o.e.d.n.Node ] [testEmailFields] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="no-jdk" message="no-jdk distributions that do not bundle a JDK are deprecated and will be removed in a future release"
[2020-11-16T14:23:39,137][INFO ][o.e.n.Node ] [testEmailFields] JVM arguments [-Dfile.encoding=UTF8, -Des.scripting.update.ctx_in_params=false, -Des.search.rewrite_sort=true, -Des.set.netty.runtime.available.processors=false, -Des.transport.cname_in_publish_address=true, -Dgradle.dist.lib=/var/lib/jenkins/.gradle/wrapper/dists/gradle-6.6.1-all/ejrtlte9hlw8v6ii20a9584rs/gradle-6.6.1/lib, -Dgradle.user.home=/var/lib/jenkins/.gradle, -Dgradle.worker.jar=/var/lib/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar, -Dio.netty.noKeySetOptimization=true, -Dio.netty.noUnsafe=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Djava.awt.headless=true, -Djava.locale.providers=SPI,COMPAT, -Djna.nosys=true, -Dorg.gradle.native=false, -Dtests.artifact=watcher, -Dtests.gradle=true, -Dtests.logger.level=WARN, -Dtests.security.manager=true, -Dtests.seed=F1C7AE174070F0E4, -Dtests.task=:x-pack:plugin:watcher:internalClusterTest, --illegal-access=warn, -XX:+HeapDumpOnOutOfMemoryError, -esa, -XX:HeapDumpPath=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/heapdump, -Xms512m, -Xmx512m, -Dfile.encoding=UTF-8, -Djava.io.tmpdir=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/testrun/internalClusterTest/temp, -Duser.country=US, -Duser.language=en, -Duser.variant, -ea]
[2020-11-16T14:23:39,138][WARN ][o.e.n.Node ] [testEmailFields] version [8.0.0-SNAPSHOT] is a pre-release version of Elasticsearch and is not suitable for production
[2020-11-16T14:23:39,138][INFO ][o.e.x.w.t.TimeWarpedWatcher] [testEmailFields] using time warped watchers plugin
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] no modules loaded
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.analysis.common.CommonAnalysisPlugin]
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.node.NodeMocksPlugin]
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockMustacheScriptEngine$TestPlugin]
[2020-11-16T14:23:39,139][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockScriptService$TestPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$AssertActionNamePlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$TestSeedPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.MockHttpTransport$TestPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.TestGeoShapeFieldMapperPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.store.MockFSIndexStore$TestPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.transport.nio.MockNioTransportPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.datastreams.DataStreamsPlugin]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.ilm.IndexLifecycle]
[2020-11-16T14:23:39,140][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.watcher.test.TimeWarpedWatcher]
[2020-11-16T14:23:39,150][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] using [3] data paths, mounts [[/ (/dev/sda2)]], net usable_space [232.4gb], net total_space [349.7gb], types [xfs]
[2020-11-16T14:23:39,150][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] heap size [512mb], compressed ordinary object pointers [true]
[2020-11-16T14:23:39,156][INFO ][o.e.n.Node ] [testEmailFields] node name [node_s1], node ID [VADujvzoSQyhHelIQIPhGA], cluster name [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster], roles [master, remote_cluster_client, data_hot, data_content, ingest, data_warm, data, data_cold]
[2020-11-16T14:23:39,222][INFO ][o.e.d.DiscoveryModule ] [testEmailFields] using discovery type [zen] and seed hosts providers [settings, file]
[2020-11-16T14:23:39,299][INFO ][o.e.n.Node ] [testEmailFields] initialized
[2020-11-16T14:23:39,302][INFO ][o.e.n.Node ] [testEmailFields] version[8.0.0-SNAPSHOT], pid[246899], build[unknown/unknown/c2864e38fb097bd0e8095aaa716263356a0c3f8a/2020-11-16T05:45:22.776947Z], OS[Linux/4.18.0-193.28.1.el8_2.x86_64/amd64], JVM[Oracle Corporation/OpenJDK 64-Bit Server VM/11.0.2/11.0.2+7]
[2020-11-16T14:23:39,303][INFO ][o.e.n.Node ] [testEmailFields] JVM home [/var/lib/jenkins/.java/openjdk-11.0.2-linux]
[2020-11-16T14:23:39,303][DEPRECATION][o.e.d.n.Node ] [testEmailFields] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="no-jdk" message="no-jdk distributions that do not bundle a JDK are deprecated and will be removed in a future release"
[2020-11-16T14:23:39,304][INFO ][o.e.n.Node ] [testEmailFields] JVM arguments [-Dfile.encoding=UTF8, -Des.scripting.update.ctx_in_params=false, -Des.search.rewrite_sort=true, -Des.set.netty.runtime.available.processors=false, -Des.transport.cname_in_publish_address=true, -Dgradle.dist.lib=/var/lib/jenkins/.gradle/wrapper/dists/gradle-6.6.1-all/ejrtlte9hlw8v6ii20a9584rs/gradle-6.6.1/lib, -Dgradle.user.home=/var/lib/jenkins/.gradle, -Dgradle.worker.jar=/var/lib/jenkins/.gradle/caches/6.6.1/workerMain/gradle-worker.jar, -Dio.netty.noKeySetOptimization=true, -Dio.netty.noUnsafe=true, -Dio.netty.recycler.maxCapacityPerThread=0, -Djava.awt.headless=true, -Djava.locale.providers=SPI,COMPAT, -Djna.nosys=true, -Dorg.gradle.native=false, -Dtests.artifact=watcher, -Dtests.gradle=true, -Dtests.logger.level=WARN, -Dtests.security.manager=true, -Dtests.seed=F1C7AE174070F0E4, -Dtests.task=:x-pack:plugin:watcher:internalClusterTest, --illegal-access=warn, -XX:+HeapDumpOnOutOfMemoryError, -esa, -XX:HeapDumpPath=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/heapdump, -Xms512m, -Xmx512m, -Dfile.encoding=UTF-8, -Djava.io.tmpdir=/var/lib/jenkins/workspace/elastic+elasticsearch+master+multijob-unix-compatibility/os/centos-8&&immutable/x-pack/plugin/watcher/build/testrun/internalClusterTest/temp, -Duser.country=US, -Duser.language=en, -Duser.variant, -ea]
[2020-11-16T14:23:39,304][WARN ][o.e.n.Node ] [testEmailFields] version [8.0.0-SNAPSHOT] is a pre-release version of Elasticsearch and is not suitable for production
[2020-11-16T14:23:39,304][INFO ][o.e.x.w.t.TimeWarpedWatcher] [testEmailFields] using time warped watchers plugin
[2020-11-16T14:23:39,305][INFO ][o.e.p.PluginsService ] [testEmailFields] no modules loaded
[2020-11-16T14:23:39,305][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.analysis.common.CommonAnalysisPlugin]
[2020-11-16T14:23:39,305][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.node.NodeMocksPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockMustacheScriptEngine$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.script.MockScriptService$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$AssertActionNamePlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.ESIntegTestCase$TestSeedPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.MockHttpTransport$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.TestGeoShapeFieldMapperPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.test.store.MockFSIndexStore$TestPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.transport.nio.MockNioTransportPlugin]
[2020-11-16T14:23:39,306][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.datastreams.DataStreamsPlugin]
[2020-11-16T14:23:39,307][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.ilm.IndexLifecycle]
[2020-11-16T14:23:39,307][INFO ][o.e.p.PluginsService ] [testEmailFields] loaded plugin [org.elasticsearch.xpack.watcher.test.TimeWarpedWatcher]
[2020-11-16T14:23:39,316][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] using [3] data paths, mounts [[/ (/dev/sda2)]], net usable_space [232.4gb], net total_space [349.7gb], types [xfs]
[2020-11-16T14:23:39,325][INFO ][o.e.e.NodeEnvironment ] [testEmailFields] heap size [512mb], compressed ordinary object pointers [true]
[2020-11-16T14:23:39,331][INFO ][o.e.n.Node ] [testEmailFields] node name [node_s2], node ID [ssLW2ClgQ1eVaIok0ITx-A], cluster name [SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster], roles [master, remote_cluster_client, data_hot, data_content, ingest, data_warm, data, data_cold]
[2020-11-16T14:23:39,389][INFO ][o.e.d.DiscoveryModule ] [testEmailFields] using discovery type [zen] and seed hosts providers [settings, file]
[2020-11-16T14:23:39,451][INFO ][o.e.n.Node ] [testEmailFields] initialized
[2020-11-16T14:23:39,467][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#2]]] starting ...
[2020-11-16T14:23:39,473][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#3]]] starting ...
[2020-11-16T14:23:39,481][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#1]]] starting ...
[2020-11-16T14:23:39,486][INFO ][o.e.t.TransportService ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#2]]] publish_address {127.0.0.1:44717}, bound_addresses {[::1]:43339}, {127.0.0.1:44717}
[2020-11-16T14:23:39,505][INFO ][o.e.t.TransportService ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#1]]] publish_address {127.0.0.1:37883}, bound_addresses {[::1]:37255}, {127.0.0.1:37883}
[2020-11-16T14:23:39,544][INFO ][o.e.t.TransportService ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#3]]] publish_address {127.0.0.1:38021}, bound_addresses {[::1]:39523}, {127.0.0.1:38021}
[2020-11-16T14:23:39,714][INFO ][o.e.c.c.Coordinator ] [node_s0] setting initial configuration to VotingConfiguration{5WY1CYTqRbejYfTqT5LK1w,{bootstrap-placeholder}-node_s1,ssLW2ClgQ1eVaIok0ITx-A}
[2020-11-16T14:23:39,895][INFO ][o.e.c.s.MasterService ] [node_s0] elected-as-master ([2] nodes joined)[{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true} elect leader, {node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true} elect leader, _BECOME_MASTER_TASK_, _FINISH_ELECTION_], term: 1, version: 1, delta: master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:39,993][INFO ][o.e.c.c.CoordinationState] [node_s0] cluster UUID set to [wU0jGhUqSaS_LV4XG_p1Zg]
[2020-11-16T14:23:40,004][INFO ][o.e.c.c.CoordinationState] [node_s2] cluster UUID set to [wU0jGhUqSaS_LV4XG_p1Zg]
[2020-11-16T14:23:40,113][INFO ][o.e.c.s.ClusterApplierService] [node_s2] master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 1, reason: ApplyCommitRequest{term=1, version=1, sourceNode={node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,115][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#3]]] started
[2020-11-16T14:23:40,122][INFO ][o.e.c.s.ClusterApplierService] [node_s0] master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 1, reason: Publication{term=1, version=1}
[2020-11-16T14:23:40,166][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#1]]] started
[2020-11-16T14:23:40,183][INFO ][o.e.c.s.MasterService ] [node_s0] node-join[{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true} join existing leader], term: 1, version: 2, delta: added {{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,218][INFO ][o.e.c.r.a.DiskThresholdMonitor] [node_s0] skipping monitor as a check is already in progress
[2020-11-16T14:23:40,226][INFO ][o.e.c.s.ClusterApplierService] [node_s2] added {{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 2, reason: ApplyCommitRequest{term=1, version=2, sourceNode={node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,282][INFO ][o.e.c.s.ClusterApplierService] [node_s1] master node changed {previous [], current [{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}]}, added {{node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true},{node_s2}{ssLW2ClgQ1eVaIok0ITx-A}{Pd7_nYW2Roq0DxCQdyLOyw}{127.0.0.1}{127.0.0.1:38021}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 2, reason: ApplyCommitRequest{term=1, version=2, sourceNode={node_s0}{5WY1CYTqRbejYfTqT5LK1w}{dilD5u-TTAODUYWqiqEEfQ}{127.0.0.1}{127.0.0.1:37883}{cdhimrsw}{xpack.installed=true}}
[2020-11-16T14:23:40,284][INFO ][o.e.n.Node ] [[test_SUITE-TEST_WORKER_VM=[826]-CLUSTER_SEED=[-7448744538358753696]-HASH=[226EBECF37E]-cluster[T#2]]] started
[2020-11-16T14:23:40,285][INFO ][o.e.c.s.ClusterApplierService] [node_s0] added {{node_s1}{VADujvzoSQyhHelIQIPhGA}{p5u_mAK3RpG62yrbtlzsbw}{127.0.0.1}{127.0.0.1:44717}{cdhimrsw}{xpack.installed=true}}, term: 1, version: 2, reason: Publication{term=1, version=2}
[2020-11-16T14:23:40,439][INFO ][o.e.g.GatewayService ] [node_s0] recovered [0] indices into cluster_state
[2020-11-16T14:23:40,461][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.slm-history] for index patterns [.slm-history-5*]
[2020-11-16T14:23:40,545][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.triggered_watches] for index patterns [.triggered_watches*]
[2020-11-16T14:23:40,634][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.watches] for index patterns [.watches*]
[2020-11-16T14:23:40,744][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding index template [.watch-history-14] for index patterns [.watcher-history-14*]
[2020-11-16T14:23:40,838][INFO ][o.e.x.i.a.TransportPutLifecycleAction] [node_s0] adding index lifecycle policy [watch-history-ilm-policy]
[2020-11-16T14:23:40,921][INFO ][o.e.x.i.a.TransportPutLifecycleAction] [node_s0] adding index lifecycle policy [slm-history-ilm-policy]
[2020-11-16T14:23:41,284][INFO ][o.e.l.LicenseService ] [node_s1] license [2860a10f-36d3-4bfd-b78f-c792d06c0567] mode [trial] - valid
[2020-11-16T14:23:41,288][INFO ][o.e.l.LicenseService ] [node_s2] license [2860a10f-36d3-4bfd-b78f-c792d06c0567] mode [trial] - valid
[2020-11-16T14:23:41,331][INFO ][o.e.l.LicenseService ] [node_s0] license [2860a10f-36d3-4bfd-b78f-c792d06c0567] mode [trial] - valid
[2020-11-16T14:23:41,351][WARN ][o.e.c.m.MetadataIndexTemplateService] [node_s0] legacy template [random_index_template] has index patterns [*] matching patterns from existing composable templates [.triggered_watches,.watch-history-14,.slm-history,.watches] with patterns (.triggered_watches => [.triggered_watches*],.watch-history-14 => [.watcher-history-14*],.slm-history => [.slm-history-5*],.watches => [.watches*]); this template [random_index_template] may be ignored in favor of a composable template at index creation time
[2020-11-16T14:23:41,353][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] adding template [random_index_template] for index patterns [*]
[2020-11-16T14:23:41,432][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: all set up test
[2020-11-16T14:23:41,437][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: freezing time on nodes
[2020-11-16T14:23:41,440][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STARTED], Tuple [v1=node_s2 (0), v2=STARTED], Tuple [v1=node_s1 (0), v2=STARTED]]
[2020-11-16T14:23:41,511][INFO ][o.e.x.w.WatcherService ] [node_s1] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:41,511][INFO ][o.e.x.w.WatcherService ] [node_s2] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:41,512][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s1] watcher has stopped
[2020-11-16T14:23:41,520][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s2] watcher has stopped
[2020-11-16T14:23:41,521][INFO ][o.e.x.w.WatcherService ] [node_s0] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:41,521][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s0] watcher has stopped
[2020-11-16T14:23:41,530][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STOPPED], Tuple [v1=node_s2 (0), v2=STOPPED], Tuple [v1=node_s1 (0), v2=STOPPED]]
[2020-11-16T14:23:41,540][DEPRECATION][o.e.d.c.m.MetadataCreateIndexService] [node_s0] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="index_name_starts_with_dot" message="index name [.watches] starts with a dot '.', in the next major version, index names starting with a dot are reserved for hidden indices and system indices"
[2020-11-16T14:23:41,550][INFO ][o.e.c.m.MetadataCreateIndexService] [node_s0] [.watches] creating index, cause [api], templates [.watches], shards [1]/[0]
[2020-11-16T14:23:41,555][INFO ][o.e.c.r.a.AllocationService] [node_s0] updating number_of_replicas to [1] for indices [.watches]
[2020-11-16T14:23:41,868][DEPRECATION][o.e.d.c.m.MetadataCreateIndexService] [node_s0] data_stream.dataset="deprecation.elasticsearch" data_stream.namespace="default" data_stream.type="logs" ecs.version="1.6" key="index_name_starts_with_dot" message="index name [.triggered_watches] starts with a dot '.', in the next major version, index names starting with a dot are reserved for hidden indices and system indices"
[2020-11-16T14:23:41,887][INFO ][o.e.c.m.MetadataCreateIndexService] [node_s0] [.triggered_watches] creating index, cause [api], templates [.triggered_watches], shards [1]/[1]
[2020-11-16T14:23:42,285][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STOPPED], Tuple [v1=node_s2, v2=STOPPED], Tuple [v1=node_s1, v2=STOPPED]]
[2020-11-16T14:23:42,534][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTING], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,544][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTING], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,553][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTING], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,568][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to start watcher, current states [Tuple [v1=node_s0, v2=STARTED], Tuple [v1=node_s2, v2=STARTED], Tuple [v1=node_s1, v2=STARTED]]
[2020-11-16T14:23:42,608][INFO ][o.e.x.w.WatcherService ] [node_s2] reloading watcher, reason [new local watcher shard allocation ids], cancelled [0] queued tasks
[2020-11-16T14:23:42,619][INFO ][o.e.x.w.WatcherService ] [node_s0] reloading watcher, reason [new local watcher shard allocation ids], cancelled [0] queued tasks
[2020-11-16T14:23:42,636][INFO ][o.e.c.r.a.AllocationService] [node_s0] current.health="GREEN" message="Cluster health status changed from [YELLOW] to [GREEN] (reason: [shards started [[.triggered_watches][0]]])." previous.health="YELLOW" reason="shards started [[.triggered_watches][0]]"
[2020-11-16T14:23:42,803][INFO ][o.s.s.s.SMTPServer ] [testEmailFields] SMTP server *:39213 stopping
[2020-11-16T14:23:42,804][INFO ][o.s.s.s.ServerThread ] [[org.subethamail.smtp.server.ServerThread *:39213]{smtpServerLocalSocketAddress=*:39213}] SMTP server *:39213 stopped
[2020-11-16T14:23:42,805][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [#testEmailFields]: clearing watcher state
[2020-11-16T14:23:42,808][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STARTED], Tuple [v1=node_s2 (0), v2=STARTED], Tuple [v1=node_s1 (0), v2=STARTED]]
[2020-11-16T14:23:42,889][INFO ][o.e.x.w.WatcherService ] [node_s2] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:42,889][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s2] watcher has stopped
[2020-11-16T14:23:42,895][INFO ][o.e.x.w.WatcherService ] [node_s1] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:42,895][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s1] watcher has stopped
[2020-11-16T14:23:42,905][INFO ][o.e.x.w.WatcherService ] [node_s0] stopping watch service, reason [watcher manually marked to shutdown by cluster state update]
[2020-11-16T14:23:42,905][INFO ][o.e.x.w.WatcherLifeCycleService] [node_s0] watcher has stopped
[2020-11-16T14:23:42,910][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] waiting to stop watcher, current states [Tuple [v1=node_s0 (0), v2=STOPPED], Tuple [v1=node_s2 (0), v2=STOPPED], Tuple [v1=node_s1 (0), v2=STOPPED]]
[2020-11-16T14:23:42,919][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: cleaning up after test
[2020-11-16T14:23:42,919][INFO ][o.e.t.InternalTestCluster] [testEmailFields] Clearing active scheme time frozen, expected healing time 0s
[2020-11-16T14:23:43,002][INFO ][o.e.c.m.MetadataDeleteIndexService] [node_s0] [.watches/DgNTavZhQqSWESdZgB3WmA] deleting index
[2020-11-16T14:23:43,002][INFO ][o.e.c.m.MetadataDeleteIndexService] [node_s0] [.triggered_watches/NrMycBW_QZGlIMxOa4-pSw] deleting index
[2020-11-16T14:23:43,188][INFO ][o.e.c.m.MetadataIndexTemplateService] [node_s0] removing template [random_index_template]
[2020-11-16T14:23:43,252][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] [HistoryTemplateEmailMappingsTests#testEmailFields]: cleaned up after test
[2020-11-16T14:23:43,252][INFO ][o.e.x.w.h.HistoryTemplateEmailMappingsTests] [testEmailFields] after test
REPRODUCE WITH: ./gradlew ':x-pack:plugin:watcher:internalClusterTest' --tests "org.elasticsearch.xpack.watcher.history.HistoryTemplateEmailMappingsTests.testEmailFields" -Dtests.seed=F1C7AE174070F0E4 -Dtests.security.manager=true -Dtests.locale=pt-BR -Dtests.timezone=Asia/Brunei -Druntime.java=11
```
| test | historytemplateemailmappingstests testemailfields fails build scan repro line gradlew x pack plugin watcher internalclustertest tests org elasticsearch xpack watcher history historytemplateemailmappingstests testemailfields dtests seed dtests security manager true dtests locale hr hr dtests timezone systemv druntime java reproduces locally no applicable branches master failure history failure excerpt smtp server starting before test setting up test smtpserverlocalsocketaddress smtp server started setup internaltestcluster cluster seed hash cluster with seed using dedicated masters data nodes and coord only nodes master nodes are version pid build os jvm jvm home data stream dataset deprecation elasticsearch data stream namespace default data stream type logs ecs version key no jdk message no jdk distributions that do not bundle a jdk are deprecated and will be removed in a future release jvm arguments version is a pre release version of elasticsearch and is not suitable for production using time warped watchers plugin no modules loaded loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin using data paths mounts net usable space net total space types heap size compressed ordinary object pointers node name node id cluster name cluster seed hash cluster roles using discovery type and seed hosts providers initialized version pid build os jvm jvm home data stream dataset deprecation elasticsearch data stream namespace default data stream type logs ecs version key no jdk message no jdk distributions that do not bundle a jdk are deprecated and will be removed in a future release jvm arguments version is a pre release version of elasticsearch and is not suitable for production using time warped watchers plugin no modules loaded loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin using data paths mounts net usable space net total space types heap size compressed ordinary object pointers node name node id cluster name cluster seed hash cluster roles using discovery type and seed hosts providers initialized version pid build os jvm jvm home data stream dataset deprecation elasticsearch data stream namespace default data stream type logs ecs version key no jdk message no jdk distributions that do not bundle a jdk are deprecated and will be removed in a future release jvm arguments version is a pre release version of elasticsearch and is not suitable for production using time warped watchers plugin no modules loaded loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin loaded plugin using data paths mounts net usable space net total space types heap size compressed ordinary object pointers node name node id cluster name cluster seed hash cluster roles using discovery type and seed hosts providers initialized cluster seed hash cluster starting cluster seed hash cluster starting cluster seed hash cluster starting cluster seed hash cluster publish address bound addresses cluster seed hash cluster publish address bound addresses cluster seed hash cluster publish address bound addresses setting initial configuration to votingconfiguration bootstrap placeholder node a elected as master nodes joined term version delta master node changed previous current added node a cdhimrsw xpack installed true cluster uuid set to cluster uuid set to master node changed previous current added node ttaoduywqiqeefq cdhimrsw xpack installed true term version reason applycommitrequest term version sourcenode node ttaoduywqiqeefq cdhimrsw xpack installed true cluster seed hash cluster started master node changed previous current added node a cdhimrsw xpack installed true term version reason publication term version cluster seed hash cluster started node join term version delta added node vadujvzosqyhheliqiphga cdhimrsw xpack installed true skipping monitor as a check is already in progress added node vadujvzosqyhheliqiphga cdhimrsw xpack installed true term version reason applycommitrequest term version sourcenode node ttaoduywqiqeefq cdhimrsw xpack installed true master node changed previous current added node ttaoduywqiqeefq cdhimrsw xpack installed true node a cdhimrsw xpack installed true term version reason applycommitrequest term version sourcenode node ttaoduywqiqeefq cdhimrsw xpack installed true cluster seed hash cluster started added node vadujvzosqyhheliqiphga cdhimrsw xpack installed true term version reason publication term version recovered indices into cluster state adding index template for index patterns adding index template for index patterns adding index template for index patterns adding index template for index patterns adding index lifecycle policy adding index lifecycle policy license mode valid license mode valid license mode valid legacy template has index patterns matching patterns from existing composable templates with patterns triggered watches watch history slm history watches this template may be ignored in favor of a composable template at index creation time adding template for index patterns all set up test freezing time on nodes waiting to stop watcher current states tuple tuple stopping watch service reason stopping watch service reason watcher has stopped watcher has stopped stopping watch service reason watcher has stopped waiting to stop watcher current states tuple tuple data stream dataset deprecation elasticsearch data stream namespace default data stream type logs ecs version key index name starts with dot message index name starts with a dot in the next major version index names starting with a dot are reserved for hidden indices and system indices creating index cause templates shards updating number of replicas to for indices data stream dataset deprecation elasticsearch data stream namespace default data stream type logs ecs version key index name starts with dot message index name starts with a dot in the next major version index names starting with a dot are reserved for hidden indices and system indices creating index cause templates shards waiting to start watcher current states tuple tuple waiting to start watcher current states tuple tuple waiting to start watcher current states tuple tuple waiting to start watcher current states tuple tuple waiting to start watcher current states tuple tuple reloading watcher reason cancelled queued tasks reloading watcher reason cancelled queued tasks current health green message cluster health status changed from to reason previous health yellow reason shards started smtp server stopping smtpserverlocalsocketaddress smtp server stopped clearing watcher state waiting to stop watcher current states tuple tuple stopping watch service reason watcher has stopped stopping watch service reason watcher has stopped stopping watch service reason watcher has stopped waiting to stop watcher current states tuple tuple cleaning up after test clearing active scheme time frozen expected healing time deleting index deleting index removing template cleaned up after test after test reproduce with gradlew x pack plugin watcher internalclustertest tests org elasticsearch xpack watcher history historytemplateemailmappingstests testemailfields dtests seed dtests security manager true dtests locale pt br dtests timezone asia brunei druntime java | 1 |
58,382 | 3,088,985,232 | IssuesEvent | 2015-08-25 19:18:13 | pavel-pimenov/flylinkdc-r5xx | https://api.github.com/repos/pavel-pimenov/flylinkdc-r5xx | closed | Зависание при закрывании неподключенного ADC хаба | bug imported Priority-Medium | _From [mike.kor...@gmail.com](https://code.google.com/u/101495626515388303633/) on October 14, 2014 22:28:41_
При запуске клиента, не соединился с ADC хабом поддержки флайлинка,
Кнопка переподключения не работала, из меню - тоже.
Скроллера в главном чате не было.
При попытке закрыть окно клиент завис.
Дамп - https://yadi.sk/d/jCMx-T9ec2Wtc
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=1505_ | 1.0 | Зависание при закрывании неподключенного ADC хаба - _From [mike.kor...@gmail.com](https://code.google.com/u/101495626515388303633/) on October 14, 2014 22:28:41_
При запуске клиента, не соединился с ADC хабом поддержки флайлинка,
Кнопка переподключения не работала, из меню - тоже.
Скроллера в главном чате не было.
При попытке закрыть окно клиент завис.
Дамп - https://yadi.sk/d/jCMx-T9ec2Wtc
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=1505_ | non_test | зависание при закрывании неподключенного adc хаба from on october при запуске клиента не соединился с adc хабом поддержки флайлинка кнопка переподключения не работала из меню тоже скроллера в главном чате не было при попытке закрыть окно клиент завис дамп original issue | 0 |
140,966 | 11,385,456,521 | IssuesEvent | 2020-01-29 11:08:17 | brave/brave-ios | https://api.github.com/repos/brave/brave-ios | closed | No SNTP image is shown when toggle the `Show sponsored images` switch | Epic: NTP QA/Yes bug duplicate test-plan/specified | <!-- Have you searched for similar issues on the repository?
Before submitting this issue, please visit our wiki for common ones: https://github.com/brave/browser-ios/wiki
For more, check out our community site: https://community.brave.com/ -->
### Description:
No SNTP image is shown when toggle the `Show sponsored images` switch
### Steps to Reproduce
1. Clean profile 1.14.3
2. Enable rewards through on-boarding
3. Open SNTP saw sponsored image once
4. Navigate to settings->New Tab page->Turn off `Show sponsored images`
5. Remove the app from memory
6. Relaunch the app and navigate to settings->New Tab page->Turn on `show sponsored images`
Actual: No sponsored images are shown.
**Actual result:** <!-- Add screenshots if needed -->
No SNTP image is shown when toggle the `Show sponsored images` switch to ON
**Expected result:**
SNTP images should be shown when `Show sponsored images`switch is ON
**Reproduces how often:** [Easily reproduced, Intermittent Issue]
Always
**Brave Version:** <!-- Provide full details Eg: v1.4.2(17.09.08.16) -->
1.14.3 (20.01.28.18)
**Device details:** <!-- Model type and iOS version Eg: iPhone 6s+ (iOS 10.3.3)-->
iPhone 8 - IOS 13.3
**Website problems only:**
- did you check with Brave Shields down? NA
- did you check in Safari/Firefox (WkWebView-based browsers)? NA
### Additional Information
cc: @brave/legacy_qa @jhreis @davidtemkin @anthonypkeane @jamesmudgett | 1.0 | No SNTP image is shown when toggle the `Show sponsored images` switch - <!-- Have you searched for similar issues on the repository?
Before submitting this issue, please visit our wiki for common ones: https://github.com/brave/browser-ios/wiki
For more, check out our community site: https://community.brave.com/ -->
### Description:
No SNTP image is shown when toggle the `Show sponsored images` switch
### Steps to Reproduce
1. Clean profile 1.14.3
2. Enable rewards through on-boarding
3. Open SNTP saw sponsored image once
4. Navigate to settings->New Tab page->Turn off `Show sponsored images`
5. Remove the app from memory
6. Relaunch the app and navigate to settings->New Tab page->Turn on `show sponsored images`
Actual: No sponsored images are shown.
**Actual result:** <!-- Add screenshots if needed -->
No SNTP image is shown when toggle the `Show sponsored images` switch to ON
**Expected result:**
SNTP images should be shown when `Show sponsored images`switch is ON
**Reproduces how often:** [Easily reproduced, Intermittent Issue]
Always
**Brave Version:** <!-- Provide full details Eg: v1.4.2(17.09.08.16) -->
1.14.3 (20.01.28.18)
**Device details:** <!-- Model type and iOS version Eg: iPhone 6s+ (iOS 10.3.3)-->
iPhone 8 - IOS 13.3
**Website problems only:**
- did you check with Brave Shields down? NA
- did you check in Safari/Firefox (WkWebView-based browsers)? NA
### Additional Information
cc: @brave/legacy_qa @jhreis @davidtemkin @anthonypkeane @jamesmudgett | test | no sntp image is shown when toggle the show sponsored images switch have you searched for similar issues on the repository before submitting this issue please visit our wiki for common ones for more check out our community site description no sntp image is shown when toggle the show sponsored images switch steps to reproduce clean profile enable rewards through on boarding open sntp saw sponsored image once navigate to settings new tab page turn off show sponsored images remove the app from memory relaunch the app and navigate to settings new tab page turn on show sponsored images actual no sponsored images are shown actual result no sntp image is shown when toggle the show sponsored images switch to on expected result sntp images should be shown when show sponsored images switch is on reproduces how often always brave version device details iphone ios website problems only did you check with brave shields down na did you check in safari firefox wkwebview based browsers na additional information cc brave legacy qa jhreis davidtemkin anthonypkeane jamesmudgett | 1 |
306,014 | 26,428,776,892 | IssuesEvent | 2023-01-14 14:45:27 | PalisadoesFoundation/talawa-api | https://api.github.com/repos/PalisadoesFoundation/talawa-api | closed | Resolver: Create tests for mailer.js | good first issue parent points 01 test | The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage.
Tests need to be written for file `lib/helper_functions/mailer.js`
- We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed.
- When complete, all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a
single file with the name `tests/helper_functions/mailer.sepc.js`. You may need to create the appropriate directory structure to do this.
### IMPORTANT:
Please refer to the parent issue on how to implement these tests correctly:
- https://github.com/PalisadoesFoundation/talawa-api/issues/490
### PR Acceptance Criteria
- When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR.
- [The current code coverage for the file can be found here](https://codecov.io/gh/PalisadoesFoundation/talawa-api/tree/develop/lib/resolvers/organization_query/). If the file isn't found in this directory, or there is a 404 error, then tests have not been created.
- The PR will show a report for the code coverage for the file you have added. You can use that as a guide. | 1.0 | Resolver: Create tests for mailer.js - The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage.
Tests need to be written for file `lib/helper_functions/mailer.js`
- We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed.
- When complete, all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a
single file with the name `tests/helper_functions/mailer.sepc.js`. You may need to create the appropriate directory structure to do this.
### IMPORTANT:
Please refer to the parent issue on how to implement these tests correctly:
- https://github.com/PalisadoesFoundation/talawa-api/issues/490
### PR Acceptance Criteria
- When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR.
- [The current code coverage for the file can be found here](https://codecov.io/gh/PalisadoesFoundation/talawa-api/tree/develop/lib/resolvers/organization_query/). If the file isn't found in this directory, or there is a 404 error, then tests have not been created.
- The PR will show a report for the code coverage for the file you have added. You can use that as a guide. | test | resolver create tests for mailer js the talawa api code base needs to be reliable this means we need to have test code coverage tests need to be written for file lib helper functions mailer js we will need the api to be refactored for all methods classes and or functions found in this file for testing to be correctly executed when complete all methods classes and or functions in the refactored file will need to be tested these tests must be placed in a single file with the name tests helper functions mailer sepc js you may need to create the appropriate directory structure to do this important please refer to the parent issue on how to implement these tests correctly pr acceptance criteria when complete this file must show coverage when merged into the code base this will be clearly visible when you submit your pr if the file isn t found in this directory or there is a error then tests have not been created the pr will show a report for the code coverage for the file you have added you can use that as a guide | 1 |
217,046 | 16,832,924,713 | IssuesEvent | 2021-06-18 08:09:12 | nuxsmin/sysPass | https://api.github.com/repos/nuxsmin/sysPass | closed | LDAP Causing System Failure | kind/question triage/need-test v3 | **sysPass Version**
Can be found on `Config -> Information` tab
3.1 (312.20030701)
Config: 312.20030701
App: 312.20030701
DB: 312.20030701
**Describe the question**
A clear and concise description.
When enabling LDAP, system no longer work. Upon logging in, I get redirected to http://syspass/undefined with the screenshot below. I had to disable the LDAP from the config.xml file.
**Screenshots**
If applicable, add screenshots to help explain your problem.

**Platform (please complete the following information):**
- OS: [e.g. Linux, Android]: Windows (Client) // Linux (Server)
- OS Version: 10 (Client) // 20.04 LTS
- Browser [e.g. Firefox, Chrome] Chrome
**Additional context**
Add any other context about the problem here.
LDAP Configuration from config.xml (stripping some information):
`
root@syspass:/var/www/html/syspass/app/config# cat config.xml | grep -i ldap
<ldapAds>0</ldapAds>
<ldapBase></ldapBase>
<ldapBindPass>password</ldapBindPass> >>>>>>>>>>>> Here using the server admin credentials
<ldapBindUser>username</ldapBindUser> >>>>>>>>>>>> Here using the server admin credentials
<ldapDefaultGroup>1</ldapDefaultGroup>
<ldapDefaultProfile>1</ldapDefaultProfile>
<ldapEnabled>0</ldapEnabled> >>>>>>>>>>>> Here set to 1 when enabled
<ldapGroup>CN=****,OU=****,DC=****,DC=****</ldapGroup>
<ldapProxyUser></ldapProxyUser>
<ldapServer>1.2.3.4</ldapServer> >>>>>>>>>>>> Here using the server IP Address
<ldapTlsEnabled>0</ldapTlsEnabled>
<ldapType>2</ldapType>
`
Connecting is done with Microsoft Active Directory | 1.0 | LDAP Causing System Failure - **sysPass Version**
Can be found on `Config -> Information` tab
3.1 (312.20030701)
Config: 312.20030701
App: 312.20030701
DB: 312.20030701
**Describe the question**
A clear and concise description.
When enabling LDAP, system no longer work. Upon logging in, I get redirected to http://syspass/undefined with the screenshot below. I had to disable the LDAP from the config.xml file.
**Screenshots**
If applicable, add screenshots to help explain your problem.

**Platform (please complete the following information):**
- OS: [e.g. Linux, Android]: Windows (Client) // Linux (Server)
- OS Version: 10 (Client) // 20.04 LTS
- Browser [e.g. Firefox, Chrome] Chrome
**Additional context**
Add any other context about the problem here.
LDAP Configuration from config.xml (stripping some information):
`
root@syspass:/var/www/html/syspass/app/config# cat config.xml | grep -i ldap
<ldapAds>0</ldapAds>
<ldapBase></ldapBase>
<ldapBindPass>password</ldapBindPass> >>>>>>>>>>>> Here using the server admin credentials
<ldapBindUser>username</ldapBindUser> >>>>>>>>>>>> Here using the server admin credentials
<ldapDefaultGroup>1</ldapDefaultGroup>
<ldapDefaultProfile>1</ldapDefaultProfile>
<ldapEnabled>0</ldapEnabled> >>>>>>>>>>>> Here set to 1 when enabled
<ldapGroup>CN=****,OU=****,DC=****,DC=****</ldapGroup>
<ldapProxyUser></ldapProxyUser>
<ldapServer>1.2.3.4</ldapServer> >>>>>>>>>>>> Here using the server IP Address
<ldapTlsEnabled>0</ldapTlsEnabled>
<ldapType>2</ldapType>
`
Connecting is done with Microsoft Active Directory | test | ldap causing system failure syspass version can be found on config information tab config app db describe the question a clear and concise description when enabling ldap system no longer work upon logging in i get redirected to with the screenshot below i had to disable the ldap from the config xml file screenshots if applicable add screenshots to help explain your problem platform please complete the following information os windows client linux server os version client lts browser chrome additional context add any other context about the problem here ldap configuration from config xml stripping some information root syspass var www html syspass app config cat config xml grep i ldap password here using the server admin credentials username here using the server admin credentials here set to when enabled cn ou dc dc here using the server ip address connecting is done with microsoft active directory | 1 |
335,987 | 30,111,945,250 | IssuesEvent | 2023-06-30 08:28:50 | hpcaitech/ColossalAI | https://api.github.com/repos/hpcaitech/ColossalAI | closed | [shardformer] add embedding gradient check | testing shardformer | In the shardformer tests, we did not check the embedding gradients. However, this is extremely important as the embedding is usually the first layer of the module, its correctness kind of ensures the whole backward is correct. Meanwhile, some embedding is a tied weight, it is important to make the tied weight gradient correct. | 1.0 | [shardformer] add embedding gradient check - In the shardformer tests, we did not check the embedding gradients. However, this is extremely important as the embedding is usually the first layer of the module, its correctness kind of ensures the whole backward is correct. Meanwhile, some embedding is a tied weight, it is important to make the tied weight gradient correct. | test | add embedding gradient check in the shardformer tests we did not check the embedding gradients however this is extremely important as the embedding is usually the first layer of the module its correctness kind of ensures the whole backward is correct meanwhile some embedding is a tied weight it is important to make the tied weight gradient correct | 1 |
208,446 | 23,605,495,371 | IssuesEvent | 2022-08-24 07:54:43 | ioana-nicolae/second | https://api.github.com/repos/ioana-nicolae/second | closed | CVE-2020-36189 (High) detected in jackson-databind-2.7.9.jar - autoclosed | security vulnerability | ## CVE-2020-36189 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.7.9.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9/a4c0b14c7dd85bdf4d25da074e90a10fa4b9b88b/jackson-databind-2.7.9.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.7.9.jar** (Vulnerable Library)
<p>Found in base branch: <b>branch3</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189>CVE-2020-36189</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: 2.9.10.8</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue | True | CVE-2020-36189 (High) detected in jackson-databind-2.7.9.jar - autoclosed - ## CVE-2020-36189 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.7.9.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.7.9/a4c0b14c7dd85bdf4d25da074e90a10fa4b9b88b/jackson-databind-2.7.9.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.7.9.jar** (Vulnerable Library)
<p>Found in base branch: <b>branch3</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189>CVE-2020-36189</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: 2.9.10.8</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue | non_test | cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file build gradle path to vulnerable library gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in base branch vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com newrelic agent deps ch qos logback core db drivermanagerconnectionsource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution rescue worker helmet automatic remediation is available for this issue | 0 |
732,363 | 25,256,848,645 | IssuesEvent | 2022-11-15 18:52:50 | serverlessworkflow/synapse | https://api.github.com/repos/serverlessworkflow/synapse | closed | Cannot save a workflow after an unsuccessfull save attempt | bug priority: high dashboard weight: 1 | **What happened**:
Cannot save a workflow after an unsuccessfull save attempt in the editor
**What you expected to happen**:
The workflow to be saved after errors have been fixed
**How to reproduce it**:
1. Create a workflow such as the following, which references a non existing external definitions (i.e. `functions`)
2. Try saving the workflow => an exception will be thrown because of a 404 NOT FOUND (i.e. `https://test.com/path_that_does_not_exist`)
3. Fix the `functions` by using an uri that returns an actual function definition collections
4. Save => nothing happens (no error, no console logs: nothing)
```yaml
id: breaks-editor-save
name: Breaks editor save feature
version: 1.0.0
specVersion: 0.8
functions: https://test.com/path_that_does_not_exist
states:
- name: CallToExternalFunction
type: operation
actions:
- name: callToExternalFunction
functionRef:
refName: get-pet-by-id
end: true
```
**Environment**:
Main branch at https://github.com/neuroglia-io/synapse | 1.0 | Cannot save a workflow after an unsuccessfull save attempt - **What happened**:
Cannot save a workflow after an unsuccessfull save attempt in the editor
**What you expected to happen**:
The workflow to be saved after errors have been fixed
**How to reproduce it**:
1. Create a workflow such as the following, which references a non existing external definitions (i.e. `functions`)
2. Try saving the workflow => an exception will be thrown because of a 404 NOT FOUND (i.e. `https://test.com/path_that_does_not_exist`)
3. Fix the `functions` by using an uri that returns an actual function definition collections
4. Save => nothing happens (no error, no console logs: nothing)
```yaml
id: breaks-editor-save
name: Breaks editor save feature
version: 1.0.0
specVersion: 0.8
functions: https://test.com/path_that_does_not_exist
states:
- name: CallToExternalFunction
type: operation
actions:
- name: callToExternalFunction
functionRef:
refName: get-pet-by-id
end: true
```
**Environment**:
Main branch at https://github.com/neuroglia-io/synapse | non_test | cannot save a workflow after an unsuccessfull save attempt what happened cannot save a workflow after an unsuccessfull save attempt in the editor what you expected to happen the workflow to be saved after errors have been fixed how to reproduce it create a workflow such as the following which references a non existing external definitions i e functions try saving the workflow an exception will be thrown because of a not found i e fix the functions by using an uri that returns an actual function definition collections save nothing happens no error no console logs nothing yaml id breaks editor save name breaks editor save feature version specversion functions states name calltoexternalfunction type operation actions name calltoexternalfunction functionref refname get pet by id end true environment main branch at | 0 |
23,538 | 16,384,132,056 | IssuesEvent | 2021-05-17 08:16:47 | google/web-stories-wp | https://api.github.com/repos/google/web-stories-wp | opened | E2E Tests: take screenshots on failures | Package: E2E Tests Pod: WP & Infra Type: Infrastructure | <!-- NOTE: For help requests, support questions, or general feedback, please use the WordPress.org forums instead: https://wordpress.org/support/plugin/web-stories/ -->
## Task Description
<!-- A clear and concise description of what this task is about. -->
Taking screenshots when an e2e test fails on CI would make it much easier for us to debug.
Some prior art:
https://github.com/WordPress/gutenberg/pull/26664
https://github.com/WordPress/gutenberg/pull/28449
Just like for #7549, it would mean using `jest-circus` as the test runner due to the events it emits to make this easier.
We could try using Percy for the screenshots, but storing them as GitHub Actions artifacts is cheaper. | 1.0 | E2E Tests: take screenshots on failures - <!-- NOTE: For help requests, support questions, or general feedback, please use the WordPress.org forums instead: https://wordpress.org/support/plugin/web-stories/ -->
## Task Description
<!-- A clear and concise description of what this task is about. -->
Taking screenshots when an e2e test fails on CI would make it much easier for us to debug.
Some prior art:
https://github.com/WordPress/gutenberg/pull/26664
https://github.com/WordPress/gutenberg/pull/28449
Just like for #7549, it would mean using `jest-circus` as the test runner due to the events it emits to make this easier.
We could try using Percy for the screenshots, but storing them as GitHub Actions artifacts is cheaper. | non_test | tests take screenshots on failures task description taking screenshots when an test fails on ci would make it much easier for us to debug some prior art just like for it would mean using jest circus as the test runner due to the events it emits to make this easier we could try using percy for the screenshots but storing them as github actions artifacts is cheaper | 0 |
184,277 | 14,284,922,389 | IssuesEvent | 2020-11-23 13:13:31 | INRIA/spoon | https://api.github.com/repos/INRIA/spoon | closed | Refactor(Test): Remove jre matching from test cases and replace with junit annotation | good first issue test | ### Problem
At spoon we have some test cases which have some weird problems at jdk8 and are run on jdk9+ only.
Example: [testRemoveDeprecatedMethods](
https://github.com/INRIA/spoon/blob/f6a2849bb9d3de9a992055fbbb67a0a27e7dc874/src/test/java/spoon/test/refactoring/RefactoringTest.java#L152)
Junit got recently and annotation for it. We could replace the self written code with junit.
see https://junit.org/junit5/docs/current/api/org.junit.jupiter.api/org/junit/jupiter/api/condition/DisabledForJreRange.html
### Tasks
Tasks are:
- Find all testcases using the self written jre matching
- Replace with annotation
- Remove/refactor unused functions afterwards
As this is a starter issue, feel free to give it a try, ask questions or ping me for help if you need some.
### Why refactor this ?
As always self written code is nice, but requires maintenance. If a stable framework provides the same features and we already use it, we can replace our code with their out of the box solution. This reduces the codebase from spoon and improves the maintainability. | 1.0 | Refactor(Test): Remove jre matching from test cases and replace with junit annotation - ### Problem
At spoon we have some test cases which have some weird problems at jdk8 and are run on jdk9+ only.
Example: [testRemoveDeprecatedMethods](
https://github.com/INRIA/spoon/blob/f6a2849bb9d3de9a992055fbbb67a0a27e7dc874/src/test/java/spoon/test/refactoring/RefactoringTest.java#L152)
Junit got recently and annotation for it. We could replace the self written code with junit.
see https://junit.org/junit5/docs/current/api/org.junit.jupiter.api/org/junit/jupiter/api/condition/DisabledForJreRange.html
### Tasks
Tasks are:
- Find all testcases using the self written jre matching
- Replace with annotation
- Remove/refactor unused functions afterwards
As this is a starter issue, feel free to give it a try, ask questions or ping me for help if you need some.
### Why refactor this ?
As always self written code is nice, but requires maintenance. If a stable framework provides the same features and we already use it, we can replace our code with their out of the box solution. This reduces the codebase from spoon and improves the maintainability. | test | refactor test remove jre matching from test cases and replace with junit annotation problem at spoon we have some test cases which have some weird problems at and are run on only example junit got recently and annotation for it we could replace the self written code with junit see tasks tasks are find all testcases using the self written jre matching replace with annotation remove refactor unused functions afterwards as this is a starter issue feel free to give it a try ask questions or ping me for help if you need some why refactor this as always self written code is nice but requires maintenance if a stable framework provides the same features and we already use it we can replace our code with their out of the box solution this reduces the codebase from spoon and improves the maintainability | 1 |
101,391 | 12,681,121,001 | IssuesEvent | 2020-06-19 14:51:31 | cds-snc/notification-api | https://api.github.com/repos/cds-snc/notification-api | opened | Allow users to change between English and French default brandings | Design M/M :tshirt: | Currently a platform admin has to update the branding for a service. This is because before, we had only a single default branding, and we needed to validate a change to a custom brand.
In addition to this, we should allow users to be able to change between the English and French default brandings without requiring CDS support.
I'm thinking maybe radio buttons above the existing custom branding upload page? This probably needs Design input before it's ready for Dev.
<img width="1074" alt="Screen Shot 2020-06-19 at 8 51 13 AM" src="https://user-images.githubusercontent.com/5032149/85145616-0f0abb00-b20a-11ea-9712-358498a781da.png">
| 1.0 | Allow users to change between English and French default brandings - Currently a platform admin has to update the branding for a service. This is because before, we had only a single default branding, and we needed to validate a change to a custom brand.
In addition to this, we should allow users to be able to change between the English and French default brandings without requiring CDS support.
I'm thinking maybe radio buttons above the existing custom branding upload page? This probably needs Design input before it's ready for Dev.
<img width="1074" alt="Screen Shot 2020-06-19 at 8 51 13 AM" src="https://user-images.githubusercontent.com/5032149/85145616-0f0abb00-b20a-11ea-9712-358498a781da.png">
| non_test | allow users to change between english and french default brandings currently a platform admin has to update the branding for a service this is because before we had only a single default branding and we needed to validate a change to a custom brand in addition to this we should allow users to be able to change between the english and french default brandings without requiring cds support i m thinking maybe radio buttons above the existing custom branding upload page this probably needs design input before it s ready for dev img width alt screen shot at am src | 0 |
403,325 | 27,411,593,581 | IssuesEvent | 2023-03-01 10:55:12 | zsabbagh/dd2480-assignment-4 | https://api.github.com/repos/zsabbagh/dd2480-assignment-4 | closed | Create UML diagram for issue "Log something only once (or only N times) " | documentation | **Part of criterias for P**
Key features affected by the issue are shown in UML class diagrams (for refactorings: include before/after).
Note: you do not have to show classes, fields, or methods that are not relevant, unless they help with
the overall understanding. Typically, the diagram would contain 5–10 classes.
| 1.0 | Create UML diagram for issue "Log something only once (or only N times) " - **Part of criterias for P**
Key features affected by the issue are shown in UML class diagrams (for refactorings: include before/after).
Note: you do not have to show classes, fields, or methods that are not relevant, unless they help with
the overall understanding. Typically, the diagram would contain 5–10 classes.
| non_test | create uml diagram for issue log something only once or only n times part of criterias for p key features affected by the issue are shown in uml class diagrams for refactorings include before after note you do not have to show classes fields or methods that are not relevant unless they help with the overall understanding typically the diagram would contain – classes | 0 |
136,274 | 5,279,087,238 | IssuesEvent | 2017-02-07 10:15:23 | projectcalico/felix | https://api.github.com/repos/projectcalico/felix | closed | Host endpoint resolution confused by interface renaming | area/felix kind/bug priority/P1 | Investigating the flaky test https://github.com/projectcalico/calicoctl/issues/1492, turns out to be a real failure:
- The test adds a second veth to the gateway `DockerHost` container.
- When adding a veth to the container, felix sees the veth appear with name `veth12345`, then libcalico-go renames it to `eth1` but felix seems to continue thinking the veth has the old name.
- Endpoint never resolves.
| 1.0 | Host endpoint resolution confused by interface renaming - Investigating the flaky test https://github.com/projectcalico/calicoctl/issues/1492, turns out to be a real failure:
- The test adds a second veth to the gateway `DockerHost` container.
- When adding a veth to the container, felix sees the veth appear with name `veth12345`, then libcalico-go renames it to `eth1` but felix seems to continue thinking the veth has the old name.
- Endpoint never resolves.
| non_test | host endpoint resolution confused by interface renaming investigating the flaky test turns out to be a real failure the test adds a second veth to the gateway dockerhost container when adding a veth to the container felix sees the veth appear with name then libcalico go renames it to but felix seems to continue thinking the veth has the old name endpoint never resolves | 0 |
136,124 | 11,038,107,894 | IssuesEvent | 2019-12-08 11:43:49 | ably/ably-cocoa | https://api.github.com/repos/ably/ably-cocoa | opened | Flaky test: JSON encoder: ignore invalid response | test suite | ```
✗ Utilities__JSON_Encoder__in_Rest__should_ignore_invalid_response_payload, Error is nil
``` | 1.0 | Flaky test: JSON encoder: ignore invalid response - ```
✗ Utilities__JSON_Encoder__in_Rest__should_ignore_invalid_response_payload, Error is nil
``` | test | flaky test json encoder ignore invalid response ✗ utilities json encoder in rest should ignore invalid response payload error is nil | 1 |
276,459 | 23,992,934,882 | IssuesEvent | 2022-09-14 04:04:07 | Tencent/bk-ci | https://api.github.com/repos/Tencent/bk-ci | closed | bug:monitoring服务把为空的数值的字段插入Influxdb报错 | kind/bug for gray for test area/ci/backend tested grayed streams/for test streams/for gray streams/done | **问题:** monitoring服务把值为空的数值的字段插入Influxdb时会把该字段的值转换为空字符串,从而造成报类型不匹配的错误。错误信息如下:
org.influxdb.InfluxDBException$FieldTypeConflictException: partial write: field type conflict: input field "endTime" on measurement "AtomMonitorData" is type string, already exists as type integer dropped=1
**措施:** 不能把值为空的数值的字段的值强制转换为空字符串 | 3.0 | bug:monitoring服务把为空的数值的字段插入Influxdb报错 - **问题:** monitoring服务把值为空的数值的字段插入Influxdb时会把该字段的值转换为空字符串,从而造成报类型不匹配的错误。错误信息如下:
org.influxdb.InfluxDBException$FieldTypeConflictException: partial write: field type conflict: input field "endTime" on measurement "AtomMonitorData" is type string, already exists as type integer dropped=1
**措施:** 不能把值为空的数值的字段的值强制转换为空字符串 | test | bug monitoring服务把为空的数值的字段插入influxdb报错 问题: monitoring服务把值为空的数值的字段插入influxdb时会把该字段的值转换为空字符串,从而造成报类型不匹配的错误。错误信息如下: org influxdb influxdbexception fieldtypeconflictexception partial write field type conflict input field endtime on measurement atommonitordata is type string already exists as type integer dropped 措施: 不能把值为空的数值的字段的值强制转换为空字符串 | 1 |
4,256 | 21,102,810,495 | IssuesEvent | 2022-04-04 15:51:21 | carbon-design-system/carbon | https://api.github.com/repos/carbon-design-system/carbon | closed | carbon-components.min.css and carbon-components.min.js not found | status: needs triage 🕵️♀️ status: waiting for maintainer response 💬 | ### Package
carbon-components
### Browser
_No response_
### Operating System
_No response_
### Package version
11.0.0
### React version
_No response_
### Automated testing tool and ruleset
not needed
### Assistive technology
_No response_
### Description
https://unpkg.com/carbon-components/scripts/carbon-components.min.js
https://unpkg.com/carbon-components/css/carbon-components.min.css
### WCAG 2.1 Violation
_No response_
### CodeSandbox example
not needed
### Steps to reproduce
open unpkg.com links, files not found
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/carbon-design-system/carbon/blob/f555616971a03fd454c0f4daea184adf41fff05b/.github/CODE_OF_CONDUCT.md)
- [X] I checked the [current issues](https://github.com/carbon-design-system/carbon/issues) for duplicate problems | True | carbon-components.min.css and carbon-components.min.js not found - ### Package
carbon-components
### Browser
_No response_
### Operating System
_No response_
### Package version
11.0.0
### React version
_No response_
### Automated testing tool and ruleset
not needed
### Assistive technology
_No response_
### Description
https://unpkg.com/carbon-components/scripts/carbon-components.min.js
https://unpkg.com/carbon-components/css/carbon-components.min.css
### WCAG 2.1 Violation
_No response_
### CodeSandbox example
not needed
### Steps to reproduce
open unpkg.com links, files not found
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/carbon-design-system/carbon/blob/f555616971a03fd454c0f4daea184adf41fff05b/.github/CODE_OF_CONDUCT.md)
- [X] I checked the [current issues](https://github.com/carbon-design-system/carbon/issues) for duplicate problems | non_test | carbon components min css and carbon components min js not found package carbon components browser no response operating system no response package version react version no response automated testing tool and ruleset not needed assistive technology no response description wcag violation no response codesandbox example not needed steps to reproduce open unpkg com links files not found code of conduct i agree to follow this project s i checked the for duplicate problems | 0 |
102,626 | 8,851,055,371 | IssuesEvent | 2019-01-08 14:52:06 | researchstudio-sat/webofneeds | https://api.github.com/repos/researchstudio-sat/webofneeds | closed | usecase search not working for all use cases | bug testing | For example, the usecase search finds 'Meet People', but not 'Offer Taxi Service' | 1.0 | usecase search not working for all use cases - For example, the usecase search finds 'Meet People', but not 'Offer Taxi Service' | test | usecase search not working for all use cases for example the usecase search finds meet people but not offer taxi service | 1 |
15,400 | 19,591,951,577 | IssuesEvent | 2022-01-05 13:56:47 | RobertCraigie/prisma-client-py | https://api.github.com/repos/RobertCraigie/prisma-client-py | closed | Increase default HTTP timeout | kind/improvement process/candidate | ## Problem
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Currently, `httpx.ReadTimeout` errors can be encountered easily (#197).
## Suggested solution
<!-- A clear and concise description of what you want to happen. -->
We should increase the default timeout, however I do not know by how much.
We should look into how Prisma handles this.
| 1.0 | Increase default HTTP timeout - ## Problem
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Currently, `httpx.ReadTimeout` errors can be encountered easily (#197).
## Suggested solution
<!-- A clear and concise description of what you want to happen. -->
We should increase the default timeout, however I do not know by how much.
We should look into how Prisma handles this.
| non_test | increase default http timeout problem currently httpx readtimeout errors can be encountered easily suggested solution we should increase the default timeout however i do not know by how much we should look into how prisma handles this | 0 |
28,318 | 6,979,472,929 | IssuesEvent | 2017-12-12 21:10:15 | PBR/QTM | https://api.github.com/repos/PBR/QTM | opened | The use of some config parameters is not clear. | code question | Config file: `config/configQtm.properties`
By setting `textFiles=./` generates [some PMCID].txt file. What's the purpose of this output file? | 1.0 | The use of some config parameters is not clear. - Config file: `config/configQtm.properties`
By setting `textFiles=./` generates [some PMCID].txt file. What's the purpose of this output file? | non_test | the use of some config parameters is not clear config file config configqtm properties by setting textfiles generates txt file what s the purpose of this output file | 0 |
282,498 | 24,480,983,450 | IssuesEvent | 2022-10-08 20:43:51 | Energy-Price-News-API/energy-prices-api | https://api.github.com/repos/Energy-Price-News-API/energy-prices-api | closed | Test: Add a test suite for `api/news/sources/:sourceId` | :rescue_worker_helmet: help wanted :beers: good first issue :heavy_check_mark: testing | ### What would you like to share?
Add a test suite for the `/sourceId` endpoint.
### Steps:
- Add a new `describe block` for `/api/news/sources` inside `/__tests__/integration-tests/routes/news.test.js`
- Add test suite for all possible scenarios
### Additional information
> **Note**
> All `axios` requests are mock inside the `__mocks__` folder
> Fell free to add more mock data inside that file | 1.0 | Test: Add a test suite for `api/news/sources/:sourceId` - ### What would you like to share?
Add a test suite for the `/sourceId` endpoint.
### Steps:
- Add a new `describe block` for `/api/news/sources` inside `/__tests__/integration-tests/routes/news.test.js`
- Add test suite for all possible scenarios
### Additional information
> **Note**
> All `axios` requests are mock inside the `__mocks__` folder
> Fell free to add more mock data inside that file | test | test add a test suite for api news sources sourceid what would you like to share add a test suite for the sourceid endpoint steps add a new describe block for api news sources inside tests integration tests routes news test js add test suite for all possible scenarios additional information note all axios requests are mock inside the mocks folder fell free to add more mock data inside that file | 1 |
55,025 | 6,422,938,306 | IssuesEvent | 2017-08-09 09:42:41 | fsprojects/Paket | https://api.github.com/repos/fsprojects/Paket | closed | Forcing lowercase with tolower(Id) yields poor Proget performance during resolution | needs-test-case | ### Description
When resolving packages on a Proget Nuget feed with 150k package versions, the use of tolower(Id) in tryGetAllVersionsFromNugetODataWithFilter breaks performance optimizations around "Id eq 'package'" in Proget. This forces full enumeration of packages in the feed, which is orders of magnitude slower.
See related question posted to http://inedo.com/support/questions/6810
### Repro steps
Please provide the steps required to reproduce the problem
1. paket.dependencies references a package with source as a Proget feed with 100s of thousands of package version
2. Issue a get using Id='<package>', Version='<version>' and compare performance to a paket update on the same package.
### Expected behavior
Paket could provide syntax or a keyword override within paket.dependencies to disable case insensitivity handling like tolower()
| 1.0 | Forcing lowercase with tolower(Id) yields poor Proget performance during resolution - ### Description
When resolving packages on a Proget Nuget feed with 150k package versions, the use of tolower(Id) in tryGetAllVersionsFromNugetODataWithFilter breaks performance optimizations around "Id eq 'package'" in Proget. This forces full enumeration of packages in the feed, which is orders of magnitude slower.
See related question posted to http://inedo.com/support/questions/6810
### Repro steps
Please provide the steps required to reproduce the problem
1. paket.dependencies references a package with source as a Proget feed with 100s of thousands of package version
2. Issue a get using Id='<package>', Version='<version>' and compare performance to a paket update on the same package.
### Expected behavior
Paket could provide syntax or a keyword override within paket.dependencies to disable case insensitivity handling like tolower()
| test | forcing lowercase with tolower id yields poor proget performance during resolution description when resolving packages on a proget nuget feed with package versions the use of tolower id in trygetallversionsfromnugetodatawithfilter breaks performance optimizations around id eq package in proget this forces full enumeration of packages in the feed which is orders of magnitude slower see related question posted to repro steps please provide the steps required to reproduce the problem paket dependencies references a package with source as a proget feed with of thousands of package version issue a get using id version and compare performance to a paket update on the same package expected behavior paket could provide syntax or a keyword override within paket dependencies to disable case insensitivity handling like tolower | 1 |
112,161 | 24,235,725,754 | IssuesEvent | 2022-09-26 22:59:06 | robert-altom/test | https://api.github.com/repos/robert-altom/test | closed | Return value for CallStaticMethod is wrong in documentation | documentation 1.6.2 in code review gitlab | <sub>You can find the original issue from GitLab [here](https://gitlab.com/altom/altunity/altunitytester/-/issues/464).</sub>
| 1.0 | Return value for CallStaticMethod is wrong in documentation - <sub>You can find the original issue from GitLab [here](https://gitlab.com/altom/altunity/altunitytester/-/issues/464).</sub>
| non_test | return value for callstaticmethod is wrong in documentation you can find the original issue from gitlab | 0 |
720,553 | 24,796,754,830 | IssuesEvent | 2022-10-24 17:57:58 | AY2223S1-CS2103T-T12-4/tp | https://api.github.com/repos/AY2223S1-CS2103T-T12-4/tp | closed | Display patient's medication type and dosages | type.Story priority.Medium type.NoteBased | As a private nurse I want to know what medication my patient needs so that I can prepare the dosages accordingly | 1.0 | Display patient's medication type and dosages - As a private nurse I want to know what medication my patient needs so that I can prepare the dosages accordingly | non_test | display patient s medication type and dosages as a private nurse i want to know what medication my patient needs so that i can prepare the dosages accordingly | 0 |
347,745 | 31,271,651,417 | IssuesEvent | 2023-08-22 00:50:24 | cca-ffodregamdi/running-hi-back | https://api.github.com/repos/cca-ffodregamdi/running-hi-back | opened | [Feature] 4주차 - [BOOKMARK] 이미 저장된 즐겨찾기 예외처리 | ✨ Feature 🎯 Test | ✏️Description
-
즐겨찾기 저장 시 이미 폴더에 저장되어있는 게시물일 경우 예외처리
✅TODO
-
- [ ]
🐾ETC
-
| 1.0 | [Feature] 4주차 - [BOOKMARK] 이미 저장된 즐겨찾기 예외처리 - ✏️Description
-
즐겨찾기 저장 시 이미 폴더에 저장되어있는 게시물일 경우 예외처리
✅TODO
-
- [ ]
🐾ETC
-
| test | 이미 저장된 즐겨찾기 예외처리 ✏️description 즐겨찾기 저장 시 이미 폴더에 저장되어있는 게시물일 경우 예외처리 ✅todo 🐾etc | 1 |
43,089 | 5,518,380,031 | IssuesEvent | 2017-03-18 08:33:20 | Orderella/PopupDialog | https://api.github.com/repos/Orderella/PopupDialog | closed | Because of FXBlurView, any view behind the popup that was hidden is shown after the popup is dismissed | ready for testing | # Report
## Environment
Please provide information on your development environment, so we can build with the same scenario.
- Xcode version (e.g. 8.0): 8.2.1
- PopupDialog version (e.g. 0.5.0): 0.5.3
- Minimum deployment target (e.g. 9.0): 10.0
- Language (Objective-C / Swift): Swift
- In case of Swift - Version (e.g. 3.0): 3.0
## Dependency management
If you are not using any dependency managers, you can remove this section.
> Please note: If you are using CocoaPods with Xcode 8, CocoaPods 1.1.0 is required.
- Dependency manager (e.g. CocoaPods): CocoaPods
- Version (e.g. 1.1.0): 1.1.1
## What did you do?
Presented a popup over a view that has a subview hidden
## What did you expect to happen?
After the popup is dismissed, the view should be exactly as it was before, with the subview hidden.
## What happened instead?
After the popup is dismissed, the hidden view is being displayed. This is being caused by FXBlurView, see https://github.com/nicklockwood/FXBlurView/issues/126 for more details. | 1.0 | Because of FXBlurView, any view behind the popup that was hidden is shown after the popup is dismissed - # Report
## Environment
Please provide information on your development environment, so we can build with the same scenario.
- Xcode version (e.g. 8.0): 8.2.1
- PopupDialog version (e.g. 0.5.0): 0.5.3
- Minimum deployment target (e.g. 9.0): 10.0
- Language (Objective-C / Swift): Swift
- In case of Swift - Version (e.g. 3.0): 3.0
## Dependency management
If you are not using any dependency managers, you can remove this section.
> Please note: If you are using CocoaPods with Xcode 8, CocoaPods 1.1.0 is required.
- Dependency manager (e.g. CocoaPods): CocoaPods
- Version (e.g. 1.1.0): 1.1.1
## What did you do?
Presented a popup over a view that has a subview hidden
## What did you expect to happen?
After the popup is dismissed, the view should be exactly as it was before, with the subview hidden.
## What happened instead?
After the popup is dismissed, the hidden view is being displayed. This is being caused by FXBlurView, see https://github.com/nicklockwood/FXBlurView/issues/126 for more details. | test | because of fxblurview any view behind the popup that was hidden is shown after the popup is dismissed report environment please provide information on your development environment so we can build with the same scenario xcode version e g popupdialog version e g minimum deployment target e g language objective c swift swift in case of swift version e g dependency management if you are not using any dependency managers you can remove this section please note if you are using cocoapods with xcode cocoapods is required dependency manager e g cocoapods cocoapods version e g what did you do presented a popup over a view that has a subview hidden what did you expect to happen after the popup is dismissed the view should be exactly as it was before with the subview hidden what happened instead after the popup is dismissed the hidden view is being displayed this is being caused by fxblurview see for more details | 1 |
382,464 | 11,306,841,383 | IssuesEvent | 2020-01-18 16:52:05 | JensenJ/BunkerSurvival | https://api.github.com/repos/JensenJ/BunkerSurvival | opened | [FEATURE] Better Spawning / Despawning | enhancement low-priority | **Is your feature request related to a problem?**
Spawning and despawning is quite abrupt.
**Describe the solution you'd like instead**
An animation of some sort in order to make spawning / despawning more natural. This could be done using shadergraph.
**Describe alternatives you've considered**
Creating a multitude of animations and have different animations for different objects such as enemies, buildings or players.
| 1.0 | [FEATURE] Better Spawning / Despawning - **Is your feature request related to a problem?**
Spawning and despawning is quite abrupt.
**Describe the solution you'd like instead**
An animation of some sort in order to make spawning / despawning more natural. This could be done using shadergraph.
**Describe alternatives you've considered**
Creating a multitude of animations and have different animations for different objects such as enemies, buildings or players.
| non_test | better spawning despawning is your feature request related to a problem spawning and despawning is quite abrupt describe the solution you d like instead an animation of some sort in order to make spawning despawning more natural this could be done using shadergraph describe alternatives you ve considered creating a multitude of animations and have different animations for different objects such as enemies buildings or players | 0 |
275,666 | 23,929,189,154 | IssuesEvent | 2022-09-10 09:11:16 | IntellectualSites/FastAsyncWorldEdit | https://api.github.com/repos/IntellectualSites/FastAsyncWorldEdit | opened | ForwardExtentCopy doesn't overwrite existing item frames at the destination | Requires Testing | ### Server Implementation
Paper
### Server Version
1.19.2
### Describe the bug
When using a `ForwardExtentCopy` to paste a schematic which contains only blocks, existing item frames are not removed at the destination.
### To Reproduce
1. Create a schematic in a 2x2x2 area
2. Place an item frame anywhere within the target destination
3. Paste the schematic at the 2x2x2 area
The item frame should still be there.
### Expected behaviour
The item frame should not be there.
### Screenshots / Videos
_No response_
### Error log (if applicable)
_No response_
### Fawe Debugpaste
https://athion.net/ISPaster/paste/view/40da6ca507fe454ba5208c5f2051fca2
### Fawe Version
FastAsyncWorldEdit-v2.4.5-SNAPSHOT-273;8233f13
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists.
### Anything else?
Using FAWE-273 as 276 has an issue with the AsyncCatcher (see: https://discord.com/channels/268444645527126017/344128526435221505/1018058263813693480)
Debugpaste contains AreaShop which is the plugin I'm working on which performs the schematic pasting. | 1.0 | ForwardExtentCopy doesn't overwrite existing item frames at the destination - ### Server Implementation
Paper
### Server Version
1.19.2
### Describe the bug
When using a `ForwardExtentCopy` to paste a schematic which contains only blocks, existing item frames are not removed at the destination.
### To Reproduce
1. Create a schematic in a 2x2x2 area
2. Place an item frame anywhere within the target destination
3. Paste the schematic at the 2x2x2 area
The item frame should still be there.
### Expected behaviour
The item frame should not be there.
### Screenshots / Videos
_No response_
### Error log (if applicable)
_No response_
### Fawe Debugpaste
https://athion.net/ISPaster/paste/view/40da6ca507fe454ba5208c5f2051fca2
### Fawe Version
FastAsyncWorldEdit-v2.4.5-SNAPSHOT-273;8233f13
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists.
### Anything else?
Using FAWE-273 as 276 has an issue with the AsyncCatcher (see: https://discord.com/channels/268444645527126017/344128526435221505/1018058263813693480)
Debugpaste contains AreaShop which is the plugin I'm working on which performs the schematic pasting. | test | forwardextentcopy doesn t overwrite existing item frames at the destination server implementation paper server version describe the bug when using a forwardextentcopy to paste a schematic which contains only blocks existing item frames are not removed at the destination to reproduce create a schematic in a area place an item frame anywhere within the target destination paste the schematic at the area the item frame should still be there expected behaviour the item frame should not be there screenshots videos no response error log if applicable no response fawe debugpaste fawe version fastasyncworldedit snapshot checklist i have included a fawe debugpaste i am using the newest build from and the issue still persists anything else using fawe as has an issue with the asynccatcher see debugpaste contains areashop which is the plugin i m working on which performs the schematic pasting | 1 |
16,899 | 9,545,910,997 | IssuesEvent | 2019-05-01 18:25:08 | couchbase/couchbase-lite-core | https://api.github.com/repos/couchbase/couchbase-lite-core | opened | Replicator should stop quickly when told to stop | bug :bug: f/replicator performance :stopwatch: | When the replicator is explicitly told to stop, it should try to stop ASAP, since the app may be quitting. We have customer bug reports involving replicators not stopping in a timely fashion.
From a quick code perusal, I think the bottleneck is the WebSocket close protocol. `WebSocketImpl` sends a `close` message and waits for a `close` back from the peer. But if the peer isn't responsive, or if the TCP socket is backed up (e.g. connectivity is lost), nothing will happen.
`WebSocketImpl` doesn't have any timeout of its own; it has a timer to send a PING message every 5 minutes (by default), but nothing that detects whether the peer isn't sending messages. (And the close timeout should be shorter than the regular heartbeat timeout, probably just a few seconds.)
[[CM-163](https://issues.couchbase.com/browse/CM-163)] | True | Replicator should stop quickly when told to stop - When the replicator is explicitly told to stop, it should try to stop ASAP, since the app may be quitting. We have customer bug reports involving replicators not stopping in a timely fashion.
From a quick code perusal, I think the bottleneck is the WebSocket close protocol. `WebSocketImpl` sends a `close` message and waits for a `close` back from the peer. But if the peer isn't responsive, or if the TCP socket is backed up (e.g. connectivity is lost), nothing will happen.
`WebSocketImpl` doesn't have any timeout of its own; it has a timer to send a PING message every 5 minutes (by default), but nothing that detects whether the peer isn't sending messages. (And the close timeout should be shorter than the regular heartbeat timeout, probably just a few seconds.)
[[CM-163](https://issues.couchbase.com/browse/CM-163)] | non_test | replicator should stop quickly when told to stop when the replicator is explicitly told to stop it should try to stop asap since the app may be quitting we have customer bug reports involving replicators not stopping in a timely fashion from a quick code perusal i think the bottleneck is the websocket close protocol websocketimpl sends a close message and waits for a close back from the peer but if the peer isn t responsive or if the tcp socket is backed up e g connectivity is lost nothing will happen websocketimpl doesn t have any timeout of its own it has a timer to send a ping message every minutes by default but nothing that detects whether the peer isn t sending messages and the close timeout should be shorter than the regular heartbeat timeout probably just a few seconds | 0 |
20,347 | 3,808,444,733 | IssuesEvent | 2016-03-25 15:06:32 | uProxy/uproxy | https://api.github.com/repos/uProxy/uproxy | opened | dashboard for cloud servers | C:Cloud C:Performance C:Testing P1 | I want to a dashboard for my cloud server. A few essential statistics;
* number of established peerconnections
* throughput (bytes being proxied/second)
* CPU load
* memory usage
This is for load testing and, eventually, end-users. I think this is higher priority even than a health checker because currently we don't even know how to check health.
I think there must be a bunch of open source tools we can use; we'll need to investigate those, and presumably add a bunch of instrumentation to the server itself. | 1.0 | dashboard for cloud servers - I want to a dashboard for my cloud server. A few essential statistics;
* number of established peerconnections
* throughput (bytes being proxied/second)
* CPU load
* memory usage
This is for load testing and, eventually, end-users. I think this is higher priority even than a health checker because currently we don't even know how to check health.
I think there must be a bunch of open source tools we can use; we'll need to investigate those, and presumably add a bunch of instrumentation to the server itself. | test | dashboard for cloud servers i want to a dashboard for my cloud server a few essential statistics number of established peerconnections throughput bytes being proxied second cpu load memory usage this is for load testing and eventually end users i think this is higher priority even than a health checker because currently we don t even know how to check health i think there must be a bunch of open source tools we can use we ll need to investigate those and presumably add a bunch of instrumentation to the server itself | 1 |
267,495 | 28,509,064,247 | IssuesEvent | 2023-04-19 01:32:11 | dpteam/RK3188_TABLET | https://api.github.com/repos/dpteam/RK3188_TABLET | closed | CVE-2011-4131 (Medium) detected in randomv3.0.66, linuxv3.0 - autoclosed | Mend: dependency security vulnerability | ## CVE-2011-4131 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>randomv3.0.66</b>, <b>linuxv3.0</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The NFSv4 implementation in the Linux kernel before 3.2.2 does not properly handle bitmap sizes in GETACL replies, which allows remote NFS servers to cause a denial of service (OOPS) by sending an excessive number of bitmap words.
<p>Publish Date: 2012-05-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2011-4131>CVE-2011-4131</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2011-4131">https://www.linuxkernelcves.com/cves/CVE-2011-4131</a></p>
<p>Release Date: 2012-05-17</p>
<p>Fix Resolution: v3.3-rc1,v3.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2011-4131 (Medium) detected in randomv3.0.66, linuxv3.0 - autoclosed - ## CVE-2011-4131 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>randomv3.0.66</b>, <b>linuxv3.0</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The NFSv4 implementation in the Linux kernel before 3.2.2 does not properly handle bitmap sizes in GETACL replies, which allows remote NFS servers to cause a denial of service (OOPS) by sending an excessive number of bitmap words.
<p>Publish Date: 2012-05-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2011-4131>CVE-2011-4131</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Adjacent
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2011-4131">https://www.linuxkernelcves.com/cves/CVE-2011-4131</a></p>
<p>Release Date: 2012-05-17</p>
<p>Fix Resolution: v3.3-rc1,v3.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve medium detected in autoclosed cve medium severity vulnerability vulnerable libraries vulnerability details the implementation in the linux kernel before does not properly handle bitmap sizes in getacl replies which allows remote nfs servers to cause a denial of service oops by sending an excessive number of bitmap words publish date url a href cvss score details base score metrics exploitability metrics attack vector adjacent attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
172,418 | 13,305,128,237 | IssuesEvent | 2020-08-25 18:02:11 | mathjax/MathJax | https://api.github.com/repos/mathjax/MathJax | closed | AssistiveMML crashes on output for "Math output error" | Accepted Fixed Test Needed v3 v3.1 | The assistive-mml extension fails when trying to add the MathML to a "Math output error" element, since the structure is not the same as usual elements. The error output should be marked as escaped so that assistive-mml won't attempt to adjust it. | 1.0 | AssistiveMML crashes on output for "Math output error" - The assistive-mml extension fails when trying to add the MathML to a "Math output error" element, since the structure is not the same as usual elements. The error output should be marked as escaped so that assistive-mml won't attempt to adjust it. | test | assistivemml crashes on output for math output error the assistive mml extension fails when trying to add the mathml to a math output error element since the structure is not the same as usual elements the error output should be marked as escaped so that assistive mml won t attempt to adjust it | 1 |
141,379 | 18,983,375,863 | IssuesEvent | 2021-11-21 09:37:52 | Seagate/cortx-utils | https://api.github.com/repos/Seagate/cortx-utils | closed | CVE-2020-36242 (High) detected in cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl | needs-attention security vulnerability | ## CVE-2020-36242 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl</b></p></summary>
<p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ca/9a/7cece52c46546e214e10811b36b2da52ce1ea7fa203203a629b8dfadad53/cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ca/9a/7cece52c46546e214e10811b36b2da52ce1ea7fa203203a629b8dfadad53/cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: cortx-utils/py-utils/src/utils/setup/kafka/kafka</p>
<p>Path to vulnerable library: cortx-utils/py-utils/src/utils/setup/kafka/kafka,cortx-utils/py-utils,cortx-utils/py-utils/requirements.txt,cortx-utils/py-utils/src/setup/utils</p>
<p>
Dependency Hierarchy:
- :x: **cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Seagate/cortx-utils/commits/0e59cfa613b30347e0280346b3b3c2f411c4dd36">0e59cfa613b30347e0280346b3b3c2f411c4dd36</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.
<p>Publish Date: 2021-02-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242>CVE-2020-36242</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst">https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst</a></p>
<p>Release Date: 2021-02-07</p>
<p>Fix Resolution: cryptography - 3.3.2</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"cryptography","packageVersion":"2.8","packageFilePaths":["/py-utils/src/utils/setup/kafka/kafka","/py-utils","/py-utils/requirements.txt","/py-utils/src/setup/utils"],"isTransitiveDependency":false,"dependencyTree":"cryptography:2.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"cryptography - 3.3.2"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-36242","vulnerabilityDetails":"In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-36242 (High) detected in cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl - ## CVE-2020-36242 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl</b></p></summary>
<p>cryptography is a package which provides cryptographic recipes and primitives to Python developers.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/ca/9a/7cece52c46546e214e10811b36b2da52ce1ea7fa203203a629b8dfadad53/cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/ca/9a/7cece52c46546e214e10811b36b2da52ce1ea7fa203203a629b8dfadad53/cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl</a></p>
<p>Path to dependency file: cortx-utils/py-utils/src/utils/setup/kafka/kafka</p>
<p>Path to vulnerable library: cortx-utils/py-utils/src/utils/setup/kafka/kafka,cortx-utils/py-utils,cortx-utils/py-utils/requirements.txt,cortx-utils/py-utils/src/setup/utils</p>
<p>
Dependency Hierarchy:
- :x: **cryptography-2.8-cp34-abi3-manylinux2010_x86_64.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Seagate/cortx-utils/commits/0e59cfa613b30347e0280346b3b3c2f411c4dd36">0e59cfa613b30347e0280346b3b3c2f411c4dd36</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.
<p>Publish Date: 2021-02-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242>CVE-2020-36242</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst">https://github.com/pyca/cryptography/blob/master/CHANGELOG.rst</a></p>
<p>Release Date: 2021-02-07</p>
<p>Fix Resolution: cryptography - 3.3.2</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"cryptography","packageVersion":"2.8","packageFilePaths":["/py-utils/src/utils/setup/kafka/kafka","/py-utils","/py-utils/requirements.txt","/py-utils/src/setup/utils"],"isTransitiveDependency":false,"dependencyTree":"cryptography:2.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"cryptography - 3.3.2"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2020-36242","vulnerabilityDetails":"In the cryptography package before 3.3.2 for Python, certain sequences of update calls to symmetrically encrypt multi-GB values could result in an integer overflow and buffer overflow, as demonstrated by the Fernet class.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36242","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_test | cve high detected in cryptography whl cve high severity vulnerability vulnerable library cryptography whl cryptography is a package which provides cryptographic recipes and primitives to python developers library home page a href path to dependency file cortx utils py utils src utils setup kafka kafka path to vulnerable library cortx utils py utils src utils setup kafka kafka cortx utils py utils cortx utils py utils requirements txt cortx utils py utils src setup utils dependency hierarchy x cryptography whl vulnerable library found in head commit a href found in base branch main vulnerability details in the cryptography package before for python certain sequences of update calls to symmetrically encrypt multi gb values could result in an integer overflow and buffer overflow as demonstrated by the fernet class publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution cryptography rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree cryptography isminimumfixversionavailable true minimumfixversion cryptography basebranches vulnerabilityidentifier cve vulnerabilitydetails in the cryptography package before for python certain sequences of update calls to symmetrically encrypt multi gb values could result in an integer overflow and buffer overflow as demonstrated by the fernet class vulnerabilityurl | 0 |
110,858 | 13,943,870,853 | IssuesEvent | 2020-10-23 00:19:38 | jupyterlab/jupyterlab | https://api.github.com/repos/jupyterlab/jupyterlab | closed | New Experience for Command Palette | help wanted tag:Design and UX | Resurrecting some discussion around the idea of modal Command Palette #4851 .
I think the current Command Palette is disjointed from the where it may be having an effect, a single modal window similar to VSCode, Sublime, Classic Notebook etc. is worth exploring.
@isabela-sf @javag97 | 1.0 | New Experience for Command Palette - Resurrecting some discussion around the idea of modal Command Palette #4851 .
I think the current Command Palette is disjointed from the where it may be having an effect, a single modal window similar to VSCode, Sublime, Classic Notebook etc. is worth exploring.
@isabela-sf @javag97 | non_test | new experience for command palette resurrecting some discussion around the idea of modal command palette i think the current command palette is disjointed from the where it may be having an effect a single modal window similar to vscode sublime classic notebook etc is worth exploring isabela sf | 0 |
298,992 | 25,874,418,200 | IssuesEvent | 2022-12-14 06:34:17 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | roachtest: sqlsmith/setup=empty/setting=default failed | C-test-failure O-robot O-roachtest branch-master | roachtest.sqlsmith/setup=empty/setting=default [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7952891?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7952891?buildTab=artifacts#/sqlsmith/setup=empty/setting=default) on master @ [052acc88ad9d7296ce6b8b441627fb469cc74d95](https://github.com/cockroachdb/cockroach/commits/052acc88ad9d7296ce6b8b441627fb469cc74d95):
```
test artifacts and logs in: /artifacts/sqlsmith/setup=empty/setting=default/run_1
(test_impl.go:297).Fatalf: error: pq: internal error: comparison overload not found (is, void, unknown)
stmt:
WITH
with_106 (col_465)
AS (SELECT * FROM (VALUES (1881675994:::OID), (1699118865:::OID), (3245422930:::OID)) AS tab_246 (col_465))
SELECT
cte_ref_27.col_465 AS col_469, tab_247.col_468 AS col_470, NULL AS col_471
FROM
with_106 AS cte_ref_27,
(
VALUES
(
'':::VOID,
(),
'BOX(-0.5472482673152966 0.11051620319044264,0.9723179745751926 0.4001892774859597)':::BOX2D
),
(
'':::VOID,
NULL,
'BOX(-0.3017628734619263 -1.6107285056715497,0.32348021420669676 0.9140915887967789)':::BOX2D
)
)
AS tab_247 (col_466, col_467, col_468)
ORDER BY
tab_247.col_468 NULLS FIRST,
tab_247.col_466 ASC NULLS FIRST,
tab_247.col_466 ASC NULLS LAST,
tab_247.col_468 DESC NULLS LAST
LIMIT
74:::INT8;
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=empty/setting=default.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 2.0 | roachtest: sqlsmith/setup=empty/setting=default failed - roachtest.sqlsmith/setup=empty/setting=default [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7952891?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/7952891?buildTab=artifacts#/sqlsmith/setup=empty/setting=default) on master @ [052acc88ad9d7296ce6b8b441627fb469cc74d95](https://github.com/cockroachdb/cockroach/commits/052acc88ad9d7296ce6b8b441627fb469cc74d95):
```
test artifacts and logs in: /artifacts/sqlsmith/setup=empty/setting=default/run_1
(test_impl.go:297).Fatalf: error: pq: internal error: comparison overload not found (is, void, unknown)
stmt:
WITH
with_106 (col_465)
AS (SELECT * FROM (VALUES (1881675994:::OID), (1699118865:::OID), (3245422930:::OID)) AS tab_246 (col_465))
SELECT
cte_ref_27.col_465 AS col_469, tab_247.col_468 AS col_470, NULL AS col_471
FROM
with_106 AS cte_ref_27,
(
VALUES
(
'':::VOID,
(),
'BOX(-0.5472482673152966 0.11051620319044264,0.9723179745751926 0.4001892774859597)':::BOX2D
),
(
'':::VOID,
NULL,
'BOX(-0.3017628734619263 -1.6107285056715497,0.32348021420669676 0.9140915887967789)':::BOX2D
)
)
AS tab_247 (col_466, col_467, col_468)
ORDER BY
tab_247.col_468 NULLS FIRST,
tab_247.col_466 ASC NULLS FIRST,
tab_247.col_466 ASC NULLS LAST,
tab_247.col_468 DESC NULLS LAST
LIMIT
74:::INT8;
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=empty/setting=default.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | roachtest sqlsmith setup empty setting default failed roachtest sqlsmith setup empty setting default with on master test artifacts and logs in artifacts sqlsmith setup empty setting default run test impl go fatalf error pq internal error comparison overload not found is void unknown stmt with with col as select from values oid oid oid as tab col select cte ref col as col tab col as col null as col from with as cte ref values void box void null box as tab col col col order by tab col nulls first tab col asc nulls first tab col asc nulls last tab col desc nulls last limit parameters roachtest cloud gce roachtest cpu roachtest encrypted false roachtest ssd help see see cc cockroachdb sql queries | 1 |
71,482 | 7,245,830,926 | IssuesEvent | 2018-02-14 19:27:15 | phetsims/capacitor-lab-basics | https://api.github.com/repos/phetsims/capacitor-lab-basics | closed | Change battery slider to have snap-to behavior at 0.05 jumps? | status:fixed-pending-testing | @arouinfar - Comparison of doubling/halving is challenging with the voltage slider right now because it does not "snap to" nice values. What do you think of making them 0.05 jumps reproducible? Currently jumps seem to be around 0.03, so you would have close to as much flexibility in exact numbers.
Note that voltage readout would need to stay 3 decimal places because changing area and separation can change voltage too. | 1.0 | Change battery slider to have snap-to behavior at 0.05 jumps? - @arouinfar - Comparison of doubling/halving is challenging with the voltage slider right now because it does not "snap to" nice values. What do you think of making them 0.05 jumps reproducible? Currently jumps seem to be around 0.03, so you would have close to as much flexibility in exact numbers.
Note that voltage readout would need to stay 3 decimal places because changing area and separation can change voltage too. | test | change battery slider to have snap to behavior at jumps arouinfar comparison of doubling halving is challenging with the voltage slider right now because it does not snap to nice values what do you think of making them jumps reproducible currently jumps seem to be around so you would have close to as much flexibility in exact numbers note that voltage readout would need to stay decimal places because changing area and separation can change voltage too | 1 |
54,873 | 7,928,753,434 | IssuesEvent | 2018-07-06 12:53:34 | spring-projects/spring-boot | https://api.github.com/repos/spring-projects/spring-boot | closed | Ensure reference manual doesn't generate horizontal scrollbar | type: documentation | Spring Boot's reference manual currently generates horizontal scrollbar at common screen sizes, which is a bit annoying for the reader.
The cause of this is a table in _Customize the Jackson ObjectMapper_ section which has a cell containing value `spring.jackson.default-property-inclusion=always|non_null|non_absent|non_default|non_empty` that sticks out like a sore thumb and results in a horizontal scrollbar being rendered. This is present in both `1.5.x` and `2.0.x`:
- https://docs.spring.io/spring-boot/docs/1.5.x/reference/htmlsingle/#howto-customize-the-jackson-objectmapper
- https://docs.spring.io/spring-boot/docs/2.0.x/reference/htmlsingle/#howto-customize-the-jackson-objectmapper
Ideally, the horizontal scrollbar (if really needed) should be placed on the enclosing element (table in this case) rather than on the whole document. This approach is already present in many places in the reference manual, like the [application properties appendix](https://docs.spring.io/spring-boot/docs/2.0.x/reference/htmlsingle/#common-application-properties). | 1.0 | Ensure reference manual doesn't generate horizontal scrollbar - Spring Boot's reference manual currently generates horizontal scrollbar at common screen sizes, which is a bit annoying for the reader.
The cause of this is a table in _Customize the Jackson ObjectMapper_ section which has a cell containing value `spring.jackson.default-property-inclusion=always|non_null|non_absent|non_default|non_empty` that sticks out like a sore thumb and results in a horizontal scrollbar being rendered. This is present in both `1.5.x` and `2.0.x`:
- https://docs.spring.io/spring-boot/docs/1.5.x/reference/htmlsingle/#howto-customize-the-jackson-objectmapper
- https://docs.spring.io/spring-boot/docs/2.0.x/reference/htmlsingle/#howto-customize-the-jackson-objectmapper
Ideally, the horizontal scrollbar (if really needed) should be placed on the enclosing element (table in this case) rather than on the whole document. This approach is already present in many places in the reference manual, like the [application properties appendix](https://docs.spring.io/spring-boot/docs/2.0.x/reference/htmlsingle/#common-application-properties). | non_test | ensure reference manual doesn t generate horizontal scrollbar spring boot s reference manual currently generates horizontal scrollbar at common screen sizes which is a bit annoying for the reader the cause of this is a table in customize the jackson objectmapper section which has a cell containing value spring jackson default property inclusion always non null non absent non default non empty that sticks out like a sore thumb and results in a horizontal scrollbar being rendered this is present in both x and x ideally the horizontal scrollbar if really needed should be placed on the enclosing element table in this case rather than on the whole document this approach is already present in many places in the reference manual like the | 0 |
319,105 | 27,349,941,633 | IssuesEvent | 2023-02-27 08:52:03 | NuGet/Home | https://api.github.com/repos/NuGet/Home | closed | [Cross-platform Dotnet NuGet Sign] An error occurred when adding the trust for certificate on Mac | Priority:2 Type:Bug Functionality:Signing Product:dotnet.exe Category:Quality Week Found:ManualTests | ### NuGet Product Used
dotnet.exe
### Product Version
.Net SDK 7.0.102
### Worked before?
_No response_
### Impact
It's more difficult to complete my work
### Repro Steps & Context
#### Repro Steps:
1. [Patched dotnet SDK](https://microsoft.sharepoint.com/teams/NuGet/_layouts/15/Doc.aspx?sourcedoc=%7b8a3c6685-1661-47f4-8807-b765ca62aea1%7d&action=edit&wd=target%28Package%20Signing%20Manual%20Test.one%7C0344a15b-01b1-4828-af4c-63fe53687f8a%2FPatch%20dotnet%20SDK%7Cfd0e3c41-e7a8-4de8-b83a-1a09fdb56271%2F%29&wdorigin=703).
2. Create a new test certificate: ` .\CreateTestCertificate.ps1 -AddAsTrustedRootAuthority -Password password -GenerateCerFile `(in the powershell "Developer Command Prompt") on Windows.
3. Copy the .cer file(should be generated under the same path with .pfx file) and the .pfx file from the above Windows machine to macOS machine.
4. Add the trust for certificate: `./dotnet run --project ./Entropy/TrustTestCert/TrustTestCert.csproj --framework net7.0 -- add -c <CertificateFilePath> -vsd <VersionedSdkDirectoryPath>` on Mac.
#### Expected:
Adding the trust for certificate is successful.
#### Actual:
An error occurred when adding the trust for certificate as below screenshot.

#### Notes:
1. The issue doesn’t repro on Windows & Linux.
2. The issue doesn’t repro on .NET SDK Version: 7.0.101 patched with Dev\6.5.0.136.
### Verbose Logs
_No response_ | 1.0 | [Cross-platform Dotnet NuGet Sign] An error occurred when adding the trust for certificate on Mac - ### NuGet Product Used
dotnet.exe
### Product Version
.Net SDK 7.0.102
### Worked before?
_No response_
### Impact
It's more difficult to complete my work
### Repro Steps & Context
#### Repro Steps:
1. [Patched dotnet SDK](https://microsoft.sharepoint.com/teams/NuGet/_layouts/15/Doc.aspx?sourcedoc=%7b8a3c6685-1661-47f4-8807-b765ca62aea1%7d&action=edit&wd=target%28Package%20Signing%20Manual%20Test.one%7C0344a15b-01b1-4828-af4c-63fe53687f8a%2FPatch%20dotnet%20SDK%7Cfd0e3c41-e7a8-4de8-b83a-1a09fdb56271%2F%29&wdorigin=703).
2. Create a new test certificate: ` .\CreateTestCertificate.ps1 -AddAsTrustedRootAuthority -Password password -GenerateCerFile `(in the powershell "Developer Command Prompt") on Windows.
3. Copy the .cer file(should be generated under the same path with .pfx file) and the .pfx file from the above Windows machine to macOS machine.
4. Add the trust for certificate: `./dotnet run --project ./Entropy/TrustTestCert/TrustTestCert.csproj --framework net7.0 -- add -c <CertificateFilePath> -vsd <VersionedSdkDirectoryPath>` on Mac.
#### Expected:
Adding the trust for certificate is successful.
#### Actual:
An error occurred when adding the trust for certificate as below screenshot.

#### Notes:
1. The issue doesn’t repro on Windows & Linux.
2. The issue doesn’t repro on .NET SDK Version: 7.0.101 patched with Dev\6.5.0.136.
### Verbose Logs
_No response_ | test | an error occurred when adding the trust for certificate on mac nuget product used dotnet exe product version net sdk worked before no response impact it s more difficult to complete my work repro steps context repro steps create a new test certificate createtestcertificate addastrustedrootauthority password password generatecerfile in the powershell developer command prompt on windows copy the cer file should be generated under the same path with pfx file and the pfx file from the above windows machine to macos machine add the trust for certificate dotnet run project entropy trusttestcert trusttestcert csproj framework add c vsd on mac expected adding the trust for certificate is successful actual an error occurred when adding the trust for certificate as below screenshot notes the issue doesn’t repro on windows linux the issue doesn’t repro on net sdk version patched with dev verbose logs no response | 1 |
250,252 | 18,876,946,048 | IssuesEvent | 2021-11-14 06:17:18 | girlscript/winter-of-contributing | https://api.github.com/repos/girlscript/winter-of-contributing | closed | Lowest common ancestor of a binary tree | documentation GWOC21 Assigned C/CPP | ### Description
lowest common ancestor (lca) in a binary tree
### Domain
C/CPP
### Type of Contribution
Documentation
### Code of Conduct
- [X] I follow [Contributing Guidelines](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CONTRIBUTING.md) & [Code of conduct](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CODE_OF_CONDUCT.md) of this project. | 1.0 | Lowest common ancestor of a binary tree - ### Description
lowest common ancestor (lca) in a binary tree
### Domain
C/CPP
### Type of Contribution
Documentation
### Code of Conduct
- [X] I follow [Contributing Guidelines](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CONTRIBUTING.md) & [Code of conduct](https://github.com/girlscript/winter-of-contributing/blob/main/.github/CODE_OF_CONDUCT.md) of this project. | non_test | lowest common ancestor of a binary tree description lowest common ancestor lca in a binary tree domain c cpp type of contribution documentation code of conduct i follow of this project | 0 |
256,296 | 8,127,330,249 | IssuesEvent | 2018-08-17 07:39:36 | aowen87/BAR | https://api.github.com/repos/aowen87/BAR | closed | viewercore introduces _ser library dependencies into simV2runtime_par. | Bug Likelihood: 3 - Occasional Priority: High Severity: 3 - Major Irritation | The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries.
The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds.
As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 2017
Status: Resolved
Project: VisIt
Tracker: Bug
Priority: High
Subject: viewercore introduces _ser library dependencies into simV2runtime_par.
Assigned to: Brad Whitlock
Category:
Target version: 2.11.0
Author: Brad Whitlock
Start: 10/09/2014
Due date:
% Done: 100
Estimated time:
Created: 10/09/2014 03:40 pm
Updated: 06/28/2016 03:37 pm
Likelihood: 3 - Occasional
Severity: 3 - Major Irritation
Found in version: 2.7.3
Impact:
Expected Use:
OS: All
Support Group: Any
Description:
The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries.
The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds.
As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
Comments:
This needs to be fixed to test simv2 in parallel some more and to test it with static builds.
I'm working on this right now.
I moved a few things from viewercore into libviewer and changed the build to produce viewercore_ser, viewercore_par libraries with appropriate dependencies. This prevents _ser libraries from linking into the SimV2 parallel runtime library. For static builds, the SimV2 reader is built into the SimV2 runtime library. I turned off the build of some rpc and proxy libraries for engine or server only builds.
| 1.0 | viewercore introduces _ser library dependencies into simV2runtime_par. - The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries.
The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds.
As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
-----------------------REDMINE MIGRATION-----------------------
This ticket was migrated from Redmine. As such, not all
information was able to be captured in the transition. Below is
a complete record of the original redmine ticket.
Ticket number: 2017
Status: Resolved
Project: VisIt
Tracker: Bug
Priority: High
Subject: viewercore introduces _ser library dependencies into simV2runtime_par.
Assigned to: Brad Whitlock
Category:
Target version: 2.11.0
Author: Brad Whitlock
Start: 10/09/2014
Due date:
% Done: 100
Estimated time:
Created: 10/09/2014 03:40 pm
Updated: 06/28/2016 03:37 pm
Likelihood: 3 - Occasional
Severity: 3 - Major Irritation
Found in version: 2.7.3
Impact:
Expected Use:
OS: All
Support Group: Any
Description:
The viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim. Since viewercore relies on some AVT stuff and was destined for the viewer, it ends up linking with some _ser avt libraries. When viewercore is linked into the parallel simV2runtime_par library, it brings along its _ser dependencies, even though we really want only the _par versions of the avt libraries.
The best thing might be to build the viewercore sources directly into the simV2runtime libraries instead of linking with viewercore. This would permit us to use the _ser and _par libraries that we want explicitly. We could also omit files like ViewerFileServer and ViewerEngineManager, which are not needed by the simV2runtime, and have some _ser dependencies of their own due to libraries like engineproxy. Furthermore, by not using some of those classes, we could revert to not building libraries like mdserverrpc, mdserverproxy, launcherrpc, launcherproxy, when we're doing engine-only builds.
As things stand on the trunk, engine-only builds are broken because the simv2 runtimes can't be linked with viewercore, which is not built.
Comments:
This needs to be fixed to test simv2 in parallel some more and to test it with static builds.
I'm working on this right now.
I moved a few things from viewercore into libviewer and changed the build to produce viewercore_ser, viewercore_par libraries with appropriate dependencies. This prevents _ser libraries from linking into the SimV2 parallel runtime library. For static builds, the SimV2 reader is built into the SimV2 runtime library. I turned off the build of some rpc and proxy libraries for engine or server only builds.
| non_test | viewercore introduces ser library dependencies into par the viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim since viewercore relies on some avt stuff and was destined for the viewer it ends up linking with some ser avt libraries when viewercore is linked into the parallel par library it brings along its ser dependencies even though we really want only the par versions of the avt libraries the best thing might be to build the viewercore sources directly into the libraries instead of linking with viewercore this would permit us to use the ser and par libraries that we want explicitly we could also omit files like viewerfileserver and viewerenginemanager which are not needed by the and have some ser dependencies of their own due to libraries like engineproxy furthermore by not using some of those classes we could revert to not building libraries like mdserverrpc mdserverproxy launcherrpc launcherproxy when we re doing engine only builds as things stand on the trunk engine only builds are broken because the runtimes can t be linked with viewercore which is not built redmine migration this ticket was migrated from redmine as such not all information was able to be captured in the transition below is a complete record of the original redmine ticket ticket number status resolved project visit tracker bug priority high subject viewercore introduces ser library dependencies into par assigned to brad whitlock category target version author brad whitlock start due date done estimated time created pm updated pm likelihood occasional severity major irritation found in version impact expected use os all support group any description the viewercore library contains the core parts of the viewer and it is used to add viewer functionality to libsim since viewercore relies on some avt stuff and was destined for the viewer it ends up linking with some ser avt libraries when viewercore is linked into the parallel par library it brings along its ser dependencies even though we really want only the par versions of the avt libraries the best thing might be to build the viewercore sources directly into the libraries instead of linking with viewercore this would permit us to use the ser and par libraries that we want explicitly we could also omit files like viewerfileserver and viewerenginemanager which are not needed by the and have some ser dependencies of their own due to libraries like engineproxy furthermore by not using some of those classes we could revert to not building libraries like mdserverrpc mdserverproxy launcherrpc launcherproxy when we re doing engine only builds as things stand on the trunk engine only builds are broken because the runtimes can t be linked with viewercore which is not built comments this needs to be fixed to test in parallel some more and to test it with static builds i m working on this right now i moved a few things from viewercore into libviewer and changed the build to produce viewercore ser viewercore par libraries with appropriate dependencies this prevents ser libraries from linking into the parallel runtime library for static builds the reader is built into the runtime library i turned off the build of some rpc and proxy libraries for engine or server only builds | 0 |
26,072 | 4,197,756,753 | IssuesEvent | 2016-06-27 05:09:37 | hsyyid/EssentialCmds | https://api.github.com/repos/hsyyid/EssentialCmds | closed | blacklist | high priority needs testing sponge implementation missing | Banned items are removed only after taking them into hand. Would it be possible to remove them immediately on pickup or prevent pickup? | 1.0 | blacklist - Banned items are removed only after taking them into hand. Would it be possible to remove them immediately on pickup or prevent pickup? | test | blacklist banned items are removed only after taking them into hand would it be possible to remove them immediately on pickup or prevent pickup | 1 |
238,513 | 19,724,844,735 | IssuesEvent | 2022-01-13 18:53:33 | Sifchain/sifnode | https://api.github.com/repos/Sifchain/sifnode | opened | Minting/Pegging- Script Improvements | Testing Team | These are the requirements identified to improve certain aspects of minting/pegging script:
1) Exporting list of tokens from Betanet as assets.json that would be input to the minting script for Devnet/Testnet
2) Transferring the tokens over from Ethereum to Sifchain and distribute to test accounts | 1.0 | Minting/Pegging- Script Improvements - These are the requirements identified to improve certain aspects of minting/pegging script:
1) Exporting list of tokens from Betanet as assets.json that would be input to the minting script for Devnet/Testnet
2) Transferring the tokens over from Ethereum to Sifchain and distribute to test accounts | test | minting pegging script improvements these are the requirements identified to improve certain aspects of minting pegging script exporting list of tokens from betanet as assets json that would be input to the minting script for devnet testnet transferring the tokens over from ethereum to sifchain and distribute to test accounts | 1 |
114,744 | 14,630,666,308 | IssuesEvent | 2020-12-23 18:11:29 | department-of-veterans-affairs/va.gov-team | https://api.github.com/repos/department-of-veterans-affairs/va.gov-team | opened | Solve and Create First Version of Search for a Representative | POA-Search design vsa vsa-ebenefits | Use known components
Address those that do not take digital submissions | 1.0 | Solve and Create First Version of Search for a Representative - Use known components
Address those that do not take digital submissions | non_test | solve and create first version of search for a representative use known components address those that do not take digital submissions | 0 |
61,791 | 6,758,403,783 | IssuesEvent | 2017-10-24 14:06:56 | artin-phares/microcosm | https://api.github.com/repos/artin-phares/microcosm | closed | rejected promises do not fail tests | bug test | if promise of target method expected to be rejected but it don't - it should fail the test.
currently such test stays green, with ignored `unhandled promise rejection` error in console | 1.0 | rejected promises do not fail tests - if promise of target method expected to be rejected but it don't - it should fail the test.
currently such test stays green, with ignored `unhandled promise rejection` error in console | test | rejected promises do not fail tests if promise of target method expected to be rejected but it don t it should fail the test currently such test stays green with ignored unhandled promise rejection error in console | 1 |
832,409 | 32,079,417,829 | IssuesEvent | 2023-09-25 13:05:40 | yalelibrary/YUL-DC | https://api.github.com/repos/yalelibrary/YUL-DC | closed | 9/20 Deployment | dev-ops HIGH PRIORITY | **Versions**
Management: no changes
Blacklight: no changes
**Production**
- [x] restart image servers | 1.0 | 9/20 Deployment - **Versions**
Management: no changes
Blacklight: no changes
**Production**
- [x] restart image servers | non_test | deployment versions management no changes blacklight no changes production restart image servers | 0 |
126,413 | 10,421,050,735 | IssuesEvent | 2019-09-16 04:11:41 | Fundynamic/RealBot | https://api.github.com/repos/Fundynamic/RealBot | closed | rblog() causes extremely low FPS when running on normal magnetic HDD (non-SSD) | bug ready-for-test | as title.
video:
https://youtu.be/5g2ZyasCHQc
I profiled a bit with Instruments tool included with xcode, and found that rblog() (which writes a lot of log messages to reallog.txt) caused problem:

possible solution:
only open the reallog.txt file once, and close it when program exits. | 1.0 | rblog() causes extremely low FPS when running on normal magnetic HDD (non-SSD) - as title.
video:
https://youtu.be/5g2ZyasCHQc
I profiled a bit with Instruments tool included with xcode, and found that rblog() (which writes a lot of log messages to reallog.txt) caused problem:

possible solution:
only open the reallog.txt file once, and close it when program exits. | test | rblog causes extremely low fps when running on normal magnetic hdd non ssd as title video i profiled a bit with instruments tool included with xcode and found that rblog which writes a lot of log messages to reallog txt caused problem possible solution only open the reallog txt file once and close it when program exits | 1 |
51,573 | 12,748,067,357 | IssuesEvent | 2020-06-26 19:17:28 | cucumber/cucumber | https://api.github.com/repos/cucumber/cucumber | closed | javascript: Eslint 7 requires explicit types on module boundries | language: javascript type: build | When running `make pre-release` cucumber will automatically upgrade all dependencies. This includes an upgrade of `eslint` which brings some new rules.
The rule looks pretty sensible but I've supressed this check for now. Enabling it will require an upgrade of all js projects.
Btw: `GeneratedExpression` and `GroupBuilder` shouldn't be public and not need any types.
```
> @cucumber/cucumber-expressions@10.2.0 lint-fix /app/cucumber-expressions/javascript
> eslint --ext ts --max-warnings 0 --fix src test
Warning: React version was set to "detect" in eslint-plugin-react settings, but the "react" package is not installed. Assuming latest React version for linting.
/app/cucumber-expressions/javascript/src/Argument.ts
47:18 warning Argument 'thisObj' should be typed @typescript-eslint/explicit-module-boundary-types
52:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/GeneratedExpression.ts
10:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/GroupBuilder.ts
9:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
26:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
30:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
34:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/ParameterType.ts
9:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
19:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
65:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
65:19 warning Argument 'thisObj' should be typed @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/ParameterTypeMatcher.ts
16:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
42:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
46:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
50:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
54:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
58:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
66:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
70:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/ParameterTypeRegistry.ts
72:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
76:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
106:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
✖ 22 problems (0 errors, 22 warnings)
``` | 1.0 | javascript: Eslint 7 requires explicit types on module boundries - When running `make pre-release` cucumber will automatically upgrade all dependencies. This includes an upgrade of `eslint` which brings some new rules.
The rule looks pretty sensible but I've supressed this check for now. Enabling it will require an upgrade of all js projects.
Btw: `GeneratedExpression` and `GroupBuilder` shouldn't be public and not need any types.
```
> @cucumber/cucumber-expressions@10.2.0 lint-fix /app/cucumber-expressions/javascript
> eslint --ext ts --max-warnings 0 --fix src test
Warning: React version was set to "detect" in eslint-plugin-react settings, but the "react" package is not installed. Assuming latest React version for linting.
/app/cucumber-expressions/javascript/src/Argument.ts
47:18 warning Argument 'thisObj' should be typed @typescript-eslint/explicit-module-boundary-types
52:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/GeneratedExpression.ts
10:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/GroupBuilder.ts
9:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
26:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
30:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
34:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/ParameterType.ts
9:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
19:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
65:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
65:19 warning Argument 'thisObj' should be typed @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/ParameterTypeMatcher.ts
16:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
42:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
46:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
50:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
54:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
58:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
66:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
70:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
/app/cucumber-expressions/javascript/src/ParameterTypeRegistry.ts
72:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
76:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
106:3 warning Missing return type on function @typescript-eslint/explicit-module-boundary-types
✖ 22 problems (0 errors, 22 warnings)
``` | non_test | javascript eslint requires explicit types on module boundries when running make pre release cucumber will automatically upgrade all dependencies this includes an upgrade of eslint which brings some new rules the rule looks pretty sensible but i ve supressed this check for now enabling it will require an upgrade of all js projects btw generatedexpression and groupbuilder shouldn t be public and not need any types cucumber cucumber expressions lint fix app cucumber expressions javascript eslint ext ts max warnings fix src test warning react version was set to detect in eslint plugin react settings but the react package is not installed assuming latest react version for linting app cucumber expressions javascript src argument ts warning argument thisobj should be typed typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types app cucumber expressions javascript src generatedexpression ts warning missing return type on function typescript eslint explicit module boundary types app cucumber expressions javascript src groupbuilder ts warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types app cucumber expressions javascript src parametertype ts warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning argument thisobj should be typed typescript eslint explicit module boundary types app cucumber expressions javascript src parametertypematcher ts warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types app cucumber expressions javascript src parametertyperegistry ts warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types warning missing return type on function typescript eslint explicit module boundary types ✖ problems errors warnings | 0 |
301,738 | 26,093,196,823 | IssuesEvent | 2022-12-26 14:42:55 | bounswe/bounswe2022group4 | https://api.github.com/repos/bounswe/bounswe2022group4 | opened | Frontend: Unit Test for Search Functionality | Status: In Test whom: individual Language - React.js Frontend | ### About Test:
The search feature of the medical experience sharing platform has been implemented, but it does not have any unit test cases. Since it is important to ensure that the search is functioning correctly, I will impleement unit test for it. Thanks to tests, the possibility of malfunctioning of the search will be decreased.
### Steps to be taken:
* I will design realistic scenarious that can be encountered while using the medical experience sharing platform.
* I will determine expected results according to scenarious.
* I will implement unit test by comparing actual results with expexted results.
* If there is a problemi I will fix it thanks to tests.
I will use @testing-library/react to impelement unit tests.
### 🔎 Reviewer:
@umutdenizsenerr
### ⏰ Deadline:
26.12.2022 23.59 | 1.0 | Frontend: Unit Test for Search Functionality - ### About Test:
The search feature of the medical experience sharing platform has been implemented, but it does not have any unit test cases. Since it is important to ensure that the search is functioning correctly, I will impleement unit test for it. Thanks to tests, the possibility of malfunctioning of the search will be decreased.
### Steps to be taken:
* I will design realistic scenarious that can be encountered while using the medical experience sharing platform.
* I will determine expected results according to scenarious.
* I will implement unit test by comparing actual results with expexted results.
* If there is a problemi I will fix it thanks to tests.
I will use @testing-library/react to impelement unit tests.
### 🔎 Reviewer:
@umutdenizsenerr
### ⏰ Deadline:
26.12.2022 23.59 | test | frontend unit test for search functionality about test the search feature of the medical experience sharing platform has been implemented but it does not have any unit test cases since it is important to ensure that the search is functioning correctly i will impleement unit test for it thanks to tests the possibility of malfunctioning of the search will be decreased steps to be taken i will design realistic scenarious that can be encountered while using the medical experience sharing platform i will determine expected results according to scenarious i will implement unit test by comparing actual results with expexted results if there is a problemi i will fix it thanks to tests i will use testing library react to impelement unit tests 🔎 reviewer umutdenizsenerr ⏰ deadline | 1 |
37,884 | 5,147,357,061 | IssuesEvent | 2017-01-13 06:49:22 | jiksoo75/MOHA-Kafka | https://api.github.com/repos/jiksoo75/MOHA-Kafka | opened | Testing of the new MOHA TaskExecutor Implementation | Testing | Based on the implementation of new MOHA Manager Class in Issue #33, we need to test its functionality on our testbed. | 1.0 | Testing of the new MOHA TaskExecutor Implementation - Based on the implementation of new MOHA Manager Class in Issue #33, we need to test its functionality on our testbed. | test | testing of the new moha taskexecutor implementation based on the implementation of new moha manager class in issue we need to test its functionality on our testbed | 1 |
32,003 | 2,742,739,180 | IssuesEvent | 2015-04-21 17:57:58 | fog/fog-google | https://api.github.com/repos/fog/fog-google | opened | Images functionality broken due to API change | bug priority/high | Per #20 and testing done yesterday:
The `images` functionality is currently broken, as [the API](https://cloud.google.com/compute/docs/reference/latest/images#resource) has changed, but the codebase hasn't reflected that change. | 1.0 | Images functionality broken due to API change - Per #20 and testing done yesterday:
The `images` functionality is currently broken, as [the API](https://cloud.google.com/compute/docs/reference/latest/images#resource) has changed, but the codebase hasn't reflected that change. | non_test | images functionality broken due to api change per and testing done yesterday the images functionality is currently broken as has changed but the codebase hasn t reflected that change | 0 |
73,596 | 7,345,578,867 | IssuesEvent | 2018-03-07 17:51:17 | openshift/origin | https://api.github.com/repos/openshift/origin | closed | Jenkins testing (v2) should pass a smoke test after an s2i build | kind/test-flake | https://ci.openshift.redhat.com/jenkins/job/push_jenkins_images/139/
```
Jenkins testing (v2)
should pass a smoke test after an s2i build
/data/src/github.com/openshift/jenkins/2/test/jenkins_test.go:181
STEP: running s2i build
Checking if image "registry.access.redhat.com/openshift3/jenkins-2-rhel7-candidate" is available locally ...
Checking if image "registry.access.redhat.com/openshift3/jenkins-2-rhel7-candidate" is available locally ...
panic: test timed out after 30m0s
```
<summary>
<details>
```
goroutine 1875 [running]:
testing.startAlarm.func1()
/usr/lib/golang/src/testing/testing.go:1145 +0xf9
created by time.goFunc
/usr/lib/golang/src/time/sleep.go:170 +0x44
goroutine 1 [chan receive, 29 minutes]:
testing.(*T).Run(0xc420358000, 0x84b9b2, 0x4, 0x86c338, 0x480dc6)
/usr/lib/golang/src/testing/testing.go:790 +0x2fc
testing.runTests.func1(0xc420358000)
/usr/lib/golang/src/testing/testing.go:1004 +0x64
testing.tRunner(0xc420358000, 0xc4201e3de0)
/usr/lib/golang/src/testing/testing.go:746 +0xd0
testing.runTests(0xc420237260, 0xb17c30, 0x1, 0x1, 0x2cd)
/usr/lib/golang/src/testing/testing.go:1002 +0x2d8
testing.(*M).Run(0xc420433f18, 0xc4201e3f70)
/usr/lib/golang/src/testing/testing.go:921 +0x111
main.main()
github.com/openshift/jenkins/2/test/_test/_testmain.go:44 +0xdb
goroutine 5 [syscall, 29 minutes]:
os/signal.signal_recv(0x0)
/usr/lib/golang/src/runtime/sigqueue.go:131 +0xa6
os/signal.loop()
/usr/lib/golang/src/os/signal/signal_unix.go:22 +0x22
created by os/signal.init.0
/usr/lib/golang/src/os/signal/signal_unix.go:28 +0x41
goroutine 7 [syscall, 25 minutes]:
syscall.Syscall6(0xf7, 0x1, 0x4775, 0xc42016aed0, 0x1000004, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/lib/golang/src/syscall/asm_linux_amd64.s:44 +0x5
os.(*Process).blockUntilWaitable(0xc4201da510, 0xc4201da510, 0x0, 0x0)
/usr/lib/golang/src/os/wait_waitid.go:31 +0xa5
os.(*Process).wait(0xc4201da510, 0xc42043ff20, 0xc420467578, 0x60)
/usr/lib/golang/src/os/exec_unix.go:22 +0x42
os.(*Process).Wait(0xc4201da510, 0x0, 0x0, 0x86c7f8)
/usr/lib/golang/src/os/exec.go:115 +0x2b
os/exec.(*Cmd).Wait(0xc4204674a0, 0x0, 0x0)
/usr/lib/golang/src/os/exec/exec.go:446 +0x62
os/exec.(*Cmd).Run(0xc4204674a0, 0xc42043ff20, 0xc42016b228)
/usr/lib/golang/src/os/exec/exec.go:289 +0x5c
github.com/openshift/jenkins/2/test.glob..func2.7()
/data/src/github.com/openshift/jenkins/2/test/jenkins_test.go:203 +0x34f
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc420274d20, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:110 +0x9c
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc420274d20, 0x3, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*ItNode).Run(0xc4202371a0, 0xa5e1c0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes/it_node.go:26 +0x7f
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc420081930, 0x0, 0xa5e1c0, 0xc420076cf0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:176 +0x5a9
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc420081930, 0xa5e1c0, 0xc420076cf0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:127 +0xe3
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc4202937c0, 0xc420081930, 0x0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:198 +0x10d
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc4202937c0, 0x86c301)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:168 +0x32c
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc4202937c0, 0x12)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:64 +0xdc
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200102d0, 0x7fc42beca0f0, 0xc4203580f0, 0x85527c, 0x12, 0xc4202aaf60, 0x1, 0x1, 0xa64dc0, 0xc420076cf0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x283
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0xa5f080, 0xc4203580f0, 0x85527c, 0x12, 0xc42003df50, 0x1, 0x1, 0xf)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:218 +0x276
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo.RunSpecs(0xa5f080, 0xc4203580f0, 0x85527c, 0x12, 0x5371b3)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:199 +0x1e5
github.com/openshift/jenkins/2/test.Test(0xc4203580f0)
/data/src/github.com/openshift/jenkins/2/test/jenkins_test.go:22 +0x64
testing.tRunner(0xc4203580f0, 0x86c338)
/usr/lib/golang/src/testing/testing.go:746 +0xd0
created by testing.(*T).Run
/usr/lib/golang/src/testing/testing.go:789 +0x2de
goroutine 8 [chan receive, 29 minutes]:
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc4202937c0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:220 +0xd7
created by github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:59 +0x60
goroutine 18 [select, 29 minutes, locked to thread]:
runtime.gopark(0x86cbe0, 0x0, 0x84d9b0, 0x6, 0x18, 0x1)
/usr/lib/golang/src/runtime/proc.go:287 +0x12c
runtime.selectgo(0xc420038f50, 0xc420370060)
/usr/lib/golang/src/runtime/select.go:395 +0x1149
runtime.ensureSigM.func1()
/usr/lib/golang/src/runtime/signal_unix.go:511 +0x220
runtime.goexit()
/usr/lib/golang/src/runtime/asm_amd64.s:2337 +0x1
goroutine 11 [IO wait, 25 minutes]:
internal/poll.runtime_pollWait(0x7fc42bec1f70, 0x72, 0x0)
/usr/lib/golang/src/runtime/netpoll.go:173 +0x57
internal/poll.(*pollDesc).wait(0xc42037c098, 0x72, 0xffffffffffffff00, 0xa60400, 0xa5c4c0)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:85 +0xae
internal/poll.(*pollDesc).waitRead(0xc42037c098, 0xc42009f000, 0x1000, 0x1000)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:90 +0x3d
internal/poll.(*FD).Read(0xc42037c080, 0xc42009f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/internal/poll/fd_unix.go:126 +0x18a
net.(*netFD).Read(0xc42037c080, 0xc42009f000, 0x1000, 0x1000, 0x42f05b, 0xc420388a40, 0x45a6a0)
/usr/lib/golang/src/net/fd_unix.go:202 +0x52
net.(*conn).Read(0xc42000e4b8, 0xc42009f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/net/net.go:176 +0x6d
net/http.(*persistConn).Read(0xc42009dc20, 0xc42009f000, 0x1000, 0x1000, 0xc4202e2640, 0xc42038c058, 0x458740)
/usr/lib/golang/src/net/http/transport.go:1391 +0x140
bufio.(*Reader).fill(0xc420275320)
/usr/lib/golang/src/bufio/bufio.go:97 +0x11a
bufio.(*Reader).Peek(0xc420275320, 0x1, 0x0, 0x0, 0x0, 0xc420341920, 0x0)
/usr/lib/golang/src/bufio/bufio.go:129 +0x3a
net/http.(*persistConn).readLoop(0xc42009dc20)
/usr/lib/golang/src/net/http/transport.go:1539 +0x185
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 12 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42009dc20)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 647 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc42009dd40)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 646 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc42009dd40)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 390 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc42009cea0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 658 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc420436ea0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 389 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc42009cea0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1215 [select, 27 minutes]:
net/http.(*persistConn).readLoop(0xc420327320)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 381 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc420412d80)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 380 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc420412d80)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 385 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc420412ea0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 384 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc420412ea0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 662 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc4204370e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1168 [select, 27 minutes]:
net/http.(*persistConn).readLoop(0xc42031c7e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 661 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc4204370e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 625 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc420436ea0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1507 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42031c360)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1211 [select, 27 minutes]:
net/http.(*persistConn).readLoop(0xc4203270e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1212 [select, 27 minutes]:
net/http.(*persistConn).writeLoop(0xc4203270e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1169 [select, 27 minutes]:
net/http.(*persistConn).writeLoop(0xc42031c7e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1506 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc42031c360)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1216 [select, 27 minutes]:
net/http.(*persistConn).writeLoop(0xc420327320)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1839 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc420326a20)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1438 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc42031c120)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1439 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42031c120)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1494 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc42009d0e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1495 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42009d0e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1838 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc420326a20)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1848 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc4204a67e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1890 [IO wait, 25 minutes]:
internal/poll.runtime_pollWait(0x7fc42bec1370, 0x72, 0x1)
/usr/lib/golang/src/runtime/netpoll.go:173 +0x57
internal/poll.(*pollDesc).wait(0xc420420978, 0x72, 0xffffffffffffff01, 0xa60400, 0xa5c4c0)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:85 +0xae
internal/poll.(*pollDesc).waitRead(0xc420420978, 0xc4201ae001, 0x8000, 0x8000)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:90 +0x3d
internal/poll.(*FD).Read(0xc420420960, 0xc4201ae000, 0x8000, 0x8000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/internal/poll/fd_unix.go:126 +0x18a
os.(*File).read(0xc4203f8460, 0xc4201ae000, 0x8000, 0x8000, 0xc4203ece60, 0x6f5e9a, 0xc4200165f8)
/usr/lib/golang/src/os/file_unix.go:216 +0x4e
os.(*File).Read(0xc4203f8460, 0xc4201ae000, 0x8000, 0x8000, 0x6d, 0x0, 0x0)
/usr/lib/golang/src/os/file.go:103 +0x6d
io.copyBuffer(0xa5e1c0, 0xc420076cf0, 0xa5ec40, 0xc4203f8460, 0xc4201ae000, 0x8000, 0x8000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/io/io.go:392 +0x123
io.Copy(0xa5e1c0, 0xc420076cf0, 0xa5ec40, 0xc4203f8460, 0x0, 0x0, 0x0)
/usr/lib/golang/src/io/io.go:362 +0x68
os/exec.(*Cmd).writerDescriptor.func1(0x435118, 0x86ca68)
/usr/lib/golang/src/os/exec/exec.go:264 +0x4d
os/exec.(*Cmd).Start.func1(0xc4204674a0, 0xc420395960)
/usr/lib/golang/src/os/exec/exec.go:380 +0x27
created by os/exec.(*Cmd).Start
/usr/lib/golang/src/os/exec/exec.go:379 +0x646
goroutine 1849 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc4204a67e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1852 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc4204a6900)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1853 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc4204a6900)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
exit status 2
```
</details>
</summary>
FAIL github.com/openshift/jenkins/2/test 1800.014s | 1.0 | Jenkins testing (v2) should pass a smoke test after an s2i build - https://ci.openshift.redhat.com/jenkins/job/push_jenkins_images/139/
```
Jenkins testing (v2)
should pass a smoke test after an s2i build
/data/src/github.com/openshift/jenkins/2/test/jenkins_test.go:181
STEP: running s2i build
Checking if image "registry.access.redhat.com/openshift3/jenkins-2-rhel7-candidate" is available locally ...
Checking if image "registry.access.redhat.com/openshift3/jenkins-2-rhel7-candidate" is available locally ...
panic: test timed out after 30m0s
```
<summary>
<details>
```
goroutine 1875 [running]:
testing.startAlarm.func1()
/usr/lib/golang/src/testing/testing.go:1145 +0xf9
created by time.goFunc
/usr/lib/golang/src/time/sleep.go:170 +0x44
goroutine 1 [chan receive, 29 minutes]:
testing.(*T).Run(0xc420358000, 0x84b9b2, 0x4, 0x86c338, 0x480dc6)
/usr/lib/golang/src/testing/testing.go:790 +0x2fc
testing.runTests.func1(0xc420358000)
/usr/lib/golang/src/testing/testing.go:1004 +0x64
testing.tRunner(0xc420358000, 0xc4201e3de0)
/usr/lib/golang/src/testing/testing.go:746 +0xd0
testing.runTests(0xc420237260, 0xb17c30, 0x1, 0x1, 0x2cd)
/usr/lib/golang/src/testing/testing.go:1002 +0x2d8
testing.(*M).Run(0xc420433f18, 0xc4201e3f70)
/usr/lib/golang/src/testing/testing.go:921 +0x111
main.main()
github.com/openshift/jenkins/2/test/_test/_testmain.go:44 +0xdb
goroutine 5 [syscall, 29 minutes]:
os/signal.signal_recv(0x0)
/usr/lib/golang/src/runtime/sigqueue.go:131 +0xa6
os/signal.loop()
/usr/lib/golang/src/os/signal/signal_unix.go:22 +0x22
created by os/signal.init.0
/usr/lib/golang/src/os/signal/signal_unix.go:28 +0x41
goroutine 7 [syscall, 25 minutes]:
syscall.Syscall6(0xf7, 0x1, 0x4775, 0xc42016aed0, 0x1000004, 0x0, 0x0, 0x0, 0x0, 0x0)
/usr/lib/golang/src/syscall/asm_linux_amd64.s:44 +0x5
os.(*Process).blockUntilWaitable(0xc4201da510, 0xc4201da510, 0x0, 0x0)
/usr/lib/golang/src/os/wait_waitid.go:31 +0xa5
os.(*Process).wait(0xc4201da510, 0xc42043ff20, 0xc420467578, 0x60)
/usr/lib/golang/src/os/exec_unix.go:22 +0x42
os.(*Process).Wait(0xc4201da510, 0x0, 0x0, 0x86c7f8)
/usr/lib/golang/src/os/exec.go:115 +0x2b
os/exec.(*Cmd).Wait(0xc4204674a0, 0x0, 0x0)
/usr/lib/golang/src/os/exec/exec.go:446 +0x62
os/exec.(*Cmd).Run(0xc4204674a0, 0xc42043ff20, 0xc42016b228)
/usr/lib/golang/src/os/exec/exec.go:289 +0x5c
github.com/openshift/jenkins/2/test.glob..func2.7()
/data/src/github.com/openshift/jenkins/2/test/jenkins_test.go:203 +0x34f
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).runSync(0xc420274d20, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:110 +0x9c
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*runner).run(0xc420274d20, 0x3, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes/runner.go:64 +0x13e
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes.(*ItNode).Run(0xc4202371a0, 0xa5e1c0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/leafnodes/it_node.go:26 +0x7f
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).runSample(0xc420081930, 0x0, 0xa5e1c0, 0xc420076cf0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:176 +0x5a9
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec.(*Spec).Run(0xc420081930, 0xa5e1c0, 0xc420076cf0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/spec/spec.go:127 +0xe3
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpec(0xc4202937c0, 0xc420081930, 0x0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:198 +0x10d
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).runSpecs(0xc4202937c0, 0x86c301)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:168 +0x32c
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run(0xc4202937c0, 0x12)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:64 +0xdc
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/suite.(*Suite).Run(0xc4200102d0, 0x7fc42beca0f0, 0xc4203580f0, 0x85527c, 0x12, 0xc4202aaf60, 0x1, 0x1, 0xa64dc0, 0xc420076cf0, ...)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/suite/suite.go:62 +0x283
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo.RunSpecsWithCustomReporters(0xa5f080, 0xc4203580f0, 0x85527c, 0x12, 0xc42003df50, 0x1, 0x1, 0xf)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:218 +0x276
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo.RunSpecs(0xa5f080, 0xc4203580f0, 0x85527c, 0x12, 0x5371b3)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/ginkgo_dsl.go:199 +0x1e5
github.com/openshift/jenkins/2/test.Test(0xc4203580f0)
/data/src/github.com/openshift/jenkins/2/test/jenkins_test.go:22 +0x64
testing.tRunner(0xc4203580f0, 0x86c338)
/usr/lib/golang/src/testing/testing.go:746 +0xd0
created by testing.(*T).Run
/usr/lib/golang/src/testing/testing.go:789 +0x2de
goroutine 8 [chan receive, 29 minutes]:
github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).registerForInterrupts(0xc4202937c0)
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:220 +0xd7
created by github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner.(*SpecRunner).Run
/data/src/github.com/openshift/jenkins/vendor/github.com/onsi/ginkgo/internal/specrunner/spec_runner.go:59 +0x60
goroutine 18 [select, 29 minutes, locked to thread]:
runtime.gopark(0x86cbe0, 0x0, 0x84d9b0, 0x6, 0x18, 0x1)
/usr/lib/golang/src/runtime/proc.go:287 +0x12c
runtime.selectgo(0xc420038f50, 0xc420370060)
/usr/lib/golang/src/runtime/select.go:395 +0x1149
runtime.ensureSigM.func1()
/usr/lib/golang/src/runtime/signal_unix.go:511 +0x220
runtime.goexit()
/usr/lib/golang/src/runtime/asm_amd64.s:2337 +0x1
goroutine 11 [IO wait, 25 minutes]:
internal/poll.runtime_pollWait(0x7fc42bec1f70, 0x72, 0x0)
/usr/lib/golang/src/runtime/netpoll.go:173 +0x57
internal/poll.(*pollDesc).wait(0xc42037c098, 0x72, 0xffffffffffffff00, 0xa60400, 0xa5c4c0)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:85 +0xae
internal/poll.(*pollDesc).waitRead(0xc42037c098, 0xc42009f000, 0x1000, 0x1000)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:90 +0x3d
internal/poll.(*FD).Read(0xc42037c080, 0xc42009f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/internal/poll/fd_unix.go:126 +0x18a
net.(*netFD).Read(0xc42037c080, 0xc42009f000, 0x1000, 0x1000, 0x42f05b, 0xc420388a40, 0x45a6a0)
/usr/lib/golang/src/net/fd_unix.go:202 +0x52
net.(*conn).Read(0xc42000e4b8, 0xc42009f000, 0x1000, 0x1000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/net/net.go:176 +0x6d
net/http.(*persistConn).Read(0xc42009dc20, 0xc42009f000, 0x1000, 0x1000, 0xc4202e2640, 0xc42038c058, 0x458740)
/usr/lib/golang/src/net/http/transport.go:1391 +0x140
bufio.(*Reader).fill(0xc420275320)
/usr/lib/golang/src/bufio/bufio.go:97 +0x11a
bufio.(*Reader).Peek(0xc420275320, 0x1, 0x0, 0x0, 0x0, 0xc420341920, 0x0)
/usr/lib/golang/src/bufio/bufio.go:129 +0x3a
net/http.(*persistConn).readLoop(0xc42009dc20)
/usr/lib/golang/src/net/http/transport.go:1539 +0x185
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 12 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42009dc20)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 647 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc42009dd40)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 646 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc42009dd40)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 390 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc42009cea0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 658 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc420436ea0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 389 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc42009cea0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1215 [select, 27 minutes]:
net/http.(*persistConn).readLoop(0xc420327320)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 381 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc420412d80)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 380 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc420412d80)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 385 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc420412ea0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 384 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc420412ea0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 662 [select, 29 minutes]:
net/http.(*persistConn).writeLoop(0xc4204370e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1168 [select, 27 minutes]:
net/http.(*persistConn).readLoop(0xc42031c7e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 661 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc4204370e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 625 [select, 29 minutes]:
net/http.(*persistConn).readLoop(0xc420436ea0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1507 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42031c360)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1211 [select, 27 minutes]:
net/http.(*persistConn).readLoop(0xc4203270e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1212 [select, 27 minutes]:
net/http.(*persistConn).writeLoop(0xc4203270e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1169 [select, 27 minutes]:
net/http.(*persistConn).writeLoop(0xc42031c7e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1506 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc42031c360)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1216 [select, 27 minutes]:
net/http.(*persistConn).writeLoop(0xc420327320)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1839 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc420326a20)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1438 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc42031c120)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1439 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42031c120)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1494 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc42009d0e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1495 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc42009d0e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1838 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc420326a20)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1848 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc4204a67e0)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1890 [IO wait, 25 minutes]:
internal/poll.runtime_pollWait(0x7fc42bec1370, 0x72, 0x1)
/usr/lib/golang/src/runtime/netpoll.go:173 +0x57
internal/poll.(*pollDesc).wait(0xc420420978, 0x72, 0xffffffffffffff01, 0xa60400, 0xa5c4c0)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:85 +0xae
internal/poll.(*pollDesc).waitRead(0xc420420978, 0xc4201ae001, 0x8000, 0x8000)
/usr/lib/golang/src/internal/poll/fd_poll_runtime.go:90 +0x3d
internal/poll.(*FD).Read(0xc420420960, 0xc4201ae000, 0x8000, 0x8000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/internal/poll/fd_unix.go:126 +0x18a
os.(*File).read(0xc4203f8460, 0xc4201ae000, 0x8000, 0x8000, 0xc4203ece60, 0x6f5e9a, 0xc4200165f8)
/usr/lib/golang/src/os/file_unix.go:216 +0x4e
os.(*File).Read(0xc4203f8460, 0xc4201ae000, 0x8000, 0x8000, 0x6d, 0x0, 0x0)
/usr/lib/golang/src/os/file.go:103 +0x6d
io.copyBuffer(0xa5e1c0, 0xc420076cf0, 0xa5ec40, 0xc4203f8460, 0xc4201ae000, 0x8000, 0x8000, 0x0, 0x0, 0x0)
/usr/lib/golang/src/io/io.go:392 +0x123
io.Copy(0xa5e1c0, 0xc420076cf0, 0xa5ec40, 0xc4203f8460, 0x0, 0x0, 0x0)
/usr/lib/golang/src/io/io.go:362 +0x68
os/exec.(*Cmd).writerDescriptor.func1(0x435118, 0x86ca68)
/usr/lib/golang/src/os/exec/exec.go:264 +0x4d
os/exec.(*Cmd).Start.func1(0xc4204674a0, 0xc420395960)
/usr/lib/golang/src/os/exec/exec.go:380 +0x27
created by os/exec.(*Cmd).Start
/usr/lib/golang/src/os/exec/exec.go:379 +0x646
goroutine 1849 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc4204a67e0)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
goroutine 1852 [select, 25 minutes]:
net/http.(*persistConn).readLoop(0xc4204a6900)
/usr/lib/golang/src/net/http/transport.go:1654 +0x7a7
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1186 +0xa2e
goroutine 1853 [select, 25 minutes]:
net/http.(*persistConn).writeLoop(0xc4204a6900)
/usr/lib/golang/src/net/http/transport.go:1759 +0x165
created by net/http.(*Transport).dialConn
/usr/lib/golang/src/net/http/transport.go:1187 +0xa53
exit status 2
```
</details>
</summary>
FAIL github.com/openshift/jenkins/2/test 1800.014s | test | jenkins testing should pass a smoke test after an build jenkins testing should pass a smoke test after an build data src github com openshift jenkins test jenkins test go step running build checking if image registry access redhat com jenkins candidate is available locally checking if image registry access redhat com jenkins candidate is available locally panic test timed out after goroutine testing startalarm usr lib golang src testing testing go created by time gofunc usr lib golang src time sleep go goroutine testing t run usr lib golang src testing testing go testing runtests usr lib golang src testing testing go testing trunner usr lib golang src testing testing go testing runtests usr lib golang src testing testing go testing m run usr lib golang src testing testing go main main github com openshift jenkins test test testmain go goroutine os signal signal recv usr lib golang src runtime sigqueue go os signal loop usr lib golang src os signal signal unix go created by os signal init usr lib golang src os signal signal unix go goroutine syscall usr lib golang src syscall asm linux s os process blockuntilwaitable usr lib golang src os wait waitid go os process wait usr lib golang src os exec unix go os process wait usr lib golang src os exec go os exec cmd wait usr lib golang src os exec exec go os exec cmd run usr lib golang src os exec exec go github com openshift jenkins test glob data src github com openshift jenkins test jenkins test go github com openshift jenkins vendor github com onsi ginkgo internal leafnodes runner runsync data src github com openshift jenkins vendor github com onsi ginkgo internal leafnodes runner go github com openshift jenkins vendor github com onsi ginkgo internal leafnodes runner run data src github com openshift jenkins vendor github com onsi ginkgo internal leafnodes runner go github com openshift jenkins vendor github com onsi ginkgo internal leafnodes itnode run data src github com openshift jenkins vendor github com onsi ginkgo internal leafnodes it node go github com openshift jenkins vendor github com onsi ginkgo internal spec spec runsample data src github com openshift jenkins vendor github com onsi ginkgo internal spec spec go github com openshift jenkins vendor github com onsi ginkgo internal spec spec run data src github com openshift jenkins vendor github com onsi ginkgo internal spec spec go github com openshift jenkins vendor github com onsi ginkgo internal specrunner specrunner runspec data src github com openshift jenkins vendor github com onsi ginkgo internal specrunner spec runner go github com openshift jenkins vendor github com onsi ginkgo internal specrunner specrunner runspecs data src github com openshift jenkins vendor github com onsi ginkgo internal specrunner spec runner go github com openshift jenkins vendor github com onsi ginkgo internal specrunner specrunner run data src github com openshift jenkins vendor github com onsi ginkgo internal specrunner spec runner go github com openshift jenkins vendor github com onsi ginkgo internal suite suite run data src github com openshift jenkins vendor github com onsi ginkgo internal suite suite go github com openshift jenkins vendor github com onsi ginkgo runspecswithcustomreporters data src github com openshift jenkins vendor github com onsi ginkgo ginkgo dsl go github com openshift jenkins vendor github com onsi ginkgo runspecs data src github com openshift jenkins vendor github com onsi ginkgo ginkgo dsl go github com openshift jenkins test test data src github com openshift jenkins test jenkins test go testing trunner usr lib golang src testing testing go created by testing t run usr lib golang src testing testing go goroutine github com openshift jenkins vendor github com onsi ginkgo internal specrunner specrunner registerforinterrupts data src github com openshift jenkins vendor github com onsi ginkgo internal specrunner spec runner go created by github com openshift jenkins vendor github com onsi ginkgo internal specrunner specrunner run data src github com openshift jenkins vendor github com onsi ginkgo internal specrunner spec runner go goroutine runtime gopark usr lib golang src runtime proc go runtime selectgo usr lib golang src runtime select go runtime ensuresigm usr lib golang src runtime signal unix go runtime goexit usr lib golang src runtime asm s goroutine internal poll runtime pollwait usr lib golang src runtime netpoll go internal poll polldesc wait usr lib golang src internal poll fd poll runtime go internal poll polldesc waitread usr lib golang src internal poll fd poll runtime go internal poll fd read usr lib golang src internal poll fd unix go net netfd read usr lib golang src net fd unix go net conn read usr lib golang src net net go net http persistconn read usr lib golang src net http transport go bufio reader fill usr lib golang src bufio bufio go bufio reader peek usr lib golang src bufio bufio go net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine internal poll runtime pollwait usr lib golang src runtime netpoll go internal poll polldesc wait usr lib golang src internal poll fd poll runtime go internal poll polldesc waitread usr lib golang src internal poll fd poll runtime go internal poll fd read usr lib golang src internal poll fd unix go os file read usr lib golang src os file unix go os file read usr lib golang src os file go io copybuffer usr lib golang src io io go io copy usr lib golang src io io go os exec cmd writerdescriptor usr lib golang src os exec exec go os exec cmd start usr lib golang src os exec exec go created by os exec cmd start usr lib golang src os exec exec go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn readloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go goroutine net http persistconn writeloop usr lib golang src net http transport go created by net http transport dialconn usr lib golang src net http transport go exit status fail github com openshift jenkins test | 1 |
99,068 | 8,689,183,656 | IssuesEvent | 2018-12-03 17:58:50 | wp-cli/wp-cli-bundle | https://api.github.com/repos/wp-cli/wp-cli-bundle | closed | Bump PHP Compatibility Checker version minimum to 5.4 | scope:testing | While we raised the required PHP minimum for WP-CLI to version 5.4+ with release v2.0.0, we are still checking for PHP 5.3 compatibility with the PHP Compatibility Checker sniffs. | 1.0 | Bump PHP Compatibility Checker version minimum to 5.4 - While we raised the required PHP minimum for WP-CLI to version 5.4+ with release v2.0.0, we are still checking for PHP 5.3 compatibility with the PHP Compatibility Checker sniffs. | test | bump php compatibility checker version minimum to while we raised the required php minimum for wp cli to version with release we are still checking for php compatibility with the php compatibility checker sniffs | 1 |
531,294 | 15,444,338,957 | IssuesEvent | 2021-03-08 10:16:39 | AY2021S2-CS2103T-T12-4/tp | https://api.github.com/repos/AY2021S2-CS2103T-T12-4/tp | closed | Bug: Response time adds multiple seconds | priority.High severity.High type.Bug | To recreate, open the close, do a few commands and close the app multiple times.
Sample of error:
{
"endpoints" : [ {
"method" : "GET",
"address" : "jakjdfl",
"data" : "somedata",
"headers" : [ ],
"tagged" : [ ],
"response" : {
"protocolVersion" : "",
"statusCode" : "",
"reasonPhrase" : "",
"statusLine" : "",
"responseEntity" : "",
"responseTime" : " seconds seconds seconds"
}
}, {
"method" : "GET",
"address" : "testingurl",
"data" : "",
"headers" : [ ],
"tagged" : [ ],
"response" : {
"protocolVersion" : "",
"statusCode" : "",
"reasonPhrase" : "",
"statusLine" : "",
"responseEntity" : "",
"responseTime" : " seconds"
}
}, {
"method" : "POST",
"address" : "urlwdata",
"data" : "testingdata",
"headers" : [ ],
"tagged" : [ ],
"response" : {
"protocolVersion" : "",
"statusCode" : "",
"reasonPhrase" : "",
"statusLine" : "",
"responseEntity" : "",
"responseTime" : " seconds"
}
} ]
} | 1.0 | Bug: Response time adds multiple seconds - To recreate, open the close, do a few commands and close the app multiple times.
Sample of error:
{
"endpoints" : [ {
"method" : "GET",
"address" : "jakjdfl",
"data" : "somedata",
"headers" : [ ],
"tagged" : [ ],
"response" : {
"protocolVersion" : "",
"statusCode" : "",
"reasonPhrase" : "",
"statusLine" : "",
"responseEntity" : "",
"responseTime" : " seconds seconds seconds"
}
}, {
"method" : "GET",
"address" : "testingurl",
"data" : "",
"headers" : [ ],
"tagged" : [ ],
"response" : {
"protocolVersion" : "",
"statusCode" : "",
"reasonPhrase" : "",
"statusLine" : "",
"responseEntity" : "",
"responseTime" : " seconds"
}
}, {
"method" : "POST",
"address" : "urlwdata",
"data" : "testingdata",
"headers" : [ ],
"tagged" : [ ],
"response" : {
"protocolVersion" : "",
"statusCode" : "",
"reasonPhrase" : "",
"statusLine" : "",
"responseEntity" : "",
"responseTime" : " seconds"
}
} ]
} | non_test | bug response time adds multiple seconds to recreate open the close do a few commands and close the app multiple times sample of error endpoints method get address jakjdfl data somedata headers tagged response protocolversion statuscode reasonphrase statusline responseentity responsetime seconds seconds seconds method get address testingurl data headers tagged response protocolversion statuscode reasonphrase statusline responseentity responsetime seconds method post address urlwdata data testingdata headers tagged response protocolversion statuscode reasonphrase statusline responseentity responsetime seconds | 0 |
761,731 | 26,694,665,264 | IssuesEvent | 2023-01-27 09:18:28 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | twitter.com - design is broken | priority-critical browser-fenix engine-gecko device-tablet | <!-- @browser: Firefox Mobile (Tablet) 109.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 12; Tablet; rv:109.0) Gecko/109.0 Firefox/109.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/117196 -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://twitter.com/firefoxbeta
**Browser / Version**: Firefox Mobile (Tablet) 109.0
**Operating System**: Android 12
**Tested Another Browser**: Yes Chrome
**Problem type**: Design is broken
**Description**: Items are misaligned
**Steps to Reproduce**:
Page is not occupying 100% width.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2023/1/85176f62-4ce0-414b-a810-6b80daf9d5a2.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20230105190654</li><li>channel: beta</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2023/1/77027c23-ce66-466e-b7eb-beaf3b3fcb4a)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | twitter.com - design is broken - <!-- @browser: Firefox Mobile (Tablet) 109.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 12; Tablet; rv:109.0) Gecko/109.0 Firefox/109.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/117196 -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://twitter.com/firefoxbeta
**Browser / Version**: Firefox Mobile (Tablet) 109.0
**Operating System**: Android 12
**Tested Another Browser**: Yes Chrome
**Problem type**: Design is broken
**Description**: Items are misaligned
**Steps to Reproduce**:
Page is not occupying 100% width.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2023/1/85176f62-4ce0-414b-a810-6b80daf9d5a2.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20230105190654</li><li>channel: beta</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2023/1/77027c23-ce66-466e-b7eb-beaf3b3fcb4a)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | non_test | twitter com design is broken url browser version firefox mobile tablet operating system android tested another browser yes chrome problem type design is broken description items are misaligned steps to reproduce page is not occupying width view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel beta hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️ | 0 |
262,512 | 22,908,694,593 | IssuesEvent | 2022-07-16 01:19:47 | backend-br/vagas | https://api.github.com/repos/backend-br/vagas | closed | [São Paulo] Back-end Developer @ Carreira Preta | CLT PJ Pleno JavaScript Remoto DevOps Especialista TypeScript NoSQL Testes Unitários SQL CI GraphQL Docker Stale | <!--
==================================================
"Remoto"
==================================================
-->
<!--
==================================================
"BACK-END DEVELOPER"
==================================================
> Vaga Remota
## Carreira Preta contrata Desenvolvedor Back-End para nossa cliente Cubos
## Descrição da vaga
RESPONSABILIDADES E ATRIBUIÇÕES
- Participar ativamente nas etapas de ideação, planejamento e desenvolvimento dos produtos dos nossos clientes;
- Participar de discussão das tecnologias que serão utilizadas nos projetos;
- Criar endpoints de APIs que serão consumidas por desenvolvedores web, android e iOS. Seja amigo deles! :p
- Compreender as ferramentas que usa em um contexto DevOps, sendo capaz, em situações problemáticas, de desenvolver um raciocínio lógico consistente para investigar e resolver o problema;
- Disposição para ajudar e compartilhar conhecimento;
- Ser uma pessoa curiosa e estar disposta a se arriscar em novas tecnologias.
## Local
Remoto
## Requisitos
**Obrigatórios:**
- Conhecimento em JavaScript ES6+;
- Experiência fazendo APIs;
- Experiência com Docker;
- Experiência com bancos de dados SQL e/ou NoSQL;
**Diferenciais:**
- Leitura na língua inglesa;
- Conhecimento em Typescript;
- Experiência com testes unitários e de integração;
- Experiência com GraphQL;
- Apresentação de portfólio, GitHub ou similares;
- Experiência com CI.
- Conhecimento em linguagem Elixir.
## Benefícios
- Vale refeição ou alimentação; - Plano de saúde Bradesco Saúde ;
- Plano odontológico Bradesco Saúde;
- Vale transporte (Auxílio home office);
- Day off aniversário;
- Licença casamento estendida;
- Licença maternidade estendida;
- Licença paternidade estendida;
- Gympass;
- Aulas semanais multidisciplinares;
- Licença Pet;
- Auxílio compra de livros;
- Auxílio curso de inglês;
- Auxílio bem-estar;
## Contratação
CLT ou PJ - a combinar
## Como se candidatar
Por favor envie um email para magali@carreirapreta.com.br com seu CV anexado - enviar no assunto: Vaga Back-End
## Tempo médio de feedbacks
Costumamos enviar feedbacks em até 05 dias após cada processo.
E-mail para contato em caso de não haver resposta: ola@carreirapreta.com.br
#### Alocação
- Remoto
#### Regime
- CLT
- PJ
#### Nível
- Pleno
- Sênior
- Especialista
| 1.0 | [São Paulo] Back-end Developer @ Carreira Preta - <!--
==================================================
"Remoto"
==================================================
-->
<!--
==================================================
"BACK-END DEVELOPER"
==================================================
> Vaga Remota
## Carreira Preta contrata Desenvolvedor Back-End para nossa cliente Cubos
## Descrição da vaga
RESPONSABILIDADES E ATRIBUIÇÕES
- Participar ativamente nas etapas de ideação, planejamento e desenvolvimento dos produtos dos nossos clientes;
- Participar de discussão das tecnologias que serão utilizadas nos projetos;
- Criar endpoints de APIs que serão consumidas por desenvolvedores web, android e iOS. Seja amigo deles! :p
- Compreender as ferramentas que usa em um contexto DevOps, sendo capaz, em situações problemáticas, de desenvolver um raciocínio lógico consistente para investigar e resolver o problema;
- Disposição para ajudar e compartilhar conhecimento;
- Ser uma pessoa curiosa e estar disposta a se arriscar em novas tecnologias.
## Local
Remoto
## Requisitos
**Obrigatórios:**
- Conhecimento em JavaScript ES6+;
- Experiência fazendo APIs;
- Experiência com Docker;
- Experiência com bancos de dados SQL e/ou NoSQL;
**Diferenciais:**
- Leitura na língua inglesa;
- Conhecimento em Typescript;
- Experiência com testes unitários e de integração;
- Experiência com GraphQL;
- Apresentação de portfólio, GitHub ou similares;
- Experiência com CI.
- Conhecimento em linguagem Elixir.
## Benefícios
- Vale refeição ou alimentação; - Plano de saúde Bradesco Saúde ;
- Plano odontológico Bradesco Saúde;
- Vale transporte (Auxílio home office);
- Day off aniversário;
- Licença casamento estendida;
- Licença maternidade estendida;
- Licença paternidade estendida;
- Gympass;
- Aulas semanais multidisciplinares;
- Licença Pet;
- Auxílio compra de livros;
- Auxílio curso de inglês;
- Auxílio bem-estar;
## Contratação
CLT ou PJ - a combinar
## Como se candidatar
Por favor envie um email para magali@carreirapreta.com.br com seu CV anexado - enviar no assunto: Vaga Back-End
## Tempo médio de feedbacks
Costumamos enviar feedbacks em até 05 dias após cada processo.
E-mail para contato em caso de não haver resposta: ola@carreirapreta.com.br
#### Alocação
- Remoto
#### Regime
- CLT
- PJ
#### Nível
- Pleno
- Sênior
- Especialista
| test | back end developer carreira preta remoto back end developer vaga remota carreira preta contrata desenvolvedor back end para nossa cliente cubos descrição da vaga responsabilidades e atribuições participar ativamente nas etapas de ideação planejamento e desenvolvimento dos produtos dos nossos clientes participar de discussão das tecnologias que serão utilizadas nos projetos criar endpoints de apis que serão consumidas por desenvolvedores web android e ios seja amigo deles p compreender as ferramentas que usa em um contexto devops sendo capaz em situações problemáticas de desenvolver um raciocínio lógico consistente para investigar e resolver o problema disposição para ajudar e compartilhar conhecimento ser uma pessoa curiosa e estar disposta a se arriscar em novas tecnologias local remoto requisitos obrigatórios conhecimento em javascript experiência fazendo apis experiência com docker experiência com bancos de dados sql e ou nosql diferenciais leitura na língua inglesa conhecimento em typescript experiência com testes unitários e de integração experiência com graphql apresentação de portfólio github ou similares experiência com ci conhecimento em linguagem elixir benefícios vale refeição ou alimentação plano de saúde bradesco saúde plano odontológico bradesco saúde vale transporte auxílio home office day off aniversário licença casamento estendida licença maternidade estendida licença paternidade estendida gympass aulas semanais multidisciplinares licença pet auxílio compra de livros auxílio curso de inglês auxílio bem estar contratação clt ou pj a combinar como se candidatar por favor envie um email para magali carreirapreta com br com seu cv anexado enviar no assunto vaga back end tempo médio de feedbacks costumamos enviar feedbacks em até dias após cada processo e mail para contato em caso de não haver resposta ola carreirapreta com br alocação remoto regime clt pj nível pleno sênior especialista | 1 |
44,935 | 5,661,016,893 | IssuesEvent | 2017-04-10 16:21:09 | research-resource/research_resource | https://api.github.com/repos/research-resource/research_resource | closed | Changing saliva kit request copy | please-test priority-2 | Please put 'Thank you for requesting your saliva sample kit, you should receive it in the post shortly'
| 1.0 | Changing saliva kit request copy - Please put 'Thank you for requesting your saliva sample kit, you should receive it in the post shortly'
| test | changing saliva kit request copy please put thank you for requesting your saliva sample kit you should receive it in the post shortly | 1 |
608,926 | 18,851,427,442 | IssuesEvent | 2021-11-11 21:27:01 | watertap-org/watertap | https://api.github.com/repos/watertap-org/watertap | closed | Type Error on EDB local test | bug Priority:High | After installing a fresh proteuslib environment, I get the following error running pytest. This error seems localized to the EDB. No other tests seem to fail when running individually.
=========================================================================== test session starts ===========================================================================
platform win32 -- Python 3.8.10, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: C:\Users\1pi\projects\proteuslib, configfile: pytest.ini
plugins: cov-2.12.1
collected 109 items / 1 error / 108 selected
================================================================================= ERRORS ==================================================================================
_______________________________________________________ ERROR collecting proteuslib/tests/test_pure_water_pH_edb.py _______________________________________________________
test_pure_water_pH_edb.py:75: in <module>
class TestPureWaterEdb(TestPureWater):
test_pure_water_pH_edb.py:79: in TestPureWaterEdb
thermo_config = get_thermo_config(g_edb)
test_pure_water_pH_edb.py:51: in get_thermo_config
return base.idaes_config
E AttributeError: 'NoneType' object has no attribute 'idaes_config'
---------- coverage: platform win32, python 3.8.10-final-0 ----------- | 1.0 | Type Error on EDB local test - After installing a fresh proteuslib environment, I get the following error running pytest. This error seems localized to the EDB. No other tests seem to fail when running individually.
=========================================================================== test session starts ===========================================================================
platform win32 -- Python 3.8.10, pytest-6.2.4, py-1.10.0, pluggy-0.13.1
rootdir: C:\Users\1pi\projects\proteuslib, configfile: pytest.ini
plugins: cov-2.12.1
collected 109 items / 1 error / 108 selected
================================================================================= ERRORS ==================================================================================
_______________________________________________________ ERROR collecting proteuslib/tests/test_pure_water_pH_edb.py _______________________________________________________
test_pure_water_pH_edb.py:75: in <module>
class TestPureWaterEdb(TestPureWater):
test_pure_water_pH_edb.py:79: in TestPureWaterEdb
thermo_config = get_thermo_config(g_edb)
test_pure_water_pH_edb.py:51: in get_thermo_config
return base.idaes_config
E AttributeError: 'NoneType' object has no attribute 'idaes_config'
---------- coverage: platform win32, python 3.8.10-final-0 ----------- | non_test | type error on edb local test after installing a fresh proteuslib environment i get the following error running pytest this error seems localized to the edb no other tests seem to fail when running individually test session starts platform python pytest py pluggy rootdir c users projects proteuslib configfile pytest ini plugins cov collected items error selected errors error collecting proteuslib tests test pure water ph edb py test pure water ph edb py in class testpurewateredb testpurewater test pure water ph edb py in testpurewateredb thermo config get thermo config g edb test pure water ph edb py in get thermo config return base idaes config e attributeerror nonetype object has no attribute idaes config coverage platform python final | 0 |
274,123 | 20,826,282,394 | IssuesEvent | 2022-03-18 21:23:16 | hashgraph/guardian | https://api.github.com/repos/hashgraph/guardian | closed | Admin Panel - Clear all filters functionality | documentation enhancement UI/UX | ### Problem description
It would be nice to implement functionality where a user can clear filters/set filters to default.
### Requirements
Exact requirement for the product - what needs to be done?
### Definition of done
A user can clear applied filters in the Admin panel by clicking the 'Clear Filters' button.
### Acceptance criteria
When a user with access to the Admin panel navigates to the Admin panel,
Then a user sees clickable the 'Clear filters' button.
Then a user enters any values into the filters fields and clicks on the 'Clear filters' button.
And a user should be able to clear all values entered before.
| 1.0 | Admin Panel - Clear all filters functionality - ### Problem description
It would be nice to implement functionality where a user can clear filters/set filters to default.
### Requirements
Exact requirement for the product - what needs to be done?
### Definition of done
A user can clear applied filters in the Admin panel by clicking the 'Clear Filters' button.
### Acceptance criteria
When a user with access to the Admin panel navigates to the Admin panel,
Then a user sees clickable the 'Clear filters' button.
Then a user enters any values into the filters fields and clicks on the 'Clear filters' button.
And a user should be able to clear all values entered before.
| non_test | admin panel clear all filters functionality problem description it would be nice to implement functionality where a user can clear filters set filters to default requirements exact requirement for the product what needs to be done definition of done a user can clear applied filters in the admin panel by clicking the clear filters button acceptance criteria when a user with access to the admin panel navigates to the admin panel then a user sees clickable the clear filters button then a user enters any values into the filters fields and clicks on the clear filters button and a user should be able to clear all values entered before | 0 |
286,926 | 24,795,011,272 | IssuesEvent | 2022-10-24 16:29:20 | near/nearcore | https://api.github.com/repos/near/nearcore | opened | #7695 breaks bunch of NayDuck tests | A-testing P-high | For example
```
$ git checkout 73de01b93f4e7e2a182d194b69d7b0b142265e05
$ cargo test -pintegration-tests --features test_features,expensive_tests -- \
--exact --nocapture tests::client::chunks_management::chunks_recovered_from_others2
→ fails
$ git checkout @~
$ cargo test -pintegration-tests --features test_features,expensive_tests -- \
--exact --nocapture tests::client::chunks_management::chunks_recovered_from_others2
→ passes
``` | 1.0 | #7695 breaks bunch of NayDuck tests - For example
```
$ git checkout 73de01b93f4e7e2a182d194b69d7b0b142265e05
$ cargo test -pintegration-tests --features test_features,expensive_tests -- \
--exact --nocapture tests::client::chunks_management::chunks_recovered_from_others2
→ fails
$ git checkout @~
$ cargo test -pintegration-tests --features test_features,expensive_tests -- \
--exact --nocapture tests::client::chunks_management::chunks_recovered_from_others2
→ passes
``` | test | breaks bunch of nayduck tests for example git checkout cargo test pintegration tests features test features expensive tests exact nocapture tests client chunks management chunks recovered from → fails git checkout cargo test pintegration tests features test features expensive tests exact nocapture tests client chunks management chunks recovered from → passes | 1 |
229,518 | 7,575,302,690 | IssuesEvent | 2018-04-24 00:54:56 | Azure/acs-engine | https://api.github.com/repos/Azure/acs-engine | closed | Kubernetes Tracking - Attach/Detach Disk api calls do not correctly monitor status | kind/bug orchestrator/k8s priority/P1 upstream | **Is this an ISSUE or FEATURE REQUEST?** (choose one): Issue
Tracking issue for https://github.com/kubernetes/kubernetes/issues/47623 - Attach/Detach Disk api calls do not correctly monitor status | 1.0 | Kubernetes Tracking - Attach/Detach Disk api calls do not correctly monitor status - **Is this an ISSUE or FEATURE REQUEST?** (choose one): Issue
Tracking issue for https://github.com/kubernetes/kubernetes/issues/47623 - Attach/Detach Disk api calls do not correctly monitor status | non_test | kubernetes tracking attach detach disk api calls do not correctly monitor status is this an issue or feature request choose one issue tracking issue for attach detach disk api calls do not correctly monitor status | 0 |
271,486 | 20,676,339,370 | IssuesEvent | 2022-03-10 09:38:50 | SpenceKonde/DxCore | https://api.github.com/repos/SpenceKonde/DxCore | closed | Fix DA32 pinout diagram | Documentation | due to copy-paste from the DB-series where that isn't an I/O pin PD0 is not marked as a possible pin for a TCA PWM group; | 1.0 | Fix DA32 pinout diagram - due to copy-paste from the DB-series where that isn't an I/O pin PD0 is not marked as a possible pin for a TCA PWM group; | non_test | fix pinout diagram due to copy paste from the db series where that isn t an i o pin is not marked as a possible pin for a tca pwm group | 0 |
48,107 | 13,067,452,667 | IssuesEvent | 2020-07-31 00:30:07 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | closed | [fillratio] uninitialized value (Trac #1795) | Migrated from Trac combo reconstruction defect | found by static analyser http://software.icecube.wisc.edu/static_analysis/2016-07-26-030212-26135-1/report-844164.html#EndPath
Migrated from https://code.icecube.wisc.edu/ticket/1795
```json
{
"status": "closed",
"changetime": "2017-01-17T18:45:27",
"description": "found by static analyser http://software.icecube.wisc.edu/static_analysis/2016-07-26-030212-26135-1/report-844164.html#EndPath",
"reporter": "kjmeagher",
"cc": "",
"resolution": "invalid",
"_ts": "1484678727298194",
"component": "combo reconstruction",
"summary": "[fillratio] uninitialized value",
"priority": "normal",
"keywords": "",
"time": "2016-07-27T07:53:29",
"milestone": "Long-Term Future",
"owner": "mjl5147",
"type": "defect"
}
```
| 1.0 | [fillratio] uninitialized value (Trac #1795) - found by static analyser http://software.icecube.wisc.edu/static_analysis/2016-07-26-030212-26135-1/report-844164.html#EndPath
Migrated from https://code.icecube.wisc.edu/ticket/1795
```json
{
"status": "closed",
"changetime": "2017-01-17T18:45:27",
"description": "found by static analyser http://software.icecube.wisc.edu/static_analysis/2016-07-26-030212-26135-1/report-844164.html#EndPath",
"reporter": "kjmeagher",
"cc": "",
"resolution": "invalid",
"_ts": "1484678727298194",
"component": "combo reconstruction",
"summary": "[fillratio] uninitialized value",
"priority": "normal",
"keywords": "",
"time": "2016-07-27T07:53:29",
"milestone": "Long-Term Future",
"owner": "mjl5147",
"type": "defect"
}
```
| non_test | uninitialized value trac found by static analyser migrated from json status closed changetime description found by static analyser reporter kjmeagher cc resolution invalid ts component combo reconstruction summary uninitialized value priority normal keywords time milestone long term future owner type defect | 0 |
5,464 | 8,328,330,549 | IssuesEvent | 2018-09-27 00:11:00 | ArctosDB/new-collections | https://api.github.com/repos/ArctosDB/new-collections | closed | Ohio Wesleyan University - Notify Arctos Working Group | Application in process MOU draft needed | Forward questionnaire to [Arctos Working Group](arctos-working-group@googlegroups.com) and request volunteers for collection mentor.
AWG member can volunteer to act as primary contact, especially if they have similar collections or specific knowledge about a collection; can serve as ‘in kind support’ for collections to help offset costs | 1.0 | Ohio Wesleyan University - Notify Arctos Working Group - Forward questionnaire to [Arctos Working Group](arctos-working-group@googlegroups.com) and request volunteers for collection mentor.
AWG member can volunteer to act as primary contact, especially if they have similar collections or specific knowledge about a collection; can serve as ‘in kind support’ for collections to help offset costs | non_test | ohio wesleyan university notify arctos working group forward questionnaire to arctos working group googlegroups com and request volunteers for collection mentor awg member can volunteer to act as primary contact especially if they have similar collections or specific knowledge about a collection can serve as ‘in kind support’ for collections to help offset costs | 0 |
345,426 | 30,810,652,322 | IssuesEvent | 2023-08-01 10:09:03 | wazuh/wazuh | https://api.github.com/repos/wazuh/wazuh | closed | Release 4.5.0 - Alpha 1 - Packages tests | type/test tracking level/task type/release | ### Packages tests information
|||
| :-- | :-- |
| **Main release candidate issue** | #18058 |
| **Version** | 4.5.0 |
| **Release candidate** | Alpha 1 |
| **Tag** | https://github.com/wazuh/wazuh/tree/v4.5.0-alpha1 |
| **Previous packages metrics** | #17078 |
---
| Status | Test | Issue |
| :--: | :-- | :--: |
| :yellow_circle: | Installation | #18069 |
| :yellow_circle: | Upgrade | #18070 |
| :yellow_circle: | SELinux | #18072 |
| :yellow_circle: | Register | #18071 |
| :yellow_circle: | Service | #18073 |
| :yellow_circle: | Specific systems | #18074 |
| :yellow_circle: | Indexer/Dashboard | #18075 |
---
Status legend:
:black_circle: - Pending/In progress
:white_circle: - Skipped
:red_circle: - Rejected
:yellow_circle: - Ready to review
:green_circle: - Approved
---
## Auditor's validation
In order to close and proceed with the release or the next candidate version, the following auditors must give the green light to this RC.
- [ ] @davidjiglesias
- [ ] @teddytpc1
--- | 1.0 | Release 4.5.0 - Alpha 1 - Packages tests - ### Packages tests information
|||
| :-- | :-- |
| **Main release candidate issue** | #18058 |
| **Version** | 4.5.0 |
| **Release candidate** | Alpha 1 |
| **Tag** | https://github.com/wazuh/wazuh/tree/v4.5.0-alpha1 |
| **Previous packages metrics** | #17078 |
---
| Status | Test | Issue |
| :--: | :-- | :--: |
| :yellow_circle: | Installation | #18069 |
| :yellow_circle: | Upgrade | #18070 |
| :yellow_circle: | SELinux | #18072 |
| :yellow_circle: | Register | #18071 |
| :yellow_circle: | Service | #18073 |
| :yellow_circle: | Specific systems | #18074 |
| :yellow_circle: | Indexer/Dashboard | #18075 |
---
Status legend:
:black_circle: - Pending/In progress
:white_circle: - Skipped
:red_circle: - Rejected
:yellow_circle: - Ready to review
:green_circle: - Approved
---
## Auditor's validation
In order to close and proceed with the release or the next candidate version, the following auditors must give the green light to this RC.
- [ ] @davidjiglesias
- [ ] @teddytpc1
--- | test | release alpha packages tests packages tests information main release candidate issue version release candidate alpha tag previous packages metrics status test issue yellow circle installation yellow circle upgrade yellow circle selinux yellow circle register yellow circle service yellow circle specific systems yellow circle indexer dashboard status legend black circle pending in progress white circle skipped red circle rejected yellow circle ready to review green circle approved auditor s validation in order to close and proceed with the release or the next candidate version the following auditors must give the green light to this rc davidjiglesias | 1 |
57,683 | 14,189,483,388 | IssuesEvent | 2020-11-14 01:02:03 | idonthaveafifaaddiction/debug | https://api.github.com/repos/idonthaveafifaaddiction/debug | opened | CVE-2020-7769 (High) detected in nodemailer-2.7.2.tgz | security vulnerability | ## CVE-2020-7769 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nodemailer-2.7.2.tgz</b></p></summary>
<p>Easy as cake e-mail sending from your Node.js applications</p>
<p>Library home page: <a href="https://registry.npmjs.org/nodemailer/-/nodemailer-2.7.2.tgz">https://registry.npmjs.org/nodemailer/-/nodemailer-2.7.2.tgz</a></p>
<p>Path to dependency file: debug/package.json</p>
<p>Path to vulnerable library: debug/node_modules/nodemailer/package.json</p>
<p>
Dependency Hierarchy:
- karma-2.0.5.tgz (Root Library)
- log4js-2.11.0.tgz
- :x: **nodemailer-2.7.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package nodemailer before 6.4.16. Use of crafted recipient email addresses may result in arbitrary command flag injection in sendmail transport for sending mails.
<p>Publish Date: 2020-11-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7769>CVE-2020-7769</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7769</a></p>
<p>Release Date: 2020-11-12</p>
<p>Fix Resolution: v6.4.16</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"nodemailer","packageVersion":"2.7.2","isTransitiveDependency":true,"dependencyTree":"karma:2.0.5;log4js:2.11.0;nodemailer:2.7.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v6.4.16"}],"vulnerabilityIdentifier":"CVE-2020-7769","vulnerabilityDetails":"This affects the package nodemailer before 6.4.16. Use of crafted recipient email addresses may result in arbitrary command flag injection in sendmail transport for sending mails.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7769","cvss3Severity":"high","cvss3Score":"8.6","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-7769 (High) detected in nodemailer-2.7.2.tgz - ## CVE-2020-7769 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nodemailer-2.7.2.tgz</b></p></summary>
<p>Easy as cake e-mail sending from your Node.js applications</p>
<p>Library home page: <a href="https://registry.npmjs.org/nodemailer/-/nodemailer-2.7.2.tgz">https://registry.npmjs.org/nodemailer/-/nodemailer-2.7.2.tgz</a></p>
<p>Path to dependency file: debug/package.json</p>
<p>Path to vulnerable library: debug/node_modules/nodemailer/package.json</p>
<p>
Dependency Hierarchy:
- karma-2.0.5.tgz (Root Library)
- log4js-2.11.0.tgz
- :x: **nodemailer-2.7.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package nodemailer before 6.4.16. Use of crafted recipient email addresses may result in arbitrary command flag injection in sendmail transport for sending mails.
<p>Publish Date: 2020-11-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7769>CVE-2020-7769</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.6</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7769</a></p>
<p>Release Date: 2020-11-12</p>
<p>Fix Resolution: v6.4.16</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"nodemailer","packageVersion":"2.7.2","isTransitiveDependency":true,"dependencyTree":"karma:2.0.5;log4js:2.11.0;nodemailer:2.7.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v6.4.16"}],"vulnerabilityIdentifier":"CVE-2020-7769","vulnerabilityDetails":"This affects the package nodemailer before 6.4.16. Use of crafted recipient email addresses may result in arbitrary command flag injection in sendmail transport for sending mails.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7769","cvss3Severity":"high","cvss3Score":"8.6","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | non_test | cve high detected in nodemailer tgz cve high severity vulnerability vulnerable library nodemailer tgz easy as cake e mail sending from your node js applications library home page a href path to dependency file debug package json path to vulnerable library debug node modules nodemailer package json dependency hierarchy karma tgz root library tgz x nodemailer tgz vulnerable library vulnerability details this affects the package nodemailer before use of crafted recipient email addresses may result in arbitrary command flag injection in sendmail transport for sending mails publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails this affects the package nodemailer before use of crafted recipient email addresses may result in arbitrary command flag injection in sendmail transport for sending mails vulnerabilityurl | 0 |
49,392 | 13,453,388,861 | IssuesEvent | 2020-09-09 00:48:20 | nasifimtiazohi/openmrs-module-fhir-1.20.0 | https://api.github.com/repos/nasifimtiazohi/openmrs-module-fhir-1.20.0 | opened | CVE-2020-14060 (High) detected in jackson-databind-2.9.10.1.jar | security vulnerability | ## CVE-2020-14060 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.10.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /openmrs-module-fhir-1.20.0/omod/target/fhir-1.20.0/lib/jackson-databind-2.9.10.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.10.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nasifimtiazohi/openmrs-module-fhir-1.20.0/commit/af058c71e43795da6378a96dc75d1d9e6147f1a6">af058c71e43795da6378a96dc75d1d9e6147f1a6</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.5 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.xalan.lib.sql.JNDIConnectionPool (aka apache/drill).
<p>Publish Date: 2020-06-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14060>CVE-2020-14060</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060</a></p>
<p>Release Date: 2020-06-14</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.10.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-14060 (High) detected in jackson-databind-2.9.10.1.jar - ## CVE-2020-14060 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.10.1.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to vulnerable library: /openmrs-module-fhir-1.20.0/omod/target/fhir-1.20.0/lib/jackson-databind-2.9.10.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.10.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nasifimtiazohi/openmrs-module-fhir-1.20.0/commit/af058c71e43795da6378a96dc75d1d9e6147f1a6">af058c71e43795da6378a96dc75d1d9e6147f1a6</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.5 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.xalan.lib.sql.JNDIConnectionPool (aka apache/drill).
<p>Publish Date: 2020-06-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-14060>CVE-2020-14060</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-14060</a></p>
<p>Release Date: 2020-06-14</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.10.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library openmrs module fhir omod target fhir lib jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to oadd org apache xalan lib sql jndiconnectionpool aka apache drill publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource | 0 |
279,204 | 24,206,393,707 | IssuesEvent | 2022-09-25 09:34:35 | PlasmaLang/plasma | https://api.github.com/repos/PlasmaLang/plasma | closed | Sometimes the test script can't detect the end of the compiler errors section in test output. | skill: scripting meta: no-domain-knowledge meta: triaged status: accepted type: maintenance component: tests | Sometimes the test script can't detect the end of the compiler errors section in test output.
```
not ok 121 tests/build/dup_module_name_2 diff # exited with 1 expected 0
--- dup_module_name_2.exp 2022-09-24 12:43:12.933663044 +0000
+++ dup_module_name_2.outs 2022-09-24 12:44:04.853843131 +0000
@@ -1,3 +1,4 @@
dup_module_name_2.p:10: The module name from the source file 'DupModuleName2'
does not match the module name from the BUILD.plz file
'Dup_ModuleName2'
+[3/6] Compiling DupModuleName2
``` | 1.0 | Sometimes the test script can't detect the end of the compiler errors section in test output. - Sometimes the test script can't detect the end of the compiler errors section in test output.
```
not ok 121 tests/build/dup_module_name_2 diff # exited with 1 expected 0
--- dup_module_name_2.exp 2022-09-24 12:43:12.933663044 +0000
+++ dup_module_name_2.outs 2022-09-24 12:44:04.853843131 +0000
@@ -1,3 +1,4 @@
dup_module_name_2.p:10: The module name from the source file 'DupModuleName2'
does not match the module name from the BUILD.plz file
'Dup_ModuleName2'
+[3/6] Compiling DupModuleName2
``` | test | sometimes the test script can t detect the end of the compiler errors section in test output sometimes the test script can t detect the end of the compiler errors section in test output not ok tests build dup module name diff exited with expected dup module name exp dup module name outs dup module name p the module name from the source file does not match the module name from the build plz file dup compiling | 1 |
11,908 | 3,237,545,046 | IssuesEvent | 2015-10-14 12:31:29 | MTG/freesound | https://api.github.com/repos/MTG/freesound | closed | Fix status update SQL | Bug High Testing _Sounds | In https://github.com/MTG/freesound/blob/master/sounds/models.py#L337 we perform immediate updates of sound status values so that other workers don't clobber status.
This SQL needs to be constructed using proper attribute positioning, not by manually constructing strings. | 1.0 | Fix status update SQL - In https://github.com/MTG/freesound/blob/master/sounds/models.py#L337 we perform immediate updates of sound status values so that other workers don't clobber status.
This SQL needs to be constructed using proper attribute positioning, not by manually constructing strings. | test | fix status update sql in we perform immediate updates of sound status values so that other workers don t clobber status this sql needs to be constructed using proper attribute positioning not by manually constructing strings | 1 |
292,791 | 25,238,486,341 | IssuesEvent | 2022-11-15 04:23:48 | swsnu/swppfall2022-team9 | https://api.github.com/repos/swsnu/swppfall2022-team9 | opened | [TASK] View another user's profile | backend testing | ### Detail
- GET /api/profile/:user_id/
- Get profile of another user, if the current user has read permission (is onechon or twochon)
### Checklist
- [ ] Design
- [ ] Implementation
- [ ] Testing
- Unit Test | 1.0 | [TASK] View another user's profile - ### Detail
- GET /api/profile/:user_id/
- Get profile of another user, if the current user has read permission (is onechon or twochon)
### Checklist
- [ ] Design
- [ ] Implementation
- [ ] Testing
- Unit Test | test | view another user s profile detail get api profile user id get profile of another user if the current user has read permission is onechon or twochon checklist design implementation testing unit test | 1 |
61,374 | 12,180,522,755 | IssuesEvent | 2020-04-28 12:36:56 | kwk/test-llvm-bz-import-5 | https://api.github.com/repos/kwk/test-llvm-bz-import-5 | closed | offsetof() inconsistent with gcc on ParamBlockRec from Carbon headers | BZ-BUG-STATUS: RESOLVED BZ-RESOLUTION: DUPLICATE clang/LLVM Codegen dummy import from bugzilla | This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=4465. | 1.0 | offsetof() inconsistent with gcc on ParamBlockRec from Carbon headers - This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=4465. | non_test | offsetof inconsistent with gcc on paramblockrec from carbon headers this issue was imported from bugzilla | 0 |
219,478 | 7,342,791,972 | IssuesEvent | 2018-03-07 09:13:31 | CS2103JAN2018-W09-B3/main | https://api.github.com/repos/CS2103JAN2018-W09-B3/main | opened | As a user with a filled timetable I want to sync with cloud storage calenders | priority.high type.story | ... so that I can easily sync my timetable to my schedule | 1.0 | As a user with a filled timetable I want to sync with cloud storage calenders - ... so that I can easily sync my timetable to my schedule | non_test | as a user with a filled timetable i want to sync with cloud storage calenders so that i can easily sync my timetable to my schedule | 0 |
508,198 | 14,692,690,973 | IssuesEvent | 2021-01-03 03:58:07 | stride3d/stride | https://api.github.com/repos/stride3d/stride | closed | [VR] Problems with steam vr 1.13.9 (mirror display related ?) | area-Graphics bug help wanted priority-high work-estimate-M | with the new steamvr update VR in stride no longer works and you get this error
[Game]: Error: Unexpected exception. SharpDX.SharpDXException: HRESULT: [0x80070057], Module: [General], ApiCode: [E_INVALIDARG/Invalid Arguments], Message: The parameter is incorrect.
at SharpDX.Result.CheckError()
at SharpDX.Direct3D11.Device.CreateShaderResourceView(Resource resourceRef, Nullable`1 descRef, ShaderResourceView sRViewOut)
at SharpDX.Direct3D11.ShaderResourceView..ctor(Device device, Resource resource, ShaderResourceViewDescription description)
at Stride.Graphics.Texture.GetShaderResourceView(ViewType viewType, Int32 arrayOrDepthSlice, Int32 mipIndex)
at Stride.Graphics.Texture.InitializeFromImpl(DataBox[] dataBoxes)
at Stride.Graphics.Texture.InitializeFrom(Texture parentTexture, TextureDescription description, TextureViewDescription viewDescription, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFrom(TextureDescription description, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFromImpl(Texture2D texture, Boolean isSrgb)
at Stride.Graphics.Texture.InitializeFromImpl(ShaderResourceView srv)
at Stride.VirtualReality.OpenVR.GetMirrorTexture(GraphicsDevice device, Int32 eyeIndex)
at Stride.VirtualReality.OpenVRHmd.Enable(GraphicsDevice device, GraphicsDeviceManager graphicsDeviceManager, Boolean requireMirror, Int32 mirrorWidth, Int32 mirrorHeight)
at Stride.VirtualReality.VRDeviceSystem.OnEnabledChanged(Object sender, EventArgs eventArgs)
at Stride.Games.GameSystemBase.OnEnabledChanged(EventArgs e)
at Stride.Games.GameSystemBase.set_Enabled(Boolean value)
at Stride.Rendering.Compositing.ForwardRenderer.InitializeCore()
at Stride.Rendering.RendererCoreBase.Initialize(RenderContext context)
at Stride.Rendering.RendererCoreBase.EnsureContext(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererCollection.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectInner(RenderContext renderContext)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.GraphicsCompositor.DrawCore(RenderDrawContext context)
at Stride.Rendering.RendererBase.Draw(RenderDrawContext context)
at Stride.Engine.SceneSystem.Draw(GameTime gameTime)
at Stride.Games.GameSystemCollection.Draw(GameTime gameTime)
at Stride.Games.GameBase.Draw(GameTime gameTime)
at Stride.Games.GameBase.RawTick(TimeSpan elapsedTimePerUpdate, Int32 updateCount, Single drawInterpolationFactor, Boolean drawFrame)
at Stride.Games.GameBase.RawTickProducer()
SharpDX.SharpDXException: HRESULT: [0x80070057], Module: [General], ApiCode: [E_INVALIDARG/Invalid Arguments], Message: The parameter is incorrect.
at SharpDX.Result.CheckError()
at SharpDX.Direct3D11.Device.CreateShaderResourceView(Resource resourceRef, Nullable`1 descRef, ShaderResourceView sRViewOut)
at SharpDX.Direct3D11.ShaderResourceView..ctor(Device device, Resource resource, ShaderResourceViewDescription description)
at Stride.Graphics.Texture.GetShaderResourceView(ViewType viewType, Int32 arrayOrDepthSlice, Int32 mipIndex)
at Stride.Graphics.Texture.InitializeFromImpl(DataBox[] dataBoxes)
at Stride.Graphics.Texture.InitializeFrom(Texture parentTexture, TextureDescription description, TextureViewDescription viewDescription, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFrom(TextureDescription description, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFromImpl(Texture2D texture, Boolean isSrgb)
at Stride.Graphics.Texture.InitializeFromImpl(ShaderResourceView srv)
at Stride.VirtualReality.OpenVR.GetMirrorTexture(GraphicsDevice device, Int32 eyeIndex)
at Stride.VirtualReality.OpenVRHmd.Enable(GraphicsDevice device, GraphicsDeviceManager graphicsDeviceManager, Boolean requireMirror, Int32 mirrorWidth, Int32 mirrorHeight)
at Stride.VirtualReality.VRDeviceSystem.OnEnabledChanged(Object sender, EventArgs eventArgs)
at Stride.Games.GameSystemBase.OnEnabledChanged(EventArgs e)
at Stride.Games.GameSystemBase.set_Enabled(Boolean value)
at Stride.Rendering.Compositing.ForwardRenderer.InitializeCore()
at Stride.Rendering.RendererCoreBase.Initialize(RenderContext context)
at Stride.Rendering.RendererCoreBase.EnsureContext(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererCollection.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectInner(RenderContext renderContext)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.GraphicsCompositor.DrawCore(RenderDrawContext context)
at Stride.Rendering.RendererBase.Draw(RenderDrawContext context)
at Stride.Engine.SceneSystem.Draw(GameTime gameTime)
at Stride.Games.GameSystemCollection.Draw(GameTime gameTime)
at Stride.Games.GameBase.Draw(GameTime gameTime)
at Stride.Games.GameBase.RawTick(TimeSpan elapsedTimePerUpdate, Int32 updateCount, Single drawInterpolationFactor, Boolean drawFrame)
at Stride.Games.GameBase.RawTickProducer() | 1.0 | [VR] Problems with steam vr 1.13.9 (mirror display related ?) - with the new steamvr update VR in stride no longer works and you get this error
[Game]: Error: Unexpected exception. SharpDX.SharpDXException: HRESULT: [0x80070057], Module: [General], ApiCode: [E_INVALIDARG/Invalid Arguments], Message: The parameter is incorrect.
at SharpDX.Result.CheckError()
at SharpDX.Direct3D11.Device.CreateShaderResourceView(Resource resourceRef, Nullable`1 descRef, ShaderResourceView sRViewOut)
at SharpDX.Direct3D11.ShaderResourceView..ctor(Device device, Resource resource, ShaderResourceViewDescription description)
at Stride.Graphics.Texture.GetShaderResourceView(ViewType viewType, Int32 arrayOrDepthSlice, Int32 mipIndex)
at Stride.Graphics.Texture.InitializeFromImpl(DataBox[] dataBoxes)
at Stride.Graphics.Texture.InitializeFrom(Texture parentTexture, TextureDescription description, TextureViewDescription viewDescription, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFrom(TextureDescription description, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFromImpl(Texture2D texture, Boolean isSrgb)
at Stride.Graphics.Texture.InitializeFromImpl(ShaderResourceView srv)
at Stride.VirtualReality.OpenVR.GetMirrorTexture(GraphicsDevice device, Int32 eyeIndex)
at Stride.VirtualReality.OpenVRHmd.Enable(GraphicsDevice device, GraphicsDeviceManager graphicsDeviceManager, Boolean requireMirror, Int32 mirrorWidth, Int32 mirrorHeight)
at Stride.VirtualReality.VRDeviceSystem.OnEnabledChanged(Object sender, EventArgs eventArgs)
at Stride.Games.GameSystemBase.OnEnabledChanged(EventArgs e)
at Stride.Games.GameSystemBase.set_Enabled(Boolean value)
at Stride.Rendering.Compositing.ForwardRenderer.InitializeCore()
at Stride.Rendering.RendererCoreBase.Initialize(RenderContext context)
at Stride.Rendering.RendererCoreBase.EnsureContext(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererCollection.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectInner(RenderContext renderContext)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.GraphicsCompositor.DrawCore(RenderDrawContext context)
at Stride.Rendering.RendererBase.Draw(RenderDrawContext context)
at Stride.Engine.SceneSystem.Draw(GameTime gameTime)
at Stride.Games.GameSystemCollection.Draw(GameTime gameTime)
at Stride.Games.GameBase.Draw(GameTime gameTime)
at Stride.Games.GameBase.RawTick(TimeSpan elapsedTimePerUpdate, Int32 updateCount, Single drawInterpolationFactor, Boolean drawFrame)
at Stride.Games.GameBase.RawTickProducer()
SharpDX.SharpDXException: HRESULT: [0x80070057], Module: [General], ApiCode: [E_INVALIDARG/Invalid Arguments], Message: The parameter is incorrect.
at SharpDX.Result.CheckError()
at SharpDX.Direct3D11.Device.CreateShaderResourceView(Resource resourceRef, Nullable`1 descRef, ShaderResourceView sRViewOut)
at SharpDX.Direct3D11.ShaderResourceView..ctor(Device device, Resource resource, ShaderResourceViewDescription description)
at Stride.Graphics.Texture.GetShaderResourceView(ViewType viewType, Int32 arrayOrDepthSlice, Int32 mipIndex)
at Stride.Graphics.Texture.InitializeFromImpl(DataBox[] dataBoxes)
at Stride.Graphics.Texture.InitializeFrom(Texture parentTexture, TextureDescription description, TextureViewDescription viewDescription, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFrom(TextureDescription description, DataBox[] textureDatas)
at Stride.Graphics.Texture.InitializeFromImpl(Texture2D texture, Boolean isSrgb)
at Stride.Graphics.Texture.InitializeFromImpl(ShaderResourceView srv)
at Stride.VirtualReality.OpenVR.GetMirrorTexture(GraphicsDevice device, Int32 eyeIndex)
at Stride.VirtualReality.OpenVRHmd.Enable(GraphicsDevice device, GraphicsDeviceManager graphicsDeviceManager, Boolean requireMirror, Int32 mirrorWidth, Int32 mirrorHeight)
at Stride.VirtualReality.VRDeviceSystem.OnEnabledChanged(Object sender, EventArgs eventArgs)
at Stride.Games.GameSystemBase.OnEnabledChanged(EventArgs e)
at Stride.Games.GameSystemBase.set_Enabled(Boolean value)
at Stride.Rendering.Compositing.ForwardRenderer.InitializeCore()
at Stride.Rendering.RendererCoreBase.Initialize(RenderContext context)
at Stride.Rendering.RendererCoreBase.EnsureContext(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererCollection.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectInner(RenderContext renderContext)
at Stride.Rendering.Compositing.SceneCameraRenderer.CollectCore(RenderContext context)
at Stride.Rendering.Compositing.SceneRendererBase.Collect(RenderContext context)
at Stride.Rendering.Compositing.GraphicsCompositor.DrawCore(RenderDrawContext context)
at Stride.Rendering.RendererBase.Draw(RenderDrawContext context)
at Stride.Engine.SceneSystem.Draw(GameTime gameTime)
at Stride.Games.GameSystemCollection.Draw(GameTime gameTime)
at Stride.Games.GameBase.Draw(GameTime gameTime)
at Stride.Games.GameBase.RawTick(TimeSpan elapsedTimePerUpdate, Int32 updateCount, Single drawInterpolationFactor, Boolean drawFrame)
at Stride.Games.GameBase.RawTickProducer() | non_test | problems with steam vr mirror display related with the new steamvr update vr in stride no longer works and you get this error error unexpected exception sharpdx sharpdxexception hresult module apicode message the parameter is incorrect at sharpdx result checkerror at sharpdx device createshaderresourceview resource resourceref nullable descref shaderresourceview srviewout at sharpdx shaderresourceview ctor device device resource resource shaderresourceviewdescription description at stride graphics texture getshaderresourceview viewtype viewtype arrayordepthslice mipindex at stride graphics texture initializefromimpl databox databoxes at stride graphics texture initializefrom texture parenttexture texturedescription description textureviewdescription viewdescription databox texturedatas at stride graphics texture initializefrom texturedescription description databox texturedatas at stride graphics texture initializefromimpl texture boolean issrgb at stride graphics texture initializefromimpl shaderresourceview srv at stride virtualreality openvr getmirrortexture graphicsdevice device eyeindex at stride virtualreality openvrhmd enable graphicsdevice device graphicsdevicemanager graphicsdevicemanager boolean requiremirror mirrorwidth mirrorheight at stride virtualreality vrdevicesystem onenabledchanged object sender eventargs eventargs at stride games gamesystembase onenabledchanged eventargs e at stride games gamesystembase set enabled boolean value at stride rendering compositing forwardrenderer initializecore at stride rendering renderercorebase initialize rendercontext context at stride rendering renderercorebase ensurecontext rendercontext context at stride rendering compositing scenerendererbase collect rendercontext context at stride rendering compositing scenerenderercollection collectcore rendercontext context at stride rendering compositing scenerendererbase collect rendercontext context at stride rendering compositing scenecamerarenderer collectinner rendercontext rendercontext at stride rendering compositing scenecamerarenderer collectcore rendercontext context at stride rendering compositing scenerendererbase collect rendercontext context at stride rendering compositing graphicscompositor drawcore renderdrawcontext context at stride rendering rendererbase draw renderdrawcontext context at stride engine scenesystem draw gametime gametime at stride games gamesystemcollection draw gametime gametime at stride games gamebase draw gametime gametime at stride games gamebase rawtick timespan elapsedtimeperupdate updatecount single drawinterpolationfactor boolean drawframe at stride games gamebase rawtickproducer sharpdx sharpdxexception hresult module apicode message the parameter is incorrect at sharpdx result checkerror at sharpdx device createshaderresourceview resource resourceref nullable descref shaderresourceview srviewout at sharpdx shaderresourceview ctor device device resource resource shaderresourceviewdescription description at stride graphics texture getshaderresourceview viewtype viewtype arrayordepthslice mipindex at stride graphics texture initializefromimpl databox databoxes at stride graphics texture initializefrom texture parenttexture texturedescription description textureviewdescription viewdescription databox texturedatas at stride graphics texture initializefrom texturedescription description databox texturedatas at stride graphics texture initializefromimpl texture boolean issrgb at stride graphics texture initializefromimpl shaderresourceview srv at stride virtualreality openvr getmirrortexture graphicsdevice device eyeindex at stride virtualreality openvrhmd enable graphicsdevice device graphicsdevicemanager graphicsdevicemanager boolean requiremirror mirrorwidth mirrorheight at stride virtualreality vrdevicesystem onenabledchanged object sender eventargs eventargs at stride games gamesystembase onenabledchanged eventargs e at stride games gamesystembase set enabled boolean value at stride rendering compositing forwardrenderer initializecore at stride rendering renderercorebase initialize rendercontext context at stride rendering renderercorebase ensurecontext rendercontext context at stride rendering compositing scenerendererbase collect rendercontext context at stride rendering compositing scenerenderercollection collectcore rendercontext context at stride rendering compositing scenerendererbase collect rendercontext context at stride rendering compositing scenecamerarenderer collectinner rendercontext rendercontext at stride rendering compositing scenecamerarenderer collectcore rendercontext context at stride rendering compositing scenerendererbase collect rendercontext context at stride rendering compositing graphicscompositor drawcore renderdrawcontext context at stride rendering rendererbase draw renderdrawcontext context at stride engine scenesystem draw gametime gametime at stride games gamesystemcollection draw gametime gametime at stride games gamebase draw gametime gametime at stride games gamebase rawtick timespan elapsedtimeperupdate updatecount single drawinterpolationfactor boolean drawframe at stride games gamebase rawtickproducer | 0 |
26,343 | 11,297,130,951 | IssuesEvent | 2020-01-17 04:43:07 | drakeg/drakeweb.org | https://api.github.com/repos/drakeg/drakeweb.org | opened | WS-2015-0024 (High) detected in uglify-js-1.3.3.tgz | security vulnerability | ## WS-2015-0024 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>uglify-js-1.3.3.tgz</b></p></summary>
<p>JavaScript parser and compressor/beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-1.3.3.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-1.3.3.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/drakeweb.org/wp-content/themes/decode/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/drakeweb.org/wp-content/themes/decode/node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- grunt-modernizr-0.5.2.tgz (Root Library)
- :x: **uglify-js-1.3.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/drakeg/drakeweb.org/commit/3759e4bc84151f770a6f4cc54da1c6232f666a6e">3759e4bc84151f770a6f4cc54da1c6232f666a6e</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
UglifyJS versions 2.4.23 and earlier are affected by a vulnerability which allows a specially crafted Javascript file to have altered functionality after minification.
<p>Publish Date: 2015-08-24
<p>URL: <a href=https://github.com/mishoo/UglifyJS2/issues/751>WS-2015-0024</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02">https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02</a></p>
<p>Release Date: 2017-01-31</p>
<p>Fix Resolution: v2.4.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2015-0024 (High) detected in uglify-js-1.3.3.tgz - ## WS-2015-0024 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>uglify-js-1.3.3.tgz</b></p></summary>
<p>JavaScript parser and compressor/beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-1.3.3.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-1.3.3.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/drakeweb.org/wp-content/themes/decode/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/drakeweb.org/wp-content/themes/decode/node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- grunt-modernizr-0.5.2.tgz (Root Library)
- :x: **uglify-js-1.3.3.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/drakeg/drakeweb.org/commit/3759e4bc84151f770a6f4cc54da1c6232f666a6e">3759e4bc84151f770a6f4cc54da1c6232f666a6e</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
UglifyJS versions 2.4.23 and earlier are affected by a vulnerability which allows a specially crafted Javascript file to have altered functionality after minification.
<p>Publish Date: 2015-08-24
<p>URL: <a href=https://github.com/mishoo/UglifyJS2/issues/751>WS-2015-0024</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02">https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02</a></p>
<p>Release Date: 2017-01-31</p>
<p>Fix Resolution: v2.4.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | ws high detected in uglify js tgz ws high severity vulnerability vulnerable library uglify js tgz javascript parser and compressor beautifier toolkit library home page a href path to dependency file tmp ws scm drakeweb org wp content themes decode package json path to vulnerable library tmp ws scm drakeweb org wp content themes decode node modules uglify js package json dependency hierarchy grunt modernizr tgz root library x uglify js tgz vulnerable library found in head commit a href vulnerability details uglifyjs versions and earlier are affected by a vulnerability which allows a specially crafted javascript file to have altered functionality after minification publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
139,621 | 20,920,021,828 | IssuesEvent | 2022-03-24 16:32:11 | Notiooo/PawTravel | https://api.github.com/repos/Notiooo/PawTravel | opened | Detailed display of the offer | design | Please use the materials from pull #7 to keep the designs consistent. Work on this design should be done on the **_dev/issue11_** branch. Please base your work on the [documentation](https://docs.google.com/document/d/1LjV2i6rANkNiA21UoczNkxFHLk3vx4O0/edit?usp=sharing&ouid=100136228169225905054&rtpof=true&sd=true) provided. | 1.0 | Detailed display of the offer - Please use the materials from pull #7 to keep the designs consistent. Work on this design should be done on the **_dev/issue11_** branch. Please base your work on the [documentation](https://docs.google.com/document/d/1LjV2i6rANkNiA21UoczNkxFHLk3vx4O0/edit?usp=sharing&ouid=100136228169225905054&rtpof=true&sd=true) provided. | non_test | detailed display of the offer please use the materials from pull to keep the designs consistent work on this design should be done on the dev branch please base your work on the provided | 0 |
264,876 | 23,144,617,272 | IssuesEvent | 2022-07-28 22:28:47 | sasjs/cli | https://api.github.com/repos/sasjs/cli | closed | sasjs test should return non-zero code when result contains any FAILures | sasjs test | To assist with pipeline development, `sasjs test` should return a non-zero return code by default if the test results are not 100% PASSes.
To enable tests to run regardless, we should enable a new, optional flag (`--ignoreFail`). When this is in place, the `sasjs test` will return normally even if there are failures (like it does now).
- [ ] The documentation (both the --help and the cli.sasjs.io website) should be updated with this new flag
- [ ] We should have tests, both with and without the new flag (with / without failing tests)
Regardless of the flag, the tests should always run to the end. The check for failures is made after running ALL the tests.
Linked issue: https://github.com/sasjs/cli/issues/1101 | 1.0 | sasjs test should return non-zero code when result contains any FAILures - To assist with pipeline development, `sasjs test` should return a non-zero return code by default if the test results are not 100% PASSes.
To enable tests to run regardless, we should enable a new, optional flag (`--ignoreFail`). When this is in place, the `sasjs test` will return normally even if there are failures (like it does now).
- [ ] The documentation (both the --help and the cli.sasjs.io website) should be updated with this new flag
- [ ] We should have tests, both with and without the new flag (with / without failing tests)
Regardless of the flag, the tests should always run to the end. The check for failures is made after running ALL the tests.
Linked issue: https://github.com/sasjs/cli/issues/1101 | test | sasjs test should return non zero code when result contains any failures to assist with pipeline development sasjs test should return a non zero return code by default if the test results are not passes to enable tests to run regardless we should enable a new optional flag ignorefail when this is in place the sasjs test will return normally even if there are failures like it does now the documentation both the help and the cli sasjs io website should be updated with this new flag we should have tests both with and without the new flag with without failing tests regardless of the flag the tests should always run to the end the check for failures is made after running all the tests linked issue | 1 |
99,974 | 16,479,852,777 | IssuesEvent | 2021-05-24 10:09:47 | anyulled/ReactWorkshop | https://api.github.com/repos/anyulled/ReactWorkshop | opened | CVE-2021-23386 (High) detected in dns-packet-1.3.1.tgz | security vulnerability | ## CVE-2021-23386 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dns-packet-1.3.1.tgz</b></p></summary>
<p>An abstract-encoding compliant module for encoding / decoding DNS packets</p>
<p>Library home page: <a href="https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz">https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz</a></p>
<p>Path to dependency file: ReactWorkshop/package.json</p>
<p>Path to vulnerable library: ReactWorkshop/node_modules/dns-packet/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- webpack-dev-server-3.11.1.tgz
- bonjour-3.5.0.tgz
- multicast-dns-6.2.3.tgz
- :x: **dns-packet-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/anyulled/ReactWorkshop/commit/451bfef705a5284b8a85c2eb2b512a3e27b29ebc">451bfef705a5284b8a85c2eb2b512a3e27b29ebc</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package dns-packet before 5.2.2. It creates buffers with allocUnsafe and does not always fill them before forming network packets. This can expose internal application memory over unencrypted network when querying crafted invalid domain names.
<p>Publish Date: 2021-05-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23386>CVE-2021-23386</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386</a></p>
<p>Release Date: 2021-05-20</p>
<p>Fix Resolution: dns-packet - 5.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-23386 (High) detected in dns-packet-1.3.1.tgz - ## CVE-2021-23386 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dns-packet-1.3.1.tgz</b></p></summary>
<p>An abstract-encoding compliant module for encoding / decoding DNS packets</p>
<p>Library home page: <a href="https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz">https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz</a></p>
<p>Path to dependency file: ReactWorkshop/package.json</p>
<p>Path to vulnerable library: ReactWorkshop/node_modules/dns-packet/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-4.0.3.tgz (Root Library)
- webpack-dev-server-3.11.1.tgz
- bonjour-3.5.0.tgz
- multicast-dns-6.2.3.tgz
- :x: **dns-packet-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/anyulled/ReactWorkshop/commit/451bfef705a5284b8a85c2eb2b512a3e27b29ebc">451bfef705a5284b8a85c2eb2b512a3e27b29ebc</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package dns-packet before 5.2.2. It creates buffers with allocUnsafe and does not always fill them before forming network packets. This can expose internal application memory over unencrypted network when querying crafted invalid domain names.
<p>Publish Date: 2021-05-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23386>CVE-2021-23386</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386</a></p>
<p>Release Date: 2021-05-20</p>
<p>Fix Resolution: dns-packet - 5.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in dns packet tgz cve high severity vulnerability vulnerable library dns packet tgz an abstract encoding compliant module for encoding decoding dns packets library home page a href path to dependency file reactworkshop package json path to vulnerable library reactworkshop node modules dns packet package json dependency hierarchy react scripts tgz root library webpack dev server tgz bonjour tgz multicast dns tgz x dns packet tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package dns packet before it creates buffers with allocunsafe and does not always fill them before forming network packets this can expose internal application memory over unencrypted network when querying crafted invalid domain names publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope changed impact metrics confidentiality impact high integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution dns packet step up your open source security game with whitesource | 0 |
181,477 | 14,021,734,981 | IssuesEvent | 2020-10-29 21:49:15 | rapidsai/cuml | https://api.github.com/repos/rapidsai/cuml | reopened | [TEST] Speed up test_linear_model, test_kmeans, and test_incremental PCA | tests | Each of these is well over a minute on unit tests for now. Probably all have similar issues of being over-parametrized.
Related to #3026 | 1.0 | [TEST] Speed up test_linear_model, test_kmeans, and test_incremental PCA - Each of these is well over a minute on unit tests for now. Probably all have similar issues of being over-parametrized.
Related to #3026 | test | speed up test linear model test kmeans and test incremental pca each of these is well over a minute on unit tests for now probably all have similar issues of being over parametrized related to | 1 |
110,099 | 9,430,491,105 | IssuesEvent | 2019-04-12 09:09:34 | presscustomizr/customizr | https://api.github.com/repos/presscustomizr/customizr | opened | mobile menu => javascript tap / click event possible issue | Needs tests question | https://secure.helpscout.net/conversation/825871704/216938?folderId=610151
> Here is the fix for another theme that was having similar issues. Basically, the JavaScript is listening for both a tap and click - unless the default is prevented, both actions fire on touch screens. https://wordpress.org/support/topic/mobile-menu-issues-on-pages-with-draw-attention/
>
> Wordpress 5.1.1-en. CustomizrVersion: 4.1.37. Draw Attention Version 1.8.9 | By N Squared. Here is the site I am referencing, and the Draw Attention plugin is being used on the "Get Involved" menu page. http://pranichealingrichmond.com/ | 1.0 | mobile menu => javascript tap / click event possible issue - https://secure.helpscout.net/conversation/825871704/216938?folderId=610151
> Here is the fix for another theme that was having similar issues. Basically, the JavaScript is listening for both a tap and click - unless the default is prevented, both actions fire on touch screens. https://wordpress.org/support/topic/mobile-menu-issues-on-pages-with-draw-attention/
>
> Wordpress 5.1.1-en. CustomizrVersion: 4.1.37. Draw Attention Version 1.8.9 | By N Squared. Here is the site I am referencing, and the Draw Attention plugin is being used on the "Get Involved" menu page. http://pranichealingrichmond.com/ | test | mobile menu javascript tap click event possible issue here is the fix for another theme that was having similar issues basically the javascript is listening for both a tap and click unless the default is prevented both actions fire on touch screens wordpress en customizrversion draw attention version by n squared here is the site i am referencing and the draw attention plugin is being used on the get involved menu page | 1 |
8,809 | 11,908,300,077 | IssuesEvent | 2020-03-31 00:33:13 | qgis/QGIS | https://api.github.com/repos/qgis/QGIS | closed | Add support for undo/redo actions, such as deleting an object in modeler | Feature Request Processing | Author Name: **Magnus Nilsson** (Magnus Nilsson)
Original Redmine Issue: [5471](https://issues.qgis.org/issues/5471)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
Add support for undo/redo actions, such as deleting an object in modeler
---
Related issue(s): #24172 (duplicates)
Redmine related issue(s): [16262](https://issues.qgis.org/issues/16262)
---
| 1.0 | Add support for undo/redo actions, such as deleting an object in modeler - Author Name: **Magnus Nilsson** (Magnus Nilsson)
Original Redmine Issue: [5471](https://issues.qgis.org/issues/5471)
Redmine category:processing/modeller
Assignee: Victor Olaya
---
Add support for undo/redo actions, such as deleting an object in modeler
---
Related issue(s): #24172 (duplicates)
Redmine related issue(s): [16262](https://issues.qgis.org/issues/16262)
---
| non_test | add support for undo redo actions such as deleting an object in modeler author name magnus nilsson magnus nilsson original redmine issue redmine category processing modeller assignee victor olaya add support for undo redo actions such as deleting an object in modeler related issue s duplicates redmine related issue s | 0 |
344,579 | 30,751,740,089 | IssuesEvent | 2023-07-28 19:58:24 | saltstack/salt | https://api.github.com/repos/saltstack/salt | opened | [Increase Test Coverage] Batch 136 | Tests | Increase the code coverage percent on the following files to at least 80%.
Please be aware that currently the percentage might be inaccurate if the module uses salt due to #64696
File | Percent
salt/utils/mount.py 39
salt/utils/pkg/deb.py 75
salt/utils/profile.py 24
salt/utils/psutil_compat.py 5
salt/utils/ssh.py 27
| 1.0 | [Increase Test Coverage] Batch 136 - Increase the code coverage percent on the following files to at least 80%.
Please be aware that currently the percentage might be inaccurate if the module uses salt due to #64696
File | Percent
salt/utils/mount.py 39
salt/utils/pkg/deb.py 75
salt/utils/profile.py 24
salt/utils/psutil_compat.py 5
salt/utils/ssh.py 27
| test | batch increase the code coverage percent on the following files to at least please be aware that currently the percentage might be inaccurate if the module uses salt due to file percent salt utils mount py salt utils pkg deb py salt utils profile py salt utils psutil compat py salt utils ssh py | 1 |
14,698 | 3,418,720,925 | IssuesEvent | 2015-12-08 04:22:15 | dotnet/wcf | https://api.github.com/repos/dotnet/wcf | closed | Create test for PR #545 - implement SecurityBindingElement.IsSetKeyDerivation | test bug | A test needs to be written for the commit made; this was merged without a test for the time being in order to unblock customer testing. | 1.0 | Create test for PR #545 - implement SecurityBindingElement.IsSetKeyDerivation - A test needs to be written for the commit made; this was merged without a test for the time being in order to unblock customer testing. | test | create test for pr implement securitybindingelement issetkeyderivation a test needs to be written for the commit made this was merged without a test for the time being in order to unblock customer testing | 1 |
131,117 | 10,681,092,504 | IssuesEvent | 2019-10-21 23:23:20 | microsoft/react-native-windows | https://api.github.com/repos/microsoft/react-native-windows | opened | Create native module to support Accessiblity testing. | Area: Tests Proposal vnext | WebDriver doesn't implement the function of accessible function like Invoke, Expand. We can workaround the problem by native module. | 1.0 | Create native module to support Accessiblity testing. - WebDriver doesn't implement the function of accessible function like Invoke, Expand. We can workaround the problem by native module. | test | create native module to support accessiblity testing webdriver doesn t implement the function of accessible function like invoke expand we can workaround the problem by native module | 1 |
156,035 | 12,291,974,006 | IssuesEvent | 2020-05-10 12:39:04 | kotest/kotest | https://api.github.com/repos/kotest/kotest | closed | Make the arb.choice function take derived classes | Good First Issue enhancement property-testing | I am using the new Arb.choice function (Kotest version 4.0.5) and when I use derived classes I need to cast them. For example
```
Arb.choice(
Arb.constant(RuntimeException(message)) as Arb<Exception>,
Arb.constant(Exception(message)),
Arb.constant(IllegalArgumentException(message)) as Arb<Exception>,
Arb.constant(IllegalStateException(message)) as Arb<Exception>
)
```
I would like to know if it is possible to declare their types internally as anything that derive from T so I won't need to cast them.
Thank you. | 1.0 | Make the arb.choice function take derived classes - I am using the new Arb.choice function (Kotest version 4.0.5) and when I use derived classes I need to cast them. For example
```
Arb.choice(
Arb.constant(RuntimeException(message)) as Arb<Exception>,
Arb.constant(Exception(message)),
Arb.constant(IllegalArgumentException(message)) as Arb<Exception>,
Arb.constant(IllegalStateException(message)) as Arb<Exception>
)
```
I would like to know if it is possible to declare their types internally as anything that derive from T so I won't need to cast them.
Thank you. | test | make the arb choice function take derived classes i am using the new arb choice function kotest version and when i use derived classes i need to cast them for example arb choice arb constant runtimeexception message as arb arb constant exception message arb constant illegalargumentexception message as arb arb constant illegalstateexception message as arb i would like to know if it is possible to declare their types internally as anything that derive from t so i won t need to cast them thank you | 1 |
128,544 | 10,542,318,738 | IssuesEvent | 2019-10-02 12:56:50 | OpenArchive/openarchive-android | https://api.github.com/repos/OpenArchive/openarchive-android | closed | On Private Server screen Server Name field is not getting displayed | PLEASE TEST | When we create new private server "Server Name" field is not getting displayed. Only server URL, Password and Login fields are getting displayed.
Note: Server Name field is getting displayed on iOS devices.
Tested on Samsung Galaxy A70 Android version 9.

| 1.0 | On Private Server screen Server Name field is not getting displayed - When we create new private server "Server Name" field is not getting displayed. Only server URL, Password and Login fields are getting displayed.
Note: Server Name field is getting displayed on iOS devices.
Tested on Samsung Galaxy A70 Android version 9.

| test | on private server screen server name field is not getting displayed when we create new private server server name field is not getting displayed only server url password and login fields are getting displayed note server name field is getting displayed on ios devices tested on samsung galaxy android version | 1 |
563,594 | 16,701,504,227 | IssuesEvent | 2021-06-09 03:33:10 | rohan-kulkarni-25/Learn-GITHUB | https://api.github.com/repos/rohan-kulkarni-25/Learn-GITHUB | closed | Learner Addition | ⭐ goal: addition 🟧 priority: high | Enter Details Below !
Github Username :- AtrikGit6174
Your Name :- Atrik Ray
Thanks !!!
| 1.0 | Learner Addition - Enter Details Below !
Github Username :- AtrikGit6174
Your Name :- Atrik Ray
Thanks !!!
| non_test | learner addition enter details below github username your name atrik ray thanks | 0 |
316,450 | 27,165,430,092 | IssuesEvent | 2023-02-17 15:02:43 | rancher/dashboard | https://api.github.com/repos/rancher/dashboard | closed | PSA: Update PSA default | [zube]: To Test QA/XS kind/enhancement team/area2 size/3 area/psa | **Changes required:**
The PSP default 'RKE Default' should be labeled 'Default - RKE2 Embedded'.
The PSA default is currently labeled '(None)' - this is fine for k8s <1.25 (because in k8s <1.25 RKE2 does not automatically apply any PSA).
When the version is >= 1.25, instead of "(None)" we should have "Default - RKE2 Embedded" (which "maps" to empty PSACT on Cluster YAML). Note that this does _not_ apply to k3s as it doesn't automatically apply PSA.
If the user changes the CIS Worker Profile to anything other than "None" **on k8s >= 1.25**, then we need to reset the PSA to "rancher-restricted" (it's guaranteed to exist) and disable the control to prevent the user changing it. We should show a banner or somehow indicate that we've done this.
In some cases, the user may create a PSA Configuration Template (PSACT) that is compatible with the "rancher-restricted" - they may want to use this with CIS Worker Profile set to something other than None - I suggest a checkbox 'Allow PSA Default Template to be overridden when using a CIS Profile". Checking this would enable the PSA template dropdown and allow them to select their own.
We should show a banner if they check this that they understand that they know what they are doing.
Note: The Rancher Default is different to the RKE2 Default, so we can't use their docs - Rancher should provide docs that we can link to when done. | 1.0 | PSA: Update PSA default - **Changes required:**
The PSP default 'RKE Default' should be labeled 'Default - RKE2 Embedded'.
The PSA default is currently labeled '(None)' - this is fine for k8s <1.25 (because in k8s <1.25 RKE2 does not automatically apply any PSA).
When the version is >= 1.25, instead of "(None)" we should have "Default - RKE2 Embedded" (which "maps" to empty PSACT on Cluster YAML). Note that this does _not_ apply to k3s as it doesn't automatically apply PSA.
If the user changes the CIS Worker Profile to anything other than "None" **on k8s >= 1.25**, then we need to reset the PSA to "rancher-restricted" (it's guaranteed to exist) and disable the control to prevent the user changing it. We should show a banner or somehow indicate that we've done this.
In some cases, the user may create a PSA Configuration Template (PSACT) that is compatible with the "rancher-restricted" - they may want to use this with CIS Worker Profile set to something other than None - I suggest a checkbox 'Allow PSA Default Template to be overridden when using a CIS Profile". Checking this would enable the PSA template dropdown and allow them to select their own.
We should show a banner if they check this that they understand that they know what they are doing.
Note: The Rancher Default is different to the RKE2 Default, so we can't use their docs - Rancher should provide docs that we can link to when done. | test | psa update psa default changes required the psp default rke default should be labeled default embedded the psa default is currently labeled none this is fine for because in does not automatically apply any psa when the version is instead of none we should have default embedded which maps to empty psact on cluster yaml note that this does not apply to as it doesn t automatically apply psa if the user changes the cis worker profile to anything other than none on then we need to reset the psa to rancher restricted it s guaranteed to exist and disable the control to prevent the user changing it we should show a banner or somehow indicate that we ve done this in some cases the user may create a psa configuration template psact that is compatible with the rancher restricted they may want to use this with cis worker profile set to something other than none i suggest a checkbox allow psa default template to be overridden when using a cis profile checking this would enable the psa template dropdown and allow them to select their own we should show a banner if they check this that they understand that they know what they are doing note the rancher default is different to the default so we can t use their docs rancher should provide docs that we can link to when done | 1 |
746,585 | 26,036,587,704 | IssuesEvent | 2022-12-22 05:59:37 | magento/magento2 | https://api.github.com/repos/magento/magento2 | closed | Cannot empty Customer`s Custom Attribute (Text Field) after Save it with any Value | Issue: Confirmed Triage: Dev.Experience Reproduced on 2.4.x Progress: PR in progress Priority: P2 Component: CustomerAttributes Area: Framework Reported on 2.4.5 | ### Preconditions and environment
- Magento version: Latest version, Magento Open Source 2.4.5
- you just need to install Magento Open Source Version 2.4.5
- Create Customer Attribute in any way you want
### Steps to reproduce
1. Install Magento Open Source Version 2.4.5 by composer (https://experienceleague.adobe.com/docs/commerce-operations/installation-guide/composer.html)
2. Create Custom Text Field in any way, for my example I used an article that already exists on Adobe Magento website [https://developer.adobe.com/commerce/php/tutorials/admin/custom-text-field-attribute/](https://developer.adobe.com/commerce/php/tutorials/admin/custom-text-field-attribute/)
3. Create a Customer from the backend
4. fill in the attribute mentioned in the article with any value (EX: "External ID" = "Test")
5. try to delete the value
### Expected result
the value of this attribute [External ID] should be empty
### Actual result
Magento ignores the empty value and keeps saving the old value
### Additional information
Nothing more, for easy use I have attached the extension which creates the attribute
[ExampleCorp.zip](https://github.com/magento/magento2/files/9590886/ExampleCorp.zip)

this function that causes the issue "**populateWithOrigData**", here is the place of it
**Magento\Customer\Model\ResourceModel\CustomerRepository** line 306
### Release note
_No response_
### Triage and priority
- [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._
- [ ] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._
- [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._ | 1.0 | Cannot empty Customer`s Custom Attribute (Text Field) after Save it with any Value - ### Preconditions and environment
- Magento version: Latest version, Magento Open Source 2.4.5
- you just need to install Magento Open Source Version 2.4.5
- Create Customer Attribute in any way you want
### Steps to reproduce
1. Install Magento Open Source Version 2.4.5 by composer (https://experienceleague.adobe.com/docs/commerce-operations/installation-guide/composer.html)
2. Create Custom Text Field in any way, for my example I used an article that already exists on Adobe Magento website [https://developer.adobe.com/commerce/php/tutorials/admin/custom-text-field-attribute/](https://developer.adobe.com/commerce/php/tutorials/admin/custom-text-field-attribute/)
3. Create a Customer from the backend
4. fill in the attribute mentioned in the article with any value (EX: "External ID" = "Test")
5. try to delete the value
### Expected result
the value of this attribute [External ID] should be empty
### Actual result
Magento ignores the empty value and keeps saving the old value
### Additional information
Nothing more, for easy use I have attached the extension which creates the attribute
[ExampleCorp.zip](https://github.com/magento/magento2/files/9590886/ExampleCorp.zip)

this function that causes the issue "**populateWithOrigData**", here is the place of it
**Magento\Customer\Model\ResourceModel\CustomerRepository** line 306
### Release note
_No response_
### Triage and priority
- [ ] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._
- [ ] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._
- [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._
- [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._ | non_test | cannot empty customer s custom attribute text field after save it with any value preconditions and environment magento version latest version magento open source you just need to install magento open source version create customer attribute in any way you want steps to reproduce install magento open source version by composer create custom text field in any way for my example i used an article that already exists on adobe magento website create a customer from the backend fill in the attribute mentioned in the article with any value ex external id test try to delete the value expected result the value of this attribute should be empty actual result magento ignores the empty value and keeps saving the old value additional information nothing more for easy use i have attached the extension which creates the attribute this function that causes the issue populatewithorigdata here is the place of it magento customer model resourcemodel customerrepository line release note no response triage and priority severity affects critical data or functionality and leaves users without workaround severity affects critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and does not force users to employ a workaround severity affects aesthetics professional look and feel “quality” or “usability” | 0 |
255,887 | 21,964,981,326 | IssuesEvent | 2022-05-24 19:18:57 | pulp/pulp_file | https://api.github.com/repos/pulp/pulp_file | closed | Test - sync does not report non-fatal errors properly | Tests Priority Finished? | Author: kersom (kersom)
Redmine Issue: 5467, https://pulp.plan.io/issues/5467
---
If you sync a file repository where one of the files is missing, it seems that the repository syncs as much as it can (as expected), but its reported as a fatal error, with a state of 'failed'.
Steps to reproduce:
1\) create a file repository where one of the files is missing
2\) create a file remote and repository and sync them
Actual task status (apologies its been yaml-fied):
~~~
- _href: "/pulp/api/v3/tasks/b6f9b619-c174-4e43-b546-0bbefdfb11e7/"
_created: '2019-08-15T15:21:37.058+00:00'
state: failed
name: pulp_file.app.tasks.synchronizing.synchronize
started_at: '2019-08-15T15:21:37.177+00:00'
finished_at: '2019-08-15T15:21:37.382+00:00'
non_fatal_errors: "[]"
error:
code: ''
description: 404, message='Not Found'
traceback: |2
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/rq/worker.py", line 822, in perform_job
rv = job.perform()
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/rq/job.py", line 605, in perform
self._result = self._execute()
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/rq/job.py", line 611, in _execute
return self.func(*self.args, **self.kwargs)
File "/usr/local/lib/pulp/src/pulp-file/pulp_file/app/tasks/synchronizing.py", line 45, in synchronize
dv.create()
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/declarative_version.py", line 169, in create
loop.run_until_complete(pipeline)
File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
return future.result()
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/api.py", line 209, in create_pipeline
await asyncio.gather(*futures)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/api.py", line 43, in __call__
await self.run()
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/artifact_stages.py", line 132, in run
pb.done += task.result() # download_count
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/artifact_stages.py", line 155, in _handle_content_unit
await asyncio.gather(*downloaders_for_content)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/models.py", line 78, in download
download_result = await downloader.run(extra_data=self.extra_data)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/download/base.py", line 212, in run
return await self._run(extra_data=extra_data)
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/backoff/_async.py", line 131, in retry
ret = await target(*args, **kwargs)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/download/http.py", line 183, in _run
response.raise_for_status()
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/aiohttp/client_reqrep.py", line 942, in raise_for_status
headers=self.headers)
worker: "/pulp/api/v3/workers/df7e0085-b0dd-4073-b74d-9ab78ad27a03/"
spawned_tasks: []
progress_reports:
- message: Downloading Metadata
state: completed
total: 1
done: 1
- message: Parsing Metadata Lines
state: completed
total: 2
done: 2
- message: Downloading Artifacts
state: failed
done: 0
- message: Associating Content
state: canceled
done: 0
created_resources: []
reserved_resources_record: []
create_version: true
poll_attempts:
total: 1
failed: 1
~~~
I'd expect this error to be in the 'non-fatal' errors attribute, and the state to not be 'failed'
| 1.0 | Test - sync does not report non-fatal errors properly - Author: kersom (kersom)
Redmine Issue: 5467, https://pulp.plan.io/issues/5467
---
If you sync a file repository where one of the files is missing, it seems that the repository syncs as much as it can (as expected), but its reported as a fatal error, with a state of 'failed'.
Steps to reproduce:
1\) create a file repository where one of the files is missing
2\) create a file remote and repository and sync them
Actual task status (apologies its been yaml-fied):
~~~
- _href: "/pulp/api/v3/tasks/b6f9b619-c174-4e43-b546-0bbefdfb11e7/"
_created: '2019-08-15T15:21:37.058+00:00'
state: failed
name: pulp_file.app.tasks.synchronizing.synchronize
started_at: '2019-08-15T15:21:37.177+00:00'
finished_at: '2019-08-15T15:21:37.382+00:00'
non_fatal_errors: "[]"
error:
code: ''
description: 404, message='Not Found'
traceback: |2
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/rq/worker.py", line 822, in perform_job
rv = job.perform()
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/rq/job.py", line 605, in perform
self._result = self._execute()
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/rq/job.py", line 611, in _execute
return self.func(*self.args, **self.kwargs)
File "/usr/local/lib/pulp/src/pulp-file/pulp_file/app/tasks/synchronizing.py", line 45, in synchronize
dv.create()
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/declarative_version.py", line 169, in create
loop.run_until_complete(pipeline)
File "/usr/lib64/python3.6/asyncio/base_events.py", line 484, in run_until_complete
return future.result()
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/api.py", line 209, in create_pipeline
await asyncio.gather(*futures)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/api.py", line 43, in __call__
await self.run()
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/artifact_stages.py", line 132, in run
pb.done += task.result() # download_count
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/artifact_stages.py", line 155, in _handle_content_unit
await asyncio.gather(*downloaders_for_content)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/stages/models.py", line 78, in download
download_result = await downloader.run(extra_data=self.extra_data)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/download/base.py", line 212, in run
return await self._run(extra_data=extra_data)
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/backoff/_async.py", line 131, in retry
ret = await target(*args, **kwargs)
File "/usr/local/lib/pulp/src/pulpcore-plugin/pulpcore/plugin/download/http.py", line 183, in _run
response.raise_for_status()
File "/usr/local/lib/pulp/lib64/python3.6/site-packages/aiohttp/client_reqrep.py", line 942, in raise_for_status
headers=self.headers)
worker: "/pulp/api/v3/workers/df7e0085-b0dd-4073-b74d-9ab78ad27a03/"
spawned_tasks: []
progress_reports:
- message: Downloading Metadata
state: completed
total: 1
done: 1
- message: Parsing Metadata Lines
state: completed
total: 2
done: 2
- message: Downloading Artifacts
state: failed
done: 0
- message: Associating Content
state: canceled
done: 0
created_resources: []
reserved_resources_record: []
create_version: true
poll_attempts:
total: 1
failed: 1
~~~
I'd expect this error to be in the 'non-fatal' errors attribute, and the state to not be 'failed'
| test | test sync does not report non fatal errors properly author kersom kersom redmine issue if you sync a file repository where one of the files is missing it seems that the repository syncs as much as it can as expected but its reported as a fatal error with a state of failed steps to reproduce create a file repository where one of the files is missing create a file remote and repository and sync them actual task status apologies its been yaml fied href pulp api tasks created state failed name pulp file app tasks synchronizing synchronize started at finished at non fatal errors error code description message not found traceback file usr local lib pulp site packages rq worker py line in perform job rv job perform file usr local lib pulp site packages rq job py line in perform self result self execute file usr local lib pulp site packages rq job py line in execute return self func self args self kwargs file usr local lib pulp src pulp file pulp file app tasks synchronizing py line in synchronize dv create file usr local lib pulp src pulpcore plugin pulpcore plugin stages declarative version py line in create loop run until complete pipeline file usr asyncio base events py line in run until complete return future result file usr local lib pulp src pulpcore plugin pulpcore plugin stages api py line in create pipeline await asyncio gather futures file usr local lib pulp src pulpcore plugin pulpcore plugin stages api py line in call await self run file usr local lib pulp src pulpcore plugin pulpcore plugin stages artifact stages py line in run pb done task result download count file usr local lib pulp src pulpcore plugin pulpcore plugin stages artifact stages py line in handle content unit await asyncio gather downloaders for content file usr local lib pulp src pulpcore plugin pulpcore plugin stages models py line in download download result await downloader run extra data self extra data file usr local lib pulp src pulpcore plugin pulpcore plugin download base py line in run return await self run extra data extra data file usr local lib pulp site packages backoff async py line in retry ret await target args kwargs file usr local lib pulp src pulpcore plugin pulpcore plugin download http py line in run response raise for status file usr local lib pulp site packages aiohttp client reqrep py line in raise for status headers self headers worker pulp api workers spawned tasks progress reports message downloading metadata state completed total done message parsing metadata lines state completed total done message downloading artifacts state failed done message associating content state canceled done created resources reserved resources record create version true poll attempts total failed i d expect this error to be in the non fatal errors attribute and the state to not be failed | 1 |
219,572 | 24,501,523,332 | IssuesEvent | 2022-10-10 13:10:24 | nidhi7598/linux-3.0.35 | https://api.github.com/repos/nidhi7598/linux-3.0.35 | opened | CVE-2018-9517 (High) detected in multiple libraries | security vulnerability | ## CVE-2018-9517 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxlinux-3.0.40</b>, <b>linuxlinux-3.0.40</b>, <b>linuxlinux-3.0.40</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In pppol2tp_connect, there is possible memory corruption due to a use after free. This could lead to local escalation of privilege with System execution privileges needed. User interaction is not needed for exploitation. Product: Android. Versions: Android kernel. Android ID: A-38159931.
<p>Publish Date: 2018-12-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-9517>CVE-2018-9517</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-9517">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-9517</a></p>
<p>Release Date: 2018-12-07</p>
<p>Fix Resolution: v4.14</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-9517 (High) detected in multiple libraries - ## CVE-2018-9517 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxlinux-3.0.40</b>, <b>linuxlinux-3.0.40</b>, <b>linuxlinux-3.0.40</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In pppol2tp_connect, there is possible memory corruption due to a use after free. This could lead to local escalation of privilege with System execution privileges needed. User interaction is not needed for exploitation. Product: Android. Versions: Android kernel. Android ID: A-38159931.
<p>Publish Date: 2018-12-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-9517>CVE-2018-9517</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-9517">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-9517</a></p>
<p>Release Date: 2018-12-07</p>
<p>Fix Resolution: v4.14</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries linuxlinux linuxlinux linuxlinux vulnerability details in connect there is possible memory corruption due to a use after free this could lead to local escalation of privilege with system execution privileges needed user interaction is not needed for exploitation product android versions android kernel android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
134,757 | 10,928,689,703 | IssuesEvent | 2019-11-22 19:38:13 | hyperledger/quilt | https://api.github.com/repos/hyperledger/quilt | closed | Start using Stream test fixture from ILP RFC Repo instead | enhancement good first issue stream test coverage | [StreamPacketFixtureTest.java](https://github.com/hyperledger/quilt/blob/master/codecs-parent/codecs-stream/src/test/java/org/interledger/codecs/stream/StreamPacketFixturesTest.java) currently uses a fixture in this repo (see [here](https://github.com/hyperledger/quilt/blob/master/codecs-parent/codecs-stream/src/test/resources/StreamPacketFixtures.json)). However, we should instead be using the standard fixture, which is published in the ILP RFC repo [here](https://github.com/interledger/rfcs/tree/master/0029-stream/test-vectors). | 1.0 | Start using Stream test fixture from ILP RFC Repo instead - [StreamPacketFixtureTest.java](https://github.com/hyperledger/quilt/blob/master/codecs-parent/codecs-stream/src/test/java/org/interledger/codecs/stream/StreamPacketFixturesTest.java) currently uses a fixture in this repo (see [here](https://github.com/hyperledger/quilt/blob/master/codecs-parent/codecs-stream/src/test/resources/StreamPacketFixtures.json)). However, we should instead be using the standard fixture, which is published in the ILP RFC repo [here](https://github.com/interledger/rfcs/tree/master/0029-stream/test-vectors). | test | start using stream test fixture from ilp rfc repo instead currently uses a fixture in this repo see however we should instead be using the standard fixture which is published in the ilp rfc repo | 1 |
94,082 | 8,469,033,412 | IssuesEvent | 2018-10-23 21:29:39 | Microsoft/openenclave | https://api.github.com/repos/Microsoft/openenclave | closed | tests/VectorException fails sporadically in CI | bug testing | Specifically in the Kabylake configuration, we've seen the VectorException test fail a couple of times when running CI that recovers on rerun:
```
=== This program is used to test basic vector exception functionalities.
TestCpuidInGlobalConstructors: completed successfully.
TestVectorException: will generate a hardware exception inside enclave!
TestVectorException: hardware exception is handled correctly!
TestGetsecInstruction stack parameters are ok.
Success-Illegal GETSEC raised 2nd chance exception.
The value of cpuidRAX is now: 8
.Success-Unsupported CPUID leaf 00000008 raised 2nd chance exception.
The value of cpuidRAX is now: -2147483648
.Success-Unsupported CPUID leaf 80000000 raised 2nd chance exception.
The value of cpuidRAX is now: 21
.The value of cpuidRAX is now: 591593
.The value of cpuidRAX is now: 1979933441
.The value of cpuidRAX is now: 469762337
.TestSigillHandling: completed successfully.
Test failed: /home/jenkins/workspace/Bors_staging-SAVQQOGG4TMIJW6ZHSGG2MHERTL45R6RNECLBYNXEIQ6O4GKH54Q/tests/VectorException/host/host.c(78): TestSigillHandling cpuid_info[j] == args.cpuid_table[i][j]
``` | 1.0 | tests/VectorException fails sporadically in CI - Specifically in the Kabylake configuration, we've seen the VectorException test fail a couple of times when running CI that recovers on rerun:
```
=== This program is used to test basic vector exception functionalities.
TestCpuidInGlobalConstructors: completed successfully.
TestVectorException: will generate a hardware exception inside enclave!
TestVectorException: hardware exception is handled correctly!
TestGetsecInstruction stack parameters are ok.
Success-Illegal GETSEC raised 2nd chance exception.
The value of cpuidRAX is now: 8
.Success-Unsupported CPUID leaf 00000008 raised 2nd chance exception.
The value of cpuidRAX is now: -2147483648
.Success-Unsupported CPUID leaf 80000000 raised 2nd chance exception.
The value of cpuidRAX is now: 21
.The value of cpuidRAX is now: 591593
.The value of cpuidRAX is now: 1979933441
.The value of cpuidRAX is now: 469762337
.TestSigillHandling: completed successfully.
Test failed: /home/jenkins/workspace/Bors_staging-SAVQQOGG4TMIJW6ZHSGG2MHERTL45R6RNECLBYNXEIQ6O4GKH54Q/tests/VectorException/host/host.c(78): TestSigillHandling cpuid_info[j] == args.cpuid_table[i][j]
``` | test | tests vectorexception fails sporadically in ci specifically in the kabylake configuration we ve seen the vectorexception test fail a couple of times when running ci that recovers on rerun this program is used to test basic vector exception functionalities testcpuidinglobalconstructors completed successfully testvectorexception will generate a hardware exception inside enclave testvectorexception hardware exception is handled correctly testgetsecinstruction stack parameters are ok success illegal getsec raised chance exception the value of cpuidrax is now success unsupported cpuid leaf raised chance exception the value of cpuidrax is now success unsupported cpuid leaf raised chance exception the value of cpuidrax is now the value of cpuidrax is now the value of cpuidrax is now the value of cpuidrax is now testsigillhandling completed successfully test failed home jenkins workspace bors staging tests vectorexception host host c testsigillhandling cpuid info args cpuid table | 1 |
348,334 | 31,551,622,905 | IssuesEvent | 2023-09-02 05:49:54 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | sql/tests: TestRandomSyntaxGeneration failed | C-test-failure O-robot branch-master release-blocker T-sql-foundations | sql/tests.TestRandomSyntaxGeneration [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11597390?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11597390?buildTab=artifacts#/) on master @ [b994d025c678f495cb8b93044e35a8c59595bd78](https://github.com/cockroachdb/cockroach/commits/b994d025c678f495cb8b93044e35a8c59595bd78):
Random syntax error:
```
rsg_test.go:887: Crash detected: server panic: statement exec timeout
```
Query:
```
DROP DATABASE IF EXISTS ident CASCADE;
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/sql-foundations
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxGeneration.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 1.0 | sql/tests: TestRandomSyntaxGeneration failed - sql/tests.TestRandomSyntaxGeneration [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11597390?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11597390?buildTab=artifacts#/) on master @ [b994d025c678f495cb8b93044e35a8c59595bd78](https://github.com/cockroachdb/cockroach/commits/b994d025c678f495cb8b93044e35a8c59595bd78):
Random syntax error:
```
rsg_test.go:887: Crash detected: server panic: statement exec timeout
```
Query:
```
DROP DATABASE IF EXISTS ident CASCADE;
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/sql-foundations
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxGeneration.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | sql tests testrandomsyntaxgeneration failed sql tests testrandomsyntaxgeneration with on master random syntax error rsg test go crash detected server panic statement exec timeout query drop database if exists ident cascade help see also cc cockroachdb sql foundations | 1 |
270,130 | 23,492,585,553 | IssuesEvent | 2022-08-17 20:21:56 | Shopify/quilt | https://api.github.com/repos/Shopify/quilt | opened | Improved `fillGraphQL` utility | Type: Feature Request :raised_hands: Package: graphql-testing | ## Overview
### Problem
`fillGraphQL` has some issues.
1. It's very verbose when combined with createGraphQL (almost all the time)
```ts
graphQL: createGraphQL({
FakeQuery: fillGraphQL(FakeQuery, {
fakeQuery: {
nodes: [
{
id: '1',
name: 'Batman',
},
{
id: '2',
name: 'John Doe',
},
],
},
}),
}),
```
In here you can see we've repeated FakeQuery 3 times and it nests very deeply
2. If you make a mistake, your tests will fail with almost no explanation. Here, `Fake` doesn't match the inner query name of `FakeQuery` so the test will fail to hit the fakeQuery endpoint.
```ts
graphQL: createGraphQL({
Fake: fillGraphQL(FakeQuery, {
fakeQuery: {
nodes: [
{
id: '1',
name: 'Batman',
},
{
id: '2',
name: 'John Doe',
},
],
},
}),
}),
```
### Solution
I recently wrote a utility in one of Shopify's internal Repos that is an improvement on fillGraphQL
```ts
/**
* Build a GraphQL stub from a GraphQL file
*
* @example
* buildGraphQL(CoolQuery, {nodes: [{id: 1, name: 'foo'}]})
* = {CoolQuery: coolQuery: {nodes: [{__typename: 'CoolNode', id: 1, name: 'foo'}]}}
*
* // If you need to include multiple queries/mutations
* {...buildGraphQL(QueryOne, {...}), ...buildGraphQL(QueryTwo, {...})}
* = {QueryOne: queryOne: {...}, QueryTwo: queryTwo: {...}}
*/
export function buildGraphQL<T extends object>(
graphQLOperation: DocumentNode<T, any, any>,
data?: ExtractQueryData<T>,
) {
const name = getQueryName(graphQLOperation);
const operation = getGraphQLOperation(graphQLOperation);
return {
[name]: fillGraphQL(graphQLOperation, {
[operation]: data,
}),
};
}
```
Usage with createGraphQL
Single mock
```ts
graphQL: createGraphQL(
buildGraphQL(FakeMutation, {
errors: ['cool error'],
}),
)
```
Multiple mocks
```ts
graphQL: createGraphQL({
...buildGraphQL(FakeQuery, {
nodes: [],
}),
...buildGraphQL(FakeMutation, {
errors: ['cool error'],
}),
}),
```
## Type
- [x] New feature
- [x] Changes to existing features
## Motivation
The boilerplate required to use fillGraphQL, and the fact that its so easy to make a typo and not realize
## Checklist
- [ ] Please delete the labels section before submitting your issue
- [ ] I have described this issue in a way that is actionable (if possible)
| 1.0 | Improved `fillGraphQL` utility - ## Overview
### Problem
`fillGraphQL` has some issues.
1. It's very verbose when combined with createGraphQL (almost all the time)
```ts
graphQL: createGraphQL({
FakeQuery: fillGraphQL(FakeQuery, {
fakeQuery: {
nodes: [
{
id: '1',
name: 'Batman',
},
{
id: '2',
name: 'John Doe',
},
],
},
}),
}),
```
In here you can see we've repeated FakeQuery 3 times and it nests very deeply
2. If you make a mistake, your tests will fail with almost no explanation. Here, `Fake` doesn't match the inner query name of `FakeQuery` so the test will fail to hit the fakeQuery endpoint.
```ts
graphQL: createGraphQL({
Fake: fillGraphQL(FakeQuery, {
fakeQuery: {
nodes: [
{
id: '1',
name: 'Batman',
},
{
id: '2',
name: 'John Doe',
},
],
},
}),
}),
```
### Solution
I recently wrote a utility in one of Shopify's internal Repos that is an improvement on fillGraphQL
```ts
/**
* Build a GraphQL stub from a GraphQL file
*
* @example
* buildGraphQL(CoolQuery, {nodes: [{id: 1, name: 'foo'}]})
* = {CoolQuery: coolQuery: {nodes: [{__typename: 'CoolNode', id: 1, name: 'foo'}]}}
*
* // If you need to include multiple queries/mutations
* {...buildGraphQL(QueryOne, {...}), ...buildGraphQL(QueryTwo, {...})}
* = {QueryOne: queryOne: {...}, QueryTwo: queryTwo: {...}}
*/
export function buildGraphQL<T extends object>(
graphQLOperation: DocumentNode<T, any, any>,
data?: ExtractQueryData<T>,
) {
const name = getQueryName(graphQLOperation);
const operation = getGraphQLOperation(graphQLOperation);
return {
[name]: fillGraphQL(graphQLOperation, {
[operation]: data,
}),
};
}
```
Usage with createGraphQL
Single mock
```ts
graphQL: createGraphQL(
buildGraphQL(FakeMutation, {
errors: ['cool error'],
}),
)
```
Multiple mocks
```ts
graphQL: createGraphQL({
...buildGraphQL(FakeQuery, {
nodes: [],
}),
...buildGraphQL(FakeMutation, {
errors: ['cool error'],
}),
}),
```
## Type
- [x] New feature
- [x] Changes to existing features
## Motivation
The boilerplate required to use fillGraphQL, and the fact that its so easy to make a typo and not realize
## Checklist
- [ ] Please delete the labels section before submitting your issue
- [ ] I have described this issue in a way that is actionable (if possible)
| test | improved fillgraphql utility overview problem fillgraphql has some issues it s very verbose when combined with creategraphql almost all the time ts graphql creategraphql fakequery fillgraphql fakequery fakequery nodes id name batman id name john doe in here you can see we ve repeated fakequery times and it nests very deeply if you make a mistake your tests will fail with almost no explanation here fake doesn t match the inner query name of fakequery so the test will fail to hit the fakequery endpoint ts graphql creategraphql fake fillgraphql fakequery fakequery nodes id name batman id name john doe solution i recently wrote a utility in one of shopify s internal repos that is an improvement on fillgraphql ts build a graphql stub from a graphql file example buildgraphql coolquery nodes coolquery coolquery nodes if you need to include multiple queries mutations buildgraphql queryone buildgraphql querytwo queryone queryone querytwo querytwo export function buildgraphql graphqloperation documentnode data extractquerydata const name getqueryname graphqloperation const operation getgraphqloperation graphqloperation return fillgraphql graphqloperation data usage with creategraphql single mock ts graphql creategraphql buildgraphql fakemutation errors multiple mocks ts graphql creategraphql buildgraphql fakequery nodes buildgraphql fakemutation errors type new feature changes to existing features motivation the boilerplate required to use fillgraphql and the fact that its so easy to make a typo and not realize checklist please delete the labels section before submitting your issue i have described this issue in a way that is actionable if possible | 1 |
14,914 | 11,222,789,370 | IssuesEvent | 2020-01-07 21:01:40 | enarx/enarx | https://api.github.com/repos/enarx/enarx | closed | Better handling of `cargo audit` | infrastructure | Currently we have three test tasks: `stable`, `beta`, and `nightly`. As part of each of these, we install and run `cargo-audit`. However, this takes a lot of time and is duplicated unnecessarily. We should add a fourth test task: `audit` that runs `cargo-audit` in parallel to the other test tasks. | 1.0 | Better handling of `cargo audit` - Currently we have three test tasks: `stable`, `beta`, and `nightly`. As part of each of these, we install and run `cargo-audit`. However, this takes a lot of time and is duplicated unnecessarily. We should add a fourth test task: `audit` that runs `cargo-audit` in parallel to the other test tasks. | non_test | better handling of cargo audit currently we have three test tasks stable beta and nightly as part of each of these we install and run cargo audit however this takes a lot of time and is duplicated unnecessarily we should add a fourth test task audit that runs cargo audit in parallel to the other test tasks | 0 |
146,697 | 11,745,741,858 | IssuesEvent | 2020-03-12 10:22:08 | microsoft/azure-pipelines-tasks | https://api.github.com/repos/microsoft/azure-pipelines-tasks | closed | PublishTestResults: Publishing test results fails because no file PTR_TEST_RUNSUMMARY.json | Area: Test Area: TestManagement question | ## Note
Issues in this repo are for tracking bugs, feature requests and questions for the tasks in this repo
For a list:
https://github.com/Microsoft/azure-pipelines-tasks/tree/master/Tasks
If you have an issue or request for the Azure Pipelines service, use developer community instead:
https://developercommunity.visualstudio.com/spaces/21/index.html )
## Required Information
Entering this information will route you directly to the right team and expedite traction.
**Question, Bug, or Feature?**
*Type*: Question
**Enter Task Name**: PublishTestResults
list here (V# not needed):
https://github.com/Microsoft/azure-pipelines-tasks/tree/master/Tasks
## Environment
- Server - Azure Pipelines or TFS on-premises?
- If using Azure Pipelines, provide the account name, team project name, build definition name/build number:
- Agent - Hosted or Private:
- If using private agent, provide the OS of the machine running the agent and the agent version: Windows, agent.version: 2.164.7
## Issue Description
Running tests with protractor using npm, which outputs a testresults file in vstest testresults xml format. Trying to publish the test results gives the warning that the ptr_test_runsummary.json file is not found. When is this file supposed to be created/used from my azure pipeline?
### Task logs
[ReleaseLogs_10908.zip](https://github.com/microsoft/azure-pipelines-tasks/files/4158116/ReleaseLogs_10908.zip)
## Troubleshooting
Checkout how to troubleshoot failures and collect debug logs: https://docs.microsoft.com/en-us/vsts/build-release/actions/troubleshooting
### Error logs
[##[debug]Unable to publish the test run summary to evidencestore, error details:Error: ENOENT: no such file or directory, open 'C:\Agent_AD1\_work\_temp\PTR_TEST_RUNSUMMARY.json']
| 2.0 | PublishTestResults: Publishing test results fails because no file PTR_TEST_RUNSUMMARY.json - ## Note
Issues in this repo are for tracking bugs, feature requests and questions for the tasks in this repo
For a list:
https://github.com/Microsoft/azure-pipelines-tasks/tree/master/Tasks
If you have an issue or request for the Azure Pipelines service, use developer community instead:
https://developercommunity.visualstudio.com/spaces/21/index.html )
## Required Information
Entering this information will route you directly to the right team and expedite traction.
**Question, Bug, or Feature?**
*Type*: Question
**Enter Task Name**: PublishTestResults
list here (V# not needed):
https://github.com/Microsoft/azure-pipelines-tasks/tree/master/Tasks
## Environment
- Server - Azure Pipelines or TFS on-premises?
- If using Azure Pipelines, provide the account name, team project name, build definition name/build number:
- Agent - Hosted or Private:
- If using private agent, provide the OS of the machine running the agent and the agent version: Windows, agent.version: 2.164.7
## Issue Description
Running tests with protractor using npm, which outputs a testresults file in vstest testresults xml format. Trying to publish the test results gives the warning that the ptr_test_runsummary.json file is not found. When is this file supposed to be created/used from my azure pipeline?
### Task logs
[ReleaseLogs_10908.zip](https://github.com/microsoft/azure-pipelines-tasks/files/4158116/ReleaseLogs_10908.zip)
## Troubleshooting
Checkout how to troubleshoot failures and collect debug logs: https://docs.microsoft.com/en-us/vsts/build-release/actions/troubleshooting
### Error logs
[##[debug]Unable to publish the test run summary to evidencestore, error details:Error: ENOENT: no such file or directory, open 'C:\Agent_AD1\_work\_temp\PTR_TEST_RUNSUMMARY.json']
| test | publishtestresults publishing test results fails because no file ptr test runsummary json note issues in this repo are for tracking bugs feature requests and questions for the tasks in this repo for a list if you have an issue or request for the azure pipelines service use developer community instead required information entering this information will route you directly to the right team and expedite traction question bug or feature type question enter task name publishtestresults list here v not needed environment server azure pipelines or tfs on premises if using azure pipelines provide the account name team project name build definition name build number agent hosted or private if using private agent provide the os of the machine running the agent and the agent version windows agent version issue description running tests with protractor using npm which outputs a testresults file in vstest testresults xml format trying to publish the test results gives the warning that the ptr test runsummary json file is not found when is this file supposed to be created used from my azure pipeline task logs troubleshooting checkout how to troubleshoot failures and collect debug logs error logs unable to publish the test run summary to evidencestore error details error enoent no such file or directory open c agent work temp ptr test runsummary json | 1 |
85,348 | 10,437,101,008 | IssuesEvent | 2019-09-17 21:08:19 | raiden-network/raiden | https://api.github.com/repos/raiden-network/raiden | closed | Specify the state machines | documentation | This issue keeps track of writing a specification of the state machines in the client implementation. The spec will be useful for writing unit tests and keeping track of the test coverage. The spec will live in `raiden` repository rather than `spec` repository because it is about this particulr implementation (I discussed this with @LefterisJP ).
The first task is to write a sample specification for one state change on one state. After getting some feedback on that, the spec can be extended to cover one state machine. | 1.0 | Specify the state machines - This issue keeps track of writing a specification of the state machines in the client implementation. The spec will be useful for writing unit tests and keeping track of the test coverage. The spec will live in `raiden` repository rather than `spec` repository because it is about this particulr implementation (I discussed this with @LefterisJP ).
The first task is to write a sample specification for one state change on one state. After getting some feedback on that, the spec can be extended to cover one state machine. | non_test | specify the state machines this issue keeps track of writing a specification of the state machines in the client implementation the spec will be useful for writing unit tests and keeping track of the test coverage the spec will live in raiden repository rather than spec repository because it is about this particulr implementation i discussed this with lefterisjp the first task is to write a sample specification for one state change on one state after getting some feedback on that the spec can be extended to cover one state machine | 0 |
20,418 | 27,078,780,559 | IssuesEvent | 2023-02-14 12:34:42 | island-is/island.is | https://api.github.com/repos/island-is/island.is | opened | New process - project/service lifecycle docs | sync 2022-Q1 process | What happens when a service is "done" and goes into maintenance and the team is no longer around.
Suggestion is to assign it to "Core" team in codeowners. | 1.0 | New process - project/service lifecycle docs - What happens when a service is "done" and goes into maintenance and the team is no longer around.
Suggestion is to assign it to "Core" team in codeowners. | non_test | new process project service lifecycle docs what happens when a service is done and goes into maintenance and the team is no longer around suggestion is to assign it to core team in codeowners | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.