Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
4
112
repo_url
stringlengths
33
141
action
stringclasses
3 values
title
stringlengths
1
1.02k
labels
stringlengths
4
1.54k
body
stringlengths
1
262k
index
stringclasses
17 values
text_combine
stringlengths
95
262k
label
stringclasses
2 values
text
stringlengths
96
252k
binary_label
int64
0
1
278,867
24,182,187,599
IssuesEvent
2022-09-23 09:55:58
MrBrax/LiveStreamDVR
https://api.github.com/repos/MrBrax/LiveStreamDVR
closed
Add FTP export and automatic export
enhancement needs testing
It would be nice to have FTP export, an option for automatic export of VODs after remuxing has finished, with an option to automatically delete the VOD after export has finished
1.0
Add FTP export and automatic export - It would be nice to have FTP export, an option for automatic export of VODs after remuxing has finished, with an option to automatically delete the VOD after export has finished
test
add ftp export and automatic export it would be nice to have ftp export an option for automatic export of vods after remuxing has finished with an option to automatically delete the vod after export has finished
1
72,990
7,319,676,630
IssuesEvent
2018-03-02 02:10:58
EFForg/https-everywhere
https://api.github.com/repos/EFForg/https-everywhere
closed
Update platform certs in https-everywhere-checker
Ruleset Testing
We should pull the latest platform certs trusted by Firefox.
1.0
Update platform certs in https-everywhere-checker - We should pull the latest platform certs trusted by Firefox.
test
update platform certs in https everywhere checker we should pull the latest platform certs trusted by firefox
1
181,851
21,664,454,695
IssuesEvent
2022-05-07 01:24:15
venkateshreddypala/post-it-a4
https://api.github.com/repos/venkateshreddypala/post-it-a4
closed
WS-2019-0333 (High) detected in handlebars-1.3.0.tgz, handlebars-4.0.10.tgz - autoclosed
security vulnerability
## WS-2019-0333 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>handlebars-1.3.0.tgz</b>, <b>handlebars-4.0.10.tgz</b></p></summary> <p> <details><summary><b>handlebars-1.3.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz</a></p> <p>Path to dependency file: /post-it-a4/package.json</p> <p>Path to vulnerable library: post-it-a4/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - cli-1.1.1.tgz (Root Library) - postcss-url-5.1.2.tgz - directory-encoder-0.7.2.tgz - :x: **handlebars-1.3.0.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.0.10.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz</a></p> <p>Path to dependency file: /post-it-a4/package.json</p> <p>Path to vulnerable library: post-it-a4/node_modules/istanbul-reports/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-1.3.0.tgz (Root Library) - istanbul-api-1.1.9.tgz - istanbul-reports-1.1.1.tgz - :x: **handlebars-4.0.10.tgz** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In handlebars, versions prior to v4.5.3 are vulnerable to prototype pollution. Using a malicious template it's possbile to add or modify properties to the Object prototype. This can also lead to DOS and RCE in certain conditions. <p>Publish Date: 2019-11-18 <p>URL: <a href=https://github.com/wycats/handlebars.js/commit/f7f05d7558e674856686b62a00cde5758f3b7a08>WS-2019-0333</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1325">https://www.npmjs.com/advisories/1325</a></p> <p>Release Date: 2019-11-18</p> <p>Fix Resolution: handlebars - 4.5.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2019-0333 (High) detected in handlebars-1.3.0.tgz, handlebars-4.0.10.tgz - autoclosed - ## WS-2019-0333 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>handlebars-1.3.0.tgz</b>, <b>handlebars-4.0.10.tgz</b></p></summary> <p> <details><summary><b>handlebars-1.3.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz</a></p> <p>Path to dependency file: /post-it-a4/package.json</p> <p>Path to vulnerable library: post-it-a4/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - cli-1.1.1.tgz (Root Library) - postcss-url-5.1.2.tgz - directory-encoder-0.7.2.tgz - :x: **handlebars-1.3.0.tgz** (Vulnerable Library) </details> <details><summary><b>handlebars-4.0.10.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.0.10.tgz</a></p> <p>Path to dependency file: /post-it-a4/package.json</p> <p>Path to vulnerable library: post-it-a4/node_modules/istanbul-reports/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - karma-coverage-istanbul-reporter-1.3.0.tgz (Root Library) - istanbul-api-1.1.9.tgz - istanbul-reports-1.1.1.tgz - :x: **handlebars-4.0.10.tgz** (Vulnerable Library) </details> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In handlebars, versions prior to v4.5.3 are vulnerable to prototype pollution. Using a malicious template it's possbile to add or modify properties to the Object prototype. This can also lead to DOS and RCE in certain conditions. <p>Publish Date: 2019-11-18 <p>URL: <a href=https://github.com/wycats/handlebars.js/commit/f7f05d7558e674856686b62a00cde5758f3b7a08>WS-2019-0333</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1325">https://www.npmjs.com/advisories/1325</a></p> <p>Release Date: 2019-11-18</p> <p>Fix Resolution: handlebars - 4.5.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
ws high detected in handlebars tgz handlebars tgz autoclosed ws high severity vulnerability vulnerable libraries handlebars tgz handlebars tgz handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file post it package json path to vulnerable library post it node modules handlebars package json dependency hierarchy cli tgz root library postcss url tgz directory encoder tgz x handlebars tgz vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file post it package json path to vulnerable library post it node modules istanbul reports node modules handlebars package json dependency hierarchy karma coverage istanbul reporter tgz root library istanbul api tgz istanbul reports tgz x handlebars tgz vulnerable library vulnerability details in handlebars versions prior to are vulnerable to prototype pollution using a malicious template it s possbile to add or modify properties to the object prototype this can also lead to dos and rce in certain conditions publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution handlebars step up your open source security game with whitesource
0
120,432
17,644,194,964
IssuesEvent
2021-08-20 01:55:34
logbie/HyperGAN
https://api.github.com/repos/logbie/HyperGAN
opened
CVE-2021-37686 (Medium) detected in tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl
security vulnerability
## CVE-2021-37686 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl</b></p></summary> <p>TensorFlow is an open source machine learning framework for everyone.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/0a/93/c7bca39b23aae45cd2e85ad3871c81eccc63b9c5276e926511e2e5b0879d/tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/0a/93/c7bca39b23aae45cd2e85ad3871c81eccc63b9c5276e926511e2e5b0879d/tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl</a></p> <p>Path to dependency file: HyperGAN/requirements.txt</p> <p>Path to vulnerable library: HyperGAN/requirements.txt</p> <p> Dependency Hierarchy: - :x: **tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> TensorFlow is an end-to-end open source platform for machine learning. In affected versions the strided slice implementation in TFLite has a logic bug which can allow an attacker to trigger an infinite loop. This arises from newly introduced support for [ellipsis in axis definition](https://github.com/tensorflow/tensorflow/blob/149562d49faa709ea80df1d99fc41d005b81082a/tensorflow/lite/kernels/strided_slice.cc#L103-L122). An attacker can craft a model such that `ellipsis_end_idx` is smaller than `i` (e.g., always negative). In this case, the inner loop does not increase `i` and the `continue` statement causes execution to skip over the preincrement at the end of the outer loop. We have patched the issue in GitHub commit dfa22b348b70bb89d6d6ec0ff53973bacb4f4695. TensorFlow 2.6.0 is the only affected version. <p>Publish Date: 2021-08-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37686>CVE-2021-37686</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-mhhc-q96p-mfm9">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-mhhc-q96p-mfm9</a></p> <p>Release Date: 2021-08-12</p> <p>Fix Resolution: tensorflow - 2.6.0, tensorflow-cpu - 2.6.0, tensorflow-gpu - 2.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-37686 (Medium) detected in tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl - ## CVE-2021-37686 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl</b></p></summary> <p>TensorFlow is an open source machine learning framework for everyone.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/0a/93/c7bca39b23aae45cd2e85ad3871c81eccc63b9c5276e926511e2e5b0879d/tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl">https://files.pythonhosted.org/packages/0a/93/c7bca39b23aae45cd2e85ad3871c81eccc63b9c5276e926511e2e5b0879d/tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl</a></p> <p>Path to dependency file: HyperGAN/requirements.txt</p> <p>Path to vulnerable library: HyperGAN/requirements.txt</p> <p> Dependency Hierarchy: - :x: **tensorflow_gpu-2.1.0-cp36-cp36m-manylinux2010_x86_64.whl** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> TensorFlow is an end-to-end open source platform for machine learning. In affected versions the strided slice implementation in TFLite has a logic bug which can allow an attacker to trigger an infinite loop. This arises from newly introduced support for [ellipsis in axis definition](https://github.com/tensorflow/tensorflow/blob/149562d49faa709ea80df1d99fc41d005b81082a/tensorflow/lite/kernels/strided_slice.cc#L103-L122). An attacker can craft a model such that `ellipsis_end_idx` is smaller than `i` (e.g., always negative). In this case, the inner loop does not increase `i` and the `continue` statement causes execution to skip over the preincrement at the end of the outer loop. We have patched the issue in GitHub commit dfa22b348b70bb89d6d6ec0ff53973bacb4f4695. TensorFlow 2.6.0 is the only affected version. <p>Publish Date: 2021-08-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37686>CVE-2021-37686</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-mhhc-q96p-mfm9">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-mhhc-q96p-mfm9</a></p> <p>Release Date: 2021-08-12</p> <p>Fix Resolution: tensorflow - 2.6.0, tensorflow-cpu - 2.6.0, tensorflow-gpu - 2.6.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium detected in tensorflow gpu whl cve medium severity vulnerability vulnerable library tensorflow gpu whl tensorflow is an open source machine learning framework for everyone library home page a href path to dependency file hypergan requirements txt path to vulnerable library hypergan requirements txt dependency hierarchy x tensorflow gpu whl vulnerable library vulnerability details tensorflow is an end to end open source platform for machine learning in affected versions the strided slice implementation in tflite has a logic bug which can allow an attacker to trigger an infinite loop this arises from newly introduced support for an attacker can craft a model such that ellipsis end idx is smaller than i e g always negative in this case the inner loop does not increase i and the continue statement causes execution to skip over the preincrement at the end of the outer loop we have patched the issue in github commit tensorflow is the only affected version publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with whitesource
0
186,132
14,394,638,339
IssuesEvent
2020-12-03 01:46:13
github-vet/rangeclosure-findings
https://api.github.com/repos/github-vet/rangeclosure-findings
closed
pingcap/tidb-operator: pkg/webhook/pod/pd_deleter_test.go; 3 LoC
fresh test tiny
Found a possible issue in [pingcap/tidb-operator](https://www.github.com/pingcap/tidb-operator) at [pkg/webhook/pod/pd_deleter_test.go](https://github.com/pingcap/tidb-operator/blob/57b7160e1586cf6f4bd0354fc8c4a7ee750a4f69/pkg/webhook/pod/pd_deleter_test.go#L280-L282) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to test at line 281 may start a goroutine [Click here to see the code in its original context.](https://github.com/pingcap/tidb-operator/blob/57b7160e1586cf6f4bd0354fc8c4a7ee750a4f69/pkg/webhook/pod/pd_deleter_test.go#L280-L282) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, test := range tests { testFn(&test) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 57b7160e1586cf6f4bd0354fc8c4a7ee750a4f69
1.0
pingcap/tidb-operator: pkg/webhook/pod/pd_deleter_test.go; 3 LoC - Found a possible issue in [pingcap/tidb-operator](https://www.github.com/pingcap/tidb-operator) at [pkg/webhook/pod/pd_deleter_test.go](https://github.com/pingcap/tidb-operator/blob/57b7160e1586cf6f4bd0354fc8c4a7ee750a4f69/pkg/webhook/pod/pd_deleter_test.go#L280-L282) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > function call which takes a reference to test at line 281 may start a goroutine [Click here to see the code in its original context.](https://github.com/pingcap/tidb-operator/blob/57b7160e1586cf6f4bd0354fc8c4a7ee750a4f69/pkg/webhook/pod/pd_deleter_test.go#L280-L282) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for _, test := range tests { testFn(&test) } ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 57b7160e1586cf6f4bd0354fc8c4a7ee750a4f69
test
pingcap tidb operator pkg webhook pod pd deleter test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call which takes a reference to test at line may start a goroutine click here to show the line s of go which triggered the analyzer go for test range tests testfn test leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
186,626
14,402,110,179
IssuesEvent
2020-12-03 14:32:17
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
sql: several schema changer tests are skipped
C-cleanup skipped-test
`schema_changer_test.go` contains many skipped tests: - [ ] `TestSchemaChangePurgeFailure` - [ ] `TestSchemaChangeReverseMutations` - [ ] `TestSchemaChangeCompletion` - [ ] `TestCancelSchemaChange` - [ ] `TestOrphanedGCMutationsRemoved` - [ ] `TestTruncateWhileColumnBackfill` (in progress: #49399)
1.0
sql: several schema changer tests are skipped - `schema_changer_test.go` contains many skipped tests: - [ ] `TestSchemaChangePurgeFailure` - [ ] `TestSchemaChangeReverseMutations` - [ ] `TestSchemaChangeCompletion` - [ ] `TestCancelSchemaChange` - [ ] `TestOrphanedGCMutationsRemoved` - [ ] `TestTruncateWhileColumnBackfill` (in progress: #49399)
test
sql several schema changer tests are skipped schema changer test go contains many skipped tests testschemachangepurgefailure testschemachangereversemutations testschemachangecompletion testcancelschemachange testorphanedgcmutationsremoved testtruncatewhilecolumnbackfill in progress
1
47,927
13,066,383,819
IssuesEvent
2020-07-30 21:34:57
icecube-trac/tix2
https://api.github.com/repos/icecube-trac/tix2
closed
[dipolefit] no sphinx documentation (Trac #1445)
Migrated from Trac combo reconstruction defect
Good documentation is now deemed essential. Migrated from https://code.icecube.wisc.edu/ticket/1445 ```json { "status": "closed", "changetime": "2019-02-13T14:11:57", "description": "Good documentation is now deemed essential.", "reporter": "david.schultz", "cc": "olivas", "resolution": "fixed", "_ts": "1550067117911749", "component": "combo reconstruction", "summary": "[dipolefit] no sphinx documentation", "priority": "major", "keywords": "", "time": "2015-11-24T23:14:58", "milestone": "", "owner": "olivas", "type": "defect" } ```
1.0
[dipolefit] no sphinx documentation (Trac #1445) - Good documentation is now deemed essential. Migrated from https://code.icecube.wisc.edu/ticket/1445 ```json { "status": "closed", "changetime": "2019-02-13T14:11:57", "description": "Good documentation is now deemed essential.", "reporter": "david.schultz", "cc": "olivas", "resolution": "fixed", "_ts": "1550067117911749", "component": "combo reconstruction", "summary": "[dipolefit] no sphinx documentation", "priority": "major", "keywords": "", "time": "2015-11-24T23:14:58", "milestone": "", "owner": "olivas", "type": "defect" } ```
non_test
no sphinx documentation trac good documentation is now deemed essential migrated from json status closed changetime description good documentation is now deemed essential reporter david schultz cc olivas resolution fixed ts component combo reconstruction summary no sphinx documentation priority major keywords time milestone owner olivas type defect
0
135,639
11,014,145,322
IssuesEvent
2019-12-04 22:03:11
rancher/rancher
https://api.github.com/repos/rancher/rancher
closed
Associate AD group with Rancher role
[zube]: To Test internal team/az
I have AD authentication enabled in Rancher 2.2.8, with 2 AD groups: - rancher-admin - rancher-user I would like to automatically associate all `rancher-admin` users to Rancher `Admin` role and `rancher-user` users to Rancher `User` role. Is there a way to do this?
1.0
Associate AD group with Rancher role - I have AD authentication enabled in Rancher 2.2.8, with 2 AD groups: - rancher-admin - rancher-user I would like to automatically associate all `rancher-admin` users to Rancher `Admin` role and `rancher-user` users to Rancher `User` role. Is there a way to do this?
test
associate ad group with rancher role i have ad authentication enabled in rancher with ad groups rancher admin rancher user i would like to automatically associate all rancher admin users to rancher admin role and rancher user users to rancher user role is there a way to do this
1
319,214
9,740,331,139
IssuesEvent
2019-06-01 19:25:08
draden05/LICAsphere_BACK
https://api.github.com/repos/draden05/LICAsphere_BACK
closed
En tant qu'utilisateur, je souhaite pouvoir lister mes projets
priority user story
Se résume à une route GET /project/all.
1.0
En tant qu'utilisateur, je souhaite pouvoir lister mes projets - Se résume à une route GET /project/all.
non_test
en tant qu utilisateur je souhaite pouvoir lister mes projets se résume à une route get project all
0
255,009
21,893,410,216
IssuesEvent
2022-05-20 05:53:01
milvus-io/milvus
https://api.github.com/repos/milvus-io/milvus
opened
[Enhancement]: The statement of the expression reporting an error is not clear and cannot be seen as not matching ""
kind/enhancement test-string
### Is there an existing issue for this? - [X] I have searched the existing issues ### What would you like to be added? testcase: ``` [2022-05-19T16:27:28.390Z] [2022-05-19T16:27:28.390Z] @pytest.mark.tags(CaseLabel.L2) [2022-05-19T16:27:28.390Z] def test_query_expr_non_primary_fields(self): [2022-05-19T16:27:28.390Z] """ [2022-05-19T16:27:28.390Z] target: test query on non-primary non-vector fields [2022-05-19T16:27:28.390Z] method: query on non-primary non-vector fields [2022-05-19T16:27:28.390Z] expected: verify query result [2022-05-19T16:27:28.390Z] """ [2022-05-19T16:27:28.390Z] self._connect() [2022-05-19T16:27:28.390Z] # construct dataframe and inert data [2022-05-19T16:27:28.390Z] df = pd.DataFrame({ [2022-05-19T16:27:28.390Z] ct.default_int64_field_name: pd.Series(data=[i for i in range(ct.default_nb)]), [2022-05-19T16:27:28.390Z] ct.default_int32_field_name: pd.Series(data=[np.int32(i) for i in range(ct.default_nb)], dtype="int32"), [2022-05-19T16:27:28.390Z] ct.default_int16_field_name: pd.Series(data=[np.int16(i) for i in range(ct.default_nb)], dtype="int16"), [2022-05-19T16:27:28.390Z] ct.default_float_field_name: pd.Series(data=[np.float32(i) for i in range(ct.default_nb)], dtype="float32"), [2022-05-19T16:27:28.390Z] ct.default_double_field_name: pd.Series(data=[np.double(i) for i in range(ct.default_nb)], dtype="double"), [2022-05-19T16:27:28.390Z] ct.default_string_field_name: pd.Series(data=[str(i) for i in range(ct.default_nb)], dtype="string"), [2022-05-19T16:27:28.390Z] ct.default_float_vec_field_name: cf.gen_vectors(ct.default_nb, ct.default_dim) [2022-05-19T16:27:28.390Z] }) [2022-05-19T16:27:28.390Z] self.collection_wrap.construct_from_dataframe(cf.gen_unique_str(prefix), df, [2022-05-19T16:27:28.390Z] primary_field=ct.default_int64_field_name) [2022-05-19T16:27:28.390Z] assert self.collection_wrap.num_entities == ct.default_nb [2022-05-19T16:27:28.390Z] self.collection_wrap.load() [2022-05-19T16:27:28.390Z] [2022-05-19T16:27:28.390Z] # query by non_primary non_vector scalar field [2022-05-19T16:27:28.390Z] non_primary_field = [ct.default_int32_field_name, ct.default_int16_field_name, [2022-05-19T16:27:28.390Z] ct.default_float_field_name, ct.default_double_field_name, ct.default_string_field_name] [2022-05-19T16:27:28.390Z] [2022-05-19T16:27:28.390Z] # exp res: first two rows and all fields expect last vec field [2022-05-19T16:27:28.390Z] res = df.iloc[:2, :-1].to_dict('records') [2022-05-19T16:27:28.390Z] for field in non_primary_field: [2022-05-19T16:27:28.390Z] filter_values = df[field].tolist()[:2] [2022-05-19T16:27:28.390Z] term_expr = f'{field} in {filter_values}' [2022-05-19T16:27:28.390Z] self.collection_wrap.query(term_expr, output_fields=["*"], [2022-05-19T16:27:28.390Z] > check_task=CheckTasks.check_query_results, check_items={exp_res: res}) ``` ``` 'double': 1.0, 'varchar': '1', 'int32': 1, 'int16': 1, 'float': 1.0}] (api_request.py:27) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - DEBUG - ci_test]: (api_request) : [Collection.query] args: ["varchar in ['0', '1']", ['*'], None, 20], kwargs: {} (api_request.py:55) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - ERROR - pymilvus.decorators]: RPC error: [query], <MilvusException: (code=1, message=cannot parse expression: varchar in ['0', '1'], error: line 1:19 token recognition error at: ''')>, <Time:{'RPC start': '2022-05-19 15:09:04.328408', 'RPC error': '2022-05-19 15:09:04.339856'}> (decorators.py:73) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - ERROR - ci_test]: Traceback (most recent call last): [2022-05-19T16:27:28.391Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 22, in inner_wrapper [2022-05-19T16:27:28.391Z] res = func(*args, **kwargs) [2022-05-19T16:27:28.391Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 56, in api_request [2022-05-19T16:27:28.391Z] return func(*arg, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/orm/collection.py", line 772, in query [2022-05-19T16:27:28.391Z] res = conn.query(self._name, expr, output_fields, partition_names, timeout, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 56, in handler [2022-05-19T16:27:28.391Z] raise e [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 41, in handler [2022-05-19T16:27:28.391Z] return func(self, *args, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 74, in handler [2022-05-19T16:27:28.391Z] raise e [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 70, in handler [2022-05-19T16:27:28.391Z] return func(*args, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/client/grpc_handler.py", line 1033, in query [2022-05-19T16:27:28.391Z] raise MilvusException(response.status.error_code, response.status.reason) [2022-05-19T16:27:28.391Z] pymilvus.exceptions.MilvusException: <MilvusException: (code=1, message=cannot parse expression: varchar in ['0', '1'], error: line 1:19 token recognition error at: ''')> [2022-05-19T16:27:28.391Z] (api_request.py:35) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - ERROR - ci_test]: (api_response) : <MilvusException: (code=1, message=cannot parse expression: varchar in ['0', '1'], error: line 1:19 token recognition error at: ''')> (api_request.py:36) ``` ### Why is this needed? _No response_ ### Anything else? _No response_
1.0
[Enhancement]: The statement of the expression reporting an error is not clear and cannot be seen as not matching "" - ### Is there an existing issue for this? - [X] I have searched the existing issues ### What would you like to be added? testcase: ``` [2022-05-19T16:27:28.390Z] [2022-05-19T16:27:28.390Z] @pytest.mark.tags(CaseLabel.L2) [2022-05-19T16:27:28.390Z] def test_query_expr_non_primary_fields(self): [2022-05-19T16:27:28.390Z] """ [2022-05-19T16:27:28.390Z] target: test query on non-primary non-vector fields [2022-05-19T16:27:28.390Z] method: query on non-primary non-vector fields [2022-05-19T16:27:28.390Z] expected: verify query result [2022-05-19T16:27:28.390Z] """ [2022-05-19T16:27:28.390Z] self._connect() [2022-05-19T16:27:28.390Z] # construct dataframe and inert data [2022-05-19T16:27:28.390Z] df = pd.DataFrame({ [2022-05-19T16:27:28.390Z] ct.default_int64_field_name: pd.Series(data=[i for i in range(ct.default_nb)]), [2022-05-19T16:27:28.390Z] ct.default_int32_field_name: pd.Series(data=[np.int32(i) for i in range(ct.default_nb)], dtype="int32"), [2022-05-19T16:27:28.390Z] ct.default_int16_field_name: pd.Series(data=[np.int16(i) for i in range(ct.default_nb)], dtype="int16"), [2022-05-19T16:27:28.390Z] ct.default_float_field_name: pd.Series(data=[np.float32(i) for i in range(ct.default_nb)], dtype="float32"), [2022-05-19T16:27:28.390Z] ct.default_double_field_name: pd.Series(data=[np.double(i) for i in range(ct.default_nb)], dtype="double"), [2022-05-19T16:27:28.390Z] ct.default_string_field_name: pd.Series(data=[str(i) for i in range(ct.default_nb)], dtype="string"), [2022-05-19T16:27:28.390Z] ct.default_float_vec_field_name: cf.gen_vectors(ct.default_nb, ct.default_dim) [2022-05-19T16:27:28.390Z] }) [2022-05-19T16:27:28.390Z] self.collection_wrap.construct_from_dataframe(cf.gen_unique_str(prefix), df, [2022-05-19T16:27:28.390Z] primary_field=ct.default_int64_field_name) [2022-05-19T16:27:28.390Z] assert self.collection_wrap.num_entities == ct.default_nb [2022-05-19T16:27:28.390Z] self.collection_wrap.load() [2022-05-19T16:27:28.390Z] [2022-05-19T16:27:28.390Z] # query by non_primary non_vector scalar field [2022-05-19T16:27:28.390Z] non_primary_field = [ct.default_int32_field_name, ct.default_int16_field_name, [2022-05-19T16:27:28.390Z] ct.default_float_field_name, ct.default_double_field_name, ct.default_string_field_name] [2022-05-19T16:27:28.390Z] [2022-05-19T16:27:28.390Z] # exp res: first two rows and all fields expect last vec field [2022-05-19T16:27:28.390Z] res = df.iloc[:2, :-1].to_dict('records') [2022-05-19T16:27:28.390Z] for field in non_primary_field: [2022-05-19T16:27:28.390Z] filter_values = df[field].tolist()[:2] [2022-05-19T16:27:28.390Z] term_expr = f'{field} in {filter_values}' [2022-05-19T16:27:28.390Z] self.collection_wrap.query(term_expr, output_fields=["*"], [2022-05-19T16:27:28.390Z] > check_task=CheckTasks.check_query_results, check_items={exp_res: res}) ``` ``` 'double': 1.0, 'varchar': '1', 'int32': 1, 'int16': 1, 'float': 1.0}] (api_request.py:27) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - DEBUG - ci_test]: (api_request) : [Collection.query] args: ["varchar in ['0', '1']", ['*'], None, 20], kwargs: {} (api_request.py:55) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - ERROR - pymilvus.decorators]: RPC error: [query], <MilvusException: (code=1, message=cannot parse expression: varchar in ['0', '1'], error: line 1:19 token recognition error at: ''')>, <Time:{'RPC start': '2022-05-19 15:09:04.328408', 'RPC error': '2022-05-19 15:09:04.339856'}> (decorators.py:73) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - ERROR - ci_test]: Traceback (most recent call last): [2022-05-19T16:27:28.391Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 22, in inner_wrapper [2022-05-19T16:27:28.391Z] res = func(*args, **kwargs) [2022-05-19T16:27:28.391Z] File "/home/jenkins/agent/workspace/tests/python_client/utils/api_request.py", line 56, in api_request [2022-05-19T16:27:28.391Z] return func(*arg, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/orm/collection.py", line 772, in query [2022-05-19T16:27:28.391Z] res = conn.query(self._name, expr, output_fields, partition_names, timeout, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 56, in handler [2022-05-19T16:27:28.391Z] raise e [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 41, in handler [2022-05-19T16:27:28.391Z] return func(self, *args, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 74, in handler [2022-05-19T16:27:28.391Z] raise e [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/decorators.py", line 70, in handler [2022-05-19T16:27:28.391Z] return func(*args, **kwargs) [2022-05-19T16:27:28.391Z] File "/usr/local/lib/python3.6/site-packages/pymilvus/client/grpc_handler.py", line 1033, in query [2022-05-19T16:27:28.391Z] raise MilvusException(response.status.error_code, response.status.reason) [2022-05-19T16:27:28.391Z] pymilvus.exceptions.MilvusException: <MilvusException: (code=1, message=cannot parse expression: varchar in ['0', '1'], error: line 1:19 token recognition error at: ''')> [2022-05-19T16:27:28.391Z] (api_request.py:35) [2022-05-19T16:27:28.391Z] [2022-05-19 15:09:04 - ERROR - ci_test]: (api_response) : <MilvusException: (code=1, message=cannot parse expression: varchar in ['0', '1'], error: line 1:19 token recognition error at: ''')> (api_request.py:36) ``` ### Why is this needed? _No response_ ### Anything else? _No response_
test
the statement of the expression reporting an error is not clear and cannot be seen as not matching is there an existing issue for this i have searched the existing issues what would you like to be added testcase: pytest mark tags caselabel def test query expr non primary fields self target test query on non primary non vector fields method query on non primary non vector fields expected verify query result self connect construct dataframe and inert data df pd dataframe ct default field name pd series data ct default field name pd series data dtype ct default field name pd series data dtype ct default float field name pd series data dtype ct default double field name pd series data dtype double ct default string field name pd series data dtype string ct default float vec field name cf gen vectors ct default nb ct default dim self collection wrap construct from dataframe cf gen unique str prefix df primary field ct default field name assert self collection wrap num entities ct default nb self collection wrap load query by non primary non vector scalar field non primary field ct default field name ct default field name ct default float field name ct default double field name ct default string field name exp res first two rows and all fields expect last vec field res df iloc to dict records for field in non primary field filter values df tolist term expr f field in filter values self collection wrap query term expr output fields check task checktasks check query results check items exp res res double varchar float api request py api request args none kwargs api request py rpc error decorators py traceback most recent call last file home jenkins agent workspace tests python client utils api request py line in inner wrapper res func args kwargs file home jenkins agent workspace tests python client utils api request py line in api request return func arg kwargs file usr local lib site packages pymilvus orm collection py line in query res conn query self name expr output fields partition names timeout kwargs file usr local lib site packages pymilvus decorators py line in handler raise e file usr local lib site packages pymilvus decorators py line in handler return func self args kwargs file usr local lib site packages pymilvus decorators py line in handler raise e file usr local lib site packages pymilvus decorators py line in handler return func args kwargs file usr local lib site packages pymilvus client grpc handler py line in query raise milvusexception response status error code response status reason pymilvus exceptions milvusexception api request py api response api request py why is this needed no response anything else no response
1
7,130
4,786,513,307
IssuesEvent
2016-10-29 13:28:47
hajicj/MUSCIMarker
https://api.github.com/repos/hajicj/MUSCIMarker
opened
LineTracer could be deleteable by retracing a part of the line
prio:medium type:functional-enhancement type:usability-enhancement
Would enable self-corrections during drawing. Must carefully test so that it isn't annoying.
True
LineTracer could be deleteable by retracing a part of the line - Would enable self-corrections during drawing. Must carefully test so that it isn't annoying.
non_test
linetracer could be deleteable by retracing a part of the line would enable self corrections during drawing must carefully test so that it isn t annoying
0
266,893
23,266,879,732
IssuesEvent
2022-08-04 18:18:20
GoogleCloudPlatform/appengine-plugins-core
https://api.github.com/repos/GoogleCloudPlatform/appengine-plugins-core
closed
CloudSdk needs more tests
testing
#137 should have a matching test that fails when format is incorrect. More broadly we should make sure that the cloud sdk is executing commands according to expectations.
1.0
CloudSdk needs more tests - #137 should have a matching test that fails when format is incorrect. More broadly we should make sure that the cloud sdk is executing commands according to expectations.
test
cloudsdk needs more tests should have a matching test that fails when format is incorrect more broadly we should make sure that the cloud sdk is executing commands according to expectations
1
401,616
11,795,233,703
IssuesEvent
2020-03-18 08:32:37
brave/brave-browser
https://api.github.com/repos/brave/brave-browser
closed
userguides.tritondigital.com doesn't load properly because of adblock
bug feature/shields/webcompat priority/P5 workaround/allow-ads-and-tracking
Carried over from https://github.com/brave/browser-laptop/issues/14689 > 1. With shields enabled, visit https://userguides.tritondigital.com/adv/a2xspecdsp/ > 2. Notice totally blank screen > 3. Use shields menu and pick `Allow ads and tracking` under `Ad Control` > 4. Site works
1.0
userguides.tritondigital.com doesn't load properly because of adblock - Carried over from https://github.com/brave/browser-laptop/issues/14689 > 1. With shields enabled, visit https://userguides.tritondigital.com/adv/a2xspecdsp/ > 2. Notice totally blank screen > 3. Use shields menu and pick `Allow ads and tracking` under `Ad Control` > 4. Site works
non_test
userguides tritondigital com doesn t load properly because of adblock carried over from with shields enabled visit notice totally blank screen use shields menu and pick allow ads and tracking under ad control site works
0
151,495
12,041,526,652
IssuesEvent
2020-04-14 08:59:36
WoWManiaUK/Redemption
https://api.github.com/repos/WoWManiaUK/Redemption
closed
[Raid] ICC - Spinestalker - immunities
Fix - Tester Confirmed
@Rushor Addition to #4281 The NPC(s) missing immunities to the below: Arcane Torrent - Spell ID: 28730 / 25046 / 50613 Thunderstorm - Spell ID: 59159 Silencing Shot - Spell ID: 34490 If you can, please also check: 15487 and pet ability 19647 I believe these two i missed when i was checking on the initial report.
1.0
[Raid] ICC - Spinestalker - immunities - @Rushor Addition to #4281 The NPC(s) missing immunities to the below: Arcane Torrent - Spell ID: 28730 / 25046 / 50613 Thunderstorm - Spell ID: 59159 Silencing Shot - Spell ID: 34490 If you can, please also check: 15487 and pet ability 19647 I believe these two i missed when i was checking on the initial report.
test
icc spinestalker immunities rushor addition to the npc s missing immunities to the below arcane torrent spell id thunderstorm spell id silencing shot spell id if you can please also check and pet ability i believe these two i missed when i was checking on the initial report
1
784,933
27,589,765,717
IssuesEvent
2023-03-08 23:02:00
hotosm/fmtm
https://api.github.com/repos/hotosm/fmtm
opened
Create test cases
enhancement Priority: Must have
Currently FMTM lacks working test cases. Ideally every endpoint should have unit level test case.
1.0
Create test cases - Currently FMTM lacks working test cases. Ideally every endpoint should have unit level test case.
non_test
create test cases currently fmtm lacks working test cases ideally every endpoint should have unit level test case
0
225,097
24,808,016,498
IssuesEvent
2022-10-25 07:07:36
sast-automation-dev/easybuggy4django-45
https://api.github.com/repos/sast-automation-dev/easybuggy4django-45
opened
psutil-5.4.3.tar.gz: 1 vulnerabilities (highest severity is: 7.5)
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>psutil-5.4.3.tar.gz</b></p></summary> <p>Cross-platform lib for process and system monitoring in Python.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz">https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-45/commit/6ec2ca349bdfa185e7fedb3ba2422dd8824b9420">6ec2ca349bdfa185e7fedb3ba2422dd8824b9420</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (psutil version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2019-18874](https://www.mend.io/vulnerability-database/CVE-2019-18874) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | psutil-5.4.3.tar.gz | Direct | 5.6.6 | &#9989; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-18874</summary> ### Vulnerable Library - <b>psutil-5.4.3.tar.gz</b></p> <p>Cross-platform lib for process and system monitoring in Python.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz">https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **psutil-5.4.3.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-45/commit/6ec2ca349bdfa185e7fedb3ba2422dd8824b9420">6ec2ca349bdfa185e7fedb3ba2422dd8824b9420</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> psutil (aka python-psutil) through 5.6.5 can have a double free. This occurs because of refcount mishandling within a while or for loop that converts system data into a Python object. <p>Publish Date: 2019-11-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-18874>CVE-2019-18874</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-18874">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-18874</a></p> <p>Release Date: 2019-11-18</p> <p>Fix Resolution: 5.6.6</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details> *** <p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
True
psutil-5.4.3.tar.gz: 1 vulnerabilities (highest severity is: 7.5) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>psutil-5.4.3.tar.gz</b></p></summary> <p>Cross-platform lib for process and system monitoring in Python.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz">https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-45/commit/6ec2ca349bdfa185e7fedb3ba2422dd8824b9420">6ec2ca349bdfa185e7fedb3ba2422dd8824b9420</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (psutil version) | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | ------------- | --- | | [CVE-2019-18874](https://www.mend.io/vulnerability-database/CVE-2019-18874) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | psutil-5.4.3.tar.gz | Direct | 5.6.6 | &#9989; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-18874</summary> ### Vulnerable Library - <b>psutil-5.4.3.tar.gz</b></p> <p>Cross-platform lib for process and system monitoring in Python.</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz">https://files.pythonhosted.org/packages/e2/e1/600326635f97fee89bf8426fef14c5c29f4849c79f68fd79f433d8c1bd96/psutil-5.4.3.tar.gz</a></p> <p>Path to dependency file: /requirements.txt</p> <p>Path to vulnerable library: /requirements.txt</p> <p> Dependency Hierarchy: - :x: **psutil-5.4.3.tar.gz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/sast-automation-dev/easybuggy4django-45/commit/6ec2ca349bdfa185e7fedb3ba2422dd8824b9420">6ec2ca349bdfa185e7fedb3ba2422dd8824b9420</a></p> <p>Found in base branch: <b>master</b></p> </p> <p></p> ### Vulnerability Details <p> psutil (aka python-psutil) through 5.6.5 can have a double free. This occurs because of refcount mishandling within a while or for loop that converts system data into a Python object. <p>Publish Date: 2019-11-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-18874>CVE-2019-18874</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-18874">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-18874</a></p> <p>Release Date: 2019-11-18</p> <p>Fix Resolution: 5.6.6</p> </p> <p></p> :rescue_worker_helmet: Automatic Remediation is available for this issue </details> *** <p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
non_test
psutil tar gz vulnerabilities highest severity is vulnerable library psutil tar gz cross platform lib for process and system monitoring in python library home page a href path to dependency file requirements txt path to vulnerable library requirements txt found in head commit a href vulnerabilities cve severity cvss dependency type fixed in psutil version remediation available high psutil tar gz direct details cve vulnerable library psutil tar gz cross platform lib for process and system monitoring in python library home page a href path to dependency file requirements txt path to vulnerable library requirements txt dependency hierarchy x psutil tar gz vulnerable library found in head commit a href found in base branch master vulnerability details psutil aka python psutil through can have a double free this occurs because of refcount mishandling within a while or for loop that converts system data into a python object publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
0
297,831
25,766,466,568
IssuesEvent
2022-12-09 02:30:15
PalisadoesFoundation/talawa-api
https://api.github.com/repos/PalisadoesFoundation/talawa-api
closed
Resolver: Create Tests for postsByOrganization.js
good first issue parent points 02 test
The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage. Tests need to be written for file `lib/resolvers/Query/postsByOrganization.ts` - We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed. - When complete, all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a single file with the name `tests/resolvers/post_query/postsByOrganization.sepc.js`. You may need to create the appropriate directory structure to do this. ### IMPORTANT: Please refer to the parent issue on how to implement these tests correctly: - https://github.com/PalisadoesFoundation/talawa-api/issues/490 ### PR Acceptance Criteria - When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR. - [The current code coverage for the file can be found here](https://codecov.io/gh/PalisadoesFoundation/talawa-api/tree/develop/lib/resolvers/organization_query/). If the file isn't found in this directory, or there is a 404 error, then tests have not been created. - The PR will show a report for the code coverage for the file you have added. You can use that as a guide.
1.0
Resolver: Create Tests for postsByOrganization.js - The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage. Tests need to be written for file `lib/resolvers/Query/postsByOrganization.ts` - We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed. - When complete, all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a single file with the name `tests/resolvers/post_query/postsByOrganization.sepc.js`. You may need to create the appropriate directory structure to do this. ### IMPORTANT: Please refer to the parent issue on how to implement these tests correctly: - https://github.com/PalisadoesFoundation/talawa-api/issues/490 ### PR Acceptance Criteria - When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR. - [The current code coverage for the file can be found here](https://codecov.io/gh/PalisadoesFoundation/talawa-api/tree/develop/lib/resolvers/organization_query/). If the file isn't found in this directory, or there is a 404 error, then tests have not been created. - The PR will show a report for the code coverage for the file you have added. You can use that as a guide.
test
resolver create tests for postsbyorganization js the talawa api code base needs to be reliable this means we need to have test code coverage tests need to be written for file lib resolvers query postsbyorganization ts we will need the api to be refactored for all methods classes and or functions found in this file for testing to be correctly executed when complete all methods classes and or functions in the refactored file will need to be tested these tests must be placed in a single file with the name tests resolvers post query postsbyorganization sepc js you may need to create the appropriate directory structure to do this important please refer to the parent issue on how to implement these tests correctly pr acceptance criteria when complete this file must show coverage when merged into the code base this will be clearly visible when you submit your pr if the file isn t found in this directory or there is a error then tests have not been created the pr will show a report for the code coverage for the file you have added you can use that as a guide
1
231,483
7,633,357,700
IssuesEvent
2018-05-06 03:45:45
OperationCode/operationcode_frontend
https://api.github.com/repos/OperationCode/operationcode_frontend
opened
Polyfill rAF in test environment to prevent CI build warnings for tests
Priority: Low Type: Developer Experience beginner friendly
## What is the current behavior? Travis CI build currently displays the following warning: ``` console.error node_modules/fbjs/lib/warning.js:33 Warning: React depends on requestAnimationFrame. Make sure that you load a polyfill in older browsers. http://fb.me/react-polyfills ``` ## What is the expected behavior? No warnings. We can follow the pattern used in create-react-app to fix this: https://github.com/facebook/create-react-app/pull/3340/files
1.0
Polyfill rAF in test environment to prevent CI build warnings for tests - ## What is the current behavior? Travis CI build currently displays the following warning: ``` console.error node_modules/fbjs/lib/warning.js:33 Warning: React depends on requestAnimationFrame. Make sure that you load a polyfill in older browsers. http://fb.me/react-polyfills ``` ## What is the expected behavior? No warnings. We can follow the pattern used in create-react-app to fix this: https://github.com/facebook/create-react-app/pull/3340/files
non_test
polyfill raf in test environment to prevent ci build warnings for tests what is the current behavior travis ci build currently displays the following warning console error node modules fbjs lib warning js warning react depends on requestanimationframe make sure that you load a polyfill in older browsers what is the expected behavior no warnings we can follow the pattern used in create react app to fix this
0
213,124
16,500,777,243
IssuesEvent
2021-05-25 14:24:34
spack/spack
https://api.github.com/repos/spack/spack
closed
Perform smoke tests in temporary directory
feature smoke-tests
Currently post-installation tests are usually run inside the install prefix and must be cleaned before reusing. I suggest a mechanism for running inside a temporary directory. ### Rationale Installation tests, especially for libraries, can be messy (e.g. create a `build` dir with many files), and the result of one build might affect another build resulting in a false positive or negative. Furthermore, it's possible that users of a shared spack installation other than the original installer would want to test, or spack is installed onto a read-only device. Currently the test system assumes the user testing has full write privileges on the installation prefix. Finally, packages might `make clean` the test builds to avoid polluting the next run or leaving detritus in the installation directories, but that behavior interferes with the ability to debug a smoke test if it fails. The better solution is a temporary directory with a `--keep-stage` option. ### Description I think a `self.test_build_dir` or something could easily replace the handrolled smoke test "build" dirs in place. ### General information - [x] Spack version 0.16.1-2652-39a4f3ba88 - [x] I have searched the issues of this repo and believe this is not a duplicate
1.0
Perform smoke tests in temporary directory - Currently post-installation tests are usually run inside the install prefix and must be cleaned before reusing. I suggest a mechanism for running inside a temporary directory. ### Rationale Installation tests, especially for libraries, can be messy (e.g. create a `build` dir with many files), and the result of one build might affect another build resulting in a false positive or negative. Furthermore, it's possible that users of a shared spack installation other than the original installer would want to test, or spack is installed onto a read-only device. Currently the test system assumes the user testing has full write privileges on the installation prefix. Finally, packages might `make clean` the test builds to avoid polluting the next run or leaving detritus in the installation directories, but that behavior interferes with the ability to debug a smoke test if it fails. The better solution is a temporary directory with a `--keep-stage` option. ### Description I think a `self.test_build_dir` or something could easily replace the handrolled smoke test "build" dirs in place. ### General information - [x] Spack version 0.16.1-2652-39a4f3ba88 - [x] I have searched the issues of this repo and believe this is not a duplicate
test
perform smoke tests in temporary directory currently post installation tests are usually run inside the install prefix and must be cleaned before reusing i suggest a mechanism for running inside a temporary directory rationale installation tests especially for libraries can be messy e g create a build dir with many files and the result of one build might affect another build resulting in a false positive or negative furthermore it s possible that users of a shared spack installation other than the original installer would want to test or spack is installed onto a read only device currently the test system assumes the user testing has full write privileges on the installation prefix finally packages might make clean the test builds to avoid polluting the next run or leaving detritus in the installation directories but that behavior interferes with the ability to debug a smoke test if it fails the better solution is a temporary directory with a keep stage option description i think a self test build dir or something could easily replace the handrolled smoke test build dirs in place general information spack version i have searched the issues of this repo and believe this is not a duplicate
1
255,440
21,924,224,205
IssuesEvent
2022-05-23 01:04:19
backend-br/vagas
https://api.github.com/repos/backend-br/vagas
closed
[Remoto] Java Developer na CuboConnect
CLT JavaScript Java Remoto DevOps Testes automatizados Spring SQL Apache Linux Stale
## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/desenvolvedor-java-170907192?utm_source=github&utm_medium=backend-br-vagas&modal=open) com o pop-up personalizado de candidatura. 👋 <p><strong>A Cuboconnect</strong> busca pessoa <strong>Java Developer</strong> para compor seu time!</p> <p>A cuboconnect atua no mercado há mais de 20 anos como uma transformadora digital de soluções de tecnologia. São mais de 100 clientes em uma relação de confiança e transparência acrescidos de nossa experiência e profundo conhecimento em diversos segmentos de mercado.</p> <p>Levamos soluções que resolvem e entregam resultados com inovação, capacidade de entrega e performance em projetos customizados e de alta complexidade.</p> <p></p> ## CuboConnect: <p>A cuboconnect atua no mercado há mais de 20 anos como uma transformadora digital de soluções de tecnologia. São mais de 100 clientes em uma relação de confiança e transparência acrescidos de nossa experiência e profundo conhecimento em diversos segmentos de mercado.</p> <p>Levamos soluções que resolvem e entregam resultados com inovação, capacidade de entrega e performance em projetos customizados e de alta complexidade.</p><a href='https://coodesh.com/empresas/cuboconnect'>Veja mais no site</a> ## Habilidades: - Oracle SQL - Apache - Java EE - Javascript ## Local: 100% Remoto ## Requisitos: - Conhecimento em JVM (JAVA) - Conhecimento em Frameworks Web (VRaptor) - Conhecimento Frameworks Java (Spring, Hibernate) - Conhecimento Bancos SQL (Oracle) - Conhecimento WebServer (Apache) - Conhecimento Frameworks JavaScript (jQuery, RequireJS) ## Diferenciais: - Ferramentas de Build (Maven, SBT) - Application Server (Oracle Weblogic) - DevOps (Linux, Chef, Vagrant, Virtualbox) - Testes Automatizados (Selenium, Specs2, Jenkins) - Frameworks JavaScript (Backbone, Underscore, AngularJS 1 e 2) - Front end workflow (Bower, Grunt, SASS) ## Benefícios: - Vale refeição - Assistência médica - Assistência odontológica: com franquia paga pelo beneficiário diretamente ao dentista não abrange dependentes - Seguro de Vida : Itaú Seguros - Carteirinha do Clube Sesc - Treinamentos para atualização do profissional (de acordo com solicitação do cliente) - Reembolso para certificações: de acordo com o perfil do profissional - Acompanhamento de carreira - Descontos de *20% em cursos no SENAC - Interclube: plano de descontos em diversos estabelecimento ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Java Developer na CuboConnect](https://coodesh.com/vagas/desenvolvedor-java-170907192?utm_source=github&utm_medium=backend-br-vagas&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime CLT #### Categoria Back-End
1.0
[Remoto] Java Developer na CuboConnect - ## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/desenvolvedor-java-170907192?utm_source=github&utm_medium=backend-br-vagas&modal=open) com o pop-up personalizado de candidatura. 👋 <p><strong>A Cuboconnect</strong> busca pessoa <strong>Java Developer</strong> para compor seu time!</p> <p>A cuboconnect atua no mercado há mais de 20 anos como uma transformadora digital de soluções de tecnologia. São mais de 100 clientes em uma relação de confiança e transparência acrescidos de nossa experiência e profundo conhecimento em diversos segmentos de mercado.</p> <p>Levamos soluções que resolvem e entregam resultados com inovação, capacidade de entrega e performance em projetos customizados e de alta complexidade.</p> <p></p> ## CuboConnect: <p>A cuboconnect atua no mercado há mais de 20 anos como uma transformadora digital de soluções de tecnologia. São mais de 100 clientes em uma relação de confiança e transparência acrescidos de nossa experiência e profundo conhecimento em diversos segmentos de mercado.</p> <p>Levamos soluções que resolvem e entregam resultados com inovação, capacidade de entrega e performance em projetos customizados e de alta complexidade.</p><a href='https://coodesh.com/empresas/cuboconnect'>Veja mais no site</a> ## Habilidades: - Oracle SQL - Apache - Java EE - Javascript ## Local: 100% Remoto ## Requisitos: - Conhecimento em JVM (JAVA) - Conhecimento em Frameworks Web (VRaptor) - Conhecimento Frameworks Java (Spring, Hibernate) - Conhecimento Bancos SQL (Oracle) - Conhecimento WebServer (Apache) - Conhecimento Frameworks JavaScript (jQuery, RequireJS) ## Diferenciais: - Ferramentas de Build (Maven, SBT) - Application Server (Oracle Weblogic) - DevOps (Linux, Chef, Vagrant, Virtualbox) - Testes Automatizados (Selenium, Specs2, Jenkins) - Frameworks JavaScript (Backbone, Underscore, AngularJS 1 e 2) - Front end workflow (Bower, Grunt, SASS) ## Benefícios: - Vale refeição - Assistência médica - Assistência odontológica: com franquia paga pelo beneficiário diretamente ao dentista não abrange dependentes - Seguro de Vida : Itaú Seguros - Carteirinha do Clube Sesc - Treinamentos para atualização do profissional (de acordo com solicitação do cliente) - Reembolso para certificações: de acordo com o perfil do profissional - Acompanhamento de carreira - Descontos de *20% em cursos no SENAC - Interclube: plano de descontos em diversos estabelecimento ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [Java Developer na CuboConnect](https://coodesh.com/vagas/desenvolvedor-java-170907192?utm_source=github&utm_medium=backend-br-vagas&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime CLT #### Categoria Back-End
test
java developer na cuboconnect descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a cuboconnect busca pessoa java developer para compor seu time a cuboconnect atua no mercado há mais de anos como uma transformadora digital de soluções de tecnologia são mais de clientes em uma relação de confiança e transparência acrescidos de nossa experiência e profundo conhecimento em diversos segmentos de mercado levamos soluções que resolvem e entregam resultados com inovação capacidade de entrega e performance em projetos customizados e de alta complexidade cuboconnect a cuboconnect atua no mercado há mais de anos como uma transformadora digital de soluções de tecnologia são mais de clientes em uma relação de confiança e transparência acrescidos de nossa experiência e profundo conhecimento em diversos segmentos de mercado levamos soluções que resolvem e entregam resultados com inovação capacidade de entrega e performance em projetos customizados e de alta complexidade habilidades oracle sql apache java ee javascript local remoto requisitos conhecimento em jvm java conhecimento em frameworks web vraptor conhecimento frameworks java spring hibernate conhecimento bancos sql oracle conhecimento webserver apache conhecimento frameworks javascript jquery requirejs diferenciais ferramentas de build maven sbt application server oracle weblogic devops linux chef vagrant virtualbox testes automatizados selenium jenkins frameworks javascript backbone underscore angularjs e front end workflow bower grunt sass benefícios vale refeição assistência médica assistência odontológica com franquia paga pelo beneficiário diretamente ao dentista não abrange dependentes seguro de vida itaú seguros carteirinha do clube sesc treinamentos para atualização do profissional de acordo com solicitação do cliente reembolso para certificações de acordo com o perfil do profissional acompanhamento de carreira descontos de em cursos no senac interclube plano de descontos em diversos estabelecimento como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação remoto regime clt categoria back end
1
26,783
4,241,983,025
IssuesEvent
2016-07-06 18:01:24
galenframework/galen
https://api.github.com/repos/galenframework/galen
closed
Spec 'contains' does not support object groups
bug c4 p3 ready for test
At the moment the following code doesn't work ``` # ... @groups menu_items menu_item-* = Menu = menu: contains &menu_items ```
1.0
Spec 'contains' does not support object groups - At the moment the following code doesn't work ``` # ... @groups menu_items menu_item-* = Menu = menu: contains &menu_items ```
test
spec contains does not support object groups at the moment the following code doesn t work groups menu items menu item menu menu contains menu items
1
26,588
4,235,246,075
IssuesEvent
2016-07-05 14:39:40
hioa-cs/IncludeOS
https://api.github.com/repos/hioa-cs/IncludeOS
closed
test: quick_exit move inside STL test
C++ standard library Test
The test for quick_exit should be moved inside STL, and this test should cease to exists. Anyone disagrees?
1.0
test: quick_exit move inside STL test - The test for quick_exit should be moved inside STL, and this test should cease to exists. Anyone disagrees?
test
test quick exit move inside stl test the test for quick exit should be moved inside stl and this test should cease to exists anyone disagrees
1
304,449
26,277,331,033
IssuesEvent
2023-01-07 00:13:00
openservicemesh/osm
https://api.github.com/repos/openservicemesh/osm
closed
Unit test TestGetInboundMeshTrafficPolicy flakes in CI
kind/bug size/XS kind/flaky-test stale
<!-- This issue tracker is a best-effort forum for users and customers to report bugs. Be sure to not include any sensitive information. Sensitive information should __NOT__ be included in this issue. --> **Bug description**: Unit test TestGetInboundMeshTrafficPolicy/multiple_services,_SMI_mode,_1_TrafficTarget,_1_HTTPRouteGroup,_1_TrafficSplit_and_multiple_trust_domains flakes in the CI. ``` inbound_traffic_policies_test.go:2417: Error Trace: inbound_traffic_policies_test.go:2417 Error: Condition never satisfied Test: TestGetInboundMeshTrafficPolicy/multiple_services,_SMI_mode,_1_TrafficTarget,_1_HTTPRouteGroup,_1_TrafficSplit_and_multiple_trust_domains ``` Failed run: https://github.com/openservicemesh/osm/actions/runs/3299314699/jobs/5442591548 **Affected area (please mark with X where applicable)**: - Tests [X]
1.0
Unit test TestGetInboundMeshTrafficPolicy flakes in CI - <!-- This issue tracker is a best-effort forum for users and customers to report bugs. Be sure to not include any sensitive information. Sensitive information should __NOT__ be included in this issue. --> **Bug description**: Unit test TestGetInboundMeshTrafficPolicy/multiple_services,_SMI_mode,_1_TrafficTarget,_1_HTTPRouteGroup,_1_TrafficSplit_and_multiple_trust_domains flakes in the CI. ``` inbound_traffic_policies_test.go:2417: Error Trace: inbound_traffic_policies_test.go:2417 Error: Condition never satisfied Test: TestGetInboundMeshTrafficPolicy/multiple_services,_SMI_mode,_1_TrafficTarget,_1_HTTPRouteGroup,_1_TrafficSplit_and_multiple_trust_domains ``` Failed run: https://github.com/openservicemesh/osm/actions/runs/3299314699/jobs/5442591548 **Affected area (please mark with X where applicable)**: - Tests [X]
test
unit test testgetinboundmeshtrafficpolicy flakes in ci this issue tracker is a best effort forum for users and customers to report bugs be sure to not include any sensitive information sensitive information should not be included in this issue bug description unit test testgetinboundmeshtrafficpolicy multiple services smi mode traffictarget httproutegroup trafficsplit and multiple trust domains flakes in the ci inbound traffic policies test go error trace inbound traffic policies test go error condition never satisfied test testgetinboundmeshtrafficpolicy multiple services smi mode traffictarget httproutegroup trafficsplit and multiple trust domains failed run affected area please mark with x where applicable tests
1
242,317
20,239,423,229
IssuesEvent
2022-02-14 07:41:50
dotnet/machinelearning-modelbuilder
https://api.github.com/repos/dotnet/machinelearning-modelbuilder
opened
The right border of the predicted table is not displayed.
Priority:2 Test Team Forecasting bug bash
**System Information (please complete the following information):** - Model Builder Version (available in Manage Extensions dialog): 16.13.3.2211104 (latest main) - Microsoft Visual Studio Enterprise 2022 (64-bit) Version 17.0.6 **Describe the bug** - On which step of the process did you run into an issue: Prediction result on Evaluate page - Clear description of the problem: **To Reproduce** Steps to reproduce the behavior: 1. Select Create a new project from the Visual Studio 2022 start window; 2. Choose the C# Console App (.NET Core) project template with .Net 6.0; 3. Add model builder by right click on the project; 4. Select forecasting scenario to complete training; 5. Navigate to Evaluate page, click "Predict" button to see the predication result; 6. See that the right border of the predicted table is not displayed. **Expected behavior** Display all border of the predicted table. **Screenshots** If applicable, add screenshots to help explain your problem. ![image](https://user-images.githubusercontent.com/81727020/153820747-3fd43797-5842-46a4-833f-f979e69ed1ac.png)
1.0
The right border of the predicted table is not displayed. - **System Information (please complete the following information):** - Model Builder Version (available in Manage Extensions dialog): 16.13.3.2211104 (latest main) - Microsoft Visual Studio Enterprise 2022 (64-bit) Version 17.0.6 **Describe the bug** - On which step of the process did you run into an issue: Prediction result on Evaluate page - Clear description of the problem: **To Reproduce** Steps to reproduce the behavior: 1. Select Create a new project from the Visual Studio 2022 start window; 2. Choose the C# Console App (.NET Core) project template with .Net 6.0; 3. Add model builder by right click on the project; 4. Select forecasting scenario to complete training; 5. Navigate to Evaluate page, click "Predict" button to see the predication result; 6. See that the right border of the predicted table is not displayed. **Expected behavior** Display all border of the predicted table. **Screenshots** If applicable, add screenshots to help explain your problem. ![image](https://user-images.githubusercontent.com/81727020/153820747-3fd43797-5842-46a4-833f-f979e69ed1ac.png)
test
the right border of the predicted table is not displayed system information please complete the following information model builder version available in manage extensions dialog latest main microsoft visual studio enterprise bit version describe the bug on which step of the process did you run into an issue prediction result on evaluate page clear description of the problem to reproduce steps to reproduce the behavior select create a new project from the visual studio start window choose the c console app net core project template with net add model builder by right click on the project select forecasting scenario to complete training navigate to evaluate page click predict button to see the predication result see that the right border of the predicted table is not displayed expected behavior display all border of the predicted table screenshots if applicable add screenshots to help explain your problem
1
194,221
14,671,888,245
IssuesEvent
2020-12-30 09:18:12
github-vet/rangeloop-pointer-findings
https://api.github.com/repos/github-vet/rangeloop-pointer-findings
closed
yalue/elf_reader: elf32_format_test.go; 3 LoC
fresh test tiny
Found a possible issue in [yalue/elf_reader](https://www.github.com/yalue/elf_reader) at [elf32_format_test.go](https://github.com/yalue/elf_reader/blob/04ba8f01deb53a8b7d85321a9cc611e3a60605f7/elf32_format_test.go#L210-L212) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > [Click here to see the code in its original context.](https://github.com/yalue/elf_reader/blob/04ba8f01deb53a8b7d85321a9cc611e3a60605f7/elf32_format_test.go#L210-L212) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for k, x := range aux[j] { t.Logf(" Aux %d: %s\n", k, &x) } ``` </details> <details> <summary>Click here to show extra information the analyzer produced.</summary> ``` No path was found through the callgraph that could lead to a function which writes a pointer argument. No path was found through the callgraph that could lead to a function which passes a pointer to third-party code. root signature {Logf 3} was not found in the callgraph; reference was passed directly to third-party code ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 04ba8f01deb53a8b7d85321a9cc611e3a60605f7
1.0
yalue/elf_reader: elf32_format_test.go; 3 LoC - Found a possible issue in [yalue/elf_reader](https://www.github.com/yalue/elf_reader) at [elf32_format_test.go](https://github.com/yalue/elf_reader/blob/04ba8f01deb53a8b7d85321a9cc611e3a60605f7/elf32_format_test.go#L210-L212) Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first issue it finds, so please do not limit your consideration to the contents of the below message. > [Click here to see the code in its original context.](https://github.com/yalue/elf_reader/blob/04ba8f01deb53a8b7d85321a9cc611e3a60605f7/elf32_format_test.go#L210-L212) <details> <summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary> ```go for k, x := range aux[j] { t.Logf(" Aux %d: %s\n", k, &x) } ``` </details> <details> <summary>Click here to show extra information the analyzer produced.</summary> ``` No path was found through the callgraph that could lead to a function which writes a pointer argument. No path was found through the callgraph that could lead to a function which passes a pointer to third-party code. root signature {Logf 3} was not found in the callgraph; reference was passed directly to third-party code ``` </details> Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket: See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information. commit ID: 04ba8f01deb53a8b7d85321a9cc611e3a60605f7
test
yalue elf reader format test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message click here to show the line s of go which triggered the analyzer go for k x range aux t logf aux d s n k x click here to show extra information the analyzer produced no path was found through the callgraph that could lead to a function which writes a pointer argument no path was found through the callgraph that could lead to a function which passes a pointer to third party code root signature logf was not found in the callgraph reference was passed directly to third party code leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
1
178,191
13,767,954,258
IssuesEvent
2020-10-07 16:24:14
probabilistic-numerics/probnum
https://api.github.com/repos/probabilistic-numerics/probnum
closed
Use isort to sort the order of import statements
improvement testing and CI
While discussing #108 (flake8/pylint) with @marvinpfoertner we also talked about using [isort](https://github.com/timothycrosley/isort) to sort the order of import statements. It seems easy to set up in most popular editors (e.g. VS Code and PyCharm), or it can be used with pre-commit. It makes code more readable. And even if someone is completely unaware of the tool, PR errors could be fixed by installing and running isort (`pip install isort`, and `isort`), or through tox (`tox -e isort`). It should also be quick to add to probnum. @JonathanWenger any thoughts?
1.0
Use isort to sort the order of import statements - While discussing #108 (flake8/pylint) with @marvinpfoertner we also talked about using [isort](https://github.com/timothycrosley/isort) to sort the order of import statements. It seems easy to set up in most popular editors (e.g. VS Code and PyCharm), or it can be used with pre-commit. It makes code more readable. And even if someone is completely unaware of the tool, PR errors could be fixed by installing and running isort (`pip install isort`, and `isort`), or through tox (`tox -e isort`). It should also be quick to add to probnum. @JonathanWenger any thoughts?
test
use isort to sort the order of import statements while discussing pylint with marvinpfoertner we also talked about using to sort the order of import statements it seems easy to set up in most popular editors e g vs code and pycharm or it can be used with pre commit it makes code more readable and even if someone is completely unaware of the tool pr errors could be fixed by installing and running isort pip install isort and isort or through tox tox e isort it should also be quick to add to probnum jonathanwenger any thoughts
1
467,710
13,453,110,317
IssuesEvent
2020-09-08 23:58:55
TryGhost/Ghost
https://api.github.com/repos/TryGhost/Ghost
closed
Error when accepting invitation with existing email
bug priority
### Issue Summary Accepting a staff invitation email, and submitting an email of an already existing user will cause the DB to throw an error because the email would no longer be unique. The fix is likely to be to check if the email already exists. ### To Reproduce 1. Invite a new staff user 1. Open the invitation link 1. Use an email of an already existing staff member 1. :boom: Error shown below form ![image](https://user-images.githubusercontent.com/964245/87911928-5633d800-ca64-11ea-9c83-b6ca7223a939.png) ``` HTTP/1.1 500 Internal Server Error -- InternalServerError: insert into `users` (`accessibility`, `bio`, `cover_image`, `created_at`, `created_by`, `email`, `facebook`, `id`, `last_seen`, `locale`, `location`, `meta_description`, `meta_title`, `name`, `password`, `profile_image`, `slug`, `status`, `tour`, `twitter`, `updated_at`, `updated_by`, `visibility`, `website`) values (NULL, NULL, NULL, '2020-07-20 03:24:52', '1', 'secret', NULL, 'id', NULL, NULL, NULL, NULL, NULL, 'my name', 'secret password', NULL, 'secret', 'active', NULL, NULL, '2020-07-20 03:24:52', '1', 'public', NULL) - ER_DUP_ENTRY: Duplicate entry 'edited@email.com' for key 'users_email_unique' at new GhostError (/home/ghost/node_modules/@tryghost/errors/lib/errors.js:10:26) at _private.prepareError (/home/ghost/core/server/web/shared/middlewares/error-handler.js:53:19) at Layer.handle_error (/home/ghost/node_modules/express/lib/router/layer.js:71:5) at trim_prefix (/home/ghost/node_modules/express/lib/router/index.js:315:13) at /home/ghost/node_modules/express/lib/router/index.js:284:7 at Function.process_params (/home/ghost/node_modules/express/lib/router/index.js:335:12) ........ Error: ER_DUP_ENTRY: Duplicate entry 'edited@email.com' for key 'users_email_unique' at Query.Sequence._packetToError (/home/ghost/node_modules/mysql/lib/protocol/sequences/Sequence.js:47:14) at Query.ErrorPacket (/home/ghost/node_modules/mysql/lib/protocol/sequences/Query.js:79:18) at Protocol._parsePacket (/home/ghost/node_modules/mysql/lib/protocol/Protocol.js:291:23) at Parser._parsePacket (/home/ghost/node_modules/mysql/lib/protocol/Parser.js:433:10) ........ ``` Originating from Sentry: https://sentry.io/organizations/ghost-foundation/issues/1634959159/events/ea2ce7b6681a4c119104533d78c9520e/ ### Technical details: * Ghost Version: 3.25.0 * Node Version: 12.18.0 * Browser/OS: Firefox/Linux * Database: SQLite
1.0
Error when accepting invitation with existing email - ### Issue Summary Accepting a staff invitation email, and submitting an email of an already existing user will cause the DB to throw an error because the email would no longer be unique. The fix is likely to be to check if the email already exists. ### To Reproduce 1. Invite a new staff user 1. Open the invitation link 1. Use an email of an already existing staff member 1. :boom: Error shown below form ![image](https://user-images.githubusercontent.com/964245/87911928-5633d800-ca64-11ea-9c83-b6ca7223a939.png) ``` HTTP/1.1 500 Internal Server Error -- InternalServerError: insert into `users` (`accessibility`, `bio`, `cover_image`, `created_at`, `created_by`, `email`, `facebook`, `id`, `last_seen`, `locale`, `location`, `meta_description`, `meta_title`, `name`, `password`, `profile_image`, `slug`, `status`, `tour`, `twitter`, `updated_at`, `updated_by`, `visibility`, `website`) values (NULL, NULL, NULL, '2020-07-20 03:24:52', '1', 'secret', NULL, 'id', NULL, NULL, NULL, NULL, NULL, 'my name', 'secret password', NULL, 'secret', 'active', NULL, NULL, '2020-07-20 03:24:52', '1', 'public', NULL) - ER_DUP_ENTRY: Duplicate entry 'edited@email.com' for key 'users_email_unique' at new GhostError (/home/ghost/node_modules/@tryghost/errors/lib/errors.js:10:26) at _private.prepareError (/home/ghost/core/server/web/shared/middlewares/error-handler.js:53:19) at Layer.handle_error (/home/ghost/node_modules/express/lib/router/layer.js:71:5) at trim_prefix (/home/ghost/node_modules/express/lib/router/index.js:315:13) at /home/ghost/node_modules/express/lib/router/index.js:284:7 at Function.process_params (/home/ghost/node_modules/express/lib/router/index.js:335:12) ........ Error: ER_DUP_ENTRY: Duplicate entry 'edited@email.com' for key 'users_email_unique' at Query.Sequence._packetToError (/home/ghost/node_modules/mysql/lib/protocol/sequences/Sequence.js:47:14) at Query.ErrorPacket (/home/ghost/node_modules/mysql/lib/protocol/sequences/Query.js:79:18) at Protocol._parsePacket (/home/ghost/node_modules/mysql/lib/protocol/Protocol.js:291:23) at Parser._parsePacket (/home/ghost/node_modules/mysql/lib/protocol/Parser.js:433:10) ........ ``` Originating from Sentry: https://sentry.io/organizations/ghost-foundation/issues/1634959159/events/ea2ce7b6681a4c119104533d78c9520e/ ### Technical details: * Ghost Version: 3.25.0 * Node Version: 12.18.0 * Browser/OS: Firefox/Linux * Database: SQLite
non_test
error when accepting invitation with existing email issue summary accepting a staff invitation email and submitting an email of an already existing user will cause the db to throw an error because the email would no longer be unique the fix is likely to be to check if the email already exists to reproduce invite a new staff user open the invitation link use an email of an already existing staff member boom error shown below form http internal server error internalservererror insert into users accessibility bio cover image created at created by email facebook id last seen locale location meta description meta title name password profile image slug status tour twitter updated at updated by visibility website values null null null secret null id null null null null null my name secret password null secret active null null public null er dup entry duplicate entry edited email com for key users email unique at new ghosterror home ghost node modules tryghost errors lib errors js at private prepareerror home ghost core server web shared middlewares error handler js at layer handle error home ghost node modules express lib router layer js at trim prefix home ghost node modules express lib router index js at home ghost node modules express lib router index js at function process params home ghost node modules express lib router index js error er dup entry duplicate entry edited email com for key users email unique at query sequence packettoerror home ghost node modules mysql lib protocol sequences sequence js at query errorpacket home ghost node modules mysql lib protocol sequences query js at protocol parsepacket home ghost node modules mysql lib protocol protocol js at parser parsepacket home ghost node modules mysql lib protocol parser js originating from sentry technical details ghost version node version browser os firefox linux database sqlite
0
474,800
13,676,693,226
IssuesEvent
2020-09-29 14:11:24
DroneDB/DroneDB
https://api.github.com/repos/DroneDB/DroneDB
opened
Do not allow files with forbidden characters to be added to an index
low priority software fault
Certain paths are valid on Unix, but not on Windows (e.g. "a\b" is a valid filename on Linux, but not on Windows). Such files should be skipped from inclusion in an index.
1.0
Do not allow files with forbidden characters to be added to an index - Certain paths are valid on Unix, but not on Windows (e.g. "a\b" is a valid filename on Linux, but not on Windows). Such files should be skipped from inclusion in an index.
non_test
do not allow files with forbidden characters to be added to an index certain paths are valid on unix but not on windows e g a b is a valid filename on linux but not on windows such files should be skipped from inclusion in an index
0
324,357
27,804,066,100
IssuesEvent
2023-03-17 18:10:08
PalisadoesFoundation/talawa-admin
https://api.github.com/repos/PalisadoesFoundation/talawa-admin
closed
Rectify Errors/Warnings : `src/components/OrgPostCard/OrgPostCard.test.tsx`
bug test
**Describe the bug** There are certain warning in the console when we run the test ```OrgPostCard.test.tsx``` using the command ```yarn test OrgPostCard.test``` **To Reproduce** Steps to reproduce the behavior: 1. Open a terminal inside the Talawa Admin Project 2. Run the command ```yarn test OrgPostCard.test``` 3. The tests will pass but, there would be many warning in the terminal. #### Issue tracker for this file: Make sure to complete the following issues. * [ ] Removed the warning with statement ```This typically indicates a configuration error in your mocks setup, usually due to a typo or mismatched variable.``` * [ ] The terminal has no other warning Parent Issue: #555 ### PR Acceptance Criteria - When complete this file must show 100% coverage when merged into the code base. - [The current code coverage for the file can be found here](https://app.codecov.io/gh/PalisadoesFoundation/talawa-admin?search=&displayType=list) - The PR will show a report for the code coverage for the file you have added. You can use that as a guide **Potential internship candidates** Please read this if you are planning to apply for a Palisadoes Foundation internship https://github.com/PalisadoesFoundation/talawa/issues/359
1.0
Rectify Errors/Warnings : `src/components/OrgPostCard/OrgPostCard.test.tsx` - **Describe the bug** There are certain warning in the console when we run the test ```OrgPostCard.test.tsx``` using the command ```yarn test OrgPostCard.test``` **To Reproduce** Steps to reproduce the behavior: 1. Open a terminal inside the Talawa Admin Project 2. Run the command ```yarn test OrgPostCard.test``` 3. The tests will pass but, there would be many warning in the terminal. #### Issue tracker for this file: Make sure to complete the following issues. * [ ] Removed the warning with statement ```This typically indicates a configuration error in your mocks setup, usually due to a typo or mismatched variable.``` * [ ] The terminal has no other warning Parent Issue: #555 ### PR Acceptance Criteria - When complete this file must show 100% coverage when merged into the code base. - [The current code coverage for the file can be found here](https://app.codecov.io/gh/PalisadoesFoundation/talawa-admin?search=&displayType=list) - The PR will show a report for the code coverage for the file you have added. You can use that as a guide **Potential internship candidates** Please read this if you are planning to apply for a Palisadoes Foundation internship https://github.com/PalisadoesFoundation/talawa/issues/359
test
rectify errors warnings src components orgpostcard orgpostcard test tsx describe the bug there are certain warning in the console when we run the test orgpostcard test tsx using the command yarn test orgpostcard test to reproduce steps to reproduce the behavior open a terminal inside the talawa admin project run the command yarn test orgpostcard test the tests will pass but there would be many warning in the terminal issue tracker for this file make sure to complete the following issues removed the warning with statement this typically indicates a configuration error in your mocks setup usually due to a typo or mismatched variable the terminal has no other warning parent issue pr acceptance criteria when complete this file must show coverage when merged into the code base the pr will show a report for the code coverage for the file you have added you can use that as a guide potential internship candidates please read this if you are planning to apply for a palisadoes foundation internship
1
33,288
4,466,790,741
IssuesEvent
2016-08-25 00:31:12
dotnet/roslyn
https://api.github.com/repos/dotnet/roslyn
closed
No errors reported when out parameter of ImmutableArray type is not initialized before function returns
4 - In Review Area-Infrastructure Bug Language-C# Resolution-By Design Resolution-Fixed
in roslyn\src\ExpressionEvaluator\CSharp\Source\ExpressionCompiler\CompilationContext.cs There is a following code at the end of ExtendBinderChain function, which has out parameter ```out ImmutableArray<LocalSymbol> declaredLocals``` ``` if (declaredLocalsScopeDesignator != null) { declaredLocals = originalRootBinder.GetDeclaredLocalsForScope(declaredLocalsScopeDesignator); } else { declaredLocals = ImmutableArray<LocalSymbol>.Empty; } ``` If the ```else``` block is removed, no error reported that declaredLocals might be not fully assigned. Expected to get an error.
1.0
No errors reported when out parameter of ImmutableArray type is not initialized before function returns - in roslyn\src\ExpressionEvaluator\CSharp\Source\ExpressionCompiler\CompilationContext.cs There is a following code at the end of ExtendBinderChain function, which has out parameter ```out ImmutableArray<LocalSymbol> declaredLocals``` ``` if (declaredLocalsScopeDesignator != null) { declaredLocals = originalRootBinder.GetDeclaredLocalsForScope(declaredLocalsScopeDesignator); } else { declaredLocals = ImmutableArray<LocalSymbol>.Empty; } ``` If the ```else``` block is removed, no error reported that declaredLocals might be not fully assigned. Expected to get an error.
non_test
no errors reported when out parameter of immutablearray type is not initialized before function returns in roslyn src expressionevaluator csharp source expressioncompiler compilationcontext cs there is a following code at the end of extendbinderchain function which has out parameter out immutablearray declaredlocals if declaredlocalsscopedesignator null declaredlocals originalrootbinder getdeclaredlocalsforscope declaredlocalsscopedesignator else declaredlocals immutablearray empty if the else block is removed no error reported that declaredlocals might be not fully assigned expected to get an error
0
180,308
13,929,775,584
IssuesEvent
2020-10-22 00:33:57
backend-br/vagas
https://api.github.com/repos/backend-br/vagas
closed
[São Paulo] Back-end software engineer @ Acesso Bank
.NET ASP AWS CI CLT Docker Entity Framework MySQL NoSQL NodeJS Presencial RabbitMQ React Native SQL Scrum Stale Testes automatizados
A Acesso é uma Fintech que está há 9 anos, simplificando a movimentação financeira de pessoas e empresas! Estamos em processo de transformação de solução de pagamento para banco digital ([Acesso Bank](https://acessobank.com.br)). Mas não é qualquer banco tá? É um banco que vai propor pro mercado o que nenhum sistema bancário propôs: democratização financeira. Somos apaixonados pelo que fazemos e pelo impacto que causamos e estamos buscando #gentequepulsa e que acredite em nosso propósito! ## Sobre a vaga Como Backend Software Engineer você irá trabalhar no desenvolvimento de aplicações Web / Microsserviços e irá interagir com times multidisciplinares em reuniões de briefing e acompanhamento, além disso, como membro do time de engenharia, você vai trabalhar com programadores experientes, além de interagir com times de produto, design e BI. Em nossa stack contamos com .NET Core, NodeJS, DynamoDb, SQL Server (com Entity Framework), RabbitMQ (usando [Masstransit](https://masstransit-project.com/)), Docker, mas estamos aberto a novas tecnologias. ## Esses conhecimentos são essenciais para o seu dia-a-dia: ✔ Desenvolvimento C# .Net Core, Asp.net core ✔ OOP, SOLID e Clean Code ✔ Testes automatizados, CI e CD ✔ Desenvolvimento ágil (Scrum ou Kanban) ✔ Conhecimento em NoSQL (Mongo, DynamoDB ou DocumentsDB) ✔ Mensageria (utilizamos RabbitMQ) ✔ SQL, Entity Framework ou Dapper e Migrations (utilizamos SQL Server) ## Com certeza ajudará no seu dia-a-dia se: ⬆ ter conhecimento em NodeJS ⬆ ter conhecimento em Docker ⬆ ter conhecimento em ambiente cloud AWS ou Azure ⬆ compreender os princípios de 12 Factor Apps ⬆ conhecer algum framework JS (React, React Native ou Vue) Mas se não tiver, relaxa, o importante é estar disposto à aprender, aqui tem uma galera disposta a ensinar! 😉 ## Nossos Benefícios: 👨‍⚕ Plano Médico e Odontológico 🍽 Vale Refeição/Alimentação (Flex) 🏥 Seguro de Vida 🚌 Vale Transporte 💵 Bônus anual mediante atingimento de metas da empresa ## Employee Experience Acesso: 👔 #SerAcessoéSerVocê - Seja de bermuda, seja de gravata, seja de boné... Como? Não importa! Apenas, seja você! ✈ Worktrip: Possibilidade de uma experiência internacional de trabalho de 30 dias na Holanda ⏰ Horário flexível 🏠 Liberdade para trabalhar home office ⤴ #PRACIMA: Oportunidade de Crescimento de Carreira 📈 Avaliação de desempenho 📅 Day One: Reunião semanal para compartilhamento de resultados e evoluções da empresa 📚 Workshops de Desenvolvimento 🎓#EstudaAcesso – Programa de incentivo à cursos, pós, extensões e etc. ⚡ Trabalhamos com Metodologia Ágil 🏖 Day off de aniversário 🎉#SextouAcesso: Traga seu pet de estimação, havaiana e pantufa liberada e muita pipoca pra galera 😴 Espaço de descompressão 🚴 Bicicletário e chuveiros 🎲 Sala de Jogos 👶 Sala de Amamentação ## Contratação: CLT PRESENCIAL / REMOTO ## Como se candidatar: Envie currículo ou link para o LinkedIn para vagas.tech@acesso.com com o assunto [Seu Nome] - Backend Software Engineer no Acesso Bank **Dúvidas**: https://www.linkedin.com/in/fernandoseguim
1.0
[São Paulo] Back-end software engineer @ Acesso Bank - A Acesso é uma Fintech que está há 9 anos, simplificando a movimentação financeira de pessoas e empresas! Estamos em processo de transformação de solução de pagamento para banco digital ([Acesso Bank](https://acessobank.com.br)). Mas não é qualquer banco tá? É um banco que vai propor pro mercado o que nenhum sistema bancário propôs: democratização financeira. Somos apaixonados pelo que fazemos e pelo impacto que causamos e estamos buscando #gentequepulsa e que acredite em nosso propósito! ## Sobre a vaga Como Backend Software Engineer você irá trabalhar no desenvolvimento de aplicações Web / Microsserviços e irá interagir com times multidisciplinares em reuniões de briefing e acompanhamento, além disso, como membro do time de engenharia, você vai trabalhar com programadores experientes, além de interagir com times de produto, design e BI. Em nossa stack contamos com .NET Core, NodeJS, DynamoDb, SQL Server (com Entity Framework), RabbitMQ (usando [Masstransit](https://masstransit-project.com/)), Docker, mas estamos aberto a novas tecnologias. ## Esses conhecimentos são essenciais para o seu dia-a-dia: ✔ Desenvolvimento C# .Net Core, Asp.net core ✔ OOP, SOLID e Clean Code ✔ Testes automatizados, CI e CD ✔ Desenvolvimento ágil (Scrum ou Kanban) ✔ Conhecimento em NoSQL (Mongo, DynamoDB ou DocumentsDB) ✔ Mensageria (utilizamos RabbitMQ) ✔ SQL, Entity Framework ou Dapper e Migrations (utilizamos SQL Server) ## Com certeza ajudará no seu dia-a-dia se: ⬆ ter conhecimento em NodeJS ⬆ ter conhecimento em Docker ⬆ ter conhecimento em ambiente cloud AWS ou Azure ⬆ compreender os princípios de 12 Factor Apps ⬆ conhecer algum framework JS (React, React Native ou Vue) Mas se não tiver, relaxa, o importante é estar disposto à aprender, aqui tem uma galera disposta a ensinar! 😉 ## Nossos Benefícios: 👨‍⚕ Plano Médico e Odontológico 🍽 Vale Refeição/Alimentação (Flex) 🏥 Seguro de Vida 🚌 Vale Transporte 💵 Bônus anual mediante atingimento de metas da empresa ## Employee Experience Acesso: 👔 #SerAcessoéSerVocê - Seja de bermuda, seja de gravata, seja de boné... Como? Não importa! Apenas, seja você! ✈ Worktrip: Possibilidade de uma experiência internacional de trabalho de 30 dias na Holanda ⏰ Horário flexível 🏠 Liberdade para trabalhar home office ⤴ #PRACIMA: Oportunidade de Crescimento de Carreira 📈 Avaliação de desempenho 📅 Day One: Reunião semanal para compartilhamento de resultados e evoluções da empresa 📚 Workshops de Desenvolvimento 🎓#EstudaAcesso – Programa de incentivo à cursos, pós, extensões e etc. ⚡ Trabalhamos com Metodologia Ágil 🏖 Day off de aniversário 🎉#SextouAcesso: Traga seu pet de estimação, havaiana e pantufa liberada e muita pipoca pra galera 😴 Espaço de descompressão 🚴 Bicicletário e chuveiros 🎲 Sala de Jogos 👶 Sala de Amamentação ## Contratação: CLT PRESENCIAL / REMOTO ## Como se candidatar: Envie currículo ou link para o LinkedIn para vagas.tech@acesso.com com o assunto [Seu Nome] - Backend Software Engineer no Acesso Bank **Dúvidas**: https://www.linkedin.com/in/fernandoseguim
test
back end software engineer acesso bank a acesso é uma fintech que está há anos simplificando a movimentação financeira de pessoas e empresas estamos em processo de transformação de solução de pagamento para banco digital mas não é qualquer banco tá é um banco que vai propor pro mercado o que nenhum sistema bancário propôs democratização financeira somos apaixonados pelo que fazemos e pelo impacto que causamos e estamos buscando gentequepulsa e que acredite em nosso propósito sobre a vaga como backend software engineer você irá trabalhar no desenvolvimento de aplicações web microsserviços e irá interagir com times multidisciplinares em reuniões de briefing e acompanhamento além disso como membro do time de engenharia você vai trabalhar com programadores experientes além de interagir com times de produto design e bi em nossa stack contamos com net core nodejs dynamodb sql server com entity framework rabbitmq usando docker mas estamos aberto a novas tecnologias esses conhecimentos são essenciais para o seu dia a dia ✔ desenvolvimento c net core asp net core ✔ oop solid e clean code ✔ testes automatizados ci e cd ✔ desenvolvimento ágil scrum ou kanban ✔ conhecimento em nosql mongo dynamodb ou documentsdb ✔ mensageria utilizamos rabbitmq ✔ sql entity framework ou dapper e migrations utilizamos sql server com certeza ajudará no seu dia a dia se ⬆ ter conhecimento em nodejs ⬆ ter conhecimento em docker ⬆ ter conhecimento em ambiente cloud aws ou azure ⬆ compreender os princípios de factor apps ⬆ conhecer algum framework js react react native ou vue mas se não tiver relaxa o importante é estar disposto à aprender aqui tem uma galera disposta a ensinar 😉 nossos benefícios 👨‍⚕ plano médico e odontológico 🍽 vale refeição alimentação flex 🏥 seguro de vida 🚌 vale transporte 💵 bônus anual mediante atingimento de metas da empresa employee experience acesso 👔 seracessoéservocê seja de bermuda seja de gravata seja de boné como não importa apenas seja você ✈ worktrip possibilidade de uma experiência internacional de trabalho de dias na holanda ⏰ horário flexível 🏠 liberdade para trabalhar home office ⤴ pracima oportunidade de crescimento de carreira 📈 avaliação de desempenho 📅 day one reunião semanal para compartilhamento de resultados e evoluções da empresa 📚 workshops de desenvolvimento 🎓 estudaacesso – programa de incentivo à cursos pós extensões e etc ⚡ trabalhamos com metodologia ágil 🏖 day off de aniversário 🎉 sextouacesso traga seu pet de estimação havaiana e pantufa liberada e muita pipoca pra galera 😴 espaço de descompressão 🚴 bicicletário e chuveiros 🎲 sala de jogos 👶 sala de amamentação contratação clt presencial remoto como se candidatar envie currículo ou link para o linkedin para vagas tech acesso com com o assunto backend software engineer no acesso bank dúvidas
1
74,948
7,453,130,583
IssuesEvent
2018-03-29 10:46:26
italia/spid
https://api.github.com/repos/italia/spid
closed
Validazione Metadata Comune di San Donato Milanese
metadata nuovo md test
Richiesta dal Comune di San Donato Milanese Si prega di validare metadata per il Comune di San Donato Milanese
1.0
Validazione Metadata Comune di San Donato Milanese - Richiesta dal Comune di San Donato Milanese Si prega di validare metadata per il Comune di San Donato Milanese
test
validazione metadata comune di san donato milanese richiesta dal comune di san donato milanese si prega di validare metadata per il comune di san donato milanese
1
164,747
20,386,923,284
IssuesEvent
2022-02-22 08:06:05
mheob/itsb-web
https://api.github.com/repos/mheob/itsb-web
closed
CVE-2021-3757 (High) detected in immer-8.0.1.tgz - autoclosed
security vulnerability
## CVE-2021-3757 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>immer-8.0.1.tgz</b></p></summary> <p>Create your next immutable state by mutating the current one</p> <p>Library home page: <a href="https://registry.npmjs.org/immer/-/immer-8.0.1.tgz">https://registry.npmjs.org/immer/-/immer-8.0.1.tgz</a></p> <p>Path to dependency file: /frontend/package.json</p> <p>Path to vulnerable library: /frontend/node_modules/immer/package.json</p> <p> Dependency Hierarchy: - react-scripts-4.0.3.tgz (Root Library) - react-dev-utils-11.0.4.tgz - :x: **immer-8.0.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/mheob/itsb-web/commit/63cd64e6436a5ddc88015de41fc0c04679e53b1f">63cd64e6436a5ddc88015de41fc0c04679e53b1f</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> immer is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution') <p>Publish Date: 2021-09-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3757>CVE-2021-3757</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/">https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/</a></p> <p>Release Date: 2021-09-02</p> <p>Fix Resolution (immer): 9.0.6</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.0-next.47</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-3757 (High) detected in immer-8.0.1.tgz - autoclosed - ## CVE-2021-3757 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>immer-8.0.1.tgz</b></p></summary> <p>Create your next immutable state by mutating the current one</p> <p>Library home page: <a href="https://registry.npmjs.org/immer/-/immer-8.0.1.tgz">https://registry.npmjs.org/immer/-/immer-8.0.1.tgz</a></p> <p>Path to dependency file: /frontend/package.json</p> <p>Path to vulnerable library: /frontend/node_modules/immer/package.json</p> <p> Dependency Hierarchy: - react-scripts-4.0.3.tgz (Root Library) - react-dev-utils-11.0.4.tgz - :x: **immer-8.0.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/mheob/itsb-web/commit/63cd64e6436a5ddc88015de41fc0c04679e53b1f">63cd64e6436a5ddc88015de41fc0c04679e53b1f</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> immer is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution') <p>Publish Date: 2021-09-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3757>CVE-2021-3757</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/">https://huntr.dev/bounties/23d38099-71cd-42ed-a77a-71e68094adfa/</a></p> <p>Release Date: 2021-09-02</p> <p>Fix Resolution (immer): 9.0.6</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.0-next.47</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve high detected in immer tgz autoclosed cve high severity vulnerability vulnerable library immer tgz create your next immutable state by mutating the current one library home page a href path to dependency file frontend package json path to vulnerable library frontend node modules immer package json dependency hierarchy react scripts tgz root library react dev utils tgz x immer tgz vulnerable library found in head commit a href found in base branch master vulnerability details immer is vulnerable to improperly controlled modification of object prototype attributes prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution immer direct dependency fix resolution react scripts next step up your open source security game with whitesource
0
139,337
11,258,049,526
IssuesEvent
2020-01-13 02:41:41
microsoft/AzureStorageExplorer
https://api.github.com/repos/microsoft/AzureStorageExplorer
closed
The date in Release Notes is incorrect
🧪 testing
**Storage Explorer Version:** 1.12.0 **Build:** [20200109.2](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3369695&view=results) **Branch:** rel/1.12.0 **Platform/OS:** Windows 10/ Linux Ubuntu 18.04/ MacOS High Sierra **Architecture:** ia32/x64 **Regression From:** Not a regression **Steps to reproduce:** 1. Launch Storage Explorer -> Open Release Notes. 2. Check the date in Release Notes. **Expect Experience:** Show a correct date.(2020) **Actual Experience:** Show an incorrect date. ![image](https://user-images.githubusercontent.com/41351993/72147652-fdf78b80-33d9-11ea-92e6-65361e0901f8.png)
1.0
The date in Release Notes is incorrect - **Storage Explorer Version:** 1.12.0 **Build:** [20200109.2](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3369695&view=results) **Branch:** rel/1.12.0 **Platform/OS:** Windows 10/ Linux Ubuntu 18.04/ MacOS High Sierra **Architecture:** ia32/x64 **Regression From:** Not a regression **Steps to reproduce:** 1. Launch Storage Explorer -> Open Release Notes. 2. Check the date in Release Notes. **Expect Experience:** Show a correct date.(2020) **Actual Experience:** Show an incorrect date. ![image](https://user-images.githubusercontent.com/41351993/72147652-fdf78b80-33d9-11ea-92e6-65361e0901f8.png)
test
the date in release notes is incorrect storage explorer version build branch rel platform os windows linux ubuntu macos high sierra architecture regression from not a regression steps to reproduce launch storage explorer open release notes check the date in release notes expect experience show a correct date actual experience show an incorrect date
1
30,293
4,579,566,308
IssuesEvent
2016-09-18 09:03:11
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
e2e permafail on upgrade: [k8s.io] Deployment deployment should support rollback when there's replica set with no revision
component/scheduler kind/upgrade-test-failure team/ux
on [kubernetes-e2e-gce-1.3-1.4-upgrade-master](http://kubekins.dls.corp.google.com/view/Upgrade%20Test%20-%20GCE/job/kubernetes-e2e-gce-1.3-1.4-upgrade-master/5/) [k8s.io] Deployment deployment should support rollback when there's replica set with no revision /go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:79 Expected <[]extensions.ReplicaSet | len:1, cap:1>: [ { TypeMeta: {Kind: "", APIVersion: ""}, ObjectMeta: { Name: "test-rollback-no-revision-deployment-783459730", GenerateName: "", Namespace: "e2e-tests-deployment-2bt7a", SelfLink: "/apis/extensions/v1beta1/namespaces/e2e-tests-deployment-2bt7a/replicasets/test-rollback-no-revision-deployment-783459730", UID: "f32a416d-7c2e-11e6-891a-42010af00002", ResourceVersion: "17825", Generation: 5, CreationTimestamp: { Time: { sec: 63609641977, nsec: 0, loc: { name: "Local", zone: [ {name: "PDT", offset: -25200, isDST: true}, {name: "PST", offset: -28800, isDST: false}, {name: "PWT", offset: -25200, isDST: true}, {name: "PPT", offset: -25200, isDST: true}, ], tx: [ {when: -1633269600, index: 0, isstd: false, isutc: false}, {when: -1615129200, index: 1, isstd: false, isutc: false}, {when: -1601820000, index: 0, isstd: false, isutc: false}, {when: -1583679600, index: 1, isstd: true, isutc: true}, {when: -880207200, index: 2, isstd: false, isutc: false}, {when: -769395600, index: 3, isstd: false, isutc: false}, {when: -765385200, index: 1, isstd: false, isutc: false}, {when: -687967200, index: 0, isstd: false, isutc: false}, {when: -662655600, index: 1, isstd: false, isutc: false}, {when: -620834400, index: 0, isstd: false, isutc: false}, {when: -608137200, index: 1, isstd: false, isutc: false}, {when: -589384800, index: 0, isstd: false, isutc: false}, {when: -576082800, index: 1, isstd: false, isutc: false}, {when: -557935200, index: 0, isstd: false, isutc: false}, {when: -544633200, index: 1, isstd: false, isutc: false}, {when: -526485600, index: 0, isstd: false, isutc: false}, {when: -513183600, index: 1, isstd: false, isutc: false}, {when: -495036000, index: 0, isstd: false, isutc: false}, {when: -481734000, index: 1, isstd: false, isutc: false}, {when: -463586400, index: 0, isstd: false, isutc: false}, {when: -450284400, index: 1, isstd: false, isutc: false}, {when: -431532000, index: 0, isstd: false, isutc: false}, {when: -418230000, index: 1, isstd: false, isutc: false}, {when: -400082400, index: 0, isstd: false, isutc: false}, {when: -386780400, index: 1, isstd: false, isutc: false}, {when: -368632800, index: 0, isstd: false, isutc: false}, {when: -355330800, index: 1, isstd: false, isutc: false}, {when: -337183200, index: 0, isstd: false, isutc: false}, {when: -323881200, index: 1, isstd: false, isutc: false}, {when: -305733600, index: 0, isstd: false, isutc: false}, {when: -292431600, index: 1, isstd: false, isutc: false}, {when: -273679200, index: 0, isstd: false, isutc: false}, {when: -260982000, index: 1, isstd: false, isutc: false}, {when: -242229600, index: 0, isstd: false, isutc: false}, {when: -226508400, index: 1, isstd: false, isutc: false}, {when: -210780000, index: 0, isstd: false, isutc: false}, {when: -195058800, index: 1, isstd: false, isutc: false}, {when: -179330400, index: 0, isstd: false, isutc: false}, {when: -163609200, index: 1, isstd: false, isutc: false}, {when: -147880800, index: 0, isstd: false, isutc: false}, {when: -131554800, index: 1, isstd: false, isutc: false}, {when: -116431200, index: 0, isstd: false, isutc: false}, {when: -100105200, index: 1, isstd: false, isutc: false}, {when: -84376800, index: 0, isstd: false, isutc: false}, {when: -68655600, index: 1, isstd: false, isutc: false}, {when: -52927200, index: 0, isstd: false, isutc: false}, {when: -37206000, index: 1, isstd: false, isutc: false}, {when: -21477600, index: 0, isstd: false, isutc: false}, {when: -5756400, index: 1, isstd: false, isutc: false}, {when: 9972000, index: 0, isstd: false, isutc: false}, {when: 25693200, index: 1, isstd: false, isutc: false}, {when: 41421600, index: 0, isstd: false, isutc: false}, {when: 57747600, index: 1, isstd: false, isutc: false}, {when: 73476000, index: 0, isstd: false, isutc: false}, {when: 89197200, index: 1, isstd: false, isutc: false}, {when: 104925600, index: 0, isstd: false, isutc: false}, {when: 120646800, index: 1, isstd: false, isutc: false}, {when: 126698400, index: 0, isstd: false, isutc: false}, {when: 152096400, index: 1, isstd: false, isutc: false}, {when: 162381600, index: 0, isstd: false, isutc: false}, {when: 183546000, index: 1, isstd: false, isutc: false}, {when: 199274400, index: 0, isstd: false, isutc: false}, {when: 215600400, index: 1, isstd: false, isutc: false}, {when: 230724000, index: 0, isstd: false, isutc: false}, {when: 247050000, index: 1, isstd: false, isutc: false}, {when: 262778400, index: 0, isstd: false, isutc: false}, {when: 278499600, index: 1, isstd: false, isutc: false}, {when: 294228000, index: 0, isstd: false, isutc: false}, {when: 309949200, index: 1, isstd: false, isutc: false}, {when: 325677600, index: 0, isstd: false, isutc: false}, {when: 341398800, index: 1, isstd: false, isutc: false}, {when: 357127200, index: 0, isstd: false, isutc: false}, {when: 372848400, index: 1, isstd: false, isutc: false}, {when: 388576800, index: 0, isstd: false, isutc: false}, {when: 404902800, index: 1, isstd: false, isutc: false}, {when: 420026400, index: 0, isstd: false, isutc: false}, {when: 436352400, index: 1, isstd: false, isutc: false}, {when: 452080800, index: 0, isstd: false, isutc: false}, {when: 467802000, index: 1, isstd: false, isutc: false}, {when: 483530400, index: 0, isstd: false, isutc: false}, {when: 499251600, index: 1, isstd: false, isutc: false}, {when: 514980000, index: 0, isstd: false, isutc: false}, {when: 530701200, index: 1, isstd: false, isutc: false}, {when: 544615200, index: 0, isstd: false, isutc: false}, {when: 562150800, index: 1, isstd: false, isutc: false}, {when: 576064800, index: 0, isstd: false, isutc: false}, {when: 594205200, index: 1, isstd: false, isutc: false}, {when: 607514400, index: 0, isstd: false, isutc: false}, {when: 625654800, index: 1, isstd: false, isutc: false}, {when: 638964000, index: 0, isstd: false, isutc: false}, {when: 657104400, index: 1, isstd: false, isutc: false}, {when: 671018400, index: 0, isstd: false, isutc: false}, {when: 688554000, index: 1, isstd: false, isutc: false}, {when: 702468000, index: 0, isstd: false, isutc: false}, {when: 720003600, index: 1, isstd: false, isutc: false}, {when: 733917600, index: 0, isstd: false, isutc: false}, {when: 752058000, index: 1, isstd: false, isutc: false}, {when: 765367200, index: 0, isstd: false, isutc: false}, {when: 783507600, index: 1, isstd: false, isutc: false}, {when: 796816800, index: 0, isstd: false, isutc: false}, {when: 814957200, index: 1, isstd: false, isutc: false}, {when: 828871200, index: 0, isstd: false, isutc: false}, {when: 846406800, index: 1, isstd: false, isutc: false}, {when: 860320800, index: 0, isstd: false, isutc: false}, {when: 877856400, index: 1, isstd: false, isutc: false}, {when: 891770400, index: 0, isstd: false, isutc: false}, {when: 909306000, index: 1, isstd: false, isutc: false}, {when: 923220000, index: 0, isstd: false, isutc: false}, {when: 941360400, index: 1, isstd: false, isutc: false}, {when: 954669600, index: 0, isstd: false, isutc: false}, {when: 972810000, index: 1, isstd: false, isutc: false}, {when: 986119200, index: 0, isstd: false, isutc: false}, {when: 1004259600, index: 1, isstd: false, isutc: false}, {when: 1018173600, index: 0, isstd: false, isutc: false}, {when: 1035709200, index: 1, isstd: false, isutc: false}, {when: 1049623200, index: 0, isstd: false, isutc: false}, {when: 1067158800, index: 1, isstd: false, isutc: false}, {when: 1081072800, index: 0, isstd: false, isutc: false}, {when: 1099213200, index: 1, isstd: false, isutc: false}, {when: 1112522400, index: 0, isstd: false, isutc: false}, {when: 1130662800, index: 1, isstd: false, isutc: false}, {when: 1143972000, index: 0, isstd: false, isutc: false}, {when: 1162112400, index: 1, isstd: false, isutc: false}, {when: 1173607200, index: 0, isstd: false, isutc: false}, {when: 1194166800, index: 1, isstd: false, isutc: false}, {when: 1205056800, index: 0, isstd: false, isutc: false}, {when: 1225616400, index: 1, isstd: false, isutc: false}, {when: 1236506400, index: 0, isstd: false, isutc: false}, {when: 1257066000, index: 1, isstd: false, isutc: false}, {when: 1268560800, index: 0, isstd: false, isutc: false}, {when: 1289120400, index: 1, isstd: false, isutc: false}, {when: 1300010400, index: 0, isstd: false, isutc: false}, {when: 1320570000, index: 1, isstd: false, isutc: false}, {when: 1331460000, index: 0, isstd: false, isutc: false}, {when: 1352019600, index: 1, isstd: false, isutc: false}, {when: 1362909600, index: 0, isstd: false, isutc: false}, {when: 1383469200, index: 1, isstd: false, isutc: false}, {when: 1394359200, index: 0, isstd: false, isutc: false}, {when: 1414918800, index: 1, isstd: false, isutc: false}, {when: 1425808800, index: 0, isstd: false, isutc: false}, {when: 1446368400, index: 1, isstd: false, isutc: false}, {when: 1457863200, index: 0, isstd: false, isutc: false}, {when: 1478422800, index: 1, isstd: false, isutc: false}, {when: 1489312800, index: 0, isstd: false, isutc: false}, {when: 1509872400, index: 1, isstd: false, isutc: false}, {when: 1520762400, index: 0, isstd: false, isutc: false}, {when: 1541322000, index: 1, isstd: false, isutc: false}, {when: 1552212000, index: 0, isstd: false, isutc: false}, {when: 1572771600, index: 1, isstd: false, isutc: false}, {when: 1583661600, index: 0, isstd: false, isutc: false}, {when: 1604221200, index: 1, isstd: false, isutc: false}, {when: 1615716000, index: 0, isstd: false, isutc: false}, {when: 1636275600, index: 1, isstd: false, isutc: false}, {when: 1647165600, index: 0, isstd: false, isutc: false}, {when: 1667725200, index: 1, isstd: false, isutc: false}, {when: 1678615200, index: 0, isstd: false, isutc: false}, {when: 1699174800, index: 1, isstd: false, isutc: false}, {when: 1710064800, index: 0, isstd: false, isutc: false}, {when: 1730624400, index: 1, isstd: false, isutc: false}, {when: 1741514400, index: 0, isstd: false, isutc: false}, {when: 1762074000, index: 1, isstd: false, isutc: false}, {when: 1772964000, index: 0, isstd: false, isutc: false}, {when: 1793523600, index: 1, isstd: false, isutc: false}, {when: 1805018400, index: 0, isstd: false, isutc: false}, {when: 1825578000, index: 1, isstd: false, isutc: false}, {when: 1836468000, index: 0, isstd: false, isutc: false}, {when: 1857027600, index: 1, isstd: false, isutc: false}, {when: 1867917600, index: 0, isstd: false, isutc: false}, {when: 1888477200, index: 1, isstd: false, isutc: false}, {when: 1899367200, index: 0, isstd: false, isutc: false}, {when: 1919926800, index: 1, isstd: false, isutc: false}, {when: 1930816800, index: 0, isstd: false, isutc: false}, {when: 1951376400, index: 1, isstd: false, isutc: false}, {when: 1962871200, index: 0, isstd: false, isutc: false}, {when: 1983430800, index: 1, isstd: false, isutc: false}, {when: 1994320800, index: 0, isstd: false, isutc: false}, {when: 2014880400, index: 1, isstd: false, isutc: false}, {when: 2025770400, index: 0, isstd: false, isutc: false}, {when: 2046330000, index: 1, isstd: false, isutc: false}, {when: 2057220000, index: 0, isstd: false, isutc: false}, {when: 2077779600, index: 1, isstd: false, isutc: false}, {when: 2088669600, index: 0, isstd: false, isutc: false}, {when: 2109229200, index: 1, isstd: false, isutc: false}, {when: 2120119200, index: 0, isstd: false, isutc: false}, {when: 2140678800, index: 1, isstd: false, isutc: false}, ], cacheStart: 1457863200, cacheEnd: 1478422800, cacheZone: {name: "PDT", offset: -25200, isDST: true}, }, }, }, DeletionTimestamp: { Time: { sec: 63609642001, nsec: 0, loc: { name: "Local", zone: [ {name: "PDT", offset: -25200, isDST: true}, {name: "PST", offset: -28800, isDST: false}, {name: "PWT", offset: -25200, isDST: true}, {name: "PPT", offset: -25200, isDST: true}, ], tx: [ {when: -1633269600, index: 0, isstd: false, isutc: false}, {when: -1615129200, index: 1, isstd: false, isutc: false}, {when: -1601820000, index: 0, isstd: false, isutc: false}, {when: -1583679600, index: 1, isstd: true, isutc: true}, {when: -880207200, index: 2, isstd: false, isutc: false}, {when: -769395600, index: 3, isstd: false, isutc: false}, {when: -765385200, index: 1, isstd: false, isutc: false}, {when: -687967200, index: 0, isstd: false, isutc: false}, {when: -662655600, index: 1, isstd: false, isutc: false}, {when: -620834400, index: 0, isstd: false, isutc: false}, {when: -608137200, index: 1, isstd: false, isutc: false}, {when: -589384800, index: 0, isstd: false, isutc: false}, {when: -576082800, index: 1, isstd: false, isutc: false}, {when: -557935200, index: 0, isstd: false, isutc: false}, {when: -544633200, index: 1, isstd: false, isutc: false}, {when: -526485600, index: 0, isstd: false, isutc: false}, {when: -513183600, index: 1, isstd: false, isutc: false}, {when: -495036000, index: 0, isstd: false, isutc: false}, {when: -481734000, index: 1, isstd: false, isutc: false}, {when: -463586400, index: 0, isstd: false, isutc: false}, {when: -450284400, index: 1, isstd: false, isutc: false}, {when: -431532000, index: 0, isstd: false, isutc: false}, {when: -418230000, index: 1, isstd: false, isutc: false}, {when: -400082400, index: 0, isstd: false, isutc: false}, {when: -386780400, index: 1, isstd: false, isutc: false}, {when: -368632800, index: 0, isstd: false, isutc: false}, {when: -355330800, index: 1, isstd: false, isutc: false}, {when: -337183200, index: 0, isstd: false, isutc: false}, {when: -323881200, index: 1, isstd: false, isutc: false}, {when: -305733600, index: 0, isstd: false, isutc: false}, {when: -292431600, index: 1, isstd: false, isutc: false}, {when: -273679200, index: 0, isstd: false, isutc: false}, {when: -260982000, index: 1, isstd: false, isutc: false}, {when: -242229600, index: 0, isstd: false, isutc: false}, {when: -226508400, index: 1, isstd: false, isutc: false}, {when: -210780000, index: 0, isstd: false, isutc: false}, {when: -195058800, index: 1, isstd: false, isutc: false}, {when: -179330400, index: 0, isstd: false, isutc: false}, {when: -163609200, index: 1, isstd: false, isutc: false}, {when: -147880800, index: 0, isstd: false, isutc: false}, {when: -131554800, index: 1, isstd: false, isutc: false}, {when: -116431200, index: 0, isstd: false, isutc: false}, {when: -100105200, index: 1, isstd: false, isutc: false}, {when: -84376800, index: 0, isstd: false, isutc: false}, {when: -68655600, index: 1, isstd: false, isutc: false}, {when: -52927200, index: 0, isstd: false, isutc: false}, {when: -37206000, index: 1, isstd: false, isutc: false}, {when: -21477600, index: 0, isstd: false, isutc: false}, {when: -5756400, index: 1, isstd: false, isutc: false}, {when: 9972000, index: 0, isstd: false, isutc: false}, {when: 25693200, index: 1, isstd: false, isutc: false}, {when: 41421600, index: 0, isstd: false, isutc: false}, {when: 57747600, index: 1, isstd: false, isutc: false}, {when: 73476000, index: 0, isstd: false, isutc: false}, {when: 89197200, index: 1, isstd: false, isutc: false}, {when: 104925600, index: 0, isstd: false, isutc: false}, {when: 120646800, index: 1, isstd: false, isutc: false}, {when: 126698400, index: 0, isstd: false, isutc: false}, {when: 152096400, index: 1, isstd: false, isutc: false}, {when: 162381600, index: 0, isstd: false, isutc: false}, {when: 183546000, index: 1, isstd: false, isutc: false}, {when: 199274400, index: 0, isstd: false, isutc: false}, {when: 215600400, index: 1, isstd: false, isutc: false}, {when: 230724000, index: 0, isstd: false, isutc: false}, {when: 247050000, index: 1, isstd: false, isutc: false}, {when: 262778400, index: 0, isstd: false, isutc: false}, {when: 278499600, index: 1, isstd: false, isutc: false}, {when: 294228000, index: 0, isstd: false, isutc: false}, {when: 309949200, index: 1, isstd: false, isutc: false}, {when: 325677600, index: 0, isstd: false, isutc: false}, {when: 341398800, index: 1, isstd: false, isutc: false}, {when: 357127200, index: 0, isstd: false, isutc: false}, {when: 372848400, index: 1, isstd: false, isutc: false}, {when: 388576800, index: 0, isstd: false, isutc: false}, {when: 404902800, index: 1, isstd: false, isutc: false}, {when: 420026400, index: 0, isstd: false, isutc: false}, {when: 436352400, index: 1, isstd: false, isutc: false}, {when: 452080800, index: 0, isstd: false, isutc: false}, {when: 467802000, index: 1, isstd: false, isutc: false}, {when: 483530400, index: 0, isstd: false, isutc: false}, {when: 499251600, index: 1, isstd: false, isutc: false}, {when: 514980000, index: 0, isstd: false, isutc: false}, {when: 530701200, index: 1, isstd: false, isutc: false}, {when: 544615200, index: 0, isstd: false, isutc: false}, {when: 562150800, index: 1, isstd: false, isutc: false}, {when: 576064800, index: 0, isstd: false, isutc: false}, {when: 594205200, index: 1, isstd: false, isutc: false}, {when: 607514400, index: 0, isstd: false, isutc: false}, {when: 625654800, index: 1, isstd: false, isutc: false}, {when: 638964000, index: 0, isstd: false, isutc: false}, {when: 657104400, index: 1, isstd: false, isutc: false}, {when: 671018400, index: 0, isstd: false, isutc: false}, {when: 688554000, index: 1, isstd: false, isutc: false}, {when: 702468000, index: 0, isstd: false, isutc: false}, {when: 720003600, index: 1, isstd: false, isutc: false}, {when: 733917600, index: 0, isstd: false, isutc: false}, {when: 752058000, index: 1, isstd: false, isutc: false}, {when: 765367200, index: 0, isstd: false, isutc: false}, {when: 783507600, index: 1, isstd: false, isutc: false}, {when: 796816800, index: 0, isstd: false, isutc: false}, {when: 814957200, index: 1, isstd: false, isutc: false}, {when: 828871200, index: 0, isstd: false, isutc: false}, {when: 846406800, index: 1, isstd: false, isutc: false}, {when: 860320800, index: 0, isstd: false, isutc: false}, {when: 877856400, index: 1, isstd: false, isutc: false}, {when: 891770400, index: 0, isstd: false, isutc: false}, {when: 909306000, index: 1, isstd: false, isutc: false}, {when: 923220000, index: 0, isstd: false, isutc: false}, {when: 941360400, index: 1, isstd: false, isutc: false}, {when: 954669600, index: 0, isstd: false, isutc: false}, {when: 972810000, index: 1, isstd: false, isutc: false}, {when: 986119200, index: 0, isstd: false, isutc: false}, {when: 1004259600, index: 1, isstd: false, isutc: false}, {when: 1018173600, index: 0, isstd: false, isutc: false}, {when: 1035709200, index: 1, isstd: false, isutc: false}, {when: 1049623200, index: 0, isstd: false, isutc: false}, {when: 1067158800, index: 1, isstd: false, isutc: false}, {when: 1081072800, index: 0, isstd: false, isutc: false}, {when: 1099213200, index: 1, isstd: false, isutc: false}, {when: 1112522400, index: 0, isstd: false, isutc: false}, {when: 1130662800, index: 1, isstd: false, isutc: false}, {when: 1143972000, index: 0, isstd: false, isutc: false}, {when: 1162112400, index: 1, isstd: false, isutc: false}, {when: 1173607200, index: 0, isstd: false, isutc: false}, {when: 1194166800, index: 1, isstd: false, isutc: false}, {when: 1205056800, index: 0, isstd: false, isutc: false}, {when: 1225616400, index: 1, isstd: false, isutc: false}, {when: 1236506400, index: 0, isstd: false, isutc: false}, {when: 1257066000, index: 1, isstd: false, isutc: false}, {when: 1268560800, index: 0, isstd: false, isutc: false}, {when: 1289120400, index: 1, isstd: false, isutc: false}, {when: 1300010400, index: 0, isstd: false, isutc: false}, {when: 1320570000, index: 1, isstd: false, isutc: false}, {when: 1331460000, index: 0, isstd: false, isutc: false}, {when: 1352019600, index: 1, isstd: false, isutc: false}, {when: 1362909600, index: 0, isstd: false, isutc: false}, {when: 1383469200, index: 1, isstd: false, isutc: false}, {when: 1394359200, index: 0, isstd: false, isutc: false}, {when: 1414918800, index: 1, isstd: false, isutc: false}, {when: 1425808800, index: 0, isstd: false, isutc: false}, {when: 1446368400, index: 1, isstd: false, isutc: false}, {when: 1457863200, index: 0, isstd: false, isutc: false}, {when: 1478422800, index: 1, isstd: false, isutc: false}, {when: 1489312800, index: 0, isstd: false, isutc: false}, {when: 1509872400, index: 1, isstd: false, isutc: false}, {when: 1520762400, index: 0, isstd: false, isutc: false}, {when: 1541322000, index: 1, isstd: false, isutc: false}, {when: 1552212000, index: 0, isstd: false, isutc: false}, {when: 1572771600, index: 1, isstd: false, isutc: false}, {when: 1583661600, index: 0, isstd: false, isutc: false}, {when: 1604221200, index: 1, isstd: false, isutc: false}, {when: 1615716000, index: 0, isstd: false, isutc: false}, {when: 1636275600, index: 1, isstd: false, isutc: false}, {when: 1647165600, index: 0, isstd: false, isutc: false}, {when: 1667725200, index: 1, isstd: false, isutc: false}, {when: 1678615200, index: 0, isstd: false, isutc: false}, {when: 1699174800, index: 1, isstd: false, isutc: false}, {when: 1710064800, index: 0, isstd: false, isutc: false}, {when: 1730624400, index: 1, isstd: false, isutc: false}, {when: 1741514400, index: 0, isstd: false, isutc: false}, {when: 1762074000, index: 1, isstd: false, isutc: false}, {when: 1772964000, index: 0, isstd: false, isutc: false}, {when: 1793523600, index: 1, isstd: false, isutc: false}, {when: 1805018400, index: 0, isstd: false, isutc: false}, {when: 1825578000, index: 1, isstd: false, isutc: false}, {when: 1836468000, index: 0, isstd: false, isutc: false}, {when: 1857027600, index: 1, isstd: false, isutc: false}, {when: 1867917600, index: 0, isstd: false, isutc: false}, {when: 1888477200, index: 1, isstd: false, isutc: false}, {when: 1899367200, index: 0, isstd: false, isutc: false}, {when: 1919926800, index: 1, isstd: false, isutc: false}, {when: 1930816800, index: 0, isstd: false, isutc: false}, {when: 1951376400, index: 1, isstd: false, isutc: false}, {when: 1962871200, index: 0, isstd: false, isutc: false}, {when: 1983430800, index: 1, isstd: false, isutc: false}, {when: 1994320800, index: 0, isstd: false, isutc: false}, {when: 2014880400, index: 1, isstd: false, isutc: false}, {when: 2025770400, index: 0, isstd: false, isutc: false}, {when: 2046330000, index: 1, isstd: false, isutc: false}, {when: 2057220000, index: 0, isstd: false, isutc: false}, {when: 2077779600, index: 1, isstd: false, isutc: false}, {when: 2088669600, index: 0, isstd: false, isutc: false}, {when: 2109229200, index: 1, isstd: false, isutc: false}, {when: 2120119200, index: 0, isstd: false, isutc: false}, {when: 2140678800, index: 1, isstd: false, isutc: false}, ], cacheStart: 1457863200, cacheEnd: 1478422800, cacheZone: {name: "PDT", offset: -25200, isDST: true}, }, }, }, DeletionGracePeriodSeconds: 0, Labels: { "name": "nginx", "pod-template-hash": "783459730", }, Annotations: { "deployment.kubernetes.io/desired-replicas": "0", "deployment.kubernetes.io/max-replicas": "1", "deployment.kubernetes.io/revision": "3", }, OwnerReferences: nil, Finalizers: ["orphan"], }, Spec: { Replicas: 0, Selector: { MatchLabels: { "name": "nginx", "pod-template-hash": "783459730", }, MatchExpressions: nil, }, Template: { ObjectMeta: { Name: "", GenerateName: "", Namespace: "", SelfLink: "", UID: "", ResourceVersion: "", Generation: 0, CreationTimestamp: { Time: {sec: 0, nsec: 0, loc: nil}, }, DeletionTimestamp: nil, DeletionGracePeriodSeconds: nil, Labels: { "pod-template-hash": "783459730", "name": "nginx", }, Annotations: nil, OwnerReferences: nil, Finalizers: nil, }, Spec: { Volumes: nil, InitContainers: nil, Containers: [ { Name: "nginx", Image: "gcr.io/google_containers/nginx:1.7.9", Command: nil, Args: nil, WorkingDir: "", Ports: nil, Env: nil, Resources: {Limits: nil, Requests: nil}, VolumeMounts: nil, LivenessProbe: nil, ReadinessProbe: nil, Lifecycle: nil, TerminationMessagePath: "/dev/termination-log", ImagePullPolicy: "IfNotPresent", SecurityContext: nil, Stdin: false, StdinOnce: false, TTY: false, }, ], RestartPolicy: "Always", TerminationGracePeriodSeconds: 0, ActiveDeadlineSeconds: nil, DNSPolicy: "ClusterFirst", NodeSelector: nil, ServiceAccountName: "", NodeName: "", SecurityContext: { HostNetwork: false, HostPID: false, HostIPC: false, SELinuxOptions: nil, RunAsUser: nil, RunAsNonRoot: nil, SupplementalGroups: nil, FSGroup: nil, }, ImagePullSecrets: nil, Hostname: "", Subdomain: "", }, }, }, Status: { Replicas: 0, FullyLabeledReplicas: 0, ObservedGeneration: 5, }, }, ] to have length 0
1.0
e2e permafail on upgrade: [k8s.io] Deployment deployment should support rollback when there's replica set with no revision - on [kubernetes-e2e-gce-1.3-1.4-upgrade-master](http://kubekins.dls.corp.google.com/view/Upgrade%20Test%20-%20GCE/job/kubernetes-e2e-gce-1.3-1.4-upgrade-master/5/) [k8s.io] Deployment deployment should support rollback when there's replica set with no revision /go/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/test/e2e/deployment.go:79 Expected <[]extensions.ReplicaSet | len:1, cap:1>: [ { TypeMeta: {Kind: "", APIVersion: ""}, ObjectMeta: { Name: "test-rollback-no-revision-deployment-783459730", GenerateName: "", Namespace: "e2e-tests-deployment-2bt7a", SelfLink: "/apis/extensions/v1beta1/namespaces/e2e-tests-deployment-2bt7a/replicasets/test-rollback-no-revision-deployment-783459730", UID: "f32a416d-7c2e-11e6-891a-42010af00002", ResourceVersion: "17825", Generation: 5, CreationTimestamp: { Time: { sec: 63609641977, nsec: 0, loc: { name: "Local", zone: [ {name: "PDT", offset: -25200, isDST: true}, {name: "PST", offset: -28800, isDST: false}, {name: "PWT", offset: -25200, isDST: true}, {name: "PPT", offset: -25200, isDST: true}, ], tx: [ {when: -1633269600, index: 0, isstd: false, isutc: false}, {when: -1615129200, index: 1, isstd: false, isutc: false}, {when: -1601820000, index: 0, isstd: false, isutc: false}, {when: -1583679600, index: 1, isstd: true, isutc: true}, {when: -880207200, index: 2, isstd: false, isutc: false}, {when: -769395600, index: 3, isstd: false, isutc: false}, {when: -765385200, index: 1, isstd: false, isutc: false}, {when: -687967200, index: 0, isstd: false, isutc: false}, {when: -662655600, index: 1, isstd: false, isutc: false}, {when: -620834400, index: 0, isstd: false, isutc: false}, {when: -608137200, index: 1, isstd: false, isutc: false}, {when: -589384800, index: 0, isstd: false, isutc: false}, {when: -576082800, index: 1, isstd: false, isutc: false}, {when: -557935200, index: 0, isstd: false, isutc: false}, {when: -544633200, index: 1, isstd: false, isutc: false}, {when: -526485600, index: 0, isstd: false, isutc: false}, {when: -513183600, index: 1, isstd: false, isutc: false}, {when: -495036000, index: 0, isstd: false, isutc: false}, {when: -481734000, index: 1, isstd: false, isutc: false}, {when: -463586400, index: 0, isstd: false, isutc: false}, {when: -450284400, index: 1, isstd: false, isutc: false}, {when: -431532000, index: 0, isstd: false, isutc: false}, {when: -418230000, index: 1, isstd: false, isutc: false}, {when: -400082400, index: 0, isstd: false, isutc: false}, {when: -386780400, index: 1, isstd: false, isutc: false}, {when: -368632800, index: 0, isstd: false, isutc: false}, {when: -355330800, index: 1, isstd: false, isutc: false}, {when: -337183200, index: 0, isstd: false, isutc: false}, {when: -323881200, index: 1, isstd: false, isutc: false}, {when: -305733600, index: 0, isstd: false, isutc: false}, {when: -292431600, index: 1, isstd: false, isutc: false}, {when: -273679200, index: 0, isstd: false, isutc: false}, {when: -260982000, index: 1, isstd: false, isutc: false}, {when: -242229600, index: 0, isstd: false, isutc: false}, {when: -226508400, index: 1, isstd: false, isutc: false}, {when: -210780000, index: 0, isstd: false, isutc: false}, {when: -195058800, index: 1, isstd: false, isutc: false}, {when: -179330400, index: 0, isstd: false, isutc: false}, {when: -163609200, index: 1, isstd: false, isutc: false}, {when: -147880800, index: 0, isstd: false, isutc: false}, {when: -131554800, index: 1, isstd: false, isutc: false}, {when: -116431200, index: 0, isstd: false, isutc: false}, {when: -100105200, index: 1, isstd: false, isutc: false}, {when: -84376800, index: 0, isstd: false, isutc: false}, {when: -68655600, index: 1, isstd: false, isutc: false}, {when: -52927200, index: 0, isstd: false, isutc: false}, {when: -37206000, index: 1, isstd: false, isutc: false}, {when: -21477600, index: 0, isstd: false, isutc: false}, {when: -5756400, index: 1, isstd: false, isutc: false}, {when: 9972000, index: 0, isstd: false, isutc: false}, {when: 25693200, index: 1, isstd: false, isutc: false}, {when: 41421600, index: 0, isstd: false, isutc: false}, {when: 57747600, index: 1, isstd: false, isutc: false}, {when: 73476000, index: 0, isstd: false, isutc: false}, {when: 89197200, index: 1, isstd: false, isutc: false}, {when: 104925600, index: 0, isstd: false, isutc: false}, {when: 120646800, index: 1, isstd: false, isutc: false}, {when: 126698400, index: 0, isstd: false, isutc: false}, {when: 152096400, index: 1, isstd: false, isutc: false}, {when: 162381600, index: 0, isstd: false, isutc: false}, {when: 183546000, index: 1, isstd: false, isutc: false}, {when: 199274400, index: 0, isstd: false, isutc: false}, {when: 215600400, index: 1, isstd: false, isutc: false}, {when: 230724000, index: 0, isstd: false, isutc: false}, {when: 247050000, index: 1, isstd: false, isutc: false}, {when: 262778400, index: 0, isstd: false, isutc: false}, {when: 278499600, index: 1, isstd: false, isutc: false}, {when: 294228000, index: 0, isstd: false, isutc: false}, {when: 309949200, index: 1, isstd: false, isutc: false}, {when: 325677600, index: 0, isstd: false, isutc: false}, {when: 341398800, index: 1, isstd: false, isutc: false}, {when: 357127200, index: 0, isstd: false, isutc: false}, {when: 372848400, index: 1, isstd: false, isutc: false}, {when: 388576800, index: 0, isstd: false, isutc: false}, {when: 404902800, index: 1, isstd: false, isutc: false}, {when: 420026400, index: 0, isstd: false, isutc: false}, {when: 436352400, index: 1, isstd: false, isutc: false}, {when: 452080800, index: 0, isstd: false, isutc: false}, {when: 467802000, index: 1, isstd: false, isutc: false}, {when: 483530400, index: 0, isstd: false, isutc: false}, {when: 499251600, index: 1, isstd: false, isutc: false}, {when: 514980000, index: 0, isstd: false, isutc: false}, {when: 530701200, index: 1, isstd: false, isutc: false}, {when: 544615200, index: 0, isstd: false, isutc: false}, {when: 562150800, index: 1, isstd: false, isutc: false}, {when: 576064800, index: 0, isstd: false, isutc: false}, {when: 594205200, index: 1, isstd: false, isutc: false}, {when: 607514400, index: 0, isstd: false, isutc: false}, {when: 625654800, index: 1, isstd: false, isutc: false}, {when: 638964000, index: 0, isstd: false, isutc: false}, {when: 657104400, index: 1, isstd: false, isutc: false}, {when: 671018400, index: 0, isstd: false, isutc: false}, {when: 688554000, index: 1, isstd: false, isutc: false}, {when: 702468000, index: 0, isstd: false, isutc: false}, {when: 720003600, index: 1, isstd: false, isutc: false}, {when: 733917600, index: 0, isstd: false, isutc: false}, {when: 752058000, index: 1, isstd: false, isutc: false}, {when: 765367200, index: 0, isstd: false, isutc: false}, {when: 783507600, index: 1, isstd: false, isutc: false}, {when: 796816800, index: 0, isstd: false, isutc: false}, {when: 814957200, index: 1, isstd: false, isutc: false}, {when: 828871200, index: 0, isstd: false, isutc: false}, {when: 846406800, index: 1, isstd: false, isutc: false}, {when: 860320800, index: 0, isstd: false, isutc: false}, {when: 877856400, index: 1, isstd: false, isutc: false}, {when: 891770400, index: 0, isstd: false, isutc: false}, {when: 909306000, index: 1, isstd: false, isutc: false}, {when: 923220000, index: 0, isstd: false, isutc: false}, {when: 941360400, index: 1, isstd: false, isutc: false}, {when: 954669600, index: 0, isstd: false, isutc: false}, {when: 972810000, index: 1, isstd: false, isutc: false}, {when: 986119200, index: 0, isstd: false, isutc: false}, {when: 1004259600, index: 1, isstd: false, isutc: false}, {when: 1018173600, index: 0, isstd: false, isutc: false}, {when: 1035709200, index: 1, isstd: false, isutc: false}, {when: 1049623200, index: 0, isstd: false, isutc: false}, {when: 1067158800, index: 1, isstd: false, isutc: false}, {when: 1081072800, index: 0, isstd: false, isutc: false}, {when: 1099213200, index: 1, isstd: false, isutc: false}, {when: 1112522400, index: 0, isstd: false, isutc: false}, {when: 1130662800, index: 1, isstd: false, isutc: false}, {when: 1143972000, index: 0, isstd: false, isutc: false}, {when: 1162112400, index: 1, isstd: false, isutc: false}, {when: 1173607200, index: 0, isstd: false, isutc: false}, {when: 1194166800, index: 1, isstd: false, isutc: false}, {when: 1205056800, index: 0, isstd: false, isutc: false}, {when: 1225616400, index: 1, isstd: false, isutc: false}, {when: 1236506400, index: 0, isstd: false, isutc: false}, {when: 1257066000, index: 1, isstd: false, isutc: false}, {when: 1268560800, index: 0, isstd: false, isutc: false}, {when: 1289120400, index: 1, isstd: false, isutc: false}, {when: 1300010400, index: 0, isstd: false, isutc: false}, {when: 1320570000, index: 1, isstd: false, isutc: false}, {when: 1331460000, index: 0, isstd: false, isutc: false}, {when: 1352019600, index: 1, isstd: false, isutc: false}, {when: 1362909600, index: 0, isstd: false, isutc: false}, {when: 1383469200, index: 1, isstd: false, isutc: false}, {when: 1394359200, index: 0, isstd: false, isutc: false}, {when: 1414918800, index: 1, isstd: false, isutc: false}, {when: 1425808800, index: 0, isstd: false, isutc: false}, {when: 1446368400, index: 1, isstd: false, isutc: false}, {when: 1457863200, index: 0, isstd: false, isutc: false}, {when: 1478422800, index: 1, isstd: false, isutc: false}, {when: 1489312800, index: 0, isstd: false, isutc: false}, {when: 1509872400, index: 1, isstd: false, isutc: false}, {when: 1520762400, index: 0, isstd: false, isutc: false}, {when: 1541322000, index: 1, isstd: false, isutc: false}, {when: 1552212000, index: 0, isstd: false, isutc: false}, {when: 1572771600, index: 1, isstd: false, isutc: false}, {when: 1583661600, index: 0, isstd: false, isutc: false}, {when: 1604221200, index: 1, isstd: false, isutc: false}, {when: 1615716000, index: 0, isstd: false, isutc: false}, {when: 1636275600, index: 1, isstd: false, isutc: false}, {when: 1647165600, index: 0, isstd: false, isutc: false}, {when: 1667725200, index: 1, isstd: false, isutc: false}, {when: 1678615200, index: 0, isstd: false, isutc: false}, {when: 1699174800, index: 1, isstd: false, isutc: false}, {when: 1710064800, index: 0, isstd: false, isutc: false}, {when: 1730624400, index: 1, isstd: false, isutc: false}, {when: 1741514400, index: 0, isstd: false, isutc: false}, {when: 1762074000, index: 1, isstd: false, isutc: false}, {when: 1772964000, index: 0, isstd: false, isutc: false}, {when: 1793523600, index: 1, isstd: false, isutc: false}, {when: 1805018400, index: 0, isstd: false, isutc: false}, {when: 1825578000, index: 1, isstd: false, isutc: false}, {when: 1836468000, index: 0, isstd: false, isutc: false}, {when: 1857027600, index: 1, isstd: false, isutc: false}, {when: 1867917600, index: 0, isstd: false, isutc: false}, {when: 1888477200, index: 1, isstd: false, isutc: false}, {when: 1899367200, index: 0, isstd: false, isutc: false}, {when: 1919926800, index: 1, isstd: false, isutc: false}, {when: 1930816800, index: 0, isstd: false, isutc: false}, {when: 1951376400, index: 1, isstd: false, isutc: false}, {when: 1962871200, index: 0, isstd: false, isutc: false}, {when: 1983430800, index: 1, isstd: false, isutc: false}, {when: 1994320800, index: 0, isstd: false, isutc: false}, {when: 2014880400, index: 1, isstd: false, isutc: false}, {when: 2025770400, index: 0, isstd: false, isutc: false}, {when: 2046330000, index: 1, isstd: false, isutc: false}, {when: 2057220000, index: 0, isstd: false, isutc: false}, {when: 2077779600, index: 1, isstd: false, isutc: false}, {when: 2088669600, index: 0, isstd: false, isutc: false}, {when: 2109229200, index: 1, isstd: false, isutc: false}, {when: 2120119200, index: 0, isstd: false, isutc: false}, {when: 2140678800, index: 1, isstd: false, isutc: false}, ], cacheStart: 1457863200, cacheEnd: 1478422800, cacheZone: {name: "PDT", offset: -25200, isDST: true}, }, }, }, DeletionTimestamp: { Time: { sec: 63609642001, nsec: 0, loc: { name: "Local", zone: [ {name: "PDT", offset: -25200, isDST: true}, {name: "PST", offset: -28800, isDST: false}, {name: "PWT", offset: -25200, isDST: true}, {name: "PPT", offset: -25200, isDST: true}, ], tx: [ {when: -1633269600, index: 0, isstd: false, isutc: false}, {when: -1615129200, index: 1, isstd: false, isutc: false}, {when: -1601820000, index: 0, isstd: false, isutc: false}, {when: -1583679600, index: 1, isstd: true, isutc: true}, {when: -880207200, index: 2, isstd: false, isutc: false}, {when: -769395600, index: 3, isstd: false, isutc: false}, {when: -765385200, index: 1, isstd: false, isutc: false}, {when: -687967200, index: 0, isstd: false, isutc: false}, {when: -662655600, index: 1, isstd: false, isutc: false}, {when: -620834400, index: 0, isstd: false, isutc: false}, {when: -608137200, index: 1, isstd: false, isutc: false}, {when: -589384800, index: 0, isstd: false, isutc: false}, {when: -576082800, index: 1, isstd: false, isutc: false}, {when: -557935200, index: 0, isstd: false, isutc: false}, {when: -544633200, index: 1, isstd: false, isutc: false}, {when: -526485600, index: 0, isstd: false, isutc: false}, {when: -513183600, index: 1, isstd: false, isutc: false}, {when: -495036000, index: 0, isstd: false, isutc: false}, {when: -481734000, index: 1, isstd: false, isutc: false}, {when: -463586400, index: 0, isstd: false, isutc: false}, {when: -450284400, index: 1, isstd: false, isutc: false}, {when: -431532000, index: 0, isstd: false, isutc: false}, {when: -418230000, index: 1, isstd: false, isutc: false}, {when: -400082400, index: 0, isstd: false, isutc: false}, {when: -386780400, index: 1, isstd: false, isutc: false}, {when: -368632800, index: 0, isstd: false, isutc: false}, {when: -355330800, index: 1, isstd: false, isutc: false}, {when: -337183200, index: 0, isstd: false, isutc: false}, {when: -323881200, index: 1, isstd: false, isutc: false}, {when: -305733600, index: 0, isstd: false, isutc: false}, {when: -292431600, index: 1, isstd: false, isutc: false}, {when: -273679200, index: 0, isstd: false, isutc: false}, {when: -260982000, index: 1, isstd: false, isutc: false}, {when: -242229600, index: 0, isstd: false, isutc: false}, {when: -226508400, index: 1, isstd: false, isutc: false}, {when: -210780000, index: 0, isstd: false, isutc: false}, {when: -195058800, index: 1, isstd: false, isutc: false}, {when: -179330400, index: 0, isstd: false, isutc: false}, {when: -163609200, index: 1, isstd: false, isutc: false}, {when: -147880800, index: 0, isstd: false, isutc: false}, {when: -131554800, index: 1, isstd: false, isutc: false}, {when: -116431200, index: 0, isstd: false, isutc: false}, {when: -100105200, index: 1, isstd: false, isutc: false}, {when: -84376800, index: 0, isstd: false, isutc: false}, {when: -68655600, index: 1, isstd: false, isutc: false}, {when: -52927200, index: 0, isstd: false, isutc: false}, {when: -37206000, index: 1, isstd: false, isutc: false}, {when: -21477600, index: 0, isstd: false, isutc: false}, {when: -5756400, index: 1, isstd: false, isutc: false}, {when: 9972000, index: 0, isstd: false, isutc: false}, {when: 25693200, index: 1, isstd: false, isutc: false}, {when: 41421600, index: 0, isstd: false, isutc: false}, {when: 57747600, index: 1, isstd: false, isutc: false}, {when: 73476000, index: 0, isstd: false, isutc: false}, {when: 89197200, index: 1, isstd: false, isutc: false}, {when: 104925600, index: 0, isstd: false, isutc: false}, {when: 120646800, index: 1, isstd: false, isutc: false}, {when: 126698400, index: 0, isstd: false, isutc: false}, {when: 152096400, index: 1, isstd: false, isutc: false}, {when: 162381600, index: 0, isstd: false, isutc: false}, {when: 183546000, index: 1, isstd: false, isutc: false}, {when: 199274400, index: 0, isstd: false, isutc: false}, {when: 215600400, index: 1, isstd: false, isutc: false}, {when: 230724000, index: 0, isstd: false, isutc: false}, {when: 247050000, index: 1, isstd: false, isutc: false}, {when: 262778400, index: 0, isstd: false, isutc: false}, {when: 278499600, index: 1, isstd: false, isutc: false}, {when: 294228000, index: 0, isstd: false, isutc: false}, {when: 309949200, index: 1, isstd: false, isutc: false}, {when: 325677600, index: 0, isstd: false, isutc: false}, {when: 341398800, index: 1, isstd: false, isutc: false}, {when: 357127200, index: 0, isstd: false, isutc: false}, {when: 372848400, index: 1, isstd: false, isutc: false}, {when: 388576800, index: 0, isstd: false, isutc: false}, {when: 404902800, index: 1, isstd: false, isutc: false}, {when: 420026400, index: 0, isstd: false, isutc: false}, {when: 436352400, index: 1, isstd: false, isutc: false}, {when: 452080800, index: 0, isstd: false, isutc: false}, {when: 467802000, index: 1, isstd: false, isutc: false}, {when: 483530400, index: 0, isstd: false, isutc: false}, {when: 499251600, index: 1, isstd: false, isutc: false}, {when: 514980000, index: 0, isstd: false, isutc: false}, {when: 530701200, index: 1, isstd: false, isutc: false}, {when: 544615200, index: 0, isstd: false, isutc: false}, {when: 562150800, index: 1, isstd: false, isutc: false}, {when: 576064800, index: 0, isstd: false, isutc: false}, {when: 594205200, index: 1, isstd: false, isutc: false}, {when: 607514400, index: 0, isstd: false, isutc: false}, {when: 625654800, index: 1, isstd: false, isutc: false}, {when: 638964000, index: 0, isstd: false, isutc: false}, {when: 657104400, index: 1, isstd: false, isutc: false}, {when: 671018400, index: 0, isstd: false, isutc: false}, {when: 688554000, index: 1, isstd: false, isutc: false}, {when: 702468000, index: 0, isstd: false, isutc: false}, {when: 720003600, index: 1, isstd: false, isutc: false}, {when: 733917600, index: 0, isstd: false, isutc: false}, {when: 752058000, index: 1, isstd: false, isutc: false}, {when: 765367200, index: 0, isstd: false, isutc: false}, {when: 783507600, index: 1, isstd: false, isutc: false}, {when: 796816800, index: 0, isstd: false, isutc: false}, {when: 814957200, index: 1, isstd: false, isutc: false}, {when: 828871200, index: 0, isstd: false, isutc: false}, {when: 846406800, index: 1, isstd: false, isutc: false}, {when: 860320800, index: 0, isstd: false, isutc: false}, {when: 877856400, index: 1, isstd: false, isutc: false}, {when: 891770400, index: 0, isstd: false, isutc: false}, {when: 909306000, index: 1, isstd: false, isutc: false}, {when: 923220000, index: 0, isstd: false, isutc: false}, {when: 941360400, index: 1, isstd: false, isutc: false}, {when: 954669600, index: 0, isstd: false, isutc: false}, {when: 972810000, index: 1, isstd: false, isutc: false}, {when: 986119200, index: 0, isstd: false, isutc: false}, {when: 1004259600, index: 1, isstd: false, isutc: false}, {when: 1018173600, index: 0, isstd: false, isutc: false}, {when: 1035709200, index: 1, isstd: false, isutc: false}, {when: 1049623200, index: 0, isstd: false, isutc: false}, {when: 1067158800, index: 1, isstd: false, isutc: false}, {when: 1081072800, index: 0, isstd: false, isutc: false}, {when: 1099213200, index: 1, isstd: false, isutc: false}, {when: 1112522400, index: 0, isstd: false, isutc: false}, {when: 1130662800, index: 1, isstd: false, isutc: false}, {when: 1143972000, index: 0, isstd: false, isutc: false}, {when: 1162112400, index: 1, isstd: false, isutc: false}, {when: 1173607200, index: 0, isstd: false, isutc: false}, {when: 1194166800, index: 1, isstd: false, isutc: false}, {when: 1205056800, index: 0, isstd: false, isutc: false}, {when: 1225616400, index: 1, isstd: false, isutc: false}, {when: 1236506400, index: 0, isstd: false, isutc: false}, {when: 1257066000, index: 1, isstd: false, isutc: false}, {when: 1268560800, index: 0, isstd: false, isutc: false}, {when: 1289120400, index: 1, isstd: false, isutc: false}, {when: 1300010400, index: 0, isstd: false, isutc: false}, {when: 1320570000, index: 1, isstd: false, isutc: false}, {when: 1331460000, index: 0, isstd: false, isutc: false}, {when: 1352019600, index: 1, isstd: false, isutc: false}, {when: 1362909600, index: 0, isstd: false, isutc: false}, {when: 1383469200, index: 1, isstd: false, isutc: false}, {when: 1394359200, index: 0, isstd: false, isutc: false}, {when: 1414918800, index: 1, isstd: false, isutc: false}, {when: 1425808800, index: 0, isstd: false, isutc: false}, {when: 1446368400, index: 1, isstd: false, isutc: false}, {when: 1457863200, index: 0, isstd: false, isutc: false}, {when: 1478422800, index: 1, isstd: false, isutc: false}, {when: 1489312800, index: 0, isstd: false, isutc: false}, {when: 1509872400, index: 1, isstd: false, isutc: false}, {when: 1520762400, index: 0, isstd: false, isutc: false}, {when: 1541322000, index: 1, isstd: false, isutc: false}, {when: 1552212000, index: 0, isstd: false, isutc: false}, {when: 1572771600, index: 1, isstd: false, isutc: false}, {when: 1583661600, index: 0, isstd: false, isutc: false}, {when: 1604221200, index: 1, isstd: false, isutc: false}, {when: 1615716000, index: 0, isstd: false, isutc: false}, {when: 1636275600, index: 1, isstd: false, isutc: false}, {when: 1647165600, index: 0, isstd: false, isutc: false}, {when: 1667725200, index: 1, isstd: false, isutc: false}, {when: 1678615200, index: 0, isstd: false, isutc: false}, {when: 1699174800, index: 1, isstd: false, isutc: false}, {when: 1710064800, index: 0, isstd: false, isutc: false}, {when: 1730624400, index: 1, isstd: false, isutc: false}, {when: 1741514400, index: 0, isstd: false, isutc: false}, {when: 1762074000, index: 1, isstd: false, isutc: false}, {when: 1772964000, index: 0, isstd: false, isutc: false}, {when: 1793523600, index: 1, isstd: false, isutc: false}, {when: 1805018400, index: 0, isstd: false, isutc: false}, {when: 1825578000, index: 1, isstd: false, isutc: false}, {when: 1836468000, index: 0, isstd: false, isutc: false}, {when: 1857027600, index: 1, isstd: false, isutc: false}, {when: 1867917600, index: 0, isstd: false, isutc: false}, {when: 1888477200, index: 1, isstd: false, isutc: false}, {when: 1899367200, index: 0, isstd: false, isutc: false}, {when: 1919926800, index: 1, isstd: false, isutc: false}, {when: 1930816800, index: 0, isstd: false, isutc: false}, {when: 1951376400, index: 1, isstd: false, isutc: false}, {when: 1962871200, index: 0, isstd: false, isutc: false}, {when: 1983430800, index: 1, isstd: false, isutc: false}, {when: 1994320800, index: 0, isstd: false, isutc: false}, {when: 2014880400, index: 1, isstd: false, isutc: false}, {when: 2025770400, index: 0, isstd: false, isutc: false}, {when: 2046330000, index: 1, isstd: false, isutc: false}, {when: 2057220000, index: 0, isstd: false, isutc: false}, {when: 2077779600, index: 1, isstd: false, isutc: false}, {when: 2088669600, index: 0, isstd: false, isutc: false}, {when: 2109229200, index: 1, isstd: false, isutc: false}, {when: 2120119200, index: 0, isstd: false, isutc: false}, {when: 2140678800, index: 1, isstd: false, isutc: false}, ], cacheStart: 1457863200, cacheEnd: 1478422800, cacheZone: {name: "PDT", offset: -25200, isDST: true}, }, }, }, DeletionGracePeriodSeconds: 0, Labels: { "name": "nginx", "pod-template-hash": "783459730", }, Annotations: { "deployment.kubernetes.io/desired-replicas": "0", "deployment.kubernetes.io/max-replicas": "1", "deployment.kubernetes.io/revision": "3", }, OwnerReferences: nil, Finalizers: ["orphan"], }, Spec: { Replicas: 0, Selector: { MatchLabels: { "name": "nginx", "pod-template-hash": "783459730", }, MatchExpressions: nil, }, Template: { ObjectMeta: { Name: "", GenerateName: "", Namespace: "", SelfLink: "", UID: "", ResourceVersion: "", Generation: 0, CreationTimestamp: { Time: {sec: 0, nsec: 0, loc: nil}, }, DeletionTimestamp: nil, DeletionGracePeriodSeconds: nil, Labels: { "pod-template-hash": "783459730", "name": "nginx", }, Annotations: nil, OwnerReferences: nil, Finalizers: nil, }, Spec: { Volumes: nil, InitContainers: nil, Containers: [ { Name: "nginx", Image: "gcr.io/google_containers/nginx:1.7.9", Command: nil, Args: nil, WorkingDir: "", Ports: nil, Env: nil, Resources: {Limits: nil, Requests: nil}, VolumeMounts: nil, LivenessProbe: nil, ReadinessProbe: nil, Lifecycle: nil, TerminationMessagePath: "/dev/termination-log", ImagePullPolicy: "IfNotPresent", SecurityContext: nil, Stdin: false, StdinOnce: false, TTY: false, }, ], RestartPolicy: "Always", TerminationGracePeriodSeconds: 0, ActiveDeadlineSeconds: nil, DNSPolicy: "ClusterFirst", NodeSelector: nil, ServiceAccountName: "", NodeName: "", SecurityContext: { HostNetwork: false, HostPID: false, HostIPC: false, SELinuxOptions: nil, RunAsUser: nil, RunAsNonRoot: nil, SupplementalGroups: nil, FSGroup: nil, }, ImagePullSecrets: nil, Hostname: "", Subdomain: "", }, }, }, Status: { Replicas: 0, FullyLabeledReplicas: 0, ObservedGeneration: 5, }, }, ] to have length 0
test
permafail on upgrade deployment deployment should support rollback when there s replica set with no revision on deployment deployment should support rollback when there s replica set with no revision go src io kubernetes output dockerized go src io kubernetes test deployment go expected tx cachestart cacheend cachezone name pdt offset isdst true deletiontimestamp time sec nsec loc name local zone tx cachestart cacheend cachezone name pdt offset isdst true deletiongraceperiodseconds labels name nginx pod template hash annotations deployment kubernetes io desired replicas deployment kubernetes io max replicas deployment kubernetes io revision ownerreferences nil finalizers spec replicas selector matchlabels name nginx pod template hash matchexpressions nil template objectmeta name generatename namespace selflink uid resourceversion generation creationtimestamp time sec nsec loc nil deletiontimestamp nil deletiongraceperiodseconds nil labels pod template hash name nginx annotations nil ownerreferences nil finalizers nil spec volumes nil initcontainers nil containers restartpolicy always terminationgraceperiodseconds activedeadlineseconds nil dnspolicy clusterfirst nodeselector nil serviceaccountname nodename securitycontext hostnetwork false hostpid false hostipc false selinuxoptions nil runasuser nil runasnonroot nil supplementalgroups nil fsgroup nil imagepullsecrets nil hostname subdomain status replicas fullylabeledreplicas observedgeneration to have length
1
19,685
14,428,121,278
IssuesEvent
2020-12-06 08:09:52
tarantool/test-run
https://api.github.com/repos/tarantool/test-run
opened
Fail testing of tarantool when it is not built
feature usability
It is convenient that test-run uses a tarantool / a tarantoolctl executables found in PATH, when we want to test a module (say, vshard). However it is undesirable behaviour, when we test tarantool itself: it is not always obvious what is going on when I forgot to build tarantool prior to run tests. We already detect tarantool source directory: https://github.com/tarantool/test-run/blob/e84355283f117e91964fe84eac70f26bb32d7f06/lib/__init__.py#L14-L22 So it should be easy to forbid using of an external tarantool in the case.
True
Fail testing of tarantool when it is not built - It is convenient that test-run uses a tarantool / a tarantoolctl executables found in PATH, when we want to test a module (say, vshard). However it is undesirable behaviour, when we test tarantool itself: it is not always obvious what is going on when I forgot to build tarantool prior to run tests. We already detect tarantool source directory: https://github.com/tarantool/test-run/blob/e84355283f117e91964fe84eac70f26bb32d7f06/lib/__init__.py#L14-L22 So it should be easy to forbid using of an external tarantool in the case.
non_test
fail testing of tarantool when it is not built it is convenient that test run uses a tarantool a tarantoolctl executables found in path when we want to test a module say vshard however it is undesirable behaviour when we test tarantool itself it is not always obvious what is going on when i forgot to build tarantool prior to run tests we already detect tarantool source directory so it should be easy to forbid using of an external tarantool in the case
0
315,056
23,543,192,077
IssuesEvent
2022-08-20 18:22:19
jpablo-ortiz/Reconocimiento-LSC-Lengua-Senas-Colombiana
https://api.github.com/repos/jpablo-ortiz/Reconocimiento-LSC-Lengua-Senas-Colombiana
closed
[Feature]: Agregar convenciones y plantillas al repositorio
documentation enhancement
Requerimiento convenciones y plantillas: - Git hooks para conventional commits. - Versionamiento con Standard Version. - Automatización de CHANGELOG.MD. - Cambio Readme para uso de convenciones. - Issues Templates.
1.0
[Feature]: Agregar convenciones y plantillas al repositorio - Requerimiento convenciones y plantillas: - Git hooks para conventional commits. - Versionamiento con Standard Version. - Automatización de CHANGELOG.MD. - Cambio Readme para uso de convenciones. - Issues Templates.
non_test
agregar convenciones y plantillas al repositorio requerimiento convenciones y plantillas git hooks para conventional commits versionamiento con standard version automatización de changelog md cambio readme para uso de convenciones issues templates
0
338,788
30,321,693,207
IssuesEvent
2023-07-10 19:46:17
ntop/ntopng
https://api.github.com/repos/ntop/ntopng
closed
Invalid Protocol in URL
Bug Ready to Test
<img width="1476" alt="image" src="https://github.com/ntop/ntopng/assets/4493366/ccc5e5db-fc3a-4814-abc0-a420d8a16a59"> This is an HTTP flow. Clicking on the copy button it copies https://support.content.office.microsoft.com/en-us/static/AF101807649.wat instead of http://support.content.office.microsoft.com/en-us/static/AF101807649.wat
1.0
Invalid Protocol in URL - <img width="1476" alt="image" src="https://github.com/ntop/ntopng/assets/4493366/ccc5e5db-fc3a-4814-abc0-a420d8a16a59"> This is an HTTP flow. Clicking on the copy button it copies https://support.content.office.microsoft.com/en-us/static/AF101807649.wat instead of http://support.content.office.microsoft.com/en-us/static/AF101807649.wat
test
invalid protocol in url img width alt image src this is an http flow clicking on the copy button it copies instead of
1
46,086
13,150,009,060
IssuesEvent
2020-08-09 08:57:40
shaundmorris/ddf
https://api.github.com/repos/shaundmorris/ddf
closed
CVE-2016-1000339 Medium Severity Vulnerability detected by WhiteSource
security vulnerability wontfix
## CVE-2016-1000339 - Medium Severity Vulnerability <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bcprov-jdk15on-1.54.jar</b></p></summary> <p>The Bouncy Castle Crypto package is a Java implementation of cryptographic algorithms. This jar contains JCE provider and lightweight API for the Bouncy Castle Cryptography APIs for JDK 1.5 to JDK 1.8.</p> <p>path: /root/.m2/repository/org/bouncycastle/bcprov-jdk15on/1.54/bcprov-jdk15on-1.54.jar,/ddf/distribution/ddf/target/dependencies/solr/contrib/extraction/lib/bcprov-jdk15on-1.54.jar,/ddf/distribution/solr-distro/target/solr-7.4.0/contrib/extraction/lib/bcprov-jdk15on-1.54.jar,/ddf/distribution/kernel/target/dependencies/solr/contrib/extraction/lib/bcprov-jdk15on-1.54.jar</p> <p> <p>Library home page: <a href=http://www.bouncycastle.org/java.html>http://www.bouncycastle.org/java.html</a></p> Dependency Hierarchy: - :x: **bcprov-jdk15on-1.54.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In the Bouncy Castle JCE Provider version 1.55 and earlier the primary engine class used for AES was AESFastEngine. Due to the highly table driven approach used in the algorithm it turns out that if the data channel on the CPU can be monitored the lookup table accesses are sufficient to leak information on the AES key being used. There was also a leak in AESEngine although it was substantially less. AESEngine has been modified to remove any signs of leakage (testing carried out on Intel X86-64) and is now the primary AES class for the BC JCE provider from 1.56. Use of AESFastEngine is now only recommended where otherwise deemed appropriate. <p>Publish Date: 2018-06-04 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-1000339>CVE-2016-1000339</a></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Change files</p> <p>Origin: <a href="https://github.com/bcgit/bc-java/commit/413b42f4d770456508585c830cfcde95f9b0e93b#diff-54656f860db94b867ba7542430cd2ef0">https://github.com/bcgit/bc-java/commit/413b42f4d770456508585c830cfcde95f9b0e93b#diff-54656f860db94b867ba7542430cd2ef0</a></p> <p>Release Date: 2016-10-31</p> <p>Fix Resolution: Replace or update the following files: IESCipher.java, KeyFactorySpi.java, AESEngine.java, DHTest.java, AES.java, DHUtil.java, DHPublicKeyParameters.java, BCDHPublicKey.java</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2016-1000339 Medium Severity Vulnerability detected by WhiteSource - ## CVE-2016-1000339 - Medium Severity Vulnerability <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bcprov-jdk15on-1.54.jar</b></p></summary> <p>The Bouncy Castle Crypto package is a Java implementation of cryptographic algorithms. This jar contains JCE provider and lightweight API for the Bouncy Castle Cryptography APIs for JDK 1.5 to JDK 1.8.</p> <p>path: /root/.m2/repository/org/bouncycastle/bcprov-jdk15on/1.54/bcprov-jdk15on-1.54.jar,/ddf/distribution/ddf/target/dependencies/solr/contrib/extraction/lib/bcprov-jdk15on-1.54.jar,/ddf/distribution/solr-distro/target/solr-7.4.0/contrib/extraction/lib/bcprov-jdk15on-1.54.jar,/ddf/distribution/kernel/target/dependencies/solr/contrib/extraction/lib/bcprov-jdk15on-1.54.jar</p> <p> <p>Library home page: <a href=http://www.bouncycastle.org/java.html>http://www.bouncycastle.org/java.html</a></p> Dependency Hierarchy: - :x: **bcprov-jdk15on-1.54.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In the Bouncy Castle JCE Provider version 1.55 and earlier the primary engine class used for AES was AESFastEngine. Due to the highly table driven approach used in the algorithm it turns out that if the data channel on the CPU can be monitored the lookup table accesses are sufficient to leak information on the AES key being used. There was also a leak in AESEngine although it was substantially less. AESEngine has been modified to remove any signs of leakage (testing carried out on Intel X86-64) and is now the primary AES class for the BC JCE provider from 1.56. Use of AESFastEngine is now only recommended where otherwise deemed appropriate. <p>Publish Date: 2018-06-04 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-1000339>CVE-2016-1000339</a></p> </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Change files</p> <p>Origin: <a href="https://github.com/bcgit/bc-java/commit/413b42f4d770456508585c830cfcde95f9b0e93b#diff-54656f860db94b867ba7542430cd2ef0">https://github.com/bcgit/bc-java/commit/413b42f4d770456508585c830cfcde95f9b0e93b#diff-54656f860db94b867ba7542430cd2ef0</a></p> <p>Release Date: 2016-10-31</p> <p>Fix Resolution: Replace or update the following files: IESCipher.java, KeyFactorySpi.java, AESEngine.java, DHTest.java, AES.java, DHUtil.java, DHPublicKeyParameters.java, BCDHPublicKey.java</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_test
cve medium severity vulnerability detected by whitesource cve medium severity vulnerability vulnerable library bcprov jar the bouncy castle crypto package is a java implementation of cryptographic algorithms this jar contains jce provider and lightweight api for the bouncy castle cryptography apis for jdk to jdk path root repository org bouncycastle bcprov bcprov jar ddf distribution ddf target dependencies solr contrib extraction lib bcprov jar ddf distribution solr distro target solr contrib extraction lib bcprov jar ddf distribution kernel target dependencies solr contrib extraction lib bcprov jar library home page a href dependency hierarchy x bcprov jar vulnerable library vulnerability details in the bouncy castle jce provider version and earlier the primary engine class used for aes was aesfastengine due to the highly table driven approach used in the algorithm it turns out that if the data channel on the cpu can be monitored the lookup table accesses are sufficient to leak information on the aes key being used there was also a leak in aesengine although it was substantially less aesengine has been modified to remove any signs of leakage testing carried out on intel and is now the primary aes class for the bc jce provider from use of aesfastengine is now only recommended where otherwise deemed appropriate publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type change files origin a href release date fix resolution replace or update the following files iescipher java keyfactoryspi java aesengine java dhtest java aes java dhutil java dhpublickeyparameters java bcdhpublickey java step up your open source security game with whitesource
0
37,954
12,510,903,729
IssuesEvent
2020-06-02 19:31:58
kenferrara/react-base-table
https://api.github.com/repos/kenferrara/react-base-table
opened
CVE-2015-9251 (Medium) detected in jquery-2.1.4.min.js
security vulnerability
## CVE-2015-9251 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-2.1.4.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/react-base-table/node_modules/js-base64/.attic/test-moment/index.html</p> <p>Path to vulnerable library: /react-base-table/node_modules/js-base64/.attic/test-moment/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-2.1.4.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/kenferrara/react-base-table/commit/8e278435a954b3faf16104b3f871a7a2a913555a">8e278435a954b3faf16104b3f871a7a2a913555a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v3.0.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.1.4","isTransitiveDependency":false,"dependencyTree":"jquery:2.1.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2015-9251 (Medium) detected in jquery-2.1.4.min.js - ## CVE-2015-9251 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-2.1.4.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.1.4/jquery.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/react-base-table/node_modules/js-base64/.attic/test-moment/index.html</p> <p>Path to vulnerable library: /react-base-table/node_modules/js-base64/.attic/test-moment/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-2.1.4.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/kenferrara/react-base-table/commit/8e278435a954b3faf16104b3f871a7a2a913555a">8e278435a954b3faf16104b3f871a7a2a913555a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v3.0.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"2.1.4","isTransitiveDependency":false,"dependencyTree":"jquery:2.1.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_test
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file tmp ws scm react base table node modules js attic test moment index html path to vulnerable library react base table node modules js attic test moment index html dependency hierarchy x jquery min js vulnerable library found in head commit a href vulnerability details jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed vulnerabilityurl
0
286,027
24,714,283,378
IssuesEvent
2022-10-20 05:14:21
microsoft/MixedRealityToolkit-Unity
https://api.github.com/repos/microsoft/MixedRealityToolkit-Unity
opened
Rig origin changes break absolute-world-placed unit tests
Bug Tests MRTK3
## Describe the bug When #11035 changed the XR Rig to specify user height and origin offset explicitly, unit tests that expected y=0 for head height broke. ## To reproduce Run unit tests. Unit tests fail. ## Expected behavior Unit tests should not fail. ## Fix Objects in unit tests should be placed in head-height-relative reference frame.
1.0
Rig origin changes break absolute-world-placed unit tests - ## Describe the bug When #11035 changed the XR Rig to specify user height and origin offset explicitly, unit tests that expected y=0 for head height broke. ## To reproduce Run unit tests. Unit tests fail. ## Expected behavior Unit tests should not fail. ## Fix Objects in unit tests should be placed in head-height-relative reference frame.
test
rig origin changes break absolute world placed unit tests describe the bug when changed the xr rig to specify user height and origin offset explicitly unit tests that expected y for head height broke to reproduce run unit tests unit tests fail expected behavior unit tests should not fail fix objects in unit tests should be placed in head height relative reference frame
1
249,646
18,858,219,817
IssuesEvent
2021-11-12 09:31:07
ryantianj/pe
https://api.github.com/repos/ryantianj/pe
closed
Irrelevant information in DG
severity.VeryLow type.DocumentationBug
Personally, I feel that this should not be included in the final DG release to other developers. (The plant-uml tutorial part) ![image.png](https://raw.githubusercontent.com/ryantianj/pe/main/files/8bf9d8ec-50e3-4959-bc52-644e39d4cadf.png) <!--session: 1636704548904-49691a7e-abf1-493b-87bf-e92fa43c37b6--> <!--Version: Web v3.4.1-->
1.0
Irrelevant information in DG - Personally, I feel that this should not be included in the final DG release to other developers. (The plant-uml tutorial part) ![image.png](https://raw.githubusercontent.com/ryantianj/pe/main/files/8bf9d8ec-50e3-4959-bc52-644e39d4cadf.png) <!--session: 1636704548904-49691a7e-abf1-493b-87bf-e92fa43c37b6--> <!--Version: Web v3.4.1-->
non_test
irrelevant information in dg personally i feel that this should not be included in the final dg release to other developers the plant uml tutorial part
0
200,212
15,094,073,804
IssuesEvent
2021-02-07 04:12:34
TsubakiBotPad/pad-cogs
https://api.github.com/repos/TsubakiBotPad/pad-cogs
closed
in idtest run print the ref number for each failing test
id3 tests
![image](https://user-images.githubusercontent.com/18037011/107136109-031d5780-68c6-11eb-85a4-a19c468c0e0c.png) Should say: ` 40. "hello world"` This will make it much easier to delete stuff (especially fake test data)
1.0
in idtest run print the ref number for each failing test - ![image](https://user-images.githubusercontent.com/18037011/107136109-031d5780-68c6-11eb-85a4-a19c468c0e0c.png) Should say: ` 40. "hello world"` This will make it much easier to delete stuff (especially fake test data)
test
in idtest run print the ref number for each failing test should say hello world this will make it much easier to delete stuff especially fake test data
1
655,162
21,678,997,411
IssuesEvent
2022-05-09 03:04:27
StackExchange/dnscontrol
https://api.github.com/repos/StackExchange/dnscontrol
closed
Providers should lazy-load, not touching their API until used
Type: Enhancement Priority: p4 - Lowest
If a provider is broken or down, it can't be skipped. Even -providers=foo initializes foo and non-foo providers, which requires working API keys, the API being up, and so on.
1.0
Providers should lazy-load, not touching their API until used - If a provider is broken or down, it can't be skipped. Even -providers=foo initializes foo and non-foo providers, which requires working API keys, the API being up, and so on.
non_test
providers should lazy load not touching their api until used if a provider is broken or down it can t be skipped even providers foo initializes foo and non foo providers which requires working api keys the api being up and so on
0
176,296
28,064,394,614
IssuesEvent
2023-03-29 14:31:52
202212-GIZ-YE-FEW/iDev
https://api.github.com/repos/202212-GIZ-YE-FEW/iDev
closed
Design Auth as Social Media
needs-detailed-design
**Task title** Design Auth as Social Media **Task description** Design auth as social media **Subtasks** - OR line - Facebook & Google **Screenshots** ![image](https://user-images.githubusercontent.com/96573043/227829229-2d196338-4853-49a9-9bfe-23e92f9761fb.png) **Link to the component on Figma** [Signup as social media](https://www.figma.com/file/r2YCuflKtmP8FK2KVXhnNx/Online-Therapist?node-id=279-4868&t=DD6EMQdwZQDBT8YE-4)
1.0
Design Auth as Social Media - **Task title** Design Auth as Social Media **Task description** Design auth as social media **Subtasks** - OR line - Facebook & Google **Screenshots** ![image](https://user-images.githubusercontent.com/96573043/227829229-2d196338-4853-49a9-9bfe-23e92f9761fb.png) **Link to the component on Figma** [Signup as social media](https://www.figma.com/file/r2YCuflKtmP8FK2KVXhnNx/Online-Therapist?node-id=279-4868&t=DD6EMQdwZQDBT8YE-4)
non_test
design auth as social media task title design auth as social media task description design auth as social media subtasks or line facebook google screenshots link to the component on figma
0
154,235
13,542,132,246
IssuesEvent
2020-09-16 16:51:49
Atos20/Travel-tracker
https://api.github.com/repos/Atos20/Travel-tracker
opened
Create method that would delete a single trip
API data documentation
In the API class add the method that would remove an element from the server
1.0
Create method that would delete a single trip - In the API class add the method that would remove an element from the server
non_test
create method that would delete a single trip in the api class add the method that would remove an element from the server
0
347,638
31,237,626,241
IssuesEvent
2023-08-20 13:17:32
sixes-sense/back-end
https://api.github.com/repos/sixes-sense/back-end
closed
[Feature] [태그] 태그 저장 기능
feature test
✏️Description - 태그 저장 기능 구현 ✅TODO - - [x] 태그 엔티티 생성 - [x] 태그 저장 기능 구현 - [x] 태그 통합 테스트 진행 🐾ETC -
1.0
[Feature] [태그] 태그 저장 기능 - ✏️Description - 태그 저장 기능 구현 ✅TODO - - [x] 태그 엔티티 생성 - [x] 태그 저장 기능 구현 - [x] 태그 통합 테스트 진행 🐾ETC -
test
태그 저장 기능 ✏️description 태그 저장 기능 구현 ✅todo 태그 엔티티 생성 태그 저장 기능 구현 태그 통합 테스트 진행 🐾etc
1
79,638
7,721,977,384
IssuesEvent
2018-05-24 07:48:45
zephyrproject-rtos/zephyr
https://api.github.com/repos/zephyrproject-rtos/zephyr
closed
[Coverity CID: 183486] Null pointer dereferences in /tests/net/traffic_class/src/main.c
Coverity area: Tests bug priority: low
Static code scan issues seen in File: /tests/net/traffic_class/src/main.c Category: Null pointer dereferences Function: address_setup Component: Tests CID: 183486 Please fix or provide comments to square it off in coverity in the link: https://scan9.coverity.com/reports.htm#v32951/p12996
1.0
[Coverity CID: 183486] Null pointer dereferences in /tests/net/traffic_class/src/main.c - Static code scan issues seen in File: /tests/net/traffic_class/src/main.c Category: Null pointer dereferences Function: address_setup Component: Tests CID: 183486 Please fix or provide comments to square it off in coverity in the link: https://scan9.coverity.com/reports.htm#v32951/p12996
test
null pointer dereferences in tests net traffic class src main c static code scan issues seen in file tests net traffic class src main c category null pointer dereferences function address setup component tests cid please fix or provide comments to square it off in coverity in the link
1
155,569
24,483,454,141
IssuesEvent
2022-10-09 05:36:08
microsoft/pyright
https://api.github.com/repos/microsoft/pyright
closed
drodown list has too many "noises"
as designed
**Is your feature request related to a problem? Please describe.** when I want to see a dropdown list of methods, coc-pyright gave me lots of options and many of them are not useful, see the screenshot, what I really need is just the "f", not "v". ![image](https://user-images.githubusercontent.com/18351761/194735521-033ca832-1ca5-466f-a1ac-173b3ad651db.png) **Describe the solution you'd like** as a comparison to my c++ code, only useful methods are shown up in the dropdown list: ![image](https://user-images.githubusercontent.com/18351761/194735537-5ff3555b-ec19-4e7d-8c53-babf15c21e18.png) so is typescript: ![image](https://user-images.githubusercontent.com/18351761/194735549-46f8e8c9-e037-4ece-96a6-af0d288a3cc0.png) can pyright do similar sorting for dropdown suggestions? i.e. the most useful methods on the top? **Additional context**
1.0
drodown list has too many "noises" - **Is your feature request related to a problem? Please describe.** when I want to see a dropdown list of methods, coc-pyright gave me lots of options and many of them are not useful, see the screenshot, what I really need is just the "f", not "v". ![image](https://user-images.githubusercontent.com/18351761/194735521-033ca832-1ca5-466f-a1ac-173b3ad651db.png) **Describe the solution you'd like** as a comparison to my c++ code, only useful methods are shown up in the dropdown list: ![image](https://user-images.githubusercontent.com/18351761/194735537-5ff3555b-ec19-4e7d-8c53-babf15c21e18.png) so is typescript: ![image](https://user-images.githubusercontent.com/18351761/194735549-46f8e8c9-e037-4ece-96a6-af0d288a3cc0.png) can pyright do similar sorting for dropdown suggestions? i.e. the most useful methods on the top? **Additional context**
non_test
drodown list has too many noises is your feature request related to a problem please describe when i want to see a dropdown list of methods coc pyright gave me lots of options and many of them are not useful see the screenshot what i really need is just the f not v describe the solution you d like as a comparison to my c code only useful methods are shown up in the dropdown list so is typescript can pyright do similar sorting for dropdown suggestions i e the most useful methods on the top additional context
0
135,801
11,017,952,859
IssuesEvent
2019-12-05 09:33:38
aliasrobotics/RVD
https://api.github.com/repos/aliasrobotics/RVD
opened
(error) syntax error
bug cppcheck static analysis testing triage
```yaml { "severity": { "rvss-score": 0, "cvss-score": 0, "severity-description": "", "cvss-vector": "", "rvss-vector": "" }, "exploitation": { "exploitation-vector": "", "exploitation-image": "", "description": "" }, "flaw": { "trace": "", "reproduction": "See artifacts below (if available)", "languages": "None", "reproducibility": "always", "detected-by-method": "testing static", "subsystem": "N/A", "specificity": "N/A", "issue": "", "reported-by-relationship": "automatic", "phase": "testing", "reported-by": "Alias Robotics", "reproduction-image": "gitlab.com/aliasrobotics/offensive/alurity/pipelines/pipeline_ur_ros_official_cppcheck/-/jobs/370181107/artifacts/download", "package": "N/A", "date-detected": "2019-12-05 (09:33)", "detected-by": "Alias Robotics", "date-reported": "2019-12-05 (09:33)", "application": "N/A", "architectural-location": "N/A" }, "keywords": [ "cppcheck", "static analysis", "testing", "triage", "bug" ], "links": "", "mitigation": { "pull-request": "", "description": "" }, "system": "src/Universal_Robots_ROS_Driver/ur_calibration/test/calibration_test.cpp", "description": "[src/Universal_Robots_ROS_Driver/ur_calibration/test/calibration_test.cpp:37]: (error) syntax error", "vendor": null, "cwe": "None", "title": "(error) syntax error", "cve": "None", "type": "bug", "id": 1 } ```
1.0
(error) syntax error - ```yaml { "severity": { "rvss-score": 0, "cvss-score": 0, "severity-description": "", "cvss-vector": "", "rvss-vector": "" }, "exploitation": { "exploitation-vector": "", "exploitation-image": "", "description": "" }, "flaw": { "trace": "", "reproduction": "See artifacts below (if available)", "languages": "None", "reproducibility": "always", "detected-by-method": "testing static", "subsystem": "N/A", "specificity": "N/A", "issue": "", "reported-by-relationship": "automatic", "phase": "testing", "reported-by": "Alias Robotics", "reproduction-image": "gitlab.com/aliasrobotics/offensive/alurity/pipelines/pipeline_ur_ros_official_cppcheck/-/jobs/370181107/artifacts/download", "package": "N/A", "date-detected": "2019-12-05 (09:33)", "detected-by": "Alias Robotics", "date-reported": "2019-12-05 (09:33)", "application": "N/A", "architectural-location": "N/A" }, "keywords": [ "cppcheck", "static analysis", "testing", "triage", "bug" ], "links": "", "mitigation": { "pull-request": "", "description": "" }, "system": "src/Universal_Robots_ROS_Driver/ur_calibration/test/calibration_test.cpp", "description": "[src/Universal_Robots_ROS_Driver/ur_calibration/test/calibration_test.cpp:37]: (error) syntax error", "vendor": null, "cwe": "None", "title": "(error) syntax error", "cve": "None", "type": "bug", "id": 1 } ```
test
error syntax error yaml severity rvss score cvss score severity description cvss vector rvss vector exploitation exploitation vector exploitation image description flaw trace reproduction see artifacts below if available languages none reproducibility always detected by method testing static subsystem n a specificity n a issue reported by relationship automatic phase testing reported by alias robotics reproduction image gitlab com aliasrobotics offensive alurity pipelines pipeline ur ros official cppcheck jobs artifacts download package n a date detected detected by alias robotics date reported application n a architectural location n a keywords cppcheck static analysis testing triage bug links mitigation pull request description system src universal robots ros driver ur calibration test calibration test cpp description error syntax error vendor null cwe none title error syntax error cve none type bug id
1
33,583
4,839,160,398
IssuesEvent
2016-11-09 08:19:50
codepress/admin-columns-issues
https://api.github.com/repos/codepress/admin-columns-issues
opened
Remove or change old filter so third party code will not break
status:need_testing type:improvement
Think about changing the following filters - [ ] **cac/column/meta/value** We send a complete refactored column to this filter so I expect a lot of problem when people update to the latest version. Good use case is Pods which does a lot of checked based on the old CPAC_Column object
1.0
Remove or change old filter so third party code will not break - Think about changing the following filters - [ ] **cac/column/meta/value** We send a complete refactored column to this filter so I expect a lot of problem when people update to the latest version. Good use case is Pods which does a lot of checked based on the old CPAC_Column object
test
remove or change old filter so third party code will not break think about changing the following filters cac column meta value we send a complete refactored column to this filter so i expect a lot of problem when people update to the latest version good use case is pods which does a lot of checked based on the old cpac column object
1
259,541
22,497,518,932
IssuesEvent
2022-06-23 08:53:35
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
com.hazelcast.internal.dynamicconfig.DynamicConfigSlowPreJoinBouncingTest.doNotThrowExceptionWhenMemberIsGone [HZ-978]
Team: Core Type: Test-Failure Source: Internal Module: Config to-jira
_master_ (commit c7d5d4ed0150e9e927af1541e2e0da730df401f5) Failed on Sonar build (Oracle JDK 11): https://jenkins.hazelcast.com/view/Official%20Builds/job/Hazelcast-master-sonar/1008/testReport/com.hazelcast.internal.dynamicconfig/DynamicConfigSlowPreJoinBouncingTest/doNotThrowExceptionWhenMemberIsGone/ <details><summary>Stacktrace:</summary> ``` java.lang.AssertionError: expected:<MapConfig{name='20bb9939-3b4c-4510-86ea-85b2568843be', inMemoryFormat='OBJECT', metadataPolicy=CREATE_ON_UPDATE, backupCount=3, asyncBackupCount=0, timeToLiveSeconds=12, maxIdleSeconds=20, readBackupData=true, evictionConfig=EvictionConfig{size=1000, maxSizePolicy=FREE_HEAP_SIZE, evictionPolicy=LRU, comparatorClassName=null, comparator=LRUEvictionPolicyComparator{com.hazelcast.internal.eviction.impl.comparator.LRUEvictionPolicyComparator@5e2c3ca2} }, merkleTree=MerkleTreeConfig{enabled=null, depth=10}, eventJournal=EventJournalConfig{enabled=false, capacity=10000, timeToLiveSeconds=0}, hotRestart=HotRestartConfig{enabled=true, fsync=true}, dataPersistenceConfig=DataPersistenceConfig{enabled=true, fsync=true}, nearCacheConfig=NearCacheConfig{name=default, inMemoryFormat=NATIVE, invalidateOnChange=true, timeToLiveSeconds=0, maxIdleSeconds=0, evictionConfig=EvictionConfig{size=10000, maxSizePolicy=ENTRY_COUNT, evictionPolicy=LRU, comparatorClassName=null, comparator=null}, cacheLocalEntries=true, localUpdatePolicy=CACHE_ON_UPDATE, preloaderConfig=NearCachePreloaderConfig{enabled=true, directory=, storeInitialDelaySeconds=600, storeIntervalSeconds=600}}, mapStoreConfig=MapStoreConfig{enabled=true, className='foo.bar.MapStoreDoesNotExist', factoryClassName='null', writeDelaySeconds=0, writeBatchSize=1, implementation=null, factoryImplementation=null, properties={}, initialLoadMode=LAZY, writeCoalescing=true}, mergePolicyConfig=MergePolicyConfig{policy='com.hazelcast.spi.merge.PutIfAbsentMergePolicy', batchSize=100}, wanReplicationRef=WanReplicationRef{name='name', mergePolicy='foo.bar.PolicyClass', filters='[]', republishingEnabled='true'}, entryListenerConfigs=[EntryListenerConfig{local=true, includeValue=true}, EntryListenerConfig{local=true, includeValue=true}, EntryListenerConfig{local=true, includeValue=true}], indexConfigs=[IndexConfig{name=null, type=SORTED, attributes=[orderAttribute]}, IndexConfig{name=null, type=HASH, attributes=[unorderedAttribute]}], attributeConfigs=[AttributeConfig{name='attribute'extractorClassName='foo.bar.ExtractorClass'}], splitBrainProtectionName=split-brain-protection, queryCacheConfigs=[QueryCacheConfig{batchSize=100, bufferSize=16, delaySeconds=0, includeValue=true, populate=true, coalesce=false, inMemoryFormat=OBJECT, name='queryCacheName', predicateConfig=PredicateConfig{className='null', sql='null', implementation=null}, evictionConfig=EvictionConfig{size=10000, maxSizePolicy=ENTRY_COUNT, evictionPolicy=LRU, comparatorClassName=null, comparator=null}, entryListenerConfigs=[EntryListenerConfig{local=false, includeValue=true}], indexConfigs=[IndexConfig{name=null, type=HASH, attributes=[attribute]}]}], cacheDeserializedValues=ALWAYS, statisticsEnabled=false, entryStatsEnabled=true}> but was:<MapConfig{name='20bb9939-3b4c-4510-86ea-85b2568843be', inMemoryFormat='BINARY', metadataPolicy=CREATE_ON_UPDATE, backupCount=1, asyncBackupCount=0, timeToLiveSeconds=0, maxIdleSeconds=0, readBackupData=false, evictionConfig=EvictionConfig{size=2147483647, maxSizePolicy=PER_NODE, evictionPolicy=NONE, comparatorClassName=null, comparator=null}, merkleTree=MerkleTreeConfig{enabled=null, depth=10}, eventJournal=EventJournalConfig{enabled=false, capacity=10000, timeToLiveSeconds=0}, hotRestart=HotRestartConfig{enabled=false, fsync=false}, dataPersistenceConfig=DataPersistenceConfig{enabled=false, fsync=false}, nearCacheConfig=null, mapStoreConfig=MapStoreConfig{enabled=false, className='null', factoryClassName='null', writeDelaySeconds=0, writeBatchSize=1, implementation=null, factoryImplementation=null, properties={}, initialLoadMode=LAZY, writeCoalescing=true}, mergePolicyConfig=MergePolicyConfig{policy='com.hazelcast.spi.merge.PutIfAbsentMergePolicy', batchSize=100}, wanReplicationRef=null, entryListenerConfigs=[], indexConfigs=[], attributeConfigs=[], splitBrainProtectionName=null, queryCacheConfigs=[], cacheDeserializedValues=INDEX_ONLY, statisticsEnabled=true, entryStatsEnabled=false}> at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:120) at org.junit.Assert.assertEquals(Assert.java:146) at com.hazelcast.internal.dynamicconfig.DynamicConfigBouncingTest.doNotThrowExceptionWhenMemberIsGone(DynamicConfigBouncingTest.java:84) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:115) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:107) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.lang.Thread.run(Thread.java:834) ``` </details> <details><summary>Standard output:</summary> ``` 01:00:59,239 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5701 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:00:59,242 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:00:59,245 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:00:59,248 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:00:59,248 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is STARTING 01:00:59,248 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:1, ver:1} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this ] 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is STARTED 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5702 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:00:59,251 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:00:59,254 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:00:59,257 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:00:59,257 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is STARTING 01:00:59,257 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:00:59,258 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:00:59,258 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:2, ver:2} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 ] 01:01:00,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:2, ver:2} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this ] 01:01:00,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is STARTED 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5703 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:01:00,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:01:00,267 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:01:00,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:01:00,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is STARTING 01:01:00,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:01:00,371 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:01,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:3} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.generic-operation.thread-3 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:3, ver:3} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this ] 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:3, ver:3} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is STARTED 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5704 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:01:02,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:01:02,267 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:01:02,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:01:02,271 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is STARTING 01:01:02,271 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:01:05,261 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:06,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:07,762 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:09,262 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 ] 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.epic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 this ] 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 ] 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is STARTED 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 ] 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5705 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:01:10,266 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:01:10,269 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:01:10,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:01:10,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is STARTING 01:01:10,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:01:12,263 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:15,274 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:20,776 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:24,267 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:26,278 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:31,780 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:32,270 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:33,270 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:34,271 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:34,271 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:36,272 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,272 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba this ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.epic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 this Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is STARTED 01:01:48,277 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 since it is expired (now: 2021-10-15 01:01:48.277, timestamp: 2021-10-15 01:01:17.269) 01:01:54,279 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:55,280 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:59,282 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:02,284 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 since it is expired (now: 2021-10-15 01:02:02.284, timestamp: 2021-10-15 01:01:24.256) 01:02:05,285 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:10,287 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:12,288 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 since it is expired (now: 2021-10-15 01:02:12.288, timestamp: 2021-10-15 01:01:29.256) 01:02:14,247 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Suspecting Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 because it has not sent any heartbeats since 2021-10-15 01:01:09.262. Now: 2021-10-15 01:02:14.247, heartbeat timeout: 60000 ms, suspicion level: 1.00 01:02:16,289 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5702, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5703, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.vigilant_bose.cached.thread-2 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-8 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.admiring_bose.cached.thread-6 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba this ] 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.elastic_bose.cached.thread-5 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:02:20,292 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:22,269 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:02:22,269 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:02:22,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:02:22,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:02:23,293 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:27,294 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 since it is expired (now: 2021-10-15 01:02:27.294, timestamp: 2021-10-15 01:01:35.268) 01:02:34,296 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:36,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:36,297 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:02:36,397 INFO |doNotThrowExceptionWhenMemberIsGone| - [ExplicitSuspicionOp] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Received suspicion request from: [127.0.0.1]:5701 01:02:36,397 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d is suspected to be dead for reason: explicit suspicion 01:02:36,397 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:36,397 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:02:37,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:38,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:38,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:38,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 since it is expired (now: 2021-10-15 01:02:38.298, timestamp: 2021-10-15 01:01:44.256) 01:02:39,247 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Suspecting Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba because it has not sent any heartbeats since 2021-10-15 01:01:36.272. Now: 2021-10-15 01:02:39.247, heartbeat timeout: 60000 ms, suspicion level: 1.00 01:02:39,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5705, could not acquire lock in time. 01:02:39,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:39,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5705, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5702, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5703, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:7} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-10 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5705, UUID: b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.vigilant_bose.cached.thread-12 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5705, UUID: b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:3, ver:7} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.elastic_bose.cached.thread-1 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5705, UUID: b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:3, ver:7} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this ] 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:02:40,299 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAP ...[truncated 153624 chars]... ce] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:27} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 ] 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.thirsty_bose.cached.thread-7 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5713, UUID: 83ca37ea-e72a-4868-99ef-7cd13b6b48ac 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-12 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5713, UUID: 83ca37ea-e72a-4868-99ef-7cd13b6b48ac 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.blissful_bose.cached.thread-1 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5713, UUID: 83ca37ea-e72a-4868-99ef-7cd13b6b48ac 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.generic-operation.thread-2 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:3, ver:27} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this ] 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.thirsty_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Members {size:3, ver:27} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 ] 01:03:30,681 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:30,683 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:30,683 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:30,683 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5713 is SHUTDOWN 01:03:31,341 INFO |doNotThrowExceptionWhenMemberIsGone| - [ProgressMonitor] doNotThrowExceptionWhenMemberIsGone - Aggregated progress: 503975 operations. Maximum latency: 5520 ms.Throughput in last 5000 ms: 55237 ops / second. 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5716 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:03:32,697 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:03:32,700 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is STARTING 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:32,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:33,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.thirsty_bose.generic-operation.thread-0 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:33,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.blissful_bose.generic-operation.thread-3 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:33,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this ] 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5714, alive=true} 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5715, alive=true} 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.generic-operation.thread-3 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:33,705 WARN |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Config seed port is 5701 and cluster size is 4. Some of the ports seem occupied! 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.thirsty_bose.generic-operation.thread-0 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is STARTED 01:03:33,706 INFO |doNotThrowExceptionWhenMemberIsGone| - [HealthMonitor] hz.silly_bose.HealthMonitor - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] processors=8, physical.memory.total=755.6G, physical.memory.free=660.1G, swap.space.total=4.0G, swap.space.free=4.0G, heap.memory.used=1.7G, heap.memory.free=277.5M, heap.memory.total=2.0G, heap.memory.max=2.0G, heap.memory.used/total=86.40%, heap.memory.used/max=86.19%, minor.gc.count=5137, minor.gc.time=45848ms, major.gc.count=2, major.gc.time=574ms, load.process=11.11%, load.system=16.22%, load.systemAverage=7.92, thread.count=649, thread.peakCount=2810, cluster.timeDiff=-1001, event.q.size=0, executor.q.async.size=0, executor.q.client.size=0, executor.q.client.query.size=0, executor.q.client.blocking.size=0, executor.q.query.size=0, executor.q.scheduled.size=0, executor.q.io.size=0, executor.q.system.size=0, executor.q.operations.size=0, executor.q.priorityOperation.size=0, operations.completed.count=7, executor.q.mapLoad.size=0, executor.q.mapLoadAllKeys.size=0, executor.q.cluster.size=0, executor.q.response.size=0, operations.running.count=0, operations.pending.invocations.percentage=0.00%, operations.pending.invocations.count=0, proxy.count=0, clientEndpoint.count=0, connection.active.count=0, client.connection.count=0, connection.count=0 01:03:35,706 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTTING_DOWN 01:03:35,706 WARN |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Terminating forcefully... 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5714, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5714, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5714, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:35,707 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58679 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this 01:03:35,707 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:29} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-5 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5714, UUID: 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.blissful_bose.cached.thread-6 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5714, UUID: 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.silly_bose.cached.thread-4 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5714, UUID: 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,708 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.generic-operation.thread-2 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:3, ver:29} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:35,708 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:3, ver:29} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this ] 01:03:35,710 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:35,710 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 4 ms. 01:03:35,710 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTDOWN 01:03:36,341 INFO |doNotThrowExceptionWhenMemberIsGone| - [ProgressMonitor] doNotThrowExceptionWhenMemberIsGone - Aggregated progress: 560072 operations. Maximum latency: 5520 ms.Throughput in last 5000 ms: 56765 ops / second. 01:03:37,712 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5717 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:03:37,715 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:03:37,719 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:03:37,723 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:03:37,723 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is STARTING 01:03:37,723 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:03:37,724 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5717, alive=true} 01:03:37,724 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.blissful_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5717, alive=true} 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5717, alive=true} 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.wizardly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d this ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5715, alive=true} 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:38,725 WARN |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Config seed port is 5701 and cluster size is 4. Some of the ports seem occupied! 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is STARTED 01:03:38,727 INFO |doNotThrowExceptionWhenMemberIsGone| - [HealthMonitor] hz.wizardly_bose.HealthMonitor - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] processors=8, physical.memory.total=755.6G, physical.memory.free=660.1G, swap.space.total=4.0G, swap.space.free=4.0G, heap.memory.used=1.5G, heap.memory.free=516.4M, heap.memory.total=2.0G, heap.memory.max=2.0G, heap.memory.used/total=74.71%, heap.memory.used/max=74.52%, minor.gc.count=5139, minor.gc.time=45870ms, major.gc.count=2, major.gc.time=574ms, load.process=14.29%, load.system=25.00%, load.systemAverage=8.01, thread.count=647, thread.peakCount=2810, cluster.timeDiff=-1001, event.q.size=0, executor.q.async.size=0, executor.q.client.size=0, executor.q.client.query.size=0, executor.q.client.blocking.size=0, executor.q.query.size=0, executor.q.scheduled.size=0, executor.q.io.size=0, executor.q.system.size=0, executor.q.operations.size=0, executor.q.priorityOperation.size=0, operations.completed.count=7, executor.q.mapLoad.size=0, executor.q.mapLoadAllKeys.size=0, executor.q.cluster.size=0, executor.q.response.size=0, operations.running.count=0, operations.pending.invocations.percentage=0.00%, operations.pending.invocations.count=0, proxy.count=0, clientEndpoint.count=0, connection.active.count=0, client.connection.count=0, connection.count=0 01:03:39,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [HealthMonitor] hz.serene_bose.HealthMonitor - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] processors=8, physical.memory.total=755.6G, physical.memory.free=660.1G, swap.space.total=4.0G, swap.space.free=4.0G, heap.memory.used=1.7G, heap.memory.free=311.7M, heap.memory.total=2.0G, heap.memory.max=2.0G, heap.memory.used/total=84.74%, heap.memory.used/max=84.53%, minor.gc.count=5139, minor.gc.time=45870ms, major.gc.count=2, major.gc.time=574ms, load.process=5.00%, load.system=14.71%, load.systemAverage=8.01, thread.count=647, thread.peakCount=2810, cluster.timeDiff=0, event.q.size=0, executor.q.async.size=0, executor.q.client.size=0, executor.q.client.query.size=0, executor.q.client.blocking.size=0, executor.q.query.size=0, executor.q.scheduled.size=0, executor.q.io.size=0, executor.q.system.size=0, executor.q.operations.size=0, executor.q.priorityOperation.size=0, operations.completed.count=436, executor.q.mapLoad.size=0, executor.q.mapLoadAllKeys.size=0, executor.q.cluster.size=0, executor.q.response.size=0, operations.running.count=2, operations.pending.invocations.percentage=0.00%, operations.pending.invocations.count=0, proxy.count=0, clientEndpoint.count=0, connection.active.count=0, client.connection.count=0, connection.count=0 01:03:40,342 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] doNotThrowExceptionWhenMemberIsGone - Test deadline reached, tearing down 01:03:40,342 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] doNotThrowExceptionWhenMemberIsGone - Waiting until 2021-10-15 01:04:10.342 for test tasks to complete gracefully. 01:03:40,342 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58673 - Waiting for member bouncing thread to stop... 01:03:40,427 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58679 - Member bouncing thread exiting 01:03:40,427 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58673 - Member bouncing thread stopped. 01:03:40,819 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58673 - Tearing down BounceMemberRule 01:03:40,819 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTTING_DOWN 01:03:40,820 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,820 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,821 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5711 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5711 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5711 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5711 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5711 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5710 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5710 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5710 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5710 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5710 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5713 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5713 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5712 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5712 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5712 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5712 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5712 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5715 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5717, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:40,822 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this 01:03:40,822 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:31} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5715, UUID: 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.wizardly_bose.cached.thread-1 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5715, UUID: 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.silly_bose.cached.thread-6 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5715, UUID: 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:3, ver:31} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.wizardly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Members {size:3, ver:31} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d this ] 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5715 is SHUTDOWN 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTTING_DOWN 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTDOWN 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is SHUTTING_DOWN 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5717, alive=false} 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5717, alive=false} 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:40,826 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d this 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:2, ver:32} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5717, UUID: cc7ae526-0a6b-4d05-b92b-6cdd0166920d 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:2, ver:32} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this ] 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.silly_bose.cached.thread-6 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5717, UUID: cc7ae526-0a6b-4d05-b92b-6cdd0166920d 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is SHUTDOWN 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is SHUTTING_DOWN 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 01:03:40,829 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:1, ver:33} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this ] 01:03:40,829 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5716, UUID: c5e0bfc3-97dc-429e-ac61-163a781a3a30 01:03:40,829 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 2 ms. 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5707 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5707 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5707 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5707 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5707 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5706 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5706 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5706 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5706 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5706 is SHUTDOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5709 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5709 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5709 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5709 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5709 is SHUTDOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5708 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5708 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5708 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5708 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5708 is SHUTDOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,833 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,833 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 2 ms. 01:03:40,833 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is SHUTDOWN BuildInfo right after doNotThrowExceptionWhenMemberIsGone(com.hazelcast.internal.dynamicconfig.DynamicConfigSlowPreJoinBouncingTest): BuildInfo{version='5.1-SNAPSHOT', build='20211014', buildNumber=20211014, revision=c7d5d4e, enterprise=false, serializationVersion=1} Hiccups measured while running test 'doNotThrowExceptionWhenMemberIsGone(com.hazelcast.internal.dynamicconfig.DynamicConfigSlowPreJoinBouncingTest):' 01:00:55, accumulated pauses: 62 ms, max pause: 19 ms, pauses over 1000 ms: 0 01:01:00, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:05, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:10, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:15, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:20, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:25, accumulated pauses: 37 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:30, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:35, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:40, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:45, accumulated pauses: 33 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:50, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:55, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:00, accumulated pauses: 36 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:05, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:10, accumulated pauses: 31 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:15, accumulated pauses: 33 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:20, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:25, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:30, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:35, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:40, accumulated pauses: 33 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:45, accumulated pauses: 37 ms, max pause: 7 ms, pauses over 1000 ms: 0 01:02:50, accumulated pauses: 107 ms, max pause: 74 ms, pauses over 1000 ms: 0 01:02:55, accumulated pauses: 75 ms, max pause: 26 ms, pauses over 1000 ms: 0 01:03:00, accumulated pauses: 64 ms, max pause: 26 ms, pauses over 1000 ms: 0 01:03:05, accumulated pauses: 158 ms, max pause: 101 ms, pauses over 1000 ms: 0 01:03:10, accumulated pauses: 76 ms, max pause: 22 ms, pauses over 1000 ms: 0 01:03:15, accumulated pauses: 48 ms, max pause: 15 ms, pauses over 1000 ms: 0 01:03:20, accumulated pauses: 50 ms, max pause: 19 ms, pauses over 1000 ms: 0 01:03:25, accumulated pauses: 56 ms, max pause: 18 ms, pauses over 1000 ms: 0 01:03:30, accumulated pauses: 65 ms, max pause: 25 ms, pauses over 1000 ms: 0 01:03:35, accumulated pauses: 71 ms, max pause: 21 ms, pauses over 1000 ms: 0 01:03:40, accumulated pauses: 343 ms, max pause: 335 ms, pauses over 1000 ms: 0 ``` </details>
1.0
com.hazelcast.internal.dynamicconfig.DynamicConfigSlowPreJoinBouncingTest.doNotThrowExceptionWhenMemberIsGone [HZ-978] - _master_ (commit c7d5d4ed0150e9e927af1541e2e0da730df401f5) Failed on Sonar build (Oracle JDK 11): https://jenkins.hazelcast.com/view/Official%20Builds/job/Hazelcast-master-sonar/1008/testReport/com.hazelcast.internal.dynamicconfig/DynamicConfigSlowPreJoinBouncingTest/doNotThrowExceptionWhenMemberIsGone/ <details><summary>Stacktrace:</summary> ``` java.lang.AssertionError: expected:<MapConfig{name='20bb9939-3b4c-4510-86ea-85b2568843be', inMemoryFormat='OBJECT', metadataPolicy=CREATE_ON_UPDATE, backupCount=3, asyncBackupCount=0, timeToLiveSeconds=12, maxIdleSeconds=20, readBackupData=true, evictionConfig=EvictionConfig{size=1000, maxSizePolicy=FREE_HEAP_SIZE, evictionPolicy=LRU, comparatorClassName=null, comparator=LRUEvictionPolicyComparator{com.hazelcast.internal.eviction.impl.comparator.LRUEvictionPolicyComparator@5e2c3ca2} }, merkleTree=MerkleTreeConfig{enabled=null, depth=10}, eventJournal=EventJournalConfig{enabled=false, capacity=10000, timeToLiveSeconds=0}, hotRestart=HotRestartConfig{enabled=true, fsync=true}, dataPersistenceConfig=DataPersistenceConfig{enabled=true, fsync=true}, nearCacheConfig=NearCacheConfig{name=default, inMemoryFormat=NATIVE, invalidateOnChange=true, timeToLiveSeconds=0, maxIdleSeconds=0, evictionConfig=EvictionConfig{size=10000, maxSizePolicy=ENTRY_COUNT, evictionPolicy=LRU, comparatorClassName=null, comparator=null}, cacheLocalEntries=true, localUpdatePolicy=CACHE_ON_UPDATE, preloaderConfig=NearCachePreloaderConfig{enabled=true, directory=, storeInitialDelaySeconds=600, storeIntervalSeconds=600}}, mapStoreConfig=MapStoreConfig{enabled=true, className='foo.bar.MapStoreDoesNotExist', factoryClassName='null', writeDelaySeconds=0, writeBatchSize=1, implementation=null, factoryImplementation=null, properties={}, initialLoadMode=LAZY, writeCoalescing=true}, mergePolicyConfig=MergePolicyConfig{policy='com.hazelcast.spi.merge.PutIfAbsentMergePolicy', batchSize=100}, wanReplicationRef=WanReplicationRef{name='name', mergePolicy='foo.bar.PolicyClass', filters='[]', republishingEnabled='true'}, entryListenerConfigs=[EntryListenerConfig{local=true, includeValue=true}, EntryListenerConfig{local=true, includeValue=true}, EntryListenerConfig{local=true, includeValue=true}], indexConfigs=[IndexConfig{name=null, type=SORTED, attributes=[orderAttribute]}, IndexConfig{name=null, type=HASH, attributes=[unorderedAttribute]}], attributeConfigs=[AttributeConfig{name='attribute'extractorClassName='foo.bar.ExtractorClass'}], splitBrainProtectionName=split-brain-protection, queryCacheConfigs=[QueryCacheConfig{batchSize=100, bufferSize=16, delaySeconds=0, includeValue=true, populate=true, coalesce=false, inMemoryFormat=OBJECT, name='queryCacheName', predicateConfig=PredicateConfig{className='null', sql='null', implementation=null}, evictionConfig=EvictionConfig{size=10000, maxSizePolicy=ENTRY_COUNT, evictionPolicy=LRU, comparatorClassName=null, comparator=null}, entryListenerConfigs=[EntryListenerConfig{local=false, includeValue=true}], indexConfigs=[IndexConfig{name=null, type=HASH, attributes=[attribute]}]}], cacheDeserializedValues=ALWAYS, statisticsEnabled=false, entryStatsEnabled=true}> but was:<MapConfig{name='20bb9939-3b4c-4510-86ea-85b2568843be', inMemoryFormat='BINARY', metadataPolicy=CREATE_ON_UPDATE, backupCount=1, asyncBackupCount=0, timeToLiveSeconds=0, maxIdleSeconds=0, readBackupData=false, evictionConfig=EvictionConfig{size=2147483647, maxSizePolicy=PER_NODE, evictionPolicy=NONE, comparatorClassName=null, comparator=null}, merkleTree=MerkleTreeConfig{enabled=null, depth=10}, eventJournal=EventJournalConfig{enabled=false, capacity=10000, timeToLiveSeconds=0}, hotRestart=HotRestartConfig{enabled=false, fsync=false}, dataPersistenceConfig=DataPersistenceConfig{enabled=false, fsync=false}, nearCacheConfig=null, mapStoreConfig=MapStoreConfig{enabled=false, className='null', factoryClassName='null', writeDelaySeconds=0, writeBatchSize=1, implementation=null, factoryImplementation=null, properties={}, initialLoadMode=LAZY, writeCoalescing=true}, mergePolicyConfig=MergePolicyConfig{policy='com.hazelcast.spi.merge.PutIfAbsentMergePolicy', batchSize=100}, wanReplicationRef=null, entryListenerConfigs=[], indexConfigs=[], attributeConfigs=[], splitBrainProtectionName=null, queryCacheConfigs=[], cacheDeserializedValues=INDEX_ONLY, statisticsEnabled=true, entryStatsEnabled=false}> at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:120) at org.junit.Assert.assertEquals(Assert.java:146) at com.hazelcast.internal.dynamicconfig.DynamicConfigBouncingTest.doNotThrowExceptionWhenMemberIsGone(DynamicConfigBouncingTest.java:84) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method) at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) at java.base/java.lang.reflect.Method.invoke(Method.java:566) at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:59) at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:56) at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:115) at com.hazelcast.test.FailOnTimeoutStatement$CallableStatement.call(FailOnTimeoutStatement.java:107) at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264) at java.base/java.lang.Thread.run(Thread.java:834) ``` </details> <details><summary>Standard output:</summary> ``` 01:00:59,239 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5701 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:00:59,240 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:00:59,242 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:00:59,245 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:00:59,248 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:00:59,248 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is STARTING 01:00:59,248 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:1, ver:1} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this ] 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is STARTED 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5702 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:00:59,249 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:00:59,251 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:00:59,254 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:00:59,257 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:00:59,257 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is STARTING 01:00:59,257 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:00:59,258 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:00:59,258 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:2, ver:2} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 ] 01:01:00,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:2, ver:2} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this ] 01:01:00,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is STARTED 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5703 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:01:00,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:01:00,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:01:00,267 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:01:00,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:01:00,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is STARTING 01:01:00,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:01:00,371 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:01,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:3} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.generic-operation.thread-3 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:3, ver:3} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this ] 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:3, ver:3} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:01:02,260 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is STARTED 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5704 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:01:02,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:01:02,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:01:02,267 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:01:02,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:01:02,271 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is STARTING 01:01:02,271 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:01:05,261 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:06,261 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:07,762 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:09,262 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 ] 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.epic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 this ] 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 ] 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is STARTED 01:01:10,263 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:4, ver:4} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 ] 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5705 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:01:10,264 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:01:10,266 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:01:10,269 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:01:10,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:01:10,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is STARTING 01:01:10,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:01:12,263 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:15,274 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:20,776 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:24,267 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:26,278 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:31,780 WARN |doNotThrowExceptionWhenMemberIsGone| - [MockJoiner] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Resetting master address because join address timeout 01:01:32,270 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:33,270 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:34,271 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:34,271 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:36,272 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,272 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba this ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.epic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 this Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:5, ver:5} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:01:37,273 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is STARTED 01:01:48,277 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 since it is expired (now: 2021-10-15 01:01:48.277, timestamp: 2021-10-15 01:01:17.269) 01:01:54,279 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:01:55,280 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:01:59,282 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:02,284 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 since it is expired (now: 2021-10-15 01:02:02.284, timestamp: 2021-10-15 01:01:24.256) 01:02:05,285 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:10,287 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:12,288 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 since it is expired (now: 2021-10-15 01:02:12.288, timestamp: 2021-10-15 01:01:29.256) 01:02:14,247 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Suspecting Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 because it has not sent any heartbeats since 2021-10-15 01:01:09.262. Now: 2021-10-15 01:02:14.247, heartbeat timeout: 60000 ms, suspicion level: 1.00 01:02:16,289 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5704 - 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5702, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5703, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.vigilant_bose.cached.thread-2 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,290 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-8 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.admiring_bose.cached.thread-6 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba this ] 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.elastic_bose.cached.thread-5 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5704, UUID: 84d277a8-eb56-41f9-9584-9dffd3eb1898 01:02:17,291 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:4, ver:6} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba ] 01:02:20,292 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:22,269 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:02:22,269 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5702, alive=true} 01:02:22,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5703, alive=true} 01:02:22,270 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.cached.thread-3 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:02:23,293 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:27,294 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 since it is expired (now: 2021-10-15 01:02:27.294, timestamp: 2021-10-15 01:01:35.268) 01:02:34,296 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:36,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:36,297 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.generic-operation.thread-3 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:02:36,397 INFO |doNotThrowExceptionWhenMemberIsGone| - [ExplicitSuspicionOp] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Received suspicion request from: [127.0.0.1]:5701 01:02:36,397 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d is suspected to be dead for reason: explicit suspicion 01:02:36,397 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:02:36,397 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.epic_bose.generic-operation.thread-2 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5704, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:02:37,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:38,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:38,297 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:38,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Ignoring heartbeat from Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 since it is expired (now: 2021-10-15 01:02:38.298, timestamp: 2021-10-15 01:01:44.256) 01:02:39,247 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Suspecting Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba because it has not sent any heartbeats since 2021-10-15 01:01:36.272. Now: 2021-10-15 01:02:39.247, heartbeat timeout: 60000 ms, suspicion level: 1.00 01:02:39,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5705, could not acquire lock in time. 01:02:39,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:39,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5705, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5704, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5702, could not acquire lock in time. 01:02:40,298 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.generic-operation.thread-2 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Cannot handle heartbeat from [127.0.0.1]:5703, could not acquire lock in time. 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.admiring_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:02:40,298 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5705 - b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5702, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5702, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5703, connection: MockConnection{localEndpoint=[127.0.0.1]:5705, remoteEndpoint=[127.0.0.1]:5703, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5702, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5703, remoteEndpoint=[127.0.0.1]:5705, alive=false} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.cached.thread-13 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:7} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-10 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5705, UUID: b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.vigilant_bose.cached.thread-12 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5705, UUID: b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.vigilant_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Members {size:3, ver:7} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 this Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 ] 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5705, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5705, alive=true} 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.elastic_bose.cached.thread-1 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5705, UUID: b04f3c0d-060f-470f-8034-14757b529bba 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.elastic_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Members {size:3, ver:7} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5702 - a898837c-6ab1-4e68-bd17-6aa1ef2eff47 Member [127.0.0.1]:5703 - 088c43df-7e6b-4243-881f-678ecbf43272 this ] 01:02:40,299 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=true} 01:02:40,299 WARN |doNotThrowExceptionWhenMemberIsGone| - [ClusterHeartbeatManager] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAP ...[truncated 153624 chars]... ce] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:27} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 ] 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.thirsty_bose.cached.thread-7 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5713, UUID: 83ca37ea-e72a-4868-99ef-7cd13b6b48ac 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-12 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5713, UUID: 83ca37ea-e72a-4868-99ef-7cd13b6b48ac 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.blissful_bose.cached.thread-1 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5713, UUID: 83ca37ea-e72a-4868-99ef-7cd13b6b48ac 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.generic-operation.thread-2 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:3, ver:27} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this ] 01:03:30,680 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.thirsty_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Members {size:3, ver:27} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 ] 01:03:30,681 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:30,683 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:30,683 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:30,683 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5713 is SHUTDOWN 01:03:31,341 INFO |doNotThrowExceptionWhenMemberIsGone| - [ProgressMonitor] doNotThrowExceptionWhenMemberIsGone - Aggregated progress: 503975 operations. Maximum latency: 5520 ms.Throughput in last 5000 ms: 55237 ops / second. 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5716 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:03:32,695 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:03:32,697 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:03:32,700 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is STARTING 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:03:32,703 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:32,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:33,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.thirsty_bose.generic-operation.thread-0 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:33,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.blissful_bose.generic-operation.thread-3 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:33,704 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this ] 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5714, alive=true} 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5715, alive=true} 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.generic-operation.thread-3 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:33,705 WARN |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Config seed port is 5701 and cluster size is 4. Some of the ports seem occupied! 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.thirsty_bose.generic-operation.thread-0 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Members {size:4, ver:28} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:33,705 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is STARTED 01:03:33,706 INFO |doNotThrowExceptionWhenMemberIsGone| - [HealthMonitor] hz.silly_bose.HealthMonitor - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] processors=8, physical.memory.total=755.6G, physical.memory.free=660.1G, swap.space.total=4.0G, swap.space.free=4.0G, heap.memory.used=1.7G, heap.memory.free=277.5M, heap.memory.total=2.0G, heap.memory.max=2.0G, heap.memory.used/total=86.40%, heap.memory.used/max=86.19%, minor.gc.count=5137, minor.gc.time=45848ms, major.gc.count=2, major.gc.time=574ms, load.process=11.11%, load.system=16.22%, load.systemAverage=7.92, thread.count=649, thread.peakCount=2810, cluster.timeDiff=-1001, event.q.size=0, executor.q.async.size=0, executor.q.client.size=0, executor.q.client.query.size=0, executor.q.client.blocking.size=0, executor.q.query.size=0, executor.q.scheduled.size=0, executor.q.io.size=0, executor.q.system.size=0, executor.q.operations.size=0, executor.q.priorityOperation.size=0, operations.completed.count=7, executor.q.mapLoad.size=0, executor.q.mapLoadAllKeys.size=0, executor.q.cluster.size=0, executor.q.response.size=0, operations.running.count=0, operations.pending.invocations.percentage=0.00%, operations.pending.invocations.count=0, proxy.count=0, clientEndpoint.count=0, connection.active.count=0, client.connection.count=0, connection.count=0 01:03:35,706 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTTING_DOWN 01:03:35,706 WARN |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Terminating forcefully... 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5714, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5714, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5714, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5714, alive=false} 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5714, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:35,707 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58679 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this 01:03:35,707 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58679 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 this 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5714 - 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58679 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:29} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-5 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5714, UUID: 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.blissful_bose.cached.thread-6 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5714, UUID: 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,707 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.silly_bose.cached.thread-4 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5714, UUID: 29769b30-142e-42b6-95a7-70cfea4bba80 01:03:35,708 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.generic-operation.thread-2 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:3, ver:29} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:35,708 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:3, ver:29} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this ] 01:03:35,710 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:35,710 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 4 ms. 01:03:35,710 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTDOWN 01:03:36,341 INFO |doNotThrowExceptionWhenMemberIsGone| - [ProgressMonitor] doNotThrowExceptionWhenMemberIsGone - Aggregated progress: 560072 operations. Maximum latency: 5520 ms.Throughput in last 5000 ms: 56765 ops / second. 01:03:37,712 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [LOCAL] [dev] [5.1-SNAPSHOT] Overridden metrics configuration with system property 'hazelcast.metrics.collection.frequency'='1' -> 'MetricsConfig.collectionFrequencySeconds'='1' 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [logo] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] + + o o o o---o o----o o o---o o o----o o--o--o + + + + | | / \ / | | / / \ | | + + + + + o----o o o o o----o | o o o o----o | + + + + | | / \ / | | \ / \ | | + + o o o o o---o o----o o----o o---o o o o----o o 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Copyright (c) 2008-2021, Hazelcast, Inc. All Rights Reserved. 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Hazelcast Platform 5.1-SNAPSHOT (20211014 - c7d5d4e) starting at [127.0.0.1]:5717 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Cluster name: dev 01:03:37,713 INFO |doNotThrowExceptionWhenMemberIsGone| - [system] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] The Jet engine is disabled. To enable the Jet engine on the members, please do one of the following: - Change member config using Java API: config.getJetConfig().setEnabled(true); - Change XML/YAML configuration property: Set hazelcast.jet.enabled to true - Add system property: -Dhz.jet.enabled=true - Add environment variable: HZ_JET_ENABLED=true 01:03:37,715 INFO |doNotThrowExceptionWhenMemberIsGone| - [MetricsConfigHelper] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Collecting debug metrics and sending to diagnostics is enabled 01:03:37,719 WARN |doNotThrowExceptionWhenMemberIsGone| - [CPSubsystem] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] CP Subsystem is not enabled. CP data structures will operate in UNSAFE mode! Please note that UNSAFE mode will not provide strong consistency guarantees. 01:03:37,723 INFO |doNotThrowExceptionWhenMemberIsGone| - [Diagnostics] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Diagnostics disabled. To enable add -Dhazelcast.diagnostics.enabled=true to the JVM arguments. 01:03:37,723 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is STARTING 01:03:37,723 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5701, alive=true} 01:03:37,724 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5717, alive=true} 01:03:37,724 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.serene_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.blissful_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5717, alive=true} 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5717, alive=true} 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.wizardly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d this ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.blissful_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:4, ver:30} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5715, alive=true} 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Created connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5716, alive=true} 01:03:38,725 WARN |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Config seed port is 5701 and cluster size is 4. Some of the ports seem occupied! 01:03:38,725 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58679 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is STARTED 01:03:38,727 INFO |doNotThrowExceptionWhenMemberIsGone| - [HealthMonitor] hz.wizardly_bose.HealthMonitor - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] processors=8, physical.memory.total=755.6G, physical.memory.free=660.1G, swap.space.total=4.0G, swap.space.free=4.0G, heap.memory.used=1.5G, heap.memory.free=516.4M, heap.memory.total=2.0G, heap.memory.max=2.0G, heap.memory.used/total=74.71%, heap.memory.used/max=74.52%, minor.gc.count=5139, minor.gc.time=45870ms, major.gc.count=2, major.gc.time=574ms, load.process=14.29%, load.system=25.00%, load.systemAverage=8.01, thread.count=647, thread.peakCount=2810, cluster.timeDiff=-1001, event.q.size=0, executor.q.async.size=0, executor.q.client.size=0, executor.q.client.query.size=0, executor.q.client.blocking.size=0, executor.q.query.size=0, executor.q.scheduled.size=0, executor.q.io.size=0, executor.q.system.size=0, executor.q.operations.size=0, executor.q.priorityOperation.size=0, operations.completed.count=7, executor.q.mapLoad.size=0, executor.q.mapLoadAllKeys.size=0, executor.q.cluster.size=0, executor.q.response.size=0, operations.running.count=0, operations.pending.invocations.percentage=0.00%, operations.pending.invocations.count=0, proxy.count=0, clientEndpoint.count=0, connection.active.count=0, client.connection.count=0, connection.count=0 01:03:39,259 INFO |doNotThrowExceptionWhenMemberIsGone| - [HealthMonitor] hz.serene_bose.HealthMonitor - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] processors=8, physical.memory.total=755.6G, physical.memory.free=660.1G, swap.space.total=4.0G, swap.space.free=4.0G, heap.memory.used=1.7G, heap.memory.free=311.7M, heap.memory.total=2.0G, heap.memory.max=2.0G, heap.memory.used/total=84.74%, heap.memory.used/max=84.53%, minor.gc.count=5139, minor.gc.time=45870ms, major.gc.count=2, major.gc.time=574ms, load.process=5.00%, load.system=14.71%, load.systemAverage=8.01, thread.count=647, thread.peakCount=2810, cluster.timeDiff=0, event.q.size=0, executor.q.async.size=0, executor.q.client.size=0, executor.q.client.query.size=0, executor.q.client.blocking.size=0, executor.q.query.size=0, executor.q.scheduled.size=0, executor.q.io.size=0, executor.q.system.size=0, executor.q.operations.size=0, executor.q.priorityOperation.size=0, operations.completed.count=436, executor.q.mapLoad.size=0, executor.q.mapLoadAllKeys.size=0, executor.q.cluster.size=0, executor.q.response.size=0, operations.running.count=2, operations.pending.invocations.percentage=0.00%, operations.pending.invocations.count=0, proxy.count=0, clientEndpoint.count=0, connection.active.count=0, client.connection.count=0, connection.count=0 01:03:40,342 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] doNotThrowExceptionWhenMemberIsGone - Test deadline reached, tearing down 01:03:40,342 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] doNotThrowExceptionWhenMemberIsGone - Waiting until 2021-10-15 01:04:10.342 for test tasks to complete gracefully. 01:03:40,342 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58673 - Waiting for member bouncing thread to stop... 01:03:40,427 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58679 - Member bouncing thread exiting 01:03:40,427 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58673 - Member bouncing thread stopped. 01:03:40,819 INFO |doNotThrowExceptionWhenMemberIsGone| - [BounceMemberRule] Thread-58673 - Tearing down BounceMemberRule 01:03:40,819 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTTING_DOWN 01:03:40,820 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,820 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,821 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5711 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5711 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5711 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5711 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5711 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5710 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5710 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5710 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5710 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5710 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5713 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5713 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5713 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5712 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5712 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5712 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5712 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5712 is SHUTDOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5715 is SHUTTING_DOWN 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5717, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5715, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5715, alive=false} 01:03:40,822 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5715, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:40,822 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this 01:03:40,822 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 this 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5715 - 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:3, ver:31} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5715, UUID: 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.wizardly_bose.cached.thread-1 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5715, UUID: 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.silly_bose.cached.thread-6 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5715, UUID: 048b1b6b-2385-4a9b-83de-ce5ffd7e9bc3 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:3, ver:31} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d ] 01:03:40,823 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.wizardly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Members {size:3, ver:31} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d this ] 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5715 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5715 is SHUTDOWN 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTTING_DOWN 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5714 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5714 is SHUTDOWN 01:03:40,825 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is SHUTTING_DOWN 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5717, alive=false} 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5717, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5717, alive=false} 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5717, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:40,826 WARN |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d is suspected to be dead for reason: Connection manager is stopped on Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d this 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5717 - cc7ae526-0a6b-4d05-b92b-6cdd0166920d 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:2, ver:32} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 ] 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5717, UUID: cc7ae526-0a6b-4d05-b92b-6cdd0166920d 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] hz.silly_bose.priority-generic-operation.thread-0 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Members {size:2, ver:32} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 this ] 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.silly_bose.cached.thread-6 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5717, UUID: cc7ae526-0a6b-4d05-b92b-6cdd0166920d 01:03:40,826 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 3 ms. 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5717 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5717 is SHUTDOWN 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is SHUTTING_DOWN 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5716, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5716, alive=false} 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5701, connection: MockConnection{localEndpoint=[127.0.0.1]:5716, remoteEndpoint=[127.0.0.1]:5701, alive=false} 01:03:40,828 INFO |doNotThrowExceptionWhenMemberIsGone| - [MembershipManager] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removing Member [127.0.0.1]:5716 - c5e0bfc3-97dc-429e-ac61-163a781a3a30 01:03:40,829 INFO |doNotThrowExceptionWhenMemberIsGone| - [ClusterService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Members {size:1, ver:33} [ Member [127.0.0.1]:5701 - a330b824-fec1-42bc-99f4-4fc79fb15f8d this ] 01:03:40,829 INFO |doNotThrowExceptionWhenMemberIsGone| - [TransactionManagerService] hz.serene_bose.cached.thread-1 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Committing/rolling-back live transactions of [127.0.0.1]:5716, UUID: c5e0bfc3-97dc-429e-ac61-163a781a3a30 01:03:40,829 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 2 ms. 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5716 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5716 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5703 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5703 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5702 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5702 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5705 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5705 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5704 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5704 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5707 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5707 is SHUTTING_DOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5707 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5707 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5707 is SHUTDOWN 01:03:40,830 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5706 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5706 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5706 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5706 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5706 is SHUTDOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5709 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5709 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5709 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5709 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5709 is SHUTDOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5708 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5708 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5708 [dev] [5.1-SNAPSHOT] Node is already shutting down... Waiting for shutdown process to complete... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5708 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5708 is SHUTDOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is SHUTTING_DOWN 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Shutting down connection manager... 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [MockServer] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Removed connection to endpoint: [127.0.0.1]:5704, connection: MockConnection{localEndpoint=[127.0.0.1]:5701, remoteEndpoint=[127.0.0.1]:5704, alive=false} 01:03:40,831 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Shutting down node engine... 01:03:40,833 INFO |doNotThrowExceptionWhenMemberIsGone| - [NodeExtension] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Destroying node NodeExtension. 01:03:40,833 INFO |doNotThrowExceptionWhenMemberIsGone| - [Node] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] Hazelcast Shutdown is completed in 2 ms. 01:03:40,833 INFO |doNotThrowExceptionWhenMemberIsGone| - [LifecycleService] Thread-58673 - [127.0.0.1]:5701 [dev] [5.1-SNAPSHOT] [127.0.0.1]:5701 is SHUTDOWN BuildInfo right after doNotThrowExceptionWhenMemberIsGone(com.hazelcast.internal.dynamicconfig.DynamicConfigSlowPreJoinBouncingTest): BuildInfo{version='5.1-SNAPSHOT', build='20211014', buildNumber=20211014, revision=c7d5d4e, enterprise=false, serializationVersion=1} Hiccups measured while running test 'doNotThrowExceptionWhenMemberIsGone(com.hazelcast.internal.dynamicconfig.DynamicConfigSlowPreJoinBouncingTest):' 01:00:55, accumulated pauses: 62 ms, max pause: 19 ms, pauses over 1000 ms: 0 01:01:00, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:05, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:10, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:15, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:20, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:25, accumulated pauses: 37 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:30, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:35, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:40, accumulated pauses: 35 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:45, accumulated pauses: 33 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:50, accumulated pauses: 34 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:01:55, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:00, accumulated pauses: 36 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:05, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:10, accumulated pauses: 31 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:15, accumulated pauses: 33 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:20, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:25, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:30, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:35, accumulated pauses: 32 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:40, accumulated pauses: 33 ms, max pause: 0 ms, pauses over 1000 ms: 0 01:02:45, accumulated pauses: 37 ms, max pause: 7 ms, pauses over 1000 ms: 0 01:02:50, accumulated pauses: 107 ms, max pause: 74 ms, pauses over 1000 ms: 0 01:02:55, accumulated pauses: 75 ms, max pause: 26 ms, pauses over 1000 ms: 0 01:03:00, accumulated pauses: 64 ms, max pause: 26 ms, pauses over 1000 ms: 0 01:03:05, accumulated pauses: 158 ms, max pause: 101 ms, pauses over 1000 ms: 0 01:03:10, accumulated pauses: 76 ms, max pause: 22 ms, pauses over 1000 ms: 0 01:03:15, accumulated pauses: 48 ms, max pause: 15 ms, pauses over 1000 ms: 0 01:03:20, accumulated pauses: 50 ms, max pause: 19 ms, pauses over 1000 ms: 0 01:03:25, accumulated pauses: 56 ms, max pause: 18 ms, pauses over 1000 ms: 0 01:03:30, accumulated pauses: 65 ms, max pause: 25 ms, pauses over 1000 ms: 0 01:03:35, accumulated pauses: 71 ms, max pause: 21 ms, pauses over 1000 ms: 0 01:03:40, accumulated pauses: 343 ms, max pause: 335 ms, pauses over 1000 ms: 0 ``` </details>
test
com hazelcast internal dynamicconfig dynamicconfigslowprejoinbouncingtest donotthrowexceptionwhenmemberisgone master commit failed on sonar build oracle jdk stacktrace java lang assertionerror expected but was at org junit assert fail assert java at org junit assert failnotequals assert java at org junit assert assertequals assert java at org junit assert assertequals assert java at com hazelcast internal dynamicconfig dynamicconfigbouncingtest donotthrowexceptionwhenmemberisgone dynamicconfigbouncingtest java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at org junit runners model frameworkmethod runreflectivecall frameworkmethod java at org junit internal runners model reflectivecallable run reflectivecallable java at org junit runners model frameworkmethod invokeexplosively frameworkmethod java at org junit internal runners statements invokemethod evaluate invokemethod java at com hazelcast test failontimeoutstatement callablestatement call failontimeoutstatement java at com hazelcast test failontimeoutstatement callablestatement call failontimeoutstatement java at java base java util concurrent futuretask run futuretask java at java base java lang thread run thread java standard output info donotthrowexceptionwhenmemberisgone thread overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info donotthrowexceptionwhenmemberisgone thread o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info donotthrowexceptionwhenmemberisgone thread copyright c hazelcast inc all rights reserved info donotthrowexceptionwhenmemberisgone thread hazelcast platform snapshot starting at info donotthrowexceptionwhenmemberisgone thread cluster name dev info donotthrowexceptionwhenmemberisgone thread the jet engine is disabled to enable the jet engine on the members please do one of the following change member config using java api config getjetconfig setenabled true change xml yaml configuration property set hazelcast jet enabled to true add system property dhz jet enabled true add environment variable hz jet enabled true info donotthrowexceptionwhenmemberisgone thread collecting debug metrics and sending to diagnostics is enabled warn donotthrowexceptionwhenmemberisgone thread cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info donotthrowexceptionwhenmemberisgone thread diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info donotthrowexceptionwhenmemberisgone thread is starting info donotthrowexceptionwhenmemberisgone thread members size ver member this info donotthrowexceptionwhenmemberisgone thread is started info donotthrowexceptionwhenmemberisgone thread overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info donotthrowexceptionwhenmemberisgone thread o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info donotthrowexceptionwhenmemberisgone thread copyright c hazelcast inc all rights reserved info donotthrowexceptionwhenmemberisgone thread hazelcast platform snapshot starting at info donotthrowexceptionwhenmemberisgone thread cluster name dev info donotthrowexceptionwhenmemberisgone thread the jet engine is disabled to enable the jet engine on the members please do one of the following change member config using java api config getjetconfig setenabled true change xml yaml configuration property set hazelcast jet enabled to true add system property dhz jet enabled true add environment variable hz jet enabled true info donotthrowexceptionwhenmemberisgone thread collecting debug metrics and sending to diagnostics is enabled warn donotthrowexceptionwhenmemberisgone thread cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info donotthrowexceptionwhenmemberisgone thread diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info donotthrowexceptionwhenmemberisgone thread is starting info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread members size ver member this member info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread members size ver member member this info donotthrowexceptionwhenmemberisgone thread is started info donotthrowexceptionwhenmemberisgone thread overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info donotthrowexceptionwhenmemberisgone thread o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info donotthrowexceptionwhenmemberisgone thread copyright c hazelcast inc all rights reserved info donotthrowexceptionwhenmemberisgone thread hazelcast platform snapshot starting at info donotthrowexceptionwhenmemberisgone thread cluster name dev info donotthrowexceptionwhenmemberisgone thread the jet engine is disabled to enable the jet engine on the members please do one of the following change member config using java api config getjetconfig setenabled true change xml yaml configuration property set hazelcast jet enabled to true add system property dhz jet enabled true add environment variable hz jet enabled true info donotthrowexceptionwhenmemberisgone thread collecting debug metrics and sending to diagnostics is enabled warn donotthrowexceptionwhenmemberisgone thread cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info donotthrowexceptionwhenmemberisgone thread diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info donotthrowexceptionwhenmemberisgone thread is starting info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread members size ver member this member member info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz elastic bose generic operation thread members size ver member member member this info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread members size ver member member this member info donotthrowexceptionwhenmemberisgone thread is started info donotthrowexceptionwhenmemberisgone thread overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info donotthrowexceptionwhenmemberisgone thread o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info donotthrowexceptionwhenmemberisgone thread copyright c hazelcast inc all rights reserved info donotthrowexceptionwhenmemberisgone thread hazelcast platform snapshot starting at info donotthrowexceptionwhenmemberisgone thread cluster name dev info donotthrowexceptionwhenmemberisgone thread the jet engine is disabled to enable the jet engine on the members please do one of the following change member config using java api config getjetconfig setenabled true change xml yaml configuration property set hazelcast jet enabled to true add system property dhz jet enabled true add environment variable hz jet enabled true info donotthrowexceptionwhenmemberisgone thread collecting debug metrics and sending to diagnostics is enabled warn donotthrowexceptionwhenmemberisgone thread cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info donotthrowexceptionwhenmemberisgone thread diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info donotthrowexceptionwhenmemberisgone thread is starting info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true warn donotthrowexceptionwhenmemberisgone thread resetting master address because join address timeout info donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread members size ver member this member member member info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz epic bose priority generic operation thread members size ver member member member member this info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread members size ver member member member this member info donotthrowexceptionwhenmemberisgone thread is started info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread members size ver member member this member member info donotthrowexceptionwhenmemberisgone thread overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info donotthrowexceptionwhenmemberisgone thread o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info donotthrowexceptionwhenmemberisgone thread copyright c hazelcast inc all rights reserved info donotthrowexceptionwhenmemberisgone thread hazelcast platform snapshot starting at info donotthrowexceptionwhenmemberisgone thread cluster name dev info donotthrowexceptionwhenmemberisgone thread the jet engine is disabled to enable the jet engine on the members please do one of the following change member config using java api config getjetconfig setenabled true change xml yaml configuration property set hazelcast jet enabled to true add system property dhz jet enabled true add environment variable hz jet enabled true info donotthrowexceptionwhenmemberisgone thread collecting debug metrics and sending to diagnostics is enabled warn donotthrowexceptionwhenmemberisgone thread cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info donotthrowexceptionwhenmemberisgone thread diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info donotthrowexceptionwhenmemberisgone thread is starting info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone thread resetting master address because join address timeout warn donotthrowexceptionwhenmemberisgone thread resetting master address because join address timeout warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone thread resetting master address because join address timeout warn donotthrowexceptionwhenmemberisgone thread resetting master address because join address timeout warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time info donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread members size ver member this member member member member warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz epic bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz admiring bose priority generic operation thread members size ver member member member member member this info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread members size ver member member member this member member info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz epic bose priority generic operation thread members size ver member member member member this member info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread members size ver member member this member member member info donotthrowexceptionwhenmemberisgone thread is started warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread ignoring heartbeat from member since it is expired now timestamp warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread ignoring heartbeat from member since it is expired now timestamp warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread ignoring heartbeat from member since it is expired now timestamp warn donotthrowexceptionwhenmemberisgone hz serene bose cached thread suspecting member because it has not sent any heartbeats since now heartbeat timeout ms suspicion level warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time info donotthrowexceptionwhenmemberisgone hz serene bose cached thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz serene bose cached thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz serene bose cached thread removing member info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz admiring bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz admiring bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz serene bose cached thread members size ver member this member member member info donotthrowexceptionwhenmemberisgone hz vigilant bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz serene bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread members size ver member member this member member info donotthrowexceptionwhenmemberisgone hz admiring bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz admiring bose priority generic operation thread members size ver member member member member this info donotthrowexceptionwhenmemberisgone hz elastic bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread members size ver member member member this member warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time info donotthrowexceptionwhenmemberisgone hz epic bose cached thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz epic bose cached thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz epic bose cached thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz epic bose cached thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread ignoring heartbeat from member since it is expired now timestamp warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time info donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz epic bose generic operation thread received suspicion request from warn donotthrowexceptionwhenmemberisgone hz epic bose generic operation thread member is suspected to be dead for reason explicit suspicion info donotthrowexceptionwhenmemberisgone hz epic bose generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz epic bose generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread ignoring heartbeat from member since it is expired now timestamp warn donotthrowexceptionwhenmemberisgone hz serene bose cached thread suspecting member because it has not sent any heartbeats since now heartbeat timeout ms suspicion level warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time warn donotthrowexceptionwhenmemberisgone hz serene bose generic operation thread cannot handle heartbeat from could not acquire lock in time info donotthrowexceptionwhenmemberisgone hz serene bose cached thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz serene bose cached thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz admiring bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose cached thread removing member info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone hz serene bose cached thread members size ver member this member member info donotthrowexceptionwhenmemberisgone hz serene bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz vigilant bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz vigilant bose priority generic operation thread members size ver member member this member info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz elastic bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz elastic bose priority generic operation thread members size ver member member member this info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true warn donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread snap ce thread members size ver member this member member info donotthrowexceptionwhenmemberisgone hz thirsty bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz serene bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz blissful bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz blissful bose generic operation thread members size ver member member member this info donotthrowexceptionwhenmemberisgone hz thirsty bose priority generic operation thread members size ver member member this member info donotthrowexceptionwhenmemberisgone thread shutting down node engine info donotthrowexceptionwhenmemberisgone thread destroying node nodeextension info donotthrowexceptionwhenmemberisgone thread hazelcast shutdown is completed in ms info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone donotthrowexceptionwhenmemberisgone aggregated progress operations maximum latency ms throughput in last ms ops second info donotthrowexceptionwhenmemberisgone thread overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info donotthrowexceptionwhenmemberisgone thread o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info donotthrowexceptionwhenmemberisgone thread copyright c hazelcast inc all rights reserved info donotthrowexceptionwhenmemberisgone thread hazelcast platform snapshot starting at info donotthrowexceptionwhenmemberisgone thread cluster name dev info donotthrowexceptionwhenmemberisgone thread the jet engine is disabled to enable the jet engine on the members please do one of the following change member config using java api config getjetconfig setenabled true change xml yaml configuration property set hazelcast jet enabled to true add system property dhz jet enabled true add environment variable hz jet enabled true info donotthrowexceptionwhenmemberisgone thread collecting debug metrics and sending to diagnostics is enabled warn donotthrowexceptionwhenmemberisgone thread cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info donotthrowexceptionwhenmemberisgone thread diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info donotthrowexceptionwhenmemberisgone thread is starting info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread members size ver member this member member member info donotthrowexceptionwhenmemberisgone hz thirsty bose generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz blissful bose generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz silly bose priority generic operation thread members size ver member member member member this info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz blissful bose generic operation thread members size ver member member member this member warn donotthrowexceptionwhenmemberisgone thread config seed port is and cluster size is some of the ports seem occupied info donotthrowexceptionwhenmemberisgone hz thirsty bose generic operation thread members size ver member member this member member info donotthrowexceptionwhenmemberisgone thread is started info donotthrowexceptionwhenmemberisgone hz silly bose healthmonitor processors physical memory total physical memory free swap space total swap space free heap memory used heap memory free heap memory total heap memory max heap memory used total heap memory used max minor gc count minor gc time major gc count major gc time load process load system load systemaverage thread count thread peakcount cluster timediff event q size executor q async size executor q client size executor q client query size executor q client blocking size executor q query size executor q scheduled size executor q io size executor q system size executor q operations size executor q priorityoperation size operations completed count executor q mapload size executor q maploadallkeys size executor q cluster size executor q response size operations running count operations pending invocations percentage operations pending invocations count proxy count clientendpoint count connection active count client connection count connection count info donotthrowexceptionwhenmemberisgone thread is shutting down warn donotthrowexceptionwhenmemberisgone thread terminating forcefully info donotthrowexceptionwhenmemberisgone thread shutting down connection manager info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false warn donotthrowexceptionwhenmemberisgone thread member is suspected to be dead for reason connection manager is stopped on member this warn donotthrowexceptionwhenmemberisgone thread member is suspected to be dead for reason connection manager is stopped on member this info donotthrowexceptionwhenmemberisgone thread removing member info donotthrowexceptionwhenmemberisgone thread members size ver member this member member info donotthrowexceptionwhenmemberisgone hz serene bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone thread shutting down node engine info donotthrowexceptionwhenmemberisgone hz blissful bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz silly bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz blissful bose generic operation thread members size ver member member this member info donotthrowexceptionwhenmemberisgone hz silly bose priority generic operation thread members size ver member member member this info donotthrowexceptionwhenmemberisgone thread destroying node nodeextension info donotthrowexceptionwhenmemberisgone thread hazelcast shutdown is completed in ms info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone donotthrowexceptionwhenmemberisgone aggregated progress operations maximum latency ms throughput in last ms ops second info donotthrowexceptionwhenmemberisgone thread overridden metrics configuration with system property hazelcast metrics collection frequency metricsconfig collectionfrequencyseconds info donotthrowexceptionwhenmemberisgone thread o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o o info donotthrowexceptionwhenmemberisgone thread copyright c hazelcast inc all rights reserved info donotthrowexceptionwhenmemberisgone thread hazelcast platform snapshot starting at info donotthrowexceptionwhenmemberisgone thread cluster name dev info donotthrowexceptionwhenmemberisgone thread the jet engine is disabled to enable the jet engine on the members please do one of the following change member config using java api config getjetconfig setenabled true change xml yaml configuration property set hazelcast jet enabled to true add system property dhz jet enabled true add environment variable hz jet enabled true info donotthrowexceptionwhenmemberisgone thread collecting debug metrics and sending to diagnostics is enabled warn donotthrowexceptionwhenmemberisgone thread cp subsystem is not enabled cp data structures will operate in unsafe mode please note that unsafe mode will not provide strong consistency guarantees info donotthrowexceptionwhenmemberisgone thread diagnostics disabled to enable add dhazelcast diagnostics enabled true to the jvm arguments info donotthrowexceptionwhenmemberisgone thread is starting info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz serene bose priority generic operation thread members size ver member this member member member info donotthrowexceptionwhenmemberisgone hz blissful bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz silly bose priority generic operation thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone hz wizardly bose priority generic operation thread members size ver member member member member this info donotthrowexceptionwhenmemberisgone hz blissful bose priority generic operation thread members size ver member member this member member info donotthrowexceptionwhenmemberisgone hz silly bose priority generic operation thread members size ver member member member this member info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true info donotthrowexceptionwhenmemberisgone thread created connection to endpoint connection mockconnection localendpoint remoteendpoint alive true warn donotthrowexceptionwhenmemberisgone thread config seed port is and cluster size is some of the ports seem occupied info donotthrowexceptionwhenmemberisgone thread is started info donotthrowexceptionwhenmemberisgone hz wizardly bose healthmonitor processors physical memory total physical memory free swap space total swap space free heap memory used heap memory free heap memory total heap memory max heap memory used total heap memory used max minor gc count minor gc time major gc count major gc time load process load system load systemaverage thread count thread peakcount cluster timediff event q size executor q async size executor q client size executor q client query size executor q client blocking size executor q query size executor q scheduled size executor q io size executor q system size executor q operations size executor q priorityoperation size operations completed count executor q mapload size executor q maploadallkeys size executor q cluster size executor q response size operations running count operations pending invocations percentage operations pending invocations count proxy count clientendpoint count connection active count client connection count connection count info donotthrowexceptionwhenmemberisgone hz serene bose healthmonitor processors physical memory total physical memory free swap space total swap space free heap memory used heap memory free heap memory total heap memory max heap memory used total heap memory used max minor gc count minor gc time major gc count major gc time load process load system load systemaverage thread count thread peakcount cluster timediff event q size executor q async size executor q client size executor q client query size executor q client blocking size executor q query size executor q scheduled size executor q io size executor q system size executor q operations size executor q priorityoperation size operations completed count executor q mapload size executor q maploadallkeys size executor q cluster size executor q response size operations running count operations pending invocations percentage operations pending invocations count proxy count clientendpoint count connection active count client connection count connection count info donotthrowexceptionwhenmemberisgone donotthrowexceptionwhenmemberisgone test deadline reached tearing down info donotthrowexceptionwhenmemberisgone donotthrowexceptionwhenmemberisgone waiting until for test tasks to complete gracefully info donotthrowexceptionwhenmemberisgone thread waiting for member bouncing thread to stop info donotthrowexceptionwhenmemberisgone thread member bouncing thread exiting info donotthrowexceptionwhenmemberisgone thread member bouncing thread stopped info donotthrowexceptionwhenmemberisgone thread tearing down bouncememberrule info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread shutting down connection manager info donotthrowexceptionwhenmemberisgone thread shutting down node engine info donotthrowexceptionwhenmemberisgone thread destroying node nodeextension info donotthrowexceptionwhenmemberisgone thread hazelcast shutdown is completed in ms info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread shutting down connection manager info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false warn donotthrowexceptionwhenmemberisgone thread member is suspected to be dead for reason connection manager is stopped on member this warn donotthrowexceptionwhenmemberisgone thread member is suspected to be dead for reason connection manager is stopped on member this info donotthrowexceptionwhenmemberisgone thread removing member info donotthrowexceptionwhenmemberisgone thread members size ver member this member member info donotthrowexceptionwhenmemberisgone thread shutting down node engine info donotthrowexceptionwhenmemberisgone hz serene bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz wizardly bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz silly bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz silly bose priority generic operation thread members size ver member member this member info donotthrowexceptionwhenmemberisgone hz wizardly bose priority generic operation thread members size ver member member member this info donotthrowexceptionwhenmemberisgone thread destroying node nodeextension info donotthrowexceptionwhenmemberisgone thread hazelcast shutdown is completed in ms info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread shutting down connection manager info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false warn donotthrowexceptionwhenmemberisgone thread member is suspected to be dead for reason connection manager is stopped on member this info donotthrowexceptionwhenmemberisgone thread removing member info donotthrowexceptionwhenmemberisgone thread members size ver member this member info donotthrowexceptionwhenmemberisgone hz serene bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone hz silly bose priority generic operation thread members size ver member member this info donotthrowexceptionwhenmemberisgone hz silly bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone thread shutting down node engine info donotthrowexceptionwhenmemberisgone thread destroying node nodeextension info donotthrowexceptionwhenmemberisgone thread hazelcast shutdown is completed in ms info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread shutting down connection manager info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread removing member info donotthrowexceptionwhenmemberisgone thread members size ver member this info donotthrowexceptionwhenmemberisgone hz serene bose cached thread committing rolling back live transactions of uuid info donotthrowexceptionwhenmemberisgone thread shutting down node engine info donotthrowexceptionwhenmemberisgone thread destroying node nodeextension info donotthrowexceptionwhenmemberisgone thread hazelcast shutdown is completed in ms info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread node is already shutting down waiting for shutdown process to complete info donotthrowexceptionwhenmemberisgone thread is shutdown info donotthrowexceptionwhenmemberisgone thread is shutting down info donotthrowexceptionwhenmemberisgone thread shutting down connection manager info donotthrowexceptionwhenmemberisgone thread removed connection to endpoint connection mockconnection localendpoint remoteendpoint alive false info donotthrowexceptionwhenmemberisgone thread shutting down node engine info donotthrowexceptionwhenmemberisgone thread destroying node nodeextension info donotthrowexceptionwhenmemberisgone thread hazelcast shutdown is completed in ms info donotthrowexceptionwhenmemberisgone thread is shutdown buildinfo right after donotthrowexceptionwhenmemberisgone com hazelcast internal dynamicconfig dynamicconfigslowprejoinbouncingtest buildinfo version snapshot build buildnumber revision enterprise false serializationversion hiccups measured while running test donotthrowexceptionwhenmemberisgone com hazelcast internal dynamicconfig dynamicconfigslowprejoinbouncingtest accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms accumulated pauses ms max pause ms pauses over ms
1
69,459
30,285,991,517
IssuesEvent
2023-07-08 17:32:24
ps2gg/ps2.gg
https://api.github.com/repos/ps2gg/ps2.gg
closed
Alt matched friends should show original character name
Scope: UI Type: Enhancement Service: Peepo
### I'm submitting a... <!-- Check with [x] --> - [ ] Bug report - [x] Feature request - [ ] Documentation request ### Current behavior <!-- Describe how the issue manifests. --> We show only the matched friends' name, making it unclear who they actually are. ### Expected behavior <!-- Describe the desired behavior. --> Should show the original player's name behind the matched alt. ### Definition of Done <!-- What requirements need to be fulfilled before we can release it --> - [Universal Definition of Done](https://github.com/ps2gg/ps2.gg/blob/master/docs/standards/Definition-Of-Done.md) is adhered to ## <!-- Additional information (optional) -->
1.0
Alt matched friends should show original character name - ### I'm submitting a... <!-- Check with [x] --> - [ ] Bug report - [x] Feature request - [ ] Documentation request ### Current behavior <!-- Describe how the issue manifests. --> We show only the matched friends' name, making it unclear who they actually are. ### Expected behavior <!-- Describe the desired behavior. --> Should show the original player's name behind the matched alt. ### Definition of Done <!-- What requirements need to be fulfilled before we can release it --> - [Universal Definition of Done](https://github.com/ps2gg/ps2.gg/blob/master/docs/standards/Definition-Of-Done.md) is adhered to ## <!-- Additional information (optional) -->
non_test
alt matched friends should show original character name i m submitting a bug report feature request documentation request current behavior we show only the matched friends name making it unclear who they actually are expected behavior should show the original player s name behind the matched alt definition of done is adhered to
0
75,488
14,478,338,307
IssuesEvent
2020-12-10 08:13:51
Foggalong/hardcode-fixer
https://api.github.com/repos/Foggalong/hardcode-fixer
opened
hardcoded app : silmbook battery
hardcoded app
slimbattery.desktop content of this desktop file [Desktop Entry] Name=Slimbook Battery Version=3.0beta Exec=slimbookbattery Comment=Show notifications of your battery states and manage energy! Icon=/usr/share/pixmaps/slimbookbattery.png Type=Application Terminal=false StartupNotify=true Encoding=UTF-8 Categories=Utility; Name[es_ES]=Slimbook Battery Name=Slimbook Battery
1.0
hardcoded app : silmbook battery - slimbattery.desktop content of this desktop file [Desktop Entry] Name=Slimbook Battery Version=3.0beta Exec=slimbookbattery Comment=Show notifications of your battery states and manage energy! Icon=/usr/share/pixmaps/slimbookbattery.png Type=Application Terminal=false StartupNotify=true Encoding=UTF-8 Categories=Utility; Name[es_ES]=Slimbook Battery Name=Slimbook Battery
non_test
hardcoded app silmbook battery slimbattery desktop content of this desktop file name slimbook battery version exec slimbookbattery comment show notifications of your battery states and manage energy icon usr share pixmaps slimbookbattery png type application terminal false startupnotify true encoding utf categories utility name slimbook battery name slimbook battery
0
78,675
22,345,429,462
IssuesEvent
2022-06-15 07:21:24
actonlang/acton
https://api.github.com/repos/actonlang/acton
opened
Fix dependency declaration for stdlib
enhancement Compiler build system
Changing the C source files in stdlib often does not lead to recompilation since actonc is so kind and efficient as to not recompile files if the sources has not been updated. Naturally, it has a rather big blind spot for modules implemented in C. - top level Makefile need to realize something has changed - actonc needs to rerun Make targets in acton project regardless of what it thinks since it simply does not have a complete view of dependencies
1.0
Fix dependency declaration for stdlib - Changing the C source files in stdlib often does not lead to recompilation since actonc is so kind and efficient as to not recompile files if the sources has not been updated. Naturally, it has a rather big blind spot for modules implemented in C. - top level Makefile need to realize something has changed - actonc needs to rerun Make targets in acton project regardless of what it thinks since it simply does not have a complete view of dependencies
non_test
fix dependency declaration for stdlib changing the c source files in stdlib often does not lead to recompilation since actonc is so kind and efficient as to not recompile files if the sources has not been updated naturally it has a rather big blind spot for modules implemented in c top level makefile need to realize something has changed actonc needs to rerun make targets in acton project regardless of what it thinks since it simply does not have a complete view of dependencies
0
96,135
8,597,336,004
IssuesEvent
2018-11-15 18:21:52
offa/stm32-eth
https://api.github.com/repos/offa/stm32-eth
closed
Clang / GCC Leak Sanitizer fails on Travis
arch: desktop ci integration test
Clan 7 Leak Sanitizer fails on Travis: ``` ==488==LeakSanitizer has encountered a fatal error. ==488==HINT: For debugging, try setting environment variable LSAN_OPTIONS=verbosity=1:log_threads=1 ==488==HINT: LeakSanitizer does not work under ptrace (strace, gdb, etc) test/CMakeFiles/unittest.dir/build.make:57: recipe for target 'test/CMakeFiles/unittest' failed ``` https://travis-ci.org/offa/stm32-eth/jobs/454570187 ### Progress - [x] `sudo: required` — **Fails** - [x] `sudo: true` — **Fails** - [x] `--cap-add SYS_PTRACE` — **Works** - [x] Docker invocation with `sudo` — **Fails** - [x] Docker invocation with `sudo` and `--cap-add SYS_PTRACE` — **Works** **Ref.** #125, #128
1.0
Clang / GCC Leak Sanitizer fails on Travis - Clan 7 Leak Sanitizer fails on Travis: ``` ==488==LeakSanitizer has encountered a fatal error. ==488==HINT: For debugging, try setting environment variable LSAN_OPTIONS=verbosity=1:log_threads=1 ==488==HINT: LeakSanitizer does not work under ptrace (strace, gdb, etc) test/CMakeFiles/unittest.dir/build.make:57: recipe for target 'test/CMakeFiles/unittest' failed ``` https://travis-ci.org/offa/stm32-eth/jobs/454570187 ### Progress - [x] `sudo: required` — **Fails** - [x] `sudo: true` — **Fails** - [x] `--cap-add SYS_PTRACE` — **Works** - [x] Docker invocation with `sudo` — **Fails** - [x] Docker invocation with `sudo` and `--cap-add SYS_PTRACE` — **Works** **Ref.** #125, #128
test
clang gcc leak sanitizer fails on travis clan leak sanitizer fails on travis leaksanitizer has encountered a fatal error hint for debugging try setting environment variable lsan options verbosity log threads hint leaksanitizer does not work under ptrace strace gdb etc test cmakefiles unittest dir build make recipe for target test cmakefiles unittest failed progress sudo required — fails sudo true — fails cap add sys ptrace — works docker invocation with sudo — fails docker invocation with sudo and cap add sys ptrace — works ref
1
238,124
7,775,025,478
IssuesEvent
2018-06-05 00:15:15
GoogleCloudPlatform/google-cloud-cpp
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-cpp
opened
Use consistent aliases for namespaces in tests and examples
bigtable priority: p2 type: cleanup
Most of the tests use `btproto` as a namespace alias for `google::bigtable::v2` and `btadmin` for `google::bigtable::admin::v2`. But some use `admin_proto`, or other names. Not very important, but a pass cleaning these up would be nice.
1.0
Use consistent aliases for namespaces in tests and examples - Most of the tests use `btproto` as a namespace alias for `google::bigtable::v2` and `btadmin` for `google::bigtable::admin::v2`. But some use `admin_proto`, or other names. Not very important, but a pass cleaning these up would be nice.
non_test
use consistent aliases for namespaces in tests and examples most of the tests use btproto as a namespace alias for google bigtable and btadmin for google bigtable admin but some use admin proto or other names not very important but a pass cleaning these up would be nice
0
126,670
10,431,965,923
IssuesEvent
2019-09-17 10:11:36
MangopearUK/European-Boating-Association--Theme
https://api.github.com/repos/MangopearUK/European-Boating-Association--Theme
reopened
Test & audit group: DSV
Testing
Page URL: https://eba.eu.com/membership/groups/dsv/ ## Table of contents - [ ] **Task 1:** Perform automated audits _(10 tasks)_ - [ ] **Task 2:** Manual standards & accessibility tests _(61 tasks)_ - [ ] **Task 3:** Breakpoint testing _(15 tasks)_ - [ ] **Task 4:** Re-run automated audits _(10 tasks)_ ## 1: Perform automated audits _(10 tasks)_ ### Lighthouse: - [ ] Run "Accessibility" audit in lighthouse _(using incognito tab)_ - [ ] Run "Performance" audit in lighthouse _(using incognito tab)_ - [ ] Run "Best practices" audit in lighthouse _(using incognito tab)_ - [ ] Run "SEO" audit in lighthouse _(using incognito tab)_ - [ ] Run "PWA" audit in lighthouse _(using incognito tab)_ ### Pingdom - [ ] Run full audit of the the page's performance in Pingdom ### Browser's console - [ ] Check Chrome's console for errors ### Log results of audits - [ ] Screenshot snapshot of the lighthouse audits - [ ] Upload PDF of detailed lighthouse reports - [ ] Provide a screenshot of any console errors ## 2: Manual standards & accessibility tests _(61 tasks)_ ### Forms - [ ] Give all form elements permanently visible labels - [ ] Place labels above form elements - [ ] Mark invalid fields clearly and provide associated error messages - [ ] Make forms as short as possible; offer shortcuts like autocompleting the address using the postcode - [ ] Ensure all form fields have the correct requried state - [ ] Provide status and error messages as WAI-ARIA live regions ### Readability of content - [ ] Ensure page has good grammar - [ ] Ensure page content has been spell-checked - [ ] Make sure headings are in logical order - [ ] Ensure the same content is available across different devices and platforms - [ ] Begin long, multi-section documents with a table of contents ### Presentation - [ ] Make sure all content is formatted correctly - [ ] Avoid all-caps text - [ ] Make sure data tables wider than their container can be scrolled horizontally - [ ] Use the same design patterns to solve the same problems - [ ] Do not mark up subheadings/straplines with separate heading elements ### Links & buttons #### Links - [ ] Check all links to ensure they work - [ ] Check all links to third party websites use `rel="noopener"` - [ ] Make sure the purpose of a link is clearly described: "read more" vs. "read more about accessibility" - [ ] Provide a skip link if necessary - [ ] Underline links — at least in body copy - [ ] Warn users of links that have unusual behaviors, like linking off-site, or loading a new tab (i.e. aria-label) #### Buttons - [ ] Ensure primary calls to action are easy to recognize and reach - [ ] Provide clear, unambiguous focus styles - [ ] Ensure states (pressed, expanded, invalid, etc) are communicated to assistive software - [ ] Ensure disabled controls are not focusable - [ ] Make sure controls within hidden content are not focusable - [ ] Provide large touch "targets" for interactive elements - [ ] Make controls look like controls; give them strong perceived affordance - [ ] Use well-established, therefore recognizable, icons and symbols ### Assistive technology - [ ] Ensure content is not obscured through zooming - [ ] Support Windows high contrast mode (use images, not background images) - [ ] Provide alternative text for salient images - [ ] Make scrollable elements focusable for keyboard users - [ ] Ensure keyboard focus order is logical regarding visual layout - [ ] Match semantics to behavior for assistive technology users - [ ] Provide a default language and use lang="[ISO code]" for subsections in different languages - [ ] Inform the user when there are important changes to the application state - [ ] Do not hijack standard scrolling behavior - [ ] Do not instate "infinite scroll" by default; provide buttons to load more items ### General accessibility - [ ] Make sure text and background colors contrast sufficiently - [ ] Do not rely on color for differentiation of visual elements - [ ] Avoid images of text — text that cannot be translated, selected, or understood by assistive tech - [ ] Provide a print stylesheet - [ ] Honour requests to remove animation via the prefers-reduced-motion media query ### SEO - [ ] Ensure all pages have appropriate title - [ ] Ensure all pages have meta descriptions - [ ] Make content easier to find and improve search results with structured data [Read more](https://developers.google.com/search/docs/guides/prototype) - [ ] Check whether page should be appearing in sitemap - [ ] Make sure page has Facebook and Twitter large image previews set correctly - [ ] Check canonical links for page - [ ] Mark as cornerstone content? ### Performance - [ ] Ensure all CSS assets are minified and concatenated - [ ] Ensure all JS assets are minified and concatenated - [ ] Ensure all images are compressed - [ ] Where possible, remove redundant code - [ ] Ensure all SVG assets have been optimised - [ ] Make sure styles and scripts are not render blocking - [ ] Ensure large image assets are lazy loaded ### Other - [ ] Make sure all content belongs to a landmark element - [ ] Provide a manifest.json file for identifiable homescreen entries ## 3: Breakpoint testing _(15 tasks)_ ### Desktop - [ ] Provide a full screenshot of **1920px** wide page - [ ] Provide a full screenshot of **1500px** wide page - [ ] Provide a full screenshot of **1280px** wide page - [ ] Provide a full screenshot of **1024px** wide page ### Tablet - [ ] Provide a full screenshot of **960px** wide page - [ ] Provide a full screenshot of **800px** wide page - [ ] Provide a full screenshot of **760px** wide page - [ ] Provide a full screenshot of **650px** wide page ### Mobile - [ ] Provide a full screenshot of **600px** wide page - [ ] Provide a full screenshot of **500px** wide page - [ ] Provide a full screenshot of **450px** wide page - [ ] Provide a full screenshot of **380px** wide page - [ ] Provide a full screenshot of **320px** wide page - [ ] Provide a full screenshot of **280px** wide page - [ ] Provide a full screenshot of **250px** wide page ## 4: Re-run automated audits _(10 tasks)_ ### Lighthouse: - [ ] Run "Accessibility" audit in lighthouse _(using incognito tab)_ - [ ] Run "Performance" audit in lighthouse _(using incognito tab)_ - [ ] Run "Best practices" audit in lighthouse _(using incognito tab)_ - [ ] Run "SEO" audit in lighthouse _(using incognito tab)_ - [ ] Run "PWA" audit in lighthouse _(using incognito tab)_ ### Pingdom - [ ] Run full audit of the the page's performance in Pingdom ### Browser's console - [ ] Check Chrome's console for errors ### Log results of audits - [ ] Screenshot snapshot of the lighthouse audits - [ ] Upload PDF of detailed lighthouse reports - [ ] Provide a screenshot of any console errors
1.0
Test & audit group: DSV - Page URL: https://eba.eu.com/membership/groups/dsv/ ## Table of contents - [ ] **Task 1:** Perform automated audits _(10 tasks)_ - [ ] **Task 2:** Manual standards & accessibility tests _(61 tasks)_ - [ ] **Task 3:** Breakpoint testing _(15 tasks)_ - [ ] **Task 4:** Re-run automated audits _(10 tasks)_ ## 1: Perform automated audits _(10 tasks)_ ### Lighthouse: - [ ] Run "Accessibility" audit in lighthouse _(using incognito tab)_ - [ ] Run "Performance" audit in lighthouse _(using incognito tab)_ - [ ] Run "Best practices" audit in lighthouse _(using incognito tab)_ - [ ] Run "SEO" audit in lighthouse _(using incognito tab)_ - [ ] Run "PWA" audit in lighthouse _(using incognito tab)_ ### Pingdom - [ ] Run full audit of the the page's performance in Pingdom ### Browser's console - [ ] Check Chrome's console for errors ### Log results of audits - [ ] Screenshot snapshot of the lighthouse audits - [ ] Upload PDF of detailed lighthouse reports - [ ] Provide a screenshot of any console errors ## 2: Manual standards & accessibility tests _(61 tasks)_ ### Forms - [ ] Give all form elements permanently visible labels - [ ] Place labels above form elements - [ ] Mark invalid fields clearly and provide associated error messages - [ ] Make forms as short as possible; offer shortcuts like autocompleting the address using the postcode - [ ] Ensure all form fields have the correct requried state - [ ] Provide status and error messages as WAI-ARIA live regions ### Readability of content - [ ] Ensure page has good grammar - [ ] Ensure page content has been spell-checked - [ ] Make sure headings are in logical order - [ ] Ensure the same content is available across different devices and platforms - [ ] Begin long, multi-section documents with a table of contents ### Presentation - [ ] Make sure all content is formatted correctly - [ ] Avoid all-caps text - [ ] Make sure data tables wider than their container can be scrolled horizontally - [ ] Use the same design patterns to solve the same problems - [ ] Do not mark up subheadings/straplines with separate heading elements ### Links & buttons #### Links - [ ] Check all links to ensure they work - [ ] Check all links to third party websites use `rel="noopener"` - [ ] Make sure the purpose of a link is clearly described: "read more" vs. "read more about accessibility" - [ ] Provide a skip link if necessary - [ ] Underline links — at least in body copy - [ ] Warn users of links that have unusual behaviors, like linking off-site, or loading a new tab (i.e. aria-label) #### Buttons - [ ] Ensure primary calls to action are easy to recognize and reach - [ ] Provide clear, unambiguous focus styles - [ ] Ensure states (pressed, expanded, invalid, etc) are communicated to assistive software - [ ] Ensure disabled controls are not focusable - [ ] Make sure controls within hidden content are not focusable - [ ] Provide large touch "targets" for interactive elements - [ ] Make controls look like controls; give them strong perceived affordance - [ ] Use well-established, therefore recognizable, icons and symbols ### Assistive technology - [ ] Ensure content is not obscured through zooming - [ ] Support Windows high contrast mode (use images, not background images) - [ ] Provide alternative text for salient images - [ ] Make scrollable elements focusable for keyboard users - [ ] Ensure keyboard focus order is logical regarding visual layout - [ ] Match semantics to behavior for assistive technology users - [ ] Provide a default language and use lang="[ISO code]" for subsections in different languages - [ ] Inform the user when there are important changes to the application state - [ ] Do not hijack standard scrolling behavior - [ ] Do not instate "infinite scroll" by default; provide buttons to load more items ### General accessibility - [ ] Make sure text and background colors contrast sufficiently - [ ] Do not rely on color for differentiation of visual elements - [ ] Avoid images of text — text that cannot be translated, selected, or understood by assistive tech - [ ] Provide a print stylesheet - [ ] Honour requests to remove animation via the prefers-reduced-motion media query ### SEO - [ ] Ensure all pages have appropriate title - [ ] Ensure all pages have meta descriptions - [ ] Make content easier to find and improve search results with structured data [Read more](https://developers.google.com/search/docs/guides/prototype) - [ ] Check whether page should be appearing in sitemap - [ ] Make sure page has Facebook and Twitter large image previews set correctly - [ ] Check canonical links for page - [ ] Mark as cornerstone content? ### Performance - [ ] Ensure all CSS assets are minified and concatenated - [ ] Ensure all JS assets are minified and concatenated - [ ] Ensure all images are compressed - [ ] Where possible, remove redundant code - [ ] Ensure all SVG assets have been optimised - [ ] Make sure styles and scripts are not render blocking - [ ] Ensure large image assets are lazy loaded ### Other - [ ] Make sure all content belongs to a landmark element - [ ] Provide a manifest.json file for identifiable homescreen entries ## 3: Breakpoint testing _(15 tasks)_ ### Desktop - [ ] Provide a full screenshot of **1920px** wide page - [ ] Provide a full screenshot of **1500px** wide page - [ ] Provide a full screenshot of **1280px** wide page - [ ] Provide a full screenshot of **1024px** wide page ### Tablet - [ ] Provide a full screenshot of **960px** wide page - [ ] Provide a full screenshot of **800px** wide page - [ ] Provide a full screenshot of **760px** wide page - [ ] Provide a full screenshot of **650px** wide page ### Mobile - [ ] Provide a full screenshot of **600px** wide page - [ ] Provide a full screenshot of **500px** wide page - [ ] Provide a full screenshot of **450px** wide page - [ ] Provide a full screenshot of **380px** wide page - [ ] Provide a full screenshot of **320px** wide page - [ ] Provide a full screenshot of **280px** wide page - [ ] Provide a full screenshot of **250px** wide page ## 4: Re-run automated audits _(10 tasks)_ ### Lighthouse: - [ ] Run "Accessibility" audit in lighthouse _(using incognito tab)_ - [ ] Run "Performance" audit in lighthouse _(using incognito tab)_ - [ ] Run "Best practices" audit in lighthouse _(using incognito tab)_ - [ ] Run "SEO" audit in lighthouse _(using incognito tab)_ - [ ] Run "PWA" audit in lighthouse _(using incognito tab)_ ### Pingdom - [ ] Run full audit of the the page's performance in Pingdom ### Browser's console - [ ] Check Chrome's console for errors ### Log results of audits - [ ] Screenshot snapshot of the lighthouse audits - [ ] Upload PDF of detailed lighthouse reports - [ ] Provide a screenshot of any console errors
test
test audit group dsv page url table of contents task perform automated audits tasks task manual standards accessibility tests tasks task breakpoint testing tasks task re run automated audits tasks perform automated audits tasks lighthouse run accessibility audit in lighthouse using incognito tab run performance audit in lighthouse using incognito tab run best practices audit in lighthouse using incognito tab run seo audit in lighthouse using incognito tab run pwa audit in lighthouse using incognito tab pingdom run full audit of the the page s performance in pingdom browser s console check chrome s console for errors log results of audits screenshot snapshot of the lighthouse audits upload pdf of detailed lighthouse reports provide a screenshot of any console errors manual standards accessibility tests tasks forms give all form elements permanently visible labels place labels above form elements mark invalid fields clearly and provide associated error messages make forms as short as possible offer shortcuts like autocompleting the address using the postcode ensure all form fields have the correct requried state provide status and error messages as wai aria live regions readability of content ensure page has good grammar ensure page content has been spell checked make sure headings are in logical order ensure the same content is available across different devices and platforms begin long multi section documents with a table of contents presentation make sure all content is formatted correctly avoid all caps text make sure data tables wider than their container can be scrolled horizontally use the same design patterns to solve the same problems do not mark up subheadings straplines with separate heading elements links buttons links check all links to ensure they work check all links to third party websites use rel noopener make sure the purpose of a link is clearly described read more vs read more about accessibility provide a skip link if necessary underline links — at least in body copy warn users of links that have unusual behaviors like linking off site or loading a new tab i e aria label buttons ensure primary calls to action are easy to recognize and reach provide clear unambiguous focus styles ensure states pressed expanded invalid etc are communicated to assistive software ensure disabled controls are not focusable make sure controls within hidden content are not focusable provide large touch targets for interactive elements make controls look like controls give them strong perceived affordance use well established therefore recognizable icons and symbols assistive technology ensure content is not obscured through zooming support windows high contrast mode use images not background images provide alternative text for salient images make scrollable elements focusable for keyboard users ensure keyboard focus order is logical regarding visual layout match semantics to behavior for assistive technology users provide a default language and use lang for subsections in different languages inform the user when there are important changes to the application state do not hijack standard scrolling behavior do not instate infinite scroll by default provide buttons to load more items general accessibility make sure text and background colors contrast sufficiently do not rely on color for differentiation of visual elements avoid images of text — text that cannot be translated selected or understood by assistive tech provide a print stylesheet honour requests to remove animation via the prefers reduced motion media query seo ensure all pages have appropriate title ensure all pages have meta descriptions make content easier to find and improve search results with structured data check whether page should be appearing in sitemap make sure page has facebook and twitter large image previews set correctly check canonical links for page mark as cornerstone content performance ensure all css assets are minified and concatenated ensure all js assets are minified and concatenated ensure all images are compressed where possible remove redundant code ensure all svg assets have been optimised make sure styles and scripts are not render blocking ensure large image assets are lazy loaded other make sure all content belongs to a landmark element provide a manifest json file for identifiable homescreen entries breakpoint testing tasks desktop provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page tablet provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page mobile provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page provide a full screenshot of wide page re run automated audits tasks lighthouse run accessibility audit in lighthouse using incognito tab run performance audit in lighthouse using incognito tab run best practices audit in lighthouse using incognito tab run seo audit in lighthouse using incognito tab run pwa audit in lighthouse using incognito tab pingdom run full audit of the the page s performance in pingdom browser s console check chrome s console for errors log results of audits screenshot snapshot of the lighthouse audits upload pdf of detailed lighthouse reports provide a screenshot of any console errors
1
51,828
6,198,948,932
IssuesEvent
2017-07-05 20:23:57
ePADD/epadd
https://api.github.com/repos/ePADD/epadd
closed
Folder View Count Not Correct
Bug FIXED_WAITING_FOR_TEST Medium priority
Version 3.01 Total message count is given in Folder View Count. It should give the no of folders.
1.0
Folder View Count Not Correct - Version 3.01 Total message count is given in Folder View Count. It should give the no of folders.
test
folder view count not correct version total message count is given in folder view count it should give the no of folders
1
15,331
10,298,151,426
IssuesEvent
2019-08-28 07:40:17
aws/aws-cdk
https://api.github.com/repos/aws/aws-cdk
closed
aws-lambda Code.Asset static method name is awkward
feature-request service/lambda
The `Asset` static method in aws-lambd module's `Code` class is a little awkward. From the name of the method its not clear/obvious that this static method constructs a Code from a Asset's folder. It would be more clear if the `Asset` method was named to something more similar to `FromDirectory`. This naming change would also be consistent with other aspects of the CDK such as aws-ecs module's `ContainerImage` Class's static method `FromDockerHub`.
1.0
aws-lambda Code.Asset static method name is awkward - The `Asset` static method in aws-lambd module's `Code` class is a little awkward. From the name of the method its not clear/obvious that this static method constructs a Code from a Asset's folder. It would be more clear if the `Asset` method was named to something more similar to `FromDirectory`. This naming change would also be consistent with other aspects of the CDK such as aws-ecs module's `ContainerImage` Class's static method `FromDockerHub`.
non_test
aws lambda code asset static method name is awkward the asset static method in aws lambd module s code class is a little awkward from the name of the method its not clear obvious that this static method constructs a code from a asset s folder it would be more clear if the asset method was named to something more similar to fromdirectory this naming change would also be consistent with other aspects of the cdk such as aws ecs module s containerimage class s static method fromdockerhub
0
134,632
10,925,040,069
IssuesEvent
2019-11-22 11:31:37
ubtue/DatenProbleme
https://api.github.com/repos/ubtue/DatenProbleme
closed
ISSN 1556-1836 Terrorism and Political Violence "Or"
high priority ready for testing
Bei diesem Artikel https://www.tandfonline.com/doi/full/10.1080/09546553.2017.1330201 heißt der 1. Verfasser Or Honig. "Or" wird in 100 getilgt: `<subfield code="a">, Honig</subfield>`
1.0
ISSN 1556-1836 Terrorism and Political Violence "Or" - Bei diesem Artikel https://www.tandfonline.com/doi/full/10.1080/09546553.2017.1330201 heißt der 1. Verfasser Or Honig. "Or" wird in 100 getilgt: `<subfield code="a">, Honig</subfield>`
test
issn terrorism and political violence or bei diesem artikel heißt der verfasser or honig or wird in getilgt honig
1
120,294
10,112,762,351
IssuesEvent
2019-07-30 15:18:09
QubesOS/updates-status
https://api.github.com/repos/QubesOS/updates-status
closed
vmm-xen v4.12.0-2 (r4.1)
r4.1-buster-cur-test r4.1-dom0-cur-test r4.1-stretch-cur-test
Update of vmm-xen to v4.12.0-2 for Qubes r4.1, see comments below for details. Built from: https://github.com/QubesOS/qubes-vmm-xen/commit/11cedc5ba1769bc4d86be47720848b7c0e019150 [Changes since previous version](https://github.com/QubesOS/qubes-vmm-xen/compare/v4.12.0-1...v4.12.0-2): QubesOS/qubes-vmm-xen@11cedc5 version 4.12.0-2 QubesOS/qubes-vmm-xen@00ed41a Abort the major upgrade if any VM is running QubesOS/qubes-vmm-xen@dfcb257 Remove obsolete patches from series-vm.conf QubesOS/qubes-vmm-xen@bfc179f Fix xc_physinfo python wrapper Referenced issues: If you're release manager, you can issue GPG-inline signed command: * `Upload vmm-xen 11cedc5ba1769bc4d86be47720848b7c0e019150 r4.1 current repo` (available 7 days from now) * `Upload vmm-xen 11cedc5ba1769bc4d86be47720848b7c0e019150 r4.1 current (dists) repo`, you can choose subset of distributions, like `vm-fc24 vm-fc25` (available 7 days from now) * `Upload vmm-xen 11cedc5ba1769bc4d86be47720848b7c0e019150 r4.1 security-testing repo` Above commands will work only if packages in current-testing repository were built from given commit (i.e. no new version superseded it).
3.0
vmm-xen v4.12.0-2 (r4.1) - Update of vmm-xen to v4.12.0-2 for Qubes r4.1, see comments below for details. Built from: https://github.com/QubesOS/qubes-vmm-xen/commit/11cedc5ba1769bc4d86be47720848b7c0e019150 [Changes since previous version](https://github.com/QubesOS/qubes-vmm-xen/compare/v4.12.0-1...v4.12.0-2): QubesOS/qubes-vmm-xen@11cedc5 version 4.12.0-2 QubesOS/qubes-vmm-xen@00ed41a Abort the major upgrade if any VM is running QubesOS/qubes-vmm-xen@dfcb257 Remove obsolete patches from series-vm.conf QubesOS/qubes-vmm-xen@bfc179f Fix xc_physinfo python wrapper Referenced issues: If you're release manager, you can issue GPG-inline signed command: * `Upload vmm-xen 11cedc5ba1769bc4d86be47720848b7c0e019150 r4.1 current repo` (available 7 days from now) * `Upload vmm-xen 11cedc5ba1769bc4d86be47720848b7c0e019150 r4.1 current (dists) repo`, you can choose subset of distributions, like `vm-fc24 vm-fc25` (available 7 days from now) * `Upload vmm-xen 11cedc5ba1769bc4d86be47720848b7c0e019150 r4.1 security-testing repo` Above commands will work only if packages in current-testing repository were built from given commit (i.e. no new version superseded it).
test
vmm xen update of vmm xen to for qubes see comments below for details built from qubesos qubes vmm xen version qubesos qubes vmm xen abort the major upgrade if any vm is running qubesos qubes vmm xen remove obsolete patches from series vm conf qubesos qubes vmm xen fix xc physinfo python wrapper referenced issues if you re release manager you can issue gpg inline signed command upload vmm xen current repo available days from now upload vmm xen current dists repo you can choose subset of distributions like vm vm available days from now upload vmm xen security testing repo above commands will work only if packages in current testing repository were built from given commit i e no new version superseded it
1
287,027
24,803,573,106
IssuesEvent
2022-10-25 01:01:51
dotnet/maui
https://api.github.com/repos/dotnet/maui
closed
[MacCatalyst] [Scenario Day] Crash using Shell
t/bug platform/macOS 🍏 p/1 area/shell 🐢 s/verified s/try-latest-version
### Description Crash using Xaminals Shell sample. <img width="1231" alt="Captura de Pantalla 2022-05-17 a las 17 37 23" src="https://user-images.githubusercontent.com/6755973/168853312-42b56ff6-dc42-4b66-b755-92ed50b0836a.png"> ### Steps to Reproduce 1. Download https://github.com/davidbritch/dotnet-maui-samples/tree/main/Fundamentals/Shell 2. Launch the App in macOS. ### Version with bug Release Candidate 3 (current) ### Last version that worked well Unknown/Other ### Affected platforms macOS ### Affected platform versions macOS 12.3.1 ### Did you find any workaround? _No response_ ### Relevant log output _No response_
1.0
[MacCatalyst] [Scenario Day] Crash using Shell - ### Description Crash using Xaminals Shell sample. <img width="1231" alt="Captura de Pantalla 2022-05-17 a las 17 37 23" src="https://user-images.githubusercontent.com/6755973/168853312-42b56ff6-dc42-4b66-b755-92ed50b0836a.png"> ### Steps to Reproduce 1. Download https://github.com/davidbritch/dotnet-maui-samples/tree/main/Fundamentals/Shell 2. Launch the App in macOS. ### Version with bug Release Candidate 3 (current) ### Last version that worked well Unknown/Other ### Affected platforms macOS ### Affected platform versions macOS 12.3.1 ### Did you find any workaround? _No response_ ### Relevant log output _No response_
test
crash using shell description crash using xaminals shell sample img width alt captura de pantalla a las src steps to reproduce download launch the app in macos version with bug release candidate current last version that worked well unknown other affected platforms macos affected platform versions macos did you find any workaround no response relevant log output no response
1
246,711
20,910,361,588
IssuesEvent
2022-03-24 08:44:23
LiskHQ/lisk-desktop
https://api.github.com/repos/LiskHQ/lisk-desktop
reopened
Prepare 2.3.0 for production
epic type: test
### Description The purpose of this epic is to prepare version 2.3.0 for production release. ### Tasks ### Sprint 70 - [x] #4222 - [x] #4223 - [x] #4218
1.0
Prepare 2.3.0 for production - ### Description The purpose of this epic is to prepare version 2.3.0 for production release. ### Tasks ### Sprint 70 - [x] #4222 - [x] #4223 - [x] #4218
test
prepare for production description the purpose of this epic is to prepare version for production release tasks sprint
1
436,297
30,545,360,966
IssuesEvent
2023-07-20 03:05:11
Nalini1998/Project_Public
https://api.github.com/repos/Nalini1998/Project_Public
closed
26. Change directories two levels up back to the artusi/ directory. Print the working directory
documentation duplicate enhancement help wanted good first issue question
``` $ cd ../.. $ pwd /home/ccuser/workspace/artusi ```
1.0
26. Change directories two levels up back to the artusi/ directory. Print the working directory - ``` $ cd ../.. $ pwd /home/ccuser/workspace/artusi ```
non_test
change directories two levels up back to the artusi directory print the working directory cd pwd home ccuser workspace artusi
0
306,789
23,172,160,537
IssuesEvent
2022-07-30 22:12:06
BoXunTong/Automated_Pet_Feeding
https://api.github.com/repos/BoXunTong/Automated_Pet_Feeding
closed
New branches basing on structure
documentation good first issue
@BoXunTong @Yue2022pro @Vrachaos Build branches basing on your own part of code and name it by its function
1.0
New branches basing on structure - @BoXunTong @Yue2022pro @Vrachaos Build branches basing on your own part of code and name it by its function
non_test
new branches basing on structure boxuntong vrachaos build branches basing on your own part of code and name it by its function
0
94,274
8,482,542,363
IssuesEvent
2018-10-25 18:50:49
WallarooLabs/wallaroo
https://api.github.com/repos/WallarooLabs/wallaroo
closed
Improve handling of crashed workers in the test harness
test
The test harness needs better detection for crashed workers. While investigating #2267 it turned out that an unreported segfault was what led to the test failure. Not catching and reporting on that segfault (or in the general case, on any non-0 worker exit code) resulted in increased complexity of the trying to identify the cause of the test failure as well as several days spent searching for it. Once a rough segfault detection was added to the harness (in branch `bug-2267`), it was a matter of a few hours to diagnose the source of the issue and come up with a fix. Worker error detection in the test harness should have the following properties: - [ ] The ability to short-circuit or interrupt a test when a worker exits for an unexpected reason - [ ] Be able to accommodate workers with exit codes relating to intentional termination as part of resilience tests - [ ] Be able to produce meaningful artifacts, such as worker log files and core dumps in the case of a segfault - [ ] Should not shadow other important test errors, since that can lead us astray when investigating test errors.
1.0
Improve handling of crashed workers in the test harness - The test harness needs better detection for crashed workers. While investigating #2267 it turned out that an unreported segfault was what led to the test failure. Not catching and reporting on that segfault (or in the general case, on any non-0 worker exit code) resulted in increased complexity of the trying to identify the cause of the test failure as well as several days spent searching for it. Once a rough segfault detection was added to the harness (in branch `bug-2267`), it was a matter of a few hours to diagnose the source of the issue and come up with a fix. Worker error detection in the test harness should have the following properties: - [ ] The ability to short-circuit or interrupt a test when a worker exits for an unexpected reason - [ ] Be able to accommodate workers with exit codes relating to intentional termination as part of resilience tests - [ ] Be able to produce meaningful artifacts, such as worker log files and core dumps in the case of a segfault - [ ] Should not shadow other important test errors, since that can lead us astray when investigating test errors.
test
improve handling of crashed workers in the test harness the test harness needs better detection for crashed workers while investigating it turned out that an unreported segfault was what led to the test failure not catching and reporting on that segfault or in the general case on any non worker exit code resulted in increased complexity of the trying to identify the cause of the test failure as well as several days spent searching for it once a rough segfault detection was added to the harness in branch bug it was a matter of a few hours to diagnose the source of the issue and come up with a fix worker error detection in the test harness should have the following properties the ability to short circuit or interrupt a test when a worker exits for an unexpected reason be able to accommodate workers with exit codes relating to intentional termination as part of resilience tests be able to produce meaningful artifacts such as worker log files and core dumps in the case of a segfault should not shadow other important test errors since that can lead us astray when investigating test errors
1
428,709
30,007,071,298
IssuesEvent
2023-06-26 13:07:04
elbywan/wretch
https://api.github.com/repos/elbywan/wretch
closed
Doc: Typescript examples?
documentation
I can't find Typescript examples anywhere online! Is it possible to provide few examples? Currently I get the following error: ``` Property 'json' does not exist on type '<Result = unknown>(cb?: ((type: any) => Result | Promise<Result>) | undefined) => Promise<Awaited<Result>>'. ```
1.0
Doc: Typescript examples? - I can't find Typescript examples anywhere online! Is it possible to provide few examples? Currently I get the following error: ``` Property 'json' does not exist on type '<Result = unknown>(cb?: ((type: any) => Result | Promise<Result>) | undefined) => Promise<Awaited<Result>>'. ```
non_test
doc typescript examples i can t find typescript examples anywhere online is it possible to provide few examples currently i get the following error property json does not exist on type cb type any result promise undefined promise
0
341,072
30,566,149,364
IssuesEvent
2023-07-20 17:56:12
giorgi-injgia-1-btu-edu-ge/final_git
https://api.github.com/repos/giorgi-injgia-1-btu-edu-ge/final_git
opened
577b6d8 failed unit and formatting tests.
ci-pytest ci-black
Results of latest checkings 577b6d857d9cc97c4601ace9e30a337eeb24ba60 failed unit and formatting tests. pytest bisect result [7cfc79(https://giorgi-injgia-1-btu-edu-ge/final_git/commit/[7cfc7970ddbb33fa3e96984cbfb1cf08a387d716] pytest_failed) black bisect result \n <a class=commit-link data-hovercard-type=commit href=https://giorgi-injgia-1-btu-edu-ge/final_git/commit/[d5be486f0e30ea11bb9e5b6a9e3740b309aecf5f] pytest_failed_black_failed><tt>[d5be48</tt></a> Pytest report: \n<a href="https://github.com/giorgi-injgia-1-btu-edu-ge/test-repo/blob/report_branch/577b6d857d9cc97c4601ace9e30a337eeb24ba60-1689875765/pytest.html">pytest report</a> Black report: \n<a href="https://github.com/giorgi-injgia-1-btu-edu-ge/test-repo/blob/report_branch/577b6d857d9cc97c4601ace9e30a337eeb24ba60-1689875765/black.html">black report</a>
1.0
577b6d8 failed unit and formatting tests. - Results of latest checkings 577b6d857d9cc97c4601ace9e30a337eeb24ba60 failed unit and formatting tests. pytest bisect result [7cfc79(https://giorgi-injgia-1-btu-edu-ge/final_git/commit/[7cfc7970ddbb33fa3e96984cbfb1cf08a387d716] pytest_failed) black bisect result \n <a class=commit-link data-hovercard-type=commit href=https://giorgi-injgia-1-btu-edu-ge/final_git/commit/[d5be486f0e30ea11bb9e5b6a9e3740b309aecf5f] pytest_failed_black_failed><tt>[d5be48</tt></a> Pytest report: \n<a href="https://github.com/giorgi-injgia-1-btu-edu-ge/test-repo/blob/report_branch/577b6d857d9cc97c4601ace9e30a337eeb24ba60-1689875765/pytest.html">pytest report</a> Black report: \n<a href="https://github.com/giorgi-injgia-1-btu-edu-ge/test-repo/blob/report_branch/577b6d857d9cc97c4601ace9e30a337eeb24ba60-1689875765/black.html">black report</a>
test
failed unit and formatting tests results of latest checkings failed unit and formatting tests pytest bisect result pytest failed black bisect result n pytest report n black report n
1
191,618
14,595,007,387
IssuesEvent
2020-12-20 09:15:10
pytorch/pytorch
https://api.github.com/repos/pytorch/pytorch
opened
ROCm CI is intermittently failing with std::out_of_range
module: flaky-tests module: rocm
See https://ci.pytorch.org/jenkins/job/pytorch-builds/job/pytorch-linux-bionic-rocm3.10-py3.6-test1/126//console and https://ci.pytorch.org/jenkins/job/pytorch-builds/job/pytorch-linux-bionic-rocm3.10-py3.6-test1/110//console for examples. As best I can tell this error always happens around the same place in the test suite (but note that in these examples the actual tests where the error occurs are different).
1.0
ROCm CI is intermittently failing with std::out_of_range - See https://ci.pytorch.org/jenkins/job/pytorch-builds/job/pytorch-linux-bionic-rocm3.10-py3.6-test1/126//console and https://ci.pytorch.org/jenkins/job/pytorch-builds/job/pytorch-linux-bionic-rocm3.10-py3.6-test1/110//console for examples. As best I can tell this error always happens around the same place in the test suite (but note that in these examples the actual tests where the error occurs are different).
test
rocm ci is intermittently failing with std out of range see and for examples as best i can tell this error always happens around the same place in the test suite but note that in these examples the actual tests where the error occurs are different
1
66,600
7,008,122,957
IssuesEvent
2017-12-19 14:47:02
godotengine/godot
https://api.github.com/repos/godotengine/godot
closed
Rotation of tiles in tilemap should also rotate collision shape
bug needs testing topic:physics
**Issue description:** When you add a rotated or mirrored tile to a tile map, the collision shape is not rotated with it. **Steps to reproduce:** 1. Open the isometric example from the official examples. 2. Run the game in the simulator. 3. Try to collide with the doors that were added with the mirror / rotate option. 4. Notice that the collision doesn't match because only the visual element was rotated, not the collision. I haven't tested this yet with gridmaps, but I suspect the same issue may be present.
1.0
Rotation of tiles in tilemap should also rotate collision shape - **Issue description:** When you add a rotated or mirrored tile to a tile map, the collision shape is not rotated with it. **Steps to reproduce:** 1. Open the isometric example from the official examples. 2. Run the game in the simulator. 3. Try to collide with the doors that were added with the mirror / rotate option. 4. Notice that the collision doesn't match because only the visual element was rotated, not the collision. I haven't tested this yet with gridmaps, but I suspect the same issue may be present.
test
rotation of tiles in tilemap should also rotate collision shape issue description when you add a rotated or mirrored tile to a tile map the collision shape is not rotated with it steps to reproduce open the isometric example from the official examples run the game in the simulator try to collide with the doors that were added with the mirror rotate option notice that the collision doesn t match because only the visual element was rotated not the collision i haven t tested this yet with gridmaps but i suspect the same issue may be present
1
30,664
6,219,063,731
IssuesEvent
2017-07-09 09:43:45
coin-or/pulp
https://api.github.com/repos/coin-or/pulp
closed
ValueError in GUROBI_CMD
auto-migrated Priority-Medium Type-Defect
``` In version 51:e30da6f4b623 GUROBI_CMD.actualSolve() only iterates over the values of dictionary self.options instead of key-value pairs (line 1891). This results in a ValueError when using the solver GUROBI_CMD with custom options. What steps will reproduce the problem? 1. m.solve(pulp.GUROBI_CMD(options={'ResultFile': 'tmp.sol'})) on an arbitrary model m I have attached a patch to fix the problem. ``` Original issue reported on code.google.com by `DenisLoh...@googlemail.com` on 27 Jun 2014 at 1:36 Attachments: - [gurobi_cmd.patch](https://storage.googleapis.com/google-code-attachments/pulp-or/issue-64/comment-0/gurobi_cmd.patch)
1.0
ValueError in GUROBI_CMD - ``` In version 51:e30da6f4b623 GUROBI_CMD.actualSolve() only iterates over the values of dictionary self.options instead of key-value pairs (line 1891). This results in a ValueError when using the solver GUROBI_CMD with custom options. What steps will reproduce the problem? 1. m.solve(pulp.GUROBI_CMD(options={'ResultFile': 'tmp.sol'})) on an arbitrary model m I have attached a patch to fix the problem. ``` Original issue reported on code.google.com by `DenisLoh...@googlemail.com` on 27 Jun 2014 at 1:36 Attachments: - [gurobi_cmd.patch](https://storage.googleapis.com/google-code-attachments/pulp-or/issue-64/comment-0/gurobi_cmd.patch)
non_test
valueerror in gurobi cmd in version gurobi cmd actualsolve only iterates over the values of dictionary self options instead of key value pairs line this results in a valueerror when using the solver gurobi cmd with custom options what steps will reproduce the problem m solve pulp gurobi cmd options resultfile tmp sol on an arbitrary model m i have attached a patch to fix the problem original issue reported on code google com by denisloh googlemail com on jun at attachments
0
153,589
12,152,834,675
IssuesEvent
2020-04-24 23:33:10
3ilson/pfelk
https://api.github.com/repos/3ilson/pfelk
closed
Debian 9/10 Installation
setup/config issue testing needed wip
**Describe the bug** Installing on Debian 9/10 requires additional configuration beyond what is provided in the instructions. **To Reproduce** Follow the instructions and pfELK will not work on Debian 9/10 without additional configuration **Firewall System (please complete the following information):** N/A **Operating System (please complete the following information):** - OS Debian 9/10 **Elasticsearch, Logstash, Kibana (please complete the following information):** - Version of ELK 7.6.2 **Elasticsearch, Logstash, Kibana logs:** N/A **Additional context** Add any other context about the problem here.
1.0
Debian 9/10 Installation - **Describe the bug** Installing on Debian 9/10 requires additional configuration beyond what is provided in the instructions. **To Reproduce** Follow the instructions and pfELK will not work on Debian 9/10 without additional configuration **Firewall System (please complete the following information):** N/A **Operating System (please complete the following information):** - OS Debian 9/10 **Elasticsearch, Logstash, Kibana (please complete the following information):** - Version of ELK 7.6.2 **Elasticsearch, Logstash, Kibana logs:** N/A **Additional context** Add any other context about the problem here.
test
debian installation describe the bug installing on debian requires additional configuration beyond what is provided in the instructions to reproduce follow the instructions and pfelk will not work on debian without additional configuration firewall system please complete the following information n a operating system please complete the following information os debian elasticsearch logstash kibana please complete the following information version of elk elasticsearch logstash kibana logs n a additional context add any other context about the problem here
1
34,364
6,314,528,431
IssuesEvent
2017-07-24 11:06:21
discos/simulators
https://api.github.com/repos/discos/simulators
closed
Add README file
documentation
Add a README file containing a brief project description, and also the icons for Travis, Landscape.io and Coverage.
1.0
Add README file - Add a README file containing a brief project description, and also the icons for Travis, Landscape.io and Coverage.
non_test
add readme file add a readme file containing a brief project description and also the icons for travis landscape io and coverage
0
175,630
6,552,690,085
IssuesEvent
2017-09-05 19:22:21
inverse-inc/packetfence
https://api.github.com/repos/inverse-inc/packetfence
closed
FreeRADIUS: Segfault in syslog.so
Priority: Critical Type: Bug
FreeRADIUS is segfaulting in syslog.so when using `packetfence-multi-domain.pm` module with undefined values `kernel: radiusd[28886]: segfault at 8 ip 00007f671f1ca4ee sp 00007f66ccff6520 error 4 in Syslog.so[7f671f1c9000+3000]`
1.0
FreeRADIUS: Segfault in syslog.so - FreeRADIUS is segfaulting in syslog.so when using `packetfence-multi-domain.pm` module with undefined values `kernel: radiusd[28886]: segfault at 8 ip 00007f671f1ca4ee sp 00007f66ccff6520 error 4 in Syslog.so[7f671f1c9000+3000]`
non_test
freeradius segfault in syslog so freeradius is segfaulting in syslog so when using packetfence multi domain pm module with undefined values kernel radiusd segfault at ip sp error in syslog so
0
685,122
23,444,476,464
IssuesEvent
2022-08-15 18:09:04
IAmTamal/Milan
https://api.github.com/repos/IAmTamal/Milan
closed
🚀 Add a banner for clubs !
🕹 aspect: interface 🟨 priority: medium ⭐ goal: addition 🛠 status : under development
### Description ### We need a banner If you look at the **Donate Us** page, you will see a banner --> `Image 1` But the **Clubs** page has none. --> `Image 2` **Donate Us Page** : https://milaan.vercel.app/donateus **Clubs Page** : https://milaan.vercel.app/display/clubs Can we have a banner where the clubs page has a similar one too ? You can use a good image from internet. **Heading** Clubs and communities ! **Sub-heading** Here are some clubs you can follow, you can attend club events and even get notified about it once you subscribe ! ### Screenshots ## Image 1 ![image](https://user-images.githubusercontent.com/72851613/184502960-fb6e0013-2dc6-4c9d-8c5e-30db7b2ca556.png) ## Image 2 ![image](https://user-images.githubusercontent.com/72851613/184502974-74956885-01e1-4878-9189-383c7d10d9df.png) ### Additional information The Design is upto your skills. **Image resources :** - https://undraw.co/illustrations - https://storyset.com/
1.0
🚀 Add a banner for clubs ! - ### Description ### We need a banner If you look at the **Donate Us** page, you will see a banner --> `Image 1` But the **Clubs** page has none. --> `Image 2` **Donate Us Page** : https://milaan.vercel.app/donateus **Clubs Page** : https://milaan.vercel.app/display/clubs Can we have a banner where the clubs page has a similar one too ? You can use a good image from internet. **Heading** Clubs and communities ! **Sub-heading** Here are some clubs you can follow, you can attend club events and even get notified about it once you subscribe ! ### Screenshots ## Image 1 ![image](https://user-images.githubusercontent.com/72851613/184502960-fb6e0013-2dc6-4c9d-8c5e-30db7b2ca556.png) ## Image 2 ![image](https://user-images.githubusercontent.com/72851613/184502974-74956885-01e1-4878-9189-383c7d10d9df.png) ### Additional information The Design is upto your skills. **Image resources :** - https://undraw.co/illustrations - https://storyset.com/
non_test
🚀 add a banner for clubs description we need a banner if you look at the donate us page you will see a banner image but the clubs page has none image donate us page clubs page can we have a banner where the clubs page has a similar one too you can use a good image from internet heading clubs and communities sub heading here are some clubs you can follow you can attend club events and even get notified about it once you subscribe screenshots image image additional information the design is upto your skills image resources
0
178,582
29,926,138,082
IssuesEvent
2023-06-22 05:45:57
KEEPER31337/Homepage-Front-R2
https://api.github.com/repos/KEEPER31337/Homepage-Front-R2
closed
버튼 line height 제거
Design
### 기능 추가 요약 - line height을 고정시켰는데, 폰트 사이즈 14px의 경우 고정된 길이때문에 윗부분이 조금 짤려서 이를 해결함. ![image](https://github.com/KEEPER31337/Homepage-Front-R2/assets/81643702/6e3182b3-03a0-4a40-b7ca-f39ca9aec6fc) ### 추가되는 Dependency - ### 개발 시 참고되는 Documentation -
1.0
버튼 line height 제거 - ### 기능 추가 요약 - line height을 고정시켰는데, 폰트 사이즈 14px의 경우 고정된 길이때문에 윗부분이 조금 짤려서 이를 해결함. ![image](https://github.com/KEEPER31337/Homepage-Front-R2/assets/81643702/6e3182b3-03a0-4a40-b7ca-f39ca9aec6fc) ### 추가되는 Dependency - ### 개발 시 참고되는 Documentation -
non_test
버튼 line height 제거 기능 추가 요약 line height을 고정시켰는데 폰트 사이즈 경우 고정된 길이때문에 윗부분이 조금 짤려서 이를 해결함 추가되는 dependency 개발 시 참고되는 documentation
0
273,108
23,728,339,051
IssuesEvent
2022-08-30 22:04:44
WASdev/websphere-liberty-operator
https://api.github.com/repos/WASdev/websphere-liberty-operator
opened
Intermittent test failures
zenhub-dev test
1) The probe test failed intermittently on few manual test runs for OCP 4.11 ``` resource Deployment:wlo-test-123/probes-wsliberty: .status.replicas: value mismatch, expected: 1 != actual: 2 ``` 2) The pull policy test also failed on some builds (example [build](url)): ``` Name: pullpolicy State: fail Errors: resource Deployment:wlo-test-521/example-wsliberty-applicaion: .status.readyReplicas: key is missing from map ```
1.0
Intermittent test failures - 1) The probe test failed intermittently on few manual test runs for OCP 4.11 ``` resource Deployment:wlo-test-123/probes-wsliberty: .status.replicas: value mismatch, expected: 1 != actual: 2 ``` 2) The pull policy test also failed on some builds (example [build](url)): ``` Name: pullpolicy State: fail Errors: resource Deployment:wlo-test-521/example-wsliberty-applicaion: .status.readyReplicas: key is missing from map ```
test
intermittent test failures the probe test failed intermittently on few manual test runs for ocp resource deployment wlo test probes wsliberty status replicas value mismatch expected actual the pull policy test also failed on some builds example url name pullpolicy state fail errors resource deployment wlo test example wsliberty applicaion status readyreplicas key is missing from map
1
776,482
27,262,063,412
IssuesEvent
2023-02-22 15:30:46
WordPress/openverse
https://api.github.com/repos/WordPress/openverse
closed
Improve `pnpm i18n` performance
🟨 priority: medium ✨ goal: improvement 🤖 aspect: dx 🧱 stack: frontend
## Problem <!-- Describe a problem solved by this feature; or delete the section entirely. --> Currently the longest step of our Docker builds is the `pnpm i18n` script. Are there things we could do to speed this up, such as making fetch calls concurrent or other optimizations? ## Description <!-- Describe the feature and how it solves the problem. --> This job runs three scripts in the `src/locales/scripts` directory in order: 1. create-wp-locale-list 1. get-translations 1. get-validated-locales In total, this takes around 2 minutes on my machine for most runs. The second longest running step, `pnpm build:only`, takes less than 30 seconds. ## Alternatives <!-- Describe any alternative solutions or features you have considered. How is this feature better? --> ## Additional context <!-- Add any other context about the feature here; or delete the section entirely. --> ## Implementation <!-- Replace the [ ] with [x] to check the box. --> - [ ] 🙋 I would be interested in implementing this feature.
1.0
Improve `pnpm i18n` performance - ## Problem <!-- Describe a problem solved by this feature; or delete the section entirely. --> Currently the longest step of our Docker builds is the `pnpm i18n` script. Are there things we could do to speed this up, such as making fetch calls concurrent or other optimizations? ## Description <!-- Describe the feature and how it solves the problem. --> This job runs three scripts in the `src/locales/scripts` directory in order: 1. create-wp-locale-list 1. get-translations 1. get-validated-locales In total, this takes around 2 minutes on my machine for most runs. The second longest running step, `pnpm build:only`, takes less than 30 seconds. ## Alternatives <!-- Describe any alternative solutions or features you have considered. How is this feature better? --> ## Additional context <!-- Add any other context about the feature here; or delete the section entirely. --> ## Implementation <!-- Replace the [ ] with [x] to check the box. --> - [ ] 🙋 I would be interested in implementing this feature.
non_test
improve pnpm performance problem currently the longest step of our docker builds is the pnpm script are there things we could do to speed this up such as making fetch calls concurrent or other optimizations description this job runs three scripts in the src locales scripts directory in order create wp locale list get translations get validated locales in total this takes around minutes on my machine for most runs the second longest running step pnpm build only takes less than seconds alternatives additional context implementation 🙋 i would be interested in implementing this feature
0
72,098
7,279,109,704
IssuesEvent
2018-02-22 02:24:12
MultiPoolMiner/MultiPoolMiner
https://api.github.com/repos/MultiPoolMiner/MultiPoolMiner
closed
Stats for YiiMP.eu are off for some coins/messing up/ impossible autoswitch
available for testing bug priority
The stats for yiimp are off EUR/day and EUR/GH/Day messing up/ impossible autoswitch My config.txt [Config.txt](https://github.com/MultiPoolMiner/MultiPoolMiner/files/1743515/Config.txt) Bat file cd` /d %~dp0 setx GPU_FORCE_64BIT_PTR 1 setx GPU_MAX_HEAP_SIZE 100 setx GPU_USE_SYNC_OBJECTS 1 setx GPU_MAX_ALLOC_PERCENT 100 setx GPU_SINGLE_ALLOC_PERCENT 100 pwsh -noexit -executionpolicy bypass -command "& .\multipoolminer.ps1 -region europe -currency eur -type nvidia -poolname yiimp -algorithm x17,phi,bitcore,keccakc,lyra2v2,tribus,skein,bitcore,neoscrypt -donate 10 -watchdog -minerstatusurl https://multipoolminer.io/monitor/miner.php -switchingprevention 2" pause ![stats](https://user-images.githubusercontent.com/36695084/36473633-62cfe470-16f5-11e8-9490-cde597bba407.jpg) Please advise...
1.0
Stats for YiiMP.eu are off for some coins/messing up/ impossible autoswitch - The stats for yiimp are off EUR/day and EUR/GH/Day messing up/ impossible autoswitch My config.txt [Config.txt](https://github.com/MultiPoolMiner/MultiPoolMiner/files/1743515/Config.txt) Bat file cd` /d %~dp0 setx GPU_FORCE_64BIT_PTR 1 setx GPU_MAX_HEAP_SIZE 100 setx GPU_USE_SYNC_OBJECTS 1 setx GPU_MAX_ALLOC_PERCENT 100 setx GPU_SINGLE_ALLOC_PERCENT 100 pwsh -noexit -executionpolicy bypass -command "& .\multipoolminer.ps1 -region europe -currency eur -type nvidia -poolname yiimp -algorithm x17,phi,bitcore,keccakc,lyra2v2,tribus,skein,bitcore,neoscrypt -donate 10 -watchdog -minerstatusurl https://multipoolminer.io/monitor/miner.php -switchingprevention 2" pause ![stats](https://user-images.githubusercontent.com/36695084/36473633-62cfe470-16f5-11e8-9490-cde597bba407.jpg) Please advise...
test
stats for yiimp eu are off for some coins messing up impossible autoswitch the stats for yiimp are off eur day and eur gh day messing up impossible autoswitch my config txt bat file cd d setx gpu force ptr setx gpu max heap size setx gpu use sync objects setx gpu max alloc percent setx gpu single alloc percent pwsh noexit executionpolicy bypass command multipoolminer region europe currency eur type nvidia poolname yiimp algorithm phi bitcore keccakc tribus skein bitcore neoscrypt donate watchdog minerstatusurl switchingprevention pause please advise
1
583,431
17,388,697,094
IssuesEvent
2021-08-02 02:29:03
openmsupply/application-manager-web-app
https://api.github.com/repos/openmsupply/application-manager-web-app
closed
Epic #69 - Versioning templates
EPIC Epic effort: 3 Epic priority: 1 User Stories Epic
Anytime a change is made to a template, it needs to be duplicated, versioned and set in a 'draft' mode, where only tester/admin user can do actions on the new application version. After editing it can be set to available and 'live' mode, affecting only new applications for that template. * Also includes list of templates and versions * Edit = make draft and inactive * And then make active and not editable ## Documentation - [ ] TODO: Add as part of [Epic #38](338) Public docs & tutorials ## Issues
1.0
Epic #69 - Versioning templates - Anytime a change is made to a template, it needs to be duplicated, versioned and set in a 'draft' mode, where only tester/admin user can do actions on the new application version. After editing it can be set to available and 'live' mode, affecting only new applications for that template. * Also includes list of templates and versions * Edit = make draft and inactive * And then make active and not editable ## Documentation - [ ] TODO: Add as part of [Epic #38](338) Public docs & tutorials ## Issues
non_test
epic versioning templates anytime a change is made to a template it needs to be duplicated versioned and set in a draft mode where only tester admin user can do actions on the new application version after editing it can be set to available and live mode affecting only new applications for that template also includes list of templates and versions edit make draft and inactive and then make active and not editable documentation todo add as part of public docs tutorials issues
0
97,357
3,989,172,655
IssuesEvent
2016-05-09 13:06:44
kubernetes/kubernetes
https://api.github.com/repos/kubernetes/kubernetes
closed
Namespace deletion doesn't work with protobufs
priority/P0 team/control-plane team/CSI-API Machinery SIG
All e2e tests when filed in protobuf configuration are failing with the error like these: Couldn't delete ns "e2e-tests-kubectl-1gnev": namespace e2e-tests-kubectl-1gnev was not deleted within limit: timed out waiting for the condition, pods remaining: [] or Couldn't delete ns "e2e-tests-services-b5mh0": namespace e2e-tests-services-b5mh0 was not deleted within limit: timed out waiting for the condition, some pods were not marked with a deletion timestamp, pods remaining: [service2-fpbb3 service2-olp92 service2-t8sad service3-6uilv service3-98i1z service3-m8cbu]
1.0
Namespace deletion doesn't work with protobufs - All e2e tests when filed in protobuf configuration are failing with the error like these: Couldn't delete ns "e2e-tests-kubectl-1gnev": namespace e2e-tests-kubectl-1gnev was not deleted within limit: timed out waiting for the condition, pods remaining: [] or Couldn't delete ns "e2e-tests-services-b5mh0": namespace e2e-tests-services-b5mh0 was not deleted within limit: timed out waiting for the condition, some pods were not marked with a deletion timestamp, pods remaining: [service2-fpbb3 service2-olp92 service2-t8sad service3-6uilv service3-98i1z service3-m8cbu]
non_test
namespace deletion doesn t work with protobufs all tests when filed in protobuf configuration are failing with the error like these couldn t delete ns tests kubectl namespace tests kubectl was not deleted within limit timed out waiting for the condition pods remaining or couldn t delete ns tests services namespace tests services was not deleted within limit timed out waiting for the condition some pods were not marked with a deletion timestamp pods remaining
0
362,749
10,731,598,961
IssuesEvent
2019-10-28 19:55:25
AY1920S1-CS2113-T14-1/main
https://api.github.com/repos/AY1920S1-CS2113-T14-1/main
closed
Create disambiguation window logic
component.Logic priority.Core status.Ongoing type.Task
A general purpose disambiguation screen is required for the functioning of a lot of commands. It should be able to display the string representation of several DukeObjects and allow the user to select them by index in order to execute the Command on them.
1.0
Create disambiguation window logic - A general purpose disambiguation screen is required for the functioning of a lot of commands. It should be able to display the string representation of several DukeObjects and allow the user to select them by index in order to execute the Command on them.
non_test
create disambiguation window logic a general purpose disambiguation screen is required for the functioning of a lot of commands it should be able to display the string representation of several dukeobjects and allow the user to select them by index in order to execute the command on them
0
158,386
24,829,969,673
IssuesEvent
2022-10-26 01:54:48
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
[SerDes 45] Add an "update this Saved Question from YAML" button on frontend
Type:New Feature .Design Needed Organization/Saved Questions .Frontend Operation/Serialization
Part of #24766 Depends on: - #24780 Once #24780 is in place we should add a button somewhere to the Saved Question view in the frontend UI to update it with YAML. This button should only be visible if we have the appropriate enterprise token feature. Whether we should support uploading a local YAML file or pasting into a text field or both is an open design question.
1.0
[SerDes 45] Add an "update this Saved Question from YAML" button on frontend - Part of #24766 Depends on: - #24780 Once #24780 is in place we should add a button somewhere to the Saved Question view in the frontend UI to update it with YAML. This button should only be visible if we have the appropriate enterprise token feature. Whether we should support uploading a local YAML file or pasting into a text field or both is an open design question.
non_test
add an update this saved question from yaml button on frontend part of depends on once is in place we should add a button somewhere to the saved question view in the frontend ui to update it with yaml this button should only be visible if we have the appropriate enterprise token feature whether we should support uploading a local yaml file or pasting into a text field or both is an open design question
0
179,742
30,292,020,099
IssuesEvent
2023-07-09 12:00:26
netcreateorg/netcreate-itest
https://api.github.com/repos/netcreateorg/netcreate-itest
opened
Add a network statistic panel
design
This is currently a rough idea to be refined later but we wanted to add it to the list in case it changes architecture thinking. We imagine two kinds of statistics may be useful to auto-calculate and display long-term: 1: attribute statistics (e.g., the sun or average value of a property) 2: network statistics such as average number of nearest neighbors, average degree, etc We will provide examples later but wanted to get it on your radar. Goal is for Spring or beyond.
1.0
Add a network statistic panel - This is currently a rough idea to be refined later but we wanted to add it to the list in case it changes architecture thinking. We imagine two kinds of statistics may be useful to auto-calculate and display long-term: 1: attribute statistics (e.g., the sun or average value of a property) 2: network statistics such as average number of nearest neighbors, average degree, etc We will provide examples later but wanted to get it on your radar. Goal is for Spring or beyond.
non_test
add a network statistic panel this is currently a rough idea to be refined later but we wanted to add it to the list in case it changes architecture thinking we imagine two kinds of statistics may be useful to auto calculate and display long term attribute statistics e g the sun or average value of a property network statistics such as average number of nearest neighbors average degree etc we will provide examples later but wanted to get it on your radar goal is for spring or beyond
0
216,523
16,768,406,281
IssuesEvent
2021-06-14 11:58:24
SAP/fosstars-rating-core
https://api.github.com/repos/SAP/fosstars-rating-core
opened
Better tests for the command-line tool
test
https://github.com/SAP/fosstars-rating-core/tree/master/src/test/shell/tool/github This directory contains a tiny test suite for the command-line tool, but it doesn't run in CI/CD. Furthermore, it looks outdated, It would be good to make it up-to-date and add a check for pull requests that run the test suite.
1.0
Better tests for the command-line tool - https://github.com/SAP/fosstars-rating-core/tree/master/src/test/shell/tool/github This directory contains a tiny test suite for the command-line tool, but it doesn't run in CI/CD. Furthermore, it looks outdated, It would be good to make it up-to-date and add a check for pull requests that run the test suite.
test
better tests for the command line tool this directory contains a tiny test suite for the command line tool but it doesn t run in ci cd furthermore it looks outdated it would be good to make it up to date and add a check for pull requests that run the test suite
1
139,443
5,375,479,836
IssuesEvent
2017-02-23 05:02:26
ArctosDB/documentation-wiki
https://api.github.com/repos/ArctosDB/documentation-wiki
opened
remove hyphens from anchor links when creating from <h3>subtitles
bug Priority: High question
Here's the problem: in the current site the anchors were not necessarily the subheader/subtitle phrase: https://arctosdb.org/documentation/agent/#namesearch BUT the subheader is "Searching Agents" Opt#1 I can go through and update the subheaders to match but in some cases like this one is leaves wtih some awkward phrasing. Opt#2? Anyway to create anchors not from <h3>tags? In any case we need to get rid of hyphens when constructing the anchors from subtitles (readability pfft!) This blocks the launch if we dont have a solid fix
1.0
remove hyphens from anchor links when creating from <h3>subtitles - Here's the problem: in the current site the anchors were not necessarily the subheader/subtitle phrase: https://arctosdb.org/documentation/agent/#namesearch BUT the subheader is "Searching Agents" Opt#1 I can go through and update the subheaders to match but in some cases like this one is leaves wtih some awkward phrasing. Opt#2? Anyway to create anchors not from <h3>tags? In any case we need to get rid of hyphens when constructing the anchors from subtitles (readability pfft!) This blocks the launch if we dont have a solid fix
non_test
remove hyphens from anchor links when creating from subtitles here s the problem in the current site the anchors were not necessarily the subheader subtitle phrase but the subheader is searching agents opt i can go through and update the subheaders to match but in some cases like this one is leaves wtih some awkward phrasing opt anyway to create anchors not from tags in any case we need to get rid of hyphens when constructing the anchors from subtitles readability pfft this blocks the launch if we dont have a solid fix
0
22,967
3,985,060,728
IssuesEvent
2016-05-07 16:35:33
red/red
https://api.github.com/repos/red/red
closed
Compiler inserts a CR in literal strings line-endings
status.built status.tested type.bug
When the following code is in a file with Windows (CR LF) endings: ``` Red [] s: {a b} probe s ``` The interpreter gives `"a^/b"` while the compiled executable gives `"a^M^/b"`.
1.0
Compiler inserts a CR in literal strings line-endings - When the following code is in a file with Windows (CR LF) endings: ``` Red [] s: {a b} probe s ``` The interpreter gives `"a^/b"` while the compiled executable gives `"a^M^/b"`.
test
compiler inserts a cr in literal strings line endings when the following code is in a file with windows cr lf endings red s a b probe s the interpreter gives a b while the compiled executable gives a m b
1
233,852
17,909,841,913
IssuesEvent
2021-09-09 02:39:37
sefatecol/SFTLeishmaniasis
https://api.github.com/repos/sefatecol/SFTLeishmaniasis
closed
DevOps (Configuración de ScrumBoard y Tareas)
documentation
- [x] Definición de Milestones para cada tarea - [x] Definir Tareas como Issues - [x] Asignar Issues al Kanban (ScrumBoard) - [x] Asignar responsables a cada Tarea (issue) - [x] Configurar la wiki para la documentación e información del proyecto
1.0
DevOps (Configuración de ScrumBoard y Tareas) - - [x] Definición de Milestones para cada tarea - [x] Definir Tareas como Issues - [x] Asignar Issues al Kanban (ScrumBoard) - [x] Asignar responsables a cada Tarea (issue) - [x] Configurar la wiki para la documentación e información del proyecto
non_test
devops configuración de scrumboard y tareas definición de milestones para cada tarea definir tareas como issues asignar issues al kanban scrumboard asignar responsables a cada tarea issue configurar la wiki para la documentación e información del proyecto
0
420,404
12,237,602,172
IssuesEvent
2020-05-04 18:18:44
GoogleContainerTools/kaniko
https://api.github.com/repos/GoogleContainerTools/kaniko
closed
error building image: error building stage: mkdir /usr/share/bug/systemd: not a directory
area/filesystems in progress kind/bug priority/p1
**Actual behavior** A clear and concise description of what the bug is. The error is : ``` INFO[0001] Unpacking rootfs as cmd RUN echo $HOME requires it. error building image: error building stage: mkdir /usr/share/bug/systemd: not a directory ``` This is this Dockerfile which is used to have this error : ``` FROM eu.gcr.io/xxxxxxxxxxxxx/jessie:v0.0.1-dist-kaniko USER www-data RUN echo $HOME ``` Our home made `jessie:v0.0.1-dist-kaniko` jessie docker image use `debian:jessie-slim` as FROM. `systemd` deb package is in version 215 on `debian:jessie-slim`, and `eu.gcr.io/xxxxxxxxxxxxx/jessie:v0.0.1-dist-kaniko` add as debian distribution a home made one, with inside systemd_230 deb package version. On the `systemd_215`, this is a file : `/usr/share/bug/systemd` On the `systemd_230`, this is now a directory with inside files : ``` /usr/share/bug/systemd/control /usr/share/bug/systemd/script ``` The error appears during the `apt-get -y -q dist-upgrade` command while upgrading the `systemd` deb packages from `version 215` to `version 230` Here i suppose then the error is kaniko doesn't succeed to create the new directory `/usr/share/bug/systemd/` with inside files `control` and `script` **Expected behavior** A clear and concise description of what you expected to happen. To pass to the next step which is taking snapshot. **To Reproduce** Steps to reproduce the behavior: 1. Create new Dockerfile for instance tagged `jessie:v0.0.1-dist-kaniko` with `FROM debian:jessie-slim`, which contains `systemd_215` deb packages 2. On this image, Add distribution with inside `systemd_230` (or newest version) deb packages 3. On this image, install new version packages with command `apt-get -y -q dist-upgrade` 4. Create new Dockerfile for instance tagged `jessie:v0.0.1-test-kaniko` which will be : ``` FROM eu.gcr.io/xxxxxxxxxxxxx/jessie:v0.0.1-dist-kaniko USER www-data RUN echo $HOME ``` 5. Build this `jessie:v0.0.1-test-kaniko` image using Kaniko 6. Expect the error : ``` INFO[0001] Unpacking rootfs as cmd RUN echo $HOME requires it. error building image: error building stage: mkdir /usr/share/bug/systemd: not a directory ``` **Triage Notes for the Maintainers** <!-- 🎉🎉🎉 Thank you for an opening an issue !!! 🎉🎉🎉 We are doing our best to get to this. Please help us by helping us prioritize your issue by filling the section below --> | **Description** | **Yes/No** | |----------------|---------------| | Please check if this a new feature you are proposing | <ul><li>- [ ] </li></ul>| | Please check if the build works in docker but not in kaniko | <ul><li>- [x] </li></ul>| | Please check if this error is seen when you use `--cache` flag | <ul><li>- [ ] </li></ul>| | Please check if your dockerfile is a multistage dockerfile | <ul><li>- [ ] </li></ul>|
1.0
error building image: error building stage: mkdir /usr/share/bug/systemd: not a directory - **Actual behavior** A clear and concise description of what the bug is. The error is : ``` INFO[0001] Unpacking rootfs as cmd RUN echo $HOME requires it. error building image: error building stage: mkdir /usr/share/bug/systemd: not a directory ``` This is this Dockerfile which is used to have this error : ``` FROM eu.gcr.io/xxxxxxxxxxxxx/jessie:v0.0.1-dist-kaniko USER www-data RUN echo $HOME ``` Our home made `jessie:v0.0.1-dist-kaniko` jessie docker image use `debian:jessie-slim` as FROM. `systemd` deb package is in version 215 on `debian:jessie-slim`, and `eu.gcr.io/xxxxxxxxxxxxx/jessie:v0.0.1-dist-kaniko` add as debian distribution a home made one, with inside systemd_230 deb package version. On the `systemd_215`, this is a file : `/usr/share/bug/systemd` On the `systemd_230`, this is now a directory with inside files : ``` /usr/share/bug/systemd/control /usr/share/bug/systemd/script ``` The error appears during the `apt-get -y -q dist-upgrade` command while upgrading the `systemd` deb packages from `version 215` to `version 230` Here i suppose then the error is kaniko doesn't succeed to create the new directory `/usr/share/bug/systemd/` with inside files `control` and `script` **Expected behavior** A clear and concise description of what you expected to happen. To pass to the next step which is taking snapshot. **To Reproduce** Steps to reproduce the behavior: 1. Create new Dockerfile for instance tagged `jessie:v0.0.1-dist-kaniko` with `FROM debian:jessie-slim`, which contains `systemd_215` deb packages 2. On this image, Add distribution with inside `systemd_230` (or newest version) deb packages 3. On this image, install new version packages with command `apt-get -y -q dist-upgrade` 4. Create new Dockerfile for instance tagged `jessie:v0.0.1-test-kaniko` which will be : ``` FROM eu.gcr.io/xxxxxxxxxxxxx/jessie:v0.0.1-dist-kaniko USER www-data RUN echo $HOME ``` 5. Build this `jessie:v0.0.1-test-kaniko` image using Kaniko 6. Expect the error : ``` INFO[0001] Unpacking rootfs as cmd RUN echo $HOME requires it. error building image: error building stage: mkdir /usr/share/bug/systemd: not a directory ``` **Triage Notes for the Maintainers** <!-- 🎉🎉🎉 Thank you for an opening an issue !!! 🎉🎉🎉 We are doing our best to get to this. Please help us by helping us prioritize your issue by filling the section below --> | **Description** | **Yes/No** | |----------------|---------------| | Please check if this a new feature you are proposing | <ul><li>- [ ] </li></ul>| | Please check if the build works in docker but not in kaniko | <ul><li>- [x] </li></ul>| | Please check if this error is seen when you use `--cache` flag | <ul><li>- [ ] </li></ul>| | Please check if your dockerfile is a multistage dockerfile | <ul><li>- [ ] </li></ul>|
non_test
error building image error building stage mkdir usr share bug systemd not a directory actual behavior a clear and concise description of what the bug is the error is info unpacking rootfs as cmd run echo home requires it error building image error building stage mkdir usr share bug systemd not a directory this is this dockerfile which is used to have this error from eu gcr io xxxxxxxxxxxxx jessie dist kaniko user www data run echo home our home made jessie dist kaniko jessie docker image use debian jessie slim as from systemd deb package is in version on debian jessie slim and eu gcr io xxxxxxxxxxxxx jessie dist kaniko add as debian distribution a home made one with inside systemd deb package version on the systemd this is a file usr share bug systemd on the systemd this is now a directory with inside files usr share bug systemd control usr share bug systemd script the error appears during the apt get y q dist upgrade command while upgrading the systemd deb packages from version to version here i suppose then the error is kaniko doesn t succeed to create the new directory usr share bug systemd with inside files control and script expected behavior a clear and concise description of what you expected to happen to pass to the next step which is taking snapshot to reproduce steps to reproduce the behavior create new dockerfile for instance tagged jessie dist kaniko with from debian jessie slim which contains systemd deb packages on this image add distribution with inside systemd or newest version deb packages on this image install new version packages with command apt get y q dist upgrade create new dockerfile for instance tagged jessie test kaniko which will be from eu gcr io xxxxxxxxxxxxx jessie dist kaniko user www data run echo home build this jessie test kaniko image using kaniko expect the error info unpacking rootfs as cmd run echo home requires it error building image error building stage mkdir usr share bug systemd not a directory triage notes for the maintainers 🎉🎉🎉 thank you for an opening an issue 🎉🎉🎉 we are doing our best to get to this please help us by helping us prioritize your issue by filling the section below description yes no please check if this a new feature you are proposing please check if the build works in docker but not in kaniko please check if this error is seen when you use cache flag please check if your dockerfile is a multistage dockerfile
0
26,199
12,885,478,138
IssuesEvent
2020-07-13 06:55:22
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
closed
Resource Exhausted when re-training Half of Efficientnet b0 on V100 32GB.
stalled stat:awaiting response type:performance
Hi All, i am experiencing a scenario where i can train anEfficientNetB0 ([efficientnet](https://github.com/qubvel/efficientnet)) with a batch size of 4 . when i cut the model at some layer and make a new (smaller) model out of it , the same training is throwing Resource Exhausted error . ` `Working implementation` ``` model_conv = efn.EfficientNetB0(weights='/work/source/pre_trained/efficientnet-b0_weights_tf_dim_ordering_tf_kernels_autoaugment_notop.h5',include_top=False,input_tensor=input_layer) model = tf.keras.Sequential() model.add(tf.keras.layers.TimeDistributed(model_conv, input_shape=(3, 1024,1024,3))) model.add(tf.keras.layers.GlobalAveragePooling3D()) model.add(tf.keras.layers.Dropout(0.2)) model.add(tf.keras.layers.Dense(classes_n - 1)) model.add(tf.keras.layers.Activation('sigmoid', dtype='float32', name='predictions')) ``` `Not Working implementation` ``` input_layer = tf.keras.layers.Input(shape=(1024,1024,3)) model_conv = efn.EfficientNetB0(weights='/work/source/pre_trained/efficientnet-b0_weights_tf_dim_ordering_tf_kernels_autoaugment_notop.h5', include_top=False,input_tensor=input_layer) model_conv = tf.keras.Model(model_conv.input,model_conv.get_layer('block6d_add').output) model_conv = tf.keras.models.load_model('./models_cut/effb0_5block.h5') model = tf.keras.Sequential() model.add(tf.keras.layers.TimeDistributed(model_conv, input_shape=(3, 1024,1024,3))) model.add(tf.keras.layers.GlobalAveragePooling3D()) model.add(tf.keras.layers.Dropout(0.2)) model.add(tf.keras.layers.Dense(classes_n - 1)) model.add(tf.keras.layers.Activation('sigmoid', dtype='float32', name='predictions')) ``` Error ``` tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[15,64,64,672] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [Op:Conv2D] ```
True
Resource Exhausted when re-training Half of Efficientnet b0 on V100 32GB. - Hi All, i am experiencing a scenario where i can train anEfficientNetB0 ([efficientnet](https://github.com/qubvel/efficientnet)) with a batch size of 4 . when i cut the model at some layer and make a new (smaller) model out of it , the same training is throwing Resource Exhausted error . ` `Working implementation` ``` model_conv = efn.EfficientNetB0(weights='/work/source/pre_trained/efficientnet-b0_weights_tf_dim_ordering_tf_kernels_autoaugment_notop.h5',include_top=False,input_tensor=input_layer) model = tf.keras.Sequential() model.add(tf.keras.layers.TimeDistributed(model_conv, input_shape=(3, 1024,1024,3))) model.add(tf.keras.layers.GlobalAveragePooling3D()) model.add(tf.keras.layers.Dropout(0.2)) model.add(tf.keras.layers.Dense(classes_n - 1)) model.add(tf.keras.layers.Activation('sigmoid', dtype='float32', name='predictions')) ``` `Not Working implementation` ``` input_layer = tf.keras.layers.Input(shape=(1024,1024,3)) model_conv = efn.EfficientNetB0(weights='/work/source/pre_trained/efficientnet-b0_weights_tf_dim_ordering_tf_kernels_autoaugment_notop.h5', include_top=False,input_tensor=input_layer) model_conv = tf.keras.Model(model_conv.input,model_conv.get_layer('block6d_add').output) model_conv = tf.keras.models.load_model('./models_cut/effb0_5block.h5') model = tf.keras.Sequential() model.add(tf.keras.layers.TimeDistributed(model_conv, input_shape=(3, 1024,1024,3))) model.add(tf.keras.layers.GlobalAveragePooling3D()) model.add(tf.keras.layers.Dropout(0.2)) model.add(tf.keras.layers.Dense(classes_n - 1)) model.add(tf.keras.layers.Activation('sigmoid', dtype='float32', name='predictions')) ``` Error ``` tensorflow.python.framework.errors_impl.ResourceExhaustedError: OOM when allocating tensor with shape[15,64,64,672] and type float on /job:localhost/replica:0/task:0/device:GPU:0 by allocator GPU_0_bfc [Op:Conv2D] ```
non_test
resource exhausted when re training half of efficientnet on hi all i am experiencing a scenario where i can train with a batch size of when i cut the model at some layer and make a new smaller model out of it the same training is throwing resource exhausted error working implementation model conv efn weights work source pre trained efficientnet weights tf dim ordering tf kernels autoaugment notop include top false input tensor input layer model tf keras sequential model add tf keras layers timedistributed model conv input shape model add tf keras layers model add tf keras layers dropout model add tf keras layers dense classes n model add tf keras layers activation sigmoid dtype name predictions not working implementation input layer tf keras layers input shape model conv efn weights work source pre trained efficientnet weights tf dim ordering tf kernels autoaugment notop include top false input tensor input layer model conv tf keras model model conv input model conv get layer add output model conv tf keras models load model models cut model tf keras sequential model add tf keras layers timedistributed model conv input shape model add tf keras layers model add tf keras layers dropout model add tf keras layers dense classes n model add tf keras layers activation sigmoid dtype name predictions error tensorflow python framework errors impl resourceexhaustederror oom when allocating tensor with shape and type float on job localhost replica task device gpu by allocator gpu bfc
0
180,578
6,651,117,378
IssuesEvent
2017-09-28 18:49:48
mozilla/addons-frontend
https://api.github.com/repos/mozilla/addons-frontend
closed
Clicking a star while editing a review will erase some text
component: add-on ratings priority: mvp project: amo triaged type: bug
from https://github.com/mozilla/addons-frontend/issues/3180#issuecomment-332513834 I've verified https://github.com/mozilla/addons-frontend/issues/3180 on AMO-dev F55 (WIn 10 and Android 7.0) In both tests I observe the following: - Click on 'Edit My review" - Type additional text in the text area - Modify the star rating Expected result: The modifications made in the text area should show the updates just made Actual result: The updates made in the text area disappear if the star rating is modified after new text is added Video reproduction: ![edit review](https://user-images.githubusercontent.com/31961530/30914711-3f24a4d2-a39d-11e7-9222-8ce4384e9576.gif)
1.0
Clicking a star while editing a review will erase some text - from https://github.com/mozilla/addons-frontend/issues/3180#issuecomment-332513834 I've verified https://github.com/mozilla/addons-frontend/issues/3180 on AMO-dev F55 (WIn 10 and Android 7.0) In both tests I observe the following: - Click on 'Edit My review" - Type additional text in the text area - Modify the star rating Expected result: The modifications made in the text area should show the updates just made Actual result: The updates made in the text area disappear if the star rating is modified after new text is added Video reproduction: ![edit review](https://user-images.githubusercontent.com/31961530/30914711-3f24a4d2-a39d-11e7-9222-8ce4384e9576.gif)
non_test
clicking a star while editing a review will erase some text from i ve verified on amo dev win and android in both tests i observe the following click on edit my review type additional text in the text area modify the star rating expected result the modifications made in the text area should show the updates just made actual result the updates made in the text area disappear if the star rating is modified after new text is added video reproduction
0
139,542
11,273,488,370
IssuesEvent
2020-01-14 16:39:04
HumanCellAtlas/ingest-graph-validator
https://api.github.com/repos/HumanCellAtlas/ingest-graph-validator
opened
refactor test "sequencing files have links to appropriate protocols"
graph validation test refactoring
The first process upstream of a sequencing file should have two links to protocols and those should be 'library_preparation_protocol' and 'sequencing_protocol'.
1.0
refactor test "sequencing files have links to appropriate protocols" - The first process upstream of a sequencing file should have two links to protocols and those should be 'library_preparation_protocol' and 'sequencing_protocol'.
test
refactor test sequencing files have links to appropriate protocols the first process upstream of a sequencing file should have two links to protocols and those should be library preparation protocol and sequencing protocol
1
294,766
9,041,392,739
IssuesEvent
2019-02-11 00:02:07
telstra/open-kilda
https://api.github.com/repos/telstra/open-kilda
closed
Implement checks for PathVerificationService#handlePacketIn
area/security bug priority/2-high refactor
Step to reproduce 1. Create simple topology 2. Run code in scapy ``` $ docker exec -ti mininet scapy Welcome to Scapy (2.2.0) >>> sendp(Ether(dst="00:26:e1:ff:ff:ff")/ IP(dst="192.168.0.255")/UDP(dport=61231,sport=61231)/'\0x01\0xFF\x00\x00', iface="00000001-eth1") . Sent 1 packets. ``` In `docker logs -f openkilda_floodlight_1 ` you can see ``` 2017-11-13 16:37:24.434 ERROR [o.o.f.p.PathVerificationService] unknown error during packet_in message processing: null java.lang.NullPointerException: null at org.openkilda.floodlight.pathverification.PathVerificationService.handlePacketIn(PathVerificationService.java:423) [floodlight-modules.jar:na] at org.openkilda.floodlight.pathverification.PathVerificationService.receive(PathVerificationService.java:207) [floodlight-modules.jar:na] at net.floodlightcontroller.core.internal.Controller.handleMessage(Controller.java:407) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchManager.handleMessage(OFSwitchManager.java:487) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.dispatchMessage(OFSwitchHandshakeHandler.java:1752) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.access$2000(OFSwitchHandshakeHandler.java:95) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler$MasterState.processOFPacketIn(OFSwitchHandshakeHandler.java:1488) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler$OFSwitchHandshakeState.processOFMessage(OFSwitchHandshakeHandler.java:839) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.processOFMessage(OFSwitchHandshakeHandler.java:1790) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.messageReceived(OFSwitchHandshakeHandler.java:1964) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFConnection.messageReceived(OFConnection.java:414) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.sendMessageToConnection(OFChannelHandler.java:579) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.access$800(OFChannelHandler.java:57) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler$OFChannelState.processOFMessage(OFChannelHandler.java:284) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.channelRead0(OFChannelHandler.java:696) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.channelRead0(OFChannelHandler.java:57) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.handler.timeout.ReadTimeoutHandler.channelRead(ReadTimeoutHandler.java:152) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) [floodlight.jar:1.2-SNAPSHOT] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112) [floodlight.jar:1.2-SNAPSHOT] at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137) [floodlight.jar:1.2-SNAPSHOT] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131] ``` Need to write code for avoiding malformed packets.
1.0
Implement checks for PathVerificationService#handlePacketIn - Step to reproduce 1. Create simple topology 2. Run code in scapy ``` $ docker exec -ti mininet scapy Welcome to Scapy (2.2.0) >>> sendp(Ether(dst="00:26:e1:ff:ff:ff")/ IP(dst="192.168.0.255")/UDP(dport=61231,sport=61231)/'\0x01\0xFF\x00\x00', iface="00000001-eth1") . Sent 1 packets. ``` In `docker logs -f openkilda_floodlight_1 ` you can see ``` 2017-11-13 16:37:24.434 ERROR [o.o.f.p.PathVerificationService] unknown error during packet_in message processing: null java.lang.NullPointerException: null at org.openkilda.floodlight.pathverification.PathVerificationService.handlePacketIn(PathVerificationService.java:423) [floodlight-modules.jar:na] at org.openkilda.floodlight.pathverification.PathVerificationService.receive(PathVerificationService.java:207) [floodlight-modules.jar:na] at net.floodlightcontroller.core.internal.Controller.handleMessage(Controller.java:407) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchManager.handleMessage(OFSwitchManager.java:487) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.dispatchMessage(OFSwitchHandshakeHandler.java:1752) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.access$2000(OFSwitchHandshakeHandler.java:95) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler$MasterState.processOFPacketIn(OFSwitchHandshakeHandler.java:1488) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler$OFSwitchHandshakeState.processOFMessage(OFSwitchHandshakeHandler.java:839) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.processOFMessage(OFSwitchHandshakeHandler.java:1790) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFSwitchHandshakeHandler.messageReceived(OFSwitchHandshakeHandler.java:1964) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFConnection.messageReceived(OFConnection.java:414) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.sendMessageToConnection(OFChannelHandler.java:579) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.access$800(OFChannelHandler.java:57) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler$OFChannelState.processOFMessage(OFChannelHandler.java:284) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.channelRead0(OFChannelHandler.java:696) [floodlight.jar:1.2-SNAPSHOT] at net.floodlightcontroller.core.internal.OFChannelHandler.channelRead0(OFChannelHandler.java:57) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.SimpleChannelInboundHandler.channelRead(SimpleChannelInboundHandler.java:105) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.ChannelInboundHandlerAdapter.channelRead(ChannelInboundHandlerAdapter.java:86) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.handler.timeout.ReadTimeoutHandler.channelRead(ReadTimeoutHandler.java:152) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:266) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.handler.codec.ByteToMessageDecoder.channelRead(ByteToMessageDecoder.java:244) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:308) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:294) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:846) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:131) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:511) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:468) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:382) [floodlight.jar:1.2-SNAPSHOT] at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:354) [floodlight.jar:1.2-SNAPSHOT] at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:112) [floodlight.jar:1.2-SNAPSHOT] at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:137) [floodlight.jar:1.2-SNAPSHOT] at java.lang.Thread.run(Thread.java:748) [na:1.8.0_131] ``` Need to write code for avoiding malformed packets.
non_test
implement checks for pathverificationservice handlepacketin step to reproduce create simple topology run code in scapy docker exec ti mininet scapy welcome to scapy sendp ether dst ff ff ff ip dst udp dport sport iface sent packets in docker logs f openkilda floodlight you can see error unknown error during packet in message processing null java lang nullpointerexception null at org openkilda floodlight pathverification pathverificationservice handlepacketin pathverificationservice java at org openkilda floodlight pathverification pathverificationservice receive pathverificationservice java at net floodlightcontroller core internal controller handlemessage controller java at net floodlightcontroller core internal ofswitchmanager handlemessage ofswitchmanager java at net floodlightcontroller core internal ofswitchhandshakehandler dispatchmessage ofswitchhandshakehandler java at net floodlightcontroller core internal ofswitchhandshakehandler access ofswitchhandshakehandler java at net floodlightcontroller core internal ofswitchhandshakehandler masterstate processofpacketin ofswitchhandshakehandler java at net floodlightcontroller core internal ofswitchhandshakehandler ofswitchhandshakestate processofmessage ofswitchhandshakehandler java at net floodlightcontroller core internal ofswitchhandshakehandler processofmessage ofswitchhandshakehandler java at net floodlightcontroller core internal ofswitchhandshakehandler messagereceived ofswitchhandshakehandler java at net floodlightcontroller core internal ofconnection messagereceived ofconnection java at net floodlightcontroller core internal ofchannelhandler sendmessagetoconnection ofchannelhandler java at net floodlightcontroller core internal ofchannelhandler access ofchannelhandler java at net floodlightcontroller core internal ofchannelhandler ofchannelstate processofmessage ofchannelhandler java at net floodlightcontroller core internal ofchannelhandler ofchannelhandler java at net floodlightcontroller core internal ofchannelhandler ofchannelhandler java at io netty channel simplechannelinboundhandler channelread simplechannelinboundhandler java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty channel channelinboundhandleradapter channelread channelinboundhandleradapter java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler timeout readtimeouthandler channelread readtimeouthandler java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler timeout idlestatehandler channelread idlestatehandler java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler codec bytetomessagedecoder channelread bytetomessagedecoder java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty channel defaultchannelpipeline firechannelread defaultchannelpipeline java at io netty channel nio abstractniobytechannel niobyteunsafe read abstractniobytechannel java at io netty channel nio nioeventloop processselectedkey nioeventloop java at io netty channel nio nioeventloop processselectedkeysoptimized nioeventloop java at io netty channel nio nioeventloop processselectedkeys nioeventloop java at io netty channel nio nioeventloop run nioeventloop java at io netty util concurrent singlethreadeventexecutor run singlethreadeventexecutor java at io netty util concurrent defaultthreadfactory defaultrunnabledecorator run defaultthreadfactory java at java lang thread run thread java need to write code for avoiding malformed packets
0
115,630
9,807,166,112
IssuesEvent
2019-06-12 13:09:44
brave/brave-browser
https://api.github.com/repos/brave/brave-browser
closed
Ad notifications not being shown after re-enabling until browser is restarted
QA/Test-Plan-Specified QA/Yes feature/ads feature/rewards release-notes/include
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue. PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE. INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED--> ## Description After Ads have been disabled and then re-enabled, Ad notifications will not be shown until you restart Brave. ## Steps to Reproduce <!--Please add a series of steps to reproduce the issue--> 1. Clean install/profile. 2. Enable Rewards (Ads are also enabled) 3. Go to brave://rewards page 4. Disable Ads 5. Wait 30s or so, enable Ads. 6. Visit a site, wait until you see `Browser state changed to idle` in terminal then move the mouse. 7. No ad is shown. You get `Notification not made: Confirmations not ready` in terminal. 8. Give it another minute or so, try step 6 again. Same result. This will continue (you won't get Ads) until you restart Brave. Then you will start to see Ads again. ## Actual result: Ad notifications not shown after re-enabling Ads from panel on brave://rewards ## Expected result: Ad notifications should be shown after re-enabling Ads from panel. ## Reproduces how often: easily ## Brave version (brave://version info) Brave | 0.63.55 Chromium: 74.0.3729.131 (Official Build) (64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X Brave | 0.64.75 Chromium: 74.0.3729.131 (Official Build) (64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X Brave | 0.65.92 Chromium: 74.0.3729.131 (Official Build) beta(64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X Brave | 0.66.63 Chromium: 74.0.3729.131 (Official Build) dev(64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X ## Version/Channel Information: <!--Does this issue happen on any other channels? Or is it specific to a certain channel?--> - Can you reproduce this issue with the current release? yes - Can you reproduce this issue with the beta channel? yes - Can you reproduce this issue with the dev channel? yes - Can you reproduce this issue with the nightly channel? unsure, but probably yes ## Other Additional Information: - Does the issue resolve itself when disabling Brave Shields? n/a - Does the issue resolve itself when disabling Brave Rewards? n/a - Is the issue reproducible on the latest version of Chrome? n/a ## Miscellaneous Information: reproduced by @srirambv cc @jsecretan @tmancey
1.0
Ad notifications not being shown after re-enabling until browser is restarted - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue. PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE. INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED--> ## Description After Ads have been disabled and then re-enabled, Ad notifications will not be shown until you restart Brave. ## Steps to Reproduce <!--Please add a series of steps to reproduce the issue--> 1. Clean install/profile. 2. Enable Rewards (Ads are also enabled) 3. Go to brave://rewards page 4. Disable Ads 5. Wait 30s or so, enable Ads. 6. Visit a site, wait until you see `Browser state changed to idle` in terminal then move the mouse. 7. No ad is shown. You get `Notification not made: Confirmations not ready` in terminal. 8. Give it another minute or so, try step 6 again. Same result. This will continue (you won't get Ads) until you restart Brave. Then you will start to see Ads again. ## Actual result: Ad notifications not shown after re-enabling Ads from panel on brave://rewards ## Expected result: Ad notifications should be shown after re-enabling Ads from panel. ## Reproduces how often: easily ## Brave version (brave://version info) Brave | 0.63.55 Chromium: 74.0.3729.131 (Official Build) (64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X Brave | 0.64.75 Chromium: 74.0.3729.131 (Official Build) (64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X Brave | 0.65.92 Chromium: 74.0.3729.131 (Official Build) beta(64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X Brave | 0.66.63 Chromium: 74.0.3729.131 (Official Build) dev(64-bit) -- | -- Revision | 518a41c1fa7ce1c8bb5e22346e82e42b4d76a96f-refs/branch-heads/3729@{#954} OS | Mac OS X ## Version/Channel Information: <!--Does this issue happen on any other channels? Or is it specific to a certain channel?--> - Can you reproduce this issue with the current release? yes - Can you reproduce this issue with the beta channel? yes - Can you reproduce this issue with the dev channel? yes - Can you reproduce this issue with the nightly channel? unsure, but probably yes ## Other Additional Information: - Does the issue resolve itself when disabling Brave Shields? n/a - Does the issue resolve itself when disabling Brave Rewards? n/a - Is the issue reproducible on the latest version of Chrome? n/a ## Miscellaneous Information: reproduced by @srirambv cc @jsecretan @tmancey
test
ad notifications not being shown after re enabling until browser is restarted have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description after ads have been disabled and then re enabled ad notifications will not be shown until you restart brave steps to reproduce clean install profile enable rewards ads are also enabled go to brave rewards page disable ads wait or so enable ads visit a site wait until you see browser state changed to idle in terminal then move the mouse no ad is shown you get notification not made confirmations not ready in terminal give it another minute or so try step again same result this will continue you won t get ads until you restart brave then you will start to see ads again actual result ad notifications not shown after re enabling ads from panel on brave rewards expected result ad notifications should be shown after re enabling ads from panel reproduces how often easily brave version brave version info brave chromium   official build   bit revision refs branch heads os mac os x brave chromium   official build   bit revision refs branch heads os mac os x brave chromium   official build  beta bit revision refs branch heads os mac os x brave chromium   official build  dev bit revision refs branch heads os mac os x version channel information can you reproduce this issue with the current release yes can you reproduce this issue with the beta channel yes can you reproduce this issue with the dev channel yes can you reproduce this issue with the nightly channel unsure but probably yes other additional information does the issue resolve itself when disabling brave shields n a does the issue resolve itself when disabling brave rewards n a is the issue reproducible on the latest version of chrome n a miscellaneous information reproduced by srirambv cc jsecretan tmancey
1
171,678
13,244,041,656
IssuesEvent
2020-08-19 12:27:39
openfoodfacts/openfoodfacts-androidapp
https://api.github.com/repos/openfoodfacts/openfoodfacts-androidapp
closed
Test that translations of strings that contain a placeholder also contain a placeholder
help wanted tests
Test that translations of strings that contain a placeholder also contain a placeholder
1.0
Test that translations of strings that contain a placeholder also contain a placeholder - Test that translations of strings that contain a placeholder also contain a placeholder
test
test that translations of strings that contain a placeholder also contain a placeholder test that translations of strings that contain a placeholder also contain a placeholder
1
428,464
12,412,021,678
IssuesEvent
2020-05-22 09:43:38
ahmedkaludi/accelerated-mobile-pages
https://api.github.com/repos/ahmedkaludi/accelerated-mobile-pages
closed
The attribute 'src' in tag 'amp-form extension .js script' is set to the invalid value 'https://cdn.ampproject.org/v0/amp-form-latest.js' defer onload=''.
NEXT UPDATE [Priority: HIGH] bug
REF:https://secure.helpscout.net/conversation/1157797858/128141/
1.0
The attribute 'src' in tag 'amp-form extension .js script' is set to the invalid value 'https://cdn.ampproject.org/v0/amp-form-latest.js' defer onload=''. - REF:https://secure.helpscout.net/conversation/1157797858/128141/
non_test
the attribute src in tag amp form extension js script is set to the invalid value defer onload ref
0
136,991
30,609,778,987
IssuesEvent
2023-07-23 13:11:37
sara-abu-zeineh/portfolio
https://api.github.com/repos/sara-abu-zeineh/portfolio
closed
Create Education section
code review
#7 Education Section --- This section contains the following: - [ ] A timeline for the most important education I have in the current year 🏫 . - [ ] Used font for the information in each card, ` font-family: 'Raleway', sans-serif;` ✒. - [ ] Used icon from [fontawsome ](https://fontawesome.com/) Used icon : 1. ` <i class=" fa-solid fa-caret-left"></i>` ◀ 2. ` <i class="fa-solid fa-caret-right"></i>` ▶ - [ ] Implement Responsive Home Design 🖥. - [ ] Add A forward Transitions to the timeline✨. - [ ] Colors: 1. `--golden-orange : #F9A602` 2. `--black : #000;` 3. ` --white :#fff;`
1.0
Create Education section - #7 Education Section --- This section contains the following: - [ ] A timeline for the most important education I have in the current year 🏫 . - [ ] Used font for the information in each card, ` font-family: 'Raleway', sans-serif;` ✒. - [ ] Used icon from [fontawsome ](https://fontawesome.com/) Used icon : 1. ` <i class=" fa-solid fa-caret-left"></i>` ◀ 2. ` <i class="fa-solid fa-caret-right"></i>` ▶ - [ ] Implement Responsive Home Design 🖥. - [ ] Add A forward Transitions to the timeline✨. - [ ] Colors: 1. `--golden-orange : #F9A602` 2. `--black : #000;` 3. ` --white :#fff;`
non_test
create education section education section this section contains the following a timeline for the most important education i have in the current year 🏫 used font for the information in each card font family raleway sans serif ✒ used icon from used icon ◀ ▶ implement responsive home design 🖥 add a forward transitions to the timeline✨ colors golden orange black white fff
0
285,549
24,676,141,715
IssuesEvent
2022-10-18 17:09:47
nrwl/nx
https://api.github.com/repos/nrwl/nx
closed
@nrwl/cypress not installing cypress
type: bug scope: testing tools
<!-- Please do your best to fill out all of the sections below! --> ## Current Behavior <!-- What is the behavior that currently you experience? --> When adding `@nrwl/cypress` and generating a new e2e project, it does not add cypress itself to the dependencies. ## Expected Behavior <!-- What is the behavior that you expect to happen? --> <!-- Is this a regression? .i.e Did this used to be the behavior at one point? --> When using `@nrwl/cypress`, running e2e should work OOTB without installing cypress manually, at least without concrete warnings in the CLI that it's supposed to be a manual step. ## Steps to Reproduce <!-- Help us help you by making it easy for us to reproduce your issue! --> ```sh npx create-nx-workspace@14.8.4 test-repo # version is 14.8.4 cd test-repo npm add -D @nrwl/cypress npx nx generate @nrwl/cypress:cypress-project test-1 --baseUrl=localhost:3000 ``` <!-- Can you reproduce this on https://github.com/nrwl/nx-examples? --> <!-- If so, open a PR with your changes and link it below. --> <!-- If not, please provide a minimal Github repo --> <!-- At the very least, provide as much detail as possible to help us reproduce the issue --> ### Failure Logs <!-- Please include any relevant log snippets or files here. --> ``` > nx run test-1:e2e >  NX  Cannot find module 'cypress' Require stack: - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/@nrwl/cypress/src/executors/cypress/cypress.impl.js - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/nx/src/config/workspaces.js - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/nx/src/command-line/run.js - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/nx/bin/run-executor.js Pass --verbose to see the stacktrace. > NX Running target "test-1:e2e" failed Failed tasks: - test-1:e2e Hint: run the command with --verbose for more details. ``` ### Environment <!-- It's important for us to know the context in which you experience this behavior! --> <!-- Please paste the result of `nx report` below! --> ``` > NX Report complete - copy this into the issue template Node : 14.19.3 OS : darwin x64 npm : 6.14.17 nx : 14.8.4 @nrwl/angular : Not Found @nrwl/cypress : 14.8.4 @nrwl/detox : Not Found @nrwl/devkit : 14.8.4 @nrwl/esbuild : Not Found @nrwl/eslint-plugin-nx : 14.8.4 @nrwl/expo : Not Found @nrwl/express : Not Found @nrwl/jest : 14.8.4 @nrwl/js : Not Found @nrwl/linter : 14.8.4 @nrwl/nest : Not Found @nrwl/next : Not Found @nrwl/node : Not Found @nrwl/nx-cloud : Not Found @nrwl/nx-plugin : Not Found @nrwl/react : Not Found @nrwl/react-native : Not Found @nrwl/rollup : Not Found @nrwl/schematics : Not Found @nrwl/storybook : Not Found @nrwl/web : Not Found @nrwl/webpack : Not Found @nrwl/workspace : 14.8.4 typescript : 4.8.4 --------------------------------------- Local workspace plugins: --------------------------------------- Community plugins: ```
1.0
@nrwl/cypress not installing cypress - <!-- Please do your best to fill out all of the sections below! --> ## Current Behavior <!-- What is the behavior that currently you experience? --> When adding `@nrwl/cypress` and generating a new e2e project, it does not add cypress itself to the dependencies. ## Expected Behavior <!-- What is the behavior that you expect to happen? --> <!-- Is this a regression? .i.e Did this used to be the behavior at one point? --> When using `@nrwl/cypress`, running e2e should work OOTB without installing cypress manually, at least without concrete warnings in the CLI that it's supposed to be a manual step. ## Steps to Reproduce <!-- Help us help you by making it easy for us to reproduce your issue! --> ```sh npx create-nx-workspace@14.8.4 test-repo # version is 14.8.4 cd test-repo npm add -D @nrwl/cypress npx nx generate @nrwl/cypress:cypress-project test-1 --baseUrl=localhost:3000 ``` <!-- Can you reproduce this on https://github.com/nrwl/nx-examples? --> <!-- If so, open a PR with your changes and link it below. --> <!-- If not, please provide a minimal Github repo --> <!-- At the very least, provide as much detail as possible to help us reproduce the issue --> ### Failure Logs <!-- Please include any relevant log snippets or files here. --> ``` > nx run test-1:e2e >  NX  Cannot find module 'cypress' Require stack: - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/@nrwl/cypress/src/executors/cypress/cypress.impl.js - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/nx/src/config/workspaces.js - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/nx/src/command-line/run.js - /Users/gioraguttsait/Git/POCs/shopify-cypress-poc/node_modules/nx/bin/run-executor.js Pass --verbose to see the stacktrace. > NX Running target "test-1:e2e" failed Failed tasks: - test-1:e2e Hint: run the command with --verbose for more details. ``` ### Environment <!-- It's important for us to know the context in which you experience this behavior! --> <!-- Please paste the result of `nx report` below! --> ``` > NX Report complete - copy this into the issue template Node : 14.19.3 OS : darwin x64 npm : 6.14.17 nx : 14.8.4 @nrwl/angular : Not Found @nrwl/cypress : 14.8.4 @nrwl/detox : Not Found @nrwl/devkit : 14.8.4 @nrwl/esbuild : Not Found @nrwl/eslint-plugin-nx : 14.8.4 @nrwl/expo : Not Found @nrwl/express : Not Found @nrwl/jest : 14.8.4 @nrwl/js : Not Found @nrwl/linter : 14.8.4 @nrwl/nest : Not Found @nrwl/next : Not Found @nrwl/node : Not Found @nrwl/nx-cloud : Not Found @nrwl/nx-plugin : Not Found @nrwl/react : Not Found @nrwl/react-native : Not Found @nrwl/rollup : Not Found @nrwl/schematics : Not Found @nrwl/storybook : Not Found @nrwl/web : Not Found @nrwl/webpack : Not Found @nrwl/workspace : 14.8.4 typescript : 4.8.4 --------------------------------------- Local workspace plugins: --------------------------------------- Community plugins: ```
test
nrwl cypress not installing cypress current behavior when adding nrwl cypress and generating a new project it does not add cypress itself to the dependencies expected behavior when using nrwl cypress running should work ootb without installing cypress manually at least without concrete warnings in the cli that it s supposed to be a manual step steps to reproduce sh npx create nx workspace test repo version is cd test repo npm add d nrwl cypress npx nx generate nrwl cypress cypress project test baseurl localhost failure logs nx run test       nx      find module cypress  require stack users gioraguttsait git pocs shopify cypress poc node modules nrwl cypress src executors cypress cypress impl js users gioraguttsait git pocs shopify cypress poc node modules nx src config workspaces js users gioraguttsait git pocs shopify cypress poc node modules nx src command line run js users gioraguttsait git pocs shopify cypress poc node modules nx bin run executor js pass verbose to see the stacktrace nx running target test failed failed tasks test hint run the command with verbose for more details environment nx report complete copy this into the issue template node os darwin npm nx nrwl angular not found nrwl cypress nrwl detox not found nrwl devkit nrwl esbuild not found nrwl eslint plugin nx nrwl expo not found nrwl express not found nrwl jest nrwl js not found nrwl linter nrwl nest not found nrwl next not found nrwl node not found nrwl nx cloud not found nrwl nx plugin not found nrwl react not found nrwl react native not found nrwl rollup not found nrwl schematics not found nrwl storybook not found nrwl web not found nrwl webpack not found nrwl workspace typescript local workspace plugins community plugins
1
608,625
18,844,588,078
IssuesEvent
2021-11-11 13:37:16
betagouv/service-national-universel
https://api.github.com/repos/betagouv/service-national-universel
closed
fix: pays de naissance
enhancement priority-HIGH
### Fonctionnalité liée à un problème ? _No response_ ### Fonctionnalité ![image](https://user-images.githubusercontent.com/33058720/141279941-a2fea9c7-7c9b-40a5-add4-4958f8f2bb23.png) il faudrait qu'aucune des chacbox ne soient selectionnées. Si il selectionne France, on set a le pays de naissance a France. Le but est d'anlever les valeurs par défaut et les deduction du style "si il n'a pas de pays renseigné c'est que c'est lq France parce qu'en fait blablabla" ### Commentaires _No response_
1.0
fix: pays de naissance - ### Fonctionnalité liée à un problème ? _No response_ ### Fonctionnalité ![image](https://user-images.githubusercontent.com/33058720/141279941-a2fea9c7-7c9b-40a5-add4-4958f8f2bb23.png) il faudrait qu'aucune des chacbox ne soient selectionnées. Si il selectionne France, on set a le pays de naissance a France. Le but est d'anlever les valeurs par défaut et les deduction du style "si il n'a pas de pays renseigné c'est que c'est lq France parce qu'en fait blablabla" ### Commentaires _No response_
non_test
fix pays de naissance fonctionnalité liée à un problème no response fonctionnalité il faudrait qu aucune des chacbox ne soient selectionnées si il selectionne france on set a le pays de naissance a france le but est d anlever les valeurs par défaut et les deduction du style si il n a pas de pays renseigné c est que c est lq france parce qu en fait blablabla commentaires no response
0