Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
1.02k
| labels
stringlengths 4
1.54k
| body
stringlengths 1
262k
| index
stringclasses 17
values | text_combine
stringlengths 95
262k
| label
stringclasses 2
values | text
stringlengths 96
252k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
381,994
| 11,299,254,814
|
IssuesEvent
|
2020-01-17 10:47:01
|
bryntum/support
|
https://api.github.com/repos/bryntum/support
|
closed
|
Cannot save unscheduled task with ENTER key
|
bug high-priority resolved
|
Await https://github.com/bryntum/bryntum-suite/pull/387 or try this in that branch (where duration field can be empty and still valid)
In basic demo, run:
```
const added = gantt.taskStore.rootNode.appendChild({ name : 'New' });
// run propagation to calculate new task fields
await gantt.project.propagate();
gantt.editTask(added);
```
Change name, ENTER key.
|
1.0
|
Cannot save unscheduled task with ENTER key - Await https://github.com/bryntum/bryntum-suite/pull/387 or try this in that branch (where duration field can be empty and still valid)
In basic demo, run:
```
const added = gantt.taskStore.rootNode.appendChild({ name : 'New' });
// run propagation to calculate new task fields
await gantt.project.propagate();
gantt.editTask(added);
```
Change name, ENTER key.
|
non_test
|
cannot save unscheduled task with enter key await or try this in that branch where duration field can be empty and still valid in basic demo run const added gantt taskstore rootnode appendchild name new run propagation to calculate new task fields await gantt project propagate gantt edittask added change name enter key
| 0
|
245,550
| 20,777,114,133
|
IssuesEvent
|
2022-03-16 11:32:20
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
closed
|
tests: cmsis_dsp: rf16 and cf16 tests are not executed on Native POSIX
|
bug priority: low area: Tests area: CMSIS-DSP
|
**Describe the bug**
Following test scenarios:
* `libraries.cmsis_dsp.transform.cf16`
* `libraries.cmsis_dsp.transform.cf16.fpu`
* `libraries.cmsis_dsp.transform.rf16`
* `libraries.cmsis_dsp.transform.rf16.fpu`
From this directory:
`tests/lib/cmsis_dsp/transform/`
are not executed - only `PROJECT EXECUTION SUCCESSFUL` is printed.
Problem concerns only `native_posix` platform.
On `mps2_an521` and `mps2_an521_remote` platforms everything works properly.
**To Reproduce**
Steps to reproduce the behavior:
1. Run Twister by this command:
`./scripts/twister -p native_posix -T tests/lib/cmsis_dsp/transform/`
3. Analyze twister.log file - especially execution of `rf16` and `cf16` tests.
**Expected behavior**
Tests should be executed - not only printing of `PROJECT EXECUTION SUCCESSFUL` information.
**Impact**
At this moment those tests are marked as `PASS` what is misleading information, because they aren't even executed.
**Logs and console output**
```
2022-03-15 17:14:39,977 - twister - DEBUG - Spawning BinaryHandler Thread for native_posix/tests/lib/cmsis_dsp/transform/libraries.cmsis_dsp.transform.cf16.fpu
2022-03-15 17:14:39,978 - twister - DEBUG - OUTPUT: *** Booting Zephyr OS build zephyr-v3.0.0-884-gadc901aa6a39 ***
2022-03-15 17:14:39,978 - twister - DEBUG - OUTPUT: ===================================================================
2022-03-15 17:14:39,979 - twister - DEBUG - OUTPUT: PROJECT EXECUTION SUCCESSFUL
2022-03-15 17:14:39,979 - twister - DEBUG - OUTPUT:
```
**Environment (please complete the following information):**
- OS: Linux
- Toolchain: Zephyr SDK
- Commit SHA: adc901aa6a39caffd7971ff99456b26414cd3793
**Additional context**
Similar problem occurred here:
https://github.com/zephyrproject-rtos/zephyr/issues/42396
But I verified, that those tests do not work since the beginning - since they was added in those commits:
https://github.com/zephyrproject-rtos/zephyr/commit/600ca01464ef253097644f2fc4dd0f3f2a3f5087
https://github.com/zephyrproject-rtos/zephyr/commit/6547025bc19b0bfd6078b3e76ba54b1193de9196
So, the source of problem is probably not connected with changes introduced in ZTest API in October last year.
This problem was discovered during review of this PR:
https://github.com/zephyrproject-rtos/zephyr/pull/42482
Enhancement proposed in this PR could help to avoid this situation, due to the additional verification of printed test suite name.
If everything works properly on QEMU platforms, then perhaps it should be considered to remove support for Native POSIX platform for those tests?
|
1.0
|
tests: cmsis_dsp: rf16 and cf16 tests are not executed on Native POSIX - **Describe the bug**
Following test scenarios:
* `libraries.cmsis_dsp.transform.cf16`
* `libraries.cmsis_dsp.transform.cf16.fpu`
* `libraries.cmsis_dsp.transform.rf16`
* `libraries.cmsis_dsp.transform.rf16.fpu`
From this directory:
`tests/lib/cmsis_dsp/transform/`
are not executed - only `PROJECT EXECUTION SUCCESSFUL` is printed.
Problem concerns only `native_posix` platform.
On `mps2_an521` and `mps2_an521_remote` platforms everything works properly.
**To Reproduce**
Steps to reproduce the behavior:
1. Run Twister by this command:
`./scripts/twister -p native_posix -T tests/lib/cmsis_dsp/transform/`
3. Analyze twister.log file - especially execution of `rf16` and `cf16` tests.
**Expected behavior**
Tests should be executed - not only printing of `PROJECT EXECUTION SUCCESSFUL` information.
**Impact**
At this moment those tests are marked as `PASS` what is misleading information, because they aren't even executed.
**Logs and console output**
```
2022-03-15 17:14:39,977 - twister - DEBUG - Spawning BinaryHandler Thread for native_posix/tests/lib/cmsis_dsp/transform/libraries.cmsis_dsp.transform.cf16.fpu
2022-03-15 17:14:39,978 - twister - DEBUG - OUTPUT: *** Booting Zephyr OS build zephyr-v3.0.0-884-gadc901aa6a39 ***
2022-03-15 17:14:39,978 - twister - DEBUG - OUTPUT: ===================================================================
2022-03-15 17:14:39,979 - twister - DEBUG - OUTPUT: PROJECT EXECUTION SUCCESSFUL
2022-03-15 17:14:39,979 - twister - DEBUG - OUTPUT:
```
**Environment (please complete the following information):**
- OS: Linux
- Toolchain: Zephyr SDK
- Commit SHA: adc901aa6a39caffd7971ff99456b26414cd3793
**Additional context**
Similar problem occurred here:
https://github.com/zephyrproject-rtos/zephyr/issues/42396
But I verified, that those tests do not work since the beginning - since they was added in those commits:
https://github.com/zephyrproject-rtos/zephyr/commit/600ca01464ef253097644f2fc4dd0f3f2a3f5087
https://github.com/zephyrproject-rtos/zephyr/commit/6547025bc19b0bfd6078b3e76ba54b1193de9196
So, the source of problem is probably not connected with changes introduced in ZTest API in October last year.
This problem was discovered during review of this PR:
https://github.com/zephyrproject-rtos/zephyr/pull/42482
Enhancement proposed in this PR could help to avoid this situation, due to the additional verification of printed test suite name.
If everything works properly on QEMU platforms, then perhaps it should be considered to remove support for Native POSIX platform for those tests?
|
test
|
tests cmsis dsp and tests are not executed on native posix describe the bug following test scenarios libraries cmsis dsp transform libraries cmsis dsp transform fpu libraries cmsis dsp transform libraries cmsis dsp transform fpu from this directory tests lib cmsis dsp transform are not executed only project execution successful is printed problem concerns only native posix platform on and remote platforms everything works properly to reproduce steps to reproduce the behavior run twister by this command scripts twister p native posix t tests lib cmsis dsp transform analyze twister log file especially execution of and tests expected behavior tests should be executed not only printing of project execution successful information impact at this moment those tests are marked as pass what is misleading information because they aren t even executed logs and console output twister debug spawning binaryhandler thread for native posix tests lib cmsis dsp transform libraries cmsis dsp transform fpu twister debug output booting zephyr os build zephyr twister debug output twister debug output project execution successful twister debug output environment please complete the following information os linux toolchain zephyr sdk commit sha additional context similar problem occurred here but i verified that those tests do not work since the beginning since they was added in those commits so the source of problem is probably not connected with changes introduced in ztest api in october last year this problem was discovered during review of this pr enhancement proposed in this pr could help to avoid this situation due to the additional verification of printed test suite name if everything works properly on qemu platforms then perhaps it should be considered to remove support for native posix platform for those tests
| 1
|
41,910
| 5,408,907,166
|
IssuesEvent
|
2017-03-01 01:44:33
|
nwjs/nw.js
|
https://api.github.com/repos/nwjs/nw.js
|
closed
|
webview does not work correctly when "node" added to Chrome app manifest
|
bug P2 test-todo triaged
|
When "node" is added in the permissions in manifest.json in a Chrome App, a webview is not working immediately after the app has started. For example, the request handlers are undefined.
If you add a delay of say 500 ms, it works.
In the attached example it will print the stringified request handlers of the webview 1) just after startup and 2) after a delay. Only the delayed print will show the handlers.
If you remove node from the permissions, both are printed out correctly.
It is probably the whole UI that is delayed by adding node permission.
Also, the font changes when adding node to permissions. Why?
Sample tested in nwjs-sdk-v0.20.3-linux-x64 on Linux Mint 18.
[webviewtest.zip](https://github.com/nwjs/nw.js/files/801561/webviewtest.zip)
|
1.0
|
webview does not work correctly when "node" added to Chrome app manifest - When "node" is added in the permissions in manifest.json in a Chrome App, a webview is not working immediately after the app has started. For example, the request handlers are undefined.
If you add a delay of say 500 ms, it works.
In the attached example it will print the stringified request handlers of the webview 1) just after startup and 2) after a delay. Only the delayed print will show the handlers.
If you remove node from the permissions, both are printed out correctly.
It is probably the whole UI that is delayed by adding node permission.
Also, the font changes when adding node to permissions. Why?
Sample tested in nwjs-sdk-v0.20.3-linux-x64 on Linux Mint 18.
[webviewtest.zip](https://github.com/nwjs/nw.js/files/801561/webviewtest.zip)
|
test
|
webview does not work correctly when node added to chrome app manifest when node is added in the permissions in manifest json in a chrome app a webview is not working immediately after the app has started for example the request handlers are undefined if you add a delay of say ms it works in the attached example it will print the stringified request handlers of the webview just after startup and after a delay only the delayed print will show the handlers if you remove node from the permissions both are printed out correctly it is probably the whole ui that is delayed by adding node permission also the font changes when adding node to permissions why sample tested in nwjs sdk linux on linux mint
| 1
|
82,266
| 10,237,650,035
|
IssuesEvent
|
2019-08-19 14:19:20
|
Shopify/polaris-react
|
https://api.github.com/repos/Shopify/polaris-react
|
closed
|
[Button] Add “pressed” state
|
⚗️ Development 🎨 Design
|
## Problem
Sometimes buttons are used as a radio button, usually in a button group with segmented buttons but potentially on their own (as a toggle).
This has been implemented in polaris-rails: https://github.com/Shopify/polaris-ux/issues/58
## Examples
<img width="315" alt="screen shot 2017-10-12 at 6 21 06 pm" src="https://user-images.githubusercontent.com/804014/31557049-5590d798-b015-11e7-92a0-bc040d5d19af.png">
Re-created from the previous polaris-react repo.
|
1.0
|
[Button] Add “pressed” state - ## Problem
Sometimes buttons are used as a radio button, usually in a button group with segmented buttons but potentially on their own (as a toggle).
This has been implemented in polaris-rails: https://github.com/Shopify/polaris-ux/issues/58
## Examples
<img width="315" alt="screen shot 2017-10-12 at 6 21 06 pm" src="https://user-images.githubusercontent.com/804014/31557049-5590d798-b015-11e7-92a0-bc040d5d19af.png">
Re-created from the previous polaris-react repo.
|
non_test
|
add “pressed” state problem sometimes buttons are used as a radio button usually in a button group with segmented buttons but potentially on their own as a toggle this has been implemented in polaris rails examples img width alt screen shot at pm src re created from the previous polaris react repo
| 0
|
124,321
| 12,228,657,953
|
IssuesEvent
|
2020-05-03 20:22:23
|
brunoarueira/rss-hub-backend
|
https://api.github.com/repos/brunoarueira/rss-hub-backend
|
closed
|
Improve README
|
documentation
|
The README needs some attention to describe which is the purpose of this project, which technologies will be used and how someone can contribute to.
|
1.0
|
Improve README - The README needs some attention to describe which is the purpose of this project, which technologies will be used and how someone can contribute to.
|
non_test
|
improve readme the readme needs some attention to describe which is the purpose of this project which technologies will be used and how someone can contribute to
| 0
|
48,406
| 20,144,144,686
|
IssuesEvent
|
2022-02-09 04:37:13
|
Solemates-Turing2108/frontend
|
https://api.github.com/repos/Solemates-Turing2108/frontend
|
opened
|
service: API
|
service
|
A service to be used throughout the app to provide interfacing between the front end and back end.
AC:
- Functions/Methods are used to avoid components touching business logic
|
1.0
|
service: API - A service to be used throughout the app to provide interfacing between the front end and back end.
AC:
- Functions/Methods are used to avoid components touching business logic
|
non_test
|
service api a service to be used throughout the app to provide interfacing between the front end and back end ac functions methods are used to avoid components touching business logic
| 0
|
155,920
| 19,803,121,613
|
IssuesEvent
|
2022-01-19 01:31:11
|
ChoeMinji/HtmlUnit-2.37.0
|
https://api.github.com/repos/ChoeMinji/HtmlUnit-2.37.0
|
opened
|
CVE-2022-23307 (Medium) detected in log4j-1.2.12.jar
|
security vulnerability
|
## CVE-2022-23307 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-1.2.12.jar</b></p></summary>
<p></p>
<p>Path to vulnerable library: /src/test/resources/libraries/DWR/2.0.5/WEB-INF/lib/log4j-1.2.12.jar</p>
<p>
Dependency Hierarchy:
- :x: **log4j-1.2.12.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
CVE-2020-9493 identified a deserialization issue that was present in Apache Chainsaw. Prior to Chainsaw V2.0 Chainsaw was a component of Apache Log4j 1.2.x where the same issue exists.
<p>Publish Date: 2022-01-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23307>CVE-2022-23307</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-23307 (Medium) detected in log4j-1.2.12.jar - ## CVE-2022-23307 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-1.2.12.jar</b></p></summary>
<p></p>
<p>Path to vulnerable library: /src/test/resources/libraries/DWR/2.0.5/WEB-INF/lib/log4j-1.2.12.jar</p>
<p>
Dependency Hierarchy:
- :x: **log4j-1.2.12.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
CVE-2020-9493 identified a deserialization issue that was present in Apache Chainsaw. Prior to Chainsaw V2.0 Chainsaw was a component of Apache Log4j 1.2.x where the same issue exists.
<p>Publish Date: 2022-01-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-23307>CVE-2022-23307</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in jar cve medium severity vulnerability vulnerable library jar path to vulnerable library src test resources libraries dwr web inf lib jar dependency hierarchy x jar vulnerable library found in base branch master vulnerability details cve identified a deserialization issue that was present in apache chainsaw prior to chainsaw chainsaw was a component of apache x where the same issue exists publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href step up your open source security game with whitesource
| 0
|
311,131
| 26,770,005,942
|
IssuesEvent
|
2023-01-31 13:29:04
|
SUNET/eduid-front
|
https://api.github.com/repos/SUNET/eduid-front
|
closed
|
DASHBOARD NAV: notification tips text state for unit test
|
testing
|
#### Description of issue:
To ensure correct data for different conditions, we will add a DashboardNav-test
|
1.0
|
DASHBOARD NAV: notification tips text state for unit test - #### Description of issue:
To ensure correct data for different conditions, we will add a DashboardNav-test
|
test
|
dashboard nav notification tips text state for unit test description of issue to ensure correct data for different conditions we will add a dashboardnav test
| 1
|
83,183
| 10,329,920,510
|
IssuesEvent
|
2019-09-02 13:25:57
|
vector-im/riotX-android
|
https://api.github.com/repos/vector-im/riotX-android
|
opened
|
blacklist/unblacklist devices
|
feature:e2e legacy-feature need-design
|
In stabilization because it's a missing functionality regarding other Riot clients
|
1.0
|
blacklist/unblacklist devices - In stabilization because it's a missing functionality regarding other Riot clients
|
non_test
|
blacklist unblacklist devices in stabilization because it s a missing functionality regarding other riot clients
| 0
|
104,558
| 22,691,523,909
|
IssuesEvent
|
2022-07-04 21:12:28
|
vatro/svelthree
|
https://api.github.com/repos/vatro/svelthree
|
opened
|
`SvelthreeInteraction` properly type emitted `CustomEvents`
|
general interaction code quality
|
Currently those are just `CustomEvents` not revealing anything about the contents of the `detail` object.
|
1.0
|
`SvelthreeInteraction` properly type emitted `CustomEvents` - Currently those are just `CustomEvents` not revealing anything about the contents of the `detail` object.
|
non_test
|
svelthreeinteraction properly type emitted customevents currently those are just customevents not revealing anything about the contents of the detail object
| 0
|
148,771
| 11,864,417,190
|
IssuesEvent
|
2020-03-25 21:38:12
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
opened
|
Netflix does not work due to Widevine Content Decryption Module error in brave://components
|
OS/Windows QA/Test-Plan-Specified QA/Yes intermittent-issue plugin/Widevine
|
Follow up to https://github.com/brave/brave-browser/issues/4646
We do not recover gracefully when `Widevine Content Decryption Module` fails to download.
Browser needs to be restarted to trigger the update.
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean profile
2. Navigate to netflix.com and login.
3. Make `Widevine Content Decryption Module` fail to download (TODO: Determine how)
4. Open `brave://components` and check `Widevine Content Decryption Module` status
5. Stream a video
## Actual result:
<!--Please add screenshots if needed-->
`Widevine Content Decryption Module` error and the module stays at version 0.0.0.0

## Expected result:
`Widevine Content Decryption Module`
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
10% repro rate
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 1.8.36 Chromium: 81.0.4044.69 (Official Build) nightly (64-bit)
-- | --
Revision | 6813546031a4bc83f717a2ef7cd4ac6ec1199132-refs/branch-heads/4044@{#776}
OS | Windows 7 Service Pack 1 (Build 7601.24544)
cc @simonhong @brave/legacy_qa @bsclifton
|
1.0
|
Netflix does not work due to Widevine Content Decryption Module error in brave://components - Follow up to https://github.com/brave/brave-browser/issues/4646
We do not recover gracefully when `Widevine Content Decryption Module` fails to download.
Browser needs to be restarted to trigger the update.
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Clean profile
2. Navigate to netflix.com and login.
3. Make `Widevine Content Decryption Module` fail to download (TODO: Determine how)
4. Open `brave://components` and check `Widevine Content Decryption Module` status
5. Stream a video
## Actual result:
<!--Please add screenshots if needed-->
`Widevine Content Decryption Module` error and the module stays at version 0.0.0.0

## Expected result:
`Widevine Content Decryption Module`
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
10% repro rate
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
Brave | 1.8.36 Chromium: 81.0.4044.69 (Official Build) nightly (64-bit)
-- | --
Revision | 6813546031a4bc83f717a2ef7cd4ac6ec1199132-refs/branch-heads/4044@{#776}
OS | Windows 7 Service Pack 1 (Build 7601.24544)
cc @simonhong @brave/legacy_qa @bsclifton
|
test
|
netflix does not work due to widevine content decryption module error in brave components follow up to we do not recover gracefully when widevine content decryption module fails to download browser needs to be restarted to trigger the update steps to reproduce clean profile navigate to netflix com and login make widevine content decryption module fail to download todo determine how open brave components and check widevine content decryption module status stream a video actual result widevine content decryption module error and the module stays at version expected result widevine content decryption module reproduces how often repro rate brave version brave version info brave chromium official build nightly bit revision refs branch heads os windows service pack build cc simonhong brave legacy qa bsclifton
| 1
|
292,352
| 25,206,805,590
|
IssuesEvent
|
2022-11-13 19:35:45
|
MinhazMurks/Bannerlord.Tweaks
|
https://api.github.com/repos/MinhazMurks/Bannerlord.Tweaks
|
opened
|
Test Quality of Recruitment Balancing
|
testing
|
Test to see if tweak: "Quality of Recruitment Balancing" works
|
1.0
|
Test Quality of Recruitment Balancing - Test to see if tweak: "Quality of Recruitment Balancing" works
|
test
|
test quality of recruitment balancing test to see if tweak quality of recruitment balancing works
| 1
|
184,646
| 14,289,809,560
|
IssuesEvent
|
2020-11-23 19:51:46
|
github-vet/rangeclosure-findings
|
https://api.github.com/repos/github-vet/rangeclosure-findings
|
closed
|
barakmich/go_sse2: src/sync/atomic/value_test.go; 30 LoC
|
fresh small test
|
Found a possible issue in [barakmich/go_sse2](https://www.github.com/barakmich/go_sse2) at [src/sync/atomic/value_test.go](https://github.com/barakmich/go_sse2/blob/a6a26455d4f4f81cfe89b1ec3261da5b80fa96aa/src/sync/atomic/value_test.go#L92-L121)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/barakmich/go_sse2/blob/a6a26455d4f4f81cfe89b1ec3261da5b80fa96aa/src/sync/atomic/value_test.go#L92-L121)
<details>
<summary>Click here to show the 30 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range tests {
var v Value
done := make(chan bool, p)
for i := 0; i < p; i++ {
go func() {
r := rand.New(rand.NewSource(rand.Int63()))
expected := true
loop:
for j := 0; j < N; j++ {
x := test[r.Intn(len(test))]
v.Store(x)
x = v.Load()
for _, x1 := range test {
if x == x1 {
continue loop
}
}
t.Logf("loaded unexpected value %+v, want %+v", x, test)
expected = false
break
}
done <- expected
}()
}
for i := 0; i < p; i++ {
if !<-done {
t.FailNow()
}
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: a6a26455d4f4f81cfe89b1ec3261da5b80fa96aa
|
1.0
|
barakmich/go_sse2: src/sync/atomic/value_test.go; 30 LoC -
Found a possible issue in [barakmich/go_sse2](https://www.github.com/barakmich/go_sse2) at [src/sync/atomic/value_test.go](https://github.com/barakmich/go_sse2/blob/a6a26455d4f4f81cfe89b1ec3261da5b80fa96aa/src/sync/atomic/value_test.go#L92-L121)
The below snippet of Go code triggered static analysis which searches for goroutines and/or defer statements
which capture loop variables.
[Click here to see the code in its original context.](https://github.com/barakmich/go_sse2/blob/a6a26455d4f4f81cfe89b1ec3261da5b80fa96aa/src/sync/atomic/value_test.go#L92-L121)
<details>
<summary>Click here to show the 30 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range tests {
var v Value
done := make(chan bool, p)
for i := 0; i < p; i++ {
go func() {
r := rand.New(rand.NewSource(rand.Int63()))
expected := true
loop:
for j := 0; j < N; j++ {
x := test[r.Intn(len(test))]
v.Store(x)
x = v.Load()
for _, x1 := range test {
if x == x1 {
continue loop
}
}
t.Logf("loaded unexpected value %+v, want %+v", x, test)
expected = false
break
}
done <- expected
}()
}
for i := 0; i < p; i++ {
if !<-done {
t.FailNow()
}
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: a6a26455d4f4f81cfe89b1ec3261da5b80fa96aa
|
test
|
barakmich go src sync atomic value test go loc found a possible issue in at the below snippet of go code triggered static analysis which searches for goroutines and or defer statements which capture loop variables click here to show the line s of go which triggered the analyzer go for test range tests var v value done make chan bool p for i i p i go func r rand new rand newsource rand expected true loop for j j n j x test v store x x v load for range test if x continue loop t logf loaded unexpected value v want v x test expected false break done expected for i i p i if done t failnow leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 1
|
5,144
| 2,762,960,904
|
IssuesEvent
|
2015-04-29 04:26:23
|
TricksterGuy/complx
|
https://api.github.com/repos/TricksterGuy/complx
|
opened
|
Write a LC3Test Xml Generator [for complx]
|
complx feature request job lc3test
|
This is actually two jobs.
1. A simple command line interface that just generates the test xml files.
Design it however you want some ideas
* You can read the .asm file and automatically determine what the input and output is via looking at the symbol table and then ask the user for test input and output values
* Just ask the user several questions (like input symbol name, what is the type of it, etc) and then ask for test input output values
2. Integrating this in complx
|
1.0
|
Write a LC3Test Xml Generator [for complx] - This is actually two jobs.
1. A simple command line interface that just generates the test xml files.
Design it however you want some ideas
* You can read the .asm file and automatically determine what the input and output is via looking at the symbol table and then ask the user for test input and output values
* Just ask the user several questions (like input symbol name, what is the type of it, etc) and then ask for test input output values
2. Integrating this in complx
|
test
|
write a xml generator this is actually two jobs a simple command line interface that just generates the test xml files design it however you want some ideas you can read the asm file and automatically determine what the input and output is via looking at the symbol table and then ask the user for test input and output values just ask the user several questions like input symbol name what is the type of it etc and then ask for test input output values integrating this in complx
| 1
|
181,288
| 14,014,296,971
|
IssuesEvent
|
2020-10-29 11:42:11
|
publiclab/image-sequencer
|
https://api.github.com/repos/publiclab/image-sequencer
|
closed
|
adding tests for CLI functionality - brainstorming
|
discussion help wanted testing
|
Trying to think through the best way to do better CLI testing. Right now we have only: https://github.com/publiclab/image-sequencer/blob/440c3e0ad0ab081bdcc7dfff0d000f52a0902035/test/core/cli.js which is pretty non-existent.
Noting that we encountered this in https://github.com/publiclab/image-sequencer/issues/659 but did not at that time implement tests.
Ok, so I had thought this was the way to go, installing https://github.com/sstephenson/bats (blog https://medium.com/@pimterry/testing-your-shell-scripts-with-bats-abfca9bdc5b9) and comparing images like perhaps:
```js
looksSame = require('looks-same');
looksSame(process.argv[2], process.argv[3], function(error, {equal}) {
// equal will be true, if images looks the same
console.log(equal ? 1 : 0);
});
```
But actually maybe it's just easier to use one of our existing JS tests and running our CLI tests from inside node, like this:
```js
const { exec } = require('child_process');
var yourscript = exec('sh hi.sh',
(error, stdout, stderr) => {
console.log(stdout);
console.log(stderr);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
```
https://stackoverflow.com/questions/44647778/how-to-run-shell-script-file-using-nodejs#44667294 has some guidance on this.
Is there a standard way to write CLI tests for `commander.js`?
UPDATE: yes, it looks like there are some good ways: https://github.com/tj/commander.js/issues/438
|
1.0
|
adding tests for CLI functionality - brainstorming - Trying to think through the best way to do better CLI testing. Right now we have only: https://github.com/publiclab/image-sequencer/blob/440c3e0ad0ab081bdcc7dfff0d000f52a0902035/test/core/cli.js which is pretty non-existent.
Noting that we encountered this in https://github.com/publiclab/image-sequencer/issues/659 but did not at that time implement tests.
Ok, so I had thought this was the way to go, installing https://github.com/sstephenson/bats (blog https://medium.com/@pimterry/testing-your-shell-scripts-with-bats-abfca9bdc5b9) and comparing images like perhaps:
```js
looksSame = require('looks-same');
looksSame(process.argv[2], process.argv[3], function(error, {equal}) {
// equal will be true, if images looks the same
console.log(equal ? 1 : 0);
});
```
But actually maybe it's just easier to use one of our existing JS tests and running our CLI tests from inside node, like this:
```js
const { exec } = require('child_process');
var yourscript = exec('sh hi.sh',
(error, stdout, stderr) => {
console.log(stdout);
console.log(stderr);
if (error !== null) {
console.log(`exec error: ${error}`);
}
});
```
https://stackoverflow.com/questions/44647778/how-to-run-shell-script-file-using-nodejs#44667294 has some guidance on this.
Is there a standard way to write CLI tests for `commander.js`?
UPDATE: yes, it looks like there are some good ways: https://github.com/tj/commander.js/issues/438
|
test
|
adding tests for cli functionality brainstorming trying to think through the best way to do better cli testing right now we have only which is pretty non existent noting that we encountered this in but did not at that time implement tests ok so i had thought this was the way to go installing blog and comparing images like perhaps js lookssame require looks same lookssame process argv process argv function error equal equal will be true if images looks the same console log equal but actually maybe it s just easier to use one of our existing js tests and running our cli tests from inside node like this js const exec require child process var yourscript exec sh hi sh error stdout stderr console log stdout console log stderr if error null console log exec error error has some guidance on this is there a standard way to write cli tests for commander js update yes it looks like there are some good ways
| 1
|
140,311
| 11,309,157,740
|
IssuesEvent
|
2020-01-19 11:14:22
|
qmetry/qaf
|
https://api.github.com/repos/qmetry/qaf
|
closed
|
Random data and expression in variable interpolation support
|
configuration feature testdata
|
Till 2.1.15, variable interpolation looks up for available properties. With this feature it should support random value and expression as variable/parameter in addition to property as variable.
Examples:
```
${rnd:aaa-aaa-aaa}
${rnd:99999}
${expr:java.time.Instant.now()}
${expr:java.lang.System.currentTimeMillis()}
${expr:java.util.UUID.randomUUID()}
${expr:com.qmetry.qaf.automation.util.DateUtil.getDate(0, 'MM/dd/yyyy')}
```
|
1.0
|
Random data and expression in variable interpolation support - Till 2.1.15, variable interpolation looks up for available properties. With this feature it should support random value and expression as variable/parameter in addition to property as variable.
Examples:
```
${rnd:aaa-aaa-aaa}
${rnd:99999}
${expr:java.time.Instant.now()}
${expr:java.lang.System.currentTimeMillis()}
${expr:java.util.UUID.randomUUID()}
${expr:com.qmetry.qaf.automation.util.DateUtil.getDate(0, 'MM/dd/yyyy')}
```
|
test
|
random data and expression in variable interpolation support till variable interpolation looks up for available properties with this feature it should support random value and expression as variable parameter in addition to property as variable examples rnd aaa aaa aaa rnd expr java time instant now expr java lang system currenttimemillis expr java util uuid randomuuid expr com qmetry qaf automation util dateutil getdate mm dd yyyy
| 1
|
195,405
| 14,728,309,866
|
IssuesEvent
|
2021-01-06 09:47:08
|
OpenPaaS-Suite/esn-frontend-inbox
|
https://api.github.com/repos/OpenPaaS-Suite/esn-frontend-inbox
|
closed
|
As a user, I want to see my folders list in a tree view
|
QA:Testing enhancement
|
Use a feature flag, and user can enable the tree view in configuration. This feature flag is recorded in broswer's localstorage.
|
1.0
|
As a user, I want to see my folders list in a tree view - Use a feature flag, and user can enable the tree view in configuration. This feature flag is recorded in broswer's localstorage.
|
test
|
as a user i want to see my folders list in a tree view use a feature flag and user can enable the tree view in configuration this feature flag is recorded in broswer s localstorage
| 1
|
261,395
| 22,743,253,716
|
IssuesEvent
|
2022-07-07 06:47:34
|
kubernetes-sigs/cluster-api
|
https://api.github.com/repos/kubernetes-sigs/cluster-api
|
closed
|
Adjust clusterctl upgrade jobs for v1.3
|
kind/feature area/testing
|
Once we start development of CAPI v1.3 we have to adjust our clusterctl upgrade test jobs.
The goal is to have jobs which test clusterctl upgrade from the latest release of each currently supported contract version (v1alpha3 (v0.3.x), v1alpha4 (v0.4.x) and v1beta1 (v1.2.x).
We already have jobs for v0.3, v0.4 and v1.1. As we don't need the job for v1.1 anymore we can just change it for v1.2:
Job: [periodic-cluster-api-e2e-upgrade-v1-1-to-main](https://github.com/kubernetes/test-infra/blob/master/config/jobs/kubernetes-sigs/cluster-api/cluster-api-periodics-main.yaml#L100-L141)
* Change name to "periodic-cluster-api-e2e-upgrade-v1-2-to-main"
* INIT_WITH_BINARY should use v1.2.0
* Change testgrid-tab-name to "capi-e2e-upgrade-v1-2-to-main"
/kind feature
/area testing
|
1.0
|
Adjust clusterctl upgrade jobs for v1.3 - Once we start development of CAPI v1.3 we have to adjust our clusterctl upgrade test jobs.
The goal is to have jobs which test clusterctl upgrade from the latest release of each currently supported contract version (v1alpha3 (v0.3.x), v1alpha4 (v0.4.x) and v1beta1 (v1.2.x).
We already have jobs for v0.3, v0.4 and v1.1. As we don't need the job for v1.1 anymore we can just change it for v1.2:
Job: [periodic-cluster-api-e2e-upgrade-v1-1-to-main](https://github.com/kubernetes/test-infra/blob/master/config/jobs/kubernetes-sigs/cluster-api/cluster-api-periodics-main.yaml#L100-L141)
* Change name to "periodic-cluster-api-e2e-upgrade-v1-2-to-main"
* INIT_WITH_BINARY should use v1.2.0
* Change testgrid-tab-name to "capi-e2e-upgrade-v1-2-to-main"
/kind feature
/area testing
|
test
|
adjust clusterctl upgrade jobs for once we start development of capi we have to adjust our clusterctl upgrade test jobs the goal is to have jobs which test clusterctl upgrade from the latest release of each currently supported contract version x x and x we already have jobs for and as we don t need the job for anymore we can just change it for job change name to periodic cluster api upgrade to main init with binary should use change testgrid tab name to capi upgrade to main kind feature area testing
| 1
|
3,984
| 6,813,769,202
|
IssuesEvent
|
2017-11-06 10:30:29
|
TEDxYouthJPIS/main
|
https://api.github.com/repos/TEDxYouthJPIS/main
|
reopened
|
Nomination form display error on mobile
|
bug mobile-compatibility
|
The google form is not being seen in a smaller size on iphone. It's extending to the full limits. Change the code of the nomination html page. Look for help online with similar issues


|
True
|
Nomination form display error on mobile - The google form is not being seen in a smaller size on iphone. It's extending to the full limits. Change the code of the nomination html page. Look for help online with similar issues


|
non_test
|
nomination form display error on mobile the google form is not being seen in a smaller size on iphone it s extending to the full limits change the code of the nomination html page look for help online with similar issues
| 0
|
40,928
| 16,596,913,772
|
IssuesEvent
|
2021-06-01 14:28:00
|
Azure/azure-cli
|
https://api.github.com/repos/Azure/azure-cli
|
closed
|
az deployment group what-if is failing after release 2.24.0
|
ARM Service Attention
|
### **This is autogenerated. Please review and update as needed.**
## Describe the bug
**Command Name**
`az deployment group what-if`
**Errors:**
```
'str' object has no attribute 'value'
```
## To Reproduce:
Steps to reproduce the behavior. Note that argument values have been redacted, as they may contain sensitive information.
- _Put any pre-requisite steps here..._
- `az deployment group what-if -f {} -p {} -g {}`
## Expected Behavior
## Environment Summary
```
macOS-11.4-x86_64-i386-64bit
Python 3.8.10
Installer: HOMEBREW
azure-cli 2.24.0
Extensions:
aks-preview 0.5.14
azure-devops 0.18.0
```
## Additional Context
<!--Please don't remove this:-->
<!--auto-generated-->
|
1.0
|
az deployment group what-if is failing after release 2.24.0 - ### **This is autogenerated. Please review and update as needed.**
## Describe the bug
**Command Name**
`az deployment group what-if`
**Errors:**
```
'str' object has no attribute 'value'
```
## To Reproduce:
Steps to reproduce the behavior. Note that argument values have been redacted, as they may contain sensitive information.
- _Put any pre-requisite steps here..._
- `az deployment group what-if -f {} -p {} -g {}`
## Expected Behavior
## Environment Summary
```
macOS-11.4-x86_64-i386-64bit
Python 3.8.10
Installer: HOMEBREW
azure-cli 2.24.0
Extensions:
aks-preview 0.5.14
azure-devops 0.18.0
```
## Additional Context
<!--Please don't remove this:-->
<!--auto-generated-->
|
non_test
|
az deployment group what if is failing after release this is autogenerated please review and update as needed describe the bug command name az deployment group what if errors str object has no attribute value to reproduce steps to reproduce the behavior note that argument values have been redacted as they may contain sensitive information put any pre requisite steps here az deployment group what if f p g expected behavior environment summary macos python installer homebrew azure cli extensions aks preview azure devops additional context
| 0
|
66,763
| 7,017,574,895
|
IssuesEvent
|
2017-12-21 10:09:37
|
AdrianAntonGarcia/-TFG-UBUSetas
|
https://api.github.com/repos/AdrianAntonGarcia/-TFG-UBUSetas
|
closed
|
S14-17. Crear los test de integración de la actividad mostrar Información setas.
|
Android Test
|
Se crearán los test de integración de la actividad mostrar información setas para probar la funcionalidad de esta actividad.
|
1.0
|
S14-17. Crear los test de integración de la actividad mostrar Información setas. - Se crearán los test de integración de la actividad mostrar información setas para probar la funcionalidad de esta actividad.
|
test
|
crear los test de integración de la actividad mostrar información setas se crearán los test de integración de la actividad mostrar información setas para probar la funcionalidad de esta actividad
| 1
|
188,900
| 14,478,878,959
|
IssuesEvent
|
2020-12-10 09:03:14
|
kalexmills/github-vet-tests-dec2020
|
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
|
closed
|
code4pantanal/onibus.ms: internal/router/ponto_test.go; 3 LoC
|
fresh test tiny
|
Found a possible issue in [code4pantanal/onibus.ms](https://www.github.com/code4pantanal/onibus.ms) at [internal/router/ponto_test.go](https://github.com/code4pantanal/onibus.ms/blob/f771376df274eda5d44f4f298b4e1c4c670aab3e/internal/router/ponto_test.go#L43-L45)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call at line 44 may store a reference to ponto
[Click here to see the code in its original context.](https://github.com/code4pantanal/onibus.ms/blob/f771376df274eda5d44f4f298b4e1c4c670aab3e/internal/router/ponto_test.go#L43-L45)
<details>
<summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary>
```go
for _, ponto := range pontos {
store.PontoStore().Insere(&ponto)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: f771376df274eda5d44f4f298b4e1c4c670aab3e
|
1.0
|
code4pantanal/onibus.ms: internal/router/ponto_test.go; 3 LoC -
Found a possible issue in [code4pantanal/onibus.ms](https://www.github.com/code4pantanal/onibus.ms) at [internal/router/ponto_test.go](https://github.com/code4pantanal/onibus.ms/blob/f771376df274eda5d44f4f298b4e1c4c670aab3e/internal/router/ponto_test.go#L43-L45)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call at line 44 may store a reference to ponto
[Click here to see the code in its original context.](https://github.com/code4pantanal/onibus.ms/blob/f771376df274eda5d44f4f298b4e1c4c670aab3e/internal/router/ponto_test.go#L43-L45)
<details>
<summary>Click here to show the 3 line(s) of Go which triggered the analyzer.</summary>
```go
for _, ponto := range pontos {
store.PontoStore().Insere(&ponto)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: f771376df274eda5d44f4f298b4e1c4c670aab3e
|
test
|
onibus ms internal router ponto test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call at line may store a reference to ponto click here to show the line s of go which triggered the analyzer go for ponto range pontos store pontostore insere ponto leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 1
|
180,522
| 13,934,853,327
|
IssuesEvent
|
2020-10-22 10:37:32
|
kyma-project/console
|
https://api.github.com/repos/kyma-project/console
|
closed
|
UI Test for opening externally linked cluster microfrontends
|
area/console area/quality stale test-missing
|
**Description**
Implement an automated test that will test opening CMFs with external link attribute (i.e `Stats & Metrics`, `Tracing`)
**Reasons**
To increase e2e test coverage and deploy new features with confidence
|
1.0
|
UI Test for opening externally linked cluster microfrontends - **Description**
Implement an automated test that will test opening CMFs with external link attribute (i.e `Stats & Metrics`, `Tracing`)
**Reasons**
To increase e2e test coverage and deploy new features with confidence
|
test
|
ui test for opening externally linked cluster microfrontends description implement an automated test that will test opening cmfs with external link attribute i e stats metrics tracing reasons to increase test coverage and deploy new features with confidence
| 1
|
8,545
| 22,826,998,070
|
IssuesEvent
|
2022-07-12 09:27:38
|
arduino/arduino-cli
|
https://api.github.com/repos/arduino/arduino-cli
|
closed
|
panic: runtime error when running the CLI daemon on my Pi
|
os: linux architecture: armv7 type: imperfection topic: gRPC
|
## Bug Report
### Current behavior
<!-- Paste the full command you run -->
I am trying to use the Arduino CLI daemon on `armv7l` over gRPC. I had this error in the Pro IDE:
```
daemon INFO panic: runtime error: invalid memory address or nil pointer dereference
daemon INFO [signal SIGSEGV: segmentation violation code=0x1 addr=0x4 pc=0x12c28]
daemon INFO goroutine 1 [running]:
daemon INFO github.com/segmentio/stats/v4/prometheus.(*Handler).HandleMeasures(0xec44d8, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x21f1300, 0x1, 0x1)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/prometheus/handler.go:96 +0x1d8
daemon INFO github.com/segmentio/stats/v4.(*Engine).measure(0x21f11c0, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x85f175, 0x6, 0x785f10, 0x985a50, ...)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:155 +0x2b4
daemon INFO github.com/segmentio/stats/v4.(*Engine).Add(0x21f11c0, 0x85f175, 0x6, 0x785f10, 0x985a50, 0x2155e80, 0x1, 0x1)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:94 +0x78
daemon INFO github.com/segmentio/stats/v4.(*Engine).Incr(...)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:84
daemon INFO github.com/segmentio/stats/v4.Incr(...)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:251
daemon INFO github.com/arduino/arduino-cli/cli/daemon.runDaemonCommand(0x21f74a0, 0x21ea630, 0x0, 0x5)
daemon INFO /__w/arduino-cli/arduino-cli/cli/daemon/daemon.go:65 +0x7e4
daemon INFO github.com/spf13/cobra.(*Command).execute(0x21f74a0, 0x21ea600, 0x5, 0x6, 0x21f74a0, 0x21ea600)
daemon INFO /github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:846 +0x1f4
daemon INFO github.com/spf13/cobra.(*Command).ExecuteC(0x20ad760, 0x17, 0x0, 0x20000e0)
daemon INFO /github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:950 +0x26c
daemon INFO github.com/spf13/cobra.(*Command).Execute(...)
daemon INFO /github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:887
daemon INFO main.main()
daemon INFO /__w/arduino-cli/arduino-cli/main.go:31 +0x24
daemon INFO Failed to start the daemon.
daemon ERROR Error: panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x4 pc=0x12c28]
goroutine 1 [running]:
github.com/segmentio/stats/v4/prometheus.(*Handler).HandleMeasures(0xec44d8, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x21f1300, 0x1, 0x1)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/prometheus/handler.go:96 +0x1d8
github.com/segmentio/stats/v4.(*Engine).measure(0x21f11c0, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x85f175, 0x6, 0x785f10, 0x985a50, ...)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:155 +0x2b4
github.com/segmentio/stats/v4.(*Engine).Add(0x21f11c0, 0x85f175, 0x6, 0x785f10, 0x985a50, 0x2155e80, 0x1, 0x1)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:94 +0x78
github.com/segmentio/stats/v4.(*Engine).Incr(...)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:84
github.com/segmentio/stats/v4.Incr(...)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:251
github.com/arduino/arduino-cli/cli/daemon.runDaemonCommand(0x21f74a0, 0x21ea630, 0x0, 0x5)
/__w/arduino-cli/arduino-cli/cli/daemon/daemon.go:65 +0x7e4
github.com/spf13/cobra.(*Command).execute(0x21f74a0, 0x21ea600, 0x5, 0x6, 0x21f74a0, 0x21ea600)
/github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:846 +0x1f4
github.com/spf13/cobra.(*Command).ExecuteC(0x20ad760, 0x17, 0x0, 0x20000e0)
/github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:950 +0x26c
github.com/spf13/cobra.(*Command).Execute(...)
/github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:887
main.main()
/__w/arduino-cli/arduino-cli/main.go:31 +0x24
```
Disabling the `telemetry` **did** help:
```
cat ~/.arduinoProIDE/arduino-cli.yaml
board_manager:
additional_urls: []
daemon:
port: "50051"
directories:
data: /home/pi/.arduino15
downloads: /home/pi/.arduino15/staging
user: /home/pi/Arduino
logging:
file: ""
format: text
level: info
telemetry:
addr: :9090
enabled: false
```
<!-- Add a clear and concise description of the behavior. -->
### Expected behavior
<!-- Add a clear and concise description of what you expected to happen. -->
### Environment
- CLI version (output of `arduino-cli version`): `arduino-cli Version: 0.11.0 Commit: 0296f4d`
- OS and platform: `Linux raspberrypi 4.19.118-v7l+ #1311 SMP Mon Apr 27 14:26:42 BST 2020 armv7l GNU/Linux`
### Additional context
<!-- (Optional) Add any other context about the problem here. -->
I used the https://downloads.arduino.cc/arduino-cli/arduino-cli_0.11.0_Linux_ARMv7.tar.gz URL to get the CLI.
Update: I have corrected my original description; disabling the `telemetry` helped to work around the problem. I can confirm, without the telemetry, the daemon runs on ARM.
|
1.0
|
panic: runtime error when running the CLI daemon on my Pi - ## Bug Report
### Current behavior
<!-- Paste the full command you run -->
I am trying to use the Arduino CLI daemon on `armv7l` over gRPC. I had this error in the Pro IDE:
```
daemon INFO panic: runtime error: invalid memory address or nil pointer dereference
daemon INFO [signal SIGSEGV: segmentation violation code=0x1 addr=0x4 pc=0x12c28]
daemon INFO goroutine 1 [running]:
daemon INFO github.com/segmentio/stats/v4/prometheus.(*Handler).HandleMeasures(0xec44d8, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x21f1300, 0x1, 0x1)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/prometheus/handler.go:96 +0x1d8
daemon INFO github.com/segmentio/stats/v4.(*Engine).measure(0x21f11c0, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x85f175, 0x6, 0x785f10, 0x985a50, ...)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:155 +0x2b4
daemon INFO github.com/segmentio/stats/v4.(*Engine).Add(0x21f11c0, 0x85f175, 0x6, 0x785f10, 0x985a50, 0x2155e80, 0x1, 0x1)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:94 +0x78
daemon INFO github.com/segmentio/stats/v4.(*Engine).Incr(...)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:84
daemon INFO github.com/segmentio/stats/v4.Incr(...)
daemon INFO /github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:251
daemon INFO github.com/arduino/arduino-cli/cli/daemon.runDaemonCommand(0x21f74a0, 0x21ea630, 0x0, 0x5)
daemon INFO /__w/arduino-cli/arduino-cli/cli/daemon/daemon.go:65 +0x7e4
daemon INFO github.com/spf13/cobra.(*Command).execute(0x21f74a0, 0x21ea600, 0x5, 0x6, 0x21f74a0, 0x21ea600)
daemon INFO /github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:846 +0x1f4
daemon INFO github.com/spf13/cobra.(*Command).ExecuteC(0x20ad760, 0x17, 0x0, 0x20000e0)
daemon INFO /github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:950 +0x26c
daemon INFO github.com/spf13/cobra.(*Command).Execute(...)
daemon INFO /github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:887
daemon INFO main.main()
daemon INFO /__w/arduino-cli/arduino-cli/main.go:31 +0x24
daemon INFO Failed to start the daemon.
daemon ERROR Error: panic: runtime error: invalid memory address or nil pointer dereference
[signal SIGSEGV: segmentation violation code=0x1 addr=0x4 pc=0x12c28]
goroutine 1 [running]:
github.com/segmentio/stats/v4/prometheus.(*Handler).HandleMeasures(0xec44d8, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x21f1300, 0x1, 0x1)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/prometheus/handler.go:96 +0x1d8
github.com/segmentio/stats/v4.(*Engine).measure(0x21f11c0, 0x44dd7270, 0xbfba24fe, 0x1a71c0e, 0x0, 0xec44a0, 0x85f175, 0x6, 0x785f10, 0x985a50, ...)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:155 +0x2b4
github.com/segmentio/stats/v4.(*Engine).Add(0x21f11c0, 0x85f175, 0x6, 0x785f10, 0x985a50, 0x2155e80, 0x1, 0x1)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:94 +0x78
github.com/segmentio/stats/v4.(*Engine).Incr(...)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:84
github.com/segmentio/stats/v4.Incr(...)
/github/home/go/pkg/mod/github.com/segmentio/stats/v4@v4.5.3/engine.go:251
github.com/arduino/arduino-cli/cli/daemon.runDaemonCommand(0x21f74a0, 0x21ea630, 0x0, 0x5)
/__w/arduino-cli/arduino-cli/cli/daemon/daemon.go:65 +0x7e4
github.com/spf13/cobra.(*Command).execute(0x21f74a0, 0x21ea600, 0x5, 0x6, 0x21f74a0, 0x21ea600)
/github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:846 +0x1f4
github.com/spf13/cobra.(*Command).ExecuteC(0x20ad760, 0x17, 0x0, 0x20000e0)
/github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:950 +0x26c
github.com/spf13/cobra.(*Command).Execute(...)
/github/home/go/pkg/mod/github.com/spf13/cobra@v1.0.0/command.go:887
main.main()
/__w/arduino-cli/arduino-cli/main.go:31 +0x24
```
Disabling the `telemetry` **did** help:
```
cat ~/.arduinoProIDE/arduino-cli.yaml
board_manager:
additional_urls: []
daemon:
port: "50051"
directories:
data: /home/pi/.arduino15
downloads: /home/pi/.arduino15/staging
user: /home/pi/Arduino
logging:
file: ""
format: text
level: info
telemetry:
addr: :9090
enabled: false
```
<!-- Add a clear and concise description of the behavior. -->
### Expected behavior
<!-- Add a clear and concise description of what you expected to happen. -->
### Environment
- CLI version (output of `arduino-cli version`): `arduino-cli Version: 0.11.0 Commit: 0296f4d`
- OS and platform: `Linux raspberrypi 4.19.118-v7l+ #1311 SMP Mon Apr 27 14:26:42 BST 2020 armv7l GNU/Linux`
### Additional context
<!-- (Optional) Add any other context about the problem here. -->
I used the https://downloads.arduino.cc/arduino-cli/arduino-cli_0.11.0_Linux_ARMv7.tar.gz URL to get the CLI.
Update: I have corrected my original description; disabling the `telemetry` helped to work around the problem. I can confirm, without the telemetry, the daemon runs on ARM.
|
non_test
|
panic runtime error when running the cli daemon on my pi bug report current behavior i am trying to use the arduino cli daemon on over grpc i had this error in the pro ide daemon info panic runtime error invalid memory address or nil pointer dereference daemon info daemon info goroutine daemon info github com segmentio stats prometheus handler handlemeasures daemon info github home go pkg mod github com segmentio stats prometheus handler go daemon info github com segmentio stats engine measure daemon info github home go pkg mod github com segmentio stats engine go daemon info github com segmentio stats engine add daemon info github home go pkg mod github com segmentio stats engine go daemon info github com segmentio stats engine incr daemon info github home go pkg mod github com segmentio stats engine go daemon info github com segmentio stats incr daemon info github home go pkg mod github com segmentio stats engine go daemon info github com arduino arduino cli cli daemon rundaemoncommand daemon info w arduino cli arduino cli cli daemon daemon go daemon info github com cobra command execute daemon info github home go pkg mod github com cobra command go daemon info github com cobra command executec daemon info github home go pkg mod github com cobra command go daemon info github com cobra command execute daemon info github home go pkg mod github com cobra command go daemon info main main daemon info w arduino cli arduino cli main go daemon info failed to start the daemon daemon error error panic runtime error invalid memory address or nil pointer dereference goroutine github com segmentio stats prometheus handler handlemeasures github home go pkg mod github com segmentio stats prometheus handler go github com segmentio stats engine measure github home go pkg mod github com segmentio stats engine go github com segmentio stats engine add github home go pkg mod github com segmentio stats engine go github com segmentio stats engine incr github home go pkg mod github com segmentio stats engine go github com segmentio stats incr github home go pkg mod github com segmentio stats engine go github com arduino arduino cli cli daemon rundaemoncommand w arduino cli arduino cli cli daemon daemon go github com cobra command execute github home go pkg mod github com cobra command go github com cobra command executec github home go pkg mod github com cobra command go github com cobra command execute github home go pkg mod github com cobra command go main main w arduino cli arduino cli main go disabling the telemetry did help cat arduinoproide arduino cli yaml board manager additional urls daemon port directories data home pi downloads home pi staging user home pi arduino logging file format text level info telemetry addr enabled false expected behavior environment cli version output of arduino cli version arduino cli version commit os and platform linux raspberrypi smp mon apr bst gnu linux additional context i used the url to get the cli update i have corrected my original description disabling the telemetry helped to work around the problem i can confirm without the telemetry the daemon runs on arm
| 0
|
68,307
| 3,286,064,223
|
IssuesEvent
|
2015-10-28 23:42:33
|
dart-lang/sdk
|
https://api.github.com/repos/dart-lang/sdk
|
opened
|
Add .analysis_options support for `enableSuperMixins`.
|
Area-Analyzer Priority-Medium Type-Enhancement
|
Straw-man:
```
analyzer:
enableSuperMixins: true
```
@sethladd : are there other options we need while I'm at it?
|
1.0
|
Add .analysis_options support for `enableSuperMixins`. - Straw-man:
```
analyzer:
enableSuperMixins: true
```
@sethladd : are there other options we need while I'm at it?
|
non_test
|
add analysis options support for enablesupermixins straw man analyzer enablesupermixins true sethladd are there other options we need while i m at it
| 0
|
642,284
| 20,883,404,864
|
IssuesEvent
|
2022-03-23 00:30:24
|
mskcc/helix_filters_01
|
https://api.github.com/repos/mskcc/helix_filters_01
|
closed
|
need to run test cases inside container
|
enhancement low priority
|
Right now the test case runner recipes run in the current terminal session, need to modify them so that they execute inside the Docker and Singularity containers instead
|
1.0
|
need to run test cases inside container - Right now the test case runner recipes run in the current terminal session, need to modify them so that they execute inside the Docker and Singularity containers instead
|
non_test
|
need to run test cases inside container right now the test case runner recipes run in the current terminal session need to modify them so that they execute inside the docker and singularity containers instead
| 0
|
87,102
| 8,058,898,758
|
IssuesEvent
|
2018-08-02 20:02:40
|
rancher/rancher
|
https://api.github.com/repos/rancher/rancher
|
closed
|
Rancher Logging does not take custom Docker root dir into account
|
area/tools kind/bug priority/-1 status/resolved status/to-test team/cn version/2.0
|
Due to server storage restrictions, my team needs to store anything large on our servers in `/app`, including the docker installation and `/var/log` directories, and then symlink to them from the original location. This seems to cause issues when enabling cluster logging in rancher. I dug into the issue, and saw the following logs for the fluentd container on my agent for each of my containers:
1: `2018-05-04 19:51:34 +0000 [warn]: #0 /var/lib/rancher/rke/log/kubelet_2512018f1c0e124fe6356cb915db9973ec5e24f2a0e77536274f661d9335706c.log unreadable. It is excluded and would be examined next time.`
AND
2: `2018-05-04 20:09:10 +0000 [warn]: #0 /var/log/containers/redis-7bd4689f65-q8w7r_default_redis-6272889bf0dfdc2609c1d05a8fd53115fc4352e6fbfb9409f343488889422759.log unreadable. It is excluded and would be examined next time.`
After seeing this I went into the container and dug down to see why it was not able to read the file. For example 2 above, the end symlink was trying to point to `/app/docker/containers`.... (the correct location on my host of the docker install/log location). The issue with this is that `/app/docker/containers` is not bind mounted for the fluentd container, only `/var/lib/docker` is bind mounted.
My proposed fix is to add the docker root dir, found from running `docker info`, as a bind mount when spinning up the fluentd container.
Thoughts?
**Rancher versions:**
2.0.0
**Infrastructure Stack versions:**
kubernetes (if applicable): 1.10
**Docker version: (`docker version`,`docker info` preferred)**
17.03
**Operating system and kernel: (`cat /etc/os-release`, `uname -r` preferred)**
centos 7
**Type/provider of hosts: (VirtualBox/Bare-metal/AWS/GCE/DO)**
bare-metal
**Setup details: (single node rancher vs. HA rancher, internal DB vs. external DB)**
single server with 3 agents on a single cluster
**Environment Template: (Cattle/Kubernetes/Swarm/Mesos)**
k8
**Steps to Reproduce:**
on your agent:
- Install docker
- move the docker installation to another dir
- sym link /var/lib/docker to your new location
- start rancher agent as normal on that machine and add it to your cluster
- configure cluster logging in rancher UI to send to a valid kafka endpoint (other methods may work as well not sure if this matters)
- view logs of fluentd container started on the agent
**Results:**
|
1.0
|
Rancher Logging does not take custom Docker root dir into account - Due to server storage restrictions, my team needs to store anything large on our servers in `/app`, including the docker installation and `/var/log` directories, and then symlink to them from the original location. This seems to cause issues when enabling cluster logging in rancher. I dug into the issue, and saw the following logs for the fluentd container on my agent for each of my containers:
1: `2018-05-04 19:51:34 +0000 [warn]: #0 /var/lib/rancher/rke/log/kubelet_2512018f1c0e124fe6356cb915db9973ec5e24f2a0e77536274f661d9335706c.log unreadable. It is excluded and would be examined next time.`
AND
2: `2018-05-04 20:09:10 +0000 [warn]: #0 /var/log/containers/redis-7bd4689f65-q8w7r_default_redis-6272889bf0dfdc2609c1d05a8fd53115fc4352e6fbfb9409f343488889422759.log unreadable. It is excluded and would be examined next time.`
After seeing this I went into the container and dug down to see why it was not able to read the file. For example 2 above, the end symlink was trying to point to `/app/docker/containers`.... (the correct location on my host of the docker install/log location). The issue with this is that `/app/docker/containers` is not bind mounted for the fluentd container, only `/var/lib/docker` is bind mounted.
My proposed fix is to add the docker root dir, found from running `docker info`, as a bind mount when spinning up the fluentd container.
Thoughts?
**Rancher versions:**
2.0.0
**Infrastructure Stack versions:**
kubernetes (if applicable): 1.10
**Docker version: (`docker version`,`docker info` preferred)**
17.03
**Operating system and kernel: (`cat /etc/os-release`, `uname -r` preferred)**
centos 7
**Type/provider of hosts: (VirtualBox/Bare-metal/AWS/GCE/DO)**
bare-metal
**Setup details: (single node rancher vs. HA rancher, internal DB vs. external DB)**
single server with 3 agents on a single cluster
**Environment Template: (Cattle/Kubernetes/Swarm/Mesos)**
k8
**Steps to Reproduce:**
on your agent:
- Install docker
- move the docker installation to another dir
- sym link /var/lib/docker to your new location
- start rancher agent as normal on that machine and add it to your cluster
- configure cluster logging in rancher UI to send to a valid kafka endpoint (other methods may work as well not sure if this matters)
- view logs of fluentd container started on the agent
**Results:**
|
test
|
rancher logging does not take custom docker root dir into account due to server storage restrictions my team needs to store anything large on our servers in app including the docker installation and var log directories and then symlink to them from the original location this seems to cause issues when enabling cluster logging in rancher i dug into the issue and saw the following logs for the fluentd container on my agent for each of my containers var lib rancher rke log kubelet log unreadable it is excluded and would be examined next time and var log containers redis default redis log unreadable it is excluded and would be examined next time after seeing this i went into the container and dug down to see why it was not able to read the file for example above the end symlink was trying to point to app docker containers the correct location on my host of the docker install log location the issue with this is that app docker containers is not bind mounted for the fluentd container only var lib docker is bind mounted my proposed fix is to add the docker root dir found from running docker info as a bind mount when spinning up the fluentd container thoughts rancher versions infrastructure stack versions kubernetes if applicable docker version docker version docker info preferred operating system and kernel cat etc os release uname r preferred centos type provider of hosts virtualbox bare metal aws gce do bare metal setup details single node rancher vs ha rancher internal db vs external db single server with agents on a single cluster environment template cattle kubernetes swarm mesos steps to reproduce on your agent install docker move the docker installation to another dir sym link var lib docker to your new location start rancher agent as normal on that machine and add it to your cluster configure cluster logging in rancher ui to send to a valid kafka endpoint other methods may work as well not sure if this matters view logs of fluentd container started on the agent results
| 1
|
10,182
| 8,837,383,154
|
IssuesEvent
|
2019-01-05 04:00:03
|
terraform-providers/terraform-provider-aws
|
https://api.github.com/repos/terraform-providers/terraform-provider-aws
|
closed
|
Add import support for aws_ssm_maintenance_window
|
enhancement service/ssm
|
<!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
Add support for importing aws_ssm_maintenance_window resources via ID.
### New or Affected Resource(s)
* aws_ssm_maintenance_window
### Potential Terraform Configuration
N/A
### References
PR:
* https://github.com/terraform-providers/terraform-provider-aws/pull/6747
|
1.0
|
Add import support for aws_ssm_maintenance_window - <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### Description
Add support for importing aws_ssm_maintenance_window resources via ID.
### New or Affected Resource(s)
* aws_ssm_maintenance_window
### Potential Terraform Configuration
N/A
### References
PR:
* https://github.com/terraform-providers/terraform-provider-aws/pull/6747
|
non_test
|
add import support for aws ssm maintenance window community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment description add support for importing aws ssm maintenance window resources via id new or affected resource s aws ssm maintenance window potential terraform configuration n a references pr
| 0
|
117,714
| 9,957,336,014
|
IssuesEvent
|
2019-07-05 16:27:18
|
elastic/elasticsearch
|
https://api.github.com/repos/elastic/elasticsearch
|
closed
|
Reproducible Failure in CoordinatorTests.testStateRecoveryResetAfterPreviousLeadership
|
:Distributed/Cluster Coordination >test-failure
|
```
REPRODUCE WITH: ./gradlew :server:test --tests "org.elasticsearch.cluster.coordination.CoordinatorTests.testStateRecoveryResetAfterPreviousLeadership" -Dtests.seed=283C43800B2E265E -Dtests.security.manager=true -Dtests.locale=dsb -Dtests.timezone=SystemV/MST7 -Dcompiler.java=12 -Druntime.java=11
NOTE: leaving temporary files on disk at: /var/lib/jenkins/workspace/elastic+elasticsearch+pull-request-1/server/build/testrun/test/temp/org.elasticsearch.cluster.coordination.CoordinatorTests_283C43800B2E265E-001
```
fails every time on `master` with:
```
java.lang.AssertionError
at __randomizedtesting.SeedInfo.seed([283C43800B2E265E:F95A83802CC7DFFB]:0)
at org.elasticsearch.cluster.coordination.PublicationTransportHandler$2.sendPublishRequest(PublicationTransportHandler.java:168)
at org.elasticsearch.cluster.coordination.Coordinator$CoordinatorPublication.sendPublishRequest(Coordinator.java:1401)
at org.elasticsearch.cluster.coordination.Publication$PublicationTarget.sendPublishRequest(Publication.java:231)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1540)
at org.elasticsearch.cluster.coordination.Publication.start(Publication.java:72)
at org.elasticsearch.cluster.coordination.Coordinator.publish(Coordinator.java:1020)
at org.elasticsearch.indices.cluster.FakeThreadPoolMasterService.publish(FakeThreadPoolMasterService.java:131)
at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:238)
at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:142)
at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:150)
at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:188)
at org.elasticsearch.indices.cluster.FakeThreadPoolMasterService$2.run(FakeThreadPoolMasterService.java:110)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$1.run(AbstractCoordinatorTestCase.java:1125)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster$ClusterNode$2.run(AbstractCoordinatorTestCase.java:929)
at org.elasticsearch.cluster.coordination.DeterministicTaskQueue.runTask(DeterministicTaskQueue.java:130)
at org.elasticsearch.cluster.coordination.DeterministicTaskQueue.runRandomTask(DeterministicTaskQueue.java:124)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster.runFor(AbstractCoordinatorTestCase.java:575)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster.stabilise(AbstractCoordinatorTestCase.java:473)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster.stabilise(AbstractCoordinatorTestCase.java:463)
at org.elasticsearch.cluster.coordination.CoordinatorTests.testStateRecoveryResetAfterPreviousLeadership(CoordinatorTests.java:119)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:834)
```
E.g. here https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+pull-request-1/2177/testReport/junit/org.elasticsearch.cluster.coordination/CoordinatorTests/testStateRecoveryResetAfterPreviousLeadership/
|
1.0
|
Reproducible Failure in CoordinatorTests.testStateRecoveryResetAfterPreviousLeadership - ```
REPRODUCE WITH: ./gradlew :server:test --tests "org.elasticsearch.cluster.coordination.CoordinatorTests.testStateRecoveryResetAfterPreviousLeadership" -Dtests.seed=283C43800B2E265E -Dtests.security.manager=true -Dtests.locale=dsb -Dtests.timezone=SystemV/MST7 -Dcompiler.java=12 -Druntime.java=11
NOTE: leaving temporary files on disk at: /var/lib/jenkins/workspace/elastic+elasticsearch+pull-request-1/server/build/testrun/test/temp/org.elasticsearch.cluster.coordination.CoordinatorTests_283C43800B2E265E-001
```
fails every time on `master` with:
```
java.lang.AssertionError
at __randomizedtesting.SeedInfo.seed([283C43800B2E265E:F95A83802CC7DFFB]:0)
at org.elasticsearch.cluster.coordination.PublicationTransportHandler$2.sendPublishRequest(PublicationTransportHandler.java:168)
at org.elasticsearch.cluster.coordination.Coordinator$CoordinatorPublication.sendPublishRequest(Coordinator.java:1401)
at org.elasticsearch.cluster.coordination.Publication$PublicationTarget.sendPublishRequest(Publication.java:231)
at java.base/java.util.ArrayList.forEach(ArrayList.java:1540)
at org.elasticsearch.cluster.coordination.Publication.start(Publication.java:72)
at org.elasticsearch.cluster.coordination.Coordinator.publish(Coordinator.java:1020)
at org.elasticsearch.indices.cluster.FakeThreadPoolMasterService.publish(FakeThreadPoolMasterService.java:131)
at org.elasticsearch.cluster.service.MasterService.runTasks(MasterService.java:238)
at org.elasticsearch.cluster.service.MasterService$Batcher.run(MasterService.java:142)
at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:150)
at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:188)
at org.elasticsearch.indices.cluster.FakeThreadPoolMasterService$2.run(FakeThreadPoolMasterService.java:110)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$1.run(AbstractCoordinatorTestCase.java:1125)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster$ClusterNode$2.run(AbstractCoordinatorTestCase.java:929)
at org.elasticsearch.cluster.coordination.DeterministicTaskQueue.runTask(DeterministicTaskQueue.java:130)
at org.elasticsearch.cluster.coordination.DeterministicTaskQueue.runRandomTask(DeterministicTaskQueue.java:124)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster.runFor(AbstractCoordinatorTestCase.java:575)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster.stabilise(AbstractCoordinatorTestCase.java:473)
at org.elasticsearch.cluster.coordination.AbstractCoordinatorTestCase$Cluster.stabilise(AbstractCoordinatorTestCase.java:463)
at org.elasticsearch.cluster.coordination.CoordinatorTests.testStateRecoveryResetAfterPreviousLeadership(CoordinatorTests.java:119)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:566)
at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:834)
```
E.g. here https://elasticsearch-ci.elastic.co/job/elastic+elasticsearch+pull-request-1/2177/testReport/junit/org.elasticsearch.cluster.coordination/CoordinatorTests/testStateRecoveryResetAfterPreviousLeadership/
|
test
|
reproducible failure in coordinatortests teststaterecoveryresetafterpreviousleadership reproduce with gradlew server test tests org elasticsearch cluster coordination coordinatortests teststaterecoveryresetafterpreviousleadership dtests seed dtests security manager true dtests locale dsb dtests timezone systemv dcompiler java druntime java note leaving temporary files on disk at var lib jenkins workspace elastic elasticsearch pull request server build testrun test temp org elasticsearch cluster coordination coordinatortests fails every time on master with java lang assertionerror at randomizedtesting seedinfo seed at org elasticsearch cluster coordination publicationtransporthandler sendpublishrequest publicationtransporthandler java at org elasticsearch cluster coordination coordinator coordinatorpublication sendpublishrequest coordinator java at org elasticsearch cluster coordination publication publicationtarget sendpublishrequest publication java at java base java util arraylist foreach arraylist java at org elasticsearch cluster coordination publication start publication java at org elasticsearch cluster coordination coordinator publish coordinator java at org elasticsearch indices cluster fakethreadpoolmasterservice publish fakethreadpoolmasterservice java at org elasticsearch cluster service masterservice runtasks masterservice java at org elasticsearch cluster service masterservice batcher run masterservice java at org elasticsearch cluster service taskbatcher runifnotprocessed taskbatcher java at org elasticsearch cluster service taskbatcher batchedtask run taskbatcher java at org elasticsearch indices cluster fakethreadpoolmasterservice run fakethreadpoolmasterservice java at org elasticsearch cluster coordination abstractcoordinatortestcase run abstractcoordinatortestcase java at org elasticsearch cluster coordination abstractcoordinatortestcase cluster clusternode run abstractcoordinatortestcase java at org elasticsearch cluster coordination deterministictaskqueue runtask deterministictaskqueue java at org elasticsearch cluster coordination deterministictaskqueue runrandomtask deterministictaskqueue java at org elasticsearch cluster coordination abstractcoordinatortestcase cluster runfor abstractcoordinatortestcase java at org elasticsearch cluster coordination abstractcoordinatortestcase cluster stabilise abstractcoordinatortestcase java at org elasticsearch cluster coordination abstractcoordinatortestcase cluster stabilise abstractcoordinatortestcase java at org elasticsearch cluster coordination coordinatortests teststaterecoveryresetafterpreviousleadership coordinatortests java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testruleignoretestsuites evaluate testruleignoretestsuites java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at java base java lang thread run thread java e g here
| 1
|
355,871
| 25,176,050,147
|
IssuesEvent
|
2022-11-11 09:21:34
|
xhphoong/pe
|
https://api.github.com/repos/xhphoong/pe
|
opened
|
Unnecessary KEY table in the list command
|
severity.Low type.DocumentationBug
|
The examples for list has demonstrate what the key table want to demonstrate and thus, I think that the KEY table for list command is unnecessary.
<!--session: 1668154147439-97f1d9c1-50c9-4671-ae85-1d2870b53e7a-->
<!--Version: Web v3.4.4-->
|
1.0
|
Unnecessary KEY table in the list command - The examples for list has demonstrate what the key table want to demonstrate and thus, I think that the KEY table for list command is unnecessary.
<!--session: 1668154147439-97f1d9c1-50c9-4671-ae85-1d2870b53e7a-->
<!--Version: Web v3.4.4-->
|
non_test
|
unnecessary key table in the list command the examples for list has demonstrate what the key table want to demonstrate and thus i think that the key table for list command is unnecessary
| 0
|
29,331
| 5,625,478,042
|
IssuesEvent
|
2017-04-04 19:29:21
|
minishift/minishift
|
https://api.github.com/repos/minishift/minishift
|
closed
|
'openshift config' command rejects all patches as non valid JSON
|
component/documentation kind/bug os/windows priority/major
|
```
Environment
OS: Windows 7
Minishift version: minishift 1.0.0-beta.4-1, cdk 3.0.0-beta.2
```
I am following docs - [Updating OpenShift configuration](https://github.com/minishift/minishift/blob/724f750d5b1116b8651d5b5f75378906ed91a357/docs/using.md#updating-openshift-configuration). Running example command in both cmd and powershell gives me following message: "The specified patch need to be a valid JSON." I tried several alternations of quotes but without success:
```
PS C:\Users\CDK> minishift openshift config set --patch '{"corsAllowedOrigins": [".*"]}'
The specified patch need to be valid JSON.
PS C:\Users\CDK> minishift openshift config set --patch "{'corsAllowedOrigins': ['.*']}"
The specified patch need to be valid JSON.
PS C:\Users\CDK> minishift openshift config set --patch '{apiVersion : v1}'
The specified patch need to be valid JSON.
C:\Users\CDK\.minishift\cache\oc\v3.4.1.2>minishift openshift config set --patch "{apiVersion:v1}"
The specified patch need to be valid JSON.
C:\Users\CDK\.minishift\cache\oc\v3.4.1.2>minishift openshift config set --patch "{ apiVersion:v1 }"
The specified patch need to be valid JSON.
C:\Users\CDK\.minishift\cache\oc\v3.4.1.2>minishift openshift config set --patch "{ apiVersion : v1 }"
The specified patch need to be valid JSON.
```
|
1.0
|
'openshift config' command rejects all patches as non valid JSON - ```
Environment
OS: Windows 7
Minishift version: minishift 1.0.0-beta.4-1, cdk 3.0.0-beta.2
```
I am following docs - [Updating OpenShift configuration](https://github.com/minishift/minishift/blob/724f750d5b1116b8651d5b5f75378906ed91a357/docs/using.md#updating-openshift-configuration). Running example command in both cmd and powershell gives me following message: "The specified patch need to be a valid JSON." I tried several alternations of quotes but without success:
```
PS C:\Users\CDK> minishift openshift config set --patch '{"corsAllowedOrigins": [".*"]}'
The specified patch need to be valid JSON.
PS C:\Users\CDK> minishift openshift config set --patch "{'corsAllowedOrigins': ['.*']}"
The specified patch need to be valid JSON.
PS C:\Users\CDK> minishift openshift config set --patch '{apiVersion : v1}'
The specified patch need to be valid JSON.
C:\Users\CDK\.minishift\cache\oc\v3.4.1.2>minishift openshift config set --patch "{apiVersion:v1}"
The specified patch need to be valid JSON.
C:\Users\CDK\.minishift\cache\oc\v3.4.1.2>minishift openshift config set --patch "{ apiVersion:v1 }"
The specified patch need to be valid JSON.
C:\Users\CDK\.minishift\cache\oc\v3.4.1.2>minishift openshift config set --patch "{ apiVersion : v1 }"
The specified patch need to be valid JSON.
```
|
non_test
|
openshift config command rejects all patches as non valid json environment os windows minishift version minishift beta cdk beta i am following docs running example command in both cmd and powershell gives me following message the specified patch need to be a valid json i tried several alternations of quotes but without success ps c users cdk minishift openshift config set patch corsallowedorigins the specified patch need to be valid json ps c users cdk minishift openshift config set patch corsallowedorigins the specified patch need to be valid json ps c users cdk minishift openshift config set patch apiversion the specified patch need to be valid json c users cdk minishift cache oc minishift openshift config set patch apiversion the specified patch need to be valid json c users cdk minishift cache oc minishift openshift config set patch apiversion the specified patch need to be valid json c users cdk minishift cache oc minishift openshift config set patch apiversion the specified patch need to be valid json
| 0
|
413,042
| 27,884,650,384
|
IssuesEvent
|
2023-03-21 22:31:51
|
TheThingsIndustries/lorawan-stack-docs
|
https://api.github.com/repos/TheThingsIndustries/lorawan-stack-docs
|
closed
|
Missing information about uploading `config.yml` file to `EDCSConfigBucket` when deploying TTS CF templates
|
documentation
|
<!--
Thanks for submitting this documentation request. Please fill the template
below, otherwise we will not be able to process this request.
-->
#### Summary
<!-- Summarize the request in a few sentences: -->
Update note about uploading `config.yml` file to also include the EDCS bucket.
#### Why do we need this ?
<!-- Please explain the motivation, for whom, etc. -->
TTS won't deploy properly without this file.
#### What is already there? What do you see now?
<!--
Please add any relevant documentation articles and resources. Screenshots if necessary.
-->
The documentation is there and there is a note about this file being uploaded to the interop bucket already.
#### What is missing? What do you want to see?
<!-- Please add documentation sources (forum links, outside sources...), mock-ups if applicable -->
The same note about the file for the EDCS bucket.
#### How do you propose to document this?
<!-- Please think about how this could be fixed. -->
Update the docs to contain this note.
#### Can you do this yourself and submit a Pull Request?
<!-- You can also @mention experts if you need help with this. -->
I can do this myself.
|
1.0
|
Missing information about uploading `config.yml` file to `EDCSConfigBucket` when deploying TTS CF templates - <!--
Thanks for submitting this documentation request. Please fill the template
below, otherwise we will not be able to process this request.
-->
#### Summary
<!-- Summarize the request in a few sentences: -->
Update note about uploading `config.yml` file to also include the EDCS bucket.
#### Why do we need this ?
<!-- Please explain the motivation, for whom, etc. -->
TTS won't deploy properly without this file.
#### What is already there? What do you see now?
<!--
Please add any relevant documentation articles and resources. Screenshots if necessary.
-->
The documentation is there and there is a note about this file being uploaded to the interop bucket already.
#### What is missing? What do you want to see?
<!-- Please add documentation sources (forum links, outside sources...), mock-ups if applicable -->
The same note about the file for the EDCS bucket.
#### How do you propose to document this?
<!-- Please think about how this could be fixed. -->
Update the docs to contain this note.
#### Can you do this yourself and submit a Pull Request?
<!-- You can also @mention experts if you need help with this. -->
I can do this myself.
|
non_test
|
missing information about uploading config yml file to edcsconfigbucket when deploying tts cf templates thanks for submitting this documentation request please fill the template below otherwise we will not be able to process this request summary update note about uploading config yml file to also include the edcs bucket why do we need this tts won t deploy properly without this file what is already there what do you see now please add any relevant documentation articles and resources screenshots if necessary the documentation is there and there is a note about this file being uploaded to the interop bucket already what is missing what do you want to see the same note about the file for the edcs bucket how do you propose to document this update the docs to contain this note can you do this yourself and submit a pull request i can do this myself
| 0
|
314,577
| 27,012,061,315
|
IssuesEvent
|
2023-02-10 16:08:43
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix raw_ops.test_tensorflow_FloorDiv
|
TensorFlow Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_tensorflow/test_raw_ops.py::test_tensorflow_FloorDiv[cpu-ivy.functional.backends.numpy-False-False]</summary>
2023-01-26T03:39:26.9334044Z E AssertionError: -1.0 != -0.0
2023-01-26T03:39:26.9334380Z E Falsifying example: test_tensorflow_FloorDiv(
2023-01-26T03:39:26.9334792Z E dtype_and_x=(['float32', 'float32'],
2023-01-26T03:39:26.9335262Z E [array(-1., dtype=float32), array(8.50706e+37, dtype=float32)]),
2023-01-26T03:39:26.9335759Z E test_flags=num_positional_args=0. with_out=False. inplace=False. native_arrays=[False]. as_variable=[False]. ,
2023-01-26T03:39:26.9336362Z E fn_tree='ivy.functional.frontends.tensorflow.raw_ops.FloorDiv',
2023-01-26T03:39:26.9336788Z E frontend='tensorflow',
2023-01-26T03:39:26.9337106Z E on_device='cpu',
2023-01-26T03:39:26.9337350Z E )
2023-01-26T03:39:26.9337560Z E
2023-01-26T03:39:26.9338234Z E You can reproduce this example by temporarily adding @reproduce_failure('6.55.0', b'AXicY2AAAkYGCGAEMdofMDAIQMUAENEBfA==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_tensorflow/test_raw_ops.py::test_tensorflow_FloorDiv[cpu-ivy.functional.backends.numpy-False-False]</summary>
2023-01-26T03:39:26.9334044Z E AssertionError: -1.0 != -0.0
2023-01-26T03:39:26.9334380Z E Falsifying example: test_tensorflow_FloorDiv(
2023-01-26T03:39:26.9334792Z E dtype_and_x=(['float32', 'float32'],
2023-01-26T03:39:26.9335262Z E [array(-1., dtype=float32), array(8.50706e+37, dtype=float32)]),
2023-01-26T03:39:26.9335759Z E test_flags=num_positional_args=0. with_out=False. inplace=False. native_arrays=[False]. as_variable=[False]. ,
2023-01-26T03:39:26.9336362Z E fn_tree='ivy.functional.frontends.tensorflow.raw_ops.FloorDiv',
2023-01-26T03:39:26.9336788Z E frontend='tensorflow',
2023-01-26T03:39:26.9337106Z E on_device='cpu',
2023-01-26T03:39:26.9337350Z E )
2023-01-26T03:39:26.9337560Z E
2023-01-26T03:39:26.9338234Z E You can reproduce this example by temporarily adding @reproduce_failure('6.55.0', b'AXicY2AAAkYGCGAEMdofMDAIQMUAENEBfA==') as a decorator on your test case
</details>
|
1.0
|
Fix raw_ops.test_tensorflow_FloorDiv - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4012069726/jobs/6890208840" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_tensorflow/test_raw_ops.py::test_tensorflow_FloorDiv[cpu-ivy.functional.backends.numpy-False-False]</summary>
2023-01-26T03:39:26.9334044Z E AssertionError: -1.0 != -0.0
2023-01-26T03:39:26.9334380Z E Falsifying example: test_tensorflow_FloorDiv(
2023-01-26T03:39:26.9334792Z E dtype_and_x=(['float32', 'float32'],
2023-01-26T03:39:26.9335262Z E [array(-1., dtype=float32), array(8.50706e+37, dtype=float32)]),
2023-01-26T03:39:26.9335759Z E test_flags=num_positional_args=0. with_out=False. inplace=False. native_arrays=[False]. as_variable=[False]. ,
2023-01-26T03:39:26.9336362Z E fn_tree='ivy.functional.frontends.tensorflow.raw_ops.FloorDiv',
2023-01-26T03:39:26.9336788Z E frontend='tensorflow',
2023-01-26T03:39:26.9337106Z E on_device='cpu',
2023-01-26T03:39:26.9337350Z E )
2023-01-26T03:39:26.9337560Z E
2023-01-26T03:39:26.9338234Z E You can reproduce this example by temporarily adding @reproduce_failure('6.55.0', b'AXicY2AAAkYGCGAEMdofMDAIQMUAENEBfA==') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_tensorflow/test_raw_ops.py::test_tensorflow_FloorDiv[cpu-ivy.functional.backends.numpy-False-False]</summary>
2023-01-26T03:39:26.9334044Z E AssertionError: -1.0 != -0.0
2023-01-26T03:39:26.9334380Z E Falsifying example: test_tensorflow_FloorDiv(
2023-01-26T03:39:26.9334792Z E dtype_and_x=(['float32', 'float32'],
2023-01-26T03:39:26.9335262Z E [array(-1., dtype=float32), array(8.50706e+37, dtype=float32)]),
2023-01-26T03:39:26.9335759Z E test_flags=num_positional_args=0. with_out=False. inplace=False. native_arrays=[False]. as_variable=[False]. ,
2023-01-26T03:39:26.9336362Z E fn_tree='ivy.functional.frontends.tensorflow.raw_ops.FloorDiv',
2023-01-26T03:39:26.9336788Z E frontend='tensorflow',
2023-01-26T03:39:26.9337106Z E on_device='cpu',
2023-01-26T03:39:26.9337350Z E )
2023-01-26T03:39:26.9337560Z E
2023-01-26T03:39:26.9338234Z E You can reproduce this example by temporarily adding @reproduce_failure('6.55.0', b'AXicY2AAAkYGCGAEMdofMDAIQMUAENEBfA==') as a decorator on your test case
</details>
|
test
|
fix raw ops test tensorflow floordiv tensorflow img src torch img src numpy img src jax img src failed ivy tests test ivy test frontends test tensorflow test raw ops py test tensorflow floordiv e assertionerror e falsifying example test tensorflow floordiv e dtype and x e e test flags num positional args with out false inplace false native arrays as variable e fn tree ivy functional frontends tensorflow raw ops floordiv e frontend tensorflow e on device cpu e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test tensorflow test raw ops py test tensorflow floordiv e assertionerror e falsifying example test tensorflow floordiv e dtype and x e e test flags num positional args with out false inplace false native arrays as variable e fn tree ivy functional frontends tensorflow raw ops floordiv e frontend tensorflow e on device cpu e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case
| 1
|
94,653
| 10,840,385,562
|
IssuesEvent
|
2019-11-12 08:11:31
|
milvus-io/milvus
|
https://api.github.com/repos/milvus-io/milvus
|
opened
|
[DOC] The Japanese version of the Milvus readme needs to be updated
|
documentation
|
## Report incorrect documentation
**Describe the problems or issues found in the documentation**
The [Japanese version of the Milvus readme](https://github.com/milvus-io/milvus/blob/master/README_JP.md) is not up to date.
**Suggested fix for documentation**
Update the Japanese version of the Milvus readme to match the latest English readme.
|
1.0
|
[DOC] The Japanese version of the Milvus readme needs to be updated - ## Report incorrect documentation
**Describe the problems or issues found in the documentation**
The [Japanese version of the Milvus readme](https://github.com/milvus-io/milvus/blob/master/README_JP.md) is not up to date.
**Suggested fix for documentation**
Update the Japanese version of the Milvus readme to match the latest English readme.
|
non_test
|
the japanese version of the milvus readme needs to be updated report incorrect documentation describe the problems or issues found in the documentation the is not up to date suggested fix for documentation update the japanese version of the milvus readme to match the latest english readme
| 0
|
161,060
| 12,530,149,755
|
IssuesEvent
|
2020-06-04 12:36:43
|
aces/Loris
|
https://api.github.com/repos/aces/Loris
|
closed
|
[DQT] JS Error on View Data > Longitudinal view
|
23.0.0-testing Bug Critical to release
|
**To reproduce**
Define Fields: aosi | Candidate_Age
Define Filters: aosi | Candidate_Age = 33 All Visits
View Data > Run Query
Switch to Longitudinal
>
> react_devtools_backend.js:6 TypeError: Cannot read property '0' of null
> at StatsVisualizationTabPane.render (react.tabs.js:651)
|
1.0
|
[DQT] JS Error on View Data > Longitudinal view - **To reproduce**
Define Fields: aosi | Candidate_Age
Define Filters: aosi | Candidate_Age = 33 All Visits
View Data > Run Query
Switch to Longitudinal
>
> react_devtools_backend.js:6 TypeError: Cannot read property '0' of null
> at StatsVisualizationTabPane.render (react.tabs.js:651)
|
test
|
js error on view data longitudinal view to reproduce define fields aosi candidate age define filters aosi candidate age all visits view data run query switch to longitudinal react devtools backend js typeerror cannot read property of null at statsvisualizationtabpane render react tabs js
| 1
|
14,235
| 3,386,948,696
|
IssuesEvent
|
2015-11-27 23:39:57
|
demiurgosoft/maelstrom
|
https://api.github.com/repos/demiurgosoft/maelstrom
|
opened
|
[world] Fix & Test Get ship details
|
bug test
|
get.shipDetails currently not tested
Get user Ships test skiped
|
1.0
|
[world] Fix & Test Get ship details - get.shipDetails currently not tested
Get user Ships test skiped
|
test
|
fix test get ship details get shipdetails currently not tested get user ships test skiped
| 1
|
94,054
| 3,918,559,249
|
IssuesEvent
|
2016-04-21 13:00:19
|
kubernetes/dashboard
|
https://api.github.com/repos/kubernetes/dashboard
|
closed
|
RC details page loads slowly when there are a lot of events
|
area/performance priority/P2
|
#### Issue details
##### Environment
```
Dashboard version: gcr.io/google_containers/kubernetes-dashboard-amd64:v1.0.0
Kubernetes version: v1.3.0-alpha.0.604+b494cbc0c17dd7
Operating system: linux/amd64
Node.js version: v0.10.40
Go version: go1.4.2
```
##### Steps to reproduce
I had generated a large number of events in short period time
##### Observed result
Load time RC detail page is very long (for 10000 events time load is over 60 s), more then 10000 events crashes script. Moreover I've seen that command "kubectl get events ..." displaying all of events (node and pods) but in Dashboard we may see only events for pods.
##### Comments
I think we should do pagination for Events page. This should solve the problem.
IMHO it is not our fault that script in browser was crashed.
I don't know we should displaying all of events (like a "kubectl...") what are your opinions?
|
1.0
|
RC details page loads slowly when there are a lot of events - #### Issue details
##### Environment
```
Dashboard version: gcr.io/google_containers/kubernetes-dashboard-amd64:v1.0.0
Kubernetes version: v1.3.0-alpha.0.604+b494cbc0c17dd7
Operating system: linux/amd64
Node.js version: v0.10.40
Go version: go1.4.2
```
##### Steps to reproduce
I had generated a large number of events in short period time
##### Observed result
Load time RC detail page is very long (for 10000 events time load is over 60 s), more then 10000 events crashes script. Moreover I've seen that command "kubectl get events ..." displaying all of events (node and pods) but in Dashboard we may see only events for pods.
##### Comments
I think we should do pagination for Events page. This should solve the problem.
IMHO it is not our fault that script in browser was crashed.
I don't know we should displaying all of events (like a "kubectl...") what are your opinions?
|
non_test
|
rc details page loads slowly when there are a lot of events issue details environment dashboard version gcr io google containers kubernetes dashboard kubernetes version alpha operating system linux node js version go version steps to reproduce i had generated a large number of events in short period time observed result load time rc detail page is very long for events time load is over s more then events crashes script moreover i ve seen that command kubectl get events displaying all of events node and pods but in dashboard we may see only events for pods comments i think we should do pagination for events page this should solve the problem imho it is not our fault that script in browser was crashed i don t know we should displaying all of events like a kubectl what are your opinions
| 0
|
93,018
| 8,391,763,466
|
IssuesEvent
|
2018-10-09 15:45:40
|
edenlabllc/ehealth.web
|
https://api.github.com/repos/edenlabllc/ehealth.web
|
closed
|
Legal entity list page
|
FE epic/legal-entity epic/nhs_admin_portal_v2 in progress kind/task status/test
|
NHS admin must be able to get legal entities details described in [scheme](
https://github.com/edenlabllc/ehealth.web/blob/master/packages/mock-server/__schema__/legal_entities.graphql) and [features](
https://github.com/edenlabllc/ehealth.web/blob/master/packages/admin/e2e/__features__/legal_entities_details.feature)
```
- legalEntities( filter: LegalEntityFilter orderBy: LegalEntityOrderBy ): [LegalEntity]
```
|
1.0
|
Legal entity list page - NHS admin must be able to get legal entities details described in [scheme](
https://github.com/edenlabllc/ehealth.web/blob/master/packages/mock-server/__schema__/legal_entities.graphql) and [features](
https://github.com/edenlabllc/ehealth.web/blob/master/packages/admin/e2e/__features__/legal_entities_details.feature)
```
- legalEntities( filter: LegalEntityFilter orderBy: LegalEntityOrderBy ): [LegalEntity]
```
|
test
|
legal entity list page nhs admin must be able to get legal entities details described in and legalentities filter legalentityfilter orderby legalentityorderby
| 1
|
193,984
| 14,666,995,560
|
IssuesEvent
|
2020-12-29 17:33:02
|
elastic/elasticsearch
|
https://api.github.com/repos/elastic/elasticsearch
|
opened
|
ReadOnlyEngineTests testSearcherId fails
|
>test-failure
|
**Build scan**:
https://gradle-enterprise.elastic.co/s/j4tztllphmnpo
**Repro line**:
./gradlew ':server:test' --tests "org.elasticsearch.index.engine.ReadOnlyEngineTests.testSearcherId" \
-Dtests.seed=5F23C74BF9BA02D2 \
-Dtests.security.manager=true \
-Dtests.locale=es-AR \
-Dtests.timezone=America/Noronha \
-Druntime.java=11
**Reproduces locally?**:
yes
**Applicable branches**:
7.x
**Failure history**:
14 failures over the last week
**Failure excerpt**:
```
org.elasticsearch.index.engine.ReadOnlyEngineTests > testSearcherId FAILED
java.lang.AssertionError:
Expected: not "0b4176a0054fce403abb72350d27706d167ec061d30031a49e5db37e4a50113b"
but: was "0b4176a0054fce403abb72350d27706d167ec061d30031a49e5db37e4a50113b"
at __randomizedtesting.SeedInfo.seed([5F23C74BF9BA02D2:BDA0039B9DAD7920]:0)
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18)
at org.junit.Assert.assertThat(Assert.java:956)
at org.junit.Assert.assertThat(Assert.java:923)
at org.elasticsearch.index.engine.ReadOnlyEngineTests.testSearcherId(ReadOnlyEngineTests.java:373)
```
|
1.0
|
ReadOnlyEngineTests testSearcherId fails -
**Build scan**:
https://gradle-enterprise.elastic.co/s/j4tztllphmnpo
**Repro line**:
./gradlew ':server:test' --tests "org.elasticsearch.index.engine.ReadOnlyEngineTests.testSearcherId" \
-Dtests.seed=5F23C74BF9BA02D2 \
-Dtests.security.manager=true \
-Dtests.locale=es-AR \
-Dtests.timezone=America/Noronha \
-Druntime.java=11
**Reproduces locally?**:
yes
**Applicable branches**:
7.x
**Failure history**:
14 failures over the last week
**Failure excerpt**:
```
org.elasticsearch.index.engine.ReadOnlyEngineTests > testSearcherId FAILED
java.lang.AssertionError:
Expected: not "0b4176a0054fce403abb72350d27706d167ec061d30031a49e5db37e4a50113b"
but: was "0b4176a0054fce403abb72350d27706d167ec061d30031a49e5db37e4a50113b"
at __randomizedtesting.SeedInfo.seed([5F23C74BF9BA02D2:BDA0039B9DAD7920]:0)
at org.hamcrest.MatcherAssert.assertThat(MatcherAssert.java:18)
at org.junit.Assert.assertThat(Assert.java:956)
at org.junit.Assert.assertThat(Assert.java:923)
at org.elasticsearch.index.engine.ReadOnlyEngineTests.testSearcherId(ReadOnlyEngineTests.java:373)
```
|
test
|
readonlyenginetests testsearcherid fails build scan repro line gradlew server test tests org elasticsearch index engine readonlyenginetests testsearcherid dtests seed dtests security manager true dtests locale es ar dtests timezone america noronha druntime java reproduces locally yes applicable branches x failure history failures over the last week failure excerpt org elasticsearch index engine readonlyenginetests testsearcherid failed java lang assertionerror expected not but was at randomizedtesting seedinfo seed at org hamcrest matcherassert assertthat matcherassert java at org junit assert assertthat assert java at org junit assert assertthat assert java at org elasticsearch index engine readonlyenginetests testsearcherid readonlyenginetests java
| 1
|
824,680
| 31,166,154,081
|
IssuesEvent
|
2023-08-16 19:53:11
|
IRPTeam/IRP
|
https://api.github.com/repos/IRPTeam/IRP
|
opened
|
GetWorkstationHardwareByEquipmentType
|
bug Critical Priority
|
- [ ] - Поставить контроль в оборудовании, что бы нельзя было установить Enable у 2 строк фискалок и штрихкод считывателей. 2 строки могут быть только у разных пос терминалов
- [ ] - &ChangeAndValidate("GetWorkstationHardwareByEquipmentType")
Function BF99_GetWorkstationHardwareByEquipmentType(Workstation, EquipmentType)
Query = New Query();
Query.Text =
"SELECT
| HardwareList.Hardware
|FROM
| Catalog.Workstations.HardwareList AS HardwareList
|WHERE
| HardwareList.Ref = &Workstation
| And HardwareList.Hardware.EquipmentType = &EquipmentType
| And HardwareList.Enable
| And Not HardwareList.Hardware.DeletionMark";
Query.SetParameter("Workstation", Workstation);
Query.SetParameter("EquipmentType", EquipmentType);
QueryResult = Query.Execute();
SelectionDetailRecords = QueryResult.Select();
HardwareList = New Array();
#Delete
If SelectionDetailRecords.Next() Then
HardwareList.Add(SelectionDetailRecords.Hardware);
EndIf;
#EndDelete
#Insert
While SelectionDetailRecords.Next() Do
HardwareList.Add(SelectionDetailRecords.Hardware);
EndDo;
#EndInsert
Return HardwareList;
EndFunction
|
1.0
|
GetWorkstationHardwareByEquipmentType - - [ ] - Поставить контроль в оборудовании, что бы нельзя было установить Enable у 2 строк фискалок и штрихкод считывателей. 2 строки могут быть только у разных пос терминалов
- [ ] - &ChangeAndValidate("GetWorkstationHardwareByEquipmentType")
Function BF99_GetWorkstationHardwareByEquipmentType(Workstation, EquipmentType)
Query = New Query();
Query.Text =
"SELECT
| HardwareList.Hardware
|FROM
| Catalog.Workstations.HardwareList AS HardwareList
|WHERE
| HardwareList.Ref = &Workstation
| And HardwareList.Hardware.EquipmentType = &EquipmentType
| And HardwareList.Enable
| And Not HardwareList.Hardware.DeletionMark";
Query.SetParameter("Workstation", Workstation);
Query.SetParameter("EquipmentType", EquipmentType);
QueryResult = Query.Execute();
SelectionDetailRecords = QueryResult.Select();
HardwareList = New Array();
#Delete
If SelectionDetailRecords.Next() Then
HardwareList.Add(SelectionDetailRecords.Hardware);
EndIf;
#EndDelete
#Insert
While SelectionDetailRecords.Next() Do
HardwareList.Add(SelectionDetailRecords.Hardware);
EndDo;
#EndInsert
Return HardwareList;
EndFunction
|
non_test
|
getworkstationhardwarebyequipmenttype поставить контроль в оборудовании что бы нельзя было установить enable у строк фискалок и штрихкод считывателей строки могут быть только у разных пос терминалов changeandvalidate getworkstationhardwarebyequipmenttype function getworkstationhardwarebyequipmenttype workstation equipmenttype query new query query text select hardwarelist hardware from catalog workstations hardwarelist as hardwarelist where hardwarelist ref workstation and hardwarelist hardware equipmenttype equipmenttype and hardwarelist enable and not hardwarelist hardware deletionmark query setparameter workstation workstation query setparameter equipmenttype equipmenttype queryresult query execute selectiondetailrecords queryresult select hardwarelist new array delete if selectiondetailrecords next then hardwarelist add selectiondetailrecords hardware endif enddelete insert while selectiondetailrecords next do hardwarelist add selectiondetailrecords hardware enddo endinsert return hardwarelist endfunction
| 0
|
34,071
| 28,151,188,163
|
IssuesEvent
|
2023-04-03 01:30:45
|
APSIMInitiative/ApsimX
|
https://api.github.com/repos/APSIMInitiative/ApsimX
|
closed
|
Won't download .NET Core dependency
|
bug interface/infrastructure Informative
|
When installing APSIM next generation the installation wizard won't automatically download/install the .NET core dependency. Attempting to run APSIM will cause a popup to appear that states that .NET core isn't installed and asks whether the user wants to install it. However, clicking yes does nothing and I had to manually install .NET core. I tried installing the latest version of APSIM (7127) and encountered this issue. I also tried an earlier version of APSIM (7009) which had worked on another computer but still encountered this issue.
|
1.0
|
Won't download .NET Core dependency - When installing APSIM next generation the installation wizard won't automatically download/install the .NET core dependency. Attempting to run APSIM will cause a popup to appear that states that .NET core isn't installed and asks whether the user wants to install it. However, clicking yes does nothing and I had to manually install .NET core. I tried installing the latest version of APSIM (7127) and encountered this issue. I also tried an earlier version of APSIM (7009) which had worked on another computer but still encountered this issue.
|
non_test
|
won t download net core dependency when installing apsim next generation the installation wizard won t automatically download install the net core dependency attempting to run apsim will cause a popup to appear that states that net core isn t installed and asks whether the user wants to install it however clicking yes does nothing and i had to manually install net core i tried installing the latest version of apsim and encountered this issue i also tried an earlier version of apsim which had worked on another computer but still encountered this issue
| 0
|
118,258
| 9,979,513,425
|
IssuesEvent
|
2019-07-09 23:11:44
|
saltstack/salt
|
https://api.github.com/repos/saltstack/salt
|
opened
|
unit.modules.test_gpg.GpgTestCase.test_delete_key
|
Neon Test Failure
|
Neon failing on salt-fedora-29-py2, salt-ubuntu-1604-py2-tcp, salt-ubuntu-1604-py3-tcp
Bring #49092 into Neon to fix
https://github.com/saltstack/salt/pull/49092
|
1.0
|
unit.modules.test_gpg.GpgTestCase.test_delete_key - Neon failing on salt-fedora-29-py2, salt-ubuntu-1604-py2-tcp, salt-ubuntu-1604-py3-tcp
Bring #49092 into Neon to fix
https://github.com/saltstack/salt/pull/49092
|
test
|
unit modules test gpg gpgtestcase test delete key neon failing on salt fedora salt ubuntu tcp salt ubuntu tcp bring into neon to fix
| 1
|
239,686
| 19,908,632,100
|
IssuesEvent
|
2022-01-25 15:09:46
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/uptime/synthetics_integration·ts - Uptime app with generated data When on the Synthetics Integration Policy Create Page displays custom UI prevent saving when integration name, url/host, or schedule is missing
|
blocker failed-test Team:uptime skipped-test uptime v8.1.0
|
A test failed on a tracked branch
```
ElementClickInterceptedError: element click intercepted: Element <button class="euiButton euiButton--primary euiButton--fill" type="button" data-test-subj="createPackagePolicySaveButton">...</button> is not clickable at point (1480, 964). Other element would receive the click: <span class="euiButton__text">...</span>
(Session info: headless chrome=95.0.4638.69)
at Object.throwDecodedError (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/node_modules/selenium-webdriver/lib/error.js:517:15)
at parseHttpResponse (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/node_modules/selenium-webdriver/lib/http.js:643:13)
at Executor.execute (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/node_modules/selenium-webdriver/lib/http.js:569:28)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Task.exec (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/test/functional/services/remote/prevent_parallel_calls.ts:28:20) {
remoteStacktrace: '#0 0x56538a5a9f93 <unknown>\n' +
'#1 0x56538a084908 <unknown>\n' +
'#2 0x56538a0c1b94 <unknown>\n' +
'#3 0x56538a0bf664 <unknown>\n' +
'#4 0x56538a0bce2a <unknown>\n' +
'#5 0x56538a0bb832 <unknown>\n' +
'#6 0x56538a0af538 <unknown>\n' +
'#7 0x56538a0d7a82 <unknown>\n' +
'#8 0x56538a0af2b3 <unknown>\n' +
'#9 0x56538a0d7b8e <unknown>\n' +
'#10 0x56538a0eac91 <unknown>\n' +
'#11 0x56538a0d7973 <unknown>\n' +
'#12 0x56538a0addf4 <unknown>\n' +
'#13 0x56538a0aede5 <unknown>\n' +
'#14 0x56538a5d92be <unknown>\n' +
'#15 0x56538a5eeba0 <unknown>\n' +
'#16 0x56538a5da215 <unknown>\n' +
'#17 0x56538a5effe8 <unknown>\n' +
'#18 0x56538a5ce9db <unknown>\n' +
'#19 0x56538a60b218 <unknown>\n' +
'#20 0x56538a60b398 <unknown>\n' +
'#21 0x56538a6266cd <unknown>\n' +
'#22 0x7f042aa57609 <unknown>\n'
}
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-hourly/builds/2354#000689d8-9fba-42f4-8073-3e9ee8204da8)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/uptime/synthetics_integration·ts","test.name":"Uptime app with generated data When on the Synthetics Integration Policy Create Page displays custom UI prevent saving when integration name, url/host, or schedule is missing","test.failCount":2}} -->
|
2.0
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/uptime/synthetics_integration·ts - Uptime app with generated data When on the Synthetics Integration Policy Create Page displays custom UI prevent saving when integration name, url/host, or schedule is missing - A test failed on a tracked branch
```
ElementClickInterceptedError: element click intercepted: Element <button class="euiButton euiButton--primary euiButton--fill" type="button" data-test-subj="createPackagePolicySaveButton">...</button> is not clickable at point (1480, 964). Other element would receive the click: <span class="euiButton__text">...</span>
(Session info: headless chrome=95.0.4638.69)
at Object.throwDecodedError (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/node_modules/selenium-webdriver/lib/error.js:517:15)
at parseHttpResponse (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/node_modules/selenium-webdriver/lib/http.js:643:13)
at Executor.execute (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/node_modules/selenium-webdriver/lib/http.js:569:28)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Task.exec (/opt/local-ssd/buildkite/builds/kb-cigroup-6-2adc9311a4ab7581/elastic/kibana-hourly/kibana/test/functional/services/remote/prevent_parallel_calls.ts:28:20) {
remoteStacktrace: '#0 0x56538a5a9f93 <unknown>\n' +
'#1 0x56538a084908 <unknown>\n' +
'#2 0x56538a0c1b94 <unknown>\n' +
'#3 0x56538a0bf664 <unknown>\n' +
'#4 0x56538a0bce2a <unknown>\n' +
'#5 0x56538a0bb832 <unknown>\n' +
'#6 0x56538a0af538 <unknown>\n' +
'#7 0x56538a0d7a82 <unknown>\n' +
'#8 0x56538a0af2b3 <unknown>\n' +
'#9 0x56538a0d7b8e <unknown>\n' +
'#10 0x56538a0eac91 <unknown>\n' +
'#11 0x56538a0d7973 <unknown>\n' +
'#12 0x56538a0addf4 <unknown>\n' +
'#13 0x56538a0aede5 <unknown>\n' +
'#14 0x56538a5d92be <unknown>\n' +
'#15 0x56538a5eeba0 <unknown>\n' +
'#16 0x56538a5da215 <unknown>\n' +
'#17 0x56538a5effe8 <unknown>\n' +
'#18 0x56538a5ce9db <unknown>\n' +
'#19 0x56538a60b218 <unknown>\n' +
'#20 0x56538a60b398 <unknown>\n' +
'#21 0x56538a6266cd <unknown>\n' +
'#22 0x7f042aa57609 <unknown>\n'
}
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-hourly/builds/2354#000689d8-9fba-42f4-8073-3e9ee8204da8)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/uptime/synthetics_integration·ts","test.name":"Uptime app with generated data When on the Synthetics Integration Policy Create Page displays custom UI prevent saving when integration name, url/host, or schedule is missing","test.failCount":2}} -->
|
test
|
failing test chrome x pack ui functional tests x pack test functional apps uptime synthetics integration·ts uptime app with generated data when on the synthetics integration policy create page displays custom ui prevent saving when integration name url host or schedule is missing a test failed on a tracked branch elementclickinterceptederror element click intercepted element is not clickable at point other element would receive the click session info headless chrome at object throwdecodederror opt local ssd buildkite builds kb cigroup elastic kibana hourly kibana node modules selenium webdriver lib error js at parsehttpresponse opt local ssd buildkite builds kb cigroup elastic kibana hourly kibana node modules selenium webdriver lib http js at executor execute opt local ssd buildkite builds kb cigroup elastic kibana hourly kibana node modules selenium webdriver lib http js at runmicrotasks at processticksandrejections node internal process task queues at task exec opt local ssd buildkite builds kb cigroup elastic kibana hourly kibana test functional services remote prevent parallel calls ts remotestacktrace n n n n n n n n n n n n n n n n n n n n n n n first failure
| 1
|
62,964
| 6,821,638,003
|
IssuesEvent
|
2017-11-07 17:21:00
|
SWEZenith/Fall2017Swe574-Zenith
|
https://api.github.com/repos/SWEZenith/Fall2017Swe574-Zenith
|
closed
|
Mobile Project Practice (Text/Image Selector)
|
mobile research test
|
## Research / Test Issue
How to use Text & Image Selector on a mobile application?
## Sample Links:
- http://docs.annotatorjs.org/en/latest/usage.html
- https://developer.mozilla.org/en-US/docs/Learn/HTML/Multimedia_and_embedding/Images_in_HTML
- http://caniuse.com/#feat=queryselector
|
1.0
|
Mobile Project Practice (Text/Image Selector) - ## Research / Test Issue
How to use Text & Image Selector on a mobile application?
## Sample Links:
- http://docs.annotatorjs.org/en/latest/usage.html
- https://developer.mozilla.org/en-US/docs/Learn/HTML/Multimedia_and_embedding/Images_in_HTML
- http://caniuse.com/#feat=queryselector
|
test
|
mobile project practice text image selector research test issue how to use text image selector on a mobile application sample links
| 1
|
264,477
| 20,022,299,897
|
IssuesEvent
|
2022-02-01 17:29:10
|
casper-network/docs
|
https://api.github.com/repos/casper-network/docs
|
closed
|
Add info about Casper SDKs
|
documentation
|
Move this information into the docs: https://docs.google.com/document/d/1L7ZdDMTO_TVN5LgOL4wyDNFdvcxylGACIxJmLXnnM8c/edit#
Here is the landing page: https://casper.network/docs/sdk
|
1.0
|
Add info about Casper SDKs - Move this information into the docs: https://docs.google.com/document/d/1L7ZdDMTO_TVN5LgOL4wyDNFdvcxylGACIxJmLXnnM8c/edit#
Here is the landing page: https://casper.network/docs/sdk
|
non_test
|
add info about casper sdks move this information into the docs here is the landing page
| 0
|
60,350
| 6,688,486,958
|
IssuesEvent
|
2017-10-08 15:19:41
|
r9y9/nnmnkwii
|
https://api.github.com/repos/r9y9/nnmnkwii
|
closed
|
Find a correct way to collect test coverage of backward passes of autograd functions
|
Tests
|
or simply ignore test coverage of these passes? Currently there are missing test coverage in autograd functions, while we are actually testing. For example, see https://codecov.io/gh/r9y9/nnmnkwii/src/fea5e54fa98c317d0d082225de4ebc32278d22ac/nnmnkwii/autograd/_impl/mlpg.py#L158.
|
1.0
|
Find a correct way to collect test coverage of backward passes of autograd functions - or simply ignore test coverage of these passes? Currently there are missing test coverage in autograd functions, while we are actually testing. For example, see https://codecov.io/gh/r9y9/nnmnkwii/src/fea5e54fa98c317d0d082225de4ebc32278d22ac/nnmnkwii/autograd/_impl/mlpg.py#L158.
|
test
|
find a correct way to collect test coverage of backward passes of autograd functions or simply ignore test coverage of these passes currently there are missing test coverage in autograd functions while we are actually testing for example see
| 1
|
279,195
| 24,205,772,755
|
IssuesEvent
|
2022-09-25 07:36:26
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
roachtest: kv/splits/nodes=3/quiesce=false failed
|
C-test-failure O-robot O-roachtest branch-master release-blocker
|
roachtest.kv/splits/nodes=3/quiesce=false [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6610307?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6610307?buildTab=artifacts#/kv/splits/nodes=3/quiesce=false) on master @ [51c8aae748d338549400c047796c6c9b892527da](https://github.com/cockroachdb/cockroach/commits/51c8aae748d338549400c047796c6c9b892527da):
```
| r12 0x6
| r13 0x2
| r14 0xc000102680
| r15 0x7f15679fbf40
| rip 0x49a101
| rflags 0x286
| cs 0x33
| fs 0x0
| gs 0x0
|
| stdout:
Wraps: (4) SSH_PROBLEM
Wraps: (5) Node 4. Command with error:
| ``````
| ./workload run kv --init --max-ops=1 --concurrency=192 --splits=30000 {pgurl:1-3}
| ``````
Wraps: (6) exit status 255
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) errors.SSH (5) *hintdetail.withDetail (6) *exec.ExitError
monitor.go:127,kv.go:729,test_runner.go:928: monitor failure: monitor task failed: t.Fatal() was called
(1) attached stack trace
-- stack trace:
| main.(*monitorImpl).WaitE
| main/pkg/cmd/roachtest/monitor.go:115
| main.(*monitorImpl).Wait
| main/pkg/cmd/roachtest/monitor.go:123
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.registerKVSplits.func1
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/kv.go:729
| main.(*testRunner).runTest.func2
| main/pkg/cmd/roachtest/test_runner.go:928
Wraps: (2) monitor failure
Wraps: (3) attached stack trace
-- stack trace:
| main.(*monitorImpl).wait.func2
| main/pkg/cmd/roachtest/monitor.go:171
Wraps: (4) monitor task failed
Wraps: (5) attached stack trace
-- stack trace:
| main.init
| main/pkg/cmd/roachtest/monitor.go:80
| runtime.doInit
| GOROOT/src/runtime/proc.go:6340
| runtime.main
| GOROOT/src/runtime/proc.go:233
| runtime.goexit
| GOROOT/src/runtime/asm_amd64.s:1594
Wraps: (6) t.Fatal() was called
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *withstack.withStack (6) *errutil.leafError
test_runner.go:1059,test_runner.go:958: test timed out (2h0m0s)
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*kv/splits/nodes=3/quiesce=false.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: kv/splits/nodes=3/quiesce=false failed - roachtest.kv/splits/nodes=3/quiesce=false [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6610307?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6610307?buildTab=artifacts#/kv/splits/nodes=3/quiesce=false) on master @ [51c8aae748d338549400c047796c6c9b892527da](https://github.com/cockroachdb/cockroach/commits/51c8aae748d338549400c047796c6c9b892527da):
```
| r12 0x6
| r13 0x2
| r14 0xc000102680
| r15 0x7f15679fbf40
| rip 0x49a101
| rflags 0x286
| cs 0x33
| fs 0x0
| gs 0x0
|
| stdout:
Wraps: (4) SSH_PROBLEM
Wraps: (5) Node 4. Command with error:
| ``````
| ./workload run kv --init --max-ops=1 --concurrency=192 --splits=30000 {pgurl:1-3}
| ``````
Wraps: (6) exit status 255
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) errors.SSH (5) *hintdetail.withDetail (6) *exec.ExitError
monitor.go:127,kv.go:729,test_runner.go:928: monitor failure: monitor task failed: t.Fatal() was called
(1) attached stack trace
-- stack trace:
| main.(*monitorImpl).WaitE
| main/pkg/cmd/roachtest/monitor.go:115
| main.(*monitorImpl).Wait
| main/pkg/cmd/roachtest/monitor.go:123
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.registerKVSplits.func1
| github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/kv.go:729
| main.(*testRunner).runTest.func2
| main/pkg/cmd/roachtest/test_runner.go:928
Wraps: (2) monitor failure
Wraps: (3) attached stack trace
-- stack trace:
| main.(*monitorImpl).wait.func2
| main/pkg/cmd/roachtest/monitor.go:171
Wraps: (4) monitor task failed
Wraps: (5) attached stack trace
-- stack trace:
| main.init
| main/pkg/cmd/roachtest/monitor.go:80
| runtime.doInit
| GOROOT/src/runtime/proc.go:6340
| runtime.main
| GOROOT/src/runtime/proc.go:233
| runtime.goexit
| GOROOT/src/runtime/asm_amd64.s:1594
Wraps: (6) t.Fatal() was called
Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *withstack.withStack (4) *errutil.withPrefix (5) *withstack.withStack (6) *errutil.leafError
test_runner.go:1059,test_runner.go:958: test timed out (2h0m0s)
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*kv/splits/nodes=3/quiesce=false.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
roachtest kv splits nodes quiesce false failed roachtest kv splits nodes quiesce false with on master rip rflags cs fs gs stdout wraps ssh problem wraps node command with error workload run kv init max ops concurrency splits pgurl wraps exit status error types withstack withstack errutil withprefix cluster withcommanddetails errors ssh hintdetail withdetail exec exiterror monitor go kv go test runner go monitor failure monitor task failed t fatal was called attached stack trace stack trace main monitorimpl waite main pkg cmd roachtest monitor go main monitorimpl wait main pkg cmd roachtest monitor go github com cockroachdb cockroach pkg cmd roachtest tests registerkvsplits github com cockroachdb cockroach pkg cmd roachtest tests kv go main testrunner runtest main pkg cmd roachtest test runner go wraps monitor failure wraps attached stack trace stack trace main monitorimpl wait main pkg cmd roachtest monitor go wraps monitor task failed wraps attached stack trace stack trace main init main pkg cmd roachtest monitor go runtime doinit goroot src runtime proc go runtime main goroot src runtime proc go runtime goexit goroot src runtime asm s wraps t fatal was called error types withstack withstack errutil withprefix withstack withstack errutil withprefix withstack withstack errutil leaferror test runner go test runner go test timed out parameters roachtest cloud gce roachtest cpu roachtest encrypted false roachtest ssd help see see cc cockroachdb kv triage
| 1
|
72,879
| 24,343,412,646
|
IssuesEvent
|
2022-10-02 01:31:01
|
naev/naev
|
https://api.github.com/repos/naev/naev
|
closed
|
System name visible in notes
|
Type-Defect Priority-Low GoodFirstProject
|
LJ_Dude
—
今日 07:12
You can see the name of an undiscovered system with the note tool. Doesn't really tell you anything, but you're still probably not supposed to know.

|
1.0
|
System name visible in notes - LJ_Dude
—
今日 07:12
You can see the name of an undiscovered system with the note tool. Doesn't really tell you anything, but you're still probably not supposed to know.

|
non_test
|
system name visible in notes lj dude — 今日 you can see the name of an undiscovered system with the note tool doesn t really tell you anything but you re still probably not supposed to know
| 0
|
44,889
| 5,659,079,486
|
IssuesEvent
|
2017-04-10 12:00:27
|
tgstation/tgstation
|
https://api.github.com/repos/tgstation/tgstation
|
closed
|
Multiple reports of Radiation storms managing to tag/kill people in maintence
|
Bug Needs Reproducing/Testing
|
@Shadowlight213
Also the nuke op shuttle wasn't immune to the storm and it nearly wiped them out
|
1.0
|
Multiple reports of Radiation storms managing to tag/kill people in maintence - @Shadowlight213
Also the nuke op shuttle wasn't immune to the storm and it nearly wiped them out
|
test
|
multiple reports of radiation storms managing to tag kill people in maintence also the nuke op shuttle wasn t immune to the storm and it nearly wiped them out
| 1
|
128,288
| 10,523,692,380
|
IssuesEvent
|
2019-09-30 11:39:10
|
telstra/open-kilda
|
https://api.github.com/repos/telstra/open-kilda
|
opened
|
Refactor test for isl.unstable.timeout.sec
|
area/testing priority/3-normal
|
Refactor test `"System takes isl time_unstable info into account while creating a flow"`
1. Verify that `isl.unstable.timeout.sec` is respected and isl cost is no longer increased after timeout passes
2. Verify that proper 'isl.cost.when.port.down' is added
|
1.0
|
Refactor test for isl.unstable.timeout.sec - Refactor test `"System takes isl time_unstable info into account while creating a flow"`
1. Verify that `isl.unstable.timeout.sec` is respected and isl cost is no longer increased after timeout passes
2. Verify that proper 'isl.cost.when.port.down' is added
|
test
|
refactor test for isl unstable timeout sec refactor test system takes isl time unstable info into account while creating a flow verify that isl unstable timeout sec is respected and isl cost is no longer increased after timeout passes verify that proper isl cost when port down is added
| 1
|
511,495
| 14,861,439,114
|
IssuesEvent
|
2021-01-18 22:52:04
|
craftercms/craftercms
|
https://api.github.com/repos/craftercms/craftercms
|
closed
|
[commons] Upgrade manager is not updating all versions in the pipelines
|
bug priority: high
|
## Describe the bug
After the refactor done in https://github.com/craftercms/craftercms/issues/3801 it has been pointed out that the upgrade manager should write every version in the pipeline instead of only the last one.
## Expected behavior
The UM should track all versions in the pipelines to properly resume upgrades after errors
## Specs
### Version
3.2-SNAPSHOT
|
1.0
|
[commons] Upgrade manager is not updating all versions in the pipelines - ## Describe the bug
After the refactor done in https://github.com/craftercms/craftercms/issues/3801 it has been pointed out that the upgrade manager should write every version in the pipeline instead of only the last one.
## Expected behavior
The UM should track all versions in the pipelines to properly resume upgrades after errors
## Specs
### Version
3.2-SNAPSHOT
|
non_test
|
upgrade manager is not updating all versions in the pipelines describe the bug after the refactor done in it has been pointed out that the upgrade manager should write every version in the pipeline instead of only the last one expected behavior the um should track all versions in the pipelines to properly resume upgrades after errors specs version snapshot
| 0
|
95,839
| 8,579,107,459
|
IssuesEvent
|
2018-11-13 08:07:54
|
humera987/FXLabs-Test-Automation
|
https://api.github.com/repos/humera987/FXLabs-Test-Automation
|
closed
|
testing : ApiV1EnvsProjectIdIdGetQueryParamPagesizeDdos
|
testing
|
Project : testing
Job : UAT
Env : UAT
Region : US_WEST
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=MTM5ZjI0ZmEtYTU3OC00N2ViLWIzMGMtZmUzNDViYjI5MGE0; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Tue, 13 Nov 2018 06:58:51 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/envs/project-id/N5mi37lP?pageSize=1001
Request :
Response :
{
"timestamp" : "2018-11-13T06:58:51.307+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/envs/project-id/N5mi37lP"
}
Logs :
Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed]
--- FX Bot ---
|
1.0
|
testing : ApiV1EnvsProjectIdIdGetQueryParamPagesizeDdos - Project : testing
Job : UAT
Env : UAT
Region : US_WEST
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=MTM5ZjI0ZmEtYTU3OC00N2ViLWIzMGMtZmUzNDViYjI5MGE0; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Tue, 13 Nov 2018 06:58:51 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/envs/project-id/N5mi37lP?pageSize=1001
Request :
Response :
{
"timestamp" : "2018-11-13T06:58:51.307+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/envs/project-id/N5mi37lP"
}
Logs :
Assertion [@StatusCode != 401] resolved-to [404 != 401] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed]
--- FX Bot ---
|
test
|
testing project testing job uat env uat region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options set cookie content type transfer encoding date endpoint request response timestamp status error not found message no message available path api api envs project id logs assertion resolved to result assertion resolved to result fx bot
| 1
|
144,407
| 13,104,997,688
|
IssuesEvent
|
2020-08-04 11:21:05
|
allenai/pybart
|
https://api.github.com/repos/allenai/pybart
|
opened
|
Refactoring
|
documentation enhancement
|
- [ ] Fix conversion-functions selection mechanism (also in the test suite)
- [ ] For each conversion-function (standing on 5/34):
- [ ] Move conversions functions to classes
- [ ] Simplify the 'change-graph' code, and try to move part of it to the 'match-code' (restrictions)
- [ ] Add documentation as much as possible
- [ ] Try to add support for ud-v2 and to multilingual on the fly (or should I net mix and drink, and seperate it to a different issue/PR)
This is the current suggestion for the refactoring task, which we should discuss @hillelt @yoavg
|
1.0
|
Refactoring - - [ ] Fix conversion-functions selection mechanism (also in the test suite)
- [ ] For each conversion-function (standing on 5/34):
- [ ] Move conversions functions to classes
- [ ] Simplify the 'change-graph' code, and try to move part of it to the 'match-code' (restrictions)
- [ ] Add documentation as much as possible
- [ ] Try to add support for ud-v2 and to multilingual on the fly (or should I net mix and drink, and seperate it to a different issue/PR)
This is the current suggestion for the refactoring task, which we should discuss @hillelt @yoavg
|
non_test
|
refactoring fix conversion functions selection mechanism also in the test suite for each conversion function standing on move conversions functions to classes simplify the change graph code and try to move part of it to the match code restrictions add documentation as much as possible try to add support for ud and to multilingual on the fly or should i net mix and drink and seperate it to a different issue pr this is the current suggestion for the refactoring task which we should discuss hillelt yoavg
| 0
|
32,452
| 8,855,324,371
|
IssuesEvent
|
2019-01-09 05:57:14
|
opencv/opencv
|
https://api.github.com/repos/opencv/opencv
|
closed
|
build error, 4.0, openvx enabled. "modules/core/include/opencv2/core/hal/interface.h:68:28: error: void value not ignored as it ought to be"
|
category: build/install priority: low
|
- OpenCV => 4.0
- Operating System / Platform => ARM 64bit
- Compiler => GCC 7.3.0
script: cmake -DOPENVX_ROOT=/usr -DWITH_OPENVX=ON -DENABLE_PRECOMPILED_HEADERS=NO -DCMAKE_BUILD_TYPE=Release ../
##### Steps to reproduce
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/kmeans.cpp.o
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lapack.cpp.o
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lda.cpp.o
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/logger.cpp.o
[ 28%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lpsolver.cpp.o
[ 28%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lut.cpp.o
In file included from /home/root/opencv/modules/core/include/opencv2/core/cvdef.h:165:0,
from /home/root/opencv/modules/core/include/opencv2/core.hpp:52,
from /home/root/opencv/modules/core/include/opencv2/core/utility.hpp:56,
from /home/root/opencv/modules/core/src/precomp.hpp:49,
from /home/root/opencv/modules/core/src/lut.cpp:6:
/home/root/opencv/3rdparty/openvx/include/ivx.hpp: In static member function 'static int ivx::Image::formatToMatType(vx_df_image, vx_uint32)':
/home/root/opencv/modules/core/include/opencv2/core/hal/interface.h:68:28: error: void value not ignored as it ought to be
#define CV_USRTYPE1 (void) "CV_USRTYPE1 support has been dropped in OpenCV 4.0"
^
/home/root/opencv/modules/core/include/opencv2/core/hal/interface.h:68:28: note: in definition of macro 'CV_USRTYPE1'
#define CV_USRTYPE1 (void) "CV_USRTYPE1 support has been dropped in OpenCV 4.0"
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
make[2]: *** [modules/core/CMakeFiles/opencv_core.dir/build.make:506: modules/core/CMakeFiles/opencv_core.dir/src/lut.cpp.o] Error 1
|
1.0
|
build error, 4.0, openvx enabled. "modules/core/include/opencv2/core/hal/interface.h:68:28: error: void value not ignored as it ought to be" - - OpenCV => 4.0
- Operating System / Platform => ARM 64bit
- Compiler => GCC 7.3.0
script: cmake -DOPENVX_ROOT=/usr -DWITH_OPENVX=ON -DENABLE_PRECOMPILED_HEADERS=NO -DCMAKE_BUILD_TYPE=Release ../
##### Steps to reproduce
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/kmeans.cpp.o
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lapack.cpp.o
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lda.cpp.o
[ 27%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/logger.cpp.o
[ 28%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lpsolver.cpp.o
[ 28%] Building CXX object modules/core/CMakeFiles/opencv_core.dir/src/lut.cpp.o
In file included from /home/root/opencv/modules/core/include/opencv2/core/cvdef.h:165:0,
from /home/root/opencv/modules/core/include/opencv2/core.hpp:52,
from /home/root/opencv/modules/core/include/opencv2/core/utility.hpp:56,
from /home/root/opencv/modules/core/src/precomp.hpp:49,
from /home/root/opencv/modules/core/src/lut.cpp:6:
/home/root/opencv/3rdparty/openvx/include/ivx.hpp: In static member function 'static int ivx::Image::formatToMatType(vx_df_image, vx_uint32)':
/home/root/opencv/modules/core/include/opencv2/core/hal/interface.h:68:28: error: void value not ignored as it ought to be
#define CV_USRTYPE1 (void) "CV_USRTYPE1 support has been dropped in OpenCV 4.0"
^
/home/root/opencv/modules/core/include/opencv2/core/hal/interface.h:68:28: note: in definition of macro 'CV_USRTYPE1'
#define CV_USRTYPE1 (void) "CV_USRTYPE1 support has been dropped in OpenCV 4.0"
^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
make[2]: *** [modules/core/CMakeFiles/opencv_core.dir/build.make:506: modules/core/CMakeFiles/opencv_core.dir/src/lut.cpp.o] Error 1
|
non_test
|
build error openvx enabled modules core include core hal interface h error void value not ignored as it ought to be opencv operating system platform arm compiler gcc script cmake dopenvx root usr dwith openvx on denable precompiled headers no dcmake build type release steps to reproduce building cxx object modules core cmakefiles opencv core dir src kmeans cpp o building cxx object modules core cmakefiles opencv core dir src lapack cpp o building cxx object modules core cmakefiles opencv core dir src lda cpp o building cxx object modules core cmakefiles opencv core dir src logger cpp o building cxx object modules core cmakefiles opencv core dir src lpsolver cpp o building cxx object modules core cmakefiles opencv core dir src lut cpp o in file included from home root opencv modules core include core cvdef h from home root opencv modules core include core hpp from home root opencv modules core include core utility hpp from home root opencv modules core src precomp hpp from home root opencv modules core src lut cpp home root opencv openvx include ivx hpp in static member function static int ivx image formattomattype vx df image vx home root opencv modules core include core hal interface h error void value not ignored as it ought to be define cv void cv support has been dropped in opencv home root opencv modules core include core hal interface h note in definition of macro cv define cv void cv support has been dropped in opencv make error
| 0
|
28,789
| 11,694,228,443
|
IssuesEvent
|
2020-03-06 03:17:18
|
fufunoyu/example-pip-travis
|
https://api.github.com/repos/fufunoyu/example-pip-travis
|
opened
|
CVE-2016-9190 (High) detected in Pillow-3.2.0.tar.gz
|
security vulnerability
|
## CVE-2016-9190 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Pillow-3.2.0.tar.gz</b></p></summary>
<p>Python Imaging Library (Fork)</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/e2/af/0a3981fffc5cd43078eb8b1057702e0dd2d5771e5aaa36cbd140e32f8473/Pillow-3.2.0.tar.gz">https://files.pythonhosted.org/packages/e2/af/0a3981fffc5cd43078eb8b1057702e0dd2d5771e5aaa36cbd140e32f8473/Pillow-3.2.0.tar.gz</a></p>
<p>Path to dependency file: /tmp/ws-scm/example-pip-travis/requirements.txt</p>
<p>Path to vulnerable library: /tmp/ws-scm/example-pip-travis/requirements.txt</p>
<p>
Dependency Hierarchy:
- image-1.5.5.tar.gz (Root Library)
- :x: **Pillow-3.2.0.tar.gz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fufunoyu/example-pip-travis/commit/3f5372056bdcd8eea6c1d9f545fe09c32d6d54ea">3f5372056bdcd8eea6c1d9f545fe09c32d6d54ea</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Pillow before 3.3.2 allows context-dependent attackers to execute arbitrary code by using the "crafted image file" approach, related to an "Insecure Sign Extension" issue affecting the ImagingNew in Storage.c component.
<p>Publish Date: 2016-11-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-9190>CVE-2016-9190</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2016-9190">https://nvd.nist.gov/vuln/detail/CVE-2016-9190</a></p>
<p>Release Date: 2016-11-04</p>
<p>Fix Resolution: 3.3.2</p>
</p>
</details>
<p></p>
|
True
|
CVE-2016-9190 (High) detected in Pillow-3.2.0.tar.gz - ## CVE-2016-9190 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Pillow-3.2.0.tar.gz</b></p></summary>
<p>Python Imaging Library (Fork)</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/e2/af/0a3981fffc5cd43078eb8b1057702e0dd2d5771e5aaa36cbd140e32f8473/Pillow-3.2.0.tar.gz">https://files.pythonhosted.org/packages/e2/af/0a3981fffc5cd43078eb8b1057702e0dd2d5771e5aaa36cbd140e32f8473/Pillow-3.2.0.tar.gz</a></p>
<p>Path to dependency file: /tmp/ws-scm/example-pip-travis/requirements.txt</p>
<p>Path to vulnerable library: /tmp/ws-scm/example-pip-travis/requirements.txt</p>
<p>
Dependency Hierarchy:
- image-1.5.5.tar.gz (Root Library)
- :x: **Pillow-3.2.0.tar.gz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fufunoyu/example-pip-travis/commit/3f5372056bdcd8eea6c1d9f545fe09c32d6d54ea">3f5372056bdcd8eea6c1d9f545fe09c32d6d54ea</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Pillow before 3.3.2 allows context-dependent attackers to execute arbitrary code by using the "crafted image file" approach, related to an "Insecure Sign Extension" issue affecting the ImagingNew in Storage.c component.
<p>Publish Date: 2016-11-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-9190>CVE-2016-9190</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2016-9190">https://nvd.nist.gov/vuln/detail/CVE-2016-9190</a></p>
<p>Release Date: 2016-11-04</p>
<p>Fix Resolution: 3.3.2</p>
</p>
</details>
<p></p>
|
non_test
|
cve high detected in pillow tar gz cve high severity vulnerability vulnerable library pillow tar gz python imaging library fork library home page a href path to dependency file tmp ws scm example pip travis requirements txt path to vulnerable library tmp ws scm example pip travis requirements txt dependency hierarchy image tar gz root library x pillow tar gz vulnerable library found in head commit a href vulnerability details pillow before allows context dependent attackers to execute arbitrary code by using the crafted image file approach related to an insecure sign extension issue affecting the imagingnew in storage c component publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
| 0
|
167,151
| 6,333,178,851
|
IssuesEvent
|
2017-07-26 14:16:04
|
azavea/raster-foundry
|
https://api.github.com/repos/azavea/raster-foundry
|
opened
|
Implement selection of visualization options
|
priority
|
**Nature of data**
- sequential
- diverging
- categorical
**Color Scheme**
UI and service
|
1.0
|
Implement selection of visualization options - **Nature of data**
- sequential
- diverging
- categorical
**Color Scheme**
UI and service
|
non_test
|
implement selection of visualization options nature of data sequential diverging categorical color scheme ui and service
| 0
|
352,719
| 25,079,893,445
|
IssuesEvent
|
2022-11-07 18:20:45
|
octokit/auth-token.js
|
https://api.github.com/repos/octokit/auth-token.js
|
closed
|
headers are undefined
|
Good first issue Priority: Normal Type: Documentation
|
Following the instructions in the [readme](https://github.com/octokit/auth-token.js#find-out-what-scopes-are-enabled-for-oauth-tokens) of this example code:
```js
const TOKEN = "ghp_PersonalAccessToken01245678900000000";
const auth = createTokenAuth(TOKEN);
const authentication = await auth();
const response = await request("HEAD /", {
headers: authentication.headers,
});
const scopes = response.headers["x-oauth-scopes"].split(/,\s+/);
if (scopes.length) {
console.log(
`"${TOKEN}" has ${scopes.length} scopes enabled: ${scopes.join(", ")}`
);
} else {
console.log(`"${TOKEN}" has no scopes enabled`);
}
```
I get this error:
```
const scopes = response.headers['x-oauth-scopes'].split(/,\s+/)
^
TypeError: Cannot read properties of undefined (reading 'split')
```
Also, `authentication.headers` is always `undefined`. I don't know why it has to be set in the request, since there is no `header` property in the [authentication object](https://github.com/octokit/auth-token.js#authentication-object).
There is a closed but unanswered issue on the same topic: https://github.com/octokit/auth-token.js/issues/205
|
1.0
|
headers are undefined - Following the instructions in the [readme](https://github.com/octokit/auth-token.js#find-out-what-scopes-are-enabled-for-oauth-tokens) of this example code:
```js
const TOKEN = "ghp_PersonalAccessToken01245678900000000";
const auth = createTokenAuth(TOKEN);
const authentication = await auth();
const response = await request("HEAD /", {
headers: authentication.headers,
});
const scopes = response.headers["x-oauth-scopes"].split(/,\s+/);
if (scopes.length) {
console.log(
`"${TOKEN}" has ${scopes.length} scopes enabled: ${scopes.join(", ")}`
);
} else {
console.log(`"${TOKEN}" has no scopes enabled`);
}
```
I get this error:
```
const scopes = response.headers['x-oauth-scopes'].split(/,\s+/)
^
TypeError: Cannot read properties of undefined (reading 'split')
```
Also, `authentication.headers` is always `undefined`. I don't know why it has to be set in the request, since there is no `header` property in the [authentication object](https://github.com/octokit/auth-token.js#authentication-object).
There is a closed but unanswered issue on the same topic: https://github.com/octokit/auth-token.js/issues/205
|
non_test
|
headers are undefined following the instructions in the of this example code js const token ghp const auth createtokenauth token const authentication await auth const response await request head headers authentication headers const scopes response headers split s if scopes length console log token has scopes length scopes enabled scopes join else console log token has no scopes enabled i get this error const scopes response headers split s typeerror cannot read properties of undefined reading split also authentication headers is always undefined i don t know why it has to be set in the request since there is no header property in the there is a closed but unanswered issue on the same topic
| 0
|
3,645
| 6,527,700,009
|
IssuesEvent
|
2017-08-30 02:34:52
|
pingcap/tidb
|
https://api.github.com/repos/pingcap/tidb
|
closed
|
SHOW CREATE TABLE not escaped
|
bug compatibility for-new-contributors help wanted
|
For example, single quote in comment when CREATE TABLE will lead to incorrect syntax CREATE TABLE.
```sql
CREATE TABLE `t1` (
`c1` int(11) NOT NULL AUTO_INCREMENT,
`c2` varchar(100) DEFAULT NULL,
`c3` varchar(100) DEFAULT NULL,
PRIMARY KEY (`c1`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='DoubleQuotes"SingleQuotes\'Hello'
```
TiDB:
```
mysql> show create table t1;
+-------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Table | Create Table |
+-------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| t1 | CREATE TABLE `t1` (
`c1` int(11) NOT NULL AUTO_INCREMENT,
`c2` varchar(100) DEFAULT NULL,
`c3` varchar(100) DEFAULT NULL,
PRIMARY KEY (`c1`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin COMMENT='DoubleQuotes"SingleQuotes'Hello' |
+-------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
```
MySQL:
```
mysql> show create table t1;
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Table | Create Table |
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| t1 | CREATE TABLE `t1` (
`c1` int(11) NOT NULL AUTO_INCREMENT,
`c2` varchar(100) DEFAULT NULL,
`c3` varchar(100) DEFAULT NULL,
PRIMARY KEY (`c1`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='DoubleQuotes"SingleQuotes''Hello' |
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
```
|
True
|
SHOW CREATE TABLE not escaped - For example, single quote in comment when CREATE TABLE will lead to incorrect syntax CREATE TABLE.
```sql
CREATE TABLE `t1` (
`c1` int(11) NOT NULL AUTO_INCREMENT,
`c2` varchar(100) DEFAULT NULL,
`c3` varchar(100) DEFAULT NULL,
PRIMARY KEY (`c1`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='DoubleQuotes"SingleQuotes\'Hello'
```
TiDB:
```
mysql> show create table t1;
+-------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Table | Create Table |
+-------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| t1 | CREATE TABLE `t1` (
`c1` int(11) NOT NULL AUTO_INCREMENT,
`c2` varchar(100) DEFAULT NULL,
`c3` varchar(100) DEFAULT NULL,
PRIMARY KEY (`c1`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COLLATE=utf8_bin COMMENT='DoubleQuotes"SingleQuotes'Hello' |
+-------+------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
```
MySQL:
```
mysql> show create table t1;
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| Table | Create Table |
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
| t1 | CREATE TABLE `t1` (
`c1` int(11) NOT NULL AUTO_INCREMENT,
`c2` varchar(100) DEFAULT NULL,
`c3` varchar(100) DEFAULT NULL,
PRIMARY KEY (`c1`)
) ENGINE=InnoDB DEFAULT CHARSET=utf8 COMMENT='DoubleQuotes"SingleQuotes''Hello' |
+-------+--------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------------+
1 row in set (0.00 sec)
```
|
non_test
|
show create table not escaped for example single quote in comment when create table will lead to incorrect syntax create table sql create table int not null auto increment varchar default null varchar default null primary key engine innodb default charset comment doublequotes singlequotes hello tidb mysql show create table table create table create table int not null auto increment varchar default null varchar default null primary key engine innodb default charset collate bin comment doublequotes singlequotes hello row in set sec mysql mysql show create table table create table create table int not null auto increment varchar default null varchar default null primary key engine innodb default charset comment doublequotes singlequotes hello row in set sec
| 0
|
65,766
| 14,761,894,342
|
IssuesEvent
|
2021-01-09 00:53:20
|
AlexRogalskiy/electron-vue-template
|
https://api.github.com/repos/AlexRogalskiy/electron-vue-template
|
opened
|
CVE-2020-7656 (Medium) detected in jquery-1.7.1.min.js
|
security vulnerability
|
## CVE-2020-7656 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p>
<p>Path to dependency file: electron-vue-template/node_modules/sockjs/examples/express-3.x/index.html</p>
<p>Path to vulnerable library: electron-vue-template/node_modules/sockjs/examples/express-3.x/index.html,electron-vue-template/node_modules/sockjs/examples/hapi/html/index.html,electron-vue-template/node_modules/sockjs/examples/echo/index.html,electron-vue-template/node_modules/sockjs/examples/multiplex/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.7.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/electron-vue-template/commit/e180436ddc869ab181e9108f09eafef3237f5eb6">e180436ddc869ab181e9108f09eafef3237f5eb6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed.
<p>Publish Date: 2020-05-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568">https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568</a></p>
<p>Release Date: 2020-05-19</p>
<p>Fix Resolution: jquery-rails - 2.2.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-7656 (Medium) detected in jquery-1.7.1.min.js - ## CVE-2020-7656 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.7.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p>
<p>Path to dependency file: electron-vue-template/node_modules/sockjs/examples/express-3.x/index.html</p>
<p>Path to vulnerable library: electron-vue-template/node_modules/sockjs/examples/express-3.x/index.html,electron-vue-template/node_modules/sockjs/examples/hapi/html/index.html,electron-vue-template/node_modules/sockjs/examples/echo/index.html,electron-vue-template/node_modules/sockjs/examples/multiplex/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.7.1.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/electron-vue-template/commit/e180436ddc869ab181e9108f09eafef3237f5eb6">e180436ddc869ab181e9108f09eafef3237f5eb6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jquery prior to 1.9.0 allows Cross-site Scripting attacks via the load method. The load method fails to recognize and remove "<script>" HTML tags that contain a whitespace character, i.e: "</script >", which results in the enclosed script logic to be executed.
<p>Publish Date: 2020-05-19
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7656>CVE-2020-7656</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568">https://github.com/rails/jquery-rails/commit/8f601cbfa08749ee5bbd2bffb6e509db9d753568</a></p>
<p>Release Date: 2020-05-19</p>
<p>Fix Resolution: jquery-rails - 2.2.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file electron vue template node modules sockjs examples express x index html path to vulnerable library electron vue template node modules sockjs examples express x index html electron vue template node modules sockjs examples hapi html index html electron vue template node modules sockjs examples echo index html electron vue template node modules sockjs examples multiplex index html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery prior to allows cross site scripting attacks via the load method the load method fails to recognize and remove html tags that contain a whitespace character i e which results in the enclosed script logic to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery rails step up your open source security game with whitesource
| 0
|
242,753
| 20,261,865,812
|
IssuesEvent
|
2022-02-15 08:22:53
|
tempus-finance/tempus-app
|
https://api.github.com/repos/tempus-finance/tempus-app
|
closed
|
Yield details visibility should be consistent for Fixed/Variable
|
bug ready for test
|
Prerequest:
- Connect MetaMask to Tempus App
- Switch MetaMask to Fantom network ([details](https://docs.fantom.foundation/tutorials/set-up-metamask))
Steps:
1. Navigate to https://tempus-app-testnet-v2.web.app/
2. Expand YFI.
3. Manage.
4. Pick YFI in "From" dropdown box.
5. Type in "0" amount.
Expected:
The Variable Yield details should be the same as Fixed Yield details.
Actual:

|
1.0
|
Yield details visibility should be consistent for Fixed/Variable - Prerequest:
- Connect MetaMask to Tempus App
- Switch MetaMask to Fantom network ([details](https://docs.fantom.foundation/tutorials/set-up-metamask))
Steps:
1. Navigate to https://tempus-app-testnet-v2.web.app/
2. Expand YFI.
3. Manage.
4. Pick YFI in "From" dropdown box.
5. Type in "0" amount.
Expected:
The Variable Yield details should be the same as Fixed Yield details.
Actual:

|
test
|
yield details visibility should be consistent for fixed variable prerequest connect metamask to tempus app switch metamask to fantom network steps navigate to expand yfi manage pick yfi in from dropdown box type in amount expected the variable yield details should be the same as fixed yield details actual
| 1
|
356,176
| 25,176,130,346
|
IssuesEvent
|
2022-11-11 09:25:13
|
adeearyaa/pe
|
https://api.github.com/repos/adeearyaa/pe
|
opened
|
Hardly visible diagrams
|
severity.VeryLow type.DocumentationBug
|

Diagram in the user guide cannot be read clearly by the reader
<!--session: 1668153101613-87df3e8a-e823-41da-a9bb-388c22a8a3ba-->
<!--Version: Web v3.4.4-->
|
1.0
|
Hardly visible diagrams - 
Diagram in the user guide cannot be read clearly by the reader
<!--session: 1668153101613-87df3e8a-e823-41da-a9bb-388c22a8a3ba-->
<!--Version: Web v3.4.4-->
|
non_test
|
hardly visible diagrams diagram in the user guide cannot be read clearly by the reader
| 0
|
246,846
| 20,920,485,780
|
IssuesEvent
|
2022-03-24 16:55:58
|
vegaprotocol/frontend-monorepo
|
https://api.github.com/repos/vegaprotocol/frontend-monorepo
|
closed
|
UI Testing Stats page
|
Testing 🧪 Stats app
|
Now that the stats page has been migrated to mono repo we can add Cypress tests to assert that everything is displayed correctly
|
1.0
|
UI Testing Stats page - Now that the stats page has been migrated to mono repo we can add Cypress tests to assert that everything is displayed correctly
|
test
|
ui testing stats page now that the stats page has been migrated to mono repo we can add cypress tests to assert that everything is displayed correctly
| 1
|
74,884
| 3,449,334,903
|
IssuesEvent
|
2015-12-16 13:11:36
|
YJSoft/xe-module-loginxeserver
|
https://api.github.com/repos/YJSoft/xe-module-loginxeserver
|
opened
|
인증 방식을 ID/KEY에서 ID/Public Key 방식으로 변경
|
priority/high status/enhancement
|
클라이언트에서 개인키 및 공개키 생성후 서버에 공개키를 등록해서, 서버는 공개키를 사용해서 암호화하고 클라이언트는 개인키로 암호화를 해제할 수 있도록 함
|
1.0
|
인증 방식을 ID/KEY에서 ID/Public Key 방식으로 변경 - 클라이언트에서 개인키 및 공개키 생성후 서버에 공개키를 등록해서, 서버는 공개키를 사용해서 암호화하고 클라이언트는 개인키로 암호화를 해제할 수 있도록 함
|
non_test
|
인증 방식을 id key에서 id public key 방식으로 변경 클라이언트에서 개인키 및 공개키 생성후 서버에 공개키를 등록해서 서버는 공개키를 사용해서 암호화하고 클라이언트는 개인키로 암호화를 해제할 수 있도록 함
| 0
|
94,752
| 8,515,447,710
|
IssuesEvent
|
2018-10-31 21:38:10
|
hashmapinc/WitsmlApi-Server
|
https://api.github.com/repos/hashmapinc/WitsmlApi-Server
|
closed
|
Create a skeleton project that serves a legal WITSML API
|
API Testing
|
A skeleton Spring Boot / CXF application that can execute getBaseMsg/getVersion/getCap from the PDS client
|
1.0
|
Create a skeleton project that serves a legal WITSML API - A skeleton Spring Boot / CXF application that can execute getBaseMsg/getVersion/getCap from the PDS client
|
test
|
create a skeleton project that serves a legal witsml api a skeleton spring boot cxf application that can execute getbasemsg getversion getcap from the pds client
| 1
|
149,373
| 11,898,927,332
|
IssuesEvent
|
2020-03-30 08:07:52
|
hazelcast/hazelcast-jet
|
https://api.github.com/repos/hazelcast/hazelcast-jet
|
closed
|
com.hazelcast.jet.impl.connector.JmsIntegration_NonSharedClusterTest.com.hazelcast.jet.impl.connector.JmsIntegration_NonSharedClusterTest
|
test-failure
|
_master_ (commit b5a714f5e76ed142a313959dacfee786e75249a8)
Failed on Zulu JDK 8: http://jenkins.hazelcast.com/job/jet-oss-master-zulu-jdk8/187/testReport/junit/com.hazelcast.jet.impl.connector/JmsIntegration_NonSharedClusterTest/com_hazelcast_jet_impl_connector_JmsIntegration_NonSharedClusterTest/
Stacktrace:
```
org.junit.runners.model.TestTimedOutException: test timed out after 900 seconds
at java.lang.Object.wait(Native Method)
at org.apache.activemq.transport.failover.FailoverTransport.oneway(FailoverTransport.java:623)
at org.apache.activemq.transport.MutexTransport.oneway(MutexTransport.java:68)
at org.apache.activemq.transport.ResponseCorrelator.asyncRequest(ResponseCorrelator.java:81)
at org.apache.activemq.transport.ResponseCorrelator.request(ResponseCorrelator.java:86)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1392)
at org.apache.activemq.ActiveMQConnection.ensureConnectionInfoSent(ActiveMQConnection.java:1486)
at org.apache.activemq.ActiveMQConnection.createSession(ActiveMQConnection.java:329)
at org.apache.activemq.junit.EmbeddedActiveMQBroker$InternalClient.start(EmbeddedActiveMQBroker.java:711)
at org.apache.activemq.junit.EmbeddedActiveMQBroker.start(EmbeddedActiveMQBroker.java:131)
at org.apache.activemq.junit.EmbeddedActiveMQBroker.before(EmbeddedActiveMQBroker.java:170)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46)
at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:292)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:748)
```
Standard output:
```
INFO | Starting embedded ActiveMQ broker: embedded-broker
WARN | Memory Usage for the Broker (1024mb) is more than the maximum available for the JVM: 1011 mb - resetting to 70% of maximum available: 708 mb
INFO | Using Persistence Adapter: MemoryPersistenceAdapter
INFO | Apache ActiveMQ 5.15.11 (embedded-broker, ID:64e2684f4972-38725-1585360487282-0:2) is starting
INFO | Apache ActiveMQ 5.15.11 (embedded-broker, ID:64e2684f4972-38725-1585360487282-0:2) started
INFO | For help or more information please see: http://activemq.apache.org
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
```
|
1.0
|
com.hazelcast.jet.impl.connector.JmsIntegration_NonSharedClusterTest.com.hazelcast.jet.impl.connector.JmsIntegration_NonSharedClusterTest - _master_ (commit b5a714f5e76ed142a313959dacfee786e75249a8)
Failed on Zulu JDK 8: http://jenkins.hazelcast.com/job/jet-oss-master-zulu-jdk8/187/testReport/junit/com.hazelcast.jet.impl.connector/JmsIntegration_NonSharedClusterTest/com_hazelcast_jet_impl_connector_JmsIntegration_NonSharedClusterTest/
Stacktrace:
```
org.junit.runners.model.TestTimedOutException: test timed out after 900 seconds
at java.lang.Object.wait(Native Method)
at org.apache.activemq.transport.failover.FailoverTransport.oneway(FailoverTransport.java:623)
at org.apache.activemq.transport.MutexTransport.oneway(MutexTransport.java:68)
at org.apache.activemq.transport.ResponseCorrelator.asyncRequest(ResponseCorrelator.java:81)
at org.apache.activemq.transport.ResponseCorrelator.request(ResponseCorrelator.java:86)
at org.apache.activemq.ActiveMQConnection.syncSendPacket(ActiveMQConnection.java:1392)
at org.apache.activemq.ActiveMQConnection.ensureConnectionInfoSent(ActiveMQConnection.java:1486)
at org.apache.activemq.ActiveMQConnection.createSession(ActiveMQConnection.java:329)
at org.apache.activemq.junit.EmbeddedActiveMQBroker$InternalClient.start(EmbeddedActiveMQBroker.java:711)
at org.apache.activemq.junit.EmbeddedActiveMQBroker.start(EmbeddedActiveMQBroker.java:131)
at org.apache.activemq.junit.EmbeddedActiveMQBroker.before(EmbeddedActiveMQBroker.java:170)
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:46)
at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:298)
at org.junit.internal.runners.statements.FailOnTimeout$CallableStatement.call(FailOnTimeout.java:292)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.lang.Thread.run(Thread.java:748)
```
Standard output:
```
INFO | Starting embedded ActiveMQ broker: embedded-broker
WARN | Memory Usage for the Broker (1024mb) is more than the maximum available for the JVM: 1011 mb - resetting to 70% of maximum available: 708 mb
INFO | Using Persistence Adapter: MemoryPersistenceAdapter
INFO | Apache ActiveMQ 5.15.11 (embedded-broker, ID:64e2684f4972-38725-1585360487282-0:2) is starting
INFO | Apache ActiveMQ 5.15.11 (embedded-broker, ID:64e2684f4972-38725-1585360487282-0:2) started
INFO | For help or more information please see: http://activemq.apache.org
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 10 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 20 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 30 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
WARN | Failed to connect to [vm://embedded-broker?create=false] after: 40 attempt(s) continuing to retry.
```
|
test
|
com hazelcast jet impl connector jmsintegration nonsharedclustertest com hazelcast jet impl connector jmsintegration nonsharedclustertest master commit failed on zulu jdk stacktrace org junit runners model testtimedoutexception test timed out after seconds at java lang object wait native method at org apache activemq transport failover failovertransport oneway failovertransport java at org apache activemq transport mutextransport oneway mutextransport java at org apache activemq transport responsecorrelator asyncrequest responsecorrelator java at org apache activemq transport responsecorrelator request responsecorrelator java at org apache activemq activemqconnection syncsendpacket activemqconnection java at org apache activemq activemqconnection ensureconnectioninfosent activemqconnection java at org apache activemq activemqconnection createsession activemqconnection java at org apache activemq junit embeddedactivemqbroker internalclient start embeddedactivemqbroker java at org apache activemq junit embeddedactivemqbroker start embeddedactivemqbroker java at org apache activemq junit embeddedactivemqbroker before embeddedactivemqbroker java at org junit rules externalresource evaluate externalresource java at org junit internal runners statements failontimeout callablestatement call failontimeout java at org junit internal runners statements failontimeout callablestatement call failontimeout java at java util concurrent futuretask run futuretask java at java lang thread run thread java standard output info starting embedded activemq broker embedded broker warn memory usage for the broker is more than the maximum available for the jvm mb resetting to of maximum available mb info using persistence adapter memorypersistenceadapter info apache activemq embedded broker id is starting info apache activemq embedded broker id started info for help or more information please see warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry warn failed to connect to after attempt s continuing to retry
| 1
|
50,161
| 6,062,680,609
|
IssuesEvent
|
2017-06-14 10:00:23
|
khartec/waltz
|
https://api.github.com/repos/khartec/waltz
|
closed
|
Move Entity Named Notes into own section
|
enhancement fixed (test & close)
|
Section called `Additional Notes`
Hide section if no notes & unable to add them
|
1.0
|
Move Entity Named Notes into own section - Section called `Additional Notes`
Hide section if no notes & unable to add them
|
test
|
move entity named notes into own section section called additional notes hide section if no notes unable to add them
| 1
|
288,629
| 8,849,609,892
|
IssuesEvent
|
2019-01-08 10:44:05
|
aiidateam/aiida_core
|
https://api.github.com/repos/aiidateam/aiida_core
|
closed
|
Add support for slicing and indexing in `Group.nodes` property
|
priority/nice to have topic/ORM type/accepted feature
|
Without this functionality, one while always have to needlessly iterate over the array to get the items of interest
|
1.0
|
Add support for slicing and indexing in `Group.nodes` property - Without this functionality, one while always have to needlessly iterate over the array to get the items of interest
|
non_test
|
add support for slicing and indexing in group nodes property without this functionality one while always have to needlessly iterate over the array to get the items of interest
| 0
|
750,447
| 26,202,327,077
|
IssuesEvent
|
2023-01-03 18:43:55
|
qutebrowser/qutebrowser
|
https://api.github.com/repos/qutebrowser/qutebrowser
|
reopened
|
Google sheets renders black text as white with qt6 branch
|
priority: 1 - middle status: needs triage bug: behavior qt: 6
|
**Version info**:
```
qutebrowser v2.5.2
Git commit: ca667d642 on makepkg (2022-11-20 16:15:06 +1300)
Backend: QtWebEngine 6.4.1, based on Chromium 102.0.5005.177 (from api)
Qt: 6.4.1 (compiled 6.4.0)
CPython: 3.10.8
PyQt: 6.4.0
sip: no
colorama: 0.4.6
jinja2: 3.1.2
pygments: 2.13.0
yaml: 6.0
adblock: 0.6.0
objc: no
PyQt6.QtWebEngineCore: 6.4.0
pdf.js: 3.0.279 (/usr/share/pdf.js/build/pdf.js)
sqlite: 3.40.0
QtNetwork SSL: OpenSSL 3.0.7 1 Nov 2022
Style: QFusionStyle
Platform plugin: xcb
OpenGL: Intel, 4.6 (Compatibility Profile) Mesa 22.2.3
Platform: Linux-5.15.79-1-lts-x86_64-with-glibc2.36, 64bit
Linux distribution: Arch Linux (arch)
Frozen: False
Imported from /usr/lib/python3.10/site-packages/qutebrowser
Using Python from /usr/bin/python3
Qt library executable path: /usr/lib/qt6, data path: /usr/share/qt6
Paths:
cache: /run/user/1000/qutebrowser/Web/cache
config: /run/user/1000/qutebrowser/Web/config
data: /run/user/1000/qutebrowser/Web/data
runtime: /run/user/1000/qutebrowser/Web/runtime
system data: /usr/share/qutebrowser
Autoconfig loaded: no
Config.py: /run/user/1000/qutebrowser/Web/config/config.py has been loaded
Uptime: 0:20:22
```
**Does the bug happen if you start with `--temp-basedir`?**:
Yes
**Description**
Since some time (~1 month) Google sheets renders black text as white with qt6 branch.
I tried running qutebrowser without any configuration and got the same results. I also tried downgrading qt to version 6.3 to no avail.
**How to reproduce**
<!-- Link to the affected site, or steps to reproduce the issue (if possible/applicable). -->
Open any Google sheet, observe that the cell contents are white, regardless of the font color.
|
1.0
|
Google sheets renders black text as white with qt6 branch - **Version info**:
```
qutebrowser v2.5.2
Git commit: ca667d642 on makepkg (2022-11-20 16:15:06 +1300)
Backend: QtWebEngine 6.4.1, based on Chromium 102.0.5005.177 (from api)
Qt: 6.4.1 (compiled 6.4.0)
CPython: 3.10.8
PyQt: 6.4.0
sip: no
colorama: 0.4.6
jinja2: 3.1.2
pygments: 2.13.0
yaml: 6.0
adblock: 0.6.0
objc: no
PyQt6.QtWebEngineCore: 6.4.0
pdf.js: 3.0.279 (/usr/share/pdf.js/build/pdf.js)
sqlite: 3.40.0
QtNetwork SSL: OpenSSL 3.0.7 1 Nov 2022
Style: QFusionStyle
Platform plugin: xcb
OpenGL: Intel, 4.6 (Compatibility Profile) Mesa 22.2.3
Platform: Linux-5.15.79-1-lts-x86_64-with-glibc2.36, 64bit
Linux distribution: Arch Linux (arch)
Frozen: False
Imported from /usr/lib/python3.10/site-packages/qutebrowser
Using Python from /usr/bin/python3
Qt library executable path: /usr/lib/qt6, data path: /usr/share/qt6
Paths:
cache: /run/user/1000/qutebrowser/Web/cache
config: /run/user/1000/qutebrowser/Web/config
data: /run/user/1000/qutebrowser/Web/data
runtime: /run/user/1000/qutebrowser/Web/runtime
system data: /usr/share/qutebrowser
Autoconfig loaded: no
Config.py: /run/user/1000/qutebrowser/Web/config/config.py has been loaded
Uptime: 0:20:22
```
**Does the bug happen if you start with `--temp-basedir`?**:
Yes
**Description**
Since some time (~1 month) Google sheets renders black text as white with qt6 branch.
I tried running qutebrowser without any configuration and got the same results. I also tried downgrading qt to version 6.3 to no avail.
**How to reproduce**
<!-- Link to the affected site, or steps to reproduce the issue (if possible/applicable). -->
Open any Google sheet, observe that the cell contents are white, regardless of the font color.
|
non_test
|
google sheets renders black text as white with branch version info qutebrowser git commit on makepkg backend qtwebengine based on chromium from api qt compiled cpython pyqt sip no colorama pygments yaml adblock objc no qtwebenginecore pdf js usr share pdf js build pdf js sqlite qtnetwork ssl openssl nov style qfusionstyle platform plugin xcb opengl intel compatibility profile mesa platform linux lts with linux distribution arch linux arch frozen false imported from usr lib site packages qutebrowser using python from usr bin qt library executable path usr lib data path usr share paths cache run user qutebrowser web cache config run user qutebrowser web config data run user qutebrowser web data runtime run user qutebrowser web runtime system data usr share qutebrowser autoconfig loaded no config py run user qutebrowser web config config py has been loaded uptime does the bug happen if you start with temp basedir yes description since some time month google sheets renders black text as white with branch i tried running qutebrowser without any configuration and got the same results i also tried downgrading qt to version to no avail how to reproduce open any google sheet observe that the cell contents are white regardless of the font color
| 0
|
26,471
| 2,684,556,306
|
IssuesEvent
|
2015-03-29 03:31:39
|
gtcasl/gpuocelot
|
https://api.github.com/repos/gtcasl/gpuocelot
|
opened
|
Ocelot won't interpretate ld.volatile.v4.f32 instruction
|
bug imported Priority-Medium
|
_From [unde...@gmail.com](https://code.google.com/u/100354420691982552014/) on June 04, 2012 18:00:32_
What steps will reproduce the problem? 1. Compile particles app in NVIDIA SDK 4.1 using nvcc 4.2 (nvcc -arch sm_20), exporting ptx file.
2. Use ptxOptimizer to parse output ptx file. What is the expected output? What do you see instead? Should create new ptx code.
Prints line and error:
1199 ld.volatile.v4.f32 {%f123, %f124, %f125, %f126}, [%rl19];
(1199, 12): syntax error, unexpected TOKEN_V4 What version of the product are you using? On what operating system? ocelot svn 1940
ptxgramar.ll won't accept a ".v4" token after a ".volatile" token, it requires to have a "addressSpace" between them.
Possible solution:
Add construction
ldModifier : TOKEN_VOLATILE instructionVectorType
{
state.volatileFlag( true );
};
**Attachment:** [make.particles.sdk41.log.bz2](http://code.google.com/p/gpuocelot/issues/detail?id=69)
_Original issue: http://code.google.com/p/gpuocelot/issues/detail?id=69_
|
1.0
|
Ocelot won't interpretate ld.volatile.v4.f32 instruction - _From [unde...@gmail.com](https://code.google.com/u/100354420691982552014/) on June 04, 2012 18:00:32_
What steps will reproduce the problem? 1. Compile particles app in NVIDIA SDK 4.1 using nvcc 4.2 (nvcc -arch sm_20), exporting ptx file.
2. Use ptxOptimizer to parse output ptx file. What is the expected output? What do you see instead? Should create new ptx code.
Prints line and error:
1199 ld.volatile.v4.f32 {%f123, %f124, %f125, %f126}, [%rl19];
(1199, 12): syntax error, unexpected TOKEN_V4 What version of the product are you using? On what operating system? ocelot svn 1940
ptxgramar.ll won't accept a ".v4" token after a ".volatile" token, it requires to have a "addressSpace" between them.
Possible solution:
Add construction
ldModifier : TOKEN_VOLATILE instructionVectorType
{
state.volatileFlag( true );
};
**Attachment:** [make.particles.sdk41.log.bz2](http://code.google.com/p/gpuocelot/issues/detail?id=69)
_Original issue: http://code.google.com/p/gpuocelot/issues/detail?id=69_
|
non_test
|
ocelot won t interpretate ld volatile instruction from on june what steps will reproduce the problem compile particles app in nvidia sdk using nvcc nvcc arch sm exporting ptx file use ptxoptimizer to parse output ptx file what is the expected output what do you see instead should create new ptx code prints line and error ld volatile syntax error unexpected token what version of the product are you using on what operating system ocelot svn ptxgramar ll won t accept a token after a volatile token it requires to have a addressspace between them possible solution add construction ldmodifier token volatile instructionvectortype state volatileflag true attachment original issue
| 0
|
137,040
| 12,743,973,474
|
IssuesEvent
|
2020-06-26 11:32:56
|
CatalogueLegacies/antconc.github.io
|
https://api.github.com/repos/CatalogueLegacies/antconc.github.io
|
opened
|
Zoom rules for online teaching
|
documentation
|
Draft:
- we will not aim to get through all the material, but a sensible amount in the time we have
- RA manage the chat/hands.
- JB concentrate on teaching.
- JB mute all by default
- in 'participants' start by all clicking "yes" for happy, then hit "no" if there is a problem. RA inform JB and we pause - unmute to resolve - then only continue when all are back to "yes"
- for exercises, you get randomly assigned breakout rooms and a time frame. If you get stuck, jump back out to speak to JB.
- feedback form at the end
|
1.0
|
Zoom rules for online teaching - Draft:
- we will not aim to get through all the material, but a sensible amount in the time we have
- RA manage the chat/hands.
- JB concentrate on teaching.
- JB mute all by default
- in 'participants' start by all clicking "yes" for happy, then hit "no" if there is a problem. RA inform JB and we pause - unmute to resolve - then only continue when all are back to "yes"
- for exercises, you get randomly assigned breakout rooms and a time frame. If you get stuck, jump back out to speak to JB.
- feedback form at the end
|
non_test
|
zoom rules for online teaching draft we will not aim to get through all the material but a sensible amount in the time we have ra manage the chat hands jb concentrate on teaching jb mute all by default in participants start by all clicking yes for happy then hit no if there is a problem ra inform jb and we pause unmute to resolve then only continue when all are back to yes for exercises you get randomly assigned breakout rooms and a time frame if you get stuck jump back out to speak to jb feedback form at the end
| 0
|
42,084
| 5,425,135,971
|
IssuesEvent
|
2017-03-03 04:28:03
|
Princeton-CDH/winthrop-django
|
https://api.github.com/repos/Princeton-CDH/winthrop-django
|
closed
|
person list fields
|
awaiting testing
|
As a data editor, when I’m viewing the list of people I want to see authorized name, sort name, birth and death dates, viaf id, and family group so that I can quickly get a sense of the records I’m looking at.
|
1.0
|
person list fields - As a data editor, when I’m viewing the list of people I want to see authorized name, sort name, birth and death dates, viaf id, and family group so that I can quickly get a sense of the records I’m looking at.
|
test
|
person list fields as a data editor when i’m viewing the list of people i want to see authorized name sort name birth and death dates viaf id and family group so that i can quickly get a sense of the records i’m looking at
| 1
|
298,096
| 22,440,451,382
|
IssuesEvent
|
2022-06-21 00:30:21
|
ProjectEvergreen/eleventy-plugin-wcc
|
https://api.github.com/repos/ProjectEvergreen/eleventy-plugin-wcc
|
opened
|
declarative shadow dom polyfill
|
documentation enhancement
|
Would be good to have support for Declarative Shadow DOM through options. Should be able to pull it off NPM and conditionally load it for the user, or provide instructions for how to bootstrap an app that needs it.
|
1.0
|
declarative shadow dom polyfill - Would be good to have support for Declarative Shadow DOM through options. Should be able to pull it off NPM and conditionally load it for the user, or provide instructions for how to bootstrap an app that needs it.
|
non_test
|
declarative shadow dom polyfill would be good to have support for declarative shadow dom through options should be able to pull it off npm and conditionally load it for the user or provide instructions for how to bootstrap an app that needs it
| 0
|
50,498
| 21,132,368,547
|
IssuesEvent
|
2022-04-06 00:40:24
|
shchuko/demo-service-181
|
https://api.github.com/repos/shchuko/demo-service-181
|
closed
|
Разобраться, как работает доставка
|
bug delivery-service order-service
|
Заведена ишуя на квипи: [тык](http://77.234.215.138:28082/projects/f7c99203-ce2d-4d47-84d5-3a6500dc16ec?questionId=c7f72010-ec20-4226-b9c3-372040ff0301)
Как будет понятен алгоритм обработки (генерация слотов, таймауты, работа с внешней системой) - поправить у нас
|
2.0
|
Разобраться, как работает доставка - Заведена ишуя на квипи: [тык](http://77.234.215.138:28082/projects/f7c99203-ce2d-4d47-84d5-3a6500dc16ec?questionId=c7f72010-ec20-4226-b9c3-372040ff0301)
Как будет понятен алгоритм обработки (генерация слотов, таймауты, работа с внешней системой) - поправить у нас
|
non_test
|
разобраться как работает доставка заведена ишуя на квипи как будет понятен алгоритм обработки генерация слотов таймауты работа с внешней системой поправить у нас
| 0
|
64,176
| 26,637,406,189
|
IssuesEvent
|
2023-01-24 23:31:34
|
Azure/azure-sdk-for-js
|
https://api.github.com/repos/Azure/azure-sdk-for-js
|
closed
|
[Service Bus] Sender Tests PartitionedQueueWithSessions: Schedule multiple messages failing in nightly runs
|
Client Service Bus test-reliability
|
Service Bus nightly test runs are failing with:
> Error message:
>ServiceBusError: Failed to create a receiver for the requested session 'my-session'. It may be locked by another receiver.
at translateServiceBusError (D:\a\_work\1\s\sdk\servicebus\service-bus\src\serviceBusError.ts:86:63)
at MessageSession._init (D:\a\_work\1\s\sdk\servicebus\service-bus\dist-esm\src\session\messageSession.js:39:282)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (internal/process/task_queues.js:97:5)
at Function.create (D:\a\_work\1\s\sdk\servicebus\service-bus\src\session\messageSession.ts:450:27)
at ServiceBusClient.acceptSession (D:\a\_work\1\s\sdk\servicebus\service-bus\src\serviceBusClient.ts:87:47)
at ServiceBusTestHelpers.acceptSessionWithPeekLock (D:\a\_work\1\s\sdk\servicebus\service-bus\test\public\utils\testutils2.ts:405:11)
at beforeEachTest (D:\a\_work\1\s\sdk\servicebus\service-bus\test\public\sendAndSchedule.spec.ts:46:16)
at Context.<anonymous> (D:\a\_work\1\
For more details check here:
- https://dev.azure.com/azure-sdk/internal/_build/results?buildId=1731557&view=results
|
1.0
|
[Service Bus] Sender Tests PartitionedQueueWithSessions: Schedule multiple messages failing in nightly runs - Service Bus nightly test runs are failing with:
> Error message:
>ServiceBusError: Failed to create a receiver for the requested session 'my-session'. It may be locked by another receiver.
at translateServiceBusError (D:\a\_work\1\s\sdk\servicebus\service-bus\src\serviceBusError.ts:86:63)
at MessageSession._init (D:\a\_work\1\s\sdk\servicebus\service-bus\dist-esm\src\session\messageSession.js:39:282)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (internal/process/task_queues.js:97:5)
at Function.create (D:\a\_work\1\s\sdk\servicebus\service-bus\src\session\messageSession.ts:450:27)
at ServiceBusClient.acceptSession (D:\a\_work\1\s\sdk\servicebus\service-bus\src\serviceBusClient.ts:87:47)
at ServiceBusTestHelpers.acceptSessionWithPeekLock (D:\a\_work\1\s\sdk\servicebus\service-bus\test\public\utils\testutils2.ts:405:11)
at beforeEachTest (D:\a\_work\1\s\sdk\servicebus\service-bus\test\public\sendAndSchedule.spec.ts:46:16)
at Context.<anonymous> (D:\a\_work\1\
For more details check here:
- https://dev.azure.com/azure-sdk/internal/_build/results?buildId=1731557&view=results
|
non_test
|
sender tests partitionedqueuewithsessions schedule multiple messages failing in nightly runs service bus nightly test runs are failing with error message servicebuserror failed to create a receiver for the requested session my session it may be locked by another receiver at translateservicebuserror d a work s sdk servicebus service bus src servicebuserror ts at messagesession init d a work s sdk servicebus service bus dist esm src session messagesession js at runmicrotasks at processticksandrejections internal process task queues js at function create d a work s sdk servicebus service bus src session messagesession ts at servicebusclient acceptsession d a work s sdk servicebus service bus src servicebusclient ts at servicebustesthelpers acceptsessionwithpeeklock d a work s sdk servicebus service bus test public utils ts at beforeeachtest d a work s sdk servicebus service bus test public sendandschedule spec ts at context d a work for more details check here
| 0
|
135,746
| 11,016,450,000
|
IssuesEvent
|
2019-12-05 05:26:57
|
openequella/openEQUELLA
|
https://api.github.com/repos/openequella/openEQUELLA
|
closed
|
Taxonomy PUT requests for data is broken
|
Ready for Testing
|
**Describe the bug**
For the PUT request `/{uuid}/term/{termUuid}/data/{datakey}/{datavalue}` in `TaxonomyResource`, the `datakey` parameter is listed twice instead of once and then the `datavalue`parameter.
This messes up the Swagger UI apidocs.do page as well.
**Platform:**
- OpenEquella Version: 2019.1
|
1.0
|
Taxonomy PUT requests for data is broken - **Describe the bug**
For the PUT request `/{uuid}/term/{termUuid}/data/{datakey}/{datavalue}` in `TaxonomyResource`, the `datakey` parameter is listed twice instead of once and then the `datavalue`parameter.
This messes up the Swagger UI apidocs.do page as well.
**Platform:**
- OpenEquella Version: 2019.1
|
test
|
taxonomy put requests for data is broken describe the bug for the put request uuid term termuuid data datakey datavalue in taxonomyresource the datakey parameter is listed twice instead of once and then the datavalue parameter this messes up the swagger ui apidocs do page as well platform openequella version
| 1
|
70,206
| 7,179,394,202
|
IssuesEvent
|
2018-01-31 19:32:34
|
CuBoulder/express
|
https://api.github.com/repos/CuBoulder/express
|
closed
|
Cleanup Test Dependencies
|
3.0:Alex:Testing
|
The number of test dependencies should be kept to a minimum to reduce complexity.
To resolve this issue, the behat.yml file should be completely rewritten. There are a bunch of parts of that file I've left in just because.
The real question is if the Chromedriver should still be used and how JS tests can run and be observable.
The recommended solution seems to be using Sauce Labs locally and on a CI server for the JS tests.
|
1.0
|
Cleanup Test Dependencies - The number of test dependencies should be kept to a minimum to reduce complexity.
To resolve this issue, the behat.yml file should be completely rewritten. There are a bunch of parts of that file I've left in just because.
The real question is if the Chromedriver should still be used and how JS tests can run and be observable.
The recommended solution seems to be using Sauce Labs locally and on a CI server for the JS tests.
|
test
|
cleanup test dependencies the number of test dependencies should be kept to a minimum to reduce complexity to resolve this issue the behat yml file should be completely rewritten there are a bunch of parts of that file i ve left in just because the real question is if the chromedriver should still be used and how js tests can run and be observable the recommended solution seems to be using sauce labs locally and on a ci server for the js tests
| 1
|
94,792
| 11,911,390,428
|
IssuesEvent
|
2020-03-31 08:32:39
|
Disfactory/Disfactory
|
https://api.github.com/repos/Disfactory/Disfactory
|
opened
|
Dashboard on Landing Page
|
design medium priority
|
Dashboard will be showed on landing page
(拆自後台 Admin Page issue: #268 )
Purpose:
- For staff: transparency and impact metrics to measure success
- For active reporters: rewarded and know the progress
- For general public: know the overview advocacy impact and join the reporting movement
Design principle:
- simple and strong messages
- realtime update (max interval: 30 min)
Data:
- 回報行動:民眾已回報工廠數量總數、更新資料筆數、縣市分佈(可用台灣地圖視覺化)、熱門查詢與回報地區(`town-district`)排行榜 top 5(列出舉報工廠術與資料筆數)、past 30 days unique users(這個有點赤裸 XD,數字好看再放)
- 舉報進度:打算正式舉報的數量佔全部回報點位的百分比(並分為中高污染和新增建工廠)、地公處理進度 `cet_report_status`(分階段和時間顯示數量)、政府回應 `gov_response`(分類顯示)
## **Out of Scope**
- Interactive design or motion graphic, e.g. search, filter
- more than 3 sections to display the dashboard
|
1.0
|
Dashboard on Landing Page - Dashboard will be showed on landing page
(拆自後台 Admin Page issue: #268 )
Purpose:
- For staff: transparency and impact metrics to measure success
- For active reporters: rewarded and know the progress
- For general public: know the overview advocacy impact and join the reporting movement
Design principle:
- simple and strong messages
- realtime update (max interval: 30 min)
Data:
- 回報行動:民眾已回報工廠數量總數、更新資料筆數、縣市分佈(可用台灣地圖視覺化)、熱門查詢與回報地區(`town-district`)排行榜 top 5(列出舉報工廠術與資料筆數)、past 30 days unique users(這個有點赤裸 XD,數字好看再放)
- 舉報進度:打算正式舉報的數量佔全部回報點位的百分比(並分為中高污染和新增建工廠)、地公處理進度 `cet_report_status`(分階段和時間顯示數量)、政府回應 `gov_response`(分類顯示)
## **Out of Scope**
- Interactive design or motion graphic, e.g. search, filter
- more than 3 sections to display the dashboard
|
non_test
|
dashboard on landing page dashboard will be showed on landing page (拆自後台 admin page issue ) purpose for staff transparency and impact metrics to measure success for active reporters rewarded and know the progress for general public know the overview advocacy impact and join the reporting movement design principle simple and strong messages realtime update max interval min data 回報行動:民眾已回報工廠數量總數、更新資料筆數、縣市分佈(可用台灣地圖視覺化)、熱門查詢與回報地區( town district )排行榜 top (列出舉報工廠術與資料筆數)、past days unique users(這個有點赤裸 xd,數字好看再放) 舉報進度:打算正式舉報的數量佔全部回報點位的百分比(並分為中高污染和新增建工廠)、地公處理進度 cet report status (分階段和時間顯示數量)、政府回應 gov response (分類顯示) out of scope interactive design or motion graphic e g search filter more than sections to display the dashboard
| 0
|
236,907
| 19,584,635,751
|
IssuesEvent
|
2022-01-05 04:13:33
|
pombase/pombase-chado
|
https://api.github.com/repos/pombase/pombase-chado
|
closed
|
Fix warning in curation-tool-data-load-output log file
|
bug logs needs testing
|
more the one relation cvterm returned for part_of:
terms: BFO:0000050 and BFO:0000050
Also occurs when loading PHAF files: https://curation.pombase.org/dumps/builds/pombase-build-2022-01-02/logs/log.2022-01-02-21-30-14.phenotypes_from_PMID_28410370_phaf
|
1.0
|
Fix warning in curation-tool-data-load-output log file - more the one relation cvterm returned for part_of:
terms: BFO:0000050 and BFO:0000050
Also occurs when loading PHAF files: https://curation.pombase.org/dumps/builds/pombase-build-2022-01-02/logs/log.2022-01-02-21-30-14.phenotypes_from_PMID_28410370_phaf
|
test
|
fix warning in curation tool data load output log file more the one relation cvterm returned for part of terms bfo and bfo also occurs when loading phaf files
| 1
|
24,493
| 4,086,936,472
|
IssuesEvent
|
2016-06-01 08:08:45
|
menatwork/syncCto
|
https://api.github.com/repos/menatwork/syncCto
|
closed
|
optionally allow sync of tl_files
|
Accepted Feature Testing
|
`tl_files` is currently excluded by default when using the database sync. `tl_files` will only be transferred, if you use the "overwrite system" option.
It would be useful to optionally also allow the `tl_files` table to be transferred/compared when using the regular sync. e.g. as an additional option below "Database-synchronisation".
|
1.0
|
optionally allow sync of tl_files - `tl_files` is currently excluded by default when using the database sync. `tl_files` will only be transferred, if you use the "overwrite system" option.
It would be useful to optionally also allow the `tl_files` table to be transferred/compared when using the regular sync. e.g. as an additional option below "Database-synchronisation".
|
test
|
optionally allow sync of tl files tl files is currently excluded by default when using the database sync tl files will only be transferred if you use the overwrite system option it would be useful to optionally also allow the tl files table to be transferred compared when using the regular sync e g as an additional option below database synchronisation
| 1
|
329,685
| 28,301,563,718
|
IssuesEvent
|
2023-04-10 06:42:54
|
wazuh/wazuh
|
https://api.github.com/repos/wazuh/wazuh
|
closed
|
Release 4.4.1 - Release Candidate 1 - Ruleset Test
|
type/test level/task release test/4.4.1
|
### Packages tests information
|||
| :-- | :-- |
| **Main release candidate issue** | #16620 |
| **Version** | 4.4.1 |
| **Release candidate** | RC1 |
| **Tag** | https://github.com/wazuh/wazuh/tree/v4.4.1-rc1 |
| **Previous ruleset test ** | - |
#### SCA
|OS | installed | Executed |
| -------- | -------- | -------- |
| Red Hat Enterprise Linux 9 | ✅ | ✅ |
| Debian Liunux 11 | ✅ | ✅ |
#### Decoders and Rules
|Component | Tested | Total | Coverage |
| -------- | -------- | -------- | -------- |
| Rules | 1343 | 4248 | 31.61% |
| Decoders | 120 | 165 | 72.73% |
| File | Passed | Failed | Status |
| -------- | -------- | -------- | -------- |
|./tests/proftpd.ini | 7 | 0 | ✅ |
|./tests/exim.ini | 7 | 0 | ✅ |
|./tests/squid_rules.ini | 2 | 0 | ✅ |
|./tests/sysmon_eid_3.ini | 10 | 0 | ✅ |
|./tests/checkpoint_smart1.ini| 18 | 0 | ✅ |
|./tests/iptables.ini | 9 | 0 | ✅ |
|./tests/sysmon_eid_10.ini| 4 | 0 | ✅ |
|./tests/SonicWall.ini | 11 | 0 | ✅ |
|./tests/panda_paps.ini | 8 | 0 | ✅ |
|./tests/fortiauth.ini | 4 | 0 | ✅ |
|./tests/openldap.ini | 9 | 0 | ✅ |
|./tests/f5_big_ip.ini | 48 | 0 | ✅ |
|./tests/sophos.ini | 8 | 0 | ✅ |
|./tests/opensmtpd.ini | 7 | 0 | ✅ |
|./tests/netscreen.ini | 4 | 0 | ✅ |
|./tests/rsh.ini | 2 | 0 | ✅ |
|./tests/arbor.ini | 2 | 0 | ✅ |
|./tests/web_rules.ini | 10 | 0 | ✅ |
|./tests/exchange.ini | 2 | 0 | ✅ |
|./tests/vuln_detector.ini| 2 | 0 | ✅ |
|./tests/sysmon_eid_7.ini | 6 | 0 | ✅ |
|./tests/samba.ini | 4 | 0 | ✅ |
|./tests/apparmor.ini | 5 | 0 | ✅ |
|./tests/test_osmatch_regex.ini| 6 | 0 | ✅ |
|./tests/test_features.ini| 7 | 0 | ✅ |
|./tests/pam.ini | 5 | 0 | ✅ |
|./tests/apache.ini | 12 | 0 | ✅ |
|./tests/fireeye.ini | 3 | 0 | ✅ |
|./tests/sysmon.ini | 25 | 0 | ✅ |
|./tests/dovecot.ini | 15 | 0 | ✅ |
|./tests/web_appsec.ini | 31 | 0 | ✅ |
|./tests/nextcloud.ini | 8 | 0 | ✅ |
|./tests/kernel_usb.ini | 6 | 0 | ✅ |
|./tests/paloalto.ini | 16 | 0 | ✅ |
|./tests/sysmon_eid_13.ini| 7 | 0 | ✅ |
|./tests/ossec.ini | 5 | 0 | ✅ |
|./tests/test_osregex_regex.ini| 28 | 0 | ✅ |
|./tests/cisco_ftd.ini | 42 | 0 | ✅ |
|./tests/sysmon_eid_1.ini | 59 | 0 | ✅ |
|./tests/github.ini | 324 | 0 | ✅ |
|./tests/gitlab.ini | 27 | 0 | ✅ |
|./tests/named.ini | 5 | 0 | ✅ |
|./tests/powershell.ini | 32 | 0 | ✅ |
|./tests/openvpn_ldap.ini | 2 | 0 | ✅ |
|./tests/sysmon_eid_8.ini | 4 | 0 | ✅ |
|./tests/cloudflare-waf.ini| 13 | 0 | ✅ |
|./tests/huawei_usg.ini | 3 | 0 | ✅ |
|./tests/eset.ini | 8 | 0 | ✅ |
|./tests/test_expr_negation.ini| 56 | 0 | ✅ |
|./tests/gcp.ini | 31 | 0 | ✅ |
|./tests/cpanel.ini | 7 | 0 | ✅ |
|./tests/pfsense.ini | 2 | 0 | ✅ |
|./tests/cisco_asa.ini | 88 | 0 | ✅ |
|./tests/systemd.ini | 2 | 0 | ✅ |
|./tests/nginx.ini | 12 | 0 | ✅ |
|./tests/sysmon_eid_11.ini| 28 | 0 | ✅ |
|./tests/cisco_ios.ini | 17 | 0 | ✅ |
|./tests/pix.ini | 22 | 0 | ✅ |
|./tests/php.ini | 2 | 0 | ✅ |
|./tests/office365.ini | 128 | 0 | ✅ |
|./tests/fortigate.ini | 45 | 0 | ✅ |
|./tests/fortimail.ini | 6 | 0 | ✅ |
|./tests/audit_scp.ini | 8 | 0 | ✅ |
|./tests/win_application.ini| 0 | 0 | ✅ |
|./tests/fortiddos.ini | 1 | 0 | ✅ |
|./tests/cimserver.ini | 2 | 0 | ✅ |
|./tests/freepbx.ini | 6 | 0 | ✅ |
|./tests/overwrite.ini | 10 | 0 | ✅ |
|./tests/sshd.ini | 48 | 0 | ✅ |
|./tests/win_event_channel.ini| 8 | 0 | ✅ |
|./tests/test_pcre2_regex.ini| 33 | 0 | ✅ |
|./tests/glpi.ini | 3 | 0 | ✅ |
|./tests/api.ini | 21 | 0 | ✅ |
|./tests/dropbear.ini | 3 | 0 | ✅ |
|./tests/firewalld.ini | 2 | 0 | ✅ |
|./tests/mailscanner.ini | 1 | 0 | ✅ |
|./tests/owlh.ini | 4 | 0 | ✅ |
|./tests/sysmon_eid_20.ini| 2 | 0 | ✅ |
|./tests/auditd.ini | 31 | 0 | ✅ |
|./tests/mcafee_epo.ini | 1 | 0 | ✅ |
|./tests/doas.ini | 4 | 0 | ✅ |
|./tests/junos.ini | 3 | 0 | ✅ |
|./tests/sudo.ini | 8 | 0 | ✅ |
|./tests/syslog.ini | 6 | 0 | ✅ |
|./tests/sophos_fw.ini | 10 | 0 | ✅ |
|./tests/vsftpd.ini | 4 | 0 | ✅ |
|./tests/postfix.ini | 2 | 0 | ✅ |
|./tests/modsecurity.ini | 6 | 0 | ✅ |
|./tests/su.ini | 5 | 0 | ✅ |
|./tests/unbound.ini | 0 | 0 | ✅ |
|./tests/aws_s3_access.ini| 10 | 0 | ✅ |
|./tests/test_static_filters.ini| 28 | 0 | ✅ |
|./tests/oscap.ini | 32 | 0 | ✅ |
```
# python runtests.py
- [ File = ./tests/SonicWall.ini ] ---------
...........
- [ File = ./tests/apache.ini ] ---------
............
- [ File = ./tests/api.ini ] ---------
.....................
- [ File = ./tests/apparmor.ini ] ---------
.....
- [ File = ./tests/arbor.ini ] ---------
..
- [ File = ./tests/audit_scp.ini ] ---------
........
- [ File = ./tests/auditd.ini ] ---------
...............................
- [ File = ./tests/aws_s3_access.ini ] ---------
..........
- [ File = ./tests/checkpoint_smart1.ini ] ---------
..................
- [ File = ./tests/cimserver.ini ] ---------
..
- [ File = ./tests/cisco_asa.ini ] ---------
........................................................................................
- [ File = ./tests/cisco_ftd.ini ] ---------
..........................................
- [ File = ./tests/cisco_ios.ini ] ---------
.................
- [ File = ./tests/cloudflare-waf.ini ] ---------
.............
- [ File = ./tests/cpanel.ini ] ---------
.......
- [ File = ./tests/doas.ini ] ---------
....
- [ File = ./tests/dovecot.ini ] ---------
...............
- [ File = ./tests/dropbear.ini ] ---------
...
- [ File = ./tests/eset.ini ] ---------
........
- [ File = ./tests/exchange.ini ] ---------
..
- [ File = ./tests/exim.ini ] ---------
.......
- [ File = ./tests/f5_big_ip.ini ] ---------
................................................
- [ File = ./tests/fireeye.ini ] ---------
...
- [ File = ./tests/firewalld.ini ] ---------
..
- [ File = ./tests/fortiauth.ini ] ---------
....
- [ File = ./tests/fortiddos.ini ] ---------
.
- [ File = ./tests/fortigate.ini ] ---------
.............................................
- [ File = ./tests/fortimail.ini ] ---------
......
- [ File = ./tests/freepbx.ini ] ---------
......
- [ File = ./tests/gcp.ini ] ---------
...............................
- [ File = ./tests/github.ini ] ---------
....................................................................................................................................................................................................................................................................................................................................
- [ File = ./tests/gitlab.ini ] ---------
...........................
- [ File = ./tests/glpi.ini ] ---------
...
- [ File = ./tests/huawei_usg.ini ] ---------
...
- [ File = ./tests/iptables.ini ] ---------
.........
- [ File = ./tests/junos.ini ] ---------
...
- [ File = ./tests/kernel_usb.ini ] ---------
......
- [ File = ./tests/mailscanner.ini ] ---------
.
- [ File = ./tests/mcafee_epo.ini ] ---------
.
- [ File = ./tests/modsecurity.ini ] ---------
......
- [ File = ./tests/named.ini ] ---------
.....
- [ File = ./tests/netscreen.ini ] ---------
....
- [ File = ./tests/nextcloud.ini ] ---------
........
- [ File = ./tests/nginx.ini ] ---------
............
- [ File = ./tests/office365.ini ] ---------
................................................................................................................................
- [ File = ./tests/openldap.ini ] ---------
.........
- [ File = ./tests/opensmtpd.ini ] ---------
.......
- [ File = ./tests/openvpn_ldap.ini ] ---------
..
- [ File = ./tests/oscap.ini ] ---------
................................
- [ File = ./tests/ossec.ini ] ---------
.....
- [ File = ./tests/overwrite.ini ] ---------
..........
- [ File = ./tests/owlh.ini ] ---------
....
- [ File = ./tests/paloalto.ini ] ---------
................
- [ File = ./tests/pam.ini ] ---------
.....
- [ File = ./tests/panda_paps.ini ] ---------
........
- [ File = ./tests/pfsense.ini ] ---------
..
- [ File = ./tests/php.ini ] ---------
..
- [ File = ./tests/pix.ini ] ---------
......................
- [ File = ./tests/postfix.ini ] ---------
..
- [ File = ./tests/powershell.ini ] ---------
................................
- [ File = ./tests/proftpd.ini ] ---------
.......
- [ File = ./tests/rsh.ini ] ---------
..
- [ File = ./tests/samba.ini ] ---------
....
- [ File = ./tests/sophos.ini ] ---------
........
- [ File = ./tests/sophos_fw.ini ] ---------
..........
- [ File = ./tests/squid_rules.ini ] ---------
..
- [ File = ./tests/sshd.ini ] ---------
................................................
- [ File = ./tests/su.ini ] ---------
.....
- [ File = ./tests/sudo.ini ] ---------
........
- [ File = ./tests/syslog.ini ] ---------
......
- [ File = ./tests/sysmon.ini ] ---------
.........................
- [ File = ./tests/sysmon_eid_1.ini ] ---------
...........................................................
- [ File = ./tests/sysmon_eid_10.ini ] ---------
....
- [ File = ./tests/sysmon_eid_11.ini ] ---------
............................
- [ File = ./tests/sysmon_eid_13.ini ] ---------
.......
- [ File = ./tests/sysmon_eid_20.ini ] ---------
..
- [ File = ./tests/sysmon_eid_3.ini ] ---------
..........
- [ File = ./tests/sysmon_eid_7.ini ] ---------
......
- [ File = ./tests/sysmon_eid_8.ini ] ---------
....
- [ File = ./tests/systemd.ini ] ---------
..
- [ File = ./tests/test_expr_negation.ini ] ---------
........................................................
- [ File = ./tests/test_features.ini ] ---------
.......
- [ File = ./tests/test_osmatch_regex.ini ] ---------
......
- [ File = ./tests/test_osregex_regex.ini ] ---------
............................
- [ File = ./tests/test_pcre2_regex.ini ] ---------
.................................
- [ File = ./tests/test_static_filters.ini ] ---------
............................
- [ File = ./tests/unbound.ini ] ---------
- [ File = ./tests/vsftpd.ini ] ---------
....
- [ File = ./tests/vuln_detector.ini ] ---------
..
- [ File = ./tests/web_appsec.ini ] ---------
...............................
- [ File = ./tests/web_rules.ini ] ---------
..........
- [ File = ./tests/win_application.ini ] ---------
- [ File = ./tests/win_event_channel.ini ] ---------
........
```
|
2.0
|
Release 4.4.1 - Release Candidate 1 - Ruleset Test - ### Packages tests information
|||
| :-- | :-- |
| **Main release candidate issue** | #16620 |
| **Version** | 4.4.1 |
| **Release candidate** | RC1 |
| **Tag** | https://github.com/wazuh/wazuh/tree/v4.4.1-rc1 |
| **Previous ruleset test ** | - |
#### SCA
|OS | installed | Executed |
| -------- | -------- | -------- |
| Red Hat Enterprise Linux 9 | ✅ | ✅ |
| Debian Liunux 11 | ✅ | ✅ |
#### Decoders and Rules
|Component | Tested | Total | Coverage |
| -------- | -------- | -------- | -------- |
| Rules | 1343 | 4248 | 31.61% |
| Decoders | 120 | 165 | 72.73% |
| File | Passed | Failed | Status |
| -------- | -------- | -------- | -------- |
|./tests/proftpd.ini | 7 | 0 | ✅ |
|./tests/exim.ini | 7 | 0 | ✅ |
|./tests/squid_rules.ini | 2 | 0 | ✅ |
|./tests/sysmon_eid_3.ini | 10 | 0 | ✅ |
|./tests/checkpoint_smart1.ini| 18 | 0 | ✅ |
|./tests/iptables.ini | 9 | 0 | ✅ |
|./tests/sysmon_eid_10.ini| 4 | 0 | ✅ |
|./tests/SonicWall.ini | 11 | 0 | ✅ |
|./tests/panda_paps.ini | 8 | 0 | ✅ |
|./tests/fortiauth.ini | 4 | 0 | ✅ |
|./tests/openldap.ini | 9 | 0 | ✅ |
|./tests/f5_big_ip.ini | 48 | 0 | ✅ |
|./tests/sophos.ini | 8 | 0 | ✅ |
|./tests/opensmtpd.ini | 7 | 0 | ✅ |
|./tests/netscreen.ini | 4 | 0 | ✅ |
|./tests/rsh.ini | 2 | 0 | ✅ |
|./tests/arbor.ini | 2 | 0 | ✅ |
|./tests/web_rules.ini | 10 | 0 | ✅ |
|./tests/exchange.ini | 2 | 0 | ✅ |
|./tests/vuln_detector.ini| 2 | 0 | ✅ |
|./tests/sysmon_eid_7.ini | 6 | 0 | ✅ |
|./tests/samba.ini | 4 | 0 | ✅ |
|./tests/apparmor.ini | 5 | 0 | ✅ |
|./tests/test_osmatch_regex.ini| 6 | 0 | ✅ |
|./tests/test_features.ini| 7 | 0 | ✅ |
|./tests/pam.ini | 5 | 0 | ✅ |
|./tests/apache.ini | 12 | 0 | ✅ |
|./tests/fireeye.ini | 3 | 0 | ✅ |
|./tests/sysmon.ini | 25 | 0 | ✅ |
|./tests/dovecot.ini | 15 | 0 | ✅ |
|./tests/web_appsec.ini | 31 | 0 | ✅ |
|./tests/nextcloud.ini | 8 | 0 | ✅ |
|./tests/kernel_usb.ini | 6 | 0 | ✅ |
|./tests/paloalto.ini | 16 | 0 | ✅ |
|./tests/sysmon_eid_13.ini| 7 | 0 | ✅ |
|./tests/ossec.ini | 5 | 0 | ✅ |
|./tests/test_osregex_regex.ini| 28 | 0 | ✅ |
|./tests/cisco_ftd.ini | 42 | 0 | ✅ |
|./tests/sysmon_eid_1.ini | 59 | 0 | ✅ |
|./tests/github.ini | 324 | 0 | ✅ |
|./tests/gitlab.ini | 27 | 0 | ✅ |
|./tests/named.ini | 5 | 0 | ✅ |
|./tests/powershell.ini | 32 | 0 | ✅ |
|./tests/openvpn_ldap.ini | 2 | 0 | ✅ |
|./tests/sysmon_eid_8.ini | 4 | 0 | ✅ |
|./tests/cloudflare-waf.ini| 13 | 0 | ✅ |
|./tests/huawei_usg.ini | 3 | 0 | ✅ |
|./tests/eset.ini | 8 | 0 | ✅ |
|./tests/test_expr_negation.ini| 56 | 0 | ✅ |
|./tests/gcp.ini | 31 | 0 | ✅ |
|./tests/cpanel.ini | 7 | 0 | ✅ |
|./tests/pfsense.ini | 2 | 0 | ✅ |
|./tests/cisco_asa.ini | 88 | 0 | ✅ |
|./tests/systemd.ini | 2 | 0 | ✅ |
|./tests/nginx.ini | 12 | 0 | ✅ |
|./tests/sysmon_eid_11.ini| 28 | 0 | ✅ |
|./tests/cisco_ios.ini | 17 | 0 | ✅ |
|./tests/pix.ini | 22 | 0 | ✅ |
|./tests/php.ini | 2 | 0 | ✅ |
|./tests/office365.ini | 128 | 0 | ✅ |
|./tests/fortigate.ini | 45 | 0 | ✅ |
|./tests/fortimail.ini | 6 | 0 | ✅ |
|./tests/audit_scp.ini | 8 | 0 | ✅ |
|./tests/win_application.ini| 0 | 0 | ✅ |
|./tests/fortiddos.ini | 1 | 0 | ✅ |
|./tests/cimserver.ini | 2 | 0 | ✅ |
|./tests/freepbx.ini | 6 | 0 | ✅ |
|./tests/overwrite.ini | 10 | 0 | ✅ |
|./tests/sshd.ini | 48 | 0 | ✅ |
|./tests/win_event_channel.ini| 8 | 0 | ✅ |
|./tests/test_pcre2_regex.ini| 33 | 0 | ✅ |
|./tests/glpi.ini | 3 | 0 | ✅ |
|./tests/api.ini | 21 | 0 | ✅ |
|./tests/dropbear.ini | 3 | 0 | ✅ |
|./tests/firewalld.ini | 2 | 0 | ✅ |
|./tests/mailscanner.ini | 1 | 0 | ✅ |
|./tests/owlh.ini | 4 | 0 | ✅ |
|./tests/sysmon_eid_20.ini| 2 | 0 | ✅ |
|./tests/auditd.ini | 31 | 0 | ✅ |
|./tests/mcafee_epo.ini | 1 | 0 | ✅ |
|./tests/doas.ini | 4 | 0 | ✅ |
|./tests/junos.ini | 3 | 0 | ✅ |
|./tests/sudo.ini | 8 | 0 | ✅ |
|./tests/syslog.ini | 6 | 0 | ✅ |
|./tests/sophos_fw.ini | 10 | 0 | ✅ |
|./tests/vsftpd.ini | 4 | 0 | ✅ |
|./tests/postfix.ini | 2 | 0 | ✅ |
|./tests/modsecurity.ini | 6 | 0 | ✅ |
|./tests/su.ini | 5 | 0 | ✅ |
|./tests/unbound.ini | 0 | 0 | ✅ |
|./tests/aws_s3_access.ini| 10 | 0 | ✅ |
|./tests/test_static_filters.ini| 28 | 0 | ✅ |
|./tests/oscap.ini | 32 | 0 | ✅ |
```
# python runtests.py
- [ File = ./tests/SonicWall.ini ] ---------
...........
- [ File = ./tests/apache.ini ] ---------
............
- [ File = ./tests/api.ini ] ---------
.....................
- [ File = ./tests/apparmor.ini ] ---------
.....
- [ File = ./tests/arbor.ini ] ---------
..
- [ File = ./tests/audit_scp.ini ] ---------
........
- [ File = ./tests/auditd.ini ] ---------
...............................
- [ File = ./tests/aws_s3_access.ini ] ---------
..........
- [ File = ./tests/checkpoint_smart1.ini ] ---------
..................
- [ File = ./tests/cimserver.ini ] ---------
..
- [ File = ./tests/cisco_asa.ini ] ---------
........................................................................................
- [ File = ./tests/cisco_ftd.ini ] ---------
..........................................
- [ File = ./tests/cisco_ios.ini ] ---------
.................
- [ File = ./tests/cloudflare-waf.ini ] ---------
.............
- [ File = ./tests/cpanel.ini ] ---------
.......
- [ File = ./tests/doas.ini ] ---------
....
- [ File = ./tests/dovecot.ini ] ---------
...............
- [ File = ./tests/dropbear.ini ] ---------
...
- [ File = ./tests/eset.ini ] ---------
........
- [ File = ./tests/exchange.ini ] ---------
..
- [ File = ./tests/exim.ini ] ---------
.......
- [ File = ./tests/f5_big_ip.ini ] ---------
................................................
- [ File = ./tests/fireeye.ini ] ---------
...
- [ File = ./tests/firewalld.ini ] ---------
..
- [ File = ./tests/fortiauth.ini ] ---------
....
- [ File = ./tests/fortiddos.ini ] ---------
.
- [ File = ./tests/fortigate.ini ] ---------
.............................................
- [ File = ./tests/fortimail.ini ] ---------
......
- [ File = ./tests/freepbx.ini ] ---------
......
- [ File = ./tests/gcp.ini ] ---------
...............................
- [ File = ./tests/github.ini ] ---------
....................................................................................................................................................................................................................................................................................................................................
- [ File = ./tests/gitlab.ini ] ---------
...........................
- [ File = ./tests/glpi.ini ] ---------
...
- [ File = ./tests/huawei_usg.ini ] ---------
...
- [ File = ./tests/iptables.ini ] ---------
.........
- [ File = ./tests/junos.ini ] ---------
...
- [ File = ./tests/kernel_usb.ini ] ---------
......
- [ File = ./tests/mailscanner.ini ] ---------
.
- [ File = ./tests/mcafee_epo.ini ] ---------
.
- [ File = ./tests/modsecurity.ini ] ---------
......
- [ File = ./tests/named.ini ] ---------
.....
- [ File = ./tests/netscreen.ini ] ---------
....
- [ File = ./tests/nextcloud.ini ] ---------
........
- [ File = ./tests/nginx.ini ] ---------
............
- [ File = ./tests/office365.ini ] ---------
................................................................................................................................
- [ File = ./tests/openldap.ini ] ---------
.........
- [ File = ./tests/opensmtpd.ini ] ---------
.......
- [ File = ./tests/openvpn_ldap.ini ] ---------
..
- [ File = ./tests/oscap.ini ] ---------
................................
- [ File = ./tests/ossec.ini ] ---------
.....
- [ File = ./tests/overwrite.ini ] ---------
..........
- [ File = ./tests/owlh.ini ] ---------
....
- [ File = ./tests/paloalto.ini ] ---------
................
- [ File = ./tests/pam.ini ] ---------
.....
- [ File = ./tests/panda_paps.ini ] ---------
........
- [ File = ./tests/pfsense.ini ] ---------
..
- [ File = ./tests/php.ini ] ---------
..
- [ File = ./tests/pix.ini ] ---------
......................
- [ File = ./tests/postfix.ini ] ---------
..
- [ File = ./tests/powershell.ini ] ---------
................................
- [ File = ./tests/proftpd.ini ] ---------
.......
- [ File = ./tests/rsh.ini ] ---------
..
- [ File = ./tests/samba.ini ] ---------
....
- [ File = ./tests/sophos.ini ] ---------
........
- [ File = ./tests/sophos_fw.ini ] ---------
..........
- [ File = ./tests/squid_rules.ini ] ---------
..
- [ File = ./tests/sshd.ini ] ---------
................................................
- [ File = ./tests/su.ini ] ---------
.....
- [ File = ./tests/sudo.ini ] ---------
........
- [ File = ./tests/syslog.ini ] ---------
......
- [ File = ./tests/sysmon.ini ] ---------
.........................
- [ File = ./tests/sysmon_eid_1.ini ] ---------
...........................................................
- [ File = ./tests/sysmon_eid_10.ini ] ---------
....
- [ File = ./tests/sysmon_eid_11.ini ] ---------
............................
- [ File = ./tests/sysmon_eid_13.ini ] ---------
.......
- [ File = ./tests/sysmon_eid_20.ini ] ---------
..
- [ File = ./tests/sysmon_eid_3.ini ] ---------
..........
- [ File = ./tests/sysmon_eid_7.ini ] ---------
......
- [ File = ./tests/sysmon_eid_8.ini ] ---------
....
- [ File = ./tests/systemd.ini ] ---------
..
- [ File = ./tests/test_expr_negation.ini ] ---------
........................................................
- [ File = ./tests/test_features.ini ] ---------
.......
- [ File = ./tests/test_osmatch_regex.ini ] ---------
......
- [ File = ./tests/test_osregex_regex.ini ] ---------
............................
- [ File = ./tests/test_pcre2_regex.ini ] ---------
.................................
- [ File = ./tests/test_static_filters.ini ] ---------
............................
- [ File = ./tests/unbound.ini ] ---------
- [ File = ./tests/vsftpd.ini ] ---------
....
- [ File = ./tests/vuln_detector.ini ] ---------
..
- [ File = ./tests/web_appsec.ini ] ---------
...............................
- [ File = ./tests/web_rules.ini ] ---------
..........
- [ File = ./tests/win_application.ini ] ---------
- [ File = ./tests/win_event_channel.ini ] ---------
........
```
|
test
|
release release candidate ruleset test packages tests information main release candidate issue version release candidate tag previous ruleset test sca os installed executed red hat enterprise linux ✅ ✅ debian liunux ✅ ✅ decoders and rules component tested total coverage rules decoders file passed failed status tests proftpd ini ✅ tests exim ini ✅ tests squid rules ini ✅ tests sysmon eid ini ✅ tests checkpoint ini ✅ tests iptables ini ✅ tests sysmon eid ini ✅ tests sonicwall ini ✅ tests panda paps ini ✅ tests fortiauth ini ✅ tests openldap ini ✅ tests big ip ini ✅ tests sophos ini ✅ tests opensmtpd ini ✅ tests netscreen ini ✅ tests rsh ini ✅ tests arbor ini ✅ tests web rules ini ✅ tests exchange ini ✅ tests vuln detector ini ✅ tests sysmon eid ini ✅ tests samba ini ✅ tests apparmor ini ✅ tests test osmatch regex ini ✅ tests test features ini ✅ tests pam ini ✅ tests apache ini ✅ tests fireeye ini ✅ tests sysmon ini ✅ tests dovecot ini ✅ tests web appsec ini ✅ tests nextcloud ini ✅ tests kernel usb ini ✅ tests paloalto ini ✅ tests sysmon eid ini ✅ tests ossec ini ✅ tests test osregex regex ini ✅ tests cisco ftd ini ✅ tests sysmon eid ini ✅ tests github ini ✅ tests gitlab ini ✅ tests named ini ✅ tests powershell ini ✅ tests openvpn ldap ini ✅ tests sysmon eid ini ✅ tests cloudflare waf ini ✅ tests huawei usg ini ✅ tests eset ini ✅ tests test expr negation ini ✅ tests gcp ini ✅ tests cpanel ini ✅ tests pfsense ini ✅ tests cisco asa ini ✅ tests systemd ini ✅ tests nginx ini ✅ tests sysmon eid ini ✅ tests cisco ios ini ✅ tests pix ini ✅ tests php ini ✅ tests ini ✅ tests fortigate ini ✅ tests fortimail ini ✅ tests audit scp ini ✅ tests win application ini ✅ tests fortiddos ini ✅ tests cimserver ini ✅ tests freepbx ini ✅ tests overwrite ini ✅ tests sshd ini ✅ tests win event channel ini ✅ tests test regex ini ✅ tests glpi ini ✅ tests api ini ✅ tests dropbear ini ✅ tests firewalld ini ✅ tests mailscanner ini ✅ tests owlh ini ✅ tests sysmon eid ini ✅ tests auditd ini ✅ tests mcafee epo ini ✅ tests doas ini ✅ tests junos ini ✅ tests sudo ini ✅ tests syslog ini ✅ tests sophos fw ini ✅ tests vsftpd ini ✅ tests postfix ini ✅ tests modsecurity ini ✅ tests su ini ✅ tests unbound ini ✅ tests aws access ini ✅ tests test static filters ini ✅ tests oscap ini ✅ python runtests py
| 1
|
268,685
| 28,761,207,409
|
IssuesEvent
|
2023-05-01 01:01:10
|
amaybaum-dev/vprofile-project3
|
https://api.github.com/repos/amaybaum-dev/vprofile-project3
|
opened
|
CVE-2018-15756 (High, reachable) detected in spring-web-4.3.7.RELEASE.jar
|
Mend: dependency security vulnerability
|
## CVE-2018-15756 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-web-4.3.7.RELEASE.jar</b></p></summary>
<p>Spring Web</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/4.3.7.RELEASE/spring-web-4.3.7.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-rabbit-1.7.1.RELEASE.jar (Root Library)
- :x: **spring-web-4.3.7.RELEASE.jar** (Vulnerable Library)
<p>Found in base branch: <b>vp-rem</b></p>
</p>
</details>
<p></p>
<details><summary> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> Reachability Analysis</summary>
<p>
This vulnerability is potentially used
```
com.visualpathit.account.controller.FileUploadController (Application)
-> org.springframework.web.multipart.support.StandardMultipartHttpServletRequest$StandardMultipartFile (Extension)
-> org.springframework.web.multipart.support.StandardMultipartHttpServletRequest (Extension)
-> org.springframework.http.HttpHeaders (Extension)
-> ❌ org.springframework.http.HttpRange (Vulnerable Component)
```
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Spring Framework, version 5.1, versions 5.0.x prior to 5.0.10, versions 4.3.x prior to 4.3.20, and older unsupported versions on the 4.2.x branch provide support for range requests when serving static resources through the ResourceHttpRequestHandler, or starting in 5.0 when an annotated controller returns an org.springframework.core.io.Resource. A malicious user (or attacker) can add a range header with a high number of ranges, or with wide ranges that overlap, or both, for a denial of service attack. This vulnerability affects applications that depend on either spring-webmvc or spring-webflux. Such applications must also have a registration for serving static resources (e.g. JS, CSS, images, and others), or have an annotated controller that returns an org.springframework.core.io.Resource. Spring Boot applications that depend on spring-boot-starter-web or spring-boot-starter-webflux are ready to serve static resources out of the box and are therefore vulnerable.
<p>Publish Date: 2018-10-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-15756>CVE-2018-15756</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://pivotal.io/security/cve-2018-15756">https://pivotal.io/security/cve-2018-15756</a></p>
<p>Release Date: 2018-10-18</p>
<p>Fix Resolution (org.springframework:spring-web): 4.3.20.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.amqp:spring-rabbit): 1.7.11.RELEASE</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
|
True
|
CVE-2018-15756 (High, reachable) detected in spring-web-4.3.7.RELEASE.jar - ## CVE-2018-15756 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-web-4.3.7.RELEASE.jar</b></p></summary>
<p>Spring Web</p>
<p>Library home page: <a href="https://github.com/spring-projects/spring-framework">https://github.com/spring-projects/spring-framework</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/springframework/spring-web/4.3.7.RELEASE/spring-web-4.3.7.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- spring-rabbit-1.7.1.RELEASE.jar (Root Library)
- :x: **spring-web-4.3.7.RELEASE.jar** (Vulnerable Library)
<p>Found in base branch: <b>vp-rem</b></p>
</p>
</details>
<p></p>
<details><summary> <img src='https://whitesource-resources.whitesourcesoftware.com/viaRed.png' width=19 height=20> Reachability Analysis</summary>
<p>
This vulnerability is potentially used
```
com.visualpathit.account.controller.FileUploadController (Application)
-> org.springframework.web.multipart.support.StandardMultipartHttpServletRequest$StandardMultipartFile (Extension)
-> org.springframework.web.multipart.support.StandardMultipartHttpServletRequest (Extension)
-> org.springframework.http.HttpHeaders (Extension)
-> ❌ org.springframework.http.HttpRange (Vulnerable Component)
```
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Spring Framework, version 5.1, versions 5.0.x prior to 5.0.10, versions 4.3.x prior to 4.3.20, and older unsupported versions on the 4.2.x branch provide support for range requests when serving static resources through the ResourceHttpRequestHandler, or starting in 5.0 when an annotated controller returns an org.springframework.core.io.Resource. A malicious user (or attacker) can add a range header with a high number of ranges, or with wide ranges that overlap, or both, for a denial of service attack. This vulnerability affects applications that depend on either spring-webmvc or spring-webflux. Such applications must also have a registration for serving static resources (e.g. JS, CSS, images, and others), or have an annotated controller that returns an org.springframework.core.io.Resource. Spring Boot applications that depend on spring-boot-starter-web or spring-boot-starter-webflux are ready to serve static resources out of the box and are therefore vulnerable.
<p>Publish Date: 2018-10-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-15756>CVE-2018-15756</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://pivotal.io/security/cve-2018-15756">https://pivotal.io/security/cve-2018-15756</a></p>
<p>Release Date: 2018-10-18</p>
<p>Fix Resolution (org.springframework:spring-web): 4.3.20.RELEASE</p>
<p>Direct dependency fix Resolution (org.springframework.amqp:spring-rabbit): 1.7.11.RELEASE</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
|
non_test
|
cve high reachable detected in spring web release jar cve high severity vulnerability vulnerable library spring web release jar spring web library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org springframework spring web release spring web release jar dependency hierarchy spring rabbit release jar root library x spring web release jar vulnerable library found in base branch vp rem reachability analysis this vulnerability is potentially used com visualpathit account controller fileuploadcontroller application org springframework web multipart support standardmultiparthttpservletrequest standardmultipartfile extension org springframework web multipart support standardmultiparthttpservletrequest extension org springframework http httpheaders extension ❌ org springframework http httprange vulnerable component vulnerability details spring framework version versions x prior to versions x prior to and older unsupported versions on the x branch provide support for range requests when serving static resources through the resourcehttprequesthandler or starting in when an annotated controller returns an org springframework core io resource a malicious user or attacker can add a range header with a high number of ranges or with wide ranges that overlap or both for a denial of service attack this vulnerability affects applications that depend on either spring webmvc or spring webflux such applications must also have a registration for serving static resources e g js css images and others or have an annotated controller that returns an org springframework core io resource spring boot applications that depend on spring boot starter web or spring boot starter webflux are ready to serve static resources out of the box and are therefore vulnerable publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework spring web release direct dependency fix resolution org springframework amqp spring rabbit release rescue worker helmet automatic remediation is available for this issue
| 0
|
142,259
| 11,461,455,421
|
IssuesEvent
|
2020-02-07 11:59:13
|
LiskHQ/lisk-sdk
|
https://api.github.com/repos/LiskHQ/lisk-sdk
|
closed
|
Remove istanbul-middleware from the application
|
framework/http_api type: test
|
### Expected behavior
We should come up with a better approach to collect code coverage from functional tests.
We should also discuss if it's needed or does it give any value.
If it's really needed, maybe at least we could strip that code piece from production build.
### Actual behavior
We are using `istanbul-middleware` in the application source code.
### Steps to reproduce
### Which version(s) does this affect? (Environment, OS, etc...)
|
1.0
|
Remove istanbul-middleware from the application - ### Expected behavior
We should come up with a better approach to collect code coverage from functional tests.
We should also discuss if it's needed or does it give any value.
If it's really needed, maybe at least we could strip that code piece from production build.
### Actual behavior
We are using `istanbul-middleware` in the application source code.
### Steps to reproduce
### Which version(s) does this affect? (Environment, OS, etc...)
|
test
|
remove istanbul middleware from the application expected behavior we should come up with a better approach to collect code coverage from functional tests we should also discuss if it s needed or does it give any value if it s really needed maybe at least we could strip that code piece from production build actual behavior we are using istanbul middleware in the application source code steps to reproduce which version s does this affect environment os etc
| 1
|
196,266
| 14,852,699,500
|
IssuesEvent
|
2021-01-18 08:57:35
|
DiSSCo/ELViS
|
https://api.github.com/repos/DiSSCo/ELViS
|
closed
|
Questions and Feedback
|
bug resolved to test
|
#### Description
fill in the different sections and an error occurs when sending it
#### Steps to reproduce the issue
1. Open the section Questions and Feedback
2. Write the text in the field Subject an the explanation in the field comments o question
3. Press the button Submit
#### What's the expected result?
-Send the message
#### What's the actual result?
- None. Error message appears:
"Something went wrong uploading the form, please try again"
#### Additional details / screenshot

|
1.0
|
Questions and Feedback - #### Description
fill in the different sections and an error occurs when sending it
#### Steps to reproduce the issue
1. Open the section Questions and Feedback
2. Write the text in the field Subject an the explanation in the field comments o question
3. Press the button Submit
#### What's the expected result?
-Send the message
#### What's the actual result?
- None. Error message appears:
"Something went wrong uploading the form, please try again"
#### Additional details / screenshot

|
test
|
questions and feedback description fill in the different sections and an error occurs when sending it steps to reproduce the issue open the section questions and feedback write the text in the field subject an the explanation in the field comments o question press the button submit what s the expected result send the message what s the actual result none error message appears something went wrong uploading the form please try again additional details screenshot
| 1
|
333,079
| 29,508,053,316
|
IssuesEvent
|
2023-06-03 15:08:27
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix array_contents.test_numpy_isposinf
|
NumPy Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|
1.0
|
Fix array_contents.test_numpy_isposinf - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5071698123/jobs/9108411237" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|
test
|
fix array contents test numpy isposinf tensorflow img src torch img src numpy img src jax img src paddle img src
| 1
|
553,829
| 16,383,294,452
|
IssuesEvent
|
2021-05-17 07:16:56
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
closed
|
[0.9.3.1] Weird behaviour when creating a split deed
|
Category: Gameplay Priority: High Regression Squad: Wild Turkey Type: Bug
|
So I used an old habit of mine to create a split deed - I had a full store and house on one plot of land, I clicked "edit" and claimed some land somewhere else. I got a warning that this will split my deed, which was expected. What followed however:
- My deed was split as expected
- My original deed was renamed to a default name
- My residency was moved to the new part of the deed (empty land), rather than defaulting to staying where it was. Housing XP didn't change however (probably handled on a different process), nor was I kicked out of my government position for being in the wrong district (again, probably different process) until a few minutes later when I have re-set my residency.
- I got a notification that my store stopped working since I didn't have access to its custom account (wut?). I had to re-set the account and currency to proper ones
I think that feature should be a bit safer to execute:
- Residency should stay with the half of the deed that has the highest housing value
- Stores should remain unaffected
[Player.log](https://github.com/StrangeLoopGames/EcoIssues/files/6355632/Player.log)
To reproduce:
- Have a well developed land with a store, house, etc.
- Click edit
- Claim area that is not connected to the original deed
- Confirm
- Panic
|
1.0
|
[0.9.3.1] Weird behaviour when creating a split deed - So I used an old habit of mine to create a split deed - I had a full store and house on one plot of land, I clicked "edit" and claimed some land somewhere else. I got a warning that this will split my deed, which was expected. What followed however:
- My deed was split as expected
- My original deed was renamed to a default name
- My residency was moved to the new part of the deed (empty land), rather than defaulting to staying where it was. Housing XP didn't change however (probably handled on a different process), nor was I kicked out of my government position for being in the wrong district (again, probably different process) until a few minutes later when I have re-set my residency.
- I got a notification that my store stopped working since I didn't have access to its custom account (wut?). I had to re-set the account and currency to proper ones
I think that feature should be a bit safer to execute:
- Residency should stay with the half of the deed that has the highest housing value
- Stores should remain unaffected
[Player.log](https://github.com/StrangeLoopGames/EcoIssues/files/6355632/Player.log)
To reproduce:
- Have a well developed land with a store, house, etc.
- Click edit
- Claim area that is not connected to the original deed
- Confirm
- Panic
|
non_test
|
weird behaviour when creating a split deed so i used an old habit of mine to create a split deed i had a full store and house on one plot of land i clicked edit and claimed some land somewhere else i got a warning that this will split my deed which was expected what followed however my deed was split as expected my original deed was renamed to a default name my residency was moved to the new part of the deed empty land rather than defaulting to staying where it was housing xp didn t change however probably handled on a different process nor was i kicked out of my government position for being in the wrong district again probably different process until a few minutes later when i have re set my residency i got a notification that my store stopped working since i didn t have access to its custom account wut i had to re set the account and currency to proper ones i think that feature should be a bit safer to execute residency should stay with the half of the deed that has the highest housing value stores should remain unaffected to reproduce have a well developed land with a store house etc click edit claim area that is not connected to the original deed confirm panic
| 0
|
198,601
| 14,988,588,492
|
IssuesEvent
|
2021-01-29 01:37:41
|
celo-org/celo-monorepo
|
https://api.github.com/repos/celo-org/celo-monorepo
|
opened
|
[FLAKEY TEST] mobile-test -> mobile -> checkWeb3SyncProgress -> reports web3 status correctly
|
FLAKEY mobile mobile-test
|
Discovered at commit c19f6705001360b5f498d94c2b40b166442ad82a
Attempt No. 1:
SagaTestError:
put expectation unmet:
Expected
--------
{ '@@redux-saga/IO': true,
combinator: false,
type: 'PUT',
payload:
{ channel: undefined,
action: { type: 'WEB3/COMPLETE_WEB3_SYNC', latestBlockNumber: 200 } } }
at new SagaTestError (/home/circleci/app/node_modules/redux-saga-test-plan/lib/shared/SagaTestError.js:17:57)
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/expectations.js:63:13
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:554:7
at Array.forEach (<anonymous>)
at checkExpectations (/home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:553:18)
at tryCallOne (/home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:37:12)
at /home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:123:15
at flush (/home/circleci/app/node_modules/asap/raw.js:50:29)
at process._tickCallback (internal/process/next_tick.js:61:11)
Attempt No. 2:
SagaTestError:
put expectation unmet:
Expected
--------
{ '@@redux-saga/IO': true,
combinator: false,
type: 'PUT',
payload:
{ channel: undefined,
action:
{ type: 'WEB3/UPDATE_WEB3_SYNC_PROGRESS',
payload: { startingBlock: 0, currentBlock: 10, highestBlock: 100 } } } }
Actual:
------
1. { '@@redux-saga/IO': true,
combinator: false,
type: 'PUT',
payload:
{ channel: undefined,
action: { type: 'WEB3/COMPLETE_WEB3_SYNC', latestBlockNumber: 200 } } }
at new SagaTestError (/home/circleci/app/node_modules/redux-saga-test-plan/lib/shared/SagaTestError.js:17:57)
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/expectations.js:63:13
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:554:7
at Array.forEach (<anonymous>)
at checkExpectations (/home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:553:18)
at tryCallOne (/home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:37:12)
at /home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:123:15
at flush (/home/circleci/app/node_modules/asap/raw.js:50:29)
at process._tickCallback (internal/process/next_tick.js:61:11)
Attempt No. 3:
Test Passed!
|
1.0
|
[FLAKEY TEST] mobile-test -> mobile -> checkWeb3SyncProgress -> reports web3 status correctly - Discovered at commit c19f6705001360b5f498d94c2b40b166442ad82a
Attempt No. 1:
SagaTestError:
put expectation unmet:
Expected
--------
{ '@@redux-saga/IO': true,
combinator: false,
type: 'PUT',
payload:
{ channel: undefined,
action: { type: 'WEB3/COMPLETE_WEB3_SYNC', latestBlockNumber: 200 } } }
at new SagaTestError (/home/circleci/app/node_modules/redux-saga-test-plan/lib/shared/SagaTestError.js:17:57)
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/expectations.js:63:13
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:554:7
at Array.forEach (<anonymous>)
at checkExpectations (/home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:553:18)
at tryCallOne (/home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:37:12)
at /home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:123:15
at flush (/home/circleci/app/node_modules/asap/raw.js:50:29)
at process._tickCallback (internal/process/next_tick.js:61:11)
Attempt No. 2:
SagaTestError:
put expectation unmet:
Expected
--------
{ '@@redux-saga/IO': true,
combinator: false,
type: 'PUT',
payload:
{ channel: undefined,
action:
{ type: 'WEB3/UPDATE_WEB3_SYNC_PROGRESS',
payload: { startingBlock: 0, currentBlock: 10, highestBlock: 100 } } } }
Actual:
------
1. { '@@redux-saga/IO': true,
combinator: false,
type: 'PUT',
payload:
{ channel: undefined,
action: { type: 'WEB3/COMPLETE_WEB3_SYNC', latestBlockNumber: 200 } } }
at new SagaTestError (/home/circleci/app/node_modules/redux-saga-test-plan/lib/shared/SagaTestError.js:17:57)
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/expectations.js:63:13
at /home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:554:7
at Array.forEach (<anonymous>)
at checkExpectations (/home/circleci/app/node_modules/redux-saga-test-plan/lib/expectSaga/index.js:553:18)
at tryCallOne (/home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:37:12)
at /home/circleci/app/node_modules/react-native/node_modules/promise/lib/core.js:123:15
at flush (/home/circleci/app/node_modules/asap/raw.js:50:29)
at process._tickCallback (internal/process/next_tick.js:61:11)
Attempt No. 3:
Test Passed!
|
test
|
mobile test mobile reports status correctly discovered at commit attempt no sagatesterror put expectation unmet expected redux saga io true combinator false type put payload channel undefined action type complete sync latestblocknumber at new sagatesterror home circleci app node modules redux saga test plan lib shared sagatesterror js at home circleci app node modules redux saga test plan lib expectsaga expectations js at home circleci app node modules redux saga test plan lib expectsaga index js at array foreach at checkexpectations home circleci app node modules redux saga test plan lib expectsaga index js at trycallone home circleci app node modules react native node modules promise lib core js at home circleci app node modules react native node modules promise lib core js at flush home circleci app node modules asap raw js at process tickcallback internal process next tick js attempt no sagatesterror put expectation unmet expected redux saga io true combinator false type put payload channel undefined action type update sync progress payload startingblock currentblock highestblock actual redux saga io true combinator false type put payload channel undefined action type complete sync latestblocknumber at new sagatesterror home circleci app node modules redux saga test plan lib shared sagatesterror js at home circleci app node modules redux saga test plan lib expectsaga expectations js at home circleci app node modules redux saga test plan lib expectsaga index js at array foreach at checkexpectations home circleci app node modules redux saga test plan lib expectsaga index js at trycallone home circleci app node modules react native node modules promise lib core js at home circleci app node modules react native node modules promise lib core js at flush home circleci app node modules asap raw js at process tickcallback internal process next tick js attempt no test passed
| 1
|
435
| 2,868,698,817
|
IssuesEvent
|
2015-06-05 20:29:17
|
gremau/NMEG_fluxproc_testing
|
https://api.github.com/repos/gremau/NMEG_fluxproc_testing
|
closed
|
Standardize relative humidity data to percent (0-100)
|
Ameriflux files QC Process
|
Ameriflux requests data in percent (0-100)
There is some inconsistency in this among files and processing code that should be fixed.
This will require changes in `UNM_Ameriflux_prepare_output_data` and `UNM_RemoveBadData` at a minimum
|
1.0
|
Standardize relative humidity data to percent (0-100) - Ameriflux requests data in percent (0-100)
There is some inconsistency in this among files and processing code that should be fixed.
This will require changes in `UNM_Ameriflux_prepare_output_data` and `UNM_RemoveBadData` at a minimum
|
non_test
|
standardize relative humidity data to percent ameriflux requests data in percent there is some inconsistency in this among files and processing code that should be fixed this will require changes in unm ameriflux prepare output data and unm removebaddata at a minimum
| 0
|
40,297
| 12,758,142,209
|
IssuesEvent
|
2020-06-29 01:04:02
|
kenferrara/WebGoat
|
https://api.github.com/repos/kenferrara/WebGoat
|
opened
|
CVE-2018-11696 (High) detected in node-sass-4.11.0.tgz
|
security vulnerability
|
## CVE-2018-11696 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.11.0.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/WebGoat/docs/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/WebGoat/docs/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- gulp-sass-4.0.2.tgz (Root Library)
- :x: **node-sass-4.11.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Inspect::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11696>CVE-2018-11696</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sass/libsass/issues/2665">https://github.com/sass/libsass/issues/2665</a></p>
<p>Release Date: 2018-06-04</p>
<p>Fix Resolution: Libsass:3.5.5, Node-sass:4.14.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.11.0","isTransitiveDependency":true,"dependencyTree":"gulp-sass:4.0.2;node-sass:4.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Libsass:3.5.5, Node-sass:4.14.0"}],"vulnerabilityIdentifier":"CVE-2018-11696","vulnerabilityDetails":"An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Inspect::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11696","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2018-11696 (High) detected in node-sass-4.11.0.tgz - ## CVE-2018-11696 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.11.0.tgz</b></p></summary>
<p>Wrapper around libsass</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.11.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/WebGoat/docs/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/WebGoat/docs/node_modules/node-sass/package.json</p>
<p>
Dependency Hierarchy:
- gulp-sass-4.0.2.tgz (Root Library)
- :x: **node-sass-4.11.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Inspect::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.
<p>Publish Date: 2018-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11696>CVE-2018-11696</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sass/libsass/issues/2665">https://github.com/sass/libsass/issues/2665</a></p>
<p>Release Date: 2018-06-04</p>
<p>Fix Resolution: Libsass:3.5.5, Node-sass:4.14.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.11.0","isTransitiveDependency":true,"dependencyTree":"gulp-sass:4.0.2;node-sass:4.11.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Libsass:3.5.5, Node-sass:4.14.0"}],"vulnerabilityIdentifier":"CVE-2018-11696","vulnerabilityDetails":"An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Inspect::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11696","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve high detected in node sass tgz cve high severity vulnerability vulnerable library node sass tgz wrapper around libsass library home page a href path to dependency file tmp ws scm webgoat docs package json path to vulnerable library tmp ws scm webgoat docs node modules node sass package json dependency hierarchy gulp sass tgz root library x node sass tgz vulnerable library vulnerability details an issue was discovered in libsass through a null pointer dereference was found in the function sass inspect operator which could be leveraged by an attacker to cause a denial of service application crash or possibly have unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass node sass isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails an issue was discovered in libsass through a null pointer dereference was found in the function sass inspect operator which could be leveraged by an attacker to cause a denial of service application crash or possibly have unspecified other impact vulnerabilityurl
| 0
|
17,356
| 3,605,385,281
|
IssuesEvent
|
2016-02-04 04:44:56
|
red/red
|
https://api.github.com/repos/red/red
|
closed
|
`foreach` always iterates over 3 pixels
|
Red status.built status.tested type.bug
|
`foreach` always iterates over 3 pixels of an image, regardless of the size.
```
red>> foreach i make image! 100x100 [probe i]
0.0.0.255
0.0.0.255
0.0.0.255
== 0.0.0.255
```
```
red>> foreach i make image! 1x1 [probe i]
0.0.0.255
none
none
== none
```
|
1.0
|
`foreach` always iterates over 3 pixels - `foreach` always iterates over 3 pixels of an image, regardless of the size.
```
red>> foreach i make image! 100x100 [probe i]
0.0.0.255
0.0.0.255
0.0.0.255
== 0.0.0.255
```
```
red>> foreach i make image! 1x1 [probe i]
0.0.0.255
none
none
== none
```
|
test
|
foreach always iterates over pixels foreach always iterates over pixels of an image regardless of the size red foreach i make image red foreach i make image none none none
| 1
|
255,964
| 21,993,605,092
|
IssuesEvent
|
2022-05-26 02:22:08
|
saxbophone/arby
|
https://api.github.com/repos/saxbophone/arby
|
opened
|
Run unit tests with valgrdind on CI
|
testing
|
Probably doing this on Linux is easiest —there's manual memory management in the code!
|
1.0
|
Run unit tests with valgrdind on CI - Probably doing this on Linux is easiest —there's manual memory management in the code!
|
test
|
run unit tests with valgrdind on ci probably doing this on linux is easiest —there s manual memory management in the code
| 1
|
37,588
| 5,126,217,667
|
IssuesEvent
|
2017-01-10 00:52:01
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
Running `flutter test` in checked mode causes assertion failures.
|
dev: tests
|
Specifically, the assertion failure [is here](https://github.com/flutter/flutter/blob/1fa8a254a308f517ec133fbba5cc9fdb39698a72/packages/flutter_tools/lib/src/test/flutter_platform.dart#L166). We should also be running the tests in checked mode.
```
unhandled error during test:
'package:flutter_tools/src/test/flutter_platform.dart': Failed assertion: line 166 pos 18: '!controllerSinkClosed' is not true.
#0 _AssertionError._throwNew (dart:core-patch/errors_patch.dart:24)
#1 _AssertionError._checkAssertion (dart:core-patch/errors_patch.dart:31)
#2 FlutterPlatform._startTest.<_startTest_async_body> (package:flutter_tools/src/test/flutter_platform.dart:166:18)
#3 _asyncThenWrapperHelper.<anonymous closure> (dart:async-patch/async_patch.dart:27)
#4 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#5 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#6 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#7 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#8 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#9 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#10 _rootRunUnary (dart:async/zone.dart:1158)
#11 _CustomZone.runUnary (dart:async/zone.dart:1037)
#12 _FutureListener.handleValue (dart:async/future_impl.dart:131)
#13 _Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:637)
#14 _Future._propagateToListeners (dart:async/future_impl.dart:667)
#15 _Future._complete (dart:async/future_impl.dart:467)
#16 _SyncCompleter.complete (dart:async/future_impl.dart:52)
#17 Future.any.<anonymous closure> (dart:async/future.dart:356)
#18 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#19 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#20 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#21 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#22 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#23 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#24 _rootRunUnary (dart:async/zone.dart:1158)
#25 _CustomZone.runUnary (dart:async/zone.dart:1037)
#26 _FutureListener.handleValue (dart:async/future_impl.dart:131)
#27 _Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:637)
#28 _Future._propagateToListeners (dart:async/future_impl.dart:667)
#29 _Future._completeWithValue (dart:async/future_impl.dart:477)
#30 _Future._asyncComplete.<anonymous closure> (dart:async/future_impl.dart:528)
#31 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#32 StackZoneSpecification._registerCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:97:48)
#33 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#34 StackZoneSpecification._registerCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:97:48)
#35 _rootRun (dart:async/zone.dart:1150)
#36 _CustomZone.run (dart:async/zone.dart:1026)
#37 _CustomZone.runGuarded (dart:async/zone.dart:924)
#38 _CustomZone.bindCallback.<anonymous closure> (dart:async/zone.dart:951)
#39 _microtaskLoop (dart:async/schedule_microtask.dart:41)
#40 _startMicrotaskLoop (dart:async/schedule_microtask.dart:50)
#41 _Timer._runTimers (dart:isolate-patch/timer_impl.dart:394)
#42 _Timer._handleMessage (dart:isolate-patch/timer_impl.dart:414)
#43 _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart:148)
```
|
1.0
|
Running `flutter test` in checked mode causes assertion failures. - Specifically, the assertion failure [is here](https://github.com/flutter/flutter/blob/1fa8a254a308f517ec133fbba5cc9fdb39698a72/packages/flutter_tools/lib/src/test/flutter_platform.dart#L166). We should also be running the tests in checked mode.
```
unhandled error during test:
'package:flutter_tools/src/test/flutter_platform.dart': Failed assertion: line 166 pos 18: '!controllerSinkClosed' is not true.
#0 _AssertionError._throwNew (dart:core-patch/errors_patch.dart:24)
#1 _AssertionError._checkAssertion (dart:core-patch/errors_patch.dart:31)
#2 FlutterPlatform._startTest.<_startTest_async_body> (package:flutter_tools/src/test/flutter_platform.dart:166:18)
#3 _asyncThenWrapperHelper.<anonymous closure> (dart:async-patch/async_patch.dart:27)
#4 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#5 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#6 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#7 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#8 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#9 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#10 _rootRunUnary (dart:async/zone.dart:1158)
#11 _CustomZone.runUnary (dart:async/zone.dart:1037)
#12 _FutureListener.handleValue (dart:async/future_impl.dart:131)
#13 _Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:637)
#14 _Future._propagateToListeners (dart:async/future_impl.dart:667)
#15 _Future._complete (dart:async/future_impl.dart:467)
#16 _SyncCompleter.complete (dart:async/future_impl.dart:52)
#17 Future.any.<anonymous closure> (dart:async/future.dart:356)
#18 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#19 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#20 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#21 StackZoneSpecification._registerUnaryCallback.<anonymous closure>.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:26)
#22 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#23 StackZoneSpecification._registerUnaryCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:107:14)
#24 _rootRunUnary (dart:async/zone.dart:1158)
#25 _CustomZone.runUnary (dart:async/zone.dart:1037)
#26 _FutureListener.handleValue (dart:async/future_impl.dart:131)
#27 _Future._propagateToListeners.handleValueCallback (dart:async/future_impl.dart:637)
#28 _Future._propagateToListeners (dart:async/future_impl.dart:667)
#29 _Future._completeWithValue (dart:async/future_impl.dart:477)
#30 _Future._asyncComplete.<anonymous closure> (dart:async/future_impl.dart:528)
#31 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#32 StackZoneSpecification._registerCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:97:48)
#33 StackZoneSpecification._run (package:stack_trace/src/stack_zone_specification.dart:185:15)
#34 StackZoneSpecification._registerCallback.<anonymous closure> (package:stack_trace/src/stack_zone_specification.dart:97:48)
#35 _rootRun (dart:async/zone.dart:1150)
#36 _CustomZone.run (dart:async/zone.dart:1026)
#37 _CustomZone.runGuarded (dart:async/zone.dart:924)
#38 _CustomZone.bindCallback.<anonymous closure> (dart:async/zone.dart:951)
#39 _microtaskLoop (dart:async/schedule_microtask.dart:41)
#40 _startMicrotaskLoop (dart:async/schedule_microtask.dart:50)
#41 _Timer._runTimers (dart:isolate-patch/timer_impl.dart:394)
#42 _Timer._handleMessage (dart:isolate-patch/timer_impl.dart:414)
#43 _RawReceivePortImpl._handleMessage (dart:isolate-patch/isolate_patch.dart:148)
```
|
test
|
running flutter test in checked mode causes assertion failures specifically the assertion failure we should also be running the tests in checked mode unhandled error during test package flutter tools src test flutter platform dart failed assertion line pos controllersinkclosed is not true assertionerror thrownew dart core patch errors patch dart assertionerror checkassertion dart core patch errors patch dart flutterplatform starttest package flutter tools src test flutter platform dart asyncthenwrapperhelper dart async patch async patch dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart stackzonespecification run package stack trace src stack zone specification dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart stackzonespecification run package stack trace src stack zone specification dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart rootrununary dart async zone dart customzone rununary dart async zone dart futurelistener handlevalue dart async future impl dart future propagatetolisteners handlevaluecallback dart async future impl dart future propagatetolisteners dart async future impl dart future complete dart async future impl dart synccompleter complete dart async future impl dart future any dart async future dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart stackzonespecification run package stack trace src stack zone specification dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart stackzonespecification run package stack trace src stack zone specification dart stackzonespecification registerunarycallback package stack trace src stack zone specification dart rootrununary dart async zone dart customzone rununary dart async zone dart futurelistener handlevalue dart async future impl dart future propagatetolisteners handlevaluecallback dart async future impl dart future propagatetolisteners dart async future impl dart future completewithvalue dart async future impl dart future asynccomplete dart async future impl dart stackzonespecification run package stack trace src stack zone specification dart stackzonespecification registercallback package stack trace src stack zone specification dart stackzonespecification run package stack trace src stack zone specification dart stackzonespecification registercallback package stack trace src stack zone specification dart rootrun dart async zone dart customzone run dart async zone dart customzone runguarded dart async zone dart customzone bindcallback dart async zone dart microtaskloop dart async schedule microtask dart startmicrotaskloop dart async schedule microtask dart timer runtimers dart isolate patch timer impl dart timer handlemessage dart isolate patch timer impl dart rawreceiveportimpl handlemessage dart isolate patch isolate patch dart
| 1
|
321,326
| 27,520,931,356
|
IssuesEvent
|
2023-03-06 15:00:08
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix view_tensor.test_view_tensor_asin
|
PyTorch Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4132051526/jobs/7140321500" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4132051526/jobs/7140317679" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4132051526/jobs/7140322185" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
|
1.0
|
Fix view_tensor.test_view_tensor_asin - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4132051526/jobs/7140321500" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4132051526/jobs/7140317679" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4132051526/jobs/7140322185" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
|
test
|
fix view tensor test view tensor asin tensorflow img src torch img src numpy img src jax img src not found not found not found not found not found not found
| 1
|
69
| 2,497,821,084
|
IssuesEvent
|
2015-01-07 11:17:31
|
Financial-Times/o-fonts
|
https://api.github.com/repos/Financial-Times/o-fonts
|
opened
|
New cut of Financier and Metric fonts
|
design
|
This is a placeholder ticket for now to contain any potential discrepancies, queries, pointers and issues with the new fonts. (hopefully it all goes smoothly and they work with now issues though)
|
1.0
|
New cut of Financier and Metric fonts - This is a placeholder ticket for now to contain any potential discrepancies, queries, pointers and issues with the new fonts. (hopefully it all goes smoothly and they work with now issues though)
|
non_test
|
new cut of financier and metric fonts this is a placeholder ticket for now to contain any potential discrepancies queries pointers and issues with the new fonts hopefully it all goes smoothly and they work with now issues though
| 0
|
166,863
| 12,974,610,253
|
IssuesEvent
|
2020-07-21 15:40:57
|
matplotlib/matplotlib
|
https://api.github.com/repos/matplotlib/matplotlib
|
opened
|
MEP22: Improve test coverage
|
MEP22 Testing
|
toolmanager test coverage should be increased before making MEP22 the default.
For example currently `examples/user_interfaces/toolmanager.py` is broken on Qt. The fix is easy (it's just adding a check for None), but having coverage for that would have been better.
|
1.0
|
MEP22: Improve test coverage - toolmanager test coverage should be increased before making MEP22 the default.
For example currently `examples/user_interfaces/toolmanager.py` is broken on Qt. The fix is easy (it's just adding a check for None), but having coverage for that would have been better.
|
test
|
improve test coverage toolmanager test coverage should be increased before making the default for example currently examples user interfaces toolmanager py is broken on qt the fix is easy it s just adding a check for none but having coverage for that would have been better
| 1
|
67,266
| 12,893,079,367
|
IssuesEvent
|
2020-07-13 20:52:29
|
pnp/pnpjs
|
https://api.github.com/repos/pnp/pnpjs
|
closed
|
Graph.me.drive.getItemById(<Id>).getContent() => Error : Failed to fetch
|
area: code status: complete type: enhancement
|
Hi All !
### Category
- [ ] Enhancement
- [x] Bug
- [x] Question
- [ ] Documentation gap/issue
### Version
Please specify what version of the library you are using: [ 2.0.5 ]
Please specify what version(s) of SharePoint you are targeting: [ SPO ]
### Expected / Desired Behavior / Question
graph.me.drive.getItemById(<fileId>).getContent() returns me the content of the file
### Observed Behavior
I have an exception with Error : Failed to fetch
### Steps to Reproduce
Create a new SPFX Webpart
Install pnp graph
Execut the following code with a valid file path
```
this.props.graphClient
.api('/me/drive/root:/MyFolder/MyFile.json')
.version('v1.0')
.get((error, response: any, rawResponse?: any) => {
graph.me.drive.getItemById(response.id).getContent()
.then(file => {
})
.catch(err => {
debugger;
});
});
```
I correctly set up permissions with Files.Read and User.Read.
I correctly obtain file infos.
I correctly obtain a working download url with field @microsoft.graph.downloadUrl .
For information, I did not manage to have Microsoft Graph API working with Graph explorer with
`/me/drive/root:/MyFolder/MyFile.json:/content`
This call is endlessly loading with no answser or error.
### More information
The underlying reason I tried to use this method is explained here.
I am trying to get a JSon file content stored in my Office 365 OneDrive to process it in an SPFX Webpart.
I tryed many method and none of them worked until now.
Any advice welcome to succeed in something I thought was trivial !
|
1.0
|
Graph.me.drive.getItemById(<Id>).getContent() => Error : Failed to fetch - Hi All !
### Category
- [ ] Enhancement
- [x] Bug
- [x] Question
- [ ] Documentation gap/issue
### Version
Please specify what version of the library you are using: [ 2.0.5 ]
Please specify what version(s) of SharePoint you are targeting: [ SPO ]
### Expected / Desired Behavior / Question
graph.me.drive.getItemById(<fileId>).getContent() returns me the content of the file
### Observed Behavior
I have an exception with Error : Failed to fetch
### Steps to Reproduce
Create a new SPFX Webpart
Install pnp graph
Execut the following code with a valid file path
```
this.props.graphClient
.api('/me/drive/root:/MyFolder/MyFile.json')
.version('v1.0')
.get((error, response: any, rawResponse?: any) => {
graph.me.drive.getItemById(response.id).getContent()
.then(file => {
})
.catch(err => {
debugger;
});
});
```
I correctly set up permissions with Files.Read and User.Read.
I correctly obtain file infos.
I correctly obtain a working download url with field @microsoft.graph.downloadUrl .
For information, I did not manage to have Microsoft Graph API working with Graph explorer with
`/me/drive/root:/MyFolder/MyFile.json:/content`
This call is endlessly loading with no answser or error.
### More information
The underlying reason I tried to use this method is explained here.
I am trying to get a JSon file content stored in my Office 365 OneDrive to process it in an SPFX Webpart.
I tryed many method and none of them worked until now.
Any advice welcome to succeed in something I thought was trivial !
|
non_test
|
graph me drive getitembyid getcontent error failed to fetch hi all category enhancement bug question documentation gap issue version please specify what version of the library you are using please specify what version s of sharepoint you are targeting expected desired behavior question graph me drive getitembyid getcontent returns me the content of the file observed behavior i have an exception with error failed to fetch steps to reproduce create a new spfx webpart install pnp graph execut the following code with a valid file path this props graphclient api me drive root myfolder myfile json version get error response any rawresponse any graph me drive getitembyid response id getcontent then file catch err debugger i correctly set up permissions with files read and user read i correctly obtain file infos i correctly obtain a working download url with field microsoft graph downloadurl for information i did not manage to have microsoft graph api working with graph explorer with me drive root myfolder myfile json content this call is endlessly loading with no answser or error more information the underlying reason i tried to use this method is explained here i am trying to get a json file content stored in my office onedrive to process it in an spfx webpart i tryed many method and none of them worked until now any advice welcome to succeed in something i thought was trivial
| 0
|
227,724
| 17,397,500,154
|
IssuesEvent
|
2021-08-02 15:06:06
|
aHub-Tech/twitch-bot-wildoverflow
|
https://api.github.com/repos/aHub-Tech/twitch-bot-wildoverflow
|
opened
|
Documentação: Criar arquivo de CONTRIBUIÇÃO
|
documentation
|
Deverá ser criado um arquivo de contribuição para o projeto.
|
1.0
|
Documentação: Criar arquivo de CONTRIBUIÇÃO - Deverá ser criado um arquivo de contribuição para o projeto.
|
non_test
|
documentação criar arquivo de contribuição deverá ser criado um arquivo de contribuição para o projeto
| 0
|
302,448
| 26,144,722,054
|
IssuesEvent
|
2022-12-30 01:40:29
|
ubcuas/ACOM
|
https://api.github.com/repos/ubcuas/ACOM
|
opened
|
Add Tests
|
Priority: High Status: Available Type: Testing
|
As we have added features to ACOM appropriate tests have not been added. Add missing tests where possible.
|
1.0
|
Add Tests - As we have added features to ACOM appropriate tests have not been added. Add missing tests where possible.
|
test
|
add tests as we have added features to acom appropriate tests have not been added add missing tests where possible
| 1
|
152,174
| 23,925,465,382
|
IssuesEvent
|
2022-09-09 21:56:43
|
bcgov/cas-public-reporting
|
https://api.github.com/repos/bcgov/cas-public-reporting
|
closed
|
Schedule Industry & Academia research analysis session with ES
|
Service Design
|
Research analysis is best done with more than one set of eyes! This ticket documents analyzing results from Industry & Academia interviews to find common themes, key user personas, and possibly problem statements.
**Acceptance Criteria:**
- [x] Identify the goals of session
- [x] Schedule time with ES
|
1.0
|
Schedule Industry & Academia research analysis session with ES - Research analysis is best done with more than one set of eyes! This ticket documents analyzing results from Industry & Academia interviews to find common themes, key user personas, and possibly problem statements.
**Acceptance Criteria:**
- [x] Identify the goals of session
- [x] Schedule time with ES
|
non_test
|
schedule industry academia research analysis session with es research analysis is best done with more than one set of eyes this ticket documents analyzing results from industry academia interviews to find common themes key user personas and possibly problem statements acceptance criteria identify the goals of session schedule time with es
| 0
|
15,609
| 3,477,704,181
|
IssuesEvent
|
2015-12-28 04:08:26
|
CANTUS-Project/abbot
|
https://api.github.com/repos/CANTUS-Project/abbot
|
opened
|
Add SEARCH integration tests
|
testing
|
Arising from #80. I realized there are no integration tests for SimpleHandler.search() or ComplexHandler.search().
|
1.0
|
Add SEARCH integration tests - Arising from #80. I realized there are no integration tests for SimpleHandler.search() or ComplexHandler.search().
|
test
|
add search integration tests arising from i realized there are no integration tests for simplehandler search or complexhandler search
| 1
|
226,653
| 18,043,761,706
|
IssuesEvent
|
2021-09-18 14:14:13
|
logicmoo/logicmoo_workspace
|
https://api.github.com/repos/logicmoo/logicmoo_workspace
|
opened
|
logicmoo.pfc.test.sanity_base.MT_01C_0A JUnit
|
Test_9999 logicmoo.pfc.test.sanity_base unit_test MT_01C_0A
|
(cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01c_0a.pl)
GH_MASTER_ISSUE_FINFO=
GH_MASTER_ISSUE_ID=# ''
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
This: https://jenkins.logicmoo.org/job/logicmoo_workspace/63/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/e25701286bb075746fc41e292c4a711c800a3806
https://github.com/logicmoo/logicmoo_workspace/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/e25701286bb075746fc41e292c4a711c800a3806
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01C_0A
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl'),
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
% :- rtrace.
:- expects_dialect(pfc).
:- must(is_pfc_file).
sHOW_MUST_go_on_failed_F__A__I__L_(baseKB:is_pfc_file)
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl#L17
%~ error( sHOW_MUST_go_on_failed_F__A__I__L_( baseKB : is_pfc_file))
%~ FILE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl#L17
```
totalTime=10
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
This: https://jenkins.logicmoo.org/job/logicmoo_workspace/63/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/e25701286bb075746fc41e292c4a711c800a3806
https://github.com/logicmoo/logicmoo_workspace/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/e25701286bb075746fc41e292c4a711c800a3806
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01C_0A
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k mt_01c_0a.pl (returned 137)
|
3.0
|
logicmoo.pfc.test.sanity_base.MT_01C_0A JUnit - (cd /var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base ; timeout --foreground --preserve-status -s SIGKILL -k 10s 10s lmoo-clif mt_01c_0a.pl)
GH_MASTER_ISSUE_FINFO=
GH_MASTER_ISSUE_ID=# ''
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
This: https://jenkins.logicmoo.org/job/logicmoo_workspace/63/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/e25701286bb075746fc41e292c4a711c800a3806
https://github.com/logicmoo/logicmoo_workspace/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/e25701286bb075746fc41e292c4a711c800a3806
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01C_0A
```
%
running('/var/lib/jenkins/workspace/logicmoo_workspace/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl'),
%~ this_test_might_need( :-( use_module( library(logicmoo_plarkc))))
%~ this_test_might_need( :-( expects_dialect(pfc)))
% :- rtrace.
:- expects_dialect(pfc).
:- must(is_pfc_file).
sHOW_MUST_go_on_failed_F__A__I__L_(baseKB:is_pfc_file)
%~ FIlE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl#L17
%~ error( sHOW_MUST_go_on_failed_F__A__I__L_( baseKB : is_pfc_file))
%~ FILE: * https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/blob/master/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl#L17
```
totalTime=10
Latest: https://jenkins.logicmoo.org/job/logicmoo_workspace/lastBuild/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
This: https://jenkins.logicmoo.org/job/logicmoo_workspace/63/testReport/logicmoo.pfc.test.sanity_base/MT_01C_0A/logicmoo_pfc_test_sanity_base_MT_01C_0A_JUnit/
GITHUB: https://github.com/logicmoo/logicmoo_workspace/commit/e25701286bb075746fc41e292c4a711c800a3806
https://github.com/logicmoo/logicmoo_workspace/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
GITLAB: https://logicmoo.org:2082/gitlab/logicmoo/logicmoo_workspace/-/commit/e25701286bb075746fc41e292c4a711c800a3806
https://gitlab.logicmoo.org/gitlab/logicmoo/logicmoo_workspace/-/blob/e25701286bb075746fc41e292c4a711c800a3806/packs_sys/pfc/t/sanity_base/mt_01c_0a.pl
ISSUE_SEARCH: https://github.com/logicmoo/logicmoo_workspace/issues?q=is%3Aissue+label%3AMT_01C_0A
FAILED: /var/lib/jenkins/workspace/logicmoo_workspace/bin/lmoo-junit-minor -k mt_01c_0a.pl (returned 137)
|
test
|
logicmoo pfc test sanity base mt junit cd var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base timeout foreground preserve status s sigkill k lmoo clif mt pl gh master issue finfo gh master issue id latest this github gitlab issue search running var lib jenkins workspace logicmoo workspace packs sys pfc t sanity base mt pl this test might need use module library logicmoo plarkc this test might need expects dialect pfc rtrace expects dialect pfc must is pfc file show must go on failed f a i l basekb is pfc file file error show must go on failed f a i l basekb is pfc file file totaltime latest this github gitlab issue search failed var lib jenkins workspace logicmoo workspace bin lmoo junit minor k mt pl returned
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.