Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 844 | labels stringlengths 4 721 | body stringlengths 1 261k | index stringclasses 12 values | text_combine stringlengths 96 261k | label stringclasses 2 values | text stringlengths 96 248k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
1,310 | 3,575,435,539 | IssuesEvent | 2016-01-27 15:57:39 | mesosphere/marathon | https://api.github.com/repos/mesosphere/marathon | closed | Event callbacks hold on to unresponsive hosts | bug in progress service | When using event callbacks, an unresponsive host will be tried six times for every event. We'd like the option of a specified number of failed endpoint connections or a specified period of endpoint unavailability to trigger the removal of that endpoint. | 1.0 | Event callbacks hold on to unresponsive hosts - When using event callbacks, an unresponsive host will be tried six times for every event. We'd like the option of a specified number of failed endpoint connections or a specified period of endpoint unavailability to trigger the removal of that endpoint. | non_priority | event callbacks hold on to unresponsive hosts when using event callbacks an unresponsive host will be tried six times for every event we d like the option of a specified number of failed endpoint connections or a specified period of endpoint unavailability to trigger the removal of that endpoint | 0 |
40,028 | 20,380,371,037 | IssuesEvent | 2022-02-21 20:49:32 | mratsim/constantine | https://api.github.com/repos/mratsim/constantine | closed | Karabina's compressed cyclotomic squaring | performance :checkered_flag: | The paper
- Squaring in Cyclotomic Subgroups\
Koray Karabina, 2010\
https://eprint.iacr.org/2010/542.pdf
proposes 5 cyclotomic squaring formulae with various speed tradeoffs as they use alternate compressed coordinates and so require compression/decompression overhead which or some involves extension field inversion.
When we can stay in the compressed domain for squaring, i.e. when the exponent has very low Haming Weight for example for BN254-Nogami which has 55 squarings in a row and maybe for BLS12-381 with 32 squarings, the tradeoff are worthwhile and might lead to 20~25% perf boost vs Granger Scott based cyclotomic exponentiation.

| True | Karabina's compressed cyclotomic squaring - The paper
- Squaring in Cyclotomic Subgroups\
Koray Karabina, 2010\
https://eprint.iacr.org/2010/542.pdf
proposes 5 cyclotomic squaring formulae with various speed tradeoffs as they use alternate compressed coordinates and so require compression/decompression overhead which or some involves extension field inversion.
When we can stay in the compressed domain for squaring, i.e. when the exponent has very low Haming Weight for example for BN254-Nogami which has 55 squarings in a row and maybe for BLS12-381 with 32 squarings, the tradeoff are worthwhile and might lead to 20~25% perf boost vs Granger Scott based cyclotomic exponentiation.

| non_priority | karabina s compressed cyclotomic squaring the paper squaring in cyclotomic subgroups koray karabina proposes cyclotomic squaring formulae with various speed tradeoffs as they use alternate compressed coordinates and so require compression decompression overhead which or some involves extension field inversion when we can stay in the compressed domain for squaring i e when the exponent has very low haming weight for example for nogami which has squarings in a row and maybe for with squarings the tradeoff are worthwhile and might lead to perf boost vs granger scott based cyclotomic exponentiation | 0 |
172,941 | 13,358,708,452 | IssuesEvent | 2020-08-31 12:11:49 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | kv/kvserver: TestStoreMetrics failed | C-test-failure O-robot branch-master | [(kv/kvserver).TestStoreMetrics failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2145499&tab=buildLog) on [master@1cbffd204d90523b770d7c90c35cb34a045b50e7](https://github.com/cockroachdb/cockroach/commits/1cbffd204d90523b770d7c90c35cb34a045b50e7):
```
=== RUN TestStoreMetrics
TestStoreMetrics: test_log_scope.go:85: test logs captured to: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestStoreMetrics654905305
TestStoreMetrics: test_log_scope.go:58: use -show-logs to present logs inline
TestStoreMetrics: client_metrics_test.go:296: verifyStats failed, aborting test.
test logs left over in: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestStoreMetrics654905305
TestStoreMetrics: testing.go:906: race detected during execution of test
--- FAIL: TestStoreMetrics (1.41s)
```
<details><summary>More</summary><p>
Parameters:
- TAGS=
- GOFLAGS=-race -parallel=2
```
make stressrace TESTS=TestStoreMetrics PKG=./pkg/kv/kvserver TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestStoreMetrics.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| 1.0 | kv/kvserver: TestStoreMetrics failed - [(kv/kvserver).TestStoreMetrics failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2145499&tab=buildLog) on [master@1cbffd204d90523b770d7c90c35cb34a045b50e7](https://github.com/cockroachdb/cockroach/commits/1cbffd204d90523b770d7c90c35cb34a045b50e7):
```
=== RUN TestStoreMetrics
TestStoreMetrics: test_log_scope.go:85: test logs captured to: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestStoreMetrics654905305
TestStoreMetrics: test_log_scope.go:58: use -show-logs to present logs inline
TestStoreMetrics: client_metrics_test.go:296: verifyStats failed, aborting test.
test logs left over in: /go/src/github.com/cockroachdb/cockroach/artifacts/logTestStoreMetrics654905305
TestStoreMetrics: testing.go:906: race detected during execution of test
--- FAIL: TestStoreMetrics (1.41s)
```
<details><summary>More</summary><p>
Parameters:
- TAGS=
- GOFLAGS=-race -parallel=2
```
make stressrace TESTS=TestStoreMetrics PKG=./pkg/kv/kvserver TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestStoreMetrics.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| non_priority | kv kvserver teststoremetrics failed on run teststoremetrics teststoremetrics test log scope go test logs captured to go src github com cockroachdb cockroach artifacts teststoremetrics test log scope go use show logs to present logs inline teststoremetrics client metrics test go verifystats failed aborting test test logs left over in go src github com cockroachdb cockroach artifacts teststoremetrics testing go race detected during execution of test fail teststoremetrics more parameters tags goflags race parallel make stressrace tests teststoremetrics pkg pkg kv kvserver testtimeout stressflags timeout powered by | 0 |
247,921 | 20,988,436,042 | IssuesEvent | 2022-03-29 06:59:25 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | roachtest: disk-stalled/log=false,data=true failed | C-test-failure O-robot O-roachtest branch-master release-blocker T-storage | roachtest.disk-stalled/log=false,data=true [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4713654&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4713654&tab=artifacts#/disk-stalled/log=false,data=true) on master @ [29716850b181718594663889ddb5f479fef7a305](https://github.com/cockroachdb/cockroach/commits/29716850b181718594663889ddb5f479fef7a305):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /artifacts/disk-stalled/log=false_data=true/run_1
disk_stall.go:132,disk_stall.go:44,test_runner.go:875: unexpected output: Non-zero exit code: 134
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/storage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*disk-stalled/log=false,data=true.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-14302 | 2.0 | roachtest: disk-stalled/log=false,data=true failed - roachtest.disk-stalled/log=false,data=true [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=4713654&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=4713654&tab=artifacts#/disk-stalled/log=false,data=true) on master @ [29716850b181718594663889ddb5f479fef7a305](https://github.com/cockroachdb/cockroach/commits/29716850b181718594663889ddb5f479fef7a305):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /artifacts/disk-stalled/log=false_data=true/run_1
disk_stall.go:132,disk_stall.go:44,test_runner.go:875: unexpected output: Non-zero exit code: 134
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/storage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*disk-stalled/log=false,data=true.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
Jira issue: CRDB-14302 | non_priority | roachtest disk stalled log false data true failed roachtest disk stalled log false data true with on master the test failed on branch master cloud gce test artifacts and logs in artifacts disk stalled log false data true run disk stall go disk stall go test runner go unexpected output non zero exit code help see see cc cockroachdb storage jira issue crdb | 0 |
47,759 | 10,145,764,111 | IssuesEvent | 2019-08-05 06:00:03 | oppia/oppia | https://api.github.com/repos/oppia/oppia | opened | Add lint checks to ensure that there are valid spaces and newlines | code-health talk-to: @kevinlee12 | The following file previously had an invalid newline in line 213/214:
https://github.com/oppia/oppia/blob/develop/schema_utils.py.
We need to make a linter that ensures that all newlines are valid (ie. no carriage returns, etc). | 1.0 | Add lint checks to ensure that there are valid spaces and newlines - The following file previously had an invalid newline in line 213/214:
https://github.com/oppia/oppia/blob/develop/schema_utils.py.
We need to make a linter that ensures that all newlines are valid (ie. no carriage returns, etc). | non_priority | add lint checks to ensure that there are valid spaces and newlines the following file previously had an invalid newline in line we need to make a linter that ensures that all newlines are valid ie no carriage returns etc | 0 |
7,660 | 8,027,584,545 | IssuesEvent | 2018-07-27 09:35:49 | Microsoft/vsts-tasks | https://api.github.com/repos/Microsoft/vsts-tasks | closed | VSTS Task "Service Fabric Application Deployment" : Could not ping any of the provided Service Fabric gateway endpoints | Area: Release Area: ServiceFabric Status: Resolved | Hi,
Started about 3 weeks ago, all my deployment on Service Fabric failed with a message "**##[error]Could not ping any of the provided Service Fabric gateway endpoints.**" during "**Copying application to image store...**" step.
I deploy from a VSTS 2017 hosted agent on multiple Service Fabric and it's the same result for all.
I tried to increase timeout (but seems to not be applied for this step) in the task, tried the diff package option, revert cluster in older version, but nothing work.
My release definition is a bunch of deploy service fabric app, for each VSTS replace token in the parameter file then deploy. Sometimes the error happen at the first deploy, sometimes, at the last one, sometimes in the middle, i can't find what happen and why.
The only thing i found, is the version of the "**Service Fabric Application Deployment**" used.
the last deployment successfull i have, it was this header in the task :
> ==============================================================================
> 2018-06-06T08:44:32.8059777Z Task : Service Fabric Application Deployment
> 2018-06-06T08:44:32.8062058Z Description : Deploy a Service Fabric application to a cluster.
> **2018-06-06T08:44:32.8064320Z Version : ***.7.9
> 2018-06-06T08:44:32.8066149Z Author : Microsoft Corporation
> 2018-06-06T08:44:32.8068541Z Help : [More Information](https://go.microsoft.com/fwlink/?LinkId=8***0***8)
> 2018-06-06T08:44:32.8071185Z ==============================================================================
Now it's
> ==============================================================================
> 2018-06-20T19:12:29.0249577Z Task : Service Fabric Application Deployment
> 2018-06-20T19:12:29.0252588Z Description : Deploy a Service Fabric application to a cluster.
> 2018-06-20T19:12:29.0254786Z Version : '***.7.***'
> 2018-06-20T19:12:29.0257108Z Author : Microsoft Corporation
> 2018-06-20T19:12:29.0259591Z Help : [More Information](https://go.microsoft.com/fwlink/?LinkId=8***0***8)
> 2018-06-20T19:12:29.0262517Z ==============================================================================
In attachment the full log of the error with system.debug on the release.
[log-system.debug.txt](https://github.com/Microsoft/vsts-tasks/files/2123161/log-system.debug.txt)
| 1.0 | VSTS Task "Service Fabric Application Deployment" : Could not ping any of the provided Service Fabric gateway endpoints - Hi,
Started about 3 weeks ago, all my deployment on Service Fabric failed with a message "**##[error]Could not ping any of the provided Service Fabric gateway endpoints.**" during "**Copying application to image store...**" step.
I deploy from a VSTS 2017 hosted agent on multiple Service Fabric and it's the same result for all.
I tried to increase timeout (but seems to not be applied for this step) in the task, tried the diff package option, revert cluster in older version, but nothing work.
My release definition is a bunch of deploy service fabric app, for each VSTS replace token in the parameter file then deploy. Sometimes the error happen at the first deploy, sometimes, at the last one, sometimes in the middle, i can't find what happen and why.
The only thing i found, is the version of the "**Service Fabric Application Deployment**" used.
the last deployment successfull i have, it was this header in the task :
> ==============================================================================
> 2018-06-06T08:44:32.8059777Z Task : Service Fabric Application Deployment
> 2018-06-06T08:44:32.8062058Z Description : Deploy a Service Fabric application to a cluster.
> **2018-06-06T08:44:32.8064320Z Version : ***.7.9
> 2018-06-06T08:44:32.8066149Z Author : Microsoft Corporation
> 2018-06-06T08:44:32.8068541Z Help : [More Information](https://go.microsoft.com/fwlink/?LinkId=8***0***8)
> 2018-06-06T08:44:32.8071185Z ==============================================================================
Now it's
> ==============================================================================
> 2018-06-20T19:12:29.0249577Z Task : Service Fabric Application Deployment
> 2018-06-20T19:12:29.0252588Z Description : Deploy a Service Fabric application to a cluster.
> 2018-06-20T19:12:29.0254786Z Version : '***.7.***'
> 2018-06-20T19:12:29.0257108Z Author : Microsoft Corporation
> 2018-06-20T19:12:29.0259591Z Help : [More Information](https://go.microsoft.com/fwlink/?LinkId=8***0***8)
> 2018-06-20T19:12:29.0262517Z ==============================================================================
In attachment the full log of the error with system.debug on the release.
[log-system.debug.txt](https://github.com/Microsoft/vsts-tasks/files/2123161/log-system.debug.txt)
| non_priority | vsts task service fabric application deployment could not ping any of the provided service fabric gateway endpoints hi started about weeks ago all my deployment on service fabric failed with a message could not ping any of the provided service fabric gateway endpoints during copying application to image store step i deploy from a vsts hosted agent on multiple service fabric and it s the same result for all i tried to increase timeout but seems to not be applied for this step in the task tried the diff package option revert cluster in older version but nothing work my release definition is a bunch of deploy service fabric app for each vsts replace token in the parameter file then deploy sometimes the error happen at the first deploy sometimes at the last one sometimes in the middle i can t find what happen and why the only thing i found is the version of the service fabric application deployment used the last deployment successfull i have it was this header in the task task service fabric application deployment description deploy a service fabric application to a cluster version author microsoft corporation help now it s task service fabric application deployment description deploy a service fabric application to a cluster version author microsoft corporation help in attachment the full log of the error with system debug on the release | 0 |
86,157 | 15,755,371,209 | IssuesEvent | 2021-03-31 01:39:32 | renfei/FLibs | https://api.github.com/repos/renfei/FLibs | opened | CVE-2021-23337 (High) detected in multiple libraries | security vulnerability | ## CVE-2021-23337 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.17.11.tgz</b>, <b>lodash-3.10.1.tgz</b>, <b>lodash-2.4.2.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.17.11.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz</a></p>
<p>Path to dependency file: /FLibs/libs/jquery/3.1.1/fork/package.json</p>
<p>Path to vulnerable library: FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-compare-size-0.4.2.tgz (Root Library)
- :x: **lodash-4.17.11.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: /FLibs/libs/jquery/3.2.1/package.json</p>
<p>Path to vulnerable library: FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-1.0.0.tgz (Root Library)
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-2.4.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, & extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p>
<p>Path to dependency file: /FLibs/libs/jquery/3.2.0/fork/package.json</p>
<p>Path to vulnerable library: FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-npmcopy-0.1.0.tgz (Root Library)
- :x: **lodash-2.4.2.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Command Injection via the template function.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23337>CVE-2021-23337</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c">https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash - 4.17.21</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-23337 (High) detected in multiple libraries - ## CVE-2021-23337 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.17.11.tgz</b>, <b>lodash-3.10.1.tgz</b>, <b>lodash-2.4.2.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.17.11.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz</a></p>
<p>Path to dependency file: /FLibs/libs/jquery/3.1.1/fork/package.json</p>
<p>Path to vulnerable library: FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-compare-size-0.4.2.tgz (Root Library)
- :x: **lodash-4.17.11.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-3.10.1.tgz</b></p></summary>
<p>The modern build of lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz">https://registry.npmjs.org/lodash/-/lodash-3.10.1.tgz</a></p>
<p>Path to dependency file: /FLibs/libs/jquery/3.2.1/package.json</p>
<p>Path to vulnerable library: FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/insight/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-contrib-watch-1.0.0.tgz (Root Library)
- :x: **lodash-3.10.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-2.4.2.tgz</b></p></summary>
<p>A utility library delivering consistency, customization, performance, & extras.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz">https://registry.npmjs.org/lodash/-/lodash-2.4.2.tgz</a></p>
<p>Path to dependency file: /FLibs/libs/jquery/3.2.0/fork/package.json</p>
<p>Path to vulnerable library: FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json,FLibs/libs/jquery/3.1.1/fork/node_modules/grunt-npmcopy/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- grunt-npmcopy-0.1.0.tgz (Root Library)
- :x: **lodash-2.4.2.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Command Injection via the template function.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23337>CVE-2021-23337</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.2</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c">https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash - 4.17.21</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries lodash tgz lodash tgz lodash tgz lodash tgz lodash modular utilities library home page a href path to dependency file flibs libs jquery fork package json path to vulnerable library flibs libs jquery fork node modules lodash package json flibs libs jquery fork node modules lodash package json flibs libs jquery fork node modules lodash package json flibs libs jquery fork node modules lodash package json flibs libs jquery fork node modules lodash package json dependency hierarchy grunt compare size tgz root library x lodash tgz vulnerable library lodash tgz the modern build of lodash modular utilities library home page a href path to dependency file flibs libs jquery package json path to vulnerable library flibs libs jquery fork node modules insight node modules lodash package json flibs libs jquery fork node modules insight node modules lodash package json flibs libs jquery fork node modules insight node modules lodash package json flibs libs jquery fork node modules insight node modules lodash package json flibs libs jquery fork node modules insight node modules lodash package json dependency hierarchy grunt contrib watch tgz root library x lodash tgz vulnerable library lodash tgz a utility library delivering consistency customization performance extras library home page a href path to dependency file flibs libs jquery fork package json path to vulnerable library flibs libs jquery fork node modules grunt npmcopy node modules lodash package json flibs libs jquery fork node modules grunt npmcopy node modules lodash package json flibs libs jquery fork node modules grunt npmcopy node modules lodash package json flibs libs jquery fork node modules grunt npmcopy node modules lodash package json flibs libs jquery fork node modules grunt npmcopy node modules lodash package json dependency hierarchy grunt npmcopy tgz root library x lodash tgz vulnerable library vulnerability details lodash versions prior to are vulnerable to command injection via the template function publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash step up your open source security game with whitesource | 0 |
261,491 | 27,809,783,232 | IssuesEvent | 2023-03-18 01:43:13 | madhans23/linux-4.1.15 | https://api.github.com/repos/madhans23/linux-4.1.15 | closed | CVE-2019-12456 (High) detected in linux-stable-rtv4.1.33 - autoclosed | Mend: dependency security vulnerability | ## CVE-2019-12456 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/madhans23/linux-4.1.15/commit/f9d19044b0eef1965f9bc412d7d9e579b74ec968">f9d19044b0eef1965f9bc412d7d9e579b74ec968</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/mpt3sas/mpt3sas_ctl.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/mpt3sas/mpt3sas_ctl.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** DISPUTED ** An issue was discovered in the MPT3COMMAND case in _ctl_ioctl_main in drivers/scsi/mpt3sas/mpt3sas_ctl.c in the Linux kernel through 5.1.5. It allows local users to cause a denial of service or possibly have unspecified other impact by changing the value of ioc_number between two kernel reads of that value, aka a "double fetch" vulnerability. NOTE: a third party reports that this is unexploitable because the doubly fetched value is not used.
<p>Publish Date: 2019-05-30
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-12456>CVE-2019-12456</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-12456 (High) detected in linux-stable-rtv4.1.33 - autoclosed - ## CVE-2019-12456 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/madhans23/linux-4.1.15/commit/f9d19044b0eef1965f9bc412d7d9e579b74ec968">f9d19044b0eef1965f9bc412d7d9e579b74ec968</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/mpt3sas/mpt3sas_ctl.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/scsi/mpt3sas/mpt3sas_ctl.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
** DISPUTED ** An issue was discovered in the MPT3COMMAND case in _ctl_ioctl_main in drivers/scsi/mpt3sas/mpt3sas_ctl.c in the Linux kernel through 5.1.5. It allows local users to cause a denial of service or possibly have unspecified other impact by changing the value of ioc_number between two kernel reads of that value, aka a "double fetch" vulnerability. NOTE: a third party reports that this is unexploitable because the doubly fetched value is not used.
<p>Publish Date: 2019-05-30
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-12456>CVE-2019-12456</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in linux stable autoclosed cve high severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files drivers scsi ctl c drivers scsi ctl c vulnerability details disputed an issue was discovered in the case in ctl ioctl main in drivers scsi ctl c in the linux kernel through it allows local users to cause a denial of service or possibly have unspecified other impact by changing the value of ioc number between two kernel reads of that value aka a double fetch vulnerability note a third party reports that this is unexploitable because the doubly fetched value is not used publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with mend | 0 |
233,515 | 19,006,884,403 | IssuesEvent | 2021-11-23 01:54:35 | pingcap/tidb | https://api.github.com/repos/pingcap/tidb | closed | CI unstable test mysql `auto_increment` failed | type/bug component/test sig/sql-infra severity/major | ## Bug Report
Please answer these questions before submitting your issue. Thanks!
### 1. Minimal reproduce step (Required)
in ci https://ci.pingcap.net/blue/organizations/jenkins/tidb_ghpr_integration_common_test/detail/tidb_ghpr_integration_common_test/7787/pipeline/144/
may because the `CREATE TABLE t1 (c1 DOUBLE NOT NULL AUTO_INCREMENT, c2 INT, PRIMARY KEY (c1))\nENGINE=InnoDB;` user test database, but above use auto_increment database, conflict with other cases.
<!-- a step by step guide for reproducing the bug. -->
### 2. What did you expect to see? (Required)
### 3. What did you see instead (Required)
### 4. What is your TiDB version? (Required)
<!-- Paste the output of SELECT tidb_version() -->
| 1.0 | CI unstable test mysql `auto_increment` failed - ## Bug Report
Please answer these questions before submitting your issue. Thanks!
### 1. Minimal reproduce step (Required)
in ci https://ci.pingcap.net/blue/organizations/jenkins/tidb_ghpr_integration_common_test/detail/tidb_ghpr_integration_common_test/7787/pipeline/144/
may because the `CREATE TABLE t1 (c1 DOUBLE NOT NULL AUTO_INCREMENT, c2 INT, PRIMARY KEY (c1))\nENGINE=InnoDB;` user test database, but above use auto_increment database, conflict with other cases.
<!-- a step by step guide for reproducing the bug. -->
### 2. What did you expect to see? (Required)
### 3. What did you see instead (Required)
### 4. What is your TiDB version? (Required)
<!-- Paste the output of SELECT tidb_version() -->
| non_priority | ci unstable test mysql auto increment failed bug report please answer these questions before submitting your issue thanks minimal reproduce step required in ci may because the create table double not null auto increment int primary key nengine innodb user test database but above use auto increment database conflict with other cases what did you expect to see required what did you see instead required what is your tidb version required | 0 |
19,002 | 4,328,884,677 | IssuesEvent | 2016-07-26 15:15:34 | quintel/etmoses | https://api.github.com/repos/quintel/etmoses | closed | Explanation of units | Documentation Enhancement | Many people have problems with getting the units rights in the topology when using Units and children (for capacities etc.). Some text is added for this, but an calculation example would be better to explain this. | 1.0 | Explanation of units - Many people have problems with getting the units rights in the topology when using Units and children (for capacities etc.). Some text is added for this, but an calculation example would be better to explain this. | non_priority | explanation of units many people have problems with getting the units rights in the topology when using units and children for capacities etc some text is added for this but an calculation example would be better to explain this | 0 |
1,889 | 21,480,463,020 | IssuesEvent | 2022-04-26 17:12:54 | cds-snc/platform-forms-client | https://api.github.com/repos/cds-snc/platform-forms-client | closed | Review github branching strategy to provide quick and easy rollbacks | :vertical_traffic_light: Severe development reliability | **Context**
Action item coming from the incident reports:
- https://docs.google.com/document/d/1Cd0Y2PAAjNhVkxvTetQTEHUto_3WMk2P2nLZYkheI7s/edit#
- https://docs.google.com/document/d/1z-cmpUkDonTcmDO9uFd1apacNH6IbdekKID8pMnaXf0/edit#
**Scope**
- [ ] Review the current infrastructure requirements to identify an easy rollback approach.
- [ ] Review production release process to mitigate issues during release
- [ ] ... | True | Review github branching strategy to provide quick and easy rollbacks - **Context**
Action item coming from the incident reports:
- https://docs.google.com/document/d/1Cd0Y2PAAjNhVkxvTetQTEHUto_3WMk2P2nLZYkheI7s/edit#
- https://docs.google.com/document/d/1z-cmpUkDonTcmDO9uFd1apacNH6IbdekKID8pMnaXf0/edit#
**Scope**
- [ ] Review the current infrastructure requirements to identify an easy rollback approach.
- [ ] Review production release process to mitigate issues during release
- [ ] ... | non_priority | review github branching strategy to provide quick and easy rollbacks context action item coming from the incident reports scope review the current infrastructure requirements to identify an easy rollback approach review production release process to mitigate issues during release | 0 |
157,151 | 12,360,437,877 | IssuesEvent | 2020-05-17 15:16:34 | urapadmin/kiosk | https://api.github.com/repos/urapadmin/kiosk | closed | pagination problems in file repository | bug file-repository test-stage | The file repository does not generating pages of images properly.
In testing if the file repository for KHPP produced proper results, you and I had different experiences with the return of locus relation images and I think this is why.
I searched D09.1. It said 249 images and I fetched them. I carefully scrolled through the images and did not see a single locus relationship image. I searched D09.1 with locus relations defined as the record type and saw that I should have seen 7 such images. You could not reproduce this problem. The issue is that at the TOP of the screen it says there are three pages of photos:

But at the BOTTOM (this is just scrolling down, no reloading or anything) it says there are only two pages of photos:

So I had scrolled through two pages of photos and not seen a single locus relationship photo, and when I was at the bottom it seemed to me that there was no additional page of images to look at. But if I go up and choose the third page, which I can only do from the top of the screen, lo and behold I find all the locus relationship photos.
That's definitely the problem in #617, too - it shows two pages of images at the top, but at the bottom only one and so I cannot click to the second page from the bottom and assumed there was not one. | 1.0 | pagination problems in file repository - The file repository does not generating pages of images properly.
In testing if the file repository for KHPP produced proper results, you and I had different experiences with the return of locus relation images and I think this is why.
I searched D09.1. It said 249 images and I fetched them. I carefully scrolled through the images and did not see a single locus relationship image. I searched D09.1 with locus relations defined as the record type and saw that I should have seen 7 such images. You could not reproduce this problem. The issue is that at the TOP of the screen it says there are three pages of photos:

But at the BOTTOM (this is just scrolling down, no reloading or anything) it says there are only two pages of photos:

So I had scrolled through two pages of photos and not seen a single locus relationship photo, and when I was at the bottom it seemed to me that there was no additional page of images to look at. But if I go up and choose the third page, which I can only do from the top of the screen, lo and behold I find all the locus relationship photos.
That's definitely the problem in #617, too - it shows two pages of images at the top, but at the bottom only one and so I cannot click to the second page from the bottom and assumed there was not one. | non_priority | pagination problems in file repository the file repository does not generating pages of images properly in testing if the file repository for khpp produced proper results you and i had different experiences with the return of locus relation images and i think this is why i searched it said images and i fetched them i carefully scrolled through the images and did not see a single locus relationship image i searched with locus relations defined as the record type and saw that i should have seen such images you could not reproduce this problem the issue is that at the top of the screen it says there are three pages of photos but at the bottom this is just scrolling down no reloading or anything it says there are only two pages of photos so i had scrolled through two pages of photos and not seen a single locus relationship photo and when i was at the bottom it seemed to me that there was no additional page of images to look at but if i go up and choose the third page which i can only do from the top of the screen lo and behold i find all the locus relationship photos that s definitely the problem in too it shows two pages of images at the top but at the bottom only one and so i cannot click to the second page from the bottom and assumed there was not one | 0 |
265,648 | 20,106,070,683 | IssuesEvent | 2022-02-07 10:36:36 | alphagov/govuk-prototype-kit | https://api.github.com/repos/alphagov/govuk-prototype-kit | closed | Create release notes for GOV.UK Prototype Kit v12.0.1 | documentation 🕔 hours release | ## What
Create release notes for GOV.UK Prototype v12.0.1 (fix release)
## Why
To tell users about the GOV.UK Frontend v4.0.1 fixes and potential impact on prototypes
## Who needs to work on this
Technical writer aka content designer, Developer
## Who needs to review this
Developer
## When
By end Monday 7 Feb
## Done when
- [x] Tech writer / content designer drafts release notes
- [x] Developer reviews release notes
- [x] PR raised to update release notes in Changelog
- [x] Release notes pass 2i
| 1.0 | Create release notes for GOV.UK Prototype Kit v12.0.1 - ## What
Create release notes for GOV.UK Prototype v12.0.1 (fix release)
## Why
To tell users about the GOV.UK Frontend v4.0.1 fixes and potential impact on prototypes
## Who needs to work on this
Technical writer aka content designer, Developer
## Who needs to review this
Developer
## When
By end Monday 7 Feb
## Done when
- [x] Tech writer / content designer drafts release notes
- [x] Developer reviews release notes
- [x] PR raised to update release notes in Changelog
- [x] Release notes pass 2i
| non_priority | create release notes for gov uk prototype kit what create release notes for gov uk prototype fix release why to tell users about the gov uk frontend fixes and potential impact on prototypes who needs to work on this technical writer aka content designer developer who needs to review this developer when by end monday feb done when tech writer content designer drafts release notes developer reviews release notes pr raised to update release notes in changelog release notes pass | 0 |
70,239 | 8,514,616,442 | IssuesEvent | 2018-10-31 19:04:48 | decred/dcrdesign | https://api.github.com/repos/decred/dcrdesign | closed | Decrediton: Governance view refinements | Decrediton interaction design visual design | Refine the governance view as much where possible based on the initial concept design and prepare for implementation. | 2.0 | Decrediton: Governance view refinements - Refine the governance view as much where possible based on the initial concept design and prepare for implementation. | non_priority | decrediton governance view refinements refine the governance view as much where possible based on the initial concept design and prepare for implementation | 0 |
249,843 | 21,194,783,467 | IssuesEvent | 2022-04-08 22:19:02 | IntellectualSites/FastAsyncWorldEdit | https://api.github.com/repos/IntellectualSites/FastAsyncWorldEdit | opened | Could not pass event PlayerJoinEvent & PlayerTeleportEvent | Requires Testing | ### Server Implementation
Paper
### Server Version
1.18.2
### Describe the bug
After setting up `dynamic-chunk-rendering`, an error is dump when a player join.
The feature however seems to work.
### To Reproduce
- Setting `dynamic-chunk-rendering` to 32 while `view-distance` in `server.properties` is set to 3
- Join the server
### Expected behaviour
Since it's an experimental feature, I wasn't expecting this to run without any issue, this bug report is only there to share these errors
### Screenshots / Videos
_No response_
### Error log (if applicable)
https://paste.gg/p/anonymous/332ab17295a0495ca6de83d161ef95d0
### Fawe Debugpaste
https://athion.net/ISPaster/paste/view/e6d3196ed6614752b47bd51c8c285f85
### Fawe Version
FastAsyncWorldEdit version 2.1.2-SNAPSHOT-151;2483eac
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists.
### Anything else?
_No response_ | 1.0 | Could not pass event PlayerJoinEvent & PlayerTeleportEvent - ### Server Implementation
Paper
### Server Version
1.18.2
### Describe the bug
After setting up `dynamic-chunk-rendering`, an error is dump when a player join.
The feature however seems to work.
### To Reproduce
- Setting `dynamic-chunk-rendering` to 32 while `view-distance` in `server.properties` is set to 3
- Join the server
### Expected behaviour
Since it's an experimental feature, I wasn't expecting this to run without any issue, this bug report is only there to share these errors
### Screenshots / Videos
_No response_
### Error log (if applicable)
https://paste.gg/p/anonymous/332ab17295a0495ca6de83d161ef95d0
### Fawe Debugpaste
https://athion.net/ISPaster/paste/view/e6d3196ed6614752b47bd51c8c285f85
### Fawe Version
FastAsyncWorldEdit version 2.1.2-SNAPSHOT-151;2483eac
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit/ and the issue still persists.
### Anything else?
_No response_ | non_priority | could not pass event playerjoinevent playerteleportevent server implementation paper server version describe the bug after setting up dynamic chunk rendering an error is dump when a player join the feature however seems to work to reproduce setting dynamic chunk rendering to while view distance in server properties is set to join the server expected behaviour since it s an experimental feature i wasn t expecting this to run without any issue this bug report is only there to share these errors screenshots videos no response error log if applicable fawe debugpaste fawe version fastasyncworldedit version snapshot checklist i have included a fawe debugpaste i am using the newest build from and the issue still persists anything else no response | 0 |
166,412 | 20,718,492,964 | IssuesEvent | 2022-03-13 01:55:30 | n-devs/reactJSTest | https://api.github.com/repos/n-devs/reactJSTest | opened | CVE-2021-32803 (High) detected in tar-2.2.1.tgz | security vulnerability | ## CVE-2021-32803 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.1.tgz">https://registry.npmjs.org/tar/-/tar-2.2.1.tgz</a></p>
<p>Path to dependency file: /reactJSTest/package.json</p>
<p>Path to vulnerable library: /node_modules/grpc/node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-1.0.10.tgz (Root Library)
- fsevents-1.1.2.tgz
- node-pre-gyp-0.6.36.tgz
- :x: **tar-2.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution (tar): 3.2.3</p>
<p>Direct dependency fix Resolution (react-scripts): 1.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-32803 (High) detected in tar-2.2.1.tgz - ## CVE-2021-32803 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-2.2.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-2.2.1.tgz">https://registry.npmjs.org/tar/-/tar-2.2.1.tgz</a></p>
<p>Path to dependency file: /reactJSTest/package.json</p>
<p>Path to vulnerable library: /node_modules/grpc/node_modules/tar/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-1.0.10.tgz (Root Library)
- fsevents-1.1.2.tgz
- node-pre-gyp-0.6.36.tgz
- :x: **tar-2.2.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 6.1.2, 5.0.7, 4.4.15, and 3.2.3 has an arbitrary File Creation/Overwrite vulnerability via insufficient symlink protection. `node-tar` aims to guarantee that any file whose location would be modified by a symbolic link is not extracted. This is, in part, achieved by ensuring that extracted directories are not symlinks. Additionally, in order to prevent unnecessary `stat` calls to determine whether a given path is a directory, paths are cached when directories are created. This logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory. This order of operations resulted in the directory being created and added to the `node-tar` directory cache. When a directory is present in the directory cache, subsequent calls to mkdir for that directory are skipped. However, this is also where `node-tar` checks for symlinks occur. By first creating a directory, and then replacing that directory with a symlink, it was thus possible to bypass `node-tar` symlink checks on directories, essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.3, 4.4.15, 5.0.7 and 6.1.2.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32803>CVE-2021-32803</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw">https://github.com/npm/node-tar/security/advisories/GHSA-r628-mhmh-qjhw</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution (tar): 3.2.3</p>
<p>Direct dependency fix Resolution (react-scripts): 1.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href path to dependency file reactjstest package json path to vulnerable library node modules grpc node modules tar package json dependency hierarchy react scripts tgz root library fsevents tgz node pre gyp tgz x tar tgz vulnerable library vulnerability details the npm package tar aka node tar before versions and has an arbitrary file creation overwrite vulnerability via insufficient symlink protection node tar aims to guarantee that any file whose location would be modified by a symbolic link is not extracted this is in part achieved by ensuring that extracted directories are not symlinks additionally in order to prevent unnecessary stat calls to determine whether a given path is a directory paths are cached when directories are created this logic was insufficient when extracting tar files that contained both a directory and a symlink with the same name as the directory this order of operations resulted in the directory being created and added to the node tar directory cache when a directory is present in the directory cache subsequent calls to mkdir for that directory are skipped however this is also where node tar checks for symlinks occur by first creating a directory and then replacing that directory with a symlink it was thus possible to bypass node tar symlink checks on directories essentially allowing an untrusted tar file to symlink into an arbitrary location and subsequently extracting arbitrary files into that location thus allowing arbitrary file creation and overwrite this issue was addressed in releases and publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar direct dependency fix resolution react scripts step up your open source security game with whitesource | 0 |
13,923 | 10,548,271,331 | IssuesEvent | 2019-10-03 05:02:11 | dotnet/roslyn | https://api.github.com/repos/dotnet/roslyn | opened | Update System.Collections.Immutable to 1.2.4.0 in VS | Area-Infrastructure | #38809 updated System.Collections.Immutable to take advantage of new API but had to be reverted due to issues integrating into VS.
At this point, we get type load exceptions in test runs. It's possible that some more projects in VS need to have binding redirects added. Once we get that figured out by making a test insertion in a dev branch we can recreate #38809 and pull into master again.
/cc @agocke @tmat @ryzngard | 1.0 | Update System.Collections.Immutable to 1.2.4.0 in VS - #38809 updated System.Collections.Immutable to take advantage of new API but had to be reverted due to issues integrating into VS.
At this point, we get type load exceptions in test runs. It's possible that some more projects in VS need to have binding redirects added. Once we get that figured out by making a test insertion in a dev branch we can recreate #38809 and pull into master again.
/cc @agocke @tmat @ryzngard | non_priority | update system collections immutable to in vs updated system collections immutable to take advantage of new api but had to be reverted due to issues integrating into vs at this point we get type load exceptions in test runs it s possible that some more projects in vs need to have binding redirects added once we get that figured out by making a test insertion in a dev branch we can recreate and pull into master again cc agocke tmat ryzngard | 0 |
91,169 | 11,473,063,417 | IssuesEvent | 2020-02-09 20:55:38 | codeforpdx/dwellingly-app | https://api.github.com/repos/codeforpdx/dwellingly-app | closed | Branding: Update Logo | design good first issue on hold | There is no logo in the product. We need to add the Dwellingly logo where appropriate.
| 1.0 | Branding: Update Logo - There is no logo in the product. We need to add the Dwellingly logo where appropriate.
| non_priority | branding update logo there is no logo in the product we need to add the dwellingly logo where appropriate | 0 |
30,614 | 8,568,037,511 | IssuesEvent | 2018-11-10 17:40:14 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | compile from source fails | stat:awaiting response type:build/install | **System information**
- Ubuntu 18.04
- From source / r1.12
- TensorFlow version:
- Python 3.6.6
- Inside virtualenv
- Bazel 0.18.0
- gcc 6.4.0
- Cuda 9 / Cudnn 7
**The problem**
Building from source, I end up with:
```
ERROR: /home/dev/tensorflow/tensorflow/core/BUILD:319:1: undeclared inclusion(s) in rule '//tensorflow/core:platform_base':
this rule is missing dependency declarations for the following files included by 'tensorflow/core/platform/env_time.cc':
'/usr/lib/gcc/x86_64-linux-gnu/6/include/stdint.h'
'/usr/lib/gcc/x86_64-linux-gnu/6/include/stddef.h'
'/usr/lib/gcc/x86_64-linux-gnu/6/include/stdarg.h'
Target //tensorflow/tools/pip_package:build_pip_package failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 0.460s, Critical Path: 0.26s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
```
This is when I run:
`bazel build --config=opt --config=cuda //tensorflow/tools/pip_package:build_pip_package`
Also note that if I repeat the command several times, I don't always get the same error message. After I do `bazel clean` I get:
```
ERROR: /home/.cache/bazel/_bazel_nnnnn/75c0d842b4eca8fbdb48dc37e31275de/external/protobuf_archive/BUILD:659:1: C++ compilation of rule '@protobuf_archive//:python/google/protobuf/pyext/_message.so' failed (Exit 1)
In file included from external/protobuf_archive/python/google/protobuf/pyext/map_container.cc:33:0:
external/protobuf_archive/python/google/protobuf/pyext/map_container.h:34:20: fatal error: Python.h: No such file or directory
#include <Python.h>
^
compilation terminated.
```
| 1.0 | compile from source fails - **System information**
- Ubuntu 18.04
- From source / r1.12
- TensorFlow version:
- Python 3.6.6
- Inside virtualenv
- Bazel 0.18.0
- gcc 6.4.0
- Cuda 9 / Cudnn 7
**The problem**
Building from source, I end up with:
```
ERROR: /home/dev/tensorflow/tensorflow/core/BUILD:319:1: undeclared inclusion(s) in rule '//tensorflow/core:platform_base':
this rule is missing dependency declarations for the following files included by 'tensorflow/core/platform/env_time.cc':
'/usr/lib/gcc/x86_64-linux-gnu/6/include/stdint.h'
'/usr/lib/gcc/x86_64-linux-gnu/6/include/stddef.h'
'/usr/lib/gcc/x86_64-linux-gnu/6/include/stdarg.h'
Target //tensorflow/tools/pip_package:build_pip_package failed to build
Use --verbose_failures to see the command lines of failed build steps.
INFO: Elapsed time: 0.460s, Critical Path: 0.26s
INFO: 0 processes.
FAILED: Build did NOT complete successfully
```
This is when I run:
`bazel build --config=opt --config=cuda //tensorflow/tools/pip_package:build_pip_package`
Also note that if I repeat the command several times, I don't always get the same error message. After I do `bazel clean` I get:
```
ERROR: /home/.cache/bazel/_bazel_nnnnn/75c0d842b4eca8fbdb48dc37e31275de/external/protobuf_archive/BUILD:659:1: C++ compilation of rule '@protobuf_archive//:python/google/protobuf/pyext/_message.so' failed (Exit 1)
In file included from external/protobuf_archive/python/google/protobuf/pyext/map_container.cc:33:0:
external/protobuf_archive/python/google/protobuf/pyext/map_container.h:34:20: fatal error: Python.h: No such file or directory
#include <Python.h>
^
compilation terminated.
```
| non_priority | compile from source fails system information ubuntu from source tensorflow version python inside virtualenv bazel gcc cuda cudnn the problem building from source i end up with error home dev tensorflow tensorflow core build undeclared inclusion s in rule tensorflow core platform base this rule is missing dependency declarations for the following files included by tensorflow core platform env time cc usr lib gcc linux gnu include stdint h usr lib gcc linux gnu include stddef h usr lib gcc linux gnu include stdarg h target tensorflow tools pip package build pip package failed to build use verbose failures to see the command lines of failed build steps info elapsed time critical path info processes failed build did not complete successfully this is when i run bazel build config opt config cuda tensorflow tools pip package build pip package also note that if i repeat the command several times i don t always get the same error message after i do bazel clean i get error home cache bazel bazel nnnnn external protobuf archive build c compilation of rule protobuf archive python google protobuf pyext message so failed exit in file included from external protobuf archive python google protobuf pyext map container cc external protobuf archive python google protobuf pyext map container h fatal error python h no such file or directory include compilation terminated | 0 |
275,421 | 23,914,274,718 | IssuesEvent | 2022-09-09 11:09:50 | Kuadrant/testsuite | https://api.github.com/repos/Kuadrant/testsuite | opened | Add test for authentication source priority | test-case Kuadrant | You can assign priority to each authentication source. | 1.0 | Add test for authentication source priority - You can assign priority to each authentication source. | non_priority | add test for authentication source priority you can assign priority to each authentication source | 0 |
443,924 | 30,987,673,824 | IssuesEvent | 2023-08-09 00:01:53 | aws/aws-sdk-js-v3 | https://api.github.com/repos/aws/aws-sdk-js-v3 | closed | BUG: there is no way to search for output object (such as GetObjectCommandOutput) | documentation bug closed-for-staleness p2 | ### Describe the issue
The new documentation is a major step backwards. To find the output for a command I have to:
1. Search for the package that command is in (S3 in this example)
2. Search for the command itself (GetObjectCommand)
3. Click on the result as the preview does not show the output
4. Scroll down to see the output that is listed at the bottom
5. Click on the reference to the command output object
6. Now click in to the linked object
None of which is very discoverable nor usable. I should be able to search for `GetObjectCommandOutput` from the front page and it _tell ME_ which package it is in and which command it comes from.
and _still_ it does not show a reference to `__WithSdkStreamMixin` and the `transformToByteArray` method.
### Links
https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/client/s3/ | 1.0 | BUG: there is no way to search for output object (such as GetObjectCommandOutput) - ### Describe the issue
The new documentation is a major step backwards. To find the output for a command I have to:
1. Search for the package that command is in (S3 in this example)
2. Search for the command itself (GetObjectCommand)
3. Click on the result as the preview does not show the output
4. Scroll down to see the output that is listed at the bottom
5. Click on the reference to the command output object
6. Now click in to the linked object
None of which is very discoverable nor usable. I should be able to search for `GetObjectCommandOutput` from the front page and it _tell ME_ which package it is in and which command it comes from.
and _still_ it does not show a reference to `__WithSdkStreamMixin` and the `transformToByteArray` method.
### Links
https://docs.aws.amazon.com/AWSJavaScriptSDK/v3/latest/client/s3/ | non_priority | bug there is no way to search for output object such as getobjectcommandoutput describe the issue the new documentation is a major step backwards to find the output for a command i have to search for the package that command is in in this example search for the command itself getobjectcommand click on the result as the preview does not show the output scroll down to see the output that is listed at the bottom click on the reference to the command output object now click in to the linked object none of which is very discoverable nor usable i should be able to search for getobjectcommandoutput from the front page and it tell me which package it is in and which command it comes from and still it does not show a reference to withsdkstreammixin and the transformtobytearray method links | 0 |
16,214 | 4,028,674,099 | IssuesEvent | 2016-05-18 07:36:34 | deeplearning4j/Canova | https://api.github.com/repos/deeplearning4j/Canova | closed | any good scala step by step tutorials on images with Canova? | documentation | hey,
are you planning to make a good explanatory tutorial vectorizing images with canova? I dont understand the current tutorial that well and hope there was something in addition too. | 1.0 | any good scala step by step tutorials on images with Canova? - hey,
are you planning to make a good explanatory tutorial vectorizing images with canova? I dont understand the current tutorial that well and hope there was something in addition too. | non_priority | any good scala step by step tutorials on images with canova hey are you planning to make a good explanatory tutorial vectorizing images with canova i dont understand the current tutorial that well and hope there was something in addition too | 0 |
6,383 | 2,841,825,636 | IssuesEvent | 2015-05-28 04:03:32 | ajency/short-film-window | https://api.github.com/repos/ajency/short-film-window | closed | Header - Social icons are loaded with a delay of few seconds compared to the rest of the Site | Bug Functionality Tested | 
| 1.0 | Header - Social icons are loaded with a delay of few seconds compared to the rest of the Site - 
| non_priority | header social icons are loaded with a delay of few seconds compared to the rest of the site | 0 |
107,360 | 23,397,450,480 | IssuesEvent | 2022-08-12 02:23:34 | iree-org/iree | https://api.github.com/repos/iree-org/iree | closed | Add EfficientNet-Lite0-quant and MobileBert-quant to RISC-V benchmarking infrastructure | codegen codegen/llvm infrastructure/benchmark codegen/riscv | ### Request description
We are currently tracking the performance of the following benchmarks locally for RISC-V:
- MobileNetV1-float
- MobileBert-float
- DeepLabV3-float
- EfficientNet-Lite0-quant
- MobileBert-quant
- PersonDetect-quant
We should add EfficientNet-Lite0-quant and MobileBert-quant to our RISC-V benchmarking infrastructure.
### What component(s) does this issue relate to?
Compiler
### Additional context
_No response_ | 3.0 | Add EfficientNet-Lite0-quant and MobileBert-quant to RISC-V benchmarking infrastructure - ### Request description
We are currently tracking the performance of the following benchmarks locally for RISC-V:
- MobileNetV1-float
- MobileBert-float
- DeepLabV3-float
- EfficientNet-Lite0-quant
- MobileBert-quant
- PersonDetect-quant
We should add EfficientNet-Lite0-quant and MobileBert-quant to our RISC-V benchmarking infrastructure.
### What component(s) does this issue relate to?
Compiler
### Additional context
_No response_ | non_priority | add efficientnet quant and mobilebert quant to risc v benchmarking infrastructure request description we are currently tracking the performance of the following benchmarks locally for risc v float mobilebert float float efficientnet quant mobilebert quant persondetect quant we should add efficientnet quant and mobilebert quant to our risc v benchmarking infrastructure what component s does this issue relate to compiler additional context no response | 0 |
412,109 | 27,846,714,173 | IssuesEvent | 2023-03-20 15:55:51 | VoHongKhang/Nhom10_CCPTPM | https://api.github.com/repos/VoHongKhang/Nhom10_CCPTPM | closed | Tìm hiểu đề tài và tìm hiểu giao diện trang web https://www.talkdesk.com/cloud-contact-center/customer-experience-analytics/ | documentation | - [x] Ngày 1: Tìm hiểu đề tài
- [x] Ngày 2: Phân tích đề tài
- [x] Ngày 3: Tìm hiểu trang web
- [x] Ngày 4: Phân tích giao diện và cách hoạt động của trang web
- [x] Ngày 5: Phân tích các thành phần của header
- [x] Ngày 6: Phân tích các thành phần của footer
- [x] Ngày 7: Phân tích content chứa nội dung của từng phần
- [x] Ngày 7: Thiết giao diện cơ bản bằng figma | 1.0 | Tìm hiểu đề tài và tìm hiểu giao diện trang web https://www.talkdesk.com/cloud-contact-center/customer-experience-analytics/ - - [x] Ngày 1: Tìm hiểu đề tài
- [x] Ngày 2: Phân tích đề tài
- [x] Ngày 3: Tìm hiểu trang web
- [x] Ngày 4: Phân tích giao diện và cách hoạt động của trang web
- [x] Ngày 5: Phân tích các thành phần của header
- [x] Ngày 6: Phân tích các thành phần của footer
- [x] Ngày 7: Phân tích content chứa nội dung của từng phần
- [x] Ngày 7: Thiết giao diện cơ bản bằng figma | non_priority | tìm hiểu đề tài và tìm hiểu giao diện trang web ngày tìm hiểu đề tài ngày phân tích đề tài ngày tìm hiểu trang web ngày phân tích giao diện và cách hoạt động của trang web ngày phân tích các thành phần của header ngày phân tích các thành phần của footer ngày phân tích content chứa nội dung của từng phần ngày thiết giao diện cơ bản bằng figma | 0 |
22,567 | 3,963,301,120 | IssuesEvent | 2016-05-02 19:57:00 | log2timeline/plaso | https://api.github.com/repos/log2timeline/plaso | opened | End-to-end tests fail on travis | bug testing | Due to changes in the test files the output order of the end-to-end tests run on travis are not consistent:
* [ ] update end-to-end test reference files: https://codereview.appspot.com/297140043/
* [ ] fix sorting order of events to be consistent | 1.0 | End-to-end tests fail on travis - Due to changes in the test files the output order of the end-to-end tests run on travis are not consistent:
* [ ] update end-to-end test reference files: https://codereview.appspot.com/297140043/
* [ ] fix sorting order of events to be consistent | non_priority | end to end tests fail on travis due to changes in the test files the output order of the end to end tests run on travis are not consistent update end to end test reference files fix sorting order of events to be consistent | 0 |
74,416 | 15,349,975,643 | IssuesEvent | 2021-03-01 01:00:27 | jtimberlake/stripe-payments-demo | https://api.github.com/repos/jtimberlake/stripe-payments-demo | opened | CVE-2020-28500 (Medium) detected in lodash-4.17.15.tgz | security vulnerability | ## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.15.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p>
<p>Path to dependency file: stripe-payments-demo/package.json</p>
<p>Path to vulnerable library: stripe-payments-demo/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- ngrok-3.2.7.tgz (Root Library)
- request-promise-native-1.0.8.tgz
- request-promise-core-1.1.3.tgz
- :x: **lodash-4.17.15.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
All versions of package lodash; all versions of package org.fujion.webjars:lodash are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions. Steps to reproduce (provided by reporter Liyuan Chen): var lo = require('lodash'); function build_blank (n) { var ret = "1" for (var i = 0; i < n; i++) { ret += " " } return ret + "1"; } var s = build_blank(50000) var time0 = Date.now(); lo.trim(s) var time_cost0 = Date.now() - time0; console.log("time_cost0: " + time_cost0) var time1 = Date.now(); lo.toNumber(s) var time_cost1 = Date.now() - time1; console.log("time_cost1: " + time_cost1) var time2 = Date.now(); lo.trimEnd(s) var time_cost2 = Date.now() - time2; console.log("time_cost2: " + time_cost2)
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7">https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash - 4.17.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.15","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"ngrok:3.2.7;request-promise-native:1.0.8;request-promise-core:1.1.3;lodash:4.17.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28500","vulnerabilityDetails":"All versions of package lodash; all versions of package org.fujion.webjars:lodash are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions. Steps to reproduce (provided by reporter Liyuan Chen): var lo \u003d require(\u0027lodash\u0027); function build_blank (n) { var ret \u003d \"1\" for (var i \u003d 0; i \u003c n; i++) { ret +\u003d \" \" } return ret + \"1\"; } var s \u003d build_blank(50000) var time0 \u003d Date.now(); lo.trim(s) var time_cost0 \u003d Date.now() - time0; console.log(\"time_cost0: \" + time_cost0) var time1 \u003d Date.now(); lo.toNumber(s) var time_cost1 \u003d Date.now() - time1; console.log(\"time_cost1: \" + time_cost1) var time2 \u003d Date.now(); lo.trimEnd(s) var time_cost2 \u003d Date.now() - time2; console.log(\"time_cost2: \" + time_cost2)","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-28500 (Medium) detected in lodash-4.17.15.tgz - ## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.15.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p>
<p>Path to dependency file: stripe-payments-demo/package.json</p>
<p>Path to vulnerable library: stripe-payments-demo/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- ngrok-3.2.7.tgz (Root Library)
- request-promise-native-1.0.8.tgz
- request-promise-core-1.1.3.tgz
- :x: **lodash-4.17.15.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
All versions of package lodash; all versions of package org.fujion.webjars:lodash are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions. Steps to reproduce (provided by reporter Liyuan Chen): var lo = require('lodash'); function build_blank (n) { var ret = "1" for (var i = 0; i < n; i++) { ret += " " } return ret + "1"; } var s = build_blank(50000) var time0 = Date.now(); lo.trim(s) var time_cost0 = Date.now() - time0; console.log("time_cost0: " + time_cost0) var time1 = Date.now(); lo.toNumber(s) var time_cost1 = Date.now() - time1; console.log("time_cost1: " + time_cost1) var time2 = Date.now(); lo.trimEnd(s) var time_cost2 = Date.now() - time2; console.log("time_cost2: " + time_cost2)
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7">https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash - 4.17.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.15","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"ngrok:3.2.7;request-promise-native:1.0.8;request-promise-core:1.1.3;lodash:4.17.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-28500","vulnerabilityDetails":"All versions of package lodash; all versions of package org.fujion.webjars:lodash are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions. Steps to reproduce (provided by reporter Liyuan Chen): var lo \u003d require(\u0027lodash\u0027); function build_blank (n) { var ret \u003d \"1\" for (var i \u003d 0; i \u003c n; i++) { ret +\u003d \" \" } return ret + \"1\"; } var s \u003d build_blank(50000) var time0 \u003d Date.now(); lo.trim(s) var time_cost0 \u003d Date.now() - time0; console.log(\"time_cost0: \" + time_cost0) var time1 \u003d Date.now(); lo.toNumber(s) var time_cost1 \u003d Date.now() - time1; console.log(\"time_cost1: \" + time_cost1) var time2 \u003d Date.now(); lo.trimEnd(s) var time_cost2 \u003d Date.now() - time2; console.log(\"time_cost2: \" + time_cost2)","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve medium detected in lodash tgz cve medium severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file stripe payments demo package json path to vulnerable library stripe payments demo node modules lodash package json dependency hierarchy ngrok tgz root library request promise native tgz request promise core tgz x lodash tgz vulnerable library vulnerability details all versions of package lodash all versions of package org fujion webjars lodash are vulnerable to regular expression denial of service redos via the tonumber trim and trimend functions steps to reproduce provided by reporter liyuan chen var lo require lodash function build blank n var ret for var i i n i ret return ret var s build blank var date now lo trim s var time date now console log time time var date now lo tonumber s var time date now console log time time var date now lo trimend s var time date now console log time time publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree ngrok request promise native request promise core lodash isminimumfixversionavailable true minimumfixversion lodash basebranches vulnerabilityidentifier cve vulnerabilitydetails all versions of package lodash all versions of package org fujion webjars lodash are vulnerable to regular expression denial of service redos via the tonumber trim and trimend functions steps to reproduce provided by reporter liyuan chen var lo require function build blank n var ret for var i i n i ret return ret var s build blank var date now lo trim s var time date now console log time time var date now lo tonumber s var time date now console log time time var date now lo trimend s var time date now console log time time vulnerabilityurl | 0 |
428,013 | 29,923,261,841 | IssuesEvent | 2023-06-22 01:37:54 | afterpay/sdk-android | https://api.github.com/repos/afterpay/sdk-android | closed | Bug: Documents can be written incorrectly | documentation | I trembled with laughter as even a large company's documents could be written incorrectly,Owl and scream~~~
https://afterpay.github.io/sdk-android/
| 1.0 | Bug: Documents can be written incorrectly - I trembled with laughter as even a large company's documents could be written incorrectly,Owl and scream~~~
https://afterpay.github.io/sdk-android/
| non_priority | bug documents can be written incorrectly i trembled with laughter as even a large company s documents could be written incorrectly owl and scream | 0 |
150,972 | 19,648,140,229 | IssuesEvent | 2022-01-10 01:01:44 | snowdensb/vets-website | https://api.github.com/repos/snowdensb/vets-website | closed | WS-2022-0007 (Medium) detected in node-forge-0.10.0.tgz - autoclosed | security vulnerability | ## WS-2022-0007 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary>
<p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p>
<p>Path to dependency file: /node_modules/node-forge/package.json</p>
<p>Path to vulnerable library: /vets-website/node_modules/node-forge/package.json</p>
<p>
Dependency Hierarchy:
- webpack-dev-server-3.11.0.tgz (Root Library)
- selfsigned-1.10.7.tgz
- :x: **node-forge-0.10.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In node-forge before 1.0.0 he regex used for the forge.util.parseUrl API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.
<p>Publish Date: 2022-01-08
<p>URL: <a href=https://github.com/digitalbazaar/forge/commit/db8016c805371e72b06d8e2edfe0ace0df934a5e>WS-2022-0007</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-gf8q-jrpm-jvxq">https://github.com/advisories/GHSA-gf8q-jrpm-jvxq</a></p>
<p>Release Date: 2022-01-08</p>
<p>Fix Resolution: node-forge - 1.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-forge","packageVersion":"0.10.0","packageFilePaths":["/node_modules/node-forge/package.json"],"isTransitiveDependency":true,"dependencyTree":"webpack-dev-server:3.11.0;selfsigned:1.10.7;node-forge:0.10.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"node-forge - 1.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2022-0007","vulnerabilityDetails":"In node-forge before 1.0.0 he regex used for the forge.util.parseUrl API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.","vulnerabilityUrl":"https://github.com/digitalbazaar/forge/commit/db8016c805371e72b06d8e2edfe0ace0df934a5e","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | True | WS-2022-0007 (Medium) detected in node-forge-0.10.0.tgz - autoclosed - ## WS-2022-0007 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-forge-0.10.0.tgz</b></p></summary>
<p>JavaScript implementations of network transports, cryptography, ciphers, PKI, message digests, and various utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz">https://registry.npmjs.org/node-forge/-/node-forge-0.10.0.tgz</a></p>
<p>Path to dependency file: /node_modules/node-forge/package.json</p>
<p>Path to vulnerable library: /vets-website/node_modules/node-forge/package.json</p>
<p>
Dependency Hierarchy:
- webpack-dev-server-3.11.0.tgz (Root Library)
- selfsigned-1.10.7.tgz
- :x: **node-forge-0.10.0.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In node-forge before 1.0.0 he regex used for the forge.util.parseUrl API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.
<p>Publish Date: 2022-01-08
<p>URL: <a href=https://github.com/digitalbazaar/forge/commit/db8016c805371e72b06d8e2edfe0ace0df934a5e>WS-2022-0007</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-gf8q-jrpm-jvxq">https://github.com/advisories/GHSA-gf8q-jrpm-jvxq</a></p>
<p>Release Date: 2022-01-08</p>
<p>Fix Resolution: node-forge - 1.0.0</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-forge","packageVersion":"0.10.0","packageFilePaths":["/node_modules/node-forge/package.json"],"isTransitiveDependency":true,"dependencyTree":"webpack-dev-server:3.11.0;selfsigned:1.10.7;node-forge:0.10.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"node-forge - 1.0.0","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2022-0007","vulnerabilityDetails":"In node-forge before 1.0.0 he regex used for the forge.util.parseUrl API would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior.","vulnerabilityUrl":"https://github.com/digitalbazaar/forge/commit/db8016c805371e72b06d8e2edfe0ace0df934a5e","cvss3Severity":"medium","cvss3Score":"5.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | non_priority | ws medium detected in node forge tgz autoclosed ws medium severity vulnerability vulnerable library node forge tgz javascript implementations of network transports cryptography ciphers pki message digests and various utilities library home page a href path to dependency file node modules node forge package json path to vulnerable library vets website node modules node forge package json dependency hierarchy webpack dev server tgz root library selfsigned tgz x node forge tgz vulnerable library found in base branch master vulnerability details in node forge before he regex used for the forge util parseurl api would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution node forge isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree webpack dev server selfsigned node forge isminimumfixversionavailable true minimumfixversion node forge isbinary false basebranches vulnerabilityidentifier ws vulnerabilitydetails in node forge before he regex used for the forge util parseurl api would not properly parse certain inputs resulting in a parsed data structure that could lead to undesired behavior vulnerabilityurl | 0 |
41,066 | 10,606,846,033 | IssuesEvent | 2019-10-11 01:10:13 | pytorch/pytorch | https://api.github.com/repos/pytorch/pytorch | closed | `ATen/Functions.h` does not exist after calling `scripts/build_android.sh` | module: android module: build triaged | ## 🐛 Bug
Trying to compile pytorch for android, calling `scripts/build_android.sh` for that. Afterwards when trying to use the libraries (torch, c10 etc) via the header `torch/script.h` inside an app I'm getting an error:
aten/src/ATen/ATen.h:12:10: fatal error: 'ATen/Functions.h' file not found
And indeed, using `find` to search for `Functions.h` only results in:
./tools/autograd/templates/Functions.h
./aten/src/ATen/templates/Functions.h
Remember this is *after* calling `scripts/build_android.sh` (so I would have hoped to find it somewhere under `build_android/install`).
It is requested because `torch/script.h` calls `torch/csrc/api/include/torch/types.h` calls `torch/csrc/api/include/torch/types.h` which calls `ATen/ATen.h` which has a line `#include <ATen/Functions.h>`.
## To Reproduce
Steps to reproduce the behavior:
1. git clone etc.
2. scripts/build_android.sh
From here it should already be clear that `ATen/Functions.h` does not exist but is required. To actually get an error:
3. Try and build app in AndroidStudio with `torch/script.h` and relevant libraries linked.
## Expected behavior
I expect `ATen/Functions.h` to exist after calling `scripts/build_android.sh`.
## Environment
```
PyTorch version: 1.3.0a0+f433ee1
Is debug build: No
CUDA used to build PyTorch: 10.1.243
OS: Ubuntu 16.04.6 LTS
GCC version: (Ubuntu 5.4.0-6ubuntu1~16.04.11) 5.4.0 20160609
CMake version: version 3.14.2
Python version: 3.6
Is CUDA available: Yes
CUDA runtime version: Could not collect
GPU models and configuration:
GPU 0: GeForce GTX 1080 Ti
GPU 1: GeForce GT 1030
GPU 2: GeForce GTX 1080 Ti
Nvidia driver version: 418.87.00
cuDNN version: Probably one of the following:
/usr/lib/x86_64-linux-gnu/libcudnn.so.7.3.1
/usr/local/cuda-10.1/targets/x86_64-linux/lib/libcudnn.so.7
/usr/local/cuda-9.1/targets/x86_64-linux/lib/libcudnn.so.7
Versions of relevant libraries:
[pip3] msgpack-numpy==0.4.3.2
[pip3] numpy==1.16.1
[pip3] torch==1.3.0a0+f433ee1
[pip3] torchvision==0.3.0a0+250bac8
[conda] Could not collect
```
| 1.0 | `ATen/Functions.h` does not exist after calling `scripts/build_android.sh` - ## 🐛 Bug
Trying to compile pytorch for android, calling `scripts/build_android.sh` for that. Afterwards when trying to use the libraries (torch, c10 etc) via the header `torch/script.h` inside an app I'm getting an error:
aten/src/ATen/ATen.h:12:10: fatal error: 'ATen/Functions.h' file not found
And indeed, using `find` to search for `Functions.h` only results in:
./tools/autograd/templates/Functions.h
./aten/src/ATen/templates/Functions.h
Remember this is *after* calling `scripts/build_android.sh` (so I would have hoped to find it somewhere under `build_android/install`).
It is requested because `torch/script.h` calls `torch/csrc/api/include/torch/types.h` calls `torch/csrc/api/include/torch/types.h` which calls `ATen/ATen.h` which has a line `#include <ATen/Functions.h>`.
## To Reproduce
Steps to reproduce the behavior:
1. git clone etc.
2. scripts/build_android.sh
From here it should already be clear that `ATen/Functions.h` does not exist but is required. To actually get an error:
3. Try and build app in AndroidStudio with `torch/script.h` and relevant libraries linked.
## Expected behavior
I expect `ATen/Functions.h` to exist after calling `scripts/build_android.sh`.
## Environment
```
PyTorch version: 1.3.0a0+f433ee1
Is debug build: No
CUDA used to build PyTorch: 10.1.243
OS: Ubuntu 16.04.6 LTS
GCC version: (Ubuntu 5.4.0-6ubuntu1~16.04.11) 5.4.0 20160609
CMake version: version 3.14.2
Python version: 3.6
Is CUDA available: Yes
CUDA runtime version: Could not collect
GPU models and configuration:
GPU 0: GeForce GTX 1080 Ti
GPU 1: GeForce GT 1030
GPU 2: GeForce GTX 1080 Ti
Nvidia driver version: 418.87.00
cuDNN version: Probably one of the following:
/usr/lib/x86_64-linux-gnu/libcudnn.so.7.3.1
/usr/local/cuda-10.1/targets/x86_64-linux/lib/libcudnn.so.7
/usr/local/cuda-9.1/targets/x86_64-linux/lib/libcudnn.so.7
Versions of relevant libraries:
[pip3] msgpack-numpy==0.4.3.2
[pip3] numpy==1.16.1
[pip3] torch==1.3.0a0+f433ee1
[pip3] torchvision==0.3.0a0+250bac8
[conda] Could not collect
```
| non_priority | aten functions h does not exist after calling scripts build android sh 🐛 bug trying to compile pytorch for android calling scripts build android sh for that afterwards when trying to use the libraries torch etc via the header torch script h inside an app i m getting an error aten src aten aten h fatal error aten functions h file not found and indeed using find to search for functions h only results in tools autograd templates functions h aten src aten templates functions h remember this is after calling scripts build android sh so i would have hoped to find it somewhere under build android install it is requested because torch script h calls torch csrc api include torch types h calls torch csrc api include torch types h which calls aten aten h which has a line include to reproduce steps to reproduce the behavior git clone etc scripts build android sh from here it should already be clear that aten functions h does not exist but is required to actually get an error try and build app in androidstudio with torch script h and relevant libraries linked expected behavior i expect aten functions h to exist after calling scripts build android sh environment pytorch version is debug build no cuda used to build pytorch os ubuntu lts gcc version ubuntu cmake version version python version is cuda available yes cuda runtime version could not collect gpu models and configuration gpu geforce gtx ti gpu geforce gt gpu geforce gtx ti nvidia driver version cudnn version probably one of the following usr lib linux gnu libcudnn so usr local cuda targets linux lib libcudnn so usr local cuda targets linux lib libcudnn so versions of relevant libraries msgpack numpy numpy torch torchvision could not collect | 0 |
84,795 | 16,553,268,737 | IssuesEvent | 2021-05-28 11:03:47 | ESPD/ESPD-EDM | https://api.github.com/repos/ESPD/ESPD-EDM | closed | Code list CriteriaTaxonomy (CL name TBD) | V3.0.0 codelists | As presented in the OUC community of 3rd of February, the current **CriteriaTaxonomy** code list will be updated for next ESPD release 3.0.0.
The main objective of the update of the code list is to make it non-domain specific and allow its reuse behind the procurement domain.
The main changes will be:
- Change name: from CriteriaTaxonomy to criteria or criterion (TBD).
o The change of name is due to the objective of making the code list non-domain specific.
- Code Change to define non-specific domain criteria (final code changes to be provided in March)
- Adapt concepts when needed
- Code list to be maintained by EU Vocabularies (mid-2021)
Find here the file with the proposals shown during the meeting.
[criterion_cl_sample.xlsx](https://github.com/ESPD/ESPD-EDM/files/5932480/criterion_cl_sample.xlsx)
If you have any additional comment, please do not hesitate to comment on this issue.
| 1.0 | Code list CriteriaTaxonomy (CL name TBD) - As presented in the OUC community of 3rd of February, the current **CriteriaTaxonomy** code list will be updated for next ESPD release 3.0.0.
The main objective of the update of the code list is to make it non-domain specific and allow its reuse behind the procurement domain.
The main changes will be:
- Change name: from CriteriaTaxonomy to criteria or criterion (TBD).
o The change of name is due to the objective of making the code list non-domain specific.
- Code Change to define non-specific domain criteria (final code changes to be provided in March)
- Adapt concepts when needed
- Code list to be maintained by EU Vocabularies (mid-2021)
Find here the file with the proposals shown during the meeting.
[criterion_cl_sample.xlsx](https://github.com/ESPD/ESPD-EDM/files/5932480/criterion_cl_sample.xlsx)
If you have any additional comment, please do not hesitate to comment on this issue.
| non_priority | code list criteriataxonomy cl name tbd as presented in the ouc community of of february the current criteriataxonomy code list will be updated for next espd release the main objective of the update of the code list is to make it non domain specific and allow its reuse behind the procurement domain the main changes will be change name from criteriataxonomy to criteria or criterion tbd o the change of name is due to the objective of making the code list non domain specific code change to define non specific domain criteria final code changes to be provided in march adapt concepts when needed code list to be maintained by eu vocabularies mid find here the file with the proposals shown during the meeting if you have any additional comment please do not hesitate to comment on this issue | 0 |
60,120 | 25,006,166,588 | IssuesEvent | 2022-11-03 12:02:34 | MicrosoftDocs/azure-docs | https://api.github.com/repos/MicrosoftDocs/azure-docs | closed | Missing sample for Managed Identity request | cognitive-services/svc triaged assigned-to-author doc-enhancement Pri2 |
[Enter feedback here]
For all kind of invocations a sample is included. However, for MI one is not included or clearly specify what to use:
https://docs.microsoft.com/en-us/azure/cognitive-services/authentication?context=%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Fcontext%2Fcontext&tabs=powershell#authorize-access-to-managed-identities
The key point is that in previous samples you can see what token and how to get it to be indicated on a rest call. For MI scenario it is not.
I guess we might use a similar approach as in this sample:
https://docs.microsoft.com/en-us/azure/app-service/overview-managed-identity?tabs=portal%2Cjava
But it would be great to have it
Thanks
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 1218be14-5001-1458-f5e9-821e38730e20
* Version Independent ID: 45f58de5-cd30-0455-907f-ef1820615d7e
* Content: [Authentication in Azure Cognitive Services - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/authentication?context=%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Fcontext%2Fcontext&tabs=powershell)
* Content Source: [articles/cognitive-services/authentication.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/cognitive-services/authentication.md)
* Service: **cognitive-services**
* GitHub Login: @PatrickFarley
* Microsoft Alias: **pafarley** | 1.0 | Missing sample for Managed Identity request -
[Enter feedback here]
For all kind of invocations a sample is included. However, for MI one is not included or clearly specify what to use:
https://docs.microsoft.com/en-us/azure/cognitive-services/authentication?context=%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Fcontext%2Fcontext&tabs=powershell#authorize-access-to-managed-identities
The key point is that in previous samples you can see what token and how to get it to be indicated on a rest call. For MI scenario it is not.
I guess we might use a similar approach as in this sample:
https://docs.microsoft.com/en-us/azure/app-service/overview-managed-identity?tabs=portal%2Cjava
But it would be great to have it
Thanks
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 1218be14-5001-1458-f5e9-821e38730e20
* Version Independent ID: 45f58de5-cd30-0455-907f-ef1820615d7e
* Content: [Authentication in Azure Cognitive Services - Azure Cognitive Services](https://docs.microsoft.com/en-us/azure/cognitive-services/authentication?context=%2Fazure%2Fapplied-ai-services%2Fform-recognizer%2Fcontext%2Fcontext&tabs=powershell)
* Content Source: [articles/cognitive-services/authentication.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/cognitive-services/authentication.md)
* Service: **cognitive-services**
* GitHub Login: @PatrickFarley
* Microsoft Alias: **pafarley** | non_priority | missing sample for managed identity request for all kind of invocations a sample is included however for mi one is not included or clearly specify what to use the key point is that in previous samples you can see what token and how to get it to be indicated on a rest call for mi scenario it is not i guess we might use a similar approach as in this sample but it would be great to have it thanks document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service cognitive services github login patrickfarley microsoft alias pafarley | 0 |
442,320 | 30,828,884,607 | IssuesEvent | 2023-08-01 22:53:09 | TheUpperPart/leaguehub-backend | https://api.github.com/repos/TheUpperPart/leaguehub-backend | closed | Docs: Swagger UI API 작성 | documentation refactor | ## 📝 문서 작업 사항
<!-- 어떤 문서 작업을 진행했는지 알려주세요. -->
- [x] ParticipantController
- [x] 해당 컨트롤러에 사용하는 Dto 예제 추가
## 📖 참고 사항
<!-- 레퍼런스, 스크린샷 등을 넣어 주세요. --> | 1.0 | Docs: Swagger UI API 작성 - ## 📝 문서 작업 사항
<!-- 어떤 문서 작업을 진행했는지 알려주세요. -->
- [x] ParticipantController
- [x] 해당 컨트롤러에 사용하는 Dto 예제 추가
## 📖 참고 사항
<!-- 레퍼런스, 스크린샷 등을 넣어 주세요. --> | non_priority | docs swagger ui api 작성 📝 문서 작업 사항 participantcontroller 해당 컨트롤러에 사용하는 dto 예제 추가 📖 참고 사항 | 0 |
445,210 | 31,199,095,523 | IssuesEvent | 2023-08-18 00:17:33 | projectLEMDO/lemdoIssues | https://api.github.com/repos/projectLEMDO/lemdoIssues | closed | Standardize reporting of signatures in `@n` value of `<pb>` | documentation encoding go for it fix committed | We've been having trouble processing the values of signatures and folio numbers appearing in `pb/@n` because they're not standard; some use lower-case letters, some have leading stuff which is not part of the sig (see issue #145), and others use multiple letters (Aa instead of 2A). We need to standardize this, document it, and enforce it with Schematron. | 1.0 | Standardize reporting of signatures in `@n` value of `<pb>` - We've been having trouble processing the values of signatures and folio numbers appearing in `pb/@n` because they're not standard; some use lower-case letters, some have leading stuff which is not part of the sig (see issue #145), and others use multiple letters (Aa instead of 2A). We need to standardize this, document it, and enforce it with Schematron. | non_priority | standardize reporting of signatures in n value of we ve been having trouble processing the values of signatures and folio numbers appearing in pb n because they re not standard some use lower case letters some have leading stuff which is not part of the sig see issue and others use multiple letters aa instead of we need to standardize this document it and enforce it with schematron | 0 |
49,658 | 20,829,699,310 | IssuesEvent | 2022-03-19 08:02:51 | Azure/azure-sdk-for-net | https://api.github.com/repos/Azure/azure-sdk-for-net | closed | [QUERY] Azure.Storage.Files.DataLake on behalf of user authentication | Service Attention Client needs-author-feedback customer-reported feature-request Data Lake Storage Gen2 no-recent-activity | **Query/Question**
Issue I'm having with this sdk is that I cant figure out how to authN user against my datalake. There are so many options on TokenCredentials, but none to generate new token from existing one. Went through documentation and could not figure out. I'm trying to achieve this on WebAPI where I already have access token for my own system from AD, authenticated for this app in my tenant.
***Why is this not a Bug or a feature Request?***
I'm unsure if there is way to do this.
**Setup (please complete the following information if applicable):**
- Running .NET Core 3.1
- Azure.Storage.Files.DataLake 12.0.0 preview 8
| 1.0 | [QUERY] Azure.Storage.Files.DataLake on behalf of user authentication - **Query/Question**
Issue I'm having with this sdk is that I cant figure out how to authN user against my datalake. There are so many options on TokenCredentials, but none to generate new token from existing one. Went through documentation and could not figure out. I'm trying to achieve this on WebAPI where I already have access token for my own system from AD, authenticated for this app in my tenant.
***Why is this not a Bug or a feature Request?***
I'm unsure if there is way to do this.
**Setup (please complete the following information if applicable):**
- Running .NET Core 3.1
- Azure.Storage.Files.DataLake 12.0.0 preview 8
| non_priority | azure storage files datalake on behalf of user authentication query question issue i m having with this sdk is that i cant figure out how to authn user against my datalake there are so many options on tokencredentials but none to generate new token from existing one went through documentation and could not figure out i m trying to achieve this on webapi where i already have access token for my own system from ad authenticated for this app in my tenant why is this not a bug or a feature request i m unsure if there is way to do this setup please complete the following information if applicable running net core azure storage files datalake preview | 0 |
10,444 | 4,063,654,794 | IssuesEvent | 2016-05-26 00:59:41 | ReikaKalseki/Reika_Mods_Issues | https://api.github.com/repos/ReikaKalseki/Reika_Mods_Issues | closed | [ChromatiCraft] ChromaHelpHUD reads invalid block metadata after looked at block changes | Bug ChromatiCraft Crash Mod Interaction Stupid Code Uncertain Fixability | This is related to an [issue in Glenn's Gases' issue tracker](https://bitbucket.org/jamieswhiteshirt/glenns-gases/issues/43/glenns-gases-causes-crash-with). It is causing a [crash](http://pastebin.com/k9mDmDmE).
What is happening: The player mines a ChromatiCraft ore block. Because Glenn's Gases recognizes this as a stone-type material block, it places dust in place of the ore. This dust uses metadata to track its own volume, and is therefore affecting the metadata in [ChromaHelpHUDs queries in the subsequent ticks](https://github.com/ReikaKalseki/ChromatiCraft/blob/master/Auxiliary/ChromaHelpHUD.java#L70).
I think this might be a general problem in any situation in which a block the player is looking at changes. | 1.0 | [ChromatiCraft] ChromaHelpHUD reads invalid block metadata after looked at block changes - This is related to an [issue in Glenn's Gases' issue tracker](https://bitbucket.org/jamieswhiteshirt/glenns-gases/issues/43/glenns-gases-causes-crash-with). It is causing a [crash](http://pastebin.com/k9mDmDmE).
What is happening: The player mines a ChromatiCraft ore block. Because Glenn's Gases recognizes this as a stone-type material block, it places dust in place of the ore. This dust uses metadata to track its own volume, and is therefore affecting the metadata in [ChromaHelpHUDs queries in the subsequent ticks](https://github.com/ReikaKalseki/ChromatiCraft/blob/master/Auxiliary/ChromaHelpHUD.java#L70).
I think this might be a general problem in any situation in which a block the player is looking at changes. | non_priority | chromahelphud reads invalid block metadata after looked at block changes this is related to an it is causing a what is happening the player mines a chromaticraft ore block because glenn s gases recognizes this as a stone type material block it places dust in place of the ore this dust uses metadata to track its own volume and is therefore affecting the metadata in i think this might be a general problem in any situation in which a block the player is looking at changes | 0 |
4,816 | 2,875,626,953 | IssuesEvent | 2015-06-09 09:22:28 | hazelcast/hazelcast | https://api.github.com/repos/hazelcast/hazelcast | closed | ReplicatedMap documentation page does not mention that it is in beta stage | Team: Documentation | ReplicatedMap interface is marked with @Beta annotation, however the documentation does not warn the users that it might not be fully stable. | 1.0 | ReplicatedMap documentation page does not mention that it is in beta stage - ReplicatedMap interface is marked with @Beta annotation, however the documentation does not warn the users that it might not be fully stable. | non_priority | replicatedmap documentation page does not mention that it is in beta stage replicatedmap interface is marked with beta annotation however the documentation does not warn the users that it might not be fully stable | 0 |
351,388 | 31,999,374,725 | IssuesEvent | 2023-09-21 11:16:03 | unifyai/ivy | https://api.github.com/repos/unifyai/ivy | reopened | Fix array.test_jax__ge_ | JAX Frontend Sub Task Failing Test | | | |
|---|---|
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6260848522"><img src=https://img.shields.io/badge/-failure-red></a>
| 1.0 | Fix array.test_jax__ge_ - | | |
|---|---|
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/6260222171"><img src=https://img.shields.io/badge/-success-success></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6260848522"><img src=https://img.shields.io/badge/-failure-red></a>
| non_priority | fix array test jax ge numpy a href src jax a href src tensorflow a href src torch a href src paddle a href src | 0 |
271,401 | 29,488,836,645 | IssuesEvent | 2023-06-02 11:58:32 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | RCS 2.0: Don't query for field list when configuring FLS against a remote cluster | bug Team:Security Feature:Users/Roles/API Keys | RCS 2.0 introduces a new section to the role management screen, allowing administrators to configure index privileges against a properly configured remote cluster. This includes the ability to specify field-level security ("FLS").
When activating the FLS controls on the role management page, we query the local cluster to populate the list of available fields to include/exclude from the role. We need to ensure that we **do not** make this call when configuring remote cluster privileges:
1) Querying the local cluster makes no sense
2) Querying the remote cluster is out of scope at this time, and may be problematic from an authorization perspective. | True | RCS 2.0: Don't query for field list when configuring FLS against a remote cluster - RCS 2.0 introduces a new section to the role management screen, allowing administrators to configure index privileges against a properly configured remote cluster. This includes the ability to specify field-level security ("FLS").
When activating the FLS controls on the role management page, we query the local cluster to populate the list of available fields to include/exclude from the role. We need to ensure that we **do not** make this call when configuring remote cluster privileges:
1) Querying the local cluster makes no sense
2) Querying the remote cluster is out of scope at this time, and may be problematic from an authorization perspective. | non_priority | rcs don t query for field list when configuring fls against a remote cluster rcs introduces a new section to the role management screen allowing administrators to configure index privileges against a properly configured remote cluster this includes the ability to specify field level security fls when activating the fls controls on the role management page we query the local cluster to populate the list of available fields to include exclude from the role we need to ensure that we do not make this call when configuring remote cluster privileges querying the local cluster makes no sense querying the remote cluster is out of scope at this time and may be problematic from an authorization perspective | 0 |
26,519 | 26,904,629,587 | IssuesEvent | 2023-02-06 18:03:26 | usds/justice40-tool | https://api.github.com/repos/usds/justice40-tool | closed | Change below to not above on all the thresholds in the side panel | frontend usability 1.1 | **Describe the bug**
It is causing some confusion that it says "below" when the value is below the threshold. It can be interpreted as the value needs to be below. I think changing it to "not above" is an improvement.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'Explore the map'
2. Click on 'a census tract'
3. Scroll down to 'side panel'
4. See error
**Expected behavior**
Change all the below to be not above.
**Screenshots**
If applicable, add screenshots to help explain your problem.
<img width="253" alt="Screen Shot 2022-12-06 at 11 14 53 AM" src="https://user-images.githubusercontent.com/77702996/206001518-c5001d98-0acb-4bfa-bc38-8474efe0080b.png">
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- OS: [e.g. iOS8.1]
- Browser [e.g. stock browser, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| True | Change below to not above on all the thresholds in the side panel - **Describe the bug**
It is causing some confusion that it says "below" when the value is below the threshold. It can be interpreted as the value needs to be below. I think changing it to "not above" is an improvement.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'Explore the map'
2. Click on 'a census tract'
3. Scroll down to 'side panel'
4. See error
**Expected behavior**
Change all the below to be not above.
**Screenshots**
If applicable, add screenshots to help explain your problem.
<img width="253" alt="Screen Shot 2022-12-06 at 11 14 53 AM" src="https://user-images.githubusercontent.com/77702996/206001518-c5001d98-0acb-4bfa-bc38-8474efe0080b.png">
**Desktop (please complete the following information):**
- OS: [e.g. iOS]
- Browser [e.g. chrome, safari]
- Version [e.g. 22]
**Smartphone (please complete the following information):**
- Device: [e.g. iPhone6]
- OS: [e.g. iOS8.1]
- Browser [e.g. stock browser, safari]
- Version [e.g. 22]
**Additional context**
Add any other context about the problem here.
| non_priority | change below to not above on all the thresholds in the side panel describe the bug it is causing some confusion that it says below when the value is below the threshold it can be interpreted as the value needs to be below i think changing it to not above is an improvement to reproduce steps to reproduce the behavior go to explore the map click on a census tract scroll down to side panel see error expected behavior change all the below to be not above screenshots if applicable add screenshots to help explain your problem img width alt screen shot at am src desktop please complete the following information os browser version smartphone please complete the following information device os browser version additional context add any other context about the problem here | 0 |
139,042 | 20,763,238,811 | IssuesEvent | 2022-03-15 18:04:38 | MetaMask/design-tokens | https://api.github.com/repos/MetaMask/design-tokens | opened | Input/Componentize IA Nav Redesign | design-consistency | ### **Description**
Create Input component
### **Technical Details**
- Create Figma component
- Create documentation
### **Acceptance Criteria**
- [ ] - Color/Text token applied
- [ ] - Figma component
- [ ] - Figma component's grouping/naming is standardized
- [ ] - large screen & small screen
- [ ] - Documentation
### **References**
https://www.figma.com/file/HKpPKij9V3TpsyMV1TpV7C/?node-id=1093%3A8539
| 1.0 | Input/Componentize IA Nav Redesign - ### **Description**
Create Input component
### **Technical Details**
- Create Figma component
- Create documentation
### **Acceptance Criteria**
- [ ] - Color/Text token applied
- [ ] - Figma component
- [ ] - Figma component's grouping/naming is standardized
- [ ] - large screen & small screen
- [ ] - Documentation
### **References**
https://www.figma.com/file/HKpPKij9V3TpsyMV1TpV7C/?node-id=1093%3A8539
| non_priority | input componentize ia nav redesign description create input component technical details create figma component create documentation acceptance criteria color text token applied figma component figma component s grouping naming is standardized large screen small screen documentation references | 0 |
226,486 | 24,947,215,124 | IssuesEvent | 2022-11-01 02:01:02 | kedacore/external-scaler-azure-cosmos-db | https://api.github.com/repos/kedacore/external-scaler-azure-cosmos-db | closed | CVE-2017-0249 (High) detected in system.net.http.4.3.0.nupkg - autoclosed | security vulnerability | ## CVE-2017-0249 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>system.net.http.4.3.0.nupkg</b></p></summary>
<p>Provides a programming interface for modern HTTP applications, including HTTP client components that...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/system.net.http.4.3.0.nupkg">https://api.nuget.org/packages/system.net.http.4.3.0.nupkg</a></p>
<p>Path to dependency file: /src/Scaler.Tests/Keda.CosmosDb.Scaler.Tests.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/system.net.http/4.3.0/system.net.http.4.3.0.nupkg</p>
<p>
Dependency Hierarchy:
- xunit.2.4.2.nupkg (Root Library)
- xunit.assert.2.4.2.nupkg
- netstandard.library.1.6.1.nupkg
- :x: **system.net.http.4.3.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kedacore/external-scaler-azure-cosmos-db/commit/208c73830a79844b58ff4ae9ee5915696f9d9299">208c73830a79844b58ff4ae9ee5915696f9d9299</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An elevation of privilege vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0249>CVE-2017-0249</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.3.1;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2017-0249 (High) detected in system.net.http.4.3.0.nupkg - autoclosed - ## CVE-2017-0249 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>system.net.http.4.3.0.nupkg</b></p></summary>
<p>Provides a programming interface for modern HTTP applications, including HTTP client components that...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/system.net.http.4.3.0.nupkg">https://api.nuget.org/packages/system.net.http.4.3.0.nupkg</a></p>
<p>Path to dependency file: /src/Scaler.Tests/Keda.CosmosDb.Scaler.Tests.csproj</p>
<p>Path to vulnerable library: /home/wss-scanner/.nuget/packages/system.net.http/4.3.0/system.net.http.4.3.0.nupkg</p>
<p>
Dependency Hierarchy:
- xunit.2.4.2.nupkg (Root Library)
- xunit.assert.2.4.2.nupkg
- netstandard.library.1.6.1.nupkg
- :x: **system.net.http.4.3.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/kedacore/external-scaler-azure-cosmos-db/commit/208c73830a79844b58ff4ae9ee5915696f9d9299">208c73830a79844b58ff4ae9ee5915696f9d9299</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An elevation of privilege vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0249>CVE-2017-0249</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: System.Text.Encodings.Web - 4.0.1,4.3.1;System.Net.Http - 4.1.2,4.3.2;System.Net.Http.WinHttpHandler - 4.0.2,4.3.1;System.Net.Security - 4.0.1,4.3.1;System.Net.WebSockets.Client - 4.0.1,4.3.1;Microsoft.AspNetCore.Mvc - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Abstractions - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ApiExplorer - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor.Host - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Razor - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in system net http nupkg autoclosed cve high severity vulnerability vulnerable library system net http nupkg provides a programming interface for modern http applications including http client components that library home page a href path to dependency file src scaler tests keda cosmosdb scaler tests csproj path to vulnerable library home wss scanner nuget packages system net http system net http nupkg dependency hierarchy xunit nupkg root library xunit assert nupkg netstandard library nupkg x system net http nupkg vulnerable library found in head commit a href found in base branch main vulnerability details an elevation of privilege vulnerability exists when the asp net core fails to properly sanitize web requests publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version release date fix resolution system text encodings web system net http system net http winhttphandler system net security system net websockets client microsoft aspnetcore mvc microsoft aspnetcore mvc core microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc cors microsoft aspnetcore mvc dataannotations microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc localization microsoft aspnetcore mvc razor host microsoft aspnetcore mvc razor microsoft aspnetcore mvc taghelpers microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc webapicompatshim step up your open source security game with mend | 0 |
125,373 | 17,836,135,134 | IssuesEvent | 2021-09-03 01:31:36 | iVipz/WebGoat | https://api.github.com/repos/iVipz/WebGoat | opened | CVE-2021-32804 (High) detected in tar-4.4.1.tgz | security vulnerability | ## CVE-2021-32804 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.1.tgz">https://registry.npmjs.org/tar/-/tar-4.4.1.tgz</a></p>
<p>
Dependency Hierarchy:
- browser-sync-2.26.3.tgz (Root Library)
- chokidar-2.0.4.tgz
- fsevents-1.2.4.tgz
- node-pre-gyp-0.10.0.tgz
- :x: **tar-4.4.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804>CVE-2021-32804</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9">https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution: tar - 3.2.2, 4.4.14, 5.0.6, 6.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-32804 (High) detected in tar-4.4.1.tgz - ## CVE-2021-32804 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tar-4.4.1.tgz</b></p></summary>
<p>tar for node</p>
<p>Library home page: <a href="https://registry.npmjs.org/tar/-/tar-4.4.1.tgz">https://registry.npmjs.org/tar/-/tar-4.4.1.tgz</a></p>
<p>
Dependency Hierarchy:
- browser-sync-2.26.3.tgz (Root Library)
- chokidar-2.0.4.tgz
- fsevents-1.2.4.tgz
- node-pre-gyp-0.10.0.tgz
- :x: **tar-4.4.1.tgz** (Vulnerable Library)
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The npm package "tar" (aka node-tar) before versions 6.1.1, 5.0.6, 4.4.14, and 3.3.2 has a arbitrary File Creation/Overwrite vulnerability due to insufficient absolute path sanitization. node-tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the `preservePaths` flag is not set to `true`. This is achieved by stripping the absolute path root from any absolute file paths contained in a tar file. For example `/home/user/.bashrc` would turn into `home/user/.bashrc`. This logic was insufficient when file paths contained repeated path roots such as `////home/user/.bashrc`. `node-tar` would only strip a single path root from such paths. When given an absolute file path with repeating path roots, the resulting path (e.g. `///home/user/.bashrc`) would still resolve to an absolute path, thus allowing arbitrary file creation and overwrite. This issue was addressed in releases 3.2.2, 4.4.14, 5.0.6 and 6.1.1. Users may work around this vulnerability without upgrading by creating a custom `onentry` method which sanitizes the `entry.path` or a `filter` method which removes entries with absolute paths. See referenced GitHub Advisory for details. Be aware of CVE-2021-32803 which fixes a similar bug in later versions of tar.
<p>Publish Date: 2021-08-03
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-32804>CVE-2021-32804</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9">https://github.com/npm/node-tar/security/advisories/GHSA-3jfq-g458-7qm9</a></p>
<p>Release Date: 2021-08-03</p>
<p>Fix Resolution: tar - 3.2.2, 4.4.14, 5.0.6, 6.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in tar tgz cve high severity vulnerability vulnerable library tar tgz tar for node library home page a href dependency hierarchy browser sync tgz root library chokidar tgz fsevents tgz node pre gyp tgz x tar tgz vulnerable library found in base branch develop vulnerability details the npm package tar aka node tar before versions and has a arbitrary file creation overwrite vulnerability due to insufficient absolute path sanitization node tar aims to prevent extraction of absolute file paths by turning absolute paths into relative paths when the preservepaths flag is not set to true this is achieved by stripping the absolute path root from any absolute file paths contained in a tar file for example home user bashrc would turn into home user bashrc this logic was insufficient when file paths contained repeated path roots such as home user bashrc node tar would only strip a single path root from such paths when given an absolute file path with repeating path roots the resulting path e g home user bashrc would still resolve to an absolute path thus allowing arbitrary file creation and overwrite this issue was addressed in releases and users may work around this vulnerability without upgrading by creating a custom onentry method which sanitizes the entry path or a filter method which removes entries with absolute paths see referenced github advisory for details be aware of cve which fixes a similar bug in later versions of tar publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tar step up your open source security game with whitesource | 0 |
28,376 | 4,099,877,696 | IssuesEvent | 2016-06-03 14:17:49 | DIYbiosphere/sphere.dir | https://api.github.com/repos/DIYbiosphere/sphere.dir | opened | Create a Sort and Filter pop up in list of initiatives | database design development feature request | In the table view of the initiatives. There should be a basic option for checking and unchecking what you want to display.
I will try to come up with the design | 1.0 | Create a Sort and Filter pop up in list of initiatives - In the table view of the initiatives. There should be a basic option for checking and unchecking what you want to display.
I will try to come up with the design | non_priority | create a sort and filter pop up in list of initiatives in the table view of the initiatives there should be a basic option for checking and unchecking what you want to display i will try to come up with the design | 0 |
37,689 | 8,353,134,667 | IssuesEvent | 2018-10-02 09:05:02 | makandra/consul | https://api.github.com/repos/makandra/consul | closed | Bang methods should return the scope when successful, not just true | enhancement help wanted needs code needs tests | Hey there,
I'm really liking Consul so far, however, there's an aspect of the auto-generated bang methods that I find rather frustrating.
I often-times want, in an index action,. to check if the user has access the resource being indexed; if they do, I want to populate the relevant variable with the list of resource instances they have access to but if they don't I want to raise Consul::Powerless so I can handle it in a rescue_from handler. Normally, I'd expect it to work like this:
``` ruby
@resources = current_power.resources!
```
...but what I actually need is more like:
``` ruby
@resources = current_power.resources if current_power.resources!
```
I can't really use
``` ruby
@resources = current_power.resources
```
alone, either, because if they don't have access to that resource at all (that is, the method returns nil rather than simply an empty set), then I'll get a NoMethodError in the view when I try to iterate over `@resources`.
I can probably write up a pull request for this, but I thought I'd ask for your opinion on the matter first; there might be a logic to the current behavior that I don't understand, and I didn't want to code up a pull request for something that turns out to be a bad idea.
| 1.0 | Bang methods should return the scope when successful, not just true - Hey there,
I'm really liking Consul so far, however, there's an aspect of the auto-generated bang methods that I find rather frustrating.
I often-times want, in an index action,. to check if the user has access the resource being indexed; if they do, I want to populate the relevant variable with the list of resource instances they have access to but if they don't I want to raise Consul::Powerless so I can handle it in a rescue_from handler. Normally, I'd expect it to work like this:
``` ruby
@resources = current_power.resources!
```
...but what I actually need is more like:
``` ruby
@resources = current_power.resources if current_power.resources!
```
I can't really use
``` ruby
@resources = current_power.resources
```
alone, either, because if they don't have access to that resource at all (that is, the method returns nil rather than simply an empty set), then I'll get a NoMethodError in the view when I try to iterate over `@resources`.
I can probably write up a pull request for this, but I thought I'd ask for your opinion on the matter first; there might be a logic to the current behavior that I don't understand, and I didn't want to code up a pull request for something that turns out to be a bad idea.
| non_priority | bang methods should return the scope when successful not just true hey there i m really liking consul so far however there s an aspect of the auto generated bang methods that i find rather frustrating i often times want in an index action to check if the user has access the resource being indexed if they do i want to populate the relevant variable with the list of resource instances they have access to but if they don t i want to raise consul powerless so i can handle it in a rescue from handler normally i d expect it to work like this ruby resources current power resources but what i actually need is more like ruby resources current power resources if current power resources i can t really use ruby resources current power resources alone either because if they don t have access to that resource at all that is the method returns nil rather than simply an empty set then i ll get a nomethoderror in the view when i try to iterate over resources i can probably write up a pull request for this but i thought i d ask for your opinion on the matter first there might be a logic to the current behavior that i don t understand and i didn t want to code up a pull request for something that turns out to be a bad idea | 0 |
288,912 | 31,931,001,215 | IssuesEvent | 2023-09-19 07:25:09 | Trinadh465/linux-4.1.15_CVE-2023-4128 | https://api.github.com/repos/Trinadh465/linux-4.1.15_CVE-2023-4128 | opened | CVE-2019-20422 (Medium) detected in linuxlinux-4.6 | Mend: dependency security vulnerability | ## CVE-2019-20422 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/linux-4.1.15_CVE-2023-4128/commit/0c6c8d8c809f697cd5fc581c6c08e9ad646c55a8">0c6c8d8c809f697cd5fc581c6c08e9ad646c55a8</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv6/ip6_fib.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv6/ip6_fib.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel before 5.3.4, fib6_rule_lookup in net/ipv6/ip6_fib.c mishandles the RT6_LOOKUP_F_DST_NOREF flag in a reference-count decision, leading to (for example) a crash that was identified by syzkaller, aka CID-7b09c2d052db.
<p>Publish Date: 2020-01-27
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-20422>CVE-2019-20422</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-20422">https://www.linuxkernelcves.com/cves/CVE-2019-20422</a></p>
<p>Release Date: 2020-03-13</p>
<p>Fix Resolution: v5.4-rc1,v5.3.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-20422 (Medium) detected in linuxlinux-4.6 - ## CVE-2019-20422 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/Trinadh465/linux-4.1.15_CVE-2023-4128/commit/0c6c8d8c809f697cd5fc581c6c08e9ad646c55a8">0c6c8d8c809f697cd5fc581c6c08e9ad646c55a8</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv6/ip6_fib.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/net/ipv6/ip6_fib.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel before 5.3.4, fib6_rule_lookup in net/ipv6/ip6_fib.c mishandles the RT6_LOOKUP_F_DST_NOREF flag in a reference-count decision, leading to (for example) a crash that was identified by syzkaller, aka CID-7b09c2d052db.
<p>Publish Date: 2020-01-27
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-20422>CVE-2019-20422</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-20422">https://www.linuxkernelcves.com/cves/CVE-2019-20422</a></p>
<p>Release Date: 2020-03-13</p>
<p>Fix Resolution: v5.4-rc1,v5.3.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in linuxlinux cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch main vulnerable source files net fib c net fib c vulnerability details in the linux kernel before rule lookup in net fib c mishandles the lookup f dst noref flag in a reference count decision leading to for example a crash that was identified by syzkaller aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
271,874 | 20,720,326,974 | IssuesEvent | 2022-03-13 09:34:35 | o108minmin/halberd | https://api.github.com/repos/o108minmin/halberd | closed | かっこいいアイコンの追加 | documentation | **課題 or やりたいこと**
- かっこいいアイコンが欲しい
**やること**
- [x] アイコンの形式を調べる
- [x] アイコンを作ってcommitする
| 1.0 | かっこいいアイコンの追加 - **課題 or やりたいこと**
- かっこいいアイコンが欲しい
**やること**
- [x] アイコンの形式を調べる
- [x] アイコンを作ってcommitする
| non_priority | かっこいいアイコンの追加 課題 or やりたいこと かっこいいアイコンが欲しい やること アイコンの形式を調べる アイコンを作ってcommitする | 0 |
103,975 | 13,017,665,008 | IssuesEvent | 2020-07-26 13:40:24 | jupyterlab/jupyterlab-git | https://api.github.com/repos/jupyterlab/jupyterlab-git | closed | Provide feedback on git command execution | tag:Design and UX type:Enhancement | This is a follow-up of #564.
Proposal:
1. Allow retry timeout configurability for acquiring the index.lock.
2. Suspend UI interactions until pending command complete (e.g., a modal with a spinner). I believe this should be straightforward, as we just need to enable a UI element until receiving an HTTP response.
The need for the latter is to prevent/dissuade the user from closing the JupyterLab server before pending commands have had a chance to complete (e.g., before an `index.lock` file is removed and `git add && git commit` run, thus potentially leading to the discarding of user changes).
_Originally posted by @kgryte in https://github.com/jupyterlab/jupyterlab-git/pull/564#issuecomment-612141351_ | 1.0 | Provide feedback on git command execution - This is a follow-up of #564.
Proposal:
1. Allow retry timeout configurability for acquiring the index.lock.
2. Suspend UI interactions until pending command complete (e.g., a modal with a spinner). I believe this should be straightforward, as we just need to enable a UI element until receiving an HTTP response.
The need for the latter is to prevent/dissuade the user from closing the JupyterLab server before pending commands have had a chance to complete (e.g., before an `index.lock` file is removed and `git add && git commit` run, thus potentially leading to the discarding of user changes).
_Originally posted by @kgryte in https://github.com/jupyterlab/jupyterlab-git/pull/564#issuecomment-612141351_ | non_priority | provide feedback on git command execution this is a follow up of proposal allow retry timeout configurability for acquiring the index lock suspend ui interactions until pending command complete e g a modal with a spinner i believe this should be straightforward as we just need to enable a ui element until receiving an http response the need for the latter is to prevent dissuade the user from closing the jupyterlab server before pending commands have had a chance to complete e g before an index lock file is removed and git add git commit run thus potentially leading to the discarding of user changes originally posted by kgryte in | 0 |
235,843 | 25,962,070,419 | IssuesEvent | 2022-12-19 01:03:15 | invisiblehats/acts_as_scope | https://api.github.com/repos/invisiblehats/acts_as_scope | opened | CVE-2022-23519 (Medium) detected in rails-html-sanitizer-1.4.2.gem | security vulnerability | ## CVE-2022-23519 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>rails-html-sanitizer-1.4.2.gem</b></p></summary>
<p>HTML sanitization for Rails applications</p>
<p>Library home page: <a href="https://rubygems.org/gems/rails-html-sanitizer-1.4.2.gem">https://rubygems.org/gems/rails-html-sanitizer-1.4.2.gem</a></p>
<p>Path to dependency file: /Gemfile.lock</p>
<p>Path to vulnerable library: /home/wss-scanner/.gem/ruby/2.7.0/cache/rails-html-sanitizer-1.4.2.gem</p>
<p>
Dependency Hierarchy:
- invisible_standards-0.1.2.gem (Root Library)
- rails-7.0.2.3.gem
- actionmailer-7.0.2.3.gem
- actionpack-7.0.2.3.gem
- :x: **rails-html-sanitizer-1.4.2.gem** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
rails-html-sanitizer is responsible for sanitizing HTML fragments in Rails applications. Prior to version 1.4.4, a possible XSS vulnerability with certain configurations of Rails::Html::Sanitizer may allow an attacker to inject content if the application developer has overridden the sanitizer's allowed tags in either of the following ways: allow both "math" and "style" elements, or allow both "svg" and "style" elements. Code is only impacted if allowed tags are being overridden. . This issue is fixed in version 1.4.4. All users overriding the allowed tags to include "math" or "svg" and "style" should either upgrade or use the following workaround immediately: Remove "style" from the overridden allowed tags, or remove "math" and "svg" from the overridden allowed tags.
<p>Publish Date: 2022-12-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-23519>CVE-2022-23519</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/rails-html-sanitizer/security/advisories/GHSA-9h9g-93gc-623h">https://github.com/rails/rails-html-sanitizer/security/advisories/GHSA-9h9g-93gc-623h</a></p>
<p>Release Date: 2022-12-14</p>
<p>Fix Resolution: rails-html-sanitizer - 1.4.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-23519 (Medium) detected in rails-html-sanitizer-1.4.2.gem - ## CVE-2022-23519 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>rails-html-sanitizer-1.4.2.gem</b></p></summary>
<p>HTML sanitization for Rails applications</p>
<p>Library home page: <a href="https://rubygems.org/gems/rails-html-sanitizer-1.4.2.gem">https://rubygems.org/gems/rails-html-sanitizer-1.4.2.gem</a></p>
<p>Path to dependency file: /Gemfile.lock</p>
<p>Path to vulnerable library: /home/wss-scanner/.gem/ruby/2.7.0/cache/rails-html-sanitizer-1.4.2.gem</p>
<p>
Dependency Hierarchy:
- invisible_standards-0.1.2.gem (Root Library)
- rails-7.0.2.3.gem
- actionmailer-7.0.2.3.gem
- actionpack-7.0.2.3.gem
- :x: **rails-html-sanitizer-1.4.2.gem** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
rails-html-sanitizer is responsible for sanitizing HTML fragments in Rails applications. Prior to version 1.4.4, a possible XSS vulnerability with certain configurations of Rails::Html::Sanitizer may allow an attacker to inject content if the application developer has overridden the sanitizer's allowed tags in either of the following ways: allow both "math" and "style" elements, or allow both "svg" and "style" elements. Code is only impacted if allowed tags are being overridden. . This issue is fixed in version 1.4.4. All users overriding the allowed tags to include "math" or "svg" and "style" should either upgrade or use the following workaround immediately: Remove "style" from the overridden allowed tags, or remove "math" and "svg" from the overridden allowed tags.
<p>Publish Date: 2022-12-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-23519>CVE-2022-23519</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/rails/rails-html-sanitizer/security/advisories/GHSA-9h9g-93gc-623h">https://github.com/rails/rails-html-sanitizer/security/advisories/GHSA-9h9g-93gc-623h</a></p>
<p>Release Date: 2022-12-14</p>
<p>Fix Resolution: rails-html-sanitizer - 1.4.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in rails html sanitizer gem cve medium severity vulnerability vulnerable library rails html sanitizer gem html sanitization for rails applications library home page a href path to dependency file gemfile lock path to vulnerable library home wss scanner gem ruby cache rails html sanitizer gem dependency hierarchy invisible standards gem root library rails gem actionmailer gem actionpack gem x rails html sanitizer gem vulnerable library found in base branch master vulnerability details rails html sanitizer is responsible for sanitizing html fragments in rails applications prior to version a possible xss vulnerability with certain configurations of rails html sanitizer may allow an attacker to inject content if the application developer has overridden the sanitizer s allowed tags in either of the following ways allow both math and style elements or allow both svg and style elements code is only impacted if allowed tags are being overridden this issue is fixed in version all users overriding the allowed tags to include math or svg and style should either upgrade or use the following workaround immediately remove style from the overridden allowed tags or remove math and svg from the overridden allowed tags publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rails html sanitizer step up your open source security game with mend | 0 |
113,894 | 9,668,351,142 | IssuesEvent | 2019-05-21 14:58:56 | phetsims/circuit-construction-kit-common | https://api.github.com/repos/phetsims/circuit-construction-kit-common | closed | CT numberZigZags must be an integer | type:automated-testing | ```
circuit-construction-kit-dc : xss-fuzz : load
Query: brand=phet&ea&fuzz&stringTest=xss&memoryLimit=1000
Uncaught Error: Assertion failed: numberZigZags must be an integer: 11.9947
Error: Assertion failed: numberZigZags must be an integer: 11.9947
at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/assert/js/assert.js:22:13)
at Shape.zigZagToPoint (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/kite/js/Shape.js?bust=1557957750764:353:17)
at new FuseNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/FuseNode.js?bust=1557957750764:63:10)
at CircuitElementToolFactory.createFuseToolNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/CircuitElementToolFactory.js?bust=1557957750764:324:9)
at new IntroScreenView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/view/IntroScreenView.js?bust=1557957750764:41:35)
at IntroScreen.createView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/IntroScreen.js?bust=1557957750764:64:18)
at IntroScreen.initializeView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Screen.js?bust=1557957750764:261:25)
at Array.<anonymous> (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:803:18)
at https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:811:27
id: Bayes Chrome
Approximately 5/15/2019, 1:38:18 PM
circuit-construction-kit-dc : xss-fuzz : run
Query: brand=phet&ea&fuzz&stringTest=xss&memoryLimit=1000
Uncaught Error: Assertion failed: numberZigZags must be an integer: 11.9947
Error: Assertion failed: numberZigZags must be an integer: 11.9947
at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/assert/js/assert.js:22:13)
at Shape.zigZagToPoint (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/kite/js/Shape.js?bust=1557957750764:353:17)
at new FuseNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/FuseNode.js?bust=1557957750764:63:10)
at CircuitElementToolFactory.createFuseToolNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/CircuitElementToolFactory.js?bust=1557957750764:324:9)
at new IntroScreenView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/view/IntroScreenView.js?bust=1557957750764:41:35)
at IntroScreen.createView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/IntroScreen.js?bust=1557957750764:64:18)
at IntroScreen.initializeView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Screen.js?bust=1557957750764:261:25)
at Array.<anonymous> (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:803:18)
at https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:811:27
id: Bayes Chrome
Approximately 5/15/2019, 1:38:18 PM
``` | 1.0 | CT numberZigZags must be an integer - ```
circuit-construction-kit-dc : xss-fuzz : load
Query: brand=phet&ea&fuzz&stringTest=xss&memoryLimit=1000
Uncaught Error: Assertion failed: numberZigZags must be an integer: 11.9947
Error: Assertion failed: numberZigZags must be an integer: 11.9947
at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/assert/js/assert.js:22:13)
at Shape.zigZagToPoint (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/kite/js/Shape.js?bust=1557957750764:353:17)
at new FuseNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/FuseNode.js?bust=1557957750764:63:10)
at CircuitElementToolFactory.createFuseToolNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/CircuitElementToolFactory.js?bust=1557957750764:324:9)
at new IntroScreenView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/view/IntroScreenView.js?bust=1557957750764:41:35)
at IntroScreen.createView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/IntroScreen.js?bust=1557957750764:64:18)
at IntroScreen.initializeView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Screen.js?bust=1557957750764:261:25)
at Array.<anonymous> (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:803:18)
at https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:811:27
id: Bayes Chrome
Approximately 5/15/2019, 1:38:18 PM
circuit-construction-kit-dc : xss-fuzz : run
Query: brand=phet&ea&fuzz&stringTest=xss&memoryLimit=1000
Uncaught Error: Assertion failed: numberZigZags must be an integer: 11.9947
Error: Assertion failed: numberZigZags must be an integer: 11.9947
at window.assertions.assertFunction (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/assert/js/assert.js:22:13)
at Shape.zigZagToPoint (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/kite/js/Shape.js?bust=1557957750764:353:17)
at new FuseNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/FuseNode.js?bust=1557957750764:63:10)
at CircuitElementToolFactory.createFuseToolNode (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-common/js/view/CircuitElementToolFactory.js?bust=1557957750764:324:9)
at new IntroScreenView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/view/IntroScreenView.js?bust=1557957750764:41:35)
at IntroScreen.createView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/circuit-construction-kit-dc/js/intro/IntroScreen.js?bust=1557957750764:64:18)
at IntroScreen.initializeView (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Screen.js?bust=1557957750764:261:25)
at Array.<anonymous> (https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:803:18)
at https://bayes.colorado.edu/continuous-testing/snapshot-1557949098039/joist/js/Sim.js?bust=1557957750764:811:27
id: Bayes Chrome
Approximately 5/15/2019, 1:38:18 PM
``` | non_priority | ct numberzigzags must be an integer circuit construction kit dc xss fuzz load query brand phet ea fuzz stringtest xss memorylimit uncaught error assertion failed numberzigzags must be an integer error assertion failed numberzigzags must be an integer at window assertions assertfunction at shape zigzagtopoint at new fusenode at circuitelementtoolfactory createfusetoolnode at new introscreenview at introscreen createview at introscreen initializeview at array at id bayes chrome approximately pm circuit construction kit dc xss fuzz run query brand phet ea fuzz stringtest xss memorylimit uncaught error assertion failed numberzigzags must be an integer error assertion failed numberzigzags must be an integer at window assertions assertfunction at shape zigzagtopoint at new fusenode at circuitelementtoolfactory createfusetoolnode at new introscreenview at introscreen createview at introscreen initializeview at array at id bayes chrome approximately pm | 0 |
1,789 | 6,575,881,306 | IssuesEvent | 2017-09-11 17:41:33 | ansible/ansible-modules-core | https://api.github.com/repos/ansible/ansible-modules-core | closed | ec2 badly handles non ec2 instance related limits | affects_2.1 aws bug_report cloud waiting_on_maintainer | <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
cloud/amazon/ec2.py
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
Using:
ansible 2.1.0.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
But, I have checked devel branch in this repo and the issue seems still not handled
```
##### CONFIGURATION
<!---
Mention any settings you have changed/added/removed in ansible.cfg
(or using the ANSIBLE_* environment variables).
-->
##### OS / ENVIRONMENT
<!---
Mention the OS you are running Ansible from, and the OS you are
managing, or say “N/A” for anything that is not platform-specific.
-->
GNU/Linux (Ubuntu 14.04.x and 16.04.x) 64-bit.
##### SUMMARY
<!--- Explain the problem briefly -->
`ec2.py` module does not report well failures coming from **non instance** related **limits**
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
Just try creating `N` instances with `M` EBS volumes each, making sure to exceed volume related limit.
<!--- Paste example playbooks or commands between quotes below -->
```
- name: create nodes
local_action:
module: ec2
params, args
count: <positive int>
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
At the least I need the invocation to fail with meaningful error:
1. fail the module invocation, and print error message explaining which limit was actually exceeded.
In a perfect world, I would also expect playbook to gracefully fail with "rollback":
1. destroy already created nodes
2. print failure message as explained before
3. fail the task
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
```
20:56:39 TASK [my-aws-bootstrap : create cluster nodes] *******************************
20:56:39 task path: /var/lib/jenkins/jobs/lab-start/workspace/mypipeline/playbooks/roles/aws-bootstrap/tasks/main.yml:32
20:56:39 <localhost> ESTABLISH LOCAL CONNECTION FOR USER: jenkins
20:56:39 <localhost> EXEC /bin/sh -c 'LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8 python && sleep 0'
21:16:42 fatal: [localhost]: FAILED! => {"changed": false, "failed": true, "invocation": {"module_args": {"assign_public_ip": false, "aws_access_key": null, "aws_secret_key": null, "count": 16, "count_tag": null, "ebs_optimized": false, "ec2_url": null, "exact_count": null, "group": ["group1"], "group_id": null, "id": null, "image": "ami-xxxxxxxx", "instance_ids": null, "instance_profile_name": null, "instance_tags": {"Environment": "pipeline-lab", "Name": "node", "lab_id": "lab-100", "user": "user1"}, "instance_type": "i2.8xlarge", "kernel": null, "key_name": "userkey", "monitoring": true, "network_interfaces": null, "placement_group": null, "private_ip": null, "profile": null, "ramdisk": null, "region": "us-east-1", "security_token": null, "source_dest_check": true, "spot_launch_group": null, "spot_price": null, "spot_type": "one-time", "spot_wait_timeout": "600", "state": "present", "tenancy": "default", "termination_protection": false, "user_data": null, "validate_certs": true, "volumes": [{"delete_on_termination": true, "device_name": "/dev/xvda", "volume_size": 1000}, {"delete_on_termination": true, "device_name": "/dev/xvdl", "device_type": "gp2", "volume_size": "100"}, {"delete_on_termination": true, "device_name": "/dev/xvdm", "device_type": "gp2", "volume_size": "100"}, {"delete_on_termination": true, "device_name": "/dev/xvdn", "device_type": "gp2", "volume_size": "100"}, {"device_name": "/dev/xvdd", "ephemeral": "ephemeral0"}, {"device_name": "/dev/xvde", "ephemeral": "ephemeral1"}, {"device_name": "/dev/xvdf", "ephemeral": "ephemeral2"}, {"device_name": "/dev/xvdg", "ephemeral": "ephemeral3"}, {"device_name": "/dev/xvdh", "ephemeral": "ephemeral4"}, {"device_name": "/dev/xvdi", "ephemeral": "ephemeral5"}, {"device_name": "/dev/xvdj", "ephemeral": "ephemeral6"}, {"device_name": "/dev/xvdk", "ephemeral": "ephemeral7"}], "vpc_subnet_id": "subnet-xxxxxx", "wait": true, "wait_timeout": "1200", "zone": "us-east-1a"}, "module_name": "ec2"}, "msg": "wait for instances running timeout on Mon Sep 26 21:16:42 2016"}
```
<!--- Paste verbatim command output between quotes below -->
```
```
| True | ec2 badly handles non ec2 instance related limits - <!--- Verify first that your issue/request is not already reported in GitHub -->
##### ISSUE TYPE
<!--- Pick one below and delete the rest: -->
- Bug Report
##### COMPONENT NAME
<!--- Name of the plugin/module/task -->
cloud/amazon/ec2.py
##### ANSIBLE VERSION
<!--- Paste verbatim output from “ansible --version” between quotes below -->
```
Using:
ansible 2.1.0.0
config file = /etc/ansible/ansible.cfg
configured module search path = Default w/o overrides
But, I have checked devel branch in this repo and the issue seems still not handled
```
##### CONFIGURATION
<!---
Mention any settings you have changed/added/removed in ansible.cfg
(or using the ANSIBLE_* environment variables).
-->
##### OS / ENVIRONMENT
<!---
Mention the OS you are running Ansible from, and the OS you are
managing, or say “N/A” for anything that is not platform-specific.
-->
GNU/Linux (Ubuntu 14.04.x and 16.04.x) 64-bit.
##### SUMMARY
<!--- Explain the problem briefly -->
`ec2.py` module does not report well failures coming from **non instance** related **limits**
##### STEPS TO REPRODUCE
<!---
For bugs, show exactly how to reproduce the problem.
For new features, show how the feature would be used.
-->
Just try creating `N` instances with `M` EBS volumes each, making sure to exceed volume related limit.
<!--- Paste example playbooks or commands between quotes below -->
```
- name: create nodes
local_action:
module: ec2
params, args
count: <positive int>
```
<!--- You can also paste gist.github.com links for larger files -->
##### EXPECTED RESULTS
<!--- What did you expect to happen when running the steps above? -->
At the least I need the invocation to fail with meaningful error:
1. fail the module invocation, and print error message explaining which limit was actually exceeded.
In a perfect world, I would also expect playbook to gracefully fail with "rollback":
1. destroy already created nodes
2. print failure message as explained before
3. fail the task
##### ACTUAL RESULTS
<!--- What actually happened? If possible run with extra verbosity (-vvvv) -->
```
20:56:39 TASK [my-aws-bootstrap : create cluster nodes] *******************************
20:56:39 task path: /var/lib/jenkins/jobs/lab-start/workspace/mypipeline/playbooks/roles/aws-bootstrap/tasks/main.yml:32
20:56:39 <localhost> ESTABLISH LOCAL CONNECTION FOR USER: jenkins
20:56:39 <localhost> EXEC /bin/sh -c 'LANG=en_US.UTF-8 LC_ALL=en_US.UTF-8 LC_MESSAGES=en_US.UTF-8 python && sleep 0'
21:16:42 fatal: [localhost]: FAILED! => {"changed": false, "failed": true, "invocation": {"module_args": {"assign_public_ip": false, "aws_access_key": null, "aws_secret_key": null, "count": 16, "count_tag": null, "ebs_optimized": false, "ec2_url": null, "exact_count": null, "group": ["group1"], "group_id": null, "id": null, "image": "ami-xxxxxxxx", "instance_ids": null, "instance_profile_name": null, "instance_tags": {"Environment": "pipeline-lab", "Name": "node", "lab_id": "lab-100", "user": "user1"}, "instance_type": "i2.8xlarge", "kernel": null, "key_name": "userkey", "monitoring": true, "network_interfaces": null, "placement_group": null, "private_ip": null, "profile": null, "ramdisk": null, "region": "us-east-1", "security_token": null, "source_dest_check": true, "spot_launch_group": null, "spot_price": null, "spot_type": "one-time", "spot_wait_timeout": "600", "state": "present", "tenancy": "default", "termination_protection": false, "user_data": null, "validate_certs": true, "volumes": [{"delete_on_termination": true, "device_name": "/dev/xvda", "volume_size": 1000}, {"delete_on_termination": true, "device_name": "/dev/xvdl", "device_type": "gp2", "volume_size": "100"}, {"delete_on_termination": true, "device_name": "/dev/xvdm", "device_type": "gp2", "volume_size": "100"}, {"delete_on_termination": true, "device_name": "/dev/xvdn", "device_type": "gp2", "volume_size": "100"}, {"device_name": "/dev/xvdd", "ephemeral": "ephemeral0"}, {"device_name": "/dev/xvde", "ephemeral": "ephemeral1"}, {"device_name": "/dev/xvdf", "ephemeral": "ephemeral2"}, {"device_name": "/dev/xvdg", "ephemeral": "ephemeral3"}, {"device_name": "/dev/xvdh", "ephemeral": "ephemeral4"}, {"device_name": "/dev/xvdi", "ephemeral": "ephemeral5"}, {"device_name": "/dev/xvdj", "ephemeral": "ephemeral6"}, {"device_name": "/dev/xvdk", "ephemeral": "ephemeral7"}], "vpc_subnet_id": "subnet-xxxxxx", "wait": true, "wait_timeout": "1200", "zone": "us-east-1a"}, "module_name": "ec2"}, "msg": "wait for instances running timeout on Mon Sep 26 21:16:42 2016"}
```
<!--- Paste verbatim command output between quotes below -->
```
```
| non_priority | badly handles non instance related limits issue type bug report component name cloud amazon py ansible version using ansible config file etc ansible ansible cfg configured module search path default w o overrides but i have checked devel branch in this repo and the issue seems still not handled configuration mention any settings you have changed added removed in ansible cfg or using the ansible environment variables os environment mention the os you are running ansible from and the os you are managing or say “n a” for anything that is not platform specific gnu linux ubuntu x and x bit summary py module does not report well failures coming from non instance related limits steps to reproduce for bugs show exactly how to reproduce the problem for new features show how the feature would be used just try creating n instances with m ebs volumes each making sure to exceed volume related limit name create nodes local action module params args count expected results at the least i need the invocation to fail with meaningful error fail the module invocation and print error message explaining which limit was actually exceeded in a perfect world i would also expect playbook to gracefully fail with rollback destroy already created nodes print failure message as explained before fail the task actual results task task path var lib jenkins jobs lab start workspace mypipeline playbooks roles aws bootstrap tasks main yml establish local connection for user jenkins exec bin sh c lang en us utf lc all en us utf lc messages en us utf python sleep fatal failed changed false failed true invocation module args assign public ip false aws access key null aws secret key null count count tag null ebs optimized false url null exact count null group group id null id null image ami xxxxxxxx instance ids null instance profile name null instance tags environment pipeline lab name node lab id lab user instance type kernel null key name userkey monitoring true network interfaces null placement group null private ip null profile null ramdisk null region us east security token null source dest check true spot launch group null spot price null spot type one time spot wait timeout state present tenancy default termination protection false user data null validate certs true volumes vpc subnet id subnet xxxxxx wait true wait timeout zone us east module name msg wait for instances running timeout on mon sep | 0 |
128,856 | 10,553,864,943 | IssuesEvent | 2019-10-03 18:10:31 | moby/moby | https://api.github.com/repos/moby/moby | closed | Flaky test: Integration test failure on Windows: TestJSONFileLoggerWithOpts | area/testing status/failing-ci | https://jenkins.dockerproject.org/job/Docker-PRs-WoW-RS1/20312/console
```
21:04:56 --- FAIL: TestJSONFileLoggerWithOpts (0.00s) 21:04:56 jsonfilelog_test.go:170: open C:\Users\ContainerAdministrator\AppData\Local\Temp\docker-logger-216840801\container.log.1: The process cannot access the file because it is being used by another process. 21:04:56 FAIL
```
cc @johnstep | 1.0 | Flaky test: Integration test failure on Windows: TestJSONFileLoggerWithOpts - https://jenkins.dockerproject.org/job/Docker-PRs-WoW-RS1/20312/console
```
21:04:56 --- FAIL: TestJSONFileLoggerWithOpts (0.00s) 21:04:56 jsonfilelog_test.go:170: open C:\Users\ContainerAdministrator\AppData\Local\Temp\docker-logger-216840801\container.log.1: The process cannot access the file because it is being used by another process. 21:04:56 FAIL
```
cc @johnstep | non_priority | flaky test integration test failure on windows testjsonfileloggerwithopts fail testjsonfileloggerwithopts jsonfilelog test go open c users containeradministrator appdata local temp docker logger container log the process cannot access the file because it is being used by another process fail cc johnstep | 0 |
12,007 | 14,738,270,940 | IssuesEvent | 2021-01-07 04:16:30 | kdjstudios/SABillingGitlab | https://api.github.com/repos/kdjstudios/SABillingGitlab | closed | Site Cycles - Invoice number link to report | anc-ops anc-report anp-1 ant-enhancement grt-ui processes has attachment | In GitLab by @kdjstudios on May 12, 2018, 10:46
I believe there may be an issue with how we handle the linking to reports on the Site cycle page.
When clicking on the number of invoices it goes to the Invoices by site reprot. This report however is only good for finalized invoices as you can see from the screen shot below the metrics/results do not match that when we run the Review draft invoices report:

[draft_invoice_detail____2_.pdf](/uploads/8cfe63fa1e23c4d7ddcf6520edeea501/draft_invoice_detail____2_.pdf)
Possible solution?
- We should have only completed/finalized cycle link to the invoices by site report, while non finalized should link to the draft invoices, and set the date range and site accordingly. | 1.0 | Site Cycles - Invoice number link to report - In GitLab by @kdjstudios on May 12, 2018, 10:46
I believe there may be an issue with how we handle the linking to reports on the Site cycle page.
When clicking on the number of invoices it goes to the Invoices by site reprot. This report however is only good for finalized invoices as you can see from the screen shot below the metrics/results do not match that when we run the Review draft invoices report:

[draft_invoice_detail____2_.pdf](/uploads/8cfe63fa1e23c4d7ddcf6520edeea501/draft_invoice_detail____2_.pdf)
Possible solution?
- We should have only completed/finalized cycle link to the invoices by site report, while non finalized should link to the draft invoices, and set the date range and site accordingly. | non_priority | site cycles invoice number link to report in gitlab by kdjstudios on may i believe there may be an issue with how we handle the linking to reports on the site cycle page when clicking on the number of invoices it goes to the invoices by site reprot this report however is only good for finalized invoices as you can see from the screen shot below the metrics results do not match that when we run the review draft invoices report uploads image png uploads draft invoice detail pdf possible solution we should have only completed finalized cycle link to the invoices by site report while non finalized should link to the draft invoices and set the date range and site accordingly | 0 |
399,710 | 27,252,070,609 | IssuesEvent | 2023-02-22 08:54:12 | Mastercard/flow | https://api.github.com/repos/Mastercard/flow | opened | Documented property correctness | documentation java | There's at least one system property in the documentation that is incorrect.
Let's add a test that scans the docs for things that look like system properties and ensures that they are accurate. | 1.0 | Documented property correctness - There's at least one system property in the documentation that is incorrect.
Let's add a test that scans the docs for things that look like system properties and ensures that they are accurate. | non_priority | documented property correctness there s at least one system property in the documentation that is incorrect let s add a test that scans the docs for things that look like system properties and ensures that they are accurate | 0 |
12,576 | 7,962,898,248 | IssuesEvent | 2018-07-13 15:40:08 | atilaneves/dpp | https://api.github.com/repos/atilaneves/dpp | closed | dub support | enhancement help wanted usability | I started adding `dub` support, see https://github.com/John-Colvin/dub/tree/dpp_support and read the stuff at the top of the README. All it does is just treat `.dpp` files the same as `.d` if `dppSupport true` is set in `dub.sdl`, but that should be enough to make it work.
Missing pieces in `dpp` to make it work (bringing it closer to a drop-in for the compilers it’s wrapping) are:
1) support for 0 and >1 `dpp` files. Per file: output placed in same location as input (so imports work). 0 is needed so dub can get build information
2) support for `.rsp` files like dmd. | True | dub support - I started adding `dub` support, see https://github.com/John-Colvin/dub/tree/dpp_support and read the stuff at the top of the README. All it does is just treat `.dpp` files the same as `.d` if `dppSupport true` is set in `dub.sdl`, but that should be enough to make it work.
Missing pieces in `dpp` to make it work (bringing it closer to a drop-in for the compilers it’s wrapping) are:
1) support for 0 and >1 `dpp` files. Per file: output placed in same location as input (so imports work). 0 is needed so dub can get build information
2) support for `.rsp` files like dmd. | non_priority | dub support i started adding dub support see and read the stuff at the top of the readme all it does is just treat dpp files the same as d if dppsupport true is set in dub sdl but that should be enough to make it work missing pieces in dpp to make it work bringing it closer to a drop in for the compilers it’s wrapping are support for and dpp files per file output placed in same location as input so imports work is needed so dub can get build information support for rsp files like dmd | 0 |
345,480 | 30,816,802,997 | IssuesEvent | 2023-08-01 13:57:11 | metabase/metabase | https://api.github.com/repos/metabase/metabase | closed | Look into Flaky Actions-on-Dashboards e2e Tests | .CI & Tests flaky-test-fix .Team/PixelPolice :police_officer: | These tests seem to be flaking pretty frequently

| 2.0 | Look into Flaky Actions-on-Dashboards e2e Tests - These tests seem to be flaking pretty frequently

| non_priority | look into flaky actions on dashboards tests these tests seem to be flaking pretty frequently | 0 |
148,836 | 11,867,680,636 | IssuesEvent | 2020-03-26 07:36:28 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | closed | SnapshotIT testCreateSnapshot failure in 6.8 CI | :Distributed/Snapshot/Restore >test-failure | The SnapshotIT#testCreateSnapshot test failed in 6.8 CI due to a missing snapshot. I was unable to reproduce the issue locally.
Reproduce line:
```
./gradlew ':client:rest-high-level:integTestRunner' \
-Dtests.seed=7C237C053DB8334 \
-Dtests.class=org.elasticsearch.client.SnapshotIT \
-Dtests.method="testCreateSnapshot" \
-Dtests.security.manager=true \
-Dtests.locale=en-KY \
-Dtests.timezone=Etc/GMT-11 \
-Dcompiler.java=12 \
-Druntime.java=12
```
```
ElasticsearchStatusException[Elasticsearch exception [type=snapshot_missing_exception, reason=[test_repository:test_snapshot] is missing]]
at __randomizedtesting.SeedInfo.seed([7C237C053DB8334:F9A4A2736568712D]:0)
at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:177)
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:2053)
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:2030)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1777)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1734)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1696)
at org.elasticsearch.client.SnapshotClient.delete(SnapshotClient.java:296)
at org.elasticsearch.client.ESRestHighLevelClientTestCase.execute(ESRestHighLevelClientTestCase.java:88)
at org.elasticsearch.client.ESRestHighLevelClientTestCase.execute(ESRestHighLevelClientTestCase.java:79)
at org.elasticsearch.client.SnapshotIT.testCreateSnapshot(SnapshotIT.java:149)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:835)
Suppressed: org.elasticsearch.client.ResponseException: method [DELETE], host [http://[::1]:40681], URI [/_snapshot/test_repository/test_snapshot?master_timeout=30s], status line [HTTP/1.1 404 Not Found]
{"error":{"root_cause":[{"type":"snapshot_missing_exception","reason":"[test_repository:test_snapshot] is missing"}],"type":"snapshot_missing_exception","reason":"[test_repository:test_snapshot] is missing"},"status":404}
at org.elasticsearch.client.RestClient$SyncResponseListener.get(RestClient.java:936)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:233)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1764)
```
Build scan: https://gradle-enterprise.elastic.co/s/u3csmaokg7iak | 1.0 | SnapshotIT testCreateSnapshot failure in 6.8 CI - The SnapshotIT#testCreateSnapshot test failed in 6.8 CI due to a missing snapshot. I was unable to reproduce the issue locally.
Reproduce line:
```
./gradlew ':client:rest-high-level:integTestRunner' \
-Dtests.seed=7C237C053DB8334 \
-Dtests.class=org.elasticsearch.client.SnapshotIT \
-Dtests.method="testCreateSnapshot" \
-Dtests.security.manager=true \
-Dtests.locale=en-KY \
-Dtests.timezone=Etc/GMT-11 \
-Dcompiler.java=12 \
-Druntime.java=12
```
```
ElasticsearchStatusException[Elasticsearch exception [type=snapshot_missing_exception, reason=[test_repository:test_snapshot] is missing]]
at __randomizedtesting.SeedInfo.seed([7C237C053DB8334:F9A4A2736568712D]:0)
at org.elasticsearch.rest.BytesRestResponse.errorFromXContent(BytesRestResponse.java:177)
at org.elasticsearch.client.RestHighLevelClient.parseEntity(RestHighLevelClient.java:2053)
at org.elasticsearch.client.RestHighLevelClient.parseResponseException(RestHighLevelClient.java:2030)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1777)
at org.elasticsearch.client.RestHighLevelClient.performRequest(RestHighLevelClient.java:1734)
at org.elasticsearch.client.RestHighLevelClient.performRequestAndParseEntity(RestHighLevelClient.java:1696)
at org.elasticsearch.client.SnapshotClient.delete(SnapshotClient.java:296)
at org.elasticsearch.client.ESRestHighLevelClientTestCase.execute(ESRestHighLevelClientTestCase.java:88)
at org.elasticsearch.client.ESRestHighLevelClientTestCase.execute(ESRestHighLevelClientTestCase.java:79)
at org.elasticsearch.client.SnapshotIT.testCreateSnapshot(SnapshotIT.java:149)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.base/java.lang.reflect.Method.invoke(Method.java:567)
at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750)
at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938)
at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974)
at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:49)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at org.apache.lucene.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:48)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468)
at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947)
at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832)
at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883)
at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894)
at org.apache.lucene.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:45)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:41)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at org.apache.lucene.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53)
at org.apache.lucene.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:47)
at org.apache.lucene.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:64)
at org.apache.lucene.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:54)
at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36)
at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368)
at java.base/java.lang.Thread.run(Thread.java:835)
Suppressed: org.elasticsearch.client.ResponseException: method [DELETE], host [http://[::1]:40681], URI [/_snapshot/test_repository/test_snapshot?master_timeout=30s], status line [HTTP/1.1 404 Not Found]
{"error":{"root_cause":[{"type":"snapshot_missing_exception","reason":"[test_repository:test_snapshot] is missing"}],"type":"snapshot_missing_exception","reason":"[test_repository:test_snapshot] is missing"},"status":404}
at org.elasticsearch.client.RestClient$SyncResponseListener.get(RestClient.java:936)
at org.elasticsearch.client.RestClient.performRequest(RestClient.java:233)
at org.elasticsearch.client.RestHighLevelClient.internalPerformRequest(RestHighLevelClient.java:1764)
```
Build scan: https://gradle-enterprise.elastic.co/s/u3csmaokg7iak | non_priority | snapshotit testcreatesnapshot failure in ci the snapshotit testcreatesnapshot test failed in ci due to a missing snapshot i was unable to reproduce the issue locally reproduce line gradlew client rest high level integtestrunner dtests seed dtests class org elasticsearch client snapshotit dtests method testcreatesnapshot dtests security manager true dtests locale en ky dtests timezone etc gmt dcompiler java druntime java elasticsearchstatusexception is missing at randomizedtesting seedinfo seed at org elasticsearch rest bytesrestresponse errorfromxcontent bytesrestresponse java at org elasticsearch client resthighlevelclient parseentity resthighlevelclient java at org elasticsearch client resthighlevelclient parseresponseexception resthighlevelclient java at org elasticsearch client resthighlevelclient internalperformrequest resthighlevelclient java at org elasticsearch client resthighlevelclient performrequest resthighlevelclient java at org elasticsearch client resthighlevelclient performrequestandparseentity resthighlevelclient java at org elasticsearch client snapshotclient delete snapshotclient java at org elasticsearch client esresthighlevelclienttestcase execute esresthighlevelclienttestcase java at org elasticsearch client esresthighlevelclienttestcase execute esresthighlevelclienttestcase java at org elasticsearch client snapshotit testcreatesnapshot snapshotit java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene util abstractbeforeafterrule evaluate abstractbeforeafterrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene util testruleignoretestsuites evaluate testruleignoretestsuites java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at java base java lang thread run thread java suppressed org elasticsearch client responseexception method host uri status line error root cause is missing type snapshot missing exception reason is missing status at org elasticsearch client restclient syncresponselistener get restclient java at org elasticsearch client restclient performrequest restclient java at org elasticsearch client resthighlevelclient internalperformrequest resthighlevelclient java build scan | 0 |
44,424 | 5,823,900,453 | IssuesEvent | 2017-05-07 07:21:09 | Microsoft/visualfsharp | https://api.github.com/repos/Microsoft/visualfsharp | closed | Inefficient `string` function implementation | Area-Library Resolution-By Design | This line of code
```fsharp
let s = string 22
```
compiles to this:
```csharp
object obj = 22;
if (obj != null)
{
IFormattable formattable = obj as IFormattable;
if (formattable != null)
{
IFormattable formattable2 = formattable;
formattable2.ToString(null, (IFormatProvider)CultureInfo.InvariantCulture);
}
else
{
object obj2 = obj;
obj2.ToString();
}
}
```
1. Why check for `null` after boxing?
2. Why we need to ask for `IFormatProvider` for ints?
3. Why we need to format it with `InvariantCulture`?
4. Why intermediate `obj2` variable?
As I understand the function is defined here https://github.com/Microsoft/visualfsharp/blob/master/src/fsharp/FSharp.Core/prim-types.fs#L4485
```fsharp
[<CompiledName("ToString")>]
let inline string (x: ^T) =
anyToString "" x
// since we have static optimization conditionals for ints below, we need to special-case Enums.
// This way we'll print their symbolic value, as opposed to their integral one (Eg., "A", rather than "1")
when ^T struct = anyToString "" x
when ^T : float = (# "" x : float #).ToString("g",CultureInfo.InvariantCulture)
when ^T : float32 = (# "" x : float32 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int64 = (# "" x : int64 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int32 = (# "" x : int32 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int16 = (# "" x : int16 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : nativeint = (# "" x : nativeint #).ToString()
when ^T : sbyte = (# "" x : sbyte #).ToString("g",CultureInfo.InvariantCulture)
when ^T : uint64 = (# "" x : uint64 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : uint32 = (# "" x : uint32 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int16 = (# "" x : int16 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : unativeint = (# "" x : unativeint #).ToString()
when ^T : byte = (# "" x : byte #).ToString("g",CultureInfo.InvariantCulture)
``` | 1.0 | Inefficient `string` function implementation - This line of code
```fsharp
let s = string 22
```
compiles to this:
```csharp
object obj = 22;
if (obj != null)
{
IFormattable formattable = obj as IFormattable;
if (formattable != null)
{
IFormattable formattable2 = formattable;
formattable2.ToString(null, (IFormatProvider)CultureInfo.InvariantCulture);
}
else
{
object obj2 = obj;
obj2.ToString();
}
}
```
1. Why check for `null` after boxing?
2. Why we need to ask for `IFormatProvider` for ints?
3. Why we need to format it with `InvariantCulture`?
4. Why intermediate `obj2` variable?
As I understand the function is defined here https://github.com/Microsoft/visualfsharp/blob/master/src/fsharp/FSharp.Core/prim-types.fs#L4485
```fsharp
[<CompiledName("ToString")>]
let inline string (x: ^T) =
anyToString "" x
// since we have static optimization conditionals for ints below, we need to special-case Enums.
// This way we'll print their symbolic value, as opposed to their integral one (Eg., "A", rather than "1")
when ^T struct = anyToString "" x
when ^T : float = (# "" x : float #).ToString("g",CultureInfo.InvariantCulture)
when ^T : float32 = (# "" x : float32 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int64 = (# "" x : int64 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int32 = (# "" x : int32 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int16 = (# "" x : int16 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : nativeint = (# "" x : nativeint #).ToString()
when ^T : sbyte = (# "" x : sbyte #).ToString("g",CultureInfo.InvariantCulture)
when ^T : uint64 = (# "" x : uint64 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : uint32 = (# "" x : uint32 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : int16 = (# "" x : int16 #).ToString("g",CultureInfo.InvariantCulture)
when ^T : unativeint = (# "" x : unativeint #).ToString()
when ^T : byte = (# "" x : byte #).ToString("g",CultureInfo.InvariantCulture)
``` | non_priority | inefficient string function implementation this line of code fsharp let s string compiles to this csharp object obj if obj null iformattable formattable obj as iformattable if formattable null iformattable formattable tostring null iformatprovider cultureinfo invariantculture else object obj tostring why check for null after boxing why we need to ask for iformatprovider for ints why we need to format it with invariantculture why intermediate variable as i understand the function is defined here fsharp let inline string x t anytostring x since we have static optimization conditionals for ints below we need to special case enums this way we ll print their symbolic value as opposed to their integral one eg a rather than when t struct anytostring x when t float x float tostring g cultureinfo invariantculture when t x tostring g cultureinfo invariantculture when t x tostring g cultureinfo invariantculture when t x tostring g cultureinfo invariantculture when t x tostring g cultureinfo invariantculture when t nativeint x nativeint tostring when t sbyte x sbyte tostring g cultureinfo invariantculture when t x tostring g cultureinfo invariantculture when t x tostring g cultureinfo invariantculture when t x tostring g cultureinfo invariantculture when t unativeint x unativeint tostring when t byte x byte tostring g cultureinfo invariantculture | 0 |
1,155 | 13,434,913,744 | IssuesEvent | 2020-09-07 12:08:00 | akka/akka | https://api.github.com/repos/akka/akka | closed | Chunked messages in reliable delivery | 1 - triaged t:reliable-delivery t:typed | To avoid head of line blocking from serialization and transfer of large messages | True | Chunked messages in reliable delivery - To avoid head of line blocking from serialization and transfer of large messages | non_priority | chunked messages in reliable delivery to avoid head of line blocking from serialization and transfer of large messages | 0 |
187,978 | 15,112,998,899 | IssuesEvent | 2021-02-08 22:48:44 | openmsupply/application-manager-web-app | https://api.github.com/repos/openmsupply/application-manager-web-app | closed | Diagram for Review status changes | Effort: 2 Epic #20: Review of applications Sprint18 Type: Documentation | Make similar Diagram as in #269 for the Review creation and status update workflow | 1.0 | Diagram for Review status changes - Make similar Diagram as in #269 for the Review creation and status update workflow | non_priority | diagram for review status changes make similar diagram as in for the review creation and status update workflow | 0 |
142,855 | 19,103,360,677 | IssuesEvent | 2021-11-30 02:33:19 | farooqmir/React-Redux-Demonstration-with-api | https://api.github.com/repos/farooqmir/React-Redux-Demonstration-with-api | opened | WS-2020-0443 (Medium) detected in socket.io-2.1.1.tgz | security vulnerability | ## WS-2020-0443 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p></summary>
<p>node.js realtime framework server</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p>
<p>Path to dependency file: /React-Redux-Demonstration-with-api/package.json</p>
<p>Path to vulnerable library: React-Redux-Demonstration-with-api/node_modules/socket.io/package.json</p>
<p>
Dependency Hierarchy:
- browser-sync-2.26.5.tgz (Root Library)
- :x: **socket.io-2.1.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In socket.io in versions 1.0.0 to 2.3.0 is vulnerable to Cross-Site Websocket Hijacking, it allows an attacker to bypass origin protection using special symbols include "`" and "$".
<p>Publish Date: 2020-02-20
<p>URL: <a href=https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7>WS-2020-0443</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/931197">https://hackerone.com/reports/931197</a></p>
<p>Release Date: 2020-02-20</p>
<p>Fix Resolution: socket.io - 2.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2020-0443 (Medium) detected in socket.io-2.1.1.tgz - ## WS-2020-0443 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>socket.io-2.1.1.tgz</b></p></summary>
<p>node.js realtime framework server</p>
<p>Library home page: <a href="https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz">https://registry.npmjs.org/socket.io/-/socket.io-2.1.1.tgz</a></p>
<p>Path to dependency file: /React-Redux-Demonstration-with-api/package.json</p>
<p>Path to vulnerable library: React-Redux-Demonstration-with-api/node_modules/socket.io/package.json</p>
<p>
Dependency Hierarchy:
- browser-sync-2.26.5.tgz (Root Library)
- :x: **socket.io-2.1.1.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In socket.io in versions 1.0.0 to 2.3.0 is vulnerable to Cross-Site Websocket Hijacking, it allows an attacker to bypass origin protection using special symbols include "`" and "$".
<p>Publish Date: 2020-02-20
<p>URL: <a href=https://github.com/socketio/socket.io/commit/f78a575f66ab693c3ea96ea88429ddb1a44c86c7>WS-2020-0443</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://hackerone.com/reports/931197">https://hackerone.com/reports/931197</a></p>
<p>Release Date: 2020-02-20</p>
<p>Fix Resolution: socket.io - 2.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws medium detected in socket io tgz ws medium severity vulnerability vulnerable library socket io tgz node js realtime framework server library home page a href path to dependency file react redux demonstration with api package json path to vulnerable library react redux demonstration with api node modules socket io package json dependency hierarchy browser sync tgz root library x socket io tgz vulnerable library vulnerability details in socket io in versions to is vulnerable to cross site websocket hijacking it allows an attacker to bypass origin protection using special symbols include and publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution socket io step up your open source security game with whitesource | 0 |
120,352 | 12,067,501,861 | IssuesEvent | 2020-04-16 13:29:27 | 30-seconds/integration-tools | https://api.github.com/repos/30-seconds/integration-tools | opened | Integration Tools v2.0 | documentation enhancement | As many changes have been rolled out in other repos, a `v2.0` is in order. Changes will include:
- Deprecation of some of the `.github` templates as they just slow down procedures.
- Deprecation of the archive parsing system.
- Possible combination of the standard snippet parser with the 30code one, cleanup in relevant repos.
- Deprecation of the short snippet list JSON output files. | 1.0 | Integration Tools v2.0 - As many changes have been rolled out in other repos, a `v2.0` is in order. Changes will include:
- Deprecation of some of the `.github` templates as they just slow down procedures.
- Deprecation of the archive parsing system.
- Possible combination of the standard snippet parser with the 30code one, cleanup in relevant repos.
- Deprecation of the short snippet list JSON output files. | non_priority | integration tools as many changes have been rolled out in other repos a is in order changes will include deprecation of some of the github templates as they just slow down procedures deprecation of the archive parsing system possible combination of the standard snippet parser with the one cleanup in relevant repos deprecation of the short snippet list json output files | 0 |
777 | 10,476,349,002 | IssuesEvent | 2019-09-23 18:23:36 | microsoft/BotFramework-DirectLineJS | https://api.github.com/repos/microsoft/BotFramework-DirectLineJS | opened | Happy path: OAuth with magic code | 0 Reliability 0 Streaming Extensions | 1. Start a conversation
1. Start an OAuth without enhanced token
Make sure:
- After OAuth completed, magic code should be presented
- After magic code is presented, make sure the bot can receive a working token and able to notify the client | True | Happy path: OAuth with magic code - 1. Start a conversation
1. Start an OAuth without enhanced token
Make sure:
- After OAuth completed, magic code should be presented
- After magic code is presented, make sure the bot can receive a working token and able to notify the client | non_priority | happy path oauth with magic code start a conversation start an oauth without enhanced token make sure after oauth completed magic code should be presented after magic code is presented make sure the bot can receive a working token and able to notify the client | 0 |
21,048 | 16,487,679,789 | IssuesEvent | 2021-05-24 20:38:50 | kubernetes/kubernetes | https://api.github.com/repos/kubernetes/kubernetes | closed | Change the users of IsQualifiedName to ValidateQualifiedName | good first issue help wanted needs-triage sig/usability | After #98277 got merged, we have added new helper function `ValidateQualifiedName` here: https://github.com/kubernetes/kubernetes/blob/master/pkg/apis/core/validation/validation.go#L126.
It will be good for us to use this simpler helper function to replace previous `IsQualifiedName` function.
/good-first-issue
| True | Change the users of IsQualifiedName to ValidateQualifiedName - After #98277 got merged, we have added new helper function `ValidateQualifiedName` here: https://github.com/kubernetes/kubernetes/blob/master/pkg/apis/core/validation/validation.go#L126.
It will be good for us to use this simpler helper function to replace previous `IsQualifiedName` function.
/good-first-issue
| non_priority | change the users of isqualifiedname to validatequalifiedname after got merged we have added new helper function validatequalifiedname here it will be good for us to use this simpler helper function to replace previous isqualifiedname function good first issue | 0 |
181,453 | 14,020,777,914 | IssuesEvent | 2020-10-29 20:12:38 | dask/distributed | https://api.github.com/repos/dask/distributed | opened | Flaky test_close_gracefully | flaky test | `test_close_gracefully` [failed on CI build](https://github.com/dask/distributed/runs/1324615732#step:8:116) over in https://github.com/dask/distributed/pull/4192 which contained seemly unrelated changes
<details>
<summary>Traceback:</summary>
```
2020-10-29T04:49:56.4843472Z ____________________________ test_close_gracefully ____________________________
2020-10-29T04:49:56.4843705Z
2020-10-29T04:49:56.4844237Z def test_func():
2020-10-29T04:49:56.4844564Z result = None
2020-10-29T04:49:56.4844855Z workers = []
2020-10-29T04:49:56.4845243Z with clean(timeout=active_rpc_timeout, **clean_kwargs) as loop:
2020-10-29T04:49:56.4845816Z
2020-10-29T04:49:56.4846289Z async def coro():
2020-10-29T04:49:56.4847778Z with dask.config.set(config):
2020-10-29T04:49:56.4848155Z s = False
2020-10-29T04:49:56.4848442Z for i in range(5):
2020-10-29T04:49:56.4848724Z try:
2020-10-29T04:49:56.4849002Z s, ws = await start_cluster(
2020-10-29T04:49:56.4849356Z nthreads,
2020-10-29T04:49:56.4849673Z scheduler,
2020-10-29T04:49:56.4849969Z loop,
2020-10-29T04:49:56.4850291Z security=security,
2020-10-29T04:49:56.4850651Z Worker=Worker,
2020-10-29T04:49:56.4851060Z scheduler_kwargs=scheduler_kwargs,
2020-10-29T04:49:56.4851514Z worker_kwargs=worker_kwargs,
2020-10-29T04:49:56.4851979Z )
2020-10-29T04:49:56.4852296Z except Exception as e:
2020-10-29T04:49:56.4852657Z logger.error(
2020-10-29T04:49:56.4853081Z "Failed to start gen_cluster, retrying",
2020-10-29T04:49:56.4853480Z exc_info=True,
2020-10-29T04:49:56.4853748Z )
2020-10-29T04:49:56.4854070Z await asyncio.sleep(1)
2020-10-29T04:49:56.4854418Z else:
2020-10-29T04:49:56.4854660Z workers[:] = ws
2020-10-29T04:49:56.4854972Z args = [s] + workers
2020-10-29T04:49:56.4855260Z break
2020-10-29T04:49:56.4855549Z if s is False:
2020-10-29T04:49:56.4856004Z raise Exception("Could not start cluster")
2020-10-29T04:49:56.4856561Z if client:
2020-10-29T04:49:56.4857046Z c = await Client(
2020-10-29T04:49:56.4857364Z s.address,
2020-10-29T04:49:56.4857618Z loop=loop,
2020-10-29T04:49:56.4857951Z security=security,
2020-10-29T04:49:56.4858330Z asynchronous=True,
2020-10-29T04:49:56.4858685Z **client_kwargs,
2020-10-29T04:49:56.4858962Z )
2020-10-29T04:49:56.4859226Z args = [c] + args
2020-10-29T04:49:56.4859494Z try:
2020-10-29T04:49:56.4859788Z future = func(*args)
2020-10-29T04:49:56.4860060Z if timeout:
2020-10-29T04:49:56.4860962Z future = asyncio.wait_for(future, timeout)
2020-10-29T04:49:56.4861619Z result = await future
2020-10-29T04:49:56.4861962Z if s.validate:
2020-10-29T04:49:56.4862307Z s.validate_state()
2020-10-29T04:49:56.4862624Z finally:
2020-10-29T04:49:56.4863017Z if client and c.status not in ("closing", "closed"):
2020-10-29T04:49:56.4863530Z await c._close(fast=s.status == Status.closed)
2020-10-29T04:49:56.4863951Z await end_cluster(s, workers)
2020-10-29T04:49:56.4864428Z await asyncio.wait_for(cleanup_global_workers(), 1)
2020-10-29T04:49:56.4864809Z
2020-10-29T04:49:56.4865045Z try:
2020-10-29T04:49:56.4865358Z c = await default_client()
2020-10-29T04:49:56.4865728Z except ValueError:
2020-10-29T04:49:56.4866108Z pass
2020-10-29T04:49:56.4866410Z else:
2020-10-29T04:49:56.4866738Z await c._close(fast=True)
2020-10-29T04:49:56.4867035Z
2020-10-29T04:49:56.4867315Z def get_unclosed():
2020-10-29T04:49:56.4867744Z return [c for c in Comm._instances if not c.closed()] + [
2020-10-29T04:49:56.4868130Z c
2020-10-29T04:49:56.4868495Z for c in _global_clients.values()
2020-10-29T04:49:56.4868899Z if c.status != "closed"
2020-10-29T04:49:56.4869190Z ]
2020-10-29T04:49:56.4869372Z
2020-10-29T04:49:56.4869608Z try:
2020-10-29T04:49:56.4869889Z start = time()
2020-10-29T04:49:56.4870205Z while time() < start + 5:
2020-10-29T04:49:56.4870901Z gc.collect()
2020-10-29T04:49:56.4871385Z if not get_unclosed():
2020-10-29T04:49:56.4871668Z break
2020-10-29T04:49:56.4871973Z await asyncio.sleep(0.05)
2020-10-29T04:49:56.4872313Z else:
2020-10-29T04:49:56.4872780Z if allow_unclosed:
2020-10-29T04:49:56.4873165Z print(f"Unclosed Comms: {get_unclosed()}")
2020-10-29T04:49:56.4873509Z else:
2020-10-29T04:49:56.4873910Z raise RuntimeError("Unclosed Comms", get_unclosed())
2020-10-29T04:49:56.4874317Z finally:
2020-10-29T04:49:56.4874658Z Comm._instances.clear()
2020-10-29T04:49:56.4875046Z _global_clients.clear()
2020-10-29T04:49:56.4875662Z
2020-10-29T04:49:56.4875972Z return result
2020-10-29T04:49:56.4876244Z
2020-10-29T04:49:56.4876538Z > result = loop.run_sync(
2020-10-29T04:49:56.4876976Z coro, timeout=timeout * 2 if timeout else timeout
2020-10-29T04:49:56.4877340Z )
2020-10-29T04:49:56.4877483Z
2020-10-29T04:49:56.4877807Z distributed\utils_test.py:953:
2020-10-29T04:49:56.4878148Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
2020-10-29T04:49:56.4879367Z C:\Miniconda3\envs\dask-distributed\lib\site-packages\tornado\ioloop.py:532: in run_sync
2020-10-29T04:49:56.4880323Z return future_cell[0].result()
2020-10-29T04:49:56.4880862Z distributed\utils_test.py:912: in coro
2020-10-29T04:49:56.4881322Z result = await future
2020-10-29T04:49:56.4881967Z C:\Miniconda3\envs\dask-distributed\lib\asyncio\tasks.py:491: in wait_for
2020-10-29T04:49:56.4882612Z return fut.result()
2020-10-29T04:49:56.4883043Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
2020-10-29T04:49:56.4883363Z
2020-10-29T04:49:56.4883772Z c = <Client: not connected>
2020-10-29T04:49:56.4884424Z s = <Scheduler: "tcp://127.0.0.1:56676" processes: 0 cores: 0>
2020-10-29T04:49:56.4885172Z a = <Worker: 'tcp://127.0.0.1:56677', 0, Status.closed, stored: 1, running: -1/1, ready: 0, comm: 0, waiting: 0>
2020-10-29T04:49:56.4886279Z b = <Worker: 'tcp://127.0.0.1:56679', 1, Status.closed, stored: 2, running: 2/2, ready: 96, comm: 0, waiting: 0>
2020-10-29T04:49:56.4886942Z
2020-10-29T04:49:56.4887385Z @gen_cluster(client=True)
2020-10-29T04:49:56.4887907Z async def test_close_gracefully(c, s, a, b):
2020-10-29T04:49:56.4888634Z futures = c.map(slowinc, range(200), delay=0.1)
2020-10-29T04:49:56.4889306Z while not b.data:
2020-10-29T04:49:56.4889773Z await asyncio.sleep(0.1)
2020-10-29T04:49:56.4890187Z
2020-10-29T04:49:56.4890509Z mem = set(b.data)
2020-10-29T04:49:56.4891067Z proc = [ts for ts in b.tasks.values() if ts.state == "executing"]
2020-10-29T04:49:56.4891584Z
2020-10-29T04:49:56.4892279Z await b.close_gracefully()
2020-10-29T04:49:56.4893608Z
2020-10-29T04:49:56.4894190Z assert b.status == Status.closed
2020-10-29T04:49:56.4894753Z assert b.address not in s.workers
2020-10-29T04:49:56.4895312Z assert mem.issubset(set(a.data))
2020-10-29T04:49:56.4895749Z for ts in proc:
2020-10-29T04:49:56.4896333Z > assert ts.state in ("processing", "memory")
2020-10-29T04:49:56.4897037Z E AssertionError: assert 'executing' in ('processing', 'memory')
2020-10-29T04:49:56.4898093Z E + where 'executing' = <Task 'slowinc-a373197c122b45e71acceb95cd7cb992' executing>.state
2020-10-29T04:49:56.4899687Z
2020-10-29T04:49:56.4900293Z distributed\tests\test_worker.py:1574: AssertionError
2020-10-29T04:49:56.4900923Z ---------------------------- Captured stderr call -----------------------------
2020-10-29T04:49:56.4901550Z distributed.scheduler - INFO - Clear task state
2020-10-29T04:49:56.4902359Z distributed.scheduler - INFO - Scheduler at: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4903196Z distributed.scheduler - INFO - dashboard at: 127.0.0.1:8787
2020-10-29T04:49:56.4904078Z distributed.worker - INFO - Start worker at: tcp://127.0.0.1:56677
2020-10-29T04:49:56.4905095Z distributed.worker - INFO - Listening to: tcp://127.0.0.1:56677
2020-10-29T04:49:56.4905807Z distributed.worker - INFO - dashboard at: 127.0.0.1:56678
2020-10-29T04:49:56.4907957Z distributed.worker - INFO - Waiting to connect to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4908728Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4909379Z distributed.worker - INFO - Threads: 1
2020-10-29T04:49:56.4910397Z distributed.worker - INFO - Memory: 7.52 GB
2020-10-29T04:49:56.4911480Z distributed.worker - INFO - Local Directory: D:\a\distributed\distributed\dask-worker-space\worker-b2i2kj7f
2020-10-29T04:49:56.4912410Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4913416Z distributed.worker - INFO - Start worker at: tcp://127.0.0.1:56679
2020-10-29T04:49:56.4914250Z distributed.worker - INFO - Listening to: tcp://127.0.0.1:56679
2020-10-29T04:49:56.4915004Z distributed.worker - INFO - dashboard at: 127.0.0.1:56680
2020-10-29T04:49:56.4915768Z distributed.worker - INFO - Waiting to connect to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4916927Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4917587Z distributed.worker - INFO - Threads: 2
2020-10-29T04:49:56.4918290Z distributed.worker - INFO - Memory: 7.52 GB
2020-10-29T04:49:56.4919199Z distributed.worker - INFO - Local Directory: D:\a\distributed\distributed\dask-worker-space\worker-xvpum_ry
2020-10-29T04:49:56.4920137Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4921130Z distributed.scheduler - INFO - Register worker <Worker 'tcp://127.0.0.1:56677', name: 0, memory: 0, processing: 0>
2020-10-29T04:49:56.4922485Z distributed.scheduler - INFO - Starting worker compute stream, tcp://127.0.0.1:56677
2020-10-29T04:49:56.4923428Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4924229Z distributed.worker - INFO - Registered to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4925860Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4927836Z distributed.scheduler - INFO - Register worker <Worker 'tcp://127.0.0.1:56679', name: 1, memory: 0, processing: 0>
2020-10-29T04:49:56.4928907Z distributed.scheduler - INFO - Starting worker compute stream, tcp://127.0.0.1:56679
2020-10-29T04:49:56.4929793Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4930737Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4931601Z distributed.worker - INFO - Registered to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4932466Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4933182Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4934569Z distributed.scheduler - INFO - Receive client connection: Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4936055Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4936921Z distributed.worker - INFO - Closing worker gracefully: tcp://127.0.0.1:56679
2020-10-29T04:49:56.4937692Z distributed.worker - INFO - Comm closed
2020-10-29T04:49:56.4938568Z distributed.scheduler - INFO - Retire workers {<Worker 'tcp://127.0.0.1:56679', name: 1, memory: 1, processing: 99>}
2020-10-29T04:49:56.4939523Z distributed.scheduler - INFO - Moving 1 keys to other workers
2020-10-29T04:49:56.4940339Z distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:56679
2020-10-29T04:49:56.4941303Z distributed.scheduler - INFO - Remove worker <Worker 'tcp://127.0.0.1:56679', name: 1, memory: 1, processing: 99>
2020-10-29T04:49:56.4942379Z distributed.core - INFO - Removing comms to tcp://127.0.0.1:56679
2020-10-29T04:49:56.4943384Z distributed.scheduler - INFO - Remove client Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4944621Z distributed.scheduler - INFO - Remove client Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4945929Z distributed.scheduler - INFO - Close client connection: Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4947067Z distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:56677
2020-10-29T04:49:56.4948022Z distributed.scheduler - INFO - Remove worker <Worker 'tcp://127.0.0.1:56677', name: 0, memory: 0, processing: 0>
2020-10-29T04:49:56.4948947Z distributed.core - INFO - Removing comms to tcp://127.0.0.1:56677
2020-10-29T04:49:56.4949701Z distributed.scheduler - INFO - Lost all workers
2020-10-29T04:49:56.4950440Z distributed.worker - INFO - Comm closed
2020-10-29T04:49:56.4951171Z distributed.scheduler - INFO - Scheduler closing...
2020-10-29T04:49:56.5213428Z distributed.scheduler - INFO - Scheduler closing all comms
```
</details> | 1.0 | Flaky test_close_gracefully - `test_close_gracefully` [failed on CI build](https://github.com/dask/distributed/runs/1324615732#step:8:116) over in https://github.com/dask/distributed/pull/4192 which contained seemly unrelated changes
<details>
<summary>Traceback:</summary>
```
2020-10-29T04:49:56.4843472Z ____________________________ test_close_gracefully ____________________________
2020-10-29T04:49:56.4843705Z
2020-10-29T04:49:56.4844237Z def test_func():
2020-10-29T04:49:56.4844564Z result = None
2020-10-29T04:49:56.4844855Z workers = []
2020-10-29T04:49:56.4845243Z with clean(timeout=active_rpc_timeout, **clean_kwargs) as loop:
2020-10-29T04:49:56.4845816Z
2020-10-29T04:49:56.4846289Z async def coro():
2020-10-29T04:49:56.4847778Z with dask.config.set(config):
2020-10-29T04:49:56.4848155Z s = False
2020-10-29T04:49:56.4848442Z for i in range(5):
2020-10-29T04:49:56.4848724Z try:
2020-10-29T04:49:56.4849002Z s, ws = await start_cluster(
2020-10-29T04:49:56.4849356Z nthreads,
2020-10-29T04:49:56.4849673Z scheduler,
2020-10-29T04:49:56.4849969Z loop,
2020-10-29T04:49:56.4850291Z security=security,
2020-10-29T04:49:56.4850651Z Worker=Worker,
2020-10-29T04:49:56.4851060Z scheduler_kwargs=scheduler_kwargs,
2020-10-29T04:49:56.4851514Z worker_kwargs=worker_kwargs,
2020-10-29T04:49:56.4851979Z )
2020-10-29T04:49:56.4852296Z except Exception as e:
2020-10-29T04:49:56.4852657Z logger.error(
2020-10-29T04:49:56.4853081Z "Failed to start gen_cluster, retrying",
2020-10-29T04:49:56.4853480Z exc_info=True,
2020-10-29T04:49:56.4853748Z )
2020-10-29T04:49:56.4854070Z await asyncio.sleep(1)
2020-10-29T04:49:56.4854418Z else:
2020-10-29T04:49:56.4854660Z workers[:] = ws
2020-10-29T04:49:56.4854972Z args = [s] + workers
2020-10-29T04:49:56.4855260Z break
2020-10-29T04:49:56.4855549Z if s is False:
2020-10-29T04:49:56.4856004Z raise Exception("Could not start cluster")
2020-10-29T04:49:56.4856561Z if client:
2020-10-29T04:49:56.4857046Z c = await Client(
2020-10-29T04:49:56.4857364Z s.address,
2020-10-29T04:49:56.4857618Z loop=loop,
2020-10-29T04:49:56.4857951Z security=security,
2020-10-29T04:49:56.4858330Z asynchronous=True,
2020-10-29T04:49:56.4858685Z **client_kwargs,
2020-10-29T04:49:56.4858962Z )
2020-10-29T04:49:56.4859226Z args = [c] + args
2020-10-29T04:49:56.4859494Z try:
2020-10-29T04:49:56.4859788Z future = func(*args)
2020-10-29T04:49:56.4860060Z if timeout:
2020-10-29T04:49:56.4860962Z future = asyncio.wait_for(future, timeout)
2020-10-29T04:49:56.4861619Z result = await future
2020-10-29T04:49:56.4861962Z if s.validate:
2020-10-29T04:49:56.4862307Z s.validate_state()
2020-10-29T04:49:56.4862624Z finally:
2020-10-29T04:49:56.4863017Z if client and c.status not in ("closing", "closed"):
2020-10-29T04:49:56.4863530Z await c._close(fast=s.status == Status.closed)
2020-10-29T04:49:56.4863951Z await end_cluster(s, workers)
2020-10-29T04:49:56.4864428Z await asyncio.wait_for(cleanup_global_workers(), 1)
2020-10-29T04:49:56.4864809Z
2020-10-29T04:49:56.4865045Z try:
2020-10-29T04:49:56.4865358Z c = await default_client()
2020-10-29T04:49:56.4865728Z except ValueError:
2020-10-29T04:49:56.4866108Z pass
2020-10-29T04:49:56.4866410Z else:
2020-10-29T04:49:56.4866738Z await c._close(fast=True)
2020-10-29T04:49:56.4867035Z
2020-10-29T04:49:56.4867315Z def get_unclosed():
2020-10-29T04:49:56.4867744Z return [c for c in Comm._instances if not c.closed()] + [
2020-10-29T04:49:56.4868130Z c
2020-10-29T04:49:56.4868495Z for c in _global_clients.values()
2020-10-29T04:49:56.4868899Z if c.status != "closed"
2020-10-29T04:49:56.4869190Z ]
2020-10-29T04:49:56.4869372Z
2020-10-29T04:49:56.4869608Z try:
2020-10-29T04:49:56.4869889Z start = time()
2020-10-29T04:49:56.4870205Z while time() < start + 5:
2020-10-29T04:49:56.4870901Z gc.collect()
2020-10-29T04:49:56.4871385Z if not get_unclosed():
2020-10-29T04:49:56.4871668Z break
2020-10-29T04:49:56.4871973Z await asyncio.sleep(0.05)
2020-10-29T04:49:56.4872313Z else:
2020-10-29T04:49:56.4872780Z if allow_unclosed:
2020-10-29T04:49:56.4873165Z print(f"Unclosed Comms: {get_unclosed()}")
2020-10-29T04:49:56.4873509Z else:
2020-10-29T04:49:56.4873910Z raise RuntimeError("Unclosed Comms", get_unclosed())
2020-10-29T04:49:56.4874317Z finally:
2020-10-29T04:49:56.4874658Z Comm._instances.clear()
2020-10-29T04:49:56.4875046Z _global_clients.clear()
2020-10-29T04:49:56.4875662Z
2020-10-29T04:49:56.4875972Z return result
2020-10-29T04:49:56.4876244Z
2020-10-29T04:49:56.4876538Z > result = loop.run_sync(
2020-10-29T04:49:56.4876976Z coro, timeout=timeout * 2 if timeout else timeout
2020-10-29T04:49:56.4877340Z )
2020-10-29T04:49:56.4877483Z
2020-10-29T04:49:56.4877807Z distributed\utils_test.py:953:
2020-10-29T04:49:56.4878148Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
2020-10-29T04:49:56.4879367Z C:\Miniconda3\envs\dask-distributed\lib\site-packages\tornado\ioloop.py:532: in run_sync
2020-10-29T04:49:56.4880323Z return future_cell[0].result()
2020-10-29T04:49:56.4880862Z distributed\utils_test.py:912: in coro
2020-10-29T04:49:56.4881322Z result = await future
2020-10-29T04:49:56.4881967Z C:\Miniconda3\envs\dask-distributed\lib\asyncio\tasks.py:491: in wait_for
2020-10-29T04:49:56.4882612Z return fut.result()
2020-10-29T04:49:56.4883043Z _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
2020-10-29T04:49:56.4883363Z
2020-10-29T04:49:56.4883772Z c = <Client: not connected>
2020-10-29T04:49:56.4884424Z s = <Scheduler: "tcp://127.0.0.1:56676" processes: 0 cores: 0>
2020-10-29T04:49:56.4885172Z a = <Worker: 'tcp://127.0.0.1:56677', 0, Status.closed, stored: 1, running: -1/1, ready: 0, comm: 0, waiting: 0>
2020-10-29T04:49:56.4886279Z b = <Worker: 'tcp://127.0.0.1:56679', 1, Status.closed, stored: 2, running: 2/2, ready: 96, comm: 0, waiting: 0>
2020-10-29T04:49:56.4886942Z
2020-10-29T04:49:56.4887385Z @gen_cluster(client=True)
2020-10-29T04:49:56.4887907Z async def test_close_gracefully(c, s, a, b):
2020-10-29T04:49:56.4888634Z futures = c.map(slowinc, range(200), delay=0.1)
2020-10-29T04:49:56.4889306Z while not b.data:
2020-10-29T04:49:56.4889773Z await asyncio.sleep(0.1)
2020-10-29T04:49:56.4890187Z
2020-10-29T04:49:56.4890509Z mem = set(b.data)
2020-10-29T04:49:56.4891067Z proc = [ts for ts in b.tasks.values() if ts.state == "executing"]
2020-10-29T04:49:56.4891584Z
2020-10-29T04:49:56.4892279Z await b.close_gracefully()
2020-10-29T04:49:56.4893608Z
2020-10-29T04:49:56.4894190Z assert b.status == Status.closed
2020-10-29T04:49:56.4894753Z assert b.address not in s.workers
2020-10-29T04:49:56.4895312Z assert mem.issubset(set(a.data))
2020-10-29T04:49:56.4895749Z for ts in proc:
2020-10-29T04:49:56.4896333Z > assert ts.state in ("processing", "memory")
2020-10-29T04:49:56.4897037Z E AssertionError: assert 'executing' in ('processing', 'memory')
2020-10-29T04:49:56.4898093Z E + where 'executing' = <Task 'slowinc-a373197c122b45e71acceb95cd7cb992' executing>.state
2020-10-29T04:49:56.4899687Z
2020-10-29T04:49:56.4900293Z distributed\tests\test_worker.py:1574: AssertionError
2020-10-29T04:49:56.4900923Z ---------------------------- Captured stderr call -----------------------------
2020-10-29T04:49:56.4901550Z distributed.scheduler - INFO - Clear task state
2020-10-29T04:49:56.4902359Z distributed.scheduler - INFO - Scheduler at: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4903196Z distributed.scheduler - INFO - dashboard at: 127.0.0.1:8787
2020-10-29T04:49:56.4904078Z distributed.worker - INFO - Start worker at: tcp://127.0.0.1:56677
2020-10-29T04:49:56.4905095Z distributed.worker - INFO - Listening to: tcp://127.0.0.1:56677
2020-10-29T04:49:56.4905807Z distributed.worker - INFO - dashboard at: 127.0.0.1:56678
2020-10-29T04:49:56.4907957Z distributed.worker - INFO - Waiting to connect to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4908728Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4909379Z distributed.worker - INFO - Threads: 1
2020-10-29T04:49:56.4910397Z distributed.worker - INFO - Memory: 7.52 GB
2020-10-29T04:49:56.4911480Z distributed.worker - INFO - Local Directory: D:\a\distributed\distributed\dask-worker-space\worker-b2i2kj7f
2020-10-29T04:49:56.4912410Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4913416Z distributed.worker - INFO - Start worker at: tcp://127.0.0.1:56679
2020-10-29T04:49:56.4914250Z distributed.worker - INFO - Listening to: tcp://127.0.0.1:56679
2020-10-29T04:49:56.4915004Z distributed.worker - INFO - dashboard at: 127.0.0.1:56680
2020-10-29T04:49:56.4915768Z distributed.worker - INFO - Waiting to connect to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4916927Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4917587Z distributed.worker - INFO - Threads: 2
2020-10-29T04:49:56.4918290Z distributed.worker - INFO - Memory: 7.52 GB
2020-10-29T04:49:56.4919199Z distributed.worker - INFO - Local Directory: D:\a\distributed\distributed\dask-worker-space\worker-xvpum_ry
2020-10-29T04:49:56.4920137Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4921130Z distributed.scheduler - INFO - Register worker <Worker 'tcp://127.0.0.1:56677', name: 0, memory: 0, processing: 0>
2020-10-29T04:49:56.4922485Z distributed.scheduler - INFO - Starting worker compute stream, tcp://127.0.0.1:56677
2020-10-29T04:49:56.4923428Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4924229Z distributed.worker - INFO - Registered to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4925860Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4927836Z distributed.scheduler - INFO - Register worker <Worker 'tcp://127.0.0.1:56679', name: 1, memory: 0, processing: 0>
2020-10-29T04:49:56.4928907Z distributed.scheduler - INFO - Starting worker compute stream, tcp://127.0.0.1:56679
2020-10-29T04:49:56.4929793Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4930737Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4931601Z distributed.worker - INFO - Registered to: tcp://127.0.0.1:56676
2020-10-29T04:49:56.4932466Z distributed.worker - INFO - -------------------------------------------------
2020-10-29T04:49:56.4933182Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4934569Z distributed.scheduler - INFO - Receive client connection: Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4936055Z distributed.core - INFO - Starting established connection
2020-10-29T04:49:56.4936921Z distributed.worker - INFO - Closing worker gracefully: tcp://127.0.0.1:56679
2020-10-29T04:49:56.4937692Z distributed.worker - INFO - Comm closed
2020-10-29T04:49:56.4938568Z distributed.scheduler - INFO - Retire workers {<Worker 'tcp://127.0.0.1:56679', name: 1, memory: 1, processing: 99>}
2020-10-29T04:49:56.4939523Z distributed.scheduler - INFO - Moving 1 keys to other workers
2020-10-29T04:49:56.4940339Z distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:56679
2020-10-29T04:49:56.4941303Z distributed.scheduler - INFO - Remove worker <Worker 'tcp://127.0.0.1:56679', name: 1, memory: 1, processing: 99>
2020-10-29T04:49:56.4942379Z distributed.core - INFO - Removing comms to tcp://127.0.0.1:56679
2020-10-29T04:49:56.4943384Z distributed.scheduler - INFO - Remove client Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4944621Z distributed.scheduler - INFO - Remove client Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4945929Z distributed.scheduler - INFO - Close client connection: Client-1c9286f8-19a2-11eb-9780-000d3ae52c36
2020-10-29T04:49:56.4947067Z distributed.worker - INFO - Stopping worker at tcp://127.0.0.1:56677
2020-10-29T04:49:56.4948022Z distributed.scheduler - INFO - Remove worker <Worker 'tcp://127.0.0.1:56677', name: 0, memory: 0, processing: 0>
2020-10-29T04:49:56.4948947Z distributed.core - INFO - Removing comms to tcp://127.0.0.1:56677
2020-10-29T04:49:56.4949701Z distributed.scheduler - INFO - Lost all workers
2020-10-29T04:49:56.4950440Z distributed.worker - INFO - Comm closed
2020-10-29T04:49:56.4951171Z distributed.scheduler - INFO - Scheduler closing...
2020-10-29T04:49:56.5213428Z distributed.scheduler - INFO - Scheduler closing all comms
```
</details> | non_priority | flaky test close gracefully test close gracefully over in which contained seemly unrelated changes traceback test close gracefully def test func result none workers with clean timeout active rpc timeout clean kwargs as loop async def coro with dask config set config s false for i in range try s ws await start cluster nthreads scheduler loop security security worker worker scheduler kwargs scheduler kwargs worker kwargs worker kwargs except exception as e logger error failed to start gen cluster retrying exc info true await asyncio sleep else workers ws args workers break if s is false raise exception could not start cluster if client c await client s address loop loop security security asynchronous true client kwargs args args try future func args if timeout future asyncio wait for future timeout result await future if s validate s validate state finally if client and c status not in closing closed await c close fast s status status closed await end cluster s workers await asyncio wait for cleanup global workers try c await default client except valueerror pass else await c close fast true def get unclosed return c for c in global clients values if c status closed try start time while time start gc collect if not get unclosed break await asyncio sleep else if allow unclosed print f unclosed comms get unclosed else raise runtimeerror unclosed comms get unclosed finally comm instances clear global clients clear return result result loop run sync coro timeout timeout if timeout else timeout distributed utils test py c envs dask distributed lib site packages tornado ioloop py in run sync return future cell result distributed utils test py in coro result await future c envs dask distributed lib asyncio tasks py in wait for return fut result c s a b gen cluster client true async def test close gracefully c s a b futures c map slowinc range delay while not b data await asyncio sleep mem set b data proc await b close gracefully assert b status status closed assert b address not in s workers assert mem issubset set a data for ts in proc assert ts state in processing memory e assertionerror assert executing in processing memory e where executing state distributed tests test worker py assertionerror captured stderr call distributed scheduler info clear task state distributed scheduler info scheduler at tcp distributed scheduler info dashboard at distributed worker info start worker at tcp distributed worker info listening to tcp distributed worker info dashboard at distributed worker info waiting to connect to tcp distributed worker info distributed worker info threads distributed worker info memory gb distributed worker info local directory d a distributed distributed dask worker space worker distributed worker info distributed worker info start worker at tcp distributed worker info listening to tcp distributed worker info dashboard at distributed worker info waiting to connect to tcp distributed worker info distributed worker info threads distributed worker info memory gb distributed worker info local directory d a distributed distributed dask worker space worker xvpum ry distributed worker info distributed scheduler info register worker distributed scheduler info starting worker compute stream tcp distributed core info starting established connection distributed worker info registered to tcp distributed worker info distributed scheduler info register worker distributed scheduler info starting worker compute stream tcp distributed core info starting established connection distributed core info starting established connection distributed worker info registered to tcp distributed worker info distributed core info starting established connection distributed scheduler info receive client connection client distributed core info starting established connection distributed worker info closing worker gracefully tcp distributed worker info comm closed distributed scheduler info retire workers distributed scheduler info moving keys to other workers distributed worker info stopping worker at tcp distributed scheduler info remove worker distributed core info removing comms to tcp distributed scheduler info remove client client distributed scheduler info remove client client distributed scheduler info close client connection client distributed worker info stopping worker at tcp distributed scheduler info remove worker distributed core info removing comms to tcp distributed scheduler info lost all workers distributed worker info comm closed distributed scheduler info scheduler closing distributed scheduler info scheduler closing all comms | 0 |
151,323 | 12,032,353,441 | IssuesEvent | 2020-04-13 11:56:27 | MachoThemes/strong-testimonials | https://api.github.com/repos/MachoThemes/strong-testimonials | closed | Fields menu entry should be renamed to Forms | enhancement need testing | Even though we don't have the Multiple Forms extension installed, it should be changed. It's very misleading right now and confusing for users. | 1.0 | Fields menu entry should be renamed to Forms - Even though we don't have the Multiple Forms extension installed, it should be changed. It's very misleading right now and confusing for users. | non_priority | fields menu entry should be renamed to forms even though we don t have the multiple forms extension installed it should be changed it s very misleading right now and confusing for users | 0 |
6,012 | 3,320,761,584 | IssuesEvent | 2015-11-09 02:16:46 | mozilla/metrics-graphics | https://api.github.com/repos/mozilla/metrics-graphics | closed | code review: markers.js | code quality test | Same type of task as #533, following the precedent being set by #534. This on should be a bit easier.
- [x] decompose code
- [x] refactor components
- [x] make code format uniform
- [x] write tests for components / refactor accordingly | 1.0 | code review: markers.js - Same type of task as #533, following the precedent being set by #534. This on should be a bit easier.
- [x] decompose code
- [x] refactor components
- [x] make code format uniform
- [x] write tests for components / refactor accordingly | non_priority | code review markers js same type of task as following the precedent being set by this on should be a bit easier decompose code refactor components make code format uniform write tests for components refactor accordingly | 0 |
283,968 | 30,913,575,741 | IssuesEvent | 2023-08-05 02:17:20 | hshivhare67/kernel_v4.19.72 | https://api.github.com/repos/hshivhare67/kernel_v4.19.72 | reopened | CVE-2019-19074 (High) detected in linuxlinux-4.19.282 | Mend: dependency security vulnerability | ## CVE-2019-19074 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.282</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/kernel_v4.19.72/commit/139c4e073703974ca0b05255c4cff6dcd52a8e31">139c4e073703974ca0b05255c4cff6dcd52a8e31</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A memory leak in the ath9k_wmi_cmd() function in drivers/net/wireless/ath/ath9k/wmi.c in the Linux kernel through 5.3.11 allows attackers to cause a denial of service (memory consumption), aka CID-728c1e2a05e4.
<p>Publish Date: 2019-11-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19074>CVE-2019-19074</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074</a></p>
<p>Release Date: 2019-11-18</p>
<p>Fix Resolution: v5.4-rc1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-19074 (High) detected in linuxlinux-4.19.282 - ## CVE-2019-19074 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.282</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/kernel_v4.19.72/commit/139c4e073703974ca0b05255c4cff6dcd52a8e31">139c4e073703974ca0b05255c4cff6dcd52a8e31</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A memory leak in the ath9k_wmi_cmd() function in drivers/net/wireless/ath/ath9k/wmi.c in the Linux kernel through 5.3.11 allows attackers to cause a denial of service (memory consumption), aka CID-728c1e2a05e4.
<p>Publish Date: 2019-11-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19074>CVE-2019-19074</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19074</a></p>
<p>Release Date: 2019-11-18</p>
<p>Fix Resolution: v5.4-rc1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in linuxlinux cve high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files vulnerability details a memory leak in the wmi cmd function in drivers net wireless ath wmi c in the linux kernel through allows attackers to cause a denial of service memory consumption aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
6,552 | 2,850,548,471 | IssuesEvent | 2015-05-31 17:30:51 | MeteorCode/Pathway | https://api.github.com/repos/MeteorCode/Pathway | closed | Silence log messages during testing | testing | Currently, the [console spam](https://travis-ci.org/MeteorCode/Pathway/builds/61902801#L266) from `Context` during testing is way obnoxious; it prints almost 2000 lines of log message. This makes reading the build log a pain. Need to make it stop. | 1.0 | Silence log messages during testing - Currently, the [console spam](https://travis-ci.org/MeteorCode/Pathway/builds/61902801#L266) from `Context` during testing is way obnoxious; it prints almost 2000 lines of log message. This makes reading the build log a pain. Need to make it stop. | non_priority | silence log messages during testing currently the from context during testing is way obnoxious it prints almost lines of log message this makes reading the build log a pain need to make it stop | 0 |
154,871 | 24,362,260,477 | IssuesEvent | 2022-10-03 12:42:02 | SwissDataScienceCenter/renku-ui | https://api.github.com/repos/SwissDataScienceCenter/renku-ui | opened | Pagination fails for GitLab APIs with more than 10000 items | bug needs design | ## Description
GitLab "traditional" pagination (i.e. offset-based pagination) partially fails when returning more than 10'000 items ([docs reference](https://docs.gitlab.com/ee/user/gitlab_com/index.html#pagination-response-headers)). In that case, some headers are missing, among which `x-total` and `x-total-pages` ([docs reference](https://docs.gitlab.com/ee/api/index.html#pagination-response-headers)).
Since we use those fields for our pagination system, it currently fails on >10'000 items
## Reproduce
Admins on renkulab.io already get this error when listing all projects https://renkulab.io/projects/all
All the users will soon have this problem since we already have >9'500 public projects.

## Possible solutions
I haven't checked it thoroughly yet, but here are 2 possible solutions:
- Manually adding the missing fields/data in those cases where they are missing and we still get items (e.g. if we get items but no `x-total`, we give for granted there are more than 10'000 items). In that case, we can still show the pages up to `10000 / <single_page_size>` and specify on the last page that some results are missing and a more specific search is required.
- Try the keyset pagination ([docs reference](https://docs.gitlab.com/ee/api/#keyset-based-pagination)).
I would go for the quickest option since this might be solved in a better way with our KG-based search.
| 1.0 | Pagination fails for GitLab APIs with more than 10000 items - ## Description
GitLab "traditional" pagination (i.e. offset-based pagination) partially fails when returning more than 10'000 items ([docs reference](https://docs.gitlab.com/ee/user/gitlab_com/index.html#pagination-response-headers)). In that case, some headers are missing, among which `x-total` and `x-total-pages` ([docs reference](https://docs.gitlab.com/ee/api/index.html#pagination-response-headers)).
Since we use those fields for our pagination system, it currently fails on >10'000 items
## Reproduce
Admins on renkulab.io already get this error when listing all projects https://renkulab.io/projects/all
All the users will soon have this problem since we already have >9'500 public projects.

## Possible solutions
I haven't checked it thoroughly yet, but here are 2 possible solutions:
- Manually adding the missing fields/data in those cases where they are missing and we still get items (e.g. if we get items but no `x-total`, we give for granted there are more than 10'000 items). In that case, we can still show the pages up to `10000 / <single_page_size>` and specify on the last page that some results are missing and a more specific search is required.
- Try the keyset pagination ([docs reference](https://docs.gitlab.com/ee/api/#keyset-based-pagination)).
I would go for the quickest option since this might be solved in a better way with our KG-based search.
| non_priority | pagination fails for gitlab apis with more than items description gitlab traditional pagination i e offset based pagination partially fails when returning more than items in that case some headers are missing among which x total and x total pages since we use those fields for our pagination system it currently fails on items reproduce admins on renkulab io already get this error when listing all projects all the users will soon have this problem since we already have public projects possible solutions i haven t checked it thoroughly yet but here are possible solutions manually adding the missing fields data in those cases where they are missing and we still get items e g if we get items but no x total we give for granted there are more than items in that case we can still show the pages up to and specify on the last page that some results are missing and a more specific search is required try the keyset pagination i would go for the quickest option since this might be solved in a better way with our kg based search | 0 |
139,348 | 11,258,640,266 | IssuesEvent | 2020-01-13 05:36:56 | microsoft/AzureStorageExplorer | https://api.github.com/repos/microsoft/AzureStorageExplorer | opened | Update 'mean time' to 'meantime' in Release Notes Known Issues part | 🧪 testing | **Storage Explorer Version:** 1.12.0
**Build:** [20200113.2](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3379477)
**Branch:** rel/1.12.0
**Platform/OS:** Windows 10/ Linux Ubuntu 18.04/ MacOS High Sierra
**Architecture:** ia32/x64
**Regression From:** Not a regression
**Steps to reproduce:**
1. Launch Storage Explorer -> Open Release Notes.
2. Check the Known Issues part in Release Notes.
**Expect Experience:**
Show 'meantime' in the descriptions of the know issue.
**Actual Experience:**
Show 'mean time' in the descriptions of the know issue.

| 1.0 | Update 'mean time' to 'meantime' in Release Notes Known Issues part - **Storage Explorer Version:** 1.12.0
**Build:** [20200113.2](https://devdiv.visualstudio.com/DevDiv/_build/results?buildId=3379477)
**Branch:** rel/1.12.0
**Platform/OS:** Windows 10/ Linux Ubuntu 18.04/ MacOS High Sierra
**Architecture:** ia32/x64
**Regression From:** Not a regression
**Steps to reproduce:**
1. Launch Storage Explorer -> Open Release Notes.
2. Check the Known Issues part in Release Notes.
**Expect Experience:**
Show 'meantime' in the descriptions of the know issue.
**Actual Experience:**
Show 'mean time' in the descriptions of the know issue.

| non_priority | update mean time to meantime in release notes known issues part storage explorer version build branch rel platform os windows linux ubuntu macos high sierra architecture regression from not a regression steps to reproduce launch storage explorer open release notes check the known issues part in release notes expect experience show meantime in the descriptions of the know issue actual experience show mean time in the descriptions of the know issue | 0 |
49,124 | 26,001,136,023 | IssuesEvent | 2022-12-20 15:24:28 | rancher/dashboard | https://api.github.com/repos/rancher/dashboard | opened | Improve `getGroups` performance | area/performance | - `./shell/layouts/default.vue` `getGroups` fn is called often and can sometimes take a long time to complete (approaching a second)
- We've seen in some environments a change of namespace results in two executions of `getGroups`
- We should bring up a cluster with ~1000 namespaces and prove the performance hit of `getGroups`, improve it's performance and test again
- Places to consider
- reduce number of calls to getGroups with a debounce
- `Object.keys(namespacesObject)`
- `addObject` (are these called on large sets of data)
- `type-map/allTypes` and `type-map/getTree`
- `replaceWith(this.groups, ...sortBy(out, ['weight:desc', 'label']))` takes to run | True | Improve `getGroups` performance - - `./shell/layouts/default.vue` `getGroups` fn is called often and can sometimes take a long time to complete (approaching a second)
- We've seen in some environments a change of namespace results in two executions of `getGroups`
- We should bring up a cluster with ~1000 namespaces and prove the performance hit of `getGroups`, improve it's performance and test again
- Places to consider
- reduce number of calls to getGroups with a debounce
- `Object.keys(namespacesObject)`
- `addObject` (are these called on large sets of data)
- `type-map/allTypes` and `type-map/getTree`
- `replaceWith(this.groups, ...sortBy(out, ['weight:desc', 'label']))` takes to run | non_priority | improve getgroups performance shell layouts default vue getgroups fn is called often and can sometimes take a long time to complete approaching a second we ve seen in some environments a change of namespace results in two executions of getgroups we should bring up a cluster with namespaces and prove the performance hit of getgroups improve it s performance and test again places to consider reduce number of calls to getgroups with a debounce object keys namespacesobject addobject are these called on large sets of data type map alltypes and type map gettree replacewith this groups sortby out takes to run | 0 |
312,687 | 23,439,135,600 | IssuesEvent | 2022-08-15 13:17:34 | Sage-Bionetworks/challenge-registry | https://api.github.com/repos/Sage-Bionetworks/challenge-registry | opened | Disable unnecessary email notifications | documentation dev/env | The motivation is to enhance our development environment, especially by removing email notifications that are sent outside of working hours.
Tasks:
- [ ] Identify the types of email notifications that we are currently receiving.
- [ ] Identify the ones we want to disable.
- [ ] Document what developers need to do to silence email notifications.
cc: @rrchai @vpchung @chepyle | 1.0 | Disable unnecessary email notifications - The motivation is to enhance our development environment, especially by removing email notifications that are sent outside of working hours.
Tasks:
- [ ] Identify the types of email notifications that we are currently receiving.
- [ ] Identify the ones we want to disable.
- [ ] Document what developers need to do to silence email notifications.
cc: @rrchai @vpchung @chepyle | non_priority | disable unnecessary email notifications the motivation is to enhance our development environment especially by removing email notifications that are sent outside of working hours tasks identify the types of email notifications that we are currently receiving identify the ones we want to disable document what developers need to do to silence email notifications cc rrchai vpchung chepyle | 0 |
44,777 | 23,769,032,309 | IssuesEvent | 2022-09-01 14:53:00 | python/cpython | https://api.github.com/repos/python/cpython | closed | speedup for / while / if with better bytecode | performance interpreter-core pending | BPO | [2459](https://bugs.python.org/issue2459)
--- | :---
Nosy | @rhettinger, @gpshead, @jcea, @pitrou, @giampaolo, @tiran, @djc, @phsilva
Files | <li>[predict_loop.diff](https://bugs.python.org/file9877/predict_loop.diff "Uploaded as text/plain at 2008-03-27.16:21:48 by @arigo")</li><li>[for_iter.patch](https://bugs.python.org/file12923/for_iter.patch "Uploaded as text/plain at 2009-02-02.20:36:52 by @pitrou")</li><li>[trunk-opt-loop.patch](https://bugs.python.org/file13227/trunk-opt-loop.patch "Uploaded as text/plain at 2009-03-02.06:11:39 by jyasskin"): for_iter.patch merged with python/issues-test-cpython#4715</li>
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2008-03-22.22:25:03.988>
labels = ['interpreter-core', 'performance']
title = 'speedup for / while / if with better bytecode'
updated_at = <Date 2014-06-22.12:04:07.520>
user = 'https://github.com/pitrou'
```
bugs.python.org fields:
```python
activity = <Date 2014-06-22.12:04:07.520>
actor = 'BreamoreBoy'
assignee = 'jyasskin'
closed = False
closed_date = None
closer = None
components = ['Interpreter Core']
creation = <Date 2008-03-22.22:25:03.988>
creator = 'pitrou'
dependencies = []
files = ['9877', '12923', '13227']
hgrepos = []
issue_num = 2459
keywords = ['patch']
message_count = 31.0
messages = ['64343', '64359', '64364', '64367', '64383', '64506', '64537', '64572', '64573', '64577', '64599', '64603', '66027', '68687', '80591', '80593', '80598', '80995', '81029', '81958', '81962', '81976', '82556', '82557', '83003', '83012', '83022', '83088', '106883', '192575', '221249']
nosy_count = 15.0
nosy_names = ['nnorwitz', 'collinwinter', 'rhettinger', 'gregory.p.smith', 'jcea', 'tzot', 'pitrou', 'giampaolo.rodola', 'christian.heimes', 'jyasskin', 'stutzbach', 'lauromoura', 'djc', 'phsilva', 'BreamoreBoy']
pr_nums = []
priority = 'normal'
resolution = None
stage = None
status = 'languishing'
superseder = None
type = 'performance'
url = 'https://bugs.python.org/issue2459'
versions = ['Python 2.7', 'Python 3.3', 'Python 3.4']
```
</p></details>
| True | speedup for / while / if with better bytecode - BPO | [2459](https://bugs.python.org/issue2459)
--- | :---
Nosy | @rhettinger, @gpshead, @jcea, @pitrou, @giampaolo, @tiran, @djc, @phsilva
Files | <li>[predict_loop.diff](https://bugs.python.org/file9877/predict_loop.diff "Uploaded as text/plain at 2008-03-27.16:21:48 by @arigo")</li><li>[for_iter.patch](https://bugs.python.org/file12923/for_iter.patch "Uploaded as text/plain at 2009-02-02.20:36:52 by @pitrou")</li><li>[trunk-opt-loop.patch](https://bugs.python.org/file13227/trunk-opt-loop.patch "Uploaded as text/plain at 2009-03-02.06:11:39 by jyasskin"): for_iter.patch merged with python/issues-test-cpython#4715</li>
<sup>*Note: these values reflect the state of the issue at the time it was migrated and might not reflect the current state.*</sup>
<details><summary>Show more details</summary><p>
GitHub fields:
```python
assignee = None
closed_at = None
created_at = <Date 2008-03-22.22:25:03.988>
labels = ['interpreter-core', 'performance']
title = 'speedup for / while / if with better bytecode'
updated_at = <Date 2014-06-22.12:04:07.520>
user = 'https://github.com/pitrou'
```
bugs.python.org fields:
```python
activity = <Date 2014-06-22.12:04:07.520>
actor = 'BreamoreBoy'
assignee = 'jyasskin'
closed = False
closed_date = None
closer = None
components = ['Interpreter Core']
creation = <Date 2008-03-22.22:25:03.988>
creator = 'pitrou'
dependencies = []
files = ['9877', '12923', '13227']
hgrepos = []
issue_num = 2459
keywords = ['patch']
message_count = 31.0
messages = ['64343', '64359', '64364', '64367', '64383', '64506', '64537', '64572', '64573', '64577', '64599', '64603', '66027', '68687', '80591', '80593', '80598', '80995', '81029', '81958', '81962', '81976', '82556', '82557', '83003', '83012', '83022', '83088', '106883', '192575', '221249']
nosy_count = 15.0
nosy_names = ['nnorwitz', 'collinwinter', 'rhettinger', 'gregory.p.smith', 'jcea', 'tzot', 'pitrou', 'giampaolo.rodola', 'christian.heimes', 'jyasskin', 'stutzbach', 'lauromoura', 'djc', 'phsilva', 'BreamoreBoy']
pr_nums = []
priority = 'normal'
resolution = None
stage = None
status = 'languishing'
superseder = None
type = 'performance'
url = 'https://bugs.python.org/issue2459'
versions = ['Python 2.7', 'Python 3.3', 'Python 3.4']
```
</p></details>
| non_priority | speedup for while if with better bytecode bpo nosy rhettinger gpshead jcea pitrou giampaolo tiran djc phsilva files uploaded as text plain at by arigo uploaded as text plain at by pitrou uploaded as text plain at by jyasskin for iter patch merged with python issues test cpython note these values reflect the state of the issue at the time it was migrated and might not reflect the current state show more details github fields python assignee none closed at none created at labels title speedup for while if with better bytecode updated at user bugs python org fields python activity actor breamoreboy assignee jyasskin closed false closed date none closer none components creation creator pitrou dependencies files hgrepos issue num keywords message count messages nosy count nosy names pr nums priority normal resolution none stage none status languishing superseder none type performance url versions | 0 |
189,360 | 15,186,590,997 | IssuesEvent | 2021-02-15 12:36:37 | microcks/microcks | https://api.github.com/repos/microcks/microcks | closed | Write MQTT support documentation | component/documentation kind/task | We should add complete documentation (installation, usage, tests) of the MQTT support feature in Microks. | 1.0 | Write MQTT support documentation - We should add complete documentation (installation, usage, tests) of the MQTT support feature in Microks. | non_priority | write mqtt support documentation we should add complete documentation installation usage tests of the mqtt support feature in microks | 0 |
10,952 | 12,964,986,029 | IssuesEvent | 2020-07-20 21:28:22 | Melonslise/Locks | https://api.github.com/repos/Melonslise/Locks | closed | It doesn't load on 1.14.4 | compatibility | Here's the [latest.log ](https://gist.github.com/SepSol/49327ee978279ad8f112b60882efdcf4)and [debug.log](https://gist.github.com/SepSol/8ca9aebdc02fb2dd502eb1b42ee1d986)

Minecraft 1.14.4
Forge 28.1.70
OptiFine HD U F4
Locks 2.3.1.1
| True | It doesn't load on 1.14.4 - Here's the [latest.log ](https://gist.github.com/SepSol/49327ee978279ad8f112b60882efdcf4)and [debug.log](https://gist.github.com/SepSol/8ca9aebdc02fb2dd502eb1b42ee1d986)

Minecraft 1.14.4
Forge 28.1.70
OptiFine HD U F4
Locks 2.3.1.1
| non_priority | it doesn t load on here s the minecraft forge optifine hd u locks | 0 |
151,862 | 19,665,435,365 | IssuesEvent | 2022-01-10 21:53:33 | tyhal/crie | https://api.github.com/repos/tyhal/crie | closed | CVE-2018-16875 (High) detected in github.com/moby/moby-v1.13.1 | security vulnerability | ## CVE-2018-16875 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/moby/moby-v1.13.1</b></p></summary>
<p>Moby Project - a collaborative project for the container ecosystem to assemble container-based systems</p>
<p>
Dependency Hierarchy:
- :x: **github.com/moby/moby-v1.13.1** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/tyhal/crie/commit/304e2783e903eb495b8bc99cd892d467bea7f95a">304e2783e903eb495b8bc99cd892d467bea7f95a</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The crypto/x509 package of Go before 1.10.6 and 1.11.x before 1.11.3 does not limit the amount of work performed for each chain verification, which might allow attackers to craft pathological inputs leading to a CPU denial of service. Go TLS servers accepting client certificates and TLS clients are affected.
<p>Publish Date: 2018-12-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16875>CVE-2018-16875</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-16875">https://nvd.nist.gov/vuln/detail/CVE-2018-16875</a></p>
<p>Release Date: 2018-12-14</p>
<p>Fix Resolution: 1.10.6,1.11.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-16875 (High) detected in github.com/moby/moby-v1.13.1 - ## CVE-2018-16875 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>github.com/moby/moby-v1.13.1</b></p></summary>
<p>Moby Project - a collaborative project for the container ecosystem to assemble container-based systems</p>
<p>
Dependency Hierarchy:
- :x: **github.com/moby/moby-v1.13.1** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/tyhal/crie/commit/304e2783e903eb495b8bc99cd892d467bea7f95a">304e2783e903eb495b8bc99cd892d467bea7f95a</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The crypto/x509 package of Go before 1.10.6 and 1.11.x before 1.11.3 does not limit the amount of work performed for each chain verification, which might allow attackers to craft pathological inputs leading to a CPU denial of service. Go TLS servers accepting client certificates and TLS clients are affected.
<p>Publish Date: 2018-12-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-16875>CVE-2018-16875</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-16875">https://nvd.nist.gov/vuln/detail/CVE-2018-16875</a></p>
<p>Release Date: 2018-12-14</p>
<p>Fix Resolution: 1.10.6,1.11.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in github com moby moby cve high severity vulnerability vulnerable library github com moby moby moby project a collaborative project for the container ecosystem to assemble container based systems dependency hierarchy x github com moby moby vulnerable library found in head commit a href vulnerability details the crypto package of go before and x before does not limit the amount of work performed for each chain verification which might allow attackers to craft pathological inputs leading to a cpu denial of service go tls servers accepting client certificates and tls clients are affected publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
166,358 | 20,718,446,079 | IssuesEvent | 2022-03-13 01:43:32 | directoryxx/Laravel-Jenkins-Docker | https://api.github.com/repos/directoryxx/Laravel-Jenkins-Docker | opened | CVE-2022-0691 (High) detected in url-parse-1.4.7.tgz | security vulnerability | ## CVE-2022-0691 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /Laravel-Jenkins-Docker/package.json</p>
<p>Path to vulnerable library: /node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- laravel-mix-2.1.14.tgz (Root Library)
- webpack-dev-server-2.11.5.tgz
- sockjs-client-1.1.5.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.9.
<p>Publish Date: 2022-02-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0691>CVE-2022-0691</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691</a></p>
<p>Release Date: 2022-02-21</p>
<p>Fix Resolution (url-parse): 1.5.9</p>
<p>Direct dependency fix Resolution (laravel-mix): 3.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-0691 (High) detected in url-parse-1.4.7.tgz - ## CVE-2022-0691 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /Laravel-Jenkins-Docker/package.json</p>
<p>Path to vulnerable library: /node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- laravel-mix-2.1.14.tgz (Root Library)
- webpack-dev-server-2.11.5.tgz
- sockjs-client-1.1.5.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.9.
<p>Publish Date: 2022-02-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0691>CVE-2022-0691</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691</a></p>
<p>Release Date: 2022-02-21</p>
<p>Fix Resolution (url-parse): 1.5.9</p>
<p>Direct dependency fix Resolution (laravel-mix): 3.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in url parse tgz cve high severity vulnerability vulnerable library url parse tgz small footprint url parser that works seamlessly across node js and browser environments library home page a href path to dependency file laravel jenkins docker package json path to vulnerable library node modules url parse package json dependency hierarchy laravel mix tgz root library webpack dev server tgz sockjs client tgz x url parse tgz vulnerable library vulnerability details authorization bypass through user controlled key in npm url parse prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution url parse direct dependency fix resolution laravel mix step up your open source security game with whitesource | 0 |
121,220 | 25,941,023,600 | IssuesEvent | 2022-12-16 18:23:55 | pybricks/support | https://api.github.com/repos/pybricks/support | closed | [Feature] Rename app description in meta data | enhancement software: pybricks-code | **Is your feature request related to a problem? Please describe.**
When sharing a link to Pybricks Code you usually get a preview like this:

But not everyone knows what Powered Up and perhaps IDE means. It is not well known that Powered Up also spans SPIKE and MINDSTORMS.
**Describe the solution you'd like / ideas**
Pybricks: MicroPython coding for LEGO® hubs.
MicroPython coding for LEGO® hubs.
MicroPython coding for robotics.
| 1.0 | [Feature] Rename app description in meta data - **Is your feature request related to a problem? Please describe.**
When sharing a link to Pybricks Code you usually get a preview like this:

But not everyone knows what Powered Up and perhaps IDE means. It is not well known that Powered Up also spans SPIKE and MINDSTORMS.
**Describe the solution you'd like / ideas**
Pybricks: MicroPython coding for LEGO® hubs.
MicroPython coding for LEGO® hubs.
MicroPython coding for robotics.
| non_priority | rename app description in meta data is your feature request related to a problem please describe when sharing a link to pybricks code you usually get a preview like this but not everyone knows what powered up and perhaps ide means it is not well known that powered up also spans spike and mindstorms describe the solution you d like ideas pybricks micropython coding for lego® hubs micropython coding for lego® hubs micropython coding for robotics | 0 |
184,338 | 31,860,293,417 | IssuesEvent | 2023-09-15 10:23:28 | livechat/design-system | https://api.github.com/repos/livechat/design-system | closed | [Color tokens] - Update avatars colors | design | Please update avatars colors for both all themes:
--surface-avatar-3: #328DFF
--surface-avatar-5: #9146FF
--surface-avatar-7: #0066FF
--surface-avatar-9: #B08C00
--surface-avatar-10: #00A449 | 1.0 | [Color tokens] - Update avatars colors - Please update avatars colors for both all themes:
--surface-avatar-3: #328DFF
--surface-avatar-5: #9146FF
--surface-avatar-7: #0066FF
--surface-avatar-9: #B08C00
--surface-avatar-10: #00A449 | non_priority | update avatars colors please update avatars colors for both all themes surface avatar surface avatar surface avatar surface avatar surface avatar | 0 |
61,464 | 12,191,283,874 | IssuesEvent | 2020-04-29 10:50:40 | kwk/test-llvm-bz-import-5 | https://api.github.com/repos/kwk/test-llvm-bz-import-5 | closed | x86_64 va_arg codegen asserts on typedef'd floating point types | BZ-BUG-STATUS: RESOLVED BZ-RESOLUTION: FIXED clang/LLVM Codegen dummy import from bugzilla | This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=6433. | 1.0 | x86_64 va_arg codegen asserts on typedef'd floating point types - This issue was imported from Bugzilla https://bugs.llvm.org/show_bug.cgi?id=6433. | non_priority | va arg codegen asserts on typedef d floating point types this issue was imported from bugzilla | 0 |
11,112 | 13,957,681,387 | IssuesEvent | 2020-10-24 08:07:24 | alexanderkotsev/geoportal | https://api.github.com/repos/alexanderkotsev/geoportal | opened | DE: request for a new harvesting | DE - Germany Geoportal Harvesting process | Dear Geoportal Helpdesk,
As mentioned in Roberts Mail from 2020/03/02 we would like to initiate a new push of our metadata records to the EU Geoportal. For this reason we kindly ask you to start a new harvesting of our catalogue instance and publish them for us in the Geoportal harvesting "sandbox", please.
Also we kindly ask you again, if you could provide us two or three original csw-requests (for an internal validation/review on our side), which you are using to get the metadata records from our catalogue instance.
Thanks in advance and best regards,
Anja (on behalf of Coordination Office SDI Germany) | 1.0 | DE: request for a new harvesting - Dear Geoportal Helpdesk,
As mentioned in Roberts Mail from 2020/03/02 we would like to initiate a new push of our metadata records to the EU Geoportal. For this reason we kindly ask you to start a new harvesting of our catalogue instance and publish them for us in the Geoportal harvesting "sandbox", please.
Also we kindly ask you again, if you could provide us two or three original csw-requests (for an internal validation/review on our side), which you are using to get the metadata records from our catalogue instance.
Thanks in advance and best regards,
Anja (on behalf of Coordination Office SDI Germany) | non_priority | de request for a new harvesting dear geoportal helpdesk as mentioned in roberts mail from we would like to initiate a new push of our metadata records to the eu geoportal for this reason we kindly ask you to start a new harvesting of our catalogue instance and publish them for us in the geoportal harvesting quot sandbox quot please also we kindly ask you again if you could provide us two or three original csw requests for an internal validation review on our side which you are using to get the metadata records from our catalogue instance thanks in advance and best regards anja on behalf of coordination office sdi germany | 0 |
211,585 | 16,327,896,985 | IssuesEvent | 2021-05-12 05:07:25 | openmsupply/msupply-cold-chain | https://api.github.com/repos/openmsupply/msupply-cold-chain | closed | TESTING: Sync settings | test template | # Sync settings tests
## Setup
- TODO.
## Test cases
- TODO.
| 1.0 | TESTING: Sync settings - # Sync settings tests
## Setup
- TODO.
## Test cases
- TODO.
| non_priority | testing sync settings sync settings tests setup todo test cases todo | 0 |
203,118 | 23,123,562,396 | IssuesEvent | 2022-07-28 01:37:48 | pustovitDmytro/node-package-tester | https://api.github.com/repos/pustovitDmytro/node-package-tester | closed | CVE-2021-35065 (High) detected in glob-parent-5.1.2.tgz - autoclosed | security vulnerability | ## CVE-2021-35065 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-5.1.2.tgz</b></p></summary>
<p>Extract the non-magic parent path from a glob string.</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- cli-7.16.8.tgz (Root Library)
- chokidar-3.5.3.tgz
- :x: **glob-parent-5.1.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/pustovitDmytro/node-package-tester/commit/e9f92222fae5f192fc141e650dbd0468173de3f5">e9f92222fae5f192fc141e650dbd0468173de3f5</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package glob-parent before 6.0.1 are vulnerable to Regular Expression Denial of Service (ReDoS)
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35065>CVE-2021-35065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-cj88-88mr-972w">https://github.com/advisories/GHSA-cj88-88mr-972w</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: glob-parent - 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-35065 (High) detected in glob-parent-5.1.2.tgz - autoclosed - ## CVE-2021-35065 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>glob-parent-5.1.2.tgz</b></p></summary>
<p>Extract the non-magic parent path from a glob string.</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- cli-7.16.8.tgz (Root Library)
- chokidar-3.5.3.tgz
- :x: **glob-parent-5.1.2.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/pustovitDmytro/node-package-tester/commit/e9f92222fae5f192fc141e650dbd0468173de3f5">e9f92222fae5f192fc141e650dbd0468173de3f5</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package glob-parent before 6.0.1 are vulnerable to Regular Expression Denial of Service (ReDoS)
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35065>CVE-2021-35065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-cj88-88mr-972w">https://github.com/advisories/GHSA-cj88-88mr-972w</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: glob-parent - 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in glob parent tgz autoclosed cve high severity vulnerability vulnerable library glob parent tgz extract the non magic parent path from a glob string library home page a href path to dependency file package json path to vulnerable library node modules glob parent package json dependency hierarchy cli tgz root library chokidar tgz x glob parent tgz vulnerable library found in head commit a href found in base branch master vulnerability details the package glob parent before are vulnerable to regular expression denial of service redos publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent step up your open source security game with mend | 0 |
13,133 | 3,114,742,691 | IssuesEvent | 2015-09-03 10:39:19 | mysociety/pombola | https://api.github.com/repos/mysociety/pombola | opened | ZA: Images for Facebook ads | copy design ZA | We're tentatively experimenting with some Facebook ads. I'd like to explore images that we could use to make a more appealing ad around messages such as 'Do you know what your representatives are doing in your name'; 'find contact details for your representative' and 'making Parliament easy to follow/understand".
Guidance for these ads is here: https://www.facebook.com/business/products/ads/how-ads-show/ and https://www.facebook.com/ads/tools/text_overlay
On discussion with @zarino, we decided that the PA team are best situated to discern what kind of imagery and messaging is best for the audience. @paullenz or @JenMysoc , would you be able to lead that discussion? I believe that the design team can give access to eg a stock photography library from which suitable images could be identified. | 1.0 | ZA: Images for Facebook ads - We're tentatively experimenting with some Facebook ads. I'd like to explore images that we could use to make a more appealing ad around messages such as 'Do you know what your representatives are doing in your name'; 'find contact details for your representative' and 'making Parliament easy to follow/understand".
Guidance for these ads is here: https://www.facebook.com/business/products/ads/how-ads-show/ and https://www.facebook.com/ads/tools/text_overlay
On discussion with @zarino, we decided that the PA team are best situated to discern what kind of imagery and messaging is best for the audience. @paullenz or @JenMysoc , would you be able to lead that discussion? I believe that the design team can give access to eg a stock photography library from which suitable images could be identified. | non_priority | za images for facebook ads we re tentatively experimenting with some facebook ads i d like to explore images that we could use to make a more appealing ad around messages such as do you know what your representatives are doing in your name find contact details for your representative and making parliament easy to follow understand guidance for these ads is here and on discussion with zarino we decided that the pa team are best situated to discern what kind of imagery and messaging is best for the audience paullenz or jenmysoc would you be able to lead that discussion i believe that the design team can give access to eg a stock photography library from which suitable images could be identified | 0 |
156,667 | 24,625,019,223 | IssuesEvent | 2022-10-16 12:06:12 | dotnet/efcore | https://api.github.com/repos/dotnet/efcore | closed | Incorrect syntax in the script generated by command "migrations script --idempotent" | closed-by-design | Hello,
I have a project with multiple migrations.
One of the migration have multiple procedures like this :
`private static string SCRIPT_TableUserSynchronizationRepository_GetFirstPendingUserSynchronizations =
@"CREATE PROCEDURE procedure_GetFirstPendingUserSynchronizations
[...]
END";`
When i execute the command line "_dotnet ef migrations script_" everything work fine !
But when i execute the code generated by the command "_dotnet ef migrations script --idempotent_" i have several errors !
EF generate this kind of code :
`IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20171124083142_addStoredProcedures')
BEGIN
CREATE PROCEDURE procedure_GetFirstPendingUserSynchronizations
[...]
END
END;
GO`
And the error is "Incorrect syntax near the keyword 'Procedure'".
I have found a workaround by putting the "CREATE PROCEDURE" in "EXE(' [...]').
https://stackoverflow.com/questions/16729520/incorrect-syntax-near-the-keyword-procedure.
I use SQL Server 2016 (localdb)
Is it an EF Core bug ?
| 1.0 | Incorrect syntax in the script generated by command "migrations script --idempotent" - Hello,
I have a project with multiple migrations.
One of the migration have multiple procedures like this :
`private static string SCRIPT_TableUserSynchronizationRepository_GetFirstPendingUserSynchronizations =
@"CREATE PROCEDURE procedure_GetFirstPendingUserSynchronizations
[...]
END";`
When i execute the command line "_dotnet ef migrations script_" everything work fine !
But when i execute the code generated by the command "_dotnet ef migrations script --idempotent_" i have several errors !
EF generate this kind of code :
`IF NOT EXISTS(SELECT * FROM [__EFMigrationsHistory] WHERE [MigrationId] = N'20171124083142_addStoredProcedures')
BEGIN
CREATE PROCEDURE procedure_GetFirstPendingUserSynchronizations
[...]
END
END;
GO`
And the error is "Incorrect syntax near the keyword 'Procedure'".
I have found a workaround by putting the "CREATE PROCEDURE" in "EXE(' [...]').
https://stackoverflow.com/questions/16729520/incorrect-syntax-near-the-keyword-procedure.
I use SQL Server 2016 (localdb)
Is it an EF Core bug ?
| non_priority | incorrect syntax in the script generated by command migrations script idempotent hello i have a project with multiple migrations one of the migration have multiple procedures like this private static string script tableusersynchronizationrepository getfirstpendingusersynchronizations create procedure procedure getfirstpendingusersynchronizations end when i execute the command line dotnet ef migrations script everything work fine but when i execute the code generated by the command dotnet ef migrations script idempotent i have several errors ef generate this kind of code if not exists select from where n addstoredprocedures begin create procedure procedure getfirstpendingusersynchronizations end end go and the error is incorrect syntax near the keyword procedure i have found a workaround by putting the create procedure in exe i use sql server localdb is it an ef core bug | 0 |
3,465 | 13,787,103,250 | IssuesEvent | 2020-10-09 03:51:08 | bandprotocol/bandchain | https://api.github.com/repos/bandprotocol/bandchain | closed | Delegate UI test case | automation scan | Let's implement cypress script for testing on delegation flow.
Things should be tested.
- Check the submit button are disabled before input the value on delegation modal.
- The input bar on delegation modal can be inputed.
- Delegate transaction should be send successfully.
- The user amount and the validator's bonded amount should be updated to correct one. (Need to ask @pzshine if you don't know how to do it).
- If you have another test case, feel free to add it more. | 1.0 | Delegate UI test case - Let's implement cypress script for testing on delegation flow.
Things should be tested.
- Check the submit button are disabled before input the value on delegation modal.
- The input bar on delegation modal can be inputed.
- Delegate transaction should be send successfully.
- The user amount and the validator's bonded amount should be updated to correct one. (Need to ask @pzshine if you don't know how to do it).
- If you have another test case, feel free to add it more. | non_priority | delegate ui test case let s implement cypress script for testing on delegation flow things should be tested check the submit button are disabled before input the value on delegation modal the input bar on delegation modal can be inputed delegate transaction should be send successfully the user amount and the validator s bonded amount should be updated to correct one need to ask pzshine if you don t know how to do it if you have another test case feel free to add it more | 0 |
33,366 | 7,700,556,927 | IssuesEvent | 2018-05-20 03:01:22 | joomla/joomla-cms | https://api.github.com/repos/joomla/joomla-cms | closed | [4.0] Frontend login page eye not styled | J4 Issue No Code Attached Yet | ### Steps to reproduce the issue
install Joomla! 4.0.0-alpha3 Alpha [ Amani ] 12-May-2018 15:23 GMT
install demo data in admin
go to frontend
click login button (without entering user/pass)
you get to the login page at
http://example.com/index.php/author-login
### Expected result
eye is styled
### Actual result
eye is not styled
<img width="562" alt="screen shot 2018-05-12 at 21 17 02" src="https://user-images.githubusercontent.com/400092/39961229-d7a1b4ae-5629-11e8-8714-f08325d95a4c.png">
### System information (as much as possible)
### Additional comments
| 1.0 | [4.0] Frontend login page eye not styled - ### Steps to reproduce the issue
install Joomla! 4.0.0-alpha3 Alpha [ Amani ] 12-May-2018 15:23 GMT
install demo data in admin
go to frontend
click login button (without entering user/pass)
you get to the login page at
http://example.com/index.php/author-login
### Expected result
eye is styled
### Actual result
eye is not styled
<img width="562" alt="screen shot 2018-05-12 at 21 17 02" src="https://user-images.githubusercontent.com/400092/39961229-d7a1b4ae-5629-11e8-8714-f08325d95a4c.png">
### System information (as much as possible)
### Additional comments
| non_priority | frontend login page eye not styled steps to reproduce the issue install joomla alpha may gmt install demo data in admin go to frontend click login button without entering user pass you get to the login page at expected result eye is styled actual result eye is not styled img width alt screen shot at src system information as much as possible additional comments | 0 |
264,804 | 20,033,364,580 | IssuesEvent | 2022-02-02 09:16:21 | RalfKoban/MiKo-Analyzers | https://api.github.com/repos/RalfKoban/MiKo-Analyzers | opened | XML documentation texts should not end with exclamation or question mark | feature Area: analyzer Area: documentation | Following should trigger a violation:
```c#
/// <summary>
/// Some text!
/// </summary>
```
The reason is that if it is important, it should be placed inside a `<note type="important">` tag, so that the user, when reading the comment (assuming the website), can immediately see that there is something to be aware of.
Following should NOT trigger a violation:
```c#
/// <summary>
/// Some text.
/// </summary>
/// <remarks>
/// <note type="important">
/// Some warning!
/// </note>
/// </remarks>
```
The same is valid for any other tag except of `<note>` or `<para>` below a `<note>`. | 1.0 | XML documentation texts should not end with exclamation or question mark - Following should trigger a violation:
```c#
/// <summary>
/// Some text!
/// </summary>
```
The reason is that if it is important, it should be placed inside a `<note type="important">` tag, so that the user, when reading the comment (assuming the website), can immediately see that there is something to be aware of.
Following should NOT trigger a violation:
```c#
/// <summary>
/// Some text.
/// </summary>
/// <remarks>
/// <note type="important">
/// Some warning!
/// </note>
/// </remarks>
```
The same is valid for any other tag except of `<note>` or `<para>` below a `<note>`. | non_priority | xml documentation texts should not end with exclamation or question mark following should trigger a violation c some text the reason is that if it is important it should be placed inside a tag so that the user when reading the comment assuming the website can immediately see that there is something to be aware of following should not trigger a violation c some text some warning the same is valid for any other tag except of or below a | 0 |
131,990 | 28,070,962,771 | IssuesEvent | 2023-03-29 19:02:11 | CMPUT301W23T16/AllGasNoBrakes | https://api.github.com/repos/CMPUT301W23T16/AllGasNoBrakes | closed | Fix camera permissions and status | bug Game QR Codes | When is used in the app, it remains on/in use.
Also remains on when the app is in background activity. | 1.0 | Fix camera permissions and status - When is used in the app, it remains on/in use.
Also remains on when the app is in background activity. | non_priority | fix camera permissions and status when is used in the app it remains on in use also remains on when the app is in background activity | 0 |
71,974 | 9,546,254,893 | IssuesEvent | 2019-05-01 19:23:55 | spacetelescope/monitor-framework | https://api.github.com/repos/spacetelescope/monitor-framework | opened | Need to mention that `store_results` is a required method in the docs | documentation | Forgot to include that if the database features are not used, `store_results` must be overridden in a new monitor | 1.0 | Need to mention that `store_results` is a required method in the docs - Forgot to include that if the database features are not used, `store_results` must be overridden in a new monitor | non_priority | need to mention that store results is a required method in the docs forgot to include that if the database features are not used store results must be overridden in a new monitor | 0 |
347,497 | 24,888,958,586 | IssuesEvent | 2022-10-28 10:12:15 | guowei42/ped | https://api.github.com/repos/guowei42/ped | opened | Overall Impression with the UG | type.DocumentationBug severity.Low | I think it's a good idea overall. However, regarding the UG, I think it would be better if the wording could be more targeted to the target audience. For example before the quick start, there can be a at a glance or some sorts that give a general description of each feature, and how it can be used to manage hackathons in particular.
Otherwise, I think would be better to arrange the functionalities into the respective tabs in the address book. For example, Features can be separated into "members" and "tasks" and then under each are the commands that affect them. Though I think in this case, it would be good to mention all commands can be used in any tabs.
Then the Saving the data and Editing the data file could be outside these sections or under "others".
<!--session: 1666948081231-4b686cbb-bf83-4263-8013-893836c0bec3-->
<!--Version: Web v3.4.4--> | 1.0 | Overall Impression with the UG - I think it's a good idea overall. However, regarding the UG, I think it would be better if the wording could be more targeted to the target audience. For example before the quick start, there can be a at a glance or some sorts that give a general description of each feature, and how it can be used to manage hackathons in particular.
Otherwise, I think would be better to arrange the functionalities into the respective tabs in the address book. For example, Features can be separated into "members" and "tasks" and then under each are the commands that affect them. Though I think in this case, it would be good to mention all commands can be used in any tabs.
Then the Saving the data and Editing the data file could be outside these sections or under "others".
<!--session: 1666948081231-4b686cbb-bf83-4263-8013-893836c0bec3-->
<!--Version: Web v3.4.4--> | non_priority | overall impression with the ug i think it s a good idea overall however regarding the ug i think it would be better if the wording could be more targeted to the target audience for example before the quick start there can be a at a glance or some sorts that give a general description of each feature and how it can be used to manage hackathons in particular otherwise i think would be better to arrange the functionalities into the respective tabs in the address book for example features can be separated into members and tasks and then under each are the commands that affect them though i think in this case it would be good to mention all commands can be used in any tabs then the saving the data and editing the data file could be outside these sections or under others | 0 |
435,100 | 30,485,631,433 | IssuesEvent | 2023-07-18 01:47:06 | v6d-io/v6d | https://api.github.com/repos/v6d-io/v6d | closed | Filling missing docstrings in Python and C++ source code | documentation component:client | <!--
Thanks for your contribution! please review https://github.com/v6d-io/v6d/blob/main/CONTRIBUTING.rst before opening an issue.
-->
Describe your problem
---------------------
as titled.
| 1.0 | Filling missing docstrings in Python and C++ source code - <!--
Thanks for your contribution! please review https://github.com/v6d-io/v6d/blob/main/CONTRIBUTING.rst before opening an issue.
-->
Describe your problem
---------------------
as titled.
| non_priority | filling missing docstrings in python and c source code thanks for your contribution please review before opening an issue describe your problem as titled | 0 |
70,072 | 15,051,844,814 | IssuesEvent | 2021-02-03 14:34:08 | fuzzdbunit/fuzzdbunit | https://api.github.com/repos/fuzzdbunit/fuzzdbunit | closed | CVE-2020-10672 (High) detected in jackson-databind-2.9.8.jar | security vulnerability | ## CVE-2020-10672 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: fuzzdbunit/samples/sample-selenium-test/build.gradle</p>
<p>Path to vulnerable library: /tmp/ws-ua_20210130101853_TOITPP/downloadResource_JFIXEQ/20210130102045/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- selenium-jupiter-3.3.5.jar (Root Library)
- docker-client-8.16.0.jar
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fuzzdbunit/fuzzdbunit/commit/b022277c3252b3c8a3bb63a75b4a428a09550e30">b022277c3252b3c8a3bb63a75b4a428a09550e30</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.aries.transaction.jms.internal.XaPooledConnectionFactory (aka aries.transaction.jms).
<p>Publish Date: 2020-03-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10672>CVE-2020-10672</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-10672">https://nvd.nist.gov/vuln/detail/CVE-2020-10672</a></p>
<p>Release Date: 2020-03-18</p>
<p>Fix Resolution: jackson-databind-2.9.10.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-10672 (High) detected in jackson-databind-2.9.8.jar - ## CVE-2020-10672 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: fuzzdbunit/samples/sample-selenium-test/build.gradle</p>
<p>Path to vulnerable library: /tmp/ws-ua_20210130101853_TOITPP/downloadResource_JFIXEQ/20210130102045/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- selenium-jupiter-3.3.5.jar (Root Library)
- docker-client-8.16.0.jar
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fuzzdbunit/fuzzdbunit/commit/b022277c3252b3c8a3bb63a75b4a428a09550e30">b022277c3252b3c8a3bb63a75b4a428a09550e30</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to org.apache.aries.transaction.jms.internal.XaPooledConnectionFactory (aka aries.transaction.jms).
<p>Publish Date: 2020-03-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10672>CVE-2020-10672</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2020-10672">https://nvd.nist.gov/vuln/detail/CVE-2020-10672</a></p>
<p>Release Date: 2020-03-18</p>
<p>Fix Resolution: jackson-databind-2.9.10.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file fuzzdbunit samples sample selenium test build gradle path to vulnerable library tmp ws ua toitpp downloadresource jfixeq jackson databind jar dependency hierarchy selenium jupiter jar root library docker client jar x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org apache aries transaction jms internal xapooledconnectionfactory aka aries transaction jms publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jackson databind step up your open source security game with whitesource | 0 |
202,582 | 23,077,551,811 | IssuesEvent | 2022-07-26 02:11:01 | lyubov888L/long-term-system | https://api.github.com/repos/lyubov888L/long-term-system | opened | CVE-2021-35065 (High) detected in glob-parent-3.1.0.tgz, glob-parent-5.1.1.tgz | security vulnerability | ## CVE-2021-35065 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-5.1.1.tgz</b></p></summary>
<p>
<details><summary><b>glob-parent-3.1.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-stream/node_modules/glob-parent/package.json,/node_modules/glob-watcher/node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- gulp-4.0.2.tgz (Root Library)
- vinyl-fs-3.0.3.tgz
- glob-stream-6.1.0.tgz
- :x: **glob-parent-3.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>glob-parent-5.1.1.tgz</b></p></summary>
<p>Extract the non-magic parent path from a glob string.</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- nodemon-2.0.2.tgz (Root Library)
- chokidar-3.3.1.tgz
- :x: **glob-parent-5.1.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/lyubov888L/long-term-system/commit/70ceacee09bde0d9b6f809a77010ad55db64e593">70ceacee09bde0d9b6f809a77010ad55db64e593</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package glob-parent before 6.0.1 are vulnerable to Regular Expression Denial of Service (ReDoS)
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35065>CVE-2021-35065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-cj88-88mr-972w">https://github.com/advisories/GHSA-cj88-88mr-972w</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: glob-parent - 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-35065 (High) detected in glob-parent-3.1.0.tgz, glob-parent-5.1.1.tgz - ## CVE-2021-35065 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>glob-parent-3.1.0.tgz</b>, <b>glob-parent-5.1.1.tgz</b></p></summary>
<p>
<details><summary><b>glob-parent-3.1.0.tgz</b></p></summary>
<p>Strips glob magic from a string to provide the parent directory path</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-3.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-stream/node_modules/glob-parent/package.json,/node_modules/glob-watcher/node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- gulp-4.0.2.tgz (Root Library)
- vinyl-fs-3.0.3.tgz
- glob-stream-6.1.0.tgz
- :x: **glob-parent-3.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>glob-parent-5.1.1.tgz</b></p></summary>
<p>Extract the non-magic parent path from a glob string.</p>
<p>Library home page: <a href="https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz">https://registry.npmjs.org/glob-parent/-/glob-parent-5.1.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/glob-parent/package.json</p>
<p>
Dependency Hierarchy:
- nodemon-2.0.2.tgz (Root Library)
- chokidar-3.3.1.tgz
- :x: **glob-parent-5.1.1.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/lyubov888L/long-term-system/commit/70ceacee09bde0d9b6f809a77010ad55db64e593">70ceacee09bde0d9b6f809a77010ad55db64e593</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package glob-parent before 6.0.1 are vulnerable to Regular Expression Denial of Service (ReDoS)
<p>Publish Date: 2021-06-22
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35065>CVE-2021-35065</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-cj88-88mr-972w">https://github.com/advisories/GHSA-cj88-88mr-972w</a></p>
<p>Release Date: 2021-06-22</p>
<p>Fix Resolution: glob-parent - 6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in glob parent tgz glob parent tgz cve high severity vulnerability vulnerable libraries glob parent tgz glob parent tgz glob parent tgz strips glob magic from a string to provide the parent directory path library home page a href path to dependency file package json path to vulnerable library node modules glob stream node modules glob parent package json node modules glob watcher node modules glob parent package json dependency hierarchy gulp tgz root library vinyl fs tgz glob stream tgz x glob parent tgz vulnerable library glob parent tgz extract the non magic parent path from a glob string library home page a href path to dependency file package json path to vulnerable library node modules glob parent package json dependency hierarchy nodemon tgz root library chokidar tgz x glob parent tgz vulnerable library found in head commit a href found in base branch main vulnerability details the package glob parent before are vulnerable to regular expression denial of service redos publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution glob parent step up your open source security game with mend | 0 |
417,925 | 28,111,768,816 | IssuesEvent | 2023-03-31 07:47:05 | MrTwit99/ped | https://api.github.com/repos/MrTwit99/ped | opened | Error in documentation of `help` command in UG section `Information on commands’ parameters` | type.DocumentationBug severity.Low | User typing `:help 123` does not perform the same result as `:help`. This contradicts what they have written in the UG.
Image of UG section `Information on commands' parameters`

Output image of user typing `:help 123`

<!--session: 1680242555203-e25b12ab-9efb-4987-9e9b-f87f09a07b21-->
<!--Version: Web v3.4.7--> | 1.0 | Error in documentation of `help` command in UG section `Information on commands’ parameters` - User typing `:help 123` does not perform the same result as `:help`. This contradicts what they have written in the UG.
Image of UG section `Information on commands' parameters`

Output image of user typing `:help 123`

<!--session: 1680242555203-e25b12ab-9efb-4987-9e9b-f87f09a07b21-->
<!--Version: Web v3.4.7--> | non_priority | error in documentation of help command in ug section information on commands’ parameters user typing help does not perform the same result as help this contradicts what they have written in the ug image of ug section information on commands parameters output image of user typing help | 0 |
267,680 | 20,240,723,525 | IssuesEvent | 2022-02-14 09:02:44 | scikit-learn/scikit-learn | https://api.github.com/repos/scikit-learn/scikit-learn | closed | Clarification around `cross_val_predict` cross validator. | Documentation Needs Triage | ### Describe the issue linked to the documentation
This is not necessarily true for [cross_val_predict](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.cross_val_predict.html), right?:
_Each sample belongs to exactly one test set..._
For example, what if your cv was a version of `ShuffleSplit`?
### Suggest a potential alternative/fix
I think either this should be omitted entirely or refer to the fact that the `cv` kwarg dicates what's in which test set. | 1.0 | Clarification around `cross_val_predict` cross validator. - ### Describe the issue linked to the documentation
This is not necessarily true for [cross_val_predict](https://scikit-learn.org/stable/modules/generated/sklearn.model_selection.cross_val_predict.html), right?:
_Each sample belongs to exactly one test set..._
For example, what if your cv was a version of `ShuffleSplit`?
### Suggest a potential alternative/fix
I think either this should be omitted entirely or refer to the fact that the `cv` kwarg dicates what's in which test set. | non_priority | clarification around cross val predict cross validator describe the issue linked to the documentation this is not necessarily true for right each sample belongs to exactly one test set for example what if your cv was a version of shufflesplit suggest a potential alternative fix i think either this should be omitted entirely or refer to the fact that the cv kwarg dicates what s in which test set | 0 |
34,629 | 7,458,142,347 | IssuesEvent | 2018-03-30 08:52:29 | kerdokullamae/test_koik_issued | https://api.github.com/repos/kerdokullamae/test_koik_issued | closed | Tarne pakki lisatud kasutajatel puuduvad õigused | C: AIS P: highest R: fixed T: defect | **Reported by sven syld on 4 Oct 2014 16:32 UTC**
Täpsemalt siis kasutajad, kes lisatakse baasi, kui rakendus nullist üles pannakse; kontrollida üle, kas neil on ikka õigused adminnida muudatusettepanekuid jne, st kõik avar3-3.1 lisandunud võimalused.
Fail on app/propel/sql/10-init-data.sql või midagi sarnast. | 1.0 | Tarne pakki lisatud kasutajatel puuduvad õigused - **Reported by sven syld on 4 Oct 2014 16:32 UTC**
Täpsemalt siis kasutajad, kes lisatakse baasi, kui rakendus nullist üles pannakse; kontrollida üle, kas neil on ikka õigused adminnida muudatusettepanekuid jne, st kõik avar3-3.1 lisandunud võimalused.
Fail on app/propel/sql/10-init-data.sql või midagi sarnast. | non_priority | tarne pakki lisatud kasutajatel puuduvad õigused reported by sven syld on oct utc täpsemalt siis kasutajad kes lisatakse baasi kui rakendus nullist üles pannakse kontrollida üle kas neil on ikka õigused adminnida muudatusettepanekuid jne st kõik lisandunud võimalused fail on app propel sql init data sql või midagi sarnast | 0 |
323,590 | 27,738,408,186 | IssuesEvent | 2023-03-15 12:51:04 | pytorch/pytorch | https://api.github.com/repos/pytorch/pytorch | opened | DISABLED test_variant_consistency_jit_addcmul_cpu_complex64 (__main__.TestJitCPU) | module: flaky-tests skipped module: unknown | Platforms: win, windows
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_variant_consistency_jit_addcmul_cpu_complex64&suite=TestJitCPU) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/12011585407).
Over the past 3 hours, it has been determined flaky in 3 workflow(s) with 3 failures and 3 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_variant_consistency_jit_addcmul_cpu_complex64`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
Test file path: `test_ops_jit.py` | 1.0 | DISABLED test_variant_consistency_jit_addcmul_cpu_complex64 (__main__.TestJitCPU) - Platforms: win, windows
This test was disabled because it is failing in CI. See [recent examples](https://hud.pytorch.org/flakytest?name=test_variant_consistency_jit_addcmul_cpu_complex64&suite=TestJitCPU) and the most recent trunk [workflow logs](https://github.com/pytorch/pytorch/runs/12011585407).
Over the past 3 hours, it has been determined flaky in 3 workflow(s) with 3 failures and 3 successes.
**Debugging instructions (after clicking on the recent samples link):**
DO NOT ASSUME THINGS ARE OKAY IF THE CI IS GREEN. We now shield flaky tests from developers so CI will thus be green but it will be harder to parse the logs.
To find relevant log snippets:
1. Click on the workflow logs linked above
2. Click on the Test step of the job so that it is expanded. Otherwise, the grepping will not work.
3. Grep for `test_variant_consistency_jit_addcmul_cpu_complex64`
4. There should be several instances run (as flaky tests are rerun in CI) from which you can study the logs.
Test file path: `test_ops_jit.py` | non_priority | disabled test variant consistency jit addcmul cpu main testjitcpu platforms win windows this test was disabled because it is failing in ci see and the most recent trunk over the past hours it has been determined flaky in workflow s with failures and successes debugging instructions after clicking on the recent samples link do not assume things are okay if the ci is green we now shield flaky tests from developers so ci will thus be green but it will be harder to parse the logs to find relevant log snippets click on the workflow logs linked above click on the test step of the job so that it is expanded otherwise the grepping will not work grep for test variant consistency jit addcmul cpu there should be several instances run as flaky tests are rerun in ci from which you can study the logs test file path test ops jit py | 0 |
26,624 | 20,362,405,020 | IssuesEvent | 2022-02-20 21:37:09 | APSIMInitiative/ApsimX | https://api.github.com/repos/APSIMInitiative/ApsimX | closed | Solute UI components need refactoring - SWIM and SoilWater | interface/infrastructure refactor | Need to be suitable for both models | 1.0 | Solute UI components need refactoring - SWIM and SoilWater - Need to be suitable for both models | non_priority | solute ui components need refactoring swim and soilwater need to be suitable for both models | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.