Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
125,104
| 26,595,078,535
|
IssuesEvent
|
2023-01-23 11:50:24
|
sourcegraph/sourcegraph
|
https://api.github.com/repos/sourcegraph/sourcegraph
|
closed
|
insights: export datapoints including truncated points
|
team/code-insights backend strategic insights-data-retention
|
Follow up from https://github.com/sourcegraph/sourcegraph/issues/45745
We will want to support the ability to export data as if from a user context. This data should include both data available on the chart (series_points / series_points_snapshots) as well as any truncated data for the series. Exporting in CSV is probably simple enough, and I think we do that elsewhere in the product.
/cc @joelkw @felixfbecker @vovakulikov
|
1.0
|
insights: export datapoints including truncated points - Follow up from https://github.com/sourcegraph/sourcegraph/issues/45745
We will want to support the ability to export data as if from a user context. This data should include both data available on the chart (series_points / series_points_snapshots) as well as any truncated data for the series. Exporting in CSV is probably simple enough, and I think we do that elsewhere in the product.
/cc @joelkw @felixfbecker @vovakulikov
|
non_process
|
insights export datapoints including truncated points follow up from we will want to support the ability to export data as if from a user context this data should include both data available on the chart series points series points snapshots as well as any truncated data for the series exporting in csv is probably simple enough and i think we do that elsewhere in the product cc joelkw felixfbecker vovakulikov
| 0
|
18,560
| 24,555,669,779
|
IssuesEvent
|
2022-10-12 15:40:56
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Android] [Offline indicator] 'You are offline' error message is not getting displayed when participant clicks on the 'Notifications icon
|
Bug P1 Android Process: Fixed Process: Tested QA Process: Tested dev
|
**AR:** Blank screen is getting displayed and the 'You are offline' error message is not getting displayed when the participant clicks on the 'Notifications' icon, when participant's are offline
**ER:** 'You are offline' error message should get displayed

|
3.0
|
[Android] [Offline indicator] 'You are offline' error message is not getting displayed when participant clicks on the 'Notifications icon - **AR:** Blank screen is getting displayed and the 'You are offline' error message is not getting displayed when the participant clicks on the 'Notifications' icon, when participant's are offline
**ER:** 'You are offline' error message should get displayed

|
process
|
you are offline error message is not getting displayed when participant clicks on the notifications icon ar blank screen is getting displayed and the you are offline error message is not getting displayed when the participant clicks on the notifications icon when participant s are offline er you are offline error message should get displayed
| 1
|
176,551
| 21,411,766,014
|
IssuesEvent
|
2022-04-22 06:57:06
|
AlexRogalskiy/java-patterns
|
https://api.github.com/repos/AlexRogalskiy/java-patterns
|
opened
|
CVE-2021-43138 (High) detected in multiple libraries
|
security vulnerability
|
## CVE-2021-43138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>async-1.5.2.tgz</b>, <b>async-1.0.0.tgz</b>, <b>async-0.2.10.tgz</b></p></summary>
<p>
<details><summary><b>async-1.5.2.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.5.2.tgz">https://registry.npmjs.org/async/-/async-1.5.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/portscanner/node_modules/async/package.json,/node_modules/npmi/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/package.json,/node_modules/nconf/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- gitbook-cli-2.3.2.tgz (Root Library)
- npmi-1.0.1.tgz
- npm-2.15.12.tgz
- request-2.74.0.tgz
- form-data-1.0.0-rc4.tgz
- :x: **async-1.5.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-1.0.0.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.0.0.tgz">https://registry.npmjs.org/async/-/async-1.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/winston/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- dockerfile_lint-0.3.4.tgz (Root Library)
- winston-2.4.5.tgz
- :x: **async-1.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-0.2.10.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.2.10.tgz">https://registry.npmjs.org/async/-/async-0.2.10.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/prompt/node_modules/async/package.json,/node_modules/utile/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- jscs-3.0.7.tgz (Root Library)
- prompt-0.2.14.tgz
- utile-0.2.1.tgz
- :x: **async-0.2.10.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/java-patterns/commit/0e3f838823fb09cc237bb3fc8f2e2651a2d0f0e6">0e3f838823fb09cc237bb3fc8f2e2651a2d0f0e6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution: async - v3.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-43138 (High) detected in multiple libraries - ## CVE-2021-43138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>async-1.5.2.tgz</b>, <b>async-1.0.0.tgz</b>, <b>async-0.2.10.tgz</b></p></summary>
<p>
<details><summary><b>async-1.5.2.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.5.2.tgz">https://registry.npmjs.org/async/-/async-1.5.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/portscanner/node_modules/async/package.json,/node_modules/npmi/node_modules/npm/node_modules/request/node_modules/form-data/node_modules/async/package.json,/node_modules/nconf/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- gitbook-cli-2.3.2.tgz (Root Library)
- npmi-1.0.1.tgz
- npm-2.15.12.tgz
- request-2.74.0.tgz
- form-data-1.0.0-rc4.tgz
- :x: **async-1.5.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-1.0.0.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.0.0.tgz">https://registry.npmjs.org/async/-/async-1.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/winston/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- dockerfile_lint-0.3.4.tgz (Root Library)
- winston-2.4.5.tgz
- :x: **async-1.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-0.2.10.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-0.2.10.tgz">https://registry.npmjs.org/async/-/async-0.2.10.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/prompt/node_modules/async/package.json,/node_modules/utile/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- jscs-3.0.7.tgz (Root Library)
- prompt-0.2.14.tgz
- utile-0.2.1.tgz
- :x: **async-0.2.10.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/java-patterns/commit/0e3f838823fb09cc237bb3fc8f2e2651a2d0f0e6">0e3f838823fb09cc237bb3fc8f2e2651a2d0f0e6</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution: async - v3.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries async tgz async tgz async tgz async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules portscanner node modules async package json node modules npmi node modules npm node modules request node modules form data node modules async package json node modules nconf node modules async package json dependency hierarchy gitbook cli tgz root library npmi tgz npm tgz request tgz form data tgz x async tgz vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules winston node modules async package json dependency hierarchy dockerfile lint tgz root library winston tgz x async tgz vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules prompt node modules async package json node modules utile node modules async package json dependency hierarchy jscs tgz root library prompt tgz utile tgz x async tgz vulnerable library found in head commit a href found in base branch master vulnerability details a vulnerability exists in async through fixed in which could let a malicious user obtain privileges via the mapvalues method publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution async step up your open source security game with whitesource
| 0
|
12,164
| 9,582,641,269
|
IssuesEvent
|
2019-05-08 01:38:46
|
microsoft/vscode-cpptools
|
https://api.github.com/repos/microsoft/vscode-cpptools
|
closed
|
cppStandard no longer seems to be respected on insiders version
|
Language Service bug fixed (release pending) insiders quick fix regression
|
**Type: LanguageService**
**Describe the bug**
- OS and Version: Windows 1803(?) update
- VS Code Version: 1.34.0-insiders
- C/C++ Extension Version: 0.23.0-insiders
**To Reproduce**
1. Have the above installed.
2. Create a blank directory and open it in Code Insiders
3. Create a new cpp file and write the contents:
```c++
#include <string_view>
std::string_view s = "foo";
```
There will be red squiggles on `std::string_view`. You can edit the `c_cpp_properties.json` file to include the line `"cppStandard": "c++17"`, but that doesn't seem to help. No matter what the deduced C++ version is always less than C++17 (C++14?).
**Expected behavior**
Once adding `"cppStandard": "c++17"` to the configuration file, C++17 features should not give intellisense errors.
|
1.0
|
cppStandard no longer seems to be respected on insiders version - **Type: LanguageService**
**Describe the bug**
- OS and Version: Windows 1803(?) update
- VS Code Version: 1.34.0-insiders
- C/C++ Extension Version: 0.23.0-insiders
**To Reproduce**
1. Have the above installed.
2. Create a blank directory and open it in Code Insiders
3. Create a new cpp file and write the contents:
```c++
#include <string_view>
std::string_view s = "foo";
```
There will be red squiggles on `std::string_view`. You can edit the `c_cpp_properties.json` file to include the line `"cppStandard": "c++17"`, but that doesn't seem to help. No matter what the deduced C++ version is always less than C++17 (C++14?).
**Expected behavior**
Once adding `"cppStandard": "c++17"` to the configuration file, C++17 features should not give intellisense errors.
|
non_process
|
cppstandard no longer seems to be respected on insiders version type languageservice describe the bug os and version windows update vs code version insiders c c extension version insiders to reproduce have the above installed create a blank directory and open it in code insiders create a new cpp file and write the contents c include std string view s foo there will be red squiggles on std string view you can edit the c cpp properties json file to include the line cppstandard c but that doesn t seem to help no matter what the deduced c version is always less than c c expected behavior once adding cppstandard c to the configuration file c features should not give intellisense errors
| 0
|
8,710
| 11,851,142,896
|
IssuesEvent
|
2020-03-24 17:40:00
|
MHRA/products
|
https://api.github.com/repos/MHRA/products
|
closed
|
Create worker panics when connection to the SFTP server fails
|
BUG :bug: EPIC - Auto Batch Process :oncoming_automobile:
|
It looks like the create worker will panic when it can't connect to the SFTP server.
```
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Os { code: 61, kind: ConnectionRefused, message: "Connection refused" }', src/create_manager/sftp_client.rs:10:15
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
```
After this, the create worker ceases to respond to messages.
To reproduce, set `SENTINEL_SFTP_SERVER` to a server which doesn't exit.
_Originally posted by @craiga in https://github.com/MHRA/products/issues/427#issuecomment-602729275_
|
1.0
|
Create worker panics when connection to the SFTP server fails - It looks like the create worker will panic when it can't connect to the SFTP server.
```
thread 'main' panicked at 'called `Result::unwrap()` on an `Err` value: Os { code: 61, kind: ConnectionRefused, message: "Connection refused" }', src/create_manager/sftp_client.rs:10:15
note: run with `RUST_BACKTRACE=1` environment variable to display a backtrace
```
After this, the create worker ceases to respond to messages.
To reproduce, set `SENTINEL_SFTP_SERVER` to a server which doesn't exit.
_Originally posted by @craiga in https://github.com/MHRA/products/issues/427#issuecomment-602729275_
|
process
|
create worker panics when connection to the sftp server fails it looks like the create worker will panic when it can t connect to the sftp server thread main panicked at called result unwrap on an err value os code kind connectionrefused message connection refused src create manager sftp client rs note run with rust backtrace environment variable to display a backtrace after this the create worker ceases to respond to messages to reproduce set sentinel sftp server to a server which doesn t exit originally posted by craiga in
| 1
|
775
| 3,257,746,844
|
IssuesEvent
|
2015-10-20 19:11:41
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
Undocumented options for child_process.exec and execSync?
|
child_process doc good first contribution
|
I assume that `exec` and `execSync` have the same set of options, but in the documentation some are missing:
The `shell` option mentioned in [`child_process.exec`](https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback) is absent from the [`execSync`](https://nodejs.org/api/child_process.html#child_process_child_process_execsync_command_options) documentation.
~~Conversely, the `input` and `stdio` options mentioned in `child_process.execSync` are absent from the `exec` documentation.~~ *Edit:* Hm, I'm not sure if these are actually supported by `exec`. Strange.
|
1.0
|
Undocumented options for child_process.exec and execSync? - I assume that `exec` and `execSync` have the same set of options, but in the documentation some are missing:
The `shell` option mentioned in [`child_process.exec`](https://nodejs.org/api/child_process.html#child_process_child_process_exec_command_options_callback) is absent from the [`execSync`](https://nodejs.org/api/child_process.html#child_process_child_process_execsync_command_options) documentation.
~~Conversely, the `input` and `stdio` options mentioned in `child_process.execSync` are absent from the `exec` documentation.~~ *Edit:* Hm, I'm not sure if these are actually supported by `exec`. Strange.
|
process
|
undocumented options for child process exec and execsync i assume that exec and execsync have the same set of options but in the documentation some are missing the shell option mentioned in is absent from the documentation conversely the input and stdio options mentioned in child process execsync are absent from the exec documentation edit hm i m not sure if these are actually supported by exec strange
| 1
|
17,639
| 3,012,784,626
|
IssuesEvent
|
2015-07-29 02:34:06
|
yawlfoundation/yawl
|
https://api.github.com/repos/yawlfoundation/yawl
|
closed
|
Change of data type is not reflected at runtime
|
auto-migrated Category-ResService Priority-High Type-Defect
|
```
In the attached example there is an element 'price' (part of the complex
type 'item') which used to have the type 'xs:decimal". I changed it
to 'listofitems' which is defined as a list of strings. At runtime the
price field is still expecting a decimal.
```
Original issue reported on code.google.com by `arthurte...@gmail.com` on 27 Aug 2008 at 7:27
Attachments:
* [new79.ywl](https://storage.googleapis.com/google-code-attachments/yawl/issue-127/comment-0/new79.ywl)
* [new79.xml](https://storage.googleapis.com/google-code-attachments/yawl/issue-127/comment-0/new79.xml)
* [new79.backup](https://storage.googleapis.com/google-code-attachments/yawl/issue-127/comment-0/new79.backup)
|
1.0
|
Change of data type is not reflected at runtime - ```
In the attached example there is an element 'price' (part of the complex
type 'item') which used to have the type 'xs:decimal". I changed it
to 'listofitems' which is defined as a list of strings. At runtime the
price field is still expecting a decimal.
```
Original issue reported on code.google.com by `arthurte...@gmail.com` on 27 Aug 2008 at 7:27
Attachments:
* [new79.ywl](https://storage.googleapis.com/google-code-attachments/yawl/issue-127/comment-0/new79.ywl)
* [new79.xml](https://storage.googleapis.com/google-code-attachments/yawl/issue-127/comment-0/new79.xml)
* [new79.backup](https://storage.googleapis.com/google-code-attachments/yawl/issue-127/comment-0/new79.backup)
|
non_process
|
change of data type is not reflected at runtime in the attached example there is an element price part of the complex type item which used to have the type xs decimal i changed it to listofitems which is defined as a list of strings at runtime the price field is still expecting a decimal original issue reported on code google com by arthurte gmail com on aug at attachments
| 0
|
8,871
| 11,964,894,118
|
IssuesEvent
|
2020-04-05 21:17:28
|
arcum-omni/Coquo
|
https://api.github.com/repos/arcum-omni/Coquo
|
closed
|
Setup Branch Protection Rules
|
dev process
|
Require status checks to pass before merging
Require branches to be up to date before merging
Build Project/Solution
|
1.0
|
Setup Branch Protection Rules - Require status checks to pass before merging
Require branches to be up to date before merging
Build Project/Solution
|
process
|
setup branch protection rules require status checks to pass before merging require branches to be up to date before merging build project solution
| 1
|
6,439
| 9,544,249,969
|
IssuesEvent
|
2019-05-01 13:38:40
|
cypress-io/cypress
|
https://api.github.com/repos/cypress-io/cypress
|
closed
|
Detect forked pull request on AppVeyor CI
|
CI: appveyor process: tests stage: work in progress type: chore
|
We have special logic to detect forked pull requests on CircleCI and run tests without recording even when the CI file says `cypress run --record`. (we do show a warning in the terminal)
I have noticed that we don't detect this situation on AppVeyor, see https://ci.appveyor.com/project/cypress-io/cypress-example-kitchensink/builds/24174895 for example
```
run-if npm run test:ci:record
Platform win32 is allowed
Arch ia32 is allowed
> cypress-example-kitchensink@0.0.0-development test:ci:record C:\projects\cypress-example-kitchensink
> run-p --race start:ci e2e:record
> cypress-example-kitchensink@0.0.0-development start:ci C:\projects\cypress-example-kitchensink
> http-server app -c-1 --silent
> cypress-example-kitchensink@0.0.0-development e2e:record C:\projects\cypress-example-kitchensink
> cypress run --record
You passed the --record flag but did not provide us your Record Key.
You can pass us your Record Key like this:
cypress run --record --key <record_key>
You can also set the key as an environment variable with the name CYPRESS_RECORD_KEY.
https://on.cypress.io/how-do-i-record-runs
npm ERR! code ELIFECYCLE
npm ERR! errno 1
```
Environment variables to use to detect this situation
```
print-env APPVEYOR
APPVEYOR=True
APPVEYOR_ACCOUNT_NAME=cypress-io
APPVEYOR_API_URL=http://localhost:1031/
APPVEYOR_BUILD_AGENT_HYPERV_NIC_CONFIGURED=true
APPVEYOR_BUILD_FOLDER=C:\projects\cypress-example-kitchensink
APPVEYOR_BUILD_ID=24174895
APPVEYOR_BUILD_NUMBER=1125
APPVEYOR_BUILD_VERSION=1.0.1125
APPVEYOR_BUILD_WORKER_IMAGE=Visual Studio 2015
APPVEYOR_JOB_ID=vq2vocqjxhwwik0r
APPVEYOR_JOB_NUMBER=1
APPVEYOR_PROJECT_ID=369103
APPVEYOR_PROJECT_NAME=cypress-example-kitchensink
APPVEYOR_PROJECT_SLUG=cypress-example-kitchensink
APPVEYOR_PULL_REQUEST_HEAD_COMMIT=0f79966c703a3864cb74a993a4e25ded91fbed61
APPVEYOR_PULL_REQUEST_HEAD_REPO_BRANCH=patch-2
APPVEYOR_PULL_REQUEST_HEAD_REPO_NAME=mguery/cypress-example-kitchensink
APPVEYOR_PULL_REQUEST_NUMBER=241
APPVEYOR_PULL_REQUEST_TITLE=Update index.html
APPVEYOR_REPO_BRANCH=master
APPVEYOR_REPO_COMMIT=1fed0b1223afaa50c5df2a4586e1c9e67dcca90b
APPVEYOR_REPO_COMMIT_AUTHOR=Marjy Guery
APPVEYOR_REPO_COMMIT_AUTHOR_EMAIL=marjy@cypress.io
APPVEYOR_REPO_COMMIT_MESSAGE=Update index.html
APPVEYOR_REPO_COMMIT_TIMESTAMP=2019-04-29T13:21:01.0000000Z
APPVEYOR_REPO_NAME=cypress-io/cypress-example-kitchensink
APPVEYOR_REPO_PROVIDER=gitHub
APPVEYOR_REPO_SCM=git
APPVEYOR_REPO_TAG=false
APPVEYOR_URL=https://ci.appveyor.com
commit-message-install --else "npm ci"
```
|
1.0
|
Detect forked pull request on AppVeyor CI - We have special logic to detect forked pull requests on CircleCI and run tests without recording even when the CI file says `cypress run --record`. (we do show a warning in the terminal)
I have noticed that we don't detect this situation on AppVeyor, see https://ci.appveyor.com/project/cypress-io/cypress-example-kitchensink/builds/24174895 for example
```
run-if npm run test:ci:record
Platform win32 is allowed
Arch ia32 is allowed
> cypress-example-kitchensink@0.0.0-development test:ci:record C:\projects\cypress-example-kitchensink
> run-p --race start:ci e2e:record
> cypress-example-kitchensink@0.0.0-development start:ci C:\projects\cypress-example-kitchensink
> http-server app -c-1 --silent
> cypress-example-kitchensink@0.0.0-development e2e:record C:\projects\cypress-example-kitchensink
> cypress run --record
You passed the --record flag but did not provide us your Record Key.
You can pass us your Record Key like this:
cypress run --record --key <record_key>
You can also set the key as an environment variable with the name CYPRESS_RECORD_KEY.
https://on.cypress.io/how-do-i-record-runs
npm ERR! code ELIFECYCLE
npm ERR! errno 1
```
Environment variables to use to detect this situation
```
print-env APPVEYOR
APPVEYOR=True
APPVEYOR_ACCOUNT_NAME=cypress-io
APPVEYOR_API_URL=http://localhost:1031/
APPVEYOR_BUILD_AGENT_HYPERV_NIC_CONFIGURED=true
APPVEYOR_BUILD_FOLDER=C:\projects\cypress-example-kitchensink
APPVEYOR_BUILD_ID=24174895
APPVEYOR_BUILD_NUMBER=1125
APPVEYOR_BUILD_VERSION=1.0.1125
APPVEYOR_BUILD_WORKER_IMAGE=Visual Studio 2015
APPVEYOR_JOB_ID=vq2vocqjxhwwik0r
APPVEYOR_JOB_NUMBER=1
APPVEYOR_PROJECT_ID=369103
APPVEYOR_PROJECT_NAME=cypress-example-kitchensink
APPVEYOR_PROJECT_SLUG=cypress-example-kitchensink
APPVEYOR_PULL_REQUEST_HEAD_COMMIT=0f79966c703a3864cb74a993a4e25ded91fbed61
APPVEYOR_PULL_REQUEST_HEAD_REPO_BRANCH=patch-2
APPVEYOR_PULL_REQUEST_HEAD_REPO_NAME=mguery/cypress-example-kitchensink
APPVEYOR_PULL_REQUEST_NUMBER=241
APPVEYOR_PULL_REQUEST_TITLE=Update index.html
APPVEYOR_REPO_BRANCH=master
APPVEYOR_REPO_COMMIT=1fed0b1223afaa50c5df2a4586e1c9e67dcca90b
APPVEYOR_REPO_COMMIT_AUTHOR=Marjy Guery
APPVEYOR_REPO_COMMIT_AUTHOR_EMAIL=marjy@cypress.io
APPVEYOR_REPO_COMMIT_MESSAGE=Update index.html
APPVEYOR_REPO_COMMIT_TIMESTAMP=2019-04-29T13:21:01.0000000Z
APPVEYOR_REPO_NAME=cypress-io/cypress-example-kitchensink
APPVEYOR_REPO_PROVIDER=gitHub
APPVEYOR_REPO_SCM=git
APPVEYOR_REPO_TAG=false
APPVEYOR_URL=https://ci.appveyor.com
commit-message-install --else "npm ci"
```
|
process
|
detect forked pull request on appveyor ci we have special logic to detect forked pull requests on circleci and run tests without recording even when the ci file says cypress run record we do show a warning in the terminal i have noticed that we don t detect this situation on appveyor see for example run if npm run test ci record platform is allowed arch is allowed cypress example kitchensink development test ci record c projects cypress example kitchensink run p race start ci record cypress example kitchensink development start ci c projects cypress example kitchensink http server app c silent cypress example kitchensink development record c projects cypress example kitchensink cypress run record you passed the record flag but did not provide us your record key you can pass us your record key like this cypress run record key you can also set the key as an environment variable with the name cypress record key npm err code elifecycle npm err errno environment variables to use to detect this situation print env appveyor appveyor true appveyor account name cypress io appveyor api url appveyor build agent hyperv nic configured true appveyor build folder c projects cypress example kitchensink appveyor build id appveyor build number appveyor build version appveyor build worker image visual studio appveyor job id appveyor job number appveyor project id appveyor project name cypress example kitchensink appveyor project slug cypress example kitchensink appveyor pull request head commit appveyor pull request head repo branch patch appveyor pull request head repo name mguery cypress example kitchensink appveyor pull request number appveyor pull request title update index html appveyor repo branch master appveyor repo commit appveyor repo commit author marjy guery appveyor repo commit author email marjy cypress io appveyor repo commit message update index html appveyor repo commit timestamp appveyor repo name cypress io cypress example kitchensink appveyor repo provider github appveyor repo scm git appveyor repo tag false appveyor url commit message install else npm ci
| 1
|
553,185
| 16,359,603,650
|
IssuesEvent
|
2021-05-14 07:18:38
|
ita-social-projects/TeachUA
|
https://api.github.com/repos/ita-social-projects/TeachUA
|
closed
|
[Мій профіль] The club's categories are not shown on the club's card
|
Priority: Medium bug
|
**Environment:** Windows 10, Google Chrome, 88.0.4324.182, (64-bit).
**Reproducible:** always
**Build found:** last commit from https://speak-ukrainian.org.ua/dev
**Steps to reproduce**
1. Go to 'https://speak-ukrainian.org.ua/dev' and login as 'Керівник'
2. Click on 'Мій профіль'
3. Look at the club's card
**Actual result**

**Expected result**

**User story and test case links**
User story #TBD
**Labels to be added**
"Bug", Priority ("medium"), Severity ("major"), Type ("Functional").
|
1.0
|
[Мій профіль] The club's categories are not shown on the club's card - **Environment:** Windows 10, Google Chrome, 88.0.4324.182, (64-bit).
**Reproducible:** always
**Build found:** last commit from https://speak-ukrainian.org.ua/dev
**Steps to reproduce**
1. Go to 'https://speak-ukrainian.org.ua/dev' and login as 'Керівник'
2. Click on 'Мій профіль'
3. Look at the club's card
**Actual result**

**Expected result**

**User story and test case links**
User story #TBD
**Labels to be added**
"Bug", Priority ("medium"), Severity ("major"), Type ("Functional").
|
non_process
|
the club s categories are not shown on the club s card environment windows google chrome bit reproducible always build found last commit from steps to reproduce go to and login as керівник click on мій профіль look at the club s card actual result expected result user story and test case links user story tbd labels to be added bug priority medium severity major type functional
| 0
|
26,123
| 7,785,287,173
|
IssuesEvent
|
2018-06-06 15:27:52
|
USDA-FSA/fsa-design-system
|
https://api.github.com/repos/USDA-FSA/fsa-design-system
|
closed
|
Update font usage
|
Category: Site Design & Build P2
|
Scrub through doc site SCSS for implications of removing serif (#166, #164) and adjust as needed.
|
1.0
|
Update font usage - Scrub through doc site SCSS for implications of removing serif (#166, #164) and adjust as needed.
|
non_process
|
update font usage scrub through doc site scss for implications of removing serif and adjust as needed
| 0
|
232,736
| 25,603,833,153
|
IssuesEvent
|
2022-12-01 22:59:13
|
amplify-education/tmp_SAST_eval_WebGoat
|
https://api.github.com/repos/amplify-education/tmp_SAST_eval_WebGoat
|
opened
|
commons-io-2.6.jar: 1 vulnerabilities (highest severity is: 4.8)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-io-2.6.jar</b></p></summary>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/amplify-education/tmp_SAST_eval_WebGoat/commit/320c43c0f5a8ea47b0ef17801fb70028d38a8e14">320c43c0f5a8ea47b0ef17801fb70028d38a8e14</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (commons-io version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2021-29425](https://www.mend.io/vulnerability-database/CVE-2021-29425) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.8 | commons-io-2.6.jar | Direct | 2.7 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-29425</summary>
### Vulnerable Library - <b>commons-io-2.6.jar</b></p>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>
Dependency Hierarchy:
- :x: **commons-io-2.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/amplify-education/tmp_SAST_eval_WebGoat/commit/320c43c0f5a8ea47b0ef17801fb70028d38a8e14">320c43c0f5a8ea47b0ef17801fb70028d38a8e14</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Apache Commons IO before 2.7, When invoking the method FileNameUtils.normalize with an improper input string, like "//../foo", or "\\..\foo", the result would be the same value, thus possibly providing access to files in the parent directory, but not further above (thus "limited" path traversal), if the calling code would use the result to construct a path value.
<p>Publish Date: 2021-04-13
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-29425>CVE-2021-29425</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425</a></p>
<p>Release Date: 2021-04-13</p>
<p>Fix Resolution: 2.7</p>
</p>
<p></p>
</details>
|
True
|
commons-io-2.6.jar: 1 vulnerabilities (highest severity is: 4.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-io-2.6.jar</b></p></summary>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/amplify-education/tmp_SAST_eval_WebGoat/commit/320c43c0f5a8ea47b0ef17801fb70028d38a8e14">320c43c0f5a8ea47b0ef17801fb70028d38a8e14</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (commons-io version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2021-29425](https://www.mend.io/vulnerability-database/CVE-2021-29425) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 4.8 | commons-io-2.6.jar | Direct | 2.7 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-29425</summary>
### Vulnerable Library - <b>commons-io-2.6.jar</b></p>
<p>The Apache Commons IO library contains utility classes, stream implementations, file filters,
file comparators, endian transformation classes, and much more.</p>
<p>
Dependency Hierarchy:
- :x: **commons-io-2.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/amplify-education/tmp_SAST_eval_WebGoat/commit/320c43c0f5a8ea47b0ef17801fb70028d38a8e14">320c43c0f5a8ea47b0ef17801fb70028d38a8e14</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Apache Commons IO before 2.7, When invoking the method FileNameUtils.normalize with an improper input string, like "//../foo", or "\\..\foo", the result would be the same value, thus possibly providing access to files in the parent directory, but not further above (thus "limited" path traversal), if the calling code would use the result to construct a path value.
<p>Publish Date: 2021-04-13
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-29425>CVE-2021-29425</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>4.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-29425</a></p>
<p>Release Date: 2021-04-13</p>
<p>Fix Resolution: 2.7</p>
</p>
<p></p>
</details>
|
non_process
|
commons io jar vulnerabilities highest severity is vulnerable library commons io jar the apache commons io library contains utility classes stream implementations file filters file comparators endian transformation classes and much more found in head commit a href vulnerabilities cve severity cvss dependency type fixed in commons io version remediation available medium commons io jar direct details cve vulnerable library commons io jar the apache commons io library contains utility classes stream implementations file filters file comparators endian transformation classes and much more dependency hierarchy x commons io jar vulnerable library found in head commit a href found in base branch develop vulnerability details in apache commons io before when invoking the method filenameutils normalize with an improper input string like foo or foo the result would be the same value thus possibly providing access to files in the parent directory but not further above thus limited path traversal if the calling code would use the result to construct a path value publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution
| 0
|
15,201
| 19,025,638,119
|
IssuesEvent
|
2021-11-24 02:54:26
|
streamnative/flink
|
https://api.github.com/repos/streamnative/flink
|
closed
|
[BUG] Not Confirmed Yet: PulsarSourceSplitReader tests shows initial seeking is not working.
|
platform/data-processing type/bug
|
```
@Test
@Disabled
void consumeMessageCreatedBeforeHandleSplitsChangesAndUseSecondMessageIdCursor() {
PulsarPartitionSplitReaderBase<String> splitReader = splitReader();
String topicName = randomAlphabetic(10);
operator().setupTopic(topicName, STRING, () -> randomAlphabetic(10));
seekStartPositionAndHandleSplit(splitReader, topicName, 0, new MessageIdImpl(0, 1, -1));
fetchedMessages(splitReader, NUM_RECORDS_PER_PARTITION - 1, true);
}
```
When we seek startPosition to MessageId(0, 1, -1), we expect the first message would be skiped and only consume 19 messages, but when executed it will return 20 messsages. Root cause not found, might be a false negative. Still digging
|
1.0
|
[BUG] Not Confirmed Yet: PulsarSourceSplitReader tests shows initial seeking is not working. -
```
@Test
@Disabled
void consumeMessageCreatedBeforeHandleSplitsChangesAndUseSecondMessageIdCursor() {
PulsarPartitionSplitReaderBase<String> splitReader = splitReader();
String topicName = randomAlphabetic(10);
operator().setupTopic(topicName, STRING, () -> randomAlphabetic(10));
seekStartPositionAndHandleSplit(splitReader, topicName, 0, new MessageIdImpl(0, 1, -1));
fetchedMessages(splitReader, NUM_RECORDS_PER_PARTITION - 1, true);
}
```
When we seek startPosition to MessageId(0, 1, -1), we expect the first message would be skiped and only consume 19 messages, but when executed it will return 20 messsages. Root cause not found, might be a false negative. Still digging
|
process
|
not confirmed yet pulsarsourcesplitreader tests shows initial seeking is not working test disabled void consumemessagecreatedbeforehandlesplitschangesandusesecondmessageidcursor pulsarpartitionsplitreaderbase splitreader splitreader string topicname randomalphabetic operator setuptopic topicname string randomalphabetic seekstartpositionandhandlesplit splitreader topicname new messageidimpl fetchedmessages splitreader num records per partition true when we seek startposition to messageid we expect the first message would be skiped and only consume messages but when executed it will return messsages root cause not found might be a false negative still digging
| 1
|
77,911
| 9,637,990,613
|
IssuesEvent
|
2019-05-16 10:02:19
|
Darkov3/forum-project
|
https://api.github.com/repos/Darkov3/forum-project
|
closed
|
Create favicons and all other relevant files
|
design/ui feature front-end
|
They can easily be created using this site:
https://realfavicongenerator.net/
I will use the forum default avatar from the SC theme.
|
1.0
|
Create favicons and all other relevant files - They can easily be created using this site:
https://realfavicongenerator.net/
I will use the forum default avatar from the SC theme.
|
non_process
|
create favicons and all other relevant files they can easily be created using this site i will use the forum default avatar from the sc theme
| 0
|
3,869
| 6,808,647,454
|
IssuesEvent
|
2017-11-04 06:10:00
|
Great-Hill-Corporation/quickBlocks
|
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
|
reopened
|
Separate monitored accounts into individual binary files and merge them on the fly
|
monitors-all status-inprocess type-bug
|
These data files must be hidden from the user. Also--if the user changes the list of accounts that goes into the hash, then the hash must either be re-written entirely (if there are new accounts) or cleaned up, if there are accounts that have been removed.
From https://github.com/Great-Hill-Corporation/ethslurp/issues/143
|
1.0
|
Separate monitored accounts into individual binary files and merge them on the fly - These data files must be hidden from the user. Also--if the user changes the list of accounts that goes into the hash, then the hash must either be re-written entirely (if there are new accounts) or cleaned up, if there are accounts that have been removed.
From https://github.com/Great-Hill-Corporation/ethslurp/issues/143
|
process
|
separate monitored accounts into individual binary files and merge them on the fly these data files must be hidden from the user also if the user changes the list of accounts that goes into the hash then the hash must either be re written entirely if there are new accounts or cleaned up if there are accounts that have been removed from
| 1
|
98,167
| 16,361,442,745
|
IssuesEvent
|
2021-05-14 10:05:29
|
Galaxy-Software-Service/Maven_Pom_Demo
|
https://api.github.com/repos/Galaxy-Software-Service/Maven_Pom_Demo
|
opened
|
CVE-2013-2251 (High) detected in struts2-core-2.3.15.jar
|
security vulnerability
|
## CVE-2013-2251 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>struts2-core-2.3.15.jar</b></p></summary>
<p>Apache Struts 2</p>
<p>Path to dependency file: Maven_Pom_Demo/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/org/apache/struts/struts2-core/2.3.15/struts2-core-2.3.15.jar</p>
<p>
Dependency Hierarchy:
- :x: **struts2-core-2.3.15.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Galaxy-Software-Service/Maven_Pom_Demo/commit/69cce4bac0c1b37088c48547695b174bd6149c5c">69cce4bac0c1b37088c48547695b174bd6149c5c</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Struts 2.0.0 through 2.3.15 allows remote attackers to execute arbitrary OGNL expressions via a parameter with a crafted (1) action:, (2) redirect:, or (3) redirectAction: prefix.
<p>Publish Date: 2013-07-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-2251>CVE-2013-2251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>9.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-2251">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-2251</a></p>
<p>Release Date: 2013-07-20</p>
<p>Fix Resolution: 2.3.16</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.struts","packageName":"struts2-core","packageVersion":"2.3.15","packageFilePaths":["/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.struts:struts2-core:2.3.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.3.16"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2013-2251","vulnerabilityDetails":"Apache Struts 2.0.0 through 2.3.15 allows remote attackers to execute arbitrary OGNL expressions via a parameter with a crafted (1) action:, (2) redirect:, or (3) redirectAction: prefix.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-2251","cvss2Severity":"high","cvss2Score":"9.3","extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2013-2251 (High) detected in struts2-core-2.3.15.jar - ## CVE-2013-2251 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>struts2-core-2.3.15.jar</b></p></summary>
<p>Apache Struts 2</p>
<p>Path to dependency file: Maven_Pom_Demo/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/org/apache/struts/struts2-core/2.3.15/struts2-core-2.3.15.jar</p>
<p>
Dependency Hierarchy:
- :x: **struts2-core-2.3.15.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Galaxy-Software-Service/Maven_Pom_Demo/commit/69cce4bac0c1b37088c48547695b174bd6149c5c">69cce4bac0c1b37088c48547695b174bd6149c5c</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Struts 2.0.0 through 2.3.15 allows remote attackers to execute arbitrary OGNL expressions via a parameter with a crafted (1) action:, (2) redirect:, or (3) redirectAction: prefix.
<p>Publish Date: 2013-07-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-2251>CVE-2013-2251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>9.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-2251">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2013-2251</a></p>
<p>Release Date: 2013-07-20</p>
<p>Fix Resolution: 2.3.16</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.struts","packageName":"struts2-core","packageVersion":"2.3.15","packageFilePaths":["/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.struts:struts2-core:2.3.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.3.16"}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2013-2251","vulnerabilityDetails":"Apache Struts 2.0.0 through 2.3.15 allows remote attackers to execute arbitrary OGNL expressions via a parameter with a crafted (1) action:, (2) redirect:, or (3) redirectAction: prefix.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2013-2251","cvss2Severity":"high","cvss2Score":"9.3","extraData":{}}</REMEDIATE> -->
|
non_process
|
cve high detected in core jar cve high severity vulnerability vulnerable library core jar apache struts path to dependency file maven pom demo pom xml path to vulnerable library canner repository org apache struts core core jar dependency hierarchy x core jar vulnerable library found in head commit a href found in base branch main vulnerability details apache struts through allows remote attackers to execute arbitrary ognl expressions via a parameter with a crafted action redirect or redirectaction prefix publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree org apache struts core isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails apache struts through allows remote attackers to execute arbitrary ognl expressions via a parameter with a crafted action redirect or redirectaction prefix vulnerabilityurl
| 0
|
1,363
| 3,923,276,524
|
IssuesEvent
|
2016-04-22 10:30:35
|
e-government-ua/iBP
|
https://api.github.com/repos/e-government-ua/iBP
|
closed
|
Дніпропетровська область - Рішення про продовження терміну дії рішення міської ради
|
In process of testing in work
|
Розкрити/створити послугу на наступні міста Дніпропетровської області:
- [ ] Нікополь - існує два варіанта цієї послуги для цього населеного пункту
Процес створювати чи рефакторити з використанням SubjectOrganJoinAttribute (атрибути будуть заведені централізовано на всі райони - окремо їх створювати не потрібно) та використанням уніфікованних змінних:
sNameOrgan
sWorkTime
sPhoneOrgan
sAddress
sMailClerk
sArea
nArea
sShapka
nID_Department_visitDay
контакти відповідальних осіб у [файлі](https://docs.google.com/spreadsheets/d/10epKJ_lkok-hCNzbTkU-7G8GbWGs5mzjgGFWBl-ONPQ/edit#gid=0)
|
1.0
|
Дніпропетровська область - Рішення про продовження терміну дії рішення міської ради - Розкрити/створити послугу на наступні міста Дніпропетровської області:
- [ ] Нікополь - існує два варіанта цієї послуги для цього населеного пункту
Процес створювати чи рефакторити з використанням SubjectOrganJoinAttribute (атрибути будуть заведені централізовано на всі райони - окремо їх створювати не потрібно) та використанням уніфікованних змінних:
sNameOrgan
sWorkTime
sPhoneOrgan
sAddress
sMailClerk
sArea
nArea
sShapka
nID_Department_visitDay
контакти відповідальних осіб у [файлі](https://docs.google.com/spreadsheets/d/10epKJ_lkok-hCNzbTkU-7G8GbWGs5mzjgGFWBl-ONPQ/edit#gid=0)
|
process
|
дніпропетровська область рішення про продовження терміну дії рішення міської ради розкрити створити послугу на наступні міста дніпропетровської області нікополь існує два варіанта цієї послуги для цього населеного пункту процес створювати чи рефакторити з використанням subjectorganjoinattribute атрибути будуть заведені централізовано на всі райони окремо їх створювати не потрібно та використанням уніфікованних змінних snameorgan sworktime sphoneorgan saddress smailclerk sarea narea sshapka nid department visitday контакти відповідальних осіб у
| 1
|
225,916
| 24,912,206,435
|
IssuesEvent
|
2022-10-30 01:10:12
|
ams0/openhack-containers
|
https://api.github.com/repos/ams0/openhack-containers
|
opened
|
CVE-2020-1108 (High) detected in microsoft.netcore.app.2.1.0.nupkg
|
security vulnerability
|
## CVE-2020-1108 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.netcore.app.2.1.0.nupkg</b></p></summary>
<p>A set of .NET API's that are included in the default .NET Core application model.
caa7b7e2bad98e56a687fb5cbaf60825500800f7
When using NuGet 3.x this package requires at least version 3.4.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg">https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg</a></p>
<p>Path to dependency file: /src/tripviewer/web/TripViewer.csproj</p>
<p>Path to vulnerable library: /es/microsoft.netcore.app/2.1.0/microsoft.netcore.app.2.1.0.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.netcore.app.2.1.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ams0/openhack-containers/commit/fbc3a9665f7473faa96484a3fa9b058ad82d7e60">fbc3a9665f7473faa96484a3fa9b058ad82d7e60</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A denial of service vulnerability exists when .NET Core or .NET Framework improperly handles web requests, aka '.NET Core & .NET Framework Denial of Service Vulnerability'.
<p>Publish Date: 2020-05-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-1108>CVE-2020-1108</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-3w5p-jhp5-c29q">https://github.com/advisories/GHSA-3w5p-jhp5-c29q</a></p>
<p>Release Date: 2020-05-21</p>
<p>Fix Resolution: Microsoft.NETCore.App - 2.1.18, Microsoft.NETCore.App.Runtime - 3.1.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-1108 (High) detected in microsoft.netcore.app.2.1.0.nupkg - ## CVE-2020-1108 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>microsoft.netcore.app.2.1.0.nupkg</b></p></summary>
<p>A set of .NET API's that are included in the default .NET Core application model.
caa7b7e2bad98e56a687fb5cbaf60825500800f7
When using NuGet 3.x this package requires at least version 3.4.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg">https://api.nuget.org/packages/microsoft.netcore.app.2.1.0.nupkg</a></p>
<p>Path to dependency file: /src/tripviewer/web/TripViewer.csproj</p>
<p>Path to vulnerable library: /es/microsoft.netcore.app/2.1.0/microsoft.netcore.app.2.1.0.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **microsoft.netcore.app.2.1.0.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ams0/openhack-containers/commit/fbc3a9665f7473faa96484a3fa9b058ad82d7e60">fbc3a9665f7473faa96484a3fa9b058ad82d7e60</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A denial of service vulnerability exists when .NET Core or .NET Framework improperly handles web requests, aka '.NET Core & .NET Framework Denial of Service Vulnerability'.
<p>Publish Date: 2020-05-21
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-1108>CVE-2020-1108</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-3w5p-jhp5-c29q">https://github.com/advisories/GHSA-3w5p-jhp5-c29q</a></p>
<p>Release Date: 2020-05-21</p>
<p>Fix Resolution: Microsoft.NETCore.App - 2.1.18, Microsoft.NETCore.App.Runtime - 3.1.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in microsoft netcore app nupkg cve high severity vulnerability vulnerable library microsoft netcore app nupkg a set of net api s that are included in the default net core application model when using nuget x this package requires at least version library home page a href path to dependency file src tripviewer web tripviewer csproj path to vulnerable library es microsoft netcore app microsoft netcore app nupkg dependency hierarchy x microsoft netcore app nupkg vulnerable library found in head commit a href found in base branch master vulnerability details a denial of service vulnerability exists when net core or net framework improperly handles web requests aka net core net framework denial of service vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution microsoft netcore app microsoft netcore app runtime step up your open source security game with mend
| 0
|
15,929
| 20,147,578,374
|
IssuesEvent
|
2022-02-09 09:12:58
|
CMPT756-A5-Org-Patel-Dhruv/MYC756PROJECT
|
https://api.github.com/repos/CMPT756-A5-Org-Patel-Dhruv/MYC756PROJECT
|
opened
|
Find age group based on the tenure based on months
|
preprocessing 6 months dataset
|
Write a python code to find out the age group based on the tenure
<img width="65" alt="age" src="https://user-images.githubusercontent.com/97414622/153162946-3d181118-84c1-4784-9fb6-aeae5f5ddcab.PNG">
|
1.0
|
Find age group based on the tenure based on months - Write a python code to find out the age group based on the tenure
<img width="65" alt="age" src="https://user-images.githubusercontent.com/97414622/153162946-3d181118-84c1-4784-9fb6-aeae5f5ddcab.PNG">
|
process
|
find age group based on the tenure based on months write a python code to find out the age group based on the tenure img width alt age src
| 1
|
3,145
| 6,200,970,956
|
IssuesEvent
|
2017-07-06 03:40:51
|
gaocegege/Processing.R
|
https://api.github.com/repos/gaocegege/Processing.R
|
opened
|
Welcome to have a try on Processing.R
|
community/processing
|
Processing.R has been released 5 versions, and you can get the mode from [github.com/gaocegege/Processing.R/releases/download/v1.0.3/RLangMode.tar.gz](https://github.com/gaocegege/Processing.R/releases/download/v1.0.3/RLangMode.tar.gz).
Now the features in Processing.R includes:
### Built-in functions in Processing
We have a incomplete documentation in [https://processing-r.github.io/Processing.R-docs/][https://processing-r.github.io/Processing.R-docs/]. All the functions are theoretically supported in Processing.R, but we are not sure whether there is some bugs about them.
### Libraries in Processing
Processing has some libraries to enrich the functionality of Processing, Processing.R has been tested with one library: [peasycam](http://mrfeinberg.com/peasycam/). Processing.R imports a new function `importLibrary` to import libraries manually. There is an example of `peasycam` in Processing.R, before you try the code, you need to install the corresponding library `peasycam`.
```r
P3D <- "processing.opengl.PGraphics3D"
settings <- function() {
importLibrary("peasycam")
size(200, 200, P3D)
}
setup <- function() {
cam = PeasyCam$new(processing, 100)
cam$setMinimumDistance(50)
cam$setMaximumDistance(500)
}
draw <- function() {
rotateX(-.5)
rotateY(-.5)
background(0)
fill(255, 0, 0)
box(30)
pushMatrix()
translate(0, 0, 20)
fill(0, 0, 255)
box(5)
popMatrix()
}
```
### R Packages
Processing.R has limited support for R packages now. There is [a list of the packages](http://packages.renjin.org/) which are theoretically supported in Processing.R. Processing.R will automatically download the R packages, so you could use the packages directly. There is an example about `foreach` package:
```r
library(foreach)
foreach(i=1:3) %do%
print(sqrt(i))
```
## Limitations in Processing.R
Now Processing.R is in active development, it is just an experimental version. Now we don't support built-in constants, such as `P3D` and `PI`. If you want to play with 3D sketches in Processing.R, you have to define `P3D <- "processing.opengl.PGraphics3D"` on your own.
And Processing.R doesn't have a good support for static/active/mix mode detection. We recommend you to use define `settings`, `setup` and `draw` rather than write your logic directly. For exampe, we prefer
```R
draw <- function() {
line(1,2,3,4)
}
```
than `line(1,2,3,4)`. The latter may cause some additional bugs.
Welcome to have a try on our experimental mode and give us your feedback :) And if you want to contribute to this mode, there are [issues for new contributors](https://github.com/gaocegege/Processing.R/issues?q=is%3Aissue+is%3Aopen+label%3Afor-new-contributors) and the [architecture documentaion](https://github.com/gaocegege/Processing.R/blob/master/raw-docs/architecture.md).
If you have any problem abou the mode, I am waiting for you at [the gitter channel](https://gitter.im/gaocegege/Processing.R) :tada:
|
1.0
|
Welcome to have a try on Processing.R - Processing.R has been released 5 versions, and you can get the mode from [github.com/gaocegege/Processing.R/releases/download/v1.0.3/RLangMode.tar.gz](https://github.com/gaocegege/Processing.R/releases/download/v1.0.3/RLangMode.tar.gz).
Now the features in Processing.R includes:
### Built-in functions in Processing
We have a incomplete documentation in [https://processing-r.github.io/Processing.R-docs/][https://processing-r.github.io/Processing.R-docs/]. All the functions are theoretically supported in Processing.R, but we are not sure whether there is some bugs about them.
### Libraries in Processing
Processing has some libraries to enrich the functionality of Processing, Processing.R has been tested with one library: [peasycam](http://mrfeinberg.com/peasycam/). Processing.R imports a new function `importLibrary` to import libraries manually. There is an example of `peasycam` in Processing.R, before you try the code, you need to install the corresponding library `peasycam`.
```r
P3D <- "processing.opengl.PGraphics3D"
settings <- function() {
importLibrary("peasycam")
size(200, 200, P3D)
}
setup <- function() {
cam = PeasyCam$new(processing, 100)
cam$setMinimumDistance(50)
cam$setMaximumDistance(500)
}
draw <- function() {
rotateX(-.5)
rotateY(-.5)
background(0)
fill(255, 0, 0)
box(30)
pushMatrix()
translate(0, 0, 20)
fill(0, 0, 255)
box(5)
popMatrix()
}
```
### R Packages
Processing.R has limited support for R packages now. There is [a list of the packages](http://packages.renjin.org/) which are theoretically supported in Processing.R. Processing.R will automatically download the R packages, so you could use the packages directly. There is an example about `foreach` package:
```r
library(foreach)
foreach(i=1:3) %do%
print(sqrt(i))
```
## Limitations in Processing.R
Now Processing.R is in active development, it is just an experimental version. Now we don't support built-in constants, such as `P3D` and `PI`. If you want to play with 3D sketches in Processing.R, you have to define `P3D <- "processing.opengl.PGraphics3D"` on your own.
And Processing.R doesn't have a good support for static/active/mix mode detection. We recommend you to use define `settings`, `setup` and `draw` rather than write your logic directly. For exampe, we prefer
```R
draw <- function() {
line(1,2,3,4)
}
```
than `line(1,2,3,4)`. The latter may cause some additional bugs.
Welcome to have a try on our experimental mode and give us your feedback :) And if you want to contribute to this mode, there are [issues for new contributors](https://github.com/gaocegege/Processing.R/issues?q=is%3Aissue+is%3Aopen+label%3Afor-new-contributors) and the [architecture documentaion](https://github.com/gaocegege/Processing.R/blob/master/raw-docs/architecture.md).
If you have any problem abou the mode, I am waiting for you at [the gitter channel](https://gitter.im/gaocegege/Processing.R) :tada:
|
process
|
welcome to have a try on processing r processing r has been released versions and you can get the mode from now the features in processing r includes built in functions in processing we have a incomplete documentation in all the functions are theoretically supported in processing r but we are not sure whether there is some bugs about them libraries in processing processing has some libraries to enrich the functionality of processing processing r has been tested with one library processing r imports a new function importlibrary to import libraries manually there is an example of peasycam in processing r before you try the code you need to install the corresponding library peasycam r processing opengl settings function importlibrary peasycam size setup function cam peasycam new processing cam setminimumdistance cam setmaximumdistance draw function rotatex rotatey background fill box pushmatrix translate fill box popmatrix r packages processing r has limited support for r packages now there is which are theoretically supported in processing r processing r will automatically download the r packages so you could use the packages directly there is an example about foreach package r library foreach foreach i do print sqrt i limitations in processing r now processing r is in active development it is just an experimental version now we don t support built in constants such as and pi if you want to play with sketches in processing r you have to define processing opengl on your own and processing r doesn t have a good support for static active mix mode detection we recommend you to use define settings setup and draw rather than write your logic directly for exampe we prefer r draw function line than line the latter may cause some additional bugs welcome to have a try on our experimental mode and give us your feedback and if you want to contribute to this mode there are and the if you have any problem abou the mode i am waiting for you at tada
| 1
|
398,601
| 27,203,150,112
|
IssuesEvent
|
2023-02-20 11:07:40
|
department-for-transport-BODS/bods-data-extractor
|
https://api.github.com/repos/department-for-transport-BODS/bods-data-extractor
|
closed
|
Add high level overview of concepts, data and relationships to documentation (diagram)
|
documentation feature request
|
**Is your feature request related to a problem? Please describe.**
As a data consumer, I would like to be able to understand the structure of the data being queried and returned.
**Describe the solution you'd like**
It would be useful if there was a diagram to help visualise the concepts, data and relationships encountered with this package.
|
1.0
|
Add high level overview of concepts, data and relationships to documentation (diagram) - **Is your feature request related to a problem? Please describe.**
As a data consumer, I would like to be able to understand the structure of the data being queried and returned.
**Describe the solution you'd like**
It would be useful if there was a diagram to help visualise the concepts, data and relationships encountered with this package.
|
non_process
|
add high level overview of concepts data and relationships to documentation diagram is your feature request related to a problem please describe as a data consumer i would like to be able to understand the structure of the data being queried and returned describe the solution you d like it would be useful if there was a diagram to help visualise the concepts data and relationships encountered with this package
| 0
|
139,791
| 31,777,843,791
|
IssuesEvent
|
2023-09-12 15:24:31
|
fwouts/previewjs
|
https://api.github.com/repos/fwouts/previewjs
|
closed
|
[Windows 10 not WSL] Error loading webview
|
blocked vscode windows
|
### Describe the bug
After I managed to install Preview.js in my VSCode and tried to open a preview of my component, it is showing an error like this.

OS: **Windows 10**
VSCode Version: **1.70.0**
Node Version: **16.14.2**
### Reproduction
1. Install Preview.js from the extensions tab.
2. Restart VSCode IDE
3. Create a new ReactJS from CRA or ViteJS
4. Click the `Open App.jsx in Preview.js`
### Preview.js version
v1.11.0
### Framework
React 18.0.0
### System Info
```shell
System:
OS: Windows 10 10.0.19043
CPU: (12) x64 AMD Ryzen 5 3600 6-Core Processor
Memory: 12.99 GB / 31.93 GB
Binaries:
Node: 16.14.2 - ~\.nvm\versions\node\v16.14.2\bin\node.EXE
Yarn: 1.22.18 - ~\.nvm\versions\node\v16.14.2\bin\yarn.CMD
npm: 8.15.1 - ~\.nvm\versions\node\v16.14.2\bin\npm.CMD
IDEs:
Android Studio: AI-212.5712.43.2112.8609683
VSCode: 1.70.0 - C:\Users\Jazzi\AppData\Local\Programs\Microsoft VS Code\bin\code.CMD
Visual Studio: 16.11.32228.343 (Visual Studio Community 2019)
Browsers:
Edge: Spartan (44.19041.1266.0)
Internet Explorer: 11.0.19041.1566
```
### Used Package Manager
npm
### Extension logs (useful for crashes)
_No response_
### Preview logs (useful for rendering errors)
_No response_
### Repo link (if available)
_No response_
### Anything else?
_No response_
|
1.0
|
[Windows 10 not WSL] Error loading webview - ### Describe the bug
After I managed to install Preview.js in my VSCode and tried to open a preview of my component, it is showing an error like this.

OS: **Windows 10**
VSCode Version: **1.70.0**
Node Version: **16.14.2**
### Reproduction
1. Install Preview.js from the extensions tab.
2. Restart VSCode IDE
3. Create a new ReactJS from CRA or ViteJS
4. Click the `Open App.jsx in Preview.js`
### Preview.js version
v1.11.0
### Framework
React 18.0.0
### System Info
```shell
System:
OS: Windows 10 10.0.19043
CPU: (12) x64 AMD Ryzen 5 3600 6-Core Processor
Memory: 12.99 GB / 31.93 GB
Binaries:
Node: 16.14.2 - ~\.nvm\versions\node\v16.14.2\bin\node.EXE
Yarn: 1.22.18 - ~\.nvm\versions\node\v16.14.2\bin\yarn.CMD
npm: 8.15.1 - ~\.nvm\versions\node\v16.14.2\bin\npm.CMD
IDEs:
Android Studio: AI-212.5712.43.2112.8609683
VSCode: 1.70.0 - C:\Users\Jazzi\AppData\Local\Programs\Microsoft VS Code\bin\code.CMD
Visual Studio: 16.11.32228.343 (Visual Studio Community 2019)
Browsers:
Edge: Spartan (44.19041.1266.0)
Internet Explorer: 11.0.19041.1566
```
### Used Package Manager
npm
### Extension logs (useful for crashes)
_No response_
### Preview logs (useful for rendering errors)
_No response_
### Repo link (if available)
_No response_
### Anything else?
_No response_
|
non_process
|
error loading webview describe the bug after i managed to install preview js in my vscode and tried to open a preview of my component it is showing an error like this os windows vscode version node version reproduction install preview js from the extensions tab restart vscode ide create a new reactjs from cra or vitejs click the open app jsx in preview js preview js version framework react system info shell system os windows cpu amd ryzen core processor memory gb gb binaries node nvm versions node bin node exe yarn nvm versions node bin yarn cmd npm nvm versions node bin npm cmd ides android studio ai vscode c users jazzi appdata local programs microsoft vs code bin code cmd visual studio visual studio community browsers edge spartan internet explorer used package manager npm extension logs useful for crashes no response preview logs useful for rendering errors no response repo link if available no response anything else no response
| 0
|
14,449
| 17,514,696,382
|
IssuesEvent
|
2021-08-11 04:36:12
|
googleapis/python-spanner
|
https://api.github.com/repos/googleapis/python-spanner
|
closed
|
Refactor systests into separate modules, using pytest fixtures / idioms
|
api: spanner type: process
|
See #438, #439, ..., #469, #470.
The current setup / teardown code in `tests/system/` is ancient, creaky, and not hardened well against quota restrictions / throttling.
- [x] Split the `tests/system/test_system.py` monolith up into more focused modules.
- [x] Re-do its setup / teardown using pytest fixtures, taking care to harden against 429 / 503 errors.
- [x] Likewise for `tests/system/tests_system_dbapi.py`.
|
1.0
|
Refactor systests into separate modules, using pytest fixtures / idioms - See #438, #439, ..., #469, #470.
The current setup / teardown code in `tests/system/` is ancient, creaky, and not hardened well against quota restrictions / throttling.
- [x] Split the `tests/system/test_system.py` monolith up into more focused modules.
- [x] Re-do its setup / teardown using pytest fixtures, taking care to harden against 429 / 503 errors.
- [x] Likewise for `tests/system/tests_system_dbapi.py`.
|
process
|
refactor systests into separate modules using pytest fixtures idioms see the current setup teardown code in tests system is ancient creaky and not hardened well against quota restrictions throttling split the tests system test system py monolith up into more focused modules re do its setup teardown using pytest fixtures taking care to harden against errors likewise for tests system tests system dbapi py
| 1
|
294
| 2,732,220,783
|
IssuesEvent
|
2015-04-17 03:01:43
|
mitchellh/packer
|
https://api.github.com/repos/mitchellh/packer
|
closed
|
Atlas post-processor not setting the correct artifact type
|
bug post-processor/atlas
|
Using these post-processors:
```json
"post-processors": [
{
"output": "build/{{.Provider}}/fb-ubuntu-14.04.{{user `build_number`}}.box",
"type": "vagrant"
},
{
"access_token": "{{user `vagrant_cloud_token`}}",
"box_download_url": "https://somecompany.com/boxes/fb-ubuntu-14.04.{{user `build_number`}}.box",
"box_tag": "somecompany/ubuntu-14.04",
"type": "vagrant-cloud",
"version": "0.1.{{user `build_number`}}"
}
],
```
This produces:
```text
==> ubuntu-14.04-fusion: Compacting the disk image
==> ubuntu-14.04-fusion: Running post-processor: vagrant
==> ubuntu-14.04-fusion (vagrant): Creating Vagrant box for 'vmware' provider
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/disk.vmdk
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.nvram
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.vmsd
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.vmx
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.vmxf
ubuntu-14.04-fusion (vagrant): Compressing: Vagrantfile
ubuntu-14.04-fusion (vagrant): Compressing: disk.vmdk
ubuntu-14.04-fusion (vagrant): Compressing: metadata.json
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.nvram
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.vmsd
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.vmx
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.vmxf
==> ubuntu-14.04-fusion: Running post-processor: vagrant-cloud
Build 'ubuntu-14.04-fusion' errored: 1 error(s) occurred:
* Post-processor failed: Unknown artifact type, requires box from vagrant post-processor: mitchellh.vmware
```
|
1.0
|
Atlas post-processor not setting the correct artifact type - Using these post-processors:
```json
"post-processors": [
{
"output": "build/{{.Provider}}/fb-ubuntu-14.04.{{user `build_number`}}.box",
"type": "vagrant"
},
{
"access_token": "{{user `vagrant_cloud_token`}}",
"box_download_url": "https://somecompany.com/boxes/fb-ubuntu-14.04.{{user `build_number`}}.box",
"box_tag": "somecompany/ubuntu-14.04",
"type": "vagrant-cloud",
"version": "0.1.{{user `build_number`}}"
}
],
```
This produces:
```text
==> ubuntu-14.04-fusion: Compacting the disk image
==> ubuntu-14.04-fusion: Running post-processor: vagrant
==> ubuntu-14.04-fusion (vagrant): Creating Vagrant box for 'vmware' provider
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/disk.vmdk
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.nvram
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.vmsd
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.vmx
ubuntu-14.04-fusion (vagrant): Copying: output-ubuntu-14.04-fusion/packer-ubuntu-14.04-fusion.vmxf
ubuntu-14.04-fusion (vagrant): Compressing: Vagrantfile
ubuntu-14.04-fusion (vagrant): Compressing: disk.vmdk
ubuntu-14.04-fusion (vagrant): Compressing: metadata.json
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.nvram
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.vmsd
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.vmx
ubuntu-14.04-fusion (vagrant): Compressing: packer-ubuntu-14.04-fusion.vmxf
==> ubuntu-14.04-fusion: Running post-processor: vagrant-cloud
Build 'ubuntu-14.04-fusion' errored: 1 error(s) occurred:
* Post-processor failed: Unknown artifact type, requires box from vagrant post-processor: mitchellh.vmware
```
|
process
|
atlas post processor not setting the correct artifact type using these post processors json post processors output build provider fb ubuntu user build number box type vagrant access token user vagrant cloud token box download url build number box box tag somecompany ubuntu type vagrant cloud version user build number this produces text ubuntu fusion compacting the disk image ubuntu fusion running post processor vagrant ubuntu fusion vagrant creating vagrant box for vmware provider ubuntu fusion vagrant copying output ubuntu fusion disk vmdk ubuntu fusion vagrant copying output ubuntu fusion packer ubuntu fusion nvram ubuntu fusion vagrant copying output ubuntu fusion packer ubuntu fusion vmsd ubuntu fusion vagrant copying output ubuntu fusion packer ubuntu fusion vmx ubuntu fusion vagrant copying output ubuntu fusion packer ubuntu fusion vmxf ubuntu fusion vagrant compressing vagrantfile ubuntu fusion vagrant compressing disk vmdk ubuntu fusion vagrant compressing metadata json ubuntu fusion vagrant compressing packer ubuntu fusion nvram ubuntu fusion vagrant compressing packer ubuntu fusion vmsd ubuntu fusion vagrant compressing packer ubuntu fusion vmx ubuntu fusion vagrant compressing packer ubuntu fusion vmxf ubuntu fusion running post processor vagrant cloud build ubuntu fusion errored error s occurred post processor failed unknown artifact type requires box from vagrant post processor mitchellh vmware
| 1
|
18,550
| 24,555,389,247
|
IssuesEvent
|
2022-10-12 15:29:14
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Android][Scheduling] Runs count is wrong for monthly regular survey
|
Bug P2 Android Process: Fixed Process: Tested QA Process: Tested dev
|
Steps:-
1. Schedule a activity with regular-monthly frequency in SB and publish
2. Install and login into the Android app
3. Enroll into study and navigate into study activities screen
4. Observe the run count for the activity
A/R:- Currently, activity is displaying with run count of 8
E/R:- Run count should be displayed as 10
Note:-
Issue happened for below activity
**App ID** - GCPMOB001
**Study ID** - CopyOfImportedM
**Activity ID** - q5


|
3.0
|
[Android][Scheduling] Runs count is wrong for monthly regular survey - Steps:-
1. Schedule a activity with regular-monthly frequency in SB and publish
2. Install and login into the Android app
3. Enroll into study and navigate into study activities screen
4. Observe the run count for the activity
A/R:- Currently, activity is displaying with run count of 8
E/R:- Run count should be displayed as 10
Note:-
Issue happened for below activity
**App ID** - GCPMOB001
**Study ID** - CopyOfImportedM
**Activity ID** - q5


|
process
|
runs count is wrong for monthly regular survey steps schedule a activity with regular monthly frequency in sb and publish install and login into the android app enroll into study and navigate into study activities screen observe the run count for the activity a r currently activity is displaying with run count of e r run count should be displayed as note issue happened for below activity app id study id copyofimportedm activity id
| 1
|
180,220
| 21,625,602,253
|
IssuesEvent
|
2022-05-05 01:24:16
|
JMD60260/fetchmeaband
|
https://api.github.com/repos/JMD60260/fetchmeaband
|
closed
|
WS-2015-0024 (High) detected in uglify-js-2.3.6.tgz, uglify-js-1.2.5.tgz - autoclosed
|
security vulnerability
|
## WS-2015-0024 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>uglify-js-2.3.6.tgz</b>, <b>uglify-js-1.2.5.tgz</b></p></summary>
<p>
<details><summary><b>uglify-js-2.3.6.tgz</b></p></summary>
<p>JavaScript parser, mangler/compressor and beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-2.3.6.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-2.3.6.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- assemble-0.4.42.tgz (Root Library)
- assemble-handlebars-0.2.6.tgz
- handlebars-1.3.0.tgz
- :x: **uglify-js-2.3.6.tgz** (Vulnerable Library)
</details>
<details><summary><b>uglify-js-1.2.5.tgz</b></p></summary>
<p>JavaScript parser and compressor/beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-1.2.5.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-1.2.5.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/fetchmeaband/public/vendor/jquery-countdown/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/fetchmeaband/public/vendor/jquery-countdown/node_modules/socket.io-client/node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- karma-0.12.37.tgz (Root Library)
- socket.io-0.9.16.tgz
- socket.io-client-0.9.16.tgz
- :x: **uglify-js-1.2.5.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/JMD60260/fetchmeaband/commit/430b5f2947d45ada69dc047ea870d3c988006344">430b5f2947d45ada69dc047ea870d3c988006344</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
UglifyJS versions 2.4.23 and earlier are affected by a vulnerability which allows a specially crafted Javascript file to have altered functionality after minification.
<p>Publish Date: 2015-08-24
<p>URL: <a href=https://github.com/mishoo/UglifyJS2/issues/751>WS-2015-0024</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02">https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02</a></p>
<p>Release Date: 2017-01-31</p>
<p>Fix Resolution: v2.4.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2015-0024 (High) detected in uglify-js-2.3.6.tgz, uglify-js-1.2.5.tgz - autoclosed - ## WS-2015-0024 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>uglify-js-2.3.6.tgz</b>, <b>uglify-js-1.2.5.tgz</b></p></summary>
<p>
<details><summary><b>uglify-js-2.3.6.tgz</b></p></summary>
<p>JavaScript parser, mangler/compressor and beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-2.3.6.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-2.3.6.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/fetchmeaband/public/vendor/owl.carousel/node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- assemble-0.4.42.tgz (Root Library)
- assemble-handlebars-0.2.6.tgz
- handlebars-1.3.0.tgz
- :x: **uglify-js-2.3.6.tgz** (Vulnerable Library)
</details>
<details><summary><b>uglify-js-1.2.5.tgz</b></p></summary>
<p>JavaScript parser and compressor/beautifier toolkit</p>
<p>Library home page: <a href="https://registry.npmjs.org/uglify-js/-/uglify-js-1.2.5.tgz">https://registry.npmjs.org/uglify-js/-/uglify-js-1.2.5.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/fetchmeaband/public/vendor/jquery-countdown/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/fetchmeaband/public/vendor/jquery-countdown/node_modules/socket.io-client/node_modules/uglify-js/package.json</p>
<p>
Dependency Hierarchy:
- karma-0.12.37.tgz (Root Library)
- socket.io-0.9.16.tgz
- socket.io-client-0.9.16.tgz
- :x: **uglify-js-1.2.5.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/JMD60260/fetchmeaband/commit/430b5f2947d45ada69dc047ea870d3c988006344">430b5f2947d45ada69dc047ea870d3c988006344</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
UglifyJS versions 2.4.23 and earlier are affected by a vulnerability which allows a specially crafted Javascript file to have altered functionality after minification.
<p>Publish Date: 2015-08-24
<p>URL: <a href=https://github.com/mishoo/UglifyJS2/issues/751>WS-2015-0024</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>8.3</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02">https://github.com/mishoo/UglifyJS2/commit/905b6011784ca60d41919ac1a499962b7c1d4b02</a></p>
<p>Release Date: 2017-01-31</p>
<p>Fix Resolution: v2.4.24</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws high detected in uglify js tgz uglify js tgz autoclosed ws high severity vulnerability vulnerable libraries uglify js tgz uglify js tgz uglify js tgz javascript parser mangler compressor and beautifier toolkit library home page a href path to dependency file tmp ws scm fetchmeaband public vendor owl carousel package json path to vulnerable library tmp ws scm fetchmeaband public vendor owl carousel node modules uglify js package json dependency hierarchy assemble tgz root library assemble handlebars tgz handlebars tgz x uglify js tgz vulnerable library uglify js tgz javascript parser and compressor beautifier toolkit library home page a href path to dependency file tmp ws scm fetchmeaband public vendor jquery countdown package json path to vulnerable library tmp ws scm fetchmeaband public vendor jquery countdown node modules socket io client node modules uglify js package json dependency hierarchy karma tgz root library socket io tgz socket io client tgz x uglify js tgz vulnerable library found in head commit a href vulnerability details uglifyjs versions and earlier are affected by a vulnerability which allows a specially crafted javascript file to have altered functionality after minification publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
35,728
| 7,987,390,305
|
IssuesEvent
|
2018-07-19 07:36:58
|
wurmf/OpenBooth
|
https://api.github.com/repos/wurmf/OpenBooth
|
closed
|
remove all cases of log-and-rethrow
|
clean code
|
log-and-rethrow is an anti-pattern, which leads to logging exceptions multiple times and therefore cluttering the log file with unnecessary duplicate information.
log-and-rethrow typically looks like this:
```
try{
//do stuff
} catch (SQLException e) {
LOG.error("I found an exception!");
throw new CustomException(e)
}
|
1.0
|
remove all cases of log-and-rethrow - log-and-rethrow is an anti-pattern, which leads to logging exceptions multiple times and therefore cluttering the log file with unnecessary duplicate information.
log-and-rethrow typically looks like this:
```
try{
//do stuff
} catch (SQLException e) {
LOG.error("I found an exception!");
throw new CustomException(e)
}
|
non_process
|
remove all cases of log and rethrow log and rethrow is an anti pattern which leads to logging exceptions multiple times and therefore cluttering the log file with unnecessary duplicate information log and rethrow typically looks like this try do stuff catch sqlexception e log error i found an exception throw new customexception e
| 0
|
20,796
| 27,542,977,909
|
IssuesEvent
|
2023-03-07 09:49:57
|
MrZeRo2000/odeon-wss
|
https://api.github.com/repos/MrZeRo2000/odeon-wss
|
closed
|
Repeated movies import fails after Movies Load
|
bug Processing
|
Duplicate key null (attempted merging values MediaFile{id=94, artifact=Artifact{id=86, artifactType=not initialized, artist=null, title='Крепкий орешек', year=null, duration=13, size=1319066, insertDate=null, tracks=not initialized, mediaFiles=not initialized}, name='Die.Hard.1988.BDRip.720p.stimtoo.mkv', format='matroska,webm', size=1319066, bitrate=791, duration=13} and MediaFile{id=95, artifact=Artifact{id=87, artifactType=not initialized, artist=null, title='Лицензия на убийство', year=null, duration=13, size=1320799, insertDate=null, tracks=not initialized, mediaFiles=not initialized}, name='Licence to Kill (HD).m4v', format='mov,mp4,m4a,3gp,3g2,mj2', size=1320799, bitrate=792, duration=13})
|
1.0
|
Repeated movies import fails after Movies Load - Duplicate key null (attempted merging values MediaFile{id=94, artifact=Artifact{id=86, artifactType=not initialized, artist=null, title='Крепкий орешек', year=null, duration=13, size=1319066, insertDate=null, tracks=not initialized, mediaFiles=not initialized}, name='Die.Hard.1988.BDRip.720p.stimtoo.mkv', format='matroska,webm', size=1319066, bitrate=791, duration=13} and MediaFile{id=95, artifact=Artifact{id=87, artifactType=not initialized, artist=null, title='Лицензия на убийство', year=null, duration=13, size=1320799, insertDate=null, tracks=not initialized, mediaFiles=not initialized}, name='Licence to Kill (HD).m4v', format='mov,mp4,m4a,3gp,3g2,mj2', size=1320799, bitrate=792, duration=13})
|
process
|
repeated movies import fails after movies load duplicate key null attempted merging values mediafile id artifact artifact id artifacttype not initialized artist null title крепкий орешек year null duration size insertdate null tracks not initialized mediafiles not initialized name die hard bdrip stimtoo mkv format matroska webm size bitrate duration and mediafile id artifact artifact id artifacttype not initialized artist null title лицензия на убийство year null duration size insertdate null tracks not initialized mediafiles not initialized name licence to kill hd format mov size bitrate duration
| 1
|
869
| 2,582,294,836
|
IssuesEvent
|
2015-02-15 02:28:49
|
TheCricket/Chisel-2
|
https://api.github.com/repos/TheCricket/Chisel-2
|
closed
|
getHarvestLevel and getHarvestTool not returning correct values
|
bug code complete
|
Hey,
I'm using Chisel2_DEV-2.3.5.7f88e05 with Forge 10.13.2.30 and when calling either of these methods with marble or limestone (probably other, I just checked those two) these methods return the following:
getHarvestLevel: -1
getHarvestTool: null
I've also tried calling ForgeHooks.canToolHarvestBlock with a pickaxe but that returns false.
Should these return better values? Or am I just trying to use the wrong methods?
|
1.0
|
getHarvestLevel and getHarvestTool not returning correct values - Hey,
I'm using Chisel2_DEV-2.3.5.7f88e05 with Forge 10.13.2.30 and when calling either of these methods with marble or limestone (probably other, I just checked those two) these methods return the following:
getHarvestLevel: -1
getHarvestTool: null
I've also tried calling ForgeHooks.canToolHarvestBlock with a pickaxe but that returns false.
Should these return better values? Or am I just trying to use the wrong methods?
|
non_process
|
getharvestlevel and getharvesttool not returning correct values hey i m using dev with forge and when calling either of these methods with marble or limestone probably other i just checked those two these methods return the following getharvestlevel getharvesttool null i ve also tried calling forgehooks cantoolharvestblock with a pickaxe but that returns false should these return better values or am i just trying to use the wrong methods
| 0
|
170,527
| 13,191,838,236
|
IssuesEvent
|
2020-08-13 12:50:57
|
cs-shadow/buildstream
|
https://api.github.com/repos/cs-shadow/buildstream
|
closed
|
Test that an error is raised when staging to the elements build directory
|
enhancement tests
|
In GitLab by @tristanvb on Oct 22, 2017, 08:57
I had to fix the recent branch staging sources to the build directory.
The branch was checking this for *every source*, which is wrong, it needs to check once while staging *all sources* of an element to a non-empty element build directory (top level *only*), this is fixed.
After this I found that:
* A test was added to tests/sources, poking around in Source APIs, please *never do this*, that directory has to go away, and tests should be *only* using the CLI fixtures, with only a few exceptions, for instance for testing the `_yaml` module
* Deadcode was added to buildstream, files were added to tests/frontend/project, which look like they may be useful for testing this exact thing - however, these added files were unused. Please never add deadcode.
|
1.0
|
Test that an error is raised when staging to the elements build directory - In GitLab by @tristanvb on Oct 22, 2017, 08:57
I had to fix the recent branch staging sources to the build directory.
The branch was checking this for *every source*, which is wrong, it needs to check once while staging *all sources* of an element to a non-empty element build directory (top level *only*), this is fixed.
After this I found that:
* A test was added to tests/sources, poking around in Source APIs, please *never do this*, that directory has to go away, and tests should be *only* using the CLI fixtures, with only a few exceptions, for instance for testing the `_yaml` module
* Deadcode was added to buildstream, files were added to tests/frontend/project, which look like they may be useful for testing this exact thing - however, these added files were unused. Please never add deadcode.
|
non_process
|
test that an error is raised when staging to the elements build directory in gitlab by tristanvb on oct i had to fix the recent branch staging sources to the build directory the branch was checking this for every source which is wrong it needs to check once while staging all sources of an element to a non empty element build directory top level only this is fixed after this i found that a test was added to tests sources poking around in source apis please never do this that directory has to go away and tests should be only using the cli fixtures with only a few exceptions for instance for testing the yaml module deadcode was added to buildstream files were added to tests frontend project which look like they may be useful for testing this exact thing however these added files were unused please never add deadcode
| 0
|
24,621
| 6,557,323,135
|
IssuesEvent
|
2017-09-06 16:59:36
|
LukasKalbertodt/luten
|
https://api.github.com/repos/LukasKalbertodt/luten
|
opened
|
Add passwords to `add_dummies` executable
|
E-easy K-improvement W-code
|
All dummy users created by `add_dummies` could have a default password "dummy".
|
1.0
|
Add passwords to `add_dummies` executable - All dummy users created by `add_dummies` could have a default password "dummy".
|
non_process
|
add passwords to add dummies executable all dummy users created by add dummies could have a default password dummy
| 0
|
19,501
| 25,809,547,080
|
IssuesEvent
|
2022-12-11 18:23:16
|
brucemiller/LaTeXML
|
https://api.github.com/repos/brucemiller/LaTeXML
|
closed
|
\bibitem and \cite seems to be case-insensitive
|
bug postprocessing
|
In LateX, `\cite` is case-sensitive, but for LaTeXML, `\cite` seems to be case-insensitive.
A minimal example is [cite-test.zip](https://github.com/brucemiller/LaTeXML/files/7285455/cite-test.zip), which results in “[1] vs [2]” in LaTeX, but “[2] vs [2]” in LaTeXML.
|
1.0
|
\bibitem and \cite seems to be case-insensitive - In LateX, `\cite` is case-sensitive, but for LaTeXML, `\cite` seems to be case-insensitive.
A minimal example is [cite-test.zip](https://github.com/brucemiller/LaTeXML/files/7285455/cite-test.zip), which results in “[1] vs [2]” in LaTeX, but “[2] vs [2]” in LaTeXML.
|
process
|
bibitem and cite seems to be case insensitive in latex cite is case sensitive but for latexml cite seems to be case insensitive a minimal example is which results in “ vs ” in latex but “ vs ” in latexml
| 1
|
10,742
| 13,535,948,397
|
IssuesEvent
|
2020-09-16 08:20:56
|
panther-labs/panther
|
https://api.github.com/repos/panther-labs/panther
|
closed
|
SQS source messages having wrong source_id/source_label and failing classification
|
bug p0 team:data processing
|
### Describe the bug
When creating more than 1 SQS sources, Panther doesn't classify events properly.
### Steps to reproduce
Steps to reproduce the behavior:
1. Go to Panther UI
2. Onboard more than one SQS sources, each with different log type
3. Push some data logs. You will see some misclassification errors
### Expected behavior
Every event processed through SQS should be classified properly. Also, every message sent through SQS should have the right source id and source label.
|
1.0
|
SQS source messages having wrong source_id/source_label and failing classification - ### Describe the bug
When creating more than 1 SQS sources, Panther doesn't classify events properly.
### Steps to reproduce
Steps to reproduce the behavior:
1. Go to Panther UI
2. Onboard more than one SQS sources, each with different log type
3. Push some data logs. You will see some misclassification errors
### Expected behavior
Every event processed through SQS should be classified properly. Also, every message sent through SQS should have the right source id and source label.
|
process
|
sqs source messages having wrong source id source label and failing classification describe the bug when creating more than sqs sources panther doesn t classify events properly steps to reproduce steps to reproduce the behavior go to panther ui onboard more than one sqs sources each with different log type push some data logs you will see some misclassification errors expected behavior every event processed through sqs should be classified properly also every message sent through sqs should have the right source id and source label
| 1
|
17,711
| 12,512,263,278
|
IssuesEvent
|
2020-06-02 22:19:48
|
APSIMInitiative/ApsimX
|
https://api.github.com/repos/APSIMInitiative/ApsimX
|
reopened
|
Experiment in Agroforestry - factor only applied to first zone
|
bug interface/infrastructure
|
Attached is a single tree agroforestry simulation in an experiment with one factor of several N rates. The N rate factor is applied only to the first zone of the simulation, but should be applied to all zones.
This problem might exist for the Windbreak agroforestry system also.
[WorkingFolder.zip](https://github.com/APSIMInitiative/ApsimX/files/4699124/WorkingFolder.zip)
|
1.0
|
Experiment in Agroforestry - factor only applied to first zone - Attached is a single tree agroforestry simulation in an experiment with one factor of several N rates. The N rate factor is applied only to the first zone of the simulation, but should be applied to all zones.
This problem might exist for the Windbreak agroforestry system also.
[WorkingFolder.zip](https://github.com/APSIMInitiative/ApsimX/files/4699124/WorkingFolder.zip)
|
non_process
|
experiment in agroforestry factor only applied to first zone attached is a single tree agroforestry simulation in an experiment with one factor of several n rates the n rate factor is applied only to the first zone of the simulation but should be applied to all zones this problem might exist for the windbreak agroforestry system also
| 0
|
152,233
| 23,935,820,122
|
IssuesEvent
|
2022-09-11 07:54:06
|
jonallamas/simple-wallet
|
https://api.github.com/repos/jonallamas/simple-wallet
|
closed
|
[Dev] Envío de tokens a otras wallets
|
enhancement design development in progress
|
- [x] Diseñar vista de cómo cambiar entre tokens

- [x] Implementar diseño
- [x] Implementar funcionalidad de envío
|
1.0
|
[Dev] Envío de tokens a otras wallets - - [x] Diseñar vista de cómo cambiar entre tokens

- [x] Implementar diseño
- [x] Implementar funcionalidad de envío
|
non_process
|
envío de tokens a otras wallets diseñar vista de cómo cambiar entre tokens implementar diseño implementar funcionalidad de envío
| 0
|
194,255
| 22,261,918,305
|
IssuesEvent
|
2022-06-10 01:51:12
|
panasalap/linux-4.19.72_1
|
https://api.github.com/repos/panasalap/linux-4.19.72_1
|
reopened
|
CVE-2021-26931 (Medium) detected in linux-yoctov5.4.51
|
security vulnerability
|
## CVE-2021-26931 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/c5a08fe8179013aad614165d792bc5b436591df6">c5a08fe8179013aad614165d792bc5b436591df6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/block/xen-blkback/blkback.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/block/xen-blkback/blkback.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in the Linux kernel 2.6.39 through 5.10.16, as used in Xen. Block, net, and SCSI backends consider certain errors a plain bug, deliberately causing a kernel crash. For errors potentially being at least under the influence of guests (such as out of memory conditions), it isn't correct to assume a plain bug. Memory allocations potentially causing such crashes occur only when Linux is running in PV mode, though. This affects drivers/block/xen-blkback/blkback.c and drivers/xen/xen-scsiback.c.
<p>Publish Date: 2021-02-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26931>CVE-2021-26931</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-26931">https://nvd.nist.gov/vuln/detail/CVE-2021-26931</a></p>
<p>Release Date: 2021-02-17</p>
<p>Fix Resolution: linux-libc-headers - 5.13;linux-yocto - 5.4.20+gitAUTOINC+c11911d4d1_f4d7dbafb1,4.8.26+gitAUTOINC+1c60e003c7_27efc3ba68</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-26931 (Medium) detected in linux-yoctov5.4.51 - ## CVE-2021-26931 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yoctov5.4.51</b></p></summary>
<p>
<p>Yocto Linux Embedded kernel</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto>https://git.yoctoproject.org/git/linux-yocto</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.19.72/commit/c5a08fe8179013aad614165d792bc5b436591df6">c5a08fe8179013aad614165d792bc5b436591df6</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/block/xen-blkback/blkback.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/block/xen-blkback/blkback.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
An issue was discovered in the Linux kernel 2.6.39 through 5.10.16, as used in Xen. Block, net, and SCSI backends consider certain errors a plain bug, deliberately causing a kernel crash. For errors potentially being at least under the influence of guests (such as out of memory conditions), it isn't correct to assume a plain bug. Memory allocations potentially causing such crashes occur only when Linux is running in PV mode, though. This affects drivers/block/xen-blkback/blkback.c and drivers/xen/xen-scsiback.c.
<p>Publish Date: 2021-02-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-26931>CVE-2021-26931</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-26931">https://nvd.nist.gov/vuln/detail/CVE-2021-26931</a></p>
<p>Release Date: 2021-02-17</p>
<p>Fix Resolution: linux-libc-headers - 5.13;linux-yocto - 5.4.20+gitAUTOINC+c11911d4d1_f4d7dbafb1,4.8.26+gitAUTOINC+1c60e003c7_27efc3ba68</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linux cve medium severity vulnerability vulnerable library linux yocto linux embedded kernel library home page a href found in head commit a href found in base branch master vulnerable source files drivers block xen blkback blkback c drivers block xen blkback blkback c vulnerability details an issue was discovered in the linux kernel through as used in xen block net and scsi backends consider certain errors a plain bug deliberately causing a kernel crash for errors potentially being at least under the influence of guests such as out of memory conditions it isn t correct to assume a plain bug memory allocations potentially causing such crashes occur only when linux is running in pv mode though this affects drivers block xen blkback blkback c and drivers xen xen scsiback c publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution linux libc headers linux yocto gitautoinc gitautoinc step up your open source security game with mend
| 0
|
112,233
| 17,080,655,180
|
IssuesEvent
|
2021-07-08 04:21:15
|
MohamedElashri/Zotero-Docker
|
https://api.github.com/repos/MohamedElashri/Zotero-Docker
|
closed
|
WS-2019-0164 (Medium) detected in decompress-zip-0.3.0.tgz
|
security vulnerability
|
## WS-2019-0164 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>decompress-zip-0.3.0.tgz</b></p></summary>
<p>Extract files from a ZIP archive</p>
<p>Library home page: <a href="https://registry.npmjs.org/decompress-zip/-/decompress-zip-0.3.0.tgz">https://registry.npmjs.org/decompress-zip/-/decompress-zip-0.3.0.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/decompress-zip/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- :x: **decompress-zip-0.3.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MohamedElashri/Zotero-Docker/commit/6f83bd2bfd0767b41690936c81cfaee93c63820e">6f83bd2bfd0767b41690936c81cfaee93c63820e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
decompress-zip 0.2.x before 0.2.2 and 0.3.x before 0.3.2 has a Zip-Slip vulnerability, an arbitrary file write vulnerability.
<p>Publish Date: 2019-01-16
<p>URL: <a href=https://github.com/bower/decompress-zip/commit/9a908bd30ec9d9b2009110691cfcbe2b96f07c95>WS-2019-0164</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/777">https://www.npmjs.com/advisories/777</a></p>
<p>Release Date: 2019-07-15</p>
<p>Fix Resolution: 0.2.2,0.3.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2019-0164 (Medium) detected in decompress-zip-0.3.0.tgz - ## WS-2019-0164 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>decompress-zip-0.3.0.tgz</b></p></summary>
<p>Extract files from a ZIP archive</p>
<p>Library home page: <a href="https://registry.npmjs.org/decompress-zip/-/decompress-zip-0.3.0.tgz">https://registry.npmjs.org/decompress-zip/-/decompress-zip-0.3.0.tgz</a></p>
<p>Path to dependency file: Zotero-Docker/client/zotero-build/xpi/package.json</p>
<p>Path to vulnerable library: Zotero-Docker/client/zotero-build/xpi/node_modules/decompress-zip/package.json</p>
<p>
Dependency Hierarchy:
- jpm-1.2.2.tgz (Root Library)
- :x: **decompress-zip-0.3.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MohamedElashri/Zotero-Docker/commit/6f83bd2bfd0767b41690936c81cfaee93c63820e">6f83bd2bfd0767b41690936c81cfaee93c63820e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
decompress-zip 0.2.x before 0.2.2 and 0.3.x before 0.3.2 has a Zip-Slip vulnerability, an arbitrary file write vulnerability.
<p>Publish Date: 2019-01-16
<p>URL: <a href=https://github.com/bower/decompress-zip/commit/9a908bd30ec9d9b2009110691cfcbe2b96f07c95>WS-2019-0164</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.0</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/advisories/777">https://www.npmjs.com/advisories/777</a></p>
<p>Release Date: 2019-07-15</p>
<p>Fix Resolution: 0.2.2,0.3.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in decompress zip tgz ws medium severity vulnerability vulnerable library decompress zip tgz extract files from a zip archive library home page a href path to dependency file zotero docker client zotero build xpi package json path to vulnerable library zotero docker client zotero build xpi node modules decompress zip package json dependency hierarchy jpm tgz root library x decompress zip tgz vulnerable library found in head commit a href found in base branch main vulnerability details decompress zip x before and x before has a zip slip vulnerability an arbitrary file write vulnerability publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
166,255
| 12,941,933,078
|
IssuesEvent
|
2020-07-18 00:10:02
|
NakiNorton/refactor-tractor-fitlitA
|
https://api.github.com/repos/NakiNorton/refactor-tractor-fitlitA
|
opened
|
Fix bug(s) in Sleep-test-js
|
bug sleep class testing
|
- [ ] Sleep-test is currently not working as it should, we need to go in and find the issue
|
1.0
|
Fix bug(s) in Sleep-test-js - - [ ] Sleep-test is currently not working as it should, we need to go in and find the issue
|
non_process
|
fix bug s in sleep test js sleep test is currently not working as it should we need to go in and find the issue
| 0
|
12,066
| 14,739,747,398
|
IssuesEvent
|
2021-01-07 07:50:57
|
kdjstudios/SABillingGitlab
|
https://api.github.com/repos/kdjstudios/SABillingGitlab
|
closed
|
Consolidate Posting Scripts
|
anc-process anp-not prioritized ant-enhancement
|
In GitLab by @tim.traylor on Sep 18, 2018, 09:32
Please add a new 'posting_script' field to the sites collection and we would then use that field to indicate which posting script should run for that site. This way we can use a single posting script file for all of the VCC sites.
Also, please add an editable field for this to the UI, but only visible to admins.
I estimate this to be about 3 or 4 hours.
|
1.0
|
Consolidate Posting Scripts - In GitLab by @tim.traylor on Sep 18, 2018, 09:32
Please add a new 'posting_script' field to the sites collection and we would then use that field to indicate which posting script should run for that site. This way we can use a single posting script file for all of the VCC sites.
Also, please add an editable field for this to the UI, but only visible to admins.
I estimate this to be about 3 or 4 hours.
|
process
|
consolidate posting scripts in gitlab by tim traylor on sep please add a new posting script field to the sites collection and we would then use that field to indicate which posting script should run for that site this way we can use a single posting script file for all of the vcc sites also please add an editable field for this to the ui but only visible to admins i estimate this to be about or hours
| 1
|
20,238
| 26,845,099,351
|
IssuesEvent
|
2023-02-03 06:02:13
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Remaining geometry-issues in dataset after applying "fix geometries"-processing tool
|
Feedback stale Data Provider Processing Bug
|
### What is the bug or the crash?
The "fix-geometries"-processing-tool is expected to remove and repair all geometry-issues from a dataset
After applying the tool to the (sample-dataset "ewf5.shp") QGIS shows, that all geometries are valid.

But in a 3rd-party-software which uses gis-data we could not import the dataset.
To narrow down the issue, we checked the dataset with ArcMap (CheckGeometry-Tool)
It showed still "intersection with itself"-issues (and a spatial index issue)

The fix-geometies-tool in ArcMap had to be run twice to fix the issues of the sample dataset. But in the end the new, valid dataset (export_output_3.shp) coud be used in 3rd-party-software.
Sample-data (shp-file with 2 invalid polygons) can be downloaded from [https://databox0230.krz.de/public/download-shares/fm9EPlb5JdrEEjWBLbgduiTOTa6T3QX7](url)
### Steps to reproduce the issue
1) Take dataset "ewf5.shp" (from (https://databox0230.krz.de/public/download-shares/fm9EPlb5JdrEEjWBLbgduiTOTa6T3QX7) and apply "processing-fix-geometries" tool
1.1) Settings: Edit in place
1.2) Apply with repair-method "structure"
2) Take fixed dataset and apply "processing-fix-geometries" tool
2.1) Settings: Edit in place
2.2) Apply with repair-method "linework"
3) Check validity of dataset in QGIS ("Check validity-processing-tool")
4) Check validity of dataset / repair dataset in ArcMap ("GeometryChecker")
### Versions
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.28.0-Firenze | QGIS code revision | ed3ad0430f3
-- | -- | -- | --
Qt version | 5.15.3
Python version | 3.9.5
GDAL/OGR version | 3.5.2
PROJ version | 9.1.0
EPSG Registry database version | v10.074 (2022-08-01)
GEOS version | 3.10.3-CAPI-1.16.1
SQLite version | 3.39.4
PDAL version | 2.4.3
PostgreSQL client version | unknown
SpatiaLite version | 5.0.1
QWT version | 6.1.6
QScintilla2 version | 2.13.1
OS version | Windows 10 Version 2009
| | |
Active Python plugins
db_manager | 0.1.20
grassprovider | 2.12.99
MetaSearch | 0.3.6
processing | 2.12.99
sagaprovider | 2.12.99
qfieldsync | v4.2.0
</body></html><!--EndFragment-->QGIS version
3.28.0-Firenze
QGIS code revision
[ed3ad0430f3](https://github.com/qgis/QGIS/commit/ed3ad0430f3)
Qt version
5.15.3
Python version
3.9.5
GDAL/OGR version
3.5.2
PROJ version
9.1.0
EPSG Registry database version
v10.074 (2022-08-01)
GEOS version
3.10.3-CAPI-1.16.1
SQLite version
3.39.4
PDAL version
2.4.3
PostgreSQL client version
unknown
SpatiaLite version
5.0.1
QWT version
6.1.6
QScintilla2 version
2.13.1
OS version
Windows 10 Version 2009
Active Python plugins
db_manager
0.1.20
grassprovider
2.12.99
MetaSearch
0.3.6
processing
2.12.99
sagaprovider
2.12.99
qfieldsync
v4.2.0
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
1.0
|
Remaining geometry-issues in dataset after applying "fix geometries"-processing tool - ### What is the bug or the crash?
The "fix-geometries"-processing-tool is expected to remove and repair all geometry-issues from a dataset
After applying the tool to the (sample-dataset "ewf5.shp") QGIS shows, that all geometries are valid.

But in a 3rd-party-software which uses gis-data we could not import the dataset.
To narrow down the issue, we checked the dataset with ArcMap (CheckGeometry-Tool)
It showed still "intersection with itself"-issues (and a spatial index issue)

The fix-geometies-tool in ArcMap had to be run twice to fix the issues of the sample dataset. But in the end the new, valid dataset (export_output_3.shp) coud be used in 3rd-party-software.
Sample-data (shp-file with 2 invalid polygons) can be downloaded from [https://databox0230.krz.de/public/download-shares/fm9EPlb5JdrEEjWBLbgduiTOTa6T3QX7](url)
### Steps to reproduce the issue
1) Take dataset "ewf5.shp" (from (https://databox0230.krz.de/public/download-shares/fm9EPlb5JdrEEjWBLbgduiTOTa6T3QX7) and apply "processing-fix-geometries" tool
1.1) Settings: Edit in place
1.2) Apply with repair-method "structure"
2) Take fixed dataset and apply "processing-fix-geometries" tool
2.1) Settings: Edit in place
2.2) Apply with repair-method "linework"
3) Check validity of dataset in QGIS ("Check validity-processing-tool")
4) Check validity of dataset / repair dataset in ArcMap ("GeometryChecker")
### Versions
<!--StartFragment--><!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.0//EN" "http://www.w3.org/TR/REC-html40/strict.dtd">
<html><head><meta http-equiv="Content-Type" content="text/html; charset=utf-8" /><style type="text/css">
p, li { white-space: pre-wrap; }
</style></head><body>
QGIS version | 3.28.0-Firenze | QGIS code revision | ed3ad0430f3
-- | -- | -- | --
Qt version | 5.15.3
Python version | 3.9.5
GDAL/OGR version | 3.5.2
PROJ version | 9.1.0
EPSG Registry database version | v10.074 (2022-08-01)
GEOS version | 3.10.3-CAPI-1.16.1
SQLite version | 3.39.4
PDAL version | 2.4.3
PostgreSQL client version | unknown
SpatiaLite version | 5.0.1
QWT version | 6.1.6
QScintilla2 version | 2.13.1
OS version | Windows 10 Version 2009
| | |
Active Python plugins
db_manager | 0.1.20
grassprovider | 2.12.99
MetaSearch | 0.3.6
processing | 2.12.99
sagaprovider | 2.12.99
qfieldsync | v4.2.0
</body></html><!--EndFragment-->QGIS version
3.28.0-Firenze
QGIS code revision
[ed3ad0430f3](https://github.com/qgis/QGIS/commit/ed3ad0430f3)
Qt version
5.15.3
Python version
3.9.5
GDAL/OGR version
3.5.2
PROJ version
9.1.0
EPSG Registry database version
v10.074 (2022-08-01)
GEOS version
3.10.3-CAPI-1.16.1
SQLite version
3.39.4
PDAL version
2.4.3
PostgreSQL client version
unknown
SpatiaLite version
5.0.1
QWT version
6.1.6
QScintilla2 version
2.13.1
OS version
Windows 10 Version 2009
Active Python plugins
db_manager
0.1.20
grassprovider
2.12.99
MetaSearch
0.3.6
processing
2.12.99
sagaprovider
2.12.99
qfieldsync
v4.2.0
### Supported QGIS version
- [X] I'm running a supported QGIS version according to the roadmap.
### New profile
- [X] I tried with a new QGIS profile
### Additional context
_No response_
|
process
|
remaining geometry issues in dataset after applying fix geometries processing tool what is the bug or the crash the fix geometries processing tool is expected to remove and repair all geometry issues from a dataset after applying the tool to the sample dataset shp qgis shows that all geometries are valid but in a party software which uses gis data we could not import the dataset to narrow down the issue we checked the dataset with arcmap checkgeometry tool it showed still intersection with itself issues and a spatial index issue the fix geometies tool in arcmap had to be run twice to fix the issues of the sample dataset but in the end the new valid dataset export output shp coud be used in party software sample data shp file with invalid polygons can be downloaded from url steps to reproduce the issue take dataset shp from and apply processing fix geometries tool settings edit in place apply with repair method structure take fixed dataset and apply processing fix geometries tool settings edit in place apply with repair method linework check validity of dataset in qgis check validity processing tool check validity of dataset repair dataset in arcmap geometrychecker versions doctype html public dtd html en p li white space pre wrap qgis version firenze qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version unknown spatialite version qwt version version os version windows version active python plugins db manager grassprovider metasearch processing sagaprovider qfieldsync qgis version firenze qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version pdal version postgresql client version unknown spatialite version qwt version version os version windows version active python plugins db manager grassprovider metasearch processing sagaprovider qfieldsync supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
| 1
|
3,793
| 6,775,330,545
|
IssuesEvent
|
2017-10-27 13:56:30
|
syndesisio/syndesis-ui
|
https://api.github.com/repos/syndesisio/syndesis-ui
|
opened
|
Improve scripts; support macOS
|
dev process
|
For local development.
- Check for `google-chrome` in `./scripts/open-browser.sh`, if not use `open` to open browser in the `open-browser` shell script.
- Alternatively, use npm/yarn to handle these. Must be able to handle env arguments.
|
1.0
|
Improve scripts; support macOS - For local development.
- Check for `google-chrome` in `./scripts/open-browser.sh`, if not use `open` to open browser in the `open-browser` shell script.
- Alternatively, use npm/yarn to handle these. Must be able to handle env arguments.
|
process
|
improve scripts support macos for local development check for google chrome in scripts open browser sh if not use open to open browser in the open browser shell script alternatively use npm yarn to handle these must be able to handle env arguments
| 1
|
55,965
| 14,075,667,333
|
IssuesEvent
|
2020-11-04 09:22:37
|
teena24/WebGoat
|
https://api.github.com/repos/teena24/WebGoat
|
closed
|
CVE-2020-10719 (Medium) detected in undertow-core-2.0.28.Final.jar - autoclosed
|
security vulnerability
|
## CVE-2020-10719 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>undertow-core-2.0.28.Final.jar</b></p></summary>
<p>Undertow</p>
<p>Library home page: <a href="http://www.jboss.org/">http://www.jboss.org/</a></p>
<p>Path to dependency file: WebGoat/webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/undertow/undertow-core/2.0.28.Final/undertow-core-2.0.28.Final.jar,/home/wss-scanner/.m2/repository/io/undertow/undertow-core/2.0.28.Final/undertow-core-2.0.28.Final.jar,/home/wss-scanner/.m2/repository/io/undertow/undertow-core/2.0.28.Final/undertow-core-2.0.28.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-undertow-2.2.2.RELEASE.jar (Root Library)
- :x: **undertow-core-2.0.28.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/teena24/WebGoat/commit/b8a568f6e08fcde3c08370e69ce7236fef395ad5">b8a568f6e08fcde3c08370e69ce7236fef395ad5</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in Undertow in versions before 2.1.1.Final, regarding the processing of invalid HTTP requests with large chunk sizes. This flaw allows an attacker to take advantage of HTTP request smuggling.
<p>Publish Date: 2020-05-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10719>CVE-2020-10719</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10719">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10719</a></p>
<p>Release Date: 2020-05-26</p>
<p>Fix Resolution: io.undertow:undertow-core:2.1.1.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.undertow","packageName":"undertow-core","packageVersion":"2.0.28.Final","isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-undertow:2.2.2.RELEASE;io.undertow:undertow-core:2.0.28.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.undertow:undertow-core:2.1.1.Final"}],"vulnerabilityIdentifier":"CVE-2020-10719","vulnerabilityDetails":"A flaw was found in Undertow in versions before 2.1.1.Final, regarding the processing of invalid HTTP requests with large chunk sizes. This flaw allows an attacker to take advantage of HTTP request smuggling.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10719","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-10719 (Medium) detected in undertow-core-2.0.28.Final.jar - autoclosed - ## CVE-2020-10719 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>undertow-core-2.0.28.Final.jar</b></p></summary>
<p>Undertow</p>
<p>Library home page: <a href="http://www.jboss.org/">http://www.jboss.org/</a></p>
<p>Path to dependency file: WebGoat/webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/io/undertow/undertow-core/2.0.28.Final/undertow-core-2.0.28.Final.jar,/home/wss-scanner/.m2/repository/io/undertow/undertow-core/2.0.28.Final/undertow-core-2.0.28.Final.jar,/home/wss-scanner/.m2/repository/io/undertow/undertow-core/2.0.28.Final/undertow-core-2.0.28.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-undertow-2.2.2.RELEASE.jar (Root Library)
- :x: **undertow-core-2.0.28.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/teena24/WebGoat/commit/b8a568f6e08fcde3c08370e69ce7236fef395ad5">b8a568f6e08fcde3c08370e69ce7236fef395ad5</a></p>
<p>Found in base branch: <b>develop</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A flaw was found in Undertow in versions before 2.1.1.Final, regarding the processing of invalid HTTP requests with large chunk sizes. This flaw allows an attacker to take advantage of HTTP request smuggling.
<p>Publish Date: 2020-05-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10719>CVE-2020-10719</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10719">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10719</a></p>
<p>Release Date: 2020-05-26</p>
<p>Fix Resolution: io.undertow:undertow-core:2.1.1.Final</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.undertow","packageName":"undertow-core","packageVersion":"2.0.28.Final","isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-undertow:2.2.2.RELEASE;io.undertow:undertow-core:2.0.28.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.undertow:undertow-core:2.1.1.Final"}],"vulnerabilityIdentifier":"CVE-2020-10719","vulnerabilityDetails":"A flaw was found in Undertow in versions before 2.1.1.Final, regarding the processing of invalid HTTP requests with large chunk sizes. This flaw allows an attacker to take advantage of HTTP request smuggling.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10719","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_process
|
cve medium detected in undertow core final jar autoclosed cve medium severity vulnerability vulnerable library undertow core final jar undertow library home page a href path to dependency file webgoat webwolf pom xml path to vulnerable library home wss scanner repository io undertow undertow core final undertow core final jar home wss scanner repository io undertow undertow core final undertow core final jar home wss scanner repository io undertow undertow core final undertow core final jar dependency hierarchy spring boot starter undertow release jar root library x undertow core final jar vulnerable library found in head commit a href found in base branch develop vulnerability details a flaw was found in undertow in versions before final regarding the processing of invalid http requests with large chunk sizes this flaw allows an attacker to take advantage of http request smuggling publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io undertow undertow core final isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails a flaw was found in undertow in versions before final regarding the processing of invalid http requests with large chunk sizes this flaw allows an attacker to take advantage of http request smuggling vulnerabilityurl
| 0
|
136,096
| 30,475,167,580
|
IssuesEvent
|
2023-07-17 15:59:36
|
Rothamsted/knetminer
|
https://api.github.com/repos/Rothamsted/knetminer
|
reopened
|
Render Gene view using JSON
|
code review UX & UI
|
Part 2 of [3](https://github.com/Rothamsted/knetminer/issues/692). See 1 [here](https://github.com/Rothamsted/knetminer/issues/734).
Gene view to be updated to parse the returned JSON.
- [x] 1. @marco-brandizi - ensure the code is in order (is the [code](https://github.com/Rothamsted/knetminer/issues/655) from #655 on Master?)
- [x] 2. @marco-brandizi make Distance and P-value numeric formats.
- [x] 3. @marco-brandizi to ensure Distance and P-values are being returned for each evidence.
- [x] 4. @marco-brandizi please explain to @lawal-olaotan here how to call the API with the flag.
- [ ] 5. @lawal-olaotan to update the frontend. Since the new format (see #655) will now include "Distance" and "P-Value", we can use those to work on [Gene Neighbourhood functionality](https://github.com/Rothamsted/knetminer/issues/692).
|
1.0
|
Render Gene view using JSON - Part 2 of [3](https://github.com/Rothamsted/knetminer/issues/692). See 1 [here](https://github.com/Rothamsted/knetminer/issues/734).
Gene view to be updated to parse the returned JSON.
- [x] 1. @marco-brandizi - ensure the code is in order (is the [code](https://github.com/Rothamsted/knetminer/issues/655) from #655 on Master?)
- [x] 2. @marco-brandizi make Distance and P-value numeric formats.
- [x] 3. @marco-brandizi to ensure Distance and P-values are being returned for each evidence.
- [x] 4. @marco-brandizi please explain to @lawal-olaotan here how to call the API with the flag.
- [ ] 5. @lawal-olaotan to update the frontend. Since the new format (see #655) will now include "Distance" and "P-Value", we can use those to work on [Gene Neighbourhood functionality](https://github.com/Rothamsted/knetminer/issues/692).
|
non_process
|
render gene view using json part of see gene view to be updated to parse the returned json marco brandizi ensure the code is in order is the from on master marco brandizi make distance and p value numeric formats marco brandizi to ensure distance and p values are being returned for each evidence marco brandizi please explain to lawal olaotan here how to call the api with the flag lawal olaotan to update the frontend since the new format see will now include distance and p value we can use those to work on
| 0
|
344,535
| 24,817,896,766
|
IssuesEvent
|
2022-10-25 14:23:12
|
gatsbyjs/gatsby
|
https://api.github.com/repos/gatsbyjs/gatsby
|
closed
|
SOUBHAGYA SEKHAR
|
type: documentation status: triage needed
|
### Preliminary Checks
- [X] This issue is not a duplicate. Before opening a new issue, please search existing issues: https://github.com/gatsbyjs/gatsby/issues
- [X] This issue is not a question, feature request, RFC, or anything other than a bug report. Please post those things in GitHub Discussions: https://github.com/gatsbyjs/gatsby/discussions
### Summary
Hello
### Steps to Resolve this Issue
1.
2.
3.
...
Good
|
1.0
|
SOUBHAGYA SEKHAR - ### Preliminary Checks
- [X] This issue is not a duplicate. Before opening a new issue, please search existing issues: https://github.com/gatsbyjs/gatsby/issues
- [X] This issue is not a question, feature request, RFC, or anything other than a bug report. Please post those things in GitHub Discussions: https://github.com/gatsbyjs/gatsby/discussions
### Summary
Hello
### Steps to Resolve this Issue
1.
2.
3.
...
Good
|
non_process
|
soubhagya sekhar preliminary checks this issue is not a duplicate before opening a new issue please search existing issues this issue is not a question feature request rfc or anything other than a bug report please post those things in github discussions summary hello steps to resolve this issue good
| 0
|
305,374
| 9,368,468,395
|
IssuesEvent
|
2019-04-03 08:46:53
|
radical-cybertools/radical.pilot
|
https://api.github.com/repos/radical-cybertools/radical.pilot
|
closed
|
Bootstrapper should ensure that essential modules are loaded.
|
comp:agent:bootstrapper comp:agent:executor priority:medium topic:resource type:enhancement
|
e.g., should ensure that launch methods are found - that check should be made fatal.
|
1.0
|
Bootstrapper should ensure that essential modules are loaded. - e.g., should ensure that launch methods are found - that check should be made fatal.
|
non_process
|
bootstrapper should ensure that essential modules are loaded e g should ensure that launch methods are found that check should be made fatal
| 0
|
365,537
| 25,541,019,287
|
IssuesEvent
|
2022-11-29 15:18:50
|
Plutonomicon/cardano-transaction-lib
|
https://api.github.com/repos/Plutonomicon/cardano-transaction-lib
|
closed
|
Better visibility for purescript-bridge fork
|
documentation stage 5
|
Let's rename purescript-bridge to something like ctl-purescript-bridge and add a relevant section to our docs.
|
1.0
|
Better visibility for purescript-bridge fork - Let's rename purescript-bridge to something like ctl-purescript-bridge and add a relevant section to our docs.
|
non_process
|
better visibility for purescript bridge fork let s rename purescript bridge to something like ctl purescript bridge and add a relevant section to our docs
| 0
|
9,774
| 3,317,341,002
|
IssuesEvent
|
2015-11-06 21:11:08
|
cfpb/cfgov-refresh
|
https://api.github.com/repos/cfpb/cfgov-refresh
|
closed
|
Add directions for updating sheer
|
depreciated by cms move documentation
|
Do we currently have it documented anywhere what to do when sheer is outdated on an existing installation of cfgov-refresh? It would help to have some steps to update sheer documented. Perhaps in https://github.com/cfpb/cfgov-refresh/blob/flapjack/README.md#3-launch-sheer-to-serve-the-site
Also, https://github.com/cfpb/cfgov-refresh/blob/flapjack/README.md#updating-all-dependencies, probably shouldn't say "all" since sheer isn't updated there.
|
1.0
|
Add directions for updating sheer - Do we currently have it documented anywhere what to do when sheer is outdated on an existing installation of cfgov-refresh? It would help to have some steps to update sheer documented. Perhaps in https://github.com/cfpb/cfgov-refresh/blob/flapjack/README.md#3-launch-sheer-to-serve-the-site
Also, https://github.com/cfpb/cfgov-refresh/blob/flapjack/README.md#updating-all-dependencies, probably shouldn't say "all" since sheer isn't updated there.
|
non_process
|
add directions for updating sheer do we currently have it documented anywhere what to do when sheer is outdated on an existing installation of cfgov refresh it would help to have some steps to update sheer documented perhaps in also probably shouldn t say all since sheer isn t updated there
| 0
|
16,769
| 21,944,226,641
|
IssuesEvent
|
2022-05-23 21:43:38
|
hashgraph/hedera-json-rpc-relay
|
https://api.github.com/repos/hashgraph/hedera-json-rpc-relay
|
opened
|
Add liveness and readiness endpoint to relay
|
enhancement P2 process
|
### Problem
When deployed in a k8s instance a liveness and readiness endpoint is necessary for correct deployment coordination.
No such endpoints exist right now
### Solution
Add a `/liveness` and `/readiness` endpoint
- `/liveness` - return HTTP 200 post startup
- `/readiness` - return HTTP 200 post simple check e.g. valid chainId is set
### Alternatives
_No response_
|
1.0
|
Add liveness and readiness endpoint to relay - ### Problem
When deployed in a k8s instance a liveness and readiness endpoint is necessary for correct deployment coordination.
No such endpoints exist right now
### Solution
Add a `/liveness` and `/readiness` endpoint
- `/liveness` - return HTTP 200 post startup
- `/readiness` - return HTTP 200 post simple check e.g. valid chainId is set
### Alternatives
_No response_
|
process
|
add liveness and readiness endpoint to relay problem when deployed in a instance a liveness and readiness endpoint is necessary for correct deployment coordination no such endpoints exist right now solution add a liveness and readiness endpoint liveness return http post startup readiness return http post simple check e g valid chainid is set alternatives no response
| 1
|
55,670
| 3,074,229,703
|
IssuesEvent
|
2015-08-20 05:15:02
|
pavel-pimenov/flylinkdc-r5xx
|
https://api.github.com/repos/pavel-pimenov/flylinkdc-r5xx
|
closed
|
чат-комната
|
enhancement imported Priority-Low Usability
|
_From [maks_vla...@mail.ru](https://code.google.com/u/109160490397369351897/) on January 06, 2011 09:12:20_
почему бы не сделать как в FgLink "создать чат- комнату (минихаб)"? http://www.image123.net/tlmjoy5do9uypic.html и еще меня прикалывает эта штука там же http://www.image123.net/uhtn50xgnsiwpic.html
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=272_
|
1.0
|
чат-комната - _From [maks_vla...@mail.ru](https://code.google.com/u/109160490397369351897/) on January 06, 2011 09:12:20_
почему бы не сделать как в FgLink "создать чат- комнату (минихаб)"? http://www.image123.net/tlmjoy5do9uypic.html и еще меня прикалывает эта штука там же http://www.image123.net/uhtn50xgnsiwpic.html
_Original issue: http://code.google.com/p/flylinkdc/issues/detail?id=272_
|
non_process
|
чат комната from on january почему бы не сделать как в fglink создать чат комнату минихаб и еще меня прикалывает эта штука там же original issue
| 0
|
21,091
| 28,044,160,632
|
IssuesEvent
|
2023-03-28 21:04:11
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
opened
|
Track PR merge times
|
Process
|
**Proposal**
With the recent publication of [Zephyr's Contributor and Reviewer Expectations](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html), it would be valuable to measure and track PR merge times. This data could inform whether policy changes are helping or hindering the velocity of the project.
[LFX insights](https://insights-v2.lfx.linuxfoundation.org/zep/trends) provides some of this data, but I don't know the methodology used, and the numbers don't look correct. The times seem far too low and the most recent data might be incomplete.


[GitHub's Insights](https://github.com/zephyrproject-rtos/zephyr/graphs/commit-activity) view only shows the total number of commits per week, without any data for merge and review times.
Tracking the median time for PRs to get the first review and to merge would be a good place to start. This tracking could later be enhanced to distinguish between bug fixes and new features. The expectation is that bug fixes will merge fairly quickly, while new features go through additional review.
|
1.0
|
Track PR merge times - **Proposal**
With the recent publication of [Zephyr's Contributor and Reviewer Expectations](https://docs.zephyrproject.org/latest/contribute/contributor_expectations.html), it would be valuable to measure and track PR merge times. This data could inform whether policy changes are helping or hindering the velocity of the project.
[LFX insights](https://insights-v2.lfx.linuxfoundation.org/zep/trends) provides some of this data, but I don't know the methodology used, and the numbers don't look correct. The times seem far too low and the most recent data might be incomplete.


[GitHub's Insights](https://github.com/zephyrproject-rtos/zephyr/graphs/commit-activity) view only shows the total number of commits per week, without any data for merge and review times.
Tracking the median time for PRs to get the first review and to merge would be a good place to start. This tracking could later be enhanced to distinguish between bug fixes and new features. The expectation is that bug fixes will merge fairly quickly, while new features go through additional review.
|
process
|
track pr merge times proposal with the recent publication of it would be valuable to measure and track pr merge times this data could inform whether policy changes are helping or hindering the velocity of the project provides some of this data but i don t know the methodology used and the numbers don t look correct the times seem far too low and the most recent data might be incomplete view only shows the total number of commits per week without any data for merge and review times tracking the median time for prs to get the first review and to merge would be a good place to start this tracking could later be enhanced to distinguish between bug fixes and new features the expectation is that bug fixes will merge fairly quickly while new features go through additional review
| 1
|
148,887
| 11,870,239,154
|
IssuesEvent
|
2020-03-26 12:26:51
|
eclipse/openj9
|
https://api.github.com/repos/eclipse/openj9
|
opened
|
Windows javac code.Symbol.complete NullPointerException
|
test failure
|
https://ci.eclipse.org/openj9/job/Test_openjdk14_j9_extended.system_x86-64_windows_Nightly/46
SharedClasses.SCM01.SingleCL_0
```
CSC 26/03/20 00:43:28: Compiling java files
CSC Compilation failed
CSC stderr An exception has occurred in the compiler (14-internal). Please file a bug against the Java compiler via the Java bug reporting page (http://bugreport.java.com) after checking the Bug Database (http://bugs.java.com) for duplicates. Include your program, the following diagnostic, and the parameters passed to the Java compiler in your report. Thank you.
CSC stderr java.lang.NullPointerException
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol.complete(Symbol.java:670)
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol$PackageSymbol.members(Symbol.java:1164)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$ImportsPhase.resolveImports(TypeEnter.java:356)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$ImportsPhase.runPhase(TypeEnter.java:324)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$Phase.doCompleteEnvs(TypeEnter.java:285)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$Phase.completeEnvs(TypeEnter.java:254)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter.complete(TypeEnter.java:201)
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol.complete(Symbol.java:670)
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol$ClassSymbol.complete(Symbol.java:1383)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.Enter.complete(Enter.java:584)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.Enter.main(Enter.java:561)
CSC stderr at jdk.compiler/com.sun.tools.javac.main.JavaCompiler.enterTrees(JavaCompiler.java:1071)
CSC stderr at jdk.compiler/com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:936)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.lambda$doCall$0(JavacTaskImpl.java:104)
CSC stderr at com.sun.tools.javac.api.JavacTaskImpl$$Lambda$41/0000000000000000.call(Unknown Source)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.handleExceptions(JavacTaskImpl.java:147)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:100)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:94)
CSC stderr at net.openj9.test.sc.JavaGen.compileJavas(JavaGen.java:376)
CSC stderr at net.openj9.test.sc.JavaGen.go(JavaGen.java:102)
CSC stderr at net.openj9.test.sc.JavaGen.main(JavaGen.java:94)
STF 00:43:39.757 - **FAILED** Process CSC ended with exit code (1) and not the expected exit code/s (0)
```
All subsequent SCM tests failed
`**FAILED** at step 1 (Copy sharedClasses jar). Expected return value=1 Actual=0 at C:/Users/jenkins/workspace/Test_openjdk14_j9_extended.system_x86-64_windows_Nightly/openjdk-tests/\TKG\test_output_15851931988770\SharedClasses.SCM01.MultiCL_0/20200326-004342-SharedClasses/setUp.pl line 51.`
SharedClasses.SCM01.MultiCL_0
SharedClasses.SCM01.MultiThread_0
SharedClasses.SCM01.MultiThreadMultiCL_0
SharedClasses.SCM23.SingleCL_0
SharedClasses.SCM23.MultiCL_0
SharedClasses.SCM23.MultiThread_0
SharedClasses.SCM23.MultiThreadMultiCL_0
|
1.0
|
Windows javac code.Symbol.complete NullPointerException - https://ci.eclipse.org/openj9/job/Test_openjdk14_j9_extended.system_x86-64_windows_Nightly/46
SharedClasses.SCM01.SingleCL_0
```
CSC 26/03/20 00:43:28: Compiling java files
CSC Compilation failed
CSC stderr An exception has occurred in the compiler (14-internal). Please file a bug against the Java compiler via the Java bug reporting page (http://bugreport.java.com) after checking the Bug Database (http://bugs.java.com) for duplicates. Include your program, the following diagnostic, and the parameters passed to the Java compiler in your report. Thank you.
CSC stderr java.lang.NullPointerException
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol.complete(Symbol.java:670)
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol$PackageSymbol.members(Symbol.java:1164)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$ImportsPhase.resolveImports(TypeEnter.java:356)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$ImportsPhase.runPhase(TypeEnter.java:324)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$Phase.doCompleteEnvs(TypeEnter.java:285)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter$Phase.completeEnvs(TypeEnter.java:254)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.TypeEnter.complete(TypeEnter.java:201)
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol.complete(Symbol.java:670)
CSC stderr at jdk.compiler/com.sun.tools.javac.code.Symbol$ClassSymbol.complete(Symbol.java:1383)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.Enter.complete(Enter.java:584)
CSC stderr at jdk.compiler/com.sun.tools.javac.comp.Enter.main(Enter.java:561)
CSC stderr at jdk.compiler/com.sun.tools.javac.main.JavaCompiler.enterTrees(JavaCompiler.java:1071)
CSC stderr at jdk.compiler/com.sun.tools.javac.main.JavaCompiler.compile(JavaCompiler.java:936)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.lambda$doCall$0(JavacTaskImpl.java:104)
CSC stderr at com.sun.tools.javac.api.JavacTaskImpl$$Lambda$41/0000000000000000.call(Unknown Source)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.handleExceptions(JavacTaskImpl.java:147)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.doCall(JavacTaskImpl.java:100)
CSC stderr at jdk.compiler/com.sun.tools.javac.api.JavacTaskImpl.call(JavacTaskImpl.java:94)
CSC stderr at net.openj9.test.sc.JavaGen.compileJavas(JavaGen.java:376)
CSC stderr at net.openj9.test.sc.JavaGen.go(JavaGen.java:102)
CSC stderr at net.openj9.test.sc.JavaGen.main(JavaGen.java:94)
STF 00:43:39.757 - **FAILED** Process CSC ended with exit code (1) and not the expected exit code/s (0)
```
All subsequent SCM tests failed
`**FAILED** at step 1 (Copy sharedClasses jar). Expected return value=1 Actual=0 at C:/Users/jenkins/workspace/Test_openjdk14_j9_extended.system_x86-64_windows_Nightly/openjdk-tests/\TKG\test_output_15851931988770\SharedClasses.SCM01.MultiCL_0/20200326-004342-SharedClasses/setUp.pl line 51.`
SharedClasses.SCM01.MultiCL_0
SharedClasses.SCM01.MultiThread_0
SharedClasses.SCM01.MultiThreadMultiCL_0
SharedClasses.SCM23.SingleCL_0
SharedClasses.SCM23.MultiCL_0
SharedClasses.SCM23.MultiThread_0
SharedClasses.SCM23.MultiThreadMultiCL_0
|
non_process
|
windows javac code symbol complete nullpointerexception sharedclasses singlecl csc compiling java files csc compilation failed csc stderr an exception has occurred in the compiler internal please file a bug against the java compiler via the java bug reporting page after checking the bug database for duplicates include your program the following diagnostic and the parameters passed to the java compiler in your report thank you csc stderr java lang nullpointerexception csc stderr at jdk compiler com sun tools javac code symbol complete symbol java csc stderr at jdk compiler com sun tools javac code symbol packagesymbol members symbol java csc stderr at jdk compiler com sun tools javac comp typeenter importsphase resolveimports typeenter java csc stderr at jdk compiler com sun tools javac comp typeenter importsphase runphase typeenter java csc stderr at jdk compiler com sun tools javac comp typeenter phase docompleteenvs typeenter java csc stderr at jdk compiler com sun tools javac comp typeenter phase completeenvs typeenter java csc stderr at jdk compiler com sun tools javac comp typeenter complete typeenter java csc stderr at jdk compiler com sun tools javac code symbol complete symbol java csc stderr at jdk compiler com sun tools javac code symbol classsymbol complete symbol java csc stderr at jdk compiler com sun tools javac comp enter complete enter java csc stderr at jdk compiler com sun tools javac comp enter main enter java csc stderr at jdk compiler com sun tools javac main javacompiler entertrees javacompiler java csc stderr at jdk compiler com sun tools javac main javacompiler compile javacompiler java csc stderr at jdk compiler com sun tools javac api javactaskimpl lambda docall javactaskimpl java csc stderr at com sun tools javac api javactaskimpl lambda call unknown source csc stderr at jdk compiler com sun tools javac api javactaskimpl handleexceptions javactaskimpl java csc stderr at jdk compiler com sun tools javac api javactaskimpl docall javactaskimpl java csc stderr at jdk compiler com sun tools javac api javactaskimpl call javactaskimpl java csc stderr at net test sc javagen compilejavas javagen java csc stderr at net test sc javagen go javagen java csc stderr at net test sc javagen main javagen java stf failed process csc ended with exit code and not the expected exit code s all subsequent scm tests failed failed at step copy sharedclasses jar expected return value actual at c users jenkins workspace test extended system windows nightly openjdk tests tkg test output sharedclasses multicl sharedclasses setup pl line sharedclasses multicl sharedclasses multithread sharedclasses multithreadmulticl sharedclasses singlecl sharedclasses multicl sharedclasses multithread sharedclasses multithreadmulticl
| 0
|
21,586
| 29,953,997,732
|
IssuesEvent
|
2023-06-23 05:35:14
|
oasis-tcs/sarif-spec
|
https://api.github.com/repos/oasis-tcs/sarif-spec
|
closed
|
Errata01 20230619 Section 1.2 Terminology - Entry result management system - artifacts link broken
|
process 2.1.0-erratum editorial
|
When accepting all changes the link with the text artifacts in the first para of the result management system entry is broken
> software system that consumes the [log files] produced by [analysis tools], produces reports that enable engineering teams to assess the quality of their software >>>[artifacts]<<< at a point in time and to observe trends in the quality over time, and performs functions such as filing bugs and displaying information about individual [results]
Could be an application specific problem.
Proposal: If possible make them work.
Reference bundle <https://www.oasis-open.org/committees/document.php?document_id=71131&wg_abbrev=sarif> and file `sarif-v2.1.0-errata01-csd01-redlined.docx` (section 1.2 page 20 (on my machine) first para of entry result management system)
|
1.0
|
Errata01 20230619 Section 1.2 Terminology - Entry result management system - artifacts link broken - When accepting all changes the link with the text artifacts in the first para of the result management system entry is broken
> software system that consumes the [log files] produced by [analysis tools], produces reports that enable engineering teams to assess the quality of their software >>>[artifacts]<<< at a point in time and to observe trends in the quality over time, and performs functions such as filing bugs and displaying information about individual [results]
Could be an application specific problem.
Proposal: If possible make them work.
Reference bundle <https://www.oasis-open.org/committees/document.php?document_id=71131&wg_abbrev=sarif> and file `sarif-v2.1.0-errata01-csd01-redlined.docx` (section 1.2 page 20 (on my machine) first para of entry result management system)
|
process
|
section terminology entry result management system artifacts link broken when accepting all changes the link with the text artifacts in the first para of the result management system entry is broken software system that consumes the produced by produces reports that enable engineering teams to assess the quality of their software at a point in time and to observe trends in the quality over time and performs functions such as filing bugs and displaying information about individual could be an application specific problem proposal if possible make them work reference bundle and file sarif redlined docx section page on my machine first para of entry result management system
| 1
|
17,196
| 22,773,380,540
|
IssuesEvent
|
2022-07-08 12:16:39
|
gradle/gradle
|
https://api.github.com/repos/gradle/gradle
|
closed
|
Incremental cache lost after a compilation error when using Micronaut
|
a:bug in:annotation-processing
|
The cache is lost after any compilation error when Micronaut-inject is shipped with the project, so for example :
1- Build this project a first time
2- Make a build error, for example by removing any parentheses on the line 12 ( `log.debug` ) of the `test.Application` class in the `/grails-app/init` folder, please note that Micronaut injection must be triggered in the code, that's why I added the first line `TriggerMicronaut.getDeclaredField("name")` by calling reflection API
3- Build the project, so the build will fails
4- Correct the code, to succeed the next compilation
5- Rebuild the project, here, instead of having an incremental build (just one file), we got the message :
```
> Task :compileGroovy
Build cache key for task ':compileGroovy' is 8155b9162e73ad3dbcf1e6e45d38c585
Task ':compileGroovy' is not up-to-date because:
Task has failed previously.
The input changes require a full rebuild for incremental task ':compileGroovy'.
Groovy compilation avoidance is an incubating feature.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
```
I am using Gradle 7.3.3 ( latest version ), with open JDK 16
[micronaut-cache-conflict.zip](https://github.com/gradle/gradle/files/7791258/micronaut-cache-conflict.zip)
|
1.0
|
Incremental cache lost after a compilation error when using Micronaut - The cache is lost after any compilation error when Micronaut-inject is shipped with the project, so for example :
1- Build this project a first time
2- Make a build error, for example by removing any parentheses on the line 12 ( `log.debug` ) of the `test.Application` class in the `/grails-app/init` folder, please note that Micronaut injection must be triggered in the code, that's why I added the first line `TriggerMicronaut.getDeclaredField("name")` by calling reflection API
3- Build the project, so the build will fails
4- Correct the code, to succeed the next compilation
5- Rebuild the project, here, instead of having an incremental build (just one file), we got the message :
```
> Task :compileGroovy
Build cache key for task ':compileGroovy' is 8155b9162e73ad3dbcf1e6e45d38c585
Task ':compileGroovy' is not up-to-date because:
Task has failed previously.
The input changes require a full rebuild for incremental task ':compileGroovy'.
Groovy compilation avoidance is an incubating feature.
Full recompilation is required because no incremental change information is available. This is usually caused by clean builds or changing compiler arguments.
```
I am using Gradle 7.3.3 ( latest version ), with open JDK 16
[micronaut-cache-conflict.zip](https://github.com/gradle/gradle/files/7791258/micronaut-cache-conflict.zip)
|
process
|
incremental cache lost after a compilation error when using micronaut the cache is lost after any compilation error when micronaut inject is shipped with the project so for example build this project a first time make a build error for example by removing any parentheses on the line log debug of the test application class in the grails app init folder please note that micronaut injection must be triggered in the code that s why i added the first line triggermicronaut getdeclaredfield name by calling reflection api build the project so the build will fails correct the code to succeed the next compilation rebuild the project here instead of having an incremental build just one file we got the message task compilegroovy build cache key for task compilegroovy is task compilegroovy is not up to date because task has failed previously the input changes require a full rebuild for incremental task compilegroovy groovy compilation avoidance is an incubating feature full recompilation is required because no incremental change information is available this is usually caused by clean builds or changing compiler arguments i am using gradle latest version with open jdk
| 1
|
320,910
| 27,492,868,042
|
IssuesEvent
|
2023-03-04 20:54:49
|
PalisadoesFoundation/talawa-admin
|
https://api.github.com/repos/PalisadoesFoundation/talawa-admin
|
closed
|
Increase the code coverage for OrganizationPeople.tsx
|
bug good first issue test
|
**Describe the bug**
- Increase the code coverage for `/screens/OrganizationPeople/OrganizationPeople.tsx` to 100%.
- Add required tests or alter the existing tests in the `/screens/OrganizationPeople/OrganizationPeople.test.tsx` file.
**Potential internship candidates**
Please read this if you are planning to apply for a Palisadoes Foundation internship https://github.com/PalisadoesFoundation/talawa/issues/359
|
1.0
|
Increase the code coverage for OrganizationPeople.tsx - **Describe the bug**
- Increase the code coverage for `/screens/OrganizationPeople/OrganizationPeople.tsx` to 100%.
- Add required tests or alter the existing tests in the `/screens/OrganizationPeople/OrganizationPeople.test.tsx` file.
**Potential internship candidates**
Please read this if you are planning to apply for a Palisadoes Foundation internship https://github.com/PalisadoesFoundation/talawa/issues/359
|
non_process
|
increase the code coverage for organizationpeople tsx describe the bug increase the code coverage for screens organizationpeople organizationpeople tsx to add required tests or alter the existing tests in the screens organizationpeople organizationpeople test tsx file potential internship candidates please read this if you are planning to apply for a palisadoes foundation internship
| 0
|
1,704
| 4,349,987,519
|
IssuesEvent
|
2016-07-30 23:26:44
|
AkkadianGames/Nanoshooter
|
https://api.github.com/repos/AkkadianGames/Nanoshooter
|
opened
|
Issue migration — Nanoshooter framework issues to Susa
|
Process Ready
|
## Criteria
- [ ] All framework-related issues in Nanoshooter are migrated to the Susa project.
|
1.0
|
Issue migration — Nanoshooter framework issues to Susa - ## Criteria
- [ ] All framework-related issues in Nanoshooter are migrated to the Susa project.
|
process
|
issue migration — nanoshooter framework issues to susa criteria all framework related issues in nanoshooter are migrated to the susa project
| 1
|
11,054
| 13,889,100,302
|
IssuesEvent
|
2020-10-19 07:24:01
|
zerolab-fe/awesome-nodejs
|
https://api.github.com/repos/zerolab-fe/awesome-nodejs
|
closed
|
pm2
|
Process management
|
在👆 Title 处填写包名,并补充下面信息:
```json
{
"repoUrl": "https://github.com/Unitech/pm2",
"description": "内置负载均衡的 node 进程管理器"
}
```
|
1.0
|
pm2 - 在👆 Title 处填写包名,并补充下面信息:
```json
{
"repoUrl": "https://github.com/Unitech/pm2",
"description": "内置负载均衡的 node 进程管理器"
}
```
|
process
|
在👆 title 处填写包名,并补充下面信息: json repourl description 内置负载均衡的 node 进程管理器
| 1
|
20,002
| 26,476,409,115
|
IssuesEvent
|
2023-01-17 11:28:47
|
0xPolygonMiden/miden-vm
|
https://api.github.com/repos/0xPolygonMiden/miden-vm
|
closed
|
Tracking issue: Migrate to Rescue Prime Optimized hash function
|
processor air v0.4
|
### Goal(s)
Switch the Miden VM's native hash function to RPO.
@grjte edit: I've added the goal and working group information, but kept the details from Bobbin's original description.
### Details
Currently, the native hash function of the VM is a non-standard variation of Rescue Prime. Besides being non-standard, it has several disadvantages which we can remedy by switching to Rescue Prime Optimized (RPO), which was designed by the original authors of Rescue/Rescue Prime. We already have a Rust implementation of RPO [here](https://github.com/0xPolygonMiden/crypto/blob/main/src/hash/rpo/mod.rs).
The main difference between Rescue Prime (in the variant that we are using) and Rescue Prime Optimized are as follows:
* **Overwrite mode**: during the absorption step, instead of adding new elements to the elements in rate portion of the state, we can just overwrite them.
* **Better padding rule**: the padding rule used in the variant of Rescue Prime is one of the things that make it non-standard. This padding rule has questionable security. The padding rule described in RPO specifications is much better.
* **No inverse MDS matrix**: because of a slightly different arrangement of operations within a round, multiplication by an inverse MDS matrix is no longer needed for evaluating AIR constraints.
To replace Rescue Prime with RPO, we'll need to:
* Update how the processor works. Specifically, the hasher chiplet but probably a few other things too (e.g., chiplet bus, decoder).
* Update AIR constraints for the affected components.
One nice thing is that these updates should actually reduce the overall complexity.
### Must have
- [x] Update how the processor works. Specifically, the hasher chiplet but probably a few other things too (e.g., chiplet bus, decoder).
- [x] Update AIR constraints for the affected components.
### Nice to have
- [ ]
## Working group:
@grjte, @al-kindi, @tohrnii
<details>
<summary>Workflow</summary>
- Discussion should happen here or in the related sub-issues.
- PRs should only be merged by the coordinator, to ensure everyone is able to review.
- Aim to complete reviews within 24 hours.
- When a related sub-issue is opened:
- add it to the list of sub-issues in this tracking issue
- When opening a related PR:
- request review from everyone in this working group
- When a sub-issue is completed:
- close the related issue with a comment that links to the PR where the work was completed
</details>
### Coordinator: @grjte
<details>
<summary>Workflow</summary>
The working group coordinator ensures scope & progress tracking are transparent and accurate. They will:
- Merge approved PRs after all working group members have completed their reviews.
- add the PR # to the relevant section of the current tracking PR.
- close any completed sub-issue(s) with a comment that links to the PR where the work was completed
- Monitor workflow items and complete anything that slips through the cracks.
- Monitor scope to see if anything is untracked or unclear. Create missing sub-issues or initiate discussion as required.
- Monitor progress to see if there's anything which isn't moving forward. Initiate discussion as required.
- Identify PRs with especially significant changes and add @grjte and @bobbinth for review.
</details>
|
1.0
|
Tracking issue: Migrate to Rescue Prime Optimized hash function - ### Goal(s)
Switch the Miden VM's native hash function to RPO.
@grjte edit: I've added the goal and working group information, but kept the details from Bobbin's original description.
### Details
Currently, the native hash function of the VM is a non-standard variation of Rescue Prime. Besides being non-standard, it has several disadvantages which we can remedy by switching to Rescue Prime Optimized (RPO), which was designed by the original authors of Rescue/Rescue Prime. We already have a Rust implementation of RPO [here](https://github.com/0xPolygonMiden/crypto/blob/main/src/hash/rpo/mod.rs).
The main difference between Rescue Prime (in the variant that we are using) and Rescue Prime Optimized are as follows:
* **Overwrite mode**: during the absorption step, instead of adding new elements to the elements in rate portion of the state, we can just overwrite them.
* **Better padding rule**: the padding rule used in the variant of Rescue Prime is one of the things that make it non-standard. This padding rule has questionable security. The padding rule described in RPO specifications is much better.
* **No inverse MDS matrix**: because of a slightly different arrangement of operations within a round, multiplication by an inverse MDS matrix is no longer needed for evaluating AIR constraints.
To replace Rescue Prime with RPO, we'll need to:
* Update how the processor works. Specifically, the hasher chiplet but probably a few other things too (e.g., chiplet bus, decoder).
* Update AIR constraints for the affected components.
One nice thing is that these updates should actually reduce the overall complexity.
### Must have
- [x] Update how the processor works. Specifically, the hasher chiplet but probably a few other things too (e.g., chiplet bus, decoder).
- [x] Update AIR constraints for the affected components.
### Nice to have
- [ ]
## Working group:
@grjte, @al-kindi, @tohrnii
<details>
<summary>Workflow</summary>
- Discussion should happen here or in the related sub-issues.
- PRs should only be merged by the coordinator, to ensure everyone is able to review.
- Aim to complete reviews within 24 hours.
- When a related sub-issue is opened:
- add it to the list of sub-issues in this tracking issue
- When opening a related PR:
- request review from everyone in this working group
- When a sub-issue is completed:
- close the related issue with a comment that links to the PR where the work was completed
</details>
### Coordinator: @grjte
<details>
<summary>Workflow</summary>
The working group coordinator ensures scope & progress tracking are transparent and accurate. They will:
- Merge approved PRs after all working group members have completed their reviews.
- add the PR # to the relevant section of the current tracking PR.
- close any completed sub-issue(s) with a comment that links to the PR where the work was completed
- Monitor workflow items and complete anything that slips through the cracks.
- Monitor scope to see if anything is untracked or unclear. Create missing sub-issues or initiate discussion as required.
- Monitor progress to see if there's anything which isn't moving forward. Initiate discussion as required.
- Identify PRs with especially significant changes and add @grjte and @bobbinth for review.
</details>
|
process
|
tracking issue migrate to rescue prime optimized hash function goal s switch the miden vm s native hash function to rpo grjte edit i ve added the goal and working group information but kept the details from bobbin s original description details currently the native hash function of the vm is a non standard variation of rescue prime besides being non standard it has several disadvantages which we can remedy by switching to rescue prime optimized rpo which was designed by the original authors of rescue rescue prime we already have a rust implementation of rpo the main difference between rescue prime in the variant that we are using and rescue prime optimized are as follows overwrite mode during the absorption step instead of adding new elements to the elements in rate portion of the state we can just overwrite them better padding rule the padding rule used in the variant of rescue prime is one of the things that make it non standard this padding rule has questionable security the padding rule described in rpo specifications is much better no inverse mds matrix because of a slightly different arrangement of operations within a round multiplication by an inverse mds matrix is no longer needed for evaluating air constraints to replace rescue prime with rpo we ll need to update how the processor works specifically the hasher chiplet but probably a few other things too e g chiplet bus decoder update air constraints for the affected components one nice thing is that these updates should actually reduce the overall complexity must have update how the processor works specifically the hasher chiplet but probably a few other things too e g chiplet bus decoder update air constraints for the affected components nice to have working group grjte al kindi tohrnii workflow discussion should happen here or in the related sub issues prs should only be merged by the coordinator to ensure everyone is able to review aim to complete reviews within hours when a related sub issue is opened add it to the list of sub issues in this tracking issue when opening a related pr request review from everyone in this working group when a sub issue is completed close the related issue with a comment that links to the pr where the work was completed coordinator grjte workflow the working group coordinator ensures scope progress tracking are transparent and accurate they will merge approved prs after all working group members have completed their reviews add the pr to the relevant section of the current tracking pr close any completed sub issue s with a comment that links to the pr where the work was completed monitor workflow items and complete anything that slips through the cracks monitor scope to see if anything is untracked or unclear create missing sub issues or initiate discussion as required monitor progress to see if there s anything which isn t moving forward initiate discussion as required identify prs with especially significant changes and add grjte and bobbinth for review
| 1
|
7,958
| 11,137,566,926
|
IssuesEvent
|
2019-12-20 19:43:02
|
openopps/openopps-platform
|
https://api.github.com/repos/openopps/openopps-platform
|
closed
|
Bring work experience sort order over from USAJOBS
|
Apply Process Requirements Ready State Dept.
|
Who: Student internship applicants
What: default sort order for work experience
Why: in order to bring data over from USAJOBS in the same order
Acceptance Criteria:
USAJOBS now allows for the user to sort their work experience on One Profile.
- Pull the sort order from USAJOBS
- The work experience should default display in the same order as it comes over from USAJOBS
Screen shot of USAJOBS one profile:

Related Tickets:
10/29/19 - TFS 39119 is in production which adds the sort on USAJOBS but it's not available yet in the API. 39267 should add it to the API in the next release
|
1.0
|
Bring work experience sort order over from USAJOBS - Who: Student internship applicants
What: default sort order for work experience
Why: in order to bring data over from USAJOBS in the same order
Acceptance Criteria:
USAJOBS now allows for the user to sort their work experience on One Profile.
- Pull the sort order from USAJOBS
- The work experience should default display in the same order as it comes over from USAJOBS
Screen shot of USAJOBS one profile:

Related Tickets:
10/29/19 - TFS 39119 is in production which adds the sort on USAJOBS but it's not available yet in the API. 39267 should add it to the API in the next release
|
process
|
bring work experience sort order over from usajobs who student internship applicants what default sort order for work experience why in order to bring data over from usajobs in the same order acceptance criteria usajobs now allows for the user to sort their work experience on one profile pull the sort order from usajobs the work experience should default display in the same order as it comes over from usajobs screen shot of usajobs one profile related tickets tfs is in production which adds the sort on usajobs but it s not available yet in the api should add it to the api in the next release
| 1
|
316,224
| 9,638,963,384
|
IssuesEvent
|
2019-05-16 12:30:51
|
oceanprotocol/commons
|
https://api.github.com/repos/oceanprotocol/commons
|
closed
|
Error during the consuming flow in Nile
|
bug priority:high
|
# Prerequisites
## Expected Behavior
When I try to consume/download an asset on Nile network I can do it without any issue.
## Current Behavior
Currently when I'm trying to download one asset, the flow is stuck showing the message:
""Decrypting file URL, please sign...""
## Failure Information (for bugs)
Please help provide information about the failure if this is a bug. If it is not a bug, please remove the rest of this template.
## Step to Reproduce the problem
Please provide detailed steps for reproducing the issue.
1. Go to Nile
2. Get some ETH using the Faucet
3. Go to https://commons.oceanprotocol.com/asset/did:op:be2ab056265d44a6b406785702a1f654984c174d949c451f99e41f8fbea57a71
4. Click on Download
5. Confirm 2 times in Metamask

|
1.0
|
Error during the consuming flow in Nile - # Prerequisites
## Expected Behavior
When I try to consume/download an asset on Nile network I can do it without any issue.
## Current Behavior
Currently when I'm trying to download one asset, the flow is stuck showing the message:
""Decrypting file URL, please sign...""
## Failure Information (for bugs)
Please help provide information about the failure if this is a bug. If it is not a bug, please remove the rest of this template.
## Step to Reproduce the problem
Please provide detailed steps for reproducing the issue.
1. Go to Nile
2. Get some ETH using the Faucet
3. Go to https://commons.oceanprotocol.com/asset/did:op:be2ab056265d44a6b406785702a1f654984c174d949c451f99e41f8fbea57a71
4. Click on Download
5. Confirm 2 times in Metamask

|
non_process
|
error during the consuming flow in nile prerequisites expected behavior when i try to consume download an asset on nile network i can do it without any issue current behavior currently when i m trying to download one asset the flow is stuck showing the message decrypting file url please sign failure information for bugs please help provide information about the failure if this is a bug if it is not a bug please remove the rest of this template step to reproduce the problem please provide detailed steps for reproducing the issue go to nile get some eth using the faucet go to click on download confirm times in metamask
| 0
|
98,757
| 16,389,453,362
|
IssuesEvent
|
2021-05-17 14:27:36
|
Thanraj/linux-1
|
https://api.github.com/repos/Thanraj/linux-1
|
opened
|
CVE-2019-19816 (High) detected in linuxv5.0
|
security vulnerability
|
## CVE-2019-19816 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.0</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Thanraj/linux-1/commits/9738d89d33cb0f3ac708908509b82eafc007d557">9738d89d33cb0f3ac708908509b82eafc007d557</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux-1/fs/btrfs/volumes.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux-1/fs/btrfs/volumes.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel 5.0.21, mounting a crafted btrfs filesystem image and performing some operations can cause slab-out-of-bounds write access in __btrfs_map_block in fs/btrfs/volumes.c, because a value of 1 for the number of data stripes is mishandled.
<p>Publish Date: 2019-12-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19816>CVE-2019-19816</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19816">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19816</a></p>
<p>Release Date: 2019-12-17</p>
<p>Fix Resolution: v5.5-rc1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-19816 (High) detected in linuxv5.0 - ## CVE-2019-19816 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.0</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Thanraj/linux-1/commits/9738d89d33cb0f3ac708908509b82eafc007d557">9738d89d33cb0f3ac708908509b82eafc007d557</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux-1/fs/btrfs/volumes.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux-1/fs/btrfs/volumes.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel 5.0.21, mounting a crafted btrfs filesystem image and performing some operations can cause slab-out-of-bounds write access in __btrfs_map_block in fs/btrfs/volumes.c, because a value of 1 for the number of data stripes is mishandled.
<p>Publish Date: 2019-12-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-19816>CVE-2019-19816</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19816">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19816</a></p>
<p>Release Date: 2019-12-17</p>
<p>Fix Resolution: v5.5-rc1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in cve high severity vulnerability vulnerable library linux kernel source tree library home page a href found in head commit a href found in base branch master vulnerable source files linux fs btrfs volumes c linux fs btrfs volumes c vulnerability details in the linux kernel mounting a crafted btrfs filesystem image and performing some operations can cause slab out of bounds write access in btrfs map block in fs btrfs volumes c because a value of for the number of data stripes is mishandled publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
2,057
| 4,864,871,858
|
IssuesEvent
|
2016-11-14 19:13:58
|
Sage-Bionetworks/Genie
|
https://api.github.com/repos/Sage-Bionetworks/Genie
|
opened
|
incorrect gene symbols in BED
|
data processing NKI pending release VICC
|
NKI uploaded a new file for release 0.3
VICC says to ignore symbols containing "WG".
|
1.0
|
incorrect gene symbols in BED - NKI uploaded a new file for release 0.3
VICC says to ignore symbols containing "WG".
|
process
|
incorrect gene symbols in bed nki uploaded a new file for release vicc says to ignore symbols containing wg
| 1
|
8,242
| 3,147,255,849
|
IssuesEvent
|
2015-09-15 06:49:55
|
jMonkeyEngine-Contributions/Lemur
|
https://api.github.com/repos/jMonkeyEngine-Contributions/Lemur
|
opened
|
Docs:TextEntryComponent
|
documentation
|
Need to add content to the TextEntryComponent section of the GUI Components page.
|
1.0
|
Docs:TextEntryComponent - Need to add content to the TextEntryComponent section of the GUI Components page.
|
non_process
|
docs textentrycomponent need to add content to the textentrycomponent section of the gui components page
| 0
|
17,742
| 23,657,330,047
|
IssuesEvent
|
2022-08-26 12:32:52
|
mdsreq-fga-unb/2022.1-GDS
|
https://api.github.com/repos/mdsreq-fga-unb/2022.1-GDS
|
closed
|
Processo de Requisitos
|
Processo de Requisitos
|
**Descrição**
A equipe apresenta apenas uma lista de atividades, não organizam elas em um processo.
|
1.0
|
Processo de Requisitos - **Descrição**
A equipe apresenta apenas uma lista de atividades, não organizam elas em um processo.
|
process
|
processo de requisitos descrição a equipe apresenta apenas uma lista de atividades não organizam elas em um processo
| 1
|
286,537
| 24,759,781,387
|
IssuesEvent
|
2022-10-21 21:59:55
|
kendallm360/smart-tech
|
https://api.github.com/repos/kendallm360/smart-tech
|
closed
|
Category Testing
|
Testing
|
- [x] Test for all elements in Category component
- [x] Add tests for router when selecting a category
- [x] Add tests for number of expected elements
- [x] Add tests for title
- [x] Add fixture
|
1.0
|
Category Testing - - [x] Test for all elements in Category component
- [x] Add tests for router when selecting a category
- [x] Add tests for number of expected elements
- [x] Add tests for title
- [x] Add fixture
|
non_process
|
category testing test for all elements in category component add tests for router when selecting a category add tests for number of expected elements add tests for title add fixture
| 0
|
88,962
| 8,182,496,560
|
IssuesEvent
|
2018-08-29 05:26:57
|
MrBlizzard/RCAdmins-Tracker
|
https://api.github.com/repos/MrBlizzard/RCAdmins-Tracker
|
closed
|
[Crate Catcher] Find a way to disable autofishing
|
Testing Needed Upload Needed priority:low
|
Possibly make a prompt every 5 fish caught to type something in chat, a few separate responses, so ti requires a person to be active to use it.
|
1.0
|
[Crate Catcher] Find a way to disable autofishing - Possibly make a prompt every 5 fish caught to type something in chat, a few separate responses, so ti requires a person to be active to use it.
|
non_process
|
find a way to disable autofishing possibly make a prompt every fish caught to type something in chat a few separate responses so ti requires a person to be active to use it
| 0
|
9,326
| 12,339,525,608
|
IssuesEvent
|
2020-05-14 18:16:48
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
ctx.files linearizes nested sets
|
P3 team-Starlark type: process
|
This code:
https://github.com/bazelbuild/bazel/blob/master/src/main/java/com/google/devtools/build/lib/rules/SkylarkRuleContext.java#L330
linearizes nested sets and as such, unnecessarily increases memory use.
|
1.0
|
ctx.files linearizes nested sets - This code:
https://github.com/bazelbuild/bazel/blob/master/src/main/java/com/google/devtools/build/lib/rules/SkylarkRuleContext.java#L330
linearizes nested sets and as such, unnecessarily increases memory use.
|
process
|
ctx files linearizes nested sets this code linearizes nested sets and as such unnecessarily increases memory use
| 1
|
30,922
| 2,729,656,404
|
IssuesEvent
|
2015-04-16 09:59:29
|
jkall/qgis-midvatten-plugin
|
https://api.github.com/repos/jkall/qgis-midvatten-plugin
|
opened
|
allow obsid dubplicates by introducing another ID as primary key
|
enhancement Priority-High
|
Introduce a unique observation id that is independent of obsid and name. This will allow the existance of duplicates among obsid.
Major code revisions are needed. Several security checks are needed during imports and also, when obsid duplicates are found, user interaction to distinguish between observations of same obsid.
Probably time to introduce a main table with id for each observation as the primary key. This table will hold all observations, no matter if they are points or lines.
|
1.0
|
allow obsid dubplicates by introducing another ID as primary key - Introduce a unique observation id that is independent of obsid and name. This will allow the existance of duplicates among obsid.
Major code revisions are needed. Several security checks are needed during imports and also, when obsid duplicates are found, user interaction to distinguish between observations of same obsid.
Probably time to introduce a main table with id for each observation as the primary key. This table will hold all observations, no matter if they are points or lines.
|
non_process
|
allow obsid dubplicates by introducing another id as primary key introduce a unique observation id that is independent of obsid and name this will allow the existance of duplicates among obsid major code revisions are needed several security checks are needed during imports and also when obsid duplicates are found user interaction to distinguish between observations of same obsid probably time to introduce a main table with id for each observation as the primary key this table will hold all observations no matter if they are points or lines
| 0
|
21,711
| 30,211,055,438
|
IssuesEvent
|
2023-07-05 12:49:54
|
GIScience/sketch-map-tool
|
https://api.github.com/repos/GIScience/sketch-map-tool
|
opened
|
Issue with OpenCV and too many descriptors in matching
|
bug component:upload-processing
|
When uploading an A0 sketch map, the following error message appears. Probably due to too many descriptors (cf. https://stackoverflow.com/questions/20432403/error-215-traindesccollectioniidx-python-opencv)
```
Arguments: {'hostname': 'celery@H0108', 'id': 'a3a9bc42-27a9-4792-a65d-9cce615c278b', 'name': 'sketch_map_tool.tasks.digitize_sketches', 'exc': 'error("OpenCV(4.7.0) /io/opencv/modules/features2d/src/matchers.cpp:860: error: (-215:Assertion failed) trainDescCollection[iIdx].rows < IMGIDX_ONE in function 'knnMatchImpl'\n")', 'traceback': 'Traceback (most recent call last):
File "/home/matthias/.cache/pypoetry/virtualenvs/sketch-map-tool-SFkv235P-py3.11/lib/python3.11/site-packages/celery/app/trace.py", line 451, in trace_task
R = retval = fun(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/__init__.py", line 46, in __call__
return self.run(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/tasks.py", line 146, in digitize_sketches
[process(file_id, name) for file_id, name in zip(file_ids, file_names)]
File "/home/matthias/work/smt/sketch_map_tool/tasks.py", line 146, in <listcomp>
[process(file_id, name) for file_id, name in zip(file_ids, file_names)]
^^^^^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/tasks.py", line 132, in process
r = clip(r, map_frame)
^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/upload_processing/clip.py", line 28, in clip
matches = list(matcher.match(desc1, desc2, None))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
cv2.error: OpenCV(4.7.0) /io/opencv/modules/features2d/src/matchers.cpp:860: error: (-215:Assertion failed) trainDescCollection[iIdx].rows < IMGIDX_ONE in function 'knnMatchImpl'
', 'args': "([14], ['Screenshot_2023-07-05_at_14-35-05_untitled_-_sketch-map-18.pdf_copy_4.png'], array([[[154, 204, 169],
[199, 232, 217],
[232, 253, 253],
...,
[118, 118, 118],
[118, 118, 118],
[ 55, 55, 55]],
[[154, 204, 169],
[217, 246, 247],
[232, 253, 253],
...,
[255, 255, 255],
[255, 255, 255],
[119, 119, 119]],
[[170, 196, 197],
[217, 246, 247],
[232, 253, 253],
...,
[255, 255, 255],
[255, 255, 255],
[119, 119, 119]],
...,
[[199, 206, 214],
[199, 206, 214],
[201, 208, 216],
...,
[201, 208, 216],
[201, 208, 216],
[201, 208, 216]],
[[199, 206, 214],
[199, 206, 214],
[199, 206, 214],
...,
[201, 208, 216],
[201, 208, 216],
[201, 208, 216]],
[[199, 206, 214],
[201, 208, 216],
[199, 206, 214],
...,
[201, 208, 216],
[201, 208, 216],
[201, 208, 216]]], dtype=uint8), Bbox(lon_min=964668.3060772641, lat_min=6343605.938051963, lon_max=967431.3455614758, lat_max=6345832.086898304))", 'kwargs': '{}', 'description': 'raised unexpected', 'internal': False}
```
|
1.0
|
Issue with OpenCV and too many descriptors in matching - When uploading an A0 sketch map, the following error message appears. Probably due to too many descriptors (cf. https://stackoverflow.com/questions/20432403/error-215-traindesccollectioniidx-python-opencv)
```
Arguments: {'hostname': 'celery@H0108', 'id': 'a3a9bc42-27a9-4792-a65d-9cce615c278b', 'name': 'sketch_map_tool.tasks.digitize_sketches', 'exc': 'error("OpenCV(4.7.0) /io/opencv/modules/features2d/src/matchers.cpp:860: error: (-215:Assertion failed) trainDescCollection[iIdx].rows < IMGIDX_ONE in function 'knnMatchImpl'\n")', 'traceback': 'Traceback (most recent call last):
File "/home/matthias/.cache/pypoetry/virtualenvs/sketch-map-tool-SFkv235P-py3.11/lib/python3.11/site-packages/celery/app/trace.py", line 451, in trace_task
R = retval = fun(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/__init__.py", line 46, in __call__
return self.run(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/tasks.py", line 146, in digitize_sketches
[process(file_id, name) for file_id, name in zip(file_ids, file_names)]
File "/home/matthias/work/smt/sketch_map_tool/tasks.py", line 146, in <listcomp>
[process(file_id, name) for file_id, name in zip(file_ids, file_names)]
^^^^^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/tasks.py", line 132, in process
r = clip(r, map_frame)
^^^^^^^^^^^^^^^^^^
File "/home/matthias/work/smt/sketch_map_tool/upload_processing/clip.py", line 28, in clip
matches = list(matcher.match(desc1, desc2, None))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
cv2.error: OpenCV(4.7.0) /io/opencv/modules/features2d/src/matchers.cpp:860: error: (-215:Assertion failed) trainDescCollection[iIdx].rows < IMGIDX_ONE in function 'knnMatchImpl'
', 'args': "([14], ['Screenshot_2023-07-05_at_14-35-05_untitled_-_sketch-map-18.pdf_copy_4.png'], array([[[154, 204, 169],
[199, 232, 217],
[232, 253, 253],
...,
[118, 118, 118],
[118, 118, 118],
[ 55, 55, 55]],
[[154, 204, 169],
[217, 246, 247],
[232, 253, 253],
...,
[255, 255, 255],
[255, 255, 255],
[119, 119, 119]],
[[170, 196, 197],
[217, 246, 247],
[232, 253, 253],
...,
[255, 255, 255],
[255, 255, 255],
[119, 119, 119]],
...,
[[199, 206, 214],
[199, 206, 214],
[201, 208, 216],
...,
[201, 208, 216],
[201, 208, 216],
[201, 208, 216]],
[[199, 206, 214],
[199, 206, 214],
[199, 206, 214],
...,
[201, 208, 216],
[201, 208, 216],
[201, 208, 216]],
[[199, 206, 214],
[201, 208, 216],
[199, 206, 214],
...,
[201, 208, 216],
[201, 208, 216],
[201, 208, 216]]], dtype=uint8), Bbox(lon_min=964668.3060772641, lat_min=6343605.938051963, lon_max=967431.3455614758, lat_max=6345832.086898304))", 'kwargs': '{}', 'description': 'raised unexpected', 'internal': False}
```
|
process
|
issue with opencv and too many descriptors in matching when uploading an sketch map the following error message appears probably due to too many descriptors cf arguments hostname celery id name sketch map tool tasks digitize sketches exc error opencv io opencv modules src matchers cpp error assertion failed traindesccollection rows imgidx one in function knnmatchimpl n traceback traceback most recent call last file home matthias cache pypoetry virtualenvs sketch map tool lib site packages celery app trace py line in trace task r retval fun args kwargs file home matthias work smt sketch map tool init py line in call return self run args kwargs file home matthias work smt sketch map tool tasks py line in digitize sketches file home matthias work smt sketch map tool tasks py line in file home matthias work smt sketch map tool tasks py line in process r clip r map frame file home matthias work smt sketch map tool upload processing clip py line in clip matches list matcher match none error opencv io opencv modules src matchers cpp error assertion failed traindesccollection rows imgidx one in function knnmatchimpl args array dtype bbox lon min lat min lon max lat max kwargs description raised unexpected internal false
| 1
|
243,641
| 18,720,355,197
|
IssuesEvent
|
2021-11-03 11:03:10
|
AY2122S1-CS2103-W14-3/tp
|
https://api.github.com/repos/AY2122S1-CS2103-W14-3/tp
|
closed
|
[PE-D] Mismatch description in UserGuide
|
v1.4 Documentation
|
In the User Guide, the delete class command should have the functionality to delete a class.
However, in the User Guide, it gives the wrong description as shown in the screenshot below.

<!--session: 1635495494432-101e1f68-06d2-48b1-85de-8f408ccf73a0-->
<!--Version: Web v3.4.1-->
-------------
Labels: `severity.Low` `type.DocumentationBug`
original: xiongjya/ped#8
|
1.0
|
[PE-D] Mismatch description in UserGuide - In the User Guide, the delete class command should have the functionality to delete a class.
However, in the User Guide, it gives the wrong description as shown in the screenshot below.

<!--session: 1635495494432-101e1f68-06d2-48b1-85de-8f408ccf73a0-->
<!--Version: Web v3.4.1-->
-------------
Labels: `severity.Low` `type.DocumentationBug`
original: xiongjya/ped#8
|
non_process
|
mismatch description in userguide in the user guide the delete class command should have the functionality to delete a class however in the user guide it gives the wrong description as shown in the screenshot below labels severity low type documentationbug original xiongjya ped
| 0
|
160,309
| 6,086,216,227
|
IssuesEvent
|
2017-06-17 22:16:00
|
lambdan/Splits
|
https://api.github.com/repos/lambdan/Splits
|
opened
|
change save format to wsplit
|
enhancement highpriority
|
the more i think about it, the less necessary i think it is for us to save in the custom yaml format. saving in wsplit format would make it much easier to upload to splits.io and use with wsplit on windows.
|
1.0
|
change save format to wsplit - the more i think about it, the less necessary i think it is for us to save in the custom yaml format. saving in wsplit format would make it much easier to upload to splits.io and use with wsplit on windows.
|
non_process
|
change save format to wsplit the more i think about it the less necessary i think it is for us to save in the custom yaml format saving in wsplit format would make it much easier to upload to splits io and use with wsplit on windows
| 0
|
20,875
| 27,662,508,129
|
IssuesEvent
|
2023-03-12 17:31:19
|
LLazyEmail/nomoretogo_email_template
|
https://api.github.com/repos/LLazyEmail/nomoretogo_email_template
|
closed
|
resolve an issue with button2 component
|
in process
|
```
// import { button } from 'nmtg-template-mailerlite-typography';
import {
buttonComponent,
// buttonComponent2,
} from 'nmtg-template-mailerlite-typography';
buttonComponent2({id: '12', href: 'google.com'});
```
|
1.0
|
resolve an issue with button2 component - ```
// import { button } from 'nmtg-template-mailerlite-typography';
import {
buttonComponent,
// buttonComponent2,
} from 'nmtg-template-mailerlite-typography';
buttonComponent2({id: '12', href: 'google.com'});
```
|
process
|
resolve an issue with component import button from nmtg template mailerlite typography import buttoncomponent from nmtg template mailerlite typography id href google com
| 1
|
653,535
| 21,605,862,998
|
IssuesEvent
|
2022-05-04 02:48:51
|
ballerina-platform/ballerina-extended-library
|
https://api.github.com/repos/ballerina-platform/ballerina-extended-library
|
opened
|
[Task]: `SOQL` and `SOSL` querying in Salesforce
|
Priority/Normal Type/Task Team/Connector Component/Connector
|
### Connector Name
module/salesforce (Salesforce)
### Task Description
- `SOQL` (Salesforce Object Query Language) is a query language specific to Salesforce which could be used to read information stored in your org’s database. SOQL is syntactically similar to SQL (Structured Query Language).
- `SOSL` (Salesforce Object Search Language) is a language that performs text searches in records. Unlike SOQL, SOSL can query multiple types of objects at the same time. SOSL can also use a word match to match fields, while SOQL needs the exact phrase.
**Deliverables**
- A summary of concepts of SOQL and SOSL queriying, Governer limits and best practices
- Milestone plan for next steps
|
1.0
|
[Task]: `SOQL` and `SOSL` querying in Salesforce - ### Connector Name
module/salesforce (Salesforce)
### Task Description
- `SOQL` (Salesforce Object Query Language) is a query language specific to Salesforce which could be used to read information stored in your org’s database. SOQL is syntactically similar to SQL (Structured Query Language).
- `SOSL` (Salesforce Object Search Language) is a language that performs text searches in records. Unlike SOQL, SOSL can query multiple types of objects at the same time. SOSL can also use a word match to match fields, while SOQL needs the exact phrase.
**Deliverables**
- A summary of concepts of SOQL and SOSL queriying, Governer limits and best practices
- Milestone plan for next steps
|
non_process
|
soql and sosl querying in salesforce connector name module salesforce salesforce task description soql salesforce object query language is a query language specific to salesforce which could be used to read information stored in your org’s database soql is syntactically similar to sql structured query language sosl salesforce object search language is a language that performs text searches in records unlike soql sosl can query multiple types of objects at the same time sosl can also use a word match to match fields while soql needs the exact phrase deliverables a summary of concepts of soql and sosl queriying governer limits and best practices milestone plan for next steps
| 0
|
15,155
| 18,908,722,102
|
IssuesEvent
|
2021-11-16 11:54:15
|
streamnative/pulsar-flink
|
https://api.github.com/repos/streamnative/pulsar-flink
|
closed
|
Reorganize code directory to a better structure
|
type/cleanup platform/data-processing
|
The code structure is hard to understand and may include some of the code for new connector, which is now hosted in a separate repo. After the code clean, we need to reorganize the directory to reduce the barrier to understand it.
```
── connector
│ └── pulsar
│ └── source
│ ├── enumerator
│ ├── offset
│ ├── reader
│ ├── split
│ ├── subscription
│ └── util
├── formats
│ ├── atomic
│ └── protobufnative
├── streaming
│ ├── connectors
│ │ └── pulsar
│ │ ├── config
│ │ ├── internal
│ │ │ └── metrics
│ │ ├── table
│ │ └── util
│ └── util
│ └── serialization
├── table
│ ├── catalog
│ │ └── pulsar
│ │ ├── descriptors
│ │ └── factories
│ └── descriptors
└── util
```
|
1.0
|
Reorganize code directory to a better structure - The code structure is hard to understand and may include some of the code for new connector, which is now hosted in a separate repo. After the code clean, we need to reorganize the directory to reduce the barrier to understand it.
```
── connector
│ └── pulsar
│ └── source
│ ├── enumerator
│ ├── offset
│ ├── reader
│ ├── split
│ ├── subscription
│ └── util
├── formats
│ ├── atomic
│ └── protobufnative
├── streaming
│ ├── connectors
│ │ └── pulsar
│ │ ├── config
│ │ ├── internal
│ │ │ └── metrics
│ │ ├── table
│ │ └── util
│ └── util
│ └── serialization
├── table
│ ├── catalog
│ │ └── pulsar
│ │ ├── descriptors
│ │ └── factories
│ └── descriptors
└── util
```
|
process
|
reorganize code directory to a better structure the code structure is hard to understand and may include some of the code for new connector which is now hosted in a separate repo after the code clean we need to reorganize the directory to reduce the barrier to understand it ── connector │ └── pulsar │ └── source │ ├── enumerator │ ├── offset │ ├── reader │ ├── split │ ├── subscription │ └── util ├── formats │ ├── atomic │ └── protobufnative ├── streaming │ ├── connectors │ │ └── pulsar │ │ ├── config │ │ ├── internal │ │ │ └── metrics │ │ ├── table │ │ └── util │ └── util │ └── serialization ├── table │ ├── catalog │ │ └── pulsar │ │ ├── descriptors │ │ └── factories │ └── descriptors └── util
| 1
|
7,697
| 10,781,006,023
|
IssuesEvent
|
2019-11-04 14:08:11
|
radis/radis
|
https://api.github.com/repos/radis/radis
|
closed
|
issue with algrebra of >2 spectra
|
bug post-process
|
The shortcut to `SerialSlabs(s1, s2, s3)` written as `s1 > s2 > s3` does not work as expected as Python evaluates it to `s1 > s2 and s2 > s3`
|
1.0
|
issue with algrebra of >2 spectra - The shortcut to `SerialSlabs(s1, s2, s3)` written as `s1 > s2 > s3` does not work as expected as Python evaluates it to `s1 > s2 and s2 > s3`
|
process
|
issue with algrebra of spectra the shortcut to serialslabs written as does not work as expected as python evaluates it to and
| 1
|
810,249
| 30,233,190,596
|
IssuesEvent
|
2023-07-06 08:26:58
|
googleapis/cloud-profiler-nodejs
|
https://api.github.com/repos/googleapis/cloud-profiler-nodejs
|
opened
|
Is there an option to stop the profiler?
|
type: question priority: p3
|
I would like to start and the stop the profiler for a configured duration of time and stop it once the time reached, is there an API to stop the profiler?
|
1.0
|
Is there an option to stop the profiler? - I would like to start and the stop the profiler for a configured duration of time and stop it once the time reached, is there an API to stop the profiler?
|
non_process
|
is there an option to stop the profiler i would like to start and the stop the profiler for a configured duration of time and stop it once the time reached is there an api to stop the profiler
| 0
|
22,566
| 31,789,568,989
|
IssuesEvent
|
2023-09-13 01:26:23
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
execv error executing PEX in custom rule
|
P4 type: support / not a bug (process) team-Rules-Python stale
|
### Description of the problem / feature request:
> Attempting to write a `rule` to use [pex](https://github.com/pantsbuild/pex/releases/download/v2.1.67/pex) to build PEX's.
>
> Getting a strange error from `ctx.actions.run` that does not manifest with a `genrule` or direct invocation outside of bazel.
### Feature requests: what underlying problem are you trying to solve with this feature?
> Define a simple `pex_binary` rule.
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
> Minimal reproduction: https://github.com/nickbreen/bazel_pex
### What operating system are you running Bazel on?
> Fedora 34, Python 3.6 and 3.9
### What's the output of `bazel info release`?
> release 4.2.0
### Have you found anything relevant by searching the web?
> I've found a number of repos and issues on github for bazel and rules_python, etc; all of which appear to be inconclusive or appear ancient/abandoned:
> - https://github.com/bazelbuild/bazel/issues/11931
> - https://github.com/borancar/mishmash/commit/928cd7bb75c93472c3931f3f8f56941786ec6116#diff-6b9d2d187ff982cd6100e3a5e0813e450351d5164262d0c1d9acc43522d994db
> - https://github.com/benley/bazel_rules_pex
> - https://gist.github.com/simeonf/062af826e79259bc7686
> - https://www.shearn89.com/2021/04/15/pex-file-creation
### Any other information, logs, or outputs that you want to share?
> Please see https://github.com/nickbreen/bazel_pex
|
1.0
|
execv error executing PEX in custom rule - ### Description of the problem / feature request:
> Attempting to write a `rule` to use [pex](https://github.com/pantsbuild/pex/releases/download/v2.1.67/pex) to build PEX's.
>
> Getting a strange error from `ctx.actions.run` that does not manifest with a `genrule` or direct invocation outside of bazel.
### Feature requests: what underlying problem are you trying to solve with this feature?
> Define a simple `pex_binary` rule.
### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible.
> Minimal reproduction: https://github.com/nickbreen/bazel_pex
### What operating system are you running Bazel on?
> Fedora 34, Python 3.6 and 3.9
### What's the output of `bazel info release`?
> release 4.2.0
### Have you found anything relevant by searching the web?
> I've found a number of repos and issues on github for bazel and rules_python, etc; all of which appear to be inconclusive or appear ancient/abandoned:
> - https://github.com/bazelbuild/bazel/issues/11931
> - https://github.com/borancar/mishmash/commit/928cd7bb75c93472c3931f3f8f56941786ec6116#diff-6b9d2d187ff982cd6100e3a5e0813e450351d5164262d0c1d9acc43522d994db
> - https://github.com/benley/bazel_rules_pex
> - https://gist.github.com/simeonf/062af826e79259bc7686
> - https://www.shearn89.com/2021/04/15/pex-file-creation
### Any other information, logs, or outputs that you want to share?
> Please see https://github.com/nickbreen/bazel_pex
|
process
|
execv error executing pex in custom rule description of the problem feature request attempting to write a rule to use to build pex s getting a strange error from ctx actions run that does not manifest with a genrule or direct invocation outside of bazel feature requests what underlying problem are you trying to solve with this feature define a simple pex binary rule bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible minimal reproduction what operating system are you running bazel on fedora python and what s the output of bazel info release release have you found anything relevant by searching the web i ve found a number of repos and issues on github for bazel and rules python etc all of which appear to be inconclusive or appear ancient abandoned any other information logs or outputs that you want to share please see
| 1
|
277,408
| 21,042,019,105
|
IssuesEvent
|
2022-03-31 13:09:24
|
stacks-network/docs
|
https://api.github.com/repos/stacks-network/docs
|
closed
|
Remove regtest references from docs
|
good first issue chore documentation
|
https://github.com/blockstack/docs/search?q=regtest
cc @pgray-hiro - should we replicate this issue for the Hiro docs too?
|
1.0
|
Remove regtest references from docs - https://github.com/blockstack/docs/search?q=regtest
cc @pgray-hiro - should we replicate this issue for the Hiro docs too?
|
non_process
|
remove regtest references from docs cc pgray hiro should we replicate this issue for the hiro docs too
| 0
|
38,842
| 5,011,679,450
|
IssuesEvent
|
2016-12-13 08:51:42
|
pythonapis/6ZJYP2PXGY5CWP2LWTZZFRIL
|
https://api.github.com/repos/pythonapis/6ZJYP2PXGY5CWP2LWTZZFRIL
|
reopened
|
I+UbKUBVkKv2hpI5mQ3mPEy4cODEZN+Slx79lJoFkqa3epNu7SnLwzWmDNaF1LGnxx0AVFKn8XgMP9HtmmrecbwG2GwiOfID/7xcewynoC55Y4KDd+03C7ooYIvZ9pNaPp4oSqRUd2ILjFDVe/1UBDZLlhPj3sVTngiFCtyFsn4=
|
design
|
kcS2crLTpvFeFHi/lNqT7P//v1EGTgBiN5AIzRTfiIGNJnmDtDGQLSiu9yOX+bRnQ7woZ4XR2RBcDchxaTNdE7ffz1Io+MOr95F/TLO9ALN+LJUyGvxoBM1H7dj0LlVsjYMl6QUrIJm6ZfwaZAm+6psRI1NfApAwFtVddyy23jbZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr6NOCIPys3aBq+gQwDDHcWKCcw22wFGJugB5T7xcQm7oMDQhMLXn+w59wZYf106KqRaCH7QxOkufjkflqBxBl+igO2B0Uk+asRc+vPeM9kNUy+Hd3BeJoGnN/DU2AkblP7pypg3TZnOchDONHGHG3uBVbMs7TKF9NWxgqmxzzk5FeR1YJLjRoMG4A4FxkUMD64CH+HtVcARJg2BgYH/1BBxry+FuWyRXJI+aS+sK+5oBvvfqJM6e2Tefaov+CxFDW5T+vaFVhqhsba2G6WSfFJ9SISwY8rh7elnIt8GkvBxkw4ohl98Ryvfce2Le+eeqOeMlivPadvMl3REP/lB02K4p3sdPzOiOVeHdZWDC5jTC/GlhRW/Fn4Y94DLkGai1FjA0ITC15/sOfcGWH9dOiqkWgh+0MTpLn45H5agcQZfooDtgdFJPmrEXPrz3jPZDVMvh3dwXiaBpzfw1NgJG5T+Fmr2EozVikruKdSAjszwd73+TM2FiVvpRs2+AsWp25nZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr6NOCIPys3aBq+gQwDDHcWKoUWQJWXJcGoApq2hRnIbJZUj8iS3M9jRFz+wwXUbDEj4Dy98MphGLBK7gnnnDYHqpfXJPZ74WJBIKvWrvQHKgxNSwwkJWwYquCPXhhlLYt5Yqi0A5cPO7531bksMFQBmFS16iCxxYLvs/RX735UWX6CLqrZqSSoFNON516PvQmojRc+MSJnNBHEU9SlTZo4tD8rFRk4QfSZdx9AHqvLr57yFty4+kvjrfJOqr0NkkNWZYfEnJeWUTXLWBBYV6YqSdTkRFM2XEiAOcd9XFbHxMbqgd7x5M8bEgv2iRw1pw2cyaNjz48iWRM8SipkBCkCqr+hK1cqZdpznPVGfdMC3Zzw8Lqp6WiVGcf1UjAaW27aHUs/Tsv21QTAH7H/hRMjn1+eEY+wpOZr0/PTgbN0YR/AwGyyTnAoaUlxlv9qux9XFfXcyUBLe+ttMLFzcZ6xRXyZ2HVQ4aoJcULVa3qb1IhUteogscWC77P0V+9+VFl+gi6q2akkqBTTjedej70JqI0XPjEiZzQRxFPUpU2aOLQ/KxUZOEH0mXcfQB6ry6+e8hbcuPpL463yTqq9DZJDVnNtqJ0/VEnJ+UBDEaksaapJIVDZvGkhudEjtnV8+RvCmuTbc0CmkMJZRvp5seNkaU5OrQAXiMVThUDQSrcWInc+BEV9Y6AJ2FEgv1hE/8080mlTiSpj49YicSKrR4pYQlSPyJLcz2NEXP7DBdRsMSPgPL3wymEYsEruCeecNgeql9ck9nvhYkEgq9au9AcqDE1LDCQlbBiq4I9eGGUti3nUG8aVGs8ZRsIwySmTeGgYOpLxpyYzBjQGeE0rEaL5sjSZ5g7QxkC0orvcjl/m0Z0O8KGeF0dkQXA3IcWkzXRO3389SKPjDq/eRf0yzvQCzY8N2Ho5fSa/fQrxYkC+sKAn4Trd6XDDg8WknC0X8csvZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr6NOCIPys3aBq+gQwDDHcWKa3kXo37ARMoJbsnnGhaEBrSkWDhNBEI94Yry91EczMNLmWw9bE9fp98f3bpluD3Tp9oxyoc2WzD5Jk0XULRPdS9iKdMOqnSm6fete0Vf2BzTQxI00YSLklxv2zYZVtRZwNCEwtef7Dn3Blh/XToqpFoIftDE6S5+OR+WoHEGX6KA7YHRST5qxFz6894z2Q1TL4d3cF4mgac38NTYCRuU/kLr28Ccdv+RiXSuEFinqBNbKricJfOxQ0Yidn0KPqGckkhUNm8aSG50SO2dXz5G8Ka5NtzQKaQwllG+nmx42RpTk6tABeIxVOFQNBKtxYidz4ERX1joAnYUSC/WET/zT4qHXKa4iJlYAFTKqd+mLD8/3EBzXRjjc034FTSOcDwZ2Zn/qzjZB5yoG6TS8RmVz19+OLV3naZhaMg1Wglkbz1PIw7OCVhFTFHt61dMXjq+jTgiD8rN2gavoEMAwx3Fij996F0c7VwupXXj2mWhFHI+fqeY4VYY52rON5C+VMKPdTkRFM2XEiAOcd9XFbHxMbqgd7x5M8bEgv2iRw1pw2cyaNjz48iWRM8SipkBCkCqr+hK1cqZdpznPVGfdMC3Z5vgOef0r9b9PtWMc3S40Hs/3EBzXRjjc034FTSOcDwZ2Zn/qzjZB5yoG6TS8RmVz19+OLV3naZhaMg1Wglkbz1PIw7OCVhFTFHt61dMXjq+jTgiD8rN2gavoEMAwx3Filg3UxNd1S0JJJq/15rqEd9bKricJfOxQ0Yidn0KPqGckkhUNm8aSG50SO2dXz5G8Ka5NtzQKaQwllG+nmx42RpTk6tABeIxVOFQNBKtxYidz4ERX1joAnYUSC/WET/zT8Vw9lt72lGhLTiiCog00EF/goxXGJM7tKfYdgaGrVHdfJ/ucM7yfe2xoqiH6LLhkmAMq7LCFubS3TY3XiqkE6IAvs0Gswu0hpKBnMmDP8uPEaY/0Q5tGBXsc1cNftJZTlIxJ8hiHh0kPckj3jfo+XRTJzEBqtieQY0TkoDxnSHtoIuqtmpJKgU043nXo+9CaiNFz4xImc0EcRT1KVNmji0PysVGThB9Jl3H0Aeq8uvn6B6RsKJH21wV6MqMl6s/qmb4EsM20W7DTFmrt0Chgz4xNAr6GYi5tDBO+/KdL97Bpgoh1vWyxBgTdOD5Ak8qNjHS/esNhvkfcQ1AS5BuFX2AMZWY5uTNYNeMh7Px9cKLG3Atm/3L672Tuo8yMc9C71PB/DjI9YOhL9iziL54xV5IhLBjyuHt6Wci3waS8HGTDiiGX3xHK99x7Yt7556o54yWK89p28yXdEQ/+UHTYrinex0/M6I5V4d1lYMLmNMLXrhl3+6YoKy4PdFbr8lBod2LKqn8quR1E45BiZT3/YHZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr7vnrpXDb7CqK647GyMIFsBbdiWxX6/10okl8Rdm2Bx/nsRronMIystHowWRF/6aJ2GLS3rk1aWu8fotl6gBQRa
|
1.0
|
I+UbKUBVkKv2hpI5mQ3mPEy4cODEZN+Slx79lJoFkqa3epNu7SnLwzWmDNaF1LGnxx0AVFKn8XgMP9HtmmrecbwG2GwiOfID/7xcewynoC55Y4KDd+03C7ooYIvZ9pNaPp4oSqRUd2ILjFDVe/1UBDZLlhPj3sVTngiFCtyFsn4= - kcS2crLTpvFeFHi/lNqT7P//v1EGTgBiN5AIzRTfiIGNJnmDtDGQLSiu9yOX+bRnQ7woZ4XR2RBcDchxaTNdE7ffz1Io+MOr95F/TLO9ALN+LJUyGvxoBM1H7dj0LlVsjYMl6QUrIJm6ZfwaZAm+6psRI1NfApAwFtVddyy23jbZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr6NOCIPys3aBq+gQwDDHcWKCcw22wFGJugB5T7xcQm7oMDQhMLXn+w59wZYf106KqRaCH7QxOkufjkflqBxBl+igO2B0Uk+asRc+vPeM9kNUy+Hd3BeJoGnN/DU2AkblP7pypg3TZnOchDONHGHG3uBVbMs7TKF9NWxgqmxzzk5FeR1YJLjRoMG4A4FxkUMD64CH+HtVcARJg2BgYH/1BBxry+FuWyRXJI+aS+sK+5oBvvfqJM6e2Tefaov+CxFDW5T+vaFVhqhsba2G6WSfFJ9SISwY8rh7elnIt8GkvBxkw4ohl98Ryvfce2Le+eeqOeMlivPadvMl3REP/lB02K4p3sdPzOiOVeHdZWDC5jTC/GlhRW/Fn4Y94DLkGai1FjA0ITC15/sOfcGWH9dOiqkWgh+0MTpLn45H5agcQZfooDtgdFJPmrEXPrz3jPZDVMvh3dwXiaBpzfw1NgJG5T+Fmr2EozVikruKdSAjszwd73+TM2FiVvpRs2+AsWp25nZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr6NOCIPys3aBq+gQwDDHcWKoUWQJWXJcGoApq2hRnIbJZUj8iS3M9jRFz+wwXUbDEj4Dy98MphGLBK7gnnnDYHqpfXJPZ74WJBIKvWrvQHKgxNSwwkJWwYquCPXhhlLYt5Yqi0A5cPO7531bksMFQBmFS16iCxxYLvs/RX735UWX6CLqrZqSSoFNON516PvQmojRc+MSJnNBHEU9SlTZo4tD8rFRk4QfSZdx9AHqvLr57yFty4+kvjrfJOqr0NkkNWZYfEnJeWUTXLWBBYV6YqSdTkRFM2XEiAOcd9XFbHxMbqgd7x5M8bEgv2iRw1pw2cyaNjz48iWRM8SipkBCkCqr+hK1cqZdpznPVGfdMC3Zzw8Lqp6WiVGcf1UjAaW27aHUs/Tsv21QTAH7H/hRMjn1+eEY+wpOZr0/PTgbN0YR/AwGyyTnAoaUlxlv9qux9XFfXcyUBLe+ttMLFzcZ6xRXyZ2HVQ4aoJcULVa3qb1IhUteogscWC77P0V+9+VFl+gi6q2akkqBTTjedej70JqI0XPjEiZzQRxFPUpU2aOLQ/KxUZOEH0mXcfQB6ry6+e8hbcuPpL463yTqq9DZJDVnNtqJ0/VEnJ+UBDEaksaapJIVDZvGkhudEjtnV8+RvCmuTbc0CmkMJZRvp5seNkaU5OrQAXiMVThUDQSrcWInc+BEV9Y6AJ2FEgv1hE/8080mlTiSpj49YicSKrR4pYQlSPyJLcz2NEXP7DBdRsMSPgPL3wymEYsEruCeecNgeql9ck9nvhYkEgq9au9AcqDE1LDCQlbBiq4I9eGGUti3nUG8aVGs8ZRsIwySmTeGgYOpLxpyYzBjQGeE0rEaL5sjSZ5g7QxkC0orvcjl/m0Z0O8KGeF0dkQXA3IcWkzXRO3389SKPjDq/eRf0yzvQCzY8N2Ho5fSa/fQrxYkC+sKAn4Trd6XDDg8WknC0X8csvZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr6NOCIPys3aBq+gQwDDHcWKa3kXo37ARMoJbsnnGhaEBrSkWDhNBEI94Yry91EczMNLmWw9bE9fp98f3bpluD3Tp9oxyoc2WzD5Jk0XULRPdS9iKdMOqnSm6fete0Vf2BzTQxI00YSLklxv2zYZVtRZwNCEwtef7Dn3Blh/XToqpFoIftDE6S5+OR+WoHEGX6KA7YHRST5qxFz6894z2Q1TL4d3cF4mgac38NTYCRuU/kLr28Ccdv+RiXSuEFinqBNbKricJfOxQ0Yidn0KPqGckkhUNm8aSG50SO2dXz5G8Ka5NtzQKaQwllG+nmx42RpTk6tABeIxVOFQNBKtxYidz4ERX1joAnYUSC/WET/zT4qHXKa4iJlYAFTKqd+mLD8/3EBzXRjjc034FTSOcDwZ2Zn/qzjZB5yoG6TS8RmVz19+OLV3naZhaMg1Wglkbz1PIw7OCVhFTFHt61dMXjq+jTgiD8rN2gavoEMAwx3Fij996F0c7VwupXXj2mWhFHI+fqeY4VYY52rON5C+VMKPdTkRFM2XEiAOcd9XFbHxMbqgd7x5M8bEgv2iRw1pw2cyaNjz48iWRM8SipkBCkCqr+hK1cqZdpznPVGfdMC3Z5vgOef0r9b9PtWMc3S40Hs/3EBzXRjjc034FTSOcDwZ2Zn/qzjZB5yoG6TS8RmVz19+OLV3naZhaMg1Wglkbz1PIw7OCVhFTFHt61dMXjq+jTgiD8rN2gavoEMAwx3Filg3UxNd1S0JJJq/15rqEd9bKricJfOxQ0Yidn0KPqGckkhUNm8aSG50SO2dXz5G8Ka5NtzQKaQwllG+nmx42RpTk6tABeIxVOFQNBKtxYidz4ERX1joAnYUSC/WET/zT8Vw9lt72lGhLTiiCog00EF/goxXGJM7tKfYdgaGrVHdfJ/ucM7yfe2xoqiH6LLhkmAMq7LCFubS3TY3XiqkE6IAvs0Gswu0hpKBnMmDP8uPEaY/0Q5tGBXsc1cNftJZTlIxJ8hiHh0kPckj3jfo+XRTJzEBqtieQY0TkoDxnSHtoIuqtmpJKgU043nXo+9CaiNFz4xImc0EcRT1KVNmji0PysVGThB9Jl3H0Aeq8uvn6B6RsKJH21wV6MqMl6s/qmb4EsM20W7DTFmrt0Chgz4xNAr6GYi5tDBO+/KdL97Bpgoh1vWyxBgTdOD5Ak8qNjHS/esNhvkfcQ1AS5BuFX2AMZWY5uTNYNeMh7Px9cKLG3Atm/3L672Tuo8yMc9C71PB/DjI9YOhL9iziL54xV5IhLBjyuHt6Wci3waS8HGTDiiGX3xHK99x7Yt7556o54yWK89p28yXdEQ/+UHTYrinex0/M6I5V4d1lYMLmNMLXrhl3+6YoKy4PdFbr8lBod2LKqn8quR1E45BiZT3/YHZmf+rONkHnKgbpNLxGZXPX344tXedpmFoyDVaCWRvPU8jDs4JWEVMUe3rV0xeOr7vnrpXDb7CqK647GyMIFsBbdiWxX6/10okl8Rdm2Bx/nsRronMIystHowWRF/6aJ2GLS3rk1aWu8fotl6gBQRa
|
non_process
|
i asrc fuwyrxji as sk glhrw eey vfl venj fqrxykc or wet wet yhzmf nsrronmiysthowwrf
| 0
|
724
| 3,212,002,997
|
IssuesEvent
|
2015-10-06 13:48:29
|
superroma/testcafe-hammerhead
|
https://api.github.com/repos/superroma/testcafe-hammerhead
|
closed
|
The URL attribute is set to an empty string on the client for the second time (T295078)
|
!IMPORTANT! AREA: client BROWSER: Firefox SYSTEM: resource processing SYSTEM: sandbox TYPE: bug
|
That's why Firefox reloads the iframe and removes Hammerhead from it.
The Firefox scenario:
* an iframe without a src is being processed on the server
```html
<iframe src="" src-hammerhead-stored-value=""></iframe>
```
* ```window.Hammerhead``` is initialized in the iframe on the client
* ```src``` is again set to an empty string on the client
* the iframe is reloaded (i.e. ```window.Hammerhead``` is removed)
* finally, we try to execute the task script with eval() and get the ```TypeError: Hammerhead is undefined error```
|
1.0
|
The URL attribute is set to an empty string on the client for the second time (T295078) - That's why Firefox reloads the iframe and removes Hammerhead from it.
The Firefox scenario:
* an iframe without a src is being processed on the server
```html
<iframe src="" src-hammerhead-stored-value=""></iframe>
```
* ```window.Hammerhead``` is initialized in the iframe on the client
* ```src``` is again set to an empty string on the client
* the iframe is reloaded (i.e. ```window.Hammerhead``` is removed)
* finally, we try to execute the task script with eval() and get the ```TypeError: Hammerhead is undefined error```
|
process
|
the url attribute is set to an empty string on the client for the second time that s why firefox reloads the iframe and removes hammerhead from it the firefox scenario an iframe without a src is being processed on the server html window hammerhead is initialized in the iframe on the client src is again set to an empty string on the client the iframe is reloaded i e window hammerhead is removed finally we try to execute the task script with eval and get the typeerror hammerhead is undefined error
| 1
|
99,995
| 30,595,259,108
|
IssuesEvent
|
2023-07-21 21:12:53
|
rpopuc/gha-build-homolog
|
https://api.github.com/repos/rpopuc/gha-build-homolog
|
closed
|
Via create-issue:
|
build-homolog
|
## Description
Realiza deploy automatizado da aplicação.
## Environments
environment_1
## Branches
feat/mainha
|
1.0
|
Via create-issue: - ## Description
Realiza deploy automatizado da aplicação.
## Environments
environment_1
## Branches
feat/mainha
|
non_process
|
via create issue description realiza deploy automatizado da aplicação environments environment branches feat mainha
| 0
|
5,540
| 8,392,425,037
|
IssuesEvent
|
2018-10-09 17:35:34
|
AlexsLemonade/refinebio
|
https://api.github.com/repos/AlexsLemonade/refinebio
|
closed
|
tximport RDS object should be at the transcript level
|
RNA-seq bug processor
|
### Context
Looking into #508 & downloading files
### Problem or idea
In #278, I made a mistake in asking @dongbohu to change the parameter below to `FALSE`.
https://github.com/AlexsLemonade/refinebio/blob/79d9d9032caaa7ffda6ad898862da16b82edcbab/workers/data_refinery_workers/processors/tximport.R#L49
This means that the `txi_out.RDS` file is at the gene-level rather than the desired transcript-level.
### Solution or next step
This parameter should set to `TRUE`
### New Issue Checklist
- [x] The title is short and descriptive.
- [x] You have explained the context that led you to write this issue.
- [x] You have reported a problem or idea.
- [x] You have proposed a solution or next step.
|
1.0
|
tximport RDS object should be at the transcript level - ### Context
Looking into #508 & downloading files
### Problem or idea
In #278, I made a mistake in asking @dongbohu to change the parameter below to `FALSE`.
https://github.com/AlexsLemonade/refinebio/blob/79d9d9032caaa7ffda6ad898862da16b82edcbab/workers/data_refinery_workers/processors/tximport.R#L49
This means that the `txi_out.RDS` file is at the gene-level rather than the desired transcript-level.
### Solution or next step
This parameter should set to `TRUE`
### New Issue Checklist
- [x] The title is short and descriptive.
- [x] You have explained the context that led you to write this issue.
- [x] You have reported a problem or idea.
- [x] You have proposed a solution or next step.
|
process
|
tximport rds object should be at the transcript level context looking into downloading files problem or idea in i made a mistake in asking dongbohu to change the parameter below to false this means that the txi out rds file is at the gene level rather than the desired transcript level solution or next step this parameter should set to true new issue checklist the title is short and descriptive you have explained the context that led you to write this issue you have reported a problem or idea you have proposed a solution or next step
| 1
|
14,741
| 18,012,706,164
|
IssuesEvent
|
2021-09-16 10:26:49
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
closed
|
Change term label 'GO:0039579 suppression by virus of host ISG15 activity'
|
multi-species process
|
- GO:0039579 suppression by virus of host ISG15 activity -> change label to 'suppression by virus of host ISG15-protein conjugation'
- OLD def: Any process in which a virus stops, prevents, or reduces the frequency, rate or extent of host ubiquitin-like protein ISG15 activity. ISG15 is a ubiquitin-like protein that is conjugated to lysine residues on various target proteins. Viruses escape from the antiviral activity of ISG15 by using different mechanisms; the influenza B virus NS1 protein for instance blocks the covalent linkage of ISG15 to its target proteins by directly interacting with ISG15. The papain-like protease from the coronavirus cleaves ISG15 derivatives.
- NEW def: Any process in which a virus stops, prevents, or reduces the frequency, rate or extent of host ubiquitin-like protein ISG15 **conjugation**. (rest of the def OK)
- remove parent ' suppression by virus of host molecular function'
|
1.0
|
Change term label 'GO:0039579 suppression by virus of host ISG15 activity' - - GO:0039579 suppression by virus of host ISG15 activity -> change label to 'suppression by virus of host ISG15-protein conjugation'
- OLD def: Any process in which a virus stops, prevents, or reduces the frequency, rate or extent of host ubiquitin-like protein ISG15 activity. ISG15 is a ubiquitin-like protein that is conjugated to lysine residues on various target proteins. Viruses escape from the antiviral activity of ISG15 by using different mechanisms; the influenza B virus NS1 protein for instance blocks the covalent linkage of ISG15 to its target proteins by directly interacting with ISG15. The papain-like protease from the coronavirus cleaves ISG15 derivatives.
- NEW def: Any process in which a virus stops, prevents, or reduces the frequency, rate or extent of host ubiquitin-like protein ISG15 **conjugation**. (rest of the def OK)
- remove parent ' suppression by virus of host molecular function'
|
process
|
change term label go suppression by virus of host activity go suppression by virus of host activity change label to suppression by virus of host protein conjugation old def any process in which a virus stops prevents or reduces the frequency rate or extent of host ubiquitin like protein activity is a ubiquitin like protein that is conjugated to lysine residues on various target proteins viruses escape from the antiviral activity of by using different mechanisms the influenza b virus protein for instance blocks the covalent linkage of to its target proteins by directly interacting with the papain like protease from the coronavirus cleaves derivatives new def any process in which a virus stops prevents or reduces the frequency rate or extent of host ubiquitin like protein conjugation rest of the def ok remove parent suppression by virus of host molecular function
| 1
|
395,262
| 11,682,687,777
|
IssuesEvent
|
2020-03-05 00:55:57
|
AIcrowd/AIcrowd
|
https://api.github.com/repos/AIcrowd/AIcrowd
|
closed
|
Delete submission does not delete s3 assets
|
High Priority
|
_From @seanfcarroll on October 27, 2018 21:23_
The delete submission link deletes the submission and recalculates the leaderboard, but does not delete data from S3. It should be modified to delete S3 data, and we should probably an audit of such at the same time.
_Copied from original issue: crowdAI/crowdai#1047_
|
1.0
|
Delete submission does not delete s3 assets - _From @seanfcarroll on October 27, 2018 21:23_
The delete submission link deletes the submission and recalculates the leaderboard, but does not delete data from S3. It should be modified to delete S3 data, and we should probably an audit of such at the same time.
_Copied from original issue: crowdAI/crowdai#1047_
|
non_process
|
delete submission does not delete assets from seanfcarroll on october the delete submission link deletes the submission and recalculates the leaderboard but does not delete data from it should be modified to delete data and we should probably an audit of such at the same time copied from original issue crowdai crowdai
| 0
|
21,705
| 30,203,819,239
|
IssuesEvent
|
2023-07-05 08:04:47
|
mrdoob/three.js
|
https://api.github.com/repos/mrdoob/three.js
|
closed
|
OutputPass: shader does not compile if renderer.toneMapping is set
|
Post-processing
|
### Description
Setting, for example,
```js
renderer.toneMapping = THREE.ACESFilmicToneMapping;
```
will cause `OutputPass` shader to fail to compile.
### Reproduction steps
1. Add `renderer.toneMapping = THREE.ACESFilmicToneMapping` to an example using `OutputPass`.
### Code
```js
renderer.toneMapping = THREE.ACESFilmicToneMapping
```
### Live example
* [jsfiddle-latest-release](https://jsfiddle.net/g3atw6k5/)
* [jsfiddle-dev](https://jsfiddle.net/hjqw94c5/)
### Screenshots
_No response_
### Version
r154dev
### Device
_No response_
### Browser
_No response_
### OS
_No response_
|
1.0
|
OutputPass: shader does not compile if renderer.toneMapping is set - ### Description
Setting, for example,
```js
renderer.toneMapping = THREE.ACESFilmicToneMapping;
```
will cause `OutputPass` shader to fail to compile.
### Reproduction steps
1. Add `renderer.toneMapping = THREE.ACESFilmicToneMapping` to an example using `OutputPass`.
### Code
```js
renderer.toneMapping = THREE.ACESFilmicToneMapping
```
### Live example
* [jsfiddle-latest-release](https://jsfiddle.net/g3atw6k5/)
* [jsfiddle-dev](https://jsfiddle.net/hjqw94c5/)
### Screenshots
_No response_
### Version
r154dev
### Device
_No response_
### Browser
_No response_
### OS
_No response_
|
process
|
outputpass shader does not compile if renderer tonemapping is set description setting for example js renderer tonemapping three acesfilmictonemapping will cause outputpass shader to fail to compile reproduction steps add renderer tonemapping three acesfilmictonemapping to an example using outputpass code js renderer tonemapping three acesfilmictonemapping live example screenshots no response version device no response browser no response os no response
| 1
|
8,392
| 11,564,131,053
|
IssuesEvent
|
2020-02-20 07:57:46
|
dzhw/zofar
|
https://api.github.com/repos/dzhw/zofar
|
closed
|
Webstart Replacement: Updating automated project creation process
|
5 category: secondary.cockpit category: secondary.exporter category: technical.processes duplicate prio: 1 status: discussion type: backlog.item
|
As a Developer i want to generate Configurations for Exporter and Cockpit while automated process of project creation
|
1.0
|
Webstart Replacement: Updating automated project creation process - As a Developer i want to generate Configurations for Exporter and Cockpit while automated process of project creation
|
process
|
webstart replacement updating automated project creation process as a developer i want to generate configurations for exporter and cockpit while automated process of project creation
| 1
|
707,208
| 24,298,633,218
|
IssuesEvent
|
2022-09-29 12:14:36
|
ut-issl/c2a-core
|
https://api.github.com/repos/ut-issl/c2a-core
|
closed
|
DriverSuper の各種構造体のメンバが並列に詰まってるのがわかりにくいので階層化する
|
enhancement priority::medium
|
## 概要
DriverSuper の各種構造体のメンバが並列に詰まってるのがわかりにくいので階層化する
## 詳細
ここらへん
https://github.com/ut-issl/c2a-core/blob/9f5154ddc5684f8e626b77c8f8bc7c8215c0e096/Drivers/Super/driver_super.h#L232-L335
ユーザー設定値なのか,内部で自動記録されるメトリックなのか,内部で使う一時変数なのかわかりにくい.
setterのみ,setter / getter あり,accessorなし,などで分けたい
## close条件
できたら
|
1.0
|
DriverSuper の各種構造体のメンバが並列に詰まってるのがわかりにくいので階層化する - ## 概要
DriverSuper の各種構造体のメンバが並列に詰まってるのがわかりにくいので階層化する
## 詳細
ここらへん
https://github.com/ut-issl/c2a-core/blob/9f5154ddc5684f8e626b77c8f8bc7c8215c0e096/Drivers/Super/driver_super.h#L232-L335
ユーザー設定値なのか,内部で自動記録されるメトリックなのか,内部で使う一時変数なのかわかりにくい.
setterのみ,setter / getter あり,accessorなし,などで分けたい
## close条件
できたら
|
non_process
|
driversuper の各種構造体のメンバが並列に詰まってるのがわかりにくいので階層化する 概要 driversuper の各種構造体のメンバが並列に詰まってるのがわかりにくいので階層化する 詳細 ここらへん ユーザー設定値なのか,内部で自動記録されるメトリックなのか,内部で使う一時変数なのかわかりにくい. setterのみ,setter getter あり,accessorなし,などで分けたい close条件 できたら
| 0
|
351,714
| 10,522,561,573
|
IssuesEvent
|
2019-09-30 09:01:03
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.washingtonpost.com - site is not usable
|
browser-fenix engine-gecko priority-important
|
<!-- @browser: Firefox Mobile 70.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 10; Mobile; rv:70.0) Gecko/70.0 Firefox/70.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.washingtonpost.com/sports/2019/09/25/nfls-search-next-sean-mcvay-has-created-new-role-head-coach-defense/
**Browser / Version**: Firefox Mobile 70.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: can't read articles with tracking protection on
**Steps to Reproduce**:
Loaded an article
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
Submitted in the name of `@uvayankee`
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.washingtonpost.com - site is not usable - <!-- @browser: Firefox Mobile 70.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 10; Mobile; rv:70.0) Gecko/70.0 Firefox/70.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.washingtonpost.com/sports/2019/09/25/nfls-search-next-sean-mcvay-has-created-new-role-head-coach-defense/
**Browser / Version**: Firefox Mobile 70.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Site is not usable
**Description**: can't read articles with tracking protection on
**Steps to Reproduce**:
Loaded an article
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
Submitted in the name of `@uvayankee`
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_process
|
site is not usable url browser version firefox mobile operating system android tested another browser yes problem type site is not usable description can t read articles with tracking protection on steps to reproduce loaded an article browser configuration none submitted in the name of uvayankee from with ❤️
| 0
|
32,165
| 4,329,301,521
|
IssuesEvent
|
2016-07-26 16:23:46
|
LLK/scratch-blocks
|
https://api.github.com/repos/LLK/scratch-blocks
|
closed
|
Font spec
|
design
|
We need the font to be well-defined for blocks with text, including face, size, weight.
|
1.0
|
Font spec - We need the font to be well-defined for blocks with text, including face, size, weight.
|
non_process
|
font spec we need the font to be well defined for blocks with text including face size weight
| 0
|
38,945
| 10,268,831,438
|
IssuesEvent
|
2019-08-23 07:31:56
|
yodaos-project/yodart
|
https://api.github.com/repos/yodaos-project/yodart
|
closed
|
flaky build: c client sdk dependency on yoda-api-c
|
bug build
|
* **Version**: next
* **Platform**: all
* **Subsystem**: build
Target `yodaosclient_c` does depend on output of target `yodart-api-c` yet did not declare its dependency on the target `yodart-api-c`. This may cause build issue when parallelling.
|
1.0
|
flaky build: c client sdk dependency on yoda-api-c - * **Version**: next
* **Platform**: all
* **Subsystem**: build
Target `yodaosclient_c` does depend on output of target `yodart-api-c` yet did not declare its dependency on the target `yodart-api-c`. This may cause build issue when parallelling.
|
non_process
|
flaky build c client sdk dependency on yoda api c version next platform all subsystem build target yodaosclient c does depend on output of target yodart api c yet did not declare its dependency on the target yodart api c this may cause build issue when parallelling
| 0
|
6,858
| 9,994,867,508
|
IssuesEvent
|
2019-07-11 18:45:51
|
TorXakis/TorXakis
|
https://api.github.com/repos/TorXakis/TorXakis
|
opened
|
Improve error handling in build environments
|
development-process
|
Recent issues, like with the stack lock files, show that our error handling is not always in place. All scripts should be reviewed to ensure that an error is reported when something unexpected happens. Some minimal error handling like described in https://www.pdq.com/blog/error-handling-with-powershell/ seems to be the bare minimum.
|
1.0
|
Improve error handling in build environments - Recent issues, like with the stack lock files, show that our error handling is not always in place. All scripts should be reviewed to ensure that an error is reported when something unexpected happens. Some minimal error handling like described in https://www.pdq.com/blog/error-handling-with-powershell/ seems to be the bare minimum.
|
process
|
improve error handling in build environments recent issues like with the stack lock files show that our error handling is not always in place all scripts should be reviewed to ensure that an error is reported when something unexpected happens some minimal error handling like described in seems to be the bare minimum
| 1
|
197,156
| 14,910,501,045
|
IssuesEvent
|
2021-01-22 09:40:45
|
input-output-hk/jormungandr
|
https://api.github.com/repos/input-output-hk/jormungandr
|
closed
|
Develop Performance/Load test for tallying big amount of votes
|
Tests VOTE
|
Develop tests which exercises tallying operation:
**Chain Libs**
- [x] develop test which measure tally time needed to do tallying of 1k votes
**Integration tests**
- [x] performance test which tally 1k-10k votes
- [x] memory consumption during tally 1k- 10k votes
|
1.0
|
Develop Performance/Load test for tallying big amount of votes - Develop tests which exercises tallying operation:
**Chain Libs**
- [x] develop test which measure tally time needed to do tallying of 1k votes
**Integration tests**
- [x] performance test which tally 1k-10k votes
- [x] memory consumption during tally 1k- 10k votes
|
non_process
|
develop performance load test for tallying big amount of votes develop tests which exercises tallying operation chain libs develop test which measure tally time needed to do tallying of votes integration tests performance test which tally votes memory consumption during tally votes
| 0
|
4,151
| 7,103,357,843
|
IssuesEvent
|
2018-01-16 04:26:21
|
bojanrajkovic/Volley
|
https://api.github.com/repos/bojanrajkovic/Volley
|
closed
|
Generate manifest file for homepage to pull artifacts from
|
Release Process
|
The manifest should include the artifact URL, a hash, and either a link to a detached signature or an indication that the artifact is "signed" internally (for RPM/DEB artifacts).
It would be nice to do this in CI, but due to the disparate nature of the CI systems, it'll be easiest to run the generation by hand.
|
1.0
|
Generate manifest file for homepage to pull artifacts from - The manifest should include the artifact URL, a hash, and either a link to a detached signature or an indication that the artifact is "signed" internally (for RPM/DEB artifacts).
It would be nice to do this in CI, but due to the disparate nature of the CI systems, it'll be easiest to run the generation by hand.
|
process
|
generate manifest file for homepage to pull artifacts from the manifest should include the artifact url a hash and either a link to a detached signature or an indication that the artifact is signed internally for rpm deb artifacts it would be nice to do this in ci but due to the disparate nature of the ci systems it ll be easiest to run the generation by hand
| 1
|
14,975
| 18,493,741,858
|
IssuesEvent
|
2021-10-19 05:59:07
|
juzi5201314/maop
|
https://api.github.com/repos/juzi5201314/maop
|
opened
|
改进password的配置
|
http processing
|
为了不储存密码明文,将密码hash后放到data path中。
1. 在终端与用户交互获取密码
2. 通过--no-password启动,禁用身份验证(禁用需要验证的操作
3. 通过pwd子命令手动设置密码
|
1.0
|
改进password的配置 - 为了不储存密码明文,将密码hash后放到data path中。
1. 在终端与用户交互获取密码
2. 通过--no-password启动,禁用身份验证(禁用需要验证的操作
3. 通过pwd子命令手动设置密码
|
process
|
改进password的配置 为了不储存密码明文,将密码hash后放到data path中。 在终端与用户交互获取密码 通过 no password启动,禁用身份验证 禁用需要验证的操作 通过pwd子命令手动设置密码
| 1
|
19,572
| 25,891,961,692
|
IssuesEvent
|
2022-12-14 18:44:01
|
PyCQA/pylint
|
https://api.github.com/repos/PyCQA/pylint
|
closed
|
pylint/2.15.8: python/3.11.1: debian: tests/test_check_parallel.py tests fail/hang indefinitely
|
Bug :beetle: Blocker 🙅 topic-multiprocessing
|
### Bug description
Running tests with python3.11, makes some tests/test_check_parallel.py either fail or hang indefinitely:
```
I: pybuild base:240: cd /build/pylint-2.15.8/.pybuild/cpython3_3.11/build; python3.11 -m pytest -vvvv -k 'not test_pkginfo and not test_do_not_import_files_from_local_directory and not test_import_plugin_from_local_directory_if_pythonpath_cwd and not test_can_list_directo
ries_without_dunder_init and not test_fail_on_exit_code and not test__test_environ_pythonpath_no_arg'
============================= test session starts ==============================
platform linux -- Python 3.11.1, pytest-7.1.2, pluggy-1.0.0+repack -- /usr/bin/python3.11
cachedir: .pytest_cache
benchmark: 3.2.2 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /build/pylint-2.15.8, configfile: setup.cfg
plugins: benchmark-3.2.2
collecting ... collected 2057 items / 15 deselected / 2042 selected
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize FAILED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize_pickling PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_uninitialised PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_no_checkers FAILED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_linter_with_unpickleable_plugins_is_pickleable PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_sequential_checker FAILED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_sequential_checkers_work Terminated
```
the last had to be `kill`ed
the same code, ran with python 3.10 works fine:
```
I: pybuild base:240: cd /build/pylint-2.15.8/.pybuild/cpython3_3.10/build; python3.10 -m pytest -vvvv -k 'not test_pkginfo and not test_do_not_import_files_from_local_directory and not test_import_plugin_from_local_directory_if_pythonpath_cwd and not test_can_list_directo
ries_without_dunder_init and not test_fail_on_exit_code and not test__test_environ_pythonpath_no_arg'
============================= test session starts ==============================
platform linux -- Python 3.10.9, pytest-7.1.2, pluggy-1.0.0+repack -- /usr/bin/python3.10
cachedir: .pytest_cache
benchmark: 3.2.2 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /build/pylint-2.15.8, configfile: setup.cfg
plugins: benchmark-3.2.2
collecting ... collected 2056 items / 15 deselected / 2041 selected
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize_pickling PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_uninitialised PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_no_checkers PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_linter_with_unpickleable_plugins_is_pickleable PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_sequential_checker PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_sequential_checkers_work PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_invoke_single_job PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_compare_workers_to_single_proc[1-2-1] PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_compare_workers_to_single_proc[1-2-2] PASSED [ 0%]
...
```
this is running in debian unstable, with astroid/2.12.13
### Configuration
_No response_
### Command used
```shell
pytest command and switches reported above
```
### Pylint output
```shell
n/a
```
### Expected behavior
unittests to work
### Pylint version
```shell
pylint 2.15.8
astroid 2.12.13
python 3.11.1
```
### OS / Environment
Debian unstable
### Additional dependencies
_No response_
|
1.0
|
pylint/2.15.8: python/3.11.1: debian: tests/test_check_parallel.py tests fail/hang indefinitely - ### Bug description
Running tests with python3.11, makes some tests/test_check_parallel.py either fail or hang indefinitely:
```
I: pybuild base:240: cd /build/pylint-2.15.8/.pybuild/cpython3_3.11/build; python3.11 -m pytest -vvvv -k 'not test_pkginfo and not test_do_not_import_files_from_local_directory and not test_import_plugin_from_local_directory_if_pythonpath_cwd and not test_can_list_directo
ries_without_dunder_init and not test_fail_on_exit_code and not test__test_environ_pythonpath_no_arg'
============================= test session starts ==============================
platform linux -- Python 3.11.1, pytest-7.1.2, pluggy-1.0.0+repack -- /usr/bin/python3.11
cachedir: .pytest_cache
benchmark: 3.2.2 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /build/pylint-2.15.8, configfile: setup.cfg
plugins: benchmark-3.2.2
collecting ... collected 2057 items / 15 deselected / 2042 selected
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize FAILED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize_pickling PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_uninitialised PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_no_checkers FAILED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_linter_with_unpickleable_plugins_is_pickleable PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_sequential_checker FAILED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_sequential_checkers_work Terminated
```
the last had to be `kill`ed
the same code, ran with python 3.10 works fine:
```
I: pybuild base:240: cd /build/pylint-2.15.8/.pybuild/cpython3_3.10/build; python3.10 -m pytest -vvvv -k 'not test_pkginfo and not test_do_not_import_files_from_local_directory and not test_import_plugin_from_local_directory_if_pythonpath_cwd and not test_can_list_directo
ries_without_dunder_init and not test_fail_on_exit_code and not test__test_environ_pythonpath_no_arg'
============================= test session starts ==============================
platform linux -- Python 3.10.9, pytest-7.1.2, pluggy-1.0.0+repack -- /usr/bin/python3.10
cachedir: .pytest_cache
benchmark: 3.2.2 (defaults: timer=time.perf_counter disable_gc=False min_rounds=5 min_time=0.000005 max_time=1.0 calibration_precision=10 warmup=False warmup_iterations=100000)
rootdir: /build/pylint-2.15.8, configfile: setup.cfg
plugins: benchmark-3.2.2
collecting ... collected 2056 items / 15 deselected / 2041 selected
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_initialize_pickling PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_uninitialised PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_single_file_no_checkers PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_linter_with_unpickleable_plugins_is_pickleable PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallelFramework::test_worker_check_sequential_checker PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_sequential_checkers_work PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_invoke_single_job PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_compare_workers_to_single_proc[1-2-1] PASSED [ 0%]
tests/test_check_parallel.py::TestCheckParallel::test_compare_workers_to_single_proc[1-2-2] PASSED [ 0%]
...
```
this is running in debian unstable, with astroid/2.12.13
### Configuration
_No response_
### Command used
```shell
pytest command and switches reported above
```
### Pylint output
```shell
n/a
```
### Expected behavior
unittests to work
### Pylint version
```shell
pylint 2.15.8
astroid 2.12.13
python 3.11.1
```
### OS / Environment
Debian unstable
### Additional dependencies
_No response_
|
process
|
pylint python debian tests test check parallel py tests fail hang indefinitely bug description running tests with makes some tests test check parallel py either fail or hang indefinitely i pybuild base cd build pylint pybuild build m pytest vvvv k not test pkginfo and not test do not import files from local directory and not test import plugin from local directory if pythonpath cwd and not test can list directo ries without dunder init and not test fail on exit code and not test test environ pythonpath no arg test session starts platform linux python pytest pluggy repack usr bin cachedir pytest cache benchmark defaults timer time perf counter disable gc false min rounds min time max time calibration precision warmup false warmup iterations rootdir build pylint configfile setup cfg plugins benchmark collecting collected items deselected selected tests test check parallel py testcheckparallelframework test worker initialize failed tests test check parallel py testcheckparallelframework test worker initialize pickling passed tests test check parallel py testcheckparallelframework test worker check single file uninitialised passed tests test check parallel py testcheckparallelframework test worker check single file no checkers failed tests test check parallel py testcheckparallelframework test linter with unpickleable plugins is pickleable passed tests test check parallel py testcheckparallelframework test worker check sequential checker failed tests test check parallel py testcheckparallel test sequential checkers work terminated the last had to be kill ed the same code ran with python works fine i pybuild base cd build pylint pybuild build m pytest vvvv k not test pkginfo and not test do not import files from local directory and not test import plugin from local directory if pythonpath cwd and not test can list directo ries without dunder init and not test fail on exit code and not test test environ pythonpath no arg test session starts platform linux python pytest pluggy repack usr bin cachedir pytest cache benchmark defaults timer time perf counter disable gc false min rounds min time max time calibration precision warmup false warmup iterations rootdir build pylint configfile setup cfg plugins benchmark collecting collected items deselected selected tests test check parallel py testcheckparallelframework test worker initialize passed tests test check parallel py testcheckparallelframework test worker initialize pickling passed tests test check parallel py testcheckparallelframework test worker check single file uninitialised passed tests test check parallel py testcheckparallelframework test worker check single file no checkers passed tests test check parallel py testcheckparallelframework test linter with unpickleable plugins is pickleable passed tests test check parallel py testcheckparallelframework test worker check sequential checker passed tests test check parallel py testcheckparallel test sequential checkers work passed tests test check parallel py testcheckparallel test invoke single job passed tests test check parallel py testcheckparallel test compare workers to single proc passed tests test check parallel py testcheckparallel test compare workers to single proc passed this is running in debian unstable with astroid configuration no response command used shell pytest command and switches reported above pylint output shell n a expected behavior unittests to work pylint version shell pylint astroid python os environment debian unstable additional dependencies no response
| 1
|
62,251
| 7,566,730,109
|
IssuesEvent
|
2018-04-22 00:06:18
|
tpportugal/tpp_registo_de_feeds
|
https://api.github.com/repos/tpportugal/tpp_registo_de_feeds
|
closed
|
Adaptar SCSS atual a novo SCSS conforme o utilizado em TPP.pt
|
design
|
É necessário adaptar o projeto para utilizar o SCCS de https://github.com/BlackrockDigital/startbootstrap-new-age, o que implica uma atualização para Bootstrap 4 do Bootstrap 3 que é atualizado atualmente.
Os ficheiros de estilo estão localizados [aqui](https://github.com/tpportugal/tpp_registo_de_feeds/tree/tpp/app/styles)
Os ícones e imagens, [aqui](https://github.com/tpportugal/tpp_registo_de_feeds/tree/tpp/public/assets/images)
Imagens/logótipos do projeto TPP deverão ser requisitados a @Rui-Santos ou a @glaand
|
1.0
|
Adaptar SCSS atual a novo SCSS conforme o utilizado em TPP.pt - É necessário adaptar o projeto para utilizar o SCCS de https://github.com/BlackrockDigital/startbootstrap-new-age, o que implica uma atualização para Bootstrap 4 do Bootstrap 3 que é atualizado atualmente.
Os ficheiros de estilo estão localizados [aqui](https://github.com/tpportugal/tpp_registo_de_feeds/tree/tpp/app/styles)
Os ícones e imagens, [aqui](https://github.com/tpportugal/tpp_registo_de_feeds/tree/tpp/public/assets/images)
Imagens/logótipos do projeto TPP deverão ser requisitados a @Rui-Santos ou a @glaand
|
non_process
|
adaptar scss atual a novo scss conforme o utilizado em tpp pt é necessário adaptar o projeto para utilizar o sccs de o que implica uma atualização para bootstrap do bootstrap que é atualizado atualmente os ficheiros de estilo estão localizados os ícones e imagens imagens logótipos do projeto tpp deverão ser requisitados a rui santos ou a glaand
| 0
|
14,458
| 17,536,704,693
|
IssuesEvent
|
2021-08-12 07:22:13
|
aiidateam/aiida-core
|
https://api.github.com/repos/aiidateam/aiida-core
|
closed
|
Unrelated builders interfere with each other
|
type/bug priority/important topic/processes
|
### Describe the bug
The `ProcessBuilderNamespace` of unrelated processes can interfere with each other, because their properties are set on the `ProcessBuilderNamespace` class instead of the specific instance.
### Steps to reproduce
WorkChain code:
```python
class First(WorkChain):
@classmethod
def define(cls, spec):
super().define(spec)
spec.input('a', default=lambda: orm.Int(1))
spec.input('b')
class Second(WorkChain):
@classmethod
def define(cls, spec):
super().define(spec)
spec.input('a', default=lambda: orm.Int(2))
spec.input('c')
```
Inspect builder behavior:
```python
In [1]: from test_proc import First, Second
In [2]: builder = First.get_builder()
In [3]: Second.get_builder() # We simply construct this builder, and do nothing with it.
In [5]: builder.a() # The default for 'a' has changed to the lambda defined in `Second`.
Out[5]: <Int: uuid: fb6a1556-9f34-4d32-b7ea-cc2800ee00ef (unstored) value: 2>
In [6]: builder.c # This should raise, since `First` doesn't have a 'c' input.
```
**EDIT:** When actually _running_ the workchain, the default is populated correctly, and validation is also correct. I guess this slightly lowers the severity, but depending on how the user manipulates the builder it could still lead to incorrect inputs.
### Your environment
- Operating system: Ubuntu-18.04 on WSL2
- Python version: 3.7.3
- aiida-core version: `develop` branch
### Additional context
This is a follow-up from a discussion from https://github.com/aiidateam/aiida-core/pull/4419
|
1.0
|
Unrelated builders interfere with each other - ### Describe the bug
The `ProcessBuilderNamespace` of unrelated processes can interfere with each other, because their properties are set on the `ProcessBuilderNamespace` class instead of the specific instance.
### Steps to reproduce
WorkChain code:
```python
class First(WorkChain):
@classmethod
def define(cls, spec):
super().define(spec)
spec.input('a', default=lambda: orm.Int(1))
spec.input('b')
class Second(WorkChain):
@classmethod
def define(cls, spec):
super().define(spec)
spec.input('a', default=lambda: orm.Int(2))
spec.input('c')
```
Inspect builder behavior:
```python
In [1]: from test_proc import First, Second
In [2]: builder = First.get_builder()
In [3]: Second.get_builder() # We simply construct this builder, and do nothing with it.
In [5]: builder.a() # The default for 'a' has changed to the lambda defined in `Second`.
Out[5]: <Int: uuid: fb6a1556-9f34-4d32-b7ea-cc2800ee00ef (unstored) value: 2>
In [6]: builder.c # This should raise, since `First` doesn't have a 'c' input.
```
**EDIT:** When actually _running_ the workchain, the default is populated correctly, and validation is also correct. I guess this slightly lowers the severity, but depending on how the user manipulates the builder it could still lead to incorrect inputs.
### Your environment
- Operating system: Ubuntu-18.04 on WSL2
- Python version: 3.7.3
- aiida-core version: `develop` branch
### Additional context
This is a follow-up from a discussion from https://github.com/aiidateam/aiida-core/pull/4419
|
process
|
unrelated builders interfere with each other describe the bug the processbuildernamespace of unrelated processes can interfere with each other because their properties are set on the processbuildernamespace class instead of the specific instance steps to reproduce workchain code python class first workchain classmethod def define cls spec super define spec spec input a default lambda orm int spec input b class second workchain classmethod def define cls spec super define spec spec input a default lambda orm int spec input c inspect builder behavior python in from test proc import first second in builder first get builder in second get builder we simply construct this builder and do nothing with it in builder a the default for a has changed to the lambda defined in second out in builder c this should raise since first doesn t have a c input edit when actually running the workchain the default is populated correctly and validation is also correct i guess this slightly lowers the severity but depending on how the user manipulates the builder it could still lead to incorrect inputs your environment operating system ubuntu on python version aiida core version develop branch additional context this is a follow up from a discussion from
| 1
|
288,643
| 31,861,531,166
|
IssuesEvent
|
2023-09-15 11:17:21
|
nidhi7598/linux-v4.19.72_CVE-2022-3564
|
https://api.github.com/repos/nidhi7598/linux-v4.19.72_CVE-2022-3564
|
opened
|
CVE-2023-1990 (Medium) detected in linuxlinux-4.19.294
|
Mend: dependency security vulnerability
|
## CVE-2023-1990 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.294</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-v4.19.72_CVE-2022-3564/commit/9ffee08efa44c7887e2babb8f304df0fa1094efb">9ffee08efa44c7887e2babb8f304df0fa1094efb</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/nfc/st-nci/ndlc.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/nfc/st-nci/ndlc.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A use-after-free flaw was found in ndlc_remove in drivers/nfc/st-nci/ndlc.c in the Linux Kernel. This flaw could allow an attacker to crash the system due to a race problem.
<p>Publish Date: 2023-04-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-1990>CVE-2023-1990</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2023-1990">https://www.linuxkernelcves.com/cves/CVE-2023-1990</a></p>
<p>Release Date: 2023-04-12</p>
<p>Fix Resolution: v4.14.311,v4.19.279,v5.4.238,v5.10.176,v5.15.104,v6.1.21,v6.2.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2023-1990 (Medium) detected in linuxlinux-4.19.294 - ## CVE-2023-1990 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.294</b></p></summary>
<p>
<p>The Linux Kernel</p>
<p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p>
<p>Found in HEAD commit: <a href="https://github.com/nidhi7598/linux-v4.19.72_CVE-2022-3564/commit/9ffee08efa44c7887e2babb8f304df0fa1094efb">9ffee08efa44c7887e2babb8f304df0fa1094efb</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/nfc/st-nci/ndlc.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/nfc/st-nci/ndlc.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
A use-after-free flaw was found in ndlc_remove in drivers/nfc/st-nci/ndlc.c in the Linux Kernel. This flaw could allow an attacker to crash the system due to a race problem.
<p>Publish Date: 2023-04-12
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-1990>CVE-2023-1990</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2023-1990">https://www.linuxkernelcves.com/cves/CVE-2023-1990</a></p>
<p>Release Date: 2023-04-12</p>
<p>Fix Resolution: v4.14.311,v4.19.279,v5.4.238,v5.10.176,v5.15.104,v6.1.21,v6.2.8</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve medium detected in linuxlinux cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch main vulnerable source files drivers nfc st nci ndlc c drivers nfc st nci ndlc c vulnerability details a use after free flaw was found in ndlc remove in drivers nfc st nci ndlc c in the linux kernel this flaw could allow an attacker to crash the system due to a race problem publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
10,477
| 13,251,979,732
|
IssuesEvent
|
2020-08-20 03:51:24
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
opened
|
coprocessor: redact log
|
component/security sig/coprocessor type/enhancement
|
## Development Task
In #7758 we are adding `ecurity.redact-info-log` option to TiKV, to hide user data from printing into info log. The option provide a way for users to avoid leaking sensitive data into info log, when they want to transfer the into log out of the storage node to use in other systems. However the PR only handles MVCC and storage engine KVs. It does not handle coprocessor logs. We need to redact log for the coprocessor module when the option is on.
The task is part of https://github.com/pingcap/tidb/issues/18566 and we target to include it in 5.0 release.
|
1.0
|
coprocessor: redact log - ## Development Task
In #7758 we are adding `ecurity.redact-info-log` option to TiKV, to hide user data from printing into info log. The option provide a way for users to avoid leaking sensitive data into info log, when they want to transfer the into log out of the storage node to use in other systems. However the PR only handles MVCC and storage engine KVs. It does not handle coprocessor logs. We need to redact log for the coprocessor module when the option is on.
The task is part of https://github.com/pingcap/tidb/issues/18566 and we target to include it in 5.0 release.
|
process
|
coprocessor redact log development task in we are adding ecurity redact info log option to tikv to hide user data from printing into info log the option provide a way for users to avoid leaking sensitive data into info log when they want to transfer the into log out of the storage node to use in other systems however the pr only handles mvcc and storage engine kvs it does not handle coprocessor logs we need to redact log for the coprocessor module when the option is on the task is part of and we target to include it in release
| 1
|
40,528
| 8,798,130,200
|
IssuesEvent
|
2018-12-24 04:47:49
|
MovingBlocks/DestinationSol
|
https://api.github.com/repos/MovingBlocks/DestinationSol
|
opened
|
Research and implement a better frame limiter if appropriate
|
Enhancement-code quality Good First Issue help wanted
|
Follow-up to #360 which upgraded DS to use LWJGL on the backend. This broke the old framerate-limiter and @BenjaminAmos provided a quick effort re-implementation so we could move forward.
The goal of this issue will be to research whether a better implementation could be chosen, or if we should stick with what we have. I imagine plenty of LibGDX-LWJGL3 projects have implemented framerate-limiters, maybe one is hiding in an extension or side project somewhere?
See `SolApplication.java` and search for `//HACK: A crude and primitive frame-limiter...` to find the current replacement
|
1.0
|
Research and implement a better frame limiter if appropriate - Follow-up to #360 which upgraded DS to use LWJGL on the backend. This broke the old framerate-limiter and @BenjaminAmos provided a quick effort re-implementation so we could move forward.
The goal of this issue will be to research whether a better implementation could be chosen, or if we should stick with what we have. I imagine plenty of LibGDX-LWJGL3 projects have implemented framerate-limiters, maybe one is hiding in an extension or side project somewhere?
See `SolApplication.java` and search for `//HACK: A crude and primitive frame-limiter...` to find the current replacement
|
non_process
|
research and implement a better frame limiter if appropriate follow up to which upgraded ds to use lwjgl on the backend this broke the old framerate limiter and benjaminamos provided a quick effort re implementation so we could move forward the goal of this issue will be to research whether a better implementation could be chosen or if we should stick with what we have i imagine plenty of libgdx projects have implemented framerate limiters maybe one is hiding in an extension or side project somewhere see solapplication java and search for hack a crude and primitive frame limiter to find the current replacement
| 0
|
28,473
| 8,149,296,865
|
IssuesEvent
|
2018-08-22 09:11:21
|
bazelbuild/continuous-integration
|
https://api.github.com/repos/bazelbuild/continuous-integration
|
opened
|
Windows workers disconnect frequently
|
P1 buildkite infra-flakes
|
Windows workers keep disconnecting from buildkite automatically, which block all our CI jobs.
I had to delete all GCE windows workers and let them to be recreated.
/cc @philwo @buchgr, do you know why windows workers keep going offline?
|
1.0
|
Windows workers disconnect frequently - Windows workers keep disconnecting from buildkite automatically, which block all our CI jobs.
I had to delete all GCE windows workers and let them to be recreated.
/cc @philwo @buchgr, do you know why windows workers keep going offline?
|
non_process
|
windows workers disconnect frequently windows workers keep disconnecting from buildkite automatically which block all our ci jobs i had to delete all gce windows workers and let them to be recreated cc philwo buchgr do you know why windows workers keep going offline
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.