Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 7
112
| repo_url
stringlengths 36
141
| action
stringclasses 3
values | title
stringlengths 1
744
| labels
stringlengths 4
574
| body
stringlengths 9
211k
| index
stringclasses 10
values | text_combine
stringlengths 96
211k
| label
stringclasses 2
values | text
stringlengths 96
188k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
35,390
| 12,323,198,970
|
IssuesEvent
|
2020-05-13 11:44:38
|
slaff/Sming
|
https://api.github.com/repos/slaff/Sming
|
closed
|
WS-2020-0068 (Medium) detected in yargs-parser-3.2.0.tgz
|
security vulnerability
|
## WS-2020-0068 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>yargs-parser-3.2.0.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-3.2.0.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-3.2.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/Sming/samples/HttpServer_ConfigNetwork/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/Sming/samples/HttpServer_ConfigNetwork/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- yargs-5.0.0.tgz (Root Library)
- :x: **yargs-parser-3.2.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/slaff/Sming/commit/0db2f6cd99e3779bc346af00b59d7d2365e14faa">0db2f6cd99e3779bc346af00b59d7d2365e14faa</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of yargs-parser are vulnerable to prototype pollution. Arguments are not properly sanitized, allowing an attacker to modify the prototype of Object, causing the addition or modification of an existing property that will exist on all objects. Parsing the argument --foo.__proto__.bar baz' adds a bar property with value baz to all objects. This is only exploitable if attackers have control over the arguments being passed to yargs-parser.
<p>Publish Date: 2020-05-01
<p>URL: <a href=https://www.npmjs.com/advisories/1500>WS-2020-0068</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/package/yargs-parser">https://www.npmjs.com/package/yargs-parser</a></p>
<p>Release Date: 2020-05-04</p>
<p>Fix Resolution: https://www.npmjs.com/package/yargs-parser/v/18.1.2,https://www.npmjs.com/package/yargs-parser/v/15.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2020-0068 (Medium) detected in yargs-parser-3.2.0.tgz - ## WS-2020-0068 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>yargs-parser-3.2.0.tgz</b></p></summary>
<p>the mighty option parser used by yargs</p>
<p>Library home page: <a href="https://registry.npmjs.org/yargs-parser/-/yargs-parser-3.2.0.tgz">https://registry.npmjs.org/yargs-parser/-/yargs-parser-3.2.0.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/Sming/samples/HttpServer_ConfigNetwork/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/Sming/samples/HttpServer_ConfigNetwork/node_modules/yargs-parser/package.json</p>
<p>
Dependency Hierarchy:
- yargs-5.0.0.tgz (Root Library)
- :x: **yargs-parser-3.2.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/slaff/Sming/commit/0db2f6cd99e3779bc346af00b59d7d2365e14faa">0db2f6cd99e3779bc346af00b59d7d2365e14faa</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Affected versions of yargs-parser are vulnerable to prototype pollution. Arguments are not properly sanitized, allowing an attacker to modify the prototype of Object, causing the addition or modification of an existing property that will exist on all objects. Parsing the argument --foo.__proto__.bar baz' adds a bar property with value baz to all objects. This is only exploitable if attackers have control over the arguments being passed to yargs-parser.
<p>Publish Date: 2020-05-01
<p>URL: <a href=https://www.npmjs.com/advisories/1500>WS-2020-0068</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.npmjs.com/package/yargs-parser">https://www.npmjs.com/package/yargs-parser</a></p>
<p>Release Date: 2020-05-04</p>
<p>Fix Resolution: https://www.npmjs.com/package/yargs-parser/v/18.1.2,https://www.npmjs.com/package/yargs-parser/v/15.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in yargs parser tgz ws medium severity vulnerability vulnerable library yargs parser tgz the mighty option parser used by yargs library home page a href path to dependency file tmp ws scm sming samples httpserver confignetwork package json path to vulnerable library tmp ws scm sming samples httpserver confignetwork node modules yargs parser package json dependency hierarchy yargs tgz root library x yargs parser tgz vulnerable library found in head commit a href vulnerability details affected versions of yargs parser are vulnerable to prototype pollution arguments are not properly sanitized allowing an attacker to modify the prototype of object causing the addition or modification of an existing property that will exist on all objects parsing the argument foo proto bar baz adds a bar property with value baz to all objects this is only exploitable if attackers have control over the arguments being passed to yargs parser publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
1,872
| 4,697,988,013
|
IssuesEvent
|
2016-10-12 11:18:57
|
CERNDocumentServer/cds
|
https://api.github.com/repos/CERNDocumentServer/cds
|
opened
|
Process: test the process workflow
|
avc_processing
|
Test the process workflow including soreson server. We should be able using webhooks to:
* download a file from url
* transcoding
* thumbnail extraction
|
1.0
|
Process: test the process workflow - Test the process workflow including soreson server. We should be able using webhooks to:
* download a file from url
* transcoding
* thumbnail extraction
|
process
|
process test the process workflow test the process workflow including soreson server we should be able using webhooks to download a file from url transcoding thumbnail extraction
| 1
|
113,889
| 9,667,949,188
|
IssuesEvent
|
2019-05-21 14:14:44
|
microsoft/appcenter
|
https://api.github.com/repos/microsoft/appcenter
|
closed
|
Please add Samsung galaxy J7 to your list of supported devices
|
feature request test
|
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
|
1.0
|
Please add Samsung galaxy J7 to your list of supported devices - **Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.
|
non_process
|
please add samsung galaxy to your list of supported devices describe the solution you d like a clear and concise description of what you want to happen describe alternatives you ve considered a clear and concise description of any alternative solutions or features you ve considered additional context add any other context or screenshots about the feature request here
| 0
|
14,037
| 16,844,469,070
|
IssuesEvent
|
2021-06-19 07:05:10
|
darktable-org/darktable
|
https://api.github.com/repos/darktable-org/darktable
|
closed
|
export with style: "replace history" adds history and "append history" does nothing
|
bug: pending scope: DAM scope: UI scope: image processing
|
**Introduction**
I have a style to export in scene liner space from dt.
When selecting "append history" for the style in the export panel the style is not appended. the export is the same as exporting without a style.
when selecting "replace history" the history gets appended (the expected behavior for "append history")
**Describe the bug/issue**
with a style selected
"append history" in the export panel does nothing
"replace history" in the export panel appends the history
[the style i used](https://drive.google.com/file/d/1JgYQ1Ag6nzNB-mLybrfkvLcszNYJbeuz/view?usp=sharing)
**To Reproduce**
1. import the style
2. select an image with filmic enabled
3. go to the export panel
3. select my style as the export style to use
4. select mode "append history"
4. observe that the exported image does not look like it has filmic disabled.
**Expected behavior**
"append history" actually appends the history
"replace history" replaces history
also another minor issue:
the naming between the export panel and the styles panel isn't consistent
overwrite in styles panel is replace history in export panel
|
1.0
|
export with style: "replace history" adds history and "append history" does nothing - **Introduction**
I have a style to export in scene liner space from dt.
When selecting "append history" for the style in the export panel the style is not appended. the export is the same as exporting without a style.
when selecting "replace history" the history gets appended (the expected behavior for "append history")
**Describe the bug/issue**
with a style selected
"append history" in the export panel does nothing
"replace history" in the export panel appends the history
[the style i used](https://drive.google.com/file/d/1JgYQ1Ag6nzNB-mLybrfkvLcszNYJbeuz/view?usp=sharing)
**To Reproduce**
1. import the style
2. select an image with filmic enabled
3. go to the export panel
3. select my style as the export style to use
4. select mode "append history"
4. observe that the exported image does not look like it has filmic disabled.
**Expected behavior**
"append history" actually appends the history
"replace history" replaces history
also another minor issue:
the naming between the export panel and the styles panel isn't consistent
overwrite in styles panel is replace history in export panel
|
process
|
export with style replace history adds history and append history does nothing introduction i have a style to export in scene liner space from dt when selecting append history for the style in the export panel the style is not appended the export is the same as exporting without a style when selecting replace history the history gets appended the expected behavior for append history describe the bug issue with a style selected append history in the export panel does nothing replace history in the export panel appends the history to reproduce import the style select an image with filmic enabled go to the export panel select my style as the export style to use select mode append history observe that the exported image does not look like it has filmic disabled expected behavior append history actually appends the history replace history replaces history also another minor issue the naming between the export panel and the styles panel isn t consistent overwrite in styles panel is replace history in export panel
| 1
|
4,757
| 7,620,684,240
|
IssuesEvent
|
2018-05-03 04:34:36
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
Resolver issues when library name matches class name in same library
|
bug critical parse-tree-processing
|
Who knew it was legal in COM, but the `RefEdit` library also has a `RefEdit` class.
The only way to get VBA to declare a variable of type `RefEdit` is to qualify the class name with the library name:
``` vb
Dim r As RefEdit.RefEdit
```
That seems to confuse the resolver into thinking that _every_ usage of `RefEdit` is a reference to the project:

|
1.0
|
Resolver issues when library name matches class name in same library - Who knew it was legal in COM, but the `RefEdit` library also has a `RefEdit` class.
The only way to get VBA to declare a variable of type `RefEdit` is to qualify the class name with the library name:
``` vb
Dim r As RefEdit.RefEdit
```
That seems to confuse the resolver into thinking that _every_ usage of `RefEdit` is a reference to the project:

|
process
|
resolver issues when library name matches class name in same library who knew it was legal in com but the refedit library also has a refedit class the only way to get vba to declare a variable of type refedit is to qualify the class name with the library name vb dim r as refedit refedit that seems to confuse the resolver into thinking that every usage of refedit is a reference to the project
| 1
|
104,090
| 22,588,772,228
|
IssuesEvent
|
2022-06-28 17:39:10
|
sourcegraph/sourcegraph
|
https://api.github.com/repos/sourcegraph/sourcegraph
|
opened
|
insights: support only snapshots for compute type insight
|
team/code-insights backend compute-insight
|
Note: this is for a retrofit of our current API
The mechanics of the first version of the compute type insight are such that we want only the current (within a day) values, and only a single value. Since the current API returns timeseries, we will want to purge the old data each day so that the response can be naively inspected to construct categorical charts (bar, pie, etc). For this purpose, we will reuse the snapshot functionality.
When we create a compute type insight we will need to do the following:
1. Prevent any backfilling other than the current time (mark backfilled completed, and possibly queue up a snapshot)
2. Disable interval recording (set the next_interval value to something impossibly far in the future?)
|
1.0
|
insights: support only snapshots for compute type insight - Note: this is for a retrofit of our current API
The mechanics of the first version of the compute type insight are such that we want only the current (within a day) values, and only a single value. Since the current API returns timeseries, we will want to purge the old data each day so that the response can be naively inspected to construct categorical charts (bar, pie, etc). For this purpose, we will reuse the snapshot functionality.
When we create a compute type insight we will need to do the following:
1. Prevent any backfilling other than the current time (mark backfilled completed, and possibly queue up a snapshot)
2. Disable interval recording (set the next_interval value to something impossibly far in the future?)
|
non_process
|
insights support only snapshots for compute type insight note this is for a retrofit of our current api the mechanics of the first version of the compute type insight are such that we want only the current within a day values and only a single value since the current api returns timeseries we will want to purge the old data each day so that the response can be naively inspected to construct categorical charts bar pie etc for this purpose we will reuse the snapshot functionality when we create a compute type insight we will need to do the following prevent any backfilling other than the current time mark backfilled completed and possibly queue up a snapshot disable interval recording set the next interval value to something impossibly far in the future
| 0
|
7,551
| 10,675,506,080
|
IssuesEvent
|
2019-10-21 11:51:11
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
Quasi-support ShellExec for UWP
|
area-System.Diagnostics.Process enhancement os-windows-uwp up-for-grabs
|
We didn't ask for ShellExecuteEx to be added to the WACK so for UWP we should make some attempt to use Windows.System.Launcher and if it fails fall back to Process.Start. That ought to give us at least URL support.
@JeremyKuhne
|
1.0
|
Quasi-support ShellExec for UWP - We didn't ask for ShellExecuteEx to be added to the WACK so for UWP we should make some attempt to use Windows.System.Launcher and if it fails fall back to Process.Start. That ought to give us at least URL support.
@JeremyKuhne
|
process
|
quasi support shellexec for uwp we didn t ask for shellexecuteex to be added to the wack so for uwp we should make some attempt to use windows system launcher and if it fails fall back to process start that ought to give us at least url support jeremykuhne
| 1
|
7,143
| 10,288,907,706
|
IssuesEvent
|
2019-08-27 09:28:23
|
vaerilius/angular8-course
|
https://api.github.com/repos/vaerilius/angular8-course
|
closed
|
Section 21: Dynamic Components
|
inProcess
|
- [x] Adding an Alert Modal Component
- [x] Understanding the DifferentApproaches
- [x] Using ngIf
- [x] Preparing Programmatic Creation
- [x] Creating a Component Programmatically
- [x] Understanding entryComponents
- [x] Data Binding & Event Binding
- [x] Wrap Up
- [x] Useful Resources & Links
|
1.0
|
Section 21: Dynamic Components - - [x] Adding an Alert Modal Component
- [x] Understanding the DifferentApproaches
- [x] Using ngIf
- [x] Preparing Programmatic Creation
- [x] Creating a Component Programmatically
- [x] Understanding entryComponents
- [x] Data Binding & Event Binding
- [x] Wrap Up
- [x] Useful Resources & Links
|
process
|
section dynamic components adding an alert modal component understanding the differentapproaches using ngif preparing programmatic creation creating a component programmatically understanding entrycomponents data binding event binding wrap up useful resources links
| 1
|
10,620
| 13,439,143,072
|
IssuesEvent
|
2020-09-07 20:09:59
|
timberio/vector
|
https://api.github.com/repos/timberio/vector
|
opened
|
New `now` function
|
domain: mapping domain: processing type: feature
|
The `now` function generates a new timestamp for the current time with the provided [IANA time zone](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#List).
## Examples
```
.timestamp = now("UTC")
.timestamp = now("US/Eastern")
.timestamp = now("Etc/GMT+4")
```
|
1.0
|
New `now` function - The `now` function generates a new timestamp for the current time with the provided [IANA time zone](https://en.wikipedia.org/wiki/List_of_tz_database_time_zones#List).
## Examples
```
.timestamp = now("UTC")
.timestamp = now("US/Eastern")
.timestamp = now("Etc/GMT+4")
```
|
process
|
new now function the now function generates a new timestamp for the current time with the provided examples timestamp now utc timestamp now us eastern timestamp now etc gmt
| 1
|
103,681
| 16,603,673,476
|
IssuesEvent
|
2021-06-01 23:36:27
|
hygieia/hygieia-common
|
https://api.github.com/repos/hygieia/hygieia-common
|
opened
|
WS-2020-0293 (Medium) detected in spring-security-web-4.2.15.RELEASE.jar
|
security vulnerability
|
## WS-2020-0293 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-security-web-4.2.15.RELEASE.jar</b></p></summary>
<p>spring-security-web</p>
<p>Library home page: <a href="https://spring.io/spring-security">https://spring.io/spring-security</a></p>
<p>Path to dependency file: hygieia-common/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/org/springframework/security/spring-security-web/4.2.15.RELEASE/spring-security-web-4.2.15.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- :x: **spring-security-web-4.2.15.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/hygieia/hygieia-common/commits/b8fbfc18552132520e52029d9b0fc0a1db09f115">b8fbfc18552132520e52029d9b0fc0a1db09f115</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Spring Security before 5.2.9, 5.3.7, and 5.4.3 vulnerable to side-channel attacks. Vulnerable versions of Spring Security don't use constant time comparisons for CSRF tokens.
<p>Publish Date: 2020-12-17
<p>URL: <a href=https://github.com/spring-projects/spring-security/commit/40e027c56d11b9b4c5071360bfc718165c937784>WS-2020-0293</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/spring-projects/spring-security/issues/9291">https://github.com/spring-projects/spring-security/issues/9291</a></p>
<p>Release Date: 2020-12-17</p>
<p>Fix Resolution: org.springframework.security:spring-security-web:5.2.9,5.3.7,5.4.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
WS-2020-0293 (Medium) detected in spring-security-web-4.2.15.RELEASE.jar - ## WS-2020-0293 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-security-web-4.2.15.RELEASE.jar</b></p></summary>
<p>spring-security-web</p>
<p>Library home page: <a href="https://spring.io/spring-security">https://spring.io/spring-security</a></p>
<p>Path to dependency file: hygieia-common/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/org/springframework/security/spring-security-web/4.2.15.RELEASE/spring-security-web-4.2.15.RELEASE.jar</p>
<p>
Dependency Hierarchy:
- :x: **spring-security-web-4.2.15.RELEASE.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/hygieia/hygieia-common/commits/b8fbfc18552132520e52029d9b0fc0a1db09f115">b8fbfc18552132520e52029d9b0fc0a1db09f115</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Spring Security before 5.2.9, 5.3.7, and 5.4.3 vulnerable to side-channel attacks. Vulnerable versions of Spring Security don't use constant time comparisons for CSRF tokens.
<p>Publish Date: 2020-12-17
<p>URL: <a href=https://github.com/spring-projects/spring-security/commit/40e027c56d11b9b4c5071360bfc718165c937784>WS-2020-0293</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/spring-projects/spring-security/issues/9291">https://github.com/spring-projects/spring-security/issues/9291</a></p>
<p>Release Date: 2020-12-17</p>
<p>Fix Resolution: org.springframework.security:spring-security-web:5.2.9,5.3.7,5.4.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
ws medium detected in spring security web release jar ws medium severity vulnerability vulnerable library spring security web release jar spring security web library home page a href path to dependency file hygieia common pom xml path to vulnerable library canner repository org springframework security spring security web release spring security web release jar dependency hierarchy x spring security web release jar vulnerable library found in head commit a href found in base branch main vulnerability details spring security before and vulnerable to side channel attacks vulnerable versions of spring security don t use constant time comparisons for csrf tokens publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org springframework security spring security web step up your open source security game with whitesource
| 0
|
6,209
| 9,116,887,531
|
IssuesEvent
|
2019-02-22 10:13:46
|
decidim/decidim
|
https://api.github.com/repos/decidim/decidim
|
closed
|
Invalid options shown to participatory process collaborator users
|
space: processes type: bug
|
**Describe the bug**
For a collaborator user of a participatory processes, when accessing to the edit page throught the participatory front page, the admin main menu offers options that the user is not allowed to perform.
**To Reproduce**
Steps to reproduce the behavior:
1. Add a collaborator user for a participatory process
2. Login with that user
3. Visit the participatory process page
4. Click on the Edit link located on the upper right of the screen
**Expected behavior**
When modifying a participatory process, the menu shouldn't offer additional options if the user can't perform them.
**Screenshots**
In these screenshots the user is collaborator in "Proceso 1" and administrator in "Proceso 2".
When editing a participatory process, the main menu shows to collaborators additional options that can't be accessed, because the user doesn't has rights to use them.


**Stacktrace**
If applicable, add the error stacktrace to help explain your problem.
**Extra data (please complete the following information):**
- Device: Desktop
- Device OS: Ubuntu 16.04
- Browser: Firefox
- Decidim Version: 0.16
- Decidim installation: local development
**Additional context**
Add any other context about the problem here.
|
1.0
|
Invalid options shown to participatory process collaborator users - **Describe the bug**
For a collaborator user of a participatory processes, when accessing to the edit page throught the participatory front page, the admin main menu offers options that the user is not allowed to perform.
**To Reproduce**
Steps to reproduce the behavior:
1. Add a collaborator user for a participatory process
2. Login with that user
3. Visit the participatory process page
4. Click on the Edit link located on the upper right of the screen
**Expected behavior**
When modifying a participatory process, the menu shouldn't offer additional options if the user can't perform them.
**Screenshots**
In these screenshots the user is collaborator in "Proceso 1" and administrator in "Proceso 2".
When editing a participatory process, the main menu shows to collaborators additional options that can't be accessed, because the user doesn't has rights to use them.


**Stacktrace**
If applicable, add the error stacktrace to help explain your problem.
**Extra data (please complete the following information):**
- Device: Desktop
- Device OS: Ubuntu 16.04
- Browser: Firefox
- Decidim Version: 0.16
- Decidim installation: local development
**Additional context**
Add any other context about the problem here.
|
process
|
invalid options shown to participatory process collaborator users describe the bug for a collaborator user of a participatory processes when accessing to the edit page throught the participatory front page the admin main menu offers options that the user is not allowed to perform to reproduce steps to reproduce the behavior add a collaborator user for a participatory process login with that user visit the participatory process page click on the edit link located on the upper right of the screen expected behavior when modifying a participatory process the menu shouldn t offer additional options if the user can t perform them screenshots in these screenshots the user is collaborator in proceso and administrator in proceso when editing a participatory process the main menu shows to collaborators additional options that can t be accessed because the user doesn t has rights to use them stacktrace if applicable add the error stacktrace to help explain your problem extra data please complete the following information device desktop device os ubuntu browser firefox decidim version decidim installation local development additional context add any other context about the problem here
| 1
|
22,097
| 30,624,863,455
|
IssuesEvent
|
2023-07-24 10:44:35
|
EBIvariation/eva-opentargets
|
https://api.github.com/repos/EBIvariation/eva-opentargets
|
opened
|
Manual curation for 2023.09 release
|
Processing
|
Refer to [documentation](https://github.com/EBIvariation/eva-opentargets/tree/master/docs/manual-curation) for full description of steps.
Minimal curation round to update to OLS4 (i.e. more recent version of EFO).
**Checklist:**
- [ ] Step 1 — Process
- [ ] Step 2 — Curate
- [ ] Curation
- [ ] Review 1
- [ ] Review 2
- [ ] Step 3 — Export
- [ ] Step 4 — EFO feedback
|
1.0
|
Manual curation for 2023.09 release - Refer to [documentation](https://github.com/EBIvariation/eva-opentargets/tree/master/docs/manual-curation) for full description of steps.
Minimal curation round to update to OLS4 (i.e. more recent version of EFO).
**Checklist:**
- [ ] Step 1 — Process
- [ ] Step 2 — Curate
- [ ] Curation
- [ ] Review 1
- [ ] Review 2
- [ ] Step 3 — Export
- [ ] Step 4 — EFO feedback
|
process
|
manual curation for release refer to for full description of steps minimal curation round to update to i e more recent version of efo checklist step — process step — curate curation review review step — export step — efo feedback
| 1
|
18,118
| 24,150,277,481
|
IssuesEvent
|
2022-09-21 23:26:06
|
googleapis/repo-automation-bots
|
https://api.github.com/repos/googleapis/repo-automation-bots
|
closed
|
Migrate bots to Node 14
|
type: process
|
We should update bots to Node 14 to update dependencies. But, we need to be careful when upgrading bots that it does not break when deploying to GCF.
|
1.0
|
Migrate bots to Node 14 - We should update bots to Node 14 to update dependencies. But, we need to be careful when upgrading bots that it does not break when deploying to GCF.
|
process
|
migrate bots to node we should update bots to node to update dependencies but we need to be careful when upgrading bots that it does not break when deploying to gcf
| 1
|
30,274
| 5,774,532,444
|
IssuesEvent
|
2017-04-28 07:28:29
|
facette/facette
|
https://api.github.com/repos/facette/facette
|
closed
|
Build fails on Gentoo Linux
|
Documentation Question
|
Trying to build 0.4alpha on Gentoo, I've got all the prerequisites (Node, NPM, RRDTool (including headers), Go, pandoc, and pkg-config), and it chokes when trying to build the assets. I've tried this a couple of times now from a clean clone of the repository.
Based on the output, it looks like the makefile is expecting something to be there that isn't in the repo, isn't generated by the makefile itself, and has no instructions for manual generation either.
Build output:
```
build: Preparing build directory...
result: ok
build: Building binaries...
github.com/lib/pq/oid
github.com/mattn/go-colorable
github.com/influxdata/influxdb/influxql/neldermead
github.com/influxdata/influxdb/pkg/escape
facette/worker
facette/mapper
github.com/facette/sliceutil
github.com/go-sql-driver/mysql
github.com/jinzhu/inflection
github.com/lib/pq
golang.org/x/net/context
github.com/pkg/errors
github.com/fatih/set
github.com/mgutz/ansi
github.com/hashicorp/go-uuid
github.com/mattn/go-sqlite3
github.com/facette/httputil
github.com/influxdata/influxdb/models
facette/template
github.com/facette/logger
github.com/gogo/protobuf/proto
github.com/ziutek/rrd
facette/timerange
gopkg.in/yaml.v2
github.com/cosiner/flag
github.com/influxdata/influxdb/client/v2
github.com/facette/httproute
github.com/facette/jsonutil
github.com/facette/natsort
gopkg.in/tylerb/graceful.v1
facette/yamlutil
github.com/influxdata/influxdb/influxql/internal
github.com/influxdata/influxdb/influxql
facette/orm
facette/backend
facette/catalog
facette/plot
facette/connector
cmd/facette
cmd/facettectl
result: ok
build: Building assets...
/bin/sh: node_modules/.bin/gulp: No such file or directory
result: fail
make: *** [Makefile:92: build-assets] Error 1
```
|
1.0
|
Build fails on Gentoo Linux - Trying to build 0.4alpha on Gentoo, I've got all the prerequisites (Node, NPM, RRDTool (including headers), Go, pandoc, and pkg-config), and it chokes when trying to build the assets. I've tried this a couple of times now from a clean clone of the repository.
Based on the output, it looks like the makefile is expecting something to be there that isn't in the repo, isn't generated by the makefile itself, and has no instructions for manual generation either.
Build output:
```
build: Preparing build directory...
result: ok
build: Building binaries...
github.com/lib/pq/oid
github.com/mattn/go-colorable
github.com/influxdata/influxdb/influxql/neldermead
github.com/influxdata/influxdb/pkg/escape
facette/worker
facette/mapper
github.com/facette/sliceutil
github.com/go-sql-driver/mysql
github.com/jinzhu/inflection
github.com/lib/pq
golang.org/x/net/context
github.com/pkg/errors
github.com/fatih/set
github.com/mgutz/ansi
github.com/hashicorp/go-uuid
github.com/mattn/go-sqlite3
github.com/facette/httputil
github.com/influxdata/influxdb/models
facette/template
github.com/facette/logger
github.com/gogo/protobuf/proto
github.com/ziutek/rrd
facette/timerange
gopkg.in/yaml.v2
github.com/cosiner/flag
github.com/influxdata/influxdb/client/v2
github.com/facette/httproute
github.com/facette/jsonutil
github.com/facette/natsort
gopkg.in/tylerb/graceful.v1
facette/yamlutil
github.com/influxdata/influxdb/influxql/internal
github.com/influxdata/influxdb/influxql
facette/orm
facette/backend
facette/catalog
facette/plot
facette/connector
cmd/facette
cmd/facettectl
result: ok
build: Building assets...
/bin/sh: node_modules/.bin/gulp: No such file or directory
result: fail
make: *** [Makefile:92: build-assets] Error 1
```
|
non_process
|
build fails on gentoo linux trying to build on gentoo i ve got all the prerequisites node npm rrdtool including headers go pandoc and pkg config and it chokes when trying to build the assets i ve tried this a couple of times now from a clean clone of the repository based on the output it looks like the makefile is expecting something to be there that isn t in the repo isn t generated by the makefile itself and has no instructions for manual generation either build output build preparing build directory result ok build building binaries github com lib pq oid github com mattn go colorable github com influxdata influxdb influxql neldermead github com influxdata influxdb pkg escape facette worker facette mapper github com facette sliceutil github com go sql driver mysql github com jinzhu inflection github com lib pq golang org x net context github com pkg errors github com fatih set github com mgutz ansi github com hashicorp go uuid github com mattn go github com facette httputil github com influxdata influxdb models facette template github com facette logger github com gogo protobuf proto github com ziutek rrd facette timerange gopkg in yaml github com cosiner flag github com influxdata influxdb client github com facette httproute github com facette jsonutil github com facette natsort gopkg in tylerb graceful facette yamlutil github com influxdata influxdb influxql internal github com influxdata influxdb influxql facette orm facette backend facette catalog facette plot facette connector cmd facette cmd facettectl result ok build building assets bin sh node modules bin gulp no such file or directory result fail make error
| 0
|
14,442
| 17,498,340,113
|
IssuesEvent
|
2021-08-10 05:50:06
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[Android] Unable to Share the Consent PDF for the studies which contains the '/ ' in the Study name
|
Bug P1 Android Process: Fixed Process: Tested QA
|
A/R:- Getting error message and unable to share the Consent PDF
E/R:- Consent PDF should be able to share without any error

|
2.0
|
[Android] Unable to Share the Consent PDF for the studies which contains the '/ ' in the Study name - A/R:- Getting error message and unable to share the Consent PDF
E/R:- Consent PDF should be able to share without any error

|
process
|
unable to share the consent pdf for the studies which contains the in the study name a r getting error message and unable to share the consent pdf e r consent pdf should be able to share without any error
| 1
|
10,694
| 13,490,439,410
|
IssuesEvent
|
2020-09-11 15:07:30
|
jgraley/inferno-cpp2v
|
https://api.github.com/repos/jgraley/inferno-cpp2v
|
closed
|
Remove the side info
|
Constraint Processing
|
Enable `AndRuleEngine` to determine evaluators and normal and abnormal links from just the couplings combined with the "stiff" data structures. The latter may need embellishing. Then remove all mention of `side_info` from the CSP stuff.
|
1.0
|
Remove the side info - Enable `AndRuleEngine` to determine evaluators and normal and abnormal links from just the couplings combined with the "stiff" data structures. The latter may need embellishing. Then remove all mention of `side_info` from the CSP stuff.
|
process
|
remove the side info enable andruleengine to determine evaluators and normal and abnormal links from just the couplings combined with the stiff data structures the latter may need embellishing then remove all mention of side info from the csp stuff
| 1
|
390,262
| 26,855,758,246
|
IssuesEvent
|
2023-02-03 14:30:26
|
xpsi-group/xpsi
|
https://api.github.com/repos/xpsi-group/xpsi
|
closed
|
Extra info for library installation
|
documentation hackweek2023
|
Add extra guidelines to install other library dependencies (BLAS, LAPACK, etc) for beginners.
|
1.0
|
Extra info for library installation - Add extra guidelines to install other library dependencies (BLAS, LAPACK, etc) for beginners.
|
non_process
|
extra info for library installation add extra guidelines to install other library dependencies blas lapack etc for beginners
| 0
|
18,240
| 24,308,629,448
|
IssuesEvent
|
2022-09-29 19:51:40
|
benthosdev/benthos
|
https://api.github.com/repos/benthosdev/benthos
|
closed
|
Support oracle db in sql components
|
enhancement processors outputs
|
Hi, how long will it take for benthos to support oracle db?
|
1.0
|
Support oracle db in sql components - Hi, how long will it take for benthos to support oracle db?
|
process
|
support oracle db in sql components hi how long will it take for benthos to support oracle db
| 1
|
14,765
| 18,044,486,770
|
IssuesEvent
|
2021-09-18 16:51:03
|
Leviatan-Analytics/LA-data-processing
|
https://api.github.com/repos/Leviatan-Analytics/LA-data-processing
|
closed
|
Implement player positionting missing datapoints curation techniques [1]
|
Data Processing Week 3 Sprint 4
|
Start implementing algorithm to enhance the recognition when player names overlap.
|
1.0
|
Implement player positionting missing datapoints curation techniques [1] - Start implementing algorithm to enhance the recognition when player names overlap.
|
process
|
implement player positionting missing datapoints curation techniques start implementing algorithm to enhance the recognition when player names overlap
| 1
|
7,013
| 10,164,283,485
|
IssuesEvent
|
2019-08-07 11:17:15
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
Processing : execute sql alg does not work with input2 ... inputN parameters
|
Bug Processing
|
Author Name: **jd lom** (@lejedi76)
Original Redmine Issue: [16129](https://issues.qgis.org/issues/16129)
Affected QGIS version: 3.5(master)
Redmine category:processing/qgis
---
In ExecuteSQL.py, the layerIdx variable is not incremented.
Best regards,
---
Related issue(s): #28487 (duplicates)
Redmine related issue(s): [20667](https://issues.qgis.org/issues/20667)
---
|
1.0
|
Processing : execute sql alg does not work with input2 ... inputN parameters - Author Name: **jd lom** (@lejedi76)
Original Redmine Issue: [16129](https://issues.qgis.org/issues/16129)
Affected QGIS version: 3.5(master)
Redmine category:processing/qgis
---
In ExecuteSQL.py, the layerIdx variable is not incremented.
Best regards,
---
Related issue(s): #28487 (duplicates)
Redmine related issue(s): [20667](https://issues.qgis.org/issues/20667)
---
|
process
|
processing execute sql alg does not work with inputn parameters author name jd lom original redmine issue affected qgis version master redmine category processing qgis in executesql py the layeridx variable is not incremented best regards related issue s duplicates redmine related issue s
| 1
|
6,925
| 10,084,376,195
|
IssuesEvent
|
2019-07-25 15:32:14
|
geneontology/go-ontology
|
https://api.github.com/repos/geneontology/go-ontology
|
opened
|
NTRs, fork processing-related PMID:30667359
|
New term request PomBase cell cycle and DNA processes community curation recombination
|
A PomBase community curator, Matt Whitby, has requested two new terms for PMID:30667359:
1. negative regulation of template switch recombination
For this one, I suggest:
name: negative regulation of template switch recombination involved in replication fork processing
def: "Any process that stops, prevents or reduces the frequency, rate or extent of recombination that results in template switching associated with recombination-dependent replication downstream of a replication fork barrier that occurs as part of replication fork processing."
is_a: GO:1903221 ! regulation of mitotic recombination involved in replication fork processing
I gather that template switching in this context is just a thing that happens at some frequency, and is usually deleterious for the cell/organism - I'm not at all sure it's a bona fide genetically programmed "biological process" as GO now defines them, or the adaptive role of the gene products that happen to cause it or let it take place. So I don't feel an immediate need to add a template switching term just so GO:new1 can have a "regulates GO:bp" link. The is_a GO:1903221 link will suffice. For similar reasons, I also have no need for the generic "regulation" or "positive regulation" terms.
2. positive regulation of recombination-dependent DNA replication
In this case, the regulated process may be essentially the same as GO:1990426 ! mitotic recombination-dependent replication fork processing; again, I'm not sure, but if it is the same thing then I'd be happy to use 'positive regulation of mitotic recombination-dependent replication fork processing' as the name and put the suggested text in as a synonym.
Matt suggested "A cellular process in which the restart of DNA replication by homologous recombination is promoted" as a definition; the mention of "restart" does fit with the fork processing context. I am far from an expert in this area, so if you have questions about this or any of the rest of the ticket, I probably won't be able to answer, but will happily forward any to Matt.
For both terms, PMID:28586299 and PMID:31149897 also have some relevant information and could be cited for the defs.
|
1.0
|
NTRs, fork processing-related PMID:30667359 - A PomBase community curator, Matt Whitby, has requested two new terms for PMID:30667359:
1. negative regulation of template switch recombination
For this one, I suggest:
name: negative regulation of template switch recombination involved in replication fork processing
def: "Any process that stops, prevents or reduces the frequency, rate or extent of recombination that results in template switching associated with recombination-dependent replication downstream of a replication fork barrier that occurs as part of replication fork processing."
is_a: GO:1903221 ! regulation of mitotic recombination involved in replication fork processing
I gather that template switching in this context is just a thing that happens at some frequency, and is usually deleterious for the cell/organism - I'm not at all sure it's a bona fide genetically programmed "biological process" as GO now defines them, or the adaptive role of the gene products that happen to cause it or let it take place. So I don't feel an immediate need to add a template switching term just so GO:new1 can have a "regulates GO:bp" link. The is_a GO:1903221 link will suffice. For similar reasons, I also have no need for the generic "regulation" or "positive regulation" terms.
2. positive regulation of recombination-dependent DNA replication
In this case, the regulated process may be essentially the same as GO:1990426 ! mitotic recombination-dependent replication fork processing; again, I'm not sure, but if it is the same thing then I'd be happy to use 'positive regulation of mitotic recombination-dependent replication fork processing' as the name and put the suggested text in as a synonym.
Matt suggested "A cellular process in which the restart of DNA replication by homologous recombination is promoted" as a definition; the mention of "restart" does fit with the fork processing context. I am far from an expert in this area, so if you have questions about this or any of the rest of the ticket, I probably won't be able to answer, but will happily forward any to Matt.
For both terms, PMID:28586299 and PMID:31149897 also have some relevant information and could be cited for the defs.
|
process
|
ntrs fork processing related pmid a pombase community curator matt whitby has requested two new terms for pmid negative regulation of template switch recombination for this one i suggest name negative regulation of template switch recombination involved in replication fork processing def any process that stops prevents or reduces the frequency rate or extent of recombination that results in template switching associated with recombination dependent replication downstream of a replication fork barrier that occurs as part of replication fork processing is a go regulation of mitotic recombination involved in replication fork processing i gather that template switching in this context is just a thing that happens at some frequency and is usually deleterious for the cell organism i m not at all sure it s a bona fide genetically programmed biological process as go now defines them or the adaptive role of the gene products that happen to cause it or let it take place so i don t feel an immediate need to add a template switching term just so go can have a regulates go bp link the is a go link will suffice for similar reasons i also have no need for the generic regulation or positive regulation terms positive regulation of recombination dependent dna replication in this case the regulated process may be essentially the same as go mitotic recombination dependent replication fork processing again i m not sure but if it is the same thing then i d be happy to use positive regulation of mitotic recombination dependent replication fork processing as the name and put the suggested text in as a synonym matt suggested a cellular process in which the restart of dna replication by homologous recombination is promoted as a definition the mention of restart does fit with the fork processing context i am far from an expert in this area so if you have questions about this or any of the rest of the ticket i probably won t be able to answer but will happily forward any to matt for both terms pmid and pmid also have some relevant information and could be cited for the defs
| 1
|
9,004
| 12,120,728,677
|
IssuesEvent
|
2020-04-22 08:10:11
|
threefoldtech/jumpscaleX_core
|
https://api.github.com/repos/threefoldtech/jumpscaleX_core
|
closed
|
Error starting a halted 3bot container using 3sdk
|
process_wontfix type_bug
|
branch development commit c99a93e849f64e65ac91f393a5e3473d60386977
```
3sdk> container start
@param server=True will start 3bot server
3bot
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/container.py", line 120, in start
c = _containers.get(name=name)
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/lib/SDKContainers.py", line 74, in get
if not docker.executor.exists("/sandbox/cfg/.configured"):
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/InstallTools.py", line 5402, in executor
if not self._executor:
AttributeError: 'DockerContainer' object has no attribute '_executor'
'DockerContainer' object has no attribute '_executor'
3sdk> container list
list the containers
- 3bot : localhost : threefoldtech/3bot2 (sshport:9000)
3sdk> container kosmos
start kosmos shell
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/container.py", line 104, in kosmos
c = _containers.get(name=name)
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/lib/SDKContainers.py", line 74, in get
if not docker.executor.exists("/sandbox/cfg/.configured"):
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/InstallTools.py", line 5402, in executor
if not self._executor:
AttributeError: 'DockerContainer' object has no attribute '_executor'
'DockerContainer' object has no attribute '_executor'
```
|
1.0
|
Error starting a halted 3bot container using 3sdk - branch development commit c99a93e849f64e65ac91f393a5e3473d60386977
```
3sdk> container start
@param server=True will start 3bot server
3bot
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/container.py", line 120, in start
c = _containers.get(name=name)
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/lib/SDKContainers.py", line 74, in get
if not docker.executor.exists("/sandbox/cfg/.configured"):
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/InstallTools.py", line 5402, in executor
if not self._executor:
AttributeError: 'DockerContainer' object has no attribute '_executor'
'DockerContainer' object has no attribute '_executor'
3sdk> container list
list the containers
- 3bot : localhost : threefoldtech/3bot2 (sshport:9000)
3sdk> container kosmos
start kosmos shell
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/container.py", line 104, in kosmos
c = _containers.get(name=name)
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/lib/SDKContainers.py", line 74, in get
if not docker.executor.exists("/sandbox/cfg/.configured"):
File "/Users/rob/sandbox/code/github/threefoldtech/jumpscaleX_core/install/threesdk/InstallTools.py", line 5402, in executor
if not self._executor:
AttributeError: 'DockerContainer' object has no attribute '_executor'
'DockerContainer' object has no attribute '_executor'
```
|
process
|
error starting a halted container using branch development commit container start param server true will start server traceback most recent call last file line in file users rob sandbox code github threefoldtech jumpscalex core install threesdk container py line in start c containers get name name file users rob sandbox code github threefoldtech jumpscalex core install threesdk lib sdkcontainers py line in get if not docker executor exists sandbox cfg configured file users rob sandbox code github threefoldtech jumpscalex core install threesdk installtools py line in executor if not self executor attributeerror dockercontainer object has no attribute executor dockercontainer object has no attribute executor container list list the containers localhost threefoldtech sshport container kosmos start kosmos shell traceback most recent call last file line in file users rob sandbox code github threefoldtech jumpscalex core install threesdk container py line in kosmos c containers get name name file users rob sandbox code github threefoldtech jumpscalex core install threesdk lib sdkcontainers py line in get if not docker executor exists sandbox cfg configured file users rob sandbox code github threefoldtech jumpscalex core install threesdk installtools py line in executor if not self executor attributeerror dockercontainer object has no attribute executor dockercontainer object has no attribute executor
| 1
|
217,651
| 7,326,791,772
|
IssuesEvent
|
2018-03-04 00:37:55
|
owtf/owtf
|
https://api.github.com/repos/owtf/owtf
|
opened
|
Update and complete Sphinx documentation
|
Easy Fix Priority High help wanted
|
OWTF uses Sphinx to generate internal API and user docs. Currently, the docs do not correctly reflect the current installation methods and vast changes to OWTF since the early releases.
The documentation lives under `doc` directory and can be generated using the Makefile - `make docs`.
|
1.0
|
Update and complete Sphinx documentation - OWTF uses Sphinx to generate internal API and user docs. Currently, the docs do not correctly reflect the current installation methods and vast changes to OWTF since the early releases.
The documentation lives under `doc` directory and can be generated using the Makefile - `make docs`.
|
non_process
|
update and complete sphinx documentation owtf uses sphinx to generate internal api and user docs currently the docs do not correctly reflect the current installation methods and vast changes to owtf since the early releases the documentation lives under doc directory and can be generated using the makefile make docs
| 0
|
13,504
| 16,044,465,304
|
IssuesEvent
|
2021-04-22 12:05:47
|
SpongePowered/Mixin
|
https://api.github.com/repos/SpongePowered/Mixin
|
closed
|
Refmap processor does not generate subdirectories
|
annotation processor enhancement
|
**MixinGradle version:** 0.7-SNAPSHOT
**Mixin version:** 0.8.2
I use the following path structure for my refmap `mixins/$modid/refmap.json`, but Mixin does not resolve the parent directories, and in turn silently fails to create the refmap. This results in a `FileNotFoundException` later in compilation as Mixin then cannot find the refmap.
---
This is not high priority as I have a workaround, but it would be nice to have it resolved eventually:
```kotlin
doLast {
buildDir.resolve("tmp/compileJava/$mixinRefmap").parentFile.mkdirs()
}
```
|
1.0
|
Refmap processor does not generate subdirectories - **MixinGradle version:** 0.7-SNAPSHOT
**Mixin version:** 0.8.2
I use the following path structure for my refmap `mixins/$modid/refmap.json`, but Mixin does not resolve the parent directories, and in turn silently fails to create the refmap. This results in a `FileNotFoundException` later in compilation as Mixin then cannot find the refmap.
---
This is not high priority as I have a workaround, but it would be nice to have it resolved eventually:
```kotlin
doLast {
buildDir.resolve("tmp/compileJava/$mixinRefmap").parentFile.mkdirs()
}
```
|
process
|
refmap processor does not generate subdirectories mixingradle version snapshot mixin version i use the following path structure for my refmap mixins modid refmap json but mixin does not resolve the parent directories and in turn silently fails to create the refmap this results in a filenotfoundexception later in compilation as mixin then cannot find the refmap this is not high priority as i have a workaround but it would be nice to have it resolved eventually kotlin dolast builddir resolve tmp compilejava mixinrefmap parentfile mkdirs
| 1
|
3,298
| 6,395,587,374
|
IssuesEvent
|
2017-08-04 13:38:13
|
pelias/api
|
https://api.github.com/repos/pelias/api
|
closed
|
[security] add tests to ensure that the api_key is never returned in the meta data
|
help wanted low hanging fruit processed
|
[security] add tests to ensure that the api_key is never returned in the meta data.
this is especially relevant for upstream caches.
|
1.0
|
[security] add tests to ensure that the api_key is never returned in the meta data - [security] add tests to ensure that the api_key is never returned in the meta data.
this is especially relevant for upstream caches.
|
process
|
add tests to ensure that the api key is never returned in the meta data add tests to ensure that the api key is never returned in the meta data this is especially relevant for upstream caches
| 1
|
383,224
| 11,352,829,199
|
IssuesEvent
|
2020-01-24 14:26:30
|
qutebrowser/qutebrowser
|
https://api.github.com/repos/qutebrowser/qutebrowser
|
closed
|
Add prefers-color-scheme CSS media feature to qutebrowser settings
|
component: QtWebEngine priority: 3 - wishlist qt: 5.14
|
It would be nice if https://developer.mozilla.org/en-US/docs/Web/CSS/@media/prefers-color-scheme was configurable in the settings.
|
1.0
|
Add prefers-color-scheme CSS media feature to qutebrowser settings - It would be nice if https://developer.mozilla.org/en-US/docs/Web/CSS/@media/prefers-color-scheme was configurable in the settings.
|
non_process
|
add prefers color scheme css media feature to qutebrowser settings it would be nice if was configurable in the settings
| 0
|
157,798
| 6,016,006,955
|
IssuesEvent
|
2017-06-07 05:03:07
|
javaee/grizzly
|
https://api.github.com/repos/javaee/grizzly
|
closed
|
Grizzly should warn when the HTTP specification is violated
|
Component: http Priority: Major Type: Improvement
|
Take [https://java.net/jira/browse/JERSEY-845](https://java.net/jira/browse/JERSEY-845) for example, if a user tries returning an entity body along with HTTP 304 Grizzly should either log a warning or throw an exception. Either way, the user should be informed so he/she can fix the defective code.
#### Affected Versions
[2.3.6]
|
1.0
|
Grizzly should warn when the HTTP specification is violated - Take [https://java.net/jira/browse/JERSEY-845](https://java.net/jira/browse/JERSEY-845) for example, if a user tries returning an entity body along with HTTP 304 Grizzly should either log a warning or throw an exception. Either way, the user should be informed so he/she can fix the defective code.
#### Affected Versions
[2.3.6]
|
non_process
|
grizzly should warn when the http specification is violated take for example if a user tries returning an entity body along with http grizzly should either log a warning or throw an exception either way the user should be informed so he she can fix the defective code affected versions
| 0
|
6,151
| 9,024,398,451
|
IssuesEvent
|
2019-02-07 10:26:18
|
DevExpress/testcafe-hammerhead
|
https://api.github.com/repos/DevExpress/testcafe-hammerhead
|
closed
|
Css source map regex is commenting code
|
SYSTEM: resource processing TYPE: bug
|
This [regular expression](https://github.com/DevExpress/testcafe-hammerhead/blob/master/src/processing/style.js#L11) will match all characters of the source map comment even the `*/`, leaving `/*` which can lead to the rest of the style being commented if someone has imported a stylesheet that also contains a source map comment.
Changing it to `/#\s*sourceMappingURL\s*=\s*[^\s]+?(\s|\*\/)/i` fix the issue. I tested the behavior of this Regex with: https://regexr.com/.
|
1.0
|
Css source map regex is commenting code - This [regular expression](https://github.com/DevExpress/testcafe-hammerhead/blob/master/src/processing/style.js#L11) will match all characters of the source map comment even the `*/`, leaving `/*` which can lead to the rest of the style being commented if someone has imported a stylesheet that also contains a source map comment.
Changing it to `/#\s*sourceMappingURL\s*=\s*[^\s]+?(\s|\*\/)/i` fix the issue. I tested the behavior of this Regex with: https://regexr.com/.
|
process
|
css source map regex is commenting code this will match all characters of the source map comment even the leaving which can lead to the rest of the style being commented if someone has imported a stylesheet that also contains a source map comment changing it to s sourcemappingurl s s s i fix the issue i tested the behavior of this regex with
| 1
|
8,772
| 11,888,514,161
|
IssuesEvent
|
2020-03-28 08:55:29
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Missing headers in table results
|
Priority:P1 Querying/Native Querying/Processor Type:Bug
|
**Describe the bug**
Missing headers in table results, only happens when we refresh the page
**Logs**
<img width="513" alt="Screenshot 2020-03-20 at 12 56 50" src="https://user-images.githubusercontent.com/4472857/77165468-4605ef00-6aaa-11ea-9698-34bef7a84b13.png">
**Screenshots**
<img width="1145" alt="Screenshot 2020-03-20 at 12 55 18" src="https://user-images.githubusercontent.com/4472857/77165402-1a830480-6aaa-11ea-92b6-47a59eb83c5f.png">
**Information about your Metabase Installation:**
You can get this information by going to Admin -> Troubleshooting.
{
"browser-info": {
"language": "en-GB",
"platform": "MacIntel",
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36",
"vendor": "Google Inc."
},
"system-info": {
"java.runtime.name": "OpenJDK Runtime Environment",
"java.runtime.version": "11.0.6+10-post-Debian-1bpo91",
"java.vendor": "Debian",
"java.vendor.url": "https://tracker.debian.org/openjdk-11",
"java.version": "11.0.6",
"java.vm.name": "OpenJDK 64-Bit Server VM",
"java.vm.version": "11.0.6+10-post-Debian-1bpo91",
"os.name": "Linux",
"os.version": "4.9.0-7-amd64",
"user.language": "en",
"user.timezone": "UTC"
},
"metabase-info": {
"databases": [
"postgres",
"exasol",
"bigquery",
"h2"
],
"hosting-env": "unknown",
"application-database": "postgres",
"application-database-details": {
"database": {
"name": "PostgreSQL",
"version": "9.6.16"
},
"jdbc-driver": {
"name": "PostgreSQL JDBC Driver",
"version": "42.2.8"
}
},
"run-mode": "prod",
"version": {
"date": "2020-03-16",
"tag": "v0.35.0-rc1",
"branch": "master",
"hash": "022d0ca"
},
"settings": {
"report-timezone": "UTC"
}
}
}
|
1.0
|
Missing headers in table results - **Describe the bug**
Missing headers in table results, only happens when we refresh the page
**Logs**
<img width="513" alt="Screenshot 2020-03-20 at 12 56 50" src="https://user-images.githubusercontent.com/4472857/77165468-4605ef00-6aaa-11ea-9698-34bef7a84b13.png">
**Screenshots**
<img width="1145" alt="Screenshot 2020-03-20 at 12 55 18" src="https://user-images.githubusercontent.com/4472857/77165402-1a830480-6aaa-11ea-92b6-47a59eb83c5f.png">
**Information about your Metabase Installation:**
You can get this information by going to Admin -> Troubleshooting.
{
"browser-info": {
"language": "en-GB",
"platform": "MacIntel",
"userAgent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_3) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/80.0.3987.132 Safari/537.36",
"vendor": "Google Inc."
},
"system-info": {
"java.runtime.name": "OpenJDK Runtime Environment",
"java.runtime.version": "11.0.6+10-post-Debian-1bpo91",
"java.vendor": "Debian",
"java.vendor.url": "https://tracker.debian.org/openjdk-11",
"java.version": "11.0.6",
"java.vm.name": "OpenJDK 64-Bit Server VM",
"java.vm.version": "11.0.6+10-post-Debian-1bpo91",
"os.name": "Linux",
"os.version": "4.9.0-7-amd64",
"user.language": "en",
"user.timezone": "UTC"
},
"metabase-info": {
"databases": [
"postgres",
"exasol",
"bigquery",
"h2"
],
"hosting-env": "unknown",
"application-database": "postgres",
"application-database-details": {
"database": {
"name": "PostgreSQL",
"version": "9.6.16"
},
"jdbc-driver": {
"name": "PostgreSQL JDBC Driver",
"version": "42.2.8"
}
},
"run-mode": "prod",
"version": {
"date": "2020-03-16",
"tag": "v0.35.0-rc1",
"branch": "master",
"hash": "022d0ca"
},
"settings": {
"report-timezone": "UTC"
}
}
}
|
process
|
missing headers in table results describe the bug missing headers in table results only happens when we refresh the page logs img width alt screenshot at src screenshots img width alt screenshot at src information about your metabase installation you can get this information by going to admin troubleshooting browser info language en gb platform macintel useragent mozilla macintosh intel mac os x applewebkit khtml like gecko chrome safari vendor google inc system info java runtime name openjdk runtime environment java runtime version post debian java vendor debian java vendor url java version java vm name openjdk bit server vm java vm version post debian os name linux os version user language en user timezone utc metabase info databases postgres exasol bigquery hosting env unknown application database postgres application database details database name postgresql version jdbc driver name postgresql jdbc driver version run mode prod version date tag branch master hash settings report timezone utc
| 1
|
13,520
| 16,057,044,316
|
IssuesEvent
|
2021-04-23 07:11:27
|
ropensci/software-review-meta
|
https://api.github.com/repos/ropensci/software-review-meta
|
closed
|
Generate package code statistics on submission
|
automation process
|
It would be useful to get package code statistics via `cloc` on submission. A bonus would be if `cloc` could distinguish between raw comments (often indicators of "ghost code") and roxygen comments (which may indicate well-developed documentation).
|
1.0
|
Generate package code statistics on submission - It would be useful to get package code statistics via `cloc` on submission. A bonus would be if `cloc` could distinguish between raw comments (often indicators of "ghost code") and roxygen comments (which may indicate well-developed documentation).
|
process
|
generate package code statistics on submission it would be useful to get package code statistics via cloc on submission a bonus would be if cloc could distinguish between raw comments often indicators of ghost code and roxygen comments which may indicate well developed documentation
| 1
|
7,101
| 10,255,875,090
|
IssuesEvent
|
2019-08-21 16:19:02
|
ESMValGroup/ESMValTool
|
https://api.github.com/repos/ESMValGroup/ESMValTool
|
closed
|
Running a batch of (many) recipes
|
enhancement preprocessor
|
Just had an issue raised by @alistairsellar and previously asked by @ledm about running more than one recipe - Alistair asked since there is the case when a model changes and there is the need to run a bunch of the recipes that use that model but with the new model version - it would be inconvenient to run each of them individually. Is there anything in the pipes for this or we should think about approaching such an issue? Also, @mattiarighi could you please add @alistairsellar to the github contributors list (I don't know if I can do it myself) - Alistair is the chief scientific coordinator of the UKESM project and one of the Autoassess originals
|
1.0
|
Running a batch of (many) recipes - Just had an issue raised by @alistairsellar and previously asked by @ledm about running more than one recipe - Alistair asked since there is the case when a model changes and there is the need to run a bunch of the recipes that use that model but with the new model version - it would be inconvenient to run each of them individually. Is there anything in the pipes for this or we should think about approaching such an issue? Also, @mattiarighi could you please add @alistairsellar to the github contributors list (I don't know if I can do it myself) - Alistair is the chief scientific coordinator of the UKESM project and one of the Autoassess originals
|
process
|
running a batch of many recipes just had an issue raised by alistairsellar and previously asked by ledm about running more than one recipe alistair asked since there is the case when a model changes and there is the need to run a bunch of the recipes that use that model but with the new model version it would be inconvenient to run each of them individually is there anything in the pipes for this or we should think about approaching such an issue also mattiarighi could you please add alistairsellar to the github contributors list i don t know if i can do it myself alistair is the chief scientific coordinator of the ukesm project and one of the autoassess originals
| 1
|
21,966
| 30,462,806,244
|
IssuesEvent
|
2023-07-17 08:15:53
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
[MLv2] Migrate app to MLv2
|
.Epic .metabase-lib .Team/QueryProcessor :hammer_and_wrench:
|
`metabase-lib/metadata` contains a set of classes like `Database`, `Schema`, `Table`, `Field`, etc. Each of them is used independently and has its own set of methods. Also, all of them are used to build a single shared `Metadata` class instance.
I believe we shouldn't migrate MLv1 metadata to methods to MLv1 1:1. IMO, we should look into problems they're used for, and look for higher-level methods to address the same problems.
The tasklist groups bigger areas of Metabase working with MLv1 metadata. Crossing out every issue in the list wouldn't let us do `rm -rf metabase-lib/metadata`, but we should be about 99% there. The remaining bit is expected to be really tiny
```[tasklist]
# Tasks
- [ ] [MLv2] Migrate visualizations to MLv2
- [ ] [MLv2] Migrate QB data reference to MLv2
- [ ] [MLv2] Migrate metadata info components to MLv2
- [ ] [MLv2] Migrate model metadata editor to MLv2
- [ ] [MLv2] Migrate dashboards to MLv2
- [ ] [MLv2] Migrate Metabot to MLv2
- [ ] [MLv2] Migrate admin "Table Metadata" page to MLv2
- [ ] [MLv2] Migrate admin "Databases" page to MLv2
- [ ] [MLv2] Migrate admin "Data permissions" page to MLv2
- [ ] [MLv2] Migrate admin CSV uploads settings page to MLv2
- [ ] [MLv2] Migrate `DataSelector` and/or `DataPicker` to MLv2
- [ ] [MLv2] Migrate the Browse page to MLv2
- [ ] [MLv2] Migrate the model detail page to MLv2
- [ ] [MLv2] Migrate query builder to MLv2
- [ ] [MLv2] Update entity loaders for databases, schemas, tables, and fields
- [ ] [MLv2] Replace the `Metadata` object
```
|
1.0
|
[MLv2] Migrate app to MLv2 - `metabase-lib/metadata` contains a set of classes like `Database`, `Schema`, `Table`, `Field`, etc. Each of them is used independently and has its own set of methods. Also, all of them are used to build a single shared `Metadata` class instance.
I believe we shouldn't migrate MLv1 metadata to methods to MLv1 1:1. IMO, we should look into problems they're used for, and look for higher-level methods to address the same problems.
The tasklist groups bigger areas of Metabase working with MLv1 metadata. Crossing out every issue in the list wouldn't let us do `rm -rf metabase-lib/metadata`, but we should be about 99% there. The remaining bit is expected to be really tiny
```[tasklist]
# Tasks
- [ ] [MLv2] Migrate visualizations to MLv2
- [ ] [MLv2] Migrate QB data reference to MLv2
- [ ] [MLv2] Migrate metadata info components to MLv2
- [ ] [MLv2] Migrate model metadata editor to MLv2
- [ ] [MLv2] Migrate dashboards to MLv2
- [ ] [MLv2] Migrate Metabot to MLv2
- [ ] [MLv2] Migrate admin "Table Metadata" page to MLv2
- [ ] [MLv2] Migrate admin "Databases" page to MLv2
- [ ] [MLv2] Migrate admin "Data permissions" page to MLv2
- [ ] [MLv2] Migrate admin CSV uploads settings page to MLv2
- [ ] [MLv2] Migrate `DataSelector` and/or `DataPicker` to MLv2
- [ ] [MLv2] Migrate the Browse page to MLv2
- [ ] [MLv2] Migrate the model detail page to MLv2
- [ ] [MLv2] Migrate query builder to MLv2
- [ ] [MLv2] Update entity loaders for databases, schemas, tables, and fields
- [ ] [MLv2] Replace the `Metadata` object
```
|
process
|
migrate app to metabase lib metadata contains a set of classes like database schema table field etc each of them is used independently and has its own set of methods also all of them are used to build a single shared metadata class instance i believe we shouldn t migrate metadata to methods to imo we should look into problems they re used for and look for higher level methods to address the same problems the tasklist groups bigger areas of metabase working with metadata crossing out every issue in the list wouldn t let us do rm rf metabase lib metadata but we should be about there the remaining bit is expected to be really tiny tasks migrate visualizations to migrate qb data reference to migrate metadata info components to migrate model metadata editor to migrate dashboards to migrate metabot to migrate admin table metadata page to migrate admin databases page to migrate admin data permissions page to migrate admin csv uploads settings page to migrate dataselector and or datapicker to migrate the browse page to migrate the model detail page to migrate query builder to update entity loaders for databases schemas tables and fields replace the metadata object
| 1
|
211,478
| 16,244,358,585
|
IssuesEvent
|
2021-05-07 13:15:08
|
ValveSoftware/steam-for-linux
|
https://api.github.com/repos/ValveSoftware/steam-for-linux
|
closed
|
steam client update broke login on linux mint.
|
Distro Family: Ubuntu Need Retest Steam client
|
#### Your system information
* Steam client version (build number or date):
* Distribution (e.g. Ubuntu):
* Opted into Steam client beta?: [Yes/No]
* Have you checked for system updates?: [Yes/No]
#### Please describe your issue in as much detail as possible:
Describe what you _expected_ should happen and what _did_ happen. Please link any large code pastes as a [Github Gist](https://gist.github.com/)
#### Steps for reproducing this issue:
1.
2.
3.
reset password 3 times plus reset by support 2 times. still tells incorrect password trying to login only happened after last update. been playing star conflict since 2015 with linux mint with no issues & spend $ that I can no longer benefit from.
|
1.0
|
steam client update broke login on linux mint. - #### Your system information
* Steam client version (build number or date):
* Distribution (e.g. Ubuntu):
* Opted into Steam client beta?: [Yes/No]
* Have you checked for system updates?: [Yes/No]
#### Please describe your issue in as much detail as possible:
Describe what you _expected_ should happen and what _did_ happen. Please link any large code pastes as a [Github Gist](https://gist.github.com/)
#### Steps for reproducing this issue:
1.
2.
3.
reset password 3 times plus reset by support 2 times. still tells incorrect password trying to login only happened after last update. been playing star conflict since 2015 with linux mint with no issues & spend $ that I can no longer benefit from.
|
non_process
|
steam client update broke login on linux mint your system information steam client version build number or date distribution e g ubuntu opted into steam client beta have you checked for system updates please describe your issue in as much detail as possible describe what you expected should happen and what did happen please link any large code pastes as a steps for reproducing this issue reset password times plus reset by support times still tells incorrect password trying to login only happened after last update been playing star conflict since with linux mint with no issues spend that i can no longer benefit from
| 0
|
7,228
| 10,361,587,727
|
IssuesEvent
|
2019-09-06 10:23:38
|
prisma/lift
|
https://api.github.com/repos/prisma/lift
|
closed
|
`prisma lift save` should not create a database - `prisma lift up` should
|
bug/2-confirmed kind/bug process/next-milestone
|
Should `lift save` create the database or be stateless and let `lift up` do the mutative work?
Currently, for SQLite, running `lift up` without a DB throws.
|
1.0
|
`prisma lift save` should not create a database - `prisma lift up` should - Should `lift save` create the database or be stateless and let `lift up` do the mutative work?
Currently, for SQLite, running `lift up` without a DB throws.
|
process
|
prisma lift save should not create a database prisma lift up should should lift save create the database or be stateless and let lift up do the mutative work currently for sqlite running lift up without a db throws
| 1
|
21,999
| 30,501,372,726
|
IssuesEvent
|
2023-07-18 14:09:16
|
kulturpass-de/kulturpass-app
|
https://api.github.com/repos/kulturpass-de/kulturpass-app
|
closed
|
Remove Facebook Flipper Tracker
|
In Process
|
Please do not surveillance us and submit this data to US-Companies
See report:
https://reports.exodus-privacy.eu.org/en/reports/de.bkm.kulturpass/latest/
Description of the Tracker
https://reports.exodus-privacy.eu.org/en/trackers/392/
i believe it is needed for #1
|
1.0
|
Remove Facebook Flipper Tracker - Please do not surveillance us and submit this data to US-Companies
See report:
https://reports.exodus-privacy.eu.org/en/reports/de.bkm.kulturpass/latest/
Description of the Tracker
https://reports.exodus-privacy.eu.org/en/trackers/392/
i believe it is needed for #1
|
process
|
remove facebook flipper tracker please do not surveillance us and submit this data to us companies see report description of the tracker i believe it is needed for
| 1
|
413,799
| 27,969,989,749
|
IssuesEvent
|
2023-03-25 00:25:24
|
etdds/esp-idf-lvgl-displays
|
https://api.github.com/repos/etdds/esp-idf-lvgl-displays
|
closed
|
Typo in documentation
|
documentation
|
Thanks for the library !
You have a type in the documentation, for Direct Component and Submodule Component instructions

|
1.0
|
Typo in documentation - Thanks for the library !
You have a type in the documentation, for Direct Component and Submodule Component instructions

|
non_process
|
typo in documentation thanks for the library you have a type in the documentation for direct component and submodule component instructions
| 0
|
6,950
| 10,113,524,470
|
IssuesEvent
|
2019-07-30 16:56:09
|
material-components/material-components-ios
|
https://api.github.com/repos/material-components/material-components-ios
|
closed
|
[BottomAppBar] Define shadows for Bottom App Bar cut out layer
|
[BottomAppBar] skill:Quartz Core type:Process
|
Design needs to define shadows for Bottom App Bar cut out area.
<!-- Auto-generated content below, do not modify -->
---
#### Internal data
- Associated internal bug: [b/117179585](http://b/117179585)
|
1.0
|
[BottomAppBar] Define shadows for Bottom App Bar cut out layer - Design needs to define shadows for Bottom App Bar cut out area.
<!-- Auto-generated content below, do not modify -->
---
#### Internal data
- Associated internal bug: [b/117179585](http://b/117179585)
|
process
|
define shadows for bottom app bar cut out layer design needs to define shadows for bottom app bar cut out area internal data associated internal bug
| 1
|
211,915
| 7,209,346,402
|
IssuesEvent
|
2018-02-07 08:20:08
|
minio/minio
|
https://api.github.com/repos/minio/minio
|
closed
|
Uploading file to default region results in a vague exception
|
priority: medium
|
As suggested by the reply [to my issue on the boto repo](https://github.com/boto/boto3/issues/1425), I'm copying over the issue here. The reply suggests that `minio` is not accurately mimicking the AWS S3 response when no region is provided in the `boto` resource constructor.
# Error description
Trying to upload a file to the wrong S3 region results in a vague exception regarding an "invalid literal". This happens when the region is not specified in the Boto resource constructor. Specifying the region solves this error. If possible, it would be helpful to the user if this exception was made clearer.
# Environment
I'm using Python 3.5, with the following `boto` versions. The server is running `minio` ([Github repo](https://github.com/minio/minio)).
```
> pip3.5 list | grep boto
boto (2.48.0)
boto3 (1.5.20)
botocore (1.8.34)
```
# Exception trace
```python
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 397, in _update_chunk_length
self.chunk_left = int(line, 16)
ValueError: invalid literal for int() with base 16: b''
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "uploader.py", line 54, in <module>
s3.Bucket("my-bucket").put_object(Key=file_key, Body=open(f, 'rb'), ContentType='image/png', ACL="public-read")
File "/usr/local/lib/python3.5/dist-packages/boto3/resources/factory.py", line 520, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 317, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 602, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 143, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 172, in _send_request
success_response, exception):
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 265, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 227, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 210, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 269, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 213, in _get_response
proxies=self.proxies, timeout=self.timeout)
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/sessions.py", line 605, in send
r.content
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/models.py", line 750, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/models.py", line 673, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 303, in stream
for line in self.read_chunked(amt, decode_content=decode_content):
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 447, in read_chunked
self._update_chunk_length()
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 401, in _update_chunk_length
raise httplib.IncompleteRead(line)
http.client.IncompleteRead: IncompleteRead(0 bytes read)
```
## Your Environment
```
VERSION
2018-01-18T20:33:21Z
MEMORY
Used: 3.4 MB | Allocated: 1.1 GB | Used-Heap: 3.4 MB | Allocated-Heap: 10 MB
PLATFORM
Host: c0eee334ae9d | OS: linux | Arch: amd64
RUNTIME
Version: go1.9.1 | CPUs: 8
```
|
1.0
|
Uploading file to default region results in a vague exception - As suggested by the reply [to my issue on the boto repo](https://github.com/boto/boto3/issues/1425), I'm copying over the issue here. The reply suggests that `minio` is not accurately mimicking the AWS S3 response when no region is provided in the `boto` resource constructor.
# Error description
Trying to upload a file to the wrong S3 region results in a vague exception regarding an "invalid literal". This happens when the region is not specified in the Boto resource constructor. Specifying the region solves this error. If possible, it would be helpful to the user if this exception was made clearer.
# Environment
I'm using Python 3.5, with the following `boto` versions. The server is running `minio` ([Github repo](https://github.com/minio/minio)).
```
> pip3.5 list | grep boto
boto (2.48.0)
boto3 (1.5.20)
botocore (1.8.34)
```
# Exception trace
```python
Traceback (most recent call last):
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 397, in _update_chunk_length
self.chunk_left = int(line, 16)
ValueError: invalid literal for int() with base 16: b''
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "uploader.py", line 54, in <module>
s3.Bucket("my-bucket").put_object(Key=file_key, Body=open(f, 'rb'), ContentType='image/png', ACL="public-read")
File "/usr/local/lib/python3.5/dist-packages/boto3/resources/factory.py", line 520, in do_action
response = action(self, *args, **kwargs)
File "/usr/local/lib/python3.5/dist-packages/boto3/resources/action.py", line 83, in __call__
response = getattr(parent.meta.client, operation_name)(**params)
File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 317, in _api_call
return self._make_api_call(operation_name, kwargs)
File "/usr/local/lib/python3.5/dist-packages/botocore/client.py", line 602, in _make_api_call
operation_model, request_dict)
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 143, in make_request
return self._send_request(request_dict, operation_model)
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 172, in _send_request
success_response, exception):
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 265, in _needs_retry
caught_exception=caught_exception, request_dict=request_dict)
File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 227, in emit
return self._emit(event_name, kwargs)
File "/usr/local/lib/python3.5/dist-packages/botocore/hooks.py", line 210, in _emit
response = handler(**kwargs)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 183, in __call__
if self._checker(attempts, response, caught_exception):
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 251, in __call__
caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 269, in _should_retry
return self._checker(attempt_number, response, caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 317, in __call__
caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 223, in __call__
attempt_number, caught_exception)
File "/usr/local/lib/python3.5/dist-packages/botocore/retryhandler.py", line 359, in _check_caught_exception
raise caught_exception
File "/usr/local/lib/python3.5/dist-packages/botocore/endpoint.py", line 213, in _get_response
proxies=self.proxies, timeout=self.timeout)
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/sessions.py", line 605, in send
r.content
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/models.py", line 750, in content
self._content = bytes().join(self.iter_content(CONTENT_CHUNK_SIZE)) or bytes()
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/models.py", line 673, in generate
for chunk in self.raw.stream(chunk_size, decode_content=True):
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 303, in stream
for line in self.read_chunked(amt, decode_content=decode_content):
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 447, in read_chunked
self._update_chunk_length()
File "/usr/local/lib/python3.5/dist-packages/botocore/vendored/requests/packages/urllib3/response.py", line 401, in _update_chunk_length
raise httplib.IncompleteRead(line)
http.client.IncompleteRead: IncompleteRead(0 bytes read)
```
## Your Environment
```
VERSION
2018-01-18T20:33:21Z
MEMORY
Used: 3.4 MB | Allocated: 1.1 GB | Used-Heap: 3.4 MB | Allocated-Heap: 10 MB
PLATFORM
Host: c0eee334ae9d | OS: linux | Arch: amd64
RUNTIME
Version: go1.9.1 | CPUs: 8
```
|
non_process
|
uploading file to default region results in a vague exception as suggested by the reply i m copying over the issue here the reply suggests that minio is not accurately mimicking the aws response when no region is provided in the boto resource constructor error description trying to upload a file to the wrong region results in a vague exception regarding an invalid literal this happens when the region is not specified in the boto resource constructor specifying the region solves this error if possible it would be helpful to the user if this exception was made clearer environment i m using python with the following boto versions the server is running minio list grep boto boto botocore exception trace python traceback most recent call last file usr local lib dist packages botocore vendored requests packages response py line in update chunk length self chunk left int line valueerror invalid literal for int with base b during handling of the above exception another exception occurred traceback most recent call last file uploader py line in bucket my bucket put object key file key body open f rb contenttype image png acl public read file usr local lib dist packages resources factory py line in do action response action self args kwargs file usr local lib dist packages resources action py line in call response getattr parent meta client operation name params file usr local lib dist packages botocore client py line in api call return self make api call operation name kwargs file usr local lib dist packages botocore client py line in make api call operation model request dict file usr local lib dist packages botocore endpoint py line in make request return self send request request dict operation model file usr local lib dist packages botocore endpoint py line in send request success response exception file usr local lib dist packages botocore endpoint py line in needs retry caught exception caught exception request dict request dict file usr local lib dist packages botocore hooks py line in emit return self emit event name kwargs file usr local lib dist packages botocore hooks py line in emit response handler kwargs file usr local lib dist packages botocore retryhandler py line in call if self checker attempts response caught exception file usr local lib dist packages botocore retryhandler py line in call caught exception file usr local lib dist packages botocore retryhandler py line in should retry return self checker attempt number response caught exception file usr local lib dist packages botocore retryhandler py line in call caught exception file usr local lib dist packages botocore retryhandler py line in call attempt number caught exception file usr local lib dist packages botocore retryhandler py line in check caught exception raise caught exception file usr local lib dist packages botocore endpoint py line in get response proxies self proxies timeout self timeout file usr local lib dist packages botocore vendored requests sessions py line in send r content file usr local lib dist packages botocore vendored requests models py line in content self content bytes join self iter content content chunk size or bytes file usr local lib dist packages botocore vendored requests models py line in generate for chunk in self raw stream chunk size decode content true file usr local lib dist packages botocore vendored requests packages response py line in stream for line in self read chunked amt decode content decode content file usr local lib dist packages botocore vendored requests packages response py line in read chunked self update chunk length file usr local lib dist packages botocore vendored requests packages response py line in update chunk length raise httplib incompleteread line http client incompleteread incompleteread bytes read your environment version memory used mb allocated gb used heap mb allocated heap mb platform host os linux arch runtime version cpus
| 0
|
15,831
| 20,021,302,537
|
IssuesEvent
|
2022-02-01 16:37:37
|
jessestewart1/nrn-rrn
|
https://api.github.com/repos/jessestewart1/nrn-rrn
|
closed
|
Process SK 2021
|
complete processing
|
**Description of tasks**
Process SK 2021 data for release as an NRN product.
- [x] update field mapping yaml(s)
- [x] process SK 2021 data
- [x] update release notes and sphinx documentation
- [x] copy updated yamls to `src/export/distribution_docs`
- [x] copy updated rsts to `docs/source`
- [x] copy data to server
- [x] confirm WMS updates and publication to Open Maps
|
1.0
|
Process SK 2021 - **Description of tasks**
Process SK 2021 data for release as an NRN product.
- [x] update field mapping yaml(s)
- [x] process SK 2021 data
- [x] update release notes and sphinx documentation
- [x] copy updated yamls to `src/export/distribution_docs`
- [x] copy updated rsts to `docs/source`
- [x] copy data to server
- [x] confirm WMS updates and publication to Open Maps
|
process
|
process sk description of tasks process sk data for release as an nrn product update field mapping yaml s process sk data update release notes and sphinx documentation copy updated yamls to src export distribution docs copy updated rsts to docs source copy data to server confirm wms updates and publication to open maps
| 1
|
16,268
| 20,862,997,535
|
IssuesEvent
|
2022-03-22 02:09:32
|
streamnative/flink
|
https://api.github.com/repos/streamnative/flink
|
closed
|
[FLINK-26160][DOC][BUG] 1.14 Pulsar Source setUnboundedStartCursor description should be changed
|
compute/data-processing
|
Only when partition discovery is disabled, the source will stop at the specified stopCursor. The Java Doc needs to be updated
|
1.0
|
[FLINK-26160][DOC][BUG] 1.14 Pulsar Source setUnboundedStartCursor description should be changed - Only when partition discovery is disabled, the source will stop at the specified stopCursor. The Java Doc needs to be updated
|
process
|
pulsar source setunboundedstartcursor description should be changed only when partition discovery is disabled the source will stop at the specified stopcursor the java doc needs to be updated
| 1
|
243,379
| 7,857,184,505
|
IssuesEvent
|
2018-06-21 09:56:29
|
python/mypy
|
https://api.github.com/repos/python/mypy
|
closed
|
Invalid type inferred for list indexing with Any
|
bug false-positive priority-0-high topic-overloads
|
The revealed type for the following program is `List[Any]`, while the correct type would be `Any` -- looks like the wrong overload item is picked:
```py
from typing import Any, List
i: Any
a: List[Any]
reveal_type(a[i]) # List[Any], but should be Any
```
Here is a self-contained repro:
```py
from typing import Any, overload, TypeVar, Generic
T = TypeVar('T')
class A(Generic[T]):
@overload
def f(self, x: int) -> T: ...
@overload
def f(self, x: slice) -> A[T]: ...
def f(self, x): ...
i: Any
a: A[Any]
reveal_type(a.f(i)) # A[Any], but should be Any
```
|
1.0
|
Invalid type inferred for list indexing with Any - The revealed type for the following program is `List[Any]`, while the correct type would be `Any` -- looks like the wrong overload item is picked:
```py
from typing import Any, List
i: Any
a: List[Any]
reveal_type(a[i]) # List[Any], but should be Any
```
Here is a self-contained repro:
```py
from typing import Any, overload, TypeVar, Generic
T = TypeVar('T')
class A(Generic[T]):
@overload
def f(self, x: int) -> T: ...
@overload
def f(self, x: slice) -> A[T]: ...
def f(self, x): ...
i: Any
a: A[Any]
reveal_type(a.f(i)) # A[Any], but should be Any
```
|
non_process
|
invalid type inferred for list indexing with any the revealed type for the following program is list while the correct type would be any looks like the wrong overload item is picked py from typing import any list i any a list reveal type a list but should be any here is a self contained repro py from typing import any overload typevar generic t typevar t class a generic overload def f self x int t overload def f self x slice a def f self x i any a a reveal type a f i a but should be any
| 0
|
479,812
| 13,805,916,681
|
IssuesEvent
|
2020-10-11 15:35:45
|
AY2021S1-CS2113T-F14-2/tp
|
https://api.github.com/repos/AY2021S1-CS2113T-F14-2/tp
|
closed
|
Add ListCommand
|
priority.High type.Story
|
As a careful student, I want to view all expenses individually so that I can trace each expense easily.
|
1.0
|
Add ListCommand - As a careful student, I want to view all expenses individually so that I can trace each expense easily.
|
non_process
|
add listcommand as a careful student i want to view all expenses individually so that i can trace each expense easily
| 0
|
20,151
| 26,701,865,329
|
IssuesEvent
|
2023-01-27 15:00:18
|
ORNL-AMO/AMO-Tools-Desktop
|
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
|
closed
|
Bug in Process Heating - Cooling
|
bug Process Heating important WebAssembly
|
Not sure what is the problem exactly, but things aren't hooked up right.
Example
Web

vs Desktop

Should be zero
Web

vs desktop

|
1.0
|
Bug in Process Heating - Cooling - Not sure what is the problem exactly, but things aren't hooked up right.
Example
Web

vs Desktop

Should be zero
Web

vs desktop

|
process
|
bug in process heating cooling not sure what is the problem exactly but things aren t hooked up right example web vs desktop should be zero web vs desktop
| 1
|
22,326
| 30,913,294,881
|
IssuesEvent
|
2023-08-05 01:35:01
|
h4sh5/pypi-auto-scanner
|
https://api.github.com/repos/h4sh5/pypi-auto-scanner
|
opened
|
pih 1.48036 has 2 GuardDog issues
|
guarddog typosquatting silent-process-execution
|
https://pypi.org/project/pih
https://inspector.pypi.io/project/pih
```{
"dependency": "pih",
"version": "1.48036",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pip, pid",
"silent-process-execution": [
{
"location": "pih-1.48036/pih/tools.py:781",
"code": " result = subprocess.run(command, stdin=subprocess.DEVNULL, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpnwbps_tp/pih"
}
}```
|
1.0
|
pih 1.48036 has 2 GuardDog issues - https://pypi.org/project/pih
https://inspector.pypi.io/project/pih
```{
"dependency": "pih",
"version": "1.48036",
"result": {
"issues": 2,
"errors": {},
"results": {
"typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: pip, pid",
"silent-process-execution": [
{
"location": "pih-1.48036/pih/tools.py:781",
"code": " result = subprocess.run(command, stdin=subprocess.DEVNULL, stdout=subprocess.DEVNULL, stderr=subprocess.DEVNULL)",
"message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null"
}
]
},
"path": "/tmp/tmpnwbps_tp/pih"
}
}```
|
process
|
pih has guarddog issues dependency pih version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt pip pid silent process execution location pih pih tools py code result subprocess run command stdin subprocess devnull stdout subprocess devnull stderr subprocess devnull message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmpnwbps tp pih
| 1
|
15,474
| 19,684,921,751
|
IssuesEvent
|
2022-01-11 20:55:26
|
MicrosoftDocs/azure-devops-docs
|
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
|
closed
|
Add a warning about not being able to reference pipelines in an artifact
|
doc-enhancement devops/prod Pri2 devops-cicd-process/tech
|
In section **Insert a template** the second paragraph states:
`Template files need to exist on your filesystem at the start of a pipeline run. You can't reference templates in an artifact`
I think this should be made more explicit by changing it to either a information or warning box as this information is very usefull when starting to work with pipeline resources and could save a lot of time.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6724abea-bbdc-bf66-ed5e-3214fa6c3e66
* Version Independent ID: 4f8dab21-3f0e-da32-cc0e-1d85c13c0065
* Content: [Templates - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops)
* Content Source: [docs/pipelines/process/templates.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/templates.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
1.0
|
Add a warning about not being able to reference pipelines in an artifact -
In section **Insert a template** the second paragraph states:
`Template files need to exist on your filesystem at the start of a pipeline run. You can't reference templates in an artifact`
I think this should be made more explicit by changing it to either a information or warning box as this information is very usefull when starting to work with pipeline resources and could save a lot of time.
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6724abea-bbdc-bf66-ed5e-3214fa6c3e66
* Version Independent ID: 4f8dab21-3f0e-da32-cc0e-1d85c13c0065
* Content: [Templates - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops)
* Content Source: [docs/pipelines/process/templates.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/templates.md)
* Product: **devops**
* Technology: **devops-cicd-process**
* GitHub Login: @juliakm
* Microsoft Alias: **jukullam**
|
process
|
add a warning about not being able to reference pipelines in an artifact in section insert a template the second paragraph states template files need to exist on your filesystem at the start of a pipeline run you can t reference templates in an artifact i think this should be made more explicit by changing it to either a information or warning box as this information is very usefull when starting to work with pipeline resources and could save a lot of time document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id bbdc version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
| 1
|
12,409
| 14,917,270,268
|
IssuesEvent
|
2021-01-22 19:34:57
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
api to get arm architecture of the binary as in the download page...
|
feature request help wanted process
|
<!--
Thank you for reporting an issue.
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
- **Version**: v6.3.0
- **Platform**: linux-arm\*
- **Subsystem**: API?
<!-- Enter your issue details below this comment. -->
I couldn't find any api to get the architecture of the binary itself,
`process.arch` just gives `arm`
`uname -a` would give for the host.. not the binary
`node --v8-options` gives `target arm v7..` , and a lot of output..
I want it for the the binary as it is on the download page.. `arm64` `armv6l` `armv7l`
I dumped and greped the `process` object and found.. `process.config.variables.arm_version` among others...
Checking the api doc for `process.config` says it is not read-only and gives some warning about some packages changing it..
Could and api be added for this, or should we just parse this form `process.config.variables.arm_*` or `--v8-options` or any other suggestions..
|
1.0
|
api to get arm architecture of the binary as in the download page... - <!--
Thank you for reporting an issue.
Please fill in as much of the template below as you're able.
Version: output of `node -v`
Platform: output of `uname -a` (UNIX), or version and 32 or 64-bit (Windows)
Subsystem: if known, please specify affected core module name
If possible, please provide code that demonstrates the problem, keeping it as
simple and free of external dependencies as you are able.
-->
- **Version**: v6.3.0
- **Platform**: linux-arm\*
- **Subsystem**: API?
<!-- Enter your issue details below this comment. -->
I couldn't find any api to get the architecture of the binary itself,
`process.arch` just gives `arm`
`uname -a` would give for the host.. not the binary
`node --v8-options` gives `target arm v7..` , and a lot of output..
I want it for the the binary as it is on the download page.. `arm64` `armv6l` `armv7l`
I dumped and greped the `process` object and found.. `process.config.variables.arm_version` among others...
Checking the api doc for `process.config` says it is not read-only and gives some warning about some packages changing it..
Could and api be added for this, or should we just parse this form `process.config.variables.arm_*` or `--v8-options` or any other suggestions..
|
process
|
api to get arm architecture of the binary as in the download page thank you for reporting an issue please fill in as much of the template below as you re able version output of node v platform output of uname a unix or version and or bit windows subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version platform linux arm subsystem api i couldn t find any api to get the architecture of the binary itself process arch just gives arm uname a would give for the host not the binary node options gives target arm and a lot of output i want it for the the binary as it is on the download page i dumped and greped the process object and found process config variables arm version among others checking the api doc for process config says it is not read only and gives some warning about some packages changing it could and api be added for this or should we just parse this form process config variables arm or options or any other suggestions
| 1
|
14,940
| 18,389,267,244
|
IssuesEvent
|
2021-10-12 01:54:24
|
CodeForPhilly/paws-data-pipeline
|
https://api.github.com/repos/CodeForPhilly/paws-data-pipeline
|
closed
|
nginx exposes the need for app restructuring
|
Async processes
|
**Abstract**: _We need to separate handling web requests from the long-running processes._
In testing Dan's [nginx_setup](https://github.com/CodeForPhilly/paws-data-pipeline/tree/nginx_setup) branch, I'm seeing timeouts from **nginx** when running `/api/execute` on the full data set. **nginx** has a timeout setting of 60 seconds for the response from the server, then sends a 504:
```<html>
<head><title>504 Gateway Time-out</title></head>
<body>
<center><h1>504 Gateway Time-out</h1></center>
<hr><center>nginx/1.19.6</center>
</body>
</html>
```
`handleExecute()` is expecting JSON back and doesn't check the response status before trying to parse the html, resulting in an uncaught error.
The bigger issue is that the execute takes so long to run (right at an hour on my machine for 95937 rows). There's probably room for improvements in matching speed but it's likely to always be minutes at best, and it's bad practice to leave web requests hanging out that long. Dialing up the nginx timeout could only be a temporary hack as we need to provide some feedback to the user on what's going on. (We can watch the logs when running locally but the user will have no way to know if things are still running or not.)
Here's one idea based on something I've done before:
- Move the execute (and probably intake as well) process into its own container, 'processing'
- Set up a work queue using redis and [RQ](https://python-rq.org/)
- Pass intake/execute jobs from the API server to the processing container. To avoid shared storage, the API server can drop the uploaded files as blobs into the DB.
- The process in the processing container can pull jobs from the work queue, grab the files from the DB, and start working.
- The processing container can have a message queue going back to the API server, and the React client can poll for status, at which point the API server sends back the latest status ('**34% done**') to display to the user.
I'm happy to sketch out in more detail.
|
1.0
|
nginx exposes the need for app restructuring - **Abstract**: _We need to separate handling web requests from the long-running processes._
In testing Dan's [nginx_setup](https://github.com/CodeForPhilly/paws-data-pipeline/tree/nginx_setup) branch, I'm seeing timeouts from **nginx** when running `/api/execute` on the full data set. **nginx** has a timeout setting of 60 seconds for the response from the server, then sends a 504:
```<html>
<head><title>504 Gateway Time-out</title></head>
<body>
<center><h1>504 Gateway Time-out</h1></center>
<hr><center>nginx/1.19.6</center>
</body>
</html>
```
`handleExecute()` is expecting JSON back and doesn't check the response status before trying to parse the html, resulting in an uncaught error.
The bigger issue is that the execute takes so long to run (right at an hour on my machine for 95937 rows). There's probably room for improvements in matching speed but it's likely to always be minutes at best, and it's bad practice to leave web requests hanging out that long. Dialing up the nginx timeout could only be a temporary hack as we need to provide some feedback to the user on what's going on. (We can watch the logs when running locally but the user will have no way to know if things are still running or not.)
Here's one idea based on something I've done before:
- Move the execute (and probably intake as well) process into its own container, 'processing'
- Set up a work queue using redis and [RQ](https://python-rq.org/)
- Pass intake/execute jobs from the API server to the processing container. To avoid shared storage, the API server can drop the uploaded files as blobs into the DB.
- The process in the processing container can pull jobs from the work queue, grab the files from the DB, and start working.
- The processing container can have a message queue going back to the API server, and the React client can poll for status, at which point the API server sends back the latest status ('**34% done**') to display to the user.
I'm happy to sketch out in more detail.
|
process
|
nginx exposes the need for app restructuring abstract we need to separate handling web requests from the long running processes in testing dan s branch i m seeing timeouts from nginx when running api execute on the full data set nginx has a timeout setting of seconds for the response from the server then sends a gateway time out gateway time out nginx handleexecute is expecting json back and doesn t check the response status before trying to parse the html resulting in an uncaught error the bigger issue is that the execute takes so long to run right at an hour on my machine for rows there s probably room for improvements in matching speed but it s likely to always be minutes at best and it s bad practice to leave web requests hanging out that long dialing up the nginx timeout could only be a temporary hack as we need to provide some feedback to the user on what s going on we can watch the logs when running locally but the user will have no way to know if things are still running or not here s one idea based on something i ve done before move the execute and probably intake as well process into its own container processing set up a work queue using redis and pass intake execute jobs from the api server to the processing container to avoid shared storage the api server can drop the uploaded files as blobs into the db the process in the processing container can pull jobs from the work queue grab the files from the db and start working the processing container can have a message queue going back to the api server and the react client can poll for status at which point the api server sends back the latest status done to display to the user i m happy to sketch out in more detail
| 1
|
690,810
| 23,673,199,715
|
IssuesEvent
|
2022-08-27 17:34:59
|
thoth-station/user-api
|
https://api.github.com/repos/thoth-station/user-api
|
reopened
|
Provide license information for each package on API endpoints
|
kind/feature priority/important-soon lifecycle/rotten triage/accepted
|
**Is your feature request related to a problem? Please describe.**
As a user of Thoth services, I would like to consume information about licensing from user API endpoints.
**Describe the solution you'd like**
- [ ] the license information should be reported based on the thoth-license-solver run report
- [ ] extend thoth-storages to provide a method that queries the database for license information
- [ ] provide an API endpoint that will return license information for the given (package name, package version, index url) triplet
|
1.0
|
Provide license information for each package on API endpoints - **Is your feature request related to a problem? Please describe.**
As a user of Thoth services, I would like to consume information about licensing from user API endpoints.
**Describe the solution you'd like**
- [ ] the license information should be reported based on the thoth-license-solver run report
- [ ] extend thoth-storages to provide a method that queries the database for license information
- [ ] provide an API endpoint that will return license information for the given (package name, package version, index url) triplet
|
non_process
|
provide license information for each package on api endpoints is your feature request related to a problem please describe as a user of thoth services i would like to consume information about licensing from user api endpoints describe the solution you d like the license information should be reported based on the thoth license solver run report extend thoth storages to provide a method that queries the database for license information provide an api endpoint that will return license information for the given package name package version index url triplet
| 0
|
6,081
| 2,582,916,530
|
IssuesEvent
|
2015-02-15 19:57:27
|
Krasnyanskiy/jrsh
|
https://api.github.com/repos/Krasnyanskiy/jrsh
|
closed
|
As User I want to see username credentials in the password prompt for replication command
|
enhancement high priority in progress
|
#### Current implementation
```bash
Please enter the password for JRS-3 JRS: *********
```
Expected implementation
```bash
Please enter the password for <username> at <profile name> environment: *********
```
|
1.0
|
As User I want to see username credentials in the password prompt for replication command - #### Current implementation
```bash
Please enter the password for JRS-3 JRS: *********
```
Expected implementation
```bash
Please enter the password for <username> at <profile name> environment: *********
```
|
non_process
|
as user i want to see username credentials in the password prompt for replication command current implementation bash please enter the password for jrs jrs expected implementation bash please enter the password for at environment
| 0
|
178,390
| 21,509,383,478
|
IssuesEvent
|
2022-04-28 01:35:11
|
bsbtd/Teste
|
https://api.github.com/repos/bsbtd/Teste
|
closed
|
CVE-2012-0881 (High) detected in xercesImpl-2.9.1.jar - autoclosed
|
security vulnerability
|
## CVE-2012-0881 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xercesImpl-2.9.1.jar</b></p></summary>
<p>Xerces2 is the next generation of high performance, fully compliant XML parsers in the
Apache Xerces family. This new version of Xerces introduces the Xerces Native Interface (XNI),
a complete framework for building parser components and configurations that is extremely
modular and easy to program.</p>
<p>Path to vulnerable library: /.9.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **xercesImpl-2.9.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/bsbtd/Teste/commit/64dde89c50c07496423c4d4a865f2e16b92399ad">64dde89c50c07496423c4d4a865f2e16b92399ad</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Xerces2 Java Parser before 2.12.0 allows remote attackers to cause a denial of service (CPU consumption) via a crafted message to an XML service, which triggers hash table collisions.
<p>Publish Date: 2017-10-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2012-0881>CVE-2012-0881</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2012-0881">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2012-0881</a></p>
<p>Release Date: 2017-10-30</p>
<p>Fix Resolution: 2.12.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2012-0881 (High) detected in xercesImpl-2.9.1.jar - autoclosed - ## CVE-2012-0881 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xercesImpl-2.9.1.jar</b></p></summary>
<p>Xerces2 is the next generation of high performance, fully compliant XML parsers in the
Apache Xerces family. This new version of Xerces introduces the Xerces Native Interface (XNI),
a complete framework for building parser components and configurations that is extremely
modular and easy to program.</p>
<p>Path to vulnerable library: /.9.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **xercesImpl-2.9.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/bsbtd/Teste/commit/64dde89c50c07496423c4d4a865f2e16b92399ad">64dde89c50c07496423c4d4a865f2e16b92399ad</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache Xerces2 Java Parser before 2.12.0 allows remote attackers to cause a denial of service (CPU consumption) via a crafted message to an XML service, which triggers hash table collisions.
<p>Publish Date: 2017-10-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2012-0881>CVE-2012-0881</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2012-0881">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2012-0881</a></p>
<p>Release Date: 2017-10-30</p>
<p>Fix Resolution: 2.12.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_process
|
cve high detected in xercesimpl jar autoclosed cve high severity vulnerability vulnerable library xercesimpl jar is the next generation of high performance fully compliant xml parsers in the apache xerces family this new version of xerces introduces the xerces native interface xni a complete framework for building parser components and configurations that is extremely modular and easy to program path to vulnerable library jar dependency hierarchy x xercesimpl jar vulnerable library found in head commit a href vulnerability details apache java parser before allows remote attackers to cause a denial of service cpu consumption via a crafted message to an xml service which triggers hash table collisions publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
16,766
| 21,940,111,847
|
IssuesEvent
|
2022-05-23 17:10:24
|
jgraley/inferno-cpp2v
|
https://api.github.com/repos/jgraley/inferno-cpp2v
|
closed
|
forces to be passed into solver using `set`
|
Constraint Processing
|
not `vector`. This is more correct, since their order plays no role, and will make `ReferenceSolver::Plan::DeduceVariables()` and `AndRuleEngine::Plan::Plan()` a bit simpler.
|
1.0
|
forces to be passed into solver using `set` - not `vector`. This is more correct, since their order plays no role, and will make `ReferenceSolver::Plan::DeduceVariables()` and `AndRuleEngine::Plan::Plan()` a bit simpler.
|
process
|
forces to be passed into solver using set not vector this is more correct since their order plays no role and will make referencesolver plan deducevariables and andruleengine plan plan a bit simpler
| 1
|
12,417
| 3,074,386,451
|
IssuesEvent
|
2015-08-20 06:50:10
|
excelsior-oss/restler
|
https://api.github.com/repos/excelsior-oss/restler
|
closed
|
Rename org.restler.http.Executor into RequestExecutor
|
cosmetics design
|
org.restler.http.Executor clashes with java.util.concurrent.Executor what introduce some confusion in the project.
|
1.0
|
Rename org.restler.http.Executor into RequestExecutor - org.restler.http.Executor clashes with java.util.concurrent.Executor what introduce some confusion in the project.
|
non_process
|
rename org restler http executor into requestexecutor org restler http executor clashes with java util concurrent executor what introduce some confusion in the project
| 0
|
7,164
| 10,311,060,370
|
IssuesEvent
|
2019-08-29 16:26:40
|
prisma/prisma2
|
https://api.github.com/repos/prisma/prisma2
|
closed
|
Updating data in M:N relation
|
bug/2-confirmed kind/bug process/candidate
|
I have two models:
```
model Post {
id String @id @default(cuid())
title String
categories Category[]
}
model Category {
id String @id @default(cuid())
name String
posts Post[]
}
```
Category is defined before the post is created. Creating post itself is fine, the connections in `_CategoryToPost` table are created. The problem arises when trying to update the post categories. What happens is that the old ones are not deleted and the new ones are added to the `_CategoryToPost` table. So if I update post and I don't make any change to the post categories, they duplicate in the `_CategoryToPost` table.
The code for updating is following:
```
const updatedPost = await photon.posts.update({
where: { id: args.id },
data: {
title: args.title,
categories: {
connect: args.categories.map(categoryId => ({
id: categoryId
}))
}
}
});
```
|
1.0
|
Updating data in M:N relation - I have two models:
```
model Post {
id String @id @default(cuid())
title String
categories Category[]
}
model Category {
id String @id @default(cuid())
name String
posts Post[]
}
```
Category is defined before the post is created. Creating post itself is fine, the connections in `_CategoryToPost` table are created. The problem arises when trying to update the post categories. What happens is that the old ones are not deleted and the new ones are added to the `_CategoryToPost` table. So if I update post and I don't make any change to the post categories, they duplicate in the `_CategoryToPost` table.
The code for updating is following:
```
const updatedPost = await photon.posts.update({
where: { id: args.id },
data: {
title: args.title,
categories: {
connect: args.categories.map(categoryId => ({
id: categoryId
}))
}
}
});
```
|
process
|
updating data in m n relation i have two models model post id string id default cuid title string categories category model category id string id default cuid name string posts post category is defined before the post is created creating post itself is fine the connections in categorytopost table are created the problem arises when trying to update the post categories what happens is that the old ones are not deleted and the new ones are added to the categorytopost table so if i update post and i don t make any change to the post categories they duplicate in the categorytopost table the code for updating is following const updatedpost await photon posts update where id args id data title args title categories connect args categories map categoryid id categoryid
| 1
|
45,357
| 7,179,114,601
|
IssuesEvent
|
2018-01-31 18:36:13
|
caseyyee/unity-webvr-export
|
https://api.github.com/repos/caseyyee/unity-webvr-export
|
closed
|
stuck at loading screen
|
documentation
|
I am using unity 5.6. I followed the first half of the ReadMe and ended up stuck at loading screen after hitting "build and run". In addition I had to add webGLTemplates to my assets because it originally wasn't there.
Can anyone assist with this?
|
1.0
|
stuck at loading screen - I am using unity 5.6. I followed the first half of the ReadMe and ended up stuck at loading screen after hitting "build and run". In addition I had to add webGLTemplates to my assets because it originally wasn't there.
Can anyone assist with this?
|
non_process
|
stuck at loading screen i am using unity i followed the first half of the readme and ended up stuck at loading screen after hitting build and run in addition i had to add webgltemplates to my assets because it originally wasn t there can anyone assist with this
| 0
|
131,734
| 10,708,012,762
|
IssuesEvent
|
2019-10-24 18:43:54
|
dotnet/corefx
|
https://api.github.com/repos/dotnet/corefx
|
closed
|
RegionInfoPropertyTests fails on non-US machines
|
area-System.Globalization test bug test-run-core
|
Repro:
1. Add Czech language pack in Windows language settings
2. Run System.Globalization.Tests
Result:
```
System.Globalization.Tests.RegionInfoPropertyTests.DisplayName(name: "en-US", expected: "United States") [FAIL]
Microsoft.DotNet.RemoteExecutor.RemoteExecutionException : Remote process failed with an unhandled exception.
Stack Trace:
Child exception:
Xunit.Sdk.EqualException: Assert.Equal() Failure
(pos 0)
Expected: United States
Actual: Spojené státy
(pos 0)
D:\corefx\src\System.Globalization\tests\System\Globalization\RegionInfoTests.cs(78,0): at System.Globalization.Tests.RegionInfoPropertyTests.<>c.<DisplayName>b__5_0(String _name, String _expected)
```
|
2.0
|
RegionInfoPropertyTests fails on non-US machines - Repro:
1. Add Czech language pack in Windows language settings
2. Run System.Globalization.Tests
Result:
```
System.Globalization.Tests.RegionInfoPropertyTests.DisplayName(name: "en-US", expected: "United States") [FAIL]
Microsoft.DotNet.RemoteExecutor.RemoteExecutionException : Remote process failed with an unhandled exception.
Stack Trace:
Child exception:
Xunit.Sdk.EqualException: Assert.Equal() Failure
(pos 0)
Expected: United States
Actual: Spojené státy
(pos 0)
D:\corefx\src\System.Globalization\tests\System\Globalization\RegionInfoTests.cs(78,0): at System.Globalization.Tests.RegionInfoPropertyTests.<>c.<DisplayName>b__5_0(String _name, String _expected)
```
|
non_process
|
regioninfopropertytests fails on non us machines repro add czech language pack in windows language settings run system globalization tests result system globalization tests regioninfopropertytests displayname name en us expected united states microsoft dotnet remoteexecutor remoteexecutionexception remote process failed with an unhandled exception stack trace child exception xunit sdk equalexception assert equal failure pos expected united states actual spojené státy pos d corefx src system globalization tests system globalization regioninfotests cs at system globalization tests regioninfopropertytests c b string name string expected
| 0
|
45,509
| 9,780,782,407
|
IssuesEvent
|
2019-06-07 17:54:00
|
Tribler/tribler
|
https://api.github.com/repos/Tribler/tribler
|
closed
|
Copying a torrent with a non-ascii chars from another channel fails
|
AllChannel 2.0 bug unicode
|
Latest `devel`.
Trying to add the torrent with non-ascii name from another channel fails
```python
byte 0xe3 in position 9: ordinal not in range(128)
/home/vader/.local/lib/python2.7/site-packages/twisted/web/http.py:2070:_finishRequestBody
/home/vader/.local/lib/python2.7/site-packages/twisted/web/http.py:2145:allContentReceived
/home/vader/.local/lib/python2.7/site-packages/twisted/web/http.py:890:requestReceived
/home/vader/src/TRIBLER/Tribler/Tribler/Core/Modules/restapi/rest_manager.py:118:process
--- <exception caught here> ---
/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py:197:process
/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py:257:render
/home/vader/.local/lib/python2.7/site-packages/twisted/web/resource.py:250:render
<auto generated wrapper of render_PUT() function>:2:render_PUT
/home/vader/.local/lib/python2.7/site-packages/pony/orm/core.py:528:new_func
/home/vader/src/TRIBLER/Tribler/Tribler/Core/Modules/restapi/mychannel_endpoint.py:271:render_PUT
/home/vader/src/TRIBLER/Tribler/Tribler/Core/Utilities/utilities.py:127:parse_magnetlink
]
Traceback (most recent call last):
File "/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py", line 197, in process
self.render(resrc)
File "/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py", line 257, in render
body = resrc.render(self)
File "/home/vader/.local/lib/python2.7/site-packages/twisted/web/resource.py", line 250, in render
return m(request)
File "<auto generated wrapper of render_PUT() function>", line 2, in render_PUT
File "/home/vader/.local/lib/python2.7/site-packages/pony/orm/core.py", line 528, in new_func
result = func(*args, **kwargs)
File "/home/vader/src/TRIBLER/Tribler/Tribler/Core/Modules/restapi/mychannel_endpoint.py", line 271, in render_PUT
_, xt, _ = parse_magnetlink(uri)
File "/home/vader/src/TRIBLER/Tribler/Tribler/Core/Utilities/utilities.py", line 127, in parse_magnetlink
dn = value.decode() if not isinstance(value, six.text_type) else value
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe3 in position 9: ordinal not in range(128)
```
|
1.0
|
Copying a torrent with a non-ascii chars from another channel fails - Latest `devel`.
Trying to add the torrent with non-ascii name from another channel fails
```python
byte 0xe3 in position 9: ordinal not in range(128)
/home/vader/.local/lib/python2.7/site-packages/twisted/web/http.py:2070:_finishRequestBody
/home/vader/.local/lib/python2.7/site-packages/twisted/web/http.py:2145:allContentReceived
/home/vader/.local/lib/python2.7/site-packages/twisted/web/http.py:890:requestReceived
/home/vader/src/TRIBLER/Tribler/Tribler/Core/Modules/restapi/rest_manager.py:118:process
--- <exception caught here> ---
/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py:197:process
/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py:257:render
/home/vader/.local/lib/python2.7/site-packages/twisted/web/resource.py:250:render
<auto generated wrapper of render_PUT() function>:2:render_PUT
/home/vader/.local/lib/python2.7/site-packages/pony/orm/core.py:528:new_func
/home/vader/src/TRIBLER/Tribler/Tribler/Core/Modules/restapi/mychannel_endpoint.py:271:render_PUT
/home/vader/src/TRIBLER/Tribler/Tribler/Core/Utilities/utilities.py:127:parse_magnetlink
]
Traceback (most recent call last):
File "/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py", line 197, in process
self.render(resrc)
File "/home/vader/.local/lib/python2.7/site-packages/twisted/web/server.py", line 257, in render
body = resrc.render(self)
File "/home/vader/.local/lib/python2.7/site-packages/twisted/web/resource.py", line 250, in render
return m(request)
File "<auto generated wrapper of render_PUT() function>", line 2, in render_PUT
File "/home/vader/.local/lib/python2.7/site-packages/pony/orm/core.py", line 528, in new_func
result = func(*args, **kwargs)
File "/home/vader/src/TRIBLER/Tribler/Tribler/Core/Modules/restapi/mychannel_endpoint.py", line 271, in render_PUT
_, xt, _ = parse_magnetlink(uri)
File "/home/vader/src/TRIBLER/Tribler/Tribler/Core/Utilities/utilities.py", line 127, in parse_magnetlink
dn = value.decode() if not isinstance(value, six.text_type) else value
UnicodeDecodeError: 'ascii' codec can't decode byte 0xe3 in position 9: ordinal not in range(128)
```
|
non_process
|
copying a torrent with a non ascii chars from another channel fails latest devel trying to add the torrent with non ascii name from another channel fails python byte in position ordinal not in range home vader local lib site packages twisted web http py finishrequestbody home vader local lib site packages twisted web http py allcontentreceived home vader local lib site packages twisted web http py requestreceived home vader src tribler tribler tribler core modules restapi rest manager py process home vader local lib site packages twisted web server py process home vader local lib site packages twisted web server py render home vader local lib site packages twisted web resource py render render put home vader local lib site packages pony orm core py new func home vader src tribler tribler tribler core modules restapi mychannel endpoint py render put home vader src tribler tribler tribler core utilities utilities py parse magnetlink traceback most recent call last file home vader local lib site packages twisted web server py line in process self render resrc file home vader local lib site packages twisted web server py line in render body resrc render self file home vader local lib site packages twisted web resource py line in render return m request file line in render put file home vader local lib site packages pony orm core py line in new func result func args kwargs file home vader src tribler tribler tribler core modules restapi mychannel endpoint py line in render put xt parse magnetlink uri file home vader src tribler tribler tribler core utilities utilities py line in parse magnetlink dn value decode if not isinstance value six text type else value unicodedecodeerror ascii codec can t decode byte in position ordinal not in range
| 0
|
31,332
| 11,908,618,164
|
IssuesEvent
|
2020-03-31 01:28:58
|
fufunoyu/example-maven-travis
|
https://api.github.com/repos/fufunoyu/example-maven-travis
|
opened
|
CVE-2020-10673 (High) detected in jackson-databind-2.9.8.jar
|
security vulnerability
|
## CVE-2020-10673 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/example-maven-travis/pom.xml</p>
<p>Path to vulnerable library: 20200331003401/downloadResource_db272a21-79af-4cc1-b7b3-efce982319ab/20200331010614/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fufunoyu/example-maven-travis/commit/27e07981ec452fec78a868417e5f94000e3c6e3d">27e07981ec452fec78a868417e5f94000e3c6e3d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to com.caucho.config.types.ResourceRef (aka caucho-quercus).
<p>Publish Date: 2020-03-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10673>CVE-2020-10673</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/commit/1645efbd392989cf015f459a91c999e59c921b15">https://github.com/FasterXML/jackson-databind/commit/1645efbd392989cf015f459a91c999e59c921b15</a></p>
<p>Release Date: 2020-03-18</p>
<p>Fix Resolution: Replace or update the following files: SubTypeValidator.java, VERSION-2.x</p>
</p>
</details>
<p></p>
|
True
|
CVE-2020-10673 (High) detected in jackson-databind-2.9.8.jar - ## CVE-2020-10673 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/example-maven-travis/pom.xml</p>
<p>Path to vulnerable library: 20200331003401/downloadResource_db272a21-79af-4cc1-b7b3-efce982319ab/20200331010614/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/fufunoyu/example-maven-travis/commit/27e07981ec452fec78a868417e5f94000e3c6e3d">27e07981ec452fec78a868417e5f94000e3c6e3d</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to com.caucho.config.types.ResourceRef (aka caucho-quercus).
<p>Publish Date: 2020-03-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10673>CVE-2020-10673</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/commit/1645efbd392989cf015f459a91c999e59c921b15">https://github.com/FasterXML/jackson-databind/commit/1645efbd392989cf015f459a91c999e59c921b15</a></p>
<p>Release Date: 2020-03-18</p>
<p>Fix Resolution: Replace or update the following files: SubTypeValidator.java, VERSION-2.x</p>
</p>
</details>
<p></p>
|
non_process
|
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws scm example maven travis pom xml path to vulnerable library downloadresource jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com caucho config types resourceref aka caucho quercus publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type change files origin a href release date fix resolution replace or update the following files subtypevalidator java version x
| 0
|
273,803
| 23,786,375,046
|
IssuesEvent
|
2022-09-02 10:28:28
|
pingcap/tidb
|
https://api.github.com/repos/pingcap/tidb
|
closed
|
unstable test TestRelease
|
type/bug component/test sig/execution severity/major affects-6.2
|
## Bug Report
Please answer these questions before submitting your issue. Thanks!
### 1. Minimal reproduce step (Required)
```
=== RUN TestRelease
tracker_test.go:117:
Error Trace: /home/jenkins/.tidb/tmp/04446c229c5a73c16deb3edddcb4db34/sandbox/processwrapper-sandbox/2612/execroot/__main__/bazel-out/k8-fastbuild/bin/util/memory/memory_test_/memory_test.runfiles/__main__/util/memory/tracker_test.go:117
Error: Not equal:
expected: 0
actual : 100
Test: TestReleas
```
https://prow.pingcap.net/view/gs/pingcapprow/logs/bazel_test_tidb/1554466791571329024
<!-- a step by step guide for reproducing the bug. -->
### 2. What did you expect to see? (Required)
### 3. What did you see instead (Required)
### 4. What is your TiDB version? (Required)
<!-- Paste the output of SELECT tidb_version() -->
|
1.0
|
unstable test TestRelease - ## Bug Report
Please answer these questions before submitting your issue. Thanks!
### 1. Minimal reproduce step (Required)
```
=== RUN TestRelease
tracker_test.go:117:
Error Trace: /home/jenkins/.tidb/tmp/04446c229c5a73c16deb3edddcb4db34/sandbox/processwrapper-sandbox/2612/execroot/__main__/bazel-out/k8-fastbuild/bin/util/memory/memory_test_/memory_test.runfiles/__main__/util/memory/tracker_test.go:117
Error: Not equal:
expected: 0
actual : 100
Test: TestReleas
```
https://prow.pingcap.net/view/gs/pingcapprow/logs/bazel_test_tidb/1554466791571329024
<!-- a step by step guide for reproducing the bug. -->
### 2. What did you expect to see? (Required)
### 3. What did you see instead (Required)
### 4. What is your TiDB version? (Required)
<!-- Paste the output of SELECT tidb_version() -->
|
non_process
|
unstable test testrelease bug report please answer these questions before submitting your issue thanks minimal reproduce step required run testrelease tracker test go error trace home jenkins tidb tmp sandbox processwrapper sandbox execroot main bazel out fastbuild bin util memory memory test memory test runfiles main util memory tracker test go error not equal expected actual test testreleas what did you expect to see required what did you see instead required what is your tidb version required
| 0
|
5,519
| 8,381,042,579
|
IssuesEvent
|
2018-10-07 20:43:55
|
MichiganDataScienceTeam/googleanalytics
|
https://api.github.com/repos/MichiganDataScienceTeam/googleanalytics
|
opened
|
Preprocess :u'device.mobileDeviceBranding', u'device.mobileDeviceInfo', u'device.mobileDeviceMarketingName', u'device.mobileDeviceModel', u'device.mobileInputSelector',
|
easy preprocessing
|
Preprocess the following features:
u'device.mobileDeviceBranding',
u'device.mobileDeviceInfo',
u'device.mobileDeviceMarketingName',
u'device.mobileDeviceModel',
u'device.mobileInputSelector',
1. Standardization: [http://scikit-learn.org/stable/modules/preprocessing.html#standardization-or-mean-removal-and-variance-scaling](http://scikit-learn.org/stable/modules/preprocessing.html#standardization-or-mean-removal-and-variance-scaling)
2. Impute missing values: [http://scikit-learn.org/stable/modules/impute.html](http://scikit-learn.org/stable/modules/impute.html)
3. Normalization: [http://scikit-learn.org/stable/modules/preprocessing.html#normalization](http://scikit-learn.org/stable/modules/preprocessing.html#normalization)
4. Encode categorical features (optional): [http://scikit-learn.org/stable/modules/preprocessing.html#encoding-categorical-features](http://scikit-learn.org/stable/modules/preprocessing.html#encoding-categorical-features)
5. Discretization (optional): [http://scikit-learn.org/stable/modules/preprocessing.html#discretization](http://scikit-learn.org/stable/modules/preprocessing.html#discretization)
[http://scikit-learn.org/stable/modules/preprocessing.html](http://scikit-learn.org/stable/modules/preprocessing.html)
|
1.0
|
Preprocess :u'device.mobileDeviceBranding', u'device.mobileDeviceInfo', u'device.mobileDeviceMarketingName', u'device.mobileDeviceModel', u'device.mobileInputSelector', - Preprocess the following features:
u'device.mobileDeviceBranding',
u'device.mobileDeviceInfo',
u'device.mobileDeviceMarketingName',
u'device.mobileDeviceModel',
u'device.mobileInputSelector',
1. Standardization: [http://scikit-learn.org/stable/modules/preprocessing.html#standardization-or-mean-removal-and-variance-scaling](http://scikit-learn.org/stable/modules/preprocessing.html#standardization-or-mean-removal-and-variance-scaling)
2. Impute missing values: [http://scikit-learn.org/stable/modules/impute.html](http://scikit-learn.org/stable/modules/impute.html)
3. Normalization: [http://scikit-learn.org/stable/modules/preprocessing.html#normalization](http://scikit-learn.org/stable/modules/preprocessing.html#normalization)
4. Encode categorical features (optional): [http://scikit-learn.org/stable/modules/preprocessing.html#encoding-categorical-features](http://scikit-learn.org/stable/modules/preprocessing.html#encoding-categorical-features)
5. Discretization (optional): [http://scikit-learn.org/stable/modules/preprocessing.html#discretization](http://scikit-learn.org/stable/modules/preprocessing.html#discretization)
[http://scikit-learn.org/stable/modules/preprocessing.html](http://scikit-learn.org/stable/modules/preprocessing.html)
|
process
|
preprocess u device mobiledevicebranding u device mobiledeviceinfo u device mobiledevicemarketingname u device mobiledevicemodel u device mobileinputselector preprocess the following features u device mobiledevicebranding u device mobiledeviceinfo u device mobiledevicemarketingname u device mobiledevicemodel u device mobileinputselector standardization impute missing values normalization encode categorical features optional discretization optional
| 1
|
39,449
| 2,855,399,375
|
IssuesEvent
|
2015-06-02 09:16:33
|
YaccConstructor/YaccConstructor
|
https://api.github.com/repos/YaccConstructor/YaccConstructor
|
closed
|
YC.Core package is incorrect
|
1_Priority bug
|
@VereshchaginaE Execution of YC.YaccConstructor.exe failed with next message
```
D:\...\sbsql\SqlMigration\SqlParser\packages\YC.Core.0.3.0.1-On-branch--local\lib\net40>YC.YaccConstructor.0.3.0.1-On-branch--local.exe -af -ag
Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly 'Mono.Addins, Version=1.0.0.0, Culture=neutral, PublicKeyToken=0738eb9f132ed756' or one of
its dependencies. The system cannot find the file specified.
at <StartupCode$YC-YaccConstructor>.$YaccConstructor.Program.main@()
```
Seems, that mono.addins ddls are missed.
|
1.0
|
YC.Core package is incorrect - @VereshchaginaE Execution of YC.YaccConstructor.exe failed with next message
```
D:\...\sbsql\SqlMigration\SqlParser\packages\YC.Core.0.3.0.1-On-branch--local\lib\net40>YC.YaccConstructor.0.3.0.1-On-branch--local.exe -af -ag
Unhandled Exception: System.IO.FileNotFoundException: Could not load file or assembly 'Mono.Addins, Version=1.0.0.0, Culture=neutral, PublicKeyToken=0738eb9f132ed756' or one of
its dependencies. The system cannot find the file specified.
at <StartupCode$YC-YaccConstructor>.$YaccConstructor.Program.main@()
```
Seems, that mono.addins ddls are missed.
|
non_process
|
yc core package is incorrect vereshchaginae execution of yc yaccconstructor exe failed with next message d sbsql sqlmigration sqlparser packages yc core on branch local lib yc yaccconstructor on branch local exe af ag unhandled exception system io filenotfoundexception could not load file or assembly mono addins version culture neutral publickeytoken or one of its dependencies the system cannot find the file specified at yaccconstructor program main seems that mono addins ddls are missed
| 0
|
1,477
| 4,054,541,828
|
IssuesEvent
|
2016-05-24 12:48:46
|
moxie-lean/ng-cms
|
https://api.github.com/repos/moxie-lean/ng-cms
|
closed
|
WordPress Admin Bar
|
process
|
We need a way to replicate the WP admin bar. We will need to determine if a user is logged in and what capabilities they have in order to show the relevant options.
|
1.0
|
WordPress Admin Bar - We need a way to replicate the WP admin bar. We will need to determine if a user is logged in and what capabilities they have in order to show the relevant options.
|
process
|
wordpress admin bar we need a way to replicate the wp admin bar we will need to determine if a user is logged in and what capabilities they have in order to show the relevant options
| 1
|
141,776
| 21,607,683,665
|
IssuesEvent
|
2022-05-04 06:32:18
|
milesmcc/atlos
|
https://api.github.com/repos/milesmcc/atlos
|
closed
|
Layer location icon below map container
|
design change
|
The location pin on media pages is popping up above the white "Location" container, can you layer it underneath? Thanks!
<img width="436" alt="Screen Shot 2022-05-03 at 19 17 38" src="https://user-images.githubusercontent.com/100018299/166615303-4b2bf22b-ce1a-406f-ae7e-63d6f4efcc42.png">
|
1.0
|
Layer location icon below map container - The location pin on media pages is popping up above the white "Location" container, can you layer it underneath? Thanks!
<img width="436" alt="Screen Shot 2022-05-03 at 19 17 38" src="https://user-images.githubusercontent.com/100018299/166615303-4b2bf22b-ce1a-406f-ae7e-63d6f4efcc42.png">
|
non_process
|
layer location icon below map container the location pin on media pages is popping up above the white location container can you layer it underneath thanks img width alt screen shot at src
| 0
|
118,743
| 4,755,106,576
|
IssuesEvent
|
2016-10-24 09:43:06
|
bespokeinteractive/mchapp
|
https://api.github.com/repos/bespokeinteractive/mchapp
|
opened
|
ANC: There's no internal referral to Maternity triage for an ANC patient
|
bug Medium Priority
|
ANC: There's no internal referral to Maternity triage for an ANC patient
|
1.0
|
ANC: There's no internal referral to Maternity triage for an ANC patient - ANC: There's no internal referral to Maternity triage for an ANC patient
|
non_process
|
anc there s no internal referral to maternity triage for an anc patient anc there s no internal referral to maternity triage for an anc patient
| 0
|
117,709
| 4,726,854,348
|
IssuesEvent
|
2016-10-18 11:40:38
|
CS2103AUG2016-W11-C2/main
|
https://api.github.com/repos/CS2103AUG2016-W11-C2/main
|
closed
|
Delete/mark/unmark multiple tasks at a time
|
priority.medium type.enhancement
|
I will need to enter fewer commands if I am making similar changes
|
1.0
|
Delete/mark/unmark multiple tasks at a time - I will need to enter fewer commands if I am making similar changes
|
non_process
|
delete mark unmark multiple tasks at a time i will need to enter fewer commands if i am making similar changes
| 0
|
10,117
| 13,044,162,232
|
IssuesEvent
|
2020-07-29 03:47:30
|
tikv/tikv
|
https://api.github.com/repos/tikv/tikv
|
closed
|
UCP: Migrate scalar function `MonthName` from TiDB
|
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
|
## Description
Port the scalar function `MonthName` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @iosmanthus
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
2.0
|
UCP: Migrate scalar function `MonthName` from TiDB -
## Description
Port the scalar function `MonthName` from TiDB to coprocessor.
## Score
* 50
## Mentor(s)
* @iosmanthus
## Recommended Skills
* Rust programming
## Learning Materials
Already implemented expressions ported from TiDB
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr)
- https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
|
process
|
ucp migrate scalar function monthname from tidb description port the scalar function monthname from tidb to coprocessor score mentor s iosmanthus recommended skills rust programming learning materials already implemented expressions ported from tidb
| 1
|
245,002
| 26,498,381,896
|
IssuesEvent
|
2023-01-18 08:19:50
|
RonenDabach/npm2nd
|
https://api.github.com/repos/RonenDabach/npm2nd
|
opened
|
hibernate-core-3.6.6.Final.jar: 8 vulnerabilities (highest severity is: 9.8)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hibernate-core-3.6.6.Final.jar</b></p></summary>
<p>The core functionality of Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/3.6.6.Final/hibernate-core-3.6.6.Final.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (hibernate-core version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2017-15708](https://www.mend.io/vulnerability-database/CVE-2017-15708) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2019-13116](https://www.mend.io/vulnerability-database/CVE-2019-13116) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2015-7501](https://www.mend.io/vulnerability-database/CVE-2015-7501) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2015-4852](https://www.mend.io/vulnerability-database/CVE-2015-4852) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | commons-collections-3.1.jar | Transitive | 4.0.0.Final | ✅ |
| [CVE-2015-6420](https://www.mend.io/vulnerability-database/CVE-2015-6420) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2018-1000632](https://www.mend.io/vulnerability-database/CVE-2018-1000632) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | dom4j-1.6.1.jar | Transitive | 4.2.0.Final | ✅ |
| [CVE-2020-25638](https://www.mend.io/vulnerability-database/CVE-2020-25638) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.4 | hibernate-core-3.6.6.Final.jar | Direct | 3.6.9-intact.3 | ✅ |
| [CVE-2019-14900](https://www.mend.io/vulnerability-database/CVE-2019-14900) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | hibernate-core-3.6.6.Final.jar | Direct | 5.1.10.Final | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-15708</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Apache Synapse, by default no authentication is required for Java Remote Method Invocation (RMI). So Apache Synapse 3.0.1 or all previous releases (3.0.0, 2.1.0, 2.0.0, 1.2, 1.1.2, 1.1.1) allows remote code execution attacks that can be performed by injecting specially crafted serialized objects. And the presence of Apache Commons Collections 3.2.1 (commons-collections-3.2.1.jar) or previous versions in Synapse distribution makes this exploitable. To mitigate the issue, we need to limit RMI access to trusted users only. Further upgrading to 3.0.1 version will eliminate the risk of having said Commons Collection version. In Synapse 3.0.1, Commons Collection has been updated to 3.2.2 version.
<p>Publish Date: 2017-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-15708>CVE-2017-15708</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-15708">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-15708</a></p>
<p>Release Date: 2017-12-11</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-13116</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The MuleSoft Mule Community Edition runtime engine before 3.8 allows remote attackers to execute arbitrary code because of Java Deserialization, related to Apache Commons Collections
<p>Publish Date: 2019-10-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-13116>CVE-2019-13116</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13116">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13116</a></p>
<p>Release Date: 2019-10-16</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-7501</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Red Hat JBoss A-MQ 6.x; BPM Suite (BPMS) 6.x; BRMS 6.x and 5.x; Data Grid (JDG) 6.x; Data Virtualization (JDV) 6.x and 5.x; Enterprise Application Platform 6.x, 5.x, and 4.3.x; Fuse 6.x; Fuse Service Works (FSW) 6.x; Operations Network (JBoss ON) 3.x; Portal 6.x; SOA Platform (SOA-P) 5.x; Web Server (JWS) 3.x; Red Hat OpenShift/xPAAS 3.x; and Red Hat Subscription Asset Manager 1.3 allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library.
<p>Publish Date: 2017-11-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-7501>CVE-2015-7501</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1279330">https://bugzilla.redhat.com/show_bug.cgi?id=1279330</a></p>
<p>Release Date: 2017-11-09</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-4852</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The WLS Security component in Oracle WebLogic Server 10.3.6.0, 12.1.2.0, 12.1.3.0, and 12.2.1.0 allows remote attackers to execute arbitrary commands via a crafted serialized Java object in T3 protocol traffic to TCP port 7001, related to oracle_common/modules/com.bea.core.apache.commons.collections.jar. NOTE: the scope of this CVE is limited to the WebLogic Server product.
<p>Publish Date: 2015-11-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-4852>CVE-2015-4852</a></p>
</p>
<p></p>
### CVSS 2 Score Details (<b>7.5</b>)
<p>
Base Score Metrics not available</p>
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.openwall.com/lists/oss-security/2015/11/17/19">https://www.openwall.com/lists/oss-security/2015/11/17/19</a></p>
<p>Release Date: 2015-11-18</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.1-NODEP</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.0.0.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-6420</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Serialized-object interfaces in certain Cisco Collaboration and Social Media; Endpoint Clients and Client Software; Network Application, Service, and Acceleration; Network and Content Security Devices; Network Management and Provisioning; Routing and Switching - Enterprise and Service Provider; Unified Computing; Voice and Unified Communications Devices; Video, Streaming, TelePresence, and Transcoding Devices; Wireless; and Cisco Hosted Services products allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library.
<p>Publish Date: 2015-12-15
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-6420>CVE-2015-6420</a></p>
</p>
<p></p>
### CVSS 2 Score Details (<b>7.5</b>)
<p>
Base Score Metrics not available</p>
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2015-12-15</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-1000632</summary>
### Vulnerable Library - <b>dom4j-1.6.1.jar</b></p>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution (dom4j:dom4j): 20040902.021138</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.2.0.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-25638</summary>
### Vulnerable Library - <b>hibernate-core-3.6.6.Final.jar</b></p>
<p>The core functionality of Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/3.6.6.Final/hibernate-core-3.6.6.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **hibernate-core-3.6.6.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in hibernate-core in versions prior to and including 5.4.23.Final. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SQL comments of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks. The highest threat from this vulnerability is to data confidentiality and integrity.
<p>Publish Date: 2020-12-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-25638>CVE-2020-25638</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/">https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/</a></p>
<p>Release Date: 2020-12-02</p>
<p>Fix Resolution: 3.6.9-intact.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-14900</summary>
### Vulnerable Library - <b>hibernate-core-3.6.6.Final.jar</b></p>
<p>The core functionality of Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/3.6.6.Final/hibernate-core-3.6.6.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **hibernate-core-3.6.6.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in Hibernate ORM in versions before 5.3.18, 5.4.18 and 5.5.0.Beta1. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SELECT or GROUP BY parts of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks.
<p>Publish Date: 2020-07-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-14900>CVE-2019-14900</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900</a></p>
<p>Release Date: 2020-07-06</p>
<p>Fix Resolution: 5.1.10.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
True
|
hibernate-core-3.6.6.Final.jar: 8 vulnerabilities (highest severity is: 9.8) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hibernate-core-3.6.6.Final.jar</b></p></summary>
<p>The core functionality of Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/3.6.6.Final/hibernate-core-3.6.6.Final.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (hibernate-core version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2017-15708](https://www.mend.io/vulnerability-database/CVE-2017-15708) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2019-13116](https://www.mend.io/vulnerability-database/CVE-2019-13116) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2015-7501](https://www.mend.io/vulnerability-database/CVE-2015-7501) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 9.8 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2015-4852](https://www.mend.io/vulnerability-database/CVE-2015-4852) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | commons-collections-3.1.jar | Transitive | 4.0.0.Final | ✅ |
| [CVE-2015-6420](https://www.mend.io/vulnerability-database/CVE-2015-6420) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | commons-collections-3.1.jar | Transitive | 4.1.1.Final | ✅ |
| [CVE-2018-1000632](https://www.mend.io/vulnerability-database/CVE-2018-1000632) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | dom4j-1.6.1.jar | Transitive | 4.2.0.Final | ✅ |
| [CVE-2020-25638](https://www.mend.io/vulnerability-database/CVE-2020-25638) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.4 | hibernate-core-3.6.6.Final.jar | Direct | 3.6.9-intact.3 | ✅ |
| [CVE-2019-14900](https://www.mend.io/vulnerability-database/CVE-2019-14900) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.5 | hibernate-core-3.6.6.Final.jar | Direct | 5.1.10.Final | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2017-15708</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Apache Synapse, by default no authentication is required for Java Remote Method Invocation (RMI). So Apache Synapse 3.0.1 or all previous releases (3.0.0, 2.1.0, 2.0.0, 1.2, 1.1.2, 1.1.1) allows remote code execution attacks that can be performed by injecting specially crafted serialized objects. And the presence of Apache Commons Collections 3.2.1 (commons-collections-3.2.1.jar) or previous versions in Synapse distribution makes this exploitable. To mitigate the issue, we need to limit RMI access to trusted users only. Further upgrading to 3.0.1 version will eliminate the risk of having said Commons Collection version. In Synapse 3.0.1, Commons Collection has been updated to 3.2.2 version.
<p>Publish Date: 2017-12-11
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-15708>CVE-2017-15708</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-15708">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-15708</a></p>
<p>Release Date: 2017-12-11</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2019-13116</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The MuleSoft Mule Community Edition runtime engine before 3.8 allows remote attackers to execute arbitrary code because of Java Deserialization, related to Apache Commons Collections
<p>Publish Date: 2019-10-16
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-13116>CVE-2019-13116</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13116">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-13116</a></p>
<p>Release Date: 2019-10-16</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-7501</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Red Hat JBoss A-MQ 6.x; BPM Suite (BPMS) 6.x; BRMS 6.x and 5.x; Data Grid (JDG) 6.x; Data Virtualization (JDV) 6.x and 5.x; Enterprise Application Platform 6.x, 5.x, and 4.3.x; Fuse 6.x; Fuse Service Works (FSW) 6.x; Operations Network (JBoss ON) 3.x; Portal 6.x; SOA Platform (SOA-P) 5.x; Web Server (JWS) 3.x; Red Hat OpenShift/xPAAS 3.x; and Red Hat Subscription Asset Manager 1.3 allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library.
<p>Publish Date: 2017-11-09
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-7501>CVE-2015-7501</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://bugzilla.redhat.com/show_bug.cgi?id=1279330">https://bugzilla.redhat.com/show_bug.cgi?id=1279330</a></p>
<p>Release Date: 2017-11-09</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-4852</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The WLS Security component in Oracle WebLogic Server 10.3.6.0, 12.1.2.0, 12.1.3.0, and 12.2.1.0 allows remote attackers to execute arbitrary commands via a crafted serialized Java object in T3 protocol traffic to TCP port 7001, related to oracle_common/modules/com.bea.core.apache.commons.collections.jar. NOTE: the scope of this CVE is limited to the WebLogic Server product.
<p>Publish Date: 2015-11-18
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-4852>CVE-2015-4852</a></p>
</p>
<p></p>
### CVSS 2 Score Details (<b>7.5</b>)
<p>
Base Score Metrics not available</p>
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.openwall.com/lists/oss-security/2015/11/17/19">https://www.openwall.com/lists/oss-security/2015/11/17/19</a></p>
<p>Release Date: 2015-11-18</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.1-NODEP</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.0.0.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2015-6420</summary>
### Vulnerable Library - <b>commons-collections-3.1.jar</b></p>
<p>Types that extend and augment the Java Collections Framework.</p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-collections/commons-collections/3.1/commons-collections-3.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **commons-collections-3.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Serialized-object interfaces in certain Cisco Collaboration and Social Media; Endpoint Clients and Client Software; Network Application, Service, and Acceleration; Network and Content Security Devices; Network Management and Provisioning; Routing and Switching - Enterprise and Service Provider; Unified Computing; Voice and Unified Communications Devices; Video, Streaming, TelePresence, and Transcoding Devices; Wireless; and Cisco Hosted Services products allow remote attackers to execute arbitrary commands via a crafted serialized Java object, related to the Apache Commons Collections (ACC) library.
<p>Publish Date: 2015-12-15
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-6420>CVE-2015-6420</a></p>
</p>
<p></p>
### CVSS 2 Score Details (<b>7.5</b>)
<p>
Base Score Metrics not available</p>
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2015-12-15</p>
<p>Fix Resolution (commons-collections:commons-collections): 3.2.2</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.1.1.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2018-1000632</summary>
### Vulnerable Library - <b>dom4j-1.6.1.jar</b></p>
<p>dom4j: the flexible XML framework for Java</p>
<p>Library home page: <a href="http://dom4j.org">http://dom4j.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/dom4j/dom4j/1.6.1/dom4j-1.6.1.jar</p>
<p>
Dependency Hierarchy:
- hibernate-core-3.6.6.Final.jar (Root Library)
- :x: **dom4j-1.6.1.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
dom4j version prior to version 2.1.1 contains a CWE-91: XML Injection vulnerability in Class: Element. Methods: addElement, addAttribute that can result in an attacker tampering with XML documents through XML injection. This attack appear to be exploitable via an attacker specifying attributes or elements in the XML document. This vulnerability appears to have been fixed in 2.1.1 or later.
<p>Publish Date: 2018-08-20
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-1000632>CVE-2018-1000632</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-1000632</a></p>
<p>Release Date: 2018-08-20</p>
<p>Fix Resolution (dom4j:dom4j): 20040902.021138</p>
<p>Direct dependency fix Resolution (org.hibernate:hibernate-core): 4.2.0.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2020-25638</summary>
### Vulnerable Library - <b>hibernate-core-3.6.6.Final.jar</b></p>
<p>The core functionality of Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/3.6.6.Final/hibernate-core-3.6.6.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **hibernate-core-3.6.6.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in hibernate-core in versions prior to and including 5.4.23.Final. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SQL comments of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks. The highest threat from this vulnerability is to data confidentiality and integrity.
<p>Publish Date: 2020-12-02
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-25638>CVE-2020-25638</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/">https://in.relation.to/2020/11/19/hibernate-orm-5424-final-release/</a></p>
<p>Release Date: 2020-12-02</p>
<p>Fix Resolution: 3.6.9-intact.3</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-14900</summary>
### Vulnerable Library - <b>hibernate-core-3.6.6.Final.jar</b></p>
<p>The core functionality of Hibernate</p>
<p>Library home page: <a href="http://hibernate.org">http://hibernate.org</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/hibernate/hibernate-core/3.6.6.Final/hibernate-core-3.6.6.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **hibernate-core-3.6.6.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/RonenDabach/npm2nd/commit/db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd">db900dbff6cbbd2605a4f3fbc68c40f8eeae86fd</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A flaw was found in Hibernate ORM in versions before 5.3.18, 5.4.18 and 5.5.0.Beta1. A SQL injection in the implementation of the JPA Criteria API can permit unsanitized literals when a literal is used in the SELECT or GROUP BY parts of the query. This flaw could allow an attacker to access unauthorized information or possibly conduct further attacks.
<p>Publish Date: 2020-07-06
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-14900>CVE-2019-14900</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-14900</a></p>
<p>Release Date: 2020-07-06</p>
<p>Fix Resolution: 5.1.10.Final</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p>
|
non_process
|
hibernate core final jar vulnerabilities highest severity is vulnerable library hibernate core final jar the core functionality of hibernate library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org hibernate hibernate core final hibernate core final jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in hibernate core version remediation available high commons collections jar transitive final high commons collections jar transitive final high commons collections jar transitive final high commons collections jar transitive final high commons collections jar transitive final high jar transitive final high hibernate core final jar direct intact medium hibernate core final jar direct final details cve vulnerable library commons collections jar types that extend and augment the java collections framework path to dependency file pom xml path to vulnerable library home wss scanner repository commons collections commons collections commons collections jar dependency hierarchy hibernate core final jar root library x commons collections jar vulnerable library found in head commit a href found in base branch main vulnerability details in apache synapse by default no authentication is required for java remote method invocation rmi so apache synapse or all previous releases allows remote code execution attacks that can be performed by injecting specially crafted serialized objects and the presence of apache commons collections commons collections jar or previous versions in synapse distribution makes this exploitable to mitigate the issue we need to limit rmi access to trusted users only further upgrading to version will eliminate the risk of having said commons collection version in synapse commons collection has been updated to version publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commons collections commons collections direct dependency fix resolution org hibernate hibernate core final rescue worker helmet automatic remediation is available for this issue cve vulnerable library commons collections jar types that extend and augment the java collections framework path to dependency file pom xml path to vulnerable library home wss scanner repository commons collections commons collections commons collections jar dependency hierarchy hibernate core final jar root library x commons collections jar vulnerable library found in head commit a href found in base branch main vulnerability details the mulesoft mule community edition runtime engine before allows remote attackers to execute arbitrary code because of java deserialization related to apache commons collections publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commons collections commons collections direct dependency fix resolution org hibernate hibernate core final rescue worker helmet automatic remediation is available for this issue cve vulnerable library commons collections jar types that extend and augment the java collections framework path to dependency file pom xml path to vulnerable library home wss scanner repository commons collections commons collections commons collections jar dependency hierarchy hibernate core final jar root library x commons collections jar vulnerable library found in head commit a href found in base branch main vulnerability details red hat jboss a mq x bpm suite bpms x brms x and x data grid jdg x data virtualization jdv x and x enterprise application platform x x and x fuse x fuse service works fsw x operations network jboss on x portal x soa platform soa p x web server jws x red hat openshift xpaas x and red hat subscription asset manager allow remote attackers to execute arbitrary commands via a crafted serialized java object related to the apache commons collections acc library publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commons collections commons collections direct dependency fix resolution org hibernate hibernate core final rescue worker helmet automatic remediation is available for this issue cve vulnerable library commons collections jar types that extend and augment the java collections framework path to dependency file pom xml path to vulnerable library home wss scanner repository commons collections commons collections commons collections jar dependency hierarchy hibernate core final jar root library x commons collections jar vulnerable library found in head commit a href found in base branch main vulnerability details the wls security component in oracle weblogic server and allows remote attackers to execute arbitrary commands via a crafted serialized java object in protocol traffic to tcp port related to oracle common modules com bea core apache commons collections jar note the scope of this cve is limited to the weblogic server product publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution commons collections commons collections nodep direct dependency fix resolution org hibernate hibernate core final rescue worker helmet automatic remediation is available for this issue cve vulnerable library commons collections jar types that extend and augment the java collections framework path to dependency file pom xml path to vulnerable library home wss scanner repository commons collections commons collections commons collections jar dependency hierarchy hibernate core final jar root library x commons collections jar vulnerable library found in head commit a href found in base branch main vulnerability details serialized object interfaces in certain cisco collaboration and social media endpoint clients and client software network application service and acceleration network and content security devices network management and provisioning routing and switching enterprise and service provider unified computing voice and unified communications devices video streaming telepresence and transcoding devices wireless and cisco hosted services products allow remote attackers to execute arbitrary commands via a crafted serialized java object related to the apache commons collections acc library publish date url a href cvss score details base score metrics not available suggested fix type upgrade version release date fix resolution commons collections commons collections direct dependency fix resolution org hibernate hibernate core final rescue worker helmet automatic remediation is available for this issue cve vulnerable library jar the flexible xml framework for java library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository jar dependency hierarchy hibernate core final jar root library x jar vulnerable library found in head commit a href found in base branch main vulnerability details version prior to version contains a cwe xml injection vulnerability in class element methods addelement addattribute that can result in an attacker tampering with xml documents through xml injection this attack appear to be exploitable via an attacker specifying attributes or elements in the xml document this vulnerability appears to have been fixed in or later publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution direct dependency fix resolution org hibernate hibernate core final rescue worker helmet automatic remediation is available for this issue cve vulnerable library hibernate core final jar the core functionality of hibernate library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org hibernate hibernate core final hibernate core final jar dependency hierarchy x hibernate core final jar vulnerable library found in head commit a href found in base branch main vulnerability details a flaw was found in hibernate core in versions prior to and including final a sql injection in the implementation of the jpa criteria api can permit unsanitized literals when a literal is used in the sql comments of the query this flaw could allow an attacker to access unauthorized information or possibly conduct further attacks the highest threat from this vulnerability is to data confidentiality and integrity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution intact rescue worker helmet automatic remediation is available for this issue cve vulnerable library hibernate core final jar the core functionality of hibernate library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository org hibernate hibernate core final hibernate core final jar dependency hierarchy x hibernate core final jar vulnerable library found in head commit a href found in base branch main vulnerability details a flaw was found in hibernate orm in versions before and a sql injection in the implementation of the jpa criteria api can permit unsanitized literals when a literal is used in the select or group by parts of the query this flaw could allow an attacker to access unauthorized information or possibly conduct further attacks publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution final rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue
| 0
|
386,410
| 26,681,603,379
|
IssuesEvent
|
2023-01-26 18:09:40
|
S-V-23-BootCamp-Team-F/docker
|
https://api.github.com/repos/S-V-23-BootCamp-Team-F/docker
|
closed
|
[chore] : Prometheus & Grafana docker-compose 작성
|
documentation
|
---
name: 환경 설정
about: 개발 환경 세팅
title: "[chore]"
labels: "환경설정"
assignees: ''
---
## ✨ 세팅할 환경
세팅할 환경에 대해 간략하게 설명해주세요!
name: 모니터링 환경 설정
about: Prometheus & Grafana
title: "[chore] : Prometheus & Grafana docker-compose 작성"
labels: "환경설정"
assignees: 'jiyoon0701'
<br>
### 📕 래퍼런스
|
1.0
|
[chore] : Prometheus & Grafana docker-compose 작성 - ---
name: 환경 설정
about: 개발 환경 세팅
title: "[chore]"
labels: "환경설정"
assignees: ''
---
## ✨ 세팅할 환경
세팅할 환경에 대해 간략하게 설명해주세요!
name: 모니터링 환경 설정
about: Prometheus & Grafana
title: "[chore] : Prometheus & Grafana docker-compose 작성"
labels: "환경설정"
assignees: 'jiyoon0701'
<br>
### 📕 래퍼런스
|
non_process
|
prometheus grafana docker compose 작성 name 환경 설정 about 개발 환경 세팅 title labels 환경설정 assignees ✨ 세팅할 환경 세팅할 환경에 대해 간략하게 설명해주세요 name 모니터링 환경 설정 about prometheus grafana title prometheus grafana docker compose 작성 labels 환경설정 assignees 📕 래퍼런스
| 0
|
341,461
| 30,587,000,976
|
IssuesEvent
|
2023-07-21 14:06:22
|
eclipse-openj9/openj9
|
https://api.github.com/repos/eclipse-openj9/openj9
|
opened
|
jdk_nio_1_FAILED java/nio/file/FileStore/Basic.java RuntimeException: Assertion failed
|
test failure
|
Failure link
------------
From [an internal build](https://hyc-runtimes-jenkins.swg-devops.com/job/Test_openjdk11_j9_extended.openjdk_x86-64_linux/152/)(`rhel9x86-rt1-1`):
```
java version "11.0.20" 2023-07-18
IBM Semeru Runtime Certified Edition 11.0.20.0-rc1 (build 11.0.20+8)
Eclipse OpenJ9 VM 11.0.20.0-rc1 (build v0.40.0-release-b681a676a, JRE 11 Linux amd64-64-Bit Compressed References 20230719_670 (JIT enabled, AOT enabled)
OpenJ9 - b681a676a
OMR - e80bff83b
JCL - 89ad412ebb based on jdk-11.0.20+8)
```
[Rerun in Grinder](https://hyc-runtimes-jenkins.swg-devops.com/job/Grinder/parambuild/?SDK_RESOURCE=customized&TARGET=jdk_nio_1&TEST_FLAG=&UPSTREAM_TEST_JOB_NAME=&DOCKER_REQUIRED=false&ACTIVE_NODE_TIMEOUT=0&VENDOR_TEST_DIRS=&EXTRA_DOCKER_ARGS=&TKG_OWNER_BRANCH=adoptium%3Amaster&OPENJ9_SYSTEMTEST_OWNER_BRANCH=eclipse%3Amaster&PLATFORM=x86-64_linux&GENERATE_JOBS=true&KEEP_REPORTDIR=true&PERSONAL_BUILD=false&DOCKER_REGISTRY_DIR=&RERUN_ITERATIONS=0&ADOPTOPENJDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Faqa-tests.git&SETUP_JCK_RUN=false&DOCKER_REGISTRY_URL_CREDENTIAL_ID=&LABEL=&EXTRA_OPTIONS=&CUSTOMIZED_SDK_URL=https%3A%2F%2Fna-public.artifactory.swg-devops.com%2Fartifactory%2Fsys-rt-generic-local%2Fhyc-runtimes-jenkins.swg-devops.com%2Fbuild-scripts%2Fjobs%2Fjdk11u%2Fjdk11u-linux-x64-openj9-IBM%2F670%2Fibm-semeru-certified-jdk_x64_linux_11.0.20.0-rc1.tar.gz+https%3A%2F%2Fna-public.artifactory.swg-devops.com%2Fartifactory%2Fsys-rt-generic-local%2Fhyc-runtimes-jenkins.swg-devops.com%2Fbuild-scripts%2Fjobs%2Fjdk11u%2Fjdk11u-linux-x64-openj9-IBM%2F670%2Fibm-semeru-certified-testimage_x64_linux_11.0.20.0-rc1.tar.gz&BUILD_IDENTIFIER=&JENKINS_KEY=Jenkins+4096+key&ADOPTOPENJDK_BRANCH=v0.9.8-release&LIGHT_WEIGHT_CHECKOUT=true&USE_JRE=false&ARTIFACTORY_SERVER=na.artifactory.swg-devops&KEEP_WORKSPACE=false&USER_CREDENTIALS_ID=83181e25-eea4-4f55-8b3e-e79615733226&JDK_VERSION=11&DOCKER_REGISTRY_URL=&ITERATIONS=1&VENDOR_TEST_REPOS=&JDK_REPO=git%40github.com%3Aibmruntimes%2Fopenj9-openjdk-jdk11&JCK_GIT_BRANCH=main&RELEASE_TAG=v0.40.0-release&OPENJ9_BRANCH=v0.40.0-release&OPENJ9_SHA=&JCK_GIT_REPO=&VENDOR_TEST_BRANCHES=&OPENJ9_REPO=https%3A%2F%2Fgithub.com%2Feclipse-openj9%2Fopenj9.git&UPSTREAM_JOB_NAME=&CLOUD_PROVIDER=&CUSTOM_TARGET=&VENDOR_TEST_SHAS=&JDK_BRANCH=v0.40.0-release&LABEL_ADDITION=&ARTIFACTORY_REPO=sys-rt-generic-local%2Chyc-runtimes-team-restricted-oracle-sdk-generic-local&ARTIFACTORY_ROOT_DIR=&UPSTREAM_TEST_JOB_NUMBER=&DOCKERIMAGE_TAG=&JDK_IMPL=openj9&TEST_TIME=&SSH_AGENT_CREDENTIAL=83181e25-eea4-4f55-8b3e-e79615733226&AUTO_DETECT=true&SLACK_CHANNEL=%23rt-jenkins&DYNAMIC_COMPILE=false&RELATED_NODES=&ADOPTOPENJDK_SYSTEMTEST_OWNER_BRANCH=adoptium%3Amaster&APPLICATION_OPTIONS=&CUSTOMIZED_SDK_URL_CREDENTIAL_ID=7c1c2c28-650f-49e0-afd1-ca6b60479546&ARCHIVE_TEST_RESULTS=false&NUM_MACHINES=&OPENJDK_SHA=&TRSS_URL=http%3A%2F%2Ftrss1.fyre.ibm.com&USE_TESTENV_PROPERTIES=true&BUILD_LIST=openjdk&ADDITIONAL_ARTIFACTS_REQUIRED=&UPSTREAM_JOB_NUMBER=&STF_OWNER_BRANCH=adoptium%3Amaster&TIME_LIMIT=20&JVM_OPTIONS=&PARALLEL=None) - Change TARGET to run only the failed test targets.
Optional info
-------------
Failure output (captured from console output)
---------------------------------------------
```
[2023-07-19T19:41:54.013Z] variation: Mode650
[2023-07-19T19:41:54.013Z] JVM_OPTIONS: -XX:-UseCompressedOops
[2023-07-19T19:56:39.738Z] TEST: java/nio/file/FileStore/Basic.java
[2023-07-19T19:56:39.751Z] STDERR:
[2023-07-19T19:56:39.751Z] java.lang.RuntimeException: Assertion failed
[2023-07-19T19:56:39.751Z] at Basic.assertTrue(Basic.java:57)
[2023-07-19T19:56:39.751Z] at Basic.doTests(Basic.java:161)
[2023-07-19T19:56:39.751Z] at Basic.main(Basic.java:49)
[2023-07-19T19:56:39.751Z] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[2023-07-19T19:56:39.751Z] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[2023-07-19T19:56:39.751Z] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[2023-07-19T19:56:39.751Z] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[2023-07-19T19:56:39.751Z] at com.sun.javatest.regtest.agent.MainActionHelper$AgentVMRunnable.run(MainActionHelper.java:312)
[2023-07-19T19:56:39.751Z] at java.base/java.lang.Thread.run(Thread.java:839)
[2023-07-19T19:56:39.751Z]
[2023-07-19T19:56:39.751Z] JavaTest Message: Test threw exception: java.lang.RuntimeException
[2023-07-19T19:56:39.751Z] JavaTest Message: shutting down test
[2023-07-19T19:56:39.751Z]
[2023-07-19T19:56:39.751Z]
[2023-07-19T19:56:39.751Z] TEST RESULT: Failed. Execution failed: `main' threw exception: java.lang.RuntimeException: Assertion failed
[2023-07-19T19:56:39.751Z] --------------------------------------------------
[2023-07-19T20:13:38.100Z] Test results: passed: 390; failed: 1
[2023-07-19T20:13:38.100Z] Report written to /home/jenkins/workspace/Test_openjdk11_j9_extended.openjdk_x86-64_linux/aqa-tests/TKG/output_16897925209163/jdk_nio_1/report/html/report.html
[2023-07-19T20:13:38.100Z] Results written to /home/jenkins/workspace/Test_openjdk11_j9_extended.openjdk_x86-64_linux/aqa-tests/TKG/output_16897925209163/jdk_nio_1/work
[2023-07-19T20:13:38.100Z] Error: Some tests failed or other problems occurred.
[2023-07-19T20:13:38.100Z] -----------------------------------
[2023-07-19T20:13:38.100Z] jdk_nio_1_FAILED
```
[50x grinder](https://hyc-runtimes-jenkins.swg-devops.com/job/Grinder/34089/)
|
1.0
|
jdk_nio_1_FAILED java/nio/file/FileStore/Basic.java RuntimeException: Assertion failed - Failure link
------------
From [an internal build](https://hyc-runtimes-jenkins.swg-devops.com/job/Test_openjdk11_j9_extended.openjdk_x86-64_linux/152/)(`rhel9x86-rt1-1`):
```
java version "11.0.20" 2023-07-18
IBM Semeru Runtime Certified Edition 11.0.20.0-rc1 (build 11.0.20+8)
Eclipse OpenJ9 VM 11.0.20.0-rc1 (build v0.40.0-release-b681a676a, JRE 11 Linux amd64-64-Bit Compressed References 20230719_670 (JIT enabled, AOT enabled)
OpenJ9 - b681a676a
OMR - e80bff83b
JCL - 89ad412ebb based on jdk-11.0.20+8)
```
[Rerun in Grinder](https://hyc-runtimes-jenkins.swg-devops.com/job/Grinder/parambuild/?SDK_RESOURCE=customized&TARGET=jdk_nio_1&TEST_FLAG=&UPSTREAM_TEST_JOB_NAME=&DOCKER_REQUIRED=false&ACTIVE_NODE_TIMEOUT=0&VENDOR_TEST_DIRS=&EXTRA_DOCKER_ARGS=&TKG_OWNER_BRANCH=adoptium%3Amaster&OPENJ9_SYSTEMTEST_OWNER_BRANCH=eclipse%3Amaster&PLATFORM=x86-64_linux&GENERATE_JOBS=true&KEEP_REPORTDIR=true&PERSONAL_BUILD=false&DOCKER_REGISTRY_DIR=&RERUN_ITERATIONS=0&ADOPTOPENJDK_REPO=https%3A%2F%2Fgithub.com%2Fadoptium%2Faqa-tests.git&SETUP_JCK_RUN=false&DOCKER_REGISTRY_URL_CREDENTIAL_ID=&LABEL=&EXTRA_OPTIONS=&CUSTOMIZED_SDK_URL=https%3A%2F%2Fna-public.artifactory.swg-devops.com%2Fartifactory%2Fsys-rt-generic-local%2Fhyc-runtimes-jenkins.swg-devops.com%2Fbuild-scripts%2Fjobs%2Fjdk11u%2Fjdk11u-linux-x64-openj9-IBM%2F670%2Fibm-semeru-certified-jdk_x64_linux_11.0.20.0-rc1.tar.gz+https%3A%2F%2Fna-public.artifactory.swg-devops.com%2Fartifactory%2Fsys-rt-generic-local%2Fhyc-runtimes-jenkins.swg-devops.com%2Fbuild-scripts%2Fjobs%2Fjdk11u%2Fjdk11u-linux-x64-openj9-IBM%2F670%2Fibm-semeru-certified-testimage_x64_linux_11.0.20.0-rc1.tar.gz&BUILD_IDENTIFIER=&JENKINS_KEY=Jenkins+4096+key&ADOPTOPENJDK_BRANCH=v0.9.8-release&LIGHT_WEIGHT_CHECKOUT=true&USE_JRE=false&ARTIFACTORY_SERVER=na.artifactory.swg-devops&KEEP_WORKSPACE=false&USER_CREDENTIALS_ID=83181e25-eea4-4f55-8b3e-e79615733226&JDK_VERSION=11&DOCKER_REGISTRY_URL=&ITERATIONS=1&VENDOR_TEST_REPOS=&JDK_REPO=git%40github.com%3Aibmruntimes%2Fopenj9-openjdk-jdk11&JCK_GIT_BRANCH=main&RELEASE_TAG=v0.40.0-release&OPENJ9_BRANCH=v0.40.0-release&OPENJ9_SHA=&JCK_GIT_REPO=&VENDOR_TEST_BRANCHES=&OPENJ9_REPO=https%3A%2F%2Fgithub.com%2Feclipse-openj9%2Fopenj9.git&UPSTREAM_JOB_NAME=&CLOUD_PROVIDER=&CUSTOM_TARGET=&VENDOR_TEST_SHAS=&JDK_BRANCH=v0.40.0-release&LABEL_ADDITION=&ARTIFACTORY_REPO=sys-rt-generic-local%2Chyc-runtimes-team-restricted-oracle-sdk-generic-local&ARTIFACTORY_ROOT_DIR=&UPSTREAM_TEST_JOB_NUMBER=&DOCKERIMAGE_TAG=&JDK_IMPL=openj9&TEST_TIME=&SSH_AGENT_CREDENTIAL=83181e25-eea4-4f55-8b3e-e79615733226&AUTO_DETECT=true&SLACK_CHANNEL=%23rt-jenkins&DYNAMIC_COMPILE=false&RELATED_NODES=&ADOPTOPENJDK_SYSTEMTEST_OWNER_BRANCH=adoptium%3Amaster&APPLICATION_OPTIONS=&CUSTOMIZED_SDK_URL_CREDENTIAL_ID=7c1c2c28-650f-49e0-afd1-ca6b60479546&ARCHIVE_TEST_RESULTS=false&NUM_MACHINES=&OPENJDK_SHA=&TRSS_URL=http%3A%2F%2Ftrss1.fyre.ibm.com&USE_TESTENV_PROPERTIES=true&BUILD_LIST=openjdk&ADDITIONAL_ARTIFACTS_REQUIRED=&UPSTREAM_JOB_NUMBER=&STF_OWNER_BRANCH=adoptium%3Amaster&TIME_LIMIT=20&JVM_OPTIONS=&PARALLEL=None) - Change TARGET to run only the failed test targets.
Optional info
-------------
Failure output (captured from console output)
---------------------------------------------
```
[2023-07-19T19:41:54.013Z] variation: Mode650
[2023-07-19T19:41:54.013Z] JVM_OPTIONS: -XX:-UseCompressedOops
[2023-07-19T19:56:39.738Z] TEST: java/nio/file/FileStore/Basic.java
[2023-07-19T19:56:39.751Z] STDERR:
[2023-07-19T19:56:39.751Z] java.lang.RuntimeException: Assertion failed
[2023-07-19T19:56:39.751Z] at Basic.assertTrue(Basic.java:57)
[2023-07-19T19:56:39.751Z] at Basic.doTests(Basic.java:161)
[2023-07-19T19:56:39.751Z] at Basic.main(Basic.java:49)
[2023-07-19T19:56:39.751Z] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
[2023-07-19T19:56:39.751Z] at java.base/jdk.internal.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
[2023-07-19T19:56:39.751Z] at java.base/jdk.internal.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
[2023-07-19T19:56:39.751Z] at java.base/java.lang.reflect.Method.invoke(Method.java:566)
[2023-07-19T19:56:39.751Z] at com.sun.javatest.regtest.agent.MainActionHelper$AgentVMRunnable.run(MainActionHelper.java:312)
[2023-07-19T19:56:39.751Z] at java.base/java.lang.Thread.run(Thread.java:839)
[2023-07-19T19:56:39.751Z]
[2023-07-19T19:56:39.751Z] JavaTest Message: Test threw exception: java.lang.RuntimeException
[2023-07-19T19:56:39.751Z] JavaTest Message: shutting down test
[2023-07-19T19:56:39.751Z]
[2023-07-19T19:56:39.751Z]
[2023-07-19T19:56:39.751Z] TEST RESULT: Failed. Execution failed: `main' threw exception: java.lang.RuntimeException: Assertion failed
[2023-07-19T19:56:39.751Z] --------------------------------------------------
[2023-07-19T20:13:38.100Z] Test results: passed: 390; failed: 1
[2023-07-19T20:13:38.100Z] Report written to /home/jenkins/workspace/Test_openjdk11_j9_extended.openjdk_x86-64_linux/aqa-tests/TKG/output_16897925209163/jdk_nio_1/report/html/report.html
[2023-07-19T20:13:38.100Z] Results written to /home/jenkins/workspace/Test_openjdk11_j9_extended.openjdk_x86-64_linux/aqa-tests/TKG/output_16897925209163/jdk_nio_1/work
[2023-07-19T20:13:38.100Z] Error: Some tests failed or other problems occurred.
[2023-07-19T20:13:38.100Z] -----------------------------------
[2023-07-19T20:13:38.100Z] jdk_nio_1_FAILED
```
[50x grinder](https://hyc-runtimes-jenkins.swg-devops.com/job/Grinder/34089/)
|
non_process
|
jdk nio failed java nio file filestore basic java runtimeexception assertion failed failure link from java version ibm semeru runtime certified edition build eclipse vm build release jre linux bit compressed references jit enabled aot enabled omr jcl based on jdk change target to run only the failed test targets optional info failure output captured from console output variation jvm options xx usecompressedoops test java nio file filestore basic java stderr java lang runtimeexception assertion failed at basic asserttrue basic java at basic dotests basic java at basic main basic java at java base jdk internal reflect nativemethodaccessorimpl native method at java base jdk internal reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at java base jdk internal reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java base java lang reflect method invoke method java at com sun javatest regtest agent mainactionhelper agentvmrunnable run mainactionhelper java at java base java lang thread run thread java javatest message test threw exception java lang runtimeexception javatest message shutting down test test result failed execution failed main threw exception java lang runtimeexception assertion failed test results passed failed report written to home jenkins workspace test extended openjdk linux aqa tests tkg output jdk nio report html report html results written to home jenkins workspace test extended openjdk linux aqa tests tkg output jdk nio work error some tests failed or other problems occurred jdk nio failed
| 0
|
88,244
| 3,775,325,560
|
IssuesEvent
|
2016-03-17 13:08:15
|
mantidproject/autoreduce
|
https://api.github.com/repos/mantidproject/autoreduce
|
opened
|
WISH script is writing certain run numbers out wrong
|
Bugfix High Priority
|
Pascal reported that for run numbers ending with zero, like 33280, such run numbers get saved as 3328 in this instance for example.
|
1.0
|
WISH script is writing certain run numbers out wrong - Pascal reported that for run numbers ending with zero, like 33280, such run numbers get saved as 3328 in this instance for example.
|
non_process
|
wish script is writing certain run numbers out wrong pascal reported that for run numbers ending with zero like such run numbers get saved as in this instance for example
| 0
|
20,001
| 26,475,696,322
|
IssuesEvent
|
2023-01-17 10:56:41
|
quark-engine/quark-engine
|
https://api.github.com/repos/quark-engine/quark-engine
|
closed
|
Update analysis library for Rizin v0.4.x
|
issue-processing-state-06
|
**Is your feature request related to a problem? Please describe.**
The upcoming release of Rizin v0.4.x has included features for Dex files, such as cross-reference supports and APIs to get the class name of a specified method.
Those features build up the foundation of the analysis library. They help construct a more accurate analysis result with Rizin. Thus, we should update our library to include those features.
**Describe the solution you'd like**
Update the Rizin-based library to work with the current development version of Rizin v0.4.x (commit: [de8a5cac5532845643a52d1231b17a7b34feb50a](https://github.com/rizinorg/rizin/commit/de8a5cac5532845643a52d1231b17a7b34feb50a)).
|
1.0
|
Update analysis library for Rizin v0.4.x - **Is your feature request related to a problem? Please describe.**
The upcoming release of Rizin v0.4.x has included features for Dex files, such as cross-reference supports and APIs to get the class name of a specified method.
Those features build up the foundation of the analysis library. They help construct a more accurate analysis result with Rizin. Thus, we should update our library to include those features.
**Describe the solution you'd like**
Update the Rizin-based library to work with the current development version of Rizin v0.4.x (commit: [de8a5cac5532845643a52d1231b17a7b34feb50a](https://github.com/rizinorg/rizin/commit/de8a5cac5532845643a52d1231b17a7b34feb50a)).
|
process
|
update analysis library for rizin x is your feature request related to a problem please describe the upcoming release of rizin x has included features for dex files such as cross reference supports and apis to get the class name of a specified method those features build up the foundation of the analysis library they help construct a more accurate analysis result with rizin thus we should update our library to include those features describe the solution you d like update the rizin based library to work with the current development version of rizin x commit
| 1
|
17,490
| 23,303,655,215
|
IssuesEvent
|
2022-08-07 17:52:03
|
triathematician/covid-analytics
|
https://api.github.com/repos/triathematician/covid-analytics
|
opened
|
Setup CI github action
|
process
|
Should run maven build to ensure compile is successful. Run automatically for pull requests.
|
1.0
|
Setup CI github action - Should run maven build to ensure compile is successful. Run automatically for pull requests.
|
process
|
setup ci github action should run maven build to ensure compile is successful run automatically for pull requests
| 1
|
6,486
| 14,643,463,050
|
IssuesEvent
|
2020-12-25 16:39:02
|
ArisiaInc/a21-remote-site
|
https://api.github.com/repos/ArisiaInc/a21-remote-site
|
closed
|
Figure out how to do "interstitials"
|
architecture
|
Between two items in programming, we might want to show some sort of interstitials -- music, slide-show, that sort of thing -- in Zoom.
How can we best make this happen? It can certainly be done with screen sharing: can we do something more automated, that doesn't take so many people points.
|
1.0
|
Figure out how to do "interstitials" - Between two items in programming, we might want to show some sort of interstitials -- music, slide-show, that sort of thing -- in Zoom.
How can we best make this happen? It can certainly be done with screen sharing: can we do something more automated, that doesn't take so many people points.
|
non_process
|
figure out how to do interstitials between two items in programming we might want to show some sort of interstitials music slide show that sort of thing in zoom how can we best make this happen it can certainly be done with screen sharing can we do something more automated that doesn t take so many people points
| 0
|
13,812
| 16,574,668,856
|
IssuesEvent
|
2021-05-31 01:22:52
|
Leviatan-Analytics/LA-data-processing
|
https://api.github.com/repos/Leviatan-Analytics/LA-data-processing
|
closed
|
Test pre processing image colors for feature matching [2]
|
Data Processing Sprint 2 Week 1
|
Filter image by pixel color in order to extract wards from it.
Output: Document with the output images and simple script for automating this task.
|
1.0
|
Test pre processing image colors for feature matching [2] - Filter image by pixel color in order to extract wards from it.
Output: Document with the output images and simple script for automating this task.
|
process
|
test pre processing image colors for feature matching filter image by pixel color in order to extract wards from it output document with the output images and simple script for automating this task
| 1
|
15,228
| 19,100,646,765
|
IssuesEvent
|
2021-11-29 22:04:04
|
Scott-Collier/MA5851_SP86_2021
|
https://api.github.com/repos/Scott-Collier/MA5851_SP86_2021
|
opened
|
Fix Original Price and Discount Column when there is no Discount (NaN)
|
Processing
|
When there is no Discount, Price and Original Price should be the same, and Discount should be 0.
Currently when there is no discount, Price = x, Original Price = NaN, Discount=NaN
|
1.0
|
Fix Original Price and Discount Column when there is no Discount (NaN) - When there is no Discount, Price and Original Price should be the same, and Discount should be 0.
Currently when there is no discount, Price = x, Original Price = NaN, Discount=NaN
|
process
|
fix original price and discount column when there is no discount nan when there is no discount price and original price should be the same and discount should be currently when there is no discount price x original price nan discount nan
| 1
|
248,480
| 7,931,669,288
|
IssuesEvent
|
2018-07-07 03:19:04
|
ngageoint/hootenanny
|
https://api.github.com/repos/ngageoint/hootenanny
|
opened
|
Curves turning into areas
|
Category: Core Category: Translation Priority: Medium Status: In Progress Type: Bug in progress
|
Some line features are getting turned into areas when exporting as TDSv61.
* use=transportation_support
* There will be others
NOTE: This is the first to be fixed. A follow-on ticket will cover others as they are found through testing..
|
1.0
|
Curves turning into areas - Some line features are getting turned into areas when exporting as TDSv61.
* use=transportation_support
* There will be others
NOTE: This is the first to be fixed. A follow-on ticket will cover others as they are found through testing..
|
non_process
|
curves turning into areas some line features are getting turned into areas when exporting as use transportation support there will be others note this is the first to be fixed a follow on ticket will cover others as they are found through testing
| 0
|
10,594
| 13,401,350,115
|
IssuesEvent
|
2020-09-03 17:10:18
|
jgraley/inferno-cpp2v
|
https://api.github.com/repos/jgraley/inferno-cpp2v
|
closed
|
Don't throw in SimpleCompare
|
Constraint Processing
|
Will be used by the CSP code, which is basically nothrow.
Add a method to `AndRuleEngine` which wraps `SimpleCompare` and does throw. Call it `CouplingCompare()` and let this function establish policy for couplings in one single place. See #121
|
1.0
|
Don't throw in SimpleCompare - Will be used by the CSP code, which is basically nothrow.
Add a method to `AndRuleEngine` which wraps `SimpleCompare` and does throw. Call it `CouplingCompare()` and let this function establish policy for couplings in one single place. See #121
|
process
|
don t throw in simplecompare will be used by the csp code which is basically nothrow add a method to andruleengine which wraps simplecompare and does throw call it couplingcompare and let this function establish policy for couplings in one single place see
| 1
|
9,723
| 12,718,306,005
|
IssuesEvent
|
2020-06-24 07:16:42
|
prisma/vscode
|
https://api.github.com/repos/prisma/vscode
|
closed
|
Publish lsp server to npm
|
kind/improvement process/next-milestone team/typescript
|
We recently wrote a LSP server implementation for our vscode extension.
It would be very useful if we can publish that to npm so that other project can also use the lsp implementation. That way if we do bug fixes here they will also get them.
|
1.0
|
Publish lsp server to npm - We recently wrote a LSP server implementation for our vscode extension.
It would be very useful if we can publish that to npm so that other project can also use the lsp implementation. That way if we do bug fixes here they will also get them.
|
process
|
publish lsp server to npm we recently wrote a lsp server implementation for our vscode extension it would be very useful if we can publish that to npm so that other project can also use the lsp implementation that way if we do bug fixes here they will also get them
| 1
|
6,417
| 9,515,560,677
|
IssuesEvent
|
2019-04-26 06:09:35
|
dita-ot/dita-ot
|
https://api.github.com/repos/dita-ot/dita-ot
|
closed
|
@chunk destination folder
|
enhancement preprocess/chunking priority/medium stale
|
I have a project, which consists of the following files:
- ` my-project.ditamap`, which is stored in the `/my-project` folder.
- `first-topic.dita`, which is included as a topicref in `my-project.ditamap` and is stored in the nested `concepts` folder (`/my-project/concepts`).
If I set the chunk attribute for the `first-topic.dita` topicref to "by-topic", the output chunked files are generated to the `/output/my-project` folder and not to `/output/my-project/concepts`.
Does this match the expected behavior or is it an error and the files should be generated to `/output/my-project/concepts`?
|
1.0
|
@chunk destination folder - I have a project, which consists of the following files:
- ` my-project.ditamap`, which is stored in the `/my-project` folder.
- `first-topic.dita`, which is included as a topicref in `my-project.ditamap` and is stored in the nested `concepts` folder (`/my-project/concepts`).
If I set the chunk attribute for the `first-topic.dita` topicref to "by-topic", the output chunked files are generated to the `/output/my-project` folder and not to `/output/my-project/concepts`.
Does this match the expected behavior or is it an error and the files should be generated to `/output/my-project/concepts`?
|
process
|
chunk destination folder i have a project which consists of the following files my project ditamap which is stored in the my project folder first topic dita which is included as a topicref in my project ditamap and is stored in the nested concepts folder my project concepts if i set the chunk attribute for the first topic dita topicref to by topic the output chunked files are generated to the output my project folder and not to output my project concepts does this match the expected behavior or is it an error and the files should be generated to output my project concepts
| 1
|
13,225
| 15,691,312,592
|
IssuesEvent
|
2021-03-25 17:43:01
|
microsoft/react-native-windows
|
https://api.github.com/repos/microsoft/react-native-windows
|
closed
|
RNW 0.64 Release Discussion
|
Area: Release Process Discussion enhancement
|
WIP Changelog here: https://github.com/microsoft/react-native-windows/releases/tag/react-native-windows_v0.64.0-preview.1
## Milestones
- **11/25:** Release 0.64.0-preview.1
- **1/15:** Changes in 0.64-stable require triage
- **1/29:** Release Target
## Blocking Issues
- re-generate API docs for 0.64 #6806 (asklar)
- Integrate final bits from react-native@0.64.0
## Checklist
**Legend**
- [ ] ⁉ Needs driver
- [ ] Work not started
- [ ] 🏃♂️ Work in progress
- [x] Work completed
**Before Preview**
- [x] Draft GitHub release notes from commit log (NickGerleman)
- [x] Promote canary build to preview using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [x] Smoke test of functionality (NickGerleman)
- Some discussion here https://github.com/microsoft/react-native-windows/issues/5326
- [x] Poke at RNTester
- [x] Test new C++ app with run-windows
- [x] Test new C++ app with VS
- [x] Test new C# app with run-windows
- [x] Test new C# app with VS
- [x] Test new C++ app with Hermes
- [x] Push build to 0.63-stable branch and publish (NickGerleman )
- [x] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [x] Post release notes internally (NickGerleman)
-----
**After Preview**
- [x] Send mail to the team reminding of dates + restrictions (NickGerleman)
- [x] Move most issues targeting 0.64 to 0.65 (chrisglein)
- [x] Add "Blocking" label to any known showstopper regressions (chrisglein)
- [x] Test updated gallery app (stmoy)
- [x] Test updated samples (jonthysell)
- [x] Do a pass on API Docs (ngerlem)
- [x] Integrate patch/prerelease releases for React Native (NickGerleman)
- [x] Modify CODEOWNERS in 0.63-stable to require changes go through @microsoft/react-native-windows-backport-triage (NickGerleman)
- [x] Send reminder mail to the team about backport triage (NickGerleman)
-----
**Before Release**
- [x] Ensure any community typing changes happen (rectified95?)
- [x] Test samples against newest version (jonthysell)
- [x] Ensure doc issues are addressed (NickGerleman)
- [x] Promote `latest` build to `legacy` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [x] Ensure all issues marked with "Blocking Label" are fixed (chrisglein)
-----
**Release**
- [x] Update preview release notes with any changes from cherry-picked PRs (NickGerleman)
- [x] Update sample repos to new version (jonthysell)
- [x] Update gallery to a new version (chiara)
- [x] Smoke test of functionality (NickGerleman)
- [x] Poke at RNTester
- [x] Test new C++ app with run-windows
- [x] Test new C++ app with VS
- [x] Test new C# app with run-windows
- [x] Test new C# app with VS
- [x] Test new C++ app with Hermes
- [x] Promote `preview` build to `latest` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [x] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [x] Flip docs site to 0.64 (kikisaints/jonthysell)
- [x] Send out internal release announcement (NickGerleman)
- [x] Send out external release announcement (kikisaints)
|
1.0
|
RNW 0.64 Release Discussion - WIP Changelog here: https://github.com/microsoft/react-native-windows/releases/tag/react-native-windows_v0.64.0-preview.1
## Milestones
- **11/25:** Release 0.64.0-preview.1
- **1/15:** Changes in 0.64-stable require triage
- **1/29:** Release Target
## Blocking Issues
- re-generate API docs for 0.64 #6806 (asklar)
- Integrate final bits from react-native@0.64.0
## Checklist
**Legend**
- [ ] ⁉ Needs driver
- [ ] Work not started
- [ ] 🏃♂️ Work in progress
- [x] Work completed
**Before Preview**
- [x] Draft GitHub release notes from commit log (NickGerleman)
- [x] Promote canary build to preview using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [x] Smoke test of functionality (NickGerleman)
- Some discussion here https://github.com/microsoft/react-native-windows/issues/5326
- [x] Poke at RNTester
- [x] Test new C++ app with run-windows
- [x] Test new C++ app with VS
- [x] Test new C# app with run-windows
- [x] Test new C# app with VS
- [x] Test new C++ app with Hermes
- [x] Push build to 0.63-stable branch and publish (NickGerleman )
- [x] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [x] Post release notes internally (NickGerleman)
-----
**After Preview**
- [x] Send mail to the team reminding of dates + restrictions (NickGerleman)
- [x] Move most issues targeting 0.64 to 0.65 (chrisglein)
- [x] Add "Blocking" label to any known showstopper regressions (chrisglein)
- [x] Test updated gallery app (stmoy)
- [x] Test updated samples (jonthysell)
- [x] Do a pass on API Docs (ngerlem)
- [x] Integrate patch/prerelease releases for React Native (NickGerleman)
- [x] Modify CODEOWNERS in 0.63-stable to require changes go through @microsoft/react-native-windows-backport-triage (NickGerleman)
- [x] Send reminder mail to the team about backport triage (NickGerleman)
-----
**Before Release**
- [x] Ensure any community typing changes happen (rectified95?)
- [x] Test samples against newest version (jonthysell)
- [x] Ensure doc issues are addressed (NickGerleman)
- [x] Promote `latest` build to `legacy` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [x] Ensure all issues marked with "Blocking Label" are fixed (chrisglein)
-----
**Release**
- [x] Update preview release notes with any changes from cherry-picked PRs (NickGerleman)
- [x] Update sample repos to new version (jonthysell)
- [x] Update gallery to a new version (chiara)
- [x] Smoke test of functionality (NickGerleman)
- [x] Poke at RNTester
- [x] Test new C++ app with run-windows
- [x] Test new C++ app with VS
- [x] Test new C# app with run-windows
- [x] Test new C# app with VS
- [x] Test new C++ app with Hermes
- [x] Promote `preview` build to `latest` using [wiki instructions](https://github.com/microsoft/react-native-windows/wiki/How-to-promote-a-release) (NickGerleman)
- [x] Update GitHub release notes to use manually curated notes instead of a changelog (NickGerleman)
- [x] Flip docs site to 0.64 (kikisaints/jonthysell)
- [x] Send out internal release announcement (NickGerleman)
- [x] Send out external release announcement (kikisaints)
|
process
|
rnw release discussion wip changelog here milestones release preview changes in stable require triage release target blocking issues re generate api docs for asklar integrate final bits from react native checklist legend ⁉ needs driver work not started 🏃♂️ work in progress work completed before preview draft github release notes from commit log nickgerleman promote canary build to preview using nickgerleman smoke test of functionality nickgerleman some discussion here poke at rntester test new c app with run windows test new c app with vs test new c app with run windows test new c app with vs test new c app with hermes push build to stable branch and publish nickgerleman update github release notes to use manually curated notes instead of a changelog nickgerleman post release notes internally nickgerleman after preview send mail to the team reminding of dates restrictions nickgerleman move most issues targeting to chrisglein add blocking label to any known showstopper regressions chrisglein test updated gallery app stmoy test updated samples jonthysell do a pass on api docs ngerlem integrate patch prerelease releases for react native nickgerleman modify codeowners in stable to require changes go through microsoft react native windows backport triage nickgerleman send reminder mail to the team about backport triage nickgerleman before release ensure any community typing changes happen test samples against newest version jonthysell ensure doc issues are addressed nickgerleman promote latest build to legacy using nickgerleman ensure all issues marked with blocking label are fixed chrisglein release update preview release notes with any changes from cherry picked prs nickgerleman update sample repos to new version jonthysell update gallery to a new version chiara smoke test of functionality nickgerleman poke at rntester test new c app with run windows test new c app with vs test new c app with run windows test new c app with vs test new c app with hermes promote preview build to latest using nickgerleman update github release notes to use manually curated notes instead of a changelog nickgerleman flip docs site to kikisaints jonthysell send out internal release announcement nickgerleman send out external release announcement kikisaints
| 1
|
2,505
| 5,277,587,394
|
IssuesEvent
|
2017-02-07 03:56:00
|
rubberduck-vba/Rubberduck
|
https://api.github.com/repos/rubberduck-vba/Rubberduck
|
closed
|
Local variable 'Mid$' is not declared
|
bug critical parse-tree-processing
|
Local variable 'Mid$' is not declared
It's a VBA function. Elsewhere it reports Replace function `Mid` with existing typed function
|
1.0
|
Local variable 'Mid$' is not declared - Local variable 'Mid$' is not declared
It's a VBA function. Elsewhere it reports Replace function `Mid` with existing typed function
|
process
|
local variable mid is not declared local variable mid is not declared it s a vba function elsewhere it reports replace function mid with existing typed function
| 1
|
4,379
| 7,261,579,925
|
IssuesEvent
|
2018-02-18 22:07:24
|
alphagov/employee-id
|
https://api.github.com/repos/alphagov/employee-id
|
opened
|
ID Checkers may not be able to use mobile to check id
|
ID Checker Major Issue Process Insight
|
**Why?**
- Accessibility issues
- Technological constraints
- Usability issues scanning biometrics and documents
|
1.0
|
ID Checkers may not be able to use mobile to check id - **Why?**
- Accessibility issues
- Technological constraints
- Usability issues scanning biometrics and documents
|
process
|
id checkers may not be able to use mobile to check id why accessibility issues technological constraints usability issues scanning biometrics and documents
| 1
|
8,348
| 11,499,006,881
|
IssuesEvent
|
2020-02-12 13:10:18
|
prisma/prisma2
|
https://api.github.com/repos/prisma/prisma2
|
opened
|
Improve .raw API with parameters
|
kind/docs kind/improvement process/candidate topic: prisma-client
|
The current .raw API is too limited (no parameters allowed)
## Current status
```ts
const result = await prisma.raw`SELECT * FROM USER;`
// result = [
// { "id":1, "email":"sarah@prisma.io", "name":"Sarah" },
// { "id":2, "email":"alice@prisma.io", "name":"Alice" }
// ]
```
## Current limitation
It's currently not possible to pass any arguments to the SQL statement:
```ts
// not possible
const table = `USER`
const result = await prisma.raw`SELECT * FROM ${table};`
```
https://github.com/prisma/prisma2/blob/docs/preview022/docs/prisma-client-js/api.md#raw-database-access
We could use https://github.com/blakeembrey/sql-template-tag
|
1.0
|
Improve .raw API with parameters - The current .raw API is too limited (no parameters allowed)
## Current status
```ts
const result = await prisma.raw`SELECT * FROM USER;`
// result = [
// { "id":1, "email":"sarah@prisma.io", "name":"Sarah" },
// { "id":2, "email":"alice@prisma.io", "name":"Alice" }
// ]
```
## Current limitation
It's currently not possible to pass any arguments to the SQL statement:
```ts
// not possible
const table = `USER`
const result = await prisma.raw`SELECT * FROM ${table};`
```
https://github.com/prisma/prisma2/blob/docs/preview022/docs/prisma-client-js/api.md#raw-database-access
We could use https://github.com/blakeembrey/sql-template-tag
|
process
|
improve raw api with parameters the current raw api is too limited no parameters allowed current status ts const result await prisma raw select from user result id email sarah prisma io name sarah id email alice prisma io name alice current limitation it s currently not possible to pass any arguments to the sql statement ts not possible const table user const result await prisma raw select from table we could use
| 1
|
19,299
| 25,466,465,601
|
IssuesEvent
|
2022-11-25 05:14:46
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[IDP] [PM] Participant details screen > UI Issue
|
Bug P2 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
|
UI breakage is observed in the participant details screen

|
3.0
|
[IDP] [PM] Participant details screen > UI Issue - UI breakage is observed in the participant details screen

|
process
|
participant details screen ui issue ui breakage is observed in the participant details screen
| 1
|
92,579
| 8,369,018,760
|
IssuesEvent
|
2018-10-04 16:08:30
|
STEllAR-GROUP/phylanx
|
https://api.github.com/repos/STEllAR-GROUP/phylanx
|
closed
|
tests.unit.plugins.matrixops.random_distributions buildbot test failing
|
category: continuous integration category: tests compiler: gcc platform: linux
|
...again, another test that only fails for the buildbot user.
```
[buildbot@delphi phylanx-Release]$ ctest -V -R tests.unit.plugins.matrixops.random_distributions
UpdateCTestConfiguration from :/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
Parse Config file:/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
UpdateCTestConfiguration from :/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
Parse Config file:/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
Test project /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 126
Start 126: tests.unit.plugins.matrixops.random_distributions
126: Test command: /packages/python/3.6.4-ssl/bin/python3 "/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/phylanxrun.py" "/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/random_distributions_test" "-e" "0" "-l" "1" "-t" "1" "-v" "--"
126: Test timeout computed to be: 1500
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::normal_distribution<double>]': '[0.580775, -0.222945, -0.574052, 0.0846135, 2.06277, -1.49428, -0.372216, 0.814404, -0.418164, -0.431843, 1.23158, 0.496183, 0.175568, 0.473541, -0.505594, -1.04171, -0.668944, 0.592764, 0.733453, 1.82618, 0.155324, 0.804811, -0.545873, -0.421071, 0.0123856, -0.93878, -1.17346, -1.11744, -0.152928, -0.735588, -0.205178, 0.580281]' != '[[0.580775], [-0.222945], [-0.574052], [0.0846135], [2.06277], [-1.49428], [-0.372216], [0.814404], [-0.418164], [-0.431843], [1.23158], [0.496183], [0.175568], [0.473541], [-0.505594], [-1.04171], [-0.668944], [0.592764], [0.733453], [1.82618], [0.155324], [0.804811], [-0.545873], [-0.421071], [0.0123856], [-0.93878], [-1.17346], [-1.11744], [-0.152928], [-0.735588], [-0.205178], [0.580281]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::uniform_real_distribution<double>]': '[0.742667, 0.619634, 0.14849, 0.908298, 0.755534, 0.716926, 0.692001, 0.357867, 0.587831, 0.787731, 0.35707, 0.987934, 0.0315901, 0.989657, 0.785355, 0.804331, 0.112768, 0.519823, 0.107913, 0.654452, 0.743991, 0.0174458, 0.429576, 0.563064, 0.156636, 0.684107, 0.361534, 0.643585, 0.0101308, 0.339076, 0.735305, 0.778844]' != '[[0.742667], [0.619634], [0.14849], [0.908298], [0.755534], [0.716926], [0.692001], [0.357867], [0.587831], [0.787731], [0.35707], [0.987934], [0.0315901], [0.989657], [0.785355], [0.804331], [0.112768], [0.519823], [0.107913], [0.654452], [0.743991], [0.0174458], [0.429576], [0.563064], [0.156636], [0.684107], [0.361534], [0.643585], [0.0101308], [0.339076], [0.735305], [0.778844]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::uniform_real_distribution<double>]': '[2.12055, 3.59671, 2.16695, 3.55361, 3.85889, 3.0245, 3.81361, 2.54431, 2.94064, 3.90978, 3.58663, 2.58878, 2.22568, 3.66061, 3.7666, 3.60083, 2.02072, 3.68688, 3.78987, 3.30958, 3.17404, 3.12245, 2.68987, 2.86562, 2.14594, 2.38964, 2.40754, 3.68882, 3.49536, 2.12937, 2.1939, 3.80057]' != '[[2.12055], [3.59671], [2.16695], [3.55361], [3.85889], [3.0245], [3.81361], [2.54431], [2.94064], [3.90978], [3.58663], [2.58878], [2.22568], [3.66061], [3.7666], [3.60083], [2.02072], [3.68688], [3.78987], [3.30958], [3.17404], [3.12245], [2.68987], [2.86562], [2.14594], [2.38964], [2.40754], [3.68882], [3.49536], [2.12937], [2.1939], [3.80057]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = unsigned char; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::bernoulli_distribution]': '[true, true, true, false, true, true, true, true, false, false, false, false, true, false, true, false, false, true, true, false, true, false, true, true, true, true, false, false, true, true, true, false]' != '[[true], [true], [true], [false], [true], [true], [true], [true], [false], [false], [false], [false], [true], [false], [true], [false], [false], [true], [true], [false], [true], [false], [true], [true], [true], [true], [false], [false], [true], [true], [true], [false]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = unsigned char; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::bernoulli_distribution]': '[true, true, true, true, true, true, true, true, true, true, true, true, true, true, true, false, true, true, true, true, true, true, false, true, true, false, true, true, true, true, true, true]' != '[[true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [false], [true], [true], [true], [true], [true], [true], [false], [true], [true], [false], [true], [true], [true], [true], [true], [true]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::binomial_distribution<int>]': '[1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0]' != '[[1], [1], [0], [0], [1], [0], [0], [0], [0], [1], [1], [1], [0], [0], [1], [1], [1], [0], [0], [0], [0], [0], [1], [0], [1], [1], [1], [0], [0], [0], [1], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::binomial_distribution<int>]': '[9, 5, 7, 9, 9, 7, 7, 6, 9, 9, 9, 10, 7, 9, 4, 7, 9, 9, 8, 8, 9, 9, 9, 9, 8, 7, 9, 6, 10, 6, 8, 7]' != '[[9], [5], [7], [9], [9], [7], [7], [6], [9], [9], [9], [10], [7], [9], [4], [7], [9], [9], [8], [8], [9], [9], [9], [9], [8], [7], [9], [6], [10], [6], [8], [7]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::negative_binomial_distribution<int>]': '[0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 2, 0, 1, 0, 1, 1, 0, 0, 1, 1, 1, 0, 0, 1, 1, 0, 1, 1, 3, 0, 3, 0]' != '[[0], [0], [0], [0], [1], [0], [1], [1], [0], [0], [2], [0], [1], [0], [1], [1], [0], [0], [1], [1], [1], [0], [0], [1], [1], [0], [1], [1], [3], [0], [3], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::negative_binomial_distribution<int>]': '[3, 4, 7, 6, 2, 1, 2, 3, 3, 0, 0, 3, 2, 2, 3, 2, 2, 0, 2, 2, 3, 3, 1, 4, 3, 2, 1, 1, 1, 2, 3, 5]' != '[[3], [4], [7], [6], [2], [1], [2], [3], [3], [0], [0], [3], [2], [2], [3], [2], [2], [0], [2], [2], [3], [3], [1], [4], [3], [2], [1], [1], [1], [2], [3], [5]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::geometric_distribution<int>]': '[0, 1, 1, 0, 3, 1, 1, 2, 0, 2, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1, 3, 3, 1, 1, 0, 0, 2, 0, 1, 0, 0, 0]' != '[[0], [1], [1], [0], [3], [1], [1], [2], [0], [2], [1], [0], [0], [0], [0], [0], [0], [0], [1], [1], [3], [3], [1], [1], [0], [0], [2], [0], [1], [0], [0], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::geometric_distribution<int>]': '[0, 0, 0, 0, 0, 1, 0, 0, 2, 1, 0, 2, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1]' != '[[0], [0], [0], [0], [0], [1], [0], [0], [2], [1], [0], [2], [0], [0], [0], [0], [0], [0], [0], [1], [0], [0], [0], [0], [0], [0], [0], [0], [0], [0], [0], [1]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::poisson_distribution<int>]': '[3, 1, 1, 2, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 1, 4, 0, 0, 0, 0, 4, 1, 1, 0, 1, 0, 0, 1, 1, 1, 2, 0]' != '[[3], [1], [1], [2], [1], [0], [0], [0], [1], [1], [0], [0], [1], [0], [1], [4], [0], [0], [0], [0], [4], [1], [1], [0], [1], [0], [0], [1], [1], [1], [2], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::poisson_distribution<int>]': '[0, 3, 7, 3, 0, 0, 2, 8, 4, 5, 4, 2, 5, 6, 5, 2, 4, 6, 4, 2, 8, 3, 5, 3, 5, 3, 6, 3, 3, 6, 4, 3]' != '[[0], [3], [7], [3], [0], [0], [2], [8], [4], [5], [4], [2], [5], [6], [5], [2], [4], [6], [4], [2], [8], [3], [5], [3], [5], [3], [6], [3], [3], [6], [4], [3]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::exponential_distribution<double>]': '[0.441249, 0.360445, 1.57391, 0.044464, 0.471982, 0.0571889, 0.44295, 0.520508, 0.710618, 0.348814, 1.23563, 0.737762, 2.99639, 1.15965, 5.66962, 1.26029, 0.081411, 3.34859, 0.614425, 1.96163, 0.301867, 0.0854607, 1.5064, 1.12086, 0.434801, 0.450939, 0.474579, 0.912921, 0.731756, 0.426238, 0.239172, 0.607869]' != '[[0.441249], [0.360445], [1.57391], [0.044464], [0.471982], [0.0571889], [0.44295], [0.520508], [0.710618], [0.348814], [1.23563], [0.737762], [2.99639], [1.15965], [5.66962], [1.26029], [0.081411], [3.34859], [0.614425], [1.96163], [0.301867], [0.0854607], [1.5064], [1.12086], [0.434801], [0.450939], [0.474579], [0.912921], [0.731756], [0.426238], [0.239172], [0.607869]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::exponential_distribution<double>]': '[0.0334396, 0.803989, 0.247484, 0.0393031, 0.366866, 0.583387, 0.0858922, 0.71767, 0.209726, 0.335191, 0.199114, 0.135269, 0.389417, 3.18786, 0.422753, 0.404251, 0.810837, 0.0939662, 0.124138, 1.59893, 0.380253, 0.241605, 0.198512, 0.912944, 0.418796, 0.0976327, 0.228427, 0.891203, 0.350173, 0.353009, 0.431047, 0.622711]' != '[[0.0334396], [0.803989], [0.247484], [0.0393031], [0.366866], [0.583387], [0.0858922], [0.71767], [0.209726], [0.335191], [0.199114], [0.135269], [0.389417], [3.18786], [0.422753], [0.404251], [0.810837], [0.0939662], [0.124138], [1.59893], [0.380253], [0.241605], [0.198512], [0.912944], [0.418796], [0.0976327], [0.228427], [0.891203], [0.350173], [0.353009], [0.431047], [0.622711]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::gamma_distribution<double>]': '[3.04805, 1.01703, 0.369166, 4.20253, 0.456231, 0.0389874, 1.41212, 0.141813, 0.415449, 1.28596, 2.35636, 1.10022, 1.38354, 0.739289, 0.528738, 0.172282, 0.813243, 0.860792, 0.136252, 0.335831, 0.312319, 1.14246, 1.12018, 0.294114, 0.334605, 0.243317, 1.05703, 0.921451, 1.10247, 1.27456, 0.0845061, 0.29469]' != '[[3.04805], [1.01703], [0.369166], [4.20253], [0.456231], [0.0389874], [1.41212], [0.141813], [0.415449], [1.28596], [2.35636], [1.10022], [1.38354], [0.739289], [0.528738], [0.172282], [0.813243], [0.860792], [0.136252], [0.335831], [0.312319], [1.14246], [1.12018], [0.294114], [0.334605], [0.243317], [1.05703], [0.921451], [1.10247], [1.27456], [0.0845061], [0.29469]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::gamma_distribution<double>]': '[2.73296, 0.169933, 1.11026, 1.98885, 1.54986, 1.82989, 0.630756, 0.638347, 2.5013, 0.321717, 0.460992, 0.0207866, 0.561135, 0.00652426, 0.443876, 0.0914916, 0.0945904, 2.00634, 0.400218, 1.3536, 0.481853, 0.537912, 0.936446, 0.127696, 0.0132419, 0.929784, 1.13712, 0.121129, 0.84093, 1.80036, 0.847195, 0.524633]' != '[[2.73296], [0.169933], [1.11026], [1.98885], [1.54986], [1.82989], [0.630756], [0.638347], [2.5013], [0.321717], [0.460992], [0.0207866], [0.561135], [0.00652426], [0.443876], [0.0914916], [0.0945904], [2.00634], [0.400218], [1.3536], [0.481853], [0.537912], [0.936446], [0.127696], [0.0132419], [0.929784], [1.13712], [0.121129], [0.84093], [1.80036], [0.847195], [0.524633]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::weibull_distribution<double>]': '[1.10981, 0.804423, 0.100616, 0.792442, 1.33133, 0.593407, 0.798701, 1.57409, 0.209308, 4.30468, 0.713114, 1.65488, 0.383299, 0.130778, 0.271248, 0.031098, 0.208086, 0.234041, 2.48486, 0.652427, 1.06235, 0.348106, 2.58656, 3.47026, 2.1864, 0.672827, 0.0201862, 2.1433, 1.41454, 0.531608, 0.244781, 1.68774]' != '[[1.10981], [0.804423], [0.100616], [0.792442], [1.33133], [0.593407], [0.798701], [1.57409], [0.209308], [4.30468], [0.713114], [1.65488], [0.383299], [0.130778], [0.271248], [0.031098], [0.208086], [0.234041], [2.48486], [0.652427], [1.06235], [0.348106], [2.58656], [3.47026], [2.1864], [0.672827], [0.0201862], [2.1433], [1.41454], [0.531608], [0.244781], [1.68774]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::weibull_distribution<double>]': '[0.16957, 1.58668, 2.11305, 0.0985175, 0.135229, 1.31656, 0.998673, 0.593808, 1.0826, 1.06404, 5.13932, 0.0204485, 0.863749, 2.19823, 0.692746, 0.217269, 1.77603, 1.28825, 0.322166, 0.394591, 0.00443096, 0.773148, 0.166752, 0.816493, 1.26777, 0.00839181, 1.79449, 0.0365519, 0.0198886, 0.379715, 1.96983, 0.474994]' != '[[0.16957], [1.58668], [2.11305], [0.0985175], [0.135229], [1.31656], [0.998673], [0.593808], [1.0826], [1.06404], [5.13932], [0.0204485], [0.863749], [2.19823], [0.692746], [0.217269], [1.77603], [1.28825], [0.322166], [0.394591], [0.00443096], [0.773148], [0.166752], [0.816493], [1.26777], [0.00839181], [1.79449], [0.0365519], [0.0198886], [0.379715], [1.96983], [0.474994]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::extreme_value_distribution<double>]': '[-0.620882, -0.0408931, -0.0382084, 0.581281, -0.641354, 0.0877751, 1.59767, 2.2097, -0.760962, 6.525, 1.52044, 0.667624, -0.148803, 0.238236, 0.97295, -0.784118, 0.927888, -1.24832, -0.961255, -0.987016, -0.0695678, 0.0157826, 1.51406, 2.64704, 3.32845, 0.36354, 1.44469, 1.57353, -0.125315, -1.3586, -1.00915, 1.35265]' != '[[-0.620882], [-0.0408931], [-0.0382084], [0.581281], [-0.641354], [0.0877751], [1.59767], [2.2097], [-0.760962], [6.525], [1.52044], [0.667624], [-0.148803], [0.238236], [0.97295], [-0.784118], [0.927888], [-1.24832], [-0.961255], [-0.987016], [-0.0695678], [0.0157826], [1.51406], [2.64704], [3.32845], [0.36354], [1.44469], [1.57353], [-0.125315], [-1.3586], [-1.00915], [1.35265]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::extreme_value_distribution<double>]': '[2.35096, 1.23928, 2.85486, 1.25168, 0.994236, 2.21358, 0.319051, 3.46316, 0.65825, -0.270331, 0.45591, -0.696375, 4.12916, 0.558531, 1.35472, -0.360451, 1.34939, 1.88428, 3.14817, 4.49902, 3.26288, 0.874278, 1.51886, 1.21411, 6.00252, 2.14116, 3.47951, 0.595031, 1.73225, 0.545339, 1.1155, 0.188919]' != '[[2.35096], [1.23928], [2.85486], [1.25168], [0.994236], [2.21358], [0.319051], [3.46316], [0.65825], [-0.270331], [0.45591], [-0.696375], [4.12916], [0.558531], [1.35472], [-0.360451], [1.34939], [1.88428], [3.14817], [4.49902], [3.26288], [0.874278], [1.51886], [1.21411], [6.00252], [2.14116], [3.47951], [0.595031], [1.73225], [0.545339], [1.1155], [0.188919]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::normal_distribution<double>]': '[1.3082, -0.0436525, 0.0684711, -0.560638, 0.396876, -1.16241, 1.74973, 0.723097, 1.32924, 1.55811, 0.706739, -0.050114, 0.798095, 1.16372, -0.576982, 0.543285, -1.70122, -0.104704, -0.0440302, -0.928694, -0.472609, 0.395739, 0.199102, 0.582667, 0.635211, 0.00934715, 0.624397, -0.916282, -0.0280736, 1.43846, -0.514995, -0.707437]' != '[[1.3082], [-0.0436525], [0.0684711], [-0.560638], [0.396876], [-1.16241], [1.74973], [0.723097], [1.32924], [1.55811], [0.706739], [-0.050114], [0.798095], [1.16372], [-0.576982], [0.543285], [-1.70122], [-0.104704], [-0.0440302], [-0.928694], [-0.472609], [0.395739], [0.199102], [0.582667], [0.635211], [0.00934715], [0.624397], [-0.916282], [-0.0280736], [1.43846], [-0.514995], [-0.707437]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::normal_distribution<double>]': '[-0.140159, -1.03536, 1.39168, 0.51729, -0.154849, -0.871759, -1.24514, 0.782894, -2.05631, 1.91532, 1.69309, 1.32105, 1.28103, -0.429313, 2.37565, 2.16937, -0.444257, 0.16928, 0.986921, -0.499736, -0.79068, -0.0734999, -0.735586, 1.64528, 2.11168, -1.38041, 2.40139, -0.783996, -0.520166, 0.493709, 1.59525, 0.00457382]' != '[[-0.140159], [-1.03536], [1.39168], [0.51729], [-0.154849], [-0.871759], [-1.24514], [0.782894], [-2.05631], [1.91532], [1.69309], [1.32105], [1.28103], [-0.429313], [2.37565], [2.16937], [-0.444257], [0.16928], [0.986921], [-0.499736], [-0.79068], [-0.0734999], [-0.735586], [1.64528], [2.11168], [-1.38041], [2.40139], [-0.783996], [-0.520166], [0.493709], [1.59525], [0.00457382]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::lognormal_distribution<double>]': '[0.860827, 4.18056, 3.55645, 0.396451, 0.557386, 2.38753, 0.77263, 0.312595, 0.448058, 0.433529, 1.61283, 2.37389, 0.345489, 0.902997, 0.0827094, 2.87135, 0.171438, 0.315273, 6.77384, 14.0847, 0.249311, 0.569379, 1.98996, 0.610394, 0.473587, 0.961963, 1.89314, 4.34582, 0.772119, 9.16956, 0.483967, 0.530826]' != '[[0.860827], [4.18056], [3.55645], [0.396451], [0.557386], [2.38753], [0.77263], [0.312595], [0.448058], [0.433529], [1.61283], [2.37389], [0.345489], [0.902997], [0.0827094], [2.87135], [0.171438], [0.315273], [6.77384], [14.0847], [0.249311], [0.569379], [1.98996], [0.610394], [0.473587], [0.961963], [1.89314], [4.34582], [0.772119], [9.16956], [0.483967], [0.530826]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::lognormal_distribution<double>]': '[1.62227, 0.715042, 13.0949, 31.746, 4.04411, 3.31193, 5.32273, 0.517709, 2.01362, 1.16328, 5.95381, 0.554248, 1.80286, 0.452744, 11.9641, 2.07492, 2.41038, 2.04415, 1.3424, 2.94606, 1.52166, 3.33338, 0.352988, 2.56686, 3.37492, 0.358663, 0.82673, 9.99841, 23.8595, 1.32725, 1.40484, 0.207622]' != '[[1.62227], [0.715042], [13.0949], [31.746], [4.04411], [3.31193], [5.32273], [0.517709], [2.01362], [1.16328], [5.95381], [0.554248], [1.80286], [0.452744], [11.9641], [2.07492], [2.41038], [2.04415], [1.3424], [2.94606], [1.52166], [3.33338], [0.352988], [2.56686], [3.37492], [0.358663], [0.82673], [9.99841], [23.8595], [1.32725], [1.40484], [0.207622]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::chi_squared_distribution<double>]': '[1.42984, 1.49091, 1.40647, 0.396118, 0.577295, 2.28275, 0.0856282, 3.56941, 0.385625, 1.17481, 0.00880951, 0.034031, 5.88921, 0.200072, 0.10435, 0.0262528, 1.39731, 0.0594532, 0.177828, 2.79915, 1.09432, 2.1747, 2.75281, 0.140004, 0.267484, 3.21147, 4.82852, 0.00120714, 1.47989, 0.281869, 1.20138, 1.0662]' != '[[1.42984], [1.49091], [1.40647], [0.396118], [0.577295], [2.28275], [0.0856282], [3.56941], [0.385625], [1.17481], [0.00880951], [0.034031], [5.88921], [0.200072], [0.10435], [0.0262528], [1.39731], [0.0594532], [0.177828], [2.79915], [1.09432], [2.1747], [2.75281], [0.140004], [0.267484], [3.21147], [4.82852], [0.00120714], [1.47989], [0.281869], [1.20138], [1.0662]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::chi_squared_distribution<double>]': '[0.0796135, 0.691332, 0.0893177, 1.00825e-05, 0.29323, 1.91392e-06, 3.26029, 0.149117, 0.195619, 0.0111404, 0.00973953, 0.0222336, 0.304152, 0.00404961, 2.49364, 0.284685, 1.05187, 0.48581, 0.254013, 0.0207869, 0.00382051, 0.274803, 0.148412, 0.246646, 5.55691, 0.501669, 0.00432683, 0.00568291, 0.292395, 0.169626, 0.114401, 0.0191466]' != '[[0.0796135], [0.691332], [0.0893177], [1.00825e-05], [0.29323], [1.91392e-06], [3.26029], [0.149117], [0.195619], [0.0111404], [0.00973953], [0.0222336], [0.304152], [0.00404961], [2.49364], [0.284685], [1.05187], [0.48581], [0.254013], [0.0207869], [0.00382051], [0.274803], [0.148412], [0.246646], [5.55691], [0.501669], [0.00432683], [0.00568291], [0.292395], [0.169626], [0.114401], [0.0191466]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::cauchy_distribution<double>]': '[8.33413, 0.455176, 0.183511, 0.639067, 0.262028, -3.86088, 1.03842, -5.83709, 0.822275, 1.98294, -0.160261, 1.14923, -1.08816, 0.427882, 1.74867, 0.609678, -7.31566, 0.137309, -2.80172, -1.07143, -0.014274, 0.0753825, 5.86, -0.464413, 0.374365, -0.17064, 0.70278, -0.562572, 0.333654, 1.83319, -2.4691, -4.59024]' != '[[8.33413], [0.455176], [0.183511], [0.639067], [0.262028], [-3.86088], [1.03842], [-5.83709], [0.822275], [1.98294], [-0.160261], [1.14923], [-1.08816], [0.427882], [1.74867], [0.609678], [-7.31566], [0.137309], [-2.80172], [-1.07143], [-0.014274], [0.0753825], [5.86], [-0.464413], [0.374365], [-0.17064], [0.70278], [-0.562572], [0.333654], [1.83319], [-2.4691], [-4.59024]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::cauchy_distribution<double>]': '[0.582921, -0.729551, -2.68902, -3.80884, 0.112827, 2.113, 0.706772, -315.632, 1.53036, 0.316498, 2.22017, 0.498353, 0.0173711, 0.604996, -1.47971, -0.00757937, -0.147558, -13.3748, 1.23047, 1.27099, 1.66713, -0.671168, -973.845, -0.367857, 0.710038, -1.25236, -0.0689386, 1.53094, 0.502147, 2.05363, -4.29208, -5.33235]' != '[[0.582921], [-0.729551], [-2.68902], [-3.80884], [0.112827], [2.113], [0.706772], [-315.632], [1.53036], [0.316498], [2.22017], [0.498353], [0.0173711], [0.604996], [-1.47971], [-0.00757937], [-0.147558], [-13.3748], [1.23047], [1.27099], [1.66713], [-0.671168], [-973.845], [-0.367857], [0.710038], [-1.25236], [-0.0689386], [1.53094], [0.502147], [2.05363], [-4.29208], [-5.33235]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::fisher_f_distribution<double>]': '[0.985495, 37.0159, 0.0297462, 0.482242, 0.644119, 0.863621, 0.0262075, 0.0242743, 18.1769, 2.5946, 61.3344, 0.318418, 2.20676, 5.69156, 1.22312, 0.638308, 0.0431261, 2.35496e-05, 0.944331, 7.78522, 13.5137, 265.892, 359.913, 0.0196853, 2.31688, 1719.26, 0.88807, 0.356582, 0.426315, 0.943793, 0.0332548, 90.5657]' != '[[0.985495], [37.0159], [0.0297462], [0.482242], [0.644119], [0.863621], [0.0262075], [0.0242743], [18.1769], [2.5946], [61.3344], [0.318418], [2.20676], [5.69156], [1.22312], [0.638308], [0.0431261], [2.35496e-05], [0.944331], [7.78522], [13.5137], [265.892], [359.913], [0.0196853], [2.31688], [1719.26], [0.88807], [0.356582], [0.426315], [0.943793], [0.0332548], [90.5657]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::fisher_f_distribution<double>]': '[0.000536954, 1.00784, 0.637966, 0.228904, 0.00924382, 1.13754, 0.0845261, 1.04096e+06, 246.984, 0.0323901, 0.00531824, 0.00358034, 8.29665, 1.40917, 1006.77, 3.50985, 4.60031, 0.360804, 0.961582, 0.212368, 0.406729, 988.625, 0.254808, 4.8013, 0.546344, 2.27266, 3.28646, 0.000278716, 0.0144525, 8.23121, 5.12869, 1.85015]' != '[[0.000536954], [1.00784], [0.637966], [0.228904], [0.00924382], [1.13754], [0.0845261], [1.04096e+06], [246.984], [0.0323901], [0.00531824], [0.00358034], [8.29665], [1.40917], [1006.77], [3.50985], [4.60031], [0.360804], [0.961582], [0.212368], [0.406729], [988.625], [0.254808], [4.8013], [0.546344], [2.27266], [3.28646], [0.000278716], [0.0144525], [8.23121], [5.12869], [1.85015]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::student_t_distribution<double>]': '[-0.774231, -0.153183, 0.0372515, 1.90229, 1.92155, 1.15424, 3.34977, -7.02299, -22.0139, -5.26465, 0.538061, -5.54663, 6.42244, -0.423953, -0.690078, -1.60318, -0.157842, -0.681753, 0.182721, -0.348519, 0.465761, 1.81663, -0.566082, 2.98389, -7.46501, 1.56704, -2.77097, 0.067053, 1.10687, -0.825478, 0.930204, 0.4337]' != '[[-0.774231], [-0.153183], [0.0372515], [1.90229], [1.92155], [1.15424], [3.34977], [-7.02299], [-22.0139], [-5.26465], [0.538061], [-5.54663], [6.42244], [-0.423953], [-0.690078], [-1.60318], [-0.157842], [-0.681753], [0.182721], [-0.348519], [0.465761], [1.81663], [-0.566082], [2.98389], [-7.46501], [1.56704], [-2.77097], [0.067053], [1.10687], [-0.825478], [0.930204], [0.4337]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::student_t_distribution<double>]': '[-0.0327367, -6.96851, -1.04806, 3.45067, 1.36556, -0.729179, 0.809656, 8.57641, -0.938686, -0.844625, -1.67079, -0.216872, 1.15745, -0.178408, -0.0101031, -0.329908, -7.56527, 0.35976, 0.351581, -0.0261269, 59.5263, 57.0646, -3.1862, 1.37227, -1.8856, 0.709767, 1.13008, -0.138157, 0.297846, 4.13691, -4.78464, -0.136079]' != '[[-0.0327367], [-6.96851], [-1.04806], [3.45067], [1.36556], [-0.729179], [0.809656], [8.57641], [-0.938686], [-0.844625], [-1.67079], [-0.216872], [1.15745], [-0.178408], [-0.0101031], [-0.329908], [-7.56527], [0.35976], [0.351581], [-0.0261269], [59.5263], [57.0646], [-3.1862], [1.37227], [-1.8856], [0.709767], [1.13008], [-0.138157], [0.297846], [4.13691], [-4.78464], [-0.136079]]'
126: 0 sanity checks and 33 tests failed.
126: Base command is "/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/random_distributions_test --hpx:threads=1 --hpx:localities=1"
126: Executing command: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/random_distributions_test --hpx:threads=1 --hpx:localities=1 --hpx:node=0
1/1 Test #126: tests.unit.plugins.matrixops.random_distributions ...***Failed 0.55 sec
0% tests passed, 1 tests failed out of 1
Total Test time (real) = 0.59 sec
The following tests FAILED:
126 - tests.unit.plugins.matrixops.random_distributions (Failed)
Errors while running CTest
```
built on x86_64 linux with gcc 7.1
|
1.0
|
tests.unit.plugins.matrixops.random_distributions buildbot test failing - ...again, another test that only fails for the buildbot user.
```
[buildbot@delphi phylanx-Release]$ ctest -V -R tests.unit.plugins.matrixops.random_distributions
UpdateCTestConfiguration from :/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
Parse Config file:/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
UpdateCTestConfiguration from :/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
Parse Config file:/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/DartConfiguration.tcl
Test project /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release
Constructing a list of tests
Done constructing a list of tests
Updating test list for fixtures
Added 0 tests to meet fixture requirements
Checking test dependency graph...
Checking test dependency graph end
test 126
Start 126: tests.unit.plugins.matrixops.random_distributions
126: Test command: /packages/python/3.6.4-ssl/bin/python3 "/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/phylanxrun.py" "/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/random_distributions_test" "-e" "0" "-l" "1" "-t" "1" "-v" "--"
126: Test timeout computed to be: 1500
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::normal_distribution<double>]': '[0.580775, -0.222945, -0.574052, 0.0846135, 2.06277, -1.49428, -0.372216, 0.814404, -0.418164, -0.431843, 1.23158, 0.496183, 0.175568, 0.473541, -0.505594, -1.04171, -0.668944, 0.592764, 0.733453, 1.82618, 0.155324, 0.804811, -0.545873, -0.421071, 0.0123856, -0.93878, -1.17346, -1.11744, -0.152928, -0.735588, -0.205178, 0.580281]' != '[[0.580775], [-0.222945], [-0.574052], [0.0846135], [2.06277], [-1.49428], [-0.372216], [0.814404], [-0.418164], [-0.431843], [1.23158], [0.496183], [0.175568], [0.473541], [-0.505594], [-1.04171], [-0.668944], [0.592764], [0.733453], [1.82618], [0.155324], [0.804811], [-0.545873], [-0.421071], [0.0123856], [-0.93878], [-1.17346], [-1.11744], [-0.152928], [-0.735588], [-0.205178], [0.580281]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::uniform_real_distribution<double>]': '[0.742667, 0.619634, 0.14849, 0.908298, 0.755534, 0.716926, 0.692001, 0.357867, 0.587831, 0.787731, 0.35707, 0.987934, 0.0315901, 0.989657, 0.785355, 0.804331, 0.112768, 0.519823, 0.107913, 0.654452, 0.743991, 0.0174458, 0.429576, 0.563064, 0.156636, 0.684107, 0.361534, 0.643585, 0.0101308, 0.339076, 0.735305, 0.778844]' != '[[0.742667], [0.619634], [0.14849], [0.908298], [0.755534], [0.716926], [0.692001], [0.357867], [0.587831], [0.787731], [0.35707], [0.987934], [0.0315901], [0.989657], [0.785355], [0.804331], [0.112768], [0.519823], [0.107913], [0.654452], [0.743991], [0.0174458], [0.429576], [0.563064], [0.156636], [0.684107], [0.361534], [0.643585], [0.0101308], [0.339076], [0.735305], [0.778844]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::uniform_real_distribution<double>]': '[2.12055, 3.59671, 2.16695, 3.55361, 3.85889, 3.0245, 3.81361, 2.54431, 2.94064, 3.90978, 3.58663, 2.58878, 2.22568, 3.66061, 3.7666, 3.60083, 2.02072, 3.68688, 3.78987, 3.30958, 3.17404, 3.12245, 2.68987, 2.86562, 2.14594, 2.38964, 2.40754, 3.68882, 3.49536, 2.12937, 2.1939, 3.80057]' != '[[2.12055], [3.59671], [2.16695], [3.55361], [3.85889], [3.0245], [3.81361], [2.54431], [2.94064], [3.90978], [3.58663], [2.58878], [2.22568], [3.66061], [3.7666], [3.60083], [2.02072], [3.68688], [3.78987], [3.30958], [3.17404], [3.12245], [2.68987], [2.86562], [2.14594], [2.38964], [2.40754], [3.68882], [3.49536], [2.12937], [2.1939], [3.80057]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = unsigned char; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::bernoulli_distribution]': '[true, true, true, false, true, true, true, true, false, false, false, false, true, false, true, false, false, true, true, false, true, false, true, true, true, true, false, false, true, true, true, false]' != '[[true], [true], [true], [false], [true], [true], [true], [true], [false], [false], [false], [false], [true], [false], [true], [false], [false], [true], [true], [false], [true], [false], [true], [true], [true], [true], [false], [false], [true], [true], [true], [false]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = unsigned char; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::bernoulli_distribution]': '[true, true, true, true, true, true, true, true, true, true, true, true, true, true, true, false, true, true, true, true, true, true, false, true, true, false, true, true, true, true, true, true]' != '[[true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [true], [false], [true], [true], [true], [true], [true], [true], [false], [true], [true], [false], [true], [true], [true], [true], [true], [true]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::binomial_distribution<int>]': '[1, 1, 0, 0, 1, 0, 0, 0, 0, 1, 1, 1, 0, 0, 1, 1, 1, 0, 0, 0, 0, 0, 1, 0, 1, 1, 1, 0, 0, 0, 1, 0]' != '[[1], [1], [0], [0], [1], [0], [0], [0], [0], [1], [1], [1], [0], [0], [1], [1], [1], [0], [0], [0], [0], [0], [1], [0], [1], [1], [1], [0], [0], [0], [1], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::binomial_distribution<int>]': '[9, 5, 7, 9, 9, 7, 7, 6, 9, 9, 9, 10, 7, 9, 4, 7, 9, 9, 8, 8, 9, 9, 9, 9, 8, 7, 9, 6, 10, 6, 8, 7]' != '[[9], [5], [7], [9], [9], [7], [7], [6], [9], [9], [9], [10], [7], [9], [4], [7], [9], [9], [8], [8], [9], [9], [9], [9], [8], [7], [9], [6], [10], [6], [8], [7]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::negative_binomial_distribution<int>]': '[0, 0, 0, 0, 1, 0, 1, 1, 0, 0, 2, 0, 1, 0, 1, 1, 0, 0, 1, 1, 1, 0, 0, 1, 1, 0, 1, 1, 3, 0, 3, 0]' != '[[0], [0], [0], [0], [1], [0], [1], [1], [0], [0], [2], [0], [1], [0], [1], [1], [0], [0], [1], [1], [1], [0], [0], [1], [1], [0], [1], [1], [3], [0], [3], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::negative_binomial_distribution<int>]': '[3, 4, 7, 6, 2, 1, 2, 3, 3, 0, 0, 3, 2, 2, 3, 2, 2, 0, 2, 2, 3, 3, 1, 4, 3, 2, 1, 1, 1, 2, 3, 5]' != '[[3], [4], [7], [6], [2], [1], [2], [3], [3], [0], [0], [3], [2], [2], [3], [2], [2], [0], [2], [2], [3], [3], [1], [4], [3], [2], [1], [1], [1], [2], [3], [5]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::geometric_distribution<int>]': '[0, 1, 1, 0, 3, 1, 1, 2, 0, 2, 1, 0, 0, 0, 0, 0, 0, 0, 1, 1, 3, 3, 1, 1, 0, 0, 2, 0, 1, 0, 0, 0]' != '[[0], [1], [1], [0], [3], [1], [1], [2], [0], [2], [1], [0], [0], [0], [0], [0], [0], [0], [1], [1], [3], [3], [1], [1], [0], [0], [2], [0], [1], [0], [0], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::geometric_distribution<int>]': '[0, 0, 0, 0, 0, 1, 0, 0, 2, 1, 0, 2, 0, 0, 0, 0, 0, 0, 0, 1, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 0, 1]' != '[[0], [0], [0], [0], [0], [1], [0], [0], [2], [1], [0], [2], [0], [0], [0], [0], [0], [0], [0], [1], [0], [0], [0], [0], [0], [0], [0], [0], [0], [0], [0], [1]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::poisson_distribution<int>]': '[3, 1, 1, 2, 1, 0, 0, 0, 1, 1, 0, 0, 1, 0, 1, 4, 0, 0, 0, 0, 4, 1, 1, 0, 1, 0, 0, 1, 1, 1, 2, 0]' != '[[3], [1], [1], [2], [1], [0], [0], [0], [1], [1], [0], [0], [1], [0], [1], [4], [0], [0], [0], [0], [4], [1], [1], [0], [1], [0], [0], [1], [1], [1], [2], [0]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::poisson_distribution<int>]': '[0, 3, 7, 3, 0, 0, 2, 8, 4, 5, 4, 2, 5, 6, 5, 2, 4, 6, 4, 2, 8, 3, 5, 3, 5, 3, 6, 3, 3, 6, 4, 3]' != '[[0], [3], [7], [3], [0], [0], [2], [8], [4], [5], [4], [2], [5], [6], [5], [2], [4], [6], [4], [2], [8], [3], [5], [3], [5], [3], [6], [3], [3], [6], [4], [3]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::exponential_distribution<double>]': '[0.441249, 0.360445, 1.57391, 0.044464, 0.471982, 0.0571889, 0.44295, 0.520508, 0.710618, 0.348814, 1.23563, 0.737762, 2.99639, 1.15965, 5.66962, 1.26029, 0.081411, 3.34859, 0.614425, 1.96163, 0.301867, 0.0854607, 1.5064, 1.12086, 0.434801, 0.450939, 0.474579, 0.912921, 0.731756, 0.426238, 0.239172, 0.607869]' != '[[0.441249], [0.360445], [1.57391], [0.044464], [0.471982], [0.0571889], [0.44295], [0.520508], [0.710618], [0.348814], [1.23563], [0.737762], [2.99639], [1.15965], [5.66962], [1.26029], [0.081411], [3.34859], [0.614425], [1.96163], [0.301867], [0.0854607], [1.5064], [1.12086], [0.434801], [0.450939], [0.474579], [0.912921], [0.731756], [0.426238], [0.239172], [0.607869]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::exponential_distribution<double>]': '[0.0334396, 0.803989, 0.247484, 0.0393031, 0.366866, 0.583387, 0.0858922, 0.71767, 0.209726, 0.335191, 0.199114, 0.135269, 0.389417, 3.18786, 0.422753, 0.404251, 0.810837, 0.0939662, 0.124138, 1.59893, 0.380253, 0.241605, 0.198512, 0.912944, 0.418796, 0.0976327, 0.228427, 0.891203, 0.350173, 0.353009, 0.431047, 0.622711]' != '[[0.0334396], [0.803989], [0.247484], [0.0393031], [0.366866], [0.583387], [0.0858922], [0.71767], [0.209726], [0.335191], [0.199114], [0.135269], [0.389417], [3.18786], [0.422753], [0.404251], [0.810837], [0.0939662], [0.124138], [1.59893], [0.380253], [0.241605], [0.198512], [0.912944], [0.418796], [0.0976327], [0.228427], [0.891203], [0.350173], [0.353009], [0.431047], [0.622711]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::gamma_distribution<double>]': '[3.04805, 1.01703, 0.369166, 4.20253, 0.456231, 0.0389874, 1.41212, 0.141813, 0.415449, 1.28596, 2.35636, 1.10022, 1.38354, 0.739289, 0.528738, 0.172282, 0.813243, 0.860792, 0.136252, 0.335831, 0.312319, 1.14246, 1.12018, 0.294114, 0.334605, 0.243317, 1.05703, 0.921451, 1.10247, 1.27456, 0.0845061, 0.29469]' != '[[3.04805], [1.01703], [0.369166], [4.20253], [0.456231], [0.0389874], [1.41212], [0.141813], [0.415449], [1.28596], [2.35636], [1.10022], [1.38354], [0.739289], [0.528738], [0.172282], [0.813243], [0.860792], [0.136252], [0.335831], [0.312319], [1.14246], [1.12018], [0.294114], [0.334605], [0.243317], [1.05703], [0.921451], [1.10247], [1.27456], [0.0845061], [0.29469]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::gamma_distribution<double>]': '[2.73296, 0.169933, 1.11026, 1.98885, 1.54986, 1.82989, 0.630756, 0.638347, 2.5013, 0.321717, 0.460992, 0.0207866, 0.561135, 0.00652426, 0.443876, 0.0914916, 0.0945904, 2.00634, 0.400218, 1.3536, 0.481853, 0.537912, 0.936446, 0.127696, 0.0132419, 0.929784, 1.13712, 0.121129, 0.84093, 1.80036, 0.847195, 0.524633]' != '[[2.73296], [0.169933], [1.11026], [1.98885], [1.54986], [1.82989], [0.630756], [0.638347], [2.5013], [0.321717], [0.460992], [0.0207866], [0.561135], [0.00652426], [0.443876], [0.0914916], [0.0945904], [2.00634], [0.400218], [1.3536], [0.481853], [0.537912], [0.936446], [0.127696], [0.0132419], [0.929784], [1.13712], [0.121129], [0.84093], [1.80036], [0.847195], [0.524633]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::weibull_distribution<double>]': '[1.10981, 0.804423, 0.100616, 0.792442, 1.33133, 0.593407, 0.798701, 1.57409, 0.209308, 4.30468, 0.713114, 1.65488, 0.383299, 0.130778, 0.271248, 0.031098, 0.208086, 0.234041, 2.48486, 0.652427, 1.06235, 0.348106, 2.58656, 3.47026, 2.1864, 0.672827, 0.0201862, 2.1433, 1.41454, 0.531608, 0.244781, 1.68774]' != '[[1.10981], [0.804423], [0.100616], [0.792442], [1.33133], [0.593407], [0.798701], [1.57409], [0.209308], [4.30468], [0.713114], [1.65488], [0.383299], [0.130778], [0.271248], [0.031098], [0.208086], [0.234041], [2.48486], [0.652427], [1.06235], [0.348106], [2.58656], [3.47026], [2.1864], [0.672827], [0.0201862], [2.1433], [1.41454], [0.531608], [0.244781], [1.68774]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::weibull_distribution<double>]': '[0.16957, 1.58668, 2.11305, 0.0985175, 0.135229, 1.31656, 0.998673, 0.593808, 1.0826, 1.06404, 5.13932, 0.0204485, 0.863749, 2.19823, 0.692746, 0.217269, 1.77603, 1.28825, 0.322166, 0.394591, 0.00443096, 0.773148, 0.166752, 0.816493, 1.26777, 0.00839181, 1.79449, 0.0365519, 0.0198886, 0.379715, 1.96983, 0.474994]' != '[[0.16957], [1.58668], [2.11305], [0.0985175], [0.135229], [1.31656], [0.998673], [0.593808], [1.0826], [1.06404], [5.13932], [0.0204485], [0.863749], [2.19823], [0.692746], [0.217269], [1.77603], [1.28825], [0.322166], [0.394591], [0.00443096], [0.773148], [0.166752], [0.816493], [1.26777], [0.00839181], [1.79449], [0.0365519], [0.0198886], [0.379715], [1.96983], [0.474994]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::extreme_value_distribution<double>]': '[-0.620882, -0.0408931, -0.0382084, 0.581281, -0.641354, 0.0877751, 1.59767, 2.2097, -0.760962, 6.525, 1.52044, 0.667624, -0.148803, 0.238236, 0.97295, -0.784118, 0.927888, -1.24832, -0.961255, -0.987016, -0.0695678, 0.0157826, 1.51406, 2.64704, 3.32845, 0.36354, 1.44469, 1.57353, -0.125315, -1.3586, -1.00915, 1.35265]' != '[[-0.620882], [-0.0408931], [-0.0382084], [0.581281], [-0.641354], [0.0877751], [1.59767], [2.2097], [-0.760962], [6.525], [1.52044], [0.667624], [-0.148803], [0.238236], [0.97295], [-0.784118], [0.927888], [-1.24832], [-0.961255], [-0.987016], [-0.0695678], [0.0157826], [1.51406], [2.64704], [3.32845], [0.36354], [1.44469], [1.57353], [-0.125315], [-1.3586], [-1.00915], [1.35265]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::extreme_value_distribution<double>]': '[2.35096, 1.23928, 2.85486, 1.25168, 0.994236, 2.21358, 0.319051, 3.46316, 0.65825, -0.270331, 0.45591, -0.696375, 4.12916, 0.558531, 1.35472, -0.360451, 1.34939, 1.88428, 3.14817, 4.49902, 3.26288, 0.874278, 1.51886, 1.21411, 6.00252, 2.14116, 3.47951, 0.595031, 1.73225, 0.545339, 1.1155, 0.188919]' != '[[2.35096], [1.23928], [2.85486], [1.25168], [0.994236], [2.21358], [0.319051], [3.46316], [0.65825], [-0.270331], [0.45591], [-0.696375], [4.12916], [0.558531], [1.35472], [-0.360451], [1.34939], [1.88428], [3.14817], [4.49902], [3.26288], [0.874278], [1.51886], [1.21411], [6.00252], [2.14116], [3.47951], [0.595031], [1.73225], [0.545339], [1.1155], [0.188919]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::normal_distribution<double>]': '[1.3082, -0.0436525, 0.0684711, -0.560638, 0.396876, -1.16241, 1.74973, 0.723097, 1.32924, 1.55811, 0.706739, -0.050114, 0.798095, 1.16372, -0.576982, 0.543285, -1.70122, -0.104704, -0.0440302, -0.928694, -0.472609, 0.395739, 0.199102, 0.582667, 0.635211, 0.00934715, 0.624397, -0.916282, -0.0280736, 1.43846, -0.514995, -0.707437]' != '[[1.3082], [-0.0436525], [0.0684711], [-0.560638], [0.396876], [-1.16241], [1.74973], [0.723097], [1.32924], [1.55811], [0.706739], [-0.050114], [0.798095], [1.16372], [-0.576982], [0.543285], [-1.70122], [-0.104704], [-0.0440302], [-0.928694], [-0.472609], [0.395739], [0.199102], [0.582667], [0.635211], [0.00934715], [0.624397], [-0.916282], [-0.0280736], [1.43846], [-0.514995], [-0.707437]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::normal_distribution<double>]': '[-0.140159, -1.03536, 1.39168, 0.51729, -0.154849, -0.871759, -1.24514, 0.782894, -2.05631, 1.91532, 1.69309, 1.32105, 1.28103, -0.429313, 2.37565, 2.16937, -0.444257, 0.16928, 0.986921, -0.499736, -0.79068, -0.0734999, -0.735586, 1.64528, 2.11168, -1.38041, 2.40139, -0.783996, -0.520166, 0.493709, 1.59525, 0.00457382]' != '[[-0.140159], [-1.03536], [1.39168], [0.51729], [-0.154849], [-0.871759], [-1.24514], [0.782894], [-2.05631], [1.91532], [1.69309], [1.32105], [1.28103], [-0.429313], [2.37565], [2.16937], [-0.444257], [0.16928], [0.986921], [-0.499736], [-0.79068], [-0.0734999], [-0.735586], [1.64528], [2.11168], [-1.38041], [2.40139], [-0.783996], [-0.520166], [0.493709], [1.59525], [0.00457382]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::lognormal_distribution<double>]': '[0.860827, 4.18056, 3.55645, 0.396451, 0.557386, 2.38753, 0.77263, 0.312595, 0.448058, 0.433529, 1.61283, 2.37389, 0.345489, 0.902997, 0.0827094, 2.87135, 0.171438, 0.315273, 6.77384, 14.0847, 0.249311, 0.569379, 1.98996, 0.610394, 0.473587, 0.961963, 1.89314, 4.34582, 0.772119, 9.16956, 0.483967, 0.530826]' != '[[0.860827], [4.18056], [3.55645], [0.396451], [0.557386], [2.38753], [0.77263], [0.312595], [0.448058], [0.433529], [1.61283], [2.37389], [0.345489], [0.902997], [0.0827094], [2.87135], [0.171438], [0.315273], [6.77384], [14.0847], [0.249311], [0.569379], [1.98996], [0.610394], [0.473587], [0.961963], [1.89314], [4.34582], [0.772119], [9.16956], [0.483967], [0.530826]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::lognormal_distribution<double>]': '[1.62227, 0.715042, 13.0949, 31.746, 4.04411, 3.31193, 5.32273, 0.517709, 2.01362, 1.16328, 5.95381, 0.554248, 1.80286, 0.452744, 11.9641, 2.07492, 2.41038, 2.04415, 1.3424, 2.94606, 1.52166, 3.33338, 0.352988, 2.56686, 3.37492, 0.358663, 0.82673, 9.99841, 23.8595, 1.32725, 1.40484, 0.207622]' != '[[1.62227], [0.715042], [13.0949], [31.746], [4.04411], [3.31193], [5.32273], [0.517709], [2.01362], [1.16328], [5.95381], [0.554248], [1.80286], [0.452744], [11.9641], [2.07492], [2.41038], [2.04415], [1.3424], [2.94606], [1.52166], [3.33338], [0.352988], [2.56686], [3.37492], [0.358663], [0.82673], [9.99841], [23.8595], [1.32725], [1.40484], [0.207622]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::chi_squared_distribution<double>]': '[1.42984, 1.49091, 1.40647, 0.396118, 0.577295, 2.28275, 0.0856282, 3.56941, 0.385625, 1.17481, 0.00880951, 0.034031, 5.88921, 0.200072, 0.10435, 0.0262528, 1.39731, 0.0594532, 0.177828, 2.79915, 1.09432, 2.1747, 2.75281, 0.140004, 0.267484, 3.21147, 4.82852, 0.00120714, 1.47989, 0.281869, 1.20138, 1.0662]' != '[[1.42984], [1.49091], [1.40647], [0.396118], [0.577295], [2.28275], [0.0856282], [3.56941], [0.385625], [1.17481], [0.00880951], [0.034031], [5.88921], [0.200072], [0.10435], [0.0262528], [1.39731], [0.0594532], [0.177828], [2.79915], [1.09432], [2.1747], [2.75281], [0.140004], [0.267484], [3.21147], [4.82852], [0.00120714], [1.47989], [0.281869], [1.20138], [1.0662]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::chi_squared_distribution<double>]': '[0.0796135, 0.691332, 0.0893177, 1.00825e-05, 0.29323, 1.91392e-06, 3.26029, 0.149117, 0.195619, 0.0111404, 0.00973953, 0.0222336, 0.304152, 0.00404961, 2.49364, 0.284685, 1.05187, 0.48581, 0.254013, 0.0207869, 0.00382051, 0.274803, 0.148412, 0.246646, 5.55691, 0.501669, 0.00432683, 0.00568291, 0.292395, 0.169626, 0.114401, 0.0191466]' != '[[0.0796135], [0.691332], [0.0893177], [1.00825e-05], [0.29323], [1.91392e-06], [3.26029], [0.149117], [0.195619], [0.0111404], [0.00973953], [0.0222336], [0.304152], [0.00404961], [2.49364], [0.284685], [1.05187], [0.48581], [0.254013], [0.0207869], [0.00382051], [0.274803], [0.148412], [0.246646], [5.55691], [0.501669], [0.00432683], [0.00568291], [0.292395], [0.169626], [0.114401], [0.0191466]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::cauchy_distribution<double>]': '[8.33413, 0.455176, 0.183511, 0.639067, 0.262028, -3.86088, 1.03842, -5.83709, 0.822275, 1.98294, -0.160261, 1.14923, -1.08816, 0.427882, 1.74867, 0.609678, -7.31566, 0.137309, -2.80172, -1.07143, -0.014274, 0.0753825, 5.86, -0.464413, 0.374365, -0.17064, 0.70278, -0.562572, 0.333654, 1.83319, -2.4691, -4.59024]' != '[[8.33413], [0.455176], [0.183511], [0.639067], [0.262028], [-3.86088], [1.03842], [-5.83709], [0.822275], [1.98294], [-0.160261], [1.14923], [-1.08816], [0.427882], [1.74867], [0.609678], [-7.31566], [0.137309], [-2.80172], [-1.07143], [-0.014274], [0.0753825], [5.86], [-0.464413], [0.374365], [-0.17064], [0.70278], [-0.562572], [0.333654], [1.83319], [-2.4691], [-4.59024]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::cauchy_distribution<double>]': '[0.582921, -0.729551, -2.68902, -3.80884, 0.112827, 2.113, 0.706772, -315.632, 1.53036, 0.316498, 2.22017, 0.498353, 0.0173711, 0.604996, -1.47971, -0.00757937, -0.147558, -13.3748, 1.23047, 1.27099, 1.66713, -0.671168, -973.845, -0.367857, 0.710038, -1.25236, -0.0689386, 1.53094, 0.502147, 2.05363, -4.29208, -5.33235]' != '[[0.582921], [-0.729551], [-2.68902], [-3.80884], [0.112827], [2.113], [0.706772], [-315.632], [1.53036], [0.316498], [2.22017], [0.498353], [0.0173711], [0.604996], [-1.47971], [-0.00757937], [-0.147558], [-13.3748], [1.23047], [1.27099], [1.66713], [-0.671168], [-973.845], [-0.367857], [0.710038], [-1.25236], [-0.0689386], [1.53094], [0.502147], [2.05363], [-4.29208], [-5.33235]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::fisher_f_distribution<double>]': '[0.985495, 37.0159, 0.0297462, 0.482242, 0.644119, 0.863621, 0.0262075, 0.0242743, 18.1769, 2.5946, 61.3344, 0.318418, 2.20676, 5.69156, 1.22312, 0.638308, 0.0431261, 2.35496e-05, 0.944331, 7.78522, 13.5137, 265.892, 359.913, 0.0196853, 2.31688, 1719.26, 0.88807, 0.356582, 0.426315, 0.943793, 0.0332548, 90.5657]' != '[[0.985495], [37.0159], [0.0297462], [0.482242], [0.644119], [0.863621], [0.0262075], [0.0242743], [18.1769], [2.5946], [61.3344], [0.318418], [2.20676], [5.69156], [1.22312], [0.638308], [0.0431261], [2.35496e-05], [0.944331], [7.78522], [13.5137], [265.892], [359.913], [0.0196853], [2.31688], [1719.26], [0.88807], [0.356582], [0.426315], [0.943793], [0.0332548], [90.5657]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::fisher_f_distribution<double>]': '[0.000536954, 1.00784, 0.637966, 0.228904, 0.00924382, 1.13754, 0.0845261, 1.04096e+06, 246.984, 0.0323901, 0.00531824, 0.00358034, 8.29665, 1.40917, 1006.77, 3.50985, 4.60031, 0.360804, 0.961582, 0.212368, 0.406729, 988.625, 0.254808, 4.8013, 0.546344, 2.27266, 3.28646, 0.000278716, 0.0144525, 8.23121, 5.12869, 1.85015]' != '[[0.000536954], [1.00784], [0.637966], [0.228904], [0.00924382], [1.13754], [0.0845261], [1.04096e+06], [246.984], [0.0323901], [0.00531824], [0.00358034], [8.29665], [1.40917], [1006.77], [3.50985], [4.60031], [0.360804], [0.961582], [0.212368], [0.406729], [988.625], [0.254808], [4.8013], [0.546344], [2.27266], [3.28646], [0.000278716], [0.0144525], [8.23121], [5.12869], [1.85015]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::student_t_distribution<double>]': '[-0.774231, -0.153183, 0.0372515, 1.90229, 1.92155, 1.15424, 3.34977, -7.02299, -22.0139, -5.26465, 0.538061, -5.54663, 6.42244, -0.423953, -0.690078, -1.60318, -0.157842, -0.681753, 0.182721, -0.348519, 0.465761, 1.81663, -0.566082, 2.98389, -7.46501, 1.56704, -2.77097, 0.067053, 1.10687, -0.825478, 0.930204, 0.4337]' != '[[-0.774231], [-0.153183], [0.0372515], [1.90229], [1.92155], [1.15424], [3.34977], [-7.02299], [-22.0139], [-5.26465], [0.538061], [-5.54663], [6.42244], [-0.423953], [-0.690078], [-1.60318], [-0.157842], [-0.681753], [0.182721], [-0.348519], [0.465761], [1.81663], [-0.566082], [2.98389], [-7.46501], [1.56704], [-2.77097], [0.067053], [1.10687], [-0.825478], [0.930204], [0.4337]]'
126: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tests/unit/plugins/matrixops/random_distributions.cpp(94): test 'phylanx::ir::node_data<T>(std::move(v)) == phylanx::execution_tree::extract_node_data<T>(result)' failed in function 'void generate_1d(const phylanx::execution_tree::compiler::function&, Gen&, Dist&) [with T = double; Gen = std::mersenne_twister_engine<long unsigned int, 32, 624, 397, 31, 2567483615, 11, 4294967295, 7, 2636928640, 15, 4022730752, 18, 1812433253>; Dist = std::student_t_distribution<double>]': '[-0.0327367, -6.96851, -1.04806, 3.45067, 1.36556, -0.729179, 0.809656, 8.57641, -0.938686, -0.844625, -1.67079, -0.216872, 1.15745, -0.178408, -0.0101031, -0.329908, -7.56527, 0.35976, 0.351581, -0.0261269, 59.5263, 57.0646, -3.1862, 1.37227, -1.8856, 0.709767, 1.13008, -0.138157, 0.297846, 4.13691, -4.78464, -0.136079]' != '[[-0.0327367], [-6.96851], [-1.04806], [3.45067], [1.36556], [-0.729179], [0.809656], [8.57641], [-0.938686], [-0.844625], [-1.67079], [-0.216872], [1.15745], [-0.178408], [-0.0101031], [-0.329908], [-7.56527], [0.35976], [0.351581], [-0.0261269], [59.5263], [57.0646], [-3.1862], [1.37227], [-1.8856], [0.709767], [1.13008], [-0.138157], [0.297846], [4.13691], [-4.78464], [-0.136079]]'
126: 0 sanity checks and 33 tests failed.
126: Base command is "/var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/random_distributions_test --hpx:threads=1 --hpx:localities=1"
126: Executing command: /var/lib/buildbot/slaves/phylanx/x86_64-gcc7-release/build/tools/buildbot/build-delphi-x86_64-Linux-gcc/phylanx-Release/bin/random_distributions_test --hpx:threads=1 --hpx:localities=1 --hpx:node=0
1/1 Test #126: tests.unit.plugins.matrixops.random_distributions ...***Failed 0.55 sec
0% tests passed, 1 tests failed out of 1
Total Test time (real) = 0.59 sec
The following tests FAILED:
126 - tests.unit.plugins.matrixops.random_distributions (Failed)
Errors while running CTest
```
built on x86_64 linux with gcc 7.1
|
non_process
|
tests unit plugins matrixops random distributions buildbot test failing again another test that only fails for the buildbot user ctest v r tests unit plugins matrixops random distributions updatectestconfiguration from var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release dartconfiguration tcl parse config file var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release dartconfiguration tcl updatectestconfiguration from var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release dartconfiguration tcl parse config file var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release dartconfiguration tcl test project var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release constructing a list of tests done constructing a list of tests updating test list for fixtures added tests to meet fixture requirements checking test dependency graph checking test dependency graph end test start tests unit plugins matrixops random distributions test command packages python ssl bin var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release bin phylanxrun py var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release bin random distributions test e l t v test timeout computed to be var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist var lib buildbot slaves phylanx release build tests unit plugins matrixops random distributions cpp test phylanx ir node data std move v phylanx execution tree extract node data result failed in function void generate const phylanx execution tree compiler function gen dist sanity checks and tests failed base command is var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release bin random distributions test hpx threads hpx localities executing command var lib buildbot slaves phylanx release build tools buildbot build delphi linux gcc phylanx release bin random distributions test hpx threads hpx localities hpx node test tests unit plugins matrixops random distributions failed sec tests passed tests failed out of total test time real sec the following tests failed tests unit plugins matrixops random distributions failed errors while running ctest built on linux with gcc
| 0
|
97,385
| 12,231,677,226
|
IssuesEvent
|
2020-05-04 08:15:46
|
tesshucom/jpsonic
|
https://api.github.com/repos/tesshucom/jpsonic
|
opened
|
Elimination of position increment problem
|
in : search status: pending-design-work
|
Common problems with phrase searches.
Consideration on how to eliminate the case where records with many delimiters and stop words are missing from the search. Consider ways to reduce search accuracy and minimize data growth.
- This problem is unlikely to occur in Japanese because it was assumed in advance.
- Search method can be changed to legacy mode(A method with many false searches but no omissions).
- In fact, it's not that often.
Therefore, Jpsonic will be release this new search feature despite this issue.
|
1.0
|
Elimination of position increment problem - Common problems with phrase searches.
Consideration on how to eliminate the case where records with many delimiters and stop words are missing from the search. Consider ways to reduce search accuracy and minimize data growth.
- This problem is unlikely to occur in Japanese because it was assumed in advance.
- Search method can be changed to legacy mode(A method with many false searches but no omissions).
- In fact, it's not that often.
Therefore, Jpsonic will be release this new search feature despite this issue.
|
non_process
|
elimination of position increment problem common problems with phrase searches consideration on how to eliminate the case where records with many delimiters and stop words are missing from the search consider ways to reduce search accuracy and minimize data growth this problem is unlikely to occur in japanese because it was assumed in advance search method can be changed to legacy mode a method with many false searches but no omissions in fact it s not that often therefore jpsonic will be release this new search feature despite this issue
| 0
|
15,339
| 19,480,234,644
|
IssuesEvent
|
2021-12-25 04:58:19
|
emily-writes-poems/emily-writes-poems-processing
|
https://api.github.com/repos/emily-writes-poems/emily-writes-poems-processing
|
closed
|
add confirmation that select/create feature worked
|
processing refinement
|
similar to emily-writes-poems/emily-writes-poems-scripts#7 - add a confirmation message that a feature was set or created successfully (displaying a message that disappears after a while)
|
1.0
|
add confirmation that select/create feature worked - similar to emily-writes-poems/emily-writes-poems-scripts#7 - add a confirmation message that a feature was set or created successfully (displaying a message that disappears after a while)
|
process
|
add confirmation that select create feature worked similar to emily writes poems emily writes poems scripts add a confirmation message that a feature was set or created successfully displaying a message that disappears after a while
| 1
|
207,214
| 15,797,593,816
|
IssuesEvent
|
2021-04-02 17:01:18
|
Slimefun/Slimefun4
|
https://api.github.com/repos/Slimefun/Slimefun4
|
opened
|
mcMMO AbilityBuff bug
|
🎯 Needs testing 🐞 Bug Report
|
## :round_pushpin: Description (REQUIRED)
When cargo transports a item, Slimefun doesn't check if the item has a AbilityBuff.
For example. We can keep the dig_speed enchantment from Super Breaker Skill like this:
## :bookmark_tabs: Steps to reproduce the Issue (REQUIRED)
1. Activate Super Breaker Skill with a pickaxe
2. Put the pickaxe in a item frame
3. Wait for the skill to end
4. Remove the pickaxe from the frame. Let it goes into a hopper.
5. Transport the pickaxe into a Auto-Disenchanter by cargo.
6. The dig_speed buff enchantment doesn't remove. Then just disenchant it to stable the enchantment.
7. put the pickaxe and the enchant_book in Auto-Enchanter.
8. Repeat the operation. You can get unlimited enchantment level.
## :bulb: Expected behavior (REQUIRED)
run SkillUtils#removeAbilityBuff when transporting a item,
or handlering something in machine.
## :scroll: Server Log
None
## :open_file_folder: /error-reports/ Folder
None
## :compass: Environment (REQUIRED)
Tuinity git-Tuinity-"1b0d783" (MC: 1.16.5)
Slimefun DEV - 853 (git f4bded94)
Metrics-Module #21
Java 11
Installed Addons: (3)
SlimefunItemId v1.0
Residence v4.9.3.3
CMI v8.8.2.2
|
1.0
|
mcMMO AbilityBuff bug - ## :round_pushpin: Description (REQUIRED)
When cargo transports a item, Slimefun doesn't check if the item has a AbilityBuff.
For example. We can keep the dig_speed enchantment from Super Breaker Skill like this:
## :bookmark_tabs: Steps to reproduce the Issue (REQUIRED)
1. Activate Super Breaker Skill with a pickaxe
2. Put the pickaxe in a item frame
3. Wait for the skill to end
4. Remove the pickaxe from the frame. Let it goes into a hopper.
5. Transport the pickaxe into a Auto-Disenchanter by cargo.
6. The dig_speed buff enchantment doesn't remove. Then just disenchant it to stable the enchantment.
7. put the pickaxe and the enchant_book in Auto-Enchanter.
8. Repeat the operation. You can get unlimited enchantment level.
## :bulb: Expected behavior (REQUIRED)
run SkillUtils#removeAbilityBuff when transporting a item,
or handlering something in machine.
## :scroll: Server Log
None
## :open_file_folder: /error-reports/ Folder
None
## :compass: Environment (REQUIRED)
Tuinity git-Tuinity-"1b0d783" (MC: 1.16.5)
Slimefun DEV - 853 (git f4bded94)
Metrics-Module #21
Java 11
Installed Addons: (3)
SlimefunItemId v1.0
Residence v4.9.3.3
CMI v8.8.2.2
|
non_process
|
mcmmo abilitybuff bug round pushpin description required when cargo transports a item slimefun doesn t check if the item has a abilitybuff for example we can keep the dig speed enchantment from super breaker skill like this bookmark tabs steps to reproduce the issue required activate super breaker skill with a pickaxe put the pickaxe in a item frame wait for the skill to end remove the pickaxe from the frame let it goes into a hopper transport the pickaxe into a auto disenchanter by cargo the dig speed buff enchantment doesn t remove then just disenchant it to stable the enchantment put the pickaxe and the enchant book in auto enchanter repeat the operation you can get unlimited enchantment level bulb expected behavior required run skillutils removeabilitybuff when transporting a item or handlering something in machine scroll server log none open file folder error reports folder none compass environment required tuinity git tuinity mc slimefun dev git metrics module java installed addons slimefunitemid residence cmi
| 0
|
19,381
| 25,518,864,997
|
IssuesEvent
|
2022-11-28 18:38:51
|
kubernetes/minikube
|
https://api.github.com/repos/kubernetes/minikube
|
closed
|
kicbase automation: include the changelog in the PR since last stable kic release
|
kind/feature priority/important-soon lifecycle/frozen kind/process co/kic-base
|
I think we can still the Git logic from the dependbot code
example https://github.com/kubernetes/minikube/pull/10561
https://github.com/dependabot/dependabot-core
|
1.0
|
kicbase automation: include the changelog in the PR since last stable kic release - I think we can still the Git logic from the dependbot code
example https://github.com/kubernetes/minikube/pull/10561
https://github.com/dependabot/dependabot-core
|
process
|
kicbase automation include the changelog in the pr since last stable kic release i think we can still the git logic from the dependbot code example
| 1
|
17,857
| 23,805,697,318
|
IssuesEvent
|
2022-09-04 01:52:19
|
WForst-Breeze/glassplus-developerkit
|
https://api.github.com/repos/WForst-Breeze/glassplus-developerkit
|
closed
|
Known bug: Some commands cannot provide execution feedback
|
BUG 漏洞 PROCESSED 已处理
|
Some commands cannot provide execution feedback. This is very necessary.
For example: /sp, /tsd, /tsno, /tsni, /tsmn etc.
|
1.0
|
Known bug: Some commands cannot provide execution feedback - Some commands cannot provide execution feedback. This is very necessary.
For example: /sp, /tsd, /tsno, /tsni, /tsmn etc.
|
process
|
known bug some commands cannot provide execution feedback some commands cannot provide execution feedback this is very necessary for example sp tsd tsno tsni tsmn etc
| 1
|
6,026
| 8,824,606,295
|
IssuesEvent
|
2019-01-02 17:46:07
|
googleapis/nodejs-error-reporting
|
https://api.github.com/repos/googleapis/nodejs-error-reporting
|
closed
|
Move samples to this repo
|
help wanted priority: p2 type: process
|
Hi there, our docs-samples repo still have some [error-reporting](https://github.com/GoogleCloudPlatform/nodejs-docs-samples/tree/master/error-reporting) samples. For Node.js, all samples live with their client library if they have one. Can you move the samples over please. Thanks! cc @ofrobots
|
1.0
|
Move samples to this repo - Hi there, our docs-samples repo still have some [error-reporting](https://github.com/GoogleCloudPlatform/nodejs-docs-samples/tree/master/error-reporting) samples. For Node.js, all samples live with their client library if they have one. Can you move the samples over please. Thanks! cc @ofrobots
|
process
|
move samples to this repo hi there our docs samples repo still have some samples for node js all samples live with their client library if they have one can you move the samples over please thanks cc ofrobots
| 1
|
289,519
| 8,871,930,823
|
IssuesEvent
|
2019-01-11 14:08:01
|
liam2/liam2
|
https://api.github.com/repos/liam2/liam2
|
closed
|
add support for pyqt5 or pyside
|
bug enhancement inprogress priority: high
|
Since we recommend installing using anaconda and anaconda bundles Qt5 by default, this could be considered a bug. I guess it is safer to use qtpy so that we stay compatible with Qt4.
|
1.0
|
add support for pyqt5 or pyside - Since we recommend installing using anaconda and anaconda bundles Qt5 by default, this could be considered a bug. I guess it is safer to use qtpy so that we stay compatible with Qt4.
|
non_process
|
add support for or pyside since we recommend installing using anaconda and anaconda bundles by default this could be considered a bug i guess it is safer to use qtpy so that we stay compatible with
| 0
|
2,990
| 8,670,249,676
|
IssuesEvent
|
2018-11-29 16:05:15
|
poanetwork/blockscout
|
https://api.github.com/repos/poanetwork/blockscout
|
opened
|
Import block reward from beneficiaries
|
enhancement priority: high team: architecture
|
We are querying the Parity `block_trace` call but are not extracting the block rewards from each block. We should start importing this data into the `block_rewards` table. This table should also include a `fetched_at` time which indicates if this process has been completed. `NULL` should be used if we have not received a block reward for a particular block.
The catchup fetcher should run async looking for any missed block rewards.
|
1.0
|
Import block reward from beneficiaries - We are querying the Parity `block_trace` call but are not extracting the block rewards from each block. We should start importing this data into the `block_rewards` table. This table should also include a `fetched_at` time which indicates if this process has been completed. `NULL` should be used if we have not received a block reward for a particular block.
The catchup fetcher should run async looking for any missed block rewards.
|
non_process
|
import block reward from beneficiaries we are querying the parity block trace call but are not extracting the block rewards from each block we should start importing this data into the block rewards table this table should also include a fetched at time which indicates if this process has been completed null should be used if we have not received a block reward for a particular block the catchup fetcher should run async looking for any missed block rewards
| 0
|
18,217
| 24,275,460,243
|
IssuesEvent
|
2022-09-28 13:34:09
|
googleapis/python-bigquery
|
https://api.github.com/repos/googleapis/python-bigquery
|
closed
|
blacken samples using templated noxfile
|
api: bigquery type: process samples
|
This repository has three historical samples locations (see https://github.com/googleapis/python-bigquery/issues/790 for the issue to migrate them all to the most recent layout, supported by templates), but this issue only refers to the samples in (3). The samples in (1, 2) should remain blackened by the existing mechanism:
1. `docs/snippets.py`
2. `samples/*.py` + `samples/tests/*.py`
3. `samples/snippets/*.py`, `samples/geography/*.py`, `samples/magics/*.py`, and possibly more in the future, see: https://github.com/googleapis/python-bigquery/issues/1352.
For this issue, I anticipate the following updates:
1. Update the blacken session in noxfile.py to only blacken the samples in legacy directories. That is https://github.com/googleapis/python-bigquery/blob/34a3f5cf34d4a08889fe1407f4ad6ce3c9d93838/noxfile.py#L28 should be updated to `BLACK_PATHS = ("docs", "google", "samples/*.py", "samples/tests/*.py", "tests", "noxfile.py", "setup.py")` or similar.
2. Update owlbot config to make sure the code samples in `samples/snippets|geography|etc` stay blackened. We still want to run `nox -s blacken` in the root directory but then we also want to run `nox -s blacken` in each samples directory that has a `noxfile.py`. See the implementation in python-bigquery-datatransfer for reference: https://github.com/googleapis/python-bigquery-datatransfer/blob/822223fc02b1e2ff6f52b834c51f4bf46924e2e8/owlbot.py#L60-L61
|
1.0
|
blacken samples using templated noxfile - This repository has three historical samples locations (see https://github.com/googleapis/python-bigquery/issues/790 for the issue to migrate them all to the most recent layout, supported by templates), but this issue only refers to the samples in (3). The samples in (1, 2) should remain blackened by the existing mechanism:
1. `docs/snippets.py`
2. `samples/*.py` + `samples/tests/*.py`
3. `samples/snippets/*.py`, `samples/geography/*.py`, `samples/magics/*.py`, and possibly more in the future, see: https://github.com/googleapis/python-bigquery/issues/1352.
For this issue, I anticipate the following updates:
1. Update the blacken session in noxfile.py to only blacken the samples in legacy directories. That is https://github.com/googleapis/python-bigquery/blob/34a3f5cf34d4a08889fe1407f4ad6ce3c9d93838/noxfile.py#L28 should be updated to `BLACK_PATHS = ("docs", "google", "samples/*.py", "samples/tests/*.py", "tests", "noxfile.py", "setup.py")` or similar.
2. Update owlbot config to make sure the code samples in `samples/snippets|geography|etc` stay blackened. We still want to run `nox -s blacken` in the root directory but then we also want to run `nox -s blacken` in each samples directory that has a `noxfile.py`. See the implementation in python-bigquery-datatransfer for reference: https://github.com/googleapis/python-bigquery-datatransfer/blob/822223fc02b1e2ff6f52b834c51f4bf46924e2e8/owlbot.py#L60-L61
|
process
|
blacken samples using templated noxfile this repository has three historical samples locations see for the issue to migrate them all to the most recent layout supported by templates but this issue only refers to the samples in the samples in should remain blackened by the existing mechanism docs snippets py samples py samples tests py samples snippets py samples geography py samples magics py and possibly more in the future see for this issue i anticipate the following updates update the blacken session in noxfile py to only blacken the samples in legacy directories that is should be updated to black paths docs google samples py samples tests py tests noxfile py setup py or similar update owlbot config to make sure the code samples in samples snippets geography etc stay blackened we still want to run nox s blacken in the root directory but then we also want to run nox s blacken in each samples directory that has a noxfile py see the implementation in python bigquery datatransfer for reference
| 1
|
10,865
| 13,634,822,860
|
IssuesEvent
|
2020-09-25 01:01:38
|
conventional-commits/conventionalcommits.org
|
https://api.github.com/repos/conventional-commits/conventionalcommits.org
|
opened
|
October Meeting
|
process
|
## Proposal
I would like to suggest that we have our first monthly meeting in October, where:
1. start to figure out some governance for this project.
2. divide up some of the work triaging Pull Requests, and commenting on issues.
This issue will be updated with a Hangout link, agenda, and a date/time once we can agree on one.
## Call to action
Join the conventional commits channel in [this chat room](http://devtoolscommunity.herokuapp.com/) if you're interested in joining the meeting; and we can pick a time.
|
1.0
|
October Meeting - ## Proposal
I would like to suggest that we have our first monthly meeting in October, where:
1. start to figure out some governance for this project.
2. divide up some of the work triaging Pull Requests, and commenting on issues.
This issue will be updated with a Hangout link, agenda, and a date/time once we can agree on one.
## Call to action
Join the conventional commits channel in [this chat room](http://devtoolscommunity.herokuapp.com/) if you're interested in joining the meeting; and we can pick a time.
|
process
|
october meeting proposal i would like to suggest that we have our first monthly meeting in october where start to figure out some governance for this project divide up some of the work triaging pull requests and commenting on issues this issue will be updated with a hangout link agenda and a date time once we can agree on one call to action join the conventional commits channel in if you re interested in joining the meeting and we can pick a time
| 1
|
9,373
| 12,374,270,542
|
IssuesEvent
|
2020-05-19 01:03:15
|
allinurl/goaccess
|
https://api.github.com/repos/allinurl/goaccess
|
closed
|
slow processing when using db files as the database grow (Database currently around 8GB)
|
log-processing on-disk question
|
Hello,
I used goaccess to parse the access log from multiple load balanced webservers. I have data from Feb until July of 2019 imported using the following command line but it feels like things are getting slower as the database gets bigger:
```
goaccess LOG_FILE --keep-db-files --load-from-disk --db-path=/home/logs/db/ --log-format=COMBINED -o /var/www/html/report/index.html
```
```
[root@las-elk01 temp]# du -hs /home/logs/db/
8.1G /home/logs/db/
```
This is a test processing a file with 200 lines of logs
```
[root@las-elk01 temp]# goaccess test.log --keep-db-files --load-from-disk --db-path=/home/logs/db/ --log-format=COMBINED -o /var/www/html/report/test.html
[root@las-elk01 temp]# time goaccess test.log --keep-db-files --load-from-disk --db-path=/home/logs/db/ --log-format=COMBINED -o /var/www/html/report/test.html
real 0m32.128s
user 0m28.479s
sys 0m2.042s
```
Is this performance degradation normal as the db data gets larger? Is there anything I can do to speed the process of new data?
Thanks.
|
1.0
|
slow processing when using db files as the database grow (Database currently around 8GB) - Hello,
I used goaccess to parse the access log from multiple load balanced webservers. I have data from Feb until July of 2019 imported using the following command line but it feels like things are getting slower as the database gets bigger:
```
goaccess LOG_FILE --keep-db-files --load-from-disk --db-path=/home/logs/db/ --log-format=COMBINED -o /var/www/html/report/index.html
```
```
[root@las-elk01 temp]# du -hs /home/logs/db/
8.1G /home/logs/db/
```
This is a test processing a file with 200 lines of logs
```
[root@las-elk01 temp]# goaccess test.log --keep-db-files --load-from-disk --db-path=/home/logs/db/ --log-format=COMBINED -o /var/www/html/report/test.html
[root@las-elk01 temp]# time goaccess test.log --keep-db-files --load-from-disk --db-path=/home/logs/db/ --log-format=COMBINED -o /var/www/html/report/test.html
real 0m32.128s
user 0m28.479s
sys 0m2.042s
```
Is this performance degradation normal as the db data gets larger? Is there anything I can do to speed the process of new data?
Thanks.
|
process
|
slow processing when using db files as the database grow database currently around hello i used goaccess to parse the access log from multiple load balanced webservers i have data from feb until july of imported using the following command line but it feels like things are getting slower as the database gets bigger goaccess log file keep db files load from disk db path home logs db log format combined o var www html report index html du hs home logs db home logs db this is a test processing a file with lines of logs goaccess test log keep db files load from disk db path home logs db log format combined o var www html report test html time goaccess test log keep db files load from disk db path home logs db log format combined o var www html report test html real user sys is this performance degradation normal as the db data gets larger is there anything i can do to speed the process of new data thanks
| 1
|
8,133
| 11,319,741,003
|
IssuesEvent
|
2020-01-21 01:02:36
|
SharryChoo/blog-gittalk
|
https://api.github.com/repos/SharryChoo/blog-gittalk
|
opened
|
Android 系统架构 —— SystemServer 进程的启动 - Sharry's blog
|
Gitalk android-source-systemserver-process-start
|
https://sharrychoo.github.io/blog/android-source/systemserver-process-start
前言在分析 Zygote 启动的时候, 我们注意到它调用了 ZygoteInit.forkSystemServer 函数来创建系统服务进程这里我们追踪一下系统服务进程的创建, 在此之前我们先回顾一下 SystemServer 进程的发起
|
1.0
|
Android 系统架构 —— SystemServer 进程的启动 - Sharry's blog - https://sharrychoo.github.io/blog/android-source/systemserver-process-start
前言在分析 Zygote 启动的时候, 我们注意到它调用了 ZygoteInit.forkSystemServer 函数来创建系统服务进程这里我们追踪一下系统服务进程的创建, 在此之前我们先回顾一下 SystemServer 进程的发起
|
process
|
android 系统架构 —— systemserver 进程的启动 sharry s blog 前言在分析 zygote 启动的时候 我们注意到它调用了 zygoteinit forksystemserver 函数来创建系统服务进程这里我们追踪一下系统服务进程的创建 在此之前我们先回顾一下 systemserver 进程的发起
| 1
|
19,298
| 25,466,454,374
|
IssuesEvent
|
2022-11-25 05:13:41
|
GoogleCloudPlatform/fda-mystudies
|
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
|
closed
|
[IDP] [PM] Getting an error message in add new admin screen
|
Bug P0 Participant manager Process: Fixed Process: Tested QA Process: Tested dev
|
**Steps:**
1. Login to PM
2. Click on 'Admins' tab
3. Click on 'Add new admin' button and Verify
**AR:** Getting an error message as 'User does not exist' in add new admin screen
**ER:** Error message should not get displayed

|
3.0
|
[IDP] [PM] Getting an error message in add new admin screen - **Steps:**
1. Login to PM
2. Click on 'Admins' tab
3. Click on 'Add new admin' button and Verify
**AR:** Getting an error message as 'User does not exist' in add new admin screen
**ER:** Error message should not get displayed

|
process
|
getting an error message in add new admin screen steps login to pm click on admins tab click on add new admin button and verify ar getting an error message as user does not exist in add new admin screen er error message should not get displayed
| 1
|
708,101
| 24,330,177,577
|
IssuesEvent
|
2022-09-30 18:36:52
|
vmware-tanzu/tanzu-framework
|
https://api.github.com/repos/vmware-tanzu/tanzu-framework
|
closed
|
`make build-cli` fails to build with Go 1.18
|
area/cli priority/important-soon area/release kind/bug area/iam
|
**Bug description**
Repo fails to build when `make` targets are invoked with Go 1.18.
```
$ make build-cli-local
BUILD_TAGS set to 'embedproviders'
go mod download
go mod tidy
EMBED_PROVIDERS_TAG=embedproviders
...
...
./hack/embed-pinniped-binary.sh go darwin amd64 v0.4.4 v0.12.1
+ GO=go
+ shift
+ GOOS=darwin
+ shift
+ GOARCH=amd64
+ shift
+ (( 2 ))
+ pinniped_version=v0.4.4
+ pinniped_binary=cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4
+ echo 'embed-pinniped-binary.sh: building pinniped version '\''v0.4.4'\'' to '\''cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4'\'''
embed-pinniped-binary.sh: building pinniped version 'v0.4.4' to 'cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4'
+ pushd pinniped
+ git checkout v0.4.4
Note: switching to 'v0.4.4'.
You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.
If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:
git switch -c <new-branch-name>
Or undo this operation with:
git switch -
Turn off this advice by setting config variable advice.detachedHead to false
HEAD is now at f46de56b Fix broken upstream OIDC discovery timeout added in previous commit
+ GOARCH=amd64
+ GOOS=darwin
+ go build -o ../cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4 ./cmd/pinniped
# golang.org/x/sys/unix
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/syscall_darwin.1_13.go:29:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.1_13.go:27:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.1_13.go:40:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:28:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:43:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:59:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:75:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:90:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:105:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:121:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:121:3: too many errors
make: *** [Makefile:280: build-cli-local-darwin-amd64] Error 2
```
The solution is for `pinniped` to update it's `golang.org/x/sys` dependency to latest (to support Go 1.18) and framework to checkout the `pinniped` ref. The solution is confirmed by the following workaround which builds the binaries correctly:
```diff
diff --git a/hack/embed-pinniped-binary.sh b/hack/embed-pinniped-binary.sh
index 3f0bfacb..ed4b340e 100755
--- a/hack/embed-pinniped-binary.sh
+++ b/hack/embed-pinniped-binary.sh
@@ -27,7 +27,9 @@ while (( "$#" )); do
pushd pinniped >/dev/null
git checkout "$pinniped_version"
+ ${GO} get -u golang.org/x/sys
GOARCH=${GOARCH} GOOS=${GOOS} ${GO} build -o "../${pinniped_binary}" ./cmd/pinniped
+ git checkout -- go.mod go.sum
popd >/dev/null
git update-index --assume-unchanged "$pinniped_binary"
```
**Affected product area (please put an X in all that apply)**
- [ ] APIs
- [ ] Addons
- [x] CLI
- [ ] Docs
- [ ] IAM
- [ ] Installation
- [ ] Plugin
- [ ] Security
- [x] Test and Release
- [ ] User Experience
**Expected behavior**
`make build-cli-*` targets successfully builds all the binaries.
**Steps to reproduce the bug**
Described above.
**Version** (include the SHA if the version is not obvious)
```
$ tanzu version
version: v0.18.0-dev
buildDate: 2022-03-17
sha: 75dbb80c
```
**Environment where the bug was observed (cloud, OS, etc)**
**Relevant Debug Output (Logs, manifests, etc)**
|
1.0
|
`make build-cli` fails to build with Go 1.18 - **Bug description**
Repo fails to build when `make` targets are invoked with Go 1.18.
```
$ make build-cli-local
BUILD_TAGS set to 'embedproviders'
go mod download
go mod tidy
EMBED_PROVIDERS_TAG=embedproviders
...
...
./hack/embed-pinniped-binary.sh go darwin amd64 v0.4.4 v0.12.1
+ GO=go
+ shift
+ GOOS=darwin
+ shift
+ GOARCH=amd64
+ shift
+ (( 2 ))
+ pinniped_version=v0.4.4
+ pinniped_binary=cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4
+ echo 'embed-pinniped-binary.sh: building pinniped version '\''v0.4.4'\'' to '\''cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4'\'''
embed-pinniped-binary.sh: building pinniped version 'v0.4.4' to 'cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4'
+ pushd pinniped
+ git checkout v0.4.4
Note: switching to 'v0.4.4'.
You are in 'detached HEAD' state. You can look around, make experimental
changes and commit them, and you can discard any commits you make in this
state without impacting any branches by switching back to a branch.
If you want to create a new branch to retain commits you create, you may
do so (now or later) by using -c with the switch command. Example:
git switch -c <new-branch-name>
Or undo this operation with:
git switch -
Turn off this advice by setting config variable advice.detachedHead to false
HEAD is now at f46de56b Fix broken upstream OIDC discovery timeout added in previous commit
+ GOARCH=amd64
+ GOOS=darwin
+ go build -o ../cmd/cli/plugin/pinniped-auth/asset/pinniped-v0.4.4 ./cmd/pinniped
# golang.org/x/sys/unix
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/syscall_darwin.1_13.go:29:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.1_13.go:27:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.1_13.go:40:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:28:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:43:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:59:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:75:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:90:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:105:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:121:3: //go:linkname must refer to declared function or variable
../../../../../pkg/mod/golang.org/x/sys@v0.0.0-20201112073958-5cba982894dd/unix/zsyscall_darwin_amd64.go:121:3: too many errors
make: *** [Makefile:280: build-cli-local-darwin-amd64] Error 2
```
The solution is for `pinniped` to update it's `golang.org/x/sys` dependency to latest (to support Go 1.18) and framework to checkout the `pinniped` ref. The solution is confirmed by the following workaround which builds the binaries correctly:
```diff
diff --git a/hack/embed-pinniped-binary.sh b/hack/embed-pinniped-binary.sh
index 3f0bfacb..ed4b340e 100755
--- a/hack/embed-pinniped-binary.sh
+++ b/hack/embed-pinniped-binary.sh
@@ -27,7 +27,9 @@ while (( "$#" )); do
pushd pinniped >/dev/null
git checkout "$pinniped_version"
+ ${GO} get -u golang.org/x/sys
GOARCH=${GOARCH} GOOS=${GOOS} ${GO} build -o "../${pinniped_binary}" ./cmd/pinniped
+ git checkout -- go.mod go.sum
popd >/dev/null
git update-index --assume-unchanged "$pinniped_binary"
```
**Affected product area (please put an X in all that apply)**
- [ ] APIs
- [ ] Addons
- [x] CLI
- [ ] Docs
- [ ] IAM
- [ ] Installation
- [ ] Plugin
- [ ] Security
- [x] Test and Release
- [ ] User Experience
**Expected behavior**
`make build-cli-*` targets successfully builds all the binaries.
**Steps to reproduce the bug**
Described above.
**Version** (include the SHA if the version is not obvious)
```
$ tanzu version
version: v0.18.0-dev
buildDate: 2022-03-17
sha: 75dbb80c
```
**Environment where the bug was observed (cloud, OS, etc)**
**Relevant Debug Output (Logs, manifests, etc)**
|
non_process
|
make build cli fails to build with go bug description repo fails to build when make targets are invoked with go make build cli local build tags set to embedproviders go mod download go mod tidy embed providers tag embedproviders hack embed pinniped binary sh go darwin go go shift goos darwin shift goarch shift pinniped version pinniped binary cmd cli plugin pinniped auth asset pinniped echo embed pinniped binary sh building pinniped version to cmd cli plugin pinniped auth asset pinniped embed pinniped binary sh building pinniped version to cmd cli plugin pinniped auth asset pinniped pushd pinniped git checkout note switching to you are in detached head state you can look around make experimental changes and commit them and you can discard any commits you make in this state without impacting any branches by switching back to a branch if you want to create a new branch to retain commits you create you may do so now or later by using c with the switch command example git switch c or undo this operation with git switch turn off this advice by setting config variable advice detachedhead to false head is now at fix broken upstream oidc discovery timeout added in previous commit goarch goos darwin go build o cmd cli plugin pinniped auth asset pinniped cmd pinniped golang org x sys unix pkg mod golang org x sys unix syscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go go linkname must refer to declared function or variable pkg mod golang org x sys unix zsyscall darwin go too many errors make error the solution is for pinniped to update it s golang org x sys dependency to latest to support go and framework to checkout the pinniped ref the solution is confirmed by the following workaround which builds the binaries correctly diff diff git a hack embed pinniped binary sh b hack embed pinniped binary sh index a hack embed pinniped binary sh b hack embed pinniped binary sh while do pushd pinniped dev null git checkout pinniped version go get u golang org x sys goarch goarch goos goos go build o pinniped binary cmd pinniped git checkout go mod go sum popd dev null git update index assume unchanged pinniped binary affected product area please put an x in all that apply apis addons cli docs iam installation plugin security test and release user experience expected behavior make build cli targets successfully builds all the binaries steps to reproduce the bug described above version include the sha if the version is not obvious tanzu version version dev builddate sha environment where the bug was observed cloud os etc relevant debug output logs manifests etc
| 0
|
10,442
| 13,221,452,218
|
IssuesEvent
|
2020-08-17 14:04:34
|
keep-network/keep-core
|
https://api.github.com/repos/keep-network/keep-core
|
closed
|
Update KEEP token dashboard for top-ups
|
:old_key: token dashboard process & client team
|
https://github.com/keep-network/keep-core/pull/1893 introduces the ability for staker to top-up the existing delegation. Adding tokens to existing delegation is a two-step process: first, staker initiates a top-up, locking new KEEPs in the staking contract. Once the initialization period is over, staker (or anyone else) can commit the top-up, increasing operator's stake.
There is one limitation. We expect the same source of tokens for a top-up as used for the initial delegation. If the initial delegation was done from a grant, the same grant has to be used for a top-up to the operator. If the initial delegation was done using owner's liquid tokens, liquid tokens from the same owner are required for a top-up.
An implication of this change is that tokens staked from a grant, undelegated and recovered does no longer come back to the grant contract. Instead, they are deposited in a special escrow contract from which they can be redelegated or withdrawn, according to grant unlocking schedule.
We need to update KEEP token dashboard to reflect this change.
|
1.0
|
Update KEEP token dashboard for top-ups - https://github.com/keep-network/keep-core/pull/1893 introduces the ability for staker to top-up the existing delegation. Adding tokens to existing delegation is a two-step process: first, staker initiates a top-up, locking new KEEPs in the staking contract. Once the initialization period is over, staker (or anyone else) can commit the top-up, increasing operator's stake.
There is one limitation. We expect the same source of tokens for a top-up as used for the initial delegation. If the initial delegation was done from a grant, the same grant has to be used for a top-up to the operator. If the initial delegation was done using owner's liquid tokens, liquid tokens from the same owner are required for a top-up.
An implication of this change is that tokens staked from a grant, undelegated and recovered does no longer come back to the grant contract. Instead, they are deposited in a special escrow contract from which they can be redelegated or withdrawn, according to grant unlocking schedule.
We need to update KEEP token dashboard to reflect this change.
|
process
|
update keep token dashboard for top ups introduces the ability for staker to top up the existing delegation adding tokens to existing delegation is a two step process first staker initiates a top up locking new keeps in the staking contract once the initialization period is over staker or anyone else can commit the top up increasing operator s stake there is one limitation we expect the same source of tokens for a top up as used for the initial delegation if the initial delegation was done from a grant the same grant has to be used for a top up to the operator if the initial delegation was done using owner s liquid tokens liquid tokens from the same owner are required for a top up an implication of this change is that tokens staked from a grant undelegated and recovered does no longer come back to the grant contract instead they are deposited in a special escrow contract from which they can be redelegated or withdrawn according to grant unlocking schedule we need to update keep token dashboard to reflect this change
| 1
|
21,622
| 30,022,545,842
|
IssuesEvent
|
2023-06-27 01:36:44
|
bazelbuild/bazel
|
https://api.github.com/repos/bazelbuild/bazel
|
closed
|
Support android_sdk toolchains created in skylark
|
P3 type: support / not a bug (process) team-Configurability stale
|
### Description of the problem / feature request:
I'm trying to declare a custom `android_sdk` toolchain in skylark withouth having to use the `android_sdk_repository` rule y replicating the logic in [AndroidSdkRepositoryFunction.java](https://sourcegraph.com/github.com/bazelbuild/bazel/-/blob/src/main/java/com/google/devtools/build/lib/bazel/rules/android/AndroidSdkRepositoryFunction.java) in Skylark. Essentially we have a repo rule that downloads an SDK from GCS unpacks it and creates the toolchains calling `create_android_sdk_rules` and after that registering the toolchain `register_toolchains("@snap_tool_android_sdk//dist:sdk-30-toolchain")` but building `bazel build $tgt --android_sdk=@snap_tool_android_sdk//dist:sdk-30` yields `failed; build aborted: no matching toolchains found for types @bazel_tools//tools/android:sdk_toolchain_type. Is this expected? Is there a way to create an android_sdk toolchain from skylark?
### What operating system are you running Bazel on?
Mac OS
### What's the output of `bazel info release`?
release 4.2.2.0
|
1.0
|
Support android_sdk toolchains created in skylark - ### Description of the problem / feature request:
I'm trying to declare a custom `android_sdk` toolchain in skylark withouth having to use the `android_sdk_repository` rule y replicating the logic in [AndroidSdkRepositoryFunction.java](https://sourcegraph.com/github.com/bazelbuild/bazel/-/blob/src/main/java/com/google/devtools/build/lib/bazel/rules/android/AndroidSdkRepositoryFunction.java) in Skylark. Essentially we have a repo rule that downloads an SDK from GCS unpacks it and creates the toolchains calling `create_android_sdk_rules` and after that registering the toolchain `register_toolchains("@snap_tool_android_sdk//dist:sdk-30-toolchain")` but building `bazel build $tgt --android_sdk=@snap_tool_android_sdk//dist:sdk-30` yields `failed; build aborted: no matching toolchains found for types @bazel_tools//tools/android:sdk_toolchain_type. Is this expected? Is there a way to create an android_sdk toolchain from skylark?
### What operating system are you running Bazel on?
Mac OS
### What's the output of `bazel info release`?
release 4.2.2.0
|
process
|
support android sdk toolchains created in skylark description of the problem feature request i m trying to declare a custom android sdk toolchain in skylark withouth having to use the android sdk repository rule y replicating the logic in in skylark essentially we have a repo rule that downloads an sdk from gcs unpacks it and creates the toolchains calling create android sdk rules and after that registering the toolchain register toolchains snap tool android sdk dist sdk toolchain but building bazel build tgt android sdk snap tool android sdk dist sdk yields failed build aborted no matching toolchains found for types bazel tools tools android sdk toolchain type is this expected is there a way to create an android sdk toolchain from skylark what operating system are you running bazel on mac os what s the output of bazel info release release
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.