Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
17,168
22,743,747,611
IssuesEvent
2022-07-07 07:17:12
geneontology/go-ontology
https://api.github.com/repos/geneontology/go-ontology
closed
Obsoletion notice: GO:0052018 modulation by symbiont of RNA levels in host & children
multi-species process
Dear all, The proposal has been made to obsolete GO:0052018 modulation by symbiont of RNA levels in host GO:1990209 negative regulation by symbiont of RNA levels in host GO:1990208 positive regulation by symbiont of RNA levels in host as well as GO:0052430 modulation by host of symbiont RNA levels There reason for obsoletion is that these are readouts. There was a single annotation to HopAB2 (Q8RSY1) from PMID:18703740 by JCVI. This has been changed to 'GO:0052034 effector-mediated suppression of host pattern-triggered immunity'. There are no mappings to these terms, these terms are not present in any subsets. You can comment on the ticket: Thanks, Pascale
1.0
Obsoletion notice: GO:0052018 modulation by symbiont of RNA levels in host & children - Dear all, The proposal has been made to obsolete GO:0052018 modulation by symbiont of RNA levels in host GO:1990209 negative regulation by symbiont of RNA levels in host GO:1990208 positive regulation by symbiont of RNA levels in host as well as GO:0052430 modulation by host of symbiont RNA levels There reason for obsoletion is that these are readouts. There was a single annotation to HopAB2 (Q8RSY1) from PMID:18703740 by JCVI. This has been changed to 'GO:0052034 effector-mediated suppression of host pattern-triggered immunity'. There are no mappings to these terms, these terms are not present in any subsets. You can comment on the ticket: Thanks, Pascale
process
obsoletion notice go modulation by symbiont of rna levels in host children dear all the proposal has been made to obsolete go modulation by symbiont of rna levels in host go negative regulation by symbiont of rna levels in host go positive regulation by symbiont of rna levels in host as well as go modulation by host of symbiont rna levels there reason for obsoletion is that these are readouts there was a single annotation to from pmid by jcvi this has been changed to go effector mediated suppression of host pattern triggered immunity there are no mappings to these terms these terms are not present in any subsets you can comment on the ticket thanks pascale
1
165,010
12,825,793,946
IssuesEvent
2020-07-06 15:29:53
connext/indra
https://api.github.com/repos/connext/indra
opened
[tests] TPS Measurement Updates
Chore p1 Bugs/Tests/Blockers
TPS service should be updated to give us a better idea of our limitations: - should treat the transfer rate as a function of the number of clients - clients should be pre-collateralized - should run against staging node
1.0
[tests] TPS Measurement Updates - TPS service should be updated to give us a better idea of our limitations: - should treat the transfer rate as a function of the number of clients - clients should be pre-collateralized - should run against staging node
non_process
tps measurement updates tps service should be updated to give us a better idea of our limitations should treat the transfer rate as a function of the number of clients clients should be pre collateralized should run against staging node
0
58,611
6,609,870,527
IssuesEvent
2017-09-19 15:48:45
LDMW/cms
https://api.github.com/repos/LDMW/cms
closed
UI - Consistent merging throughout resource (inc why would you recommend box)
please-test T25m
and everything in alignment with the paragraph of body text above
1.0
UI - Consistent merging throughout resource (inc why would you recommend box) - and everything in alignment with the paragraph of body text above
non_process
ui consistent merging throughout resource inc why would you recommend box and everything in alignment with the paragraph of body text above
0
97,786
16,244,192,590
IssuesEvent
2021-05-07 13:05:13
hydrogen-dev/SDK
https://api.github.com/repos/hydrogen-dev/SDK
closed
CVE-2018-14042 (Medium) detected in bootstrap-3.3.4.min.js - autoclosed
security vulnerability
## CVE-2018-14042 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.4.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js</a></p> <p>Path to vulnerable library: SDK/web-component/php/vendor/phpunit/php-code-coverage/src/CodeCoverage/Report/HTML/Renderer/Template/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.4.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hydrogen-dev/SDK/commit/f00fb70b3e31390c5d44dd81bc95979b46b2055e">f00fb70b3e31390c5d44dd81bc95979b46b2055e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042>CVE-2018-14042</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.4","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-14042","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-14042 (Medium) detected in bootstrap-3.3.4.min.js - autoclosed - ## CVE-2018-14042 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.4.min.js</b></p></summary> <p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.4/js/bootstrap.min.js</a></p> <p>Path to vulnerable library: SDK/web-component/php/vendor/phpunit/php-code-coverage/src/CodeCoverage/Report/HTML/Renderer/Template/js/bootstrap.min.js</p> <p> Dependency Hierarchy: - :x: **bootstrap-3.3.4.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/hydrogen-dev/SDK/commit/f00fb70b3e31390c5d44dd81bc95979b46b2055e">f00fb70b3e31390c5d44dd81bc95979b46b2055e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip. <p>Publish Date: 2018-07-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042>CVE-2018-14042</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twbs/bootstrap/pull/26630">https://github.com/twbs/bootstrap/pull/26630</a></p> <p>Release Date: 2018-07-13</p> <p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"twitter-bootstrap","packageVersion":"3.3.4","packageFilePaths":[],"isTransitiveDependency":false,"dependencyTree":"twitter-bootstrap:3.3.4","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2018-14042","vulnerabilityDetails":"In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in bootstrap min js autoclosed cve medium severity vulnerability vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library sdk web component php vendor phpunit php code coverage src codecoverage report html renderer template js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch master vulnerability details in bootstrap before xss is possible in the data container property of tooltip publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org webjars npm bootstrap org webjars bootstrap isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree twitter bootstrap isminimumfixversionavailable true minimumfixversion org webjars npm bootstrap org webjars bootstrap basebranches vulnerabilityidentifier cve vulnerabilitydetails in bootstrap before xss is possible in the data container property of tooltip vulnerabilityurl
0
7,664
10,756,831,476
IssuesEvent
2019-10-31 12:04:45
kubeflow/kubeflow
https://api.github.com/repos/kubeflow/kubeflow
opened
Qualify 0.7.0 with kfctl_existing_arrikto.yaml
area/kfctl kind/process priority/p0
We are now on 0.7.0RC7 for kfctl https://github.com/kubeflow/kubeflow/releases There are currently no known P0 issues. https://github.com/orgs/kubeflow/projects/22?card_filter_query=label%3Apriority%2Fp0 Opening this bug to track qualifying the `kfctl_existing_arrikto.yaml` config. Ideally we'd like to aim to finalize 0.7.0 today. So it would be great to do run through the deployment and identify and fix any issues that come up. Related to: #4249
1.0
Qualify 0.7.0 with kfctl_existing_arrikto.yaml - We are now on 0.7.0RC7 for kfctl https://github.com/kubeflow/kubeflow/releases There are currently no known P0 issues. https://github.com/orgs/kubeflow/projects/22?card_filter_query=label%3Apriority%2Fp0 Opening this bug to track qualifying the `kfctl_existing_arrikto.yaml` config. Ideally we'd like to aim to finalize 0.7.0 today. So it would be great to do run through the deployment and identify and fix any issues that come up. Related to: #4249
process
qualify with kfctl existing arrikto yaml we are now on for kfctl there are currently no known issues opening this bug to track qualifying the kfctl existing arrikto yaml config ideally we d like to aim to finalize today so it would be great to do run through the deployment and identify and fix any issues that come up related to
1
4,339
7,245,220,790
IssuesEvent
2018-02-14 17:21:50
w3c/w3process
https://api.github.com/repos/w3c/w3process
closed
Process 2019: TAG Election / Appointment Process , Role of the Director
ABProcess2019Candidate
These need review for 2019.
1.0
Process 2019: TAG Election / Appointment Process , Role of the Director - These need review for 2019.
process
process tag election appointment process role of the director these need review for
1
17,696
23,546,822,837
IssuesEvent
2022-08-21 08:32:06
benthosdev/benthos
https://api.github.com/repos/benthosdev/benthos
closed
Update Clickhouse driver, so it will support insert of experimental semi-structured JSON feature + insert DateTime from string
enhancement processors inputs outputs effort: lower
For now, to implement such pipeline, I had to add temporary table with raw JSON and insert from Benthos to it + add materialized view with transformation.
1.0
Update Clickhouse driver, so it will support insert of experimental semi-structured JSON feature + insert DateTime from string - For now, to implement such pipeline, I had to add temporary table with raw JSON and insert from Benthos to it + add materialized view with transformation.
process
update clickhouse driver so it will support insert of experimental semi structured json feature insert datetime from string for now to implement such pipeline i had to add temporary table with raw json and insert from benthos to it add materialized view with transformation
1
5,112
7,886,980,816
IssuesEvent
2018-06-27 16:55:57
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
New Question on a BiqQuery standard SQL view issue
Database/BigQuery Priority/P1 Query Processor
I'm trying to create a New Question in Metabase on a BigQuery view, which was created on standard SQL, then I got this message: > ### There was a problem with your question > Most of the time this is caused by an invalid selection or bad input value. Double check your inputs and retry your query. > Show error details > ### Here's the full error message > ``` > Cannot reference a standard SQL view in a legacy SQL query. > ``` From this issue #3012, I understand that in SQL editor, Metabase already supports standard SQL. But in my case, how can it generate standard SQL?
1.0
New Question on a BiqQuery standard SQL view issue - I'm trying to create a New Question in Metabase on a BigQuery view, which was created on standard SQL, then I got this message: > ### There was a problem with your question > Most of the time this is caused by an invalid selection or bad input value. Double check your inputs and retry your query. > Show error details > ### Here's the full error message > ``` > Cannot reference a standard SQL view in a legacy SQL query. > ``` From this issue #3012, I understand that in SQL editor, Metabase already supports standard SQL. But in my case, how can it generate standard SQL?
process
new question on a biqquery standard sql view issue i m trying to create a new question in metabase on a bigquery view which was created on standard sql then i got this message there was a problem with your question most of the time this is caused by an invalid selection or bad input value double check your inputs and retry your query show error details here s the full error message cannot reference a standard sql view in a legacy sql query from this issue i understand that in sql editor metabase already supports standard sql but in my case how can it generate standard sql
1
14,690
17,836,642,823
IssuesEvent
2021-09-03 02:42:12
monetr/rest-api
https://api.github.com/repos/monetr/rest-api
opened
spending: Create a job that periodically checks for "behind" spending and updates them.
bug enhancement good first issue Expenses Goals Funding Schedules Job Processing
Steps to reproduce: - Create a funding schedule that does not occur next until the day after tomorrow. (_+2 days_) - Create an expense for that funding schedule that is due tomorrow. (_+1 days_) After midnight on the morning of the next day (+1) when the expense is due. It should no longer be marked as "behind" as it now has a chance to be funded before it is due again. Also if the spending object has not been "spent from" recently, the next occurance of the spending does not reset. ![image](https://user-images.githubusercontent.com/37967690/131942388-acec21b9-fad9-420d-9562-a8d1bfed6219.png) ----- Create a job that will periodically (maybe once a midnight in the user's timezone - This might conflict with the funding schedule job though so maybe do them both in one go?) check all of the spending objects for an account and update their next occurrence date, as well as their behind status.
1.0
spending: Create a job that periodically checks for "behind" spending and updates them. - Steps to reproduce: - Create a funding schedule that does not occur next until the day after tomorrow. (_+2 days_) - Create an expense for that funding schedule that is due tomorrow. (_+1 days_) After midnight on the morning of the next day (+1) when the expense is due. It should no longer be marked as "behind" as it now has a chance to be funded before it is due again. Also if the spending object has not been "spent from" recently, the next occurance of the spending does not reset. ![image](https://user-images.githubusercontent.com/37967690/131942388-acec21b9-fad9-420d-9562-a8d1bfed6219.png) ----- Create a job that will periodically (maybe once a midnight in the user's timezone - This might conflict with the funding schedule job though so maybe do them both in one go?) check all of the spending objects for an account and update their next occurrence date, as well as their behind status.
process
spending create a job that periodically checks for behind spending and updates them steps to reproduce create a funding schedule that does not occur next until the day after tomorrow days create an expense for that funding schedule that is due tomorrow days after midnight on the morning of the next day when the expense is due it should no longer be marked as behind as it now has a chance to be funded before it is due again also if the spending object has not been spent from recently the next occurance of the spending does not reset create a job that will periodically maybe once a midnight in the user s timezone this might conflict with the funding schedule job though so maybe do them both in one go check all of the spending objects for an account and update their next occurrence date as well as their behind status
1
4,786
7,661,380,771
IssuesEvent
2018-05-11 14:04:07
SharePoint/PnP-PowerShell
https://api.github.com/repos/SharePoint/PnP-PowerShell
closed
get-pnpField does not retrieve fields defined at root web
To be processed
###Notice: many issues / bugs reported are actually related to the PnP Core Library which is used behind the scenes. Consider carefully where to report an issue:### 1. **Are you using ```Apply-SPOProvisioningTemplate``` or ```Get-SPOProvisioningTemplate```**? The issue is most likely related to the Provisioning Engine. The Provisioning engine is _not_ located in the PowerShell repo. Please report the issue here: https://github.com/officedev/PnP-Sites-Core/issues. 2. **Is the issue related to the cmdlet itself, its parameters, the syntax, or do you suspect it is the code of the cmdlet that is causing the issue?** Then please continue reporting the issue in this repo. 3. **If you think that the functionality might be related to the underlying libraries that the cmdlet is calling** (We realize that that might be difficult to determine), please first double check the code of the cmdlet, which can be found here: https://github.com/OfficeDev/PnP-PowerShell/tree/master/Commands. If related to the cmdlet, continue reporting the issue here, otherwise report the issue at https://github.com/officedev/PnP-Sites-Core/issues ### Reporting an Issue or Missing Feature Get-pnpField does not retrieve site columns from root web. Adding site columns defined at the root web to content types defined at the subweb is not possible. But at the SharePoint Frontend I can add site columns from the root web to a content type from the subweb. ### Expected behavior Site fields from root web can be accessed at subweb simular to contentTypes with the cmdlets "Get-PnPContentType -InSiteHierarchy". Furthermore I can add these site columns to content types from the subweb. ### Actual behavior Only the site columns defined at the subweb are shown. Try to add a site columns with another web parameter throws an error: The object is used in the context different from the one associated with the object. ###Notice: many issues / bugs reported are actually related to the PnP Core Library which is used behind the scenes. Consider carefully where to report an issue:### 1. **Are you using ```Apply-SPOProvisioningTemplate``` or ```Get-SPOProvisioningTemplate```**? The issue is most likely related to the Provisioning Engine. The Provisioning engine is _not_ located in the PowerShell repo. Please report the issue here: https://github.com/officedev/PnP-Sites-Core/issues. 2. **Is the issue related to the cmdlet itself, its parameters, the syntax, or do you suspect it is the code of the cmdlet that is causing the issue?** Then please continue reporting the issue in this repo. 3. **If you think that the functionality might be related to the underlying libraries that the cmdlet is calling** (We realize that that might be difficult to determine), please first double check the code of the cmdlet, which can be found here: https://github.com/OfficeDev/PnP-PowerShell/tree/master/Commands. If related to the cmdlet, continue reporting the issue here, otherwise report the issue at https://github.com/officedev/PnP-Sites-Core/issues ### Reporting an Issue or Missing Feature Get-pnpField does not retrieve site columns from root web. Adding site columns defined at the root web to content types defined at the subweb is not possible. But at the SharePoint Frontend I can add site columns from the root web to a content type from the subweb. ### Expected behavior Site fields from root web can be accessed at subweb simular to contentTypes with the cmdlets "Get-PnPContentType -InSiteHierarchy". Furthermore I can add these site columns to content types from the subweb. ### Actual behavior Only the site columns defined at the subweb are shown. Try to add a site columns with another web parameter throws an error: The object is used in the context different from the one associated with the object. ### Steps to reproduce behavior Get-PnPField //only shows site fields defined at simular web Add-PnPFieldToContentType -Field "FieldName" -ContentType "ContentTypeName" // error field not existing Connect-PnPOnline //connect to root web $field1 = Get-PnPField -Web $customerWeb -Identity "FieldName" Connect-PnPOnline //connect to subweb Add-PnPFieldToContentType -Field $Field1 -ContentType "ContentTypeName" // error wrong context ### Which version of the PnP-PowerShell Cmdlets are you using? - [ ] PnP PowerShell for SharePoint 2013 - [ ] PnP PowerShell for SharePoint 2016 - [ X] PnP PowerShell for SharePoint Online ### What is the version of the Cmdlet module you are running? (you can retrieve this by executing ```Get-Module -Name *pnppowershell* -ListAvailable```) ### How did you install the PnP-PowerShell Cmdlets? - [ ] MSI Installed downloaded from GitHub - [ ] Installed through the PowerShell Gallery with Install-Module - [ ] Other means
1.0
get-pnpField does not retrieve fields defined at root web - ###Notice: many issues / bugs reported are actually related to the PnP Core Library which is used behind the scenes. Consider carefully where to report an issue:### 1. **Are you using ```Apply-SPOProvisioningTemplate``` or ```Get-SPOProvisioningTemplate```**? The issue is most likely related to the Provisioning Engine. The Provisioning engine is _not_ located in the PowerShell repo. Please report the issue here: https://github.com/officedev/PnP-Sites-Core/issues. 2. **Is the issue related to the cmdlet itself, its parameters, the syntax, or do you suspect it is the code of the cmdlet that is causing the issue?** Then please continue reporting the issue in this repo. 3. **If you think that the functionality might be related to the underlying libraries that the cmdlet is calling** (We realize that that might be difficult to determine), please first double check the code of the cmdlet, which can be found here: https://github.com/OfficeDev/PnP-PowerShell/tree/master/Commands. If related to the cmdlet, continue reporting the issue here, otherwise report the issue at https://github.com/officedev/PnP-Sites-Core/issues ### Reporting an Issue or Missing Feature Get-pnpField does not retrieve site columns from root web. Adding site columns defined at the root web to content types defined at the subweb is not possible. But at the SharePoint Frontend I can add site columns from the root web to a content type from the subweb. ### Expected behavior Site fields from root web can be accessed at subweb simular to contentTypes with the cmdlets "Get-PnPContentType -InSiteHierarchy". Furthermore I can add these site columns to content types from the subweb. ### Actual behavior Only the site columns defined at the subweb are shown. Try to add a site columns with another web parameter throws an error: The object is used in the context different from the one associated with the object. ###Notice: many issues / bugs reported are actually related to the PnP Core Library which is used behind the scenes. Consider carefully where to report an issue:### 1. **Are you using ```Apply-SPOProvisioningTemplate``` or ```Get-SPOProvisioningTemplate```**? The issue is most likely related to the Provisioning Engine. The Provisioning engine is _not_ located in the PowerShell repo. Please report the issue here: https://github.com/officedev/PnP-Sites-Core/issues. 2. **Is the issue related to the cmdlet itself, its parameters, the syntax, or do you suspect it is the code of the cmdlet that is causing the issue?** Then please continue reporting the issue in this repo. 3. **If you think that the functionality might be related to the underlying libraries that the cmdlet is calling** (We realize that that might be difficult to determine), please first double check the code of the cmdlet, which can be found here: https://github.com/OfficeDev/PnP-PowerShell/tree/master/Commands. If related to the cmdlet, continue reporting the issue here, otherwise report the issue at https://github.com/officedev/PnP-Sites-Core/issues ### Reporting an Issue or Missing Feature Get-pnpField does not retrieve site columns from root web. Adding site columns defined at the root web to content types defined at the subweb is not possible. But at the SharePoint Frontend I can add site columns from the root web to a content type from the subweb. ### Expected behavior Site fields from root web can be accessed at subweb simular to contentTypes with the cmdlets "Get-PnPContentType -InSiteHierarchy". Furthermore I can add these site columns to content types from the subweb. ### Actual behavior Only the site columns defined at the subweb are shown. Try to add a site columns with another web parameter throws an error: The object is used in the context different from the one associated with the object. ### Steps to reproduce behavior Get-PnPField //only shows site fields defined at simular web Add-PnPFieldToContentType -Field "FieldName" -ContentType "ContentTypeName" // error field not existing Connect-PnPOnline //connect to root web $field1 = Get-PnPField -Web $customerWeb -Identity "FieldName" Connect-PnPOnline //connect to subweb Add-PnPFieldToContentType -Field $Field1 -ContentType "ContentTypeName" // error wrong context ### Which version of the PnP-PowerShell Cmdlets are you using? - [ ] PnP PowerShell for SharePoint 2013 - [ ] PnP PowerShell for SharePoint 2016 - [ X] PnP PowerShell for SharePoint Online ### What is the version of the Cmdlet module you are running? (you can retrieve this by executing ```Get-Module -Name *pnppowershell* -ListAvailable```) ### How did you install the PnP-PowerShell Cmdlets? - [ ] MSI Installed downloaded from GitHub - [ ] Installed through the PowerShell Gallery with Install-Module - [ ] Other means
process
get pnpfield does not retrieve fields defined at root web notice many issues bugs reported are actually related to the pnp core library which is used behind the scenes consider carefully where to report an issue are you using apply spoprovisioningtemplate or get spoprovisioningtemplate the issue is most likely related to the provisioning engine the provisioning engine is not located in the powershell repo please report the issue here is the issue related to the cmdlet itself its parameters the syntax or do you suspect it is the code of the cmdlet that is causing the issue then please continue reporting the issue in this repo if you think that the functionality might be related to the underlying libraries that the cmdlet is calling we realize that that might be difficult to determine please first double check the code of the cmdlet which can be found here if related to the cmdlet continue reporting the issue here otherwise report the issue at reporting an issue or missing feature get pnpfield does not retrieve site columns from root web adding site columns defined at the root web to content types defined at the subweb is not possible but at the sharepoint frontend i can add site columns from the root web to a content type from the subweb expected behavior site fields from root web can be accessed at subweb simular to contenttypes with the cmdlets get pnpcontenttype insitehierarchy furthermore i can add these site columns to content types from the subweb actual behavior only the site columns defined at the subweb are shown try to add a site columns with another web parameter throws an error the object is used in the context different from the one associated with the object notice many issues bugs reported are actually related to the pnp core library which is used behind the scenes consider carefully where to report an issue are you using apply spoprovisioningtemplate or get spoprovisioningtemplate the issue is most likely related to the provisioning engine the provisioning engine is not located in the powershell repo please report the issue here is the issue related to the cmdlet itself its parameters the syntax or do you suspect it is the code of the cmdlet that is causing the issue then please continue reporting the issue in this repo if you think that the functionality might be related to the underlying libraries that the cmdlet is calling we realize that that might be difficult to determine please first double check the code of the cmdlet which can be found here if related to the cmdlet continue reporting the issue here otherwise report the issue at reporting an issue or missing feature get pnpfield does not retrieve site columns from root web adding site columns defined at the root web to content types defined at the subweb is not possible but at the sharepoint frontend i can add site columns from the root web to a content type from the subweb expected behavior site fields from root web can be accessed at subweb simular to contenttypes with the cmdlets get pnpcontenttype insitehierarchy furthermore i can add these site columns to content types from the subweb actual behavior only the site columns defined at the subweb are shown try to add a site columns with another web parameter throws an error the object is used in the context different from the one associated with the object steps to reproduce behavior get pnpfield only shows site fields defined at simular web add pnpfieldtocontenttype field fieldname contenttype contenttypename error field not existing connect pnponline connect to root web get pnpfield web customerweb identity fieldname connect pnponline connect to subweb add pnpfieldtocontenttype field contenttype contenttypename error wrong context which version of the pnp powershell cmdlets are you using pnp powershell for sharepoint pnp powershell for sharepoint pnp powershell for sharepoint online what is the version of the cmdlet module you are running you can retrieve this by executing get module name pnppowershell listavailable how did you install the pnp powershell cmdlets msi installed downloaded from github installed through the powershell gallery with install module other means
1
69,063
8,373,301,852
IssuesEvent
2018-10-05 09:55:01
okfn/ckanext-lacounts
https://api.github.com/repos/okfn/ckanext-lacounts
closed
Add config option for showcased picture in the frontpage
WP1 (Design)
Copy the logic for uploading the logo image in the admin section, and add a config option for uploading the featured image. @smth can you advise on what size should be recommend?
1.0
Add config option for showcased picture in the frontpage - Copy the logic for uploading the logo image in the admin section, and add a config option for uploading the featured image. @smth can you advise on what size should be recommend?
non_process
add config option for showcased picture in the frontpage copy the logic for uploading the logo image in the admin section and add a config option for uploading the featured image smth can you advise on what size should be recommend
0
15,049
18,762,866,276
IssuesEvent
2021-11-05 18:43:56
googleapis/python-bigquery-storage
https://api.github.com/repos/googleapis/python-bigquery-storage
closed
Samples that depend on BigQuery are not compatible with Python 3.10
type: process api: bigquerystorage
`google-cloud-bigquery` does not yet support Python 3.10. When it does, the 3.10 samples check should turn green. For now, it is OK to merge PRs with a failing 3.10 samples check (the status check is intentionally optional). See https://github.com/googleapis/python-bigquery/issues/1006 for the status of python 3.10 support. Additionally, 3.10 checks will fail with a kokoro config error until #331 is merged
1.0
Samples that depend on BigQuery are not compatible with Python 3.10 - `google-cloud-bigquery` does not yet support Python 3.10. When it does, the 3.10 samples check should turn green. For now, it is OK to merge PRs with a failing 3.10 samples check (the status check is intentionally optional). See https://github.com/googleapis/python-bigquery/issues/1006 for the status of python 3.10 support. Additionally, 3.10 checks will fail with a kokoro config error until #331 is merged
process
samples that depend on bigquery are not compatible with python google cloud bigquery does not yet support python when it does the samples check should turn green for now it is ok to merge prs with a failing samples check the status check is intentionally optional see for the status of python support additionally checks will fail with a kokoro config error until is merged
1
160,357
13,786,888,934
IssuesEvent
2020-10-09 03:12:54
dinhanhx/deep_fried_meme
https://api.github.com/repos/dinhanhx/deep_fried_meme
closed
Refactor this repos to become a package
documentation enhancement maintainer's work
- [x] A package - [x] A simple documentation - [x] A handler for url input - [x] A handler for file input and output
1.0
Refactor this repos to become a package - - [x] A package - [x] A simple documentation - [x] A handler for url input - [x] A handler for file input and output
non_process
refactor this repos to become a package a package a simple documentation a handler for url input a handler for file input and output
0
32,413
4,363,375,442
IssuesEvent
2016-08-03 00:07:11
MozillaFoundation/Mozfest2016_production
https://api.github.com/repos/MozillaFoundation/Mozfest2016_production
closed
Meet with "issues team" to discuss design
Design Leaders -> Leaders
Coordinate a meeting between the designers and Sam, Paul and Kevin to discuss how the 5 key issues can be represented in the festival's overall design. cc @xmatthewx
1.0
Meet with "issues team" to discuss design - Coordinate a meeting between the designers and Sam, Paul and Kevin to discuss how the 5 key issues can be represented in the festival's overall design. cc @xmatthewx
non_process
meet with issues team to discuss design coordinate a meeting between the designers and sam paul and kevin to discuss how the key issues can be represented in the festival s overall design cc xmatthewx
0
128,182
17,460,023,905
IssuesEvent
2021-08-06 09:07:45
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
closed
Validation: connect validations to fields in GUI
solution/studio/designer area/logic kind/user-story org/ssb
Functional architect/designer: @-mention Technical architect: @-mention Description As a service developer I need to connect my validation to a field (as a GUI element) in the service, so that I can define where the validation is triggered. A service developer needs to connect their validation to a given field. Needs to specified: -Can a validation rule be connected to a field before the field is associated to the datamodel? -What types of data elements can be associated with a validation rule? -Can a service developer work in paralell, e.g. can they start with code and then connect to a field, or do they have to choose an element before they start to work on the validation? Sketch (if relevant) (Screenshot and link to Figma, make sure your sketch is public!) //Needs sketch **Navigation from/to (if relevant)** This functionality is reached from... **Technical considerations** Input (beyond tasks) on how the user story should be solved can be put here. **Acceptance criterea** - What is allowed/not allowed - Validations - Error messages and warnings - ... **Tasks** - [ ] Example task
1.0
Validation: connect validations to fields in GUI - Functional architect/designer: @-mention Technical architect: @-mention Description As a service developer I need to connect my validation to a field (as a GUI element) in the service, so that I can define where the validation is triggered. A service developer needs to connect their validation to a given field. Needs to specified: -Can a validation rule be connected to a field before the field is associated to the datamodel? -What types of data elements can be associated with a validation rule? -Can a service developer work in paralell, e.g. can they start with code and then connect to a field, or do they have to choose an element before they start to work on the validation? Sketch (if relevant) (Screenshot and link to Figma, make sure your sketch is public!) //Needs sketch **Navigation from/to (if relevant)** This functionality is reached from... **Technical considerations** Input (beyond tasks) on how the user story should be solved can be put here. **Acceptance criterea** - What is allowed/not allowed - Validations - Error messages and warnings - ... **Tasks** - [ ] Example task
non_process
validation connect validations to fields in gui functional architect designer mention technical architect mention description as a service developer i need to connect my validation to a field as a gui element in the service so that i can define where the validation is triggered a service developer needs to connect their validation to a given field needs to specified can a validation rule be connected to a field before the field is associated to the datamodel what types of data elements can be associated with a validation rule can a service developer work in paralell e g can they start with code and then connect to a field or do they have to choose an element before they start to work on the validation sketch if relevant screenshot and link to figma make sure your sketch is public needs sketch navigation from to if relevant this functionality is reached from technical considerations input beyond tasks on how the user story should be solved can be put here acceptance criterea what is allowed not allowed validations error messages and warnings tasks example task
0
266,767
8,374,891,935
IssuesEvent
2018-10-05 14:53:35
ropensci/rrricanes
https://api.github.com/repos/ropensci/rrricanes
closed
Add func `get_storm_list`
Features Medium Priority
Currently, `rrricanes` uses [get_storms](https://github.com/ropensci/rrricanes/blob/master/R/get_storms.R#L43) to retrieve a list of storms for one or more years and one or more basins. The return is a dataframe object that includes a link for each cyclone to the cyclone's respective archive page. This value is passed to product functions that return data for the specific storm. Per #113, in a move to implement the FTP server, this functionality would change quite a bit. Depending on the year(s) requested, the function would access different URLs. It appears at this moment the "default" URL would be for the current or most recent year of tropical cyclones: ftp://ftp.nhc.noaa.gov/atcf/index/. Several files depending on basin exist in this directory. For the time being, the concern would only be collecting AL and EP cyclones. Each basin has several text files depending on the weather office issuing the text products. For example, at this moment, there are three text files for AL: * AL_storms.txt.hpc - a list of storms (real cyclones and test cyclones) as issued by the Hydrometeorological Prediction Center. `rrricanes` does not access HPC products anyway so these can likely be ignored. * AL_storms.txt.nhc - a list of storms (real and test) as issued by the NHC. These, we would want. * AL_storms.txt.wpc - storms with advisories issued from the Weather Prediction Center. Additional stations for other basins are: * CPHC - Central Pacific Hurricane Center (may cover both EP and CP cyclones) A master list exists that appears to list each cyclone's `Key` and `Name`, but no additional details contained in the other text files. However, storm_list.txt exists which appears to be a consolidated text file of all cyclones going back to 1851, for all basins. The caveat of all of these text files is that TEST, INVEST, and GENESIS cyclones are listed (TEST are not true cyclones and only for testing purposes. INVEST and GENESIS would not have the text products currently covered in `rrricanes`). All text files are comma-delimited, so could easily be imported into rrricanes using `readr::read_csv()`. According to [NRL](https://www.nrlmry.navy.mil/atcf_web/docs/database/new/descriptive.html), the columns are: STORM NAME, RE, X, R2, R3,R4, R5, CY, YYYY, TY, I, YYY1MMDDHH, YYY2MMDDHH, SIZE, GENESIS_NUM, PAR1, PAR2, PRIORITY, STORM_STATE, WT_NUMBER, STORMID The definitions are: * STORM NAME = Literal storm name, "INVEST", or "GENESISxxx" where xxx is a number * RE = Region (basin) code: WP, IO, SH, CP, EP, AL, LS. (See [4. Data Format](https://www.nrlmry.navy.mil/atcf_web/docs/database/new/database.html#dataoverview)) * X = Subregion code: W, A, B, S, P, C, E, L, Q. In [Storm History Record Format](https://www.nrlmry.navy.mil/atcf_web/docs/database/new/abdeck.txt), these are listed as: + A - Arabian Sea + B - Bay of Bengal + C - Central Pacific + E - Eastern Pacific + L - Atlantic + P - South Pacific (135E - 120W) + Q - South Atlantic + S - South IO (20E - 135E) + W - Western Pacific * R2 = Region 2 code: WP, IO, SH, CP, or EP. This and R3-R5 are codes for basins entered subsequent to the original basin where the storm was generated. * R3 = Region 3 code: WP, IO, SH, CP, or EP. * R4 = Region 4 code: WP, IO, SH, CP, or EP. * R5 = Region 5 code: WP, IO, SH, CP, or EP. * CY = Annual cyclone number: 01 through 99. * YYYY = Cyclone Year: 0000 through 9999. * TY = Highest level of tc development: TD, TS, TY, ST, TC, HU, SH, XX (unknown). * I = S, R, O; straight mover, recurver, odd mover. * YYY1MMDDHH = Starting DTG: 0000010100 through 9999123123. * YYY2MMDDHH = Ending DTG: 0000010100 through 9999123123. * SIZE = Storm Size (MIDG (midget) , GIAN (giant), etc.). * GENESIS_NUM = Annual genesis number: 001 through 999. * PAR1 = UNUSED. * PAR2 = UNUSED. * PRIORITY = Priority for model runs (e.g., GFDN, GFDL, COAMPS-TC, H-WRF): 1-9. * STORM_STATE = Storm state: METWATCH,TCFA,WARNING or ARCHIVE * WT_NUMBER = Minute of warning or TCFA (00-59) * STORMID (or `Key`) = Storm ID composed of basin designator and annual cyclone number (e.g. wp081993) For example, the current AL_storms.txt.nhc contains the row ``` ARLENE, AL, L, , , , , 01, 2017, EX, S, 2017041606, 2017042112, , , , , 8, , 1, AL012017 ``` * STORM NAME: ARLENE * RE: AL Atlantic * X: L Atlantic * R2: NA * R3: NA * R4: NA * R5: NA * CY: 01 * YYYY: 2017 * TY: EX * I: S * YYY1MMDDHH: 2017041601 * YYY2MMDDHH: 2017042112 * SIZE: NA * GENESIS_NUM: NA * PAR1: NA * PAR2: NA * PRIORITY: 8 * STORM_STATE: NA * WT_NUMBER: 1 * STORMID: AL012017
1.0
Add func `get_storm_list` - Currently, `rrricanes` uses [get_storms](https://github.com/ropensci/rrricanes/blob/master/R/get_storms.R#L43) to retrieve a list of storms for one or more years and one or more basins. The return is a dataframe object that includes a link for each cyclone to the cyclone's respective archive page. This value is passed to product functions that return data for the specific storm. Per #113, in a move to implement the FTP server, this functionality would change quite a bit. Depending on the year(s) requested, the function would access different URLs. It appears at this moment the "default" URL would be for the current or most recent year of tropical cyclones: ftp://ftp.nhc.noaa.gov/atcf/index/. Several files depending on basin exist in this directory. For the time being, the concern would only be collecting AL and EP cyclones. Each basin has several text files depending on the weather office issuing the text products. For example, at this moment, there are three text files for AL: * AL_storms.txt.hpc - a list of storms (real cyclones and test cyclones) as issued by the Hydrometeorological Prediction Center. `rrricanes` does not access HPC products anyway so these can likely be ignored. * AL_storms.txt.nhc - a list of storms (real and test) as issued by the NHC. These, we would want. * AL_storms.txt.wpc - storms with advisories issued from the Weather Prediction Center. Additional stations for other basins are: * CPHC - Central Pacific Hurricane Center (may cover both EP and CP cyclones) A master list exists that appears to list each cyclone's `Key` and `Name`, but no additional details contained in the other text files. However, storm_list.txt exists which appears to be a consolidated text file of all cyclones going back to 1851, for all basins. The caveat of all of these text files is that TEST, INVEST, and GENESIS cyclones are listed (TEST are not true cyclones and only for testing purposes. INVEST and GENESIS would not have the text products currently covered in `rrricanes`). All text files are comma-delimited, so could easily be imported into rrricanes using `readr::read_csv()`. According to [NRL](https://www.nrlmry.navy.mil/atcf_web/docs/database/new/descriptive.html), the columns are: STORM NAME, RE, X, R2, R3,R4, R5, CY, YYYY, TY, I, YYY1MMDDHH, YYY2MMDDHH, SIZE, GENESIS_NUM, PAR1, PAR2, PRIORITY, STORM_STATE, WT_NUMBER, STORMID The definitions are: * STORM NAME = Literal storm name, "INVEST", or "GENESISxxx" where xxx is a number * RE = Region (basin) code: WP, IO, SH, CP, EP, AL, LS. (See [4. Data Format](https://www.nrlmry.navy.mil/atcf_web/docs/database/new/database.html#dataoverview)) * X = Subregion code: W, A, B, S, P, C, E, L, Q. In [Storm History Record Format](https://www.nrlmry.navy.mil/atcf_web/docs/database/new/abdeck.txt), these are listed as: + A - Arabian Sea + B - Bay of Bengal + C - Central Pacific + E - Eastern Pacific + L - Atlantic + P - South Pacific (135E - 120W) + Q - South Atlantic + S - South IO (20E - 135E) + W - Western Pacific * R2 = Region 2 code: WP, IO, SH, CP, or EP. This and R3-R5 are codes for basins entered subsequent to the original basin where the storm was generated. * R3 = Region 3 code: WP, IO, SH, CP, or EP. * R4 = Region 4 code: WP, IO, SH, CP, or EP. * R5 = Region 5 code: WP, IO, SH, CP, or EP. * CY = Annual cyclone number: 01 through 99. * YYYY = Cyclone Year: 0000 through 9999. * TY = Highest level of tc development: TD, TS, TY, ST, TC, HU, SH, XX (unknown). * I = S, R, O; straight mover, recurver, odd mover. * YYY1MMDDHH = Starting DTG: 0000010100 through 9999123123. * YYY2MMDDHH = Ending DTG: 0000010100 through 9999123123. * SIZE = Storm Size (MIDG (midget) , GIAN (giant), etc.). * GENESIS_NUM = Annual genesis number: 001 through 999. * PAR1 = UNUSED. * PAR2 = UNUSED. * PRIORITY = Priority for model runs (e.g., GFDN, GFDL, COAMPS-TC, H-WRF): 1-9. * STORM_STATE = Storm state: METWATCH,TCFA,WARNING or ARCHIVE * WT_NUMBER = Minute of warning or TCFA (00-59) * STORMID (or `Key`) = Storm ID composed of basin designator and annual cyclone number (e.g. wp081993) For example, the current AL_storms.txt.nhc contains the row ``` ARLENE, AL, L, , , , , 01, 2017, EX, S, 2017041606, 2017042112, , , , , 8, , 1, AL012017 ``` * STORM NAME: ARLENE * RE: AL Atlantic * X: L Atlantic * R2: NA * R3: NA * R4: NA * R5: NA * CY: 01 * YYYY: 2017 * TY: EX * I: S * YYY1MMDDHH: 2017041601 * YYY2MMDDHH: 2017042112 * SIZE: NA * GENESIS_NUM: NA * PAR1: NA * PAR2: NA * PRIORITY: 8 * STORM_STATE: NA * WT_NUMBER: 1 * STORMID: AL012017
non_process
add func get storm list currently rrricanes uses to retrieve a list of storms for one or more years and one or more basins the return is a dataframe object that includes a link for each cyclone to the cyclone s respective archive page this value is passed to product functions that return data for the specific storm per in a move to implement the ftp server this functionality would change quite a bit depending on the year s requested the function would access different urls it appears at this moment the default url would be for the current or most recent year of tropical cyclones ftp ftp nhc noaa gov atcf index several files depending on basin exist in this directory for the time being the concern would only be collecting al and ep cyclones each basin has several text files depending on the weather office issuing the text products for example at this moment there are three text files for al al storms txt hpc a list of storms real cyclones and test cyclones as issued by the hydrometeorological prediction center rrricanes does not access hpc products anyway so these can likely be ignored al storms txt nhc a list of storms real and test as issued by the nhc these we would want al storms txt wpc storms with advisories issued from the weather prediction center additional stations for other basins are cphc central pacific hurricane center may cover both ep and cp cyclones a master list exists that appears to list each cyclone s key and name but no additional details contained in the other text files however storm list txt exists which appears to be a consolidated text file of all cyclones going back to for all basins the caveat of all of these text files is that test invest and genesis cyclones are listed test are not true cyclones and only for testing purposes invest and genesis would not have the text products currently covered in rrricanes all text files are comma delimited so could easily be imported into rrricanes using readr read csv according to the columns are storm name re x cy yyyy ty i size genesis num priority storm state wt number stormid the definitions are storm name literal storm name invest or genesisxxx where xxx is a number re region basin code wp io sh cp ep al ls see x subregion code w a b s p c e l q in these are listed as a arabian sea b bay of bengal c central pacific e eastern pacific l atlantic p south pacific q south atlantic s south io w western pacific region code wp io sh cp or ep this and are codes for basins entered subsequent to the original basin where the storm was generated region code wp io sh cp or ep region code wp io sh cp or ep region code wp io sh cp or ep cy annual cyclone number through yyyy cyclone year through ty highest level of tc development td ts ty st tc hu sh xx unknown i s r o straight mover recurver odd mover starting dtg through ending dtg through size storm size midg midget gian giant etc genesis num annual genesis number through unused unused priority priority for model runs e g gfdn gfdl coamps tc h wrf storm state storm state metwatch tcfa warning or archive wt number minute of warning or tcfa stormid or key storm id composed of basin designator and annual cyclone number e g for example the current al storms txt nhc contains the row arlene al l ex s storm name arlene re al atlantic x l atlantic na na na na cy yyyy ty ex i s size na genesis num na na na priority storm state na wt number stormid
0
7,025
10,173,820,401
IssuesEvent
2019-08-08 13:53:52
googleapis/google-cloud-python
https://api.github.com/repos/googleapis/google-cloud-python
opened
PubSub: Release the first GA version
api: pubsub release blocking type: process
If no major issues are discovered during the ongoing baking period, the PubSub client should be released as GA. Releasing a new version is done with the [releasetool](https://github.com/googleapis/releasetool). Prior to the GA release, the following must be done: - Go through the checklist from the internal GA release doc, make sure everything is in order. - Changes in the client lib files: - Update PubSub Python lib README to indicate semver just like Java library has - Update beta tag in PubSub Python lib README - Update beta tag in Python libs root - Change beta label in code samples in clients page in Google Cloud website - Add to release notes (talk to Kir)
1.0
PubSub: Release the first GA version - If no major issues are discovered during the ongoing baking period, the PubSub client should be released as GA. Releasing a new version is done with the [releasetool](https://github.com/googleapis/releasetool). Prior to the GA release, the following must be done: - Go through the checklist from the internal GA release doc, make sure everything is in order. - Changes in the client lib files: - Update PubSub Python lib README to indicate semver just like Java library has - Update beta tag in PubSub Python lib README - Update beta tag in Python libs root - Change beta label in code samples in clients page in Google Cloud website - Add to release notes (talk to Kir)
process
pubsub release the first ga version if no major issues are discovered during the ongoing baking period the pubsub client should be released as ga releasing a new version is done with the prior to the ga release the following must be done go through the checklist from the internal ga release doc make sure everything is in order changes in the client lib files update pubsub python lib readme to indicate semver just like java library has update beta tag in pubsub python lib readme update beta tag in python libs root change beta label in code samples in clients page in google cloud website add to release notes talk to kir
1
324,509
27,809,099,262
IssuesEvent
2023-03-18 00:11:08
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix meta.test_maml_step_overlapping_vars
Sub Task Failing Test
| | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
1.0
Fix meta.test_maml_step_overlapping_vars - | | | |---|---| |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a> |jax|<a href="https://github.com/unifyai/ivy/actions/runs/4352192142/jobs/7604670253" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
non_process
fix meta test maml step overlapping vars tensorflow img src torch img src numpy img src jax img src
0
285,923
31,155,773,763
IssuesEvent
2023-08-16 13:02:54
Trinadh465/linux-4.1.15_CVE-2018-5873
https://api.github.com/repos/Trinadh465/linux-4.1.15_CVE-2018-5873
opened
CVE-2015-8104 (Medium) detected in linuxlinux-4.1.52
Mend: dependency security vulnerability
## CVE-2015-8104 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.1.52</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/Trinadh465/linux-4.1.15_CVE-2018-5873/commit/32145daf0c96b012284199f23418243e0168269f">32145daf0c96b012284199f23418243e0168269f</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/arch/x86/kvm/svm.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> The KVM subsystem in the Linux kernel through 4.2.6, and Xen 4.3.x through 4.6.x, allows guest OS users to cause a denial of service (host OS panic or hang) by triggering many #DB (aka Debug) exceptions, related to svm.c. <p>Publish Date: 2015-11-16 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-8104>CVE-2015-8104</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2015-8104">https://www.linuxkernelcves.com/cves/CVE-2015-8104</a></p> <p>Release Date: 2015-11-16</p> <p>Fix Resolution: v4.4-rc1,v3.12.51,v3.16.35,v3.2.74,v4.1.17,v4.3.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2015-8104 (Medium) detected in linuxlinux-4.1.52 - ## CVE-2015-8104 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.1.52</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/Trinadh465/linux-4.1.15_CVE-2018-5873/commit/32145daf0c96b012284199f23418243e0168269f">32145daf0c96b012284199f23418243e0168269f</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/arch/x86/kvm/svm.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> The KVM subsystem in the Linux kernel through 4.2.6, and Xen 4.3.x through 4.6.x, allows guest OS users to cause a denial of service (host OS panic or hang) by triggering many #DB (aka Debug) exceptions, related to svm.c. <p>Publish Date: 2015-11-16 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2015-8104>CVE-2015-8104</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2015-8104">https://www.linuxkernelcves.com/cves/CVE-2015-8104</a></p> <p>Release Date: 2015-11-16</p> <p>Fix Resolution: v4.4-rc1,v3.12.51,v3.16.35,v3.2.74,v4.1.17,v4.3.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in linuxlinux cve medium severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files arch kvm svm c vulnerability details the kvm subsystem in the linux kernel through and xen x through x allows guest os users to cause a denial of service host os panic or hang by triggering many db aka debug exceptions related to svm c publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
0
20,063
26,553,030,419
IssuesEvent
2023-01-20 09:36:13
prisma/prisma
https://api.github.com/repos/prisma/prisma
opened
Unreachable code in migration engine. [sql-migration-connector/src/sql_destructive_change_checker.rs:199:84]
bug/1-unconfirmed kind/bug process/candidate topic: error reporting team/schema
<!-- If required, please update the title to be clear and descriptive --> Command: `prisma db push` Version: `4.8.1` Binary Version: `d6e67a83f971b175a593ccc12e15c4a757f93ffe` Report: https://prisma-errors.netlify.app/report/14506 OS: `arm64 darwin 21.3.0` JS Stacktrace: ``` Error: Error in migration engine. Reason: [migration-engine/connectors/sql-migration-connector/src/sql_destructive_change_checker.rs:199:84] internal error: entered unreachable code ``` Rust Stacktrace: ``` Starting migration engine RPC server [migration-engine/connectors/sql-migration-connector/src/sql_destructive_change_checker.rs:199:84] internal error: entered unreachable code ```
1.0
Unreachable code in migration engine. [sql-migration-connector/src/sql_destructive_change_checker.rs:199:84] - <!-- If required, please update the title to be clear and descriptive --> Command: `prisma db push` Version: `4.8.1` Binary Version: `d6e67a83f971b175a593ccc12e15c4a757f93ffe` Report: https://prisma-errors.netlify.app/report/14506 OS: `arm64 darwin 21.3.0` JS Stacktrace: ``` Error: Error in migration engine. Reason: [migration-engine/connectors/sql-migration-connector/src/sql_destructive_change_checker.rs:199:84] internal error: entered unreachable code ``` Rust Stacktrace: ``` Starting migration engine RPC server [migration-engine/connectors/sql-migration-connector/src/sql_destructive_change_checker.rs:199:84] internal error: entered unreachable code ```
process
unreachable code in migration engine command prisma db push version binary version report os darwin js stacktrace error error in migration engine reason internal error entered unreachable code rust stacktrace starting migration engine rpc server internal error entered unreachable code
1
245,558
7,887,727,569
IssuesEvent
2018-06-27 19:29:37
GoogleCloudPlatform/google-cloud-python
https://api.github.com/repos/GoogleCloudPlatform/google-cloud-python
closed
ValueError thrown in django view
api: vision priority: p2 status: awaiting information type: bug
This happens on 0.29.0, and 0.30.1 Steps to reproduce: the following code works within the django shell environment, and in a python prompt and from a command-line script, but gives the traceback below when encapsulated in a view and run through the dev server (python manage.py runserver), or through uwsgi <pre> image_file = open(tmpfile, 'rb') img = types.Image(content=bytes(image_file.read())) features = [types.Feature(type=types.Feature.SAFE_SEARCH_DETECTION, max_results=8), types.Feature(type=types.Feature.LABEL_DETECTION, max_results=8)] gclient = vision.ImageAnnotatorClient() resp = gclient.annotate_image({'image' : img, 'features' : features}) </pre> and the traceback from running inside a view <pre> Traceback (most recent call last): File "/usr/local/lib64/python2.7/site-packages/django/core/handlers/exception.py", line 41, in inner response = get_response(request) File "/usr/local/lib64/python2.7/site-packages/django/core/handlers/base.py", line 187, in _get_response response = self.process_exception_by_middleware(e, request) File "/usr/local/lib64/python2.7/site-packages/django/core/handlers/base.py", line 185, in _get_response response = wrapped_callback(request, *callback_args, **callback_kwargs) File "/usr/local/lib64/python2.7/site-packages/django/contrib/auth/decorators.py", line 23, in _wrapped_view return view_func(request, *args, **kwargs) File "/home/website/beta_uwsgi/git/swym_django/app/views.py", line 276, in vision_process resp = gclient.annotate_image({'image' : img, 'features' : features}) File "/usr/local/lib/python2.7/site-packages/google/cloud/vision_helpers/__init__.py", line 67, in annotate_image r = self.batch_annotate_images([request], retry=retry, timeout=timeout) File "/usr/local/lib/python2.7/site-packages/google/cloud/vision_v1/gapic/image_annotator_client.py", line 156, in batch_annotate_images requests=requests) ValueError: Field name must be a string </pre> a more full code example inside a functional view: <pre> if request.method == 'POST': f = request.FILES['image_data'] tmpfile = "/tmp/upload-" + str(uuid.uuid4()) + ".jpg" opf = open(tmpfile, 'wb+') for chunk in f.chunks(): opf.write(chunk) opf.close() image_file = open(tmpfile, 'rb') image = bytes(image_file.read()) img = types.Image(content=image) features = [types.Feature(type=types.Feature.SAFE_SEARCH_DETECTION, max_results=8), types.Feature(type=types.Feature.LABEL_DETECTION, max_results=8)] gclient = vision.ImageAnnotatorClient() resp = gclient.annotate_image({'image' : img, 'features' : features}) </pre> What makes this even stranger is that <pre> resp = gclient.label_detection(image=img) </pre> will work just fine in views
1.0
ValueError thrown in django view - This happens on 0.29.0, and 0.30.1 Steps to reproduce: the following code works within the django shell environment, and in a python prompt and from a command-line script, but gives the traceback below when encapsulated in a view and run through the dev server (python manage.py runserver), or through uwsgi <pre> image_file = open(tmpfile, 'rb') img = types.Image(content=bytes(image_file.read())) features = [types.Feature(type=types.Feature.SAFE_SEARCH_DETECTION, max_results=8), types.Feature(type=types.Feature.LABEL_DETECTION, max_results=8)] gclient = vision.ImageAnnotatorClient() resp = gclient.annotate_image({'image' : img, 'features' : features}) </pre> and the traceback from running inside a view <pre> Traceback (most recent call last): File "/usr/local/lib64/python2.7/site-packages/django/core/handlers/exception.py", line 41, in inner response = get_response(request) File "/usr/local/lib64/python2.7/site-packages/django/core/handlers/base.py", line 187, in _get_response response = self.process_exception_by_middleware(e, request) File "/usr/local/lib64/python2.7/site-packages/django/core/handlers/base.py", line 185, in _get_response response = wrapped_callback(request, *callback_args, **callback_kwargs) File "/usr/local/lib64/python2.7/site-packages/django/contrib/auth/decorators.py", line 23, in _wrapped_view return view_func(request, *args, **kwargs) File "/home/website/beta_uwsgi/git/swym_django/app/views.py", line 276, in vision_process resp = gclient.annotate_image({'image' : img, 'features' : features}) File "/usr/local/lib/python2.7/site-packages/google/cloud/vision_helpers/__init__.py", line 67, in annotate_image r = self.batch_annotate_images([request], retry=retry, timeout=timeout) File "/usr/local/lib/python2.7/site-packages/google/cloud/vision_v1/gapic/image_annotator_client.py", line 156, in batch_annotate_images requests=requests) ValueError: Field name must be a string </pre> a more full code example inside a functional view: <pre> if request.method == 'POST': f = request.FILES['image_data'] tmpfile = "/tmp/upload-" + str(uuid.uuid4()) + ".jpg" opf = open(tmpfile, 'wb+') for chunk in f.chunks(): opf.write(chunk) opf.close() image_file = open(tmpfile, 'rb') image = bytes(image_file.read()) img = types.Image(content=image) features = [types.Feature(type=types.Feature.SAFE_SEARCH_DETECTION, max_results=8), types.Feature(type=types.Feature.LABEL_DETECTION, max_results=8)] gclient = vision.ImageAnnotatorClient() resp = gclient.annotate_image({'image' : img, 'features' : features}) </pre> What makes this even stranger is that <pre> resp = gclient.label_detection(image=img) </pre> will work just fine in views
non_process
valueerror thrown in django view this happens on and steps to reproduce the following code works within the django shell environment and in a python prompt and from a command line script but gives the traceback below when encapsulated in a view and run through the dev server python manage py runserver or through uwsgi image file open tmpfile rb img types image content bytes image file read features types feature type types feature safe search detection max results types feature type types feature label detection max results gclient vision imageannotatorclient resp gclient annotate image image img features features and the traceback from running inside a view traceback most recent call last file usr local site packages django core handlers exception py line in inner response get response request file usr local site packages django core handlers base py line in get response response self process exception by middleware e request file usr local site packages django core handlers base py line in get response response wrapped callback request callback args callback kwargs file usr local site packages django contrib auth decorators py line in wrapped view return view func request args kwargs file home website beta uwsgi git swym django app views py line in vision process resp gclient annotate image image img features features file usr local lib site packages google cloud vision helpers init py line in annotate image r self batch annotate images retry retry timeout timeout file usr local lib site packages google cloud vision gapic image annotator client py line in batch annotate images requests requests valueerror field name must be a string a more full code example inside a functional view if request method post f request files tmpfile tmp upload str uuid jpg opf open tmpfile wb for chunk in f chunks opf write chunk opf close image file open tmpfile rb image bytes image file read img types image content image features types feature type types feature safe search detection max results types feature type types feature label detection max results gclient vision imageannotatorclient resp gclient annotate image image img features features what makes this even stranger is that resp gclient label detection image img will work just fine in views
0
232,677
7,673,924,569
IssuesEvent
2018-05-15 00:46:50
StrangeLoopGames/EcoIssues
https://api.github.com/repos/StrangeLoopGames/EcoIssues
closed
FPS drop (to 10-12fps) after open a chest/stockpile (with many items) and it does not recover after closing till restarting the game.
High Priority Optimization
The problem occurs only when opening the boxes / stockpiles and it does not recover after closing till restarting the game. Before that I have a fps of 80-90. But this makes the endgame unplayeable for me. Even if I have all settings on very low.
1.0
FPS drop (to 10-12fps) after open a chest/stockpile (with many items) and it does not recover after closing till restarting the game. - The problem occurs only when opening the boxes / stockpiles and it does not recover after closing till restarting the game. Before that I have a fps of 80-90. But this makes the endgame unplayeable for me. Even if I have all settings on very low.
non_process
fps drop to after open a chest stockpile with many items and it does not recover after closing till restarting the game the problem occurs only when opening the boxes stockpiles and it does not recover after closing till restarting the game before that i have a fps of but this makes the endgame unplayeable for me even if i have all settings on very low
0
487,514
14,047,734,268
IssuesEvent
2020-11-02 07:42:29
wso2/product-apim
https://api.github.com/repos/wso2/product-apim
opened
Authentication failure when 'Allow selected' is selected in Key Manager Configuration
Priority/Highest Type/Bug
Setup more than 2 external keymanagers and enable couple of them for the API. When the API is invoked, you get following error `TID: [-1234] [] [2020-11-02 12:01:36,739] ERROR {org.wso2.carbon.apimgt.gateway.handlers.security.APIAuthenticationHandler} - API authentication failure due to Unclassified Authentication Failure org.wso2.carbon.apimgt.gateway.handlers.security.APISecurityException: Error while accessing backend services for API key validation at org.wso2.carbon.apimgt.gateway.handlers.security.APIAuthenticationHandler.isAuthenticate_aroundBody42(APIAuthenticationHandler.java:438)`
1.0
Authentication failure when 'Allow selected' is selected in Key Manager Configuration - Setup more than 2 external keymanagers and enable couple of them for the API. When the API is invoked, you get following error `TID: [-1234] [] [2020-11-02 12:01:36,739] ERROR {org.wso2.carbon.apimgt.gateway.handlers.security.APIAuthenticationHandler} - API authentication failure due to Unclassified Authentication Failure org.wso2.carbon.apimgt.gateway.handlers.security.APISecurityException: Error while accessing backend services for API key validation at org.wso2.carbon.apimgt.gateway.handlers.security.APIAuthenticationHandler.isAuthenticate_aroundBody42(APIAuthenticationHandler.java:438)`
non_process
authentication failure when allow selected is selected in key manager configuration setup more than external keymanagers and enable couple of them for the api when the api is invoked you get following error tid error org carbon apimgt gateway handlers security apiauthenticationhandler api authentication failure due to unclassified authentication failure org carbon apimgt gateway handlers security apisecurityexception error while accessing backend services for api key validation at org carbon apimgt gateway handlers security apiauthenticationhandler isauthenticate apiauthenticationhandler java
0
20,097
26,629,998,380
IssuesEvent
2023-01-24 17:06:25
NCAR/GECKO-A-2022
https://api.github.com/repos/NCAR/GECKO-A-2022
closed
unlabel DO loops in parse_chem_module
Postprocessor For later
../LIB/parse_chem_module.f90:565:23: 565 | DO 55 j=i+1,ik | 1 Warning: Fortran 2018 obsolescent feature: Labeled DO statement at (1) ... and several other instances
1.0
unlabel DO loops in parse_chem_module - ../LIB/parse_chem_module.f90:565:23: 565 | DO 55 j=i+1,ik | 1 Warning: Fortran 2018 obsolescent feature: Labeled DO statement at (1) ... and several other instances
process
unlabel do loops in parse chem module lib parse chem module do j i ik warning fortran obsolescent feature labeled do statement at and several other instances
1
3,937
6,876,682,544
IssuesEvent
2017-11-20 02:41:41
PaddlePaddle/models
https://api.github.com/repos/PaddlePaddle/models
closed
add data preprocess and trained model scripts for generating Chinese poetry.
model in process
- Please use data from this project https://github.com/lcy-seso/chinese-poetry and add a reference to this projection. - Add the trained model.
1.0
add data preprocess and trained model scripts for generating Chinese poetry. - - Please use data from this project https://github.com/lcy-seso/chinese-poetry and add a reference to this projection. - Add the trained model.
process
add data preprocess and trained model scripts for generating chinese poetry please use data from this project and add a reference to this projection add the trained model
1
96,938
10,967,413,910
IssuesEvent
2019-11-28 09:30:08
fnielsen/scholia
https://api.github.com/repos/fnielsen/scholia
closed
Write a summary of progress towards Goal 8: Scholia for Wikipedia and beyond
documentation
Integrate access to Scholia into Wikipedia articles and elsewhere https://github.com/fnielsen/scholia/projects/32
1.0
Write a summary of progress towards Goal 8: Scholia for Wikipedia and beyond - Integrate access to Scholia into Wikipedia articles and elsewhere https://github.com/fnielsen/scholia/projects/32
non_process
write a summary of progress towards goal scholia for wikipedia and beyond integrate access to scholia into wikipedia articles and elsewhere
0
83,046
10,319,426,159
IssuesEvent
2019-08-30 17:29:07
reactioncommerce/reaction
https://api.github.com/repos/reactioncommerce/reaction
closed
Bulk Tag/Product Feature: Show isVisible/Hidden circle in Product Table
design-complete reaction-admin
<img width="149" alt="Screen Shot 2019-07-30 at 7 06 13 PM" src="https://user-images.githubusercontent.com/3673236/62178229-217fbb80-b2fd-11e9-9ec7-e1f2699fce15.png">
1.0
Bulk Tag/Product Feature: Show isVisible/Hidden circle in Product Table - <img width="149" alt="Screen Shot 2019-07-30 at 7 06 13 PM" src="https://user-images.githubusercontent.com/3673236/62178229-217fbb80-b2fd-11e9-9ec7-e1f2699fce15.png">
non_process
bulk tag product feature show isvisible hidden circle in product table img width alt screen shot at pm src
0
21,583
29,953,800,118
IssuesEvent
2023-06-23 05:20:10
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
Inconsistency behaviour on filter processor
bug Stale priority:p2 processor/filter closed as inactive
### Component(s) processor/filter ### What happened? ## Description I have been trying to setup a filter to drop a few spans from my traces, matching the following conditions (`AND`) - with attribute http.host should match regexp `.*\.blob\.core\.windows\.net` - with `service.name` == `cardiolib` - with `otel.library.name` == `opentelemetry.instrumentation.requests` I have been following the documentation and tried to ways to achieve this ```yaml processors: filter/cardiolib-skip-blob-requests: traces: span: - 'IsMatch(attributes["http.host"], ".*\\.blob\\.core\\.windows\\.net") == true and attributes["otel.library.name"] == "opentelemetry.instrumentation.requests" and resource.attributes["service.name"] == "cardiolib"' ``` With this filter, somehow, **everything** is dropped (from all services), I can't explain why. If I change the span filter to this one instead ```yaml 'attributes["otel.library.name"] == "opentelemetry.instrumentation.requests" and resource.attributes["service.name"] == "cardiolib"' ``` Then nothing gets dropped, I don't get how this IsMatch can affect the whole pipeline as it looks like it's not even looking forn the rest of the conditions ? The documentations says: > If any condition is met, the telemetry is dropped (each condition is ORed together) But from what I understand, this concerns the list of conditions under the `span` attribute and not the conditions in the list element itself ? (If so, the `and` statement is useless?) I also tried to use the attribute filters this way but it doesn't have any effect either (with full fqdn instead of regexp to test if that was not a regexp issue) ```yaml config: processors: filter: spans: exclude: match_type: regexp services: - cardiolib attributes: - Key: http.host Value: "xxx.blob.core.windows.net" libraries: - Name: opentelemetry.instrumentation.requests ``` ## Steps to Reproduce ## Expected Result ## Actual Result ### Collector version otel/opentelemetry-collector-contrib:0.67.0 ### Environment information ## Environment Running on Kubernetes, datadog upstream ### OpenTelemetry Collector configuration ```yaml config: receivers: prometheus: config: scrape_configs: - job_name: 'otelcol' scrape_interval: 10s static_configs: - targets: ['0.0.0.0:8888'] processors: filter/test1: spans: exclude: match_type: regexp services: - cardiolib attributes: - key: http.host value: xxx.blob.core.windows.net - key: http.method value: PUT filter/test2: traces: span: - 'attributes["otel.library.name"] == "opentelemetry.instrumentation.requests" and resource.attributes["service.name"] == "cardiolib"' exporters: otlp: endpoint: "${K8S_HOST_IP}:14317" tls: insecure: true service: pipelines: traces: receivers: [otlp] processors: - filter/test2 - batch exporters: [otlp] ``` ### Log output _No response_ ### Additional context _No response_
1.0
Inconsistency behaviour on filter processor - ### Component(s) processor/filter ### What happened? ## Description I have been trying to setup a filter to drop a few spans from my traces, matching the following conditions (`AND`) - with attribute http.host should match regexp `.*\.blob\.core\.windows\.net` - with `service.name` == `cardiolib` - with `otel.library.name` == `opentelemetry.instrumentation.requests` I have been following the documentation and tried to ways to achieve this ```yaml processors: filter/cardiolib-skip-blob-requests: traces: span: - 'IsMatch(attributes["http.host"], ".*\\.blob\\.core\\.windows\\.net") == true and attributes["otel.library.name"] == "opentelemetry.instrumentation.requests" and resource.attributes["service.name"] == "cardiolib"' ``` With this filter, somehow, **everything** is dropped (from all services), I can't explain why. If I change the span filter to this one instead ```yaml 'attributes["otel.library.name"] == "opentelemetry.instrumentation.requests" and resource.attributes["service.name"] == "cardiolib"' ``` Then nothing gets dropped, I don't get how this IsMatch can affect the whole pipeline as it looks like it's not even looking forn the rest of the conditions ? The documentations says: > If any condition is met, the telemetry is dropped (each condition is ORed together) But from what I understand, this concerns the list of conditions under the `span` attribute and not the conditions in the list element itself ? (If so, the `and` statement is useless?) I also tried to use the attribute filters this way but it doesn't have any effect either (with full fqdn instead of regexp to test if that was not a regexp issue) ```yaml config: processors: filter: spans: exclude: match_type: regexp services: - cardiolib attributes: - Key: http.host Value: "xxx.blob.core.windows.net" libraries: - Name: opentelemetry.instrumentation.requests ``` ## Steps to Reproduce ## Expected Result ## Actual Result ### Collector version otel/opentelemetry-collector-contrib:0.67.0 ### Environment information ## Environment Running on Kubernetes, datadog upstream ### OpenTelemetry Collector configuration ```yaml config: receivers: prometheus: config: scrape_configs: - job_name: 'otelcol' scrape_interval: 10s static_configs: - targets: ['0.0.0.0:8888'] processors: filter/test1: spans: exclude: match_type: regexp services: - cardiolib attributes: - key: http.host value: xxx.blob.core.windows.net - key: http.method value: PUT filter/test2: traces: span: - 'attributes["otel.library.name"] == "opentelemetry.instrumentation.requests" and resource.attributes["service.name"] == "cardiolib"' exporters: otlp: endpoint: "${K8S_HOST_IP}:14317" tls: insecure: true service: pipelines: traces: receivers: [otlp] processors: - filter/test2 - batch exporters: [otlp] ``` ### Log output _No response_ ### Additional context _No response_
process
inconsistency behaviour on filter processor component s processor filter what happened description i have been trying to setup a filter to drop a few spans from my traces matching the following conditions and with attribute http host should match regexp blob core windows net with service name cardiolib with otel library name opentelemetry instrumentation requests i have been following the documentation and tried to ways to achieve this yaml processors filter cardiolib skip blob requests traces span ismatch attributes blob core windows net true and attributes opentelemetry instrumentation requests and resource attributes cardiolib with this filter somehow everything is dropped from all services i can t explain why if i change the span filter to this one instead yaml attributes opentelemetry instrumentation requests and resource attributes cardiolib then nothing gets dropped i don t get how this ismatch can affect the whole pipeline as it looks like it s not even looking forn the rest of the conditions the documentations says if any condition is met the telemetry is dropped each condition is ored together but from what i understand this concerns the list of conditions under the span attribute and not the conditions in the list element itself if so the and statement is useless i also tried to use the attribute filters this way but it doesn t have any effect either with full fqdn instead of regexp to test if that was not a regexp issue yaml config processors filter spans exclude match type regexp services cardiolib attributes key http host value xxx blob core windows net libraries name opentelemetry instrumentation requests steps to reproduce expected result actual result collector version otel opentelemetry collector contrib environment information environment running on kubernetes datadog upstream opentelemetry collector configuration yaml config receivers prometheus config scrape configs job name otelcol scrape interval static configs targets processors filter spans exclude match type regexp services cardiolib attributes key http host value xxx blob core windows net key http method value put filter traces span attributes opentelemetry instrumentation requests and resource attributes cardiolib exporters otlp endpoint host ip tls insecure true service pipelines traces receivers processors filter batch exporters log output no response additional context no response
1
17,782
23,711,215,222
IssuesEvent
2022-08-30 08:01:48
GoogleCloudPlatform/dotnet-docs-samples
https://api.github.com/repos/GoogleCloudPlatform/dotnet-docs-samples
closed
chore: snippet-bot full scan
type: process samples priority: p3
null<!-- probot comment [9602930]--> ## snippet-bot scan result Life is too short to manually check unmatched region tags. Here is the result: Great job! No unmatching region tags found! --- Report generated by [snippet-bot](https://github.com/apps/snippet-bot). If you find problems with this result, please file an issue at: https://github.com/googleapis/repo-automation-bots/issues.
1.0
chore: snippet-bot full scan - null<!-- probot comment [9602930]--> ## snippet-bot scan result Life is too short to manually check unmatched region tags. Here is the result: Great job! No unmatching region tags found! --- Report generated by [snippet-bot](https://github.com/apps/snippet-bot). If you find problems with this result, please file an issue at: https://github.com/googleapis/repo-automation-bots/issues.
process
chore snippet bot full scan null snippet bot scan result life is too short to manually check unmatched region tags here is the result great job no unmatching region tags found report generated by if you find problems with this result please file an issue at
1
695,605
23,865,729,856
IssuesEvent
2022-09-07 10:51:09
magento/magento2
https://api.github.com/repos/magento/magento2
closed
Grouped product with tax throws exception with PHP 8.1
Issue: Confirmed Progress: PR Created Reproduced on 2.4.x Progress: PR in progress Priority: P2 Area: Tax Reported on 2.4.4 Adobe Commerce
### Preconditions and environment - Magento 2.4.4 - PHP 8.1 (though 8.0 is probably the same) - Tax applied (in my case a blanket 20% tax) ### Steps to reproduce 1. Create a simple product 2. Create a grouped product and add the simple product to it, add the grouped product to a category 3. Ensure tax will be applied (in my case, I added a new tax zone for the UK, all regions and zip codes, at a rate of 20%. I applied this to the Retail Customer class, and set all customer groups to use it) 4. Set both options to "Including and Excluding Tax" under Store > Configuration > Sales > Tax > Price Display Settings 5. Run bin/magento setup:upgrade (not entirely sure this is needed), bin/magento c:f and bin/magento index:reindex 6. Navigate to the category the grouped product belongs to on frontend ### Expected result A list of products belonging to the category should be displayed. ### Actual result An exception is thrown: `main.CRITICAL: Exception: Deprecated Functionality: ucfirst(): Passing null to parameter #1 ($string) of type string is deprecated in /var/www/m24/vendor/magento/module-tax/Pricing/Render/Adjustment.php on line 188 in /var/www/m24/vendor/magento/framework/App/ErrorHandler.php:61 Stack trace: #0 [internal function]: Magento\Framework\App\ErrorHandler->handler() #1 /var/www/m24/vendor/magento/module-tax/Pricing/Render/Adjustment.php(188): ucfirst() #2 /var/www/m24/vendor/magento/module-tax/view/base/templates/pricing/adjustment.phtml(15): Magento\Tax\Pricing\Render\Adjustment->getDataPriceType() #3 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #4 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #5 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Framework\View\Element\Template->fetchView() #6 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\View\Element\Template->_toHtml() #7 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #8 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #9 /var/www/m24/vendor/magento/module-tax/Pricing/Render/Adjustment.php(64): Magento\Framework\View\Element\AbstractBlock->toHtml() #10 /var/www/m24/vendor/magento/framework/Pricing/Render/AbstractAdjustment.php(57): Magento\Tax\Pricing\Render\Adjustment->apply() #11 /var/www/m24/vendor/magento/framework/Pricing/Render/Amount.php(205): Magento\Framework\Pricing\Render\AbstractAdjustment->render() #12 /var/www/m24/vendor/magento/framework/Pricing/Render/Amount.php(176): Magento\Framework\Pricing\Render\Amount->getAdjustments() #13 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\Pricing\Render\Amount->_toHtml() #14 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #15 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #16 /var/www/m24/vendor/magento/module-grouped-product/view/base/templates/product/price/final_price.phtml(30): Magento\Framework\View\Element\AbstractBlock->toHtml() #17 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #18 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #19 /var/www/m24/generated/code/Magento/Catalog/Pricing/Render/FinalPriceBox/Interceptor.php(194): Magento\Framework\View\Element\Template->fetchView() #20 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Catalog\Pricing\Render\FinalPriceBox\Interceptor->fetchView() #21 /var/www/m24/vendor/magento/framework/Pricing/Render/PriceBox.php(69): Magento\Framework\View\Element\Template->_toHtml() #22 /var/www/m24/vendor/magento/module-catalog/Pricing/Render/FinalPriceBox.php(71): Magento\Framework\Pricing\Render\PriceBox->_toHtml() #23 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Catalog\Pricing\Render\FinalPriceBox->_toHtml() #24 /var/www/m24/vendor/magento/framework/Cache/LockGuardedCacheLoader.php(136): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #25 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1117): Magento\Framework\Cache\LockGuardedCacheLoader->lockedLoadData() #26 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #27 /var/www/m24/generated/code/Magento/Catalog/Pricing/Render/FinalPriceBox/Interceptor.php(410): Magento\Framework\View\Element\AbstractBlock->toHtml() #28 /var/www/m24/vendor/magento/framework/Pricing/Render.php(97): Magento\Catalog\Pricing\Render\FinalPriceBox\Interceptor->toHtml() #29 /var/www/m24/vendor/magento/module-catalog/Block/Product/ListProduct.php(411): Magento\Framework\Pricing\Render->render() #30 /var/www/m24/generated/code/Magento/Catalog/Block/Product/ListProduct/Interceptor.php(131): Magento\Catalog\Block\Product\ListProduct->getProductPrice() #31 /var/www/m24/vendor/magento/module-catalog/view/frontend/templates/product/list.phtml(77): Magento\Catalog\Block\Product\ListProduct\Interceptor->getProductPrice() #32 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #33 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #34 /var/www/m24/generated/code/Magento/Catalog/Block/Product/ListProduct/Interceptor.php(392): Magento\Framework\View\Element\Template->fetchView() #35 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Catalog\Block\Product\ListProduct\Interceptor->fetchView() #36 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\View\Element\Template->_toHtml() #37 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #38 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #39 /var/www/m24/generated/code/Magento/Catalog/Block/Product/ListProduct/Interceptor.php(617): Magento\Framework\View\Element\AbstractBlock->toHtml() #40 /var/www/m24/vendor/magento/framework/View/Layout.php(578): Magento\Catalog\Block\Product\ListProduct\Interceptor->toHtml() #41 /var/www/m24/vendor/magento/framework/View/Layout.php(555): Magento\Framework\View\Layout->_renderBlock() #42 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #43 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #44 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #45 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(507): Magento\Framework\View\Layout\Interceptor->renderElement() #46 /var/www/m24/vendor/magento/module-catalog/Block/Category/View.php(100): Magento\Framework\View\Element\AbstractBlock->getChildHtml() #47 /var/www/m24/vendor/magento/module-catalog/view/frontend/templates/category/products.phtml(15): Magento\Catalog\Block\Category\View->getProductListHtml() #48 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #49 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #50 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Framework\View\Element\Template->fetchView() #51 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\View\Element\Template->_toHtml() #52 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #53 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #54 /var/www/m24/vendor/magento/framework/View/Layout.php(578): Magento\Framework\View\Element\AbstractBlock->toHtml() #55 /var/www/m24/vendor/magento/framework/View/Layout.php(555): Magento\Framework\View\Layout->_renderBlock() #56 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #57 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #58 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #59 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #60 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #61 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #62 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #63 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #64 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #65 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #66 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #67 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #68 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #69 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #70 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #71 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #72 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #73 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #74 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #75 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #76 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #77 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #78 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #79 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #80 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #81 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #82 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #83 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #84 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #85 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #86 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #87 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #88 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #89 /var/www/m24/vendor/magento/framework/View/Layout.php(975): Magento\Framework\View\Layout\Interceptor->renderElement() #90 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(58): Magento\Framework\View\Layout->getOutput() #91 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(138): Magento\Framework\View\Layout\Interceptor->___callParent() #92 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(153): Magento\Framework\View\Layout\Interceptor->Magento\Framework\Interception\{closure}() #93 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(347): Magento\Framework\View\Layout\Interceptor->___callPlugins() #94 /var/www/m24/vendor/magento/framework/View/Result/Page.php(260): Magento\Framework\View\Layout\Interceptor->getOutput() #95 /var/www/m24/vendor/magento/framework/View/Result/Layout.php(171): Magento\Framework\View\Result\Page->render() #96 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(58): Magento\Framework\View\Result\Layout->renderResult() #97 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(138): Magento\Framework\View\Result\Page\Interceptor->___callParent() #98 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(153): Magento\Framework\View\Result\Page\Interceptor->Magento\Framework\Interception\{closure}() #99 /var/www/m24/generated/code/Magento/Framework/View/Result/Page/Interceptor.php(95): Magento\Framework\View\Result\Page\Interceptor->___callPlugins() #100 /var/www/m24/vendor/magento/framework/App/Http.php(120): Magento\Framework\View\Result\Page\Interceptor->renderResult() #101 /var/www/m24/generated/code/Magento/Framework/App/Http/Interceptor.php(23): Magento\Framework\App\Http->launch() #102 /var/www/m24/vendor/magento/framework/App/Bootstrap.php(264): Magento\Framework\App\Http\Interceptor->launch() #103 /var/www/m24/pub/index.php(30): Magento\Framework\App\Bootstrap->run() #104 {main} [] [] ` ### Additional information For whatever reason the priceType returned by amountRender is null, which is not a valid parameter for ucfirst in PHP 8. I suspect this has something to do with how grouped products calculate their display price but haven't tracked it down yet. vendor/magento/module-tax/Pricing/Render/Adjustment.php line 184: ` public function getDataPriceType(): string { return $this->amountRender->getPriceType() === 'finalPrice' ? 'basePrice' : 'base' . ucfirst($this->amountRender->getPriceType()); }` ### Release note _No response_ ### Triage and priority - [X] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._ - [ ] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._ - [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
1.0
Grouped product with tax throws exception with PHP 8.1 - ### Preconditions and environment - Magento 2.4.4 - PHP 8.1 (though 8.0 is probably the same) - Tax applied (in my case a blanket 20% tax) ### Steps to reproduce 1. Create a simple product 2. Create a grouped product and add the simple product to it, add the grouped product to a category 3. Ensure tax will be applied (in my case, I added a new tax zone for the UK, all regions and zip codes, at a rate of 20%. I applied this to the Retail Customer class, and set all customer groups to use it) 4. Set both options to "Including and Excluding Tax" under Store > Configuration > Sales > Tax > Price Display Settings 5. Run bin/magento setup:upgrade (not entirely sure this is needed), bin/magento c:f and bin/magento index:reindex 6. Navigate to the category the grouped product belongs to on frontend ### Expected result A list of products belonging to the category should be displayed. ### Actual result An exception is thrown: `main.CRITICAL: Exception: Deprecated Functionality: ucfirst(): Passing null to parameter #1 ($string) of type string is deprecated in /var/www/m24/vendor/magento/module-tax/Pricing/Render/Adjustment.php on line 188 in /var/www/m24/vendor/magento/framework/App/ErrorHandler.php:61 Stack trace: #0 [internal function]: Magento\Framework\App\ErrorHandler->handler() #1 /var/www/m24/vendor/magento/module-tax/Pricing/Render/Adjustment.php(188): ucfirst() #2 /var/www/m24/vendor/magento/module-tax/view/base/templates/pricing/adjustment.phtml(15): Magento\Tax\Pricing\Render\Adjustment->getDataPriceType() #3 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #4 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #5 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Framework\View\Element\Template->fetchView() #6 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\View\Element\Template->_toHtml() #7 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #8 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #9 /var/www/m24/vendor/magento/module-tax/Pricing/Render/Adjustment.php(64): Magento\Framework\View\Element\AbstractBlock->toHtml() #10 /var/www/m24/vendor/magento/framework/Pricing/Render/AbstractAdjustment.php(57): Magento\Tax\Pricing\Render\Adjustment->apply() #11 /var/www/m24/vendor/magento/framework/Pricing/Render/Amount.php(205): Magento\Framework\Pricing\Render\AbstractAdjustment->render() #12 /var/www/m24/vendor/magento/framework/Pricing/Render/Amount.php(176): Magento\Framework\Pricing\Render\Amount->getAdjustments() #13 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\Pricing\Render\Amount->_toHtml() #14 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #15 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #16 /var/www/m24/vendor/magento/module-grouped-product/view/base/templates/product/price/final_price.phtml(30): Magento\Framework\View\Element\AbstractBlock->toHtml() #17 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #18 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #19 /var/www/m24/generated/code/Magento/Catalog/Pricing/Render/FinalPriceBox/Interceptor.php(194): Magento\Framework\View\Element\Template->fetchView() #20 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Catalog\Pricing\Render\FinalPriceBox\Interceptor->fetchView() #21 /var/www/m24/vendor/magento/framework/Pricing/Render/PriceBox.php(69): Magento\Framework\View\Element\Template->_toHtml() #22 /var/www/m24/vendor/magento/module-catalog/Pricing/Render/FinalPriceBox.php(71): Magento\Framework\Pricing\Render\PriceBox->_toHtml() #23 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Catalog\Pricing\Render\FinalPriceBox->_toHtml() #24 /var/www/m24/vendor/magento/framework/Cache/LockGuardedCacheLoader.php(136): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #25 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1117): Magento\Framework\Cache\LockGuardedCacheLoader->lockedLoadData() #26 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #27 /var/www/m24/generated/code/Magento/Catalog/Pricing/Render/FinalPriceBox/Interceptor.php(410): Magento\Framework\View\Element\AbstractBlock->toHtml() #28 /var/www/m24/vendor/magento/framework/Pricing/Render.php(97): Magento\Catalog\Pricing\Render\FinalPriceBox\Interceptor->toHtml() #29 /var/www/m24/vendor/magento/module-catalog/Block/Product/ListProduct.php(411): Magento\Framework\Pricing\Render->render() #30 /var/www/m24/generated/code/Magento/Catalog/Block/Product/ListProduct/Interceptor.php(131): Magento\Catalog\Block\Product\ListProduct->getProductPrice() #31 /var/www/m24/vendor/magento/module-catalog/view/frontend/templates/product/list.phtml(77): Magento\Catalog\Block\Product\ListProduct\Interceptor->getProductPrice() #32 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #33 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #34 /var/www/m24/generated/code/Magento/Catalog/Block/Product/ListProduct/Interceptor.php(392): Magento\Framework\View\Element\Template->fetchView() #35 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Catalog\Block\Product\ListProduct\Interceptor->fetchView() #36 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\View\Element\Template->_toHtml() #37 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #38 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #39 /var/www/m24/generated/code/Magento/Catalog/Block/Product/ListProduct/Interceptor.php(617): Magento\Framework\View\Element\AbstractBlock->toHtml() #40 /var/www/m24/vendor/magento/framework/View/Layout.php(578): Magento\Catalog\Block\Product\ListProduct\Interceptor->toHtml() #41 /var/www/m24/vendor/magento/framework/View/Layout.php(555): Magento\Framework\View\Layout->_renderBlock() #42 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #43 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #44 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #45 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(507): Magento\Framework\View\Layout\Interceptor->renderElement() #46 /var/www/m24/vendor/magento/module-catalog/Block/Category/View.php(100): Magento\Framework\View\Element\AbstractBlock->getChildHtml() #47 /var/www/m24/vendor/magento/module-catalog/view/frontend/templates/category/products.phtml(15): Magento\Catalog\Block\Category\View->getProductListHtml() #48 /var/www/m24/vendor/magento/framework/View/TemplateEngine/Php.php(71): include('...') #49 /var/www/m24/vendor/magento/framework/View/Element/Template.php(263): Magento\Framework\View\TemplateEngine\Php->render() #50 /var/www/m24/vendor/magento/framework/View/Element/Template.php(293): Magento\Framework\View\Element\Template->fetchView() #51 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1095): Magento\Framework\View\Element\Template->_toHtml() #52 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(1099): Magento\Framework\View\Element\AbstractBlock->Magento\Framework\View\Element\{closure}() #53 /var/www/m24/vendor/magento/framework/View/Element/AbstractBlock.php(660): Magento\Framework\View\Element\AbstractBlock->_loadCache() #54 /var/www/m24/vendor/magento/framework/View/Layout.php(578): Magento\Framework\View\Element\AbstractBlock->toHtml() #55 /var/www/m24/vendor/magento/framework/View/Layout.php(555): Magento\Framework\View\Layout->_renderBlock() #56 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #57 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #58 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #59 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #60 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #61 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #62 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #63 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #64 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #65 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #66 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #67 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #68 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #69 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #70 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #71 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #72 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #73 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #74 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #75 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #76 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #77 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #78 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #79 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #80 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #81 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #82 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #83 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #84 /var/www/m24/vendor/magento/framework/View/Layout.php(606): Magento\Framework\View\Layout\Interceptor->renderElement() #85 /var/www/m24/vendor/magento/framework/View/Layout.php(557): Magento\Framework\View\Layout->_renderContainer() #86 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(149): Magento\Framework\View\Layout->renderNonCachedElement() #87 /var/www/m24/vendor/magento/framework/View/Layout.php(510): Magento\Framework\View\Layout\Interceptor->renderNonCachedElement() #88 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(140): Magento\Framework\View\Layout->renderElement() #89 /var/www/m24/vendor/magento/framework/View/Layout.php(975): Magento\Framework\View\Layout\Interceptor->renderElement() #90 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(58): Magento\Framework\View\Layout->getOutput() #91 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(138): Magento\Framework\View\Layout\Interceptor->___callParent() #92 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(153): Magento\Framework\View\Layout\Interceptor->Magento\Framework\Interception\{closure}() #93 /var/www/m24/generated/code/Magento/Framework/View/Layout/Interceptor.php(347): Magento\Framework\View\Layout\Interceptor->___callPlugins() #94 /var/www/m24/vendor/magento/framework/View/Result/Page.php(260): Magento\Framework\View\Layout\Interceptor->getOutput() #95 /var/www/m24/vendor/magento/framework/View/Result/Layout.php(171): Magento\Framework\View\Result\Page->render() #96 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(58): Magento\Framework\View\Result\Layout->renderResult() #97 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(138): Magento\Framework\View\Result\Page\Interceptor->___callParent() #98 /var/www/m24/vendor/magento/framework/Interception/Interceptor.php(153): Magento\Framework\View\Result\Page\Interceptor->Magento\Framework\Interception\{closure}() #99 /var/www/m24/generated/code/Magento/Framework/View/Result/Page/Interceptor.php(95): Magento\Framework\View\Result\Page\Interceptor->___callPlugins() #100 /var/www/m24/vendor/magento/framework/App/Http.php(120): Magento\Framework\View\Result\Page\Interceptor->renderResult() #101 /var/www/m24/generated/code/Magento/Framework/App/Http/Interceptor.php(23): Magento\Framework\App\Http->launch() #102 /var/www/m24/vendor/magento/framework/App/Bootstrap.php(264): Magento\Framework\App\Http\Interceptor->launch() #103 /var/www/m24/pub/index.php(30): Magento\Framework\App\Bootstrap->run() #104 {main} [] [] ` ### Additional information For whatever reason the priceType returned by amountRender is null, which is not a valid parameter for ucfirst in PHP 8. I suspect this has something to do with how grouped products calculate their display price but haven't tracked it down yet. vendor/magento/module-tax/Pricing/Render/Adjustment.php line 184: ` public function getDataPriceType(): string { return $this->amountRender->getPriceType() === 'finalPrice' ? 'basePrice' : 'base' . ucfirst($this->amountRender->getPriceType()); }` ### Release note _No response_ ### Triage and priority - [X] Severity: **S0** _- Affects critical data or functionality and leaves users without workaround._ - [ ] Severity: **S1** _- Affects critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S2** _- Affects non-critical data or functionality and forces users to employ a workaround._ - [ ] Severity: **S3** _- Affects non-critical data or functionality and does not force users to employ a workaround._ - [ ] Severity: **S4** _- Affects aesthetics, professional look and feel, “quality” or “usability”._
non_process
grouped product with tax throws exception with php preconditions and environment magento php though is probably the same tax applied in my case a blanket tax steps to reproduce create a simple product create a grouped product and add the simple product to it add the grouped product to a category ensure tax will be applied in my case i added a new tax zone for the uk all regions and zip codes at a rate of i applied this to the retail customer class and set all customer groups to use it set both options to including and excluding tax under store configuration sales tax price display settings run bin magento setup upgrade not entirely sure this is needed bin magento c f and bin magento index reindex navigate to the category the grouped product belongs to on frontend expected result a list of products belonging to the category should be displayed actual result an exception is thrown main critical exception deprecated functionality ucfirst passing null to parameter string of type string is deprecated in var www vendor magento module tax pricing render adjustment php on line in var www vendor magento framework app errorhandler php stack trace magento framework app errorhandler handler var www vendor magento module tax pricing render adjustment php ucfirst var www vendor magento module tax view base templates pricing adjustment phtml magento tax pricing render adjustment getdatapricetype var www vendor magento framework view templateengine php php include var www vendor magento framework view element template php magento framework view templateengine php render var www vendor magento framework view element template php magento framework view element template fetchview var www vendor magento framework view element abstractblock php magento framework view element template tohtml var www vendor magento framework view element abstractblock php magento framework view element abstractblock magento framework view element closure var www vendor magento framework view element abstractblock php magento framework view element abstractblock loadcache var www vendor magento module tax pricing render adjustment php magento framework view element abstractblock tohtml var www vendor magento framework pricing render abstractadjustment php magento tax pricing render adjustment apply var www vendor magento framework pricing render amount php magento framework pricing render abstractadjustment render var www vendor magento framework pricing render amount php magento framework pricing render amount getadjustments var www vendor magento framework view element abstractblock php magento framework pricing render amount tohtml var www vendor magento framework view element abstractblock php magento framework view element abstractblock magento framework view element closure var www vendor magento framework view element abstractblock php magento framework view element abstractblock loadcache var www vendor magento module grouped product view base templates product price final price phtml magento framework view element abstractblock tohtml var www vendor magento framework view templateengine php php include var www vendor magento framework view element template php magento framework view templateengine php render var www generated code magento catalog pricing render finalpricebox interceptor php magento framework view element template fetchview var www vendor magento framework view element template php magento catalog pricing render finalpricebox interceptor fetchview var www vendor magento framework pricing render pricebox php magento framework view element template tohtml var www vendor magento module catalog pricing render finalpricebox php magento framework pricing render pricebox tohtml var www vendor magento framework view element abstractblock php magento catalog pricing render finalpricebox tohtml var www vendor magento framework cache lockguardedcacheloader php magento framework view element abstractblock magento framework view element closure var www vendor magento framework view element abstractblock php magento framework cache lockguardedcacheloader lockedloaddata var www vendor magento framework view element abstractblock php magento framework view element abstractblock loadcache var www generated code magento catalog pricing render finalpricebox interceptor php magento framework view element abstractblock tohtml var www vendor magento framework pricing render php magento catalog pricing render finalpricebox interceptor tohtml var www vendor magento module catalog block product listproduct php magento framework pricing render render var www generated code magento catalog block product listproduct interceptor php magento catalog block product listproduct getproductprice var www vendor magento module catalog view frontend templates product list phtml magento catalog block product listproduct interceptor getproductprice var www vendor magento framework view templateengine php php include var www vendor magento framework view element template php magento framework view templateengine php render var www generated code magento catalog block product listproduct interceptor php magento framework view element template fetchview var www vendor magento framework view element template php magento catalog block product listproduct interceptor fetchview var www vendor magento framework view element abstractblock php magento framework view element template tohtml var www vendor magento framework view element abstractblock php magento framework view element abstractblock magento framework view element closure var www vendor magento framework view element abstractblock php magento framework view element abstractblock loadcache var www generated code magento catalog block product listproduct interceptor php magento framework view element abstractblock tohtml var www vendor magento framework view layout php magento catalog block product listproduct interceptor tohtml var www vendor magento framework view layout php magento framework view layout renderblock var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view element abstractblock php magento framework view layout interceptor renderelement var www vendor magento module catalog block category view php magento framework view element abstractblock getchildhtml var www vendor magento module catalog view frontend templates category products phtml magento catalog block category view getproductlisthtml var www vendor magento framework view templateengine php php include var www vendor magento framework view element template php magento framework view templateengine php render var www vendor magento framework view element template php magento framework view element template fetchview var www vendor magento framework view element abstractblock php magento framework view element template tohtml var www vendor magento framework view element abstractblock php magento framework view element abstractblock magento framework view element closure var www vendor magento framework view element abstractblock php magento framework view element abstractblock loadcache var www vendor magento framework view layout php magento framework view element abstractblock tohtml var www vendor magento framework view layout php magento framework view layout renderblock var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view layout php magento framework view layout interceptor renderelement var www vendor magento framework view layout php magento framework view layout rendercontainer var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view layout php magento framework view layout interceptor renderelement var www vendor magento framework view layout php magento framework view layout rendercontainer var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view layout php magento framework view layout interceptor renderelement var www vendor magento framework view layout php magento framework view layout rendercontainer var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view layout php magento framework view layout interceptor renderelement var www vendor magento framework view layout php magento framework view layout rendercontainer var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view layout php magento framework view layout interceptor renderelement var www vendor magento framework view layout php magento framework view layout rendercontainer var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view layout php magento framework view layout interceptor renderelement var www vendor magento framework view layout php magento framework view layout rendercontainer var www generated code magento framework view layout interceptor php magento framework view layout rendernoncachedelement var www vendor magento framework view layout php magento framework view layout interceptor rendernoncachedelement var www generated code magento framework view layout interceptor php magento framework view layout renderelement var www vendor magento framework view layout php magento framework view layout interceptor renderelement var www vendor magento framework interception interceptor php magento framework view layout getoutput var www vendor magento framework interception interceptor php magento framework view layout interceptor callparent var www vendor magento framework interception interceptor php magento framework view layout interceptor magento framework interception closure var www generated code magento framework view layout interceptor php magento framework view layout interceptor callplugins var www vendor magento framework view result page php magento framework view layout interceptor getoutput var www vendor magento framework view result layout php magento framework view result page render var www vendor magento framework interception interceptor php magento framework view result layout renderresult var www vendor magento framework interception interceptor php magento framework view result page interceptor callparent var www vendor magento framework interception interceptor php magento framework view result page interceptor magento framework interception closure var www generated code magento framework view result page interceptor php magento framework view result page interceptor callplugins var www vendor magento framework app http php magento framework view result page interceptor renderresult var www generated code magento framework app http interceptor php magento framework app http launch var www vendor magento framework app bootstrap php magento framework app http interceptor launch var www pub index php magento framework app bootstrap run main additional information for whatever reason the pricetype returned by amountrender is null which is not a valid parameter for ucfirst in php i suspect this has something to do with how grouped products calculate their display price but haven t tracked it down yet vendor magento module tax pricing render adjustment php line public function getdatapricetype string return this amountrender getpricetype finalprice baseprice base ucfirst this amountrender getpricetype release note no response triage and priority severity affects critical data or functionality and leaves users without workaround severity affects critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and forces users to employ a workaround severity affects non critical data or functionality and does not force users to employ a workaround severity affects aesthetics professional look and feel “quality” or “usability”
0
15,749
19,911,622,140
IssuesEvent
2022-01-25 17:44:00
input-output-hk/high-assurance-legacy
https://api.github.com/repos/input-output-hk/high-assurance-legacy
closed
Make the Isabelle code work with Isabelle2019
language: isabelle topic: process calculus topic: ouroboros topic: examples type: improvement topic: utilities
Currently our Isabelle code is tested with Isabelle2018. However, Isabelle2019 was released in June 2019. Our goal is to make our code work with Isabelle2019.
1.0
Make the Isabelle code work with Isabelle2019 - Currently our Isabelle code is tested with Isabelle2018. However, Isabelle2019 was released in June 2019. Our goal is to make our code work with Isabelle2019.
process
make the isabelle code work with currently our isabelle code is tested with however was released in june  our goal is to make our code work with
1
30,532
8,558,403,905
IssuesEvent
2018-11-08 18:09:28
tensorflow/tensorflow
https://api.github.com/repos/tensorflow/tensorflow
closed
TensorFlow 1.11 CUDA 9.0 with CUDNN7.3 CUDA_ERROR_LAUNCH_TIMEOUT
stat:awaiting response type:build/install
Here is my information first: - cuda: CUDA 9.0 - cudnn: cudnn7.3 for CUDA9.0 - Nvidia GTX 1080Ti - TensorFlow 1.11 from pip3 with GPU support The library can work and tensorflow successfully import from python. **but everytime inference or train a dataset only a while it will got this error**: ``` 2018-10-20 16:47:07.055721: E tensorflow/stream_executor/cuda/cuda_driver.cc:981] failed to synchronize the stop event: CUDA_ERROR_LAUNCH_TIMEOUT: the launch timed out and was terminated 2018-10-20 16:47:07.055749: E tensorflow/stream_executor/cuda/cuda_timer.cc:55] Internal: error destroying CUDA event in context 0xff40eb0: CUDA_ERROR_LAUNCH_TIMEOUT: the launch timed out and was terminated 2018-10-20 16:47:07.055755: E tensorflow/stream_executor/cuda/cuda_timer.cc:60] Internal: error destroying CUDA event in context 0xff40eb0: CUDA_ERROR_LAUNCH_TIMEOUT: the launch timed out and was terminated 2018-10-20 16:47:07.055789: F tensorflow/stream_executor/cuda/cuda_dnn.cc:211] Check failed: status == CUDNN_STATUS_SUCCESS (7 vs. 0)Failed to set cuDNN stream. [1] 3276 abort (core dumped) python3 demo.py ``` Is this a bug or the driver issue? I have ever got this error before, if anyone got this error leave a comment below to let me know!! I am stuck here right now
1.0
TensorFlow 1.11 CUDA 9.0 with CUDNN7.3 CUDA_ERROR_LAUNCH_TIMEOUT - Here is my information first: - cuda: CUDA 9.0 - cudnn: cudnn7.3 for CUDA9.0 - Nvidia GTX 1080Ti - TensorFlow 1.11 from pip3 with GPU support The library can work and tensorflow successfully import from python. **but everytime inference or train a dataset only a while it will got this error**: ``` 2018-10-20 16:47:07.055721: E tensorflow/stream_executor/cuda/cuda_driver.cc:981] failed to synchronize the stop event: CUDA_ERROR_LAUNCH_TIMEOUT: the launch timed out and was terminated 2018-10-20 16:47:07.055749: E tensorflow/stream_executor/cuda/cuda_timer.cc:55] Internal: error destroying CUDA event in context 0xff40eb0: CUDA_ERROR_LAUNCH_TIMEOUT: the launch timed out and was terminated 2018-10-20 16:47:07.055755: E tensorflow/stream_executor/cuda/cuda_timer.cc:60] Internal: error destroying CUDA event in context 0xff40eb0: CUDA_ERROR_LAUNCH_TIMEOUT: the launch timed out and was terminated 2018-10-20 16:47:07.055789: F tensorflow/stream_executor/cuda/cuda_dnn.cc:211] Check failed: status == CUDNN_STATUS_SUCCESS (7 vs. 0)Failed to set cuDNN stream. [1] 3276 abort (core dumped) python3 demo.py ``` Is this a bug or the driver issue? I have ever got this error before, if anyone got this error leave a comment below to let me know!! I am stuck here right now
non_process
tensorflow cuda with cuda error launch timeout here is my information first cuda cuda cudnn for nvidia gtx tensorflow from with gpu support the library can work and tensorflow successfully import from python but everytime inference or train a dataset only a while it will got this error e tensorflow stream executor cuda cuda driver cc failed to synchronize the stop event cuda error launch timeout the launch timed out and was terminated e tensorflow stream executor cuda cuda timer cc internal error destroying cuda event in context cuda error launch timeout the launch timed out and was terminated e tensorflow stream executor cuda cuda timer cc internal error destroying cuda event in context cuda error launch timeout the launch timed out and was terminated f tensorflow stream executor cuda cuda dnn cc check failed status cudnn status success vs failed to set cudnn stream abort core dumped demo py is this a bug or the driver issue i have ever got this error before if anyone got this error leave a comment below to let me know i am stuck here right now
0
140,557
18,903,132,273
IssuesEvent
2021-11-16 05:03:59
scriptex/react-accordion-ts
https://api.github.com/repos/scriptex/react-accordion-ts
closed
CVE-2021-3918 (High) detected in json-schema-0.2.3.tgz
security vulnerability
## CVE-2021-3918 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-schema-0.2.3.tgz</b></p></summary> <p>JSON Schema validation and specifications</p> <p>Library home page: <a href="https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz">https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz</a></p> <p>Path to dependency file: react-accordion-ts/demo/package.json</p> <p>Path to vulnerable library: react-accordion-ts/demo/node_modules/json-schema/package.json</p> <p> Dependency Hierarchy: - parcel-2.0.1.tgz (Root Library) - config-default-2.0.1.tgz - optimizer-htmlnano-2.0.1.tgz - htmlnano-1.1.1.tgz - uncss-0.17.3.tgz - request-2.88.2.tgz - http-signature-1.2.0.tgz - jsprim-1.4.1.tgz - :x: **json-schema-0.2.3.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/scriptex/react-accordion-ts/commit/ef6be9d7d4efcf1a498d1ac69b10d27eebd4564d">ef6be9d7d4efcf1a498d1ac69b10d27eebd4564d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> json-schema is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution') <p>Publish Date: 2021-11-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3918>CVE-2021-3918</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-3918 (High) detected in json-schema-0.2.3.tgz - ## CVE-2021-3918 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>json-schema-0.2.3.tgz</b></p></summary> <p>JSON Schema validation and specifications</p> <p>Library home page: <a href="https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz">https://registry.npmjs.org/json-schema/-/json-schema-0.2.3.tgz</a></p> <p>Path to dependency file: react-accordion-ts/demo/package.json</p> <p>Path to vulnerable library: react-accordion-ts/demo/node_modules/json-schema/package.json</p> <p> Dependency Hierarchy: - parcel-2.0.1.tgz (Root Library) - config-default-2.0.1.tgz - optimizer-htmlnano-2.0.1.tgz - htmlnano-1.1.1.tgz - uncss-0.17.3.tgz - request-2.88.2.tgz - http-signature-1.2.0.tgz - jsprim-1.4.1.tgz - :x: **json-schema-0.2.3.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/scriptex/react-accordion-ts/commit/ef6be9d7d4efcf1a498d1ac69b10d27eebd4564d">ef6be9d7d4efcf1a498d1ac69b10d27eebd4564d</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> json-schema is vulnerable to Improperly Controlled Modification of Object Prototype Attributes ('Prototype Pollution') <p>Publish Date: 2021-11-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3918>CVE-2021-3918</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in json schema tgz cve high severity vulnerability vulnerable library json schema tgz json schema validation and specifications library home page a href path to dependency file react accordion ts demo package json path to vulnerable library react accordion ts demo node modules json schema package json dependency hierarchy parcel tgz root library config default tgz optimizer htmlnano tgz htmlnano tgz uncss tgz request tgz http signature tgz jsprim tgz x json schema tgz vulnerable library found in head commit a href found in base branch master vulnerability details json schema is vulnerable to improperly controlled modification of object prototype attributes prototype pollution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with whitesource
0
9,644
12,604,833,774
IssuesEvent
2020-06-11 15:32:59
prisma/prisma
https://api.github.com/repos/prisma/prisma
closed
Tighten Version Check for P1 and P1.1 further
kind/improvement process/candidate team/engines topic: introspection
Use the fact that P1 and P1.1 only used very specific length limitations on char and varchar columns. Use the fact that P1 and P1.1 never used database level default values.
1.0
Tighten Version Check for P1 and P1.1 further - Use the fact that P1 and P1.1 only used very specific length limitations on char and varchar columns. Use the fact that P1 and P1.1 never used database level default values.
process
tighten version check for and further use the fact that and only used very specific length limitations on char and varchar columns use the fact that and never used database level default values
1
198,014
6,967,866,362
IssuesEvent
2017-12-10 14:29:47
krishraghuram/mess
https://api.github.com/repos/krishraghuram/mess
closed
Need a simple and scalable way to export data into Excel sheets from django admin
backport needed for 1.x low-priority
At the end of every month, HAB makes several reports on the mess systems. Thus, the Caretakers, Gensec HAB and other HAB office people need a way to export data from the django admin. The method for exporting must be scalable with the software, loosely coupled to other parts of the project, and must be user friendly(should not require knowledge of command line, python or django). Proposed solution : https://github.com/django-import-export/django-import-export https://simpleisbetterthancomplex.com/packages/2016/08/11/django-import-export.html
1.0
Need a simple and scalable way to export data into Excel sheets from django admin - At the end of every month, HAB makes several reports on the mess systems. Thus, the Caretakers, Gensec HAB and other HAB office people need a way to export data from the django admin. The method for exporting must be scalable with the software, loosely coupled to other parts of the project, and must be user friendly(should not require knowledge of command line, python or django). Proposed solution : https://github.com/django-import-export/django-import-export https://simpleisbetterthancomplex.com/packages/2016/08/11/django-import-export.html
non_process
need a simple and scalable way to export data into excel sheets from django admin at the end of every month hab makes several reports on the mess systems thus the caretakers gensec hab and other hab office people need a way to export data from the django admin the method for exporting must be scalable with the software loosely coupled to other parts of the project and must be user friendly should not require knowledge of command line python or django proposed solution
0
6,851
9,992,091,893
IssuesEvent
2019-07-11 12:44:07
googleapis/google-cloud-cpp
https://api.github.com/repos/googleapis/google-cloud-cpp
closed
Make internal Jenkins builds hermetic.
status: will not fix type: process
The builds for Travis and AppVeyor install all the build dependencies (compilers, cmake, make, etc) as part of their configuration steps. In Jenkins we assume that those dependencies are installed, which requires manual administration of the Jenkins servers when they change. We should change the Jenkins builds to also install all their dependencies, for example, using one of the Docker plugins to create a Docker image with all the necessary tools. We already do this for Travis and works (mostly) well. This would increase the build times a bit. @sduskis added as FYI in case you are wondering why the builds broke: we added a dependency for the storage/ directory.
1.0
Make internal Jenkins builds hermetic. - The builds for Travis and AppVeyor install all the build dependencies (compilers, cmake, make, etc) as part of their configuration steps. In Jenkins we assume that those dependencies are installed, which requires manual administration of the Jenkins servers when they change. We should change the Jenkins builds to also install all their dependencies, for example, using one of the Docker plugins to create a Docker image with all the necessary tools. We already do this for Travis and works (mostly) well. This would increase the build times a bit. @sduskis added as FYI in case you are wondering why the builds broke: we added a dependency for the storage/ directory.
process
make internal jenkins builds hermetic the builds for travis and appveyor install all the build dependencies compilers cmake make etc as part of their configuration steps in jenkins we assume that those dependencies are installed which requires manual administration of the jenkins servers when they change we should change the jenkins builds to also install all their dependencies for example using one of the docker plugins to create a docker image with all the necessary tools we already do this for travis and works mostly well this would increase the build times a bit sduskis added as fyi in case you are wondering why the builds broke we added a dependency for the storage directory
1
4,098
2,971,888,222
IssuesEvent
2015-07-14 10:08:36
Yakindu/statecharts
https://api.github.com/repos/Yakindu/statecharts
closed
Xpand Generator SGen doesn't generate code.
All CodeGenerators bug Priority-Medium
This test is defined in the manual click test T8.4 and T8.4.1: "Select TestProject in Package Explorer Select File->New->Other (CTRL+N) from the menu Select Yakindu->Code Generator Model Click on Next button Set the name to StaircaseXpand.sgen Click on Next button Select Custom Xpand-based Generator and check Staircase.sct Click on Finish button" "Open file StaircaseXpand.sgen and set following features: - targetFolder = "src-gen-xpand" - templateProject = "TestProject.generator.xpand" - templatePath = "testproject::generator::xpand::TextGenerator::main"" Generating Code Artifacts on the SGen file does not have any effect. ![issue_sgen_xpand_01](https://cloud.githubusercontent.com/assets/12081541/8645606/27968d10-2949-11e5-8b1a-b99bdbfa1e52.PNG)
1.0
Xpand Generator SGen doesn't generate code. - This test is defined in the manual click test T8.4 and T8.4.1: "Select TestProject in Package Explorer Select File->New->Other (CTRL+N) from the menu Select Yakindu->Code Generator Model Click on Next button Set the name to StaircaseXpand.sgen Click on Next button Select Custom Xpand-based Generator and check Staircase.sct Click on Finish button" "Open file StaircaseXpand.sgen and set following features: - targetFolder = "src-gen-xpand" - templateProject = "TestProject.generator.xpand" - templatePath = "testproject::generator::xpand::TextGenerator::main"" Generating Code Artifacts on the SGen file does not have any effect. ![issue_sgen_xpand_01](https://cloud.githubusercontent.com/assets/12081541/8645606/27968d10-2949-11e5-8b1a-b99bdbfa1e52.PNG)
non_process
xpand generator sgen doesn t generate code this test is defined in the manual click test and select testproject in package explorer select file new other ctrl n from the menu select yakindu code generator model click on next button set the name to staircasexpand sgen click on next button select custom xpand based generator and check staircase sct click on finish button open file staircasexpand sgen and set following features targetfolder src gen xpand templateproject testproject generator xpand templatepath testproject generator xpand textgenerator main generating code artifacts on the sgen file does not have any effect
0
2,534
5,290,808,701
IssuesEvent
2017-02-08 20:51:35
MikePopoloski/slang
https://api.github.com/repos/MikePopoloski/slang
closed
Start of line check for directives
area-preprocessor bug easy
Most preprocessor directives require that they start on their own line. Currently this isn't checked or enforced.
1.0
Start of line check for directives - Most preprocessor directives require that they start on their own line. Currently this isn't checked or enforced.
process
start of line check for directives most preprocessor directives require that they start on their own line currently this isn t checked or enforced
1
14,533
17,630,841,641
IssuesEvent
2021-08-19 07:46:06
tokio-rs/tokio
https://api.github.com/repos/tokio-rs/tokio
closed
implement TryFrom so that std::process:ChildStdxx can be converted to tokio::process:ChildStdxx
A-tokio M-process C-feature-request
**Is your feature request related to a problem? Please describe.** Both `std` and `tokio` have `Stdin`/`Stdout`/`Stderr` type, IMO `tokio` ones are more powerful because they implements `AsyncRead`. In *tokio/src/process/unix/mod.rs*, tokio's `Stdxx` are wrappers around `std::process::Stdxx`: ``` pub(crate) type ChildStdin = PollEvented<Fd<std::process::ChildStdin>>; pub(crate) type ChildStdout = PollEvented<Fd<std::process::ChildStdout>>; pub(crate) type ChildStderr = PollEvented<Fd<std::process::ChildStderr>>; ``` So it should be really easy to add `impl TryFrom<std::process::ChildStdin> for ChildStdin` for example, not to mention tokio has function like `fn stdio<T>(option: Option<T>) -> io::Result<Option<PollEvented<Fd<T>>>>`. **Describe the solution you'd like** Add `impl` `From` or `TryFrom` so that it is possible to convert `std::process::Stdxx` into `tokio::process::Stdxx`. **Additional context** Right now it is impossible (from external API) to construct `tokio::process::Stdxx` without `execve` (via `Command`). it would be valuable to construct them with `nix::unistd::fork`. Since `std::process::Stdxx` is just a thin-wrapper around `FileDesc`, thanks to rust's zero cost abstraction, they're identical to `c_int` (32-bit, or *HANDLE*/64-bit on Windows). But this is not true for `tokio::process::Stdxx`, if we can convert `std::process::Stdxx` into `tokio::process::Stdxx`, then we can do IO redirections with `unistd::fork`.
1.0
implement TryFrom so that std::process:ChildStdxx can be converted to tokio::process:ChildStdxx - **Is your feature request related to a problem? Please describe.** Both `std` and `tokio` have `Stdin`/`Stdout`/`Stderr` type, IMO `tokio` ones are more powerful because they implements `AsyncRead`. In *tokio/src/process/unix/mod.rs*, tokio's `Stdxx` are wrappers around `std::process::Stdxx`: ``` pub(crate) type ChildStdin = PollEvented<Fd<std::process::ChildStdin>>; pub(crate) type ChildStdout = PollEvented<Fd<std::process::ChildStdout>>; pub(crate) type ChildStderr = PollEvented<Fd<std::process::ChildStderr>>; ``` So it should be really easy to add `impl TryFrom<std::process::ChildStdin> for ChildStdin` for example, not to mention tokio has function like `fn stdio<T>(option: Option<T>) -> io::Result<Option<PollEvented<Fd<T>>>>`. **Describe the solution you'd like** Add `impl` `From` or `TryFrom` so that it is possible to convert `std::process::Stdxx` into `tokio::process::Stdxx`. **Additional context** Right now it is impossible (from external API) to construct `tokio::process::Stdxx` without `execve` (via `Command`). it would be valuable to construct them with `nix::unistd::fork`. Since `std::process::Stdxx` is just a thin-wrapper around `FileDesc`, thanks to rust's zero cost abstraction, they're identical to `c_int` (32-bit, or *HANDLE*/64-bit on Windows). But this is not true for `tokio::process::Stdxx`, if we can convert `std::process::Stdxx` into `tokio::process::Stdxx`, then we can do IO redirections with `unistd::fork`.
process
implement tryfrom so that std process childstdxx can be converted to tokio process childstdxx is your feature request related to a problem please describe both std and tokio have stdin stdout stderr type imo tokio ones are more powerful because they implements asyncread in tokio src process unix mod rs tokio s stdxx are wrappers around std process stdxx pub crate type childstdin pollevented pub crate type childstdout pollevented pub crate type childstderr pollevented so it should be really easy to add impl tryfrom for childstdin for example not to mention tokio has function like fn stdio option option io result describe the solution you d like add impl from or tryfrom so that it is possible to convert std process stdxx into tokio process stdxx additional context right now it is impossible from external api to construct tokio process stdxx without execve via command it would be valuable to construct them with nix unistd fork since std process stdxx is just a thin wrapper around filedesc thanks to rust s zero cost abstraction they re identical to c int bit or handle bit on windows but this is not true for tokio process stdxx if we can convert std process stdxx into tokio process stdxx then we can do io redirections with unistd fork
1
12,411
14,919,506,559
IssuesEvent
2021-01-23 00:22:08
hashgraph/hedera-mirror-node
https://api.github.com/repos/hashgraph/hedera-mirror-node
closed
HTS acceptance tests can fail with TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED
P1 bug process test
**Detailed Description** HTS Acceptance Tests occasionally fail with `TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED`, this is because the payer account is set as the treasury account and eventually hits the limit of associated tokens after many successive runs. **Actual Behavior** Steps to reproduce the behavior: 1. Run HTS Acceptance Tests many times over a prolonged period 2. Eventually an attempt to create a token will fail with `TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED` **Expected Behavior** Create token transactions shouldn't fail with `TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED` **Suggested solution** - Dissociate payer from treasury account. Ensure sender is appropriately funded to do following token transfers - Create a new treasury account per test session **Environment:** - Version: v0.26.0-rc1 **Additional Context**
1.0
HTS acceptance tests can fail with TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED - **Detailed Description** HTS Acceptance Tests occasionally fail with `TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED`, this is because the payer account is set as the treasury account and eventually hits the limit of associated tokens after many successive runs. **Actual Behavior** Steps to reproduce the behavior: 1. Run HTS Acceptance Tests many times over a prolonged period 2. Eventually an attempt to create a token will fail with `TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED` **Expected Behavior** Create token transactions shouldn't fail with `TOKENS_PER_ACCOUNT_LIMIT_EXCEEDED` **Suggested solution** - Dissociate payer from treasury account. Ensure sender is appropriately funded to do following token transfers - Create a new treasury account per test session **Environment:** - Version: v0.26.0-rc1 **Additional Context**
process
hts acceptance tests can fail with tokens per account limit exceeded detailed description hts acceptance tests occasionally fail with tokens per account limit exceeded this is because the payer account is set as the treasury account and eventually hits the limit of associated tokens after many successive runs actual behavior steps to reproduce the behavior run hts acceptance tests many times over a prolonged period eventually an attempt to create a token will fail with tokens per account limit exceeded expected behavior create token transactions shouldn t fail with tokens per account limit exceeded suggested solution dissociate payer from treasury account ensure sender is appropriately funded to do following token transfers create a new treasury account per test session environment version additional context
1
698,905
23,996,272,123
IssuesEvent
2022-09-14 07:49:23
thoth-station/thamos
https://api.github.com/repos/thoth-station/thamos
closed
Thoth container image name and version are not provided in `thamos images` command
priority/critical-urgent kind/bug sig/stack-guidance
## Bug description When running `thamos images` with `thamos v1.27.5`, Thoth container images name and version are not printed in the command output. After verification, those values are `None` for all container images listed. ### Steps to Reproduce Upgrade thamos if necessary and run `thamos images`. ### Expected behavior Container images name and version are found and displayed in the output of the command. ### Additional context ![Screenshot from 2022-04-13 14-49-19](https://user-images.githubusercontent.com/66788861/163184143-1f961ec1-60be-4520-afbf-9ca74a7bd42a.png)
1.0
Thoth container image name and version are not provided in `thamos images` command - ## Bug description When running `thamos images` with `thamos v1.27.5`, Thoth container images name and version are not printed in the command output. After verification, those values are `None` for all container images listed. ### Steps to Reproduce Upgrade thamos if necessary and run `thamos images`. ### Expected behavior Container images name and version are found and displayed in the output of the command. ### Additional context ![Screenshot from 2022-04-13 14-49-19](https://user-images.githubusercontent.com/66788861/163184143-1f961ec1-60be-4520-afbf-9ca74a7bd42a.png)
non_process
thoth container image name and version are not provided in thamos images command bug description when running thamos images with thamos thoth container images name and version are not printed in the command output after verification those values are none for all container images listed steps to reproduce upgrade thamos if necessary and run thamos images expected behavior container images name and version are found and displayed in the output of the command additional context
0
72,095
31,160,068,940
IssuesEvent
2023-08-16 15:24:30
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
It wasn't clear what to do here: In Visual Studio Code in the browser, open config/connection.js in the explorer. In the getConnectionInfo function, see that the app settings you created earlier for the MongoDB connection are used (DATABASE_URL and DATABASE_NAME).
app-service/svc triaged assigned-to-author doc-enhancement Pri2
[Enter feedback here] --- #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: d4ce9745-5765-dae3-2d5b-2a24cfd787bb * Version Independent ID: 8d21fd6b-15a4-dc2a-c8c2-fc0e04c15f7f * Content: [Deploy a Node.js web app using MongoDB to Azure - Azure App Service](https://learn.microsoft.com/en-us/azure/app-service/tutorial-nodejs-mongodb-app) * Content Source: [articles/app-service/tutorial-nodejs-mongodb-app.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/app-service/tutorial-nodejs-mongodb-app.md) * Service: **app-service** * GitHub Login: @cephalin * Microsoft Alias: **cephalin**
1.0
It wasn't clear what to do here: In Visual Studio Code in the browser, open config/connection.js in the explorer. In the getConnectionInfo function, see that the app settings you created earlier for the MongoDB connection are used (DATABASE_URL and DATABASE_NAME). - [Enter feedback here] --- #### Document Details ⚠ *Do not edit this section. It is required for learn.microsoft.com ➟ GitHub issue linking.* * ID: d4ce9745-5765-dae3-2d5b-2a24cfd787bb * Version Independent ID: 8d21fd6b-15a4-dc2a-c8c2-fc0e04c15f7f * Content: [Deploy a Node.js web app using MongoDB to Azure - Azure App Service](https://learn.microsoft.com/en-us/azure/app-service/tutorial-nodejs-mongodb-app) * Content Source: [articles/app-service/tutorial-nodejs-mongodb-app.md](https://github.com/MicrosoftDocs/azure-docs/blob/main/articles/app-service/tutorial-nodejs-mongodb-app.md) * Service: **app-service** * GitHub Login: @cephalin * Microsoft Alias: **cephalin**
non_process
it wasn t clear what to do here in visual studio code in the browser open config connection js in the explorer in the getconnectioninfo function see that the app settings you created earlier for the mongodb connection are used database url and database name document details ⚠ do not edit this section it is required for learn microsoft com ➟ github issue linking id version independent id content content source service app service github login cephalin microsoft alias cephalin
0
15,616
19,753,463,304
IssuesEvent
2022-01-15 10:22:14
Jeffail/benthos
https://api.github.com/repos/Jeffail/benthos
closed
Support interpolation in mongo processor
enhancement processors
Hi, I was wondering if it is possible to support [interpolations](https://www.benthos.dev/docs/configuration/interpolation/) in the mongo processor. For example, I wanted to be able to do something like this: Depending on the country, the name of the collection changes. ``` - mongodb: url: ${DB_MONGO_URL} database: ${DB_MONGO_NAME} collection: '${! meta("country") }-property' username: ${DB_USER} password: ${DB_PASS} operation: find-one filter_map: |- root.id = this.id hint_map: "" write_concern: w: "" j: false w_timeout: "60s" ``` Error: `{"@timestamp":"2021-11-30T17:13:47-03:00","@service":"benthos","component":"benthos.pipeline.processor.2.0","json":"{\"error\":\"(InvalidNamespace) Invalid collection name specified 'ads.${! meta(\\\"country\\\")}-property\",\"is_error\":true}","meta":"{\"action\":\"delete\",\"country\":\"co\",\"entity\":\"property\",\"gcp_pubsub_publish_time_unix\":\"1638303226\",\"key\":\"1458513\"}","level":"DEBUG","message":" Avro data parsed"}` Benthos version ``` Version: 3.59.0 Date: 2021-11-26T04:48:24Z ``` Thank you for everything. It's a great project ❤️
1.0
Support interpolation in mongo processor - Hi, I was wondering if it is possible to support [interpolations](https://www.benthos.dev/docs/configuration/interpolation/) in the mongo processor. For example, I wanted to be able to do something like this: Depending on the country, the name of the collection changes. ``` - mongodb: url: ${DB_MONGO_URL} database: ${DB_MONGO_NAME} collection: '${! meta("country") }-property' username: ${DB_USER} password: ${DB_PASS} operation: find-one filter_map: |- root.id = this.id hint_map: "" write_concern: w: "" j: false w_timeout: "60s" ``` Error: `{"@timestamp":"2021-11-30T17:13:47-03:00","@service":"benthos","component":"benthos.pipeline.processor.2.0","json":"{\"error\":\"(InvalidNamespace) Invalid collection name specified 'ads.${! meta(\\\"country\\\")}-property\",\"is_error\":true}","meta":"{\"action\":\"delete\",\"country\":\"co\",\"entity\":\"property\",\"gcp_pubsub_publish_time_unix\":\"1638303226\",\"key\":\"1458513\"}","level":"DEBUG","message":" Avro data parsed"}` Benthos version ``` Version: 3.59.0 Date: 2021-11-26T04:48:24Z ``` Thank you for everything. It's a great project ❤️
process
support interpolation in mongo processor hi i was wondering if it is possible to support in the mongo processor for example i wanted to be able to do something like this depending on the country the name of the collection changes mongodb url db mongo url database db mongo name collection meta country property username db user password db pass operation find one filter map root id this id hint map write concern w j false w timeout error timestamp service benthos component benthos pipeline processor json error invalidnamespace invalid collection name specified ads meta country property is error true meta action delete country co entity property gcp pubsub publish time unix key level debug message avro data parsed benthos version version date thank you for everything it s a great project ❤️
1
2,136
4,974,561,020
IssuesEvent
2016-12-06 07:12:17
opentrials/opentrials
https://api.github.com/repos/opentrials/opentrials
closed
Rebase on continuous dedup/linking
Processors
Based on high-level requirements to `explorer` we will be need to separate some `processors` responsible for doing trial merge/unmerge, publications linking etc. For example if curator will mark two records as the same trail - than some long-living process should react on it and merge records. The same problem is actual for other areas like processors run order etc (pubmed will be linked to existent trial but new trial can't check pubmed publications etc). So `processors` basically ready to do this step (extracting dedup/linking processors to work in background independently) - we could do it on one of the next iterations. Related to #105, #106 --- Example of other use case: - added trial with `nct123` and scientific title `some name` - added trial with `isrctn123` and scientific title `some name plus` - added trial with facts `nct123` and `isrctn123` - now we know that all 3 is the same but current system could merge third only with first or second. Continuous deduplication eventually merge all of them.
1.0
Rebase on continuous dedup/linking - Based on high-level requirements to `explorer` we will be need to separate some `processors` responsible for doing trial merge/unmerge, publications linking etc. For example if curator will mark two records as the same trail - than some long-living process should react on it and merge records. The same problem is actual for other areas like processors run order etc (pubmed will be linked to existent trial but new trial can't check pubmed publications etc). So `processors` basically ready to do this step (extracting dedup/linking processors to work in background independently) - we could do it on one of the next iterations. Related to #105, #106 --- Example of other use case: - added trial with `nct123` and scientific title `some name` - added trial with `isrctn123` and scientific title `some name plus` - added trial with facts `nct123` and `isrctn123` - now we know that all 3 is the same but current system could merge third only with first or second. Continuous deduplication eventually merge all of them.
process
rebase on continuous dedup linking based on high level requirements to explorer we will be need to separate some processors responsible for doing trial merge unmerge publications linking etc for example if curator will mark two records as the same trail than some long living process should react on it and merge records the same problem is actual for other areas like processors run order etc pubmed will be linked to existent trial but new trial can t check pubmed publications etc so processors basically ready to do this step extracting dedup linking processors to work in background independently we could do it on one of the next iterations related to example of other use case added trial with and scientific title some name added trial with and scientific title some name plus added trial with facts and now we know that all is the same but current system could merge third only with first or second continuous deduplication eventually merge all of them
1
2,964
5,960,465,951
IssuesEvent
2017-05-29 14:10:42
orbardugo/Hahot-Hameshulash
https://api.github.com/repos/orbardugo/Hahot-Hameshulash
closed
To combine between two or more queries
Development difficulty 2 in process priorty 3
**User Story:** As the client of the app i would like to make couple of queries at the same time, to example: I would like to get all the males that lives in jerusalem, at ages 18-20. **Test senarios:** 1. click on gender and add male/female. 2. click on age and choose range ages. 3. click on city and choose specific city. **Expected results:** 1. See results on query result labels
1.0
To combine between two or more queries - **User Story:** As the client of the app i would like to make couple of queries at the same time, to example: I would like to get all the males that lives in jerusalem, at ages 18-20. **Test senarios:** 1. click on gender and add male/female. 2. click on age and choose range ages. 3. click on city and choose specific city. **Expected results:** 1. See results on query result labels
process
to combine between two or more queries user story as the client of the app i would like to make couple of queries at the same time to example i would like to get all the males that lives in jerusalem at ages test senarios click on gender and add male female click on age and choose range ages click on city and choose specific city expected results see results on query result labels
1
14,911
18,296,394,205
IssuesEvent
2021-10-05 20:53:57
dtcenter/MET
https://api.github.com/repos/dtcenter/MET
opened
Fix MADIS2NC to work with changes in variables in the mesonet output
type: bug component: user support priority: high alert: NEED MORE DEFINITION alert: NEED ACCOUNT KEY alert: NEED PROJECT ASSIGNMENT component: external dependency requestor: METplus Team MET: PreProcessing Tools (Point)
## Describe the Problem ## The issue arose from [METplus Discussion #1189 ](https://github.com/dtcenter/METplus/discussions/1189). The madis2nc tool rejects certain mesonet data from MADIS due to a change in variables/format to the NetCDF files. Sometime in 2016, a change was made to the mesonet output files where they no longer contain variables such as precip3hr, precip6hr, etc. The MADIS support team and confirmed this was an intentional update and said that they do add/change variables to the raw data from time to time, which complicates the current methods for identifying MADIS data type. The change in data files can be demonstrated in the following files. On sencea in /d1/projects/METplus/discussions/1189, I ran the following ``` wget https://madis-data.ncep.noaa.gov/madisPublic1/data/archive/2016/01/01/LDAD/mesonet/netCDF/20160101_0000.gz gunzip 20160101_0000.gz ../../../MET/MET_releases/met-10.1.0-beta2/bin/madis2nc 20160101_0000 20160101_0000.nc -type mesonet ``` and confirmed this produced a successful run. ``` wget https://madis-data.ncep.noaa.gov/madisPublic1/data/archive/2017/01/01/LDAD/mesonet/netCDF/20170101_0000.gz gunzip 20170101_0000.gz ../../../MET/MET_releases/met-10.1.0-beta2/bin/madis2nc 20170101_0000 20170101_0000.nc -type mesonet ``` and confirmed that this produced the following error and warnings: ``` DEBUG 1: Reading MADIS File: 20170101_0000 ERROR : ERROR : process_madis_mesonet() Please check if the input is a MESONET. ERROR : WARNING: missing variable: precip3hr WARNING: missing variable: precip6hr WARNING: missing variable: precip12hr WARNING: missing variable: precip10min ``` We want to fix this for MET-10.1.0-beta4 and do a bugfix in MET-10.0.1. ### Expected Behavior ### MADIS2NC should be able to read various mesonet data from MADIS. ### Environment ### Describe your runtime environment: *1. Machine: (e.g. HPC name, Linux Workstation, Mac Laptop)* Seneca *2. OS: (e.g. RedHat Linux, MacOS)* Linux *3. Software version number(s)* ### To Reproduce ### Describe the steps to reproduce the behavior: **(see above)** *1. Go to '...'* *2. Click on '....'* *3. Scroll down to '....'* *4. See error* *Post relevant sample data following these instructions:* *https://dtcenter.org/community-code/model-evaluation-tools-met/met-help-desk#ftp* ### Relevant Deadlines ### MET-10.1.0-beta4 (11/15/21) ### Funding Source ### *TBD at start time* ## Define the Metadata ## ### Assignee ### - [x] Select **engineer(s)** or **no engineer** required - [ ] Select **scientist(s)** or **no scientist** required ### Labels ### - [x] Select **component(s)** - [x] Select **priority** - [x] Select **requestor(s)** ### Projects and Milestone ### - [x] Select **Organization** level **Project** for support of the current coordinated release - [x] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label - [x] Select **Milestone** as the next bugfix version ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) ## Bugfix Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**. - [ ] Fork this repository or create a branch of **main_\<Version>**. Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>` - [ ] Fix the bug and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **main_\<Version>**. Pull request: `bugfix <Issue Number> main_<Version> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Linked issues** Select: **Organization** level software support **Project** for the current coordinated release Select: **Milestone** as the next bugfix version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Complete the steps above to fix the bug on the **develop** branch. Branch name: `bugfix_<Issue Number>_develop_<Description>` Pull request: `bugfix <Issue Number> develop <Description>` Select: **Reviewer(s)** and **Linked issues** Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Close this issue.
1.0
Fix MADIS2NC to work with changes in variables in the mesonet output - ## Describe the Problem ## The issue arose from [METplus Discussion #1189 ](https://github.com/dtcenter/METplus/discussions/1189). The madis2nc tool rejects certain mesonet data from MADIS due to a change in variables/format to the NetCDF files. Sometime in 2016, a change was made to the mesonet output files where they no longer contain variables such as precip3hr, precip6hr, etc. The MADIS support team and confirmed this was an intentional update and said that they do add/change variables to the raw data from time to time, which complicates the current methods for identifying MADIS data type. The change in data files can be demonstrated in the following files. On sencea in /d1/projects/METplus/discussions/1189, I ran the following ``` wget https://madis-data.ncep.noaa.gov/madisPublic1/data/archive/2016/01/01/LDAD/mesonet/netCDF/20160101_0000.gz gunzip 20160101_0000.gz ../../../MET/MET_releases/met-10.1.0-beta2/bin/madis2nc 20160101_0000 20160101_0000.nc -type mesonet ``` and confirmed this produced a successful run. ``` wget https://madis-data.ncep.noaa.gov/madisPublic1/data/archive/2017/01/01/LDAD/mesonet/netCDF/20170101_0000.gz gunzip 20170101_0000.gz ../../../MET/MET_releases/met-10.1.0-beta2/bin/madis2nc 20170101_0000 20170101_0000.nc -type mesonet ``` and confirmed that this produced the following error and warnings: ``` DEBUG 1: Reading MADIS File: 20170101_0000 ERROR : ERROR : process_madis_mesonet() Please check if the input is a MESONET. ERROR : WARNING: missing variable: precip3hr WARNING: missing variable: precip6hr WARNING: missing variable: precip12hr WARNING: missing variable: precip10min ``` We want to fix this for MET-10.1.0-beta4 and do a bugfix in MET-10.0.1. ### Expected Behavior ### MADIS2NC should be able to read various mesonet data from MADIS. ### Environment ### Describe your runtime environment: *1. Machine: (e.g. HPC name, Linux Workstation, Mac Laptop)* Seneca *2. OS: (e.g. RedHat Linux, MacOS)* Linux *3. Software version number(s)* ### To Reproduce ### Describe the steps to reproduce the behavior: **(see above)** *1. Go to '...'* *2. Click on '....'* *3. Scroll down to '....'* *4. See error* *Post relevant sample data following these instructions:* *https://dtcenter.org/community-code/model-evaluation-tools-met/met-help-desk#ftp* ### Relevant Deadlines ### MET-10.1.0-beta4 (11/15/21) ### Funding Source ### *TBD at start time* ## Define the Metadata ## ### Assignee ### - [x] Select **engineer(s)** or **no engineer** required - [ ] Select **scientist(s)** or **no scientist** required ### Labels ### - [x] Select **component(s)** - [x] Select **priority** - [x] Select **requestor(s)** ### Projects and Milestone ### - [x] Select **Organization** level **Project** for support of the current coordinated release - [x] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label - [x] Select **Milestone** as the next bugfix version ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) ## Bugfix Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**. - [ ] Fork this repository or create a branch of **main_\<Version>**. Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>` - [ ] Fix the bug and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **main_\<Version>**. Pull request: `bugfix <Issue Number> main_<Version> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Linked issues** Select: **Organization** level software support **Project** for the current coordinated release Select: **Milestone** as the next bugfix version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Complete the steps above to fix the bug on the **develop** branch. Branch name: `bugfix_<Issue Number>_develop_<Description>` Pull request: `bugfix <Issue Number> develop <Description>` Select: **Reviewer(s)** and **Linked issues** Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Close this issue.
process
fix to work with changes in variables in the mesonet output describe the problem the issue arose from the tool rejects certain mesonet data from madis due to a change in variables format to the netcdf files sometime in a change was made to the mesonet output files where they no longer contain variables such as etc the madis support team and confirmed this was an intentional update and said that they do add change variables to the raw data from time to time which complicates the current methods for identifying madis data type the change in data files can be demonstrated in the following files on sencea in projects metplus discussions i ran the following wget gunzip gz met met releases met bin nc type mesonet and confirmed this produced a successful run wget gunzip gz met met releases met bin nc type mesonet and confirmed that this produced the following error and warnings debug reading madis file error error process madis mesonet please check if the input is a mesonet error warning missing variable warning missing variable warning missing variable warning missing variable we want to fix this for met and do a bugfix in met expected behavior should be able to read various mesonet data from madis environment describe your runtime environment machine e g hpc name linux workstation mac laptop seneca os e g redhat linux macos linux software version number s to reproduce describe the steps to reproduce the behavior see above go to click on scroll down to see error post relevant sample data following these instructions relevant deadlines met funding source tbd at start time define the metadata assignee select engineer s or no engineer required select scientist s or no scientist required labels select component s select priority select requestor s projects and milestone select organization level project for support of the current coordinated release select repository level project for development toward the next official release or add alert need project assignment label select milestone as the next bugfix version define related issue s consider the impact to the other metplus components bugfix checklist see the for details complete the issue definition above including the time estimate and funding source fork this repository or create a branch of main branch name bugfix main fix the bug and test your changes add update log messages for easier debugging add update unit tests add update documentation push local changes to github submit a pull request to merge into main pull request bugfix main define the pull request metadata as permissions allow select reviewer s and linked issues select organization level software support project for the current coordinated release select milestone as the next bugfix version iterate until the reviewer s accept and merge your changes delete your fork or branch complete the steps above to fix the bug on the develop branch branch name bugfix develop pull request bugfix develop select reviewer s and linked issues select repository level development cycle project for the next official release select milestone as the next official version close this issue
1
388,264
26,757,292,752
IssuesEvent
2023-01-31 02:01:40
dotnetcore/BootstrapBlazor
https://api.github.com/repos/dotnetcore/BootstrapBlazor
closed
doc: update the table component toolbar demos
documentation
### Document describing which component ```Table``` component, ```Toolbar``` Demos update to new code block mode.
1.0
doc: update the table component toolbar demos - ### Document describing which component ```Table``` component, ```Toolbar``` Demos update to new code block mode.
non_process
doc update the table component toolbar demos document describing which component table component toolbar demos update to new code block mode
0
195,713
6,917,476,254
IssuesEvent
2017-11-29 08:42:06
DjangoChained/comanche
https://api.github.com/repos/DjangoChained/comanche
opened
Traitement des requêtes
enhancement high priority
1. Parser la requête ; 1. Renvoyer `505 HTTP Version Not Supported` si la requête n'est pas en `HTTP/1.1` ; 1. Renvoyer `405 Method Not Allowed` si la requête n'est pas de type GET ; 1. Rechercher la première projection correspondante dans le fichier de configuration ; 1. Si aucune projection ne correspond, renvoyer `404 Not Found` ; 1. Si une projection statique correspond, exécuter le traitement de projection statique ; 1. Si une projection dynamque correspond, exécuter le traitement de projection dynamique.
1.0
Traitement des requêtes - 1. Parser la requête ; 1. Renvoyer `505 HTTP Version Not Supported` si la requête n'est pas en `HTTP/1.1` ; 1. Renvoyer `405 Method Not Allowed` si la requête n'est pas de type GET ; 1. Rechercher la première projection correspondante dans le fichier de configuration ; 1. Si aucune projection ne correspond, renvoyer `404 Not Found` ; 1. Si une projection statique correspond, exécuter le traitement de projection statique ; 1. Si une projection dynamque correspond, exécuter le traitement de projection dynamique.
non_process
traitement des requêtes parser la requête renvoyer http version not supported si la requête n est pas en http renvoyer method not allowed si la requête n est pas de type get rechercher la première projection correspondante dans le fichier de configuration si aucune projection ne correspond renvoyer not found si une projection statique correspond exécuter le traitement de projection statique si une projection dynamque correspond exécuter le traitement de projection dynamique
0
28
2,499,742,681
IssuesEvent
2015-01-08 05:03:59
patrickcate/patrickc8t-portfolio
https://api.github.com/repos/patrickcate/patrickc8t-portfolio
closed
Add picturefill.js
graphics js performance
Add picturefill.js and different image srcset image sizes so browsers can download the smallest image files needed.
True
Add picturefill.js - Add picturefill.js and different image srcset image sizes so browsers can download the smallest image files needed.
non_process
add picturefill js add picturefill js and different image srcset image sizes so browsers can download the smallest image files needed
0
26,410
5,251,560,305
IssuesEvent
2017-02-02 00:09:20
BonnyCI/projman
https://api.github.com/repos/BonnyCI/projman
closed
Document the BonnyCI workflow
MVP: Documentation
Describe what happens when a pull request is opened and how it gets tested, approved, gated, merged. Crib from the upstream OpenStack zuul/gerrit developer docs.
1.0
Document the BonnyCI workflow - Describe what happens when a pull request is opened and how it gets tested, approved, gated, merged. Crib from the upstream OpenStack zuul/gerrit developer docs.
non_process
document the bonnyci workflow describe what happens when a pull request is opened and how it gets tested approved gated merged crib from the upstream openstack zuul gerrit developer docs
0
1,350
3,908,238,334
IssuesEvent
2016-04-19 15:18:15
BEP-store/project-plan
https://api.github.com/repos/BEP-store/project-plan
closed
Project organization
Process Product
An outline of the project organization in terms of different repositories and IP concerns should be included in the project plan.
1.0
Project organization - An outline of the project organization in terms of different repositories and IP concerns should be included in the project plan.
process
project organization an outline of the project organization in terms of different repositories and ip concerns should be included in the project plan
1
16,441
21,317,070,092
IssuesEvent
2022-04-16 13:16:27
dita-ot/dita-ot
https://api.github.com/repos/dita-ot/dita-ot
closed
Handling of link to topic referenced with @copy-to
bug preprocess stale
I'm attaching a small sample project. I'm publishing to PDF using DITA OT 2.4.6. Basically it's a topic referenced with @copy-to: <topicref href="my_topic.dita" copy-to="prefix_my_topic.dita"/> and another topic has a link to it. In my opinion the xref should point directly to the newer @copy-to value: <xref href="prefix_my_topic.dita#my_topic"/> With DITA OT 1.8 the generated PDF will contain a functioning link with a page number attached to it. With DITA OT 2.4.6 the publishing breaks with a NullPointerException. - I believe this to be a bug, not a question about using DITA-OT. - I read the [CONTRIBUTING][] file. ## Expected Behavior The link to the topic referenced with @copy-to should work. ## Actual Behavior The publishing breaks with a NullPointerException. ## Steps to Reproduce Publish the attached DITA Map to PDF. [rellinkCopyTo.zip](https://github.com/dita-ot/dita-ot/files/1091362/rellinkCopyTo.zip) ## Copy of the error message, log file or stack trace ``` D:\projects\eXml\frameworks\dita\DITA-OT2.x\build.xml:45: The following error occurred while executing this line: D:\projects\eXml\frameworks\dita\DITA-OT2.x\plugins\org.dita.base\build_preprocess.xml:290: java.lang.NullPointerException at org.dita.dost.module.CopyToModule.performCopytoTask(CopyToModule.java:154) at org.dita.dost.module.CopyToModule.execute(CopyToModule.java:67) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:70) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:222) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) ``` ## Environment * DITA-OT version: 2.4.6 * Operating system and version _(Linux, macOS, Windows)_: Windows * How did you run DITA-OT? oXygen * Transformation type _(HTML5, PDF, custom, etc.)_: PDF [CONTRIBUTING]: https://github.com/dita-ot/dita-ot/blob/develop/.github/CONTRIBUTING.md
1.0
Handling of link to topic referenced with @copy-to - I'm attaching a small sample project. I'm publishing to PDF using DITA OT 2.4.6. Basically it's a topic referenced with @copy-to: <topicref href="my_topic.dita" copy-to="prefix_my_topic.dita"/> and another topic has a link to it. In my opinion the xref should point directly to the newer @copy-to value: <xref href="prefix_my_topic.dita#my_topic"/> With DITA OT 1.8 the generated PDF will contain a functioning link with a page number attached to it. With DITA OT 2.4.6 the publishing breaks with a NullPointerException. - I believe this to be a bug, not a question about using DITA-OT. - I read the [CONTRIBUTING][] file. ## Expected Behavior The link to the topic referenced with @copy-to should work. ## Actual Behavior The publishing breaks with a NullPointerException. ## Steps to Reproduce Publish the attached DITA Map to PDF. [rellinkCopyTo.zip](https://github.com/dita-ot/dita-ot/files/1091362/rellinkCopyTo.zip) ## Copy of the error message, log file or stack trace ``` D:\projects\eXml\frameworks\dita\DITA-OT2.x\build.xml:45: The following error occurred while executing this line: D:\projects\eXml\frameworks\dita\DITA-OT2.x\plugins\org.dita.base\build_preprocess.xml:290: java.lang.NullPointerException at org.dita.dost.module.CopyToModule.performCopytoTask(CopyToModule.java:154) at org.dita.dost.module.CopyToModule.execute(CopyToModule.java:67) at org.dita.dost.pipeline.PipelineFacade.execute(PipelineFacade.java:70) at org.dita.dost.invoker.ExtensibleAntInvoker.execute(ExtensibleAntInvoker.java:222) at org.apache.tools.ant.UnknownElement.execute(UnknownElement.java:293) at sun.reflect.GeneratedMethodAccessor6.invoke(Unknown Source) at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) at java.lang.reflect.Method.invoke(Unknown Source) at org.apache.tools.ant.dispatch.DispatchUtils.execute(DispatchUtils.java:106) at org.apache.tools.ant.Task.perform(Task.java:348) at org.apache.tools.ant.Target.execute(Target.java:435) ``` ## Environment * DITA-OT version: 2.4.6 * Operating system and version _(Linux, macOS, Windows)_: Windows * How did you run DITA-OT? oXygen * Transformation type _(HTML5, PDF, custom, etc.)_: PDF [CONTRIBUTING]: https://github.com/dita-ot/dita-ot/blob/develop/.github/CONTRIBUTING.md
process
handling of link to topic referenced with copy to i m attaching a small sample project i m publishing to pdf using dita ot basically it s a topic referenced with copy to and another topic has a link to it in my opinion the xref should point directly to the newer copy to value with dita ot the generated pdf will contain a functioning link with a page number attached to it with dita ot the publishing breaks with a nullpointerexception i believe this to be a bug not a question about using dita ot i read the file expected behavior the link to the topic referenced with copy to should work actual behavior the publishing breaks with a nullpointerexception steps to reproduce publish the attached dita map to pdf copy of the error message log file or stack trace d projects exml frameworks dita dita x build xml the following error occurred while executing this line d projects exml frameworks dita dita x plugins org dita base build preprocess xml java lang nullpointerexception at org dita dost module copytomodule performcopytotask copytomodule java at org dita dost module copytomodule execute copytomodule java at org dita dost pipeline pipelinefacade execute pipelinefacade java at org dita dost invoker extensibleantinvoker execute extensibleantinvoker java at org apache tools ant unknownelement execute unknownelement java at sun reflect invoke unknown source at sun reflect delegatingmethodaccessorimpl invoke unknown source at java lang reflect method invoke unknown source at org apache tools ant dispatch dispatchutils execute dispatchutils java at org apache tools ant task perform task java at org apache tools ant target execute target java environment dita ot version operating system and version linux macos windows windows how did you run dita ot oxygen transformation type pdf custom etc pdf
1
179,458
14,704,647,974
IssuesEvent
2021-01-04 16:48:54
SketchUp/api-issue-tracker
https://api.github.com/repos/SketchUp/api-issue-tracker
closed
#active_path documentation contradiction
Ruby API SketchUp documentation
1. SketchUp/LayOut Version: Pro 2020 2. OS Platform: all Excerpt: > The [#active_path=] method is used to open a given instance path for editing. > It is expected that no entities are modified in an operation that opens/closes instances. open for edit but do not modify !
1.0
#active_path documentation contradiction - 1. SketchUp/LayOut Version: Pro 2020 2. OS Platform: all Excerpt: > The [#active_path=] method is used to open a given instance path for editing. > It is expected that no entities are modified in an operation that opens/closes instances. open for edit but do not modify !
non_process
active path documentation contradiction sketchup layout version pro os platform all excerpt the method is used to open a given instance path for editing it is expected that no entities are modified in an operation that opens closes instances open for edit but do not modify
0
431,047
12,474,454,329
IssuesEvent
2020-05-29 09:40:52
Optiboot/optiboot
https://api.github.com/repos/Optiboot/optiboot
closed
Comment in boards-1.6.txt about Pro Micro probably was intended to be Pro Mini
Component-Docs No-binary-change Priority-Low
I noticed in the 8.0 release zip boards.txt and in boards-1.6.txt there is this comment: ## Optiboot on 32pin (SMT) CPUs (Nano, Pro Micro, etc.) I think perhaps that was meant to say Pro Mini instead of Pro Micro.
1.0
Comment in boards-1.6.txt about Pro Micro probably was intended to be Pro Mini - I noticed in the 8.0 release zip boards.txt and in boards-1.6.txt there is this comment: ## Optiboot on 32pin (SMT) CPUs (Nano, Pro Micro, etc.) I think perhaps that was meant to say Pro Mini instead of Pro Micro.
non_process
comment in boards txt about pro micro probably was intended to be pro mini i noticed in the release zip boards txt and in boards txt there is this comment optiboot on smt cpus nano pro micro etc i think perhaps that was meant to say pro mini instead of pro micro
0
21,373
29,202,227,785
IssuesEvent
2023-05-21 00:36:45
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
[Remoto] DevOps (Pleno) na Coodesh
SALVADOR PJ PYTHON PLENO NODE.JS DEVOPS AWS REQUISITOS REMOTO TDD PROCESSOS GITHUB CI CD SEGURANÇA UMA C CLEAN ECS TERRAFORM MANUTENÇÃO APP INTELIGÊNCIA ARTIFICIAL ARQUITETURA DE SOFTWARE API GATEWAY RDS Stale
## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/devops-pleno-164955703?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A <strong>UPlanner</strong> está em busca de <strong><ins>DevOps Pleno</ins></strong> para compor seu time!</p> <p><br>A UPlanner é uma empresa de planejamento financeiro pessoal, a primeira a ter uma solução digital completa, que está transformando a vida financeira de seus clientes. Contamos com um time de planejadores financeiros, que utilizam nossa plataforma para potencializar o desenvolvimento e educação financeira dos clientes.</p> <p>Estamos a todo vapor no nosso processo de expansão, contratando em todos os setores. Nosso time de desenvolvimento precisa de reforço. Estamos em busca de um(a) DevOps que chegue para servir nossos engenheiros de software e manter nossa infra impecável.</p> <p></p> <p><strong>Responsabilidades:</strong></p> <ul> <li>Você será 100% responsável por introduzir processos, ferramentas e metodologias para equilibrar as necessidades ao longo de todo o ciclo de vida do desenvolvimento de nossa plataforma, desde a criação do código e a implantação até as etapas de manutenção e atualização;</li> <li>Sua missão será reduzir a complexidade entre a integração do trabalho dos devs, preenchendo as lacunas entre as ações necessárias para mudar uma aplicação rapidamente e as tarefas para manter a confiabilidade e segurança.</li> </ul> ## UPlanner - Planejamento Financeiro Pessoal: <p>A Uplanner conta com uma equipe de planejadores financeiros e um super APP que usa inteligência artificial, para auxiliar na gestão financeira diária. Tudo isso, somado a um plano de ação que auxilia cada um, na realização de seus planos e na potencialização de seus resultados.</p> </p> ## Habilidades: - Github Actions - Amazon AWS - Terraform - Python - Node.js - Bash ## Local: 100% Remoto ## Requisitos: - CI/CD (Github Actions, Code Magic); - Foco em performance, disponibilidade, flexibilidade e segurança; - Experiência com Python, Node e Bash; - Experiência comprovada com serviços AWS (ECS, RDS, API Gateway, CodeDeploy, CodePipeline, IAM, VPC, Security Groups); - Experiência comprovada em IaaS utilizando Terraform. ## Diferenciais: - Experiência em projetos fintech; - Test Driven Development (TDD); - Arquitetura de software (clean architecture, event-driven, Microsserviços). ## Benefícios: - Plano de saúde; - Seguro de vida; - Auxílio alimentação; - Planejamento financeiro. ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [DevOps (Pleno) na UPlanner - Planejamento Financeiro Pessoal](https://coodesh.com/vagas/devops-pleno-164955703?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime PJ #### Categoria DevOps
1.0
[Remoto] DevOps (Pleno) na Coodesh - ## Descrição da vaga: Esta é uma vaga de um parceiro da plataforma Coodesh, ao candidatar-se você terá acesso as informações completas sobre a empresa e benefícios. Fique atento ao redirecionamento que vai te levar para uma url [https://coodesh.com](https://coodesh.com/vagas/devops-pleno-164955703?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) com o pop-up personalizado de candidatura. 👋 <p>A <strong>UPlanner</strong> está em busca de <strong><ins>DevOps Pleno</ins></strong> para compor seu time!</p> <p><br>A UPlanner é uma empresa de planejamento financeiro pessoal, a primeira a ter uma solução digital completa, que está transformando a vida financeira de seus clientes. Contamos com um time de planejadores financeiros, que utilizam nossa plataforma para potencializar o desenvolvimento e educação financeira dos clientes.</p> <p>Estamos a todo vapor no nosso processo de expansão, contratando em todos os setores. Nosso time de desenvolvimento precisa de reforço. Estamos em busca de um(a) DevOps que chegue para servir nossos engenheiros de software e manter nossa infra impecável.</p> <p></p> <p><strong>Responsabilidades:</strong></p> <ul> <li>Você será 100% responsável por introduzir processos, ferramentas e metodologias para equilibrar as necessidades ao longo de todo o ciclo de vida do desenvolvimento de nossa plataforma, desde a criação do código e a implantação até as etapas de manutenção e atualização;</li> <li>Sua missão será reduzir a complexidade entre a integração do trabalho dos devs, preenchendo as lacunas entre as ações necessárias para mudar uma aplicação rapidamente e as tarefas para manter a confiabilidade e segurança.</li> </ul> ## UPlanner - Planejamento Financeiro Pessoal: <p>A Uplanner conta com uma equipe de planejadores financeiros e um super APP que usa inteligência artificial, para auxiliar na gestão financeira diária. Tudo isso, somado a um plano de ação que auxilia cada um, na realização de seus planos e na potencialização de seus resultados.</p> </p> ## Habilidades: - Github Actions - Amazon AWS - Terraform - Python - Node.js - Bash ## Local: 100% Remoto ## Requisitos: - CI/CD (Github Actions, Code Magic); - Foco em performance, disponibilidade, flexibilidade e segurança; - Experiência com Python, Node e Bash; - Experiência comprovada com serviços AWS (ECS, RDS, API Gateway, CodeDeploy, CodePipeline, IAM, VPC, Security Groups); - Experiência comprovada em IaaS utilizando Terraform. ## Diferenciais: - Experiência em projetos fintech; - Test Driven Development (TDD); - Arquitetura de software (clean architecture, event-driven, Microsserviços). ## Benefícios: - Plano de saúde; - Seguro de vida; - Auxílio alimentação; - Planejamento financeiro. ## Como se candidatar: Candidatar-se exclusivamente através da plataforma Coodesh no link a seguir: [DevOps (Pleno) na UPlanner - Planejamento Financeiro Pessoal](https://coodesh.com/vagas/devops-pleno-164955703?utm_source=github&utm_medium=devssa-onde-codar-em-salvador&modal=open) Após candidatar-se via plataforma Coodesh e validar o seu login, você poderá acompanhar e receber todas as interações do processo por lá. Utilize a opção **Pedir Feedback** entre uma etapa e outra na vaga que se candidatou. Isso fará com que a pessoa **Recruiter** responsável pelo processo na empresa receba a notificação. ## Labels #### Alocação Remoto #### Regime PJ #### Categoria DevOps
process
devops pleno na coodesh descrição da vaga esta é uma vaga de um parceiro da plataforma coodesh ao candidatar se você terá acesso as informações completas sobre a empresa e benefícios fique atento ao redirecionamento que vai te levar para uma url com o pop up personalizado de candidatura 👋 a uplanner está em busca de devops pleno para compor seu time a uplanner é uma empresa de planejamento financeiro pessoal a primeira a ter uma solução digital completa que está transformando a vida financeira de seus clientes contamos com um time de planejadores financeiros que utilizam nossa plataforma para potencializar o desenvolvimento e educação financeira dos clientes estamos a todo vapor no nosso processo de expansão contratando em todos os setores nosso time de desenvolvimento precisa de reforço estamos em busca de um a devops que chegue para servir nossos engenheiros de software e manter nossa infra impecável responsabilidades você será responsável por introduzir processos ferramentas e metodologias para equilibrar as necessidades ao longo de todo o ciclo de vida do desenvolvimento de nossa plataforma desde a criação do código e a implantação até as etapas de manutenção e atualização sua missão será reduzir a complexidade entre a integração do trabalho dos devs preenchendo as lacunas entre as ações necessárias para mudar uma aplicação rapidamente e as tarefas para manter a confiabilidade e segurança uplanner planejamento financeiro pessoal a uplanner conta com uma equipe de planejadores financeiros e um super app que usa inteligência artificial para auxiliar na gestão financeira diária tudo isso somado a um plano de ação que auxilia cada um na realização de seus planos e na potencialização de seus resultados habilidades github actions amazon aws terraform python node js bash local remoto requisitos ci cd github actions code magic foco em performance disponibilidade flexibilidade e segurança experiência com python node e bash experiência comprovada com serviços aws ecs rds api gateway codedeploy codepipeline iam vpc security groups experiência comprovada em iaas utilizando terraform diferenciais experiência em projetos fintech test driven development tdd arquitetura de software clean architecture event driven microsserviços benefícios plano de saúde seguro de vida auxílio alimentação planejamento financeiro como se candidatar candidatar se exclusivamente através da plataforma coodesh no link a seguir após candidatar se via plataforma coodesh e validar o seu login você poderá acompanhar e receber todas as interações do processo por lá utilize a opção pedir feedback entre uma etapa e outra na vaga que se candidatou isso fará com que a pessoa recruiter responsável pelo processo na empresa receba a notificação labels alocação remoto regime pj categoria devops
1
3,297
6,395,363,042
IssuesEvent
2017-08-04 13:04:28
pelias/api
https://api.github.com/repos/pelias/api
closed
Should warn if any unexpected parameters are found in query.
help wanted in progress in review low hanging fruit processed
# error when specifying singular of 'sources' or 'layers' I frequently make the mistake of setting 'source' instead 'sources'; same for 'layer', the API responds without a warnings or errors which causes me much confusion until I catch my typo. We should at very least `warn`, but ideally `error` in this case so we `fail fast, fail early`
1.0
Should warn if any unexpected parameters are found in query. - # error when specifying singular of 'sources' or 'layers' I frequently make the mistake of setting 'source' instead 'sources'; same for 'layer', the API responds without a warnings or errors which causes me much confusion until I catch my typo. We should at very least `warn`, but ideally `error` in this case so we `fail fast, fail early`
process
should warn if any unexpected parameters are found in query error when specifying singular of sources or layers i frequently make the mistake of setting source instead sources same for layer the api responds without a warnings or errors which causes me much confusion until i catch my typo we should at very least warn but ideally error in this case so we fail fast fail early
1
314,056
23,504,394,778
IssuesEvent
2022-08-18 11:18:42
rhiannonmcn/the-book-nook
https://api.github.com/repos/rhiannonmcn/the-book-nook
opened
DOCUMENTATION: README.md Update
documentation must have
- [ ] Finish Future Features section in README.md - [ ] Finish Technologies used section in README.md - [ ] Finish Credits section in README.md - [ ] Finish Acknowledgements section in README.md
1.0
DOCUMENTATION: README.md Update - - [ ] Finish Future Features section in README.md - [ ] Finish Technologies used section in README.md - [ ] Finish Credits section in README.md - [ ] Finish Acknowledgements section in README.md
non_process
documentation readme md update finish future features section in readme md finish technologies used section in readme md finish credits section in readme md finish acknowledgements section in readme md
0
220,175
24,564,716,491
IssuesEvent
2022-10-13 01:05:43
akshat702/cart-ionic
https://api.github.com/repos/akshat702/cart-ionic
opened
CVE-2022-37599 (Medium) detected in loader-utils-1.1.0.tgz
security vulnerability
## CVE-2022-37599 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.1.0.tgz</b></p></summary> <p>utils for webpack loaders</p> <p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz</a></p> <p>Path to dependency file: /cart/package.json</p> <p>Path to vulnerable library: /cart/e2e/node_modules/loader-utils/package.json,/cart/e2e/node_modules/loader-utils/package.json</p> <p> Dependency Hierarchy: - build-angular-0.12.4.tgz (Root Library) - :x: **loader-utils-1.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/akshat702/cart-ionic/commits/3fdf7c1cc5dbc0b605cbc948ab32ca2ee1f0e49f">3fdf7c1cc5dbc0b605cbc948ab32ca2ee1f0e49f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Regular expression denial of service (ReDoS) flaw was found in Function interpolateName in interpolateName.js in webpack loader-utils 2.0.0 via the resourcePath variable in interpolateName.js. <p>Publish Date: 2022-10-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37599>CVE-2022-37599</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-37599 (Medium) detected in loader-utils-1.1.0.tgz - ## CVE-2022-37599 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>loader-utils-1.1.0.tgz</b></p></summary> <p>utils for webpack loaders</p> <p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.1.0.tgz</a></p> <p>Path to dependency file: /cart/package.json</p> <p>Path to vulnerable library: /cart/e2e/node_modules/loader-utils/package.json,/cart/e2e/node_modules/loader-utils/package.json</p> <p> Dependency Hierarchy: - build-angular-0.12.4.tgz (Root Library) - :x: **loader-utils-1.1.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/akshat702/cart-ionic/commits/3fdf7c1cc5dbc0b605cbc948ab32ca2ee1f0e49f">3fdf7c1cc5dbc0b605cbc948ab32ca2ee1f0e49f</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Regular expression denial of service (ReDoS) flaw was found in Function interpolateName in interpolateName.js in webpack loader-utils 2.0.0 via the resourcePath variable in interpolateName.js. <p>Publish Date: 2022-10-11 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37599>CVE-2022-37599</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in loader utils tgz cve medium severity vulnerability vulnerable library loader utils tgz utils for webpack loaders library home page a href path to dependency file cart package json path to vulnerable library cart node modules loader utils package json cart node modules loader utils package json dependency hierarchy build angular tgz root library x loader utils tgz vulnerable library found in head commit a href vulnerability details a regular expression denial of service redos flaw was found in function interpolatename in interpolatename js in webpack loader utils via the resourcepath variable in interpolatename js publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href step up your open source security game with mend
0
21,976
30,468,563,138
IssuesEvent
2023-07-17 12:11:48
q191201771/lal
https://api.github.com/repos/q191201771/lal
closed
多路同时推、拉流,报Broken pipe
#Question *In process
用如下命令推流: ffmpeg -f libk_video -wh 1920x1080 -sensor 3 -i video="test " -f alsa -ac 2 -ar 32000 -i hw:0 -idr_freq 25 -vcodec libk_h264 -acodec aac -f rtsp rtsp://10.20.1.55:5544/xxxx/com28 ffmpeg version 4.4 Copyright (c) 2000-2021 the FFmpeg developers built with gcc 7.3.0 (2019-11-20_nds64le-linux-glibc-v5d-6c120106e03) configuration: --cross-prefix=riscv64-linux- --enable-cross-compile --target-os=linux --cc=riscv64-linux-gcc --arch=riscv64 --extra-ldflags=-L./ --extra-ldflags=-ldl --extra-ldflags='-Wl,-rpath .' --enable-static --enable-libk_video --enable-libk_h264 --enable-libk_jpeg --enable-alsa --disable-autodetect --disable-ffplay --disable-ffprobe --disable-doc --enable-audio3a --enable-indev=v4l2 libavutil 56. 70.100 / 56. 70.100 libavcodec 58.134.100 / 58.134.100 libavformat 58. 76.100 / 58. 76.100 libavdevice 58. 13.100 / 58. 13.100 libavfilter 7.110.100 / 7.110.100 libswscale 5. 9.100 / 5. 9.100 libswresample 3. 9.100 / 3. 9.100 0x970E00f4: from 0x00550000 to 0x00110000 0x970E00f8: from 0x00000000 to 0x00770000 0x970E00fc: from 0x0fffff00 to 0x0fffff00 0x99900290: from 0x00000133 to 0x00000110 0x9990028c: from 0x00000001 to 0x00000000 0x9990038c: from 0x00000003 to 0x00000000 0x99900388: from 0x80000501 to 0x80000707 0x98000504: from 0x0001ffff to 0x00010303 k_video_read_header>w 1920, h 1080, stride = 1920 alloc_memory>phy_addr 0x1aebd000, size 68431872 k_video_read_header>isp_buf_paddr 0x1aebd000, isp_buf_vaddr 0x565000, isp_buf_size 68428800 isp_video ds0 block alloc:0x1aabc000,size:4194304,align 4096 ds0_out_addr =0x1aabc000 twod block alloc:0x182bb000,size:41943040,align 4096 isp_info.ds1_addr is 1aebd000 isp_info.ds1_stride is 780 video_set_mipicsi start! set_vi_params set_isp_params run_video!! Open struct isp_device addr = 0x25c0b0,vi = 0x25c150,mmio_base = 0x92620700 [ 18.682787] mipi_coner_init done, pvt code 0x1ffff [ 19.126175] lcd init ok lcd is 1 ---------------------------------!!! [ 19.133179] mipi_dsi_init done [ 19.136259] mipi_rx_dphy_init done! Call cmd ISP_CMD_MIPI_DSI_INIT s[ 19.139902] isp_act_sensor_rst success uccss system is ready to Release rst_n system rst_n has all release system rst_n has all release i2c_num = 0 imx219_i2c_init config done Isp_f2k_Init start Isp_f2k_Init end video_in_Init start vi_wrap_config start struct isp_device = 0x25c0b0,vi= 0x25c150 vi_wrap_rst struct isp_device 0x25c0b0,VI_WRAP_SWRST_CTL = 0x34f vi_wrap_config end video_in_Init end Imx219_1080p30_init reg is 100 val is 1 reg is 30eb val is 0 reg is 30eb val is 0 reg is 300a val is ff reg is 300b val is ff reg is 30eb val is 0 reg is 30eb val is 0 reg is 114 val is 1 reg is 128 val is 0 reg is 12a val is 18 reg is 12b val is 0 reg is 160 val is 4 reg is 161 val is 8e reg is 162 val is d reg is 163 val is 94 reg is 164 val is 2 reg is 165 val is a8 reg is 166 val is a reg is 167 val is 27 reg is 168 val is 2 reg is 169 val is b4 reg is 16a val is 6 reg is 16b val is eb reg is 16c val is 7 reg is 16d val is 80 reg is 16e val is 4 reg is 16f val is 38 reg is 170 val is 1 reg is 171 val is 1 reg is 174 val is 0 reg is 175 val is 0 reg is 301 val is 5 reg is 303 val is 1 reg is 304 val is 3 reg is 305 val is 3 reg is 306 val is 0 reg is 307 val is 26 reg is 30b val is 1 reg is 30c val is 0 reg is 30d val is 30 reg is 624 val is 7 reg is 625 val is 80 reg is 626 val is 4 reg is 627 val is 38 reg is 455e val is 0 reg is 471e val is 0 reg is 4767 val is 0 reg is 4750 val is 0 reg is 4540 val is 0 reg is 47b4 val is 0 reg is 4713 val is 0 reg is 478b val is 0 reg is 478f val is 0 reg is 4793 val is 0 reg is 4797 val is 0 reg is 479b val is 0 reg is 157 val is 40 reg is 158 val is 1 reg is 159 val is 0 reg is 15a val is 3 reg is 15b val is e8 reg is 100 val is 1 isp_f2k_core_table_init start! Isp2K RGB Gamma TABLE config done! Isp2K YUV Gamma TABLE config done! VO VCoef Config done![ 23.393613] plat->ds1_addr is1aebd000 VO HCoef Config done! VO GA[ 23.397728] plat->ds1_buf_cut is 14 MMA Coef Config done! Call cmd [ 23.404124] ISP_CMD_SET_DS1_SIZE fram_uv_addr is 1fa400 ISP_CMD_DS1_ADDR succss Call cmd ISP_CMD_DS1_BUFF_COUNT succss Call cmd ISP_CMD_SET_DS1_SIZE succss k_video_read_header>fd_isp: 0x7 isp_ouput Input #0, libk_video, from 'video=test': Duration: N/A, start: 0.033333, bitrate: 746496 kb/s Stream #0:0: Video: rawvideo (NV12 / 0x3231564E), nv12, 1920x1080, 746496 kb/s, 30 tbr, 30 tbn, 30 tbc Guessed Channel Layout for Input Stream #1.0 : stereo Input #1, alsa, from 'hw:0': Duration: N/A, start: 1650597527.574041, bitrate: 1024 kb/s Stream #1:0: Audio: pcm_s16le, 32000 Hz, stereo, s16, 1024 kb/s Stream mapping: Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libk_h264)) Stream #1:0 -> #0:1 (pcm_s16le (native) -> aac (native)) Press [q] to stop, [?] for help alloc_memory>phy_addr 0x17ccb000, size 6221824 k_h264_encode_init>yuv_vAddr 0x2007338000, yuv_phyAddr 0x17ccb000, yuv_size 6220800 Encoder Settings: width : 1920 height : 1080 level : 42 profile : 2 FreqIDR : 25 gopLen : 25 FrameRate : 30 rcMode : 1 SliceQP : 25 bitrate : 4000000 maxbitrate : 4000000 AL_ShareMemAlloc_Create>fd_ddr 0xe, fd_share_memory 0xd ---- FPGA board is ready ---- Board UID : 30AB6E51 Board HW ID : 620000E0 Board rev. : DC4054E7 Board date : 20191115 ----------------------------- Create_OutBuffers>count 4, size 3172352 VideoEncoder_Create>ok, hEnc 0x19c6eb0 pic: format 23, linesize 1920, 1920, 0, pts 0 pic data 0x1d255800, 0x1d44fc00, (nil), (nil), (nil), (nil), (nil), (nil) [alsa @ 0x19aa580] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) Output #0, rtsp, to 'rtsp://10.20.1.55:5544/xxxx/com28': Metadata: encoder : Lavf58.76.100 Stream #0:0: Video: h264, nv12(progressive), 1920x1080, q=2-31, 30 fps, 90k tbn Metadata: encoder : Lavc58.134.100 libk_h264 Stream #0:1: Audio: aac (LC), 32000 Hz, stereo, fltp, 128 kb/s Metadata: encoder : Lavc58.134.100 aac [libk_video @ 0x19a63d0] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) av_interleaved_write_frame(): Broken pipe00:00:15.60 bitrate=N/A speed=0.995x Receive NULL pic Last message repeated 1 times Error writing trailer of rtsp://10.20.1.55:5544/xxxx/com28: Broken pipe frame= 482 fps= 30 q=-0.0 Lsize=N/A time=00:00:16.09 bitrate=N/A speed=0.994x video:7870kB audio:253kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown k_h264_encode_close> VideoEncoder_Destroy>ch 0 ok [aac @ 0x19c1ec0] Qavg: 161.536 QoS restore k_video_read_close> Conversion failed! 然后在vlc里拉流,单独推、拉,还是比较稳定的,如果多路,比如同时推、拉3 路,过一会就会报上面的 Broken pipe 部署是直接用docker启动的。 是不是我哪里做错了?
1.0
多路同时推、拉流,报Broken pipe - 用如下命令推流: ffmpeg -f libk_video -wh 1920x1080 -sensor 3 -i video="test " -f alsa -ac 2 -ar 32000 -i hw:0 -idr_freq 25 -vcodec libk_h264 -acodec aac -f rtsp rtsp://10.20.1.55:5544/xxxx/com28 ffmpeg version 4.4 Copyright (c) 2000-2021 the FFmpeg developers built with gcc 7.3.0 (2019-11-20_nds64le-linux-glibc-v5d-6c120106e03) configuration: --cross-prefix=riscv64-linux- --enable-cross-compile --target-os=linux --cc=riscv64-linux-gcc --arch=riscv64 --extra-ldflags=-L./ --extra-ldflags=-ldl --extra-ldflags='-Wl,-rpath .' --enable-static --enable-libk_video --enable-libk_h264 --enable-libk_jpeg --enable-alsa --disable-autodetect --disable-ffplay --disable-ffprobe --disable-doc --enable-audio3a --enable-indev=v4l2 libavutil 56. 70.100 / 56. 70.100 libavcodec 58.134.100 / 58.134.100 libavformat 58. 76.100 / 58. 76.100 libavdevice 58. 13.100 / 58. 13.100 libavfilter 7.110.100 / 7.110.100 libswscale 5. 9.100 / 5. 9.100 libswresample 3. 9.100 / 3. 9.100 0x970E00f4: from 0x00550000 to 0x00110000 0x970E00f8: from 0x00000000 to 0x00770000 0x970E00fc: from 0x0fffff00 to 0x0fffff00 0x99900290: from 0x00000133 to 0x00000110 0x9990028c: from 0x00000001 to 0x00000000 0x9990038c: from 0x00000003 to 0x00000000 0x99900388: from 0x80000501 to 0x80000707 0x98000504: from 0x0001ffff to 0x00010303 k_video_read_header>w 1920, h 1080, stride = 1920 alloc_memory>phy_addr 0x1aebd000, size 68431872 k_video_read_header>isp_buf_paddr 0x1aebd000, isp_buf_vaddr 0x565000, isp_buf_size 68428800 isp_video ds0 block alloc:0x1aabc000,size:4194304,align 4096 ds0_out_addr =0x1aabc000 twod block alloc:0x182bb000,size:41943040,align 4096 isp_info.ds1_addr is 1aebd000 isp_info.ds1_stride is 780 video_set_mipicsi start! set_vi_params set_isp_params run_video!! Open struct isp_device addr = 0x25c0b0,vi = 0x25c150,mmio_base = 0x92620700 [ 18.682787] mipi_coner_init done, pvt code 0x1ffff [ 19.126175] lcd init ok lcd is 1 ---------------------------------!!! [ 19.133179] mipi_dsi_init done [ 19.136259] mipi_rx_dphy_init done! Call cmd ISP_CMD_MIPI_DSI_INIT s[ 19.139902] isp_act_sensor_rst success uccss system is ready to Release rst_n system rst_n has all release system rst_n has all release i2c_num = 0 imx219_i2c_init config done Isp_f2k_Init start Isp_f2k_Init end video_in_Init start vi_wrap_config start struct isp_device = 0x25c0b0,vi= 0x25c150 vi_wrap_rst struct isp_device 0x25c0b0,VI_WRAP_SWRST_CTL = 0x34f vi_wrap_config end video_in_Init end Imx219_1080p30_init reg is 100 val is 1 reg is 30eb val is 0 reg is 30eb val is 0 reg is 300a val is ff reg is 300b val is ff reg is 30eb val is 0 reg is 30eb val is 0 reg is 114 val is 1 reg is 128 val is 0 reg is 12a val is 18 reg is 12b val is 0 reg is 160 val is 4 reg is 161 val is 8e reg is 162 val is d reg is 163 val is 94 reg is 164 val is 2 reg is 165 val is a8 reg is 166 val is a reg is 167 val is 27 reg is 168 val is 2 reg is 169 val is b4 reg is 16a val is 6 reg is 16b val is eb reg is 16c val is 7 reg is 16d val is 80 reg is 16e val is 4 reg is 16f val is 38 reg is 170 val is 1 reg is 171 val is 1 reg is 174 val is 0 reg is 175 val is 0 reg is 301 val is 5 reg is 303 val is 1 reg is 304 val is 3 reg is 305 val is 3 reg is 306 val is 0 reg is 307 val is 26 reg is 30b val is 1 reg is 30c val is 0 reg is 30d val is 30 reg is 624 val is 7 reg is 625 val is 80 reg is 626 val is 4 reg is 627 val is 38 reg is 455e val is 0 reg is 471e val is 0 reg is 4767 val is 0 reg is 4750 val is 0 reg is 4540 val is 0 reg is 47b4 val is 0 reg is 4713 val is 0 reg is 478b val is 0 reg is 478f val is 0 reg is 4793 val is 0 reg is 4797 val is 0 reg is 479b val is 0 reg is 157 val is 40 reg is 158 val is 1 reg is 159 val is 0 reg is 15a val is 3 reg is 15b val is e8 reg is 100 val is 1 isp_f2k_core_table_init start! Isp2K RGB Gamma TABLE config done! Isp2K YUV Gamma TABLE config done! VO VCoef Config done![ 23.393613] plat->ds1_addr is1aebd000 VO HCoef Config done! VO GA[ 23.397728] plat->ds1_buf_cut is 14 MMA Coef Config done! Call cmd [ 23.404124] ISP_CMD_SET_DS1_SIZE fram_uv_addr is 1fa400 ISP_CMD_DS1_ADDR succss Call cmd ISP_CMD_DS1_BUFF_COUNT succss Call cmd ISP_CMD_SET_DS1_SIZE succss k_video_read_header>fd_isp: 0x7 isp_ouput Input #0, libk_video, from 'video=test': Duration: N/A, start: 0.033333, bitrate: 746496 kb/s Stream #0:0: Video: rawvideo (NV12 / 0x3231564E), nv12, 1920x1080, 746496 kb/s, 30 tbr, 30 tbn, 30 tbc Guessed Channel Layout for Input Stream #1.0 : stereo Input #1, alsa, from 'hw:0': Duration: N/A, start: 1650597527.574041, bitrate: 1024 kb/s Stream #1:0: Audio: pcm_s16le, 32000 Hz, stereo, s16, 1024 kb/s Stream mapping: Stream #0:0 -> #0:0 (rawvideo (native) -> h264 (libk_h264)) Stream #1:0 -> #0:1 (pcm_s16le (native) -> aac (native)) Press [q] to stop, [?] for help alloc_memory>phy_addr 0x17ccb000, size 6221824 k_h264_encode_init>yuv_vAddr 0x2007338000, yuv_phyAddr 0x17ccb000, yuv_size 6220800 Encoder Settings: width : 1920 height : 1080 level : 42 profile : 2 FreqIDR : 25 gopLen : 25 FrameRate : 30 rcMode : 1 SliceQP : 25 bitrate : 4000000 maxbitrate : 4000000 AL_ShareMemAlloc_Create>fd_ddr 0xe, fd_share_memory 0xd ---- FPGA board is ready ---- Board UID : 30AB6E51 Board HW ID : 620000E0 Board rev. : DC4054E7 Board date : 20191115 ----------------------------- Create_OutBuffers>count 4, size 3172352 VideoEncoder_Create>ok, hEnc 0x19c6eb0 pic: format 23, linesize 1920, 1920, 0, pts 0 pic data 0x1d255800, 0x1d44fc00, (nil), (nil), (nil), (nil), (nil), (nil) [alsa @ 0x19aa580] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) Output #0, rtsp, to 'rtsp://10.20.1.55:5544/xxxx/com28': Metadata: encoder : Lavf58.76.100 Stream #0:0: Video: h264, nv12(progressive), 1920x1080, q=2-31, 30 fps, 90k tbn Metadata: encoder : Lavc58.134.100 libk_h264 Stream #0:1: Audio: aac (LC), 32000 Hz, stereo, fltp, 128 kb/s Metadata: encoder : Lavc58.134.100 aac [libk_video @ 0x19a63d0] Thread message queue blocking; consider raising the thread_queue_size option (current value: 8) av_interleaved_write_frame(): Broken pipe00:00:15.60 bitrate=N/A speed=0.995x Receive NULL pic Last message repeated 1 times Error writing trailer of rtsp://10.20.1.55:5544/xxxx/com28: Broken pipe frame= 482 fps= 30 q=-0.0 Lsize=N/A time=00:00:16.09 bitrate=N/A speed=0.994x video:7870kB audio:253kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: unknown k_h264_encode_close> VideoEncoder_Destroy>ch 0 ok [aac @ 0x19c1ec0] Qavg: 161.536 QoS restore k_video_read_close> Conversion failed! 然后在vlc里拉流,单独推、拉,还是比较稳定的,如果多路,比如同时推、拉3 路,过一会就会报上面的 Broken pipe 部署是直接用docker启动的。 是不是我哪里做错了?
process
多路同时推、拉流,报broken pipe 用如下命令推流: ffmpeg f libk video wh sensor i video test f alsa ac ar i hw idr freq vcodec libk acodec aac f rtsp rtsp xxxx ffmpeg version copyright c the ffmpeg developers built with gcc linux glibc configuration cross prefix linux enable cross compile target os linux cc linux gcc arch extra ldflags l extra ldflags ldl extra ldflags wl rpath enable static enable libk video enable libk enable libk jpeg enable alsa disable autodetect disable ffplay disable ffprobe disable doc enable enable indev libavutil libavcodec libavformat libavdevice libavfilter libswscale libswresample from to from to from to from to from to from to from to from to k video read header w h stride alloc memory phy addr size k video read header isp buf paddr isp buf vaddr isp buf size isp video block alloc size align out addr twod block alloc size align isp info addr is isp info stride is video set mipicsi start set vi params set isp params run video open struct isp device addr vi mmio base mipi coner init done pvt code lcd init ok lcd is mipi dsi init done mipi rx dphy init done call cmd isp cmd mipi dsi init s isp act sensor rst success uccss system is ready to release rst n system rst n has all release system rst n has all release num init config done isp init start isp init end video in init start vi wrap config start struct isp device vi vi wrap rst struct isp device vi wrap swrst ctl vi wrap config end video in init end init reg is val is reg is val is reg is val is reg is val is ff reg is val is ff reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is d reg is val is reg is val is reg is val is reg is val is a reg is val is reg is val is reg is val is reg is val is reg is val is eb reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is reg is val is isp core table init start! rgb gamma table config done yuv gamma table config done vo vcoef config done plat addr vo hcoef config done vo ga plat buf cut is mma coef config done call cmd isp cmd set size fram uv addr is isp cmd addr succss call cmd isp cmd buff count succss call cmd isp cmd set size succss k video read header fd isp isp ouput input libk video from video test duration n a start bitrate kb s stream video rawvideo kb s tbr tbn tbc guessed channel layout for input stream stereo input alsa from hw duration n a start bitrate kb s stream audio pcm hz stereo kb s stream mapping stream rawvideo native libk stream pcm native aac native press to stop for help alloc memory phy addr size k encode init yuv vaddr yuv phyaddr yuv size encoder settings width height level profile freqidr goplen framerate rcmode sliceqp bitrate maxbitrate al sharememalloc create fd ddr fd share memory fpga board is ready board uid board hw id board rev board date create outbuffers count size videoencoder create ok henc pic format linesize pts pic data nil nil nil nil nil nil thread message queue blocking consider raising the thread queue size option current value output rtsp to rtsp xxxx metadata encoder stream video progressive q fps tbn metadata encoder libk stream audio aac lc hz stereo fltp kb s metadata encoder aac thread message queue blocking consider raising the thread queue size option current value av interleaved write frame broken bitrate n a speed receive null pic last message repeated times error writing trailer of rtsp xxxx broken pipe frame fps q lsize n a time bitrate n a speed video audio subtitle other streams global headers muxing overhead unknown k encode close videoencoder destroy ch ok qavg qos restore k video read close conversion failed 然后在vlc里拉流,单独推、拉,还是比较稳定的,如果多路,比如同时推、 路,过一会就会报上面的 broken pipe 部署是直接用docker启动的。 是不是我哪里做错了?
1
9,100
12,178,609,277
IssuesEvent
2020-04-28 09:17:06
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Loss of contour data when using 'Rasterize'
Bug Feedback Processing
**Describe the bug** A shapefile showing contour data was produced. This showed the legend for the contour lines and also the zero/datum line (zero). When rasterized the zero datum line was missing, also the legend for the contour lines. **How to Reproduce** Add the spatialite database to the project. The connection to the data has been established. 'Right-Click' on the 'soundings' layer to save a Shapefile. 'Export' 'Save Features As...' 'Right-Click' again on the Shapefile. 'Zoom to Layer' shows you the location of the soundings. This demonstrates the density of soundings required if contouring is going to be successful. The survey lines are about 10 metres apart. Select 'Vector','Contour' from the menu at the top of the screen. The input is the shapefile of soundings. 'depth' is the data value. If you go through the list of options and set things up correctly QGIS will work out min/max depth and you can choose a suitable contour interval. I needed to play with the number of contours but eventually a list appeared in the text box. It is possible to apply colours but probably better to use a single colour. The contours appear. To make the contours display better 'Right-Click'. Black will show the contours best. We now add labels to the contours. 'Properties' The shapefile is in vector format. It has to be converted to raster before it can be exported as a geotiff. **Raster output is shown. Some of the contours are faded but that will not affect their display. Unfortunately contour labels disappear. Also the datum line (0)** ![13](https://user-images.githubusercontent.com/2537240/80456878-3ccc3580-8926-11ea-912f-31f90fd136cd.png) ![19](https://user-images.githubusercontent.com/2537240/80456877-3ccc3580-8926-11ea-9be6-8927d1fba3e3.png) **QGIS and OS versions** ![qgis](https://user-images.githubusercontent.com/2537240/80457575-718cbc80-8927-11ea-8b5e-66c89d777390.png) **Additional context** Shapefile before rasterizing is here ... (Remove the .pdf) [dandy.hole.shp.pdf](https://github.com/qgis/QGIS/files/4543946/dandy.hole.shp.pdf) Brilliant program.
1.0
Loss of contour data when using 'Rasterize' - **Describe the bug** A shapefile showing contour data was produced. This showed the legend for the contour lines and also the zero/datum line (zero). When rasterized the zero datum line was missing, also the legend for the contour lines. **How to Reproduce** Add the spatialite database to the project. The connection to the data has been established. 'Right-Click' on the 'soundings' layer to save a Shapefile. 'Export' 'Save Features As...' 'Right-Click' again on the Shapefile. 'Zoom to Layer' shows you the location of the soundings. This demonstrates the density of soundings required if contouring is going to be successful. The survey lines are about 10 metres apart. Select 'Vector','Contour' from the menu at the top of the screen. The input is the shapefile of soundings. 'depth' is the data value. If you go through the list of options and set things up correctly QGIS will work out min/max depth and you can choose a suitable contour interval. I needed to play with the number of contours but eventually a list appeared in the text box. It is possible to apply colours but probably better to use a single colour. The contours appear. To make the contours display better 'Right-Click'. Black will show the contours best. We now add labels to the contours. 'Properties' The shapefile is in vector format. It has to be converted to raster before it can be exported as a geotiff. **Raster output is shown. Some of the contours are faded but that will not affect their display. Unfortunately contour labels disappear. Also the datum line (0)** ![13](https://user-images.githubusercontent.com/2537240/80456878-3ccc3580-8926-11ea-912f-31f90fd136cd.png) ![19](https://user-images.githubusercontent.com/2537240/80456877-3ccc3580-8926-11ea-9be6-8927d1fba3e3.png) **QGIS and OS versions** ![qgis](https://user-images.githubusercontent.com/2537240/80457575-718cbc80-8927-11ea-8b5e-66c89d777390.png) **Additional context** Shapefile before rasterizing is here ... (Remove the .pdf) [dandy.hole.shp.pdf](https://github.com/qgis/QGIS/files/4543946/dandy.hole.shp.pdf) Brilliant program.
process
loss of contour data when using rasterize describe the bug a shapefile showing contour data was produced this showed the legend for the contour lines and also the zero datum line zero when rasterized the zero datum line was missing also the legend for the contour lines how to reproduce add the spatialite database to the project the connection to the data has been established right click on the soundings layer to save a shapefile export save features as right click again on the shapefile zoom to layer shows you the location of the soundings this demonstrates the density of soundings required if contouring is going to be successful the survey lines are about metres apart select vector contour from the menu at the top of the screen the input is the shapefile of soundings depth is the data value if you go through the list of options and set things up correctly qgis will work out min max depth and you can choose a suitable contour interval i needed to play with the number of contours but eventually a list appeared in the text box it is possible to apply colours but probably better to use a single colour the contours appear to make the contours display better right click black will show the contours best we now add labels to the contours properties the shapefile is in vector format it has to be converted to raster before it can be exported as a geotiff raster output is shown some of the contours are faded but that will not affect their display unfortunately contour labels disappear also the datum line qgis and os versions additional context shapefile before rasterizing is here remove the pdf brilliant program
1
12,634
15,016,548,759
IssuesEvent
2021-02-01 09:43:54
threefoldtech/js-sdk
https://api.github.com/repos/threefoldtech/js-sdk
closed
when kubeapps deployed, its showing triple deployments?
process_duplicate process_wontfix type_question
<img width="1363" alt="Screenshot 2021-01-28 at 11 12 19" src="https://user-images.githubusercontent.com/43240801/106123184-e82a4500-6159-11eb-89df-ea7a99352b4b.png"> is it because apps are deployed inside kubeapps? <img width="1081" alt="Screenshot 2021-01-28 at 11 13 13" src="https://user-images.githubusercontent.com/43240801/106123152-dcd71980-6159-11eb-906f-2ded7013ec4d.png">
2.0
when kubeapps deployed, its showing triple deployments? - <img width="1363" alt="Screenshot 2021-01-28 at 11 12 19" src="https://user-images.githubusercontent.com/43240801/106123184-e82a4500-6159-11eb-89df-ea7a99352b4b.png"> is it because apps are deployed inside kubeapps? <img width="1081" alt="Screenshot 2021-01-28 at 11 13 13" src="https://user-images.githubusercontent.com/43240801/106123152-dcd71980-6159-11eb-906f-2ded7013ec4d.png">
process
when kubeapps deployed its showing triple deployments img width alt screenshot at src is it because apps are deployed inside kubeapps img width alt screenshot at src
1
101,985
12,735,968,750
IssuesEvent
2020-06-25 16:07:30
peopledoc/procrastinate
https://api.github.com/repos/peopledoc/procrastinate
reopened
Handle closed connections gracefully
Contains: Exploration & Design decisions Contains: Only Python Type: Bug
On a defer operation, if the db connection obtained from the pool is actually closed (e.g. because of a call to `pg_terminate_backend` on the Postgres side), the defer operation will fail with this error: ``` File ".../lib/python3.8/site-packages/aiopg/connection.py", line 106, in _ready state = self._conn.poll() psycopg2.OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. ``` The next defer operation will succeed though, probably because aiopg didn't add the (closed) connection that caused the exception back to the pool. I am wondering if Procrastinate should detect these cases and raise a specific exception, to give the application an opportunity to handle this as it sees fit. A similar problem may exist on the worker side as well.
1.0
Handle closed connections gracefully - On a defer operation, if the db connection obtained from the pool is actually closed (e.g. because of a call to `pg_terminate_backend` on the Postgres side), the defer operation will fail with this error: ``` File ".../lib/python3.8/site-packages/aiopg/connection.py", line 106, in _ready state = self._conn.poll() psycopg2.OperationalError: server closed the connection unexpectedly This probably means the server terminated abnormally before or while processing the request. ``` The next defer operation will succeed though, probably because aiopg didn't add the (closed) connection that caused the exception back to the pool. I am wondering if Procrastinate should detect these cases and raise a specific exception, to give the application an opportunity to handle this as it sees fit. A similar problem may exist on the worker side as well.
non_process
handle closed connections gracefully on a defer operation if the db connection obtained from the pool is actually closed e g because of a call to pg terminate backend on the postgres side the defer operation will fail with this error file lib site packages aiopg connection py line in ready state self conn poll operationalerror server closed the connection unexpectedly this probably means the server terminated abnormally before or while processing the request the next defer operation will succeed though probably because aiopg didn t add the closed connection that caused the exception back to the pool i am wondering if procrastinate should detect these cases and raise a specific exception to give the application an opportunity to handle this as it sees fit a similar problem may exist on the worker side as well
0
14,893
18,289,701,591
IssuesEvent
2021-10-05 14:04:40
jessestewart1/nrn-rrn
https://api.github.com/repos/jessestewart1/nrn-rrn
closed
Process NS 2021
complete processing
**Description of tasks** Process NS 2021 data for release as an NRN product. - [x] update field mapping yaml(s) - [x] process NS 2021 data - [x] update release notes and sphinx documentation - [x] copy updated yamls to `src/stage_5/distribution_docs` - [x] copy updated rsts to `docs/source` - [x] build updated Sphinx documentation via command: `sphinx-build -b html nrn-rrn/docs/source nrn-rrn/docs/_build` - [x] copy data to server - [x] confirm WMS updates and publication to Open Maps
1.0
Process NS 2021 - **Description of tasks** Process NS 2021 data for release as an NRN product. - [x] update field mapping yaml(s) - [x] process NS 2021 data - [x] update release notes and sphinx documentation - [x] copy updated yamls to `src/stage_5/distribution_docs` - [x] copy updated rsts to `docs/source` - [x] build updated Sphinx documentation via command: `sphinx-build -b html nrn-rrn/docs/source nrn-rrn/docs/_build` - [x] copy data to server - [x] confirm WMS updates and publication to Open Maps
process
process ns description of tasks process ns data for release as an nrn product update field mapping yaml s process ns data update release notes and sphinx documentation copy updated yamls to src stage distribution docs copy updated rsts to docs source build updated sphinx documentation via command sphinx build b html nrn rrn docs source nrn rrn docs build copy data to server confirm wms updates and publication to open maps
1
43
2,507,997,305
IssuesEvent
2015-01-12 22:14:46
bcambel/hackersome
https://api.github.com/repos/bcambel/hackersome
opened
Index Project Names, Languages
Batch Processing ElasticSearch
into ElasticSearch During the Batch Processing, index/submit all the into ElasticSearch
1.0
Index Project Names, Languages - into ElasticSearch During the Batch Processing, index/submit all the into ElasticSearch
process
index project names languages into elasticsearch during the batch processing index submit all the into elasticsearch
1
2,268
5,102,447,410
IssuesEvent
2017-01-04 18:20:30
LazyTroll/WikiCode
https://api.github.com/repos/LazyTroll/WikiCode
closed
Замена WikiTree на новый модуль WikiFileTree
introduction process task
Заменить весь старый модуль работы с файловым деревом на новый. Весь фронтенд пока оставить прежним.
1.0
Замена WikiTree на новый модуль WikiFileTree - Заменить весь старый модуль работы с файловым деревом на новый. Весь фронтенд пока оставить прежним.
process
замена wikitree на новый модуль wikifiletree заменить весь старый модуль работы с файловым деревом на новый весь фронтенд пока оставить прежним
1
4,207
7,166,281,010
IssuesEvent
2018-01-29 16:46:05
GeographicaGS/AquaGIS
https://api.github.com/repos/GeographicaGS/AquaGIS
closed
Documentar motor de detección de fugas
processing
Es necesario documentar el motor de detección de fugas desarrollado para poder usarlo en la justificación.
1.0
Documentar motor de detección de fugas - Es necesario documentar el motor de detección de fugas desarrollado para poder usarlo en la justificación.
process
documentar motor de detección de fugas es necesario documentar el motor de detección de fugas desarrollado para poder usarlo en la justificación
1
123,200
17,772,186,711
IssuesEvent
2021-08-30 14:50:04
kapseliboi/zotero
https://api.github.com/repos/kapseliboi/zotero
opened
CVE-2020-11022 (Medium) detected in jquery-2.0.2.min.js, jquery-1.7.1.min.js
security vulnerability
## CVE-2020-11022 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-2.0.2.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary> <p> <details><summary><b>jquery-2.0.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js</a></p> <p>Path to dependency file: zotero/node_modules/ace-builds/demo/whitespace .html</p> <p>Path to vulnerable library: /node_modules/ace-builds/demo/whitespace .html</p> <p> Dependency Hierarchy: - :x: **jquery-2.0.2.min.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.7.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p> <p>Path to dependency file: zotero/node_modules/vm-browserify/example/run/index.html</p> <p>Path to vulnerable library: /node_modules/vm-browserify/example/run/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/kapseliboi/zotero/commit/3cf45924442ab2512d5d382270f6e567cbbf4be9">3cf45924442ab2512d5d382270f6e567cbbf4be9</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jQuery - 3.5.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-11022 (Medium) detected in jquery-2.0.2.min.js, jquery-1.7.1.min.js - ## CVE-2020-11022 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jquery-2.0.2.min.js</b>, <b>jquery-1.7.1.min.js</b></p></summary> <p> <details><summary><b>jquery-2.0.2.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/2.0.2/jquery.min.js</a></p> <p>Path to dependency file: zotero/node_modules/ace-builds/demo/whitespace .html</p> <p>Path to vulnerable library: /node_modules/ace-builds/demo/whitespace .html</p> <p> Dependency Hierarchy: - :x: **jquery-2.0.2.min.js** (Vulnerable Library) </details> <details><summary><b>jquery-1.7.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.7.1/jquery.min.js</a></p> <p>Path to dependency file: zotero/node_modules/vm-browserify/example/run/index.html</p> <p>Path to vulnerable library: /node_modules/vm-browserify/example/run/index.html</p> <p> Dependency Hierarchy: - :x: **jquery-1.7.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/kapseliboi/zotero/commit/3cf45924442ab2512d5d382270f6e567cbbf4be9">3cf45924442ab2512d5d382270f6e567cbbf4be9</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In jQuery versions greater than or equal to 1.2 and before 3.5.0, passing HTML from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0. <p>Publish Date: 2020-04-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11022>CVE-2020-11022</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/">https://blog.jquery.com/2020/04/10/jquery-3-5-0-released/</a></p> <p>Release Date: 2020-04-29</p> <p>Fix Resolution: jQuery - 3.5.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in jquery min js jquery min js cve medium severity vulnerability vulnerable libraries jquery min js jquery min js jquery min js javascript library for dom operations library home page a href path to dependency file zotero node modules ace builds demo whitespace html path to vulnerable library node modules ace builds demo whitespace html dependency hierarchy x jquery min js vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file zotero node modules vm browserify example run index html path to vulnerable library node modules vm browserify example run index html dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details in jquery versions greater than or equal to and before passing html from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource
0
4,573
7,397,684,833
IssuesEvent
2018-03-19 00:46:03
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
"Azure Scheduler jobs" link
app-service assigned-to-author doc-bug in-process triaged
"Azure Scheduler jobs" link at the start of the document isn't pointing anywhere. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 7c1407af-81a5-52e7-a800-79d57704a876 * Version Independent ID: 5ec30406-731f-6198-99a1-9d4ee5c67475 * Content: [Develop and deploy WebJobs using Visual Studio - Azure](https://docs.microsoft.com/en-us/azure/app-service/websites-dotnet-deploy-webjobs#scheduler) * Content Source: [articles/app-service/websites-dotnet-deploy-webjobs.md](https://github.com/Microsoft/azure-docs/blob/master/articles/app-service/websites-dotnet-deploy-webjobs.md) * Service: **app-service** * GitHub Login: @ggailey777 * Microsoft Alias: **glenga;david.ebbo;suwatch;pbatum;naren.soni**
1.0
"Azure Scheduler jobs" link - "Azure Scheduler jobs" link at the start of the document isn't pointing anywhere. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 7c1407af-81a5-52e7-a800-79d57704a876 * Version Independent ID: 5ec30406-731f-6198-99a1-9d4ee5c67475 * Content: [Develop and deploy WebJobs using Visual Studio - Azure](https://docs.microsoft.com/en-us/azure/app-service/websites-dotnet-deploy-webjobs#scheduler) * Content Source: [articles/app-service/websites-dotnet-deploy-webjobs.md](https://github.com/Microsoft/azure-docs/blob/master/articles/app-service/websites-dotnet-deploy-webjobs.md) * Service: **app-service** * GitHub Login: @ggailey777 * Microsoft Alias: **glenga;david.ebbo;suwatch;pbatum;naren.soni**
process
azure scheduler jobs link azure scheduler jobs link at the start of the document isn t pointing anywhere document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service app service github login microsoft alias glenga david ebbo suwatch pbatum naren soni
1
1,955
4,774,676,164
IssuesEvent
2016-10-27 07:46:04
nodejs/node
https://api.github.com/repos/nodejs/node
opened
child_process: add public API to unref ipc channel
child_process feature request
* **Version**: all * **Platform**: n/a * **Subsystem**: child_process It would be nice to have a public API to `unref()` a child process's ipc channel. My use case for this is that I spawn a child process, send some messages back and forth via ipc, then at some point I want to detach the child process. `child.unref()` is not enough because that only unrefs the C++ ProcessWrap handle. Currently I have to resort to also doing `child._channel.unref()`, but I do not like using/relying on underscore-prefixed properties. I am not sure if this should be done automatically with `child.unref()` or if it should be a separate function or what.
1.0
child_process: add public API to unref ipc channel - * **Version**: all * **Platform**: n/a * **Subsystem**: child_process It would be nice to have a public API to `unref()` a child process's ipc channel. My use case for this is that I spawn a child process, send some messages back and forth via ipc, then at some point I want to detach the child process. `child.unref()` is not enough because that only unrefs the C++ ProcessWrap handle. Currently I have to resort to also doing `child._channel.unref()`, but I do not like using/relying on underscore-prefixed properties. I am not sure if this should be done automatically with `child.unref()` or if it should be a separate function or what.
process
child process add public api to unref ipc channel version all platform n a subsystem child process it would be nice to have a public api to unref a child process s ipc channel my use case for this is that i spawn a child process send some messages back and forth via ipc then at some point i want to detach the child process child unref is not enough because that only unrefs the c processwrap handle currently i have to resort to also doing child channel unref but i do not like using relying on underscore prefixed properties i am not sure if this should be done automatically with child unref or if it should be a separate function or what
1
15,746
19,911,531,117
IssuesEvent
2022-01-25 17:38:38
input-output-hk/high-assurance-legacy
https://api.github.com/repos/input-output-hk/high-assurance-legacy
closed
Reflect the renaming of our process calculus in the code base
language: isabelle language: haskell topic: process calculus type: improvement
We recently changed the name of our process calculus from “χ-calculus” to “♮-calculus”, because there is already another process calculus named “χ-calculus”. Our goal is to change the source code, file names, and documentation in both the Haskell and the Isabelle part of the code base to reflect this change.
1.0
Reflect the renaming of our process calculus in the code base - We recently changed the name of our process calculus from “χ-calculus” to “♮-calculus”, because there is already another process calculus named “χ-calculus”. Our goal is to change the source code, file names, and documentation in both the Haskell and the Isabelle part of the code base to reflect this change.
process
reflect the renaming of our process calculus in the code base we recently changed the name of our process calculus from “χ calculus” to “♮ calculus” because there is already another process calculus named “χ calculus” our goal is to change the source code file names and documentation in both the haskell and the isabelle part of the code base to reflect this change
1
20,995
27,860,127,461
IssuesEvent
2023-03-21 05:06:24
googleapis/google-cloud-node
https://api.github.com/repos/googleapis/google-cloud-node
closed
Your .repo-metadata.json files have a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json files: Result of scan 📈: * api_shortname '{{name}}' invalid in packages/gapic-node-templating/templates/bootstrap-templates/.repo-metadata.json * api_shortname 'clouddms' invalid in packages/google-cloud-clouddms/.repo-metadata.json * release_level must be equal to one of the allowed values in packages/google-cloud-contentwarehouse/.repo-metadata.json * api_shortname field missing from packages/google-cloud-dataform/.repo-metadata.json * api_shortname 'filestore' invalid in packages/google-cloud-filestore/.repo-metadata.json * api_shortname field missing from packages/google-cloud-run/.repo-metadata.json * api_shortname field missing from packages/google-cloud-security-publicca/.repo-metadata.json * api_shortname field missing from packages/google-cloud-video-stitcher/.repo-metadata.json * api_shortname 'routing' invalid in packages/google-maps-routing/.repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json files have a problem 🤒 - You have a problem with your .repo-metadata.json files: Result of scan 📈: * api_shortname '{{name}}' invalid in packages/gapic-node-templating/templates/bootstrap-templates/.repo-metadata.json * api_shortname 'clouddms' invalid in packages/google-cloud-clouddms/.repo-metadata.json * release_level must be equal to one of the allowed values in packages/google-cloud-contentwarehouse/.repo-metadata.json * api_shortname field missing from packages/google-cloud-dataform/.repo-metadata.json * api_shortname 'filestore' invalid in packages/google-cloud-filestore/.repo-metadata.json * api_shortname field missing from packages/google-cloud-run/.repo-metadata.json * api_shortname field missing from packages/google-cloud-security-publicca/.repo-metadata.json * api_shortname field missing from packages/google-cloud-video-stitcher/.repo-metadata.json * api_shortname 'routing' invalid in packages/google-maps-routing/.repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json files have a problem 🤒 you have a problem with your repo metadata json files result of scan 📈 api shortname name invalid in packages gapic node templating templates bootstrap templates repo metadata json api shortname clouddms invalid in packages google cloud clouddms repo metadata json release level must be equal to one of the allowed values in packages google cloud contentwarehouse repo metadata json api shortname field missing from packages google cloud dataform repo metadata json api shortname filestore invalid in packages google cloud filestore repo metadata json api shortname field missing from packages google cloud run repo metadata json api shortname field missing from packages google cloud security publicca repo metadata json api shortname field missing from packages google cloud video stitcher repo metadata json api shortname routing invalid in packages google maps routing repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
1
336,897
30,226,328,471
IssuesEvent
2023-07-06 01:02:26
opensearch-project/OpenSearch
https://api.github.com/repos/opensearch-project/OpenSearch
closed
[BUG] org.opensearch.remotestore.SegmentReplicationRemoteStoreIT.testPressureServiceStats is flaky
bug durability flaky-test v2.9.0
**Describe the bug** org.opensearch.remotestore.SegmentReplicationRemoteStoreIT.testPressureServiceStats is flaky ``` org.opensearch.remotestore.SegmentReplicationRemoteStoreIT > testPressureServiceStats FAILED java.lang.AssertionError: expected:<0> but was:<1> at __randomizedtesting.SeedInfo.seed([6FBC0417BD6F56E8:B70016A383AEB076]:0) at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:647) at org.junit.Assert.assertEquals(Assert.java:633) at org.opensearch.indices.replication.SegmentReplicationIT.testPressureServiceStats(SegmentReplicationIT.java:835) at java.****/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) at java.****/java.lang.reflect.Method.invoke(Method.java:578) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.****/java.lang.Thread.run(Thread.java:1589) ``` **To Reproduce** ``` ./gradlew ':server:internalClusterTest' --tests "org.opensearch.remotestore.SegmentReplicationRemoteStoreIT.testPressureServiceStats" -Dtests.seed=6FBC0417BD6F56E8 -Dtests.security.manager=true -Dtests.jvm.argline="-XX:TieredStopAtLevel=1 -XX:ReservedCodeCacheSize=64m" -Dtests.locale=ar-LY -Dtests.timezone=Africa/Sao_Tome -Druntime.java=19 ``` **Expected behavior** Test should pass **Plugins** None **Additional context** - https://build.ci.opensearch.org/job/gradle-check/15525/consoleFull
1.0
[BUG] org.opensearch.remotestore.SegmentReplicationRemoteStoreIT.testPressureServiceStats is flaky - **Describe the bug** org.opensearch.remotestore.SegmentReplicationRemoteStoreIT.testPressureServiceStats is flaky ``` org.opensearch.remotestore.SegmentReplicationRemoteStoreIT > testPressureServiceStats FAILED java.lang.AssertionError: expected:<0> but was:<1> at __randomizedtesting.SeedInfo.seed([6FBC0417BD6F56E8:B70016A383AEB076]:0) at org.junit.Assert.fail(Assert.java:89) at org.junit.Assert.failNotEquals(Assert.java:835) at org.junit.Assert.assertEquals(Assert.java:647) at org.junit.Assert.assertEquals(Assert.java:633) at org.opensearch.indices.replication.SegmentReplicationIT.testPressureServiceStats(SegmentReplicationIT.java:835) at java.****/jdk.internal.reflect.DirectMethodHandleAccessor.invoke(DirectMethodHandleAccessor.java:104) at java.****/java.lang.reflect.Method.invoke(Method.java:578) at com.carrotsearch.randomizedtesting.RandomizedRunner.invoke(RandomizedRunner.java:1750) at com.carrotsearch.randomizedtesting.RandomizedRunner$8.evaluate(RandomizedRunner.java:938) at com.carrotsearch.randomizedtesting.RandomizedRunner$9.evaluate(RandomizedRunner.java:974) at com.carrotsearch.randomizedtesting.RandomizedRunner$10.evaluate(RandomizedRunner.java:988) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at org.apache.lucene.tests.util.TestRuleSetupTeardownChained$1.evaluate(TestRuleSetupTeardownChained.java:48) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleThreadAndTestName$1.evaluate(TestRuleThreadAndTestName.java:45) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at com.carrotsearch.randomizedtesting.ThreadLeakControl.forkTimeoutingTask(ThreadLeakControl.java:817) at com.carrotsearch.randomizedtesting.ThreadLeakControl$3.evaluate(ThreadLeakControl.java:468) at com.carrotsearch.randomizedtesting.RandomizedRunner.runSingleTest(RandomizedRunner.java:947) at com.carrotsearch.randomizedtesting.RandomizedRunner$5.evaluate(RandomizedRunner.java:832) at com.carrotsearch.randomizedtesting.RandomizedRunner$6.evaluate(RandomizedRunner.java:883) at com.carrotsearch.randomizedtesting.RandomizedRunner$7.evaluate(RandomizedRunner.java:894) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleStoreClassName$1.evaluate(TestRuleStoreClassName.java:38) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.NoShadowingOrOverridesOnMethodsRule$1.evaluate(NoShadowingOrOverridesOnMethodsRule.java:40) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at org.apache.lucene.tests.util.TestRuleAssertionsRequired$1.evaluate(TestRuleAssertionsRequired.java:53) at org.apache.lucene.tests.util.AbstractBeforeAfterRule$1.evaluate(AbstractBeforeAfterRule.java:43) at org.apache.lucene.tests.util.TestRuleMarkFailure$1.evaluate(TestRuleMarkFailure.java:44) at org.apache.lucene.tests.util.TestRuleIgnoreAfterMaxFailures$1.evaluate(TestRuleIgnoreAfterMaxFailures.java:60) at org.apache.lucene.tests.util.TestRuleIgnoreTestSuites$1.evaluate(TestRuleIgnoreTestSuites.java:47) at org.junit.rules.RunRules.evaluate(RunRules.java:20) at com.carrotsearch.randomizedtesting.rules.StatementAdapter.evaluate(StatementAdapter.java:36) at com.carrotsearch.randomizedtesting.ThreadLeakControl$StatementRunner.run(ThreadLeakControl.java:368) at java.****/java.lang.Thread.run(Thread.java:1589) ``` **To Reproduce** ``` ./gradlew ':server:internalClusterTest' --tests "org.opensearch.remotestore.SegmentReplicationRemoteStoreIT.testPressureServiceStats" -Dtests.seed=6FBC0417BD6F56E8 -Dtests.security.manager=true -Dtests.jvm.argline="-XX:TieredStopAtLevel=1 -XX:ReservedCodeCacheSize=64m" -Dtests.locale=ar-LY -Dtests.timezone=Africa/Sao_Tome -Druntime.java=19 ``` **Expected behavior** Test should pass **Plugins** None **Additional context** - https://build.ci.opensearch.org/job/gradle-check/15525/consoleFull
non_process
org opensearch remotestore segmentreplicationremotestoreit testpressureservicestats is flaky describe the bug org opensearch remotestore segmentreplicationremotestoreit testpressureservicestats is flaky org opensearch remotestore segmentreplicationremotestoreit testpressureservicestats failed java lang assertionerror expected but was at randomizedtesting seedinfo seed at org junit assert fail assert java at org junit assert failnotequals assert java at org junit assert assertequals assert java at org junit assert assertequals assert java at org opensearch indices replication segmentreplicationit testpressureservicestats segmentreplicationit java at java jdk internal reflect directmethodhandleaccessor invoke directmethodhandleaccessor java at java java lang reflect method invoke method java at com carrotsearch randomizedtesting randomizedrunner invoke randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org junit rules runrules evaluate runrules java at org apache lucene tests util testrulesetupteardownchained evaluate testrulesetupteardownchained java at org apache lucene tests util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene tests util testrulethreadandtestname evaluate testrulethreadandtestname java at org apache lucene tests util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene tests util testrulemarkfailure evaluate testrulemarkfailure java at org junit rules runrules evaluate runrules java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol forktimeoutingtask threadleakcontrol java at com carrotsearch randomizedtesting threadleakcontrol evaluate threadleakcontrol java at com carrotsearch randomizedtesting randomizedrunner runsingletest randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at com carrotsearch randomizedtesting randomizedrunner evaluate randomizedrunner java at org apache lucene tests util abstractbeforeafterrule evaluate abstractbeforeafterrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene tests util testrulestoreclassname evaluate testrulestoreclassname java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules noshadowingoroverridesonmethodsrule evaluate noshadowingoroverridesonmethodsrule java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at org apache lucene tests util testruleassertionsrequired evaluate testruleassertionsrequired java at org apache lucene tests util abstractbeforeafterrule evaluate abstractbeforeafterrule java at org apache lucene tests util testrulemarkfailure evaluate testrulemarkfailure java at org apache lucene tests util testruleignoreaftermaxfailures evaluate testruleignoreaftermaxfailures java at org apache lucene tests util testruleignoretestsuites evaluate testruleignoretestsuites java at org junit rules runrules evaluate runrules java at com carrotsearch randomizedtesting rules statementadapter evaluate statementadapter java at com carrotsearch randomizedtesting threadleakcontrol statementrunner run threadleakcontrol java at java java lang thread run thread java to reproduce gradlew server internalclustertest tests org opensearch remotestore segmentreplicationremotestoreit testpressureservicestats dtests seed dtests security manager true dtests jvm argline xx tieredstopatlevel xx reservedcodecachesize dtests locale ar ly dtests timezone africa sao tome druntime java expected behavior test should pass plugins none additional context
0
9,525
12,500,274,397
IssuesEvent
2020-06-01 21:53:13
googleapis/google-api-nodejs-client
https://api.github.com/repos/googleapis/google-api-nodejs-client
closed
store discovery docs in fixtures for tests
type: process
It would be good to store the discovery docs we use to exercise libraries in unit tests (they're currently integration tests). See: https://github.com/googleapis/google-api-nodejs-client/pull/1846 It would also potentially be good to look at adding back some of the tests we removed when urlshortner was retired.
1.0
store discovery docs in fixtures for tests - It would be good to store the discovery docs we use to exercise libraries in unit tests (they're currently integration tests). See: https://github.com/googleapis/google-api-nodejs-client/pull/1846 It would also potentially be good to look at adding back some of the tests we removed when urlshortner was retired.
process
store discovery docs in fixtures for tests it would be good to store the discovery docs we use to exercise libraries in unit tests they re currently integration tests see it would also potentially be good to look at adding back some of the tests we removed when urlshortner was retired
1
125,894
10,368,241,689
IssuesEvent
2019-09-07 15:22:41
microsoft/appcenter
https://api.github.com/repos/microsoft/appcenter
closed
Android tests occasionally seem to crash from internal exceptions not related to our application
bug test
**What App Center service does this affect?** App Center Android UI tests **Describe the bug** Occasionally we are seeing crashes on random UI tests with crash logs that don't seem to correlate to any of our code https://appcenter.ms/orgs/xtc-Xamarin-Forms/apps/AndroidControlGallery-1/test/series/pull-7032/runs/b826dc16-ee86-49f1-b5c4-0367526b5359/170-2-0/18824604-e78c-49aa-afc5-e384ed8305e6/logs https://appcenter.ms/orgs/xtc-Xamarin-Forms/apps/AndroidControlGallery-1/test/series/pull-6762/runs/5f8151a9-35a0-4b15-bda9-62ca0a42bb16/83-0-0/18824604-e78c-49aa-afc5-e384ed8305e6/logs It's never a consistent test that causes this exception and 100% of the time I can rerun the exact same tests and it'll eventually (usually on the first rerun) pass I checked the device logs as well and there are no Unhandled Exceptions in there. The stack trace almost looks like the UI Test code is determining that the app isn't running so it kills the app but then it doesn't restart it fast enough for the UI tests to run? **Expected behavior** It seems like these crashes shouldn't be happening and aren't a fault of the application. If it is a fault of the application then the exception should have more detail about what in the application is crashing. **Smartphone (please complete the following information):** - Device: Android **Additional context** https://xamarinhq.slack.com/archives/C24DVRZ7U/p1565633285117300
1.0
Android tests occasionally seem to crash from internal exceptions not related to our application - **What App Center service does this affect?** App Center Android UI tests **Describe the bug** Occasionally we are seeing crashes on random UI tests with crash logs that don't seem to correlate to any of our code https://appcenter.ms/orgs/xtc-Xamarin-Forms/apps/AndroidControlGallery-1/test/series/pull-7032/runs/b826dc16-ee86-49f1-b5c4-0367526b5359/170-2-0/18824604-e78c-49aa-afc5-e384ed8305e6/logs https://appcenter.ms/orgs/xtc-Xamarin-Forms/apps/AndroidControlGallery-1/test/series/pull-6762/runs/5f8151a9-35a0-4b15-bda9-62ca0a42bb16/83-0-0/18824604-e78c-49aa-afc5-e384ed8305e6/logs It's never a consistent test that causes this exception and 100% of the time I can rerun the exact same tests and it'll eventually (usually on the first rerun) pass I checked the device logs as well and there are no Unhandled Exceptions in there. The stack trace almost looks like the UI Test code is determining that the app isn't running so it kills the app but then it doesn't restart it fast enough for the UI tests to run? **Expected behavior** It seems like these crashes shouldn't be happening and aren't a fault of the application. If it is a fault of the application then the exception should have more detail about what in the application is crashing. **Smartphone (please complete the following information):** - Device: Android **Additional context** https://xamarinhq.slack.com/archives/C24DVRZ7U/p1565633285117300
non_process
android tests occasionally seem to crash from internal exceptions not related to our application what app center service does this affect app center android ui tests describe the bug occasionally we are seeing crashes on random ui tests with crash logs that don t seem to correlate to any of our code it s never a consistent test that causes this exception and of the time i can rerun the exact same tests and it ll eventually usually on the first rerun pass i checked the device logs as well and there are no unhandled exceptions in there the stack trace almost looks like the ui test code is determining that the app isn t running so it kills the app but then it doesn t restart it fast enough for the ui tests to run expected behavior it seems like these crashes shouldn t be happening and aren t a fault of the application if it is a fault of the application then the exception should have more detail about what in the application is crashing smartphone please complete the following information device android additional context
0
113,864
17,169,610,886
IssuesEvent
2021-07-15 01:03:16
rsoreq/grafana
https://api.github.com/repos/rsoreq/grafana
opened
WS-2017-3770 (Medium) detected in autolinker-0.28.1.tgz
security vulnerability
## WS-2017-3770 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>autolinker-0.28.1.tgz</b></p></summary> <p>Utility to automatically link the URLs, email addresses, and Twitter handles in a given block of text/HTML</p> <p>Library home page: <a href="https://registry.npmjs.org/autolinker/-/autolinker-0.28.1.tgz">https://registry.npmjs.org/autolinker/-/autolinker-0.28.1.tgz</a></p> <p>Path to dependency file: grafana/emails/yarn.lock</p> <p>Path to vulnerable library: grafana/emails/node_modules/autolinker/package.json</p> <p> Dependency Hierarchy: - grunt-assemble-0.6.3.tgz (Root Library) - assemble-handlebars-0.4.1.tgz - handlebars-helpers-0.8.4.tgz - helper-markdown-0.2.2.tgz - remarkable-1.7.4.tgz - :x: **autolinker-0.28.1.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Cross-site Scripting (XSS) vulnerability was found in autolinker before 3.14.0. User input passed to the innerHTML tags isn't sanitized. <p>Publish Date: 2017-02-15 <p>URL: <a href=https://github.com/gregjacobs/Autolinker.js/commit/f21ea015366cfa62c4e45d4bd117681e82e9b2bf>WS-2017-3770</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/gregjacobs/Autolinker.js/releases/tag/v3.14.0">https://github.com/gregjacobs/Autolinker.js/releases/tag/v3.14.0</a></p> <p>Release Date: 2017-02-15</p> <p>Fix Resolution: autolinker - 3.14.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"autolinker","packageVersion":"0.28.1","packageFilePaths":["/emails/yarn.lock"],"isTransitiveDependency":true,"dependencyTree":"grunt-assemble:0.6.3;assemble-handlebars:0.4.1;handlebars-helpers:0.8.4;helper-markdown:0.2.2;remarkable:1.7.4;autolinker:0.28.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"autolinker - 3.14.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2017-3770","vulnerabilityDetails":"Cross-site Scripting (XSS) vulnerability was found in autolinker before 3.14.0. User input passed to the innerHTML tags isn\u0027t sanitized.","vulnerabilityUrl":"https://github.com/gregjacobs/Autolinker.js/commit/f21ea015366cfa62c4e45d4bd117681e82e9b2bf","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
WS-2017-3770 (Medium) detected in autolinker-0.28.1.tgz - ## WS-2017-3770 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>autolinker-0.28.1.tgz</b></p></summary> <p>Utility to automatically link the URLs, email addresses, and Twitter handles in a given block of text/HTML</p> <p>Library home page: <a href="https://registry.npmjs.org/autolinker/-/autolinker-0.28.1.tgz">https://registry.npmjs.org/autolinker/-/autolinker-0.28.1.tgz</a></p> <p>Path to dependency file: grafana/emails/yarn.lock</p> <p>Path to vulnerable library: grafana/emails/node_modules/autolinker/package.json</p> <p> Dependency Hierarchy: - grunt-assemble-0.6.3.tgz (Root Library) - assemble-handlebars-0.4.1.tgz - handlebars-helpers-0.8.4.tgz - helper-markdown-0.2.2.tgz - remarkable-1.7.4.tgz - :x: **autolinker-0.28.1.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Cross-site Scripting (XSS) vulnerability was found in autolinker before 3.14.0. User input passed to the innerHTML tags isn't sanitized. <p>Publish Date: 2017-02-15 <p>URL: <a href=https://github.com/gregjacobs/Autolinker.js/commit/f21ea015366cfa62c4e45d4bd117681e82e9b2bf>WS-2017-3770</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/gregjacobs/Autolinker.js/releases/tag/v3.14.0">https://github.com/gregjacobs/Autolinker.js/releases/tag/v3.14.0</a></p> <p>Release Date: 2017-02-15</p> <p>Fix Resolution: autolinker - 3.14.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"autolinker","packageVersion":"0.28.1","packageFilePaths":["/emails/yarn.lock"],"isTransitiveDependency":true,"dependencyTree":"grunt-assemble:0.6.3;assemble-handlebars:0.4.1;handlebars-helpers:0.8.4;helper-markdown:0.2.2;remarkable:1.7.4;autolinker:0.28.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"autolinker - 3.14.0"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2017-3770","vulnerabilityDetails":"Cross-site Scripting (XSS) vulnerability was found in autolinker before 3.14.0. User input passed to the innerHTML tags isn\u0027t sanitized.","vulnerabilityUrl":"https://github.com/gregjacobs/Autolinker.js/commit/f21ea015366cfa62c4e45d4bd117681e82e9b2bf","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_process
ws medium detected in autolinker tgz ws medium severity vulnerability vulnerable library autolinker tgz utility to automatically link the urls email addresses and twitter handles in a given block of text html library home page a href path to dependency file grafana emails yarn lock path to vulnerable library grafana emails node modules autolinker package json dependency hierarchy grunt assemble tgz root library assemble handlebars tgz handlebars helpers tgz helper markdown tgz remarkable tgz x autolinker tgz vulnerable library found in base branch master vulnerability details cross site scripting xss vulnerability was found in autolinker before user input passed to the innerhtml tags isn t sanitized publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution autolinker isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt assemble assemble handlebars handlebars helpers helper markdown remarkable autolinker isminimumfixversionavailable true minimumfixversion autolinker basebranches vulnerabilityidentifier ws vulnerabilitydetails cross site scripting xss vulnerability was found in autolinker before user input passed to the innerhtml tags isn sanitized vulnerabilityurl
0
189,966
22,047,167,089
IssuesEvent
2022-05-30 04:01:51
nanopathi/linux-4.19.72_CVE-2021-32399
https://api.github.com/repos/nanopathi/linux-4.19.72_CVE-2021-32399
closed
CVE-2019-11597 (High) detected in linuxlinux-4.19.236 - autoclosed
security vulnerability
## CVE-2019-11597 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.236</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/nanopathi/linux-4.19.72_CVE-2021-32399/commit/03cb3c6f0e0b62b5cbcd747df63781fbb2a6ef66">03cb3c6f0e0b62b5cbcd747df63781fbb2a6ef66</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/core/uverbs_main.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/core/uverbs_main.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In ImageMagick 7.0.8-43 Q16, there is a heap-based buffer over-read in the function WriteTIFFImage of coders/tiff.c, which allows an attacker to cause a denial of service or possibly information disclosure via a crafted image file. <p>Publish Date: 2019-04-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11597>CVE-2019-11597</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11599">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11599</a></p> <p>Release Date: 2020-08-19</p> <p>Fix Resolution: v5.1-rc6</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-11597 (High) detected in linuxlinux-4.19.236 - autoclosed - ## CVE-2019-11597 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.19.236</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in HEAD commit: <a href="https://github.com/nanopathi/linux-4.19.72_CVE-2021-32399/commit/03cb3c6f0e0b62b5cbcd747df63781fbb2a6ef66">03cb3c6f0e0b62b5cbcd747df63781fbb2a6ef66</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/core/uverbs_main.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/infiniband/core/uverbs_main.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In ImageMagick 7.0.8-43 Q16, there is a heap-based buffer over-read in the function WriteTIFFImage of coders/tiff.c, which allows an attacker to cause a denial of service or possibly information disclosure via a crafted image file. <p>Publish Date: 2019-04-29 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-11597>CVE-2019-11597</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11599">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-11599</a></p> <p>Release Date: 2020-08-19</p> <p>Fix Resolution: v5.1-rc6</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in linuxlinux autoclosed cve high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in head commit a href found in base branch master vulnerable source files drivers infiniband core uverbs main c drivers infiniband core uverbs main c vulnerability details in imagemagick there is a heap based buffer over read in the function writetiffimage of coders tiff c which allows an attacker to cause a denial of service or possibly information disclosure via a crafted image file publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
11,915
14,701,313,570
IssuesEvent
2021-01-04 11:39:44
Altinn/altinn-studio
https://api.github.com/repos/Altinn/altinn-studio
closed
Update messagebox to support feedback task
area/message-box area/process kind/user-story org/skd/sirius status/won't fix
## Description Sirius will have a task where SKD will provide feedback. When the process is in feedback we need to present information in the messagebox about that status. This is blocked by #2729 ## Considerations - We need to support any type of text in status. Check with @Febakke if this is ok - Should the element be clickable? If so we would need to support a view for this - Where to store title on task type (Suggestion to store in standard text that can be overwritten by app developer) ![image](https://user-images.githubusercontent.com/13309071/85395129-f2df8480-b54f-11ea-8b38-29c3af85325a.png) ## Acceptance criteria - The name of the task needs to be controlled by the app itself - Language support ## Specification tasks - [ ] Analyse how what needs to be changed in messagebox - [ ] Analyse how app developer define the title of the task to present in the messagebox ## Development tasks - [ ] Implement support of any task name in messagebox view (at least for tjeneste 3.0 elements) - [ ] Implement configuration of title of task so that storage component can return title. Language support needed. This will be set on [MessageBoxInstance](https://github.com/Altinn/altinn-studio/blob/master/src/Altinn.Platform/Altinn.Platform.Storage/Storage/Helpers/MessageBoxInstance.cs)
1.0
Update messagebox to support feedback task - ## Description Sirius will have a task where SKD will provide feedback. When the process is in feedback we need to present information in the messagebox about that status. This is blocked by #2729 ## Considerations - We need to support any type of text in status. Check with @Febakke if this is ok - Should the element be clickable? If so we would need to support a view for this - Where to store title on task type (Suggestion to store in standard text that can be overwritten by app developer) ![image](https://user-images.githubusercontent.com/13309071/85395129-f2df8480-b54f-11ea-8b38-29c3af85325a.png) ## Acceptance criteria - The name of the task needs to be controlled by the app itself - Language support ## Specification tasks - [ ] Analyse how what needs to be changed in messagebox - [ ] Analyse how app developer define the title of the task to present in the messagebox ## Development tasks - [ ] Implement support of any task name in messagebox view (at least for tjeneste 3.0 elements) - [ ] Implement configuration of title of task so that storage component can return title. Language support needed. This will be set on [MessageBoxInstance](https://github.com/Altinn/altinn-studio/blob/master/src/Altinn.Platform/Altinn.Platform.Storage/Storage/Helpers/MessageBoxInstance.cs)
process
update messagebox to support feedback task description sirius will have a task where skd will provide feedback when the process is in feedback we need to present information in the messagebox about that status this is blocked by considerations we need to support any type of text in status check with febakke if this is ok should the element be clickable if so we would need to support a view for this where to store title on task type suggestion to store in standard text that can be overwritten by app developer acceptance criteria the name of the task needs to be controlled by the app itself language support specification tasks analyse how what needs to be changed in messagebox analyse how app developer define the title of the task to present in the messagebox development tasks implement support of any task name in messagebox view at least for tjeneste elements implement configuration of title of task so that storage component can return title language support needed this will be set on
1
3,018
6,024,968,612
IssuesEvent
2017-06-08 07:21:02
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Release 0.5.1
category: misc > release / binary P1 Release blocker type: process
> You're moving too fast for me > And I can't keep up with you > Maybe if you slowed down for me > I could see you're only telling that we already started working on our 0.5.1 release :)
1.0
Release 0.5.1 - > You're moving too fast for me > And I can't keep up with you > Maybe if you slowed down for me > I could see you're only telling that we already started working on our 0.5.1 release :)
process
release you re moving too fast for me and i can t keep up with you maybe if you slowed down for me i could see you re only telling that we already started working on our release
1
11,398
14,234,708,803
IssuesEvent
2020-11-18 13:57:00
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
Conditions based on resources
Pri2 devops-cicd-process/tech devops/prod product-feedback
I have 2 Git repositories: one with the source code of the application, one for the DevOps pipelines. For the second stage of the pipeline, I want to check if the branch that triggered the pipeline is master. But the problem is the variable **Build.SourceBranchName** has the information of the DevOps repository, not the repository containing the source code (repository named _hidden_). I don't see any variable that can be used for that purpose. ``` resources: repositories: - repository: 'hidden' type: git name: 'hidden/hidden' trigger: - master - develop trigger: none stages: - stage: Build displayName: Build stage variables: - name: BuildConfiguration value: 'Release' - name: ApiSolution value: 'Company.WebApi.csproj' jobs: - job: BuildAPI displayName: 'Build API' pool: vmImage: 'ubuntu-latest' steps: - checkout: self - checkout: hidden - script: dir $(Build.SourcesDirectory) - task: Bash@3 inputs: targetType: 'inline' script: 'env | sort' [...] - stage: DeployDev displayName: 'Deploy dev stage' dependsOn: Build condition: and(succeeded(), eq(variables['Build.SourceBranchName'], 'master')) jobs: - deployment: DeployServiceApp displayName: 'Deploy system to the dev environment job' workspace: clean: all environment: name: 'xyz' resourceType: 'VirtualMachine' strategy: runOnce: deploy: steps: [...] ``` --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3f151218-9a11-0078-e038-f96198a76143 * Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9 * Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/conditions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
Conditions based on resources - I have 2 Git repositories: one with the source code of the application, one for the DevOps pipelines. For the second stage of the pipeline, I want to check if the branch that triggered the pipeline is master. But the problem is the variable **Build.SourceBranchName** has the information of the DevOps repository, not the repository containing the source code (repository named _hidden_). I don't see any variable that can be used for that purpose. ``` resources: repositories: - repository: 'hidden' type: git name: 'hidden/hidden' trigger: - master - develop trigger: none stages: - stage: Build displayName: Build stage variables: - name: BuildConfiguration value: 'Release' - name: ApiSolution value: 'Company.WebApi.csproj' jobs: - job: BuildAPI displayName: 'Build API' pool: vmImage: 'ubuntu-latest' steps: - checkout: self - checkout: hidden - script: dir $(Build.SourcesDirectory) - task: Bash@3 inputs: targetType: 'inline' script: 'env | sort' [...] - stage: DeployDev displayName: 'Deploy dev stage' dependsOn: Build condition: and(succeeded(), eq(variables['Build.SourceBranchName'], 'master')) jobs: - deployment: DeployServiceApp displayName: 'Deploy system to the dev environment job' workspace: clean: all environment: name: 'xyz' resourceType: 'VirtualMachine' strategy: runOnce: deploy: steps: [...] ``` --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3f151218-9a11-0078-e038-f96198a76143 * Version Independent ID: 09c4d032-62f3-d97c-79d7-6fbfd89910e9 * Content: [Conditions - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/conditions?view=azure-devops&tabs=yaml) * Content Source: [docs/pipelines/process/conditions.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/master/docs/pipelines/process/conditions.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
conditions based on resources i have git repositories one with the source code of the application one for the devops pipelines for the second stage of the pipeline i want to check if the branch that triggered the pipeline is master but the problem is the variable build sourcebranchname has the information of the devops repository not the repository containing the source code repository named hidden i don t see any variable that can be used for that purpose resources repositories repository hidden type git name hidden hidden trigger master develop trigger none stages stage build displayname build stage variables name buildconfiguration value release name apisolution value company webapi csproj jobs job buildapi displayname build api pool vmimage ubuntu latest steps checkout self checkout hidden script dir build sourcesdirectory task bash inputs targettype inline script env sort stage deploydev displayname deploy dev stage dependson build condition and succeeded eq variables master jobs deployment deployserviceapp displayname deploy system to the dev environment job workspace clean all environment name xyz resourcetype virtualmachine strategy runonce deploy steps document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
37,735
18,750,615,622
IssuesEvent
2021-11-05 01:04:25
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
Optimize number processing and validation while reading JSON token
enhancement area-System.Text.Json tenet-performance up-for-grabs no recent activity
Consider removing the `ConsumeNumberResult` internal enum and optimize the implementation of `TryGetNumber`. The way it is implemented currently is not fast enough, likely due to some redundant or unnecessary work. The helper methods may need to be marked as aggressively inlined. As part of this work, try to avoid potential redundant search for end of number in `ConsumeNumber`.
True
Optimize number processing and validation while reading JSON token - Consider removing the `ConsumeNumberResult` internal enum and optimize the implementation of `TryGetNumber`. The way it is implemented currently is not fast enough, likely due to some redundant or unnecessary work. The helper methods may need to be marked as aggressively inlined. As part of this work, try to avoid potential redundant search for end of number in `ConsumeNumber`.
non_process
optimize number processing and validation while reading json token consider removing the consumenumberresult internal enum and optimize the implementation of trygetnumber the way it is implemented currently is not fast enough likely due to some redundant or unnecessary work the helper methods may need to be marked as aggressively inlined as part of this work try to avoid potential redundant search for end of number in consumenumber
0
46,310
7,247,109,805
IssuesEvent
2018-02-15 00:41:01
project8/cicada
https://api.github.com/repos/project8/cicada
opened
Missing documentation
documentation
This issue should track the missing documentation: as such, it is unlikely it will ever be closed... - [ ] Missing fields in the Documentation/ObjectsStructure.rst file
1.0
Missing documentation - This issue should track the missing documentation: as such, it is unlikely it will ever be closed... - [ ] Missing fields in the Documentation/ObjectsStructure.rst file
non_process
missing documentation this issue should track the missing documentation as such it is unlikely it will ever be closed missing fields in the documentation objectsstructure rst file
0
11,869
14,671,383,026
IssuesEvent
2020-12-30 08:01:06
threefoldtech/js-sdk
https://api.github.com/repos/threefoldtech/js-sdk
closed
PeerTube: was able to install it but I don't get to run it
process_wontfix type_bug
PeerTube generates plenty of errors within the application when trying to create an acount and then logging in: - bad gateway - client is invalid <img width="967" alt="Screenshot 2020-12-29 at 16 06 34" src="https://user-images.githubusercontent.com/30384423/103293306-e69d1e00-49ef-11eb-97b6-5c433f8267b2.png"> <img width="1271" alt="Screenshot 2020-12-29 at 16 04 02" src="https://user-images.githubusercontent.com/30384423/103293321-ef8def80-49ef-11eb-8ca3-e00e248b2fd4.png"> Linked to configuration ? On MacOS, Safari web browser.
1.0
PeerTube: was able to install it but I don't get to run it - PeerTube generates plenty of errors within the application when trying to create an acount and then logging in: - bad gateway - client is invalid <img width="967" alt="Screenshot 2020-12-29 at 16 06 34" src="https://user-images.githubusercontent.com/30384423/103293306-e69d1e00-49ef-11eb-97b6-5c433f8267b2.png"> <img width="1271" alt="Screenshot 2020-12-29 at 16 04 02" src="https://user-images.githubusercontent.com/30384423/103293321-ef8def80-49ef-11eb-8ca3-e00e248b2fd4.png"> Linked to configuration ? On MacOS, Safari web browser.
process
peertube was able to install it but i don t get to run it peertube generates plenty of errors within the application when trying to create an acount and then logging in bad gateway client is invalid img width alt screenshot at src img width alt screenshot at src linked to configuration on macos safari web browser
1
4,170
7,107,919,013
IssuesEvent
2018-01-16 21:45:54
18F/product-guide
https://api.github.com/repos/18F/product-guide
closed
SECTION UPDATE (Handoffs & Renewals) - Removing cloud.gov quota for closed projects
process change question
Cloud.gov currently charges based on quota and not actual usage. If a product leaves the cloud.gov platform (the project is canceled or migrated off the platform), you must request that the Infrastructure team remove the it from the platform. Deleting all applications and services is not enough. More info on quota-based billing here: https://docs.cloud.gov/intro/pricing/quotas/ @mogul, @NoahKunin
1.0
SECTION UPDATE (Handoffs & Renewals) - Removing cloud.gov quota for closed projects - Cloud.gov currently charges based on quota and not actual usage. If a product leaves the cloud.gov platform (the project is canceled or migrated off the platform), you must request that the Infrastructure team remove the it from the platform. Deleting all applications and services is not enough. More info on quota-based billing here: https://docs.cloud.gov/intro/pricing/quotas/ @mogul, @NoahKunin
process
section update handoffs renewals removing cloud gov quota for closed projects cloud gov currently charges based on quota and not actual usage if a product leaves the cloud gov platform the project is canceled or migrated off the platform you must request that the infrastructure team remove the it from the platform deleting all applications and services is not enough more info on quota based billing here mogul noahkunin
1
17,038
22,412,893,472
IssuesEvent
2022-06-19 01:41:31
Arch666Angel/mods
https://api.github.com/repos/Arch666Angel/mods
opened
[Enhancement] Paper
Impact: Enhancement Angels Bio Processing
Disable wood-Wooden Board Change wooden board to fibre board Add bleach process for NaOCl - Tie into phenolic board from wood process Buff processing (recipe time output quantity and productivity interaction)
1.0
[Enhancement] Paper - Disable wood-Wooden Board Change wooden board to fibre board Add bleach process for NaOCl - Tie into phenolic board from wood process Buff processing (recipe time output quantity and productivity interaction)
process
paper disable wood wooden board change wooden board to fibre board add bleach process for naocl tie into phenolic board from wood process buff processing recipe time output quantity and productivity interaction
1
110,223
4,423,758,604
IssuesEvent
2016-08-16 09:47:15
The-Compiler/qutebrowser
https://api.github.com/repos/The-Compiler/qutebrowser
closed
Add ability to jump between completion sections
component: completion easy priority: 2 - low
For people having many quickmarks (or searchengines if we decide to show them in the completions) it would be great to be able to jump between the different sections (quickmarks, bookmarks, history, searchengines). Keybinding proposal: tab : next entry shift-tab : prev entry ctrl-tab : next section shift-ctrl-tab : prev section
1.0
Add ability to jump between completion sections - For people having many quickmarks (or searchengines if we decide to show them in the completions) it would be great to be able to jump between the different sections (quickmarks, bookmarks, history, searchengines). Keybinding proposal: tab : next entry shift-tab : prev entry ctrl-tab : next section shift-ctrl-tab : prev section
non_process
add ability to jump between completion sections for people having many quickmarks or searchengines if we decide to show them in the completions it would be great to be able to jump between the different sections quickmarks bookmarks history searchengines keybinding proposal tab next entry shift tab prev entry ctrl tab next section shift ctrl tab prev section
0
272,033
29,794,974,569
IssuesEvent
2023-06-16 01:01:11
billmcchesney1/pacbot
https://api.github.com/repos/billmcchesney1/pacbot
closed
CVE-2022-41854 (Medium) detected in multiple libraries - autoclosed
Mend: dependency security vulnerability
## CVE-2022-41854 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snakeyaml-1.15.jar</b>, <b>snakeyaml-1.19.jar</b>, <b>snakeyaml-1.17.jar</b></p></summary> <p> <details><summary><b>snakeyaml-1.15.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /jobs/pacman-cloud-notifications/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.15/snakeyaml-1.15.jar</p> <p> Dependency Hierarchy: - elasticsearch-5.6.2.jar (Root Library) - :x: **snakeyaml-1.15.jar** (Vulnerable Library) </details> <details><summary><b>snakeyaml-1.19.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /api/pacman-api-notifications/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-data-rest-2.0.4.RELEASE.jar (Root Library) - spring-boot-starter-2.0.4.RELEASE.jar - :x: **snakeyaml-1.19.jar** (Vulnerable Library) </details> <details><summary><b>snakeyaml-1.17.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /jobs/pacman-rule-engine-2.0/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.17/snakeyaml-1.17.jar</p> <p> Dependency Hierarchy: - elasticsearch-5.6.8.jar (Root Library) - :x: **snakeyaml-1.17.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/pacbot/commit/acf9a0620c1a37cee4f2896d71e1c3731c5c7b06">acf9a0620c1a37cee4f2896d71e1c3731c5c7b06</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> Those using Snakeyaml to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack overflow. This effect may support a denial of service attack. <p>Publish Date: 2022-11-11 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41854>CVE-2022-41854</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/531/">https://bitbucket.org/snakeyaml/snakeyaml/issues/531/</a></p> <p>Release Date: 2022-11-11</p> <p>Fix Resolution: org.yaml:snakeyaml:1.32</p> </p> </details> <p></p>
True
CVE-2022-41854 (Medium) detected in multiple libraries - autoclosed - ## CVE-2022-41854 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>snakeyaml-1.15.jar</b>, <b>snakeyaml-1.19.jar</b>, <b>snakeyaml-1.17.jar</b></p></summary> <p> <details><summary><b>snakeyaml-1.15.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /jobs/pacman-cloud-notifications/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.15/snakeyaml-1.15.jar</p> <p> Dependency Hierarchy: - elasticsearch-5.6.2.jar (Root Library) - :x: **snakeyaml-1.15.jar** (Vulnerable Library) </details> <details><summary><b>snakeyaml-1.19.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /api/pacman-api-notifications/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.19/snakeyaml-1.19.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-data-rest-2.0.4.RELEASE.jar (Root Library) - spring-boot-starter-2.0.4.RELEASE.jar - :x: **snakeyaml-1.19.jar** (Vulnerable Library) </details> <details><summary><b>snakeyaml-1.17.jar</b></p></summary> <p>YAML 1.1 parser and emitter for Java</p> <p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p> <p>Path to dependency file: /jobs/pacman-rule-engine-2.0/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.17/snakeyaml-1.17.jar</p> <p> Dependency Hierarchy: - elasticsearch-5.6.8.jar (Root Library) - :x: **snakeyaml-1.17.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/pacbot/commit/acf9a0620c1a37cee4f2896d71e1c3731c5c7b06">acf9a0620c1a37cee4f2896d71e1c3731c5c7b06</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> Those using Snakeyaml to parse untrusted YAML files may be vulnerable to Denial of Service attacks (DOS). If the parser is running on user supplied input, an attacker may supply content that causes the parser to crash by stack overflow. This effect may support a denial of service attack. <p>Publish Date: 2022-11-11 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-41854>CVE-2022-41854</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://bitbucket.org/snakeyaml/snakeyaml/issues/531/">https://bitbucket.org/snakeyaml/snakeyaml/issues/531/</a></p> <p>Release Date: 2022-11-11</p> <p>Fix Resolution: org.yaml:snakeyaml:1.32</p> </p> </details> <p></p>
non_process
cve medium detected in multiple libraries autoclosed cve medium severity vulnerability vulnerable libraries snakeyaml jar snakeyaml jar snakeyaml jar snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file jobs pacman cloud notifications pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy elasticsearch jar root library x snakeyaml jar vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file api pacman api notifications pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter data rest release jar root library spring boot starter release jar x snakeyaml jar vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file jobs pacman rule engine pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy elasticsearch jar root library x snakeyaml jar vulnerable library found in head commit a href found in base branch master vulnerability details those using snakeyaml to parse untrusted yaml files may be vulnerable to denial of service attacks dos if the parser is running on user supplied input an attacker may supply content that causes the parser to crash by stack overflow this effect may support a denial of service attack publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org yaml snakeyaml
0
15,164
18,920,253,193
IssuesEvent
2021-11-17 00:14:31
dtcenter/MET
https://api.github.com/repos/dtcenter/MET
opened
Madis2NC time summary output writes bad data for the output level value.
type: bug priority: high requestor: Community alert: NEED ACCOUNT KEY required: FOR OFFICIAL RELEASE MET: PreProcessing Tools (Point)
## Describe the Problem ## This issue arose via METplus Discussions dtcenter/METplus#1232. While the user was able to run madis2nc to compute time summaries, he was NOT able to get Point-Stat to read them to verify forecasts of daily temperature min/max. I was able to replicate the problem using the sample data he provided in this [comment](https://github.com/dtcenter/METplus/discussions/1232#discussioncomment-1653826). Close inspection reveals that madis2nc is writing the output level values as bad data. Next I inspected the output from the nightly build and found the same to be true there. ``` # on kiowa ncdump -v obs_lvl NB20211116/MET-develop/test_output/madis2nc/metar_20120409_time_summary.nc obs_lvl = _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, ``` Point-Stat has no way of processing observations with a bad level value. ### Expected Behavior ### The level value reported to the time summary output should be consistent with the level information found in the input observations. Be sure to check the behavior of all the other tool that compute time summaries (pb2nc, madis2nc, ascii2nc, ioda2nc) and confirm that all write sensible level values to the output. Consider fixing for both MET-10.1.0-beta5 AND MET-10.0.1 bugfix release. ### Environment ### Describe your runtime environment: 1. Demonstrated on kiowa. ### To Reproduce ### Describe the steps to reproduce the behavior: *1. Run unit_madis2nc.xml* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ### Funding Source ### *Define the source of funding and account keys here or state NONE.* ## Define the Metadata ## ### Assignee ### - [x] Select **engineer(s)** or **no engineer** required: @hsoh-u - [x] Select **scientist(s)** or **no scientist** required: none required ### Labels ### - [x] Select **component(s)** - [x] Select **priority** - [x] Select **requestor(s)** ### Projects and Milestone ### - [x] Select **Organization** level **Project** for support of the current coordinated release - [x] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label - [x] Select **Milestone** as the next bugfix version ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) No impacts. ## Bugfix Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**. - [ ] Fork this repository or create a branch of **main_\<Version>**. Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>` - [ ] Fix the bug and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **main_\<Version>**. Pull request: `bugfix <Issue Number> main_<Version> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Linked issues** Select: **Organization** level software support **Project** for the current coordinated release Select: **Milestone** as the next bugfix version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Complete the steps above to fix the bug on the **develop** branch. Branch name: `bugfix_<Issue Number>_develop_<Description>` Pull request: `bugfix <Issue Number> develop <Description>` Select: **Reviewer(s)** and **Linked issues** Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Close this issue.
1.0
Madis2NC time summary output writes bad data for the output level value. - ## Describe the Problem ## This issue arose via METplus Discussions dtcenter/METplus#1232. While the user was able to run madis2nc to compute time summaries, he was NOT able to get Point-Stat to read them to verify forecasts of daily temperature min/max. I was able to replicate the problem using the sample data he provided in this [comment](https://github.com/dtcenter/METplus/discussions/1232#discussioncomment-1653826). Close inspection reveals that madis2nc is writing the output level values as bad data. Next I inspected the output from the nightly build and found the same to be true there. ``` # on kiowa ncdump -v obs_lvl NB20211116/MET-develop/test_output/madis2nc/metar_20120409_time_summary.nc obs_lvl = _, _, _, _, _, _, _, _, _, _, _, _, _, _, _, ``` Point-Stat has no way of processing observations with a bad level value. ### Expected Behavior ### The level value reported to the time summary output should be consistent with the level information found in the input observations. Be sure to check the behavior of all the other tool that compute time summaries (pb2nc, madis2nc, ascii2nc, ioda2nc) and confirm that all write sensible level values to the output. Consider fixing for both MET-10.1.0-beta5 AND MET-10.0.1 bugfix release. ### Environment ### Describe your runtime environment: 1. Demonstrated on kiowa. ### To Reproduce ### Describe the steps to reproduce the behavior: *1. Run unit_madis2nc.xml* ### Relevant Deadlines ### *List relevant project deadlines here or state NONE.* ### Funding Source ### *Define the source of funding and account keys here or state NONE.* ## Define the Metadata ## ### Assignee ### - [x] Select **engineer(s)** or **no engineer** required: @hsoh-u - [x] Select **scientist(s)** or **no scientist** required: none required ### Labels ### - [x] Select **component(s)** - [x] Select **priority** - [x] Select **requestor(s)** ### Projects and Milestone ### - [x] Select **Organization** level **Project** for support of the current coordinated release - [x] Select **Repository** level **Project** for development toward the next official release or add **alert: NEED PROJECT ASSIGNMENT** label - [x] Select **Milestone** as the next bugfix version ## Define Related Issue(s) ## Consider the impact to the other METplus components. - [x] [METplus](https://github.com/dtcenter/METplus/issues/new/choose), [MET](https://github.com/dtcenter/MET/issues/new/choose), [METdatadb](https://github.com/dtcenter/METdatadb/issues/new/choose), [METviewer](https://github.com/dtcenter/METviewer/issues/new/choose), [METexpress](https://github.com/dtcenter/METexpress/issues/new/choose), [METcalcpy](https://github.com/dtcenter/METcalcpy/issues/new/choose), [METplotpy](https://github.com/dtcenter/METplotpy/issues/new/choose) No impacts. ## Bugfix Checklist ## See the [METplus Workflow](https://metplus.readthedocs.io/en/latest/Contributors_Guide/github_workflow.html) for details. - [ ] Complete the issue definition above, including the **Time Estimate** and **Funding Source**. - [ ] Fork this repository or create a branch of **main_\<Version>**. Branch name: `bugfix_<Issue Number>_main_<Version>_<Description>` - [ ] Fix the bug and test your changes. - [ ] Add/update log messages for easier debugging. - [ ] Add/update unit tests. - [ ] Add/update documentation. - [ ] Push local changes to GitHub. - [ ] Submit a pull request to merge into **main_\<Version>**. Pull request: `bugfix <Issue Number> main_<Version> <Description>` - [ ] Define the pull request metadata, as permissions allow. Select: **Reviewer(s)** and **Linked issues** Select: **Organization** level software support **Project** for the current coordinated release Select: **Milestone** as the next bugfix version - [ ] Iterate until the reviewer(s) accept and merge your changes. - [ ] Delete your fork or branch. - [ ] Complete the steps above to fix the bug on the **develop** branch. Branch name: `bugfix_<Issue Number>_develop_<Description>` Pull request: `bugfix <Issue Number> develop <Description>` Select: **Reviewer(s)** and **Linked issues** Select: **Repository** level development cycle **Project** for the next official release Select: **Milestone** as the next official version - [ ] Close this issue.
process
time summary output writes bad data for the output level value describe the problem this issue arose via metplus discussions dtcenter metplus while the user was able to run to compute time summaries he was not able to get point stat to read them to verify forecasts of daily temperature min max i was able to replicate the problem using the sample data he provided in this close inspection reveals that is writing the output level values as bad data next i inspected the output from the nightly build and found the same to be true there on kiowa ncdump v obs lvl met develop test output metar time summary nc obs lvl point stat has no way of processing observations with a bad level value expected behavior the level value reported to the time summary output should be consistent with the level information found in the input observations be sure to check the behavior of all the other tool that compute time summaries and confirm that all write sensible level values to the output consider fixing for both met and met bugfix release environment describe your runtime environment demonstrated on kiowa to reproduce describe the steps to reproduce the behavior run unit xml relevant deadlines list relevant project deadlines here or state none funding source define the source of funding and account keys here or state none define the metadata assignee select engineer s or no engineer required hsoh u select scientist s or no scientist required none required labels select component s select priority select requestor s projects and milestone select organization level project for support of the current coordinated release select repository level project for development toward the next official release or add alert need project assignment label select milestone as the next bugfix version define related issue s consider the impact to the other metplus components no impacts bugfix checklist see the for details complete the issue definition above including the time estimate and funding source fork this repository or create a branch of main branch name bugfix main fix the bug and test your changes add update log messages for easier debugging add update unit tests add update documentation push local changes to github submit a pull request to merge into main pull request bugfix main define the pull request metadata as permissions allow select reviewer s and linked issues select organization level software support project for the current coordinated release select milestone as the next bugfix version iterate until the reviewer s accept and merge your changes delete your fork or branch complete the steps above to fix the bug on the develop branch branch name bugfix develop pull request bugfix develop select reviewer s and linked issues select repository level development cycle project for the next official release select milestone as the next official version close this issue
1
109,969
9,421,575,032
IssuesEvent
2019-04-11 07:12:53
beeldengeluid/labo-components
https://api.github.com/repos/beeldengeluid/labo-components
closed
As a user I want an issue with the layout fixed
Tested bug
Issue: In the Single Search recipe, in the block shared between Facets and the Result list in some Width ranges they overlap each other (See screenshot) ![Screenshot 2019-04-01 at 18.04.26.png](https://images.zenhubusercontent.com/5a1e745d8a75884b90892ac4/5cd61414-df13-4ff9-afe4-1c1a5f29618d)
1.0
As a user I want an issue with the layout fixed - Issue: In the Single Search recipe, in the block shared between Facets and the Result list in some Width ranges they overlap each other (See screenshot) ![Screenshot 2019-04-01 at 18.04.26.png](https://images.zenhubusercontent.com/5a1e745d8a75884b90892ac4/5cd61414-df13-4ff9-afe4-1c1a5f29618d)
non_process
as a user i want an issue with the layout fixed issue in the single search recipe in the block shared between facets and the result list in some width ranges they overlap each other see screenshot
0
6,670
9,785,577,093
IssuesEvent
2019-06-09 08:45:13
stoyicker/test-accessors
https://api.github.com/repos/stoyicker/test-accessors
closed
Match that a class exists instead of looking at the classpath
bug processor-java processor-kotlin
Match junit.runner.BaseTestRunner for JUnit 4 and org.testng.TestNG for TestNG
2.0
Match that a class exists instead of looking at the classpath - Match junit.runner.BaseTestRunner for JUnit 4 and org.testng.TestNG for TestNG
process
match that a class exists instead of looking at the classpath match junit runner basetestrunner for junit and org testng testng for testng
1
7,407
10,526,145,634
IssuesEvent
2019-09-30 16:26:52
liskcenterutrecht/lisk.bike
https://api.github.com/repos/liskcenterutrecht/lisk.bike
closed
Prepare Lock
Process Flow
The lock needs to be prepared and connected to the Virtual Lock Server: https://github.com/bartwr/commonbike-site/blob/feature/testing/zandbak/testbt10/index.js - [x] Charge lock - [x] Remove SIM PIN - [x] Disable SIM voicemail - [x] Send SMS to self via SIM - [x] Put SIM in lock - [x] Send SMS to lock containing 'bladiebla-server-ip'
1.0
Prepare Lock - The lock needs to be prepared and connected to the Virtual Lock Server: https://github.com/bartwr/commonbike-site/blob/feature/testing/zandbak/testbt10/index.js - [x] Charge lock - [x] Remove SIM PIN - [x] Disable SIM voicemail - [x] Send SMS to self via SIM - [x] Put SIM in lock - [x] Send SMS to lock containing 'bladiebla-server-ip'
process
prepare lock the lock needs to be prepared and connected to the virtual lock server charge lock remove sim pin disable sim voicemail send sms to self via sim put sim in lock send sms to lock containing bladiebla server ip
1
101,478
16,512,282,513
IssuesEvent
2021-05-26 06:27:33
valtech-ch/microservice-kubernetes-cluster
https://api.github.com/repos/valtech-ch/microservice-kubernetes-cluster
opened
CVE-2020-10969 (High) detected in jackson-databind-2.9.8.jar
security vulnerability
## CVE-2020-10969 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: microservice-kubernetes-cluster/functions/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.8/11283f21cc480aa86c4df7a0a3243ec508372ed2/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - spring-cloud-function-adapter-azure-3.1.2.jar (Root Library) - spring-cloud-function-context-3.1.2.jar - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/valtech-ch/microservice-kubernetes-cluster/commit/eb274179a823f7d17154880d5a503973bae259a0">eb274179a823f7d17154880d5a503973bae259a0</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to javax.swing.JEditorPane. <p>Publish Date: 2020-03-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10969>CVE-2020-10969</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969</a></p> <p>Release Date: 2020-03-26</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.8.11.6;com.fasterxml.jackson.core:jackson-databind:2.7.9.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-10969 (High) detected in jackson-databind-2.9.8.jar - ## CVE-2020-10969 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.8.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: microservice-kubernetes-cluster/functions/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.8/11283f21cc480aa86c4df7a0a3243ec508372ed2/jackson-databind-2.9.8.jar</p> <p> Dependency Hierarchy: - spring-cloud-function-adapter-azure-3.1.2.jar (Root Library) - spring-cloud-function-context-3.1.2.jar - :x: **jackson-databind-2.9.8.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/valtech-ch/microservice-kubernetes-cluster/commit/eb274179a823f7d17154880d5a503973bae259a0">eb274179a823f7d17154880d5a503973bae259a0</a></p> <p>Found in base branch: <b>develop</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to javax.swing.JEditorPane. <p>Publish Date: 2020-03-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10969>CVE-2020-10969</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10969</a></p> <p>Release Date: 2020-03-26</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.8.11.6;com.fasterxml.jackson.core:jackson-databind:2.7.9.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file microservice kubernetes cluster functions build gradle path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring cloud function adapter azure jar root library spring cloud function context jar x jackson databind jar vulnerable library found in head commit a href found in base branch develop vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to javax swing jeditorpane publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
2,579
5,342,720,158
IssuesEvent
2017-02-17 09:08:57
QCoDeS/Qcodes
https://api.github.com/repos/QCoDeS/Qcodes
opened
MultiProcessing removal
mulitprocessing
The next version, and thus the master branch will ship without the multiprocessing code. It's huge and we just don't want to keep maintaining it.
1.0
MultiProcessing removal - The next version, and thus the master branch will ship without the multiprocessing code. It's huge and we just don't want to keep maintaining it.
process
multiprocessing removal the next version and thus the master branch will ship without the multiprocessing code it s huge and we just don t want to keep maintaining it
1
84,571
16,517,900,188
IssuesEvent
2021-05-26 11:43:35
joomla/joomla-cms
https://api.github.com/repos/joomla/joomla-cms
closed
mod_custom Module class not added in html code
No Code Attached Yet
### Steps to reproduce the issue Create a new mod_custom module In advanced tab add module class Publish it on a location that is active Check html in frontend to see if this is added Tried Cassiopeia theme and my own theme ### Expected result I expect that there is a class added in the mod_custom div or on other location ### Actual result I do not see the class coming back in the frontend. ### System information (as much as possible) Joomla 4 beta 7 Php 7.4 ### Additional comments
1.0
mod_custom Module class not added in html code - ### Steps to reproduce the issue Create a new mod_custom module In advanced tab add module class Publish it on a location that is active Check html in frontend to see if this is added Tried Cassiopeia theme and my own theme ### Expected result I expect that there is a class added in the mod_custom div or on other location ### Actual result I do not see the class coming back in the frontend. ### System information (as much as possible) Joomla 4 beta 7 Php 7.4 ### Additional comments
non_process
mod custom module class not added in html code steps to reproduce the issue create a new mod custom module in advanced tab add module class publish it on a location that is active check html in frontend to see if this is added tried cassiopeia theme and my own theme expected result i expect that there is a class added in the mod custom div or on other location actual result i do not see the class coming back in the frontend system information as much as possible joomla beta php additional comments
0
7,089
10,237,598,223
IssuesEvent
2019-08-19 14:13:24
AlexsLemonade/refinebio
https://api.github.com/repos/AlexsLemonade/refinebio
closed
Run tximport before full RNA-SEQ experiment is ready
RNA-seq backlog processor salmon
### Context We're having trouble making all the RNA-Seq data we have (mostly) processed available because we don't run tximport until the entire experiment is processed. So if an experiment has 100 samples and 99 of them have been processed but one failed, we can't let anyone get any of the samples because tximport hasn't been run. ### Problem or idea We could find some kind of heuristic for a cutoff point and run tximport after that's been reached. If additional samples are processed after that, we could just rerun tximport and replace the previously generated data with the new stuff. ### Solution or next step Is this something we want to do? If so what would our heuristic be? @jaclyn-taroni said in the demo day meeting that she would start thinking about these questions. ### New Issue Checklist - [x] The title is short and descriptive. - [x] You have explained the context that led you to write this issue. - [x] You have reported a problem or idea. - [x] You have proposed a solution or next step.
1.0
Run tximport before full RNA-SEQ experiment is ready - ### Context We're having trouble making all the RNA-Seq data we have (mostly) processed available because we don't run tximport until the entire experiment is processed. So if an experiment has 100 samples and 99 of them have been processed but one failed, we can't let anyone get any of the samples because tximport hasn't been run. ### Problem or idea We could find some kind of heuristic for a cutoff point and run tximport after that's been reached. If additional samples are processed after that, we could just rerun tximport and replace the previously generated data with the new stuff. ### Solution or next step Is this something we want to do? If so what would our heuristic be? @jaclyn-taroni said in the demo day meeting that she would start thinking about these questions. ### New Issue Checklist - [x] The title is short and descriptive. - [x] You have explained the context that led you to write this issue. - [x] You have reported a problem or idea. - [x] You have proposed a solution or next step.
process
run tximport before full rna seq experiment is ready context we re having trouble making all the rna seq data we have mostly processed available because we don t run tximport until the entire experiment is processed so if an experiment has samples and of them have been processed but one failed we can t let anyone get any of the samples because tximport hasn t been run problem or idea we could find some kind of heuristic for a cutoff point and run tximport after that s been reached if additional samples are processed after that we could just rerun tximport and replace the previously generated data with the new stuff solution or next step is this something we want to do if so what would our heuristic be jaclyn taroni said in the demo day meeting that she would start thinking about these questions new issue checklist the title is short and descriptive you have explained the context that led you to write this issue you have reported a problem or idea you have proposed a solution or next step
1
56,981
23,980,819,705
IssuesEvent
2022-09-13 14:55:56
hashicorp/nomad
https://api.github.com/repos/hashicorp/nomad
closed
services: add support for check_restart in nomad service checks
type/enhancement theme/service-discovery/nomad
Nomad v1.4 will add support for checks in services provisionied in Nomad's native service provider. Not all features supported by the Consul integration are supported yet, including `check_restart`. Let's add support to make `check_restart` functionality work with Nomad service checks.
1.0
services: add support for check_restart in nomad service checks - Nomad v1.4 will add support for checks in services provisionied in Nomad's native service provider. Not all features supported by the Consul integration are supported yet, including `check_restart`. Let's add support to make `check_restart` functionality work with Nomad service checks.
non_process
services add support for check restart in nomad service checks nomad will add support for checks in services provisionied in nomad s native service provider not all features supported by the consul integration are supported yet including check restart let s add support to make check restart functionality work with nomad service checks
0
13,126
15,526,357,689
IssuesEvent
2021-03-13 00:56:39
googleapis/env-tests-logging
https://api.github.com/repos/googleapis/env-tests-logging
closed
Use Python 3.7 for nox tests
type: process
Context: Kokoro supports only up to GCP ubuntu 16.04. Which means Node/Go/Java images that use `apt-get` installs version 3.7, which is the majority. This means to integrate with env-tests-logging, we have to edit all the devrel-test images to install python 3.8 [from source](https://github.com/googleapis/testing-infra-docker/blob/ced75535439bb0223f31b54d05b16f3ff68db206/python/googleapis/python-multi/Dockerfile#L123-L146) or pyenv, or create/maintain new images. We'd do this for each version of the languages we want to run the tests in. Though an argument can be made we should* update all the images anyway to use pyenv.
1.0
Use Python 3.7 for nox tests - Context: Kokoro supports only up to GCP ubuntu 16.04. Which means Node/Go/Java images that use `apt-get` installs version 3.7, which is the majority. This means to integrate with env-tests-logging, we have to edit all the devrel-test images to install python 3.8 [from source](https://github.com/googleapis/testing-infra-docker/blob/ced75535439bb0223f31b54d05b16f3ff68db206/python/googleapis/python-multi/Dockerfile#L123-L146) or pyenv, or create/maintain new images. We'd do this for each version of the languages we want to run the tests in. Though an argument can be made we should* update all the images anyway to use pyenv.
process
use python for nox tests context kokoro supports only up to gcp ubuntu which means node go java images that use apt get installs version which is the majority this means to integrate with env tests logging we have to edit all the devrel test images to install python or pyenv or create maintain new images we d do this for each version of the languages we want to run the tests in though an argument can be made we should update all the images anyway to use pyenv
1
62,901
8,648,295,414
IssuesEvent
2018-11-26 16:15:11
adokter/bioRad
https://api.github.com/repos/adokter/bioRad
closed
Decide what to do with intro_ppi vignette
documentation
There was an `intro_ppi.Rmd` vignette written by @plieper in 2017. Because it eventually contained outdated function names, its [last version](https://github.com/adokter/bioRad/blob/42ee484ce491fdbc94080991061682fcee875407/vignettes/intro_ppi.Rmd) was removed in 8f638d59b9148b6b6e37ae781088b1b1d4fd1926. It should be checked if examples of [plot(ppi)](https://github.com/adokter/bioRad/blob/master/R/plot.ppi.R) cover what was described in the vignette.
1.0
Decide what to do with intro_ppi vignette - There was an `intro_ppi.Rmd` vignette written by @plieper in 2017. Because it eventually contained outdated function names, its [last version](https://github.com/adokter/bioRad/blob/42ee484ce491fdbc94080991061682fcee875407/vignettes/intro_ppi.Rmd) was removed in 8f638d59b9148b6b6e37ae781088b1b1d4fd1926. It should be checked if examples of [plot(ppi)](https://github.com/adokter/bioRad/blob/master/R/plot.ppi.R) cover what was described in the vignette.
non_process
decide what to do with intro ppi vignette there was an intro ppi rmd vignette written by plieper in because it eventually contained outdated function names its was removed in it should be checked if examples of cover what was described in the vignette
0
21,767
30,282,326,373
IssuesEvent
2023-07-08 08:08:55
creasico/creasi
https://api.github.com/repos/creasico/creasi
closed
Auto Deployment Preview untuk setiap Pull Request
business process chore deployment
Task ini pada dasarnya melanjutkan https://github.com/creasico/laravel-project/issues/6 dan selayaknya harus dapat di implementasikan ke semua project kita kedepannya tanpa perlu ada perubahan signifikan. ### Clear and concise description of the task Pertimbangan utama adalah untuk mempermudah proses review setiap task yang sudah dikerjakan. Secara teknis fitur ini mirip dengan fitur yang ditawarkan oleh [Heroku](https://devcenter.heroku.com/articles/github-integration-review-apps) dan [Vercel](https://vercel.com/docs/concepts/deployments/preview-deployments) namun (setidaknya untuk saat ini) mereka bukanlah opsi disini dengan alasan cost & billing management. Adapun perimbangan lain adalah sebagai metode distribusi setiap produk yang akan kita kembangkan, dan Yap! ada rencana untuk distribusi ke [Docker Hub](https://hub.docker.com/) tapi itu topik untuk lain waktu, saat ini fokuskan di [Google](https://cloud.google.com/container-registry) dan [GitHub](https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry) Container Registry selama masa development. ### Requirements - [ ] Deployment di trigger dari Pull Request melalui GitHub Actions. - [ ] Nama environment yang tercantum [disini](https://github.com/creasico/creasi/deployments) adalah `preview` - [ ] Setiap `preview` dapat memiliki `url` yang berbeda menyesuaikan dengan pull request nya, misal : `<project-name>-<pr-number>.creasi.dev` - [ ] Setiap `preview` akan terhapus ketika Pull Request tersebut `merged` ataupun `closed` ### Suggested solution Opsi paling affordable untuk saat ini adalah menggunakan layanan yang sudah kita gunakan, yaitu - Google Cloud Platform dengan [Cloud Run](https://cloud.google.com/run) Yang mana ini berarti setiap project akan didistribusikan sebagai docker container, baik melalui [Google Container Registry](https://cloud.google.com/container-registry/) maupun [GitHub Package Registry](https://docs.github.com/en/packages/working-with-a-github-packages-registry). - IDCloudHost dengan [Cloud VPS](https://idcloudhost.com/cloud-vps/) Berbeda dengan GCP dimana kita bisa inisiasi server kapanpun dibutuhkan, di opsi ini kita terbatas hanya memiliki 1 server untuk menampung semua deployment preview. Bahkan mungkin untuk penggunaan Docker container sendiri pun akan terbatas. Jadi manfaatkan [Deployer](https://deployer.org/) dengan baik disini. Terkait ini untuk server yang kita gunakan, sudah saya atur untuk semua project yang di deploy ke `var/www/<project-name>` akan otomatis punya url `<project-name>.creasi.dev`. Untuk eksperimen kalian bisa gunakan home folder kalian sendiri di server dan atur agar nginx bisa baca itu sebagai vhost. ### Bottom Lines Feel free to discuss
1.0
Auto Deployment Preview untuk setiap Pull Request - Task ini pada dasarnya melanjutkan https://github.com/creasico/laravel-project/issues/6 dan selayaknya harus dapat di implementasikan ke semua project kita kedepannya tanpa perlu ada perubahan signifikan. ### Clear and concise description of the task Pertimbangan utama adalah untuk mempermudah proses review setiap task yang sudah dikerjakan. Secara teknis fitur ini mirip dengan fitur yang ditawarkan oleh [Heroku](https://devcenter.heroku.com/articles/github-integration-review-apps) dan [Vercel](https://vercel.com/docs/concepts/deployments/preview-deployments) namun (setidaknya untuk saat ini) mereka bukanlah opsi disini dengan alasan cost & billing management. Adapun perimbangan lain adalah sebagai metode distribusi setiap produk yang akan kita kembangkan, dan Yap! ada rencana untuk distribusi ke [Docker Hub](https://hub.docker.com/) tapi itu topik untuk lain waktu, saat ini fokuskan di [Google](https://cloud.google.com/container-registry) dan [GitHub](https://docs.github.com/en/packages/working-with-a-github-packages-registry/working-with-the-container-registry) Container Registry selama masa development. ### Requirements - [ ] Deployment di trigger dari Pull Request melalui GitHub Actions. - [ ] Nama environment yang tercantum [disini](https://github.com/creasico/creasi/deployments) adalah `preview` - [ ] Setiap `preview` dapat memiliki `url` yang berbeda menyesuaikan dengan pull request nya, misal : `<project-name>-<pr-number>.creasi.dev` - [ ] Setiap `preview` akan terhapus ketika Pull Request tersebut `merged` ataupun `closed` ### Suggested solution Opsi paling affordable untuk saat ini adalah menggunakan layanan yang sudah kita gunakan, yaitu - Google Cloud Platform dengan [Cloud Run](https://cloud.google.com/run) Yang mana ini berarti setiap project akan didistribusikan sebagai docker container, baik melalui [Google Container Registry](https://cloud.google.com/container-registry/) maupun [GitHub Package Registry](https://docs.github.com/en/packages/working-with-a-github-packages-registry). - IDCloudHost dengan [Cloud VPS](https://idcloudhost.com/cloud-vps/) Berbeda dengan GCP dimana kita bisa inisiasi server kapanpun dibutuhkan, di opsi ini kita terbatas hanya memiliki 1 server untuk menampung semua deployment preview. Bahkan mungkin untuk penggunaan Docker container sendiri pun akan terbatas. Jadi manfaatkan [Deployer](https://deployer.org/) dengan baik disini. Terkait ini untuk server yang kita gunakan, sudah saya atur untuk semua project yang di deploy ke `var/www/<project-name>` akan otomatis punya url `<project-name>.creasi.dev`. Untuk eksperimen kalian bisa gunakan home folder kalian sendiri di server dan atur agar nginx bisa baca itu sebagai vhost. ### Bottom Lines Feel free to discuss
process
auto deployment preview untuk setiap pull request task ini pada dasarnya melanjutkan dan selayaknya harus dapat di implementasikan ke semua project kita kedepannya tanpa perlu ada perubahan signifikan clear and concise description of the task pertimbangan utama adalah untuk mempermudah proses review setiap task yang sudah dikerjakan secara teknis fitur ini mirip dengan fitur yang ditawarkan oleh dan namun setidaknya untuk saat ini mereka bukanlah opsi disini dengan alasan cost billing management adapun perimbangan lain adalah sebagai metode distribusi setiap produk yang akan kita kembangkan dan yap ada rencana untuk distribusi ke tapi itu topik untuk lain waktu saat ini fokuskan di dan container registry selama masa development requirements deployment di trigger dari pull request melalui github actions nama environment yang tercantum adalah preview setiap preview dapat memiliki url yang berbeda menyesuaikan dengan pull request nya misal creasi dev setiap preview akan terhapus ketika pull request tersebut merged ataupun closed suggested solution opsi paling affordable untuk saat ini adalah menggunakan layanan yang sudah kita gunakan yaitu google cloud platform dengan yang mana ini berarti setiap project akan didistribusikan sebagai docker container baik melalui maupun idcloudhost dengan berbeda dengan gcp dimana kita bisa inisiasi server kapanpun dibutuhkan di opsi ini kita terbatas hanya memiliki server untuk menampung semua deployment preview bahkan mungkin untuk penggunaan docker container sendiri pun akan terbatas jadi manfaatkan dengan baik disini terkait ini untuk server yang kita gunakan sudah saya atur untuk semua project yang di deploy ke var www akan otomatis punya url creasi dev untuk eksperimen kalian bisa gunakan home folder kalian sendiri di server dan atur agar nginx bisa baca itu sebagai vhost bottom lines feel free to discuss
1
12,169
14,741,650,708
IssuesEvent
2021-01-07 10:57:09
emacs-ess/ESS
https://api.github.com/repos/emacs-ess/ESS
opened
Eldoc timeouts with generic functions
bug lang:r process
The time for querying the arguments of generic functions increases with the number of attached packages. This causes eldoc timeouts while moving the cursor within generic functions. ```r library(vctrs) system.time(.ess_funargs("vec_cast")) #> user system elapsed #> 0.439 0.002 0.443 library(tidyverse) system.time(.ess_funargs("vec_cast")) #> user system elapsed #> 1.061 0.005 1.073 ``` This doesn't concern non-generic functions: ```r system.time(.ess_funargs("vec_slice")) #> user system elapsed #> 0.031 0.000 0.032 ```
1.0
Eldoc timeouts with generic functions - The time for querying the arguments of generic functions increases with the number of attached packages. This causes eldoc timeouts while moving the cursor within generic functions. ```r library(vctrs) system.time(.ess_funargs("vec_cast")) #> user system elapsed #> 0.439 0.002 0.443 library(tidyverse) system.time(.ess_funargs("vec_cast")) #> user system elapsed #> 1.061 0.005 1.073 ``` This doesn't concern non-generic functions: ```r system.time(.ess_funargs("vec_slice")) #> user system elapsed #> 0.031 0.000 0.032 ```
process
eldoc timeouts with generic functions the time for querying the arguments of generic functions increases with the number of attached packages this causes eldoc timeouts while moving the cursor within generic functions r library vctrs system time ess funargs vec cast user system elapsed library tidyverse system time ess funargs vec cast user system elapsed this doesn t concern non generic functions r system time ess funargs vec slice user system elapsed
1
512,124
14,888,578,308
IssuesEvent
2021-01-20 20:03:59
freeorion/freeorion
https://api.github.com/repos/freeorion/freeorion
closed
Unowned monsters and planets not shooting
category:bug component:game mechanic priority:high
Bug Report ========== Environment ----------- * **FreeOrion Version**: b86e635 * **Operating System**: Windows 8.1 * **Fetched as** * Weekly development build Description ----------- Unowned monsters and planets with defenses are not shooting at my ships. This breaks balance for galaxy settings Monster, Specials and Natives. It's rather gamebreaking. Check out system Vega alpha in the attached save game. [monsters_not_shooting.zip](https://github.com/freeorion/freeorion/files/5826992/monsters_not_shooting.zip)
1.0
Unowned monsters and planets not shooting - Bug Report ========== Environment ----------- * **FreeOrion Version**: b86e635 * **Operating System**: Windows 8.1 * **Fetched as** * Weekly development build Description ----------- Unowned monsters and planets with defenses are not shooting at my ships. This breaks balance for galaxy settings Monster, Specials and Natives. It's rather gamebreaking. Check out system Vega alpha in the attached save game. [monsters_not_shooting.zip](https://github.com/freeorion/freeorion/files/5826992/monsters_not_shooting.zip)
non_process
unowned monsters and planets not shooting bug report environment freeorion version operating system windows fetched as weekly development build description unowned monsters and planets with defenses are not shooting at my ships this breaks balance for galaxy settings monster specials and natives it s rather gamebreaking check out system vega alpha in the attached save game
0