Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
21,756
30,275,659,981
IssuesEvent
2023-07-07 19:22:10
microsoft/vscode
https://api.github.com/repos/microsoft/vscode
closed
Installing auto replies will force the pty host to spawn
bug terminal-process
Repro: 1. Open VS Code 2. Close all terminal 3. Exit VS Code 4. Open VS Code (a terminal is not restored) 5. Set: ``` "terminal.integrated.autoReplies": { "Terminate batch job (Y/N)": "Y\r" } ``` 6. Open process explorer, 🐛 the pty host process should not be in the process tree
1.0
Installing auto replies will force the pty host to spawn - Repro: 1. Open VS Code 2. Close all terminal 3. Exit VS Code 4. Open VS Code (a terminal is not restored) 5. Set: ``` "terminal.integrated.autoReplies": { "Terminate batch job (Y/N)": "Y\r" } ``` 6. Open process explorer, 🐛 the pty host process should not be in the process tree
process
installing auto replies will force the pty host to spawn repro open vs code close all terminal exit vs code open vs code a terminal is not restored set terminal integrated autoreplies terminate batch job y n y r open process explorer 🐛 the pty host process should not be in the process tree
1
30,697
11,842,026,060
IssuesEvent
2020-03-23 22:02:55
Mohib-hub/karate
https://api.github.com/repos/Mohib-hub/karate
opened
CVE-2019-9515 (High) detected in netty-codec-http2-4.1.32.Final.jar
security vulnerability
## CVE-2019-9515 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http2-4.1.32.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/netty-codec-http2/">http://netty.io/netty-codec-http2/</a></p> <p>Path to vulnerable library: /tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212750/netty-codec-http2-4.1.32.Final.jar,/tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212750/netty-codec-http2-4.1.32.Final.jar</p> <p> Dependency Hierarchy: - karate-gatling-0.9.5.jar (Root Library) - gatling-charts-highcharts-3.0.2.jar - gatling-app-3.0.2.jar - gatling-http-3.0.2.jar - gatling-http-client-3.0.2.jar - :x: **netty-codec-http2-4.1.32.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/karate/commit/c8766c8277306046ef9c6f01148b98b0d2bafe02">c8766c8277306046ef9c6f01148b98b0d2bafe02</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Some HTTP/2 implementations are vulnerable to a settings flood, potentially leading to a denial of service. The attacker sends a stream of SETTINGS frames to the peer. Since the RFC requires that the peer reply with one acknowledgement per SETTINGS frame, an empty SETTINGS frame is almost equivalent in behavior to a ping. Depending on how efficiently this data is queued, this can consume excess CPU, memory, or both. <p>Publish Date: 2019-08-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9515>CVE-2019-9515</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-9515">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-9515</a></p> <p>Release Date: 2019-08-13</p> <p>Fix Resolution: 7.1.7,8.0.4</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http2","packageVersion":"4.1.32.Final","isTransitiveDependency":true,"dependencyTree":"com.intuit.karate:karate-gatling:0.9.5;io.gatling.highcharts:gatling-charts-highcharts:3.0.2;io.gatling:gatling-app:3.0.2;io.gatling:gatling-http:3.0.2;io.gatling:gatling-http-client:3.0.2;io.netty:netty-codec-http2:4.1.32.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"7.1.7,8.0.4"}],"vulnerabilityIdentifier":"CVE-2019-9515","vulnerabilityDetails":"Some HTTP/2 implementations are vulnerable to a settings flood, potentially leading to a denial of service. The attacker sends a stream of SETTINGS frames to the peer. Since the RFC requires that the peer reply with one acknowledgement per SETTINGS frame, an empty SETTINGS frame is almost equivalent in behavior to a ping. Depending on how efficiently this data is queued, this can consume excess CPU, memory, or both.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9515","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2019-9515 (High) detected in netty-codec-http2-4.1.32.Final.jar - ## CVE-2019-9515 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http2-4.1.32.Final.jar</b></p></summary> <p>Netty is an asynchronous event-driven network application framework for rapid development of maintainable high performance protocol servers and clients.</p> <p>Library home page: <a href="http://netty.io/netty-codec-http2/">http://netty.io/netty-codec-http2/</a></p> <p>Path to vulnerable library: /tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212750/netty-codec-http2-4.1.32.Final.jar,/tmp/ws-ua_20200323212715/downloadResource_20478f94-1633-47a1-ad79-827f8481d3e7/20200323212750/netty-codec-http2-4.1.32.Final.jar</p> <p> Dependency Hierarchy: - karate-gatling-0.9.5.jar (Root Library) - gatling-charts-highcharts-3.0.2.jar - gatling-app-3.0.2.jar - gatling-http-3.0.2.jar - gatling-http-client-3.0.2.jar - :x: **netty-codec-http2-4.1.32.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/karate/commit/c8766c8277306046ef9c6f01148b98b0d2bafe02">c8766c8277306046ef9c6f01148b98b0d2bafe02</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Some HTTP/2 implementations are vulnerable to a settings flood, potentially leading to a denial of service. The attacker sends a stream of SETTINGS frames to the peer. Since the RFC requires that the peer reply with one acknowledgement per SETTINGS frame, an empty SETTINGS frame is almost equivalent in behavior to a ping. Depending on how efficiently this data is queued, this can consume excess CPU, memory, or both. <p>Publish Date: 2019-08-13 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9515>CVE-2019-9515</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-9515">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-9515</a></p> <p>Release Date: 2019-08-13</p> <p>Fix Resolution: 7.1.7,8.0.4</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http2","packageVersion":"4.1.32.Final","isTransitiveDependency":true,"dependencyTree":"com.intuit.karate:karate-gatling:0.9.5;io.gatling.highcharts:gatling-charts-highcharts:3.0.2;io.gatling:gatling-app:3.0.2;io.gatling:gatling-http:3.0.2;io.gatling:gatling-http-client:3.0.2;io.netty:netty-codec-http2:4.1.32.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"7.1.7,8.0.4"}],"vulnerabilityIdentifier":"CVE-2019-9515","vulnerabilityDetails":"Some HTTP/2 implementations are vulnerable to a settings flood, potentially leading to a denial of service. The attacker sends a stream of SETTINGS frames to the peer. Since the RFC requires that the peer reply with one acknowledgement per SETTINGS frame, an empty SETTINGS frame is almost equivalent in behavior to a ping. Depending on how efficiently this data is queued, this can consume excess CPU, memory, or both.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-9515","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in netty codec final jar cve high severity vulnerability vulnerable library netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to vulnerable library tmp ws ua downloadresource netty codec final jar tmp ws ua downloadresource netty codec final jar dependency hierarchy karate gatling jar root library gatling charts highcharts jar gatling app jar gatling http jar gatling http client jar x netty codec final jar vulnerable library found in head commit a href vulnerability details some http implementations are vulnerable to a settings flood potentially leading to a denial of service the attacker sends a stream of settings frames to the peer since the rfc requires that the peer reply with one acknowledgement per settings frame an empty settings frame is almost equivalent in behavior to a ping depending on how efficiently this data is queued this can consume excess cpu memory or both publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails some http implementations are vulnerable to a settings flood potentially leading to a denial of service the attacker sends a stream of settings frames to the peer since the rfc requires that the peer reply with one acknowledgement per settings frame an empty settings frame is almost equivalent in behavior to a ping depending on how efficiently this data is queued this can consume excess cpu memory or both vulnerabilityurl
0
50,150
6,062,207,110
IssuesEvent
2017-06-14 08:51:47
mattbearman/lime
https://api.github.com/repos/mattbearman/lime
opened
test
question test
## [View in BugMuncher Control Panel](https://app.bugmuncher.com/websites/a53d1139bceee3b3730c237aff8217d824ad1ccc/feedback/990b6796b1670dde25216fef582f1425c20f1d7b) ## ## Details ## **Submitted:** June 14, 2017 08:51 **Category:** Question **Sender Email:** **Website:** BugMuncher **URL:** https://www.bugmuncher.com/ **Operating System:** Linux **Browser:** Firefox 53.0 **Browser Size:** 1301 x 673 **User Agent:** Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:53.0) Gecko/20100101 Firefox/53.0 **Description:** ## Screenshot ## ![Screenshot](http://api.bugmuncher.com/feedback/990b6796b1670dde25216fef582f1425c20f1d7b/screenshot.png) 1. fuck this ## Browser Plugins ## Shockwave Flash
1.0
test - ## [View in BugMuncher Control Panel](https://app.bugmuncher.com/websites/a53d1139bceee3b3730c237aff8217d824ad1ccc/feedback/990b6796b1670dde25216fef582f1425c20f1d7b) ## ## Details ## **Submitted:** June 14, 2017 08:51 **Category:** Question **Sender Email:** **Website:** BugMuncher **URL:** https://www.bugmuncher.com/ **Operating System:** Linux **Browser:** Firefox 53.0 **Browser Size:** 1301 x 673 **User Agent:** Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:53.0) Gecko/20100101 Firefox/53.0 **Description:** ## Screenshot ## ![Screenshot](http://api.bugmuncher.com/feedback/990b6796b1670dde25216fef582f1425c20f1d7b/screenshot.png) 1. fuck this ## Browser Plugins ## Shockwave Flash
non_process
test details submitted june category question sender email website bugmuncher url operating system linux browser firefox browser size x user agent mozilla ubuntu linux rv gecko firefox description screenshot fuck this browser plugins shockwave flash
0
20,491
27,146,979,281
IssuesEvent
2023-02-16 20:49:08
MicrosoftDocs/azure-devops-docs
https://api.github.com/repos/MicrosoftDocs/azure-devops-docs
closed
Outdated instruction for using output from different stage
devops/prod doc-bug Pri1 devops-cicd-process/tech
This instruction in [Use outputs in a different stage](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#use-outputs-in-a-different-stage) does not seem to work: > At the stage level, the format for referencing variables from a different stage is dependencies.STAGE.outputs['JOB.TASK.VARIABLE'] The only way I could get it to work is to use `stageDependencies` at stage level too. ```yaml - stage: stage2 dependsOn: stage1 variables: finalResourceNamePrefix: $[ stageDependencies.stage1.jobA.outputs['prefixCalculator.FinalResourceNamePrefix'] ] # works finalResourceNamePrefix: $[ dependencies.stage1.outputs['jobA.prefixCalculator.FinalResourceNamePrefix'] ] # does not work ``` An outdated instruction, perhaps? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: dd7e0bd3-1f7d-d7b6-cc72-5ef63c31b46a * Version Independent ID: dae87abd-b73d-9120-bcdb-6097d4b40f2a * Content: [Define variables - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch) * Content Source: [docs/pipelines/process/variables.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/variables.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
Outdated instruction for using output from different stage - This instruction in [Use outputs in a different stage](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch#use-outputs-in-a-different-stage) does not seem to work: > At the stage level, the format for referencing variables from a different stage is dependencies.STAGE.outputs['JOB.TASK.VARIABLE'] The only way I could get it to work is to use `stageDependencies` at stage level too. ```yaml - stage: stage2 dependsOn: stage1 variables: finalResourceNamePrefix: $[ stageDependencies.stage1.jobA.outputs['prefixCalculator.FinalResourceNamePrefix'] ] # works finalResourceNamePrefix: $[ dependencies.stage1.outputs['jobA.prefixCalculator.FinalResourceNamePrefix'] ] # does not work ``` An outdated instruction, perhaps? --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: dd7e0bd3-1f7d-d7b6-cc72-5ef63c31b46a * Version Independent ID: dae87abd-b73d-9120-bcdb-6097d4b40f2a * Content: [Define variables - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/variables?view=azure-devops&tabs=yaml%2Cbatch) * Content Source: [docs/pipelines/process/variables.md](https://github.com/MicrosoftDocs/azure-devops-docs/blob/main/docs/pipelines/process/variables.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
outdated instruction for using output from different stage this instruction in does not seem to work at the stage level the format for referencing variables from a different stage is dependencies stage outputs the only way i could get it to work is to use stagedependencies at stage level too yaml stage dependson variables finalresourcenameprefix works finalresourcenameprefix does not work an outdated instruction perhaps document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id bcdb content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
802,918
29,058,501,341
IssuesEvent
2023-05-15 01:52:53
certbot/certbot
https://api.github.com/repos/certbot/certbot
closed
Delay deployment of certificates to mitigate client’s clock issues
feature request area: cert management area: install priority: unplanned needs-update
According to [this study](https://research.google/pubs/pub46359/) there is a non-negligible number of clients who have certificate errors because of their misconfigured clock. The study gives the numbers of 6.7% of clients whose the clock is more than 24h late and 0.05% whose the clock is more than 24h ahead (see part 7.1 of the study and figure 4). If I’m not mistaken, the notBefore attribute of Let’s Encrypt certificates is one hour before the current hour, probably to mitigate clock issues. Instead of proposing a change of the notBefore attributes of LE certificates (this is perhaps imposed by CA/Browser Forum or other security rules), this feature request proposes to improve the quality of the renewal process of certificates issued by certbot by **delaying the deployment of renewed certificates to mitigate much more clock issues**. Currently the newly-renewed certificate is deployed immediately. With this new scenario the newly-renewed certificate would be delayed a few days (e.g. 5 days) before being deployed and becoming active, obviously if the previous certificate is still valid during this delay (and a bit more to take into account clients whose the clock is ahead). The delay should be configurable and should be zero by default to keep the current scenario and to force sysadmins to consciously activate this delay, given it could be not expected.
1.0
Delay deployment of certificates to mitigate client’s clock issues - According to [this study](https://research.google/pubs/pub46359/) there is a non-negligible number of clients who have certificate errors because of their misconfigured clock. The study gives the numbers of 6.7% of clients whose the clock is more than 24h late and 0.05% whose the clock is more than 24h ahead (see part 7.1 of the study and figure 4). If I’m not mistaken, the notBefore attribute of Let’s Encrypt certificates is one hour before the current hour, probably to mitigate clock issues. Instead of proposing a change of the notBefore attributes of LE certificates (this is perhaps imposed by CA/Browser Forum or other security rules), this feature request proposes to improve the quality of the renewal process of certificates issued by certbot by **delaying the deployment of renewed certificates to mitigate much more clock issues**. Currently the newly-renewed certificate is deployed immediately. With this new scenario the newly-renewed certificate would be delayed a few days (e.g. 5 days) before being deployed and becoming active, obviously if the previous certificate is still valid during this delay (and a bit more to take into account clients whose the clock is ahead). The delay should be configurable and should be zero by default to keep the current scenario and to force sysadmins to consciously activate this delay, given it could be not expected.
non_process
delay deployment of certificates to mitigate client’s clock issues according to there is a non negligible number of clients who have certificate errors because of their misconfigured clock the study gives the numbers of of clients whose the clock is more than late and whose the clock is more than ahead see part of the study and figure if i’m not mistaken the notbefore attribute of let’s encrypt certificates is one hour before the current hour probably to mitigate clock issues instead of proposing a change of the notbefore attributes of le certificates this is perhaps imposed by ca browser forum or other security rules this feature request proposes to improve the quality of the renewal process of certificates issued by certbot by delaying the deployment of renewed certificates to mitigate much more clock issues currently the newly renewed certificate is deployed immediately with this new scenario the newly renewed certificate would be delayed a few days e g days before being deployed and becoming active obviously if the previous certificate is still valid during this delay and a bit more to take into account clients whose the clock is ahead the delay should be configurable and should be zero by default to keep the current scenario and to force sysadmins to consciously activate this delay given it could be not expected
0
104,852
11,424,315,085
IssuesEvent
2020-02-03 17:29:23
broadinstitute/gatk
https://api.github.com/repos/broadinstitute/gatk
closed
Inaccurate Definition NON_REF in gvcf files
Documentation Vanilla
The issue is that the vcf header NON_REF "Represents any possible alternative allele at this location" but it should instead be described "any allele that is neither the reference nor the observed alt alleles" User Report: Hi, There is a bug in how you define <NON_REF> in gvcf files. From your vcf header definition: ALT=<ID=NON_REF,Description="Represents any possible alternative allele at this location"> But, see this variant (from a previous post in your forum): 20 10000117 . C T,<NON_REF> 612.77 . BaseQRankSum=0.000;ClippingRankSum=-0.411;DP=38;MLEAC=1,0;MLEAF=0.500,0.00;MQ=221.39;MQ0=0;MQRankSum=-2.172;ReadPosRankSum=-0.235 GT:AD:DP:GQ:PL:SB 0/1:17,21,0:38:99:641,0,456,691,519,1210:6,11,11,10 As you can see, out of total depth of 38 (DP), the ref allele C has depth = 17, the non-ref T allele has depth = 21, and the <NON_REF> has depth of 0. Therefore, it seems that what NON_REF actually means is any allele that is neither the reference NOR any other observed non-ref allele. This is very different than how it is misrepresented in the VCF header as ANY non-reference allele. If it was really referring to any non-reference allele, then it should also have included allele 'T' in the above example. This should be fixed and clarified in your documentation, VCF header outputs, forums, etc. This Issue was generated from your [forums] [forums]: https://gatkforums.broadinstitute.org/gatk/discussion/24568/bug-in-definition-of-in-gvcf-files/p1
1.0
Inaccurate Definition NON_REF in gvcf files - The issue is that the vcf header NON_REF "Represents any possible alternative allele at this location" but it should instead be described "any allele that is neither the reference nor the observed alt alleles" User Report: Hi, There is a bug in how you define <NON_REF> in gvcf files. From your vcf header definition: ALT=<ID=NON_REF,Description="Represents any possible alternative allele at this location"> But, see this variant (from a previous post in your forum): 20 10000117 . C T,<NON_REF> 612.77 . BaseQRankSum=0.000;ClippingRankSum=-0.411;DP=38;MLEAC=1,0;MLEAF=0.500,0.00;MQ=221.39;MQ0=0;MQRankSum=-2.172;ReadPosRankSum=-0.235 GT:AD:DP:GQ:PL:SB 0/1:17,21,0:38:99:641,0,456,691,519,1210:6,11,11,10 As you can see, out of total depth of 38 (DP), the ref allele C has depth = 17, the non-ref T allele has depth = 21, and the <NON_REF> has depth of 0. Therefore, it seems that what NON_REF actually means is any allele that is neither the reference NOR any other observed non-ref allele. This is very different than how it is misrepresented in the VCF header as ANY non-reference allele. If it was really referring to any non-reference allele, then it should also have included allele 'T' in the above example. This should be fixed and clarified in your documentation, VCF header outputs, forums, etc. This Issue was generated from your [forums] [forums]: https://gatkforums.broadinstitute.org/gatk/discussion/24568/bug-in-definition-of-in-gvcf-files/p1
non_process
inaccurate definition non ref in gvcf files the issue is that the vcf header non ref represents any possible alternative allele at this location but it should instead be described any allele that is neither the reference nor the observed alt alleles user report hi there is a bug in how you define in gvcf files from your vcf header definition alt but see this variant from a previous post in your forum c t baseqranksum clippingranksum dp mleac mleaf mq mqranksum readposranksum gt ad dp gq pl sb as you can see out of total depth of dp the ref allele c has depth the non ref t allele has depth and the has depth of therefore it seems that what non ref actually means is any allele that is neither the reference nor any other observed non ref allele this is very different than how it is misrepresented in the vcf header as any non reference allele if it was really referring to any non reference allele then it should also have included allele t in the above example this should be fixed and clarified in your documentation vcf header outputs forums etc this issue was generated from your
0
66,372
3,253,072,265
IssuesEvent
2015-10-19 17:29:21
NuGet/Home
https://api.github.com/repos/NuGet/Home
closed
%temp%\NuGet has almost 4GB of files
2 - Working Area:CommandLine Area:VS.Client Priority:2 Type:Bug
It seems that NuGet puts some temporary files there with extracted packages and never cleans up. It's worth noting that this isn't a year's worth of cache or anything - all the timestamps are from a 3-day period. I believe that NuGet should clean these files/folders up because they are apparently each used only once and never again. <!--- @huboard:{"order":802.0,"milestone_order":802,"custom_state":""} -->
1.0
%temp%\NuGet has almost 4GB of files - It seems that NuGet puts some temporary files there with extracted packages and never cleans up. It's worth noting that this isn't a year's worth of cache or anything - all the timestamps are from a 3-day period. I believe that NuGet should clean these files/folders up because they are apparently each used only once and never again. <!--- @huboard:{"order":802.0,"milestone_order":802,"custom_state":""} -->
non_process
temp nuget has almost of files it seems that nuget puts some temporary files there with extracted packages and never cleans up it s worth noting that this isn t a year s worth of cache or anything all the timestamps are from a day period i believe that nuget should clean these files folders up because they are apparently each used only once and never again huboard order milestone order custom state
0
467,946
13,458,875,828
IssuesEvent
2020-09-09 11:21:55
knime-mpicbg/knime-scripting
https://api.github.com/repos/knime-mpicbg/knime-scripting
closed
open in python should support linux
Python feature request medium priority
linux is reported as unsupported platform. is there any reason, why?
1.0
open in python should support linux - linux is reported as unsupported platform. is there any reason, why?
non_process
open in python should support linux linux is reported as unsupported platform is there any reason why
0
3,252
6,329,937,690
IssuesEvent
2017-07-26 05:30:11
nodejs/node
https://api.github.com/repos/nodejs/node
closed
Shouldn't the default SIGINT handler trigger process.on('exit')?
process
Maybe it's a bug, maybe it's by design. Documentation doesn't mention anything about whether or not the "exit" event should fire, but it's a bit silly I have to write: ``` js process.on('SIGINT', function () { process.exit(somecode); // now the "exit" event will fire }); ``` Just because the built-in handler won't do it for me. It also means I can no longer depend on the default exit code that Node associates with SIGINT, which apparently is `128 + signal number` (according to the docs).
1.0
Shouldn't the default SIGINT handler trigger process.on('exit')? - Maybe it's a bug, maybe it's by design. Documentation doesn't mention anything about whether or not the "exit" event should fire, but it's a bit silly I have to write: ``` js process.on('SIGINT', function () { process.exit(somecode); // now the "exit" event will fire }); ``` Just because the built-in handler won't do it for me. It also means I can no longer depend on the default exit code that Node associates with SIGINT, which apparently is `128 + signal number` (according to the docs).
process
shouldn t the default sigint handler trigger process on exit maybe it s a bug maybe it s by design documentation doesn t mention anything about whether or not the exit event should fire but it s a bit silly i have to write js process on sigint function process exit somecode now the exit event will fire just because the built in handler won t do it for me it also means i can no longer depend on the default exit code that node associates with sigint which apparently is signal number according to the docs
1
14,663
17,786,556,680
IssuesEvent
2021-08-31 11:48:39
bazelbuild/bazel
https://api.github.com/repos/bazelbuild/bazel
closed
Parcel V2 sandbox issue
type: support / not a bug (process) untriaged team-Local-Exec
### Description of the problem / feature request: Bazel sandbox on Linux is causing problems with parcel V2 RC ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. I trying to run parcel V2 similiar to the example for parcel V1 at https://github.com/bazelbuild/rules_nodejs/tree/stable/examples/parcel replacing parcel.bzl with https://gist.github.com/kohlerm/b41c6b14f63db757341a618ed9bdb5de (still not completely working) it fails with **Error: Bad file descriptor Error: Bad file descriptor** when running with the optoin --spawn_strategy=standalone the message disappears. I suspect this is because parcel V2 is using memory mapped files for their cache see https://v2.parceljs.org/blog/rc0/ ### What operating system are you running Bazel on? Distributor ID: Kali Description: Kali GNU/Linux Rolling Release: 2020.3 Codename: kali-rolling WSL2 on Windows 10 ### What's the output of `bazel info release`? > Replace this line with your answer. release 3.7.2- (@non-git) installed via nix package manager > Replace this line with your answer. ### Have you found anything relevant by searching the web? No ### Any other information, logs, or outputs that you want to share? ``` bazel build --subcommands bundle INFO: Analyzed target //:bundle (0 packages loaded, 0 targets configured). INFO: Found 1 target... SUBCOMMAND: # //:bundle [action 'Bundling JavaScript bundle.js [parcel]', configuration: de16027a7878784780b32c21a23af2a4b61e42de13ca1bc3615d29f2d1aacb01, execution platform: @local_config_platform//:host] (cd /home/kohlerm/.cache/bazel/_bazel_kohlerm/bbfa98f54bdb6f11a7cff30094948c96/execroot/examples_parcel && \ exec env - \ BAZEL_NODE_MODULES_ROOTS='' \ COMPILATION_MODE=fastbuild \ bazel-out/host/bin/external/npm/parcel/bin/parcel.sh build foo.js --dist-dir bazel-out/k8-fastbuild/bin --cache-dir /tmp/cache '--bazel_node_modules_manifest=bazel-out/k8-fastbuild/bin/_bundle.module_mappings.json') ERROR: /home/kohlerm/bazel_parcel/BUILD.bazel:4:7: Bundling JavaScript bundle.js [parcel] failed (Exit 1): parcel.sh failed: error executing command bazel-out/host/bin/external/npm/parcel/bin/parcel.sh build foo.js --dist-dir bazel-out/k8-fastbuild/bin --cache-dir /tmp/cache ... (remaining 1 argument(s) skipped) Use --sandbox_debug to see verbose messages from the sandbox parcel.sh failed: error executing command bazel-out/host/bin/external/npm/parcel/bin/parcel.sh build foo.js --dist-dir bazel-out/k8-fastbuild/bin --cache-dir /tmp/cache ... (remaining 1 argument(s) skipped) Use --sandbox_debug to see verbose messages from the sandbox **Error: Bad file descriptor Error: Bad file descriptor** Building... Bundling... Packaging & Optimizing... ✨ Built in 783ms bazel-out/k8-fastbuild/bin/foo.js 60 B 157ms Target //:bundle failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 6.375s, Critical Path: 6.26s INFO: 2 processes: 2 internal. FAILED: Build did NOT complete successfully ```
1.0
Parcel V2 sandbox issue - ### Description of the problem / feature request: Bazel sandbox on Linux is causing problems with parcel V2 RC ### Bugs: what's the simplest, easiest way to reproduce this bug? Please provide a minimal example if possible. I trying to run parcel V2 similiar to the example for parcel V1 at https://github.com/bazelbuild/rules_nodejs/tree/stable/examples/parcel replacing parcel.bzl with https://gist.github.com/kohlerm/b41c6b14f63db757341a618ed9bdb5de (still not completely working) it fails with **Error: Bad file descriptor Error: Bad file descriptor** when running with the optoin --spawn_strategy=standalone the message disappears. I suspect this is because parcel V2 is using memory mapped files for their cache see https://v2.parceljs.org/blog/rc0/ ### What operating system are you running Bazel on? Distributor ID: Kali Description: Kali GNU/Linux Rolling Release: 2020.3 Codename: kali-rolling WSL2 on Windows 10 ### What's the output of `bazel info release`? > Replace this line with your answer. release 3.7.2- (@non-git) installed via nix package manager > Replace this line with your answer. ### Have you found anything relevant by searching the web? No ### Any other information, logs, or outputs that you want to share? ``` bazel build --subcommands bundle INFO: Analyzed target //:bundle (0 packages loaded, 0 targets configured). INFO: Found 1 target... SUBCOMMAND: # //:bundle [action 'Bundling JavaScript bundle.js [parcel]', configuration: de16027a7878784780b32c21a23af2a4b61e42de13ca1bc3615d29f2d1aacb01, execution platform: @local_config_platform//:host] (cd /home/kohlerm/.cache/bazel/_bazel_kohlerm/bbfa98f54bdb6f11a7cff30094948c96/execroot/examples_parcel && \ exec env - \ BAZEL_NODE_MODULES_ROOTS='' \ COMPILATION_MODE=fastbuild \ bazel-out/host/bin/external/npm/parcel/bin/parcel.sh build foo.js --dist-dir bazel-out/k8-fastbuild/bin --cache-dir /tmp/cache '--bazel_node_modules_manifest=bazel-out/k8-fastbuild/bin/_bundle.module_mappings.json') ERROR: /home/kohlerm/bazel_parcel/BUILD.bazel:4:7: Bundling JavaScript bundle.js [parcel] failed (Exit 1): parcel.sh failed: error executing command bazel-out/host/bin/external/npm/parcel/bin/parcel.sh build foo.js --dist-dir bazel-out/k8-fastbuild/bin --cache-dir /tmp/cache ... (remaining 1 argument(s) skipped) Use --sandbox_debug to see verbose messages from the sandbox parcel.sh failed: error executing command bazel-out/host/bin/external/npm/parcel/bin/parcel.sh build foo.js --dist-dir bazel-out/k8-fastbuild/bin --cache-dir /tmp/cache ... (remaining 1 argument(s) skipped) Use --sandbox_debug to see verbose messages from the sandbox **Error: Bad file descriptor Error: Bad file descriptor** Building... Bundling... Packaging & Optimizing... ✨ Built in 783ms bazel-out/k8-fastbuild/bin/foo.js 60 B 157ms Target //:bundle failed to build Use --verbose_failures to see the command lines of failed build steps. INFO: Elapsed time: 6.375s, Critical Path: 6.26s INFO: 2 processes: 2 internal. FAILED: Build did NOT complete successfully ```
process
parcel sandbox issue description of the problem feature request bazel sandbox on linux is causing problems with parcel rc bugs what s the simplest easiest way to reproduce this bug please provide a minimal example if possible i trying to run parcel similiar to the example for parcel at replacing parcel bzl with still not completely working it fails with error bad file descriptor error bad file descriptor when running with the optoin spawn strategy standalone the message disappears i suspect this is because parcel is using memory mapped files for their cache see what operating system are you running bazel on distributor id kali description kali gnu linux rolling release codename kali rolling on windows what s the output of bazel info release replace this line with your answer release non git installed via nix package manager replace this line with your answer have you found anything relevant by searching the web no any other information logs or outputs that you want to share bazel build subcommands bundle info analyzed target bundle packages loaded targets configured info found target subcommand bundle configuration execution platform local config platform host cd home kohlerm cache bazel bazel kohlerm execroot examples parcel exec env bazel node modules roots compilation mode fastbuild bazel out host bin external npm parcel bin parcel sh build foo js dist dir bazel out fastbuild bin cache dir tmp cache bazel node modules manifest bazel out fastbuild bin bundle module mappings json error home kohlerm bazel parcel build bazel bundling javascript bundle js failed exit parcel sh failed error executing command bazel out host bin external npm parcel bin parcel sh build foo js dist dir bazel out fastbuild bin cache dir tmp cache remaining argument s skipped use sandbox debug to see verbose messages from the sandbox parcel sh failed error executing command bazel out host bin external npm parcel bin parcel sh build foo js dist dir bazel out fastbuild bin cache dir tmp cache remaining argument s skipped use sandbox debug to see verbose messages from the sandbox error bad file descriptor error bad file descriptor building bundling packaging optimizing ✨ built in bazel out fastbuild bin foo js b target bundle failed to build use verbose failures to see the command lines of failed build steps info elapsed time critical path info processes internal failed build did not complete successfully
1
218,804
16,771,612,406
IssuesEvent
2021-06-14 15:24:00
prometheus-operator/prometheus-operator
https://api.github.com/repos/prometheus-operator/prometheus-operator
closed
Unable to add a new service to be monitored with kube-prometheus
good first issue hacktoberfest help wanted kind/documentation stale
**What did you do?** Created a new servicemonitor ``` apiVersion: monitoring.coreos.com/v1 kind: ServiceMonitor metadata: name: sanguine-butterfly-traefik-dashboard labels: app: sanguine-butterfly-traefik spec: selector: matchLabels: app: sanguine-butterfly-traefik endpoints: - port: 8080 interval: 10s ``` **What did you expect to see?** I expected to see it appear in the prometheus dashboard under targets **What did you see instead? Under which circumstances?** Nothing changed **Environment** * Kubernetes version information: 1.8.1 * Kubernetes cluster kind: kubespray * Manifests: ``` insert manifests relevant to the issue ``` * Prometheus Operator Logs: ``` E1106 21:21:42.632764 1 reflector.go:201] github.com/coreos/prometheus-operator/pkg/prometheus/operator.go:278: Failed to list *v1.ServiceMonitor: json: cannot unmarshal number into Go struct field Endpoint.port of type string ``` after seeing the operator logs I changed port to `-port: "8080"` . Then the error went away but I still did not see my target show up. New logs ``` Go struct field Endpoint.port of type string E1106 21:21:42.632764 1 reflector.go:201] github.com/coreos/prometheus-operator/pkg/prometheus/operator.go:278: Failed to list *v1.ServiceMonitor: json: cannot unmarshal number into Go struct field Endpoint.port of type string ts=2017-11-06T21:21:43Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:21:43Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:21:43Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:21:44Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:21:53Z caller=operator.go:321 component=alertmanageroperator msg="Alertmanager updated" key=monitoring/main ts=2017-11-06T21:21:53Z caller=operator.go:377 component=alertmanageroperator msg="sync alertmanager" key=monitoring/main ts=2017-11-06T21:23:20Z caller=operator.go:326 component=prometheusoperator msg="Prometheus updated" key=monitoring/k8s ts=2017-11-06T21:23:20Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:23:21Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:23:23Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:23:23Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:23:23Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:23:23Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:23:36Z caller=operator.go:321 component=alertmanageroperator msg="Alertmanager updated" key=monitoring/main ts=2017-11-06T21:23:36Z caller=operator.go:377 component=alertmanageroperator msg="sync alertmanager" key=monitoring/main ```
1.0
Unable to add a new service to be monitored with kube-prometheus - **What did you do?** Created a new servicemonitor ``` apiVersion: monitoring.coreos.com/v1 kind: ServiceMonitor metadata: name: sanguine-butterfly-traefik-dashboard labels: app: sanguine-butterfly-traefik spec: selector: matchLabels: app: sanguine-butterfly-traefik endpoints: - port: 8080 interval: 10s ``` **What did you expect to see?** I expected to see it appear in the prometheus dashboard under targets **What did you see instead? Under which circumstances?** Nothing changed **Environment** * Kubernetes version information: 1.8.1 * Kubernetes cluster kind: kubespray * Manifests: ``` insert manifests relevant to the issue ``` * Prometheus Operator Logs: ``` E1106 21:21:42.632764 1 reflector.go:201] github.com/coreos/prometheus-operator/pkg/prometheus/operator.go:278: Failed to list *v1.ServiceMonitor: json: cannot unmarshal number into Go struct field Endpoint.port of type string ``` after seeing the operator logs I changed port to `-port: "8080"` . Then the error went away but I still did not see my target show up. New logs ``` Go struct field Endpoint.port of type string E1106 21:21:42.632764 1 reflector.go:201] github.com/coreos/prometheus-operator/pkg/prometheus/operator.go:278: Failed to list *v1.ServiceMonitor: json: cannot unmarshal number into Go struct field Endpoint.port of type string ts=2017-11-06T21:21:43Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:21:43Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:21:43Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:21:44Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:21:53Z caller=operator.go:321 component=alertmanageroperator msg="Alertmanager updated" key=monitoring/main ts=2017-11-06T21:21:53Z caller=operator.go:377 component=alertmanageroperator msg="sync alertmanager" key=monitoring/main ts=2017-11-06T21:23:20Z caller=operator.go:326 component=prometheusoperator msg="Prometheus updated" key=monitoring/k8s ts=2017-11-06T21:23:20Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:23:21Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:23:23Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:23:23Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:23:23Z caller=operator.go:670 component=prometheusoperator msg="sync prometheus" key=monitoring/k8s ts=2017-11-06T21:23:23Z caller=operator.go:979 component=prometheusoperator msg="updating config skipped, no configuration change" ts=2017-11-06T21:23:36Z caller=operator.go:321 component=alertmanageroperator msg="Alertmanager updated" key=monitoring/main ts=2017-11-06T21:23:36Z caller=operator.go:377 component=alertmanageroperator msg="sync alertmanager" key=monitoring/main ```
non_process
unable to add a new service to be monitored with kube prometheus what did you do created a new servicemonitor apiversion monitoring coreos com kind servicemonitor metadata name sanguine butterfly traefik dashboard labels app sanguine butterfly traefik spec selector matchlabels app sanguine butterfly traefik endpoints port interval what did you expect to see i expected to see it appear in the prometheus dashboard under targets what did you see instead under which circumstances nothing changed environment kubernetes version information kubernetes cluster kind kubespray manifests insert manifests relevant to the issue prometheus operator logs reflector go github com coreos prometheus operator pkg prometheus operator go failed to list servicemonitor json cannot unmarshal number into go struct field endpoint port of type string after seeing the operator logs i changed port to port then the error went away but i still did not see my target show up new logs go struct field endpoint port of type string reflector go github com coreos prometheus operator pkg prometheus operator go failed to list servicemonitor json cannot unmarshal number into go struct field endpoint port of type string ts caller operator go component prometheusoperator msg sync prometheus key monitoring ts caller operator go component prometheusoperator msg updating config skipped no configuration change ts caller operator go component prometheusoperator msg sync prometheus key monitoring ts caller operator go component prometheusoperator msg updating config skipped no configuration change ts caller operator go component alertmanageroperator msg alertmanager updated key monitoring main ts caller operator go component alertmanageroperator msg sync alertmanager key monitoring main ts caller operator go component prometheusoperator msg prometheus updated key monitoring ts caller operator go component prometheusoperator msg sync prometheus key monitoring ts caller operator go component prometheusoperator msg updating config skipped no configuration change ts caller operator go component prometheusoperator msg sync prometheus key monitoring ts caller operator go component prometheusoperator msg updating config skipped no configuration change ts caller operator go component prometheusoperator msg sync prometheus key monitoring ts caller operator go component prometheusoperator msg updating config skipped no configuration change ts caller operator go component alertmanageroperator msg alertmanager updated key monitoring main ts caller operator go component alertmanageroperator msg sync alertmanager key monitoring main
0
12,091
14,740,073,316
IssuesEvent
2021-01-07 08:28:12
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Stockton - SA Billing - Late Fee Account List
anc-process anp-important ant-bug has attachment
In GitLab by @kdjstudios on Oct 3, 2018, 11:07 [Stockton.xlsx](/uploads/89f6e5bba4c020ae8abbd4f57234017f/Stockton.xlsx) HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-95123/conversation
1.0
Stockton - SA Billing - Late Fee Account List - In GitLab by @kdjstudios on Oct 3, 2018, 11:07 [Stockton.xlsx](/uploads/89f6e5bba4c020ae8abbd4f57234017f/Stockton.xlsx) HD: http://www.servicedesk.answernet.com/profiles/ticket/2018-10-03-95123/conversation
process
stockton sa billing late fee account list in gitlab by kdjstudios on oct uploads stockton xlsx hd
1
15,189
18,957,799,059
IssuesEvent
2021-11-18 22:42:09
slynch8/10x
https://api.github.com/repos/slynch8/10x
opened
support complex statements in preprocessor
bug Priority 3 preprocessor
such as: #define TEST (1 == (1 ? 1 : 0)) #if TEST
1.0
support complex statements in preprocessor - such as: #define TEST (1 == (1 ? 1 : 0)) #if TEST
process
support complex statements in preprocessor such as define test if test
1
13,693
16,449,700,554
IssuesEvent
2021-05-21 02:37:33
pycaret/pycaret
https://api.github.com/repos/pycaret/pycaret
closed
Anomaly Detection Methods
anomaly_detection classification enhancement no-issue-activity preprocessing regression
Hi there, I have a question about anomaly detection. Is it only supporting "pca" as "outlier_methods" now? Because I find that class Outlier could send different methods including pca, knn and iso, but now is hard coded with pca, and there is not any method parameter could be used when I call setup(). Is there any way to use different methods for outlier detection now? Thanks in advance.
1.0
Anomaly Detection Methods - Hi there, I have a question about anomaly detection. Is it only supporting "pca" as "outlier_methods" now? Because I find that class Outlier could send different methods including pca, knn and iso, but now is hard coded with pca, and there is not any method parameter could be used when I call setup(). Is there any way to use different methods for outlier detection now? Thanks in advance.
process
anomaly detection methods hi there i have a question about anomaly detection is it only supporting pca as outlier methods now because i find that class outlier could send different methods including pca knn and iso but now is hard coded with pca and there is not any method parameter could be used when i call setup is there any way to use different methods for outlier detection now thanks in advance
1
17,755
23,670,999,946
IssuesEvent
2022-08-27 10:53:59
googleapis/python-ndb
https://api.github.com/repos/googleapis/python-ndb
closed
Dependency Dashboard
type: process api: datastore
This issue lists Renovate updates and detected dependencies. Read the [Dependency Dashboard](https://docs.renovatebot.com/key-concepts/dashboard/) docs to learn more. ## Edited/Blocked These updates have been manually edited so Renovate will no longer make changes. To discard all commits and start over, click on a checkbox. - [ ] <!-- rebase-branch=renovate/distlib-0.x -->[chore(deps): update dependency distlib to v0.3.6](../pull/791) ## Ignored or Blocked These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below. - [ ] <!-- recreate-branch=renovate/click-8.x -->[chore(deps): update dependency click to v8.1.3](../pull/789) - [ ] <!-- recreate-branch=renovate/googleapis-common-protos-1.x -->[chore(deps): update dependency googleapis-common-protos to <1.56.5](../pull/784) - [ ] <!-- recreate-branch=renovate/grpcio-1.x -->[chore(deps): update dependency grpcio to <1.48](../pull/780) - [ ] <!-- recreate-branch=renovate/protobuf-3.x -->[chore(deps): update dependency protobuf to <3.21](../pull/762) - [ ] <!-- recreate-branch=renovate/setuptools-65.x -->[chore(deps): update dependency setuptools to v65.3.0](../pull/790) - [ ] <!-- recreate-branch=renovate/google-cloud-datastore-2.x -->[chore(deps): update dependency google-cloud-datastore to v2](../pull/748) - [ ] <!-- recreate-branch=renovate/protobuf-4.x -->[chore(deps): update dependency protobuf to v4](../pull/771) ## Detected dependencies <details><summary>dockerfile</summary> <blockquote> <details><summary>.kokoro/docker/docs/Dockerfile</summary> - `ubuntu 22.04` </details> </blockquote> </details> <details><summary>pip_requirements</summary> <blockquote> <details><summary>.kokoro/requirements.txt</summary> - `argcomplete ==2.0.0` - `attrs ==22.1.0` - `bleach ==5.0.1` - `cachetools ==5.2.0` - `certifi ==2022.6.15` - `cffi ==1.15.1` - `charset-normalizer ==2.1.1` - `click ==8.0.4` - `colorlog ==6.6.0` - `commonmark ==0.9.1` - `cryptography ==37.0.4` - `distlib ==0.3.5` - `docutils ==0.19` - `filelock ==3.8.0` - `gcp-docuploader ==0.6.3` - `gcp-releasetool ==1.8.6` - `google-api-core ==2.8.2` - `google-auth ==2.11.0` - `google-cloud-core ==2.3.2` - `google-cloud-storage ==2.5.0` - `google-crc32c ==1.3.0` - `google-resumable-media ==2.3.3` - `googleapis-common-protos ==1.56.4` - `idna ==3.3` - `importlib-metadata ==4.12.0` - `jeepney ==0.8.0` - `jinja2 ==3.1.2` - `keyring ==23.8.2` - `markupsafe ==2.1.1` - `nox ==2022.8.7` - `packaging ==21.3` - `pkginfo ==1.8.3` - `platformdirs ==2.5.2` - `protobuf ==3.20.1` - `py ==1.11.0` - `pyasn1 ==0.4.8` - `pyasn1-modules ==0.2.8` - `pycparser ==2.21` - `pygments ==2.13.0` - `pyjwt ==2.4.0` - `pyparsing ==3.0.9` - `pyperclip ==1.8.2` - `python-dateutil ==2.8.2` - `readme-renderer ==37.0` - `requests ==2.28.1` - `requests-toolbelt ==0.9.1` - `rfc3986 ==2.0.0` - `rich ==12.5.1` - `rsa ==4.9` - `secretstorage ==3.3.3` - `six ==1.16.0` - `twine ==4.0.1` - `typing-extensions ==4.3.0` - `urllib3 ==1.26.12` - `virtualenv ==20.16.3` - `webencodings ==0.5.1` - `wheel ==0.37.1` - `zipp ==3.8.1` - `setuptools ==65.2.0` </details> </blockquote> </details> <details><summary>pip_setup</summary> <blockquote> <details><summary>setup.py</summary> - `google-cloud-datastore >= 1.7.0, < 2.0.0dev` - `googleapis-common-protos < 1.53.0` - `grpcio < 1.40dev` - `protobuf < 3.18dev` </details> </blockquote> </details> --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
1.0
Dependency Dashboard - This issue lists Renovate updates and detected dependencies. Read the [Dependency Dashboard](https://docs.renovatebot.com/key-concepts/dashboard/) docs to learn more. ## Edited/Blocked These updates have been manually edited so Renovate will no longer make changes. To discard all commits and start over, click on a checkbox. - [ ] <!-- rebase-branch=renovate/distlib-0.x -->[chore(deps): update dependency distlib to v0.3.6](../pull/791) ## Ignored or Blocked These are blocked by an existing closed PR and will not be recreated unless you click a checkbox below. - [ ] <!-- recreate-branch=renovate/click-8.x -->[chore(deps): update dependency click to v8.1.3](../pull/789) - [ ] <!-- recreate-branch=renovate/googleapis-common-protos-1.x -->[chore(deps): update dependency googleapis-common-protos to <1.56.5](../pull/784) - [ ] <!-- recreate-branch=renovate/grpcio-1.x -->[chore(deps): update dependency grpcio to <1.48](../pull/780) - [ ] <!-- recreate-branch=renovate/protobuf-3.x -->[chore(deps): update dependency protobuf to <3.21](../pull/762) - [ ] <!-- recreate-branch=renovate/setuptools-65.x -->[chore(deps): update dependency setuptools to v65.3.0](../pull/790) - [ ] <!-- recreate-branch=renovate/google-cloud-datastore-2.x -->[chore(deps): update dependency google-cloud-datastore to v2](../pull/748) - [ ] <!-- recreate-branch=renovate/protobuf-4.x -->[chore(deps): update dependency protobuf to v4](../pull/771) ## Detected dependencies <details><summary>dockerfile</summary> <blockquote> <details><summary>.kokoro/docker/docs/Dockerfile</summary> - `ubuntu 22.04` </details> </blockquote> </details> <details><summary>pip_requirements</summary> <blockquote> <details><summary>.kokoro/requirements.txt</summary> - `argcomplete ==2.0.0` - `attrs ==22.1.0` - `bleach ==5.0.1` - `cachetools ==5.2.0` - `certifi ==2022.6.15` - `cffi ==1.15.1` - `charset-normalizer ==2.1.1` - `click ==8.0.4` - `colorlog ==6.6.0` - `commonmark ==0.9.1` - `cryptography ==37.0.4` - `distlib ==0.3.5` - `docutils ==0.19` - `filelock ==3.8.0` - `gcp-docuploader ==0.6.3` - `gcp-releasetool ==1.8.6` - `google-api-core ==2.8.2` - `google-auth ==2.11.0` - `google-cloud-core ==2.3.2` - `google-cloud-storage ==2.5.0` - `google-crc32c ==1.3.0` - `google-resumable-media ==2.3.3` - `googleapis-common-protos ==1.56.4` - `idna ==3.3` - `importlib-metadata ==4.12.0` - `jeepney ==0.8.0` - `jinja2 ==3.1.2` - `keyring ==23.8.2` - `markupsafe ==2.1.1` - `nox ==2022.8.7` - `packaging ==21.3` - `pkginfo ==1.8.3` - `platformdirs ==2.5.2` - `protobuf ==3.20.1` - `py ==1.11.0` - `pyasn1 ==0.4.8` - `pyasn1-modules ==0.2.8` - `pycparser ==2.21` - `pygments ==2.13.0` - `pyjwt ==2.4.0` - `pyparsing ==3.0.9` - `pyperclip ==1.8.2` - `python-dateutil ==2.8.2` - `readme-renderer ==37.0` - `requests ==2.28.1` - `requests-toolbelt ==0.9.1` - `rfc3986 ==2.0.0` - `rich ==12.5.1` - `rsa ==4.9` - `secretstorage ==3.3.3` - `six ==1.16.0` - `twine ==4.0.1` - `typing-extensions ==4.3.0` - `urllib3 ==1.26.12` - `virtualenv ==20.16.3` - `webencodings ==0.5.1` - `wheel ==0.37.1` - `zipp ==3.8.1` - `setuptools ==65.2.0` </details> </blockquote> </details> <details><summary>pip_setup</summary> <blockquote> <details><summary>setup.py</summary> - `google-cloud-datastore >= 1.7.0, < 2.0.0dev` - `googleapis-common-protos < 1.53.0` - `grpcio < 1.40dev` - `protobuf < 3.18dev` </details> </blockquote> </details> --- - [ ] <!-- manual job -->Check this box to trigger a request for Renovate to run again on this repository
process
dependency dashboard this issue lists renovate updates and detected dependencies read the docs to learn more edited blocked these updates have been manually edited so renovate will no longer make changes to discard all commits and start over click on a checkbox pull ignored or blocked these are blocked by an existing closed pr and will not be recreated unless you click a checkbox below pull pull pull pull pull pull pull detected dependencies dockerfile kokoro docker docs dockerfile ubuntu pip requirements kokoro requirements txt argcomplete attrs bleach cachetools certifi cffi charset normalizer click colorlog commonmark cryptography distlib docutils filelock gcp docuploader gcp releasetool google api core google auth google cloud core google cloud storage google google resumable media googleapis common protos idna importlib metadata jeepney keyring markupsafe nox packaging pkginfo platformdirs protobuf py modules pycparser pygments pyjwt pyparsing pyperclip python dateutil readme renderer requests requests toolbelt rich rsa secretstorage six twine typing extensions virtualenv webencodings wheel zipp setuptools pip setup setup py google cloud datastore googleapis common protos grpcio protobuf check this box to trigger a request for renovate to run again on this repository
1
21,700
30,195,267,106
IssuesEvent
2023-07-04 20:06:12
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Processing output as shapefile is missing CPG
Processing Bug
### What is the bug or the crash? ~~Model~~ Processing output as shapefile is missing CPG and PRJ. ### Steps to reproduce the issue 1. Run model [test.zip](https://github.com/qgis/QGIS/files/9819717/test.zip) with shapefile output ![image](https://user-images.githubusercontent.com/20856381/196670059-3c3ae66a-8e5e-4504-84d7-c4d1bab851c5.png) 2. See error: -> no CPG file is created: ![image](https://user-images.githubusercontent.com/20856381/196670386-c7c566b3-094e-488c-95a8-aa70f7ba2122.png) ### Versions QGIS version | 3.27.0-Master | QGIS code revision | e8dcb89988 -- | -- | -- | -- Qt version | 5.15.3 Python version | 3.10.6 GDAL/OGR version | 3.4.1 PROJ version | 8.2.1 EPSG Registry database version | v10.041 (2021-12-03) GEOS version | 3.10.2-CAPI-1.16.0 SQLite version | 3.37.2 PostgreSQL client version | unknown SpatiaLite version | 5.0.1 QWT version | 6.1.4 QScintilla2 version | 2.11.6 OS version | Ubuntu 22.04.1 LTS   |   |   |   Active Python plugins sagaprovider | 2.12.99 grassprovider | 2.12.99 MetaSearch | 0.3.6 processing | 2.12.99 db_manager | 0.1.20 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context _No response_
1.0
Processing output as shapefile is missing CPG - ### What is the bug or the crash? ~~Model~~ Processing output as shapefile is missing CPG and PRJ. ### Steps to reproduce the issue 1. Run model [test.zip](https://github.com/qgis/QGIS/files/9819717/test.zip) with shapefile output ![image](https://user-images.githubusercontent.com/20856381/196670059-3c3ae66a-8e5e-4504-84d7-c4d1bab851c5.png) 2. See error: -> no CPG file is created: ![image](https://user-images.githubusercontent.com/20856381/196670386-c7c566b3-094e-488c-95a8-aa70f7ba2122.png) ### Versions QGIS version | 3.27.0-Master | QGIS code revision | e8dcb89988 -- | -- | -- | -- Qt version | 5.15.3 Python version | 3.10.6 GDAL/OGR version | 3.4.1 PROJ version | 8.2.1 EPSG Registry database version | v10.041 (2021-12-03) GEOS version | 3.10.2-CAPI-1.16.0 SQLite version | 3.37.2 PostgreSQL client version | unknown SpatiaLite version | 5.0.1 QWT version | 6.1.4 QScintilla2 version | 2.11.6 OS version | Ubuntu 22.04.1 LTS   |   |   |   Active Python plugins sagaprovider | 2.12.99 grassprovider | 2.12.99 MetaSearch | 0.3.6 processing | 2.12.99 db_manager | 0.1.20 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context _No response_
process
processing output as shapefile is missing cpg what is the bug or the crash model processing output as shapefile is missing cpg and prj steps to reproduce the issue run model with shapefile output see error no cpg file is created versions qgis version master qgis code revision qt version python version gdal ogr version proj version epsg registry database version geos version capi sqlite version postgresql client version unknown spatialite version qwt version version os version ubuntu lts         active python plugins sagaprovider grassprovider metasearch processing db manager supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context no response
1
15,525
19,703,285,546
IssuesEvent
2022-01-12 18:53:35
googleapis/nodejs-resource-manager
https://api.github.com/repos/googleapis/nodejs-resource-manager
opened
Your .repo-metadata.json file has a problem 🤒
type: process repo-metadata: lint
You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'resource' invalid in .repo-metadata.json ☝️ Once you correct these problems, you can close this issue. Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'resource' invalid in .repo-metadata.json ☝️ Once you correct these problems, you can close this issue. Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 api shortname resource invalid in repo metadata json ☝️ once you correct these problems you can close this issue reach out to go github automation if you have any questions
1
16,864
22,143,827,318
IssuesEvent
2022-06-03 09:45:16
camunda/zeebe
https://api.github.com/repos/camunda/zeebe
closed
Split grafana dashboard into two
kind/toil area/observability gameday team/distributed team/process-automation
**Description** This came up as part of [the game day](https://confluence.camunda.com/pages/viewpage.action?spaceKey=ZEEBE&title=2022-03-23+Game+Day) 1. [x] To reduce the cognitive overload we should provide a dashboard with only a general overview. It should be easy for some one without internal knowledge of zeebe to understand the overall health of the system. For example, a support engineer can use this to get an initial understanding before delegating it to the right team. 2. The second dashboard will contain more detailed metrics which a zeebe engineer can make sense of. (Optionally, we can also split this further into multiple dashboards -- for example rocksdb metrics or latency metrics are mostly useful if we are investigating performance issues). This would be helpful in maintaining the right balance between reducing the cognitive overload and providing necessary metrics for detailed investigation.
1.0
Split grafana dashboard into two - **Description** This came up as part of [the game day](https://confluence.camunda.com/pages/viewpage.action?spaceKey=ZEEBE&title=2022-03-23+Game+Day) 1. [x] To reduce the cognitive overload we should provide a dashboard with only a general overview. It should be easy for some one without internal knowledge of zeebe to understand the overall health of the system. For example, a support engineer can use this to get an initial understanding before delegating it to the right team. 2. The second dashboard will contain more detailed metrics which a zeebe engineer can make sense of. (Optionally, we can also split this further into multiple dashboards -- for example rocksdb metrics or latency metrics are mostly useful if we are investigating performance issues). This would be helpful in maintaining the right balance between reducing the cognitive overload and providing necessary metrics for detailed investigation.
process
split grafana dashboard into two description this came up as part of to reduce the cognitive overload we should provide a dashboard with only a general overview it should be easy for some one without internal knowledge of zeebe to understand the overall health of the system for example a support engineer can use this to get an initial understanding before delegating it to the right team the second dashboard will contain more detailed metrics which a zeebe engineer can make sense of optionally we can also split this further into multiple dashboards for example rocksdb metrics or latency metrics are mostly useful if we are investigating performance issues this would be helpful in maintaining the right balance between reducing the cognitive overload and providing necessary metrics for detailed investigation
1
303,764
9,310,365,369
IssuesEvent
2019-03-25 18:35:46
mozilla/addons-frontend
https://api.github.com/repos/mozilla/addons-frontend
closed
Improve handling of needsRestart hint with mozAddonManager
priority: p4 project: amo triaged
Steps to reproduce: 1. Search for a "requires restart" add-on on your device using AMO-dev i.e. https://addons.allizom.org/en-US/android/addon/stylish/?src=hp-dl-featured 2. Tap the add-on installation button Expected results: A popup notifying the user that the browser needs to be restarted to apply changes is displayed. Actual results: 2 popups are displayed: one that requires browser restart and one with "Your add-on is ready" message - as if the add-on was installed. Notes/Issues: - without the `extensions.webapi.testing` set to "true", it only the "browser needs to be restarted" popup is displayed. - if the both popups are dismissed the installation button appears to load continuously. Verified on FF51(Android 6.0.1). Issue is reproducing on AMO-dev and -stage. Video for this issue: ![videotogif_2017 03 09_12 52 30](https://cloud.githubusercontent.com/assets/15685960/23748224/5e0020fa-04cb-11e7-813e-a68c26364cca.gif)
1.0
Improve handling of needsRestart hint with mozAddonManager - Steps to reproduce: 1. Search for a "requires restart" add-on on your device using AMO-dev i.e. https://addons.allizom.org/en-US/android/addon/stylish/?src=hp-dl-featured 2. Tap the add-on installation button Expected results: A popup notifying the user that the browser needs to be restarted to apply changes is displayed. Actual results: 2 popups are displayed: one that requires browser restart and one with "Your add-on is ready" message - as if the add-on was installed. Notes/Issues: - without the `extensions.webapi.testing` set to "true", it only the "browser needs to be restarted" popup is displayed. - if the both popups are dismissed the installation button appears to load continuously. Verified on FF51(Android 6.0.1). Issue is reproducing on AMO-dev and -stage. Video for this issue: ![videotogif_2017 03 09_12 52 30](https://cloud.githubusercontent.com/assets/15685960/23748224/5e0020fa-04cb-11e7-813e-a68c26364cca.gif)
non_process
improve handling of needsrestart hint with mozaddonmanager steps to reproduce search for a requires restart add on on your device using amo dev i e tap the add on installation button expected results a popup notifying the user that the browser needs to be restarted to apply changes is displayed actual results popups are displayed one that requires browser restart and one with your add on is ready message as if the add on was installed notes issues without the extensions webapi testing set to true it only the browser needs to be restarted popup is displayed if the both popups are dismissed the installation button appears to load continuously verified on android issue is reproducing on amo dev and stage video for this issue
0
434,691
30,462,525,096
IssuesEvent
2023-07-17 08:04:09
vijayk3327/LWC
https://api.github.com/repos/vijayk3327/LWC
opened
Create record dynamically Using LWC Apex Framework based on database.insert in Salesforce
documentation question
In this post we are going to learn about [How to Insert record Uses of Apex Framework on standard object ](https://www.w3web.net/lwc-apex-framework-salesforce/)in Salesforce LWC. Apex is a strongly typed, object-oriented programming language that allows developers to execute flow and transaction control statements on Salesforce servers in conjunction with calls to the API. Using syntax that looks like Java and acts like database stored procedures, Apex enables developers to add business logic to most system events, including button clicks, related record updates, and Visualforce pages. Apex code can be initiated by Web service requests and from triggers on objects. **[→ Get source code live demo link:-](https://www.w3web.net/lwc-apex-framework-salesforce/)** <img src="https://www.w3web.net/wp-content/uploads/2023/02/createOpt-min.gif"/> **[Step 1:- Create Lightning Web Component : createOpt.html](https://www.w3web.net/lwc-apex-framework-salesforce/)** ` <template> <lightning-card title="Create Opportunity"> <form data-name="opptForm"> <div class="slds-grid slds-wrap"> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-input type="text" class="inpFld" label="Name" data-type="input-field" name="Name" value={optFormData.Name} required> </lightning-input> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-input type="date" class="inpFld" label="Close Date" value={optFormData.CloseDate} data-type="input-field" name="CloseDate" required> </lightning-input> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-record-edit-form object-api-name="Opportunity"> <lightning-input-field field-name="StageName" class="inpFld" data-type="input-field" value={optFormData.StageName} name="StageName" required> </lightning-input-field> </lightning-record-edit-form> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-record-edit-form object-api-name="Opportunity"> <lightning-input-field field-name="TotalOpportunityQuantity" class="inpFld" data-type="input-field" value={optFormData.TotalOpportunityQuantity} name="TotalOpportunityQuantity" required> </lightning-input-field> </lightning-record-edit-form> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-record-edit-form object-api-name="Opportunity"> <lightning-input-field field-name="Amount" class="inpFld" data-type="input-field" value={optFormData.Amount} name="Amount" required> </lightning-input-field> </lightning-record-edit-form> </div> </div> </form> <br/><br/> <footer class="slds-modal__footer"> <div class="slds-col_bump-left slds-text-align_center slds-p-horizontal_x-small slds-text-align_center"> <lightning-button variant="brand" label="Save" title="Save" class="slds-m-left_x-small slds-p-right_xx-small" onclick={saveButtonAction}> </lightning-button> </div> </footer> </lightning-card> <!--Start Spinner--> <template if:true={spinnerStatus}> <div class="slds-is-relative"> <section class="slds-modal slds-fade-in-open"> <lightning-spinner variant="brand" alternative-text="Loading..."></lightning-spinner> </section> <div class="slds-backdrop slds-backdrop_open"></div> </div> </template> <!--End Spinner--> </template>` **Step 2:- Create Lightning Web Component : createOpt.js** ` import { LightningElement,api,track } from 'lwc'; import { ShowToastEvent } from 'lightning/platformShowToastEvent'; import submitOptRecord from '@salesforce/apex/createOptCtrl.submitOptRecord'; import { NavigationMixin } from 'lightning/navigation'; export default class CreateOpt extends NavigationMixin (LightningElement) { @track recordId; @track optFormData = {}; @api spinnerStatus = false; toastEventFire(title, msg, variant, mode) { const e = new ShowToastEvent({ title: title, message: msg, variant: variant, mode: mode }); this.dispatchEvent(e); } connectedCallback() { this.alertElem = this.template.querySelector('[data-elem="alert-span"]'); // console.log(this.alertElem); } async saveButtonAction(event) { let flag = true; this.spinnerStatus=true; for (const elem of [...this.template.querySelectorAll('form[data-name="opptForm"] [data-type="input-field"]')]) { this.optFormData[elem.name] = elem.value; //console.log('aaaaa' , elem.value); } console.log('optFormData## ', this.optFormData); console.log('optFormDataStringyFy',JSON.stringify(this.optFormData)); const data = { optDataFyObj: this.optFormData, }; console.log('optDataFyObj## ',JSON.stringify(data)); if(flag){ const result = await submitOptRecord({ jsonDataStr: JSON.stringify(data) }); console.log('result## ' , result); const toastEvent = new ShowToastEvent({ title:'success', message:'Record created successfully', variant:'success' }); this.dispatchEvent(toastEvent); this.spinnerStatus=false; if (result.status == 200) { // naviagte to record page this.navigateToRecordPage(this.opportunityId); } else { return this.setFormError(result.message); } } } navigateToRecordPage(recordId) { this[NavigationMixin.GenerateUrl]({ type: 'standard__recordPage', attributes: { recordId: recordId, actionName: 'view', }, }).then(url => { window.location.href = url; }); } }` **Step 3:- Create Lightning Web Component : createOpt.js-meta.xml** ` <?xml version="1.0" encoding="UTF-8"?> <LightningComponentBundle xmlns="http://soap.sforce.com/2006/04/metadata"> <apiVersion>56.0</apiVersion> <isExposed>true</isExposed> <targets> <target>lightning__AppPage</target> <target>lightning__HomePage</target> <target>lightning__RecordPage</target> <target>lightning__RecordAction</target> <target>lightning__UtilityBar</target> <target>lightning__Tab</target> </targets> <targetConfigs> <targetConfig targets="lightning__RecordAction"> <actionType>ScreenAction</actionType> </targetConfig> </targetConfigs> </LightningComponentBundle>` **Step 4:- Create Apex Class : createOptCtrl.cls** ` public with sharing class createOptCtrl { @AuraEnabled public static Map<String, Object> submitOptRecord(String jsonDataStr) { Map<String, Object> result = new Map<String, Object>(); try { Map<String, Object> formDataMap = (Map<String, Object>)JSON.deserializeUntyped(jsonDataStr); System.debug('formDataMap ' + formDataMap); Map<String, Object> OptDataMap = (Map<String, Object>)formDataMap.get('optDataFyObj'); Opportunity optObj = new Opportunity(); optObj.Name = getStringValueFromMap(OptDataMap, 'Name'); optObj.CloseDate = getDateValueFromMap(OptDataMap, 'CloseDate'); optObj.StageName = getStringValueFromMap(OptDataMap, 'StageName'); optObj.TotalOpportunityQuantity = getIntValueFromMap(OptDataMap, 'TotalOpportunityQuantity'); optObj.Amount = getDecimalValueFromMap(OptDataMap, 'Amount'); system.debug('optObj### ' + optObj); List<Database.SaveResult> insertResult = Database.insert(new List<Opportunity>{optObj}); System.debug('insertResult ' + insertResult); }catch(Exception ex) { System.debug('Exception ' + ex.getMessage() + ',line' + ex.getLineNumber()); result.put('status', 500); result.put('message', 'Exception ' + ex.getMessage() + ',line' + ex.getLineNumber()); } return result; } public static String getStringValueFromMap(Map<String, Object> dataMap, String fieldName) { String value; try { if(dataMap.containsKey(fieldName)) { value = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(value) ? value : String.valueOf(value); } catch(Exception ex) { System.debug('Exception getValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } public static Date getDateValueFromMap(Map<String, Object> dataMap, String fieldName) { Date value; try { String str; if(dataMap.containsKey(fieldName)) { str = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(str) ? value : Date.valueOf(str); } catch(Exception ex) { System.debug('Exception getIntValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } public static Integer getIntValueFromMap(Map<String, Object> dataMap, String fieldName) { Integer value; try { String str; if(dataMap.containsKey(fieldName)) { str = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(str) ? value : Integer.valueOf(str); } catch(Exception ex) { System.debug('Exception getIntValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } public static Decimal getDecimalValueFromMap(Map<String, Object> dataMap, String fieldName) { Decimal value; try { String str; if(dataMap.containsKey(fieldName)) { str = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(str) ? value : Decimal.valueOf(str); } catch(Exception ex) { System.debug('Exception getIntValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } }` **[→ Get source code live demo link:-](https://www.w3web.net/lwc-apex-framework-salesforce/)**
1.0
Create record dynamically Using LWC Apex Framework based on database.insert in Salesforce - In this post we are going to learn about [How to Insert record Uses of Apex Framework on standard object ](https://www.w3web.net/lwc-apex-framework-salesforce/)in Salesforce LWC. Apex is a strongly typed, object-oriented programming language that allows developers to execute flow and transaction control statements on Salesforce servers in conjunction with calls to the API. Using syntax that looks like Java and acts like database stored procedures, Apex enables developers to add business logic to most system events, including button clicks, related record updates, and Visualforce pages. Apex code can be initiated by Web service requests and from triggers on objects. **[→ Get source code live demo link:-](https://www.w3web.net/lwc-apex-framework-salesforce/)** <img src="https://www.w3web.net/wp-content/uploads/2023/02/createOpt-min.gif"/> **[Step 1:- Create Lightning Web Component : createOpt.html](https://www.w3web.net/lwc-apex-framework-salesforce/)** ` <template> <lightning-card title="Create Opportunity"> <form data-name="opptForm"> <div class="slds-grid slds-wrap"> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-input type="text" class="inpFld" label="Name" data-type="input-field" name="Name" value={optFormData.Name} required> </lightning-input> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-input type="date" class="inpFld" label="Close Date" value={optFormData.CloseDate} data-type="input-field" name="CloseDate" required> </lightning-input> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-record-edit-form object-api-name="Opportunity"> <lightning-input-field field-name="StageName" class="inpFld" data-type="input-field" value={optFormData.StageName} name="StageName" required> </lightning-input-field> </lightning-record-edit-form> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-record-edit-form object-api-name="Opportunity"> <lightning-input-field field-name="TotalOpportunityQuantity" class="inpFld" data-type="input-field" value={optFormData.TotalOpportunityQuantity} name="TotalOpportunityQuantity" required> </lightning-input-field> </lightning-record-edit-form> </div> <div class="slds-col slds-size_3-of-12 slds-p-horizontal_x-small"> <lightning-record-edit-form object-api-name="Opportunity"> <lightning-input-field field-name="Amount" class="inpFld" data-type="input-field" value={optFormData.Amount} name="Amount" required> </lightning-input-field> </lightning-record-edit-form> </div> </div> </form> <br/><br/> <footer class="slds-modal__footer"> <div class="slds-col_bump-left slds-text-align_center slds-p-horizontal_x-small slds-text-align_center"> <lightning-button variant="brand" label="Save" title="Save" class="slds-m-left_x-small slds-p-right_xx-small" onclick={saveButtonAction}> </lightning-button> </div> </footer> </lightning-card> <!--Start Spinner--> <template if:true={spinnerStatus}> <div class="slds-is-relative"> <section class="slds-modal slds-fade-in-open"> <lightning-spinner variant="brand" alternative-text="Loading..."></lightning-spinner> </section> <div class="slds-backdrop slds-backdrop_open"></div> </div> </template> <!--End Spinner--> </template>` **Step 2:- Create Lightning Web Component : createOpt.js** ` import { LightningElement,api,track } from 'lwc'; import { ShowToastEvent } from 'lightning/platformShowToastEvent'; import submitOptRecord from '@salesforce/apex/createOptCtrl.submitOptRecord'; import { NavigationMixin } from 'lightning/navigation'; export default class CreateOpt extends NavigationMixin (LightningElement) { @track recordId; @track optFormData = {}; @api spinnerStatus = false; toastEventFire(title, msg, variant, mode) { const e = new ShowToastEvent({ title: title, message: msg, variant: variant, mode: mode }); this.dispatchEvent(e); } connectedCallback() { this.alertElem = this.template.querySelector('[data-elem="alert-span"]'); // console.log(this.alertElem); } async saveButtonAction(event) { let flag = true; this.spinnerStatus=true; for (const elem of [...this.template.querySelectorAll('form[data-name="opptForm"] [data-type="input-field"]')]) { this.optFormData[elem.name] = elem.value; //console.log('aaaaa' , elem.value); } console.log('optFormData## ', this.optFormData); console.log('optFormDataStringyFy',JSON.stringify(this.optFormData)); const data = { optDataFyObj: this.optFormData, }; console.log('optDataFyObj## ',JSON.stringify(data)); if(flag){ const result = await submitOptRecord({ jsonDataStr: JSON.stringify(data) }); console.log('result## ' , result); const toastEvent = new ShowToastEvent({ title:'success', message:'Record created successfully', variant:'success' }); this.dispatchEvent(toastEvent); this.spinnerStatus=false; if (result.status == 200) { // naviagte to record page this.navigateToRecordPage(this.opportunityId); } else { return this.setFormError(result.message); } } } navigateToRecordPage(recordId) { this[NavigationMixin.GenerateUrl]({ type: 'standard__recordPage', attributes: { recordId: recordId, actionName: 'view', }, }).then(url => { window.location.href = url; }); } }` **Step 3:- Create Lightning Web Component : createOpt.js-meta.xml** ` <?xml version="1.0" encoding="UTF-8"?> <LightningComponentBundle xmlns="http://soap.sforce.com/2006/04/metadata"> <apiVersion>56.0</apiVersion> <isExposed>true</isExposed> <targets> <target>lightning__AppPage</target> <target>lightning__HomePage</target> <target>lightning__RecordPage</target> <target>lightning__RecordAction</target> <target>lightning__UtilityBar</target> <target>lightning__Tab</target> </targets> <targetConfigs> <targetConfig targets="lightning__RecordAction"> <actionType>ScreenAction</actionType> </targetConfig> </targetConfigs> </LightningComponentBundle>` **Step 4:- Create Apex Class : createOptCtrl.cls** ` public with sharing class createOptCtrl { @AuraEnabled public static Map<String, Object> submitOptRecord(String jsonDataStr) { Map<String, Object> result = new Map<String, Object>(); try { Map<String, Object> formDataMap = (Map<String, Object>)JSON.deserializeUntyped(jsonDataStr); System.debug('formDataMap ' + formDataMap); Map<String, Object> OptDataMap = (Map<String, Object>)formDataMap.get('optDataFyObj'); Opportunity optObj = new Opportunity(); optObj.Name = getStringValueFromMap(OptDataMap, 'Name'); optObj.CloseDate = getDateValueFromMap(OptDataMap, 'CloseDate'); optObj.StageName = getStringValueFromMap(OptDataMap, 'StageName'); optObj.TotalOpportunityQuantity = getIntValueFromMap(OptDataMap, 'TotalOpportunityQuantity'); optObj.Amount = getDecimalValueFromMap(OptDataMap, 'Amount'); system.debug('optObj### ' + optObj); List<Database.SaveResult> insertResult = Database.insert(new List<Opportunity>{optObj}); System.debug('insertResult ' + insertResult); }catch(Exception ex) { System.debug('Exception ' + ex.getMessage() + ',line' + ex.getLineNumber()); result.put('status', 500); result.put('message', 'Exception ' + ex.getMessage() + ',line' + ex.getLineNumber()); } return result; } public static String getStringValueFromMap(Map<String, Object> dataMap, String fieldName) { String value; try { if(dataMap.containsKey(fieldName)) { value = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(value) ? value : String.valueOf(value); } catch(Exception ex) { System.debug('Exception getValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } public static Date getDateValueFromMap(Map<String, Object> dataMap, String fieldName) { Date value; try { String str; if(dataMap.containsKey(fieldName)) { str = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(str) ? value : Date.valueOf(str); } catch(Exception ex) { System.debug('Exception getIntValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } public static Integer getIntValueFromMap(Map<String, Object> dataMap, String fieldName) { Integer value; try { String str; if(dataMap.containsKey(fieldName)) { str = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(str) ? value : Integer.valueOf(str); } catch(Exception ex) { System.debug('Exception getIntValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } public static Decimal getDecimalValueFromMap(Map<String, Object> dataMap, String fieldName) { Decimal value; try { String str; if(dataMap.containsKey(fieldName)) { str = String.valueOf(dataMap.get(fieldName)); } value = String.isEmpty(str) ? value : Decimal.valueOf(str); } catch(Exception ex) { System.debug('Exception getIntValueFromMap : '+ ex.getMessage() + ' line ' + ex.getLineNumber()); } return value; } }` **[→ Get source code live demo link:-](https://www.w3web.net/lwc-apex-framework-salesforce/)**
non_process
create record dynamically using lwc apex framework based on database insert in salesforce in this post we are going to learn about salesforce lwc apex is a strongly typed object oriented programming language that allows developers to execute flow and transaction control statements on salesforce servers in conjunction with calls to the api using syntax that looks like java and acts like database stored procedures apex enables developers to add business logic to most system events including button clicks related record updates and visualforce pages apex code can be initiated by web service requests and from triggers on objects img src lightning input type text class inpfld label name data type input field name name value optformdata name required lightning input type date class inpfld label close date value optformdata closedate data type input field name closedate required lightning input field field name stagename class inpfld data type input field value optformdata stagename name stagename required lightning input field field name totalopportunityquantity class inpfld data type input field value optformdata totalopportunityquantity name totalopportunityquantity required lightning input field field name amount class inpfld data type input field value optformdata amount name amount required lightning button variant brand label save title save class slds m left x small slds p right xx small onclick savebuttonaction step create lightning web component createopt js import lightningelement api track from lwc import showtoastevent from lightning platformshowtoastevent import submitoptrecord from salesforce apex createoptctrl submitoptrecord import navigationmixin from lightning navigation export default class createopt extends navigationmixin lightningelement track recordid track optformdata api spinnerstatus false toasteventfire title msg variant mode const e new showtoastevent title title message msg variant variant mode mode this dispatchevent e connectedcallback this alertelem this template queryselector console log this alertelem async savebuttonaction event let flag true this spinnerstatus true for const elem of this optformdata elem value console log aaaaa elem value console log optformdata this optformdata console log optformdatastringyfy json stringify this optformdata const data optdatafyobj this optformdata console log optdatafyobj json stringify data if flag const result await submitoptrecord jsondatastr json stringify data console log result result const toastevent new showtoastevent title success message record created successfully variant success this dispatchevent toastevent this spinnerstatus false if result status naviagte to record page this navigatetorecordpage this opportunityid else return this setformerror result message navigatetorecordpage recordid this type standard recordpage attributes recordid recordid actionname view then url window location href url step create lightning web component createopt js meta xml lightningcomponentbundle xmlns true lightning apppage lightning homepage lightning recordpage lightning recordaction lightning utilitybar lightning tab screenaction step create apex class createoptctrl cls public with sharing class createoptctrl auraenabled public static map submitoptrecord string jsondatastr map result new map try map formdatamap map json deserializeuntyped jsondatastr system debug formdatamap formdatamap map optdatamap map formdatamap get optdatafyobj opportunity optobj new opportunity optobj name getstringvaluefrommap optdatamap name optobj closedate getdatevaluefrommap optdatamap closedate optobj stagename getstringvaluefrommap optdatamap stagename optobj totalopportunityquantity getintvaluefrommap optdatamap totalopportunityquantity optobj amount getdecimalvaluefrommap optdatamap amount system debug optobj optobj list insertresult database insert new list optobj system debug insertresult insertresult catch exception ex system debug exception ex getmessage line ex getlinenumber result put status result put message exception ex getmessage line ex getlinenumber return result public static string getstringvaluefrommap map datamap string fieldname string value try if datamap containskey fieldname value string valueof datamap get fieldname value string isempty value value string valueof value catch exception ex system debug exception getvaluefrommap ex getmessage line ex getlinenumber return value public static date getdatevaluefrommap map datamap string fieldname date value try string str if datamap containskey fieldname str string valueof datamap get fieldname value string isempty str value date valueof str catch exception ex system debug exception getintvaluefrommap ex getmessage line ex getlinenumber return value public static integer getintvaluefrommap map datamap string fieldname integer value try string str if datamap containskey fieldname str string valueof datamap get fieldname value string isempty str value integer valueof str catch exception ex system debug exception getintvaluefrommap ex getmessage line ex getlinenumber return value public static decimal getdecimalvaluefrommap map datamap string fieldname decimal value try string str if datamap containskey fieldname str string valueof datamap get fieldname value string isempty str value decimal valueof str catch exception ex system debug exception getintvaluefrommap ex getmessage line ex getlinenumber return value
0
9,522
12,499,706,721
IssuesEvent
2020-06-01 20:40:58
googleapis/google-cloud-cpp
https://api.github.com/repos/googleapis/google-cloud-cpp
closed
Ask DevRel to stop scanning old repo
api: spanner type: process
Ask the DevRel tooling folks to stop scanning the old repository, we will get duplicate samples otherwise.
1.0
Ask DevRel to stop scanning old repo - Ask the DevRel tooling folks to stop scanning the old repository, we will get duplicate samples otherwise.
process
ask devrel to stop scanning old repo ask the devrel tooling folks to stop scanning the old repository we will get duplicate samples otherwise
1
11,415
14,242,922,495
IssuesEvent
2020-11-19 03:02:36
aodn/imos-toolbox
https://api.github.com/repos/aodn/imos-toolbox
opened
Loading several CTD profiles may crash the session
Type:bug Unit:Processing Unit:Profile
There was a report at workshop day that loading a lot of CTD profiles in one toolbox session may slow down or crash the session. - [ ] Collect some 20+ CTD files with(out) a proper database. - [ ] Investigate the loading of 20+ CTD dataset profiles and try to reproduce the crash. - [ ] monitor memory resources/requirements for bulk loading.
1.0
Loading several CTD profiles may crash the session - There was a report at workshop day that loading a lot of CTD profiles in one toolbox session may slow down or crash the session. - [ ] Collect some 20+ CTD files with(out) a proper database. - [ ] Investigate the loading of 20+ CTD dataset profiles and try to reproduce the crash. - [ ] monitor memory resources/requirements for bulk loading.
process
loading several ctd profiles may crash the session there was a report at workshop day that loading a lot of ctd profiles in one toolbox session may slow down or crash the session collect some ctd files with out a proper database investigate the loading of ctd dataset profiles and try to reproduce the crash monitor memory resources requirements for bulk loading
1
79,588
15,586,186,758
IssuesEvent
2021-03-18 01:22:11
ziednov007/JavaSpring
https://api.github.com/repos/ziednov007/JavaSpring
opened
CVE-2020-36183 (High) detected in jackson-databind-2.9.6.jar
security vulnerability
## CVE-2020-36183 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: JavaSpring/app/build.gradle</p> <p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar,/root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.0.5.RELEASE.jar (Root Library) - spring-boot-starter-json-2.0.5.RELEASE.jar - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.docx4j.org.apache.xalan.lib.sql.JNDIConnectionPool. <p>Publish Date: 2021-01-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36183>CVE-2020-36183</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/3003">https://github.com/FasterXML/jackson-databind/issues/3003</a></p> <p>Release Date: 2021-01-07</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-36183 (High) detected in jackson-databind-2.9.6.jar - ## CVE-2020-36183 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.9.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: JavaSpring/app/build.gradle</p> <p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar,/root/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.6/cfa4f316351a91bfd95cb0644c6a2c95f52db1fc/jackson-databind-2.9.6.jar</p> <p> Dependency Hierarchy: - spring-boot-starter-web-2.0.5.RELEASE.jar (Root Library) - spring-boot-starter-json-2.0.5.RELEASE.jar - :x: **jackson-databind-2.9.6.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to org.docx4j.org.apache.xalan.lib.sql.JNDIConnectionPool. <p>Publish Date: 2021-01-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36183>CVE-2020-36183</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/3003">https://github.com/FasterXML/jackson-databind/issues/3003</a></p> <p>Release Date: 2021-01-07</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file javaspring app build gradle path to vulnerable library root gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar root gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter web release jar root library spring boot starter json release jar x jackson databind jar vulnerable library vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to org org apache xalan lib sql jndiconnectionpool publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
7,072
5,841,698,326
IssuesEvent
2017-05-10 02:08:01
tensorflow/models
https://api.github.com/repos/tensorflow/models
closed
[slim] performance reduce when train cifarnet with multi-gpu
stat:awaiting tensorflower type:bug/performance
I want to train cifarnet on single machine with 4 gpus, but the performance reduces comparing with training with only one gpu. ## [slim] Train cifarnet using the default script slim/scripts/train_cifarnet_on_cifar10.sh When using the default script the speed is as follow: ``` INFO:tensorflow:global step 13900: loss = 0.7609 (0.06 sec/step) ``` ## Modify slim/scripts/train_cifarnet_on_cifar10.sh by set num_clones=4 The speed become slow (I also try change num_preprocessing_threads = 1/2/4/8/16, num_readers=4/8, useless) ``` INFO:tensorflow:global step 14000: loss = 0.7438 (0.26 sec/step) INFO:tensorflow:global step 14100: loss = 0.6690 (0.26 sec/step) ``` ## Hardware Four Titan X 02:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) 03:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) 06:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED Graphics Family (rev 30) 82:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) 83:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) +------------------------------------------------------+ | NVIDIA-SMI 352.30 Driver Version: 352.30 | |-----------------------------------+----------------------+------------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | |=========================+=================+=================| | 0 GeForce GTX TIT... On | 0000:02:00.0 Off | N/A | | 28% 67C P2 75W / 250W | 228MiB / 12287MiB | 0% Default | +----------------------------------+-----------------------+------------------------+ 。。。 32 processor each as follow : processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 63 model name : Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz ## Finally Could anyone give some advices? I read some issues about multi-gpu in this rep, but still can't solve this. I think it caused by IO, because I notice that when train on single gpu the GPU-Util is above 90%( when train with 4 gpus, the GPU-Util is about 20% ). And I don't think it's due to the hardware performance of my machine.
True
[slim] performance reduce when train cifarnet with multi-gpu - I want to train cifarnet on single machine with 4 gpus, but the performance reduces comparing with training with only one gpu. ## [slim] Train cifarnet using the default script slim/scripts/train_cifarnet_on_cifar10.sh When using the default script the speed is as follow: ``` INFO:tensorflow:global step 13900: loss = 0.7609 (0.06 sec/step) ``` ## Modify slim/scripts/train_cifarnet_on_cifar10.sh by set num_clones=4 The speed become slow (I also try change num_preprocessing_threads = 1/2/4/8/16, num_readers=4/8, useless) ``` INFO:tensorflow:global step 14000: loss = 0.7438 (0.26 sec/step) INFO:tensorflow:global step 14100: loss = 0.6690 (0.26 sec/step) ``` ## Hardware Four Titan X 02:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) 03:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) 06:00.0 VGA compatible controller: ASPEED Technology, Inc. ASPEED Graphics Family (rev 30) 82:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) 83:00.0 VGA compatible controller: NVIDIA Corporation Device 17c2 (rev a1) +------------------------------------------------------+ | NVIDIA-SMI 352.30 Driver Version: 352.30 | |-----------------------------------+----------------------+------------------------+ | GPU Name Persistence-M | Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap | Memory-Usage | GPU-Util Compute M. | |=========================+=================+=================| | 0 GeForce GTX TIT... On | 0000:02:00.0 Off | N/A | | 28% 67C P2 75W / 250W | 228MiB / 12287MiB | 0% Default | +----------------------------------+-----------------------+------------------------+ 。。。 32 processor each as follow : processor : 0 vendor_id : GenuineIntel cpu family : 6 model : 63 model name : Intel(R) Xeon(R) CPU E5-2630 v3 @ 2.40GHz ## Finally Could anyone give some advices? I read some issues about multi-gpu in this rep, but still can't solve this. I think it caused by IO, because I notice that when train on single gpu the GPU-Util is above 90%( when train with 4 gpus, the GPU-Util is about 20% ). And I don't think it's due to the hardware performance of my machine.
non_process
performance reduce when train cifarnet with multi gpu i want to train cifarnet on single machine with gpus but the performance reduces comparing with training with only one gpu train cifarnet using the default script slim scripts train cifarnet on sh when using the default script the speed is as follow info tensorflow global step loss sec step modify slim scripts train cifarnet on sh by set num clones the speed become slow i also try change num preprocessing threads num readers useless info tensorflow global step loss sec step info tensorflow global step loss sec step hardware four titan x vga compatible controller nvidia corporation device rev vga compatible controller nvidia corporation device rev vga compatible controller aspeed technology inc aspeed graphics family rev vga compatible controller nvidia corporation device rev vga compatible controller nvidia corporation device rev nvidia smi driver version gpu name persistence m bus id disp a volatile uncorr ecc fan temp perf pwr usage cap memory usage gpu util compute m geforce gtx tit on off n a default 。。。 processor each as follow : processor vendor id genuineintel cpu family model model name intel r xeon r cpu finally could anyone give some advices i read some issues about multi gpu in this rep but still can t solve this i think it caused by io because i notice that when train on single gpu the gpu util is above when train with gpus the gpu util is about and i don t think it s due to the hardware performance of my machine
0
3,144
6,198,743,808
IssuesEvent
2017-07-05 19:54:32
hashicorp/packer
https://api.github.com/repos/hashicorp/packer
closed
Question: export virtualbox and qemu box while building vagrant box
post-processor/vagrant question
Hello, is there any way to export a virtualbox image and a qemu image while building the vagrant box or do I need to open the vagrant box after the build manually to get the images? My goal is that I have the following Output: > a virtualbox image > a qemu image > a virtualbox vagrant box > a qemu/libvirt vagrant box At the moment I just get the both vagrant boxes.
1.0
Question: export virtualbox and qemu box while building vagrant box - Hello, is there any way to export a virtualbox image and a qemu image while building the vagrant box or do I need to open the vagrant box after the build manually to get the images? My goal is that I have the following Output: > a virtualbox image > a qemu image > a virtualbox vagrant box > a qemu/libvirt vagrant box At the moment I just get the both vagrant boxes.
process
question export virtualbox and qemu box while building vagrant box hello is there any way to export a virtualbox image and a qemu image while building the vagrant box or do i need to open the vagrant box after the build manually to get the images my goal is that i have the following output a virtualbox image a qemu image a virtualbox vagrant box a qemu libvirt vagrant box at the moment i just get the both vagrant boxes
1
11,814
14,630,498,796
IssuesEvent
2020-12-23 17:50:32
KalikaKay/Author-Classification-Project
https://api.github.com/repos/KalikaKay/Author-Classification-Project
closed
Extract Authors
preprocessing
Extract the authors from the text. Include them in a separate field or somethin.
1.0
Extract Authors - Extract the authors from the text. Include them in a separate field or somethin.
process
extract authors extract the authors from the text include them in a separate field or somethin
1
578
3,054,613,624
IssuesEvent
2015-08-13 04:47:19
e-government-ua/i
https://api.github.com/repos/e-government-ua/i
closed
Отображать в списке заявок ( на дашборде чиновника) название услуг.
bug hi priority In process of testing test
![image](https://cloud.githubusercontent.com/assets/11442397/9172888/26f87c46-3f7f-11e5-9905-82028f7d17ec.png) Сейчас отображается id б-процесса активити. Необходимо отображать полное название услуги.
1.0
Отображать в списке заявок ( на дашборде чиновника) название услуг. - ![image](https://cloud.githubusercontent.com/assets/11442397/9172888/26f87c46-3f7f-11e5-9905-82028f7d17ec.png) Сейчас отображается id б-процесса активити. Необходимо отображать полное название услуги.
process
отображать в списке заявок на дашборде чиновника название услуг сейчас отображается id б процесса активити необходимо отображать полное название услуги
1
2,221
5,071,297,862
IssuesEvent
2016-12-26 12:20:51
jlm2017/jlm-video-subtitles
https://api.github.com/repos/jlm2017/jlm-video-subtitles
opened
[Subtitles] [FR] MÉLENCHON : Réunion publique au Lamentin en Martinique
Language: French Process: Someone is working on this issue Process: [1] Writing in progress
# Video title MÉLENCHON : Réunion publique au Lamentin en Martinique # URL https://www.youtube.com/watch?v=4vhQIa0KAtw # Youtube subtitles language Français # Duration 1:21:49 # Subtitles URL https://www.youtube.com/timedtext_editor?lang=fr&ui=hd&action_mde_edit_form=1&tab=captions&v=4vhQIa0KAtw&bl=vmp&ref=player
2.0
[Subtitles] [FR] MÉLENCHON : Réunion publique au Lamentin en Martinique - # Video title MÉLENCHON : Réunion publique au Lamentin en Martinique # URL https://www.youtube.com/watch?v=4vhQIa0KAtw # Youtube subtitles language Français # Duration 1:21:49 # Subtitles URL https://www.youtube.com/timedtext_editor?lang=fr&ui=hd&action_mde_edit_form=1&tab=captions&v=4vhQIa0KAtw&bl=vmp&ref=player
process
mélenchon réunion publique au lamentin en martinique video title mélenchon réunion publique au lamentin en martinique url youtube subtitles language français duration subtitles url
1
17,299
23,115,478,178
IssuesEvent
2022-07-27 16:16:36
GoogleCloudPlatform/professional-services-data-validator
https://api.github.com/repos/GoogleCloudPlatform/professional-services-data-validator
closed
Implement uniform logging
type: process priority: p2 good first issue
DVT uses print statements and logging simultaneously. For example, we usually [print logs when verbose is enabled](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/data_validation/data_validation.py#L277), but in some cases we use the [python logging](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/data_validation/config_manager.py#L465). We should standardize to only use the [Python logger](https://docs.python.org/3/howto/logging.html#when-to-use-logging) and categorize logs by level (DEBUG, INFO, WARNING, etc.)
1.0
Implement uniform logging - DVT uses print statements and logging simultaneously. For example, we usually [print logs when verbose is enabled](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/data_validation/data_validation.py#L277), but in some cases we use the [python logging](https://github.com/GoogleCloudPlatform/professional-services-data-validator/blob/develop/data_validation/config_manager.py#L465). We should standardize to only use the [Python logger](https://docs.python.org/3/howto/logging.html#when-to-use-logging) and categorize logs by level (DEBUG, INFO, WARNING, etc.)
process
implement uniform logging dvt uses print statements and logging simultaneously for example we usually but in some cases we use the we should standardize to only use the and categorize logs by level debug info warning etc
1
190,875
22,171,610,521
IssuesEvent
2022-06-06 01:47:24
KDWSS/dd-trace-java
https://api.github.com/repos/KDWSS/dd-trace-java
opened
CVE-2022-31023 (Medium) detected in play_2.12-2.6.25.jar, play_2.12-2.6.20.jar
security vulnerability
## CVE-2022-31023 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>play_2.12-2.6.25.jar</b>, <b>play_2.12-2.6.20.jar</b></p></summary> <p> <details><summary><b>play_2.12-2.6.25.jar</b></p></summary> <p>Play</p> <p>Path to dependency file: /dd-smoke-tests/play-2.6/play-2.6.gradle</p> <p>Path to vulnerable library: /caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.25/bf884d1cd287f962e6e7e533224fc746665282c/play_2.12-2.6.25.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.25/bf884d1cd287f962e6e7e533224fc746665282c/play_2.12-2.6.25.jar</p> <p> Dependency Hierarchy: - :x: **play_2.12-2.6.25.jar** (Vulnerable Library) </details> <details><summary><b>play_2.12-2.6.20.jar</b></p></summary> <p>Play</p> <p>Path to dependency file: /dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle</p> <p>Path to vulnerable library: /caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.20/3b98a80a2c850e8933234ee21e90c0619db14de7/play_2.12-2.6.20.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.20/3b98a80a2c850e8933234ee21e90c0619db14de7/play_2.12-2.6.20.jar</p> <p> Dependency Hierarchy: - :x: **play_2.12-2.6.20.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Play Framework is a web framework for Java and Scala. Verions prior to 2.8.16 are vulnerable to generation of error messages containing sensitive information. Play Framework, when run in dev mode, shows verbose errors for easy debugging, including an exception stack trace. Play does this by configuring its `DefaultHttpErrorHandler` to do so based on the application mode. In its Scala API Play also provides a static object `DefaultHttpErrorHandler` that is configured to always show verbose errors. This is used as a default value in some Play APIs, so it is possible to inadvertently use this version in production. It is also possible to improperly configure the `DefaultHttpErrorHandler` object instance as the injected error handler. Both of these situations could result in verbose errors displaying to users in a production application, which could expose sensitive information from the application. In particular, the constructor for `CORSFilter` and `apply` method for `CORSActionBuilder` use the static object `DefaultHttpErrorHandler` as a default value. This is patched in Play Framework 2.8.16. The `DefaultHttpErrorHandler` object has been changed to use the prod-mode behavior, and `DevHttpErrorHandler` has been introduced for the dev-mode behavior. A workaround is available. When constructing a `CORSFilter` or `CORSActionBuilder`, ensure that a properly-configured error handler is passed. Generally this should be done by using the `HttpErrorHandler` instance provided through dependency injection or through Play's `BuiltInComponents`. Ensure that the application is not using the `DefaultHttpErrorHandler` static object in any code that may be run in production. <p>Publish Date: 2022-06-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31023>CVE-2022-31023</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/playframework/playframework/security/advisories/GHSA-p9p4-97g9-wcrh">https://github.com/playframework/playframework/security/advisories/GHSA-p9p4-97g9-wcrh</a></p> <p>Release Date: 2022-06-02</p> <p>Fix Resolution: 2.8.16</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.typesafe.play","packageName":"play_2.12","packageVersion":"2.6.25","packageFilePaths":["/dd-smoke-tests/play-2.6/play-2.6.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.typesafe.play:play_2.12:2.6.25","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.16","isBinary":false},{"packageType":"Java","groupId":"com.typesafe.play","packageName":"play_2.12","packageVersion":"2.6.20","packageFilePaths":["/dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.typesafe.play:play_2.12:2.6.20","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.16","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-31023","vulnerabilityDetails":"Play Framework is a web framework for Java and Scala. Verions prior to 2.8.16 are vulnerable to generation of error messages containing sensitive information. Play Framework, when run in dev mode, shows verbose errors for easy debugging, including an exception stack trace. Play does this by configuring its `DefaultHttpErrorHandler` to do so based on the application mode. In its Scala API Play also provides a static object `DefaultHttpErrorHandler` that is configured to always show verbose errors. This is used as a default value in some Play APIs, so it is possible to inadvertently use this version in production. It is also possible to improperly configure the `DefaultHttpErrorHandler` object instance as the injected error handler. Both of these situations could result in verbose errors displaying to users in a production application, which could expose sensitive information from the application. In particular, the constructor for `CORSFilter` and `apply` method for `CORSActionBuilder` use the static object `DefaultHttpErrorHandler` as a default value. This is patched in Play Framework 2.8.16. The `DefaultHttpErrorHandler` object has been changed to use the prod-mode behavior, and `DevHttpErrorHandler` has been introduced for the dev-mode behavior. A workaround is available. When constructing a `CORSFilter` or `CORSActionBuilder`, ensure that a properly-configured error handler is passed. Generally this should be done by using the `HttpErrorHandler` instance provided through dependency injection or through Play\u0027s `BuiltInComponents`. Ensure that the application is not using the `DefaultHttpErrorHandler` static object in any code that may be run in production.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31023","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2022-31023 (Medium) detected in play_2.12-2.6.25.jar, play_2.12-2.6.20.jar - ## CVE-2022-31023 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>play_2.12-2.6.25.jar</b>, <b>play_2.12-2.6.20.jar</b></p></summary> <p> <details><summary><b>play_2.12-2.6.25.jar</b></p></summary> <p>Play</p> <p>Path to dependency file: /dd-smoke-tests/play-2.6/play-2.6.gradle</p> <p>Path to vulnerable library: /caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.25/bf884d1cd287f962e6e7e533224fc746665282c/play_2.12-2.6.25.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.25/bf884d1cd287f962e6e7e533224fc746665282c/play_2.12-2.6.25.jar</p> <p> Dependency Hierarchy: - :x: **play_2.12-2.6.25.jar** (Vulnerable Library) </details> <details><summary><b>play_2.12-2.6.20.jar</b></p></summary> <p>Play</p> <p>Path to dependency file: /dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle</p> <p>Path to vulnerable library: /caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.20/3b98a80a2c850e8933234ee21e90c0619db14de7/play_2.12-2.6.20.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.typesafe.play/play_2.12/2.6.20/3b98a80a2c850e8933234ee21e90c0619db14de7/play_2.12-2.6.20.jar</p> <p> Dependency Hierarchy: - :x: **play_2.12-2.6.20.jar** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Play Framework is a web framework for Java and Scala. Verions prior to 2.8.16 are vulnerable to generation of error messages containing sensitive information. Play Framework, when run in dev mode, shows verbose errors for easy debugging, including an exception stack trace. Play does this by configuring its `DefaultHttpErrorHandler` to do so based on the application mode. In its Scala API Play also provides a static object `DefaultHttpErrorHandler` that is configured to always show verbose errors. This is used as a default value in some Play APIs, so it is possible to inadvertently use this version in production. It is also possible to improperly configure the `DefaultHttpErrorHandler` object instance as the injected error handler. Both of these situations could result in verbose errors displaying to users in a production application, which could expose sensitive information from the application. In particular, the constructor for `CORSFilter` and `apply` method for `CORSActionBuilder` use the static object `DefaultHttpErrorHandler` as a default value. This is patched in Play Framework 2.8.16. The `DefaultHttpErrorHandler` object has been changed to use the prod-mode behavior, and `DevHttpErrorHandler` has been introduced for the dev-mode behavior. A workaround is available. When constructing a `CORSFilter` or `CORSActionBuilder`, ensure that a properly-configured error handler is passed. Generally this should be done by using the `HttpErrorHandler` instance provided through dependency injection or through Play's `BuiltInComponents`. Ensure that the application is not using the `DefaultHttpErrorHandler` static object in any code that may be run in production. <p>Publish Date: 2022-06-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31023>CVE-2022-31023</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/playframework/playframework/security/advisories/GHSA-p9p4-97g9-wcrh">https://github.com/playframework/playframework/security/advisories/GHSA-p9p4-97g9-wcrh</a></p> <p>Release Date: 2022-06-02</p> <p>Fix Resolution: 2.8.16</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.typesafe.play","packageName":"play_2.12","packageVersion":"2.6.25","packageFilePaths":["/dd-smoke-tests/play-2.6/play-2.6.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.typesafe.play:play_2.12:2.6.25","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.16","isBinary":false},{"packageType":"Java","groupId":"com.typesafe.play","packageName":"play_2.12","packageVersion":"2.6.20","packageFilePaths":["/dd-java-agent/benchmark-integration/play-perftest/play-perftest.gradle"],"isTransitiveDependency":false,"dependencyTree":"com.typesafe.play:play_2.12:2.6.20","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.16","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2022-31023","vulnerabilityDetails":"Play Framework is a web framework for Java and Scala. Verions prior to 2.8.16 are vulnerable to generation of error messages containing sensitive information. Play Framework, when run in dev mode, shows verbose errors for easy debugging, including an exception stack trace. Play does this by configuring its `DefaultHttpErrorHandler` to do so based on the application mode. In its Scala API Play also provides a static object `DefaultHttpErrorHandler` that is configured to always show verbose errors. This is used as a default value in some Play APIs, so it is possible to inadvertently use this version in production. It is also possible to improperly configure the `DefaultHttpErrorHandler` object instance as the injected error handler. Both of these situations could result in verbose errors displaying to users in a production application, which could expose sensitive information from the application. In particular, the constructor for `CORSFilter` and `apply` method for `CORSActionBuilder` use the static object `DefaultHttpErrorHandler` as a default value. This is patched in Play Framework 2.8.16. The `DefaultHttpErrorHandler` object has been changed to use the prod-mode behavior, and `DevHttpErrorHandler` has been introduced for the dev-mode behavior. A workaround is available. When constructing a `CORSFilter` or `CORSActionBuilder`, ensure that a properly-configured error handler is passed. Generally this should be done by using the `HttpErrorHandler` instance provided through dependency injection or through Play\u0027s `BuiltInComponents`. Ensure that the application is not using the `DefaultHttpErrorHandler` static object in any code that may be run in production.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31023","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"None","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in play jar play jar cve medium severity vulnerability vulnerable libraries play jar play jar play jar play path to dependency file dd smoke tests play play gradle path to vulnerable library caches modules files com typesafe play play play jar home wss scanner gradle caches modules files com typesafe play play play jar dependency hierarchy x play jar vulnerable library play jar play path to dependency file dd java agent benchmark integration play perftest play perftest gradle path to vulnerable library caches modules files com typesafe play play play jar home wss scanner gradle caches modules files com typesafe play play play jar dependency hierarchy x play jar vulnerable library found in head commit a href found in base branch master vulnerability details play framework is a web framework for java and scala verions prior to are vulnerable to generation of error messages containing sensitive information play framework when run in dev mode shows verbose errors for easy debugging including an exception stack trace play does this by configuring its defaulthttperrorhandler to do so based on the application mode in its scala api play also provides a static object defaulthttperrorhandler that is configured to always show verbose errors this is used as a default value in some play apis so it is possible to inadvertently use this version in production it is also possible to improperly configure the defaulthttperrorhandler object instance as the injected error handler both of these situations could result in verbose errors displaying to users in a production application which could expose sensitive information from the application in particular the constructor for corsfilter and apply method for corsactionbuilder use the static object defaulthttperrorhandler as a default value this is patched in play framework the defaulthttperrorhandler object has been changed to use the prod mode behavior and devhttperrorhandler has been introduced for the dev mode behavior a workaround is available when constructing a corsfilter or corsactionbuilder ensure that a properly configured error handler is passed generally this should be done by using the httperrorhandler instance provided through dependency injection or through play s builtincomponents ensure that the application is not using the defaulthttperrorhandler static object in any code that may be run in production publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree com typesafe play play isminimumfixversionavailable true minimumfixversion isbinary false packagetype java groupid com typesafe play packagename play packageversion packagefilepaths istransitivedependency false dependencytree com typesafe play play isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails play framework is a web framework for java and scala verions prior to are vulnerable to generation of error messages containing sensitive information play framework when run in dev mode shows verbose errors for easy debugging including an exception stack trace play does this by configuring its defaulthttperrorhandler to do so based on the application mode in its scala api play also provides a static object defaulthttperrorhandler that is configured to always show verbose errors this is used as a default value in some play apis so it is possible to inadvertently use this version in production it is also possible to improperly configure the defaulthttperrorhandler object instance as the injected error handler both of these situations could result in verbose errors displaying to users in a production application which could expose sensitive information from the application in particular the constructor for corsfilter and apply method for corsactionbuilder use the static object defaulthttperrorhandler as a default value this is patched in play framework the defaulthttperrorhandler object has been changed to use the prod mode behavior and devhttperrorhandler has been introduced for the dev mode behavior a workaround is available when constructing a corsfilter or corsactionbuilder ensure that a properly configured error handler is passed generally this should be done by using the httperrorhandler instance provided through dependency injection or through play builtincomponents ensure that the application is not using the defaulthttperrorhandler static object in any code that may be run in production vulnerabilityurl
0
172,945
21,088,929,433
IssuesEvent
2022-04-04 01:00:32
temporalio/subscription-workflow-project-template-java
https://api.github.com/repos/temporalio/subscription-workflow-project-template-java
opened
logback-classic-1.2.3.jar: 1 vulnerabilities (highest severity is: 6.6)
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>logback-classic-1.2.3.jar</b></p></summary> <p></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-core/1.2.3/864344400c3d4d92dfeb0a305dc87d953677c03c/logback-core-1.2.3.jar</p> <p> </details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2021-42550](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.6 | multiple | Transitive | N/A | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-42550</summary> ### Vulnerable Libraries - <b>logback-core-1.2.3.jar</b>, <b>logback-classic-1.2.3.jar</b></p> <p> ### <b>logback-core-1.2.3.jar</b></p> <p>logback-core module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-core/1.2.3/864344400c3d4d92dfeb0a305dc87d953677c03c/logback-core-1.2.3.jar</p> <p> Dependency Hierarchy: - logback-classic-1.2.3.jar (Root Library) - :x: **logback-core-1.2.3.jar** (Vulnerable Library) ### <b>logback-classic-1.2.3.jar</b></p> <p>logback-classic module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /.qos.logback/logback-classic/1.2.3/7c4f3c474fb2c041d8028740440937705ebb473a/logback-classic-1.2.3.jar</p> <p> Dependency Hierarchy: - :x: **logback-classic-1.2.3.jar** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers. <p>Publish Date: 2021-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.6</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p> <p>Release Date: 2021-12-16</p> <p>Fix Resolution: ch.qos.logback:logback-classic:1.2.9;ch.qos.logback:logback-core:1.2.9</p> </p> <p></p> </details> <!-- <REMEDIATE>[{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"ch.qos.logback","packageName":"logback-core","packageVersion":"1.2.3","packageFilePaths":["/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"ch.qos.logback:logback-classic:1.2.3;ch.qos.logback:logback-core:1.2.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"ch.qos.logback:logback-classic:1.2.9;ch.qos.logback:logback-core:1.2.9","isBinary":false},{"packageType":"Java","groupId":"ch.qos.logback","packageName":"logback-classic","packageVersion":"1.2.3","packageFilePaths":["/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"ch.qos.logback:logback-classic:1.2.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"ch.qos.logback:logback-classic:1.2.9;ch.qos.logback:logback-core:1.2.9","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-42550","vulnerabilityDetails":"In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550","cvss3Severity":"medium","cvss3Score":"6.6","cvss3Metrics":{"A":"High","AC":"High","PR":"High","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> -->
True
logback-classic-1.2.3.jar: 1 vulnerabilities (highest severity is: 6.6) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>logback-classic-1.2.3.jar</b></p></summary> <p></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-core/1.2.3/864344400c3d4d92dfeb0a305dc87d953677c03c/logback-core-1.2.3.jar</p> <p> </details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2021-42550](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.6 | multiple | Transitive | N/A | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2021-42550</summary> ### Vulnerable Libraries - <b>logback-core-1.2.3.jar</b>, <b>logback-classic-1.2.3.jar</b></p> <p> ### <b>logback-core-1.2.3.jar</b></p> <p>logback-core module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/ch.qos.logback/logback-core/1.2.3/864344400c3d4d92dfeb0a305dc87d953677c03c/logback-core-1.2.3.jar</p> <p> Dependency Hierarchy: - logback-classic-1.2.3.jar (Root Library) - :x: **logback-core-1.2.3.jar** (Vulnerable Library) ### <b>logback-classic-1.2.3.jar</b></p> <p>logback-classic module</p> <p>Library home page: <a href="http://logback.qos.ch">http://logback.qos.ch</a></p> <p>Path to dependency file: /build.gradle</p> <p>Path to vulnerable library: /.qos.logback/logback-classic/1.2.3/7c4f3c474fb2c041d8028740440937705ebb473a/logback-classic-1.2.3.jar</p> <p> Dependency Hierarchy: - :x: **logback-classic-1.2.3.jar** (Vulnerable Library) <p>Found in base branch: <b>main</b></p> </p> <p></p> ### Vulnerability Details <p> In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers. <p>Publish Date: 2021-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550>CVE-2021-42550</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>6.6</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550">https://cve.mitre.org/cgi-bin/cvename.cgi?name=VE-2021-42550</a></p> <p>Release Date: 2021-12-16</p> <p>Fix Resolution: ch.qos.logback:logback-classic:1.2.9;ch.qos.logback:logback-core:1.2.9</p> </p> <p></p> </details> <!-- <REMEDIATE>[{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"ch.qos.logback","packageName":"logback-core","packageVersion":"1.2.3","packageFilePaths":["/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"ch.qos.logback:logback-classic:1.2.3;ch.qos.logback:logback-core:1.2.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"ch.qos.logback:logback-classic:1.2.9;ch.qos.logback:logback-core:1.2.9","isBinary":false},{"packageType":"Java","groupId":"ch.qos.logback","packageName":"logback-classic","packageVersion":"1.2.3","packageFilePaths":["/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"ch.qos.logback:logback-classic:1.2.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"ch.qos.logback:logback-classic:1.2.9;ch.qos.logback:logback-core:1.2.9","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"CVE-2021-42550","vulnerabilityDetails":"In logback version 1.2.7 and prior versions, an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from LDAP servers.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-42550","cvss3Severity":"medium","cvss3Score":"6.6","cvss3Metrics":{"A":"High","AC":"High","PR":"High","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}]</REMEDIATE> -->
non_process
logback classic jar vulnerabilities highest severity is vulnerable library logback classic jar path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files ch qos logback logback core logback core jar vulnerabilities cve severity cvss dependency type fixed in remediation available medium multiple transitive n a details cve vulnerable libraries logback core jar logback classic jar logback core jar logback core module library home page a href path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files ch qos logback logback core logback core jar dependency hierarchy logback classic jar root library x logback core jar vulnerable library logback classic jar logback classic module library home page a href path to dependency file build gradle path to vulnerable library qos logback logback classic logback classic jar dependency hierarchy x logback classic jar vulnerable library found in base branch main vulnerability details in logback version and prior versions an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from ldap servers publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ch qos logback logback classic ch qos logback logback core istransitivedependency true dependencytree ch qos logback logback classic ch qos logback logback core isminimumfixversionavailable true minimumfixversion ch qos logback logback classic ch qos logback logback core isbinary false packagetype java groupid ch qos logback packagename logback classic packageversion packagefilepaths istransitivedependency false dependencytree ch qos logback logback classic isminimumfixversionavailable true minimumfixversion ch qos logback logback classic ch qos logback logback core isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails in logback version and prior versions an attacker with the required privileges to edit configurations files could craft a malicious configuration allowing to execute arbitrary code loaded from ldap servers vulnerabilityurl
0
412,756
27,871,212,639
IssuesEvent
2023-03-21 13:35:31
rooch-network/moveos
https://api.github.com/repos/rooch-network/moveos
opened
[CodeStyle] Define Error code style
documentation
Two Option: * The first letter is E, and the rest are all capitalized. * The first letter is E, and the following words use camel case.
1.0
[CodeStyle] Define Error code style - Two Option: * The first letter is E, and the rest are all capitalized. * The first letter is E, and the following words use camel case.
non_process
define error code style two option the first letter is e and the rest are all capitalized the first letter is e and the following words use camel case
0
220,624
7,369,144,717
IssuesEvent
2018-03-13 00:59:16
nco/nco
https://api.github.com/repos/nco/nco
opened
Find cause of core dumps with complex netCDF4 group hierarchies
high priority
NCO is unable to print all the contents of some netCDF4 files. This [file](http://dust.ess.uci.edu/tmp/s5p.nc) illustrates the problem: ``` zender@skyglow:~$ ncks -m s5p.nc > ~/foo 2>&1 *** Error in `ncks': free(): invalid next size (fast): 0x0000000000d67c00 *** ======= Backtrace: ========= /usr/lib64/libc.so.6(+0x7cbac)[0x154e2a3acbac] /usr/lib64/libc.so.6(+0x87a59)[0x154e2a3b7a59] /usr/lib64/libc.so.6(cfree+0x16e)[0x154e2a3bd3be] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_free+0xe)[0x154e2b9a7c0e] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_xtr_cf_var_add+0x2c8)[0x154e2b9854e8] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_xtr_cf_add+0x62)[0x154e2b985772] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_bld_trv_tbl+0x513)[0x154e2b99a103] ncks[0x404d9c] ``` `ncdump` has no problem printing this file (NB: sometime `ncdump` can print corrupt files that NCO chokes on because NCO tries to build a complete table of the CF relationships and dimension IDs involved in the file, and the netCDF4 library, at least the renaming functions, has bugs that can produce invalid entries such as duplicate IDs for the same dimension). Please investigate whether this is an NCO bug, or ... perhaps a corrupt file? The failure above seems to occur during the creation of the list of bounds/coordinates associated with variables. However, some variables include CF bounds/coordinates attributes that reference Out-of-Group (OOG) variables that are specified with an absolute path. This makes me wonder if NCO is robust to absolute path specifications and OOG coordinates/bounds in group hierarchies. I thought it was, but it has not been thoroughly tested. Thanks!
1.0
Find cause of core dumps with complex netCDF4 group hierarchies - NCO is unable to print all the contents of some netCDF4 files. This [file](http://dust.ess.uci.edu/tmp/s5p.nc) illustrates the problem: ``` zender@skyglow:~$ ncks -m s5p.nc > ~/foo 2>&1 *** Error in `ncks': free(): invalid next size (fast): 0x0000000000d67c00 *** ======= Backtrace: ========= /usr/lib64/libc.so.6(+0x7cbac)[0x154e2a3acbac] /usr/lib64/libc.so.6(+0x87a59)[0x154e2a3b7a59] /usr/lib64/libc.so.6(cfree+0x16e)[0x154e2a3bd3be] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_free+0xe)[0x154e2b9a7c0e] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_xtr_cf_var_add+0x2c8)[0x154e2b9854e8] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_xtr_cf_add+0x62)[0x154e2b985772] /home/zender/lib/libnco-4.7.4-alpha01.so(nco_bld_trv_tbl+0x513)[0x154e2b99a103] ncks[0x404d9c] ``` `ncdump` has no problem printing this file (NB: sometime `ncdump` can print corrupt files that NCO chokes on because NCO tries to build a complete table of the CF relationships and dimension IDs involved in the file, and the netCDF4 library, at least the renaming functions, has bugs that can produce invalid entries such as duplicate IDs for the same dimension). Please investigate whether this is an NCO bug, or ... perhaps a corrupt file? The failure above seems to occur during the creation of the list of bounds/coordinates associated with variables. However, some variables include CF bounds/coordinates attributes that reference Out-of-Group (OOG) variables that are specified with an absolute path. This makes me wonder if NCO is robust to absolute path specifications and OOG coordinates/bounds in group hierarchies. I thought it was, but it has not been thoroughly tested. Thanks!
non_process
find cause of core dumps with complex group hierarchies nco is unable to print all the contents of some files this illustrates the problem zender skyglow ncks m nc foo error in ncks free invalid next size fast backtrace usr libc so usr libc so usr libc so cfree home zender lib libnco so nco free home zender lib libnco so nco xtr cf var add home zender lib libnco so nco xtr cf add home zender lib libnco so nco bld trv tbl ncks ncdump has no problem printing this file nb sometime ncdump can print corrupt files that nco chokes on because nco tries to build a complete table of the cf relationships and dimension ids involved in the file and the library at least the renaming functions has bugs that can produce invalid entries such as duplicate ids for the same dimension please investigate whether this is an nco bug or perhaps a corrupt file the failure above seems to occur during the creation of the list of bounds coordinates associated with variables however some variables include cf bounds coordinates attributes that reference out of group oog variables that are specified with an absolute path this makes me wonder if nco is robust to absolute path specifications and oog coordinates bounds in group hierarchies i thought it was but it has not been thoroughly tested thanks
0
137,168
11,101,666,108
IssuesEvent
2019-12-16 21:56:36
linkerd/linkerd2
https://api.github.com/repos/linkerd/linkerd2
closed
Add tests for namespace button and filter functionality to dashboard unit tests
area/test area/web priority/P1
Followup from #3667. Add a test for the namespace select button, and for filter functionality in both namespace selection and filter-able `Table` components, to JS Unit tests. Also see [comment](https://github.com/linkerd/linkerd2/pull/3667#discussion_r342723228) from @grampelberg on possible filter Regex refactoring
1.0
Add tests for namespace button and filter functionality to dashboard unit tests - Followup from #3667. Add a test for the namespace select button, and for filter functionality in both namespace selection and filter-able `Table` components, to JS Unit tests. Also see [comment](https://github.com/linkerd/linkerd2/pull/3667#discussion_r342723228) from @grampelberg on possible filter Regex refactoring
non_process
add tests for namespace button and filter functionality to dashboard unit tests followup from add a test for the namespace select button and for filter functionality in both namespace selection and filter able table components to js unit tests also see from grampelberg on possible filter regex refactoring
0
14,099
16,988,821,354
IssuesEvent
2021-06-30 17:33:15
googleapis/python-api-core
https://api.github.com/repos/googleapis/python-api-core
closed
Undeprecate IAM factory helpers
type: process
In addition to deprecating legacy role assignments, fd47fda5e3f5eca63522c8d81cffa22bc2a29ab6 (https://github.com/googleapis/google-cloud-python/pull/9869) deprecated the `Policy.user`, `Policy.service_account`, `Policy.group`, `Policy.domain`, `Policy.all_users`, and `Policy.authenticated_users` entity factory helpers. ISTM that those helpers should *not* be deprecated: they hide spelling details from users, and were not part of the "binding assigments" bit being deprecated (the assignable `Policy.owners`, `Policy.editors`, `Policy.viewers` properties): one still has to be able to construct the correctly-spelled entity when using the expected `Policy.bindings[<ROLENAME>]` spelling.
1.0
Undeprecate IAM factory helpers - In addition to deprecating legacy role assignments, fd47fda5e3f5eca63522c8d81cffa22bc2a29ab6 (https://github.com/googleapis/google-cloud-python/pull/9869) deprecated the `Policy.user`, `Policy.service_account`, `Policy.group`, `Policy.domain`, `Policy.all_users`, and `Policy.authenticated_users` entity factory helpers. ISTM that those helpers should *not* be deprecated: they hide spelling details from users, and were not part of the "binding assigments" bit being deprecated (the assignable `Policy.owners`, `Policy.editors`, `Policy.viewers` properties): one still has to be able to construct the correctly-spelled entity when using the expected `Policy.bindings[<ROLENAME>]` spelling.
process
undeprecate iam factory helpers in addition to deprecating legacy role assignments deprecated the policy user policy service account policy group policy domain policy all users and policy authenticated users entity factory helpers istm that those helpers should not be deprecated they hide spelling details from users and were not part of the binding assigments bit being deprecated the assignable policy owners policy editors policy viewers properties one still has to be able to construct the correctly spelled entity when using the expected policy bindings spelling
1
67,310
20,961,604,486
IssuesEvent
2022-03-27 21:47:57
abedmaatalla/sipdroid
https://api.github.com/repos/abedmaatalla/sipdroid
closed
Doesn't allow changing Google account when creating PBX
Priority-Medium Type-Defect auto-migrated
``` When clicking "New PBX linked to my Google Voice" it automatically selects one of the Google accounts on my phone, but not the one with Google voice. There is no option to change it. There needs to be an option to select the proper account. Is there a workaround in the meantime? ``` Original issue reported on code.google.com by `danny.pi...@gmail.com` on 18 Jul 2014 at 1:47
1.0
Doesn't allow changing Google account when creating PBX - ``` When clicking "New PBX linked to my Google Voice" it automatically selects one of the Google accounts on my phone, but not the one with Google voice. There is no option to change it. There needs to be an option to select the proper account. Is there a workaround in the meantime? ``` Original issue reported on code.google.com by `danny.pi...@gmail.com` on 18 Jul 2014 at 1:47
non_process
doesn t allow changing google account when creating pbx when clicking new pbx linked to my google voice it automatically selects one of the google accounts on my phone but not the one with google voice there is no option to change it there needs to be an option to select the proper account is there a workaround in the meantime original issue reported on code google com by danny pi gmail com on jul at
0
142,228
19,083,641,661
IssuesEvent
2021-11-29 01:01:16
renfei/renfei-java-sdk
https://api.github.com/repos/renfei/renfei-java-sdk
opened
CVE-2021-39153 (High) detected in xstream-1.4.17.jar
security vulnerability
## CVE-2021-39153 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.17.jar</b></p></summary> <p></p> <p>Library home page: <a href="http://x-stream.github.io">http://x-stream.github.io</a></p> <p>Path to dependency file: renfei-java-sdk/pom.xml</p> <p>Path to vulnerable library: itory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar</p> <p> Dependency Hierarchy: - :x: **xstream-1.4.17.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> XStream is a simple library to serialize objects to XML and back again. In affected versions this vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream, if using the version out of the box with Java runtime version 14 to 8 or with JavaFX installed. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. XStream 1.4.18 uses no longer a blacklist by default, since it cannot be secured for general purpose. <p>Publish Date: 2021-08-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-39153>CVE-2021-39153</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-39153">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-39153</a></p> <p>Release Date: 2021-08-23</p> <p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.18</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-39153 (High) detected in xstream-1.4.17.jar - ## CVE-2021-39153 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>xstream-1.4.17.jar</b></p></summary> <p></p> <p>Library home page: <a href="http://x-stream.github.io">http://x-stream.github.io</a></p> <p>Path to dependency file: renfei-java-sdk/pom.xml</p> <p>Path to vulnerable library: itory/com/thoughtworks/xstream/xstream/1.4.17/xstream-1.4.17.jar</p> <p> Dependency Hierarchy: - :x: **xstream-1.4.17.jar** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> XStream is a simple library to serialize objects to XML and back again. In affected versions this vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream, if using the version out of the box with Java runtime version 14 to 8 or with JavaFX installed. No user is affected, who followed the recommendation to setup XStream's security framework with a whitelist limited to the minimal required types. XStream 1.4.18 uses no longer a blacklist by default, since it cannot be secured for general purpose. <p>Publish Date: 2021-08-23 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-39153>CVE-2021-39153</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Changed - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-39153">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-39153</a></p> <p>Release Date: 2021-08-23</p> <p>Fix Resolution: com.thoughtworks.xstream:xstream:1.4.18</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in xstream jar cve high severity vulnerability vulnerable library xstream jar library home page a href path to dependency file renfei java sdk pom xml path to vulnerable library itory com thoughtworks xstream xstream xstream jar dependency hierarchy x xstream jar vulnerable library found in base branch master vulnerability details xstream is a simple library to serialize objects to xml and back again in affected versions this vulnerability may allow a remote attacker to load and execute arbitrary code from a remote host only by manipulating the processed input stream if using the version out of the box with java runtime version to or with javafx installed no user is affected who followed the recommendation to setup xstream s security framework with a whitelist limited to the minimal required types xstream uses no longer a blacklist by default since it cannot be secured for general purpose publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com thoughtworks xstream xstream step up your open source security game with whitesource
0
9,801
12,814,767,165
IssuesEvent
2020-07-04 20:55:10
percybolmer/workflow
https://api.github.com/repos/percybolmer/workflow
closed
Change PropertyMap to work correctly
bug help wanted processor
Remove Requierd, Available fields Instead add AddProperty(name, desc string, req bool) SetProperty(name string, Value int{}) GetProperty(name string) Update ValidateProperties so that it instead checks If all Requierd properties is not nil
1.0
Change PropertyMap to work correctly - Remove Requierd, Available fields Instead add AddProperty(name, desc string, req bool) SetProperty(name string, Value int{}) GetProperty(name string) Update ValidateProperties so that it instead checks If all Requierd properties is not nil
process
change propertymap to work correctly remove requierd available fields instead add addproperty name desc string req bool setproperty name string value int getproperty name string update validateproperties so that it instead checks if all requierd properties is not nil
1
10,767
13,562,078,851
IssuesEvent
2020-09-18 06:08:20
DevExpress/testcafe-hammerhead
https://api.github.com/repos/DevExpress/testcafe-hammerhead
closed
Script fails to load when accessing an website with TestCafe: error: (failed net::ERR_CONTENT_DECODING_FAILED)
AREA: server SYSTEM: resource processing TYPE: bug
### What is your Test Scenario? A javaScript file fails to load when accessing the website with TestCafe. The same javaScript file is loading correctly when accessing the website manually Steps to reproduce it: - For TestCafe - Run the attached code - Manually: it cannot be reproduced. ### What is the Current behavior? When accessing the test website with TestCafe one script from OneTrust is not loaded, instead is thrown an error: **failed net::ERR_CONTENT_DECODING_FAILED** ![image](https://user-images.githubusercontent.com/5627543/91850723-ace92000-ec66-11ea-9918-44588e5cd557.png) OneTrust cookie banner is not displayed: ![image](https://user-images.githubusercontent.com/5627543/91850769-c2f6e080-ec66-11ea-8646-ab1302cc6a85.png) ### What is the Expected behavior? When accessing the test website with TestCafe all scripts are loaded correctly. OneTrust Cookie banner is displayed: ![image](https://user-images.githubusercontent.com/5627543/91850887-f3d71580-ec66-11ea-8f29-a611de65cc94.png) ### What is your web application and your TestCafe test code? <summary>https://www.westrock.com/</summary> <summary>Your complete test code (or attach your test files):</summary> `` test('Visit the page', async t => { await t.navigateTo('https://www.westrock.com/'); })`` </details> <details> <summary>Your complete configuration file (if any):</summary> <!-- Paste your complete test config file here (even if it is huge): --> ``` { "screenshotPath": "screenshots", "browsers": "chrome", "screenshots": { "takeOnFails": true, "fullPage": true }, "debugOnFail": false, "skipJsErrors": true, "disablePageCaching": false, "color": true, "speed": 0.90, "pageLoadTimeout": 30000, "assertionTimeout": 30000, "selectorTimeout": 30000, "concurrency": 1 } ``` </details> <details> <summary>Screenshots:</summary> <!-- If applicable, add screenshots to help explain the issue. --> The Request headers for that Javascript file - using TestCafe: ![image](https://user-images.githubusercontent.com/5627543/91933519-c2eff280-ecf1-11ea-9c7d-d768c2d9c899.png) - with manual testing; ![image](https://user-images.githubusercontent.com/5627543/91933537-d00ce180-ecf1-11ea-815a-0d3f33612b4c.png) ``` ``` </details> ### Steps to Reproduce: 1. Run the attached code ### Your Environment details: * testcafe version: 1.9.1 * node.js version: v11.2.0 * command-line arguments: no additional parameters * browser name and version: Chrome - Version 84.0.4147.135 (Official Build) (64-bit) * platform and version: win 10
1.0
Script fails to load when accessing an website with TestCafe: error: (failed net::ERR_CONTENT_DECODING_FAILED) - ### What is your Test Scenario? A javaScript file fails to load when accessing the website with TestCafe. The same javaScript file is loading correctly when accessing the website manually Steps to reproduce it: - For TestCafe - Run the attached code - Manually: it cannot be reproduced. ### What is the Current behavior? When accessing the test website with TestCafe one script from OneTrust is not loaded, instead is thrown an error: **failed net::ERR_CONTENT_DECODING_FAILED** ![image](https://user-images.githubusercontent.com/5627543/91850723-ace92000-ec66-11ea-9918-44588e5cd557.png) OneTrust cookie banner is not displayed: ![image](https://user-images.githubusercontent.com/5627543/91850769-c2f6e080-ec66-11ea-8646-ab1302cc6a85.png) ### What is the Expected behavior? When accessing the test website with TestCafe all scripts are loaded correctly. OneTrust Cookie banner is displayed: ![image](https://user-images.githubusercontent.com/5627543/91850887-f3d71580-ec66-11ea-8f29-a611de65cc94.png) ### What is your web application and your TestCafe test code? <summary>https://www.westrock.com/</summary> <summary>Your complete test code (or attach your test files):</summary> `` test('Visit the page', async t => { await t.navigateTo('https://www.westrock.com/'); })`` </details> <details> <summary>Your complete configuration file (if any):</summary> <!-- Paste your complete test config file here (even if it is huge): --> ``` { "screenshotPath": "screenshots", "browsers": "chrome", "screenshots": { "takeOnFails": true, "fullPage": true }, "debugOnFail": false, "skipJsErrors": true, "disablePageCaching": false, "color": true, "speed": 0.90, "pageLoadTimeout": 30000, "assertionTimeout": 30000, "selectorTimeout": 30000, "concurrency": 1 } ``` </details> <details> <summary>Screenshots:</summary> <!-- If applicable, add screenshots to help explain the issue. --> The Request headers for that Javascript file - using TestCafe: ![image](https://user-images.githubusercontent.com/5627543/91933519-c2eff280-ecf1-11ea-9c7d-d768c2d9c899.png) - with manual testing; ![image](https://user-images.githubusercontent.com/5627543/91933537-d00ce180-ecf1-11ea-815a-0d3f33612b4c.png) ``` ``` </details> ### Steps to Reproduce: 1. Run the attached code ### Your Environment details: * testcafe version: 1.9.1 * node.js version: v11.2.0 * command-line arguments: no additional parameters * browser name and version: Chrome - Version 84.0.4147.135 (Official Build) (64-bit) * platform and version: win 10
process
script fails to load when accessing an website with testcafe error failed net err content decoding failed what is your test scenario a javascript file fails to load when accessing the website with testcafe the same javascript file is loading correctly when accessing the website manually steps to reproduce it for testcafe run the attached code manually it cannot be reproduced what is the current behavior when accessing the test website with testcafe one script from onetrust is not loaded instead is thrown an error failed net err content decoding failed onetrust cookie banner is not displayed what is the expected behavior when accessing the test website with testcafe all scripts are loaded correctly onetrust cookie banner is displayed what is your web application and your testcafe test code your complete test code or attach your test files test visit the page async t await t navigateto your complete configuration file if any screenshotpath screenshots browsers chrome screenshots takeonfails true fullpage true debugonfail false skipjserrors true disablepagecaching false color true speed pageloadtimeout assertiontimeout selectortimeout concurrency screenshots the request headers for that javascript file using testcafe with manual testing steps to reproduce run the attached code your environment details testcafe version node js version command line arguments no additional parameters browser name and version chrome version official build bit platform and version win
1
1,535
4,149,817,174
IssuesEvent
2016-06-15 15:30:16
symfony/symfony
https://api.github.com/repos/symfony/symfony
closed
(Process) Passing a new environment variable is problematic on Debian distros
Process
*Use case*: You want to use `Process` to launch a subprocess. The sub-process requires an extra environment variable (eg `CHILDVAR=somevalue`), but any other environment variables (such as `PATH`, `TMPDIR`, `SSH_AUTH_SOCK`, ad nauseum) should be passed-through without modification. In the contract for `Process::__construct()`, the only way to pass a new variable is to append it to `$_ENV`, e.g. ```php $p = new Process($command, $cwd, $_ENV + array("CHILDVAR"=>"somevalue")); ``` That is intuitive, but it does not work in Debian-based distros because it relies on `$_ENV`. In Debian's PHP packages, the `php.ini` sets `variables_order=GPCS` which kills `$_ENV`. (By contrast, upstream `php.net` defaults to `variables_order=EGPCS` which supports `$_ENV`.). To see this in action, run this on any PHP CLI: ```bash php -d variables_order=EGPCS -r 'print_r($_ENV);' php -d variables_order=GPCS -r 'print_r($_ENV);' ``` A few ideas for resolving this problem -- none of which seem pretty: 1. Downstream code should guard all references to `$_ENV` and generate warnings about `variables_order`. (Problem: That sucks for a large portion of the user community which runs on Debian/Ubuntu.) 2. Complain to debian.org, ubuntu.com, etal, and haggle with them about fixing the defaults. I suspect that they've confused `variables_order` with the inter-related option `request_order`. (Problem: Even if they agree to change it, there's a large install base on old/past releases.) 3. Complain to `php.net` and ask for an API that reliably provides all environment variables. (Problem: Install-base.) 4. Change the contract of `Process` to allow adding new variables. (Problem: Compatibility break?) 5. Distribute a new helper function/class to approximate `$_ENV`. E.g. ```php namespace Symfony\Process; class Environment { /** * Get a list of all environment variables. * * This is loosely equivalent to $_ENV; however, $_ENV does not work * in some common configurations. * * @return array * Example: $result['PATH'] === '/usr/local/bin:/usr/bin:/bin' */ public static function get() { // Wishful thinking: return $_ENV; // Some systems -- such as Debian and Ubuntu -- disable $_ENV by // default (by setting `variables_order=GPCS`), which makes it harder // to write well-behaved CLI scripts which propagate the environment // content. // This is silly policy and most likely an accident of history -- // e.g. `variables_order` was introduced in PHP 5.0; the better // alternative `request_order` didn't appear until PHP 5.3. // Regardless, the information is still available -- getenv($key) can // return values for specific variables, and `$_SERVER` contains an // amalgamation of 'environment' and 'server' variables. To make an // accurate(ish) list, we need to filter $_SERVER using getenv(). // See also: http://php.net/manual/en/ini.core.php#ini.variables-order $env = array(); foreach (array_keys($_SERVER) as $key) { $value = getenv($key); if ($value !== null && $value !== false) { $env[$key] = $value; } } return $env; } } print_r(Environment::get()); ```
1.0
(Process) Passing a new environment variable is problematic on Debian distros - *Use case*: You want to use `Process` to launch a subprocess. The sub-process requires an extra environment variable (eg `CHILDVAR=somevalue`), but any other environment variables (such as `PATH`, `TMPDIR`, `SSH_AUTH_SOCK`, ad nauseum) should be passed-through without modification. In the contract for `Process::__construct()`, the only way to pass a new variable is to append it to `$_ENV`, e.g. ```php $p = new Process($command, $cwd, $_ENV + array("CHILDVAR"=>"somevalue")); ``` That is intuitive, but it does not work in Debian-based distros because it relies on `$_ENV`. In Debian's PHP packages, the `php.ini` sets `variables_order=GPCS` which kills `$_ENV`. (By contrast, upstream `php.net` defaults to `variables_order=EGPCS` which supports `$_ENV`.). To see this in action, run this on any PHP CLI: ```bash php -d variables_order=EGPCS -r 'print_r($_ENV);' php -d variables_order=GPCS -r 'print_r($_ENV);' ``` A few ideas for resolving this problem -- none of which seem pretty: 1. Downstream code should guard all references to `$_ENV` and generate warnings about `variables_order`. (Problem: That sucks for a large portion of the user community which runs on Debian/Ubuntu.) 2. Complain to debian.org, ubuntu.com, etal, and haggle with them about fixing the defaults. I suspect that they've confused `variables_order` with the inter-related option `request_order`. (Problem: Even if they agree to change it, there's a large install base on old/past releases.) 3. Complain to `php.net` and ask for an API that reliably provides all environment variables. (Problem: Install-base.) 4. Change the contract of `Process` to allow adding new variables. (Problem: Compatibility break?) 5. Distribute a new helper function/class to approximate `$_ENV`. E.g. ```php namespace Symfony\Process; class Environment { /** * Get a list of all environment variables. * * This is loosely equivalent to $_ENV; however, $_ENV does not work * in some common configurations. * * @return array * Example: $result['PATH'] === '/usr/local/bin:/usr/bin:/bin' */ public static function get() { // Wishful thinking: return $_ENV; // Some systems -- such as Debian and Ubuntu -- disable $_ENV by // default (by setting `variables_order=GPCS`), which makes it harder // to write well-behaved CLI scripts which propagate the environment // content. // This is silly policy and most likely an accident of history -- // e.g. `variables_order` was introduced in PHP 5.0; the better // alternative `request_order` didn't appear until PHP 5.3. // Regardless, the information is still available -- getenv($key) can // return values for specific variables, and `$_SERVER` contains an // amalgamation of 'environment' and 'server' variables. To make an // accurate(ish) list, we need to filter $_SERVER using getenv(). // See also: http://php.net/manual/en/ini.core.php#ini.variables-order $env = array(); foreach (array_keys($_SERVER) as $key) { $value = getenv($key); if ($value !== null && $value !== false) { $env[$key] = $value; } } return $env; } } print_r(Environment::get()); ```
process
process passing a new environment variable is problematic on debian distros use case you want to use process to launch a subprocess the sub process requires an extra environment variable eg childvar somevalue but any other environment variables such as path tmpdir ssh auth sock ad nauseum should be passed through without modification in the contract for process construct the only way to pass a new variable is to append it to env e g php p new process command cwd env array childvar somevalue that is intuitive but it does not work in debian based distros because it relies on env in debian s php packages the php ini sets variables order gpcs which kills env by contrast upstream php net defaults to variables order egpcs which supports env to see this in action run this on any php cli bash php d variables order egpcs r print r env php d variables order gpcs r print r env a few ideas for resolving this problem none of which seem pretty downstream code should guard all references to env and generate warnings about variables order problem that sucks for a large portion of the user community which runs on debian ubuntu complain to debian org ubuntu com etal and haggle with them about fixing the defaults i suspect that they ve confused variables order with the inter related option request order problem even if they agree to change it there s a large install base on old past releases complain to php net and ask for an api that reliably provides all environment variables problem install base change the contract of process to allow adding new variables problem compatibility break distribute a new helper function class to approximate env e g php namespace symfony process class environment get a list of all environment variables this is loosely equivalent to env however env does not work in some common configurations return array example result usr local bin usr bin bin public static function get wishful thinking return env some systems such as debian and ubuntu disable env by default by setting variables order gpcs which makes it harder to write well behaved cli scripts which propagate the environment content this is silly policy and most likely an accident of history e g variables order was introduced in php the better alternative request order didn t appear until php regardless the information is still available getenv key can return values for specific variables and server contains an amalgamation of environment and server variables to make an accurate ish list we need to filter server using getenv see also env array foreach array keys server as key value getenv key if value null value false env value return env print r environment get
1
239,128
19,822,583,938
IssuesEvent
2022-01-20 00:21:55
JHS-Viking-Robotics/FRC-2022
https://api.github.com/repos/JHS-Viking-Robotics/FRC-2022
closed
Hopper: refine commands and update button bindings
Category: Hopper Type: Refactor Priority: In Testing
The commands for the Hopper subsystem, as well as the safety override mode, need to be polished up and bound to buttons on the driver controller. The following need to be set up: - Unload sequence - Go up on button hold, expel balls for 2 seconds on release (Y button?) - Load - Go down and run intake (A button?) - Default - Not sure yet what this needs to be - Manual - Use joystick for the lift, and buttons for the intake Give each command some polishing, and check that the buttons are configured correctly.
1.0
Hopper: refine commands and update button bindings - The commands for the Hopper subsystem, as well as the safety override mode, need to be polished up and bound to buttons on the driver controller. The following need to be set up: - Unload sequence - Go up on button hold, expel balls for 2 seconds on release (Y button?) - Load - Go down and run intake (A button?) - Default - Not sure yet what this needs to be - Manual - Use joystick for the lift, and buttons for the intake Give each command some polishing, and check that the buttons are configured correctly.
non_process
hopper refine commands and update button bindings the commands for the hopper subsystem as well as the safety override mode need to be polished up and bound to buttons on the driver controller the following need to be set up unload sequence go up on button hold expel balls for seconds on release y button load go down and run intake a button default not sure yet what this needs to be manual use joystick for the lift and buttons for the intake give each command some polishing and check that the buttons are configured correctly
0
20,500
27,165,861,619
IssuesEvent
2023-02-17 15:17:35
SpikeInterface/spikeinterface
https://api.github.com/repos/SpikeInterface/spikeinterface
closed
error during whitenning
bug preprocessing
Hello, I was trying to add whitenning in my preprocess chain, but I got the following error: Traceback (most recent call last): File "<string>", line 1, in <module> File "C:\Users\juventin\AppData\Local\Programs\Python\Python310\lib\multiprocessing\spawn.py", line 116, in spawn_main exitcode = _main(fd, parent_sentinel) File "C:\Users\juventin\AppData\Local\Programs\Python\Python310\lib\multiprocessing\spawn.py", line 126, in _main self = reduction.pickle.load(from_parent) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\core\base.py", line 370, in from_dict extractor = _load_extractor_from_dict(d) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\core\base.py", line 844, in _load_extractor_from_dict kwargs[k] = _load_extractor_from_dict(v) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\core\base.py", line 861, in _load_extractor_from_dict extractor = cls(**kwargs) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\preprocessing\whiten.py", line 58, in __init__ rec_segment = WhitenRecordingSegment(parent_segment, W, M, dtype_) UnboundLocalError: local variable 'M' referenced before assignment The issue is associated to the following code: the recording is a SpikeGLXRecordingExtractor: 384 channels - 1 segments - 30.0kHz - 608.451s ` recording = spikeinterface.preprocessing.bandpass_filter(recording) recording = spikeinterface.preprocessing.phase_shift(recording) recording = spikeinterface.preprocessing.whiten(recording) recording = spikeinterface.preprocessing.common_reference(recording, reference='global', operator='median') peaks = detect_peaks(recording, method='locally_exclusive', **job_kwargs) ` Surprisingly, if the replace the detect_peaks by a recording.get_traces(), I don't get the error.
1.0
error during whitenning - Hello, I was trying to add whitenning in my preprocess chain, but I got the following error: Traceback (most recent call last): File "<string>", line 1, in <module> File "C:\Users\juventin\AppData\Local\Programs\Python\Python310\lib\multiprocessing\spawn.py", line 116, in spawn_main exitcode = _main(fd, parent_sentinel) File "C:\Users\juventin\AppData\Local\Programs\Python\Python310\lib\multiprocessing\spawn.py", line 126, in _main self = reduction.pickle.load(from_parent) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\core\base.py", line 370, in from_dict extractor = _load_extractor_from_dict(d) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\core\base.py", line 844, in _load_extractor_from_dict kwargs[k] = _load_extractor_from_dict(v) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\core\base.py", line 861, in _load_extractor_from_dict extractor = cls(**kwargs) File "D:\local_code_library\spike_sorters\spikeinterface\spikeinterface\preprocessing\whiten.py", line 58, in __init__ rec_segment = WhitenRecordingSegment(parent_segment, W, M, dtype_) UnboundLocalError: local variable 'M' referenced before assignment The issue is associated to the following code: the recording is a SpikeGLXRecordingExtractor: 384 channels - 1 segments - 30.0kHz - 608.451s ` recording = spikeinterface.preprocessing.bandpass_filter(recording) recording = spikeinterface.preprocessing.phase_shift(recording) recording = spikeinterface.preprocessing.whiten(recording) recording = spikeinterface.preprocessing.common_reference(recording, reference='global', operator='median') peaks = detect_peaks(recording, method='locally_exclusive', **job_kwargs) ` Surprisingly, if the replace the detect_peaks by a recording.get_traces(), I don't get the error.
process
error during whitenning hello i was trying to add whitenning in my preprocess chain but i got the following error traceback most recent call last file line in file c users juventin appdata local programs python lib multiprocessing spawn py line in spawn main exitcode main fd parent sentinel file c users juventin appdata local programs python lib multiprocessing spawn py line in main self reduction pickle load from parent file d local code library spike sorters spikeinterface spikeinterface core base py line in from dict extractor load extractor from dict d file d local code library spike sorters spikeinterface spikeinterface core base py line in load extractor from dict kwargs load extractor from dict v file d local code library spike sorters spikeinterface spikeinterface core base py line in load extractor from dict extractor cls kwargs file d local code library spike sorters spikeinterface spikeinterface preprocessing whiten py line in init rec segment whitenrecordingsegment parent segment w m dtype unboundlocalerror local variable m referenced before assignment the issue is associated to the following code the recording is a spikeglxrecordingextractor channels segments recording spikeinterface preprocessing bandpass filter recording recording spikeinterface preprocessing phase shift recording recording spikeinterface preprocessing whiten recording recording spikeinterface preprocessing common reference recording reference global operator median peaks detect peaks recording method locally exclusive job kwargs surprisingly if the replace the detect peaks by a recording get traces i don t get the error
1
17,583
23,392,568,091
IssuesEvent
2022-08-11 19:22:40
apache/arrow-rs
https://api.github.com/repos/apache/arrow-rs
closed
Fix 1.63.0 Clippy Lints
enhancement development-process help wanted
**Is your feature request related to a problem or challenge? Please describe what you are trying to do.** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] (This section helps Arrow developers understand the context and *why* for this feature, in addition to the *what*) --> Rust 1.63.0 has been released, and this has brought along new lints which are now failing on master :disappointed: **Describe the solution you'd like** <!-- A clear and concise description of what you want to happen. --> We should fix these lints to unblock the pipes **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
1.0
Fix 1.63.0 Clippy Lints - **Is your feature request related to a problem or challenge? Please describe what you are trying to do.** <!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] (This section helps Arrow developers understand the context and *why* for this feature, in addition to the *what*) --> Rust 1.63.0 has been released, and this has brought along new lints which are now failing on master :disappointed: **Describe the solution you'd like** <!-- A clear and concise description of what you want to happen. --> We should fix these lints to unblock the pipes **Describe alternatives you've considered** <!-- A clear and concise description of any alternative solutions or features you've considered. --> **Additional context** <!-- Add any other context or screenshots about the feature request here. -->
process
fix clippy lints is your feature request related to a problem or challenge please describe what you are trying to do a clear and concise description of what the problem is ex i m always frustrated when this section helps arrow developers understand the context and why for this feature in addition to the what rust has been released and this has brought along new lints which are now failing on master disappointed describe the solution you d like a clear and concise description of what you want to happen we should fix these lints to unblock the pipes describe alternatives you ve considered a clear and concise description of any alternative solutions or features you ve considered additional context add any other context or screenshots about the feature request here
1
475,097
13,686,870,748
IssuesEvent
2020-09-30 09:18:15
StrangeLoopGames/EcoIssues
https://api.github.com/repos/StrangeLoopGames/EcoIssues
opened
[0.9.0 develop-68] Update information for Fill Types tooltips
Category: UI Priority: Low
![image](https://user-images.githubusercontent.com/45708377/94666872-b376c000-0316-11eb-9649-f3249947f2f8.png) you can describe what needs to be done to place a specific Fill type
1.0
[0.9.0 develop-68] Update information for Fill Types tooltips - ![image](https://user-images.githubusercontent.com/45708377/94666872-b376c000-0316-11eb-9649-f3249947f2f8.png) you can describe what needs to be done to place a specific Fill type
non_process
update information for fill types tooltips you can describe what needs to be done to place a specific fill type
0
179,675
30,284,528,098
IssuesEvent
2023-07-08 13:56:04
DO-NOTTO-DO/iOS-NOTTODO
https://api.github.com/repos/DO-NOTTO-DO/iOS-NOTTODO
closed
[Fix] 생성/수정 뷰 날짜 선택 UI 및 스와이프 화면전환 시 날짜 Text 수정
Fix Design 윤서
## 🍏 Issue <!-- 이슈에 대해 간략하게 설명해주세요 --> - 토글을 열었을 때 선택된 날짜 유지 되도록 수정 - 스와이프에서 낫투두 수정 날짜 텍스트 수정 - 낫투두 수정 navigationTitle 수정 ## 📝 To-do <!-- 진행할 작업에 대해 적어주세요 --> - [ ] 토글을 열었을 때 선택된 날짜 유지 되도록 수정 - [ ] 스와이프에서 낫투두 수정 날짜 텍스트 수정 - [ ] 낫투두 수정 navigationTitle 수정
1.0
[Fix] 생성/수정 뷰 날짜 선택 UI 및 스와이프 화면전환 시 날짜 Text 수정 - ## 🍏 Issue <!-- 이슈에 대해 간략하게 설명해주세요 --> - 토글을 열었을 때 선택된 날짜 유지 되도록 수정 - 스와이프에서 낫투두 수정 날짜 텍스트 수정 - 낫투두 수정 navigationTitle 수정 ## 📝 To-do <!-- 진행할 작업에 대해 적어주세요 --> - [ ] 토글을 열었을 때 선택된 날짜 유지 되도록 수정 - [ ] 스와이프에서 낫투두 수정 날짜 텍스트 수정 - [ ] 낫투두 수정 navigationTitle 수정
non_process
생성 수정 뷰 날짜 선택 ui 및 스와이프 화면전환 시 날짜 text 수정 🍏 issue 토글을 열었을 때 선택된 날짜 유지 되도록 수정 스와이프에서 낫투두 수정 날짜 텍스트 수정 낫투두 수정 navigationtitle 수정 📝 to do 토글을 열었을 때 선택된 날짜 유지 되도록 수정 스와이프에서 낫투두 수정 날짜 텍스트 수정 낫투두 수정 navigationtitle 수정
0
13,229
15,701,997,536
IssuesEvent
2021-03-26 12:00:42
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
CRS issue with GRASS raster tools
Bug Feedback Processing
Dear developer team I found a strange issue regarding the CRS of raster output data in GRASS tools. I explain that bug on an often reproduced case: I am processing raster data (10x10m)for Austria in EPSG:31287 (MGI /Austria Lambert) and all input data as well as the project itself have this CRS properly defined. When I run different raster tools from the GRASS toolbox (in my case mostly r.reclass and r.neighbors) in QGIS (I tested it on version 3.12, 3.14 and 3.16 as well as on MacOS and Windows - on all setups with the same issue) the CRS of the output file gets a user defined CRS named USER:100029 plus a lot of projection parameters (those parameters are ident to MGI /Austria Lambert specifications plus WGS84 to MGI transformation parameters). The raster cells of this layers are visually slightly shifted in the map view (see images below). When I change the CRS in the layer properties > source to EPSG:31287 or assign EPSG:31287 as projection (with GDAL > Raster projections > Assign projection) everything is fine. See images below for further details on that issue. The user defined CRS in the select list: <img width="1410" alt="HWbqH" src="https://user-images.githubusercontent.com/29124423/112622644-99ccb600-8e2b-11eb-967d-ff1f0ca26f30.png"> The slight cell shift on the left and in place after the assingment of EPSG:31287 <img width="1073" alt="rVGmH" src="https://user-images.githubusercontent.com/29124423/112622660-a0f3c400-8e2b-11eb-9853-e451dd96e5b4.png"> Thank you, Thomas
1.0
CRS issue with GRASS raster tools - Dear developer team I found a strange issue regarding the CRS of raster output data in GRASS tools. I explain that bug on an often reproduced case: I am processing raster data (10x10m)for Austria in EPSG:31287 (MGI /Austria Lambert) and all input data as well as the project itself have this CRS properly defined. When I run different raster tools from the GRASS toolbox (in my case mostly r.reclass and r.neighbors) in QGIS (I tested it on version 3.12, 3.14 and 3.16 as well as on MacOS and Windows - on all setups with the same issue) the CRS of the output file gets a user defined CRS named USER:100029 plus a lot of projection parameters (those parameters are ident to MGI /Austria Lambert specifications plus WGS84 to MGI transformation parameters). The raster cells of this layers are visually slightly shifted in the map view (see images below). When I change the CRS in the layer properties > source to EPSG:31287 or assign EPSG:31287 as projection (with GDAL > Raster projections > Assign projection) everything is fine. See images below for further details on that issue. The user defined CRS in the select list: <img width="1410" alt="HWbqH" src="https://user-images.githubusercontent.com/29124423/112622644-99ccb600-8e2b-11eb-967d-ff1f0ca26f30.png"> The slight cell shift on the left and in place after the assingment of EPSG:31287 <img width="1073" alt="rVGmH" src="https://user-images.githubusercontent.com/29124423/112622660-a0f3c400-8e2b-11eb-9853-e451dd96e5b4.png"> Thank you, Thomas
process
crs issue with grass raster tools dear developer team i found a strange issue regarding the crs of raster output data in grass tools i explain that bug on an often reproduced case i am processing raster data for austria in epsg mgi austria lambert and all input data as well as the project itself have this crs properly defined when i run different raster tools from the grass toolbox in my case mostly r reclass and r neighbors in qgis i tested it on version and as well as on macos and windows on all setups with the same issue the crs of the output file gets a user defined crs named user plus a lot of projection parameters those parameters are ident to mgi austria lambert specifications plus to mgi transformation parameters the raster cells of this layers are visually slightly shifted in the map view see images below when i change the crs in the layer properties source to epsg or assign epsg as projection with gdal raster projections assign projection everything is fine see images below for further details on that issue the user defined crs in the select list img width alt hwbqh src the slight cell shift on the left and in place after the assingment of epsg img width alt rvgmh src thank you thomas
1
218,538
24,376,045,267
IssuesEvent
2022-10-04 01:03:11
BrianMcDonaldWS/atlas
https://api.github.com/repos/BrianMcDonaldWS/atlas
opened
CVE-2022-42003 (Medium) detected in jackson-databind-2.10.2.jar
security vulnerability
## CVE-2022-42003 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.10.2.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.10.2.jar,/root/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.10.2.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.10.2.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In FasterXML jackson-databind before 2.14.0-rc1, resource exhaustion can occur because of a lack of a check in primitive value deserializers to avoid deep wrapper array nesting, when the UNWRAP_SINGLE_VALUE_ARRAYS feature is enabled. <p>Publish Date: 2022-10-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-42003>CVE-2022-42003</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p>
True
CVE-2022-42003 (Medium) detected in jackson-databind-2.10.2.jar - ## CVE-2022-42003 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.10.2.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to vulnerable library: /cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.10.2.jar,/root/.ivy2/cache/com.fasterxml.jackson.core/jackson-databind/bundles/jackson-databind-2.10.2.jar</p> <p> Dependency Hierarchy: - :x: **jackson-databind-2.10.2.jar** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In FasterXML jackson-databind before 2.14.0-rc1, resource exhaustion can occur because of a lack of a check in primitive value deserializers to avoid deep wrapper array nesting, when the UNWRAP_SINGLE_VALUE_ARRAYS feature is enabled. <p>Publish Date: 2022-10-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-42003>CVE-2022-42003</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p>
non_process
cve medium detected in jackson databind jar cve medium severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to vulnerable library cache com fasterxml jackson core jackson databind bundles jackson databind jar root cache com fasterxml jackson core jackson databind bundles jackson databind jar dependency hierarchy x jackson databind jar vulnerable library vulnerability details in fasterxml jackson databind before resource exhaustion can occur because of a lack of a check in primitive value deserializers to avoid deep wrapper array nesting when the unwrap single value arrays feature is enabled publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href
0
408,349
11,945,462,309
IssuesEvent
2020-04-03 05:51:49
Western-Health-Covid19-Collaboration/wh_covid19_app
https://api.github.com/repos/Western-Health-Covid19-Collaboration/wh_covid19_app
closed
Build out well designed Disclaimer & Conditions of use screen
#2 Priority Ready for Dev enhancement
The Disclaimer & Conditions of use screen needs to present a disclaimer. It is accessible from two places: 1. When the user first starts the app it is the first thing they see 2. You can access it again from a button in the information screen **We need to build build out a detailed version of the disclaimer & conditions of use screen** @greggmiller Can you please provide some information in this issue on what should be in this section? I think you had some copy? Once that is done assign the issue to Marc Edwards. @marcedwards Once greg has provided info, can you provide some design guidance in this issue in on how it should look. Once this is done, can you label this issue as "Ready for Dev" so that one of the devs can pick it up.
1.0
Build out well designed Disclaimer & Conditions of use screen - The Disclaimer & Conditions of use screen needs to present a disclaimer. It is accessible from two places: 1. When the user first starts the app it is the first thing they see 2. You can access it again from a button in the information screen **We need to build build out a detailed version of the disclaimer & conditions of use screen** @greggmiller Can you please provide some information in this issue on what should be in this section? I think you had some copy? Once that is done assign the issue to Marc Edwards. @marcedwards Once greg has provided info, can you provide some design guidance in this issue in on how it should look. Once this is done, can you label this issue as "Ready for Dev" so that one of the devs can pick it up.
non_process
build out well designed disclaimer conditions of use screen the disclaimer conditions of use screen needs to present a disclaimer it is accessible from two places when the user first starts the app it is the first thing they see you can access it again from a button in the information screen we need to build build out a detailed version of the disclaimer conditions of use screen greggmiller can you please provide some information in this issue on what should be in this section i think you had some copy once that is done assign the issue to marc edwards marcedwards once greg has provided info can you provide some design guidance in this issue in on how it should look once this is done can you label this issue as ready for dev so that one of the devs can pick it up
0
205,234
7,096,041,732
IssuesEvent
2018-01-14 00:21:01
twsswt/bug_buddy_jira_plugin
https://api.github.com/repos/twsswt/bug_buddy_jira_plugin
closed
Allow selection of multiple recommendation methods using command line arguments
feature enhancement priority:med
for example --method=skills --method=word-frequency --method=hybrid
1.0
Allow selection of multiple recommendation methods using command line arguments - for example --method=skills --method=word-frequency --method=hybrid
non_process
allow selection of multiple recommendation methods using command line arguments for example method skills method word frequency method hybrid
0
26,920
12,497,944,555
IssuesEvent
2020-06-01 17:23:21
Azure/azure-cli
https://api.github.com/repos/Azure/azure-cli
closed
Linux Runtime 'DOTNETCORE|3.1' is not supported.
Service Attention Web Apps
## Describe the bug **Command Name** `az webapp create` **Errors:** ``` Linux Runtime 'DOTNETCORE|3.1' is not supported.Please invoke 'list-runtimes' to cross check ``` ## To Reproduce: I have been working in azure portal Cloud shell and trying to create a Web App from command line az group create --name azlinux --location "west europe" az plan create --name azplan --resource-group azlinux --is-linux --sku F1 az webapp create --name "azwebapplinux" --resource-group azlinux --plan azplan --deployment-local-git --runtime "DOTNETCORE|3.1" when I used --runtime "DOTNETCORE|2.1" it was created succesfully in web GUI is 3.1 version listed and an app was succesfully created list-runtime did not listed neither DOTNETCORE|3.1 nor DOTNETCORE|2.1 (or any other core version) ## Expected Behavior creating an app with name azwebapplinux ## Environment Summary ``` Linux-4.15.0-1082-azure-x86_64-with-debian-stretch-sid Python 3.6.5 Installer: DEB azure-cli 2.5.1 * ``` ## Additional Context command-modules-nspkg 2.0.3 core 2.5.1 * nspkg 3.0.4 telemetry 1.0.4 Python location '/opt/az/bin/python3' Extensions directory '/home/ivan/.azure/cliextensions' Python (Linux) 3.6.5 (default, Apr 30 2020, 06:22:39) [GCC 5.4.0 20160609] Legal docs and information: aka.ms/AzureCliLegal You have 2 updates available. Consider updating your CLI installation with 'sudo apt-get update && sudo apt-get install --only-upgrade -y azure-cli'. Detailed instructions can be found at https://aka.ms/doc/UpdateAzureCliApt <!--Please don't remove this:--> <!--auto-generated-->
1.0
Linux Runtime 'DOTNETCORE|3.1' is not supported. - ## Describe the bug **Command Name** `az webapp create` **Errors:** ``` Linux Runtime 'DOTNETCORE|3.1' is not supported.Please invoke 'list-runtimes' to cross check ``` ## To Reproduce: I have been working in azure portal Cloud shell and trying to create a Web App from command line az group create --name azlinux --location "west europe" az plan create --name azplan --resource-group azlinux --is-linux --sku F1 az webapp create --name "azwebapplinux" --resource-group azlinux --plan azplan --deployment-local-git --runtime "DOTNETCORE|3.1" when I used --runtime "DOTNETCORE|2.1" it was created succesfully in web GUI is 3.1 version listed and an app was succesfully created list-runtime did not listed neither DOTNETCORE|3.1 nor DOTNETCORE|2.1 (or any other core version) ## Expected Behavior creating an app with name azwebapplinux ## Environment Summary ``` Linux-4.15.0-1082-azure-x86_64-with-debian-stretch-sid Python 3.6.5 Installer: DEB azure-cli 2.5.1 * ``` ## Additional Context command-modules-nspkg 2.0.3 core 2.5.1 * nspkg 3.0.4 telemetry 1.0.4 Python location '/opt/az/bin/python3' Extensions directory '/home/ivan/.azure/cliextensions' Python (Linux) 3.6.5 (default, Apr 30 2020, 06:22:39) [GCC 5.4.0 20160609] Legal docs and information: aka.ms/AzureCliLegal You have 2 updates available. Consider updating your CLI installation with 'sudo apt-get update && sudo apt-get install --only-upgrade -y azure-cli'. Detailed instructions can be found at https://aka.ms/doc/UpdateAzureCliApt <!--Please don't remove this:--> <!--auto-generated-->
non_process
linux runtime dotnetcore is not supported describe the bug command name az webapp create errors linux runtime dotnetcore is not supported please invoke list runtimes to cross check to reproduce i have been working in azure portal cloud shell and trying to create a web app from command line az group create name azlinux location west europe az plan create name azplan resource group azlinux is linux sku az webapp create name azwebapplinux resource group azlinux plan azplan deployment local git runtime dotnetcore when i used runtime dotnetcore it was created succesfully in web gui is version listed and an app was succesfully created list runtime did not listed neither dotnetcore nor dotnetcore or any other core version expected behavior creating an app with name azwebapplinux environment summary linux azure with debian stretch sid python installer deb azure cli additional context command modules nspkg core nspkg telemetry python location opt az bin extensions directory home ivan azure cliextensions python linux default apr legal docs and information aka ms azureclilegal you have updates available consider updating your cli installation with sudo apt get update sudo apt get install only upgrade y azure cli detailed instructions can be found at
0
69,815
3,315,419,046
IssuesEvent
2015-11-06 11:57:28
Woseseltops/signbank
https://api.github.com/repos/Woseseltops/signbank
closed
Create more space for field labels
bug top priority
When switching language to Dutch before logging in, the labels for the input fields become too large to fit in front of those fields: ![image](https://cloud.githubusercontent.com/assets/10399929/10131821/419f14c2-65d3-11e5-9d74-cbe26cd9b42c.png) (This is likely to cause problems elsewhere as well)
1.0
Create more space for field labels - When switching language to Dutch before logging in, the labels for the input fields become too large to fit in front of those fields: ![image](https://cloud.githubusercontent.com/assets/10399929/10131821/419f14c2-65d3-11e5-9d74-cbe26cd9b42c.png) (This is likely to cause problems elsewhere as well)
non_process
create more space for field labels when switching language to dutch before logging in the labels for the input fields become too large to fit in front of those fields this is likely to cause problems elsewhere as well
0
20,245
26,862,607,810
IssuesEvent
2023-02-03 19:53:46
USGS-WiM/StreamStats
https://api.github.com/repos/USGS-WiM/StreamStats
closed
BP: Create modal
Batch Processor
Part of #1455 - [x] Add a new button at the top right of the navigation bar called "Batch Processor" with an icon that makes sense (gear?) ![image](https://user-images.githubusercontent.com/40237491/200083330-0dab41f8-56f9-4e47-8316-6481cd8f5029.png) - [x] When the user clicks the button, open a new modal - [x] The modal title should be "StreamStats Batch Processor" - [x] The modal should have 2 tabs called "Submit Batch" and "Batch Status". To see an example of tabs, view the [gagepage.html](https://github.com/USGS-WiM/StreamStats/blob/gageplots/src/Views/gagepage.html) in the `gageplots` branch. Work for #1455 will go in the "Submit Batch" tab.
1.0
BP: Create modal - Part of #1455 - [x] Add a new button at the top right of the navigation bar called "Batch Processor" with an icon that makes sense (gear?) ![image](https://user-images.githubusercontent.com/40237491/200083330-0dab41f8-56f9-4e47-8316-6481cd8f5029.png) - [x] When the user clicks the button, open a new modal - [x] The modal title should be "StreamStats Batch Processor" - [x] The modal should have 2 tabs called "Submit Batch" and "Batch Status". To see an example of tabs, view the [gagepage.html](https://github.com/USGS-WiM/StreamStats/blob/gageplots/src/Views/gagepage.html) in the `gageplots` branch. Work for #1455 will go in the "Submit Batch" tab.
process
bp create modal part of add a new button at the top right of the navigation bar called batch processor with an icon that makes sense gear when the user clicks the button open a new modal the modal title should be streamstats batch processor the modal should have tabs called submit batch and batch status to see an example of tabs view the in the gageplots branch work for will go in the submit batch tab
1
10,264
13,112,057,988
IssuesEvent
2020-08-05 00:59:12
tokio-rs/tokio
https://api.github.com/repos/tokio-rs/tokio
closed
Document how to remotely kill child process
A-tokio C-maintenance E-easy E-help-wanted M-process T-docs
Add an example that looks roughly like this to the documentation. ```rust let (send, recv) = oneshot::channel::<()>(); tokio::select! { _ = &mut child => {}, _ = recv => { child.kill(); } } ``` See more [here](https://github.com/tokio-rs/tokio/pull/2512#issuecomment-663954076).
1.0
Document how to remotely kill child process - Add an example that looks roughly like this to the documentation. ```rust let (send, recv) = oneshot::channel::<()>(); tokio::select! { _ = &mut child => {}, _ = recv => { child.kill(); } } ``` See more [here](https://github.com/tokio-rs/tokio/pull/2512#issuecomment-663954076).
process
document how to remotely kill child process add an example that looks roughly like this to the documentation rust let send recv oneshot channel tokio select mut child recv child kill see more
1
18,820
24,719,056,622
IssuesEvent
2022-10-20 09:16:39
PyCQA/pylint
https://api.github.com/repos/PyCQA/pylint
closed
Crash with `-j 2` when linting empty file
Blocker 🙅 Regression Crash 💥 Needs investigation 🔬 topic-multiprocessing macOS
### Bug description Noticed this crash while linting an empty file with `-j 2`. It might be OS dependent as so far I can only reproduce it on MacOS. Linux seems fine. Bisected the issue to https://github.com/PyCQA/pylint/pull/7284. In particular this line: https://github.com/PyCQA/pylint/blob/77e8ae4f3902697babb186482f577a2c8b1a3725/pylint/lint/pylinter.py#L379 /CC: @daogilvie, @DanielNoord ### Configuration _No response_ ### Command used ```shell pylint -j 2 test.py ``` ### Pylint output ```shell (venv-310) $ pylint -j 2 test.py Traceback (most recent call last): File "/.../pylint/venv-310/bin/pylint", line 33, in <module> sys.exit(load_entry_point('pylint', 'console_scripts', 'pylint')()) File "/.../pylint/pylint/__init__.py", line 35, in run_pylint PylintRun(argv or sys.argv[1:]) File "/.../pylint/pylint/lint/run.py", line 207, in __init__ linter.check(args) File "/.../pylint/pylint/lint/pylinter.py", line 668, in check check_parallel( File "/.../pylint/pylint/lint/parallel.py", line 141, in check_parallel jobs, initializer=initializer, initargs=[dill.dumps(linter)] File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 304, in dumps dump(obj, file, protocol, byref, fmode, recurse, **kwds)#, strictio) File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 276, in dump Pickler(file, protocol, **_kwds).dump(obj) File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 498, in dump StockPickler.dump(self, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 487, in dump self.save(obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 603, in save self.save_reduce(obj=obj, *rv) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 603, in save self.save_reduce(obj=obj, *rv) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 687, in save_reduce save(cls) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1439, in save_type StockPickler.save_global(pickler, obj, name=name) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 1076, in save_global raise PicklingError( _pickle.PicklingError: Can't pickle <class 'astroid.util.Uninferable'>: it's not the same object as astroid.util.Uninferable ``` ### Expected behavior _No crash_ ### Pylint version ```shell pylint 2.16.0-dev astroid 2.13.0-dev0 Python 3.10.8 (v3.10.8:aaaf517424, Oct 11 2022, 10:14:40) [Clang 13.0.0 (clang-1300.0.29.30)] ``` ### OS / Environment MacOS ### Additional dependencies _No response_
1.0
Crash with `-j 2` when linting empty file - ### Bug description Noticed this crash while linting an empty file with `-j 2`. It might be OS dependent as so far I can only reproduce it on MacOS. Linux seems fine. Bisected the issue to https://github.com/PyCQA/pylint/pull/7284. In particular this line: https://github.com/PyCQA/pylint/blob/77e8ae4f3902697babb186482f577a2c8b1a3725/pylint/lint/pylinter.py#L379 /CC: @daogilvie, @DanielNoord ### Configuration _No response_ ### Command used ```shell pylint -j 2 test.py ``` ### Pylint output ```shell (venv-310) $ pylint -j 2 test.py Traceback (most recent call last): File "/.../pylint/venv-310/bin/pylint", line 33, in <module> sys.exit(load_entry_point('pylint', 'console_scripts', 'pylint')()) File "/.../pylint/pylint/__init__.py", line 35, in run_pylint PylintRun(argv or sys.argv[1:]) File "/.../pylint/pylint/lint/run.py", line 207, in __init__ linter.check(args) File "/.../pylint/pylint/lint/pylinter.py", line 668, in check check_parallel( File "/.../pylint/pylint/lint/parallel.py", line 141, in check_parallel jobs, initializer=initializer, initargs=[dill.dumps(linter)] File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 304, in dumps dump(obj, file, protocol, byref, fmode, recurse, **kwds)#, strictio) File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 276, in dump Pickler(file, protocol, **_kwds).dump(obj) File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 498, in dump StockPickler.dump(self, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 487, in dump self.save(obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 603, in save self.save_reduce(obj=obj, *rv) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1375, in save_module pickler.save_reduce(_import_module, (obj.__name__,), obj=obj, File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 717, in save_reduce save(state) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 990, in save_module_dict StockPickler.save_dict(pickler, obj) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 972, in save_dict self._batch_setitems(obj.items()) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 998, in _batch_setitems save(v) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 603, in save self.save_reduce(obj=obj, *rv) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 687, in save_reduce save(cls) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 560, in save f(self, obj) # Call unbound method with explicit self File "/.../pylint/venv-310/lib/python3.10/site-packages/dill/_dill.py", line 1439, in save_type StockPickler.save_global(pickler, obj, name=name) File "/Library/Frameworks/Python.framework/Versions/3.10/lib/python3.10/pickle.py", line 1076, in save_global raise PicklingError( _pickle.PicklingError: Can't pickle <class 'astroid.util.Uninferable'>: it's not the same object as astroid.util.Uninferable ``` ### Expected behavior _No crash_ ### Pylint version ```shell pylint 2.16.0-dev astroid 2.13.0-dev0 Python 3.10.8 (v3.10.8:aaaf517424, Oct 11 2022, 10:14:40) [Clang 13.0.0 (clang-1300.0.29.30)] ``` ### OS / Environment MacOS ### Additional dependencies _No response_
process
crash with j when linting empty file bug description noticed this crash while linting an empty file with j it might be os dependent as so far i can only reproduce it on macos linux seems fine bisected the issue to in particular this line cc daogilvie danielnoord configuration no response command used shell pylint j test py pylint output shell venv pylint j test py traceback most recent call last file pylint venv bin pylint line in sys exit load entry point pylint console scripts pylint file pylint pylint init py line in run pylint pylintrun argv or sys argv file pylint pylint lint run py line in init linter check args file pylint pylint lint pylinter py line in check check parallel file pylint pylint lint parallel py line in check parallel jobs initializer initializer initargs file pylint venv lib site packages dill dill py line in dumps dump obj file protocol byref fmode recurse kwds strictio file pylint venv lib site packages dill dill py line in dump pickler file protocol kwds dump obj file pylint venv lib site packages dill dill py line in dump stockpickler dump self obj file library frameworks python framework versions lib pickle py line in dump self save obj file library frameworks python framework versions lib pickle py line in save self save reduce obj obj rv file library frameworks python framework versions lib pickle py line in save reduce save state file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module dict stockpickler save dict pickler obj file library frameworks python framework versions lib pickle py line in save dict self batch setitems obj items file library frameworks python framework versions lib pickle py line in batch setitems save v file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module dict stockpickler save dict pickler obj file library frameworks python framework versions lib pickle py line in save dict self batch setitems obj items file library frameworks python framework versions lib pickle py line in batch setitems save v file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module pickler save reduce import module obj name obj obj file library frameworks python framework versions lib pickle py line in save reduce save state file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module dict stockpickler save dict pickler obj file library frameworks python framework versions lib pickle py line in save dict self batch setitems obj items file library frameworks python framework versions lib pickle py line in batch setitems save v file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module pickler save reduce import module obj name obj obj file library frameworks python framework versions lib pickle py line in save reduce save state file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module dict stockpickler save dict pickler obj file library frameworks python framework versions lib pickle py line in save dict self batch setitems obj items file library frameworks python framework versions lib pickle py line in batch setitems save v file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module pickler save reduce import module obj name obj obj file library frameworks python framework versions lib pickle py line in save reduce save state file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module dict stockpickler save dict pickler obj file library frameworks python framework versions lib pickle py line in save dict self batch setitems obj items file library frameworks python framework versions lib pickle py line in batch setitems save v file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module pickler save reduce import module obj name obj obj file library frameworks python framework versions lib pickle py line in save reduce save state file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module dict stockpickler save dict pickler obj file library frameworks python framework versions lib pickle py line in save dict self batch setitems obj items file library frameworks python framework versions lib pickle py line in batch setitems save v file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module pickler save reduce import module obj name obj obj file library frameworks python framework versions lib pickle py line in save reduce save state file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save module dict stockpickler save dict pickler obj file library frameworks python framework versions lib pickle py line in save dict self batch setitems obj items file library frameworks python framework versions lib pickle py line in batch setitems save v file library frameworks python framework versions lib pickle py line in save self save reduce obj obj rv file library frameworks python framework versions lib pickle py line in save reduce save cls file library frameworks python framework versions lib pickle py line in save f self obj call unbound method with explicit self file pylint venv lib site packages dill dill py line in save type stockpickler save global pickler obj name name file library frameworks python framework versions lib pickle py line in save global raise picklingerror pickle picklingerror can t pickle it s not the same object as astroid util uninferable expected behavior no crash pylint version shell pylint dev astroid python oct os environment macos additional dependencies no response
1
139,341
5,367,681,649
IssuesEvent
2017-02-22 05:35:07
Cadasta/cadasta-platform
https://api.github.com/repos/Cadasta/cadasta-platform
closed
Tab key is not enabled for right drop-down menu in Organization page
bug First Contribution Friendly priority: low
### Steps to reproduce the error 1. Go to Organizations. 2. Select an organization from the list 3. Move between the components using tab key ![selection_053](https://cloud.githubusercontent.com/assets/10743861/21739884/0765e842-d4ce-11e6-8386-16863f23a503.png) ### Actual behavior Right drop down menu is not getting focus when move between the components using the tab key ### Expected behavior It should get focus
1.0
Tab key is not enabled for right drop-down menu in Organization page - ### Steps to reproduce the error 1. Go to Organizations. 2. Select an organization from the list 3. Move between the components using tab key ![selection_053](https://cloud.githubusercontent.com/assets/10743861/21739884/0765e842-d4ce-11e6-8386-16863f23a503.png) ### Actual behavior Right drop down menu is not getting focus when move between the components using the tab key ### Expected behavior It should get focus
non_process
tab key is not enabled for right drop down menu in organization page steps to reproduce the error go to organizations select an organization from the list move between the components using tab key actual behavior right drop down menu is not getting focus when move between the components using the tab key expected behavior it should get focus
0
17,590
23,411,364,140
IssuesEvent
2022-08-12 17:55:32
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
[API Proposal]: Add a property or method that can be used to check if there is an error in the process
api-suggestion area-System.Diagnostics.Process discussion
### Background and motivation Add a property or method that can be used to check if there is an error in the process https://github.com/Mr0N/ExampleProcessStart/blob/master/ExampleProcessStart/Program.cs ### API Proposal ```csharp static class Info { public static void CheckException(this Process process) { if (string.IsNullOrEmpty(process.StandardError.ReadToEnd())) throw new Exception(); } } ``` ### API Usage ```csharp var info = new ProcessStartInfo() { FileName = "cmd.exe", RedirectStandardError = true, RedirectStandardOutput = true, RedirectStandardInput = true }; var process = Process.Start(info); process.CheckException(); string text = process.StandardOutput.ReadToEnd(); ``` ### Alternative Designs _No response_ ### Risks _No response_
1.0
[API Proposal]: Add a property or method that can be used to check if there is an error in the process - ### Background and motivation Add a property or method that can be used to check if there is an error in the process https://github.com/Mr0N/ExampleProcessStart/blob/master/ExampleProcessStart/Program.cs ### API Proposal ```csharp static class Info { public static void CheckException(this Process process) { if (string.IsNullOrEmpty(process.StandardError.ReadToEnd())) throw new Exception(); } } ``` ### API Usage ```csharp var info = new ProcessStartInfo() { FileName = "cmd.exe", RedirectStandardError = true, RedirectStandardOutput = true, RedirectStandardInput = true }; var process = Process.Start(info); process.CheckException(); string text = process.StandardOutput.ReadToEnd(); ``` ### Alternative Designs _No response_ ### Risks _No response_
process
add a property or method that can be used to check if there is an error in the process background and motivation add a property or method that can be used to check if there is an error in the process api proposal csharp static class info public static void checkexception this process process if string isnullorempty process standarderror readtoend throw new exception api usage csharp var info new processstartinfo filename cmd exe redirectstandarderror true redirectstandardoutput true redirectstandardinput true var process process start info process checkexception string text process standardoutput readtoend alternative designs no response risks no response
1
59,606
14,427,007,190
IssuesEvent
2020-12-06 01:18:59
sourcegraph/sourcegraph
https://api.github.com/repos/sourcegraph/sourcegraph
opened
Move GitHub secrets to Vault
RFC-249-2 team/security
In order to use Vault as a single source of truth for secrets, we want to update our GitHub actions to fetch its secrets from Vault. We should be able to use [this HashiCorp provided action](https://github.com/marketplace/actions/vault-secrets) for implementation.
True
Move GitHub secrets to Vault - In order to use Vault as a single source of truth for secrets, we want to update our GitHub actions to fetch its secrets from Vault. We should be able to use [this HashiCorp provided action](https://github.com/marketplace/actions/vault-secrets) for implementation.
non_process
move github secrets to vault in order to use vault as a single source of truth for secrets we want to update our github actions to fetch its secrets from vault we should be able to use for implementation
0
34,094
7,343,249,313
IssuesEvent
2018-03-07 10:42:13
proarc/proarc
https://api.github.com/repos/proarc/proarc
closed
Nápovědní bubliny - číslo periodika
Priority-Low Release-3.4.1 Type-Defect auto-migrated
``` Prosíme opravit: 1. proč je u title info - type "hlavní název bez type"? 2. title - změnit název titulu periodika na "název čísla periodika" 3. subtitle - změnit podnázev titulu periodika na "podnázev čísla periodika" 4. name part - změnit nutno na "pokud je to možné" 5. role - role term - přidat odkaz na slovník rolí 6. origin info - vypadly názvy polí v závorkách 7. date issued - qualifier - chybí popis inferred, questionable 8. language term - přidat seznam kódů jazyků 9. language term - object part - přidat čísla polí summary - pole 041 $b table of contents - pole 041 $f accompanying material - pole 041 $g translation - pole 041 $h 10. language term - type - změnit popis na "použít hodnotu code" 11. physical description - unit - chybí popis 12. abstract - přidat popis "odpovídá poli 520 MARC 21" 13. subject - rozdělit pole name a name part, rozdělit i popisy polí 14. identifier - smazat popis "uvádějí se..." 15. identifier type - přidat popis "isbn - pokud existuje, převzít z katalogizačního záznamu z pole 020, podpole "a", "z" 16. physical location - přidat popis "neopakovatelný element" 17. part - přidat popis "element může..." ``` Original issue reported on code.google.com by `daneck...@knav.cz` on 12 Jun 2014 at 12:31
1.0
Nápovědní bubliny - číslo periodika - ``` Prosíme opravit: 1. proč je u title info - type "hlavní název bez type"? 2. title - změnit název titulu periodika na "název čísla periodika" 3. subtitle - změnit podnázev titulu periodika na "podnázev čísla periodika" 4. name part - změnit nutno na "pokud je to možné" 5. role - role term - přidat odkaz na slovník rolí 6. origin info - vypadly názvy polí v závorkách 7. date issued - qualifier - chybí popis inferred, questionable 8. language term - přidat seznam kódů jazyků 9. language term - object part - přidat čísla polí summary - pole 041 $b table of contents - pole 041 $f accompanying material - pole 041 $g translation - pole 041 $h 10. language term - type - změnit popis na "použít hodnotu code" 11. physical description - unit - chybí popis 12. abstract - přidat popis "odpovídá poli 520 MARC 21" 13. subject - rozdělit pole name a name part, rozdělit i popisy polí 14. identifier - smazat popis "uvádějí se..." 15. identifier type - přidat popis "isbn - pokud existuje, převzít z katalogizačního záznamu z pole 020, podpole "a", "z" 16. physical location - přidat popis "neopakovatelný element" 17. part - přidat popis "element může..." ``` Original issue reported on code.google.com by `daneck...@knav.cz` on 12 Jun 2014 at 12:31
non_process
nápovědní bubliny číslo periodika prosíme opravit proč je u title info type hlavní název bez type title změnit název titulu periodika na název čísla periodika subtitle změnit podnázev titulu periodika na podnázev čísla periodika name part změnit nutno na pokud je to možné role role term přidat odkaz na slovník rolí origin info vypadly názvy polí v závorkách date issued qualifier chybí popis inferred questionable language term přidat seznam kódů jazyků language term object part přidat čísla polí summary pole b table of contents pole f accompanying material pole g translation pole h language term type změnit popis na použít hodnotu code physical description unit chybí popis abstract přidat popis odpovídá poli marc subject rozdělit pole name a name part rozdělit i popisy polí identifier smazat popis uvádějí se identifier type přidat popis isbn pokud existuje převzít z katalogizačního záznamu z pole podpole a z physical location přidat popis neopakovatelný element part přidat popis element může original issue reported on code google com by daneck knav cz on jun at
0
369
2,813,816,462
IssuesEvent
2015-05-18 16:33:35
arduino/Arduino
https://api.github.com/repos/arduino/Arduino
closed
Bug: splitting define causes java regex error and arduino won't compile
Component: Preprocessor
This doesn't compile: #define SERIAL_BANNER "\ 1) Scan for sensors\n\ 2) Change sensor I2C address\n\ 3) Show readings for all sensors\n\ " It causes this: Exception in thread "Thread-6" java.lang.StackOverflowError at java.util.regex.Pattern$Branch.match(Pattern.java:4114) at java.util.regex.Pattern$GroupHead.match(Pattern.java:4168) [..] Changing it to: #define SERIAL_BANNER "1) Scan for sensors\n2) Change sensor I2C address\n3) Show readings for all sensors\n" makes it work... Marcin
1.0
Bug: splitting define causes java regex error and arduino won't compile - This doesn't compile: #define SERIAL_BANNER "\ 1) Scan for sensors\n\ 2) Change sensor I2C address\n\ 3) Show readings for all sensors\n\ " It causes this: Exception in thread "Thread-6" java.lang.StackOverflowError at java.util.regex.Pattern$Branch.match(Pattern.java:4114) at java.util.regex.Pattern$GroupHead.match(Pattern.java:4168) [..] Changing it to: #define SERIAL_BANNER "1) Scan for sensors\n2) Change sensor I2C address\n3) Show readings for all sensors\n" makes it work... Marcin
process
bug splitting define causes java regex error and arduino won t compile this doesn t compile define serial banner scan for sensors n change sensor address n show readings for all sensors n it causes this exception in thread thread java lang stackoverflowerror at java util regex pattern branch match pattern java at java util regex pattern grouphead match pattern java changing it to define serial banner scan for sensors change sensor address show readings for all sensors n makes it work marcin
1
70,268
3,321,810,885
IssuesEvent
2015-11-09 11:03:38
mantidproject/mantid
https://api.github.com/repos/mantidproject/mantid
closed
Move Mantid Third Party to Visual Studio 2015 on Windows
Component: Framework Priority: High
Visual Studio 2015 community edition will allow us to take advantage of newer C++ features. To do: - [x] Build scripts for all third party dependencies - [x] Add binaries to repository using Git LFS - [x] Reorganise CMake and allow it to download dependencies - [x] Changes to only use Python release libs in all configurations - [x] Recompile fortran modules - [x] Build ParaView locally and on servers - [x] Generate start-up scripts for developers to get correct environment - [x] Fix all warnings/errors + unit & system test failures
1.0
Move Mantid Third Party to Visual Studio 2015 on Windows - Visual Studio 2015 community edition will allow us to take advantage of newer C++ features. To do: - [x] Build scripts for all third party dependencies - [x] Add binaries to repository using Git LFS - [x] Reorganise CMake and allow it to download dependencies - [x] Changes to only use Python release libs in all configurations - [x] Recompile fortran modules - [x] Build ParaView locally and on servers - [x] Generate start-up scripts for developers to get correct environment - [x] Fix all warnings/errors + unit & system test failures
non_process
move mantid third party to visual studio on windows visual studio community edition will allow us to take advantage of newer c features to do build scripts for all third party dependencies add binaries to repository using git lfs reorganise cmake and allow it to download dependencies changes to only use python release libs in all configurations recompile fortran modules build paraview locally and on servers generate start up scripts for developers to get correct environment fix all warnings errors unit system test failures
0
19,734
26,084,921,680
IssuesEvent
2022-12-26 00:35:19
devssa/onde-codar-em-salvador
https://api.github.com/repos/devssa/onde-codar-em-salvador
closed
[REMOTO] [PRESENCIAL] [ESTÁGIO] Estágio RPA na [X-TESTING]
SALVADOR TESTE AUTOMATIZADO REMOTO DOCUMENTAÇÃO HELP WANTED feature_request AUTOMAÇÃO DE PROCESSOS Stale
<!-- ================================================== POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS! Use: "Desenvolvedor Front-end" ao invés de "Front-End Developer" \o/ Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]` ================================================== --> ## Descrição da vaga - 01 vaga para Estágio RPA - Home Office - 01 vaga para estágio RPA - Presencial - Cursando a partir do 4 semestre em T.I. ou áreas afins - Oferecemos o treinamento necessário para trabalhar com a ferramenta ## Local - Salvador ## Benefícios - Informações diretamente com o responsável/recrutador da vaga. ## Requisitos **Obrigatórios:** - Desenhar, desenvolver e testar automações de processos; - Integrar os "robôs" a diversas atividades de automação; - Realização de Testes e Criação de documentação técnica. **Desejáveis:** - Proativo; - Criativo; - Organizado; - Motivador **Diferenciais:** - Conhecimento em linguagens de programação. ## Contratação - a combinar ## Nossa empresa - Fundada em 2013, a X-Testing é uma empresa especializada em qualidade de software, processos, automação, inteligência artificial, desenvolvimento de produtos e inovação. Nossas principais características são: - Alta qualidade dos nossos serviços e produtos atestadas anualmente pelos nossos clientes e parceiros; - Independentes, profissionalismo e imparcialidade nas auditorias, avaliações e certificações; - Investimos fortemente na formação, capacitação e especialização dos nossos colaboradores criando desafios e oportunidades profissionais para todos; - Inovação é nosso carro chefe, acreditamos que a melhor forma de crescer e ampliar a nossa carteira de clientes é inovando. ## Como se candidatar - Por favor envie um email para rh@xtesting.com.br com seu CV anexado - enviar no assunto: ESTÁGIO RPA - HOME OFFICE ou ESTÁGIO RPA - PRESENCIAL
1.0
[REMOTO] [PRESENCIAL] [ESTÁGIO] Estágio RPA na [X-TESTING] - <!-- ================================================== POR FAVOR, SÓ POSTE SE A VAGA FOR PARA SALVADOR E CIDADES VIZINHAS! Use: "Desenvolvedor Front-end" ao invés de "Front-End Developer" \o/ Exemplo: `[JAVASCRIPT] [MYSQL] [NODE.JS] Desenvolvedor Front-End na [NOME DA EMPRESA]` ================================================== --> ## Descrição da vaga - 01 vaga para Estágio RPA - Home Office - 01 vaga para estágio RPA - Presencial - Cursando a partir do 4 semestre em T.I. ou áreas afins - Oferecemos o treinamento necessário para trabalhar com a ferramenta ## Local - Salvador ## Benefícios - Informações diretamente com o responsável/recrutador da vaga. ## Requisitos **Obrigatórios:** - Desenhar, desenvolver e testar automações de processos; - Integrar os "robôs" a diversas atividades de automação; - Realização de Testes e Criação de documentação técnica. **Desejáveis:** - Proativo; - Criativo; - Organizado; - Motivador **Diferenciais:** - Conhecimento em linguagens de programação. ## Contratação - a combinar ## Nossa empresa - Fundada em 2013, a X-Testing é uma empresa especializada em qualidade de software, processos, automação, inteligência artificial, desenvolvimento de produtos e inovação. Nossas principais características são: - Alta qualidade dos nossos serviços e produtos atestadas anualmente pelos nossos clientes e parceiros; - Independentes, profissionalismo e imparcialidade nas auditorias, avaliações e certificações; - Investimos fortemente na formação, capacitação e especialização dos nossos colaboradores criando desafios e oportunidades profissionais para todos; - Inovação é nosso carro chefe, acreditamos que a melhor forma de crescer e ampliar a nossa carteira de clientes é inovando. ## Como se candidatar - Por favor envie um email para rh@xtesting.com.br com seu CV anexado - enviar no assunto: ESTÁGIO RPA - HOME OFFICE ou ESTÁGIO RPA - PRESENCIAL
process
estágio rpa na por favor só poste se a vaga for para salvador e cidades vizinhas use desenvolvedor front end ao invés de front end developer o exemplo desenvolvedor front end na descrição da vaga vaga para estágio rpa home office vaga para estágio rpa presencial cursando a partir do semestre em t i ou áreas afins oferecemos o treinamento necessário para trabalhar com a ferramenta local salvador benefícios informações diretamente com o responsável recrutador da vaga requisitos obrigatórios desenhar desenvolver e testar automações de processos integrar os robôs a diversas atividades de automação realização de testes e criação de documentação técnica desejáveis proativo criativo organizado motivador diferenciais conhecimento em linguagens de programação contratação a combinar nossa empresa fundada em a x testing é uma empresa especializada em qualidade de software processos automação inteligência artificial desenvolvimento de produtos e inovação nossas principais características são alta qualidade dos nossos serviços e produtos atestadas anualmente pelos nossos clientes e parceiros independentes profissionalismo e imparcialidade nas auditorias avaliações e certificações investimos fortemente na formação capacitação e especialização dos nossos colaboradores criando desafios e oportunidades profissionais para todos inovação é nosso carro chefe acreditamos que a melhor forma de crescer e ampliar a nossa carteira de clientes é inovando como se candidatar por favor envie um email para rh xtesting com br com seu cv anexado enviar no assunto estágio rpa home office ou estágio rpa presencial
1
184,577
21,784,913,778
IssuesEvent
2022-05-14 01:47:10
n-devs/freebitco.in-mobile
https://api.github.com/repos/n-devs/freebitco.in-mobile
closed
WS-2019-0291 (High) detected in handlebars-4.1.2.tgz - autoclosed
security vulnerability
## WS-2019-0291 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.2.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/freebitco.in-mobile/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/freebitco.in-mobile/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.0.1.tgz (Root Library) - jest-24.7.1.tgz - jest-cli-24.8.0.tgz - core-24.8.0.tgz - reporters-24.8.0.tgz - istanbul-reports-2.2.6.tgz - :x: **handlebars-4.1.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/n-psk/freebitco.in-mobile/commit/03e0226af335ef1bcac923b1d013d2349ab2b2d3">03e0226af335ef1bcac923b1d013d2349ab2b2d3</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> handlebars before 4.3.0 is vulnerable to Prototype Pollution leading to Remote Code Execution. Templates may alter an Objects' __proto__ and __defineGetter__ properties, which may allow an attacker to execute arbitrary code through crafted payloads. <p>Publish Date: 2019-10-06 <p>URL: <a href=https://github.com/wycats/handlebars.js/issues/1558>WS-2019-0291</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.3</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1164">https://www.npmjs.com/advisories/1164</a></p> <p>Release Date: 2019-10-06</p> <p>Fix Resolution: 4.3.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2019-0291 (High) detected in handlebars-4.1.2.tgz - autoclosed - ## WS-2019-0291 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-4.1.2.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz">https://registry.npmjs.org/handlebars/-/handlebars-4.1.2.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/freebitco.in-mobile/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/freebitco.in-mobile/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.0.1.tgz (Root Library) - jest-24.7.1.tgz - jest-cli-24.8.0.tgz - core-24.8.0.tgz - reporters-24.8.0.tgz - istanbul-reports-2.2.6.tgz - :x: **handlebars-4.1.2.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/n-psk/freebitco.in-mobile/commit/03e0226af335ef1bcac923b1d013d2349ab2b2d3">03e0226af335ef1bcac923b1d013d2349ab2b2d3</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> handlebars before 4.3.0 is vulnerable to Prototype Pollution leading to Remote Code Execution. Templates may alter an Objects' __proto__ and __defineGetter__ properties, which may allow an attacker to execute arbitrary code through crafted payloads. <p>Publish Date: 2019-10-06 <p>URL: <a href=https://github.com/wycats/handlebars.js/issues/1558>WS-2019-0291</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.3</b>)</summary> <p> Base Score Metrics not available</p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://www.npmjs.com/advisories/1164">https://www.npmjs.com/advisories/1164</a></p> <p>Release Date: 2019-10-06</p> <p>Fix Resolution: 4.3.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
ws high detected in handlebars tgz autoclosed ws high severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file tmp ws scm freebitco in mobile package json path to vulnerable library tmp ws scm freebitco in mobile node modules handlebars package json dependency hierarchy react scripts tgz root library jest tgz jest cli tgz core tgz reporters tgz istanbul reports tgz x handlebars tgz vulnerable library found in head commit a href vulnerability details handlebars before is vulnerable to prototype pollution leading to remote code execution templates may alter an objects proto and definegetter properties which may allow an attacker to execute arbitrary code through crafted payloads publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
158,723
24,887,101,746
IssuesEvent
2022-10-28 08:43:50
knaw-huc/di-website
https://api.github.com/repos/knaw-huc/di-website
opened
Improve responsiveness of Twitter feed on News page
design
The Twitter feed takes quite a bit of time to load and render. Long enough that an impatient person might think the page isn't working. Can we add something like 'Page Loading..' placeholder before div with Twitter's widget.js loads? Or some way to pre-fetch, cache, and then render the cached content? I think this is the approach taken here: https://stackoverflow.com/questions/11206398/caching-api-twitter-calls-for-twitter-profile-widget
1.0
Improve responsiveness of Twitter feed on News page - The Twitter feed takes quite a bit of time to load and render. Long enough that an impatient person might think the page isn't working. Can we add something like 'Page Loading..' placeholder before div with Twitter's widget.js loads? Or some way to pre-fetch, cache, and then render the cached content? I think this is the approach taken here: https://stackoverflow.com/questions/11206398/caching-api-twitter-calls-for-twitter-profile-widget
non_process
improve responsiveness of twitter feed on news page the twitter feed takes quite a bit of time to load and render long enough that an impatient person might think the page isn t working can we add something like page loading placeholder before div with twitter s widget js loads or some way to pre fetch cache and then render the cached content i think this is the approach taken here
0
267,908
28,537,304,017
IssuesEvent
2023-04-20 01:06:36
ChoeMinji/redis-6.2.3
https://api.github.com/repos/ChoeMinji/redis-6.2.3
closed
CVE-2021-32761 (High) detected in redis6.2.6, redis6.2.6 - autoclosed
Mend: dependency security vulnerability
## CVE-2021-32761 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>redis6.2.6</b>, <b>redis6.2.6</b></p></summary> <p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Redis is an in-memory database that persists on disk. A vulnerability involving out-of-bounds read and integer overflow to buffer overflow exists starting with version 2.2 and prior to versions 5.0.13, 6.0.15, and 6.2.5. On 32-bit systems, Redis `*BIT*` command are vulnerable to integer overflow that can potentially be exploited to corrupt the heap, leak arbitrary heap contents or trigger remote code execution. The vulnerability involves changing the default `proto-max-bulk-len` configuration parameter to a very large value and constructing specially crafted commands bit commands. This problem only affects Redis on 32-bit platforms, or compiled as a 32-bit binary. Redis versions 5.0.`3m 6.0.15, and 6.2.5 contain patches for this issue. An additional workaround to mitigate the problem without patching the `redis-server` executable is to prevent users from modifying the `proto-max-bulk-len` configuration parameter. This can be done using ACL to restrict unprivileged users from using the CONFIG SET command. <p>Publish Date: 2021-07-21 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-32761>CVE-2021-32761</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/redis/redis/security/advisories/GHSA-8wxq-j7rp-g8wj">https://github.com/redis/redis/security/advisories/GHSA-8wxq-j7rp-g8wj</a></p> <p>Release Date: 2021-07-21</p> <p>Fix Resolution: redis - 5.0.13, 6.0.15, 6.2.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-32761 (High) detected in redis6.2.6, redis6.2.6 - autoclosed - ## CVE-2021-32761 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>redis6.2.6</b>, <b>redis6.2.6</b></p></summary> <p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Redis is an in-memory database that persists on disk. A vulnerability involving out-of-bounds read and integer overflow to buffer overflow exists starting with version 2.2 and prior to versions 5.0.13, 6.0.15, and 6.2.5. On 32-bit systems, Redis `*BIT*` command are vulnerable to integer overflow that can potentially be exploited to corrupt the heap, leak arbitrary heap contents or trigger remote code execution. The vulnerability involves changing the default `proto-max-bulk-len` configuration parameter to a very large value and constructing specially crafted commands bit commands. This problem only affects Redis on 32-bit platforms, or compiled as a 32-bit binary. Redis versions 5.0.`3m 6.0.15, and 6.2.5 contain patches for this issue. An additional workaround to mitigate the problem without patching the `redis-server` executable is to prevent users from modifying the `proto-max-bulk-len` configuration parameter. This can be done using ACL to restrict unprivileged users from using the CONFIG SET command. <p>Publish Date: 2021-07-21 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-32761>CVE-2021-32761</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/redis/redis/security/advisories/GHSA-8wxq-j7rp-g8wj">https://github.com/redis/redis/security/advisories/GHSA-8wxq-j7rp-g8wj</a></p> <p>Release Date: 2021-07-21</p> <p>Fix Resolution: redis - 5.0.13, 6.0.15, 6.2.5</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in autoclosed cve high severity vulnerability vulnerable libraries vulnerability details redis is an in memory database that persists on disk a vulnerability involving out of bounds read and integer overflow to buffer overflow exists starting with version and prior to versions and on bit systems redis bit command are vulnerable to integer overflow that can potentially be exploited to corrupt the heap leak arbitrary heap contents or trigger remote code execution the vulnerability involves changing the default proto max bulk len configuration parameter to a very large value and constructing specially crafted commands bit commands this problem only affects redis on bit platforms or compiled as a bit binary redis versions and contain patches for this issue an additional workaround to mitigate the problem without patching the redis server executable is to prevent users from modifying the proto max bulk len configuration parameter this can be done using acl to restrict unprivileged users from using the config set command publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution redis step up your open source security game with mend
0
785,716
27,623,555,678
IssuesEvent
2023-03-10 03:36:52
coral-xyz/backpack
https://api.github.com/repos/coral-xyz/backpack
closed
add mnemonic hot wallet after creating a ledger only account
priority 1 polish
Right now, if you create an account with a ledger only, you can't later create a mnemonic hot wallet.
1.0
add mnemonic hot wallet after creating a ledger only account - Right now, if you create an account with a ledger only, you can't later create a mnemonic hot wallet.
non_process
add mnemonic hot wallet after creating a ledger only account right now if you create an account with a ledger only you can t later create a mnemonic hot wallet
0
2,764
5,695,998,584
IssuesEvent
2017-04-16 06:01:37
AllenFang/react-bootstrap-table
https://api.github.com/repos/AllenFang/react-bootstrap-table
closed
Support Conjunction (&&) as Search Mode
help wanted inprocess
Currently, two very different search modes are supported: 1. A very exclusive one, that accepts only rows with the exact `searchText` contained as-is in any of their cells. (default) 2. A very inclusive one, that accepts all rows that contain at least one of the whitespace-separated terms in the `searchText`. (`multiColumnSearch: true`) One could argue whether "multiple columns" are any differentiation between these two cases, but that's not my point here. I merely want to point out that for some use cases the (default) **option (1) is too strict** and the alternative **option (2) is not strict enough**. Therefore I propose introducing an additional `strictSearch` prop opening up the way for four different search modes, i.e. two more options. ---- I'll demonstrate the difference with a single row as example: Col1 | Col2 -----|----- A B | C Search modes and whether they would include the above row for the given `searchText` case | `searchText` | `strict && !multiColumn` (existing option 1) | `!strict && !multiColumn` | `strict && multiColumn` | `!strict && multiColumn` (existing option 2) -----|-----|-----|-----|-----|----- part of the content | "A" | ✅ | ✅ | ✅ | ✅ whole content | "A B" | ✅ | ✅ | ✅ | ✅ wrong order | "B A" | ❌ | ✅ | ✅ | ✅ leading whitespace | " A" | ❌ | ✅ | ✅ | ✅ trailing whitespace | "B " | ❌ | ✅ | ✅ | ✅ not in one column | "B C" | ❌ | ❌ | ✅ | ✅ partially contained | "A D" | ❌ | ❌ | ❌ | ✅ not contained | "D" | ❌ | ❌ | ❌ | ❌
1.0
Support Conjunction (&&) as Search Mode - Currently, two very different search modes are supported: 1. A very exclusive one, that accepts only rows with the exact `searchText` contained as-is in any of their cells. (default) 2. A very inclusive one, that accepts all rows that contain at least one of the whitespace-separated terms in the `searchText`. (`multiColumnSearch: true`) One could argue whether "multiple columns" are any differentiation between these two cases, but that's not my point here. I merely want to point out that for some use cases the (default) **option (1) is too strict** and the alternative **option (2) is not strict enough**. Therefore I propose introducing an additional `strictSearch` prop opening up the way for four different search modes, i.e. two more options. ---- I'll demonstrate the difference with a single row as example: Col1 | Col2 -----|----- A B | C Search modes and whether they would include the above row for the given `searchText` case | `searchText` | `strict && !multiColumn` (existing option 1) | `!strict && !multiColumn` | `strict && multiColumn` | `!strict && multiColumn` (existing option 2) -----|-----|-----|-----|-----|----- part of the content | "A" | ✅ | ✅ | ✅ | ✅ whole content | "A B" | ✅ | ✅ | ✅ | ✅ wrong order | "B A" | ❌ | ✅ | ✅ | ✅ leading whitespace | " A" | ❌ | ✅ | ✅ | ✅ trailing whitespace | "B " | ❌ | ✅ | ✅ | ✅ not in one column | "B C" | ❌ | ❌ | ✅ | ✅ partially contained | "A D" | ❌ | ❌ | ❌ | ✅ not contained | "D" | ❌ | ❌ | ❌ | ❌
process
support conjunction as search mode currently two very different search modes are supported a very exclusive one that accepts only rows with the exact searchtext contained as is in any of their cells default a very inclusive one that accepts all rows that contain at least one of the whitespace separated terms in the searchtext multicolumnsearch true one could argue whether multiple columns are any differentiation between these two cases but that s not my point here i merely want to point out that for some use cases the default option is too strict and the alternative option is not strict enough therefore i propose introducing an additional strictsearch prop opening up the way for four different search modes i e two more options i ll demonstrate the difference with a single row as example a b c search modes and whether they would include the above row for the given searchtext case searchtext strict multicolumn existing option strict multicolumn strict multicolumn strict multicolumn existing option part of the content a ✅ ✅ ✅ ✅ whole content a b ✅ ✅ ✅ ✅ wrong order b a ❌ ✅ ✅ ✅ leading whitespace a ❌ ✅ ✅ ✅ trailing whitespace b ❌ ✅ ✅ ✅ not in one column b c ❌ ❌ ✅ ✅ partially contained a d ❌ ❌ ❌ ✅ not contained d ❌ ❌ ❌ ❌
1
6,832
9,975,295,233
IssuesEvent
2019-07-09 12:46:41
varietywalls/variety
https://api.github.com/repos/varietywalls/variety
opened
Run Black on all files, add black linting to CI (once we configure it)
dev process
@jlu5 I intend to adopt using Black for Variety (https://github.com/python/black) and run it on the whole codebase fairly soon. This will fix the current issue of way-too-inconsistent styling and too many long lines, and will also make it easier for external developers to contribute. What is your state, do you have any big non-merged branches that could result in huge conflicts? (though these could be easily resolved by running Black on those branches too before rebasing on master, or merging master into them).
1.0
Run Black on all files, add black linting to CI (once we configure it) - @jlu5 I intend to adopt using Black for Variety (https://github.com/python/black) and run it on the whole codebase fairly soon. This will fix the current issue of way-too-inconsistent styling and too many long lines, and will also make it easier for external developers to contribute. What is your state, do you have any big non-merged branches that could result in huge conflicts? (though these could be easily resolved by running Black on those branches too before rebasing on master, or merging master into them).
process
run black on all files add black linting to ci once we configure it i intend to adopt using black for variety and run it on the whole codebase fairly soon this will fix the current issue of way too inconsistent styling and too many long lines and will also make it easier for external developers to contribute what is your state do you have any big non merged branches that could result in huge conflicts though these could be easily resolved by running black on those branches too before rebasing on master or merging master into them
1
65,858
27,259,916,691
IssuesEvent
2023-02-22 14:14:44
red-hat-storage/ocs-ci
https://api.github.com/repos/red-hat-storage/ocs-ci
closed
verify_provider_topology failure
Managed Services
Post-installation check verify_provider_topology failed due to type mismatch: osd_count == size / 4 TypeError: unsupported operand type(s) for /: 'str' and 'int' The reason is that the size in the provider config file is a string, not an integer.
1.0
verify_provider_topology failure - Post-installation check verify_provider_topology failed due to type mismatch: osd_count == size / 4 TypeError: unsupported operand type(s) for /: 'str' and 'int' The reason is that the size in the provider config file is a string, not an integer.
non_process
verify provider topology failure post installation check verify provider topology failed due to type mismatch osd count size typeerror unsupported operand type s for str and int the reason is that the size in the provider config file is a string not an integer
0
724,687
24,939,132,977
IssuesEvent
2022-10-31 17:24:27
0xC0ncord/TURRPG2
https://api.github.com/repos/0xC0ncord/TURRPG2
opened
Rework magic weapon enchanting for Adrenaline Masters
type:enhancement priority:p2
Instead of randomly getting a modifier, Adrenaline masters select a desired modifier and must fill up some sort of gauge in order to receive the enchantment.
1.0
Rework magic weapon enchanting for Adrenaline Masters - Instead of randomly getting a modifier, Adrenaline masters select a desired modifier and must fill up some sort of gauge in order to receive the enchantment.
non_process
rework magic weapon enchanting for adrenaline masters instead of randomly getting a modifier adrenaline masters select a desired modifier and must fill up some sort of gauge in order to receive the enchantment
0
358,926
10,651,882,351
IssuesEvent
2019-10-17 11:24:06
kubernetes/website
https://api.github.com/repos/kubernetes/website
closed
Issue with k8s.io/docs/tutorials/kubernetes-basics/deploy-app/deploy-interactive/
kind/cleanup language/en priority/important-longterm
**This is a Bug Report** <!-- Thanks for filing an issue! Before submitting, please fill in the following information. --> <!-- See https://kubernetes.io/docs/contribute/start/ for guidance on writing an actionable issue description. --> <!--Required Information--> **Problem:** kubectl run kubernetes-bootcamp --image=gcr.io/google-samples/kubernetes-bootcamp:v1 --port=8080 kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead. deployment.apps/kubernetes-bootcamp created **Proposed Solution:** N/A **Page to Update:** https://kubernetes.io/... <!--Optional Information (remove the comment tags around information you would like to include)--> <Kubernetes Version:> 1.15.0 $ kubectl version Client Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.0", GitCommit:"e8462b5b5dc2584fdcd18e6bcfe9f1e4d970a529", GitTreeState:"clean", BuildDate:"2019-06-19T16:40:16Z", GoVersion:"go1.12.5", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.0", GitCommit:"e8462b5b5dc2584fdcd18e6bcfe9f1e4d970a529", GitTreeState:"clean", BuildDate:"2019-06-19T16:32:14Z", GoVersion:"go1.12.5", Compiler:"gc", Platform:"linux/amd64"} <!--Additional Information:-->
1.0
Issue with k8s.io/docs/tutorials/kubernetes-basics/deploy-app/deploy-interactive/ - **This is a Bug Report** <!-- Thanks for filing an issue! Before submitting, please fill in the following information. --> <!-- See https://kubernetes.io/docs/contribute/start/ for guidance on writing an actionable issue description. --> <!--Required Information--> **Problem:** kubectl run kubernetes-bootcamp --image=gcr.io/google-samples/kubernetes-bootcamp:v1 --port=8080 kubectl run --generator=deployment/apps.v1 is DEPRECATED and will be removed in a future version. Use kubectl run --generator=run-pod/v1 or kubectl create instead. deployment.apps/kubernetes-bootcamp created **Proposed Solution:** N/A **Page to Update:** https://kubernetes.io/... <!--Optional Information (remove the comment tags around information you would like to include)--> <Kubernetes Version:> 1.15.0 $ kubectl version Client Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.0", GitCommit:"e8462b5b5dc2584fdcd18e6bcfe9f1e4d970a529", GitTreeState:"clean", BuildDate:"2019-06-19T16:40:16Z", GoVersion:"go1.12.5", Compiler:"gc", Platform:"linux/amd64"} Server Version: version.Info{Major:"1", Minor:"15", GitVersion:"v1.15.0", GitCommit:"e8462b5b5dc2584fdcd18e6bcfe9f1e4d970a529", GitTreeState:"clean", BuildDate:"2019-06-19T16:32:14Z", GoVersion:"go1.12.5", Compiler:"gc", Platform:"linux/amd64"} <!--Additional Information:-->
non_process
issue with io docs tutorials kubernetes basics deploy app deploy interactive this is a bug report problem kubectl run kubernetes bootcamp image gcr io google samples kubernetes bootcamp port kubectl run generator deployment apps is deprecated and will be removed in a future version use kubectl run generator run pod or kubectl create instead deployment apps kubernetes bootcamp created proposed solution n a page to update kubectl version client version version info major minor gitversion gitcommit gittreestate clean builddate goversion compiler gc platform linux server version version info major minor gitversion gitcommit gittreestate clean builddate goversion compiler gc platform linux
0
16,186
20,626,546,089
IssuesEvent
2022-03-07 23:21:41
allinurl/goaccess
https://api.github.com/repos/allinurl/goaccess
closed
Add option for on-disk database size limit
log-processing on-disk
Hello! We are using configuration with database, load-from-disk and keep-db-files. In some cases database size can grow up to 8G or more. Is there any possibility to set maximum database size? So we can keep disk usage under control without removing goaccess database files and losing all of previously stored data. I could not find any options similar to "size limit", "database size" or "expires" in the official man page. Thank you!
1.0
Add option for on-disk database size limit - Hello! We are using configuration with database, load-from-disk and keep-db-files. In some cases database size can grow up to 8G or more. Is there any possibility to set maximum database size? So we can keep disk usage under control without removing goaccess database files and losing all of previously stored data. I could not find any options similar to "size limit", "database size" or "expires" in the official man page. Thank you!
process
add option for on disk database size limit hello we are using configuration with database load from disk and keep db files in some cases database size can grow up to or more is there any possibility to set maximum database size so we can keep disk usage under control without removing goaccess database files and losing all of previously stored data i could not find any options similar to size limit database size or expires in the official man page thank you
1
10,711
13,507,722,953
IssuesEvent
2020-09-14 06:32:06
symfony/symfony-docs
https://api.github.com/repos/symfony/symfony-docs
closed
[Process] allow setting options esp. "create_new_console" to detach a s…
Process
| Q | A | ------------ | --- | Feature PR | symfony/symfony#37519 | PR author(s) | @andrei0x309 | Merged in | 5.2-dev
1.0
[Process] allow setting options esp. "create_new_console" to detach a s… - | Q | A | ------------ | --- | Feature PR | symfony/symfony#37519 | PR author(s) | @andrei0x309 | Merged in | 5.2-dev
process
allow setting options esp create new console to detach a s… q a feature pr symfony symfony pr author s merged in dev
1
7,587
10,698,268,271
IssuesEvent
2019-10-23 18:19:18
processing/processing-docs
https://api.github.com/repos/processing/processing-docs
closed
subString should be substring in Handbook
processing handbook
### Issue description On page 143 of the second edition of the handbook (but first printing?), chapter 11, under "Syntax introduced" the String.substring() function is incorrectly capitalized as String.subString(). It is correct in the rest of that chapter. ### URL(s) of affected page(s) ### Proposed fix Change String.subString() to String.substring() in the "Syntax introduced" section of chapter 11.
1.0
subString should be substring in Handbook - ### Issue description On page 143 of the second edition of the handbook (but first printing?), chapter 11, under "Syntax introduced" the String.substring() function is incorrectly capitalized as String.subString(). It is correct in the rest of that chapter. ### URL(s) of affected page(s) ### Proposed fix Change String.subString() to String.substring() in the "Syntax introduced" section of chapter 11.
process
substring should be substring in handbook issue description on page of the second edition of the handbook but first printing chapter under syntax introduced the string substring function is incorrectly capitalized as string substring it is correct in the rest of that chapter url s of affected page s proposed fix change string substring to string substring in the syntax introduced section of chapter
1
15,344
19,491,831,742
IssuesEvent
2021-12-27 08:05:05
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
SAGA provider: "Duplicate algorithm name" warnings
Feedback Processing Bug
### What is the bug or the crash? Starting QGIS 3.21.0-master (8498c55), installed using the OSGeo4W Network Installer on Windows 10, a long list of warnings appears in the Processing tab of the Log Messages panel: ``` 2021-10-09T11:01:46 WARNING Duplicate algorithm name bioclimaticvariables for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewithmask1 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewithmask2 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewiththereshold1 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewiththereshold2 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewiththereshold3 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name destriping for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name destripingwithmask for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name directionalaverage for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name earthsorbitalparameters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name accumulatedcost for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name accumulationfunctions for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name aggregationindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name analyticalhierarchyprocess for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name covereddistance for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name crossclassificationandtabulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diversityofcategories for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fragmentationalternative for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fragmentationstandard for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fragmentationclassesfromdensityandconnectivity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzify for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzyintersectionand for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzyunionor for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name geometricfigures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorfromcartesiantopolarcoordinates for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorfrompolartocartesiancoordinates for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastersproduct for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterssum for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastercalculator for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterdifference for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterdivision for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastervolume for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name binaryerosionreconstruction for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name connectivityanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name dtmfilterslopebased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name filterclumps for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gaussianfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name geodesicmorphologicalreconstruction for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name angulardistanceweighted for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name bsplineapproximation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name cubicsplineapproximation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name changedatastorage for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closegaps for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closegapswithspline for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closegapswithstepwiseresampling for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closeonecellgaps for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name constantraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name croptodata for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterbuffer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastercellindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastermasking for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterproximitybuffer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name confusionmatrixpolygonsraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name confusionmatrixtworasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterskeletonization for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name automatedcloudcoverassessment for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name enhancedvegetationindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name imcorrfeaturetracking for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name inversedistanceweighted for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name invertdatanodata for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name invertraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name isodataclusteringforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name kmeansclusteringforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name kerneldensityestimation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name landsurfacetemperature for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name laplacianfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name layerofextremevalue for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name leastcostpaths for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name maximumentropypresenceprediction for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name meshdenoise for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name metricconversions for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name mirrorraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name monthlyglobalbylatitude for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name morphologicalfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multidirectionleefilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name naturalneighbour for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name nearestneighbour for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name patching for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name patternanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name geographiccoordinaterasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name proximityraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name randomfield for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name randomterrain for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rankfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name resampling for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name resamplingfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name seededregiongrowing for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name seedgeneration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name featurestoraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name addrastervaluestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name addrastervaluestofeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name cliprasterwithpolygon for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name contourlinesfromraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorsfromdirectionalcomponents for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorsfromdirectionandlength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorsfromsurface for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterstatisticsforpoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterstatisticsforpolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastervaluestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastervaluestopointsrandomly for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name localminimaandmaxima for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertpointstolines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertpolygonstolines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name linepolygonintersection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lineproperties for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name linesimplification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name linesmoothing for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name addcoordinatestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name aggregatepointobservations for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name clippointswithpolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertlinestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertmultipointstopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convexhull for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name createpointraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name pointsfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name pointsthinning for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name removeduplicatepoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name separatepointsbydirection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertlinestopolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertpolygonlineverticestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name difference for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flattenpolygonlayer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name identity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name intersect for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonlineintersection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonstoedgesandnodes for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygoncentroids for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonclipping for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonpartstoseparatepolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonselfintersection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonshapeindices for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name generatefeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name getfeaturesextents for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name landusescenariogenerator for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name mergelayers for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polartocartesiancoordinates for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name quadtreestructuretofeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sharedpolygonedges for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name shrinkandexpand for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name simplefilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fireriskanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name simulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name concentration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diffusepollutionrisk for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diffusivehillslopeevolutionadi for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diffusivehillslopeevolutionftcs for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fillsinksqmofesp for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationqmofesp for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastercombination for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name riverbasin for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name riverrastergeneration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name snappointstoraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name snappointstolines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name snappointstopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name splitlinesatpoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name splitlineswithlines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name splitfeatureslayerrandomly for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name categoricalcoincidence for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name directionalstatisticsforsingleraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fastrepresentativeness for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name globalmoransiforgrids for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multibandvariation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name radiusofvarianceraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name statisticsforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name ordinarykriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name regressionkriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name simplekriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name minimumdistanceanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name spatialpointpatternanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gwrformultiplepredictorrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gwrforsinglepredictorrasterdedmodeloutput for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gwrforsinglepredictorraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multiplelinearregressionanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multiplelinearregressionanalysisfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multipleregressionanalysisrasterandpredictorrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multipleregressionanalysispointsandpredictorrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polynomialregression for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name regressionanalysispointsandpredictorraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name successiveflowrouting for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name supervisedclassificationforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name surfaceandgradient for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name surfacegradientandconcentration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name svmclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name symmetricaldifference for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name tasseledcaptransformation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name channelnetwork for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name channelnetworkanddrainagebasins for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name overlandflowdistancetochannelnetwork for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name strahlerorder for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name basicterrainanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name cellbalance for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name edgecontamination for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationflowtracing for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationrecursive for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationtopdown for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowpathlength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowwidthandspecificcatchmentarea for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lakeflood for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lsfactorfieldbased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lsfactor for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name maximumflowpathlength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sagawetnessindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name slopelength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name slopelimitedflowaccumulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name streampowerindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name analyticalhillshading for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name skyviewfactor for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convergenceindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convergenceindexsearchradius for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name curvatureclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name downslopedistancegradient for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name effectiveairflowheights for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzylandformelementclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name hypsometry for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name massbalanceindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name morphometricfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name morphometricprotectionindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multiresolutionindexofvalleybottomflatnessmrvbf for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name realsurfacearea for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name relativeheightsandslopepositions for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name slopeaspectcurvature for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name surfacespecificpoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name burnstreamnetworkintodem for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fillsinkswangliu for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fillsinksxxlwangliu for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flatdetection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sinkdrainageroutedetection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sinkremoval for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name crossprofiles for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name profilesfromlines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name angmap for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name tcilow for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainruggednessindextri for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainsurfaceclassificationiwahashiandpike for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainsurfaceconvexity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainsurfacetexture for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thiessenpolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thinplatesplinetin for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thinplatespline for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thresholdbuffer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topofatmospherereflectance for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographiccorrection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographicopenness for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographicpositionindextpi for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographicwetnessindextwi for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name tpibasedlandformclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name transectthroughpolygonshapefile for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name transformfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name transposerasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name triangulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name union for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name universalkriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name update for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name upslopeanddownslopecurvature for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name upslopearea for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name valleyandridgedetectiontophatapproach for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name valleydepth for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name variogramcloud for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vectorisingrasterclasses for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vectorruggednessmeasurevrm for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vegetationindexdistancebased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vegetationindexslopebased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name verticaldistancetochannelnetwork for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name warpingfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name waterretentioncapacity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name watershedbasins for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name windexpositionindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name zonalrasterstatistics for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name zonalmultipleregressionanalysispointsandpredictorrasters for provider saga ``` ### Steps to reproduce the issue 1. start QGIS 2. open the Processing tab of the Log Messages panel ### Versions QGIS 3.21.0-master (8498c55) from OSGeo4W network installer on Windows 10 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context Maybe related to https://github.com/qgis/QGIS/pull/43792. Maybe only occurring with OSGeo4W installer on Windows.
1.0
SAGA provider: "Duplicate algorithm name" warnings - ### What is the bug or the crash? Starting QGIS 3.21.0-master (8498c55), installed using the OSGeo4W Network Installer on Windows 10, a long list of warnings appears in the Processing tab of the Log Messages panel: ``` 2021-10-09T11:01:46 WARNING Duplicate algorithm name bioclimaticvariables for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewithmask1 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewithmask2 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewiththereshold1 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewiththereshold2 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name averagewiththereshold3 for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name destriping for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name destripingwithmask for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name directionalaverage for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name earthsorbitalparameters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name accumulatedcost for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name accumulationfunctions for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name aggregationindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name analyticalhierarchyprocess for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name covereddistance for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name crossclassificationandtabulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diversityofcategories for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fragmentationalternative for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fragmentationstandard for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fragmentationclassesfromdensityandconnectivity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzify for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzyintersectionand for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzyunionor for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name geometricfigures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorfromcartesiantopolarcoordinates for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorfrompolartocartesiancoordinates for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastersproduct for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterssum for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastercalculator for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterdifference for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterdivision for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastervolume for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name binaryerosionreconstruction for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name connectivityanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name dtmfilterslopebased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name filterclumps for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gaussianfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name geodesicmorphologicalreconstruction for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name angulardistanceweighted for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name bsplineapproximation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name cubicsplineapproximation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name changedatastorage for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closegaps for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closegapswithspline for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closegapswithstepwiseresampling for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name closeonecellgaps for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name constantraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name croptodata for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterbuffer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastercellindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastermasking for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterproximitybuffer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name confusionmatrixpolygonsraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name confusionmatrixtworasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterskeletonization for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name automatedcloudcoverassessment for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name enhancedvegetationindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name imcorrfeaturetracking for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name inversedistanceweighted for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name invertdatanodata for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name invertraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name isodataclusteringforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name kmeansclusteringforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name kerneldensityestimation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name landsurfacetemperature for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name laplacianfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name layerofextremevalue for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name leastcostpaths for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name maximumentropypresenceprediction for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name meshdenoise for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name metricconversions for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name mirrorraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name monthlyglobalbylatitude for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name morphologicalfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multidirectionleefilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name naturalneighbour for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name nearestneighbour for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name patching for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name patternanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name geographiccoordinaterasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name proximityraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name randomfield for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name randomterrain for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rankfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name resampling for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name resamplingfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name seededregiongrowing for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name seedgeneration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name featurestoraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name addrastervaluestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name addrastervaluestofeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name cliprasterwithpolygon for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name contourlinesfromraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorsfromdirectionalcomponents for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorsfromdirectionandlength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gradientvectorsfromsurface for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterstatisticsforpoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rasterstatisticsforpolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastervaluestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastervaluestopointsrandomly for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name localminimaandmaxima for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertpointstolines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertpolygonstolines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name linepolygonintersection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lineproperties for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name linesimplification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name linesmoothing for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name addcoordinatestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name aggregatepointobservations for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name clippointswithpolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertlinestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertmultipointstopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convexhull for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name createpointraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name pointsfilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name pointsthinning for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name removeduplicatepoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name separatepointsbydirection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertlinestopolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convertpolygonlineverticestopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name difference for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flattenpolygonlayer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name identity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name intersect for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonlineintersection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonstoedgesandnodes for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygoncentroids for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonclipping for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonpartstoseparatepolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonselfintersection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polygonshapeindices for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name generatefeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name getfeaturesextents for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name landusescenariogenerator for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name mergelayers for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polartocartesiancoordinates for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name quadtreestructuretofeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sharedpolygonedges for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name shrinkandexpand for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name simplefilter for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fireriskanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name simulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name concentration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diffusepollutionrisk for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diffusivehillslopeevolutionadi for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name diffusivehillslopeevolutionftcs for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fillsinksqmofesp for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationqmofesp for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name rastercombination for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name riverbasin for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name riverrastergeneration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name snappointstoraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name snappointstolines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name snappointstopoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name splitlinesatpoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name splitlineswithlines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name splitfeatureslayerrandomly for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name categoricalcoincidence for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name directionalstatisticsforsingleraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fastrepresentativeness for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name globalmoransiforgrids for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multibandvariation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name radiusofvarianceraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name statisticsforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name ordinarykriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name regressionkriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name simplekriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name minimumdistanceanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name spatialpointpatternanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gwrformultiplepredictorrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gwrforsinglepredictorrasterdedmodeloutput for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name gwrforsinglepredictorraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multiplelinearregressionanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multiplelinearregressionanalysisfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multipleregressionanalysisrasterandpredictorrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multipleregressionanalysispointsandpredictorrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name polynomialregression for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name regressionanalysispointsandpredictorraster for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name successiveflowrouting for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name supervisedclassificationforrasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name surfaceandgradient for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name surfacegradientandconcentration for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name svmclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name symmetricaldifference for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name tasseledcaptransformation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name channelnetwork for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name channelnetworkanddrainagebasins for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name overlandflowdistancetochannelnetwork for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name strahlerorder for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name basicterrainanalysis for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name cellbalance for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name edgecontamination for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationflowtracing for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationrecursive for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowaccumulationtopdown for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowpathlength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flowwidthandspecificcatchmentarea for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lakeflood for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lsfactorfieldbased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name lsfactor for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name maximumflowpathlength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sagawetnessindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name slopelength for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name slopelimitedflowaccumulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name streampowerindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name analyticalhillshading for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name skyviewfactor for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convergenceindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name convergenceindexsearchradius for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name curvatureclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name downslopedistancegradient for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name effectiveairflowheights for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fuzzylandformelementclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name hypsometry for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name massbalanceindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name morphometricfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name morphometricprotectionindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name multiresolutionindexofvalleybottomflatnessmrvbf for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name realsurfacearea for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name relativeheightsandslopepositions for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name slopeaspectcurvature for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name surfacespecificpoints for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name burnstreamnetworkintodem for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fillsinkswangliu for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name fillsinksxxlwangliu for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name flatdetection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sinkdrainageroutedetection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name sinkremoval for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name crossprofiles for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name profilesfromlines for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name angmap for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name tcilow for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainruggednessindextri for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainsurfaceclassificationiwahashiandpike for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainsurfaceconvexity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name terrainsurfacetexture for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thiessenpolygons for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thinplatesplinetin for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thinplatespline for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name thresholdbuffer for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topofatmospherereflectance for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographiccorrection for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographicopenness for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographicpositionindextpi for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name topographicwetnessindextwi for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name tpibasedlandformclassification for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name transectthroughpolygonshapefile for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name transformfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name transposerasters for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name triangulation for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name union for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name universalkriging for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name update for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name upslopeanddownslopecurvature for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name upslopearea for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name valleyandridgedetectiontophatapproach for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name valleydepth for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name variogramcloud for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vectorisingrasterclasses for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vectorruggednessmeasurevrm for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vegetationindexdistancebased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name vegetationindexslopebased for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name verticaldistancetochannelnetwork for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name warpingfeatures for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name waterretentioncapacity for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name watershedbasins for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name windexpositionindex for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name zonalrasterstatistics for provider saga 2021-10-09T11:01:46 WARNING Duplicate algorithm name zonalmultipleregressionanalysispointsandpredictorrasters for provider saga ``` ### Steps to reproduce the issue 1. start QGIS 2. open the Processing tab of the Log Messages panel ### Versions QGIS 3.21.0-master (8498c55) from OSGeo4W network installer on Windows 10 ### Supported QGIS version - [X] I'm running a supported QGIS version according to the roadmap. ### New profile - [X] I tried with a new QGIS profile ### Additional context Maybe related to https://github.com/qgis/QGIS/pull/43792. Maybe only occurring with OSGeo4W installer on Windows.
process
saga provider duplicate algorithm name warnings what is the bug or the crash starting qgis master installed using the network installer on windows a long list of warnings appears in the processing tab of the log messages panel warning duplicate algorithm name bioclimaticvariables for provider saga warning duplicate algorithm name for provider saga warning duplicate algorithm name for provider saga warning duplicate algorithm name for provider saga warning duplicate algorithm name for provider saga warning duplicate algorithm name for provider saga warning duplicate algorithm name destriping for provider saga warning duplicate algorithm name destripingwithmask for provider saga warning duplicate algorithm name directionalaverage for provider saga warning duplicate algorithm name earthsorbitalparameters for provider saga warning duplicate algorithm name accumulatedcost for provider saga warning duplicate algorithm name accumulationfunctions for provider saga warning duplicate algorithm name aggregationindex for provider saga warning duplicate algorithm name analyticalhierarchyprocess for provider saga warning duplicate algorithm name covereddistance for provider saga warning duplicate algorithm name crossclassificationandtabulation for provider saga warning duplicate algorithm name diversityofcategories for provider saga warning duplicate algorithm name fragmentationalternative for provider saga warning duplicate algorithm name fragmentationstandard for provider saga warning duplicate algorithm name fragmentationclassesfromdensityandconnectivity for provider saga warning duplicate algorithm name fuzzify for provider saga warning duplicate algorithm name fuzzyintersectionand for provider saga warning duplicate algorithm name fuzzyunionor for provider saga warning duplicate algorithm name geometricfigures for provider saga warning duplicate algorithm name gradientvectorfromcartesiantopolarcoordinates for provider saga warning duplicate algorithm name gradientvectorfrompolartocartesiancoordinates for provider saga warning duplicate algorithm name rastersproduct for provider saga warning duplicate algorithm name rasterssum for provider saga warning duplicate algorithm name rastercalculator for provider saga warning duplicate algorithm name rasterdifference for provider saga warning duplicate algorithm name rasterdivision for provider saga warning duplicate algorithm name rastervolume for provider saga warning duplicate algorithm name binaryerosionreconstruction for provider saga warning duplicate algorithm name connectivityanalysis for provider saga warning duplicate algorithm name dtmfilterslopebased for provider saga warning duplicate algorithm name filterclumps for provider saga warning duplicate algorithm name gaussianfilter for provider saga warning duplicate algorithm name geodesicmorphologicalreconstruction for provider saga warning duplicate algorithm name angulardistanceweighted for provider saga warning duplicate algorithm name bsplineapproximation for provider saga warning duplicate algorithm name cubicsplineapproximation for provider saga warning duplicate algorithm name changedatastorage for provider saga warning duplicate algorithm name closegaps for provider saga warning duplicate algorithm name closegapswithspline for provider saga warning duplicate algorithm name closegapswithstepwiseresampling for provider saga warning duplicate algorithm name closeonecellgaps for provider saga warning duplicate algorithm name constantraster for provider saga warning duplicate algorithm name croptodata for provider saga warning duplicate algorithm name rasterbuffer for provider saga warning duplicate algorithm name rastercellindex for provider saga warning duplicate algorithm name rastermasking for provider saga warning duplicate algorithm name rasterproximitybuffer for provider saga warning duplicate algorithm name confusionmatrixpolygonsraster for provider saga warning duplicate algorithm name confusionmatrixtworasters for provider saga warning duplicate algorithm name rasterskeletonization for provider saga warning duplicate algorithm name automatedcloudcoverassessment for provider saga warning duplicate algorithm name enhancedvegetationindex for provider saga warning duplicate algorithm name imcorrfeaturetracking for provider saga warning duplicate algorithm name inversedistanceweighted for provider saga warning duplicate algorithm name invertdatanodata for provider saga warning duplicate algorithm name invertraster for provider saga warning duplicate algorithm name isodataclusteringforrasters for provider saga warning duplicate algorithm name kmeansclusteringforrasters for provider saga warning duplicate algorithm name kerneldensityestimation for provider saga warning duplicate algorithm name landsurfacetemperature for provider saga warning duplicate algorithm name laplacianfilter for provider saga warning duplicate algorithm name layerofextremevalue for provider saga warning duplicate algorithm name leastcostpaths for provider saga warning duplicate algorithm name maximumentropypresenceprediction for provider saga warning duplicate algorithm name meshdenoise for provider saga warning duplicate algorithm name metricconversions for provider saga warning duplicate algorithm name mirrorraster for provider saga warning duplicate algorithm name monthlyglobalbylatitude for provider saga warning duplicate algorithm name morphologicalfilter for provider saga warning duplicate algorithm name multidirectionleefilter for provider saga warning duplicate algorithm name naturalneighbour for provider saga warning duplicate algorithm name nearestneighbour for provider saga warning duplicate algorithm name patching for provider saga warning duplicate algorithm name patternanalysis for provider saga warning duplicate algorithm name geographiccoordinaterasters for provider saga warning duplicate algorithm name proximityraster for provider saga warning duplicate algorithm name randomfield for provider saga warning duplicate algorithm name randomterrain for provider saga warning duplicate algorithm name rankfilter for provider saga warning duplicate algorithm name resampling for provider saga warning duplicate algorithm name resamplingfilter for provider saga warning duplicate algorithm name seededregiongrowing for provider saga warning duplicate algorithm name seedgeneration for provider saga warning duplicate algorithm name featurestoraster for provider saga warning duplicate algorithm name addrastervaluestopoints for provider saga warning duplicate algorithm name addrastervaluestofeatures for provider saga warning duplicate algorithm name cliprasterwithpolygon for provider saga warning duplicate algorithm name contourlinesfromraster for provider saga warning duplicate algorithm name gradientvectorsfromdirectionalcomponents for provider saga warning duplicate algorithm name gradientvectorsfromdirectionandlength for provider saga warning duplicate algorithm name gradientvectorsfromsurface for provider saga warning duplicate algorithm name rasterstatisticsforpoints for provider saga warning duplicate algorithm name rasterstatisticsforpolygons for provider saga warning duplicate algorithm name rastervaluestopoints for provider saga warning duplicate algorithm name rastervaluestopointsrandomly for provider saga warning duplicate algorithm name localminimaandmaxima for provider saga warning duplicate algorithm name convertpointstolines for provider saga warning duplicate algorithm name convertpolygonstolines for provider saga warning duplicate algorithm name linepolygonintersection for provider saga warning duplicate algorithm name lineproperties for provider saga warning duplicate algorithm name linesimplification for provider saga warning duplicate algorithm name linesmoothing for provider saga warning duplicate algorithm name addcoordinatestopoints for provider saga warning duplicate algorithm name aggregatepointobservations for provider saga warning duplicate algorithm name clippointswithpolygons for provider saga warning duplicate algorithm name convertlinestopoints for provider saga warning duplicate algorithm name convertmultipointstopoints for provider saga warning duplicate algorithm name convexhull for provider saga warning duplicate algorithm name createpointraster for provider saga warning duplicate algorithm name pointsfilter for provider saga warning duplicate algorithm name pointsthinning for provider saga warning duplicate algorithm name removeduplicatepoints for provider saga warning duplicate algorithm name separatepointsbydirection for provider saga warning duplicate algorithm name convertlinestopolygons for provider saga warning duplicate algorithm name convertpolygonlineverticestopoints for provider saga warning duplicate algorithm name difference for provider saga warning duplicate algorithm name flattenpolygonlayer for provider saga warning duplicate algorithm name identity for provider saga warning duplicate algorithm name intersect for provider saga warning duplicate algorithm name polygonlineintersection for provider saga warning duplicate algorithm name polygonstoedgesandnodes for provider saga warning duplicate algorithm name polygoncentroids for provider saga warning duplicate algorithm name polygonclipping for provider saga warning duplicate algorithm name polygonpartstoseparatepolygons for provider saga warning duplicate algorithm name polygonselfintersection for provider saga warning duplicate algorithm name polygonshapeindices for provider saga warning duplicate algorithm name generatefeatures for provider saga warning duplicate algorithm name getfeaturesextents for provider saga warning duplicate algorithm name landusescenariogenerator for provider saga warning duplicate algorithm name mergelayers for provider saga warning duplicate algorithm name polartocartesiancoordinates for provider saga warning duplicate algorithm name quadtreestructuretofeatures for provider saga warning duplicate algorithm name sharedpolygonedges for provider saga warning duplicate algorithm name shrinkandexpand for provider saga warning duplicate algorithm name simplefilter for provider saga warning duplicate algorithm name fireriskanalysis for provider saga warning duplicate algorithm name simulation for provider saga warning duplicate algorithm name concentration for provider saga warning duplicate algorithm name diffusepollutionrisk for provider saga warning duplicate algorithm name diffusivehillslopeevolutionadi for provider saga warning duplicate algorithm name diffusivehillslopeevolutionftcs for provider saga warning duplicate algorithm name fillsinksqmofesp for provider saga warning duplicate algorithm name flowaccumulationqmofesp for provider saga warning duplicate algorithm name rastercombination for provider saga warning duplicate algorithm name riverbasin for provider saga warning duplicate algorithm name riverrastergeneration for provider saga warning duplicate algorithm name snappointstoraster for provider saga warning duplicate algorithm name snappointstolines for provider saga warning duplicate algorithm name snappointstopoints for provider saga warning duplicate algorithm name splitlinesatpoints for provider saga warning duplicate algorithm name splitlineswithlines for provider saga warning duplicate algorithm name splitfeatureslayerrandomly for provider saga warning duplicate algorithm name categoricalcoincidence for provider saga warning duplicate algorithm name directionalstatisticsforsingleraster for provider saga warning duplicate algorithm name fastrepresentativeness for provider saga warning duplicate algorithm name globalmoransiforgrids for provider saga warning duplicate algorithm name multibandvariation for provider saga warning duplicate algorithm name radiusofvarianceraster for provider saga warning duplicate algorithm name statisticsforrasters for provider saga warning duplicate algorithm name ordinarykriging for provider saga warning duplicate algorithm name regressionkriging for provider saga warning duplicate algorithm name simplekriging for provider saga warning duplicate algorithm name minimumdistanceanalysis for provider saga warning duplicate algorithm name spatialpointpatternanalysis for provider saga warning duplicate algorithm name gwrformultiplepredictorrasters for provider saga warning duplicate algorithm name gwrforsinglepredictorrasterdedmodeloutput for provider saga warning duplicate algorithm name gwrforsinglepredictorraster for provider saga warning duplicate algorithm name multiplelinearregressionanalysis for provider saga warning duplicate algorithm name multiplelinearregressionanalysisfeatures for provider saga warning duplicate algorithm name multipleregressionanalysisrasterandpredictorrasters for provider saga warning duplicate algorithm name multipleregressionanalysispointsandpredictorrasters for provider saga warning duplicate algorithm name polynomialregression for provider saga warning duplicate algorithm name regressionanalysispointsandpredictorraster for provider saga warning duplicate algorithm name successiveflowrouting for provider saga warning duplicate algorithm name supervisedclassificationforrasters for provider saga warning duplicate algorithm name surfaceandgradient for provider saga warning duplicate algorithm name surfacegradientandconcentration for provider saga warning duplicate algorithm name svmclassification for provider saga warning duplicate algorithm name symmetricaldifference for provider saga warning duplicate algorithm name tasseledcaptransformation for provider saga warning duplicate algorithm name channelnetwork for provider saga warning duplicate algorithm name channelnetworkanddrainagebasins for provider saga warning duplicate algorithm name overlandflowdistancetochannelnetwork for provider saga warning duplicate algorithm name strahlerorder for provider saga warning duplicate algorithm name basicterrainanalysis for provider saga warning duplicate algorithm name cellbalance for provider saga warning duplicate algorithm name edgecontamination for provider saga warning duplicate algorithm name flowaccumulationflowtracing for provider saga warning duplicate algorithm name flowaccumulationrecursive for provider saga warning duplicate algorithm name flowaccumulationtopdown for provider saga warning duplicate algorithm name flowpathlength for provider saga warning duplicate algorithm name flowwidthandspecificcatchmentarea for provider saga warning duplicate algorithm name lakeflood for provider saga warning duplicate algorithm name lsfactorfieldbased for provider saga warning duplicate algorithm name lsfactor for provider saga warning duplicate algorithm name maximumflowpathlength for provider saga warning duplicate algorithm name sagawetnessindex for provider saga warning duplicate algorithm name slopelength for provider saga warning duplicate algorithm name slopelimitedflowaccumulation for provider saga warning duplicate algorithm name streampowerindex for provider saga warning duplicate algorithm name analyticalhillshading for provider saga warning duplicate algorithm name skyviewfactor for provider saga warning duplicate algorithm name convergenceindex for provider saga warning duplicate algorithm name convergenceindexsearchradius for provider saga warning duplicate algorithm name curvatureclassification for provider saga warning duplicate algorithm name downslopedistancegradient for provider saga warning duplicate algorithm name effectiveairflowheights for provider saga warning duplicate algorithm name fuzzylandformelementclassification for provider saga warning duplicate algorithm name hypsometry for provider saga warning duplicate algorithm name massbalanceindex for provider saga warning duplicate algorithm name morphometricfeatures for provider saga warning duplicate algorithm name morphometricprotectionindex for provider saga warning duplicate algorithm name multiresolutionindexofvalleybottomflatnessmrvbf for provider saga warning duplicate algorithm name realsurfacearea for provider saga warning duplicate algorithm name relativeheightsandslopepositions for provider saga warning duplicate algorithm name slopeaspectcurvature for provider saga warning duplicate algorithm name surfacespecificpoints for provider saga warning duplicate algorithm name burnstreamnetworkintodem for provider saga warning duplicate algorithm name fillsinkswangliu for provider saga warning duplicate algorithm name fillsinksxxlwangliu for provider saga warning duplicate algorithm name flatdetection for provider saga warning duplicate algorithm name sinkdrainageroutedetection for provider saga warning duplicate algorithm name sinkremoval for provider saga warning duplicate algorithm name crossprofiles for provider saga warning duplicate algorithm name profilesfromlines for provider saga warning duplicate algorithm name angmap for provider saga warning duplicate algorithm name tcilow for provider saga warning duplicate algorithm name terrainruggednessindextri for provider saga warning duplicate algorithm name terrainsurfaceclassificationiwahashiandpike for provider saga warning duplicate algorithm name terrainsurfaceconvexity for provider saga warning duplicate algorithm name terrainsurfacetexture for provider saga warning duplicate algorithm name thiessenpolygons for provider saga warning duplicate algorithm name thinplatesplinetin for provider saga warning duplicate algorithm name thinplatespline for provider saga warning duplicate algorithm name thresholdbuffer for provider saga warning duplicate algorithm name topofatmospherereflectance for provider saga warning duplicate algorithm name topographiccorrection for provider saga warning duplicate algorithm name topographicopenness for provider saga warning duplicate algorithm name topographicpositionindextpi for provider saga warning duplicate algorithm name topographicwetnessindextwi for provider saga warning duplicate algorithm name tpibasedlandformclassification for provider saga warning duplicate algorithm name transectthroughpolygonshapefile for provider saga warning duplicate algorithm name transformfeatures for provider saga warning duplicate algorithm name transposerasters for provider saga warning duplicate algorithm name triangulation for provider saga warning duplicate algorithm name union for provider saga warning duplicate algorithm name universalkriging for provider saga warning duplicate algorithm name update for provider saga warning duplicate algorithm name upslopeanddownslopecurvature for provider saga warning duplicate algorithm name upslopearea for provider saga warning duplicate algorithm name valleyandridgedetectiontophatapproach for provider saga warning duplicate algorithm name valleydepth for provider saga warning duplicate algorithm name variogramcloud for provider saga warning duplicate algorithm name vectorisingrasterclasses for provider saga warning duplicate algorithm name vectorruggednessmeasurevrm for provider saga warning duplicate algorithm name vegetationindexdistancebased for provider saga warning duplicate algorithm name vegetationindexslopebased for provider saga warning duplicate algorithm name verticaldistancetochannelnetwork for provider saga warning duplicate algorithm name warpingfeatures for provider saga warning duplicate algorithm name waterretentioncapacity for provider saga warning duplicate algorithm name watershedbasins for provider saga warning duplicate algorithm name windexpositionindex for provider saga warning duplicate algorithm name zonalrasterstatistics for provider saga warning duplicate algorithm name zonalmultipleregressionanalysispointsandpredictorrasters for provider saga steps to reproduce the issue start qgis open the processing tab of the log messages panel versions qgis master from network installer on windows supported qgis version i m running a supported qgis version according to the roadmap new profile i tried with a new qgis profile additional context maybe related to maybe only occurring with installer on windows
1
171,799
20,996,413,486
IssuesEvent
2022-03-29 13:51:15
scumdestroy/docs
https://api.github.com/repos/scumdestroy/docs
closed
Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl: 1 vulnerabilities (highest severity is: 7.5) - autoclosed
security vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</b></p></summary> <p>An asynchronous networking framework written in Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /ci/requirements.txt</p> <p>Path to vulnerable library: /ci/requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/scumdestroy/docs/commit/4365b26096e64c91477237871b08ec9ab84069e1">4365b26096e64c91477237871b08ec9ab84069e1</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2022-21712](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21712) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl | Direct | Twisted - 22.1.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-21712</summary> ### Vulnerable Library - <b>Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</b></p> <p>An asynchronous networking framework written in Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /ci/requirements.txt</p> <p>Path to vulnerable library: /ci/requirements.txt</p> <p> Dependency Hierarchy: - :x: **Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/scumdestroy/docs/commit/4365b26096e64c91477237871b08ec9ab84069e1">4365b26096e64c91477237871b08ec9ab84069e1</a></p> <p>Found in base branch: <b>develop</b></p> </p> <p></p> ### Vulnerability Details <p> twisted is an event-driven networking engine written in Python. In affected versions twisted exposes cookies and authorization headers when following cross-origin redirects. This issue is present in the `twited.web.RedirectAgent` and `twisted.web. BrowserLikeRedirectAgent` functions. Users are advised to upgrade. There are no known workarounds. <p>Publish Date: 2022-02-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21712>CVE-2022-21712</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twisted/twisted/security/advisories/GHSA-92x2-jw7w-xvvx">https://github.com/twisted/twisted/security/advisories/GHSA-92x2-jw7w-xvvx</a></p> <p>Release Date: 2022-02-07</p> <p>Fix Resolution: Twisted - 22.1.0</p> </p> <p></p> Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details> <!-- <REMEDIATE>[{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"Twisted","packageVersion":"20.3.0","packageFilePaths":["/ci/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"Twisted:20.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Twisted - 22.1.0","isBinary":false}],"baseBranches":["develop"],"vulnerabilityIdentifier":"CVE-2022-21712","vulnerabilityDetails":"twisted is an event-driven networking engine written in Python. In affected versions twisted exposes cookies and authorization headers when following cross-origin redirects. This issue is present in the `twited.web.RedirectAgent` and `twisted.web. BrowserLikeRedirectAgent` functions. Users are advised to upgrade. There are no known workarounds.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21712","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}]</REMEDIATE> -->
True
Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl: 1 vulnerabilities (highest severity is: 7.5) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</b></p></summary> <p>An asynchronous networking framework written in Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /ci/requirements.txt</p> <p>Path to vulnerable library: /ci/requirements.txt</p> <p> <p>Found in HEAD commit: <a href="https://github.com/scumdestroy/docs/commit/4365b26096e64c91477237871b08ec9ab84069e1">4365b26096e64c91477237871b08ec9ab84069e1</a></p></details> ## Vulnerabilities | CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available | | ------------- | ------------- | ----- | ----- | ----- | --- | --- | | [CVE-2022-21712](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21712) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> High | 7.5 | Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl | Direct | Twisted - 22.1.0 | &#10060; | ## Details <details> <summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> CVE-2022-21712</summary> ### Vulnerable Library - <b>Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</b></p> <p>An asynchronous networking framework written in Python</p> <p>Library home page: <a href="https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/b8/f9/489416dda6de8ae6419356bf003c10d1ce6fb8377b6a3207b02b3a39c42a/Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl</a></p> <p>Path to dependency file: /ci/requirements.txt</p> <p>Path to vulnerable library: /ci/requirements.txt</p> <p> Dependency Hierarchy: - :x: **Twisted-20.3.0-cp37-cp37m-manylinux1_x86_64.whl** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/scumdestroy/docs/commit/4365b26096e64c91477237871b08ec9ab84069e1">4365b26096e64c91477237871b08ec9ab84069e1</a></p> <p>Found in base branch: <b>develop</b></p> </p> <p></p> ### Vulnerability Details <p> twisted is an event-driven networking engine written in Python. In affected versions twisted exposes cookies and authorization headers when following cross-origin redirects. This issue is present in the `twited.web.RedirectAgent` and `twisted.web. BrowserLikeRedirectAgent` functions. Users are advised to upgrade. There are no known workarounds. <p>Publish Date: 2022-02-07 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21712>CVE-2022-21712</a></p> </p> <p></p> ### CVSS 3 Score Details (<b>7.5</b>) <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> <p></p> ### Suggested Fix <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/twisted/twisted/security/advisories/GHSA-92x2-jw7w-xvvx">https://github.com/twisted/twisted/security/advisories/GHSA-92x2-jw7w-xvvx</a></p> <p>Release Date: 2022-02-07</p> <p>Fix Resolution: Twisted - 22.1.0</p> </p> <p></p> Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) </details> <!-- <REMEDIATE>[{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Python","packageName":"Twisted","packageVersion":"20.3.0","packageFilePaths":["/ci/requirements.txt"],"isTransitiveDependency":false,"dependencyTree":"Twisted:20.3.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"Twisted - 22.1.0","isBinary":false}],"baseBranches":["develop"],"vulnerabilityIdentifier":"CVE-2022-21712","vulnerabilityDetails":"twisted is an event-driven networking engine written in Python. In affected versions twisted exposes cookies and authorization headers when following cross-origin redirects. This issue is present in the `twited.web.RedirectAgent` and `twisted.web. BrowserLikeRedirectAgent` functions. Users are advised to upgrade. There are no known workarounds.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-21712","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}]</REMEDIATE> -->
non_process
twisted whl vulnerabilities highest severity is autoclosed vulnerable library twisted whl an asynchronous networking framework written in python library home page a href path to dependency file ci requirements txt path to vulnerable library ci requirements txt found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available high twisted whl direct twisted details cve vulnerable library twisted whl an asynchronous networking framework written in python library home page a href path to dependency file ci requirements txt path to vulnerable library ci requirements txt dependency hierarchy x twisted whl vulnerable library found in head commit a href found in base branch develop vulnerability details twisted is an event driven networking engine written in python in affected versions twisted exposes cookies and authorization headers when following cross origin redirects this issue is present in the twited web redirectagent and twisted web browserlikeredirectagent functions users are advised to upgrade there are no known workarounds publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution twisted step up your open source security game with whitesource istransitivedependency false dependencytree twisted isminimumfixversionavailable true minimumfixversion twisted isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails twisted is an event driven networking engine written in python in affected versions twisted exposes cookies and authorization headers when following cross origin redirects this issue is present in the twited web redirectagent and twisted web browserlikeredirectagent functions users are advised to upgrade there are no known workarounds vulnerabilityurl
0
53
2,514,962,331
IssuesEvent
2015-01-15 15:38:19
Graylog2/graylog2-server
https://api.github.com/repos/Graylog2/graylog2-server
closed
Blacklistfilter do not commit journal messages
bug processing
If a message is filtered out it will not be purged from the journal, because its offset is not committed.
1.0
Blacklistfilter do not commit journal messages - If a message is filtered out it will not be purged from the journal, because its offset is not committed.
process
blacklistfilter do not commit journal messages if a message is filtered out it will not be purged from the journal because its offset is not committed
1
6,991
10,142,352,340
IssuesEvent
2019-08-03 23:25:11
bow-simulation/virtualbow
https://api.github.com/repos/bow-simulation/virtualbow
closed
Windows installer uses wrong default directory
area: software process platform: windows type: bug
Is `C:\VirtualBow` but should be `C:\Program Files\VirtualBow` or `C:\Program Files (x86)\VirtualBow`. It seems like the `iss` input file doesn't make use of the `ISS_PROGRAM_DIR` CMake variable.
1.0
Windows installer uses wrong default directory - Is `C:\VirtualBow` but should be `C:\Program Files\VirtualBow` or `C:\Program Files (x86)\VirtualBow`. It seems like the `iss` input file doesn't make use of the `ISS_PROGRAM_DIR` CMake variable.
process
windows installer uses wrong default directory is c virtualbow but should be c program files virtualbow or c program files virtualbow it seems like the iss input file doesn t make use of the iss program dir cmake variable
1
14,298
17,270,797,823
IssuesEvent
2021-07-22 19:33:01
RIOT-OS/RIOT
https://api.github.com/repos/RIOT-OS/RIOT
opened
gcoap_dtls: Selecting transport at run time is not possible
Area: security Process: API change Type: bug
<!-- ==================================== IF YOUR ISSUE IS RELATED TO SECURITY ==================================== please submit it to the security mailing-list security@riot-os.org. If your issue is a question related to the usage of RIOT, please submit it to our forum at https://forum.riot-os.org. --> #### Description Say I have a user interface where a URL is provided: `coap://example.org/sensor` or `coaps://example.org/sensor`. Currently, when `gcoap_dtls` is compiled in `gcoap_req_send()` does not provide me any capability to honor that URL input. It just selects DTLS (so always assumes `coaps://`, as far as I can tell). <!-- Example: Cannot build gnrc_networking application for samr21-xpro board. --> #### Steps to reproduce the issue Try to send an unencryptet CoAP request using `gcoap_req_send()` with `gcoap_dtls` compiled in. <!-- Try to describe as precisely as possible here the steps required to reproduce the issue. Here you can also describe your hardware configuration, the network setup, etc. --> #### Expected results The CoAP request is unencrypted and sent over a UDP sock. <!-- Example: The gnrc_networking application builds on samr21-xpro. --> #### Actual results The CoAP request is always encrypted and sent over a DTLS sock. <!-- Please paste or specifically describe the actual output. --> #### Versions Any RIOT version after 23a8659bdf35a0cc. <!-- Operating system: Mac OSX, Linux, Vagrant VM Build environment: GCC, CLang versions (you can run the following command from the RIOT base directory: make print-versions). --> <!-- Thanks for contributing! -->
1.0
gcoap_dtls: Selecting transport at run time is not possible - <!-- ==================================== IF YOUR ISSUE IS RELATED TO SECURITY ==================================== please submit it to the security mailing-list security@riot-os.org. If your issue is a question related to the usage of RIOT, please submit it to our forum at https://forum.riot-os.org. --> #### Description Say I have a user interface where a URL is provided: `coap://example.org/sensor` or `coaps://example.org/sensor`. Currently, when `gcoap_dtls` is compiled in `gcoap_req_send()` does not provide me any capability to honor that URL input. It just selects DTLS (so always assumes `coaps://`, as far as I can tell). <!-- Example: Cannot build gnrc_networking application for samr21-xpro board. --> #### Steps to reproduce the issue Try to send an unencryptet CoAP request using `gcoap_req_send()` with `gcoap_dtls` compiled in. <!-- Try to describe as precisely as possible here the steps required to reproduce the issue. Here you can also describe your hardware configuration, the network setup, etc. --> #### Expected results The CoAP request is unencrypted and sent over a UDP sock. <!-- Example: The gnrc_networking application builds on samr21-xpro. --> #### Actual results The CoAP request is always encrypted and sent over a DTLS sock. <!-- Please paste or specifically describe the actual output. --> #### Versions Any RIOT version after 23a8659bdf35a0cc. <!-- Operating system: Mac OSX, Linux, Vagrant VM Build environment: GCC, CLang versions (you can run the following command from the RIOT base directory: make print-versions). --> <!-- Thanks for contributing! -->
process
gcoap dtls selecting transport at run time is not possible if your issue is related to security please submit it to the security mailing list security riot os org if your issue is a question related to the usage of riot please submit it to our forum at description say i have a user interface where a url is provided coap example org sensor or coaps example org sensor currently when gcoap dtls is compiled in gcoap req send does not provide me any capability to honor that url input it just selects dtls so always assumes coaps as far as i can tell example cannot build gnrc networking application for xpro board steps to reproduce the issue try to send an unencryptet coap request using gcoap req send with gcoap dtls compiled in try to describe as precisely as possible here the steps required to reproduce the issue here you can also describe your hardware configuration the network setup etc expected results the coap request is unencrypted and sent over a udp sock example the gnrc networking application builds on xpro actual results the coap request is always encrypted and sent over a dtls sock please paste or specifically describe the actual output versions any riot version after operating system mac osx linux vagrant vm build environment gcc clang versions you can run the following command from the riot base directory make print versions
1
196,367
14,856,309,211
IssuesEvent
2021-01-18 13:59:19
elastic/kibana
https://api.github.com/repos/elastic/kibana
closed
Failing test: renders the data provider of a host dragged from the All Hosts widget on the hosts page - timeline data providers renders the data provider of a host dragged from the All Hosts widget on the hosts page
Team: SecuritySolution Team:SIEM failed-test fixed skipped-test v7.11.0 v8.0.0
A test failed on a tracked branch ``` CypressError: Timed out retrying: expected 'suricata-iowa' to equal 'host.name: "suricata-iowa"' at cypressErr (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:138644:9) at throwErr (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:138577:11) at Object.throwErrByPath (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:138625:3) at retry (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:132905:19) at onFailFn (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:121122:16) at tryCatcher (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:165465:23) at Promise._settlePromiseFromHandler (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163401:31) at Promise._settlePromise (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163458:18) at Promise._settlePromise0 (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163503:10) at Promise._settlePromises (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163578:18) at Async../node_modules/bluebird/js/release/async.js.Async._drainQueue (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:160190:16) at Async../node_modules/bluebird/js/release/async.js.Async._drainQueues (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:160200:10) at Async.drainQueues (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:160074:14) ``` First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/4088/) <!-- kibanaCiData = {"failed-test":{"test.class":"renders the data provider of a host dragged from the All Hosts widget on the hosts page","test.name":"timeline data providers renders the data provider of a host dragged from the All Hosts widget on the hosts page","test.failCount":15}} -->
2.0
Failing test: renders the data provider of a host dragged from the All Hosts widget on the hosts page - timeline data providers renders the data provider of a host dragged from the All Hosts widget on the hosts page - A test failed on a tracked branch ``` CypressError: Timed out retrying: expected 'suricata-iowa' to equal 'host.name: "suricata-iowa"' at cypressErr (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:138644:9) at throwErr (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:138577:11) at Object.throwErrByPath (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:138625:3) at retry (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:132905:19) at onFailFn (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:121122:16) at tryCatcher (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:165465:23) at Promise._settlePromiseFromHandler (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163401:31) at Promise._settlePromise (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163458:18) at Promise._settlePromise0 (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163503:10) at Promise._settlePromises (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:163578:18) at Async../node_modules/bluebird/js/release/async.js.Async._drainQueue (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:160190:16) at Async../node_modules/bluebird/js/release/async.js.Async._drainQueues (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:160200:10) at Async.drainQueues (http://elastic:changeme@localhost:61121/__cypress/runner/cypress_runner.js:160074:14) ``` First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/4088/) <!-- kibanaCiData = {"failed-test":{"test.class":"renders the data provider of a host dragged from the All Hosts widget on the hosts page","test.name":"timeline data providers renders the data provider of a host dragged from the All Hosts widget on the hosts page","test.failCount":15}} -->
non_process
failing test renders the data provider of a host dragged from the all hosts widget on the hosts page timeline data providers renders the data provider of a host dragged from the all hosts widget on the hosts page a test failed on a tracked branch cypresserror timed out retrying expected suricata iowa to equal host name suricata iowa at cypresserr at throwerr at object throwerrbypath at retry at onfailfn at trycatcher at promise settlepromisefromhandler at promise settlepromise at promise at promise settlepromises at async node modules bluebird js release async js async drainqueue at async node modules bluebird js release async js async drainqueues at async drainqueues first failure
0
10,369
13,189,125,694
IssuesEvent
2020-08-13 07:53:30
zammad/zammad
https://api.github.com/repos/zammad/zammad
closed
Make channels trusted
bug mail processing
### Infos: * Used Zammad version: 3.4 * Installation method (source, package, ..): all * Operating system: all * Database + version: all * Elasticsearch version: all * Browser + version: all ### Expected behavior: * Currently it is not possible to set a channel trusted ### Actual behavior: * You only can set the trusted flag for StdIn https://github.com/zammad/zammad/blob/develop/app/models/channel/driver/mail_stdin.rb#L17 ------------- It should be possible to set the channel trusted via rails console.
1.0
Make channels trusted - ### Infos: * Used Zammad version: 3.4 * Installation method (source, package, ..): all * Operating system: all * Database + version: all * Elasticsearch version: all * Browser + version: all ### Expected behavior: * Currently it is not possible to set a channel trusted ### Actual behavior: * You only can set the trusted flag for StdIn https://github.com/zammad/zammad/blob/develop/app/models/channel/driver/mail_stdin.rb#L17 ------------- It should be possible to set the channel trusted via rails console.
process
make channels trusted infos used zammad version installation method source package all operating system all database version all elasticsearch version all browser version all expected behavior currently it is not possible to set a channel trusted actual behavior you only can set the trusted flag for stdin it should be possible to set the channel trusted via rails console
1
5,311
8,127,318,747
IssuesEvent
2018-08-17 07:36:47
openvstorage/framework
https://api.github.com/repos/openvstorage/framework
closed
ovs remove nodes doesn't work
process_wontfix type_bug
ovs remove nodes doesn't work. It turns out the config arakoon is in a state where the clients can't find the master anymore they need to finish up the shrink sequence.
1.0
ovs remove nodes doesn't work - ovs remove nodes doesn't work. It turns out the config arakoon is in a state where the clients can't find the master anymore they need to finish up the shrink sequence.
process
ovs remove nodes doesn t work ovs remove nodes doesn t work it turns out the config arakoon is in a state where the clients can t find the master anymore they need to finish up the shrink sequence
1
741,488
25,798,752,504
IssuesEvent
2022-12-10 20:30:57
internetarchive/openlibrary
https://api.github.com/repos/internetarchive/openlibrary
closed
Sync IA ids latest data dump for December
Module: Import Priority: 2 Lead: @cdrini
Run Scott's tool to sync IA ids on the latest dump now that December has come around!
1.0
Sync IA ids latest data dump for December - Run Scott's tool to sync IA ids on the latest dump now that December has come around!
non_process
sync ia ids latest data dump for december run scott s tool to sync ia ids on the latest dump now that december has come around
0
19,853
26,254,692,053
IssuesEvent
2023-01-05 22:59:45
rusefi/rusefi_documentation
https://api.github.com/repos/rusefi/rusefi_documentation
closed
improve workflow that uploads wiki2& wiki3 to handle testing and dir structure change
wiki location & process change
Current situation: 1. "workflow that uploads wiki3 overwrites without removing what already exists in order to not make the wiki go down temporarily." from https://github.com/rusefi/rusefi_documentation/issues/123#issuecomment-1364537056 2. "no test of changes possible" from #249 3. avoid automatic sync of wiki2 when pre-requisites are unmet (details in https://github.com/rusefi/rusefi_documentation/pull/320#issuecomment-1368001890) Desired situation: 1. all requirements are handled correctly 2. The new process is documented and reviewed before it's implemented
1.0
improve workflow that uploads wiki2& wiki3 to handle testing and dir structure change - Current situation: 1. "workflow that uploads wiki3 overwrites without removing what already exists in order to not make the wiki go down temporarily." from https://github.com/rusefi/rusefi_documentation/issues/123#issuecomment-1364537056 2. "no test of changes possible" from #249 3. avoid automatic sync of wiki2 when pre-requisites are unmet (details in https://github.com/rusefi/rusefi_documentation/pull/320#issuecomment-1368001890) Desired situation: 1. all requirements are handled correctly 2. The new process is documented and reviewed before it's implemented
process
improve workflow that uploads to handle testing and dir structure change current situation workflow that uploads overwrites without removing what already exists in order to not make the wiki go down temporarily from no test of changes possible from avoid automatic sync of when pre requisites are unmet details in desired situation all requirements are handled correctly the new process is documented and reviewed before it s implemented
1
49,851
10,426,983,278
IssuesEvent
2019-09-16 18:51:24
microsoft/AdaptiveCards
https://api.github.com/repos/microsoft/AdaptiveCards
closed
[UWP][TestApp] Failures in test app
Bug Status-In Code Review Triage-Approved for Fix
# Platform * UWP # Details The UWP test app is currently reporting 9 failures across 4 files
1.0
[UWP][TestApp] Failures in test app - # Platform * UWP # Details The UWP test app is currently reporting 9 failures across 4 files
non_process
failures in test app platform uwp details the uwp test app is currently reporting failures across files
0
251,080
7,999,571,737
IssuesEvent
2018-07-22 03:41:32
zhengqunkoo/taxibros
https://api.github.com/repos/zhengqunkoo/taxibros
closed
Deploying with AWS
Priority: Low enhancement
Deploy app on AWS. Heroku has limited memory space and is fairly laggy. We would also like the app to be self-maintaining which is not possible with heroku as heroku is known to completely reset the deployment to its original state. AWS will also allow us to have more experience with servers and proper deployment.
1.0
Deploying with AWS - Deploy app on AWS. Heroku has limited memory space and is fairly laggy. We would also like the app to be self-maintaining which is not possible with heroku as heroku is known to completely reset the deployment to its original state. AWS will also allow us to have more experience with servers and proper deployment.
non_process
deploying with aws deploy app on aws heroku has limited memory space and is fairly laggy we would also like the app to be self maintaining which is not possible with heroku as heroku is known to completely reset the deployment to its original state aws will also allow us to have more experience with servers and proper deployment
0
7,500
10,585,442,911
IssuesEvent
2019-10-08 17:31:08
googleapis/google-cloud-python
https://api.github.com/repos/googleapis/google-cloud-python
closed
Synthesis failed for videointelligence
api: videointelligence autosynth failure type: process
Hello! Autosynth couldn't regenerate videointelligence. :broken_heart: Here's the output from running `synth.py`: ``` Cloning into 'working_repo'... Switched to branch 'autosynth-videointelligence' Running synthtool ['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--'] synthtool > Executing /tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py. synthtool > Ensuring dependencies. synthtool > Pulling artman image. latest: Pulling from googleapis/artman Digest: sha256:0d2f8d429110aeb8d82df6550ef4ede59d40df9062d260a1580fce688b0512bf Status: Image is up to date for googleapis/artman:latest synthtool > Cloning googleapis. Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module> main() File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed File "/tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py", line 34, in <module> include_protos=True, File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library return self._generate_code(service, version, "python", **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code f"Unable to find configuration yaml file: {(googleapis / config_path)}." FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/artman_videointelligence_v1beta1.yaml. synthtool > Cleaned up 1 temporary directories. synthtool > Wrote metadata to synth.metadata. Synthesis failed ``` Google internal developers can see the full log [here](https://sponge/1e1314a6-ec45-4c8e-a82f-a9053780cb23).
1.0
Synthesis failed for videointelligence - Hello! Autosynth couldn't regenerate videointelligence. :broken_heart: Here's the output from running `synth.py`: ``` Cloning into 'working_repo'... Switched to branch 'autosynth-videointelligence' Running synthtool ['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', 'synth.py', '--'] synthtool > Executing /tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py. synthtool > Ensuring dependencies. synthtool > Pulling artman image. latest: Pulling from googleapis/artman Digest: sha256:0d2f8d429110aeb8d82df6550ef4ede59d40df9062d260a1580fce688b0512bf Status: Image is up to date for googleapis/artman:latest synthtool > Cloning googleapis. Traceback (most recent call last): File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 193, in _run_module_as_main "__main__", mod_spec) File "/home/kbuilder/.pyenv/versions/3.6.1/lib/python3.6/runpy.py", line 85, in _run_code exec(code, run_globals) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 87, in <module> main() File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 764, in __call__ return self.main(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 717, in main rv = self.invoke(ctx) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 956, in invoke return ctx.invoke(self.callback, **ctx.params) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 555, in invoke return callback(*args, **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 79, in main spec.loader.exec_module(synth_module) # type: ignore File "<frozen importlib._bootstrap_external>", line 678, in exec_module File "<frozen importlib._bootstrap>", line 205, in _call_with_frames_removed File "/tmpfs/src/git/autosynth/working_repo/videointelligence/synth.py", line 34, in <module> include_protos=True, File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 50, in py_library return self._generate_code(service, version, "python", **kwargs) File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/gcp/gapic_generator.py", line 121, in _generate_code f"Unable to find configuration yaml file: {(googleapis / config_path)}." FileNotFoundError: Unable to find configuration yaml file: /home/kbuilder/.cache/synthtool/googleapis/google/cloud/videointelligence/artman_videointelligence_v1beta1.yaml. synthtool > Cleaned up 1 temporary directories. synthtool > Wrote metadata to synth.metadata. Synthesis failed ``` Google internal developers can see the full log [here](https://sponge/1e1314a6-ec45-4c8e-a82f-a9053780cb23).
process
synthesis failed for videointelligence hello autosynth couldn t regenerate videointelligence broken heart here s the output from running synth py cloning into working repo switched to branch autosynth videointelligence running synthtool synthtool executing tmpfs src git autosynth working repo videointelligence synth py synthtool ensuring dependencies synthtool pulling artman image latest pulling from googleapis artman digest status image is up to date for googleapis artman latest synthtool cloning googleapis traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo videointelligence synth py line in include protos true file tmpfs src git autosynth env lib site packages synthtool gcp gapic generator py line in py library return self generate code service version python kwargs file tmpfs src git autosynth env lib site packages synthtool gcp gapic generator py line in generate code f unable to find configuration yaml file googleapis config path filenotfounderror unable to find configuration yaml file home kbuilder cache synthtool googleapis google cloud videointelligence artman videointelligence yaml synthtool cleaned up temporary directories synthtool wrote metadata to synth metadata synthesis failed google internal developers can see the full log
1
87,704
17,361,706,436
IssuesEvent
2021-07-29 21:44:22
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
[Mono] AOT compiler segfaults when building tests
area-Codegen-AOT-mono blocking-clean-ci
Discovered in https://github.com/dotnet/runtime/pull/56316 When we go to AOT our functional tests, the aot compiler crashes with a segmentation fault (11). When peeling back what's going on, the backtrace shows us: ``` * thread #1, name = 'tid_103', queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x44) frame #0: 0x00000001001487e1 mono-aot-cross`create_jit_info(cfg=<unavailable>, method_to_compile=<unavailable>) at mini.c:2595:18 [opt] 2592 int end_offset; 2593 if (ec->handler_offset + ec->handler_len < header->code_size) { 2594 tblock = cfg->cil_offset_to_bb [ec->handler_offset + ec->handler_len]; -> 2595 if (tblock->native_offset) { 2596 end_offset = tblock->native_offset; 2597 } else { 2598 int j, end; * thread #1, name = 'tid_103', queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x44) * frame #0: 0x00000001001487e1 mono-aot-cross`create_jit_info(cfg=<unavailable>, method_to_compile=<unavailable>) at mini.c:2595:18 [opt] frame #1: 0x000000010014792c mono-aot-cross`mini_method_compile(method=<unavailable>, opts=<unavailable>, flags=JIT_FLAG_AOT | JIT_FLAG_FULL_AOT | JIT_FLAG_LLVM, parts=0, aot_method_index=179) at mini.c:3924:2 [opt] frame #2: 0x00000001001d9f1d mono-aot-cross`compile_method(acfg=0x0000000101008c00, method=0x0000000101a0fca0) at aot-compiler.c:8884:8 [opt] frame #3: 0x00000001001ca8cb mono-aot-cross`mono_compile_assembly [inlined] compile_methods(acfg=0x0000000101008c00) at aot-compiler.c:12356:3 [opt] frame #4: 0x00000001001ca74c mono-aot-cross`mono_compile_assembly(ass=0x0000000100f09970, opts=<unavailable>, aot_options=<unavailable>, global_aot_state=0x00007ffeefbff068) at aot-compiler.c:14125:2 [opt] frame #5: 0x00000001001b962c mono-aot-cross`mono_main at driver.c:1434:10 [opt] frame #6: 0x00000001001b9598 mono-aot-cross`mono_main(argc=<unavailable>, argv=<unavailable>) at driver.c:2682:3 [opt] frame #7: 0x0000000100003a73 mono-aot-cross`main [inlined] mono_main_with_options(argc=<unavailable>, argv=<unavailable>) at main.c:54:9 [opt] frame #8: 0x0000000100003a5f mono-aot-cross`main(argc=5, argv=0x00007ffeefbff2a0) at main.c:397:9 [opt] frame #9: 0x00007fff203fbf5d libdyld.dylib`start + 1 thread #2, name = 'SGen worker' frame #0: 0x00007fff203adcde libsystem_kernel.dylib`__psynch_cvwait + 10 frame #1: 0x00007fff203e0e49 libsystem_pthread.dylib`_pthread_cond_wait + 1298 frame #2: 0x000000010013e5e3 mono-aot-cross`thread_func [inlined] mono_os_cond_wait(cond=<unavailable>, mutex=<unavailable>) at mono-os-mutex.h:219:8 [opt] frame #3: 0x000000010013e5c8 mono-aot-cross`thread_func at sgen-thread-pool.c:167:3 [opt] frame #4: 0x000000010013e4cb mono-aot-cross`thread_func(data=<unavailable>) at sgen-thread-pool.c:198:3 [opt] frame #5: 0x00007fff203e08fc libsystem_pthread.dylib`_pthread_start + 224 frame #6: 0x00007fff203dc443 libsystem_pthread.dylib`thread_start + 15 ``` If we go to the 2nd frame and print the important method details, we get: ``` method->name = "Trim" method->klass->name_space = "System.Buffers" method->klass->name = "TlsOverPerCoreLockedStacksArrayPool`1" ``` Which leads to this block that was changed in the PR #56316 https://github.com/dotnet/runtime/blob/96673df0d1ba24cf2be4ec9529a6ba54f7d97902/src/libraries/System.Private.CoreLib/src/System/Buffers/TlsOverPerCoreLockedStacksArrayPool.cs#L187-L280 @vargaz it looks like foreach may be causing problems under certain conditions.
1.0
[Mono] AOT compiler segfaults when building tests - Discovered in https://github.com/dotnet/runtime/pull/56316 When we go to AOT our functional tests, the aot compiler crashes with a segmentation fault (11). When peeling back what's going on, the backtrace shows us: ``` * thread #1, name = 'tid_103', queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x44) frame #0: 0x00000001001487e1 mono-aot-cross`create_jit_info(cfg=<unavailable>, method_to_compile=<unavailable>) at mini.c:2595:18 [opt] 2592 int end_offset; 2593 if (ec->handler_offset + ec->handler_len < header->code_size) { 2594 tblock = cfg->cil_offset_to_bb [ec->handler_offset + ec->handler_len]; -> 2595 if (tblock->native_offset) { 2596 end_offset = tblock->native_offset; 2597 } else { 2598 int j, end; * thread #1, name = 'tid_103', queue = 'com.apple.main-thread', stop reason = EXC_BAD_ACCESS (code=1, address=0x44) * frame #0: 0x00000001001487e1 mono-aot-cross`create_jit_info(cfg=<unavailable>, method_to_compile=<unavailable>) at mini.c:2595:18 [opt] frame #1: 0x000000010014792c mono-aot-cross`mini_method_compile(method=<unavailable>, opts=<unavailable>, flags=JIT_FLAG_AOT | JIT_FLAG_FULL_AOT | JIT_FLAG_LLVM, parts=0, aot_method_index=179) at mini.c:3924:2 [opt] frame #2: 0x00000001001d9f1d mono-aot-cross`compile_method(acfg=0x0000000101008c00, method=0x0000000101a0fca0) at aot-compiler.c:8884:8 [opt] frame #3: 0x00000001001ca8cb mono-aot-cross`mono_compile_assembly [inlined] compile_methods(acfg=0x0000000101008c00) at aot-compiler.c:12356:3 [opt] frame #4: 0x00000001001ca74c mono-aot-cross`mono_compile_assembly(ass=0x0000000100f09970, opts=<unavailable>, aot_options=<unavailable>, global_aot_state=0x00007ffeefbff068) at aot-compiler.c:14125:2 [opt] frame #5: 0x00000001001b962c mono-aot-cross`mono_main at driver.c:1434:10 [opt] frame #6: 0x00000001001b9598 mono-aot-cross`mono_main(argc=<unavailable>, argv=<unavailable>) at driver.c:2682:3 [opt] frame #7: 0x0000000100003a73 mono-aot-cross`main [inlined] mono_main_with_options(argc=<unavailable>, argv=<unavailable>) at main.c:54:9 [opt] frame #8: 0x0000000100003a5f mono-aot-cross`main(argc=5, argv=0x00007ffeefbff2a0) at main.c:397:9 [opt] frame #9: 0x00007fff203fbf5d libdyld.dylib`start + 1 thread #2, name = 'SGen worker' frame #0: 0x00007fff203adcde libsystem_kernel.dylib`__psynch_cvwait + 10 frame #1: 0x00007fff203e0e49 libsystem_pthread.dylib`_pthread_cond_wait + 1298 frame #2: 0x000000010013e5e3 mono-aot-cross`thread_func [inlined] mono_os_cond_wait(cond=<unavailable>, mutex=<unavailable>) at mono-os-mutex.h:219:8 [opt] frame #3: 0x000000010013e5c8 mono-aot-cross`thread_func at sgen-thread-pool.c:167:3 [opt] frame #4: 0x000000010013e4cb mono-aot-cross`thread_func(data=<unavailable>) at sgen-thread-pool.c:198:3 [opt] frame #5: 0x00007fff203e08fc libsystem_pthread.dylib`_pthread_start + 224 frame #6: 0x00007fff203dc443 libsystem_pthread.dylib`thread_start + 15 ``` If we go to the 2nd frame and print the important method details, we get: ``` method->name = "Trim" method->klass->name_space = "System.Buffers" method->klass->name = "TlsOverPerCoreLockedStacksArrayPool`1" ``` Which leads to this block that was changed in the PR #56316 https://github.com/dotnet/runtime/blob/96673df0d1ba24cf2be4ec9529a6ba54f7d97902/src/libraries/System.Private.CoreLib/src/System/Buffers/TlsOverPerCoreLockedStacksArrayPool.cs#L187-L280 @vargaz it looks like foreach may be causing problems under certain conditions.
non_process
aot compiler segfaults when building tests discovered in when we go to aot our functional tests the aot compiler crashes with a segmentation fault when peeling back what s going on the backtrace shows us thread name tid queue com apple main thread stop reason exc bad access code address frame mono aot cross create jit info cfg method to compile at mini c int end offset if ec handler offset ec handler len code size tblock cfg cil offset to bb if tblock native offset end offset tblock native offset else int j end thread name tid queue com apple main thread stop reason exc bad access code address frame mono aot cross create jit info cfg method to compile at mini c frame mono aot cross mini method compile method opts flags jit flag aot jit flag full aot jit flag llvm parts aot method index at mini c frame mono aot cross compile method acfg method at aot compiler c frame mono aot cross mono compile assembly compile methods acfg at aot compiler c frame mono aot cross mono compile assembly ass opts aot options global aot state at aot compiler c frame mono aot cross mono main at driver c frame mono aot cross mono main argc argv at driver c frame mono aot cross main mono main with options argc argv at main c frame mono aot cross main argc argv at main c frame libdyld dylib start thread name sgen worker frame libsystem kernel dylib psynch cvwait frame libsystem pthread dylib pthread cond wait frame mono aot cross thread func mono os cond wait cond mutex at mono os mutex h frame mono aot cross thread func at sgen thread pool c frame mono aot cross thread func data at sgen thread pool c frame libsystem pthread dylib pthread start frame libsystem pthread dylib thread start if we go to the frame and print the important method details we get method name trim method klass name space system buffers method klass name tlsoverpercorelockedstacksarraypool which leads to this block that was changed in the pr vargaz it looks like foreach may be causing problems under certain conditions
0
22,669
31,896,096,974
IssuesEvent
2023-09-18 01:56:16
tdwg/dwc
https://api.github.com/repos/tdwg/dwc
closed
schemaLocation not pointing to valid path
bug Process - complete
https://github.com/tdwg/dwc/blob/03bc7e182fbce323ae66caedef6da3a1451ca6b8/docs/text/tdwg_dwc_text.xsd#L7 I guess XSD files were originally placed at `/xsd/`. Should them be moved back or perhaps placed under `/schema/` (possibly followed by subdirectories to classify by schema type)? Also noticed that https://rs.gbif.org/schema/xml.xsd fails to display on FF and Chrome, whereas the other schemas work fine. This URL is pointed by `tdwg_dwc_text.xsd`.
1.0
schemaLocation not pointing to valid path - https://github.com/tdwg/dwc/blob/03bc7e182fbce323ae66caedef6da3a1451ca6b8/docs/text/tdwg_dwc_text.xsd#L7 I guess XSD files were originally placed at `/xsd/`. Should them be moved back or perhaps placed under `/schema/` (possibly followed by subdirectories to classify by schema type)? Also noticed that https://rs.gbif.org/schema/xml.xsd fails to display on FF and Chrome, whereas the other schemas work fine. This URL is pointed by `tdwg_dwc_text.xsd`.
process
schemalocation not pointing to valid path i guess xsd files were originally placed at xsd should them be moved back or perhaps placed under schema possibly followed by subdirectories to classify by schema type also noticed that fails to display on ff and chrome whereas the other schemas work fine this url is pointed by tdwg dwc text xsd
1
19,418
25,565,499,542
IssuesEvent
2022-11-30 14:01:58
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
Questions about opentelemetry-collector-contrib/processor/tailsamplingprocessor/
question processor/tailsampling
tailsamplingprocessor When the tailsamplingprocessor is sampling, will it only mark or discard the data?
1.0
Questions about opentelemetry-collector-contrib/processor/tailsamplingprocessor/ - tailsamplingprocessor When the tailsamplingprocessor is sampling, will it only mark or discard the data?
process
questions about opentelemetry collector contrib processor tailsamplingprocessor tailsamplingprocessor when the tailsamplingprocessor is sampling will it only mark or discard the data
1
277,058
24,045,470,205
IssuesEvent
2022-09-16 07:55:36
valory-xyz/open-autonomy
https://api.github.com/repos/valory-xyz/open-autonomy
closed
Remove acn staging dependency from e2e tests resulting in them being flaky
enhancement test
The registration e2e tests are using the `acn.staging.autonolas.tech` which results in them being flaky if an agent fails to connect. E.g.: [Fix constants in contracts main_workflow #4992](https://github.com/valory-xyz/open-autonomy/runs/8201556093?check_suite_focus=true) We may replace the dependency and set up a local p2p network for the e2e tests.
1.0
Remove acn staging dependency from e2e tests resulting in them being flaky - The registration e2e tests are using the `acn.staging.autonolas.tech` which results in them being flaky if an agent fails to connect. E.g.: [Fix constants in contracts main_workflow #4992](https://github.com/valory-xyz/open-autonomy/runs/8201556093?check_suite_focus=true) We may replace the dependency and set up a local p2p network for the e2e tests.
non_process
remove acn staging dependency from tests resulting in them being flaky the registration tests are using the acn staging autonolas tech which results in them being flaky if an agent fails to connect e g we may replace the dependency and set up a local network for the tests
0
519
2,994,310,888
IssuesEvent
2015-07-22 10:54:42
genomizer/genomizer-server
https://api.github.com/repos/genomizer/genomizer-server
closed
Conversion of wig to sgr is wrong
bug Data Storage High priority Processing
The sgr file coordinate should be the rounded average of the two corresponding coordinates in the wig file (any rounding is fine). A wig line like this: `chr2L 66 159 1.00` should be translated to a sgr line like this: `chr2L 113 1.00` Now its translated like this: `chr2L 66 1.00`
1.0
Conversion of wig to sgr is wrong - The sgr file coordinate should be the rounded average of the two corresponding coordinates in the wig file (any rounding is fine). A wig line like this: `chr2L 66 159 1.00` should be translated to a sgr line like this: `chr2L 113 1.00` Now its translated like this: `chr2L 66 1.00`
process
conversion of wig to sgr is wrong the sgr file coordinate should be the rounded average of the two corresponding coordinates in the wig file any rounding is fine a wig line like this should be translated to a sgr line like this now its translated like this
1
47,502
7,329,624,651
IssuesEvent
2018-03-05 06:17:31
PaddlePaddle/Paddle
https://api.github.com/repos/PaddlePaddle/Paddle
closed
Add guide for documentation-2-"安装与编译"
documentation
- **Task** Here is the reclassified documentation structure, please add a guide for "[安装与编译](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/index_cn.html)", introducing what is "安装与编译" , and the general idea/function/target reader of each part.("安装与运行"and“从源码编译”) 请注意:"安装与编译"其中的两篇文档随后会合并为一篇,请参考https://github.com/PaddlePaddle/Paddle/issues/8273, 最后,安装与编译应该包括三篇文档: 使用pip安装 使用Docker安装运行 从源码编译(docker环境) 概述应与文档结构保持一致~ - **What is guide?** It is an introduction of each part of one module, also, a brief introduction of this module is necessary. Take [TensorFlow ](https://www.tensorflow.org/get_started/)for example: ![image](https://user-images.githubusercontent.com/35982308/35895851-42d796f6-0bf4-11e8-87c6-23af07049500.png) - **Note** **Language:** Please write in **Chinese.** **Timeline:** Draft: 1st Mar Review: 8th Mar Deadline:15th Mar **Documentation structure mind map:** [mind map_cn](https://shimo.im/docs/nmHJeHAqOEkHqorN) [mind map_en](https://shimo.im/docs/EsuUqsyp2vUEQYov) **请注意** 直接改类似 doc/howto/capi/index_cn.rst的rst文件,不是新加一个md文件 - **Background** About guide, we have discussed in our [MRD](https://note.youdao.com/share/?token=83F0AB634A7749E6B4676F823D6EC343&gid=69922175#/), see page 6 Our goal is to make it clear and understandable for developers to use. English version is also necessary, however, our plan is writing Chinese version this time. Thank you so much! ------------------------------------------------------------------------------------------ **Project Background** Optimize documentation of PaddlePaddle, enhancing user experience -----Enhancement Plan: [Stage 1 MRD](https://note.youdao.com/share/?token=83F0AB634A7749E6B4676F823D6EC343&gid=69922175#/) -----[Enhancement Schedule(Update everyday)](https://shimo.im/sheet/VnyT2IKRHjoU1a6i/6HGyl// ) -----[Guide-adding Plan](https://shimo.im/sheet/OlPVSI2xoxsEilt3)
1.0
Add guide for documentation-2-"安装与编译" - - **Task** Here is the reclassified documentation structure, please add a guide for "[安装与编译](http://www.paddlepaddle.org/docs/develop/documentation/zh/build_and_install/index_cn.html)", introducing what is "安装与编译" , and the general idea/function/target reader of each part.("安装与运行"and“从源码编译”) 请注意:"安装与编译"其中的两篇文档随后会合并为一篇,请参考https://github.com/PaddlePaddle/Paddle/issues/8273, 最后,安装与编译应该包括三篇文档: 使用pip安装 使用Docker安装运行 从源码编译(docker环境) 概述应与文档结构保持一致~ - **What is guide?** It is an introduction of each part of one module, also, a brief introduction of this module is necessary. Take [TensorFlow ](https://www.tensorflow.org/get_started/)for example: ![image](https://user-images.githubusercontent.com/35982308/35895851-42d796f6-0bf4-11e8-87c6-23af07049500.png) - **Note** **Language:** Please write in **Chinese.** **Timeline:** Draft: 1st Mar Review: 8th Mar Deadline:15th Mar **Documentation structure mind map:** [mind map_cn](https://shimo.im/docs/nmHJeHAqOEkHqorN) [mind map_en](https://shimo.im/docs/EsuUqsyp2vUEQYov) **请注意** 直接改类似 doc/howto/capi/index_cn.rst的rst文件,不是新加一个md文件 - **Background** About guide, we have discussed in our [MRD](https://note.youdao.com/share/?token=83F0AB634A7749E6B4676F823D6EC343&gid=69922175#/), see page 6 Our goal is to make it clear and understandable for developers to use. English version is also necessary, however, our plan is writing Chinese version this time. Thank you so much! ------------------------------------------------------------------------------------------ **Project Background** Optimize documentation of PaddlePaddle, enhancing user experience -----Enhancement Plan: [Stage 1 MRD](https://note.youdao.com/share/?token=83F0AB634A7749E6B4676F823D6EC343&gid=69922175#/) -----[Enhancement Schedule(Update everyday)](https://shimo.im/sheet/VnyT2IKRHjoU1a6i/6HGyl// ) -----[Guide-adding Plan](https://shimo.im/sheet/OlPVSI2xoxsEilt3)
non_process
add guide for documentation 安装与编译 task here is the reclassified documentation structure please add a guide for introducing what is 安装与编译 and the general idea function target reader of each part 安装与运行 and“从源码编译” 请注意: 安装与编译 其中的两篇文档随后会合并为一篇,请参考 最后,安装与编译应该包括三篇文档: 使用pip安装 使用docker安装运行 从源码编译(docker环境) 概述应与文档结构保持一致 what is guide it is an introduction of each part of one module also a brief introduction of this module is necessary take example note language please write in chinese timeline draft mar review mar deadline mar documentation structure mind map 请注意 直接改类似 doc howto capi index cn rst的rst文件 不是新加一个md文件 background about guide we have discussed in our see page our goal is to make it clear and understandable for developers to use english version is also necessary however our plan is writing chinese version this time thank you so much project background optimize documentation of paddlepaddle enhancing user experience enhancement plan
0
10,016
13,043,911,531
IssuesEvent
2020-07-29 03:01:41
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `GreatestDecimal` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `GreatestDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `GreatestDecimal` from TiDB - ## Description Port the scalar function `GreatestDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @breeswish ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function greatestdecimal from tidb description port the scalar function greatestdecimal from tidb to coprocessor score mentor s breeswish recommended skills rust programming learning materials already implemented expressions ported from tidb
1
68,041
8,211,424,028
IssuesEvent
2018-09-04 13:47:40
super-maps-pointer/frontend
https://api.github.com/repos/super-maps-pointer/frontend
opened
Create a logo
Design
First and foremost we need visual identity to make the app looks like an app. Easy thing to do is a logo, anything more is also welcome
1.0
Create a logo - First and foremost we need visual identity to make the app looks like an app. Easy thing to do is a logo, anything more is also welcome
non_process
create a logo first and foremost we need visual identity to make the app looks like an app easy thing to do is a logo anything more is also welcome
0
3,212
6,266,817,760
IssuesEvent
2017-07-17 04:37:58
XENON1T/pax
https://api.github.com/repos/XENON1T/pax
closed
Pax multiprocessing crash ProcessBatchQueue
bug invalid processed data io
_From @lucrlom on November 24, 2016 17:5_ I had this problem at least in two runs 4696 and 4700 processing with `massive-cax --once --run 4xxx` with pax_v6.1.0 environment: ``` `cax_v4.10.1 - 2016-11-24 04:15:23,038 [CRITICAL] Exception caught from task ProcessBatchQueue Traceback (most recent call last): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/main.py", line 120, in main task.go(args.run) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/task.py", line 65, in go self.each_run() File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/tasks/process.py", line 211, in each_run ncpus) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/tasks/process.py", line 98, in _process parallel.multiprocess_locally(n_cpus=ncpus, **pax_kwargs) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/parallel.py", line 205, in multiprocess_locally traceback) pax.exceptions.EventBlockHeapSizeExceededException: Pax multiprocessing crashed due to exception in one of the workers. Dumping traceback: Traceback (most recent call last): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/parallel.py", line 356, in safe_processor Processor(**kwargs) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 186, in __init__ self.run() File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 310, in run total=self.number_of_events)): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/plugins/io/Queues.py", line 97, in get_events self.max_blocks_on_heap, block_id + 1)) pax.exceptions.EventBlockHeapSizeExceededException: We have received over 250 blocks without receiving the next block id (1739) in order. Likely one of the block producers has died without telling anyone. cax_v4.10.1 - 2016-11-24 04:15:23,149 [ERROR] Pax multiprocessing crashed due to exception in one of the workers. Dumping traceback: Traceback (most recent call last): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/parallel.py", line 356, in safe_processor Processor(**kwargs) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 186, in __init__ self.run() File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 310, in run total=self.number_of_events)): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/plugins/io/Queues.py", line 97, in get_events self.max_blocks_on_heap, block_id + 1)) pax.exceptions.EventBlockHeapSizeExceededException: We have received over 250 blocks without receiving the next block id (1739) in order. Likely one of the block producers has died without telling anyone` ``` _Copied from original issue: XENON1T/cax#60_
1.0
Pax multiprocessing crash ProcessBatchQueue - _From @lucrlom on November 24, 2016 17:5_ I had this problem at least in two runs 4696 and 4700 processing with `massive-cax --once --run 4xxx` with pax_v6.1.0 environment: ``` `cax_v4.10.1 - 2016-11-24 04:15:23,038 [CRITICAL] Exception caught from task ProcessBatchQueue Traceback (most recent call last): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/main.py", line 120, in main task.go(args.run) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/task.py", line 65, in go self.each_run() File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/tasks/process.py", line 211, in each_run ncpus) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/cax-4.10.1-py3.4.egg/cax/tasks/process.py", line 98, in _process parallel.multiprocess_locally(n_cpus=ncpus, **pax_kwargs) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/parallel.py", line 205, in multiprocess_locally traceback) pax.exceptions.EventBlockHeapSizeExceededException: Pax multiprocessing crashed due to exception in one of the workers. Dumping traceback: Traceback (most recent call last): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/parallel.py", line 356, in safe_processor Processor(**kwargs) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 186, in __init__ self.run() File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 310, in run total=self.number_of_events)): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/plugins/io/Queues.py", line 97, in get_events self.max_blocks_on_heap, block_id + 1)) pax.exceptions.EventBlockHeapSizeExceededException: We have received over 250 blocks without receiving the next block id (1739) in order. Likely one of the block producers has died without telling anyone. cax_v4.10.1 - 2016-11-24 04:15:23,149 [ERROR] Pax multiprocessing crashed due to exception in one of the workers. Dumping traceback: Traceback (most recent call last): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/parallel.py", line 356, in safe_processor Processor(**kwargs) File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 186, in __init__ self.run() File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/core.py", line 310, in run total=self.number_of_events)): File "/project/lgrandi/anaconda3/envs/pax_v6.1.0/lib/python3.4/site-packages/pax-6.1.0-py3.4.egg/pax/plugins/io/Queues.py", line 97, in get_events self.max_blocks_on_heap, block_id + 1)) pax.exceptions.EventBlockHeapSizeExceededException: We have received over 250 blocks without receiving the next block id (1739) in order. Likely one of the block producers has died without telling anyone` ``` _Copied from original issue: XENON1T/cax#60_
process
pax multiprocessing crash processbatchqueue from lucrlom on november i had this problem at least in two runs and processing with massive cax once run with pax environment cax exception caught from task processbatchqueue traceback most recent call last file project lgrandi envs pax lib site packages cax egg cax main py line in main task go args run file project lgrandi envs pax lib site packages cax egg cax task py line in go self each run file project lgrandi envs pax lib site packages cax egg cax tasks process py line in each run ncpus file project lgrandi envs pax lib site packages cax egg cax tasks process py line in process parallel multiprocess locally n cpus ncpus pax kwargs file project lgrandi envs pax lib site packages pax egg pax parallel py line in multiprocess locally traceback pax exceptions eventblockheapsizeexceededexception pax multiprocessing crashed due to exception in one of the workers dumping traceback traceback most recent call last file project lgrandi envs pax lib site packages pax egg pax parallel py line in safe processor processor kwargs file project lgrandi envs pax lib site packages pax egg pax core py line in init self run file project lgrandi envs pax lib site packages pax egg pax core py line in run total self number of events file project lgrandi envs pax lib site packages pax egg pax plugins io queues py line in get events self max blocks on heap block id pax exceptions eventblockheapsizeexceededexception we have received over blocks without receiving the next block id in order likely one of the block producers has died without telling anyone cax pax multiprocessing crashed due to exception in one of the workers dumping traceback traceback most recent call last file project lgrandi envs pax lib site packages pax egg pax parallel py line in safe processor processor kwargs file project lgrandi envs pax lib site packages pax egg pax core py line in init self run file project lgrandi envs pax lib site packages pax egg pax core py line in run total self number of events file project lgrandi envs pax lib site packages pax egg pax plugins io queues py line in get events self max blocks on heap block id pax exceptions eventblockheapsizeexceededexception we have received over blocks without receiving the next block id in order likely one of the block producers has died without telling anyone copied from original issue cax
1
1,593
4,188,000,707
IssuesEvent
2016-06-23 19:15:28
opentrials/opentrials
https://api.github.com/repos/opentrials/opentrials
opened
Trial has sponsor information but we're not displaying it
Data cleaning Processors
http://explorer.opentrials.net/trials/e02fbb1d-d732-43f7-8efe-a13ec6f16bc0 If you go into the NCT source at http://explorer.opentrials.net/trials/e02fbb1d-d732-43f7-8efe-a13ec6f16bc0/records/4e7e317e-a1bb-4616-a32f-9aa569287cbd, you'll see that the sponsor is Sanofi Pasteur, but we're not displaying it. It also has secondary ids that we're not grabbing.
1.0
Trial has sponsor information but we're not displaying it - http://explorer.opentrials.net/trials/e02fbb1d-d732-43f7-8efe-a13ec6f16bc0 If you go into the NCT source at http://explorer.opentrials.net/trials/e02fbb1d-d732-43f7-8efe-a13ec6f16bc0/records/4e7e317e-a1bb-4616-a32f-9aa569287cbd, you'll see that the sponsor is Sanofi Pasteur, but we're not displaying it. It also has secondary ids that we're not grabbing.
process
trial has sponsor information but we re not displaying it if you go into the nct source at you ll see that the sponsor is sanofi pasteur but we re not displaying it it also has secondary ids that we re not grabbing
1
13,578
16,112,604,450
IssuesEvent
2021-04-28 00:22:43
CodeForPittsburgh/food-access-map-data
https://api.github.com/repos/CodeForPittsburgh/food-access-map-data
closed
new food bank data
data processing
Migrating to a new food bank datasource to solve our updating issues. GPCFB staff contribute data on free food distribution sites daily to Allegheny County's [food distribution map](https://www.arcgis.com/apps/MapSeries/index.html?appid=abaca148492b47a7ad0d5a71f5d2c5e8). The [datasets](https://covid19-resource-mapping-alcogis.hub.arcgis.com/datasets/public-covid19-food-access/data?geometry=125.156%2C-88.438%2C-125.156%2C88.438&showData=true) themselves are available for us to consume. To start with let's work on food pantries and soup kitchens. @melynnduh I am a little unclear if they are separate datasets, can you clarify? @drewlevitt I'm assigning this to you but we can give it to someone else if you're busy.
1.0
new food bank data - Migrating to a new food bank datasource to solve our updating issues. GPCFB staff contribute data on free food distribution sites daily to Allegheny County's [food distribution map](https://www.arcgis.com/apps/MapSeries/index.html?appid=abaca148492b47a7ad0d5a71f5d2c5e8). The [datasets](https://covid19-resource-mapping-alcogis.hub.arcgis.com/datasets/public-covid19-food-access/data?geometry=125.156%2C-88.438%2C-125.156%2C88.438&showData=true) themselves are available for us to consume. To start with let's work on food pantries and soup kitchens. @melynnduh I am a little unclear if they are separate datasets, can you clarify? @drewlevitt I'm assigning this to you but we can give it to someone else if you're busy.
process
new food bank data migrating to a new food bank datasource to solve our updating issues gpcfb staff contribute data on free food distribution sites daily to allegheny county s the themselves are available for us to consume to start with let s work on food pantries and soup kitchens melynnduh i am a little unclear if they are separate datasets can you clarify drewlevitt i m assigning this to you but we can give it to someone else if you re busy
1
476,245
13,735,860,393
IssuesEvent
2020-10-05 10:49:30
threefoldtech/tfgateway
https://api.github.com/repos/threefoldtech/tfgateway
opened
Document proper DNS configuration for tfgateway deployement
priority_critical type_bug
Update the main README with complete exaplanation about how to properly configure the tfgateway DNS for it to work properly
1.0
Document proper DNS configuration for tfgateway deployement - Update the main README with complete exaplanation about how to properly configure the tfgateway DNS for it to work properly
non_process
document proper dns configuration for tfgateway deployement update the main readme with complete exaplanation about how to properly configure the tfgateway dns for it to work properly
0
11,407
14,238,849,541
IssuesEvent
2020-11-18 19:14:39
pingcap/tidb
https://api.github.com/repos/pingcap/tidb
closed
select distinct datetime bug
component/coprocessor severity/major sig/execution status/future type/bug type/compatibility
## Bug Report Please answer these questions before submitting your issue. Thanks! 1. What did you do? If possible, provide a recipe for reproducing the error. sql : ``` tidb> CREATE TABLE tb ( -> id BIGINT(20) PRIMARY KEY NOT NULL AUTO_INCREMENT COMMENT '自增id', -> effdt DATETIME NOT NULL -> ); Query OK, 0 rows affected (30.12 sec) tidb> insert tb(effdt) values("1987-04-12 00:00:00"); Query OK, 1 row affected (0.03 sec) tidb> select distinct(effdt) from tb; ERROR 1105 (HY000): other error: [src/coprocessor/dag/executor/mod.rs:228]: [src/coprocessor/codec/mysql/time/mod.rs:93]: '1987-4-12 0:0:0.000000000' is not a valid datetime in specified time zone ``` 2. What did you expect to see? ``` +----+---------------------+ | id | effdt | +----+---------------------+ | 1 | 1987-04-12 00:00:00 | +----+---------------------+ ``` 3. What did you see instead? ``` ERROR 1105 (HY000): other error: [src/coprocessor/dag/executor/mod.rs:228]: [src/coprocessor/codec/mysql/time/mod.rs:93]: '1987-4-12 0:0:0.000000000' is not a valid datetime in specified time zone ``` 4. What version of TiDB are you using (`tidb-server -V` or run `select tidb_version();` on TiDB)? version : 5.7.25-TiDB-v3.0.0-beta.1
1.0
select distinct datetime bug - ## Bug Report Please answer these questions before submitting your issue. Thanks! 1. What did you do? If possible, provide a recipe for reproducing the error. sql : ``` tidb> CREATE TABLE tb ( -> id BIGINT(20) PRIMARY KEY NOT NULL AUTO_INCREMENT COMMENT '自增id', -> effdt DATETIME NOT NULL -> ); Query OK, 0 rows affected (30.12 sec) tidb> insert tb(effdt) values("1987-04-12 00:00:00"); Query OK, 1 row affected (0.03 sec) tidb> select distinct(effdt) from tb; ERROR 1105 (HY000): other error: [src/coprocessor/dag/executor/mod.rs:228]: [src/coprocessor/codec/mysql/time/mod.rs:93]: '1987-4-12 0:0:0.000000000' is not a valid datetime in specified time zone ``` 2. What did you expect to see? ``` +----+---------------------+ | id | effdt | +----+---------------------+ | 1 | 1987-04-12 00:00:00 | +----+---------------------+ ``` 3. What did you see instead? ``` ERROR 1105 (HY000): other error: [src/coprocessor/dag/executor/mod.rs:228]: [src/coprocessor/codec/mysql/time/mod.rs:93]: '1987-4-12 0:0:0.000000000' is not a valid datetime in specified time zone ``` 4. What version of TiDB are you using (`tidb-server -V` or run `select tidb_version();` on TiDB)? version : 5.7.25-TiDB-v3.0.0-beta.1
process
select distinct datetime bug bug report please answer these questions before submitting your issue thanks what did you do if possible provide a recipe for reproducing the error sql tidb create table tb id bigint primary key not null auto increment comment 自增id effdt datetime not null query ok rows affected sec tidb insert tb effdt values query ok row affected sec tidb select distinct effdt from tb error other error is not a valid datetime in specified time zone what did you expect to see id effdt what did you see instead error other error is not a valid datetime in specified time zone what version of tidb are you using tidb server v or run select tidb version on tidb version tidb beta
1
3,447
6,540,036,232
IssuesEvent
2017-09-01 13:58:22
itsyouonline/identityserver
https://api.github.com/repos/itsyouonline/identityserver
closed
Fix 'publicKeys' in user information api
process_wontfix type_bug
Every property currently doesn't have any capitals in its name except for publicKeys Change it to `publickeys` (without capital K) to be consistent with the rest of the api
1.0
Fix 'publicKeys' in user information api - Every property currently doesn't have any capitals in its name except for publicKeys Change it to `publickeys` (without capital K) to be consistent with the rest of the api
process
fix publickeys in user information api every property currently doesn t have any capitals in its name except for publickeys change it to publickeys without capital k to be consistent with the rest of the api
1