Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
9,461
12,439,718,288
IssuesEvent
2020-05-26 10:36:55
prisma/prisma-client-js
https://api.github.com/repos/prisma/prisma-client-js
opened
PANIC: not implemented: Arrays are not supported for sqlite.
bug/2-confirmed kind/bug process/candidate
Hi Prisma Team! My Prisma Client just crashed. This is the report: ## Versions | Name | Version | |----------|--------------------| | Node | v12.16.3 | | OS | darwin | | Prisma | 2.0.0-alpha.1246 | ## Logs ``` prisma-client Client Version 2.0.0-alpha.1246 +0ms prisma-client Engine Version b6745bd0f67138b4b68e00f607c99483016924c8 +0ms prisma-client { prisma-client engineConfig: { prisma-client cwd: '/Users/tim/code/repros/raw-test/prisma', prisma-client debug: false, prisma-client datamodelPath: '/Users/tim/code/repros/raw-test/node_modules/.prisma/client/schema.prisma', prisma-client prismaPath: undefined, prisma-client generator: { prisma-client name: 'client', prisma-client provider: 'prisma-client-js', prisma-client output: '/Users/tim/code/repros/raw-test/node_modules/@prisma/client', prisma-client binaryTargets: [], prisma-client config: {} prisma-client }, prisma-client showColors: false, prisma-client logLevel: undefined, prisma-client logQueries: undefined, prisma-client flags: [], prisma-client clientVersion: '2.0.0-alpha.1246' prisma-client } prisma-client } +0ms prisma-client Prisma Client call: +3ms prisma-client prisma.raw(SELECT * FROM User WHERE id = $1, [[1]]) +0ms prisma-client Generated request: +2ms prisma-client mutation { prisma-client executeRaw( prisma-client query: "SELECT * FROM User WHERE id = $1" prisma-client parameters: "[[1]]" prisma-client ) prisma-client } prisma-client +0ms plusX Execution permissions of /Users/tim/code/repros/raw-test/node_modules/.prisma/client/query-engine-darwin are fine +0ms ```
1.0
PANIC: not implemented: Arrays are not supported for sqlite. - Hi Prisma Team! My Prisma Client just crashed. This is the report: ## Versions | Name | Version | |----------|--------------------| | Node | v12.16.3 | | OS | darwin | | Prisma | 2.0.0-alpha.1246 | ## Logs ``` prisma-client Client Version 2.0.0-alpha.1246 +0ms prisma-client Engine Version b6745bd0f67138b4b68e00f607c99483016924c8 +0ms prisma-client { prisma-client engineConfig: { prisma-client cwd: '/Users/tim/code/repros/raw-test/prisma', prisma-client debug: false, prisma-client datamodelPath: '/Users/tim/code/repros/raw-test/node_modules/.prisma/client/schema.prisma', prisma-client prismaPath: undefined, prisma-client generator: { prisma-client name: 'client', prisma-client provider: 'prisma-client-js', prisma-client output: '/Users/tim/code/repros/raw-test/node_modules/@prisma/client', prisma-client binaryTargets: [], prisma-client config: {} prisma-client }, prisma-client showColors: false, prisma-client logLevel: undefined, prisma-client logQueries: undefined, prisma-client flags: [], prisma-client clientVersion: '2.0.0-alpha.1246' prisma-client } prisma-client } +0ms prisma-client Prisma Client call: +3ms prisma-client prisma.raw(SELECT * FROM User WHERE id = $1, [[1]]) +0ms prisma-client Generated request: +2ms prisma-client mutation { prisma-client executeRaw( prisma-client query: "SELECT * FROM User WHERE id = $1" prisma-client parameters: "[[1]]" prisma-client ) prisma-client } prisma-client +0ms plusX Execution permissions of /Users/tim/code/repros/raw-test/node_modules/.prisma/client/query-engine-darwin are fine +0ms ```
process
panic not implemented arrays are not supported for sqlite hi prisma team my prisma client just crashed this is the report versions name version node os darwin prisma alpha logs prisma client client version alpha prisma client engine version prisma client prisma client engineconfig prisma client cwd users tim code repros raw test prisma prisma client debug false prisma client datamodelpath users tim code repros raw test node modules prisma client schema prisma prisma client prismapath undefined prisma client generator prisma client name client prisma client provider prisma client js prisma client output users tim code repros raw test node modules prisma client prisma client binarytargets prisma client config prisma client prisma client showcolors false prisma client loglevel undefined prisma client logqueries undefined prisma client flags prisma client clientversion alpha prisma client prisma client prisma client prisma client call prisma client prisma raw select from user where id prisma client generated request prisma client mutation prisma client executeraw prisma client query select from user where id prisma client parameters prisma client prisma client prisma client plusx execution permissions of users tim code repros raw test node modules prisma client query engine darwin are fine
1
137,617
5,313,338,274
IssuesEvent
2017-02-13 11:52:48
ActionFPS/ActionFPS-Game
https://api.github.com/repos/ActionFPS/ActionFPS-Game
closed
Authentication link leads to 'authentication failed' on Mac OS
bug high-priority
## Why this is an important problem Can't authenticate at all. Ketar had to make a new account (id=kkz) ## Expected behavior Clicking on the authentication link should start the game, set authentication data and connect successfully to the server. ## Actual behavior The server rejects the connection (authentication failed) ## Steps to reproduce behavior @ketarXVII please describe.
1.0
Authentication link leads to 'authentication failed' on Mac OS - ## Why this is an important problem Can't authenticate at all. Ketar had to make a new account (id=kkz) ## Expected behavior Clicking on the authentication link should start the game, set authentication data and connect successfully to the server. ## Actual behavior The server rejects the connection (authentication failed) ## Steps to reproduce behavior @ketarXVII please describe.
non_process
authentication link leads to authentication failed on mac os why this is an important problem can t authenticate at all ketar had to make a new account id kkz expected behavior clicking on the authentication link should start the game set authentication data and connect successfully to the server actual behavior the server rejects the connection authentication failed steps to reproduce behavior ketarxvii please describe
0
221,057
24,590,574,507
IssuesEvent
2022-10-14 01:30:51
kapseliboi/dapp
https://api.github.com/repos/kapseliboi/dapp
opened
CVE-2022-37601 (High) detected in loader-utils-1.2.3.tgz, loader-utils-1.4.0.tgz
security vulnerability
## CVE-2022-37601 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>loader-utils-1.2.3.tgz</b>, <b>loader-utils-1.4.0.tgz</b></p></summary> <p> <details><summary><b>loader-utils-1.2.3.tgz</b></p></summary> <p>utils for webpack loaders</p> <p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/adjust-sourcemap-loader/node_modules/loader-utils/package.json,/node_modules/react-dev-utils/node_modules/loader-utils/package.json,/node_modules/resolve-url-loader/node_modules/loader-utils/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.2.tgz (Root Library) - resolve-url-loader-3.1.1.tgz - :x: **loader-utils-1.2.3.tgz** (Vulnerable Library) </details> <details><summary><b>loader-utils-1.4.0.tgz</b></p></summary> <p>utils for webpack loaders</p> <p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/loader-utils/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.2.tgz (Root Library) - css-loader-3.4.2.tgz - :x: **loader-utils-1.4.0.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js. <p>Publish Date: 2022-10-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-10-12</p> <p>Fix Resolution (loader-utils): 2.0.0</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.1</p><p>Fix Resolution (loader-utils): 2.0.0</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-37601 (High) detected in loader-utils-1.2.3.tgz, loader-utils-1.4.0.tgz - ## CVE-2022-37601 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>loader-utils-1.2.3.tgz</b>, <b>loader-utils-1.4.0.tgz</b></p></summary> <p> <details><summary><b>loader-utils-1.2.3.tgz</b></p></summary> <p>utils for webpack loaders</p> <p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.2.3.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/adjust-sourcemap-loader/node_modules/loader-utils/package.json,/node_modules/react-dev-utils/node_modules/loader-utils/package.json,/node_modules/resolve-url-loader/node_modules/loader-utils/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.2.tgz (Root Library) - resolve-url-loader-3.1.1.tgz - :x: **loader-utils-1.2.3.tgz** (Vulnerable Library) </details> <details><summary><b>loader-utils-1.4.0.tgz</b></p></summary> <p>utils for webpack loaders</p> <p>Library home page: <a href="https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz">https://registry.npmjs.org/loader-utils/-/loader-utils-1.4.0.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/loader-utils/package.json</p> <p> Dependency Hierarchy: - react-scripts-3.4.2.tgz (Root Library) - css-loader-3.4.2.tgz - :x: **loader-utils-1.4.0.tgz** (Vulnerable Library) </details> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Prototype pollution vulnerability in function parseQuery in parseQuery.js in webpack loader-utils 2.0.0 via the name variable in parseQuery.js. <p>Publish Date: 2022-10-12 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-37601>CVE-2022-37601</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Release Date: 2022-10-12</p> <p>Fix Resolution (loader-utils): 2.0.0</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.1</p><p>Fix Resolution (loader-utils): 2.0.0</p> <p>Direct dependency fix Resolution (react-scripts): 5.0.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in loader utils tgz loader utils tgz cve high severity vulnerability vulnerable libraries loader utils tgz loader utils tgz loader utils tgz utils for webpack loaders library home page a href path to dependency file package json path to vulnerable library node modules adjust sourcemap loader node modules loader utils package json node modules react dev utils node modules loader utils package json node modules resolve url loader node modules loader utils package json dependency hierarchy react scripts tgz root library resolve url loader tgz x loader utils tgz vulnerable library loader utils tgz utils for webpack loaders library home page a href path to dependency file package json path to vulnerable library node modules loader utils package json dependency hierarchy react scripts tgz root library css loader tgz x loader utils tgz vulnerable library found in base branch master vulnerability details prototype pollution vulnerability in function parsequery in parsequery js in webpack loader utils via the name variable in parsequery js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution loader utils direct dependency fix resolution react scripts fix resolution loader utils direct dependency fix resolution react scripts step up your open source security game with mend
0
16,697
21,795,783,774
IssuesEvent
2022-05-15 15:59:53
OWASP/ASVS
https://api.github.com/repos/OWASP/ASVS
reopened
ASVS v5.0 - mappings to other standards
5.0 5.0 draft process
I opened issue based on wish/requirement/proposal from Rob van der Veer. A lot of different standards or documentation have their own numeration and as an end user it would be nice to have mapping between them. Some mapping related hanging issues: * https://github.com/OWASP/ASVS/issues/810 * https://github.com/OWASP/ASVS/issues/1025 The question is - how much ASVS should reference and map to other standards. Hard work for mapping is done in https://www.opencre.org/ and it does not seem to make sense to duplicate this work or duplicate this storage in ASVS - potentially it just run out of sync really fast and then there are a lot of conflicts and confusion between CRE and ASVS. In my opinion - we need web output. Related issue: https://github.com/OWASP/ASVS/issues/895 For the web output, we can do this mapping with some script - take mapping based on ASVS requirement number and display link(s) to mappings or show mapped requirements to web directly. We don't have any need to duplicate mapping to ASVS markdown content, we just need to create web output and use existing mapping information from CRE. CRE covers: WSTG, CWE, Pro-active controls, cheat sheets, NIST 65, NIST 53B, Top 10 (2017, 2021 is underway). Mapping to ASVS content makes sense only when we need to point to the source of the requirement - like quite many authentication related requirements come from NIST.
1.0
ASVS v5.0 - mappings to other standards - I opened issue based on wish/requirement/proposal from Rob van der Veer. A lot of different standards or documentation have their own numeration and as an end user it would be nice to have mapping between them. Some mapping related hanging issues: * https://github.com/OWASP/ASVS/issues/810 * https://github.com/OWASP/ASVS/issues/1025 The question is - how much ASVS should reference and map to other standards. Hard work for mapping is done in https://www.opencre.org/ and it does not seem to make sense to duplicate this work or duplicate this storage in ASVS - potentially it just run out of sync really fast and then there are a lot of conflicts and confusion between CRE and ASVS. In my opinion - we need web output. Related issue: https://github.com/OWASP/ASVS/issues/895 For the web output, we can do this mapping with some script - take mapping based on ASVS requirement number and display link(s) to mappings or show mapped requirements to web directly. We don't have any need to duplicate mapping to ASVS markdown content, we just need to create web output and use existing mapping information from CRE. CRE covers: WSTG, CWE, Pro-active controls, cheat sheets, NIST 65, NIST 53B, Top 10 (2017, 2021 is underway). Mapping to ASVS content makes sense only when we need to point to the source of the requirement - like quite many authentication related requirements come from NIST.
process
asvs mappings to other standards i opened issue based on wish requirement proposal from rob van der veer a lot of different standards or documentation have their own numeration and as an end user it would be nice to have mapping between them some mapping related hanging issues the question is how much asvs should reference and map to other standards hard work for mapping is done in and it does not seem to make sense to duplicate this work or duplicate this storage in asvs potentially it just run out of sync really fast and then there are a lot of conflicts and confusion between cre and asvs in my opinion we need web output related issue for the web output we can do this mapping with some script take mapping based on asvs requirement number and display link s to mappings or show mapped requirements to web directly we don t have any need to duplicate mapping to asvs markdown content we just need to create web output and use existing mapping information from cre cre covers wstg cwe pro active controls cheat sheets nist nist top is underway mapping to asvs content makes sense only when we need to point to the source of the requirement like quite many authentication related requirements come from nist
1
86,842
17,090,605,239
IssuesEvent
2021-07-08 16:55:10
ESCOMP/CTSM
https://api.github.com/repos/ESCOMP/CTSM
closed
dbug interface in decompInitMod.F90 should be redone...
closed: wontfix tag: next type: code cleanup type: enhancement
There is a hardcoded debug output interface that's embedded into decompInitMod.F90 that's hard to follow and use. @mvertens did some work on this in #1420 and actually added her own write statements rather than use the existing structure. We do need some flexibility here, because for large processor counts we need the ability to turn off the output about the decomposition. But, the mechanism that's in place isn't very useful as it is.
1.0
dbug interface in decompInitMod.F90 should be redone... - There is a hardcoded debug output interface that's embedded into decompInitMod.F90 that's hard to follow and use. @mvertens did some work on this in #1420 and actually added her own write statements rather than use the existing structure. We do need some flexibility here, because for large processor counts we need the ability to turn off the output about the decomposition. But, the mechanism that's in place isn't very useful as it is.
non_process
dbug interface in decompinitmod should be redone there is a hardcoded debug output interface that s embedded into decompinitmod that s hard to follow and use mvertens did some work on this in and actually added her own write statements rather than use the existing structure we do need some flexibility here because for large processor counts we need the ability to turn off the output about the decomposition but the mechanism that s in place isn t very useful as it is
0
19,336
25,472,637,056
IssuesEvent
2022-11-25 11:33:27
prisma/prisma
https://api.github.com/repos/prisma/prisma
opened
multiSchema + introspection: add --schemas param to the db pull command
kind/feature process/candidate topic: cli topic: introspection tech/engines/introspection engine team/schema topic: prisma db pull topic: multiSchema
Currently, the only way to specify the array of namespaces to introspect is in the datasource block inside a Prisma schema. This would be required for full support of multiSchema with `db pull --url`.
1.0
multiSchema + introspection: add --schemas param to the db pull command - Currently, the only way to specify the array of namespaces to introspect is in the datasource block inside a Prisma schema. This would be required for full support of multiSchema with `db pull --url`.
process
multischema introspection add schemas param to the db pull command currently the only way to specify the array of namespaces to introspect is in the datasource block inside a prisma schema this would be required for full support of multischema with db pull url
1
90,755
10,696,126,044
IssuesEvent
2019-10-23 14:13:37
hyperium/tonic
https://api.github.com/repos/hyperium/tonic
closed
Add readme rustdoc tests to CI
documentation good first issue help wanted
# Motivation Ensure the readme code compiles and is up to date. # Solution I think we can call rustdoc directly and test the readme in its own CI stage.
1.0
Add readme rustdoc tests to CI - # Motivation Ensure the readme code compiles and is up to date. # Solution I think we can call rustdoc directly and test the readme in its own CI stage.
non_process
add readme rustdoc tests to ci motivation ensure the readme code compiles and is up to date solution i think we can call rustdoc directly and test the readme in its own ci stage
0
9,048
12,551,490,050
IssuesEvent
2020-06-06 14:56:42
teddywilson/defund12.org
https://api.github.com/repos/teddywilson/defund12.org
closed
Eureka - Letter to Mayor and Council Members
meets-issue-requirements
To: sseaman@ci.eureka.ca.gov, lcastellano@ci.eureka.ca.gov, hmessner@ci.eureka.ca.gov, kbergel@ci.eureka.ca.gov, aallison@ci.eureka.ca.gov, narroyo@ci.eureka.ca.gov, cityclerk@ci.eureka.ca.gov Subject: Commit to reallocate for social equity Message: To whom it may concern, I am a resident of Eureka's [X] Ward. I am writing to demand that the City Council adopt a budget strategy that prioritizes community well-being and redirects funding away from the police in the next budget evaluation period. We have seen mounting evidence that police departments are ineffective institutions that marginalize minority communities and put citizens at risk of injury and death, yet the police budget accounts for 46% of our general fund (see this article). I ask that you redirect the majority of the $14.2M allotted for crime prevention toward community programs that provide citizens with basic human needs, like affordable healthcare and housing. We don’t need a militarized police force. We need to create a space in which more mental health service providers, social workers, victim/survivor advocates, religious leaders, neighbors, and friends - all of the people who really make up our community - can look out for one another. This is of course a long transition process, but real, actionable change starts with reallocating funding and investing in inclusive and diverse support strategies for our community. As the City Council, the budget proposal is in your hands. It is your duty to represent your constituents. I am urging you to completely revise the budget for the 2020-2021 fiscal year. We can be a beacon for other cities to follow if only we have the courage to change. Thank you for your time, [NAME] [ADDRESS] [PHONE NUMBER] [EMAIL ADDRESS]
1.0
Eureka - Letter to Mayor and Council Members - To: sseaman@ci.eureka.ca.gov, lcastellano@ci.eureka.ca.gov, hmessner@ci.eureka.ca.gov, kbergel@ci.eureka.ca.gov, aallison@ci.eureka.ca.gov, narroyo@ci.eureka.ca.gov, cityclerk@ci.eureka.ca.gov Subject: Commit to reallocate for social equity Message: To whom it may concern, I am a resident of Eureka's [X] Ward. I am writing to demand that the City Council adopt a budget strategy that prioritizes community well-being and redirects funding away from the police in the next budget evaluation period. We have seen mounting evidence that police departments are ineffective institutions that marginalize minority communities and put citizens at risk of injury and death, yet the police budget accounts for 46% of our general fund (see this article). I ask that you redirect the majority of the $14.2M allotted for crime prevention toward community programs that provide citizens with basic human needs, like affordable healthcare and housing. We don’t need a militarized police force. We need to create a space in which more mental health service providers, social workers, victim/survivor advocates, religious leaders, neighbors, and friends - all of the people who really make up our community - can look out for one another. This is of course a long transition process, but real, actionable change starts with reallocating funding and investing in inclusive and diverse support strategies for our community. As the City Council, the budget proposal is in your hands. It is your duty to represent your constituents. I am urging you to completely revise the budget for the 2020-2021 fiscal year. We can be a beacon for other cities to follow if only we have the courage to change. Thank you for your time, [NAME] [ADDRESS] [PHONE NUMBER] [EMAIL ADDRESS]
non_process
eureka letter to mayor and council members to sseaman ci eureka ca gov lcastellano ci eureka ca gov hmessner ci eureka ca gov kbergel ci eureka ca gov aallison ci eureka ca gov narroyo ci eureka ca gov cityclerk ci eureka ca gov subject commit to reallocate for social equity message to whom it may concern i am a resident of eureka s ward i am writing to demand that the city council adopt a budget strategy that prioritizes community well being and redirects funding away from the police in the next budget evaluation period we have seen mounting evidence that police departments are ineffective institutions that marginalize minority communities and put citizens at risk of injury and death yet the police budget accounts for of our general fund see this article i ask that you redirect the majority of the allotted for crime prevention toward community programs that provide citizens with basic human needs like affordable healthcare and housing we don’t need a militarized police force we need to create a space in which more mental health service providers social workers victim survivor advocates religious leaders neighbors and friends all of the people who really make up our community can look out for one another this is of course a long transition process but real actionable change starts with reallocating funding and investing in inclusive and diverse support strategies for our community as the city council the budget proposal is in your hands it is your duty to represent your constituents i am urging you to completely revise the budget for the fiscal year we can be a beacon for other cities to follow if only we have the courage to change thank you for your time
0
7,145
3,934,687,211
IssuesEvent
2016-04-25 23:58:24
jens-maus/yam
https://api.github.com/repos/jens-maus/yam
closed
Crash when apply a "remote filter"
@normal bug fixed GUI nightly build
**Originally by _s.hawamdeh@teletu.it_ on 2014-01-29 18:59:44 +0100** ___ Seems quite easy to reproduce, in a clean installation of YAM just go in Prefs/Filters and active the "Remote filter" button It crash on latest nighty (2014-01-29)
1.0
Crash when apply a "remote filter" - **Originally by _s.hawamdeh@teletu.it_ on 2014-01-29 18:59:44 +0100** ___ Seems quite easy to reproduce, in a clean installation of YAM just go in Prefs/Filters and active the "Remote filter" button It crash on latest nighty (2014-01-29)
non_process
crash when apply a remote filter originally by s hawamdeh teletu it on seems quite easy to reproduce in a clean installation of yam just go in prefs filters and active the remote filter button it crash on latest nighty
0
76,389
26,404,879,900
IssuesEvent
2023-01-13 06:51:27
vector-im/element-web
https://api.github.com/repos/vector-im/element-web
opened
macOS: When system-wide web proxy is enabled element can't connect
T-Defect
### Steps to reproduce On macOS enable a system-wide web proxy as shown here: https://support.apple.com/de-de/guide/mac-help/mchlp25912/13.0/mac/13.0 ### Outcome Element app can't connect to server any more: > Verbindung zum Server wurde unterbrochen. > Nachrichten werden gespeichert und gesendet, wenn die Internetverbindung wiederhergestellt ist. until the proxy is disabled. This does **not** happen when using element in a browser! ### Operating system macOS 13.1 (22C65) ### Application version 1.11.17 (1.11.17) ### How did you install the app? Downloaded dmg from element.io ### Homeserver _No response_ ### Will you send logs? No
1.0
macOS: When system-wide web proxy is enabled element can't connect - ### Steps to reproduce On macOS enable a system-wide web proxy as shown here: https://support.apple.com/de-de/guide/mac-help/mchlp25912/13.0/mac/13.0 ### Outcome Element app can't connect to server any more: > Verbindung zum Server wurde unterbrochen. > Nachrichten werden gespeichert und gesendet, wenn die Internetverbindung wiederhergestellt ist. until the proxy is disabled. This does **not** happen when using element in a browser! ### Operating system macOS 13.1 (22C65) ### Application version 1.11.17 (1.11.17) ### How did you install the app? Downloaded dmg from element.io ### Homeserver _No response_ ### Will you send logs? No
non_process
macos when system wide web proxy is enabled element can t connect steps to reproduce on macos enable a system wide web proxy as shown here outcome element app can t connect to server any more verbindung zum server wurde unterbrochen nachrichten werden gespeichert und gesendet wenn die internetverbindung wiederhergestellt ist until the proxy is disabled this does not happen when using element in a browser operating system macos application version how did you install the app downloaded dmg from element io homeserver no response will you send logs no
0
12,178
14,742,000,251
IssuesEvent
2021-01-07 11:31:32
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Incorrect Invoice Date - Billings, MT
anc-process anp-important ant-support
In GitLab by @kdjstudios on Mar 4, 2019, 14:31 **Submitted by:** "Juan Bernal" <juan.bernal@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-03-04-44437/conversation **Server:** Internal **Client/Site:** Billings **Account:** NA **Issue:** Billing for Billings, MT was finalized on 2/28/19 but the incorrect date are on the invoices. Instead of 3/1/19 it shows 2/1/19 again. Is there a way to correct this?
1.0
Incorrect Invoice Date - Billings, MT - In GitLab by @kdjstudios on Mar 4, 2019, 14:31 **Submitted by:** "Juan Bernal" <juan.bernal@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2019-03-04-44437/conversation **Server:** Internal **Client/Site:** Billings **Account:** NA **Issue:** Billing for Billings, MT was finalized on 2/28/19 but the incorrect date are on the invoices. Instead of 3/1/19 it shows 2/1/19 again. Is there a way to correct this?
process
incorrect invoice date billings mt in gitlab by kdjstudios on mar submitted by juan bernal helpdesk server internal client site billings account na issue billing for billings mt was finalized on but the incorrect date are on the invoices instead of it shows again is there a way to correct this
1
17,642
23,466,191,704
IssuesEvent
2022-08-16 17:00:00
deephaven/deephaven-core
https://api.github.com/repos/deephaven/deephaven-core
closed
EPIC: Support dynamically processing deltas in the C++ API
epic cpp api cpp-dynamic-process
1. Define data structures, etc. 2. Produce a C++ replicated table experience
1.0
EPIC: Support dynamically processing deltas in the C++ API - 1. Define data structures, etc. 2. Produce a C++ replicated table experience
process
epic support dynamically processing deltas in the c api define data structures etc produce a c replicated table experience
1
5,491
8,359,971,025
IssuesEvent
2018-10-03 09:58:23
linnovate/root
https://api.github.com/repos/linnovate/root
closed
delete items is multiply choises
Process bug bug
select item with multiply choise that you allow to delete (you need editors premetions) select another item that you can't delete becausr you aren't editor delete two items with the delete button the two items deleted, but there is no premmetion to delete the second item.
1.0
delete items is multiply choises - select item with multiply choise that you allow to delete (you need editors premetions) select another item that you can't delete becausr you aren't editor delete two items with the delete button the two items deleted, but there is no premmetion to delete the second item.
process
delete items is multiply choises select item with multiply choise that you allow to delete you need editors premetions select another item that you can t delete becausr you aren t editor delete two items with the delete button the two items deleted but there is no premmetion to delete the second item
1
244,600
20,678,777,597
IssuesEvent
2022-03-10 11:53:05
apache/cloudstack
https://api.github.com/repos/apache/cloudstack
closed
KVM problem with volume snapshots kept only on the primary storage (non-managed storages)
component:kvm component:primary-storage status:needs-testing
##### ISSUE TYPE <!-- Pick one below and delete the rest --> * Bug Report ##### COMPONENT NAME <!-- Categorize the issue, e.g. API, VR, VPN, UI, etc. --> ~~~ volume snapshots ~~~ ##### CLOUDSTACK VERSION <!-- New line separated list of affected versions, commit ID for issues on main branch. --> ~~~ from 4.11.3.0 to main ~~~ ##### SUMMARY <!-- Explain the problem/feature briefly --> When `snapshot.backup.to.secondary` is set to `false` the snapshots are kept only on the primary storage. When you take more than one snapshot of a volume and delete this volume the snapshots are still active in the UI/DB but deleted from the storage. The problem with different primary storages is: - for the current NFS implementation is that the snapshots are internal, and they are deleted when you delete the volume - active in the UI/DB - for Ceph - the snapshots are deleted from the pool, but active in DB/UI - probably with the new Linstor driver the problem will be the same - for StorPool the snapshots are not deleted, but a random record is deleted from the `snapshot_store_ref` table and the reference to the snapshot's path is lost - with this PR - [#5297](https://github.com/apache/cloudstack/pull/5297) - the problem will be the same as the StorPool primary storage ##### STEPS TO REPRODUCE <!-- For bugs, show exactly how to reproduce the problem, using a minimal test-case. Use Screenshots if accurate. For new features, show how the feature would be used. --> <!-- Paste example playbooks or commands between quotes below --> ~~~ - set `snapshot.backup.to.secondary` to `false` - take a few snapshots of a volume - delete the volume The snapshots are visible in the UI, active in DB table `snapshots` and one of them is deleted in `snapshot_store_ref` DB table, because of this code in VolumeServiceImpl.deleteVolumeCallback: SnapshotDataStoreVO snapStoreVo = _snapshotStoreDao.findByVolume(vo.getId(), DataStoreRole.Primary); if (snapStoreVo != null) { long storagePoolId = snapStoreVo.getDataStoreId(); StoragePoolVO storagePoolVO = storagePoolDao.findById(storagePoolId); if (storagePoolVO.isManaged()) { DataStore primaryDataStore = dataStoreMgr.getPrimaryDataStore(storagePoolId); Map<String, String> mapCapabilities = primaryDataStore.getDriver().getCapabilities(); String value = mapCapabilities.get(DataStoreCapabilities.STORAGE_SYSTEM_SNAPSHOT.toString()); Boolean supportsStorageSystemSnapshots = new Boolean(value); if (!supportsStorageSystemSnapshots) { _snapshotStoreDao.remove(snapStoreVo.getId()); } } else { _snapshotStoreDao.remove(snapStoreVo.getId()); } } ~~~ <!-- You can also paste gist.github.com links for larger files --> ##### EXPECTED RESULTS <!-- What did you expect to happen when running the steps above? --> ~~~ Probably for all non-managed primary storages we should not delete the snapshots. For NFS this PR - #5297 will fix the problem with the internal snapshots. ~~~ ##### ACTUAL RESULTS <!-- What actually happened? --> <!-- Paste verbatim command output between quotes below --> ~~~ Snapshots are deleted from the storage, but active in the UI/DB as a garbage ~~~
1.0
KVM problem with volume snapshots kept only on the primary storage (non-managed storages) - ##### ISSUE TYPE <!-- Pick one below and delete the rest --> * Bug Report ##### COMPONENT NAME <!-- Categorize the issue, e.g. API, VR, VPN, UI, etc. --> ~~~ volume snapshots ~~~ ##### CLOUDSTACK VERSION <!-- New line separated list of affected versions, commit ID for issues on main branch. --> ~~~ from 4.11.3.0 to main ~~~ ##### SUMMARY <!-- Explain the problem/feature briefly --> When `snapshot.backup.to.secondary` is set to `false` the snapshots are kept only on the primary storage. When you take more than one snapshot of a volume and delete this volume the snapshots are still active in the UI/DB but deleted from the storage. The problem with different primary storages is: - for the current NFS implementation is that the snapshots are internal, and they are deleted when you delete the volume - active in the UI/DB - for Ceph - the snapshots are deleted from the pool, but active in DB/UI - probably with the new Linstor driver the problem will be the same - for StorPool the snapshots are not deleted, but a random record is deleted from the `snapshot_store_ref` table and the reference to the snapshot's path is lost - with this PR - [#5297](https://github.com/apache/cloudstack/pull/5297) - the problem will be the same as the StorPool primary storage ##### STEPS TO REPRODUCE <!-- For bugs, show exactly how to reproduce the problem, using a minimal test-case. Use Screenshots if accurate. For new features, show how the feature would be used. --> <!-- Paste example playbooks or commands between quotes below --> ~~~ - set `snapshot.backup.to.secondary` to `false` - take a few snapshots of a volume - delete the volume The snapshots are visible in the UI, active in DB table `snapshots` and one of them is deleted in `snapshot_store_ref` DB table, because of this code in VolumeServiceImpl.deleteVolumeCallback: SnapshotDataStoreVO snapStoreVo = _snapshotStoreDao.findByVolume(vo.getId(), DataStoreRole.Primary); if (snapStoreVo != null) { long storagePoolId = snapStoreVo.getDataStoreId(); StoragePoolVO storagePoolVO = storagePoolDao.findById(storagePoolId); if (storagePoolVO.isManaged()) { DataStore primaryDataStore = dataStoreMgr.getPrimaryDataStore(storagePoolId); Map<String, String> mapCapabilities = primaryDataStore.getDriver().getCapabilities(); String value = mapCapabilities.get(DataStoreCapabilities.STORAGE_SYSTEM_SNAPSHOT.toString()); Boolean supportsStorageSystemSnapshots = new Boolean(value); if (!supportsStorageSystemSnapshots) { _snapshotStoreDao.remove(snapStoreVo.getId()); } } else { _snapshotStoreDao.remove(snapStoreVo.getId()); } } ~~~ <!-- You can also paste gist.github.com links for larger files --> ##### EXPECTED RESULTS <!-- What did you expect to happen when running the steps above? --> ~~~ Probably for all non-managed primary storages we should not delete the snapshots. For NFS this PR - #5297 will fix the problem with the internal snapshots. ~~~ ##### ACTUAL RESULTS <!-- What actually happened? --> <!-- Paste verbatim command output between quotes below --> ~~~ Snapshots are deleted from the storage, but active in the UI/DB as a garbage ~~~
non_process
kvm problem with volume snapshots kept only on the primary storage non managed storages issue type bug report component name categorize the issue e g api vr vpn ui etc volume snapshots cloudstack version new line separated list of affected versions commit id for issues on main branch from to main summary when snapshot backup to secondary is set to false the snapshots are kept only on the primary storage when you take more than one snapshot of a volume and delete this volume the snapshots are still active in the ui db but deleted from the storage the problem with different primary storages is for the current nfs implementation is that the snapshots are internal and they are deleted when you delete the volume active in the ui db for ceph the snapshots are deleted from the pool but active in db ui probably with the new linstor driver the problem will be the same for storpool the snapshots are not deleted but a random record is deleted from the snapshot store ref table and the reference to the snapshot s path is lost with this pr the problem will be the same as the storpool primary storage steps to reproduce for bugs show exactly how to reproduce the problem using a minimal test case use screenshots if accurate for new features show how the feature would be used set snapshot backup to secondary to false take a few snapshots of a volume delete the volume the snapshots are visible in the ui active in db table snapshots and one of them is deleted in snapshot store ref db table because of this code in volumeserviceimpl deletevolumecallback snapshotdatastorevo snapstorevo snapshotstoredao findbyvolume vo getid datastorerole primary if snapstorevo null long storagepoolid snapstorevo getdatastoreid storagepoolvo storagepoolvo storagepooldao findbyid storagepoolid if storagepoolvo ismanaged datastore primarydatastore datastoremgr getprimarydatastore storagepoolid map mapcapabilities primarydatastore getdriver getcapabilities string value mapcapabilities get datastorecapabilities storage system snapshot tostring boolean supportsstoragesystemsnapshots new boolean value if supportsstoragesystemsnapshots snapshotstoredao remove snapstorevo getid else snapshotstoredao remove snapstorevo getid expected results probably for all non managed primary storages we should not delete the snapshots for nfs this pr will fix the problem with the internal snapshots actual results snapshots are deleted from the storage but active in the ui db as a garbage
0
3,557
6,588,464,123
IssuesEvent
2017-09-14 03:20:02
econtoolkit/continuous_time_methods
https://api.github.com/repos/econtoolkit/continuous_time_methods
opened
Verify notes on discretization of operators
Stochastic Processes theory
Check all of the math in the `docs/operator_discretization_finite_differences.tex` file. This has been separated from the older file, since I realized these notes will apply to all sorts of other examples. A few things to check carefully: * I added in the boundary values for both the reflection and the absorbing barrier. Go through the algebra very carefully to make sure I didn't make a mistake in them. * You will also see that I think the formulas for (7) to (11) simplify in the boundary conditions (especially in the reflection case). * Do a sanity check that the upwind procedure (i.e. the sign of of $\mu_1$ doesn't effect the reflection boundary values in (17) and (20). My suspicion is that it does not since this seems to make the math add up for each row adding to 1 - as it should as a proper markov chain. * I think I modified your case of the $\mu < 0$ specialization correctly in (21) to (28), but I would check it out to be sure.
1.0
Verify notes on discretization of operators - Check all of the math in the `docs/operator_discretization_finite_differences.tex` file. This has been separated from the older file, since I realized these notes will apply to all sorts of other examples. A few things to check carefully: * I added in the boundary values for both the reflection and the absorbing barrier. Go through the algebra very carefully to make sure I didn't make a mistake in them. * You will also see that I think the formulas for (7) to (11) simplify in the boundary conditions (especially in the reflection case). * Do a sanity check that the upwind procedure (i.e. the sign of of $\mu_1$ doesn't effect the reflection boundary values in (17) and (20). My suspicion is that it does not since this seems to make the math add up for each row adding to 1 - as it should as a proper markov chain. * I think I modified your case of the $\mu < 0$ specialization correctly in (21) to (28), but I would check it out to be sure.
process
verify notes on discretization of operators check all of the math in the docs operator discretization finite differences tex file this has been separated from the older file since i realized these notes will apply to all sorts of other examples a few things to check carefully i added in the boundary values for both the reflection and the absorbing barrier go through the algebra very carefully to make sure i didn t make a mistake in them you will also see that i think the formulas for to simplify in the boundary conditions especially in the reflection case do a sanity check that the upwind procedure i e the sign of of mu doesn t effect the reflection boundary values in and my suspicion is that it does not since this seems to make the math add up for each row adding to as it should as a proper markov chain i think i modified your case of the mu specialization correctly in to but i would check it out to be sure
1
10,241
13,099,089,251
IssuesEvent
2020-08-03 20:50:29
googleapis/google-cloud-ruby
https://api.github.com/repos/googleapis/google-cloud-ruby
closed
Migrate google-cloud-firestore to the microgenerator
api: firestore type: process
Migrate google-cloud-firestore to the microgenerator. This involves the following steps: * [x] Write synth file and generate `google-cloud-firestore-v1`. Note: do _not_ generate firestore-v1beta1. The beta client is being deprecated. * [x] Write synth file and generate `google-cloud-firestore-admin-v1`. * [x] Make sure the new libraries are configured in kokoro * [x] Release `google-cloud-firestore-v1` * [x] Release `google-cloud-firestore-admin-v1` * [x] Switch `google-cloud-firestore` backend to the versioned gems. That is: * Rip out synth and all the generated code * Add `google-cloud-firestore-v1` and `google-cloud-firestore-admin-v1` as dependencies * Update the veneer code to the microgenerator usage * [ ] Release `google-cloud-firestore` update I do not believe samples need to be updated, unless they invoke the low-level interface directly.
1.0
Migrate google-cloud-firestore to the microgenerator - Migrate google-cloud-firestore to the microgenerator. This involves the following steps: * [x] Write synth file and generate `google-cloud-firestore-v1`. Note: do _not_ generate firestore-v1beta1. The beta client is being deprecated. * [x] Write synth file and generate `google-cloud-firestore-admin-v1`. * [x] Make sure the new libraries are configured in kokoro * [x] Release `google-cloud-firestore-v1` * [x] Release `google-cloud-firestore-admin-v1` * [x] Switch `google-cloud-firestore` backend to the versioned gems. That is: * Rip out synth and all the generated code * Add `google-cloud-firestore-v1` and `google-cloud-firestore-admin-v1` as dependencies * Update the veneer code to the microgenerator usage * [ ] Release `google-cloud-firestore` update I do not believe samples need to be updated, unless they invoke the low-level interface directly.
process
migrate google cloud firestore to the microgenerator migrate google cloud firestore to the microgenerator this involves the following steps write synth file and generate google cloud firestore note do not generate firestore the beta client is being deprecated write synth file and generate google cloud firestore admin make sure the new libraries are configured in kokoro release google cloud firestore release google cloud firestore admin switch google cloud firestore backend to the versioned gems that is rip out synth and all the generated code add google cloud firestore and google cloud firestore admin as dependencies update the veneer code to the microgenerator usage release google cloud firestore update i do not believe samples need to be updated unless they invoke the low level interface directly
1
228,236
17,439,698,143
IssuesEvent
2021-08-05 01:51:28
GenericMappingTools/pygmt
https://api.github.com/repos/GenericMappingTools/pygmt
closed
Add gallery example for grdclip
documentation scipy-sprint
**Description of the desired feature** It would be great to have some **gallery examples** on using the `grdclip` function. This will be a follow up to #1261. <!-- Please be as detailed as you can in your description. If possible, include an example of how you would like to use this feature (even better if it's a code example). --> ## Examples of what you can do These are just suggestions of things that you can plot. Feel free to come up with other ideas! 1. Plot seafloor bathymetry and clip all land above sea level 2. Plot an island and remove all seafloor data ## How to make the change :monocle_face: 1. First, read up our contributing guidelines at https://www.pygmt.org/v0.4.0/contributing.html#editing-the-documentation 2. Leave a comment below, stating that you will tackle this issue. We will assign you to this issue and you can then start working on it. 3. Go to e.g. https://github.com/GenericMappingTools/pygmt/tree/master/examples/gallery/maps, and click on 'Add File'. Or you may want to do this in your code editor locally. 4. Submit a pull request, and be sure to fill it up with as much detail as possible. **Are you willing to help implement and maintain this feature?** Yes! I'm happy to help someone learn how to contribute to PyGMT! <!-- Every feature we add is code that we will have to maintain and keep updated. This takes a lot of effort. If you are willing to be involved in the project and help maintain your feature, it will make it easier for us to accept it. -->
1.0
Add gallery example for grdclip - **Description of the desired feature** It would be great to have some **gallery examples** on using the `grdclip` function. This will be a follow up to #1261. <!-- Please be as detailed as you can in your description. If possible, include an example of how you would like to use this feature (even better if it's a code example). --> ## Examples of what you can do These are just suggestions of things that you can plot. Feel free to come up with other ideas! 1. Plot seafloor bathymetry and clip all land above sea level 2. Plot an island and remove all seafloor data ## How to make the change :monocle_face: 1. First, read up our contributing guidelines at https://www.pygmt.org/v0.4.0/contributing.html#editing-the-documentation 2. Leave a comment below, stating that you will tackle this issue. We will assign you to this issue and you can then start working on it. 3. Go to e.g. https://github.com/GenericMappingTools/pygmt/tree/master/examples/gallery/maps, and click on 'Add File'. Or you may want to do this in your code editor locally. 4. Submit a pull request, and be sure to fill it up with as much detail as possible. **Are you willing to help implement and maintain this feature?** Yes! I'm happy to help someone learn how to contribute to PyGMT! <!-- Every feature we add is code that we will have to maintain and keep updated. This takes a lot of effort. If you are willing to be involved in the project and help maintain your feature, it will make it easier for us to accept it. -->
non_process
add gallery example for grdclip description of the desired feature it would be great to have some gallery examples on using the grdclip function this will be a follow up to examples of what you can do these are just suggestions of things that you can plot feel free to come up with other ideas plot seafloor bathymetry and clip all land above sea level plot an island and remove all seafloor data how to make the change monocle face first read up our contributing guidelines at leave a comment below stating that you will tackle this issue we will assign you to this issue and you can then start working on it go to e g and click on add file or you may want to do this in your code editor locally submit a pull request and be sure to fill it up with as much detail as possible are you willing to help implement and maintain this feature yes i m happy to help someone learn how to contribute to pygmt
0
186,287
15,053,714,248
IssuesEvent
2021-02-03 16:35:42
kintoproj/kinto-helm
https://api.github.com/repos/kintoproj/kinto-helm
opened
Install manifests for dev and prod
documentation
**Is your feature request related to a problem? Please describe.** As a user, I want to be able to install KintoHub - quickly for testing on my development cluster - configure it for my production cluster **Describe the solution you'd like** We know that there is an issue with cert-manager being a sub dependencies of our main chart. So we should propose 2 ways of installing KintoHub: 1. `helm template .. | kubectl apply -f -` The easiest way for dev, it's configurable using the parameters for `helm template` but it will not be managed by helm and you will not be able to check diff or rollback. 2. `helm install` We need to add requirements before installing kintohub. Cert-manager and argo should be already installed since they are cluster wide resources. Add doc for that with specific versions and configuration.
1.0
Install manifests for dev and prod - **Is your feature request related to a problem? Please describe.** As a user, I want to be able to install KintoHub - quickly for testing on my development cluster - configure it for my production cluster **Describe the solution you'd like** We know that there is an issue with cert-manager being a sub dependencies of our main chart. So we should propose 2 ways of installing KintoHub: 1. `helm template .. | kubectl apply -f -` The easiest way for dev, it's configurable using the parameters for `helm template` but it will not be managed by helm and you will not be able to check diff or rollback. 2. `helm install` We need to add requirements before installing kintohub. Cert-manager and argo should be already installed since they are cluster wide resources. Add doc for that with specific versions and configuration.
non_process
install manifests for dev and prod is your feature request related to a problem please describe as a user i want to be able to install kintohub quickly for testing on my development cluster configure it for my production cluster describe the solution you d like we know that there is an issue with cert manager being a sub dependencies of our main chart so we should propose ways of installing kintohub helm template kubectl apply f the easiest way for dev it s configurable using the parameters for helm template but it will not be managed by helm and you will not be able to check diff or rollback helm install we need to add requirements before installing kintohub cert manager and argo should be already installed since they are cluster wide resources add doc for that with specific versions and configuration
0
8,698
11,841,331,269
IssuesEvent
2020-03-23 20:35:07
john-kurkowski/tldextract
https://api.github.com/repos/john-kurkowski/tldextract
closed
Inconsistent handling of malformed(?) URLs with successive dots
icebox: needs clarification low priority: caller can pre/post-process
I'm not sure if this should be caught by tldextract or if it's fine as is. Feel free to close. It doesn't make much sense to me to return non-empty subdomains and suffix but an empty domain: ``` >>> tldextract.extract('example.com') ExtractResult(subdomain='', domain='example', suffix='com') >>> tldextract.extract('example..com') ExtractResult(subdomain='example', domain='', suffix='com') >>> tldextract.extract('example...com') ExtractResult(subdomain='example.', domain='', suffix='com') >>> tldextract.extract('example....com') ExtractResult(subdomain='example..', domain='', suffix='com') ``` On the other hand this is what the user passed in, so... I don't know. At minimum this breaks the 'rejoin the whole namedtuple' example from the README. One of the dots is swallowed. ``` >>> ext = tldextract.extract('example..com') >>> '.'.join(part for part in ext if part) 'example.com' ```
1.0
Inconsistent handling of malformed(?) URLs with successive dots - I'm not sure if this should be caught by tldextract or if it's fine as is. Feel free to close. It doesn't make much sense to me to return non-empty subdomains and suffix but an empty domain: ``` >>> tldextract.extract('example.com') ExtractResult(subdomain='', domain='example', suffix='com') >>> tldextract.extract('example..com') ExtractResult(subdomain='example', domain='', suffix='com') >>> tldextract.extract('example...com') ExtractResult(subdomain='example.', domain='', suffix='com') >>> tldextract.extract('example....com') ExtractResult(subdomain='example..', domain='', suffix='com') ``` On the other hand this is what the user passed in, so... I don't know. At minimum this breaks the 'rejoin the whole namedtuple' example from the README. One of the dots is swallowed. ``` >>> ext = tldextract.extract('example..com') >>> '.'.join(part for part in ext if part) 'example.com' ```
process
inconsistent handling of malformed urls with successive dots i m not sure if this should be caught by tldextract or if it s fine as is feel free to close it doesn t make much sense to me to return non empty subdomains and suffix but an empty domain tldextract extract example com extractresult subdomain domain example suffix com tldextract extract example com extractresult subdomain example domain suffix com tldextract extract example com extractresult subdomain example domain suffix com tldextract extract example com extractresult subdomain example domain suffix com on the other hand this is what the user passed in so i don t know at minimum this breaks the rejoin the whole namedtuple example from the readme one of the dots is swallowed ext tldextract extract example com join part for part in ext if part example com
1
1,962
6,688,875,248
IssuesEvent
2017-10-08 19:33:54
cannawen/metric_units_reddit_bot
https://api.github.com/repos/cannawen/metric_units_reddit_bot
closed
"oz" is not being converted to "troy oz" in subreddit /r/Pmsforsale
bug first timers only hacktoberfest in progress maintainer approved
[metric_units](https://www.reddit.com/user/metric_units) is a sassy reddit bot that finds imperial units, and replies with a metric conversion. ## First timers only This issue is reserved for anyone who has **never** made a pull request to Open Source Software. If you are not a first timer, we would still love your help! Please see our other open issues :) [Read our New to OSS guide](https://github.com/cannawen/metric_units_reddit_bot/blob/master/NEW-TO-OSS.md) for an overview of what to expect ### IMPORTANT: Comment below if you would like to volunteer to fix this bug. --- ## Recommended experience - Programming fundamentals (`if` statements, arrays, etc.) - No previous experience working with Regular Expressions - Some familiarity with how Reddit works ## Time estimate 30-60 minutes ## Background Information So, you want to work on a Reddit bot that converts imperial units to metric units? Awesome! It's not an easy problem to solve though :( Imperial units are confusing!! Take ounces, for example. When someone says "ounces" they usually mean [regular ("avoirdupois") ounces](https://en.wikipedia.org/wiki/Avoirdupois) (which is 28.3495 grams). But, they could also be referring to ["troy" ounces](https://en.wikipedia.org/wiki/Troy_weight) (31.1035 grams). Troy ounces are most often used when dealing with precious metals, like gold or silver ## The problem The subreddit [/r/Pmsforsale](https://www.reddit.com/r/Pmsforsale/) is all about precious metals. They may refer to something as "ounce", but what they really mean is "troy ounce" So, when the bot finds itself in /r/Pmsforsale, we want it to find all mentions of "ounces" and replace them with "troy ounces". This should already be happening, but it is not! There is a bug in the code. To replace "ounces" with "troy ounces", we must find them by using a thing called Regular Expressions (also known as a, "regex"). Regexes help us find strings that match a certain pattern. For example if we had a regex `a` and applied it to this string: `ababcA` ... it would find all of the lower case a's but ignore the other characters (b, c, and A). OPTIONAL: You can go through [this tutorial](https://regex101.com/) to learn more about regexes OK, we are ready to get started. Lets get our development environment set up! If you run into any problems, try googling for a solution. If that doesn't work, reply to this issue with screenshots of the error and what steps you have already taken to try to solve the problem. We're happy to help :) 1. Open up a terminal window and type `node --version` 2. If it complains you do not have node, download it [here](https://nodejs.org/en/download/). 3. If your Node version is under v6.2.1, you have to update it 4. Fork the main github repository and "clone your fork" (download the bot's code) using git. [See more detailed instructions here](https://guides.github.com/activities/forking/) 5. Run `npm install` to download all of the required dependencies 6. Run `npm test` to run all of the tests. All green? Yay! 7. Open `./src/conversion_helper-test.js` in your favourite text editor 8. Search for the text "should convert oz to troy oz in precious metal sub" 9. Replace `it.skip` with `it.only`. This will tell the program to only run this single test. While you're here, take a look at the test! Read it carefully, can you guess what it does? 10. Run `npm test` again 11. Observe failing test :( Boo, failing tests! Booo! Tests allow us to define inputs and expected outputs to functions, so we can see if a function is doing the correct thing. 12. This test is failing because of a bug in the code. Let's go find that code 13. Go to `./src/conversion_helper.js` and find where we declare `specialSubredditsRegex` 14. This line creates a regex, but all regexes are case-sensitive by default! Notice the difference in capitalization in our test vs. our regular expression? 15. Your task is to make the regex match case-insensitively to make the test pass. Please use Google or any other resource. OPTIONAL: Can you think of other ways to make the test pass? Post your ideas in your Pull Request description later on (step 18). 16. Once your single test is passing, change the `it.only` to `it` to run all of the tests again. 17. Is everything green? Woohoo, green! Yay!! 18. Please commit the changes with a descriptive git commit message, and then make a pull request back to the main repository :) OPTIONAL: Don't forget to [give yourself credit](https://github.com/cannawen/metric_units_reddit_bot/blob/master/CONTRIBUTING.md#add-yourself-to-the-contributors-list)! Thank you for contributing to metric_units bot! 19. Wait for your PR to be reviewed by a maintainer, and address any comments they may have ## Step 20: YOUR CHANGE GETS MERGED AND RELEASED!! Party!!!
True
"oz" is not being converted to "troy oz" in subreddit /r/Pmsforsale - [metric_units](https://www.reddit.com/user/metric_units) is a sassy reddit bot that finds imperial units, and replies with a metric conversion. ## First timers only This issue is reserved for anyone who has **never** made a pull request to Open Source Software. If you are not a first timer, we would still love your help! Please see our other open issues :) [Read our New to OSS guide](https://github.com/cannawen/metric_units_reddit_bot/blob/master/NEW-TO-OSS.md) for an overview of what to expect ### IMPORTANT: Comment below if you would like to volunteer to fix this bug. --- ## Recommended experience - Programming fundamentals (`if` statements, arrays, etc.) - No previous experience working with Regular Expressions - Some familiarity with how Reddit works ## Time estimate 30-60 minutes ## Background Information So, you want to work on a Reddit bot that converts imperial units to metric units? Awesome! It's not an easy problem to solve though :( Imperial units are confusing!! Take ounces, for example. When someone says "ounces" they usually mean [regular ("avoirdupois") ounces](https://en.wikipedia.org/wiki/Avoirdupois) (which is 28.3495 grams). But, they could also be referring to ["troy" ounces](https://en.wikipedia.org/wiki/Troy_weight) (31.1035 grams). Troy ounces are most often used when dealing with precious metals, like gold or silver ## The problem The subreddit [/r/Pmsforsale](https://www.reddit.com/r/Pmsforsale/) is all about precious metals. They may refer to something as "ounce", but what they really mean is "troy ounce" So, when the bot finds itself in /r/Pmsforsale, we want it to find all mentions of "ounces" and replace them with "troy ounces". This should already be happening, but it is not! There is a bug in the code. To replace "ounces" with "troy ounces", we must find them by using a thing called Regular Expressions (also known as a, "regex"). Regexes help us find strings that match a certain pattern. For example if we had a regex `a` and applied it to this string: `ababcA` ... it would find all of the lower case a's but ignore the other characters (b, c, and A). OPTIONAL: You can go through [this tutorial](https://regex101.com/) to learn more about regexes OK, we are ready to get started. Lets get our development environment set up! If you run into any problems, try googling for a solution. If that doesn't work, reply to this issue with screenshots of the error and what steps you have already taken to try to solve the problem. We're happy to help :) 1. Open up a terminal window and type `node --version` 2. If it complains you do not have node, download it [here](https://nodejs.org/en/download/). 3. If your Node version is under v6.2.1, you have to update it 4. Fork the main github repository and "clone your fork" (download the bot's code) using git. [See more detailed instructions here](https://guides.github.com/activities/forking/) 5. Run `npm install` to download all of the required dependencies 6. Run `npm test` to run all of the tests. All green? Yay! 7. Open `./src/conversion_helper-test.js` in your favourite text editor 8. Search for the text "should convert oz to troy oz in precious metal sub" 9. Replace `it.skip` with `it.only`. This will tell the program to only run this single test. While you're here, take a look at the test! Read it carefully, can you guess what it does? 10. Run `npm test` again 11. Observe failing test :( Boo, failing tests! Booo! Tests allow us to define inputs and expected outputs to functions, so we can see if a function is doing the correct thing. 12. This test is failing because of a bug in the code. Let's go find that code 13. Go to `./src/conversion_helper.js` and find where we declare `specialSubredditsRegex` 14. This line creates a regex, but all regexes are case-sensitive by default! Notice the difference in capitalization in our test vs. our regular expression? 15. Your task is to make the regex match case-insensitively to make the test pass. Please use Google or any other resource. OPTIONAL: Can you think of other ways to make the test pass? Post your ideas in your Pull Request description later on (step 18). 16. Once your single test is passing, change the `it.only` to `it` to run all of the tests again. 17. Is everything green? Woohoo, green! Yay!! 18. Please commit the changes with a descriptive git commit message, and then make a pull request back to the main repository :) OPTIONAL: Don't forget to [give yourself credit](https://github.com/cannawen/metric_units_reddit_bot/blob/master/CONTRIBUTING.md#add-yourself-to-the-contributors-list)! Thank you for contributing to metric_units bot! 19. Wait for your PR to be reviewed by a maintainer, and address any comments they may have ## Step 20: YOUR CHANGE GETS MERGED AND RELEASED!! Party!!!
non_process
oz is not being converted to troy oz in subreddit r pmsforsale is a sassy reddit bot that finds imperial units and replies with a metric conversion first timers only this issue is reserved for anyone who has never made a pull request to open source software if you are not a first timer we would still love your help please see our other open issues for an overview of what to expect important comment below if you would like to volunteer to fix this bug recommended experience programming fundamentals if statements arrays etc no previous experience working with regular expressions some familiarity with how reddit works time estimate minutes background information so you want to work on a reddit bot that converts imperial units to metric units awesome it s not an easy problem to solve though imperial units are confusing take ounces for example when someone says ounces they usually mean which is grams but they could also be referring to grams troy ounces are most often used when dealing with precious metals like gold or silver the problem the subreddit is all about precious metals they may refer to something as ounce but what they really mean is troy ounce so when the bot finds itself in r pmsforsale we want it to find all mentions of ounces and replace them with troy ounces this should already be happening but it is not there is a bug in the code to replace ounces with troy ounces we must find them by using a thing called regular expressions also known as a regex regexes help us find strings that match a certain pattern for example if we had a regex a and applied it to this string ababca it would find all of the lower case a s but ignore the other characters b c and a optional you can go through to learn more about regexes ok we are ready to get started lets get our development environment set up if you run into any problems try googling for a solution if that doesn t work reply to this issue with screenshots of the error and what steps you have already taken to try to solve the problem we re happy to help open up a terminal window and type node version if it complains you do not have node download it if your node version is under you have to update it fork the main github repository and clone your fork download the bot s code using git run npm install to download all of the required dependencies run npm test to run all of the tests all green yay open src conversion helper test js in your favourite text editor search for the text should convert oz to troy oz in precious metal sub replace it skip with it only this will tell the program to only run this single test while you re here take a look at the test read it carefully can you guess what it does run npm test again observe failing test boo failing tests booo tests allow us to define inputs and expected outputs to functions so we can see if a function is doing the correct thing this test is failing because of a bug in the code let s go find that code go to src conversion helper js and find where we declare specialsubredditsregex this line creates a regex but all regexes are case sensitive by default notice the difference in capitalization in our test vs our regular expression your task is to make the regex match case insensitively to make the test pass please use google or any other resource optional can you think of other ways to make the test pass post your ideas in your pull request description later on step once your single test is passing change the it only to it to run all of the tests again is everything green woohoo green yay please commit the changes with a descriptive git commit message and then make a pull request back to the main repository optional don t forget to thank you for contributing to metric units bot wait for your pr to be reviewed by a maintainer and address any comments they may have step your change gets merged and released party
0
560,104
16,585,364,501
IssuesEvent
2021-05-31 18:17:24
ckiplab/issue
https://api.github.com/repos/ckiplab/issue
closed
[CKIP CoreNLP] Internal Server Error with WSD
Priority: Medium Status: 1-Assigned Type: Bug
**Describe the bug** The system throws 500 error with WSD **To Reproduce** Submit with WSD only **Expected behavior** No error **Screenshots** <img width="1124" alt="Screen Shot 2021-05-13 at 11 12 36 AM" src="https://user-images.githubusercontent.com/11171959/118072315-20d5ed80-b3dc-11eb-97f9-68d911748608.png">
1.0
[CKIP CoreNLP] Internal Server Error with WSD - **Describe the bug** The system throws 500 error with WSD **To Reproduce** Submit with WSD only **Expected behavior** No error **Screenshots** <img width="1124" alt="Screen Shot 2021-05-13 at 11 12 36 AM" src="https://user-images.githubusercontent.com/11171959/118072315-20d5ed80-b3dc-11eb-97f9-68d911748608.png">
non_process
internal server error with wsd describe the bug the system throws error with wsd to reproduce submit with wsd only expected behavior no error screenshots img width alt screen shot at am src
0
15,078
18,781,765,361
IssuesEvent
2021-11-08 07:47:54
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
Dependency updates for Participant Manager raised by dependabot for milestone 2.0.6
Process: Fixed Process: Tested QA Process: Tested dev
Fix the dependency updates for Participant Manager raised by dependabot
3.0
Dependency updates for Participant Manager raised by dependabot for milestone 2.0.6 - Fix the dependency updates for Participant Manager raised by dependabot
process
dependency updates for participant manager raised by dependabot for milestone fix the dependency updates for participant manager raised by dependabot
1
41,961
10,727,953,552
IssuesEvent
2019-10-28 12:57:10
primefaces/primefaces
https://api.github.com/repos/primefaces/primefaces
closed
Dock: problems when page has scrollbar
defect
Hi, I noticed some erroneous behaviour with the dock component on the latest 6.2 RC1 Problem 1: When reloading the page, as scroll status is not the top (e.g. in the middle of the page) the dock does not animate correctly. ![image](https://user-images.githubusercontent.com/13161711/34867663-4a6d1b84-f781-11e7-8d83-5792f9465ad1.png) Problem 2: When using halign="right" the dock moves into the scrollbar ![image](https://user-images.githubusercontent.com/13161711/34867615-197f4a24-f781-11e7-9701-d3a409c0e04f.png) Code to reproduce: ```xml <html xmlns="http://www.w3.org/1999/xhtml" xmlns:h="http://java.sun.com/jsf/html" xmlns:p="http://primefaces.org/ui" xmlns:f="http://xmlns.jcp.org/jsf/core"> <h:head> </h:head> <h:body> <h:form> <div style="height: 20000px; background: #e0efe5; border: 1px solid teal;"> Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet.</div> <p:dock position="bottom"> <p:menuitem value="123" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> <p:menuitem value="456" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> <p:menuitem value="789" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> <p:menuitem value="abc" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> </p:dock> </h:form> </h:body> </html> ```
1.0
Dock: problems when page has scrollbar - Hi, I noticed some erroneous behaviour with the dock component on the latest 6.2 RC1 Problem 1: When reloading the page, as scroll status is not the top (e.g. in the middle of the page) the dock does not animate correctly. ![image](https://user-images.githubusercontent.com/13161711/34867663-4a6d1b84-f781-11e7-8d83-5792f9465ad1.png) Problem 2: When using halign="right" the dock moves into the scrollbar ![image](https://user-images.githubusercontent.com/13161711/34867615-197f4a24-f781-11e7-9701-d3a409c0e04f.png) Code to reproduce: ```xml <html xmlns="http://www.w3.org/1999/xhtml" xmlns:h="http://java.sun.com/jsf/html" xmlns:p="http://primefaces.org/ui" xmlns:f="http://xmlns.jcp.org/jsf/core"> <h:head> </h:head> <h:body> <h:form> <div style="height: 20000px; background: #e0efe5; border: 1px solid teal;"> Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet. Lorem ipsum dolor sit amet, consetetur sadipscing elitr, sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat, sed diam voluptua. At vero eos et accusam et justo duo dolores et ea rebum. Stet clita kasd gubergren, no sea takimata sanctus est Lorem ipsum dolor sit amet.</div> <p:dock position="bottom"> <p:menuitem value="123" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> <p:menuitem value="456" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> <p:menuitem value="789" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> <p:menuitem value="abc" icon="https://www.primefaces.org/showcase/resources/demo/images/dock/home.png" onclick="return false;" /> </p:dock> </h:form> </h:body> </html> ```
non_process
dock problems when page has scrollbar hi i noticed some erroneous behaviour with the dock component on the latest problem when reloading the page as scroll status is not the top e g in the middle of the page the dock does not animate correctly problem when using halign right the dock moves into the scrollbar code to reproduce xml html xmlns xmlns h xmlns p xmlns f div style height background border solid teal lorem ipsum dolor sit amet consetetur sadipscing elitr sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat sed diam voluptua at vero eos et accusam et justo duo dolores et ea rebum stet clita kasd gubergren no sea takimata sanctus est lorem ipsum dolor sit amet lorem ipsum dolor sit amet consetetur sadipscing elitr sed diam nonumy eirmod tempor invidunt ut labore et dolore magna aliquyam erat sed diam voluptua at vero eos et accusam et justo duo dolores et ea rebum stet clita kasd gubergren no sea takimata sanctus est lorem ipsum dolor sit amet p menuitem value icon onclick return false p menuitem value icon onclick return false p menuitem value icon onclick return false p menuitem value abc icon onclick return false
0
130,664
18,169,246,734
IssuesEvent
2021-09-27 17:55:51
samjcs/android
https://api.github.com/repos/samjcs/android
opened
WS-2019-0379 (Medium) detected in commons-codec-1.10.jar
security vulnerability
## WS-2019-0379 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-codec-1.10.jar</b></p></summary> <p>The Apache Commons Codec package contains simple encoder and decoders for various formats such as Base64 and Hexadecimal. In addition to these widely used encoders and decoders, the codec package also maintains a collection of phonetic encoding utilities.</p> <p>Path to dependency file: android/owncloudApp/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.10/4b95f4897fa13f2cd904aee711aeafc0c5295cd8/commons-codec-1.10.jar</p> <p> Dependency Hierarchy: - room-compiler-2.3.0.jar (Root Library) - :x: **commons-codec-1.10.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/samjcs/android/commit/787d550ab037fd249932a79fcb37a055e556301e">787d550ab037fd249932a79fcb37a055e556301e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation. <p>Publish Date: 2019-05-20 <p>URL: <a href=https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113>WS-2019-0379</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113">https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113</a></p> <p>Release Date: 2019-05-20</p> <p>Fix Resolution: commons-codec:commons-codec:1.13</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"commons-codec","packageName":"commons-codec","packageVersion":"1.10","packageFilePaths":["/owncloudApp/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"androidx.room:room-compiler:2.3.0;commons-codec:commons-codec:1.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-codec:commons-codec:1.13"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2019-0379","vulnerabilityDetails":"Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation.","vulnerabilityUrl":"https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
WS-2019-0379 (Medium) detected in commons-codec-1.10.jar - ## WS-2019-0379 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-codec-1.10.jar</b></p></summary> <p>The Apache Commons Codec package contains simple encoder and decoders for various formats such as Base64 and Hexadecimal. In addition to these widely used encoders and decoders, the codec package also maintains a collection of phonetic encoding utilities.</p> <p>Path to dependency file: android/owncloudApp/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/commons-codec/commons-codec/1.10/4b95f4897fa13f2cd904aee711aeafc0c5295cd8/commons-codec-1.10.jar</p> <p> Dependency Hierarchy: - room-compiler-2.3.0.jar (Root Library) - :x: **commons-codec-1.10.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/samjcs/android/commit/787d550ab037fd249932a79fcb37a055e556301e">787d550ab037fd249932a79fcb37a055e556301e</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation. <p>Publish Date: 2019-05-20 <p>URL: <a href=https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113>WS-2019-0379</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113">https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113</a></p> <p>Release Date: 2019-05-20</p> <p>Fix Resolution: commons-codec:commons-codec:1.13</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"commons-codec","packageName":"commons-codec","packageVersion":"1.10","packageFilePaths":["/owncloudApp/build.gradle"],"isTransitiveDependency":true,"dependencyTree":"androidx.room:room-compiler:2.3.0;commons-codec:commons-codec:1.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"commons-codec:commons-codec:1.13"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2019-0379","vulnerabilityDetails":"Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation.","vulnerabilityUrl":"https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_process
ws medium detected in commons codec jar ws medium severity vulnerability vulnerable library commons codec jar the apache commons codec package contains simple encoder and decoders for various formats such as and hexadecimal in addition to these widely used encoders and decoders the codec package also maintains a collection of phonetic encoding utilities path to dependency file android owncloudapp build gradle path to vulnerable library home wss scanner gradle caches modules files commons codec commons codec commons codec jar dependency hierarchy room compiler jar root library x commons codec jar vulnerable library found in head commit a href found in base branch master vulnerability details apache commons codec before version “commons codec ” is vulnerable to information disclosure due to improper input validation publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commons codec commons codec isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree androidx room room compiler commons codec commons codec isminimumfixversionavailable true minimumfixversion commons codec commons codec basebranches vulnerabilityidentifier ws vulnerabilitydetails apache commons codec before version “commons codec ” is vulnerable to information disclosure due to improper input validation vulnerabilityurl
0
84,265
24,263,371,374
IssuesEvent
2022-09-28 02:25:57
ClickHouse/ClickHouse
https://api.github.com/repos/ClickHouse/ClickHouse
opened
Compile error returning copied variable in 22.9
build
> Make sure that `git diff` result is empty and you've just pulled fresh master. Try cleaning up cmake cache. Just in case, official build instructions are published here: https://clickhouse.com/docs/en/development/build/ **Operating system** > OS kind or distribution, specific version/release, non-standard kernel if any. If you are trying to build inside virtual machine, please mention it too. cmake ../ -DCMAKE_BUILD_TYPE=RelWithDebInfo **Cmake version** cmake version 3.16.3 **Ninja version** ninja --version 1.10.0 **Compiler name and version** echo $CC clang-12 **Full cmake and/or ninja output** root@A03-R05-I46-164-035PF74:/code/ClickHouse/build_22.9# /usr/bin/ccache /usr/bin/clang++-12 --target=x86_64-linux-gnu --sysroot=/code/ClickHouse/cmake/linux/../../contrib/sysroot/linux-x86_64/x86_64-linux-gnu/libc -DAWS_SDK_VERSION_MAJOR=1 -DAWS_SDK_VERSION_MINOR=7 -DAWS_SDK_VERSION_PATCH=231 -DBOOST_ASIO_HAS_STD_INVOKE_RESULT=1 -DBOOST_ASIO_STANDALONE=1 -DENABLE_OPENSSL_ENCRYPTION -DPOCO_ENABLE_CPP11 -DPOCO_HAVE_FD_EPOLL -DPOCO_OS_FAMILY_UNIX -DSTD_EXCEPTION_HAS_STACK_TRACE=1 -DUNALIGNED_OK -DWITH_COVERAGE=0 -DWITH_GZFILEOP -DX86_64 -DZLIB_COMPAT -D_LIBCPP_ENABLE_THREAD_SAFETY_ANNOTATIONS -Iincludes/configs -I../base/glibc-compatibility/memcpy -I../src -Isrc -Isrc/Core/include -I../base/base/.. -Ibase/base/.. -I../contrib/cctz/include -I../base/pcg-random/. -I../contrib/miniselect/include -I../contrib/zstd/lib -isystem ../contrib/libcxx/include -isystem ../contrib/libcxxabi/include -isystem ../contrib/libunwind/include -isystem ../contrib/cityhash102/include -isystem ../contrib/boost -isystem ../contrib/poco/Net/include -isystem ../contrib/poco/Foundation/include -isystem ../contrib/poco/NetSSL_OpenSSL/include -isystem ../contrib/poco/Crypto/include -isystem ../contrib/boringssl/include -isystem ../contrib/poco/Util/include -isystem ../contrib/poco/JSON/include -isystem ../contrib/poco/XML/include -isystem ../contrib/replxx/include -isystem ../contrib/fmtlib-cmake/../fmtlib/include -isystem ../contrib/magic_enum/include -isystem ../contrib/double-conversion -isystem ../contrib/dragonbox/include -isystem ../contrib/re2 -isystem contrib/re2-cmake -isystem ../contrib/zlib-ng -isystem contrib/zlib-ng-cmake -isystem ../contrib/pdqsort -isystem ../contrib/xz/src/liblzma/api -isystem ../contrib/aws-c-common/include -isystem ../contrib/aws-c-event-stream/include -isystem ../contrib/aws/aws-cpp-sdk-s3/include -isystem ../contrib/aws/aws-cpp-sdk-core/include -isystem contrib/aws-s3-cmake/include -isystem ../contrib/snappy -isystem contrib/snappy-cmake -isystem ../contrib/msgpack-c/include -isystem ../contrib/fast_float/include --gcc-toolchain=/code/ClickHouse/cmake/linux/../../contrib/sysroot/linux-x86_64 -std=c++20 -fdiagnostics-color=always -Xclang -fuse-ctor-homing -fsized-deallocation -gdwarf-aranges -pipe -mssse3 -msse4.1 -msse4.2 -mpclmul -mpopcnt -fasynchronous-unwind-tables -ffile-prefix-map=/code/ClickHouse=. -falign-functions=32 -mbranches-within-32B-boundaries -fdiagnostics-absolute-paths -fstrict-vtable-pointers -fexperimental-new-pass-manager -Wall -Wextra -Wframe-larger-than=65536 -Weverything -Wpedantic -Wno-zero-length-array -Wno-c++98-compat-pedantic -Wno-c++98-compat -Wno-c++20-compat -Wno-sign-conversion -Wno-implicit-int-conversion -Wno-implicit-int-float-conversion -Wno-shorten-64-to-32 -Wno-ctad-maybe-unsupported -Wno-disabled-macro-expansion -Wno-documentation-unknown-command -Wno-double-promotion -Wno-exit-time-destructors -Wno-float-equal -Wno-global-constructors -Wno-missing-prototypes -Wno-missing-variable-declarations -Wno-padded -Wno-switch-enum -Wno-undefined-func-template -Wno-unused-template -Wno-vla -Wno-weak-template-vtables -Wno-weak-vtables -Wno-thread-safety-negative -O2 -g -DNDEBUG -O3 -g -gdwarf-4 -fno-pie -D OS_LINUX -Werror -nostdinc++ -std=gnu++2a -MD -MT utils/keeper-bench/CMakeFiles/keeper-bench.dir/Runner.cpp.o -MF utils/keeper-bench/CMakeFiles/keeper-bench.dir/Runner.cpp.o.d -o utils/keeper-bench/CMakeFiles/keeper-bench.dir/Runner.cpp.o -c ../utils/keeper-bench/Runner.cpp In file included from ../utils/keeper-bench/Runner.cpp:1: In file included from ../utils/keeper-bench/Runner.h:2: In file included from ../src/Common/ZooKeeper/ZooKeeperImpl.h:9: In file included from ../src/Common/ZooKeeper/ZooKeeperCommon.h:5: In file included from ../src/Interpreters/ZooKeeperLog.h:3: In file included from ../src/Core/NamesAndTypes.h:10: In file included from ../src/DataTypes/IDataType.h:8: In file included from ../src/DataTypes/DataTypeCustom.h:6: In file included from ../src/DataTypes/Serializations/ISerialization.h:7: /code/ClickHouse/src/Columns/IColumn.h:161:16: error: prior to the resolution of a defect report against ISO C++11, local variable 'res' would have been copied despite being returned by name, due to its not matching the function return type ('COW<DB::IColumn>::Ptr' (aka 'immutable_ptr<DB::IColumn>') vs 'COW<DB::IColumn>::MutablePtr' (aka 'mutable_ptr<DB::IColumn>')) [-Werror,-Wreturn-std-move-in-c++11] return res; ^~~ /code/ClickHouse/src/Columns/IColumn.h:161:16: note: call 'std::move' explicitly to avoid copying on older compilers return res; ^~~ std::move(res) In file included from ../utils/keeper-bench/Runner.cpp:1: In file included from ../utils/keeper-bench/Runner.h:2: In file included from ../src/Common/ZooKeeper/ZooKeeperImpl.h:9: In file included from ../src/Common/ZooKeeper/ZooKeeperCommon.h:5: In file included from ../src/Interpreters/ZooKeeperLog.h:5: In file included from ../src/Interpreters/SystemLog.h:5: In file included from ../src/Interpreters/StorageID.h:6: /code/ClickHouse/src/Core/QualifiedTableName.h:88:16: error: prior to the resolution of a defect report against ISO C++11, local variable 'name' would have been copied despite being returned by name, due to its not matching the function return type ('std::optional<QualifiedTableName>' vs 'DB::QualifiedTableName') [-Werror,-Wreturn-std-move-in-c++11] return name; ^~~~ /code/ClickHouse/src/Core/QualifiedTableName.h:88:16: note: call 'std::move' explicitly to avoid copying on older compilers return name; ^~~~ std::move(name) 2 errors generated.
1.0
Compile error returning copied variable in 22.9 - > Make sure that `git diff` result is empty and you've just pulled fresh master. Try cleaning up cmake cache. Just in case, official build instructions are published here: https://clickhouse.com/docs/en/development/build/ **Operating system** > OS kind or distribution, specific version/release, non-standard kernel if any. If you are trying to build inside virtual machine, please mention it too. cmake ../ -DCMAKE_BUILD_TYPE=RelWithDebInfo **Cmake version** cmake version 3.16.3 **Ninja version** ninja --version 1.10.0 **Compiler name and version** echo $CC clang-12 **Full cmake and/or ninja output** root@A03-R05-I46-164-035PF74:/code/ClickHouse/build_22.9# /usr/bin/ccache /usr/bin/clang++-12 --target=x86_64-linux-gnu --sysroot=/code/ClickHouse/cmake/linux/../../contrib/sysroot/linux-x86_64/x86_64-linux-gnu/libc -DAWS_SDK_VERSION_MAJOR=1 -DAWS_SDK_VERSION_MINOR=7 -DAWS_SDK_VERSION_PATCH=231 -DBOOST_ASIO_HAS_STD_INVOKE_RESULT=1 -DBOOST_ASIO_STANDALONE=1 -DENABLE_OPENSSL_ENCRYPTION -DPOCO_ENABLE_CPP11 -DPOCO_HAVE_FD_EPOLL -DPOCO_OS_FAMILY_UNIX -DSTD_EXCEPTION_HAS_STACK_TRACE=1 -DUNALIGNED_OK -DWITH_COVERAGE=0 -DWITH_GZFILEOP -DX86_64 -DZLIB_COMPAT -D_LIBCPP_ENABLE_THREAD_SAFETY_ANNOTATIONS -Iincludes/configs -I../base/glibc-compatibility/memcpy -I../src -Isrc -Isrc/Core/include -I../base/base/.. -Ibase/base/.. -I../contrib/cctz/include -I../base/pcg-random/. -I../contrib/miniselect/include -I../contrib/zstd/lib -isystem ../contrib/libcxx/include -isystem ../contrib/libcxxabi/include -isystem ../contrib/libunwind/include -isystem ../contrib/cityhash102/include -isystem ../contrib/boost -isystem ../contrib/poco/Net/include -isystem ../contrib/poco/Foundation/include -isystem ../contrib/poco/NetSSL_OpenSSL/include -isystem ../contrib/poco/Crypto/include -isystem ../contrib/boringssl/include -isystem ../contrib/poco/Util/include -isystem ../contrib/poco/JSON/include -isystem ../contrib/poco/XML/include -isystem ../contrib/replxx/include -isystem ../contrib/fmtlib-cmake/../fmtlib/include -isystem ../contrib/magic_enum/include -isystem ../contrib/double-conversion -isystem ../contrib/dragonbox/include -isystem ../contrib/re2 -isystem contrib/re2-cmake -isystem ../contrib/zlib-ng -isystem contrib/zlib-ng-cmake -isystem ../contrib/pdqsort -isystem ../contrib/xz/src/liblzma/api -isystem ../contrib/aws-c-common/include -isystem ../contrib/aws-c-event-stream/include -isystem ../contrib/aws/aws-cpp-sdk-s3/include -isystem ../contrib/aws/aws-cpp-sdk-core/include -isystem contrib/aws-s3-cmake/include -isystem ../contrib/snappy -isystem contrib/snappy-cmake -isystem ../contrib/msgpack-c/include -isystem ../contrib/fast_float/include --gcc-toolchain=/code/ClickHouse/cmake/linux/../../contrib/sysroot/linux-x86_64 -std=c++20 -fdiagnostics-color=always -Xclang -fuse-ctor-homing -fsized-deallocation -gdwarf-aranges -pipe -mssse3 -msse4.1 -msse4.2 -mpclmul -mpopcnt -fasynchronous-unwind-tables -ffile-prefix-map=/code/ClickHouse=. -falign-functions=32 -mbranches-within-32B-boundaries -fdiagnostics-absolute-paths -fstrict-vtable-pointers -fexperimental-new-pass-manager -Wall -Wextra -Wframe-larger-than=65536 -Weverything -Wpedantic -Wno-zero-length-array -Wno-c++98-compat-pedantic -Wno-c++98-compat -Wno-c++20-compat -Wno-sign-conversion -Wno-implicit-int-conversion -Wno-implicit-int-float-conversion -Wno-shorten-64-to-32 -Wno-ctad-maybe-unsupported -Wno-disabled-macro-expansion -Wno-documentation-unknown-command -Wno-double-promotion -Wno-exit-time-destructors -Wno-float-equal -Wno-global-constructors -Wno-missing-prototypes -Wno-missing-variable-declarations -Wno-padded -Wno-switch-enum -Wno-undefined-func-template -Wno-unused-template -Wno-vla -Wno-weak-template-vtables -Wno-weak-vtables -Wno-thread-safety-negative -O2 -g -DNDEBUG -O3 -g -gdwarf-4 -fno-pie -D OS_LINUX -Werror -nostdinc++ -std=gnu++2a -MD -MT utils/keeper-bench/CMakeFiles/keeper-bench.dir/Runner.cpp.o -MF utils/keeper-bench/CMakeFiles/keeper-bench.dir/Runner.cpp.o.d -o utils/keeper-bench/CMakeFiles/keeper-bench.dir/Runner.cpp.o -c ../utils/keeper-bench/Runner.cpp In file included from ../utils/keeper-bench/Runner.cpp:1: In file included from ../utils/keeper-bench/Runner.h:2: In file included from ../src/Common/ZooKeeper/ZooKeeperImpl.h:9: In file included from ../src/Common/ZooKeeper/ZooKeeperCommon.h:5: In file included from ../src/Interpreters/ZooKeeperLog.h:3: In file included from ../src/Core/NamesAndTypes.h:10: In file included from ../src/DataTypes/IDataType.h:8: In file included from ../src/DataTypes/DataTypeCustom.h:6: In file included from ../src/DataTypes/Serializations/ISerialization.h:7: /code/ClickHouse/src/Columns/IColumn.h:161:16: error: prior to the resolution of a defect report against ISO C++11, local variable 'res' would have been copied despite being returned by name, due to its not matching the function return type ('COW<DB::IColumn>::Ptr' (aka 'immutable_ptr<DB::IColumn>') vs 'COW<DB::IColumn>::MutablePtr' (aka 'mutable_ptr<DB::IColumn>')) [-Werror,-Wreturn-std-move-in-c++11] return res; ^~~ /code/ClickHouse/src/Columns/IColumn.h:161:16: note: call 'std::move' explicitly to avoid copying on older compilers return res; ^~~ std::move(res) In file included from ../utils/keeper-bench/Runner.cpp:1: In file included from ../utils/keeper-bench/Runner.h:2: In file included from ../src/Common/ZooKeeper/ZooKeeperImpl.h:9: In file included from ../src/Common/ZooKeeper/ZooKeeperCommon.h:5: In file included from ../src/Interpreters/ZooKeeperLog.h:5: In file included from ../src/Interpreters/SystemLog.h:5: In file included from ../src/Interpreters/StorageID.h:6: /code/ClickHouse/src/Core/QualifiedTableName.h:88:16: error: prior to the resolution of a defect report against ISO C++11, local variable 'name' would have been copied despite being returned by name, due to its not matching the function return type ('std::optional<QualifiedTableName>' vs 'DB::QualifiedTableName') [-Werror,-Wreturn-std-move-in-c++11] return name; ^~~~ /code/ClickHouse/src/Core/QualifiedTableName.h:88:16: note: call 'std::move' explicitly to avoid copying on older compilers return name; ^~~~ std::move(name) 2 errors generated.
non_process
compile error returning copied variable in make sure that git diff result is empty and you ve just pulled fresh master try cleaning up cmake cache just in case official build instructions are published here operating system os kind or distribution specific version release non standard kernel if any if you are trying to build inside virtual machine please mention it too cmake dcmake build type relwithdebinfo cmake version cmake version ninja version ninja version compiler name and version echo cc clang full cmake and or ninja output root code clickhouse build usr bin ccache usr bin clang target linux gnu sysroot code clickhouse cmake linux contrib sysroot linux linux gnu libc daws sdk version major daws sdk version minor daws sdk version patch dboost asio has std invoke result dboost asio standalone denable openssl encryption dpoco enable dpoco have fd epoll dpoco os family unix dstd exception has stack trace dunaligned ok dwith coverage dwith gzfileop dzlib compat d libcpp enable thread safety annotations iincludes configs i base glibc compatibility memcpy i src isrc isrc core include i base base ibase base i contrib cctz include i base pcg random i contrib miniselect include i contrib zstd lib isystem contrib libcxx include isystem contrib libcxxabi include isystem contrib libunwind include isystem contrib include isystem contrib boost isystem contrib poco net include isystem contrib poco foundation include isystem contrib poco netssl openssl include isystem contrib poco crypto include isystem contrib boringssl include isystem contrib poco util include isystem contrib poco json include isystem contrib poco xml include isystem contrib replxx include isystem contrib fmtlib cmake fmtlib include isystem contrib magic enum include isystem contrib double conversion isystem contrib dragonbox include isystem contrib isystem contrib cmake isystem contrib zlib ng isystem contrib zlib ng cmake isystem contrib pdqsort isystem contrib xz src liblzma api isystem contrib aws c common include isystem contrib aws c event stream include isystem contrib aws aws cpp sdk include isystem contrib aws aws cpp sdk core include isystem contrib aws cmake include isystem contrib snappy isystem contrib snappy cmake isystem contrib msgpack c include isystem contrib fast float include gcc toolchain code clickhouse cmake linux contrib sysroot linux std c fdiagnostics color always xclang fuse ctor homing fsized deallocation gdwarf aranges pipe mpclmul mpopcnt fasynchronous unwind tables ffile prefix map code clickhouse falign functions mbranches within boundaries fdiagnostics absolute paths fstrict vtable pointers fexperimental new pass manager wall wextra wframe larger than weverything wpedantic wno zero length array wno c compat pedantic wno c compat wno c compat wno sign conversion wno implicit int conversion wno implicit int float conversion wno shorten to wno ctad maybe unsupported wno disabled macro expansion wno documentation unknown command wno double promotion wno exit time destructors wno float equal wno global constructors wno missing prototypes wno missing variable declarations wno padded wno switch enum wno undefined func template wno unused template wno vla wno weak template vtables wno weak vtables wno thread safety negative g dndebug g gdwarf fno pie d os linux werror nostdinc std gnu md mt utils keeper bench cmakefiles keeper bench dir runner cpp o mf utils keeper bench cmakefiles keeper bench dir runner cpp o d o utils keeper bench cmakefiles keeper bench dir runner cpp o c utils keeper bench runner cpp in file included from utils keeper bench runner cpp in file included from utils keeper bench runner h in file included from src common zookeeper zookeeperimpl h in file included from src common zookeeper zookeepercommon h in file included from src interpreters zookeeperlog h in file included from src core namesandtypes h in file included from src datatypes idatatype h in file included from src datatypes datatypecustom h in file included from src datatypes serializations iserialization h code clickhouse src columns icolumn h error prior to the resolution of a defect report against iso c local variable res would have been copied despite being returned by name due to its not matching the function return type cow ptr aka immutable ptr vs cow mutableptr aka mutable ptr return res code clickhouse src columns icolumn h note call std move explicitly to avoid copying on older compilers return res std move res in file included from utils keeper bench runner cpp in file included from utils keeper bench runner h in file included from src common zookeeper zookeeperimpl h in file included from src common zookeeper zookeepercommon h in file included from src interpreters zookeeperlog h in file included from src interpreters systemlog h in file included from src interpreters storageid h code clickhouse src core qualifiedtablename h error prior to the resolution of a defect report against iso c local variable name would have been copied despite being returned by name due to its not matching the function return type std optional vs db qualifiedtablename return name code clickhouse src core qualifiedtablename h note call std move explicitly to avoid copying on older compilers return name std move name errors generated
0
4,539
4,425,644,508
IssuesEvent
2016-08-16 15:59:31
phetsims/rutherford-scattering
https://api.github.com/repos/phetsims/rutherford-scattering
opened
Nucleus node doesn't always update in Firefox and iPad2
type:bug type:performance
From phetsims/tasks#675, @phet-steele noticed that the canvas icon node for the rutherford nucleus does not update. Occasionally, if the slider is quickly dragged, the nucleus is not redrawn and the trace is left in the screen view. ![capture](https://cloud.githubusercontent.com/assets/6396244/17706142/18e2dd70-6398-11e6-8133-584ce5e20e62.PNG)
True
Nucleus node doesn't always update in Firefox and iPad2 - From phetsims/tasks#675, @phet-steele noticed that the canvas icon node for the rutherford nucleus does not update. Occasionally, if the slider is quickly dragged, the nucleus is not redrawn and the trace is left in the screen view. ![capture](https://cloud.githubusercontent.com/assets/6396244/17706142/18e2dd70-6398-11e6-8133-584ce5e20e62.PNG)
non_process
nucleus node doesn t always update in firefox and from phetsims tasks phet steele noticed that the canvas icon node for the rutherford nucleus does not update occasionally if the slider is quickly dragged the nucleus is not redrawn and the trace is left in the screen view
0
12,487
14,952,625,093
IssuesEvent
2021-01-26 15:45:04
ORNL-AMO/AMO-Tools-Desktop
https://api.github.com/repos/ORNL-AMO/AMO-Tools-Desktop
closed
Pre-release QA/Testing of Standalones
Process Heating important
Go through all new PH standalones to be included in release and fully QA. Will add checklist later.
1.0
Pre-release QA/Testing of Standalones - Go through all new PH standalones to be included in release and fully QA. Will add checklist later.
process
pre release qa testing of standalones go through all new ph standalones to be included in release and fully qa will add checklist later
1
142,229
11,459,763,316
IssuesEvent
2020-02-07 08:11:27
OpenTechFund/hypha
https://api.github.com/repos/OpenTechFund/hypha
closed
Partners should no longer be able to write reviews
enhancement tested - approved for live ✅
**Describe the bug** Partners/Applicants should not be able to view Staff/AC Reviews, not even listed on the side of the application **Expected behavior** Just need to check that only Staff/AC members can see the little box that says reviews and what's under it in terms of staff/AC reviews/ **Priority** - High (keeping you from completing day-to-day tasks) **Affected roles** - Applicants - Reviewers
1.0
Partners should no longer be able to write reviews - **Describe the bug** Partners/Applicants should not be able to view Staff/AC Reviews, not even listed on the side of the application **Expected behavior** Just need to check that only Staff/AC members can see the little box that says reviews and what's under it in terms of staff/AC reviews/ **Priority** - High (keeping you from completing day-to-day tasks) **Affected roles** - Applicants - Reviewers
non_process
partners should no longer be able to write reviews describe the bug partners applicants should not be able to view staff ac reviews not even listed on the side of the application expected behavior just need to check that only staff ac members can see the little box that says reviews and what s under it in terms of staff ac reviews priority high keeping you from completing day to day tasks affected roles applicants reviewers
0
9,597
12,543,450,097
IssuesEvent
2020-06-05 15:34:12
spring-projects/spring-hateoas
https://api.github.com/repos/spring-projects/spring-hateoas
closed
HAL hypertext cache pattern
in: mediatypes process: waiting for review
Hi, I am using Spring MVC with Spring HATEOAS and would like to use hypertext cache pattern as defined in HAL specification. > 8.3. Hypertext Cache Pattern > > The "hypertext cache pattern" allows servers to use embedded > resources to dynamically reduce the number of requests a client > makes, improving the efficiency and performance of the application. > > Clients MAY be automated for this purpose so that, for any given link > relation, they will read from an embedded resource (if present) in > preference to traversing a link. > > To activate this client behaviour for a given link, servers SHOULD > add an embedded resource into the representation with the same > relation. > > Servers SHOULD NOT entirely "swap out" a link for an embedded > resource (or vice versa) because client support for this technique is > OPTIONAL. > > The following examples shows the hypertext cache pattern applied to > an "author" link: > > Before: > > { > "_links": { > "self": { "href": "/books/the-way-of-zen" }, > "author": { "href": "/people/alan-watts" } > } > } > > > After: > > { > "_links": { > "self": { "href": "/blog-post" }, > "author": { "href": "/people/alan-watts" } > }, > "_embedded": { > "author": { > "_links": { "self": { "href": "/people/alan-watts" } }, > "name": "Alan Watts", > "born": "January 6, 1915", > "died": "November 16, 1973" > } > } > } > Are you planning to ease hypertext cache pattern usage? I envisioned an optional flag per link that could be set during the construction of the list of links. When enabled, during serialization (?), the flag would trigger the Spring MVC controller method described by the link, allowing to complete the embedded content with the object returned by the method.
1.0
HAL hypertext cache pattern - Hi, I am using Spring MVC with Spring HATEOAS and would like to use hypertext cache pattern as defined in HAL specification. > 8.3. Hypertext Cache Pattern > > The "hypertext cache pattern" allows servers to use embedded > resources to dynamically reduce the number of requests a client > makes, improving the efficiency and performance of the application. > > Clients MAY be automated for this purpose so that, for any given link > relation, they will read from an embedded resource (if present) in > preference to traversing a link. > > To activate this client behaviour for a given link, servers SHOULD > add an embedded resource into the representation with the same > relation. > > Servers SHOULD NOT entirely "swap out" a link for an embedded > resource (or vice versa) because client support for this technique is > OPTIONAL. > > The following examples shows the hypertext cache pattern applied to > an "author" link: > > Before: > > { > "_links": { > "self": { "href": "/books/the-way-of-zen" }, > "author": { "href": "/people/alan-watts" } > } > } > > > After: > > { > "_links": { > "self": { "href": "/blog-post" }, > "author": { "href": "/people/alan-watts" } > }, > "_embedded": { > "author": { > "_links": { "self": { "href": "/people/alan-watts" } }, > "name": "Alan Watts", > "born": "January 6, 1915", > "died": "November 16, 1973" > } > } > } > Are you planning to ease hypertext cache pattern usage? I envisioned an optional flag per link that could be set during the construction of the list of links. When enabled, during serialization (?), the flag would trigger the Spring MVC controller method described by the link, allowing to complete the embedded content with the object returned by the method.
process
hal hypertext cache pattern hi i am using spring mvc with spring hateoas and would like to use hypertext cache pattern as defined in hal specification hypertext cache pattern the hypertext cache pattern allows servers to use embedded resources to dynamically reduce the number of requests a client makes improving the efficiency and performance of the application clients may be automated for this purpose so that for any given link relation they will read from an embedded resource if present in preference to traversing a link to activate this client behaviour for a given link servers should add an embedded resource into the representation with the same relation servers should not entirely swap out a link for an embedded resource or vice versa because client support for this technique is optional the following examples shows the hypertext cache pattern applied to an author link before links self href books the way of zen author href people alan watts after links self href blog post author href people alan watts embedded author links self href people alan watts name alan watts born january died november are you planning to ease hypertext cache pattern usage i envisioned an optional flag per link that could be set during the construction of the list of links when enabled during serialization the flag would trigger the spring mvc controller method described by the link allowing to complete the embedded content with the object returned by the method
1
8,652
11,790,755,317
IssuesEvent
2020-03-17 19:36:17
MicrosoftDocs/vsts-docs
https://api.github.com/repos/MicrosoftDocs/vsts-docs
closed
Please update the page for Azure DevOps Server
Pri1 devops-cicd-process/tech devops/prod doc-bug
https://developercommunity.visualstudio.com/content/problem/941173/facing-a-sequence-was-not-expected-when-applying-y.html --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 6724abea-bbdc-bf66-ed5e-3214fa6c3e66 * Version Independent ID: 4f8dab21-3f0e-da32-cc0e-1d85c13c0065 * Content: [Templates - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops-2019) * Content Source: [docs/pipelines/process/templates.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/templates.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
1.0
Please update the page for Azure DevOps Server - https://developercommunity.visualstudio.com/content/problem/941173/facing-a-sequence-was-not-expected-when-applying-y.html --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 6724abea-bbdc-bf66-ed5e-3214fa6c3e66 * Version Independent ID: 4f8dab21-3f0e-da32-cc0e-1d85c13c0065 * Content: [Templates - Azure Pipelines](https://docs.microsoft.com/en-us/azure/devops/pipelines/process/templates?view=azure-devops-2019) * Content Source: [docs/pipelines/process/templates.md](https://github.com/MicrosoftDocs/vsts-docs/blob/master/docs/pipelines/process/templates.md) * Product: **devops** * Technology: **devops-cicd-process** * GitHub Login: @juliakm * Microsoft Alias: **jukullam**
process
please update the page for azure devops server document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id bbdc version independent id content content source product devops technology devops cicd process github login juliakm microsoft alias jukullam
1
4,616
7,461,361,125
IssuesEvent
2018-03-31 01:52:08
rivine/rivine
https://api.github.com/repos/rivine/rivine
closed
get daemon constants in client at startup
process_wontfix state_inprogress type_feature wontfix
this will allow us to share the constants between client and daemon
1.0
get daemon constants in client at startup - this will allow us to share the constants between client and daemon
process
get daemon constants in client at startup this will allow us to share the constants between client and daemon
1
9,116
12,195,313,496
IssuesEvent
2020-04-29 17:09:59
prisma/prisma-client-js
https://api.github.com/repos/prisma/prisma-client-js
opened
Implement shortcut findOne api
kind/feature process/candidate team/typescript
If you have a schema like this, which only has one id field (no compound id): ```prisma model User { id Int @id @default(autoincrement()) name String } ``` We can implement a short-cut for the where syntax. Instead of ```ts await prisma.user.findOne({ where: { id: 123 }}) ``` we can implement the shortcut ```ts await prisma.user.findOne(123) ``` The same for string-based ids: ```prisma model Post { id String @id @default(cuid()) title String } ``` ```ts await prisma.post.findOne('ck9llgg2x000001lbfcv13fjy') ``` This can already be implemented today in TypeScript, as we don't need any change in the Rust Query Engine for this.
1.0
Implement shortcut findOne api - If you have a schema like this, which only has one id field (no compound id): ```prisma model User { id Int @id @default(autoincrement()) name String } ``` We can implement a short-cut for the where syntax. Instead of ```ts await prisma.user.findOne({ where: { id: 123 }}) ``` we can implement the shortcut ```ts await prisma.user.findOne(123) ``` The same for string-based ids: ```prisma model Post { id String @id @default(cuid()) title String } ``` ```ts await prisma.post.findOne('ck9llgg2x000001lbfcv13fjy') ``` This can already be implemented today in TypeScript, as we don't need any change in the Rust Query Engine for this.
process
implement shortcut findone api if you have a schema like this which only has one id field no compound id prisma model user id int id default autoincrement name string we can implement a short cut for the where syntax instead of ts await prisma user findone where id we can implement the shortcut ts await prisma user findone the same for string based ids prisma model post id string id default cuid title string ts await prisma post findone this can already be implemented today in typescript as we don t need any change in the rust query engine for this
1
185,095
14,332,073,908
IssuesEvent
2020-11-27 01:10:05
Tadukooverse/TadukooUtil
https://api.github.com/repos/Tadukooverse/TadukooUtil
closed
[TESTING] Test Orientation Enum
Tadukoo View Testing
**What change would you like to see?** Testing on orientation enum, similar to the testing on the FontFamilies enum. **How does this change help?** Improves test case coverage, ensures if someone modifies the enum we can catch any bad changes they make. **Additional context** N/A
1.0
[TESTING] Test Orientation Enum - **What change would you like to see?** Testing on orientation enum, similar to the testing on the FontFamilies enum. **How does this change help?** Improves test case coverage, ensures if someone modifies the enum we can catch any bad changes they make. **Additional context** N/A
non_process
test orientation enum what change would you like to see testing on orientation enum similar to the testing on the fontfamilies enum how does this change help improves test case coverage ensures if someone modifies the enum we can catch any bad changes they make additional context n a
0
110,595
9,461,973,819
IssuesEvent
2019-04-17 14:32:24
BeamMW/beam
https://api.github.com/repos/BeamMW/beam
closed
UI Wallet: Connection is randomly interrupted when connected to remote node.
TestQuality bug ux
**Bug description** The connection is randomly interrupted when connected to remote node. This is seen mostly with **eu nodes**. The connection is restored in a few seconds after it&#39;s lost. **To Reproduce** 1. Install and start setting up UI Wallet. 2. Choose &quot;connect to a random node&quot; and proceed. 3. Follow the Connection Indicator status while the app is running. **Current behaviour** The connection is lost from time to time. **Expected behaviour** The connection is on all the time. **Build** testnet, 4721
1.0
UI Wallet: Connection is randomly interrupted when connected to remote node. - **Bug description** The connection is randomly interrupted when connected to remote node. This is seen mostly with **eu nodes**. The connection is restored in a few seconds after it&#39;s lost. **To Reproduce** 1. Install and start setting up UI Wallet. 2. Choose &quot;connect to a random node&quot; and proceed. 3. Follow the Connection Indicator status while the app is running. **Current behaviour** The connection is lost from time to time. **Expected behaviour** The connection is on all the time. **Build** testnet, 4721
non_process
ui wallet connection is randomly interrupted when connected to remote node bug description the connection is randomly interrupted when connected to remote node this is seen mostly with eu nodes the connection is restored in a few seconds after it s lost to reproduce install and start setting up ui wallet choose quot connect to a random node quot and proceed follow the connection indicator status while the app is running current behaviour the connection is lost from time to time expected behaviour the connection is on all the time build testnet
0
265,999
20,118,274,525
IssuesEvent
2022-02-07 22:06:37
pytorch/serve
https://api.github.com/repos/pytorch/serve
closed
`.md` file spellchecking and linting
bug documentation enhancement
# Problem Many of our docs have some avoidable typos and broken links ## Broken links https://github.com/pytorch/pytorch.github.io/issues/881 https://github.com/pytorch/pytorch.github.io/issues/881 https://github.com/pytorch/pytorch.github.io/issues/878 ## Typos There's a bunch in the examples folder # Solution For spelling issues:`sh ts_scripts/spellchecker.sh` For broken links: `python sanity.py`
1.0
`.md` file spellchecking and linting - # Problem Many of our docs have some avoidable typos and broken links ## Broken links https://github.com/pytorch/pytorch.github.io/issues/881 https://github.com/pytorch/pytorch.github.io/issues/881 https://github.com/pytorch/pytorch.github.io/issues/878 ## Typos There's a bunch in the examples folder # Solution For spelling issues:`sh ts_scripts/spellchecker.sh` For broken links: `python sanity.py`
non_process
md file spellchecking and linting problem many of our docs have some avoidable typos and broken links broken links typos there s a bunch in the examples folder solution for spelling issues sh ts scripts spellchecker sh for broken links python sanity py
0
15,231
19,101,358,010
IssuesEvent
2021-11-29 23:05:26
ooi-data/CE09OSSM-MFD37-03-DOSTAD000-telemetered-dosta_abcdjm_ctdbp_dcl_instrument
https://api.github.com/repos/ooi-data/CE09OSSM-MFD37-03-DOSTAD000-telemetered-dosta_abcdjm_ctdbp_dcl_instrument
closed
🛑 Processing failed: ResponseParserError
process
## Overview `ResponseParserError` found in `processing_task` task during run ended on 2021-09-18T05:47:09.023193. ## Details Flow name: `CE09OSSM-MFD37-03-DOSTAD000-telemetered-dosta_abcdjm_ctdbp_dcl_instrument` Task name: `processing_task` Error type: `ResponseParserError` Error message: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed: b'<?xml version="1.0" encoding="UTF-8"?>\n' <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 452, in _parse_xml_string_to_dom root = parser.close() xml.etree.ElementTree.ParseError: no element found: line 2, column 0 During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 101, in processing final_path = finalize_zarr( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 359, in finalize_zarr source_store.fs.delete(source_store.root, recursive=True) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/spec.py", line 1187, in delete return self.rm(path, recursive=recursive, maxdepth=maxdepth) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 88, in wrapper return sync(self.loop, func, *args, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 69, in sync raise result[0] File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 25, in _runner result[0] = await coro File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1677, in _rm await asyncio.gather( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1657, in _bulk_delete await self._call_s3("delete_objects", kwargs, Bucket=bucket, Delete=delete_keys) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 268, in _call_s3 raise err File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 248, in _call_s3 out = await method(**additional_kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 141, in _make_api_call http, parsed_response = await self._make_request( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 161, in _make_request return await self._endpoint.make_request(operation_model, request_dict) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 93, in _send_request success_response, exception = await self._get_response( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 112, in _get_response success_response, exception = await self._do_get_response( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 177, in _do_get_response parsed_response = parser.parse( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 245, in parse parsed = self._do_parse(response, shape) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 813, in _do_parse self._add_modeled_parse(response, shape, final_parsed) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 822, in _add_modeled_parse self._parse_payload(response, shape, member_shapes, final_parsed) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 862, in _parse_payload original_parsed = self._initial_body_parse(response['body']) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 948, in _initial_body_parse return self._parse_xml_string_to_dom(xml_string) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 454, in _parse_xml_string_to_dom raise ResponseParserError( botocore.parsers.ResponseParserError: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed: b'<?xml version="1.0" encoding="UTF-8"?>\n' ``` </details>
1.0
🛑 Processing failed: ResponseParserError - ## Overview `ResponseParserError` found in `processing_task` task during run ended on 2021-09-18T05:47:09.023193. ## Details Flow name: `CE09OSSM-MFD37-03-DOSTAD000-telemetered-dosta_abcdjm_ctdbp_dcl_instrument` Task name: `processing_task` Error type: `ResponseParserError` Error message: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed: b'<?xml version="1.0" encoding="UTF-8"?>\n' <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 452, in _parse_xml_string_to_dom root = parser.close() xml.etree.ElementTree.ParseError: no element found: line 2, column 0 During handling of the above exception, another exception occurred: Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/pipeline.py", line 101, in processing final_path = finalize_zarr( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/ooi_harvester/processor/__init__.py", line 359, in finalize_zarr source_store.fs.delete(source_store.root, recursive=True) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/spec.py", line 1187, in delete return self.rm(path, recursive=recursive, maxdepth=maxdepth) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 88, in wrapper return sync(self.loop, func, *args, **kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 69, in sync raise result[0] File "/srv/conda/envs/notebook/lib/python3.8/site-packages/fsspec/asyn.py", line 25, in _runner result[0] = await coro File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1677, in _rm await asyncio.gather( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 1657, in _bulk_delete await self._call_s3("delete_objects", kwargs, Bucket=bucket, Delete=delete_keys) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 268, in _call_s3 raise err File "/srv/conda/envs/notebook/lib/python3.8/site-packages/s3fs/core.py", line 248, in _call_s3 out = await method(**additional_kwargs) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 141, in _make_api_call http, parsed_response = await self._make_request( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/client.py", line 161, in _make_request return await self._endpoint.make_request(operation_model, request_dict) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 93, in _send_request success_response, exception = await self._get_response( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 112, in _get_response success_response, exception = await self._do_get_response( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/aiobotocore/endpoint.py", line 177, in _do_get_response parsed_response = parser.parse( File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 245, in parse parsed = self._do_parse(response, shape) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 813, in _do_parse self._add_modeled_parse(response, shape, final_parsed) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 822, in _add_modeled_parse self._parse_payload(response, shape, member_shapes, final_parsed) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 862, in _parse_payload original_parsed = self._initial_body_parse(response['body']) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 948, in _initial_body_parse return self._parse_xml_string_to_dom(xml_string) File "/srv/conda/envs/notebook/lib/python3.8/site-packages/botocore/parsers.py", line 454, in _parse_xml_string_to_dom raise ResponseParserError( botocore.parsers.ResponseParserError: Unable to parse response (no element found: line 2, column 0), invalid XML received. Further retries may succeed: b'<?xml version="1.0" encoding="UTF-8"?>\n' ``` </details>
process
🛑 processing failed responseparsererror overview responseparsererror found in processing task task during run ended on details flow name telemetered dosta abcdjm ctdbp dcl instrument task name processing task error type responseparsererror error message unable to parse response no element found line column invalid xml received further retries may succeed b n traceback traceback most recent call last file srv conda envs notebook lib site packages botocore parsers py line in parse xml string to dom root parser close xml etree elementtree parseerror no element found line column during handling of the above exception another exception occurred traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize zarr file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize zarr source store fs delete source store root recursive true file srv conda envs notebook lib site packages fsspec spec py line in delete return self rm path recursive recursive maxdepth maxdepth file srv conda envs notebook lib site packages fsspec asyn py line in wrapper return sync self loop func args kwargs file srv conda envs notebook lib site packages fsspec asyn py line in sync raise result file srv conda envs notebook lib site packages fsspec asyn py line in runner result await coro file srv conda envs notebook lib site packages core py line in rm await asyncio gather file srv conda envs notebook lib site packages core py line in bulk delete await self call delete objects kwargs bucket bucket delete delete keys file srv conda envs notebook lib site packages core py line in call raise err file srv conda envs notebook lib site packages core py line in call out await method additional kwargs file srv conda envs notebook lib site packages aiobotocore client py line in make api call http parsed response await self make request file srv conda envs notebook lib site packages aiobotocore client py line in make request return await self endpoint make request operation model request dict file srv conda envs notebook lib site packages aiobotocore endpoint py line in send request success response exception await self get response file srv conda envs notebook lib site packages aiobotocore endpoint py line in get response success response exception await self do get response file srv conda envs notebook lib site packages aiobotocore endpoint py line in do get response parsed response parser parse file srv conda envs notebook lib site packages botocore parsers py line in parse parsed self do parse response shape file srv conda envs notebook lib site packages botocore parsers py line in do parse self add modeled parse response shape final parsed file srv conda envs notebook lib site packages botocore parsers py line in add modeled parse self parse payload response shape member shapes final parsed file srv conda envs notebook lib site packages botocore parsers py line in parse payload original parsed self initial body parse response file srv conda envs notebook lib site packages botocore parsers py line in initial body parse return self parse xml string to dom xml string file srv conda envs notebook lib site packages botocore parsers py line in parse xml string to dom raise responseparsererror botocore parsers responseparsererror unable to parse response no element found line column invalid xml received further retries may succeed b n
1
272,967
29,800,316,033
IssuesEvent
2023-06-16 07:35:30
billmcchesney1/foxtrot
https://api.github.com/repos/billmcchesney1/foxtrot
closed
CVE-2017-15713 (Medium) detected in hadoop-common-2.5.1.jar - autoclosed
Mend: dependency security vulnerability
## CVE-2017-15713 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hadoop-common-2.5.1.jar</b></p></summary> <p>Apache Hadoop Common</p> <p>Path to dependency file: /foxtrot-server/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar,/home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar,/home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar,/home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar</p> <p> Dependency Hierarchy: - hbase-server-1.2.1.jar (Root Library) - :x: **hadoop-common-2.5.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/foxtrot/commit/ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc">ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> Vulnerability in Apache Hadoop 0.23.x, 2.x before 2.7.5, 2.8.x before 2.8.3, and 3.0.0-alpha through 3.0.0-beta1 allows a cluster user to expose private files owned by the user running the MapReduce job history server process. The malicious user can construct a configuration file containing XML directives that reference sensitive files on the MapReduce job history server host. <p>Publish Date: 2018-01-19 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-15713>CVE-2017-15713</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://lists.apache.org/thread.html/a790a251ace7213bde9f69777dedb453b1a01a6d18289c14a61d4f91@%3Cgeneral.hadoop.apache.org%3E">https://lists.apache.org/thread.html/a790a251ace7213bde9f69777dedb453b1a01a6d18289c14a61d4f91@%3Cgeneral.hadoop.apache.org%3E</a></p> <p>Release Date: 2018-01-19</p> <p>Fix Resolution (org.apache.hadoop:hadoop-common): 2.8.3</p> <p>Direct dependency fix Resolution (org.apache.hbase:hbase-server): 1.2.3</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
True
CVE-2017-15713 (Medium) detected in hadoop-common-2.5.1.jar - autoclosed - ## CVE-2017-15713 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>hadoop-common-2.5.1.jar</b></p></summary> <p>Apache Hadoop Common</p> <p>Path to dependency file: /foxtrot-server/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar,/home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar,/home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar,/home/wss-scanner/.m2/repository/org/apache/hadoop/hadoop-common/2.5.1/hadoop-common-2.5.1.jar</p> <p> Dependency Hierarchy: - hbase-server-1.2.1.jar (Root Library) - :x: **hadoop-common-2.5.1.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/billmcchesney1/foxtrot/commit/ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc">ffb8a6014463ce8aac1bf6e7dc9a23fc4a2a8adc</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary> <p> Vulnerability in Apache Hadoop 0.23.x, 2.x before 2.7.5, 2.8.x before 2.8.3, and 3.0.0-alpha through 3.0.0-beta1 allows a cluster user to expose private files owned by the user running the MapReduce job history server process. The malicious user can construct a configuration file containing XML directives that reference sensitive files on the MapReduce job history server host. <p>Publish Date: 2018-01-19 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-15713>CVE-2017-15713</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://lists.apache.org/thread.html/a790a251ace7213bde9f69777dedb453b1a01a6d18289c14a61d4f91@%3Cgeneral.hadoop.apache.org%3E">https://lists.apache.org/thread.html/a790a251ace7213bde9f69777dedb453b1a01a6d18289c14a61d4f91@%3Cgeneral.hadoop.apache.org%3E</a></p> <p>Release Date: 2018-01-19</p> <p>Fix Resolution (org.apache.hadoop:hadoop-common): 2.8.3</p> <p>Direct dependency fix Resolution (org.apache.hbase:hbase-server): 1.2.3</p> </p> </details> <p></p> *** <!-- REMEDIATE-OPEN-PR-START --> - [ ] Check this box to open an automated fix PR <!-- REMEDIATE-OPEN-PR-END -->
non_process
cve medium detected in hadoop common jar autoclosed cve medium severity vulnerability vulnerable library hadoop common jar apache hadoop common path to dependency file foxtrot server pom xml path to vulnerable library home wss scanner repository org apache hadoop hadoop common hadoop common jar home wss scanner repository org apache hadoop hadoop common hadoop common jar home wss scanner repository org apache hadoop hadoop common hadoop common jar home wss scanner repository org apache hadoop hadoop common hadoop common jar dependency hierarchy hbase server jar root library x hadoop common jar vulnerable library found in head commit a href found in base branch master vulnerability details vulnerability in apache hadoop x x before x before and alpha through allows a cluster user to expose private files owned by the user running the mapreduce job history server process the malicious user can construct a configuration file containing xml directives that reference sensitive files on the mapreduce job history server host publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache hadoop hadoop common direct dependency fix resolution org apache hbase hbase server check this box to open an automated fix pr
0
75,772
7,482,401,437
IssuesEvent
2018-04-05 01:08:49
gregswindle/github-resource-converter
https://api.github.com/repos/gregswindle/github-resource-converter
opened
feat(notifications): notify consumers of product updates
good first issue help wanted priority: low status: available type: feature type: test
## 1. User story summary As a consumer, I would like to know whenever there are updates available for `grc` In order to benefit from product enhancements and security fixes. ## 2. Acceptance criteria * [ ] [`update-notifier`](https://github.com/yeoman/update-notifier) provides update notifications via the CLI. <!-- ⛔️ Do not remove anything below this comment. ⛔️ -->
1.0
feat(notifications): notify consumers of product updates - ## 1. User story summary As a consumer, I would like to know whenever there are updates available for `grc` In order to benefit from product enhancements and security fixes. ## 2. Acceptance criteria * [ ] [`update-notifier`](https://github.com/yeoman/update-notifier) provides update notifications via the CLI. <!-- ⛔️ Do not remove anything below this comment. ⛔️ -->
non_process
feat notifications notify consumers of product updates user story summary as a consumer i would like to know whenever there are updates available for grc in order to benefit from product enhancements and security fixes acceptance criteria provides update notifications via the cli
0
20,379
27,031,699,579
IssuesEvent
2023-02-12 09:24:31
allinurl/goaccess
https://api.github.com/repos/allinurl/goaccess
closed
goaccess doesn't work in cron
log-processing cron
Ubuntu 20.04, Goaccess 1.7 I've a script named `stat.sh`.When I run it from shell `/root/bin/stat.sh`, it works as expectd, ```sh #!/bin/sh echo begin... /usr/bin/zcat -f -- /root/logs/access* | /usr/bin/goaccess --log-format='%h [%d:%t %^] "%r" %s %b "%R" "%u" %T' --date-format=%d/%b/%Y --time-format=%T -o /root/report.html echo end... ``` but when I place it in `/etc/crontab`, it just not works, there's no `report.html` found: ``` 33 17 * * * root /root/bin/stat.sh ```
1.0
goaccess doesn't work in cron - Ubuntu 20.04, Goaccess 1.7 I've a script named `stat.sh`.When I run it from shell `/root/bin/stat.sh`, it works as expectd, ```sh #!/bin/sh echo begin... /usr/bin/zcat -f -- /root/logs/access* | /usr/bin/goaccess --log-format='%h [%d:%t %^] "%r" %s %b "%R" "%u" %T' --date-format=%d/%b/%Y --time-format=%T -o /root/report.html echo end... ``` but when I place it in `/etc/crontab`, it just not works, there's no `report.html` found: ``` 33 17 * * * root /root/bin/stat.sh ```
process
goaccess doesn t work in cron ubuntu goaccess i ve a script named stat sh when i run it from shell root bin stat sh it works as expectd sh bin sh echo begin usr bin zcat f root logs access usr bin goaccess log format h r s b r u t date format d b y time format t o root report html echo end but when i place it in etc crontab it just not works there s no report html found root root bin stat sh
1
5,948
8,773,619,937
IssuesEvent
2018-12-18 17:21:14
RIOT-OS/RIOT
https://api.github.com/repos/RIOT-OS/RIOT
closed
Tracker: Implement conditional compilation of GPIO IRQ based on `periph_gpio_irq` feature
Area: drivers Process: API change Type: tracking
#### Description #9845 made it possible to build GPIO drivers without support for external interrupts, very useful indeed. But sadly, except for the `cpu/msp430fxyz`, this feature is not implemented (e.g. missing ifdefs) for any other existing GPIO driver. So in other words, using the `PERIPH_GPIO_IRQ` module (or not) has no effect on most boards... IMHO when touching a core interface like the `periph/gpio` driver, it should be made sure that the changes are actually implemented for existing implementations (as in this case). Or at least there should be some tracking on the conversion status or similar. Please simply close this issue if I overlooked any ongoing PR or already open issue on this topic... GPIO implementations which need updating: - [x] `periph/gpio.h` header fix scope of the `periph_gpio_irq` functions #9978 - [x] Adapt `gpio_init_int` name as suggested by https://github.com/RIOT-OS/RIOT/pull/9845#issuecomment-416875712 with an alias for the old name - [x] atmega_common - #10002 - [x] cc2538 - #9994 - [x] cc26x0 #9999 - [x] efm32 - #10000 - [x] esp8266 - #10001 - [x] ezr32wg - #9993 - [x] fe310 - #10007 - [x] kinetis - #9998 - [x] lm4f120 - #10004 - [x] lpc1768 - #9995 - [x] lpc2387 - #9997 - [x] msp430fxyz - ~~#9845~~ merged - [x] native - #9976 - [x] nrf5x_common - #10005 - [x] sam0_common - #9996 - [x] sam3 - #10006 - [x] stm32_common - #10003 - [x] stm32f1 - #10008
1.0
Tracker: Implement conditional compilation of GPIO IRQ based on `periph_gpio_irq` feature - #### Description #9845 made it possible to build GPIO drivers without support for external interrupts, very useful indeed. But sadly, except for the `cpu/msp430fxyz`, this feature is not implemented (e.g. missing ifdefs) for any other existing GPIO driver. So in other words, using the `PERIPH_GPIO_IRQ` module (or not) has no effect on most boards... IMHO when touching a core interface like the `periph/gpio` driver, it should be made sure that the changes are actually implemented for existing implementations (as in this case). Or at least there should be some tracking on the conversion status or similar. Please simply close this issue if I overlooked any ongoing PR or already open issue on this topic... GPIO implementations which need updating: - [x] `periph/gpio.h` header fix scope of the `periph_gpio_irq` functions #9978 - [x] Adapt `gpio_init_int` name as suggested by https://github.com/RIOT-OS/RIOT/pull/9845#issuecomment-416875712 with an alias for the old name - [x] atmega_common - #10002 - [x] cc2538 - #9994 - [x] cc26x0 #9999 - [x] efm32 - #10000 - [x] esp8266 - #10001 - [x] ezr32wg - #9993 - [x] fe310 - #10007 - [x] kinetis - #9998 - [x] lm4f120 - #10004 - [x] lpc1768 - #9995 - [x] lpc2387 - #9997 - [x] msp430fxyz - ~~#9845~~ merged - [x] native - #9976 - [x] nrf5x_common - #10005 - [x] sam0_common - #9996 - [x] sam3 - #10006 - [x] stm32_common - #10003 - [x] stm32f1 - #10008
process
tracker implement conditional compilation of gpio irq based on periph gpio irq feature description made it possible to build gpio drivers without support for external interrupts very useful indeed but sadly except for the cpu this feature is not implemented e g missing ifdefs for any other existing gpio driver so in other words using the periph gpio irq module or not has no effect on most boards imho when touching a core interface like the periph gpio driver it should be made sure that the changes are actually implemented for existing implementations as in this case or at least there should be some tracking on the conversion status or similar please simply close this issue if i overlooked any ongoing pr or already open issue on this topic gpio implementations which need updating periph gpio h header fix scope of the periph gpio irq functions adapt gpio init int name as suggested by with an alias for the old name atmega common kinetis merged native common common common
1
335,711
10,165,351,159
IssuesEvent
2019-08-07 13:44:46
thespacedoctor/marshall_webapp_wiki
https://api.github.com/repos/thespacedoctor/marshall_webapp_wiki
closed
mark SNe as “favourites” on a user level
priority: 3 status: duplicate type: enhancement
his would be very useful to keep track of peculiar transients inside the Marshall - from Steve Schulze
1.0
mark SNe as “favourites” on a user level - his would be very useful to keep track of peculiar transients inside the Marshall - from Steve Schulze
non_process
mark sne as “favourites” on a user level his would be very useful to keep track of peculiar transients inside the marshall from steve schulze
0
235,809
19,430,298,080
IssuesEvent
2021-12-21 11:07:56
cockroachdb/cockroach
https://api.github.com/repos/cockroachdb/cockroach
closed
roachtest: jepsen/multi-register/majority-ring failed
C-test-failure O-robot O-roachtest branch-master release-blocker
roachtest.jepsen/multi-register/majority-ring [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3948712&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3948712&tab=artifacts#/jepsen/multi-register/majority-ring) on master @ [3f95a4bd83cce2952a12497de82692866b4da659](https://github.com/cockroachdb/cockroach/commits/3f95a4bd83cce2952a12497de82692866b4da659): ``` | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2050 | github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.runJepsen.func1 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:172 | github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.runJepsen.func3 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:210 | runtime.goexit | /usr/local/go/src/runtime/asm_amd64.s:1581 Wraps: (2) output in run_133646.590238777_n6_bash Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3948712-1639818405-72-n6cpu4:6 -- bash -e -c "\ | cd /mnt/data1/jepsen/cockroachdb && set -eo pipefail && \ | ~/lein run test \ | --tarball file://${PWD}/cockroach.tgz \ | --username ${USER} \ | --ssh-private-key ~/.ssh/id_rsa \ | --os ubuntu \ | --time-limit 300 \ | --concurrency 30 \ | --recovery-time 25 \ | --test-count 1 \ | -n 10.142.0.162 -n 10.142.0.169 -n 10.142.1.6 -n 10.142.0.254 -n 10.142.1.3 \ | --test multi-register --nemesis majority-ring \ | > invoke.log 2>&1 \ | " returned | stderr: | Error: SSH_PROBLEM: exit status 255 | (1) SSH_PROBLEM | Wraps: (2) Node 6. Command with error: | | `````` | | bash -e -c "\ | | cd /mnt/data1/jepsen/cockroachdb && set -eo pipefail && \ | | ~/lein run test \ | | --tarball file://${PWD}/cockroach.tgz \ | | --username ${USER} \ | | --ssh-private-key ~/.ssh/id_rsa \ | | --os ubuntu \ | | --time-limit 300 \ | | --concurrency 30 \ | | --recovery-time 25 \ | | --test-count 1 \ | | -n 10.142.0.162 -n 10.142.0.169 -n 10.142.1.6 -n 10.142.0.254 -n 10.142.1.3 \ | | --test multi-register --nemesis majority-ring \ | | > invoke.log 2>&1 \ | | " | | `````` | Wraps: (3) exit status 255 | Error types: (1) errors.SSH (2) *hintdetail.withDetail (3) *exec.ExitError | | stdout: Wraps: (4) exit status 10 Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) *exec.ExitError ``` <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/kv-triage <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jepsen/multi-register/majority-ring.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
2.0
roachtest: jepsen/multi-register/majority-ring failed - roachtest.jepsen/multi-register/majority-ring [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3948712&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3948712&tab=artifacts#/jepsen/multi-register/majority-ring) on master @ [3f95a4bd83cce2952a12497de82692866b4da659](https://github.com/cockroachdb/cockroach/commits/3f95a4bd83cce2952a12497de82692866b4da659): ``` | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2050 | github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.runJepsen.func1 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:172 | github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests.runJepsen.func3 | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tests/jepsen.go:210 | runtime.goexit | /usr/local/go/src/runtime/asm_amd64.s:1581 Wraps: (2) output in run_133646.590238777_n6_bash Wraps: (3) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-3948712-1639818405-72-n6cpu4:6 -- bash -e -c "\ | cd /mnt/data1/jepsen/cockroachdb && set -eo pipefail && \ | ~/lein run test \ | --tarball file://${PWD}/cockroach.tgz \ | --username ${USER} \ | --ssh-private-key ~/.ssh/id_rsa \ | --os ubuntu \ | --time-limit 300 \ | --concurrency 30 \ | --recovery-time 25 \ | --test-count 1 \ | -n 10.142.0.162 -n 10.142.0.169 -n 10.142.1.6 -n 10.142.0.254 -n 10.142.1.3 \ | --test multi-register --nemesis majority-ring \ | > invoke.log 2>&1 \ | " returned | stderr: | Error: SSH_PROBLEM: exit status 255 | (1) SSH_PROBLEM | Wraps: (2) Node 6. Command with error: | | `````` | | bash -e -c "\ | | cd /mnt/data1/jepsen/cockroachdb && set -eo pipefail && \ | | ~/lein run test \ | | --tarball file://${PWD}/cockroach.tgz \ | | --username ${USER} \ | | --ssh-private-key ~/.ssh/id_rsa \ | | --os ubuntu \ | | --time-limit 300 \ | | --concurrency 30 \ | | --recovery-time 25 \ | | --test-count 1 \ | | -n 10.142.0.162 -n 10.142.0.169 -n 10.142.1.6 -n 10.142.0.254 -n 10.142.1.3 \ | | --test multi-register --nemesis majority-ring \ | | > invoke.log 2>&1 \ | | " | | `````` | Wraps: (3) exit status 255 | Error types: (1) errors.SSH (2) *hintdetail.withDetail (3) *exec.ExitError | | stdout: Wraps: (4) exit status 10 Error types: (1) *withstack.withStack (2) *errutil.withPrefix (3) *cluster.WithCommandDetails (4) *exec.ExitError ``` <details><summary>Help</summary> <p> See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md) See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7) </p> </details> /cc @cockroachdb/kv-triage <sub> [This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*jepsen/multi-register/majority-ring.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues) </sub>
non_process
roachtest jepsen multi register majority ring failed roachtest jepsen multi register majority ring with on master home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go github com cockroachdb cockroach pkg cmd roachtest tests runjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest tests jepsen go github com cockroachdb cockroach pkg cmd roachtest tests runjepsen home agent work go src github com cockroachdb cockroach pkg cmd roachtest tests jepsen go runtime goexit usr local go src runtime asm s wraps output in run bash wraps home agent work go src github com cockroachdb cockroach bin roachprod run teamcity bash e c cd mnt jepsen cockroachdb set eo pipefail lein run test tarball file pwd cockroach tgz username user ssh private key ssh id rsa os ubuntu time limit concurrency recovery time test count n n n n n test multi register nemesis majority ring invoke log returned stderr error ssh problem exit status ssh problem wraps node command with error bash e c cd mnt jepsen cockroachdb set eo pipefail lein run test tarball file pwd cockroach tgz username user ssh private key ssh id rsa os ubuntu time limit concurrency recovery time test count n n n n n test multi register nemesis majority ring invoke log wraps exit status error types errors ssh hintdetail withdetail exec exiterror stdout wraps exit status error types withstack withstack errutil withprefix cluster withcommanddetails exec exiterror help see see cc cockroachdb kv triage
0
213,164
16,503,644,409
IssuesEvent
2021-05-25 16:38:29
EscolaDeSaudePublica/SACS
https://api.github.com/repos/EscolaDeSaudePublica/SACS
closed
Implementado e testado
correção de erros tarefa de rotina testes
### Objetivo **Como** desenvolvedor **Quando** o script do banco estiver pronto **Então** quero produzir a aplicação para inscrições na seleção ### Escopo O edital precisa estar no ar no dia 28/05/2021 conforme especificado. **Critérios de Aceitação:** - [x] sistema implementado - [x] sistema testado :point_left:
1.0
Implementado e testado - ### Objetivo **Como** desenvolvedor **Quando** o script do banco estiver pronto **Então** quero produzir a aplicação para inscrições na seleção ### Escopo O edital precisa estar no ar no dia 28/05/2021 conforme especificado. **Critérios de Aceitação:** - [x] sistema implementado - [x] sistema testado :point_left:
non_process
implementado e testado objetivo como desenvolvedor quando o script do banco estiver pronto então quero produzir a aplicação para inscrições na seleção escopo o edital precisa estar no ar no dia conforme especificado critérios de aceitação sistema implementado sistema testado point left
0
425,001
29,188,016,736
IssuesEvent
2023-05-19 17:09:17
dagster-io/dagster
https://api.github.com/repos/dagster-io/dagster
closed
[Content Gap] example of IOManager that uses resource_config
documentation content-gap
## What content is missing <!-- Help us to understand your request in context --> ## Type of the content <!-- Among the following items, pick a doc type that the content would most likely belong to - Tutorial - Main Concept: "What is X in Dagster" where X is a Dagster concept. - Integration Guide: "How to use X in Dagster" where X is a 3rd-party package/tool, e.g. dbt, Spark, etc - Best Practices Guide: "How to do X in Dagster" where X is usually a use case and can be addressed by incorporating multiple Dagster concepts at a time. - Deployment Guide: "How to deploy Dagster to X", e.g. Docker, K8S, GCP, etc - API Reference: e.g. docstring --> ## (Optional) Anything in particular you want the docs to cover <!-- Do you have exactly anything in mind you are looking for Examples: - Show how to test a partition set - Include a diagram to explain the relations between inputs and outputs --> ## (Optional) Target Date <!-- When do you want to have this by --> ## Writer's Guide - [Docs README](https://github.com/dagster-io/dagster/tree/master/docs) - Templates to follow: <fill by the docs team> - Examples for reference: <fill by the docs team> --- #### Message from the maintainers: Are you looking for the same documentation content? Give it a :thumbsup:. We factor engagement into prioritization.
1.0
[Content Gap] example of IOManager that uses resource_config - ## What content is missing <!-- Help us to understand your request in context --> ## Type of the content <!-- Among the following items, pick a doc type that the content would most likely belong to - Tutorial - Main Concept: "What is X in Dagster" where X is a Dagster concept. - Integration Guide: "How to use X in Dagster" where X is a 3rd-party package/tool, e.g. dbt, Spark, etc - Best Practices Guide: "How to do X in Dagster" where X is usually a use case and can be addressed by incorporating multiple Dagster concepts at a time. - Deployment Guide: "How to deploy Dagster to X", e.g. Docker, K8S, GCP, etc - API Reference: e.g. docstring --> ## (Optional) Anything in particular you want the docs to cover <!-- Do you have exactly anything in mind you are looking for Examples: - Show how to test a partition set - Include a diagram to explain the relations between inputs and outputs --> ## (Optional) Target Date <!-- When do you want to have this by --> ## Writer's Guide - [Docs README](https://github.com/dagster-io/dagster/tree/master/docs) - Templates to follow: <fill by the docs team> - Examples for reference: <fill by the docs team> --- #### Message from the maintainers: Are you looking for the same documentation content? Give it a :thumbsup:. We factor engagement into prioritization.
non_process
example of iomanager that uses resource config what content is missing type of the content among the following items pick a doc type that the content would most likely belong to tutorial main concept what is x in dagster where x is a dagster concept integration guide how to use x in dagster where x is a party package tool e g dbt spark etc best practices guide how to do x in dagster where x is usually a use case and can be addressed by incorporating multiple dagster concepts at a time deployment guide how to deploy dagster to x e g docker gcp etc api reference e g docstring optional anything in particular you want the docs to cover do you have exactly anything in mind you are looking for examples show how to test a partition set include a diagram to explain the relations between inputs and outputs optional target date writer s guide templates to follow examples for reference message from the maintainers are you looking for the same documentation content give it a thumbsup we factor engagement into prioritization
0
175,334
27,831,092,763
IssuesEvent
2023-03-20 05:01:33
hashgraph/hedera-services
https://api.github.com/repos/hashgraph/hedera-services
closed
Design enablement of traceability actions side car in mainnet
Feature Enhancement Design Hedera Smart Contract Service
### Problem Currently the traceability side cars are disabled on mainnet and exposed only in previewnet and testnet ### Solution Enable traceability side cars and migration in mainnet - [x] Design discussions around storage and perf impacts - [x] Design for cross contract and precompile involved contracts - ~[ ] Create work item tickets~ ### Alternatives _No response_
1.0
Design enablement of traceability actions side car in mainnet - ### Problem Currently the traceability side cars are disabled on mainnet and exposed only in previewnet and testnet ### Solution Enable traceability side cars and migration in mainnet - [x] Design discussions around storage and perf impacts - [x] Design for cross contract and precompile involved contracts - ~[ ] Create work item tickets~ ### Alternatives _No response_
non_process
design enablement of traceability actions side car in mainnet problem currently the traceability side cars are disabled on mainnet and exposed only in previewnet and testnet solution enable traceability side cars and migration in mainnet design discussions around storage and perf impacts design for cross contract and precompile involved contracts create work item tickets alternatives no response
0
10,752
13,542,977,946
IssuesEvent
2020-09-16 18:12:58
zcash/zips
https://api.github.com/repos/zcash/zips
opened
NUP: the New NUPenning
ZIP process
Goals: - encode the formulation of new NU's via the ZIP process [1] - making sure that our partners and users have a predictable, reliable schedule of NU's to plan around - proposed consensus changes for a new NU via the ZIP process should be 'ready to go' in software, modulo some integration work with other proposed ZIPs perhaps, before being accepted for the next NU - making sure our users have a well-greased process for software updates and changes, but which is not 1:1 with the NUP/network update changes Up for discussion: - If we don't have anything ready to go into a NU, we don't _have_ to do one. - Requiring audits of implementation / specification as part of the NUP, vs baking in enough time in the NUP for parties that audit to do so [1] "## A suggested replacement --- let's mirror and sync the ZIP/trademark process That's not to say the NUP is all bad. It's not! There's some great guidance in there, particularly after features have been selected for a hard fork upgrade. It's a thoughtfully designed process by the ECC. But to date, the actual ZIP selection process — and choosing when/which ZIPs to be a part of a hard-fork upgrade — has happened mostly opaquely. We'd like to change that, and then set a process that doesn't set an regular-cadence deadline to the upgrade, but instead relies on the ZIP editors to "bundle" suggested consensus-changing ZIPs into a given upgrade. - Create a new ZIP category called "network upgrade", a meta category of sorts, that would include a list of ZIPs approved by editors. While this meta-ZIP is in the "draft" state ZIPs can still be added/modified by Editors. - ZIP Editors decide which ZIPs (which have to be previously vetted/discussed by editors) wind up in these "network upgrade" ZIPs. - Any ZIP Editor can propose that a "network upgrade" moves to "proposed" state, at which point all ZIP Editors must unanimously agree to a implementation/audit timeline and hard-fork schedule, which will be enshrined based on block time. Some pieces of the NUP could be an inspiration here. - Once all sub-ZIPs within a "network upgrade" have been implemented, the status changes to network-upgrade ZIP changes to "implemented". - Once activated the "network upgrade" ZIP becomes "final/active." - If a ZIP within a "network upgrade" proposal is judged by _any_ ZIP Editor has having potentially political/economic implications (e.g. like the dev fund), a ZIP Editor can short-circuit to the process to call for community input in the decision, upon which it would follow a similar debate/polling process to the dev fund. If this is viewed as something that may delay a given hard fork, it can/should be tabled to the "next" network upgrade. Modifying ZIP 0 to prescribe this process is a good idea, and would scale as we get more teams building independent implementations (or contributing to zcashd and zebra!). Doing it this way would more closely match the trademark agreement, and since both the trademark agreement and the ZIP Editor selection can be expanded to include other parties (e.g., perhaps an MGRC rep?) in the future, it makes sense for the network-upgrade schedule to be in sync with these processes."
1.0
NUP: the New NUPenning - Goals: - encode the formulation of new NU's via the ZIP process [1] - making sure that our partners and users have a predictable, reliable schedule of NU's to plan around - proposed consensus changes for a new NU via the ZIP process should be 'ready to go' in software, modulo some integration work with other proposed ZIPs perhaps, before being accepted for the next NU - making sure our users have a well-greased process for software updates and changes, but which is not 1:1 with the NUP/network update changes Up for discussion: - If we don't have anything ready to go into a NU, we don't _have_ to do one. - Requiring audits of implementation / specification as part of the NUP, vs baking in enough time in the NUP for parties that audit to do so [1] "## A suggested replacement --- let's mirror and sync the ZIP/trademark process That's not to say the NUP is all bad. It's not! There's some great guidance in there, particularly after features have been selected for a hard fork upgrade. It's a thoughtfully designed process by the ECC. But to date, the actual ZIP selection process — and choosing when/which ZIPs to be a part of a hard-fork upgrade — has happened mostly opaquely. We'd like to change that, and then set a process that doesn't set an regular-cadence deadline to the upgrade, but instead relies on the ZIP editors to "bundle" suggested consensus-changing ZIPs into a given upgrade. - Create a new ZIP category called "network upgrade", a meta category of sorts, that would include a list of ZIPs approved by editors. While this meta-ZIP is in the "draft" state ZIPs can still be added/modified by Editors. - ZIP Editors decide which ZIPs (which have to be previously vetted/discussed by editors) wind up in these "network upgrade" ZIPs. - Any ZIP Editor can propose that a "network upgrade" moves to "proposed" state, at which point all ZIP Editors must unanimously agree to a implementation/audit timeline and hard-fork schedule, which will be enshrined based on block time. Some pieces of the NUP could be an inspiration here. - Once all sub-ZIPs within a "network upgrade" have been implemented, the status changes to network-upgrade ZIP changes to "implemented". - Once activated the "network upgrade" ZIP becomes "final/active." - If a ZIP within a "network upgrade" proposal is judged by _any_ ZIP Editor has having potentially political/economic implications (e.g. like the dev fund), a ZIP Editor can short-circuit to the process to call for community input in the decision, upon which it would follow a similar debate/polling process to the dev fund. If this is viewed as something that may delay a given hard fork, it can/should be tabled to the "next" network upgrade. Modifying ZIP 0 to prescribe this process is a good idea, and would scale as we get more teams building independent implementations (or contributing to zcashd and zebra!). Doing it this way would more closely match the trademark agreement, and since both the trademark agreement and the ZIP Editor selection can be expanded to include other parties (e.g., perhaps an MGRC rep?) in the future, it makes sense for the network-upgrade schedule to be in sync with these processes."
process
nup the new nupenning goals encode the formulation of new nu s via the zip process making sure that our partners and users have a predictable reliable schedule of nu s to plan around proposed consensus changes for a new nu via the zip process should be ready to go in software modulo some integration work with other proposed zips perhaps before being accepted for the next nu making sure our users have a well greased process for software updates and changes but which is not with the nup network update changes up for discussion if we don t have anything ready to go into a nu we don t have to do one requiring audits of implementation specification as part of the nup vs baking in enough time in the nup for parties that audit to do so a suggested replacement let s mirror and sync the zip trademark process that s not to say the nup is all bad it s not there s some great guidance in there particularly after features have been selected for a hard fork upgrade it s a thoughtfully designed process by the ecc but to date the actual zip selection process — and choosing when which zips to be a part of a hard fork upgrade — has happened mostly opaquely we d like to change that and then set a process that doesn t set an regular cadence deadline to the upgrade but instead relies on the zip editors to bundle suggested consensus changing zips into a given upgrade create a new zip category called network upgrade a meta category of sorts that would include a list of zips approved by editors while this meta zip is in the draft state zips can still be added modified by editors zip editors decide which zips which have to be previously vetted discussed by editors wind up in these network upgrade zips any zip editor can propose that a network upgrade moves to proposed state at which point all zip editors must unanimously agree to a implementation audit timeline and hard fork schedule which will be enshrined based on block time some pieces of the nup could be an inspiration here once all sub zips within a network upgrade have been implemented the status changes to network upgrade zip changes to implemented once activated the network upgrade zip becomes final active if a zip within a network upgrade proposal is judged by any zip editor has having potentially political economic implications e g like the dev fund a zip editor can short circuit to the process to call for community input in the decision upon which it would follow a similar debate polling process to the dev fund if this is viewed as something that may delay a given hard fork it can should be tabled to the next network upgrade modifying zip to prescribe this process is a good idea and would scale as we get more teams building independent implementations or contributing to zcashd and zebra doing it this way would more closely match the trademark agreement and since both the trademark agreement and the zip editor selection can be expanded to include other parties e g perhaps an mgrc rep in the future it makes sense for the network upgrade schedule to be in sync with these processes
1
4,552
7,381,896,771
IssuesEvent
2018-03-15 01:27:30
mlmasters/toxic-comments
https://api.github.com/repos/mlmasters/toxic-comments
closed
Data quality/cleaning: do we need it?
preprocessing
Our comments are dirty. There is an abundance of misspellings, excessive/incorrect punctuation, auto-generated messages, wikipedia markup, and more. This issue is a catch-all for data quality issues. Questions: - Do we need to worry about this? If we vectorize the text then many problems disappear. Misspellings, markup, and auto-messages remain, however. How should we deal with these? - Are there other aspects of data quality not mentioned above?
1.0
Data quality/cleaning: do we need it? - Our comments are dirty. There is an abundance of misspellings, excessive/incorrect punctuation, auto-generated messages, wikipedia markup, and more. This issue is a catch-all for data quality issues. Questions: - Do we need to worry about this? If we vectorize the text then many problems disappear. Misspellings, markup, and auto-messages remain, however. How should we deal with these? - Are there other aspects of data quality not mentioned above?
process
data quality cleaning do we need it our comments are dirty there is an abundance of misspellings excessive incorrect punctuation auto generated messages wikipedia markup and more this issue is a catch all for data quality issues questions do we need to worry about this if we vectorize the text then many problems disappear misspellings markup and auto messages remain however how should we deal with these are there other aspects of data quality not mentioned above
1
95,844
27,634,966,186
IssuesEvent
2023-03-10 13:48:32
chanzuckerberg/cell-census
https://api.github.com/repos/chanzuckerberg/cell-census
closed
Builder unit test should use at least 2 datasets per organism
cell census builder P1
Builder unit test should use at least 2 datasets per organism to ensure the var axis building is correctly creating the union of features for the var axis. An organism's datasets should have partially overlapping features.
1.0
Builder unit test should use at least 2 datasets per organism - Builder unit test should use at least 2 datasets per organism to ensure the var axis building is correctly creating the union of features for the var axis. An organism's datasets should have partially overlapping features.
non_process
builder unit test should use at least datasets per organism builder unit test should use at least datasets per organism to ensure the var axis building is correctly creating the union of features for the var axis an organism s datasets should have partially overlapping features
0
1,672
4,309,512,390
IssuesEvent
2016-07-21 16:14:02
pelias/api
https://api.github.com/repos/pelias/api
closed
[discuss] uncaught errors killing the process
processed question
With our current controller configuration there is potential for dodgy code nested deep inside async callbacks to throw Errors which miss all the synchronous `try/catch` blocks and make their way all the way up the stack to trigger `process.on('uncaughtException')` and promptly `exit(1)` the whole process. This is obviously super-bad as all the connections get dropped, the API goes down on that box and the process has to be restarted, also the load balancers get affected et cetera, it's just bad. At time of writing this has only happened once and it was only triggered in an extreme edge-case which wasn't covered by unit testing; however there is potential for some dodgy code from `us/them` one day causing us some serious problems. This is a classic nodejs issue and there is only one way of dealing with it for good, firstly here are some 'bad' ways of dealing with it: **[bad]** trying to catch async code (ie. the sound of one hand clapping) ```javascript try{ setTimeout( function(){ throw new Error('catchme'); }, 1 ); } catch( e ){} // your process just crashed ``` **[bad]** wrapping all your sync code ```javascript try{ chaosMonkey(); } catch( e ){} try{ chaosMonkey(); } catch( e ){} try{ chaosMonkey(); } catch( e ){} // your code is not pretty ``` **[bad]** using 'uncaughtException' ```javascript process.on('uncaughtException', function(){ console.log('I\'m ok; really'); }); throw new Error('catchme'); // your application state is unstable ``` **[bad]** hoping express will handle this for you ```javascript app.get('/foo', function( req, res, next ){ setTimeout( function(){ throw new Error('catchme'); }, 1 ); }); // your process just crashed ``` ...not to be confused with: ```javascript app.get('/foo', function( req, res, next ){ next( new Error('catchme') ); }); // your process is ok ``` ..so there is basically one option left; called [domains](https://nodejs.org/api/domain.html) which has a long history in the nodejs community and I've been told many times that its going to be deprecated and not to use it, however since there is no alternative, it remains in core. (maybe the old issues with it have been solved, I'm not sure) There is a lengthy discussion over at `io.js` [1] which shows a nice way of implementing domains inside an express app and discusses some of the pros/cons of this approach. [1] https://github.com/mikeal/nodeconf2013/issues/6#issuecomment-19092750
1.0
[discuss] uncaught errors killing the process - With our current controller configuration there is potential for dodgy code nested deep inside async callbacks to throw Errors which miss all the synchronous `try/catch` blocks and make their way all the way up the stack to trigger `process.on('uncaughtException')` and promptly `exit(1)` the whole process. This is obviously super-bad as all the connections get dropped, the API goes down on that box and the process has to be restarted, also the load balancers get affected et cetera, it's just bad. At time of writing this has only happened once and it was only triggered in an extreme edge-case which wasn't covered by unit testing; however there is potential for some dodgy code from `us/them` one day causing us some serious problems. This is a classic nodejs issue and there is only one way of dealing with it for good, firstly here are some 'bad' ways of dealing with it: **[bad]** trying to catch async code (ie. the sound of one hand clapping) ```javascript try{ setTimeout( function(){ throw new Error('catchme'); }, 1 ); } catch( e ){} // your process just crashed ``` **[bad]** wrapping all your sync code ```javascript try{ chaosMonkey(); } catch( e ){} try{ chaosMonkey(); } catch( e ){} try{ chaosMonkey(); } catch( e ){} // your code is not pretty ``` **[bad]** using 'uncaughtException' ```javascript process.on('uncaughtException', function(){ console.log('I\'m ok; really'); }); throw new Error('catchme'); // your application state is unstable ``` **[bad]** hoping express will handle this for you ```javascript app.get('/foo', function( req, res, next ){ setTimeout( function(){ throw new Error('catchme'); }, 1 ); }); // your process just crashed ``` ...not to be confused with: ```javascript app.get('/foo', function( req, res, next ){ next( new Error('catchme') ); }); // your process is ok ``` ..so there is basically one option left; called [domains](https://nodejs.org/api/domain.html) which has a long history in the nodejs community and I've been told many times that its going to be deprecated and not to use it, however since there is no alternative, it remains in core. (maybe the old issues with it have been solved, I'm not sure) There is a lengthy discussion over at `io.js` [1] which shows a nice way of implementing domains inside an express app and discusses some of the pros/cons of this approach. [1] https://github.com/mikeal/nodeconf2013/issues/6#issuecomment-19092750
process
uncaught errors killing the process with our current controller configuration there is potential for dodgy code nested deep inside async callbacks to throw errors which miss all the synchronous try catch blocks and make their way all the way up the stack to trigger process on uncaughtexception and promptly exit the whole process this is obviously super bad as all the connections get dropped the api goes down on that box and the process has to be restarted also the load balancers get affected et cetera it s just bad at time of writing this has only happened once and it was only triggered in an extreme edge case which wasn t covered by unit testing however there is potential for some dodgy code from us them one day causing us some serious problems this is a classic nodejs issue and there is only one way of dealing with it for good firstly here are some bad ways of dealing with it trying to catch async code ie the sound of one hand clapping javascript try settimeout function throw new error catchme catch e your process just crashed wrapping all your sync code javascript try chaosmonkey catch e try chaosmonkey catch e try chaosmonkey catch e your code is not pretty using uncaughtexception javascript process on uncaughtexception function console log i m ok really throw new error catchme your application state is unstable hoping express will handle this for you javascript app get foo function req res next settimeout function throw new error catchme your process just crashed not to be confused with javascript app get foo function req res next next new error catchme your process is ok so there is basically one option left called which has a long history in the nodejs community and i ve been told many times that its going to be deprecated and not to use it however since there is no alternative it remains in core maybe the old issues with it have been solved i m not sure there is a lengthy discussion over at io js which shows a nice way of implementing domains inside an express app and discusses some of the pros cons of this approach
1
119,472
17,616,458,535
IssuesEvent
2021-08-18 10:18:32
panasalap/curl-7.64.1
https://api.github.com/repos/panasalap/curl-7.64.1
opened
CVE-2019-5443 (High) detected in curlcurl-7_65_3
security vulnerability
## CVE-2019-5443 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>curlcurl-7_65_3</b></p></summary> <p> <p>A command line tool and library for transferring data with URL syntax, supporting HTTP, HTTPS, FTP, FTPS, GOPHER, TFTP, SCP, SFTP, SMB, TELNET, DICT, LDAP, LDAPS, MQTT, FILE, IMAP, SMTP, POP3, RTSP and RTMP. libcurl offers a myriad of powerful features</p> <p>Library home page: <a href=https://github.com/curl/curl.git>https://github.com/curl/curl.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/panasalap/curl-7.64.1/commit/ded5d31e5df28f62210d7dc3e6a127a929e970ae">ded5d31e5df28f62210d7dc3e6a127a929e970ae</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/lib/vtls/openssl.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A non-privileged user or program can put code and a config file in a known non-privileged path (under C:/usr/local/) that will make curl <= 7.65.1 automatically run the code (as an openssl "engine") on invocation. If that curl is invoked by a privileged user it can do anything it wants. <p>Publish Date: 2019-07-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-5443>CVE-2019-5443</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://curl.haxx.se/docs/CVE-2019-5443.html">https://curl.haxx.se/docs/CVE-2019-5443.html</a></p> <p>Release Date: 2019-06-30</p> <p>Fix Resolution: 7.65.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-5443 (High) detected in curlcurl-7_65_3 - ## CVE-2019-5443 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>curlcurl-7_65_3</b></p></summary> <p> <p>A command line tool and library for transferring data with URL syntax, supporting HTTP, HTTPS, FTP, FTPS, GOPHER, TFTP, SCP, SFTP, SMB, TELNET, DICT, LDAP, LDAPS, MQTT, FILE, IMAP, SMTP, POP3, RTSP and RTMP. libcurl offers a myriad of powerful features</p> <p>Library home page: <a href=https://github.com/curl/curl.git>https://github.com/curl/curl.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/panasalap/curl-7.64.1/commit/ded5d31e5df28f62210d7dc3e6a127a929e970ae">ded5d31e5df28f62210d7dc3e6a127a929e970ae</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/lib/vtls/openssl.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A non-privileged user or program can put code and a config file in a known non-privileged path (under C:/usr/local/) that will make curl <= 7.65.1 automatically run the code (as an openssl "engine") on invocation. If that curl is invoked by a privileged user it can do anything it wants. <p>Publish Date: 2019-07-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-5443>CVE-2019-5443</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://curl.haxx.se/docs/CVE-2019-5443.html">https://curl.haxx.se/docs/CVE-2019-5443.html</a></p> <p>Release Date: 2019-06-30</p> <p>Fix Resolution: 7.65.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in curlcurl cve high severity vulnerability vulnerable library curlcurl a command line tool and library for transferring data with url syntax supporting http https ftp ftps gopher tftp scp sftp smb telnet dict ldap ldaps mqtt file imap smtp rtsp and rtmp libcurl offers a myriad of powerful features library home page a href found in head commit a href found in base branch master vulnerable source files lib vtls openssl c vulnerability details a non privileged user or program can put code and a config file in a known non privileged path under c usr local that will make curl automatically run the code as an openssl engine on invocation if that curl is invoked by a privileged user it can do anything it wants publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
21,119
28,088,690,549
IssuesEvent
2023-03-30 11:36:39
Tzahi12345/YoutubeDL-Material
https://api.github.com/repos/Tzahi12345/YoutubeDL-Material
closed
[FEATURE] RSS feed for all downloads
is: enhancement [META]DATA PROCESSING
Would it be possible to add an RSS feed that lists all downloads?
1.0
[FEATURE] RSS feed for all downloads - Would it be possible to add an RSS feed that lists all downloads?
process
rss feed for all downloads would it be possible to add an rss feed that lists all downloads
1
58,402
14,274,429,888
IssuesEvent
2020-11-22 03:54:27
Ghost-chu/QuickShop-Reremake
https://api.github.com/repos/Ghost-chu/QuickShop-Reremake
closed
CVE-2019-16943 (High) detected in jackson-databind-2.3.4.jar - autoclosed
Bug security vulnerability
## CVE-2019-16943 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.3.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: QuickShop-Reremake/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.3.4/jackson-databind-2.3.4.jar</p> <p> Dependency Hierarchy: - jenkins-client-0.3.8.jar (Root Library) - :x: **jackson-databind-2.3.4.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Ghost-chu/QuickShop-Reremake/commit/8ee7d2b71191adf05b366e0787aec78ffbdad102">8ee7d2b71191adf05b366e0787aec78ffbdad102</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.0.0 through 2.9.10. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the p6spy (3.8.6) jar in the classpath, and an attacker can find an RMI service endpoint to access, it is possible to make the service execute a malicious payload. This issue exists because of com.p6spy.engine.spy.P6DataSource mishandling. <p>Publish Date: 2019-10-01 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16943>CVE-2019-16943</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16943">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16943</a></p> <p>Release Date: 2019-10-01</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-16943 (High) detected in jackson-databind-2.3.4.jar - autoclosed - ## CVE-2019-16943 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.3.4.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Path to dependency file: QuickShop-Reremake/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.3.4/jackson-databind-2.3.4.jar</p> <p> Dependency Hierarchy: - jenkins-client-0.3.8.jar (Root Library) - :x: **jackson-databind-2.3.4.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/Ghost-chu/QuickShop-Reremake/commit/8ee7d2b71191adf05b366e0787aec78ffbdad102">8ee7d2b71191adf05b366e0787aec78ffbdad102</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.0.0 through 2.9.10. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the p6spy (3.8.6) jar in the classpath, and an attacker can find an RMI service endpoint to access, it is possible to make the service execute a malicious payload. This issue exists because of com.p6spy.engine.spy.P6DataSource mishandling. <p>Publish Date: 2019-10-01 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16943>CVE-2019-16943</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16943">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16943</a></p> <p>Release Date: 2019-10-01</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.6.7.3,2.7.9.7,2.8.11.5,2.9.10.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api path to dependency file quickshop reremake pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy jenkins client jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details a polymorphic typing issue was discovered in fasterxml jackson databind through when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has the jar in the classpath and an attacker can find an rmi service endpoint to access it is possible to make the service execute a malicious payload this issue exists because of com engine spy mishandling publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
34,293
29,190,878,933
IssuesEvent
2023-05-19 19:52:11
woocommerce/woocommerce
https://api.github.com/repos/woocommerce/woocommerce
closed
Vulnerability Detection: Static Analysis
tool: monorepo infrastructure
<!-- This form is for other issue types specific to the WooCommerce plugin. This is not a support portal. --> **Prerequisites (mark completed items with an [x]):** - [x] I have checked that my issue type is not listed here https://github.com/woocommerce/woocommerce/issues/new/choose - [x] My issue is not a security issue, support request, bug report, enhancement or feature request (Please use the link above if it is). **Issue Description:** Evaluate and determine the cost/benefit to using a Static Analysis tool for discovering vulnerabilities either on the current code base or as part of a PR check. Consider factors such as cost, effectiveness, developer friendliness, etc. If the addition of Static Analysis makes sense, propose its addition with an RFC. - [x] Evaluate [Snyk](deepcode.ai) - [x] Evaluate [Semgrep](https://semgrep.dev/)
1.0
Vulnerability Detection: Static Analysis - <!-- This form is for other issue types specific to the WooCommerce plugin. This is not a support portal. --> **Prerequisites (mark completed items with an [x]):** - [x] I have checked that my issue type is not listed here https://github.com/woocommerce/woocommerce/issues/new/choose - [x] My issue is not a security issue, support request, bug report, enhancement or feature request (Please use the link above if it is). **Issue Description:** Evaluate and determine the cost/benefit to using a Static Analysis tool for discovering vulnerabilities either on the current code base or as part of a PR check. Consider factors such as cost, effectiveness, developer friendliness, etc. If the addition of Static Analysis makes sense, propose its addition with an RFC. - [x] Evaluate [Snyk](deepcode.ai) - [x] Evaluate [Semgrep](https://semgrep.dev/)
non_process
vulnerability detection static analysis prerequisites mark completed items with an i have checked that my issue type is not listed here my issue is not a security issue support request bug report enhancement or feature request please use the link above if it is issue description evaluate and determine the cost benefit to using a static analysis tool for discovering vulnerabilities either on the current code base or as part of a pr check consider factors such as cost effectiveness developer friendliness etc if the addition of static analysis makes sense propose its addition with an rfc evaluate deepcode ai evaluate
0
440,774
12,703,813,367
IssuesEvent
2020-06-22 23:22:38
ansible-community/antsibull
https://api.github.com/repos/ansible-community/antsibull
closed
Fix linking between modules
Priority bug
Fix linking between modules (the anchors aren't conducive to construction for `M()`
1.0
Fix linking between modules - Fix linking between modules (the anchors aren't conducive to construction for `M()`
non_process
fix linking between modules fix linking between modules the anchors aren t conducive to construction for m
0
130,318
10,605,622,529
IssuesEvent
2019-10-10 20:54:31
LN-Zap/zap-desktop
https://api.github.com/repos/LN-Zap/zap-desktop
opened
Add gnome keyring to CI env
scope: tests
<!--- Provide a general summary of the issue in the Title above --> ## Detailed Description Currently e2e tests on linux use keytar mock because Gnome Keyring isn't available in travis CI env. This should be fixed. More details: https://github.com/atom/node-keytar/blob/master/.travis.yml
1.0
Add gnome keyring to CI env - <!--- Provide a general summary of the issue in the Title above --> ## Detailed Description Currently e2e tests on linux use keytar mock because Gnome Keyring isn't available in travis CI env. This should be fixed. More details: https://github.com/atom/node-keytar/blob/master/.travis.yml
non_process
add gnome keyring to ci env detailed description currently tests on linux use keytar mock because gnome keyring isn t available in travis ci env this should be fixed more details
0
10,179
13,044,162,851
IssuesEvent
2020-07-29 03:47:37
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `LeastDecimal` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `LeastDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `LeastDecimal` from TiDB - ## Description Port the scalar function `LeastDecimal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function leastdecimal from tidb description port the scalar function leastdecimal from tidb to coprocessor score mentor s lonng recommended skills rust programming learning materials already implemented expressions ported from tidb
1
3,574
6,617,607,578
IssuesEvent
2017-09-21 02:50:25
amaster507/ifbmt
https://api.github.com/repos/amaster507/ifbmt
opened
Printing Address Labels
contacts idea process question
Christopher Yetzer shared with me screenshots from an Access database that he was given by Ed Johnson missionary to Brazil. It is actually the most sophisticated database I have seen so far which has helped me draw some additional idea. It appears as though it could allow you to print address labels for those who are selected as on the prayer letter mailing list. So question is, how are missionaries processing and printing address labels for newsletters? Is this done by the missionary or by the mission board if available? What format(s) are utilized?
1.0
Printing Address Labels - Christopher Yetzer shared with me screenshots from an Access database that he was given by Ed Johnson missionary to Brazil. It is actually the most sophisticated database I have seen so far which has helped me draw some additional idea. It appears as though it could allow you to print address labels for those who are selected as on the prayer letter mailing list. So question is, how are missionaries processing and printing address labels for newsletters? Is this done by the missionary or by the mission board if available? What format(s) are utilized?
process
printing address labels christopher yetzer shared with me screenshots from an access database that he was given by ed johnson missionary to brazil it is actually the most sophisticated database i have seen so far which has helped me draw some additional idea it appears as though it could allow you to print address labels for those who are selected as on the prayer letter mailing list so question is how are missionaries processing and printing address labels for newsletters is this done by the missionary or by the mission board if available what format s are utilized
1
99,289
8,696,068,834
IssuesEvent
2018-12-04 16:34:39
mapbox/mapbox-gl-native
https://api.github.com/repos/mapbox/mapbox-gl-native
closed
Android: Build a real unit test application
Android archived refactor tests
We're currently [faking our way](https://github.com/mapbox/mapbox-gl-native/blob/1d33bf774f7d0248a72529840e155c4716a99851/platform/android/src/test/main.jni.cpp#L45-L76) to an Android application. Instead, we should build a real application that *only* contains `mbgl-test.so` (which includes `mbgl-android`, which includes `mbgl-core`, and starts up as a real application with the usual application launch mechanism. Reasons for this are: - Integration into Android Studio - Run Google Unit Tests on Firebase - Test JNI wrappers outside of the context of a regular Map workflow - We need a `Context` handle for testing https://github.com/mapbox/mapbox-gl-native/pull/8712 /cc @mapbox/android
1.0
Android: Build a real unit test application - We're currently [faking our way](https://github.com/mapbox/mapbox-gl-native/blob/1d33bf774f7d0248a72529840e155c4716a99851/platform/android/src/test/main.jni.cpp#L45-L76) to an Android application. Instead, we should build a real application that *only* contains `mbgl-test.so` (which includes `mbgl-android`, which includes `mbgl-core`, and starts up as a real application with the usual application launch mechanism. Reasons for this are: - Integration into Android Studio - Run Google Unit Tests on Firebase - Test JNI wrappers outside of the context of a regular Map workflow - We need a `Context` handle for testing https://github.com/mapbox/mapbox-gl-native/pull/8712 /cc @mapbox/android
non_process
android build a real unit test application we re currently to an android application instead we should build a real application that only contains mbgl test so which includes mbgl android which includes mbgl core and starts up as a real application with the usual application launch mechanism reasons for this are integration into android studio run google unit tests on firebase test jni wrappers outside of the context of a regular map workflow we need a context handle for testing cc mapbox android
0
38,475
2,847,842,318
IssuesEvent
2015-05-29 19:14:59
calblueprint/revolv
https://api.github.com/repos/calblueprint/revolv
opened
Sign up with Facebook, Twitter, or Linkedin
auth-pages priority-3 social-engagement
Not sure if we'll be able to do this - we shelved it at the beginning of last year. Bringing it back because Andreas has mentioned that it's important to him.
1.0
Sign up with Facebook, Twitter, or Linkedin - Not sure if we'll be able to do this - we shelved it at the beginning of last year. Bringing it back because Andreas has mentioned that it's important to him.
non_process
sign up with facebook twitter or linkedin not sure if we ll be able to do this we shelved it at the beginning of last year bringing it back because andreas has mentioned that it s important to him
0
8,842
11,949,016,176
IssuesEvent
2020-04-03 12:57:47
didi/mpx
https://api.github.com/repos/didi/mpx
closed
跨平台编译后,钉钉应用无法运行
processing
**问题描述** 使用跨平台编译输出后,在微信端和支付宝小程序端,可以正常运行,不过当代码切换到钉钉应用时,报错不能运行 **复现步骤** 1、直接下载官方框架,不作任何修改后,直接运行 **期望的表现** 能够钉钉E应用也正常运行 **截图** ![钉钉报错截图](https://user-images.githubusercontent.com/25979037/77638957-ef5d5100-6f92-11ea-87d5-fd56d246f737.png)
1.0
跨平台编译后,钉钉应用无法运行 - **问题描述** 使用跨平台编译输出后,在微信端和支付宝小程序端,可以正常运行,不过当代码切换到钉钉应用时,报错不能运行 **复现步骤** 1、直接下载官方框架,不作任何修改后,直接运行 **期望的表现** 能够钉钉E应用也正常运行 **截图** ![钉钉报错截图](https://user-images.githubusercontent.com/25979037/77638957-ef5d5100-6f92-11ea-87d5-fd56d246f737.png)
process
跨平台编译后,钉钉应用无法运行 问题描述 使用跨平台编译输出后,在微信端和支付宝小程序端,可以正常运行,不过当代码切换到钉钉应用时,报错不能运行 复现步骤 、直接下载官方框架,不作任何修改后,直接运行 期望的表现 能够钉钉e应用也正常运行 截图
1
1,478
4,054,925,395
IssuesEvent
2016-05-24 14:01:51
nodejs/node
https://api.github.com/repos/nodejs/node
closed
`execSync` not allow `encoding` to set explicitly to `buffer`
child_process
<!-- Thank you for reporting an issue. Please fill in the template below. If unsure about something, just do as best as you're able. Version: usually output of `node -v` Platform: either `uname -a` output, or if Windows, version and 32 or 64-bit Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: Tested on 5, 6 * **Platform**: Tested on OS X, Ubuntu * **Subsystem**: <!-- Enter your issue details below this comment. --> ``` > child_process.execSync("ls",{encoding:"buffer"}) TypeError: Unknown encoding: buffer ```
1.0
`execSync` not allow `encoding` to set explicitly to `buffer` - <!-- Thank you for reporting an issue. Please fill in the template below. If unsure about something, just do as best as you're able. Version: usually output of `node -v` Platform: either `uname -a` output, or if Windows, version and 32 or 64-bit Subsystem: if known, please specify affected core module name If possible, please provide code that demonstrates the problem, keeping it as simple and free of external dependencies as you are able. --> * **Version**: Tested on 5, 6 * **Platform**: Tested on OS X, Ubuntu * **Subsystem**: <!-- Enter your issue details below this comment. --> ``` > child_process.execSync("ls",{encoding:"buffer"}) TypeError: Unknown encoding: buffer ```
process
execsync not allow encoding to set explicitly to buffer thank you for reporting an issue please fill in the template below if unsure about something just do as best as you re able version usually output of node v platform either uname a output or if windows version and or bit subsystem if known please specify affected core module name if possible please provide code that demonstrates the problem keeping it as simple and free of external dependencies as you are able version tested on platform tested on os x ubuntu subsystem child process execsync ls encoding buffer typeerror unknown encoding buffer
1
3,828
6,802,324,344
IssuesEvent
2017-11-02 19:47:42
WikiWatershed/model-my-watershed
https://api.github.com/repos/WikiWatershed/model-my-watershed
closed
Geoprocessing API: Validate shape is in CONUS
Geoprocessing API tested/verified WPF
Validate submitted shapes are in the CONUS. We should already have the boundaries available in the app — this check is already made on the frontend.
1.0
Geoprocessing API: Validate shape is in CONUS - Validate submitted shapes are in the CONUS. We should already have the boundaries available in the app — this check is already made on the frontend.
process
geoprocessing api validate shape is in conus validate submitted shapes are in the conus we should already have the boundaries available in the app — this check is already made on the frontend
1
226,329
17,346,476,867
IssuesEvent
2021-07-29 00:01:58
microsoft/pxt-arcade
https://api.github.com/repos/microsoft/pxt-arcade
closed
(enhancement)There doesn't have a .gif to teach user how to pick or set a tile in tilemap editor in tutorial "Setting the Scene"
documentation tutorial
**Describe the bug** There doesn't have a .gif to teach user how to pick or set a tile in tilemap editor in tutorial "Setting the Scene" **Steps to reproduce the behavior** 1. Navigate to https://arcade.makecode.com/ 2. Scroll down to "Game Design Concepts" 3. Open tutorial "Setting the Scene" and observe hint in each step **Expect behavior** There should have a .gif to teach user how to pick or set a tile in tilemap editor **Actual behavior** There doesn't have a .gif to teach user how to pick or set a tile in tilemap editor in tutorial "Setting the Scene" ![image](https://user-images.githubusercontent.com/23466737/71713730-e81c2080-2e45-11ea-9b3c-258764891f89.png) **Additional context** 1. OS: Windows(rs6) 2. arcade version: 0.15.11 3. Microsoft MakeCode version: 5.30.38
1.0
(enhancement)There doesn't have a .gif to teach user how to pick or set a tile in tilemap editor in tutorial "Setting the Scene" - **Describe the bug** There doesn't have a .gif to teach user how to pick or set a tile in tilemap editor in tutorial "Setting the Scene" **Steps to reproduce the behavior** 1. Navigate to https://arcade.makecode.com/ 2. Scroll down to "Game Design Concepts" 3. Open tutorial "Setting the Scene" and observe hint in each step **Expect behavior** There should have a .gif to teach user how to pick or set a tile in tilemap editor **Actual behavior** There doesn't have a .gif to teach user how to pick or set a tile in tilemap editor in tutorial "Setting the Scene" ![image](https://user-images.githubusercontent.com/23466737/71713730-e81c2080-2e45-11ea-9b3c-258764891f89.png) **Additional context** 1. OS: Windows(rs6) 2. arcade version: 0.15.11 3. Microsoft MakeCode version: 5.30.38
non_process
enhancement there doesn t have a gif to teach user how to pick or set a tile in tilemap editor in tutorial setting the scene describe the bug there doesn t have a gif to teach user how to pick or set a tile in tilemap editor in tutorial setting the scene steps to reproduce the behavior navigate to scroll down to game design concepts open tutorial setting the scene and observe hint in each step expect behavior there should have a gif to teach user how to pick or set a tile in tilemap editor actual behavior there doesn t have a gif to teach user how to pick or set a tile in tilemap editor in tutorial setting the scene additional context os windows arcade version microsoft makecode version
0
66,741
12,821,422,528
IssuesEvent
2020-07-06 08:02:57
joomla/joomla-cms
https://api.github.com/repos/joomla/joomla-cms
closed
[4.0] Recaptcha versions
No Code Attached Yet
### Steps to reproduce the issue 1. Open `reCAPTCHA` in your Plugin Manager 2. Note that only version available is `2.0` 3. Click the link in the description (https://www.google.com/recaptcha) ### Actual result The link directs you to V3 whereas the version available in Joomla is V2
1.0
[4.0] Recaptcha versions - ### Steps to reproduce the issue 1. Open `reCAPTCHA` in your Plugin Manager 2. Note that only version available is `2.0` 3. Click the link in the description (https://www.google.com/recaptcha) ### Actual result The link directs you to V3 whereas the version available in Joomla is V2
non_process
recaptcha versions steps to reproduce the issue open recaptcha in your plugin manager note that only version available is click the link in the description actual result the link directs you to whereas the version available in joomla is
0
467,126
13,441,632,316
IssuesEvent
2020-09-08 04:45:27
dmwm/WMCore
https://api.github.com/repos/dmwm/WMCore
closed
Properly deal with duplicated Rucio rules
BUG High Priority New Feature ReqMgr2MS Rucio Transition
**Impact of the new feature** WMAgent **Is your feature request related to a problem? Please describe.** When making the final output dataset subscription - as we currently do for phedex - we need to have a mechanism in place to figure out when the cms dataset (container) already has such rule created by a different agent (for cases where the same workflow is dealt with by multiple agents) **Describe the solution you'd like** In case we start persisting the rule id in the database, we need to make another rucio call to retrieve the rules for a given DID. Then we need to compare all the rules available against: * rucio account * DID * RSE expression such that we can find - or not - a pre-existent rule. IF we do not persist the rule id, then we could simply catch the duplicate rule exception `DuplicateRule` and return a positive response, such that the component can mark that as done and move on. **Describe alternatives you've considered** alternative described above **Additional context** none
1.0
Properly deal with duplicated Rucio rules - **Impact of the new feature** WMAgent **Is your feature request related to a problem? Please describe.** When making the final output dataset subscription - as we currently do for phedex - we need to have a mechanism in place to figure out when the cms dataset (container) already has such rule created by a different agent (for cases where the same workflow is dealt with by multiple agents) **Describe the solution you'd like** In case we start persisting the rule id in the database, we need to make another rucio call to retrieve the rules for a given DID. Then we need to compare all the rules available against: * rucio account * DID * RSE expression such that we can find - or not - a pre-existent rule. IF we do not persist the rule id, then we could simply catch the duplicate rule exception `DuplicateRule` and return a positive response, such that the component can mark that as done and move on. **Describe alternatives you've considered** alternative described above **Additional context** none
non_process
properly deal with duplicated rucio rules impact of the new feature wmagent is your feature request related to a problem please describe when making the final output dataset subscription as we currently do for phedex we need to have a mechanism in place to figure out when the cms dataset container already has such rule created by a different agent for cases where the same workflow is dealt with by multiple agents describe the solution you d like in case we start persisting the rule id in the database we need to make another rucio call to retrieve the rules for a given did then we need to compare all the rules available against rucio account did rse expression such that we can find or not a pre existent rule if we do not persist the rule id then we could simply catch the duplicate rule exception duplicaterule and return a positive response such that the component can mark that as done and move on describe alternatives you ve considered alternative described above additional context none
0
51,265
6,506,748,571
IssuesEvent
2017-08-24 10:15:29
nextcloud/nextcloud.com
https://api.github.com/repos/nextcloud/nextcloud.com
opened
Use new Google Play button
design
Current: <img width="230" alt="bildschirmfoto 2017-08-24 um 12 13 48" src="https://user-images.githubusercontent.com/19711361/29661899-beafae88-88c5-11e7-8c4d-6605957798a4.png"> How it should be: https://play.google.com/intl/en_us/badges/ @jospoortvliet
1.0
Use new Google Play button - Current: <img width="230" alt="bildschirmfoto 2017-08-24 um 12 13 48" src="https://user-images.githubusercontent.com/19711361/29661899-beafae88-88c5-11e7-8c4d-6605957798a4.png"> How it should be: https://play.google.com/intl/en_us/badges/ @jospoortvliet
non_process
use new google play button current img width alt bildschirmfoto um src how it should be jospoortvliet
0
42,718
12,949,756,453
IssuesEvent
2020-07-19 10:27:12
geea-develop/CsvConvert
https://api.github.com/repos/geea-develop/CsvConvert
opened
CVE-2020-15366 (Medium) detected in ajv-6.12.0.tgz
security vulnerability
## CVE-2020-15366 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ajv-6.12.0.tgz</b></p></summary> <p>Another JSON Schema Validator</p> <p>Library home page: <a href="https://registry.npmjs.org/ajv/-/ajv-6.12.0.tgz">https://registry.npmjs.org/ajv/-/ajv-6.12.0.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/CsvConvert/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/CsvConvert/node_modules/ajv/package.json</p> <p> Dependency Hierarchy: - webpack-4.43.0.tgz (Root Library) - :x: **ajv-6.12.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/geea-develop/CsvConvert/commit/1a4d7cfe3a84669f8b5350577a97a04668ac6aab">1a4d7cfe3a84669f8b5350577a97a04668ac6aab</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in ajv.validate() in Ajv (aka Another JSON Schema Validator) 6.12.2. A carefully crafted JSON schema could be provided that allows execution of other code by prototype pollution. (While untrusted schemas are recommended against, the worst case of an untrusted schema should be a denial of service, not execution of code.) <p>Publish Date: 2020-07-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15366>CVE-2020-15366</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/ajv-validator/ajv/releases/tag/v6.12.3">https://github.com/ajv-validator/ajv/releases/tag/v6.12.3</a></p> <p>Release Date: 2020-07-15</p> <p>Fix Resolution: ajv - 6.12.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-15366 (Medium) detected in ajv-6.12.0.tgz - ## CVE-2020-15366 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ajv-6.12.0.tgz</b></p></summary> <p>Another JSON Schema Validator</p> <p>Library home page: <a href="https://registry.npmjs.org/ajv/-/ajv-6.12.0.tgz">https://registry.npmjs.org/ajv/-/ajv-6.12.0.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/CsvConvert/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/CsvConvert/node_modules/ajv/package.json</p> <p> Dependency Hierarchy: - webpack-4.43.0.tgz (Root Library) - :x: **ajv-6.12.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/geea-develop/CsvConvert/commit/1a4d7cfe3a84669f8b5350577a97a04668ac6aab">1a4d7cfe3a84669f8b5350577a97a04668ac6aab</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in ajv.validate() in Ajv (aka Another JSON Schema Validator) 6.12.2. A carefully crafted JSON schema could be provided that allows execution of other code by prototype pollution. (While untrusted schemas are recommended against, the worst case of an untrusted schema should be a denial of service, not execution of code.) <p>Publish Date: 2020-07-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15366>CVE-2020-15366</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: N/A - Attack Complexity: N/A - Privileges Required: N/A - User Interaction: N/A - Scope: N/A - Impact Metrics: - Confidentiality Impact: N/A - Integrity Impact: N/A - Availability Impact: N/A </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/ajv-validator/ajv/releases/tag/v6.12.3">https://github.com/ajv-validator/ajv/releases/tag/v6.12.3</a></p> <p>Release Date: 2020-07-15</p> <p>Fix Resolution: ajv - 6.12.3</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in ajv tgz cve medium severity vulnerability vulnerable library ajv tgz another json schema validator library home page a href path to dependency file tmp ws scm csvconvert package json path to vulnerable library tmp ws scm csvconvert node modules ajv package json dependency hierarchy webpack tgz root library x ajv tgz vulnerable library found in head commit a href vulnerability details an issue was discovered in ajv validate in ajv aka another json schema validator a carefully crafted json schema could be provided that allows execution of other code by prototype pollution while untrusted schemas are recommended against the worst case of an untrusted schema should be a denial of service not execution of code publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ajv step up your open source security game with whitesource
0
132,582
18,268,779,847
IssuesEvent
2021-10-04 11:37:31
artsking/linux-3.0.35
https://api.github.com/repos/artsking/linux-3.0.35
opened
CVE-2020-0305 (Medium) detected in linux-stable-rtv3.8.6
security vulnerability
## CVE-2020-0305 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35/commit/5992fa81c6ac1b4e9db13f5408d914525c5b7875">5992fa81c6ac1b4e9db13f5408d914525c5b7875</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/char_dev.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/char_dev.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/char_dev.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In cdev_get of char_dev.c, there is a possible use-after-free due to a race condition. This could lead to local escalation of privilege with System execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-153467744 <p>Publish Date: 2020-07-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0305>CVE-2020-0305</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/torvalds/linux/commit/68faa679b8be1a74e6663c21c3a9d25d3">https://github.com/torvalds/linux/commit/68faa679b8be1a74e6663c21c3a9d25d3</a></p> <p>Release Date: 2020-07-17</p> <p>Fix Resolution: v5.5-rc6</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-0305 (Medium) detected in linux-stable-rtv3.8.6 - ## CVE-2020-0305 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv3.8.6</b></p></summary> <p> <p>Julia Cartwright's fork of linux-stable-rt.git</p> <p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/artsking/linux-3.0.35/commit/5992fa81c6ac1b4e9db13f5408d914525c5b7875">5992fa81c6ac1b4e9db13f5408d914525c5b7875</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/char_dev.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/char_dev.c</b> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/char_dev.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> In cdev_get of char_dev.c, there is a possible use-after-free due to a race condition. This could lead to local escalation of privilege with System execution privileges needed. User interaction is not needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-153467744 <p>Publish Date: 2020-07-17 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0305>CVE-2020-0305</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/torvalds/linux/commit/68faa679b8be1a74e6663c21c3a9d25d3">https://github.com/torvalds/linux/commit/68faa679b8be1a74e6663c21c3a9d25d3</a></p> <p>Release Date: 2020-07-17</p> <p>Fix Resolution: v5.5-rc6</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in linux stable cve medium severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files fs char dev c fs char dev c fs char dev c vulnerability details in cdev get of char dev c there is a possible use after free due to a race condition this could lead to local escalation of privilege with system execution privileges needed user interaction is not needed for exploitation product androidversions android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
15,100
18,836,122,472
IssuesEvent
2021-11-11 01:14:43
darktable-org/darktable
https://api.github.com/repos/darktable-org/darktable
closed
feature: filmic: make slope dependent of constrast only, and not of white and black relative exposure
feature: enhancement scope: image processing
In filmic rgb, changing white relative exposure and black relative exposure changes the slope of the central part of the curve (which is also controled by contrast). I think it would make the use of filmic easier if contrast was completely independent of black and white relative exposure. Internally, contrast can be adapted to white relative exposure and black relative exposure. For example, changing line 2123 of filmic by: `float contrast = CLAMP(p->contrast * dynamic_range / 8.0f, 1.00001f, 6.0f);` It allows to change dynamic range scaling slider without affecting dramatically the image. Of course, formula is a bit simplistic, it does not work when moving only white relative exposure or black relative exposure, but I think it is feasible to find a suitable formula. This example was only to illustrate what I meant and as a proof of concept. In my opinion, it would probably make filmic look more predictable accross images that have different dynamic range. Plus, user does not loose any power with this change, curve can still be completely controled with parameters. Examples: First image with normal dynamic range setting, then +50%, then -25% original (see last image has way more contrast than second one): ![Capture d’écran du 2021-09-10 22-01-39](https://user-images.githubusercontent.com/34063828/132911086-06497971-4a4b-4b83-9096-a64ae2760dcd.png) change from above (contrast is much more similar accross images, main visible difference that remains comes from the highlights desaturation): ![Capture d’écran du 2021-09-10 21-59-37](https://user-images.githubusercontent.com/34063828/132911081-0e01cc71-df08-488f-9f5d-3a341e0dc410.png) @aurelienpierre what do you think?
1.0
feature: filmic: make slope dependent of constrast only, and not of white and black relative exposure - In filmic rgb, changing white relative exposure and black relative exposure changes the slope of the central part of the curve (which is also controled by contrast). I think it would make the use of filmic easier if contrast was completely independent of black and white relative exposure. Internally, contrast can be adapted to white relative exposure and black relative exposure. For example, changing line 2123 of filmic by: `float contrast = CLAMP(p->contrast * dynamic_range / 8.0f, 1.00001f, 6.0f);` It allows to change dynamic range scaling slider without affecting dramatically the image. Of course, formula is a bit simplistic, it does not work when moving only white relative exposure or black relative exposure, but I think it is feasible to find a suitable formula. This example was only to illustrate what I meant and as a proof of concept. In my opinion, it would probably make filmic look more predictable accross images that have different dynamic range. Plus, user does not loose any power with this change, curve can still be completely controled with parameters. Examples: First image with normal dynamic range setting, then +50%, then -25% original (see last image has way more contrast than second one): ![Capture d’écran du 2021-09-10 22-01-39](https://user-images.githubusercontent.com/34063828/132911086-06497971-4a4b-4b83-9096-a64ae2760dcd.png) change from above (contrast is much more similar accross images, main visible difference that remains comes from the highlights desaturation): ![Capture d’écran du 2021-09-10 21-59-37](https://user-images.githubusercontent.com/34063828/132911081-0e01cc71-df08-488f-9f5d-3a341e0dc410.png) @aurelienpierre what do you think?
process
feature filmic make slope dependent of constrast only and not of white and black relative exposure in filmic rgb changing white relative exposure and black relative exposure changes the slope of the central part of the curve which is also controled by contrast i think it would make the use of filmic easier if contrast was completely independent of black and white relative exposure internally contrast can be adapted to white relative exposure and black relative exposure for example changing line of filmic by float contrast clamp p contrast dynamic range it allows to change dynamic range scaling slider without affecting dramatically the image of course formula is a bit simplistic it does not work when moving only white relative exposure or black relative exposure but i think it is feasible to find a suitable formula this example was only to illustrate what i meant and as a proof of concept in my opinion it would probably make filmic look more predictable accross images that have different dynamic range plus user does not loose any power with this change curve can still be completely controled with parameters examples first image with normal dynamic range setting then then original see last image has way more contrast than second one change from above contrast is much more similar accross images main visible difference that remains comes from the highlights desaturation aurelienpierre what do you think
1
47,854
13,258,783,448
IssuesEvent
2020-08-20 15:50:06
Mohib-hub/terra-ui
https://api.github.com/repos/Mohib-hub/terra-ui
opened
CVE-2018-11694 (High) detected in node-sass-v4.13.1, node-sass-4.14.1.tgz
security vulnerability
## CVE-2018-11694 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.14.1.tgz</b></p></summary> <p> <details><summary><b>node-sass-4.14.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/terra-ui/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/terra-ui/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - terra-toolkit-6.7.0.tgz (Root Library) - :x: **node-sass-4.14.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/terra-ui/commit/5a052334c048ec12ad88af460917e503929c71cf">5a052334c048ec12ad88af460917e503929c71cf</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Functions::selector_append which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact. <p>Publish Date: 2018-06-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11694>CVE-2018-11694</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11694">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11694</a></p> <p>Release Date: 2018-06-04</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.14.1","isTransitiveDependency":true,"dependencyTree":"terra-toolkit:6.7.0;node-sass:4.14.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"LibSass - 3.6.0"}],"vulnerabilityIdentifier":"CVE-2018-11694","vulnerabilityDetails":"An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Functions::selector_append which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11694","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
True
CVE-2018-11694 (High) detected in node-sass-v4.13.1, node-sass-4.14.1.tgz - ## CVE-2018-11694 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>node-sass-4.14.1.tgz</b></p></summary> <p> <details><summary><b>node-sass-4.14.1.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.14.1.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/terra-ui/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/terra-ui/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - terra-toolkit-6.7.0.tgz (Root Library) - :x: **node-sass-4.14.1.tgz** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/terra-ui/commit/5a052334c048ec12ad88af460917e503929c71cf">5a052334c048ec12ad88af460917e503929c71cf</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Functions::selector_append which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact. <p>Publish Date: 2018-06-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11694>CVE-2018-11694</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11694">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11694</a></p> <p>Release Date: 2018-06-04</p> <p>Fix Resolution: LibSass - 3.6.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"node-sass","packageVersion":"4.14.1","isTransitiveDependency":true,"dependencyTree":"terra-toolkit:6.7.0;node-sass:4.14.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"LibSass - 3.6.0"}],"vulnerabilityIdentifier":"CVE-2018-11694","vulnerabilityDetails":"An issue was discovered in LibSass through 3.5.4. A NULL pointer dereference was found in the function Sass::Functions::selector_append which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11694","cvss3Severity":"high","cvss3Score":"8.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"Required","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
non_process
cve high detected in node sass node sass tgz cve high severity vulnerability vulnerable libraries node sass tgz node sass tgz wrapper around libsass library home page a href path to dependency file tmp ws scm terra ui package json path to vulnerable library tmp ws scm terra ui node modules node sass package json dependency hierarchy terra toolkit tgz root library x node sass tgz vulnerable library found in head commit a href vulnerability details an issue was discovered in libsass through a null pointer dereference was found in the function sass functions selector append which could be leveraged by an attacker to cause a denial of service application crash or possibly have unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails an issue was discovered in libsass through a null pointer dereference was found in the function sass functions selector append which could be leveraged by an attacker to cause a denial of service application crash or possibly have unspecified other impact vulnerabilityurl
0
20,179
26,737,427,664
IssuesEvent
2023-01-30 10:27:38
bitfocus/companion-module-requests
https://api.github.com/repos/bitfocus/companion-module-requests
opened
id-al
NOT YET PROCESSED
hello, I would like to order an id-al box Video Player VP330, Video Player VP320 and Event Video Player (EVP380) http://download.id-al.com/manuals/HTML5_JavaScript_Engine/fr/index.html
1.0
id-al - hello, I would like to order an id-al box Video Player VP330, Video Player VP320 and Event Video Player (EVP380) http://download.id-al.com/manuals/HTML5_JavaScript_Engine/fr/index.html
process
id al hello i would like to order an id al box video player video player and event video player
1
326,972
28,035,660,790
IssuesEvent
2023-03-28 14:56:09
AlexsLemonade/refinebio
https://api.github.com/repos/AlexsLemonade/refinebio
closed
Generate new test data for salmon tools
testing data science
### Context Currently we fetch test data for salmontools from an external repo. https://github.com/AlexsLemonade/refinebio/blob/c4b4a6ad9392b911eec754c81b50a0b05b5f4d41/workers/run_tests.sh#L82 We essentially just run salmontools on this test data and then compare the output file against an existing output file asserting their checksums match. Below are the tests in their current implementation. #### double read https://github.com/AlexsLemonade/refinebio/blob/dev/workers/data_refinery_workers/processors/test_salmon.py#L572 ```python def test_double_reads(self): """Test outputs when the sample has both left and right reads.""" job_context = { "job_id": 123, "job": ProcessorJob(), "pipeline": Pipeline(name="Salmon"), "input_file_path": self.test_dir + "double_input/reads_1.fastq", "input_file_path_2": self.test_dir + "double_input/reads_2.fastq", "salmontools_directory": self.test_dir + "double_salmontools/", "salmontools_archive": self.test_dir + "salmontools-result.tar.gz", "output_directory": self.test_dir + "double_output/", "computed_files": [], } os.makedirs(job_context["salmontools_directory"], exist_ok=True) homo_sapiens = Organism.get_object_for_name("HOMO_SAPIENS", taxonomy_id=9606) sample = Sample() sample.organism = homo_sapiens sample.save() job_context["sample"] = sample salmon._run_salmontools(job_context) # Confirm job status self.assertTrue(job_context["success"]) # Unpack result for checking os.system("gunzip " + job_context["salmontools_directory"] + "*.gz") # Check two output files output_file1 = job_context["salmontools_directory"] + "unmapped_by_salmon_1.fa" expected_output_file1 = self.test_dir + "expected_double_output/unmapped_by_salmon_1.fa" self.assertTrue(identical_checksum(output_file1, expected_output_file1)) output_file2 = job_context["salmontools_directory"] + "unmapped_by_salmon_2.fa" expected_output_file2 = self.test_dir + "expected_double_output/unmapped_by_salmon_2.fa" self.assertTrue(identical_checksum(output_file2, expected_output_file2)) ``` #### single read https://github.com/AlexsLemonade/refinebio/blob/dev/workers/data_refinery_workers/processors/test_salmon.py#L612 ```python def test_single_read(self): """Test outputs when the sample has one read only.""" job_context = { "job_id": 456, "job": ProcessorJob(), "pipeline": Pipeline(name="Salmon"), "input_file_path": self.test_dir + "single_input/single_read.fastq", "output_directory": self.test_dir + "single_output/", "salmontools_directory": self.test_dir + "single_salmontools/", "salmontools_archive": self.test_dir + "salmontools-result.tar.gz", "computed_files": [], } os.makedirs(job_context["salmontools_directory"], exist_ok=True) homo_sapiens = Organism.get_object_for_name("HOMO_SAPIENS", taxonomy_id=9606) sample = Sample() sample.organism = homo_sapiens sample.save() job_context["sample"] = sample salmon._run_salmontools(job_context) # Confirm job status self.assertTrue(job_context["success"]) # Unpack result for checking os.system("gunzip " + job_context["salmontools_directory"] + "*.gz") # Check output file output_file = job_context["salmontools_directory"] + "unmapped_by_salmon.fa" expected_output_file = self.test_dir + "expected_single_output/unmapped_by_salmon.fa" self.assertTrue(identical_checksum(output_file, expected_output_file)) ``` ### Problem or idea Since the data is no longer available we need to recreate this data. I believe we are considering using [Polyester rna-seq](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4635655/) to generate this. ### Solution or next step Tagging @jaclyn-taroni @jashapiro for next steps.
1.0
Generate new test data for salmon tools - ### Context Currently we fetch test data for salmontools from an external repo. https://github.com/AlexsLemonade/refinebio/blob/c4b4a6ad9392b911eec754c81b50a0b05b5f4d41/workers/run_tests.sh#L82 We essentially just run salmontools on this test data and then compare the output file against an existing output file asserting their checksums match. Below are the tests in their current implementation. #### double read https://github.com/AlexsLemonade/refinebio/blob/dev/workers/data_refinery_workers/processors/test_salmon.py#L572 ```python def test_double_reads(self): """Test outputs when the sample has both left and right reads.""" job_context = { "job_id": 123, "job": ProcessorJob(), "pipeline": Pipeline(name="Salmon"), "input_file_path": self.test_dir + "double_input/reads_1.fastq", "input_file_path_2": self.test_dir + "double_input/reads_2.fastq", "salmontools_directory": self.test_dir + "double_salmontools/", "salmontools_archive": self.test_dir + "salmontools-result.tar.gz", "output_directory": self.test_dir + "double_output/", "computed_files": [], } os.makedirs(job_context["salmontools_directory"], exist_ok=True) homo_sapiens = Organism.get_object_for_name("HOMO_SAPIENS", taxonomy_id=9606) sample = Sample() sample.organism = homo_sapiens sample.save() job_context["sample"] = sample salmon._run_salmontools(job_context) # Confirm job status self.assertTrue(job_context["success"]) # Unpack result for checking os.system("gunzip " + job_context["salmontools_directory"] + "*.gz") # Check two output files output_file1 = job_context["salmontools_directory"] + "unmapped_by_salmon_1.fa" expected_output_file1 = self.test_dir + "expected_double_output/unmapped_by_salmon_1.fa" self.assertTrue(identical_checksum(output_file1, expected_output_file1)) output_file2 = job_context["salmontools_directory"] + "unmapped_by_salmon_2.fa" expected_output_file2 = self.test_dir + "expected_double_output/unmapped_by_salmon_2.fa" self.assertTrue(identical_checksum(output_file2, expected_output_file2)) ``` #### single read https://github.com/AlexsLemonade/refinebio/blob/dev/workers/data_refinery_workers/processors/test_salmon.py#L612 ```python def test_single_read(self): """Test outputs when the sample has one read only.""" job_context = { "job_id": 456, "job": ProcessorJob(), "pipeline": Pipeline(name="Salmon"), "input_file_path": self.test_dir + "single_input/single_read.fastq", "output_directory": self.test_dir + "single_output/", "salmontools_directory": self.test_dir + "single_salmontools/", "salmontools_archive": self.test_dir + "salmontools-result.tar.gz", "computed_files": [], } os.makedirs(job_context["salmontools_directory"], exist_ok=True) homo_sapiens = Organism.get_object_for_name("HOMO_SAPIENS", taxonomy_id=9606) sample = Sample() sample.organism = homo_sapiens sample.save() job_context["sample"] = sample salmon._run_salmontools(job_context) # Confirm job status self.assertTrue(job_context["success"]) # Unpack result for checking os.system("gunzip " + job_context["salmontools_directory"] + "*.gz") # Check output file output_file = job_context["salmontools_directory"] + "unmapped_by_salmon.fa" expected_output_file = self.test_dir + "expected_single_output/unmapped_by_salmon.fa" self.assertTrue(identical_checksum(output_file, expected_output_file)) ``` ### Problem or idea Since the data is no longer available we need to recreate this data. I believe we are considering using [Polyester rna-seq](https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4635655/) to generate this. ### Solution or next step Tagging @jaclyn-taroni @jashapiro for next steps.
non_process
generate new test data for salmon tools context currently we fetch test data for salmontools from an external repo we essentially just run salmontools on this test data and then compare the output file against an existing output file asserting their checksums match below are the tests in their current implementation double read python def test double reads self test outputs when the sample has both left and right reads job context job id job processorjob pipeline pipeline name salmon input file path self test dir double input reads fastq input file path self test dir double input reads fastq salmontools directory self test dir double salmontools salmontools archive self test dir salmontools result tar gz output directory self test dir double output computed files os makedirs job context exist ok true homo sapiens organism get object for name homo sapiens taxonomy id sample sample sample organism homo sapiens sample save job context sample salmon run salmontools job context confirm job status self asserttrue job context unpack result for checking os system gunzip job context gz check two output files output job context unmapped by salmon fa expected output self test dir expected double output unmapped by salmon fa self asserttrue identical checksum output expected output output job context unmapped by salmon fa expected output self test dir expected double output unmapped by salmon fa self asserttrue identical checksum output expected output single read python def test single read self test outputs when the sample has one read only job context job id job processorjob pipeline pipeline name salmon input file path self test dir single input single read fastq output directory self test dir single output salmontools directory self test dir single salmontools salmontools archive self test dir salmontools result tar gz computed files os makedirs job context exist ok true homo sapiens organism get object for name homo sapiens taxonomy id sample sample sample organism homo sapiens sample save job context sample salmon run salmontools job context confirm job status self asserttrue job context unpack result for checking os system gunzip job context gz check output file output file job context unmapped by salmon fa expected output file self test dir expected single output unmapped by salmon fa self asserttrue identical checksum output file expected output file problem or idea since the data is no longer available we need to recreate this data i believe we are considering using to generate this solution or next step tagging jaclyn taroni jashapiro for next steps
0
10,642
13,446,179,941
IssuesEvent
2020-09-08 12:35:36
MHRA/products
https://api.github.com/repos/MHRA/products
closed
PARs - Upload a new PAR
EPIC - PARs process HIGH PRIORITY :arrow_double_up: STORY :book:
## User want As a Medical Writer in the licensing team I would like to upload a new PAR So that I can ensure that new PARs are always available on products.mhra.gov.uk (Linked to #397 and #398.) ### Customer acceptance criteria - [x] Medical writers can upload a PAR PDF file to the website - [x] Medical writers can enter the relevant metadata about the document, such as the product name, active substances and Licence number - [x] The new document is available on the products.mhra.gov.uk site - [x] The PAR can be linked to one or multiple products (which will have their own PL number) - [x] Medical writers must be logged in to the site with their MHRA account to upload and will be redirected to the sign in page if they are not. The can trigger the normal microsoft azure AD sign in from the page Following fields are captured: - [x] Product name per product - [x] Active substances per product - [x] Licence number for each product - [x] Strength for each product - [x] Pharmaceutical dose form for each product ### Technical acceptance criteria - [x] PAR pdf is in blob storage after upload - [x] PAR metadata is attached to blobs - [x] PAR is in the Azure search index after upload - [x] blob/metadata/search index is handled by doc-index-updater - [x] Users accessing the app must be authenticated by being in the MHRA Azure group ### Infrastructure acceptance criteria - [x] **terraform**: new blob storage container for temporary PAR storage - [x] **terraform**: app registration - [x] **terraform**: private blob container for pars-upload website - [x] **terraform**: CDN / IP restriction / URL rewrite with SAS token - [x] **github action**: master - [x] pars-upload push to non-prod private blob container - [ ] purges the non-prod cache of the CDN (unnecessary if we have HTML cache off) - [x] **github action**: branch - [x] doc-index-updater/pars-upload compatibility tests - [x] usual a11y stuff and cypress tests for pars-upload site ### Data acceptance criteria ### Testing acceptance criteria **Size** XL **Value** **Effort** ### Exit Criteria met - [x] Backlog - [x] Discovery - [x] DUXD - [x] Development - [ ] Quality Assurance - [ ] Release and Validate #813
1.0
PARs - Upload a new PAR - ## User want As a Medical Writer in the licensing team I would like to upload a new PAR So that I can ensure that new PARs are always available on products.mhra.gov.uk (Linked to #397 and #398.) ### Customer acceptance criteria - [x] Medical writers can upload a PAR PDF file to the website - [x] Medical writers can enter the relevant metadata about the document, such as the product name, active substances and Licence number - [x] The new document is available on the products.mhra.gov.uk site - [x] The PAR can be linked to one or multiple products (which will have their own PL number) - [x] Medical writers must be logged in to the site with their MHRA account to upload and will be redirected to the sign in page if they are not. The can trigger the normal microsoft azure AD sign in from the page Following fields are captured: - [x] Product name per product - [x] Active substances per product - [x] Licence number for each product - [x] Strength for each product - [x] Pharmaceutical dose form for each product ### Technical acceptance criteria - [x] PAR pdf is in blob storage after upload - [x] PAR metadata is attached to blobs - [x] PAR is in the Azure search index after upload - [x] blob/metadata/search index is handled by doc-index-updater - [x] Users accessing the app must be authenticated by being in the MHRA Azure group ### Infrastructure acceptance criteria - [x] **terraform**: new blob storage container for temporary PAR storage - [x] **terraform**: app registration - [x] **terraform**: private blob container for pars-upload website - [x] **terraform**: CDN / IP restriction / URL rewrite with SAS token - [x] **github action**: master - [x] pars-upload push to non-prod private blob container - [ ] purges the non-prod cache of the CDN (unnecessary if we have HTML cache off) - [x] **github action**: branch - [x] doc-index-updater/pars-upload compatibility tests - [x] usual a11y stuff and cypress tests for pars-upload site ### Data acceptance criteria ### Testing acceptance criteria **Size** XL **Value** **Effort** ### Exit Criteria met - [x] Backlog - [x] Discovery - [x] DUXD - [x] Development - [ ] Quality Assurance - [ ] Release and Validate #813
process
pars upload a new par user want as a medical writer in the licensing team i would like to upload a new par so that i can ensure that new pars are always available on products mhra gov uk linked to and customer acceptance criteria medical writers can upload a par pdf file to the website medical writers can enter the relevant metadata about the document such as the product name active substances and licence number the new document is available on the products mhra gov uk site the par can be linked to one or multiple products which will have their own pl number medical writers must be logged in to the site with their mhra account to upload and will be redirected to the sign in page if they are not the can trigger the normal microsoft azure ad sign in from the page following fields are captured product name per product active substances per product licence number for each product strength for each product pharmaceutical dose form for each product technical acceptance criteria par pdf is in blob storage after upload par metadata is attached to blobs par is in the azure search index after upload blob metadata search index is handled by doc index updater users accessing the app must be authenticated by being in the mhra azure group infrastructure acceptance criteria terraform new blob storage container for temporary par storage terraform app registration terraform private blob container for pars upload website terraform cdn ip restriction url rewrite with sas token github action master pars upload push to non prod private blob container purges the non prod cache of the cdn unnecessary if we have html cache off github action branch doc index updater pars upload compatibility tests usual stuff and cypress tests for pars upload site data acceptance criteria testing acceptance criteria size xl value effort exit criteria met backlog discovery duxd development quality assurance release and validate
1
258,609
22,332,265,038
IssuesEvent
2022-06-14 15:23:18
Reference-LAPACK/lapack
https://api.github.com/repos/Reference-LAPACK/lapack
opened
stest_rfp: STFSM test is very sensitive to small numeric errors in BLAS STRSM
Type: Question Related: Testing
As originally reported in https://github.com/xianyi/OpenBLAS/issues/3648 , the test for STFSM in sdrvrf3.f compares against the result of STRSM with a fairly small default threshold. While this happens to work with the reference implementation of the BLAS, spurious failure may occur with a user-supplied BLAS (as observed with OpenBLAS) although the deviation is within the expectations for a single precision calculation making use of FMA or similar hardware features. While this situation is more or less described in the FAQ as a "minor testing failure" (http://www.netlib.org/lapack/faq.html#_how_do_i_interpret_lapack_testing_failures) it may be causing unnecessary concern for distribution packagers etc. Suggested options would be to compare against DTRSM, or trivially to increase the threshold. The same (non?)issue very likely exists with ctest_rfp/CTFSM
1.0
stest_rfp: STFSM test is very sensitive to small numeric errors in BLAS STRSM - As originally reported in https://github.com/xianyi/OpenBLAS/issues/3648 , the test for STFSM in sdrvrf3.f compares against the result of STRSM with a fairly small default threshold. While this happens to work with the reference implementation of the BLAS, spurious failure may occur with a user-supplied BLAS (as observed with OpenBLAS) although the deviation is within the expectations for a single precision calculation making use of FMA or similar hardware features. While this situation is more or less described in the FAQ as a "minor testing failure" (http://www.netlib.org/lapack/faq.html#_how_do_i_interpret_lapack_testing_failures) it may be causing unnecessary concern for distribution packagers etc. Suggested options would be to compare against DTRSM, or trivially to increase the threshold. The same (non?)issue very likely exists with ctest_rfp/CTFSM
non_process
stest rfp stfsm test is very sensitive to small numeric errors in blas strsm as originally reported in the test for stfsm in f compares against the result of strsm with a fairly small default threshold while this happens to work with the reference implementation of the blas spurious failure may occur with a user supplied blas as observed with openblas although the deviation is within the expectations for a single precision calculation making use of fma or similar hardware features while this situation is more or less described in the faq as a minor testing failure it may be causing unnecessary concern for distribution packagers etc suggested options would be to compare against dtrsm or trivially to increase the threshold the same non issue very likely exists with ctest rfp ctfsm
0
97,585
16,236,394,624
IssuesEvent
2021-05-07 01:37:51
michaeldotson/babys-first-rails
https://api.github.com/repos/michaeldotson/babys-first-rails
opened
CVE-2020-7595 (High) detected in nokogiri-1.10.3.gem
security vulnerability
## CVE-2020-7595 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nokogiri-1.10.3.gem</b></p></summary> <p>Nokogiri (鋸) is an HTML, XML, SAX, and Reader parser. Among Nokogiri's many features is the ability to search documents via XPath or CSS3 selectors.</p> <p>Library home page: <a href="https://rubygems.org/gems/nokogiri-1.10.3.gem">https://rubygems.org/gems/nokogiri-1.10.3.gem</a></p> <p>Path to dependency file: /babys-first-rails/Gemfile.lock</p> <p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/nokogiri-1.10.3.gem</p> <p> Dependency Hierarchy: - sass-rails-5.0.7.gem (Root Library) - sprockets-rails-3.2.1.gem - actionpack-5.2.2.gem - rails-html-sanitizer-1.0.4.gem - loofah-2.2.3.gem - :x: **nokogiri-1.10.3.gem** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> xmlStringLenDecodeEntities in parser.c in libxml2 2.9.10 has an infinite loop in a certain end-of-file situation. <p>Publish Date: 2020-01-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7595>CVE-2020-7595</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://security.gentoo.org/glsa/202010-04">https://security.gentoo.org/glsa/202010-04</a></p> <p>Fix Resolution: All libxml2 users should upgrade to the latest version # emerge --sync # emerge --ask --oneshot --verbose >=dev-libs/libxml2-2.9.10 >= </p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-7595 (High) detected in nokogiri-1.10.3.gem - ## CVE-2020-7595 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>nokogiri-1.10.3.gem</b></p></summary> <p>Nokogiri (鋸) is an HTML, XML, SAX, and Reader parser. Among Nokogiri's many features is the ability to search documents via XPath or CSS3 selectors.</p> <p>Library home page: <a href="https://rubygems.org/gems/nokogiri-1.10.3.gem">https://rubygems.org/gems/nokogiri-1.10.3.gem</a></p> <p>Path to dependency file: /babys-first-rails/Gemfile.lock</p> <p>Path to vulnerable library: /var/lib/gems/2.3.0/cache/nokogiri-1.10.3.gem</p> <p> Dependency Hierarchy: - sass-rails-5.0.7.gem (Root Library) - sprockets-rails-3.2.1.gem - actionpack-5.2.2.gem - rails-html-sanitizer-1.0.4.gem - loofah-2.2.3.gem - :x: **nokogiri-1.10.3.gem** (Vulnerable Library) </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> xmlStringLenDecodeEntities in parser.c in libxml2 2.9.10 has an infinite loop in a certain end-of-file situation. <p>Publish Date: 2020-01-21 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7595>CVE-2020-7595</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://security.gentoo.org/glsa/202010-04">https://security.gentoo.org/glsa/202010-04</a></p> <p>Fix Resolution: All libxml2 users should upgrade to the latest version # emerge --sync # emerge --ask --oneshot --verbose >=dev-libs/libxml2-2.9.10 >= </p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in nokogiri gem cve high severity vulnerability vulnerable library nokogiri gem nokogiri 鋸 is an html xml sax and reader parser among nokogiri s many features is the ability to search documents via xpath or selectors library home page a href path to dependency file babys first rails gemfile lock path to vulnerable library var lib gems cache nokogiri gem dependency hierarchy sass rails gem root library sprockets rails gem actionpack gem rails html sanitizer gem loofah gem x nokogiri gem vulnerable library vulnerability details xmlstringlendecodeentities in parser c in has an infinite loop in a certain end of file situation publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href fix resolution all users should upgrade to the latest version emerge sync emerge ask oneshot verbose dev libs step up your open source security game with whitesource
0
5,091
7,876,651,779
IssuesEvent
2018-06-26 02:23:25
Great-Hill-Corporation/quickBlocks
https://api.github.com/repos/Great-Hill-Corporation/quickBlocks
closed
ethName should allow for customized list of names.
status-inprocess tools-ethName type-enhancement
The standard list of names for ethName should be a customized ethName.toml file in the ~/.quickBlocks folder instead of a weird tab-delimited text file like the 'when' block program.
1.0
ethName should allow for customized list of names. - The standard list of names for ethName should be a customized ethName.toml file in the ~/.quickBlocks folder instead of a weird tab-delimited text file like the 'when' block program.
process
ethname should allow for customized list of names the standard list of names for ethname should be a customized ethname toml file in the quickblocks folder instead of a weird tab delimited text file like the when block program
1
109,434
16,845,820,244
IssuesEvent
2021-06-19 13:08:49
mukul-seagate11/cortx-s3server
https://api.github.com/repos/mukul-seagate11/cortx-s3server
closed
CVE-2020-9548 (High) detected in jackson-databind-2.6.6.jar
security vulnerability
## CVE-2020-9548 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: cortx-s3server/auth-utils/jclient/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.6/jackson-databind-2.6.6.jar</p> <p> Dependency Hierarchy: - aws-java-sdk-s3-1.11.37.jar (Root Library) - aws-java-sdk-core-1.11.37.jar - :x: **jackson-databind-2.6.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/mukul-seagate11/cortx-s3server/commits/03f1533c44ecd1d636be384cd5d10a8eb1e25f47">03f1533c44ecd1d636be384cd5d10a8eb1e25f47</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPConfig (aka anteros-core). <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9548>CVE-2020-9548</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2020-9548 (High) detected in jackson-databind-2.6.6.jar - ## CVE-2020-9548 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.6.6.jar</b></p></summary> <p>General data-binding functionality for Jackson: works on core streaming API</p> <p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p> <p>Path to dependency file: cortx-s3server/auth-utils/jclient/pom.xml</p> <p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.6.6/jackson-databind-2.6.6.jar</p> <p> Dependency Hierarchy: - aws-java-sdk-s3-1.11.37.jar (Root Library) - aws-java-sdk-core-1.11.37.jar - :x: **jackson-databind-2.6.6.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/mukul-seagate11/cortx-s3server/commits/03f1533c44ecd1d636be384cd5d10a8eb1e25f47">03f1533c44ecd1d636be384cd5d10a8eb1e25f47</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> FasterXML jackson-databind 2.x before 2.9.10.4 mishandles the interaction between serialization gadgets and typing, related to br.com.anteros.dbcp.AnterosDBCPConfig (aka anteros-core). <p>Publish Date: 2020-03-02 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-9548>CVE-2020-9548</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-9548</a></p> <p>Release Date: 2020-03-02</p> <p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.7.9.7,2.8.11.6,2.9.10.4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in jackson databind jar cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file cortx auth utils jclient pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy aws java sdk jar root library aws java sdk core jar x jackson databind jar vulnerable library found in head commit a href found in base branch main vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to br com anteros dbcp anterosdbcpconfig aka anteros core publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind step up your open source security game with whitesource
0
388,355
26,762,223,326
IssuesEvent
2023-01-31 08:01:02
luzhiled1333/comp-library
https://api.github.com/repos/luzhiled1333/comp-library
opened
[geometry] Slab Decomposition
documentation enhancement test
## Description ## File Name `src/*` ## TODO - [ ] 実装 - [ ] ドキュメント作成 - [ ] unit-test - [ ] verify ## note https://twitter.com/SSRS_cp/status/1543881120022171649?s=20&t=jUJzXBbGKfFmrPrUfVH9wA - implementation - (url) - 関連 issue / PR - # - verify: - (url)
1.0
[geometry] Slab Decomposition - ## Description ## File Name `src/*` ## TODO - [ ] 実装 - [ ] ドキュメント作成 - [ ] unit-test - [ ] verify ## note https://twitter.com/SSRS_cp/status/1543881120022171649?s=20&t=jUJzXBbGKfFmrPrUfVH9wA - implementation - (url) - 関連 issue / PR - # - verify: - (url)
non_process
slab decomposition description file name src todo 実装 ドキュメント作成 unit test verify note implementation url 関連 issue pr verify url
0
11,160
13,957,693,869
IssuesEvent
2020-10-24 08:11:05
alexanderkotsev/geoportal
https://api.github.com/repos/alexanderkotsev/geoportal
opened
PT: Missing resources in Geoportal
Geoportal Harvesting process PT - Portugal
Collected from the Geoportal Workshop online survey answers: For 2 download records: http://inspire-geoportal.ec.europa.eu/download_details.html?view=downloadDetails&amp;resourceId=%2FINSPIRE-d60bf7f3-ea96-11e4-a2c7-52540004b857_20181206-150503%2Fservices%2F1%2FPullResults%2F751-765%2Fdatasets%2F6&amp;expandedSection=metadata http://inspireservices.azores.gov.pt/services/discovery/GetRecordById?SERVICE=CSW&amp;REQUEST=GetRecordById&amp;ElementSetName=full&amp;OutputFormat=application/xml&amp;OutputSchema=http://www.isotc211.org/2005/gmd&amp;Id=5b0718ed-dda8-4e84-9627-8c12035cb229 We think it should have the download link in INSPIRE Geoportal.
1.0
PT: Missing resources in Geoportal - Collected from the Geoportal Workshop online survey answers: For 2 download records: http://inspire-geoportal.ec.europa.eu/download_details.html?view=downloadDetails&amp;resourceId=%2FINSPIRE-d60bf7f3-ea96-11e4-a2c7-52540004b857_20181206-150503%2Fservices%2F1%2FPullResults%2F751-765%2Fdatasets%2F6&amp;expandedSection=metadata http://inspireservices.azores.gov.pt/services/discovery/GetRecordById?SERVICE=CSW&amp;REQUEST=GetRecordById&amp;ElementSetName=full&amp;OutputFormat=application/xml&amp;OutputSchema=http://www.isotc211.org/2005/gmd&amp;Id=5b0718ed-dda8-4e84-9627-8c12035cb229 We think it should have the download link in INSPIRE Geoportal.
process
pt missing resources in geoportal collected from the geoportal workshop online survey answers for download records we think it should have the download link in inspire geoportal
1
18,917
24,862,828,152
IssuesEvent
2022-10-27 09:34:55
aiidateam/aiida-core
https://api.github.com/repos/aiidateam/aiida-core
closed
Crash of all workflows (DuplicateSubscriberIdentifier)
type/bug priority/important topic/engine topic/processes
Dear Aiida Users, I got the following error message that lead to the crash of all running jobs/workflows/calculations during the weekend. kiwipy.communications.DuplicateSubscriberIdentifier: Broadcast identifier I am usually running with several daemon threads. Any idea how I could avoid such problems in the future? (Running aiida 1.1.1 at the moment) ``` File "/path/to/python/lib/python3.7/site-packages/aiida/manage/external/rmq.py", line 187, in _continue result = yield super()._continue(communicator, pid, nowait, tag) File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 307, in wrapper yielded = next(result) File "/path/to/python/lib/python3.7/site-packages/plumpy/process_comms.py", line 547, in _continue proc = saved_state.unbundle(self._load_context) File "/path/to/python/lib/python3.7/site-packages/plumpy/persistence.py", line 51, in unbundle return Savable.load(self, load_context) File "/path/to/python/lib/python3.7/site-packages/plumpy/persistence.py", line 447, in load return load_cls.recreate_from(saved_state, load_context) File "/path/to/python/lib/python3.7/site-packages/plumpy/processes.py", line 236, in recreate_from base.call_with_super_check(process.init) File "/path/to/python/lib/python3.7/site-packages/plumpy/base/utils.py", line 29, in call_with_super_check fn(*args, **kwargs) File "/path/to/python/lib/python3.7/site-packages/aiida/engine/processes/process.py", line 126, in init super().init() File "/path/to/python/lib/python3.7/site-packages/plumpy/base/utils.py", line 16, in new_fn fn(self, *args, **kwargs) File "/path/to/python/lib/python3.7/site-packages/plumpy/processes.py", line 295, in init self.broadcast_receive, identifier=str(self.pid)) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 592, in add_broadcast_subscriber return self._run_task(coro) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 677, in _run_task return self.tornado_to_kiwi_future(self._create_task(coro)).result(timeout=self.TASK_TIMEOUT) File "/path/to/python/lib/python3.7/concurrent/futures/_base.py", line 435, in result return self.__get_result() File "/path/to/python/lib/python3.7/concurrent/futures/_base.py", line 384, in __get_result raise self._exception File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 646, in done result = done_future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/kiwipy/futures.py", line 54, in capture_exceptions yield File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/utils.py", line 146, in run_task future.set_result((yield coro())) File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1063, in run yielded = self.gen.throw(*exc_info) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 393, in add_broadcast_subscriber identifier = yield self._message_subscriber.add_broadcast_subscriber(subscriber, identifier) File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 307, in wrapper yielded = next(result) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 146, in add_broadcast_subscriber raise kiwipy.DuplicateSubscriberIdentifier("Broadcast identifier '{}'".format(identifier)) kiwipy.communications.DuplicateSubscriberIdentifier: Broadcast identifier '157299' ```
1.0
Crash of all workflows (DuplicateSubscriberIdentifier) - Dear Aiida Users, I got the following error message that lead to the crash of all running jobs/workflows/calculations during the weekend. kiwipy.communications.DuplicateSubscriberIdentifier: Broadcast identifier I am usually running with several daemon threads. Any idea how I could avoid such problems in the future? (Running aiida 1.1.1 at the moment) ``` File "/path/to/python/lib/python3.7/site-packages/aiida/manage/external/rmq.py", line 187, in _continue result = yield super()._continue(communicator, pid, nowait, tag) File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 307, in wrapper yielded = next(result) File "/path/to/python/lib/python3.7/site-packages/plumpy/process_comms.py", line 547, in _continue proc = saved_state.unbundle(self._load_context) File "/path/to/python/lib/python3.7/site-packages/plumpy/persistence.py", line 51, in unbundle return Savable.load(self, load_context) File "/path/to/python/lib/python3.7/site-packages/plumpy/persistence.py", line 447, in load return load_cls.recreate_from(saved_state, load_context) File "/path/to/python/lib/python3.7/site-packages/plumpy/processes.py", line 236, in recreate_from base.call_with_super_check(process.init) File "/path/to/python/lib/python3.7/site-packages/plumpy/base/utils.py", line 29, in call_with_super_check fn(*args, **kwargs) File "/path/to/python/lib/python3.7/site-packages/aiida/engine/processes/process.py", line 126, in init super().init() File "/path/to/python/lib/python3.7/site-packages/plumpy/base/utils.py", line 16, in new_fn fn(self, *args, **kwargs) File "/path/to/python/lib/python3.7/site-packages/plumpy/processes.py", line 295, in init self.broadcast_receive, identifier=str(self.pid)) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 592, in add_broadcast_subscriber return self._run_task(coro) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 677, in _run_task return self.tornado_to_kiwi_future(self._create_task(coro)).result(timeout=self.TASK_TIMEOUT) File "/path/to/python/lib/python3.7/concurrent/futures/_base.py", line 435, in result return self.__get_result() File "/path/to/python/lib/python3.7/concurrent/futures/_base.py", line 384, in __get_result raise self._exception File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 646, in done result = done_future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/kiwipy/futures.py", line 54, in capture_exceptions yield File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/utils.py", line 146, in run_task future.set_result((yield coro())) File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1063, in run yielded = self.gen.throw(*exc_info) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 393, in add_broadcast_subscriber identifier = yield self._message_subscriber.add_broadcast_subscriber(subscriber, identifier) File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 1055, in run value = future.result() File "/path/to/python/lib/python3.7/site-packages/tornado/concurrent.py", line 238, in result raise_exc_info(self._exc_info) File "<string>", line 4, in raise_exc_info File "/path/to/python/lib/python3.7/site-packages/tornado/gen.py", line 307, in wrapper yielded = next(result) File "/path/to/python/lib/python3.7/site-packages/kiwipy/rmq/communicator.py", line 146, in add_broadcast_subscriber raise kiwipy.DuplicateSubscriberIdentifier("Broadcast identifier '{}'".format(identifier)) kiwipy.communications.DuplicateSubscriberIdentifier: Broadcast identifier '157299' ```
process
crash of all workflows duplicatesubscriberidentifier dear aiida users i got the following error message that lead to the crash of all running jobs workflows calculations during the weekend kiwipy communications duplicatesubscriberidentifier broadcast identifier i am usually running with several daemon threads any idea how i could avoid such problems in the future running aiida at the moment file path to python lib site packages aiida manage external rmq py line in continue result yield super continue communicator pid nowait tag file path to python lib site packages tornado gen py line in run value future result file path to python lib site packages tornado concurrent py line in result raise exc info self exc info file line in raise exc info file path to python lib site packages tornado gen py line in wrapper yielded next result file path to python lib site packages plumpy process comms py line in continue proc saved state unbundle self load context file path to python lib site packages plumpy persistence py line in unbundle return savable load self load context file path to python lib site packages plumpy persistence py line in load return load cls recreate from saved state load context file path to python lib site packages plumpy processes py line in recreate from base call with super check process init file path to python lib site packages plumpy base utils py line in call with super check fn args kwargs file path to python lib site packages aiida engine processes process py line in init super init file path to python lib site packages plumpy base utils py line in new fn fn self args kwargs file path to python lib site packages plumpy processes py line in init self broadcast receive identifier str self pid file path to python lib site packages kiwipy rmq communicator py line in add broadcast subscriber return self run task coro file path to python lib site packages kiwipy rmq communicator py line in run task return self tornado to kiwi future self create task coro result timeout self task timeout file path to python lib concurrent futures base py line in result return self get result file path to python lib concurrent futures base py line in get result raise self exception file path to python lib site packages kiwipy rmq communicator py line in done result done future result file path to python lib site packages tornado concurrent py line in result raise exc info self exc info file line in raise exc info file path to python lib site packages kiwipy futures py line in capture exceptions yield file path to python lib site packages kiwipy rmq utils py line in run task future set result yield coro file path to python lib site packages tornado gen py line in run value future result file path to python lib site packages tornado concurrent py line in result raise exc info self exc info file line in raise exc info file path to python lib site packages tornado gen py line in run yielded self gen throw exc info file path to python lib site packages kiwipy rmq communicator py line in add broadcast subscriber identifier yield self message subscriber add broadcast subscriber subscriber identifier file path to python lib site packages tornado gen py line in run value future result file path to python lib site packages tornado concurrent py line in result raise exc info self exc info file line in raise exc info file path to python lib site packages tornado gen py line in wrapper yielded next result file path to python lib site packages kiwipy rmq communicator py line in add broadcast subscriber raise kiwipy duplicatesubscriberidentifier broadcast identifier format identifier kiwipy communications duplicatesubscriberidentifier broadcast identifier
1
10,185
13,044,162,862
IssuesEvent
2020-07-29 03:47:37
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `IntervalReal` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `IntervalReal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @sticnarf ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `IntervalReal` from TiDB - ## Description Port the scalar function `IntervalReal` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @sticnarf ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function intervalreal from tidb description port the scalar function intervalreal from tidb to coprocessor score mentor s sticnarf recommended skills rust programming learning materials already implemented expressions ported from tidb
1
247,452
26,711,589,174
IssuesEvent
2023-01-28 01:10:05
Jacksole/merge-conflict
https://api.github.com/repos/Jacksole/merge-conflict
opened
CVE-2023-22486 (Low) detected in commonmarkerv0.23.4
security vulnerability
## CVE-2023-22486 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commonmarkerv0.23.4</b></p></summary> <p> <p>Ruby wrapper for libcmark (CommonMark parser)</p> <p>Library home page: <a href=https://github.com/gjtorikian/commonmarker.git>https://github.com/gjtorikian/commonmarker.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/Jacksole/merge-conflict/commit/610bafc337c97d3a77a49e5a5883996ed444f04a">610bafc337c97d3a77a49e5a5883996ed444f04a</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/_vendor/bundle/ruby/2.7.0/gems/commonmarker-0.23.4/ext/commonmarker/inlines.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> cmark-gfm is GitHub's fork of cmark, a CommonMark parsing and rendering library and program in C. Versions prior to 0.29.0.gfm.7 contain a polynomial time complexity issue in handle_close_bracket that may lead to unbounded resource exhaustion and subsequent denial of service. This vulnerability has been patched in 0.29.0.gfm.7. <p>Publish Date: 2023-01-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-22486>CVE-2023-22486</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Adjacent - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/github/cmark-gfm/security/advisories/GHSA-r572-jvj2-3m8p">https://github.com/github/cmark-gfm/security/advisories/GHSA-r572-jvj2-3m8p</a></p> <p>Release Date: 2022-12-29</p> <p>Fix Resolution: 0.29.0.gfm.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2023-22486 (Low) detected in commonmarkerv0.23.4 - ## CVE-2023-22486 - Low Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commonmarkerv0.23.4</b></p></summary> <p> <p>Ruby wrapper for libcmark (CommonMark parser)</p> <p>Library home page: <a href=https://github.com/gjtorikian/commonmarker.git>https://github.com/gjtorikian/commonmarker.git</a></p> <p>Found in HEAD commit: <a href="https://github.com/Jacksole/merge-conflict/commit/610bafc337c97d3a77a49e5a5883996ed444f04a">610bafc337c97d3a77a49e5a5883996ed444f04a</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> <img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/_vendor/bundle/ruby/2.7.0/gems/commonmarker-0.23.4/ext/commonmarker/inlines.c</b> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary> <p> cmark-gfm is GitHub's fork of cmark, a CommonMark parsing and rendering library and program in C. Versions prior to 0.29.0.gfm.7 contain a polynomial time complexity issue in handle_close_bracket that may lead to unbounded resource exhaustion and subsequent denial of service. This vulnerability has been patched in 0.29.0.gfm.7. <p>Publish Date: 2023-01-26 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2023-22486>CVE-2023-22486</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Adjacent - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/github/cmark-gfm/security/advisories/GHSA-r572-jvj2-3m8p">https://github.com/github/cmark-gfm/security/advisories/GHSA-r572-jvj2-3m8p</a></p> <p>Release Date: 2022-12-29</p> <p>Fix Resolution: 0.29.0.gfm.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve low detected in cve low severity vulnerability vulnerable library ruby wrapper for libcmark commonmark parser library home page a href found in head commit a href found in base branch master vulnerable source files vendor bundle ruby gems commonmarker ext commonmarker inlines c vulnerability details cmark gfm is github s fork of cmark a commonmark parsing and rendering library and program in c versions prior to gfm contain a polynomial time complexity issue in handle close bracket that may lead to unbounded resource exhaustion and subsequent denial of service this vulnerability has been patched in gfm publish date url a href cvss score details base score metrics exploitability metrics attack vector adjacent attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution gfm step up your open source security game with mend
0
19,026
25,034,875,829
IssuesEvent
2022-11-04 15:14:11
googleapis/python-ndb
https://api.github.com/repos/googleapis/python-ndb
closed
Your .repo-metadata.json file has a problem 🤒
type: process api: datastore repo-metadata: lint
You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'python-ndb' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
1.0
Your .repo-metadata.json file has a problem 🤒 - You have a problem with your .repo-metadata.json file: Result of scan 📈: * api_shortname 'python-ndb' invalid in .repo-metadata.json ☝️ Once you address these problems, you can close this issue. ### Need help? * [Schema definition](https://github.com/googleapis/repo-automation-bots/blob/main/packages/repo-metadata-lint/src/repo-metadata-schema.json): lists valid options for each field. * [API index](https://github.com/googleapis/googleapis/blob/master/api-index-v1.json): for gRPC libraries **api_shortname** should match the subdomain of an API's **hostName**. * Reach out to **go/github-automation** if you have any questions.
process
your repo metadata json file has a problem 🤒 you have a problem with your repo metadata json file result of scan 📈 api shortname python ndb invalid in repo metadata json ☝️ once you address these problems you can close this issue need help lists valid options for each field for grpc libraries api shortname should match the subdomain of an api s hostname reach out to go github automation if you have any questions
1
16,567
21,578,607,533
IssuesEvent
2022-05-02 16:11:25
cypress-io/cypress-documentation
https://api.github.com/repos/cypress-io/cypress-documentation
opened
Update non-best practice selectors in code snippets (Cypress API)
content: rewrite process: internal docs
### Subject examples ### Description Stemming from [this issue](https://github.com/cypress-io/cypress-documentation/issues/4227), which we all agree should be updated. 1. Using IDs or classes in the examples should be updated for consistency with data-*, per the best practices docs. This can happen in waves. This one being command pages: [Table of Contents | Cypress Documentation](https://docs.cypress.io/api/table-of-contents) 2. There is a larger conversation happening to update the best practices around selectors to consider things like aria roles and RTL style selectors.
1.0
Update non-best practice selectors in code snippets (Cypress API) - ### Subject examples ### Description Stemming from [this issue](https://github.com/cypress-io/cypress-documentation/issues/4227), which we all agree should be updated. 1. Using IDs or classes in the examples should be updated for consistency with data-*, per the best practices docs. This can happen in waves. This one being command pages: [Table of Contents | Cypress Documentation](https://docs.cypress.io/api/table-of-contents) 2. There is a larger conversation happening to update the best practices around selectors to consider things like aria roles and RTL style selectors.
process
update non best practice selectors in code snippets cypress api subject examples description stemming from which we all agree should be updated using ids or classes in the examples should be updated for consistency with data per the best practices docs this can happen in waves this one being command pages there is a larger conversation happening to update the best practices around selectors to consider things like aria roles and rtl style selectors
1
17,935
23,933,350,388
IssuesEvent
2022-09-10 22:00:24
nodejs/node
https://api.github.com/repos/nodejs/node
closed
process.config is not cloneable anymore in v16.x
process
v16.0.0 / process #36902 made `process.config` immutable. That's fine in principle but the use of a `Proxy` breaks the following code: ```js v8.serialize(process.config) // Uncaught Error: #<Object> could not be cloned. ``` The properties themselves are proxies too so `Object.fromEntries(Object.entries(process.config))` is not an easy workaround. I suggest using `Object.freeze()` or `Object.seal()` instead. cc @jasnell
1.0
process.config is not cloneable anymore in v16.x - v16.0.0 / process #36902 made `process.config` immutable. That's fine in principle but the use of a `Proxy` breaks the following code: ```js v8.serialize(process.config) // Uncaught Error: #<Object> could not be cloned. ``` The properties themselves are proxies too so `Object.fromEntries(Object.entries(process.config))` is not an easy workaround. I suggest using `Object.freeze()` or `Object.seal()` instead. cc @jasnell
process
process config is not cloneable anymore in x process made process config immutable that s fine in principle but the use of a proxy breaks the following code js serialize process config uncaught error could not be cloned the properties themselves are proxies too so object fromentries object entries process config is not an easy workaround i suggest using object freeze or object seal instead cc jasnell
1
22,706
32,033,988,612
IssuesEvent
2023-09-22 14:09:50
h4sh5/npm-auto-scanner
https://api.github.com/repos/h4sh5/npm-auto-scanner
opened
typescript-assistant 0.64.2 has 3 guarddog issues
npm-install-script npm-silent-process-execution
```{"npm-install-script":[{"code":" \"prepare\": \"npm run fix:hooks\",","location":"package/package.json:14","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":" let install = (0, child_process_1.spawn)(\"node\", [\"./build/npm-install.js\"], {\n stdio: \"ignore\",\n shell: true,\n detached: true,\n cwd: currentDir,\n });","location":"package/dist/helpers.js:109","message":"This package is silently executing another executable"},{"code":" let install = spawn(\"node\", [\"./build/npm-install.js\"], {\n stdio: \"ignore\",\n shell: true,\n detached: true,\n cwd: currentDir,\n });","location":"package/src/helpers.ts:106","message":"This package is silently executing another executable"}]}```
1.0
typescript-assistant 0.64.2 has 3 guarddog issues - ```{"npm-install-script":[{"code":" \"prepare\": \"npm run fix:hooks\",","location":"package/package.json:14","message":"The package.json has a script automatically running when the package is installed"}],"npm-silent-process-execution":[{"code":" let install = (0, child_process_1.spawn)(\"node\", [\"./build/npm-install.js\"], {\n stdio: \"ignore\",\n shell: true,\n detached: true,\n cwd: currentDir,\n });","location":"package/dist/helpers.js:109","message":"This package is silently executing another executable"},{"code":" let install = spawn(\"node\", [\"./build/npm-install.js\"], {\n stdio: \"ignore\",\n shell: true,\n detached: true,\n cwd: currentDir,\n });","location":"package/src/helpers.ts:106","message":"This package is silently executing another executable"}]}```
process
typescript assistant has guarddog issues npm install script npm silent process execution n stdio ignore n shell true n detached true n cwd currentdir n location package dist helpers js message this package is silently executing another executable code let install spawn node n stdio ignore n shell true n detached true n cwd currentdir n location package src helpers ts message this package is silently executing another executable
1
15,737
19,910,391,317
IssuesEvent
2022-01-25 16:36:20
input-output-hk/high-assurance-legacy
https://api.github.com/repos/input-output-hk/high-assurance-legacy
closed
Base the definitions of concrete strong bisimilarity relations on weak residual structures
type: bug language: isabelle topic: process calculus
Currently the strong bisimilarity relations of the basic and the proper transition system are defined in terms of relators that are auto-generated by Isabelle. However, the general proof that strong bisimilarity is contained in weak bisimilarity refers to strong bisimilarity relations that are defined in terms of corresponding weak residual structures. As a result, we cannot apply this general result to our concrete bisimilarity relations. Our goal is to modify the definitions of the basic and the proper strong bisimilarity relation such that they are based on the corresponding weak residual structures. - [ ] Base the definitions of concrete strong bisimilarity relations on weak residual structures - [ ] Remove the lemma skeletons for concrete strong–weak bisimilarity containment
1.0
Base the definitions of concrete strong bisimilarity relations on weak residual structures - Currently the strong bisimilarity relations of the basic and the proper transition system are defined in terms of relators that are auto-generated by Isabelle. However, the general proof that strong bisimilarity is contained in weak bisimilarity refers to strong bisimilarity relations that are defined in terms of corresponding weak residual structures. As a result, we cannot apply this general result to our concrete bisimilarity relations. Our goal is to modify the definitions of the basic and the proper strong bisimilarity relation such that they are based on the corresponding weak residual structures. - [ ] Base the definitions of concrete strong bisimilarity relations on weak residual structures - [ ] Remove the lemma skeletons for concrete strong–weak bisimilarity containment
process
base the definitions of concrete strong bisimilarity relations on weak residual structures currently the strong bisimilarity relations of the basic and the proper transition system are defined in terms of relators that are auto generated by isabelle however the general proof that strong bisimilarity is contained in weak bisimilarity refers to strong bisimilarity relations that are defined in terms of corresponding weak residual structures as a result we cannot apply this general result to our concrete bisimilarity relations our goal is to modify the definitions of the basic and the proper strong bisimilarity relation such that they are based on the corresponding weak residual structures base the definitions of concrete strong bisimilarity relations on weak residual structures remove the lemma skeletons for concrete strong–weak bisimilarity containment
1
305,542
9,370,978,994
IssuesEvent
2019-04-03 14:31:40
FCP-INDI/C-PAC
https://api.github.com/repos/FCP-INDI/C-PAC
closed
CPAC GUI should cancel/rename when loading duplicate config names.
0 - Low Priority bug
At the current state, if you enter pipeline or subject config files with the same name into the main window of the gui, the duplication throws an error and requires the user to exit and restart the program instead of allowing renaming or cancelling.
1.0
CPAC GUI should cancel/rename when loading duplicate config names. - At the current state, if you enter pipeline or subject config files with the same name into the main window of the gui, the duplication throws an error and requires the user to exit and restart the program instead of allowing renaming or cancelling.
non_process
cpac gui should cancel rename when loading duplicate config names at the current state if you enter pipeline or subject config files with the same name into the main window of the gui the duplication throws an error and requires the user to exit and restart the program instead of allowing renaming or cancelling
0
107,686
16,762,156,401
IssuesEvent
2021-06-14 01:03:03
ioana-nicolae/first
https://api.github.com/repos/ioana-nicolae/first
opened
WS-2020-0342 (Medium) detected in is-my-json-valid-2.15.0.tgz
security vulnerability
## WS-2020-0342 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>is-my-json-valid-2.15.0.tgz</b></p></summary> <p>A JSONSchema validator that uses code generation to be extremely fast</p> <p>Library home page: <a href="https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz">https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz</a></p> <p>Path to dependency file: first/angular.js-master/angular.js-master/yarn.lock</p> <p>Path to vulnerable library: first/angular.js-master/angular.js-master/yarn.lock</p> <p> Dependency Hierarchy: - gulp-eslint-3.0.1.tgz (Root Library) - eslint-3.15.0.tgz - :x: **is-my-json-valid-2.15.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Regular Expression Denial of Service (ReDoS) vulnerability was found in is-my-json-valid before 2.20.2 via the style format. <p>Publish Date: 2020-06-27 <p>URL: <a href=https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb>WS-2020-0342</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb">https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb</a></p> <p>Release Date: 2020-06-27</p> <p>Fix Resolution: is-my-json-valid - 2.20.2</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"is-my-json-valid","packageVersion":"2.15.0","packageFilePaths":["/angular.js-master/angular.js-master/yarn.lock"],"isTransitiveDependency":true,"dependencyTree":"gulp-eslint:3.0.1;eslint:3.15.0;is-my-json-valid:2.15.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"is-my-json-valid - 2.20.2"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2020-0342","vulnerabilityDetails":"Regular Expression Denial of Service (ReDoS) vulnerability was found in is-my-json-valid before 2.20.2 via the style format.","vulnerabilityUrl":"https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
WS-2020-0342 (Medium) detected in is-my-json-valid-2.15.0.tgz - ## WS-2020-0342 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>is-my-json-valid-2.15.0.tgz</b></p></summary> <p>A JSONSchema validator that uses code generation to be extremely fast</p> <p>Library home page: <a href="https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz">https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz</a></p> <p>Path to dependency file: first/angular.js-master/angular.js-master/yarn.lock</p> <p>Path to vulnerable library: first/angular.js-master/angular.js-master/yarn.lock</p> <p> Dependency Hierarchy: - gulp-eslint-3.0.1.tgz (Root Library) - eslint-3.15.0.tgz - :x: **is-my-json-valid-2.15.0.tgz** (Vulnerable Library) <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Regular Expression Denial of Service (ReDoS) vulnerability was found in is-my-json-valid before 2.20.2 via the style format. <p>Publish Date: 2020-06-27 <p>URL: <a href=https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb>WS-2020-0342</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb">https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb</a></p> <p>Release Date: 2020-06-27</p> <p>Fix Resolution: is-my-json-valid - 2.20.2</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"is-my-json-valid","packageVersion":"2.15.0","packageFilePaths":["/angular.js-master/angular.js-master/yarn.lock"],"isTransitiveDependency":true,"dependencyTree":"gulp-eslint:3.0.1;eslint:3.15.0;is-my-json-valid:2.15.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"is-my-json-valid - 2.20.2"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2020-0342","vulnerabilityDetails":"Regular Expression Denial of Service (ReDoS) vulnerability was found in is-my-json-valid before 2.20.2 via the style format.","vulnerabilityUrl":"https://github.com/mafintosh/is-my-json-valid/commit/c3fc04fc455d40e9b29537f8e2c73a28ce106edb","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
ws medium detected in is my json valid tgz ws medium severity vulnerability vulnerable library is my json valid tgz a jsonschema validator that uses code generation to be extremely fast library home page a href path to dependency file first angular js master angular js master yarn lock path to vulnerable library first angular js master angular js master yarn lock dependency hierarchy gulp eslint tgz root library eslint tgz x is my json valid tgz vulnerable library found in base branch master vulnerability details regular expression denial of service redos vulnerability was found in is my json valid before via the style format publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution is my json valid isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree gulp eslint eslint is my json valid isminimumfixversionavailable true minimumfixversion is my json valid basebranches vulnerabilityidentifier ws vulnerabilitydetails regular expression denial of service redos vulnerability was found in is my json valid before via the style format vulnerabilityurl
0
283,677
8,721,977,783
IssuesEvent
2018-12-09 06:53:16
Text-Mining/Persian-NER
https://api.github.com/repos/Text-Mining/Persian-NER
closed
Add user profile
Effort: Low Priority: Medium Product: Web App Type: enhancement
for now just these fields: Name & family we use this data in the leaderboard and generating contributors file
1.0
Add user profile - for now just these fields: Name & family we use this data in the leaderboard and generating contributors file
non_process
add user profile for now just these fields name family we use this data in the leaderboard and generating contributors file
0
236,772
7,752,739,376
IssuesEvent
2018-05-30 21:17:08
martchellop/Entretenibit
https://api.github.com/repos/martchellop/Entretenibit
closed
Scrape a single event
enhancement priority: higest
Define one page to do that and get all the information possible from the event, comment your decisions in #64
1.0
Scrape a single event - Define one page to do that and get all the information possible from the event, comment your decisions in #64
non_process
scrape a single event define one page to do that and get all the information possible from the event comment your decisions in
0
201,660
7,034,780,597
IssuesEvent
2017-12-27 18:57:08
angular/angular-cli
https://api.github.com/repos/angular/angular-cli
closed
Custom decorators are stripped when using the ng build --prod
need: repro steps priority: 2 (required)
Even standard decorators are removed. ng build --prod Please remove the deletion Probably related https://github.com/angular/angular-cli/issues/8725
1.0
Custom decorators are stripped when using the ng build --prod - Even standard decorators are removed. ng build --prod Please remove the deletion Probably related https://github.com/angular/angular-cli/issues/8725
non_process
custom decorators are stripped when using the ng build prod even standard decorators are removed ng build prod please remove the deletion probably related
0
22,606
10,762,410,567
IssuesEvent
2019-10-31 23:34:55
RuslanGox/BigRepo
https://api.github.com/repos/RuslanGox/BigRepo
opened
CVE-2018-16492 (High) detected in extend-3.0.0.tgz
security vulnerability
## CVE-2018-16492 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>extend-3.0.0.tgz</b></p></summary> <p>Port of jQuery.extend for node.js and the browser</p> <p>Library home page: <a href="https://registry.npmjs.org/extend/-/extend-3.0.0.tgz">https://registry.npmjs.org/extend/-/extend-3.0.0.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/BigRepo/moodle-3.2.2/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/BigRepo/moodle-3.2.2/node_modules/extend/package.json</p> <p> Dependency Hierarchy: - grunt-contrib-less-1.3.0.tgz (Root Library) - less-2.6.1.tgz - request-2.73.0.tgz - :x: **extend-3.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/RuslanGox/BigRepo/commit/60f1b99636e9b000c240f2aa5f152ff189c55741">60f1b99636e9b000c240f2aa5f152ff189c55741</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A prototype pollution vulnerability was found in module extend <2.0.2, ~<3.0.2 that allows an attacker to inject arbitrary properties onto Object.prototype. <p>Publish Date: 2019-02-01 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-16492>CVE-2018-16492</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://hackerone.com/reports/381185">https://hackerone.com/reports/381185</a></p> <p>Release Date: 2019-02-01</p> <p>Fix Resolution: extend - v3.0.2,v2.0.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-16492 (High) detected in extend-3.0.0.tgz - ## CVE-2018-16492 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>extend-3.0.0.tgz</b></p></summary> <p>Port of jQuery.extend for node.js and the browser</p> <p>Library home page: <a href="https://registry.npmjs.org/extend/-/extend-3.0.0.tgz">https://registry.npmjs.org/extend/-/extend-3.0.0.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/BigRepo/moodle-3.2.2/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/BigRepo/moodle-3.2.2/node_modules/extend/package.json</p> <p> Dependency Hierarchy: - grunt-contrib-less-1.3.0.tgz (Root Library) - less-2.6.1.tgz - request-2.73.0.tgz - :x: **extend-3.0.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/RuslanGox/BigRepo/commit/60f1b99636e9b000c240f2aa5f152ff189c55741">60f1b99636e9b000c240f2aa5f152ff189c55741</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A prototype pollution vulnerability was found in module extend <2.0.2, ~<3.0.2 that allows an attacker to inject arbitrary properties onto Object.prototype. <p>Publish Date: 2019-02-01 <p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-16492>CVE-2018-16492</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://hackerone.com/reports/381185">https://hackerone.com/reports/381185</a></p> <p>Release Date: 2019-02-01</p> <p>Fix Resolution: extend - v3.0.2,v2.0.2</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in extend tgz cve high severity vulnerability vulnerable library extend tgz port of jquery extend for node js and the browser library home page a href path to dependency file tmp ws scm bigrepo moodle package json path to vulnerable library tmp ws scm bigrepo moodle node modules extend package json dependency hierarchy grunt contrib less tgz root library less tgz request tgz x extend tgz vulnerable library found in head commit a href vulnerability details a prototype pollution vulnerability was found in module extend that allows an attacker to inject arbitrary properties onto object prototype publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution extend step up your open source security game with whitesource
0
8,720
11,857,157,601
IssuesEvent
2020-03-25 09:03:12
tikv/tikv
https://api.github.com/repos/tikv/tikv
opened
Unknown time zone 'posixrules'
component/coprocessor type/bug
## Bug Report Cannot start tidb because of timezone error: unknown timezone 'posixrules'. ### Steps to reproduce Start tidb and tikv on a machine with America/New_York timezone. This was caused by realpath of `America/New_York` will lead to `/usr/share/zoneinfo/posixrules`. But the `chrono_tz` doesn't understand `posixrules` in its implementation of [`from_str`](https://docs.rs/chrono-tz/0.5.1/src/chrono_tz/home/cratesfyi/cratesfyi/debug/build/chrono-tz-81e187c3996d21f6/out/timezones.rs.html#605-1202)
1.0
Unknown time zone 'posixrules' - ## Bug Report Cannot start tidb because of timezone error: unknown timezone 'posixrules'. ### Steps to reproduce Start tidb and tikv on a machine with America/New_York timezone. This was caused by realpath of `America/New_York` will lead to `/usr/share/zoneinfo/posixrules`. But the `chrono_tz` doesn't understand `posixrules` in its implementation of [`from_str`](https://docs.rs/chrono-tz/0.5.1/src/chrono_tz/home/cratesfyi/cratesfyi/debug/build/chrono-tz-81e187c3996d21f6/out/timezones.rs.html#605-1202)
process
unknown time zone posixrules bug report cannot start tidb because of timezone error unknown timezone posixrules steps to reproduce start tidb and tikv on a machine with america new york timezone this was caused by realpath of america new york will lead to usr share zoneinfo posixrules but the chrono tz doesn t understand posixrules in its implementation of
1
6,943
10,112,021,874
IssuesEvent
2019-07-30 13:56:28
material-components/material-components-ios
https://api.github.com/repos/material-components/material-components-ios
closed
[TextFields] Perform accessibility usability study and come up with recommendations
[TextFields] type:Process
## Feature Request/Suggestion TextField accessibility is complicated and there are no clear "best" solutions. Let's work with our internal research and stakeholder groups to find a good solution with backing data. <!-- Auto-generated content below, do not modify --> --- #### Internal data - Associated internal bug: [b/117176625](http://b/117176625)
1.0
[TextFields] Perform accessibility usability study and come up with recommendations - ## Feature Request/Suggestion TextField accessibility is complicated and there are no clear "best" solutions. Let's work with our internal research and stakeholder groups to find a good solution with backing data. <!-- Auto-generated content below, do not modify --> --- #### Internal data - Associated internal bug: [b/117176625](http://b/117176625)
process
perform accessibility usability study and come up with recommendations feature request suggestion textfield accessibility is complicated and there are no clear best solutions let s work with our internal research and stakeholder groups to find a good solution with backing data internal data associated internal bug
1
53,989
13,891,033,562
IssuesEvent
2020-10-19 10:04:54
shaimael/Maven-Java-Example
https://api.github.com/repos/shaimael/Maven-Java-Example
opened
CVE-2017-2582 (Medium) detected in keycloak-saml-core-1.8.1.Final.jar
security vulnerability
## CVE-2017-2582 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>keycloak-saml-core-1.8.1.Final.jar</b></p></summary> <p>Keycloak SSO</p> <p>Library home page: <a href="http://keycloak.org">http://keycloak.org</a></p> <p>Path to dependency file: Maven-Java-Example/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/org/keycloak/keycloak-saml-core/1.8.1.Final/keycloak-saml-core-1.8.1.Final.jar</p> <p> Dependency Hierarchy: - :x: **keycloak-saml-core-1.8.1.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/shaimael/Maven-Java-Example/commits/45d147049c0f73132be85be6b29a4978139e1755">45d147049c0f73132be85be6b29a4978139e1755</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> It was found that while parsing the SAML messages the StaxParserUtil class of keycloak before 2.5.1 replaces special strings for obtaining attribute values with system property. This could allow an attacker to determine values of system properties at the attacked system by formatting the SAML request ID field to be the chosen system property which could be obtained in the "InResponseTo" field in the response. <p>Publish Date: 2018-07-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-2582>CVE-2017-2582</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-2582">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-2582</a></p> <p>Release Date: 2018-07-26</p> <p>Fix Resolution: org.keycloak:keycloak-saml-core:2.5.1.Final</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.keycloak","packageName":"keycloak-saml-core","packageVersion":"1.8.1.Final","isTransitiveDependency":false,"dependencyTree":"org.keycloak:keycloak-saml-core:1.8.1.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.keycloak:keycloak-saml-core:2.5.1.Final"}],"vulnerabilityIdentifier":"CVE-2017-2582","vulnerabilityDetails":"It was found that while parsing the SAML messages the StaxParserUtil class of keycloak before 2.5.1 replaces special strings for obtaining attribute values with system property. This could allow an attacker to determine values of system properties at the attacked system by formatting the SAML request ID field to be the chosen system property which could be obtained in the \"InResponseTo\" field in the response.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-2582","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
CVE-2017-2582 (Medium) detected in keycloak-saml-core-1.8.1.Final.jar - ## CVE-2017-2582 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>keycloak-saml-core-1.8.1.Final.jar</b></p></summary> <p>Keycloak SSO</p> <p>Library home page: <a href="http://keycloak.org">http://keycloak.org</a></p> <p>Path to dependency file: Maven-Java-Example/pom.xml</p> <p>Path to vulnerable library: canner/.m2/repository/org/keycloak/keycloak-saml-core/1.8.1.Final/keycloak-saml-core-1.8.1.Final.jar</p> <p> Dependency Hierarchy: - :x: **keycloak-saml-core-1.8.1.Final.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/shaimael/Maven-Java-Example/commits/45d147049c0f73132be85be6b29a4978139e1755">45d147049c0f73132be85be6b29a4978139e1755</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> It was found that while parsing the SAML messages the StaxParserUtil class of keycloak before 2.5.1 replaces special strings for obtaining attribute values with system property. This could allow an attacker to determine values of system properties at the attacked system by formatting the SAML request ID field to be the chosen system property which could be obtained in the "InResponseTo" field in the response. <p>Publish Date: 2018-07-26 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-2582>CVE-2017-2582</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: None - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-2582">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-2582</a></p> <p>Release Date: 2018-07-26</p> <p>Fix Resolution: org.keycloak:keycloak-saml-core:2.5.1.Final</p> </p> </details> <p></p> *** :rescue_worker_helmet: Automatic Remediation is available for this issue <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.keycloak","packageName":"keycloak-saml-core","packageVersion":"1.8.1.Final","isTransitiveDependency":false,"dependencyTree":"org.keycloak:keycloak-saml-core:1.8.1.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.keycloak:keycloak-saml-core:2.5.1.Final"}],"vulnerabilityIdentifier":"CVE-2017-2582","vulnerabilityDetails":"It was found that while parsing the SAML messages the StaxParserUtil class of keycloak before 2.5.1 replaces special strings for obtaining attribute values with system property. This could allow an attacker to determine values of system properties at the attacked system by formatting the SAML request ID field to be the chosen system property which could be obtained in the \"InResponseTo\" field in the response.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-2582","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"Low","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in keycloak saml core final jar cve medium severity vulnerability vulnerable library keycloak saml core final jar keycloak sso library home page a href path to dependency file maven java example pom xml path to vulnerable library canner repository org keycloak keycloak saml core final keycloak saml core final jar dependency hierarchy x keycloak saml core final jar vulnerable library found in head commit a href found in base branch master vulnerability details it was found that while parsing the saml messages the staxparserutil class of keycloak before replaces special strings for obtaining attribute values with system property this could allow an attacker to determine values of system properties at the attacked system by formatting the saml request id field to be the chosen system property which could be obtained in the inresponseto field in the response publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org keycloak keycloak saml core final rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails it was found that while parsing the saml messages the staxparserutil class of keycloak before replaces special strings for obtaining attribute values with system property this could allow an attacker to determine values of system properties at the attacked system by formatting the saml request id field to be the chosen system property which could be obtained in the inresponseto field in the response vulnerabilityurl
0
19,231
25,384,695,858
IssuesEvent
2022-11-21 20:46:08
Open-Data-Product-Initiative/open-data-product-spec-1.1dev
https://api.github.com/repos/Open-Data-Product-Initiative/open-data-product-spec-1.1dev
closed
Internal data product description
enhancement Unprocessed
Based on feedback and discussion: When data product is internal: Document level attributes: `onlyInternal boolean` Data Holder: `internalBU (business unit) string` Research and discussion are still needed for the solution proposal.
1.0
Internal data product description - Based on feedback and discussion: When data product is internal: Document level attributes: `onlyInternal boolean` Data Holder: `internalBU (business unit) string` Research and discussion are still needed for the solution proposal.
process
internal data product description based on feedback and discussion when data product is internal document level attributes onlyinternal boolean data holder internalbu business unit string research and discussion are still needed for the solution proposal
1
324,070
23,982,296,741
IssuesEvent
2022-09-13 15:57:12
awslabs/smithy
https://api.github.com/repos/awslabs/smithy
closed
How to generate Java classes from the model definition using the Gradle plugin
guidance documentation
[Also asked in: https://github.com/awslabs/smithy-gradle-plugin/issues/61] In an attempt to introduce my company to automated code generation from service definitions, I am looking to get started with defining my APIs with Smithy and use the Gradle plugin to generate the Java code. I followed the complete example as described in https://awslabs.github.io/smithy/2.0/quickstart.html#complete-example After running gradle build, it did generate the Jar but it does not contain any Java classes. ``` jar -tvf build/libs/smithy-example.jar 0 Mon Sep 05 12:46:56 PDT 2022 META-INF/ 40 Mon Sep 05 12:44:26 PDT 2022 META-INF/MANIFEST.MF 0 Mon Sep 05 12:46:56 PDT 2022 META-INF/smithy/ 15 Mon Sep 05 12:46:56 PDT 2022 META-INF/smithy/manifest 2577 Mon Sep 05 12:46:56 PDT 2022 META-INF/smithy/weather.smithy ``` The contents of my project files are exactly as shown in the complete example section. After running gradle build I expected it to generate the model classes and a client SDK for the service but that does not seem to be the case. Let me know what I am missing or if I completely misunderstood this project.
1.0
How to generate Java classes from the model definition using the Gradle plugin - [Also asked in: https://github.com/awslabs/smithy-gradle-plugin/issues/61] In an attempt to introduce my company to automated code generation from service definitions, I am looking to get started with defining my APIs with Smithy and use the Gradle plugin to generate the Java code. I followed the complete example as described in https://awslabs.github.io/smithy/2.0/quickstart.html#complete-example After running gradle build, it did generate the Jar but it does not contain any Java classes. ``` jar -tvf build/libs/smithy-example.jar 0 Mon Sep 05 12:46:56 PDT 2022 META-INF/ 40 Mon Sep 05 12:44:26 PDT 2022 META-INF/MANIFEST.MF 0 Mon Sep 05 12:46:56 PDT 2022 META-INF/smithy/ 15 Mon Sep 05 12:46:56 PDT 2022 META-INF/smithy/manifest 2577 Mon Sep 05 12:46:56 PDT 2022 META-INF/smithy/weather.smithy ``` The contents of my project files are exactly as shown in the complete example section. After running gradle build I expected it to generate the model classes and a client SDK for the service but that does not seem to be the case. Let me know what I am missing or if I completely misunderstood this project.
non_process
how to generate java classes from the model definition using the gradle plugin in an attempt to introduce my company to automated code generation from service definitions i am looking to get started with defining my apis with smithy and use the gradle plugin to generate the java code i followed the complete example as described in after running gradle build it did generate the jar but it does not contain any java classes jar tvf build libs smithy example jar mon sep pdt meta inf mon sep pdt meta inf manifest mf mon sep pdt meta inf smithy mon sep pdt meta inf smithy manifest mon sep pdt meta inf smithy weather smithy the contents of my project files are exactly as shown in the complete example section after running gradle build i expected it to generate the model classes and a client sdk for the service but that does not seem to be the case let me know what i am missing or if i completely misunderstood this project
0
15,073
18,768,206,642
IssuesEvent
2021-11-06 10:15:13
metabase/metabase
https://api.github.com/repos/metabase/metabase
closed
Use connection metadata for SQL Connection using jdbc
.Proposal Querying/Processor
Refering to the issue https://github.com/metabase/metabase/issues/2386. Could this be considered? Today, without setting this, we see for example for Postgresql : `PostgreSQL JDBC Driver` which is not really convenient. This feature is typically **not** done using comments, but using connection metadata, which is a easy to query via separate column(s) and typically preserved in logs and performance reports **ORACLE** uses concept of DBMS_APPLICATION_INFO with CLIENT_INFO / MODULE / ACTION which are for the most part preserved in AWR/ASM reports and take almost no time to setup/change per JDBC call https://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_appinf.htm **POSTGRES** has a concept of APPLICATION_NAME which can be set during or anytime after connection and is preserved in the logs and stored as a column in pg_stat_activity https://www.postgresql.org/docs/9.5/static/runtime-config-logging.html#GUC-APPLICATION-NAME **SQL SERVER** (it changed a lot since I was deep in it) has similar concept https://social.msdn.microsoft.com/Forums/sqlserver/en-US/6dacb63d-7eb4-449e-b747-a23ac67237ef/equallent-application-dbmsapplicationinfoin-sql-server?forum=sqlgetstarted _Originally posted by @CzechJiri in https://github.com/metabase/metabase/issues/2386#issuecomment-225627373_
1.0
Use connection metadata for SQL Connection using jdbc - Refering to the issue https://github.com/metabase/metabase/issues/2386. Could this be considered? Today, without setting this, we see for example for Postgresql : `PostgreSQL JDBC Driver` which is not really convenient. This feature is typically **not** done using comments, but using connection metadata, which is a easy to query via separate column(s) and typically preserved in logs and performance reports **ORACLE** uses concept of DBMS_APPLICATION_INFO with CLIENT_INFO / MODULE / ACTION which are for the most part preserved in AWR/ASM reports and take almost no time to setup/change per JDBC call https://docs.oracle.com/cd/B28359_01/appdev.111/b28419/d_appinf.htm **POSTGRES** has a concept of APPLICATION_NAME which can be set during or anytime after connection and is preserved in the logs and stored as a column in pg_stat_activity https://www.postgresql.org/docs/9.5/static/runtime-config-logging.html#GUC-APPLICATION-NAME **SQL SERVER** (it changed a lot since I was deep in it) has similar concept https://social.msdn.microsoft.com/Forums/sqlserver/en-US/6dacb63d-7eb4-449e-b747-a23ac67237ef/equallent-application-dbmsapplicationinfoin-sql-server?forum=sqlgetstarted _Originally posted by @CzechJiri in https://github.com/metabase/metabase/issues/2386#issuecomment-225627373_
process
use connection metadata for sql connection using jdbc refering to the issue could this be considered today without setting this we see for example for postgresql postgresql jdbc driver which is not really convenient this feature is typically not done using comments but using connection metadata which is a easy to query via separate column s and typically preserved in logs and performance reports oracle uses concept of dbms application info with client info module action which are for the most part preserved in awr asm reports and take almost no time to setup change per jdbc call postgres has a concept of application name which can be set during or anytime after connection and is preserved in the logs and stored as a column in pg stat activity sql server it changed a lot since i was deep in it has similar concept originally posted by czechjiri in
1
757,508
26,515,613,231
IssuesEvent
2023-01-18 20:31:22
franciscoadasme/chem.cr
https://api.github.com/repos/franciscoadasme/chem.cr
opened
Add measurement methods to Vec3 to allow method chaining
kind:feature topic:spatial priority:normal
One must use the functions in the `Spatial` module to compute the distance, angle, dihedral, etc. Even though it provides a generic API accepting several argument combinations, it may hurt discoverability (functionality is spread across multiple modules) and so it may be more convenient to have them as methods on `Vec3`, e.g., ```crystal Chem::Spatial.distance v1, v2 Chem::Spatial.angle v1, v2 # vs v1.distance_to(v2) v1.angle_to(v2) ``` However, function overloads that require more than 2 arguments may prove difficult to translate: ```crystal Chem::Spatial.dihedral v1, v2, v3, v4 # vs v1.dihedral v2, v3, v4 # or some alternative ```
1.0
Add measurement methods to Vec3 to allow method chaining - One must use the functions in the `Spatial` module to compute the distance, angle, dihedral, etc. Even though it provides a generic API accepting several argument combinations, it may hurt discoverability (functionality is spread across multiple modules) and so it may be more convenient to have them as methods on `Vec3`, e.g., ```crystal Chem::Spatial.distance v1, v2 Chem::Spatial.angle v1, v2 # vs v1.distance_to(v2) v1.angle_to(v2) ``` However, function overloads that require more than 2 arguments may prove difficult to translate: ```crystal Chem::Spatial.dihedral v1, v2, v3, v4 # vs v1.dihedral v2, v3, v4 # or some alternative ```
non_process
add measurement methods to to allow method chaining one must use the functions in the spatial module to compute the distance angle dihedral etc even though it provides a generic api accepting several argument combinations it may hurt discoverability functionality is spread across multiple modules and so it may be more convenient to have them as methods on e g crystal chem spatial distance chem spatial angle vs distance to angle to however function overloads that require more than arguments may prove difficult to translate crystal chem spatial dihedral vs dihedral or some alternative
0