Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1
value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3
values | title stringlengths 1 1k | labels stringlengths 4 1.38k | body stringlengths 1 262k | index stringclasses 16
values | text_combine stringlengths 96 262k | label stringclasses 2
values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
73,329 | 3,410,906,103 | IssuesEvent | 2015-12-04 22:29:35 | DoSomething/phoenix | https://api.github.com/repos/DoSomething/phoenix | closed | Campaign Collections are missing content on front end | #campaign-collections @bender bug priority-high | Content is not being output properly on Campaign Collections. The following fields/field collections aren't working as intended:
- `field_intro_title`
- `field_intro`
- `field_faq`
- `field_gallery`
Not sure if other content is being displayed properly or not.
example: https://www.dosomething.org/volunteer/step-game
Potentially related to #5714 | 1.0 | Campaign Collections are missing content on front end - Content is not being output properly on Campaign Collections. The following fields/field collections aren't working as intended:
- `field_intro_title`
- `field_intro`
- `field_faq`
- `field_gallery`
Not sure if other content is being displayed properly or not.
example: https://www.dosomething.org/volunteer/step-game
Potentially related to #5714 | priority | campaign collections are missing content on front end content is not being output properly on campaign collections the following fields field collections aren t working as intended field intro title field intro field faq field gallery not sure if other content is being displayed properly or not example potentially related to | 1 |
683,584 | 23,388,025,313 | IssuesEvent | 2022-08-11 15:14:58 | rangav/thunder-client-support | https://api.github.com/repos/rangav/thunder-client-support | closed | Allow multiple iterations in collections runner | feature request Priority | **Describe the solution you'd like**
As the title says, I would like to be able to run x times a collection (the same function as postman does!).
It could be awesome! 😁

| 1.0 | Allow multiple iterations in collections runner - **Describe the solution you'd like**
As the title says, I would like to be able to run x times a collection (the same function as postman does!).
It could be awesome! 😁

| priority | allow multiple iterations in collections runner describe the solution you d like as the title says i would like to be able to run x times a collection the same function as postman does it could be awesome 😁 | 1 |
99,331 | 16,445,973,554 | IssuesEvent | 2021-05-20 19:38:03 | tuanducteam/service.tuanducdesign.com | https://api.github.com/repos/tuanducteam/service.tuanducdesign.com | closed | CVE-2018-11499 (High) detected in opennmsopennms-source-26.0.0-1 | security vulnerability wontfix | ## CVE-2018-11499 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-26.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/tuanducteam/service.tuanducdesign.com/commits/bcda4c49653f0f100ba550797d8a0f0bf9c62ba3">bcda4c49653f0f100ba550797d8a0f0bf9c62ba3</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>service.tuanducdesign.com/node_modules/node-sass/src/libsass/src/parser.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A use-after-free vulnerability exists in handle_error() in sass_context.cpp in LibSass 3.4.x and 3.5.x through 3.5.4 that could be leveraged to cause a denial of service (application crash) or possibly unspecified other impact.
<p>Publish Date: 2018-05-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11499>CVE-2018-11499</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11499">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11499</a></p>
<p>Release Date: 2018-05-26</p>
<p>Fix Resolution: LibSass - 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-11499 (High) detected in opennmsopennms-source-26.0.0-1 - ## CVE-2018-11499 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>opennmsopennms-source-26.0.0-1</b></p></summary>
<p>
<p>A Java based fault and performance management system</p>
<p>Library home page: <a href=https://sourceforge.net/projects/opennms/>https://sourceforge.net/projects/opennms/</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/tuanducteam/service.tuanducdesign.com/commits/bcda4c49653f0f100ba550797d8a0f0bf9c62ba3">bcda4c49653f0f100ba550797d8a0f0bf9c62ba3</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>service.tuanducdesign.com/node_modules/node-sass/src/libsass/src/parser.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A use-after-free vulnerability exists in handle_error() in sass_context.cpp in LibSass 3.4.x and 3.5.x through 3.5.4 that could be leveraged to cause a denial of service (application crash) or possibly unspecified other impact.
<p>Publish Date: 2018-05-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11499>CVE-2018-11499</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11499">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-11499</a></p>
<p>Release Date: 2018-05-26</p>
<p>Fix Resolution: LibSass - 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in opennmsopennms source cve high severity vulnerability vulnerable library opennmsopennms source a java based fault and performance management system library home page a href found in head commit a href found in base branch master vulnerable source files service tuanducdesign com node modules node sass src libsass src parser cpp vulnerability details a use after free vulnerability exists in handle error in sass context cpp in libsass x and x through that could be leveraged to cause a denial of service application crash or possibly unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass step up your open source security game with whitesource | 0 |
317,071 | 23,663,209,081 | IssuesEvent | 2022-08-26 17:46:02 | FIRST-Tech-Challenge/ftcdocs | https://api.github.com/repos/FIRST-Tech-Challenge/ftcdocs | opened | Veteran page - Java and Android Programming | documentation | Add FREE Industry certification content from these software companies (oracle, AWS, Google). | 1.0 | Veteran page - Java and Android Programming - Add FREE Industry certification content from these software companies (oracle, AWS, Google). | non_priority | veteran page java and android programming add free industry certification content from these software companies oracle aws google | 0 |
129,591 | 18,106,171,121 | IssuesEvent | 2021-09-22 19:23:13 | samjcs/jqv | https://api.github.com/repos/samjcs/jqv | opened | WS-2018-0148 (High) detected in utile-0.2.1.tgz | security vulnerability | ## WS-2018-0148 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>utile-0.2.1.tgz</b></p></summary>
<p>A drop-in replacement for `util` with some additional advantageous functions</p>
<p>Library home page: <a href="https://registry.npmjs.org/utile/-/utile-0.2.1.tgz">https://registry.npmjs.org/utile/-/utile-0.2.1.tgz</a></p>
<p>Path to dependency file: jqv/package.json</p>
<p>Path to vulnerable library: jqv/node_modules/utile/package.json</p>
<p>
Dependency Hierarchy:
- grunt-jscs-2.8.0.tgz (Root Library)
- jscs-2.11.0.tgz
- prompt-0.2.14.tgz
- :x: **utile-0.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samjcs/jqv/commit/061d1885fe4148d27e0d62a985d4e6dbbb57d021">061d1885fe4148d27e0d62a985d4e6dbbb57d021</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The `utile` npm module, version 0.3.0, allows to extract sensitive data from uninitialized memory or to cause a DoS by passing in a large number, in setups where typed user input can be passed (e.g. from JSON).
<p>Publish Date: 2018-07-16
<p>URL: <a href=https://hackerone.com/reports/321701>WS-2018-0148</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"utile","packageVersion":"0.2.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-jscs:2.8.0;jscs:2.11.0;prompt:0.2.14;utile:0.2.1","isMinimumFixVersionAvailable":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2018-0148","vulnerabilityDetails":"The `utile` npm module, version 0.3.0, allows to extract sensitive data from uninitialized memory or to cause a DoS by passing in a large number, in setups where typed user input can be passed (e.g. from JSON).","vulnerabilityUrl":"https://hackerone.com/reports/321701","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | WS-2018-0148 (High) detected in utile-0.2.1.tgz - ## WS-2018-0148 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>utile-0.2.1.tgz</b></p></summary>
<p>A drop-in replacement for `util` with some additional advantageous functions</p>
<p>Library home page: <a href="https://registry.npmjs.org/utile/-/utile-0.2.1.tgz">https://registry.npmjs.org/utile/-/utile-0.2.1.tgz</a></p>
<p>Path to dependency file: jqv/package.json</p>
<p>Path to vulnerable library: jqv/node_modules/utile/package.json</p>
<p>
Dependency Hierarchy:
- grunt-jscs-2.8.0.tgz (Root Library)
- jscs-2.11.0.tgz
- prompt-0.2.14.tgz
- :x: **utile-0.2.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samjcs/jqv/commit/061d1885fe4148d27e0d62a985d4e6dbbb57d021">061d1885fe4148d27e0d62a985d4e6dbbb57d021</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The `utile` npm module, version 0.3.0, allows to extract sensitive data from uninitialized memory or to cause a DoS by passing in a large number, in setups where typed user input can be passed (e.g. from JSON).
<p>Publish Date: 2018-07-16
<p>URL: <a href=https://hackerone.com/reports/321701>WS-2018-0148</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"utile","packageVersion":"0.2.1","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-jscs:2.8.0;jscs:2.11.0;prompt:0.2.14;utile:0.2.1","isMinimumFixVersionAvailable":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2018-0148","vulnerabilityDetails":"The `utile` npm module, version 0.3.0, allows to extract sensitive data from uninitialized memory or to cause a DoS by passing in a large number, in setups where typed user input can be passed (e.g. from JSON).","vulnerabilityUrl":"https://hackerone.com/reports/321701","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | ws high detected in utile tgz ws high severity vulnerability vulnerable library utile tgz a drop in replacement for util with some additional advantageous functions library home page a href path to dependency file jqv package json path to vulnerable library jqv node modules utile package json dependency hierarchy grunt jscs tgz root library jscs tgz prompt tgz x utile tgz vulnerable library found in head commit a href found in base branch main vulnerability details the utile npm module version allows to extract sensitive data from uninitialized memory or to cause a dos by passing in a large number in setups where typed user input can be passed e g from json publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt jscs jscs prompt utile isminimumfixversionavailable false basebranches vulnerabilityidentifier ws vulnerabilitydetails the utile npm module version allows to extract sensitive data from uninitialized memory or to cause a dos by passing in a large number in setups where typed user input can be passed e g from json vulnerabilityurl | 0 |
107,691 | 23,467,548,739 | IssuesEvent | 2022-08-16 18:17:12 | vasl-developers/vasl | https://api.github.com/repos/vasl-developers/vasl | closed | Custom colors for map pointer/"this guy" thingy | enhancement 2 - Code | Different "this guy" colors for each player -- so when multiple people are kibitzing on the board you know who is referring to what
Also apply "zoom" factor to the size of the circle so it's smaller when zoomed out.
| 1.0 | Custom colors for map pointer/"this guy" thingy - Different "this guy" colors for each player -- so when multiple people are kibitzing on the board you know who is referring to what
Also apply "zoom" factor to the size of the circle so it's smaller when zoomed out.
| non_priority | custom colors for map pointer this guy thingy different this guy colors for each player so when multiple people are kibitzing on the board you know who is referring to what also apply zoom factor to the size of the circle so it s smaller when zoomed out | 0 |
340,558 | 10,273,234,030 | IssuesEvent | 2019-08-23 18:41:39 | metabase/metabase | https://api.github.com/repos/metabase/metabase | closed | tabs not working for the entreprise version | Priority:P1 Type:Bug | **Describe the bug**
Tabs inside the audit panel are not availlable. Only the overview is.
**Logs**
```
TypeError: a is not a function app-main.bundle.js:5:310347
onClick http://myurl.elasticbeanstalk.com/app/dist/app-main.bundle.js?fbc:5
i http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
s http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
executeDispatchesInOrder http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
h http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
f http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
forEach self-hosted:266
exports http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
processEventQueue http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
handleTopLevel http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
f http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
perform http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
batchedUpdates http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
batchedUpdates http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
dispatchEvent http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
dispatchEvent self-hosted:1053
```
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'http://myurl.elasticbeanstalk.com/admin/audit/members/all'
2. Click on 'all members'
3. Open the console
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**

**Information about your Metabase Installation:**
Metabase entreprise 1.1.6 docker on aws elasticbeanstalk.
**Severity**
Tabs is not working for all the audit pages.
| 1.0 | tabs not working for the entreprise version - **Describe the bug**
Tabs inside the audit panel are not availlable. Only the overview is.
**Logs**
```
TypeError: a is not a function app-main.bundle.js:5:310347
onClick http://myurl.elasticbeanstalk.com/app/dist/app-main.bundle.js?fbc:5
i http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
s http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
executeDispatchesInOrder http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
h http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
f http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
forEach self-hosted:266
exports http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
processEventQueue http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
handleTopLevel http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
f http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
perform http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
batchedUpdates http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
batchedUpdates http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
dispatchEvent http://myurl.elasticbeanstalk.com/app/dist/vendor.bundle.js?fbc:1
dispatchEvent self-hosted:1053
```
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'http://myurl.elasticbeanstalk.com/admin/audit/members/all'
2. Click on 'all members'
3. Open the console
4. See error
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**

**Information about your Metabase Installation:**
Metabase entreprise 1.1.6 docker on aws elasticbeanstalk.
**Severity**
Tabs is not working for all the audit pages.
| priority | tabs not working for the entreprise version describe the bug tabs inside the audit panel are not availlable only the overview is logs typeerror a is not a function app main bundle js onclick i s executedispatchesinorder h f foreach self hosted exports processeventqueue handletoplevel f perform batchedupdates batchedupdates dispatchevent dispatchevent self hosted to reproduce steps to reproduce the behavior go to click on all members open the console see error expected behavior a clear and concise description of what you expected to happen screenshots information about your metabase installation metabase entreprise docker on aws elasticbeanstalk severity tabs is not working for all the audit pages | 1 |
41,522 | 16,771,992,447 | IssuesEvent | 2021-06-14 15:48:24 | cityofaustin/atd-data-tech | https://api.github.com/repos/cityofaustin/atd-data-tech | opened | Cameron Plotter Error | Service: Geo Type: IT Support Workgroup: DTS | Work on plotter to get it to print again. Requiring new print head, but new print head is still causing errors. | 1.0 | Cameron Plotter Error - Work on plotter to get it to print again. Requiring new print head, but new print head is still causing errors. | non_priority | cameron plotter error work on plotter to get it to print again requiring new print head but new print head is still causing errors | 0 |
682,927 | 23,362,624,874 | IssuesEvent | 2022-08-10 13:01:29 | renovatebot/renovate | https://api.github.com/repos/renovatebot/renovate | opened | PR is being updated in dry mode when branch and pr were modified | type:bug priority-3-medium status:in-progress | ### How are you running Renovate?
Self-hosted
### If you're self-hosting Renovate, tell us what version of Renovate you run.
latest
### Please select which platform you are using if self-hosting.
github.com
### If you're self-hosting Renovate, tell us what version of the platform you run.
_No response_
### Was this something which used to work for you, and then stopped?
I never saw this working
### Describe the bug
Renovate will update a existing PR even when in dry run mode.
Steps to reproduce:
- Commit to any renovate feature branch with a non bot git account
- Edit the PR body corresponding to the above branch
- Run renovate in dry mode
Any repro will do, i for once used [this](https://github.com/ladzaretti/alpine-node)
### Relevant debug logs
<details><summary>Logs</summary>
```
DEBUG: File config
"config": {
"platform": "github",
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"token": "***********",
...
"dryRun": "full"
}
...
DEBUG: Found existing branch PR (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
DEBUG: PR has been edited (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
"prNo": 2
DEBUG: Updating existing PR to indicate that rebasing is not possible (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
DEBUG: updatePr(2, Update dependency node to v18, body) (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
DEBUG: PR updated (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
"pr": 2
```
</details>
### Have you created a minimal reproduction repository?
I have linked to a minimal reproduction repository in the bug description | 1.0 | PR is being updated in dry mode when branch and pr were modified - ### How are you running Renovate?
Self-hosted
### If you're self-hosting Renovate, tell us what version of Renovate you run.
latest
### Please select which platform you are using if self-hosting.
github.com
### If you're self-hosting Renovate, tell us what version of the platform you run.
_No response_
### Was this something which used to work for you, and then stopped?
I never saw this working
### Describe the bug
Renovate will update a existing PR even when in dry run mode.
Steps to reproduce:
- Commit to any renovate feature branch with a non bot git account
- Edit the PR body corresponding to the above branch
- Run renovate in dry mode
Any repro will do, i for once used [this](https://github.com/ladzaretti/alpine-node)
### Relevant debug logs
<details><summary>Logs</summary>
```
DEBUG: File config
"config": {
"platform": "github",
"$schema": "https://docs.renovatebot.com/renovate-schema.json",
"token": "***********",
...
"dryRun": "full"
}
...
DEBUG: Found existing branch PR (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
DEBUG: PR has been edited (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
"prNo": 2
DEBUG: Updating existing PR to indicate that rebasing is not possible (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
DEBUG: updatePr(2, Update dependency node to v18, body) (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
DEBUG: PR updated (repository=ladzaretti/alpine-node, branch=renovate/node-18.x)
"pr": 2
```
</details>
### Have you created a minimal reproduction repository?
I have linked to a minimal reproduction repository in the bug description | priority | pr is being updated in dry mode when branch and pr were modified how are you running renovate self hosted if you re self hosting renovate tell us what version of renovate you run latest please select which platform you are using if self hosting github com if you re self hosting renovate tell us what version of the platform you run no response was this something which used to work for you and then stopped i never saw this working describe the bug renovate will update a existing pr even when in dry run mode steps to reproduce commit to any renovate feature branch with a non bot git account edit the pr body corresponding to the above branch run renovate in dry mode any repro will do i for once used relevant debug logs logs debug file config config platform github schema token dryrun full debug found existing branch pr repository ladzaretti alpine node branch renovate node x debug pr has been edited repository ladzaretti alpine node branch renovate node x prno debug updating existing pr to indicate that rebasing is not possible repository ladzaretti alpine node branch renovate node x debug updatepr update dependency node to body repository ladzaretti alpine node branch renovate node x debug pr updated repository ladzaretti alpine node branch renovate node x pr have you created a minimal reproduction repository i have linked to a minimal reproduction repository in the bug description | 1 |
41,190 | 8,940,753,419 | IssuesEvent | 2019-01-24 01:06:17 | LeStahL/endeavor | https://api.github.com/repos/LeStahL/endeavor | closed | Design and pack endeavor font | GFX code | - very round or very hexagonal font, design and pack
- need code that packs it (python?)
- need code that draws it (signed, with circles etc.) | 1.0 | Design and pack endeavor font - - very round or very hexagonal font, design and pack
- need code that packs it (python?)
- need code that draws it (signed, with circles etc.) | non_priority | design and pack endeavor font very round or very hexagonal font design and pack need code that packs it python need code that draws it signed with circles etc | 0 |
350,914 | 10,510,596,353 | IssuesEvent | 2019-09-27 13:46:23 | dzavalishin/phantomuserland | https://api.github.com/repos/dzavalishin/phantomuserland | closed | spinlock reenter panic on first boot on clean QEMU | Component-Kernel Priority-Critical bug help wanted | second boot is ok - some clean memory error?
```
spinlock reenter detected, prev enter was here:
Stack:- 6a54c0: ?
Panic: reenter
tid 18 Stack:- 1dd2c8: _stack_dump
- 1a3e49: _panic
- 13a5d6: _hal_spin_init
- 13a61f: _hal_spin_lock
- 1a0784: _thread_block
- 1a2484: _hal_mutex_lock
- 12342d: _malloc
- 1dd131: _calloc
- 1a2172: _do_ctty_create
- 19baac: _pool_create_el
- 1a2249: _t_make_ctty
- 1a37ad: _t_new_ctty
- 13c6cf: _phantom_debug_window_loop
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 1 pri 07 blk 00000000
Thread 1 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2484: _hal_mutex_lock
- 12342d: _malloc
- 1dd131: _calloc
- 1d83fb: _ev_allocate_event
- 1d8877: _init_main_event_q
- 1682fd: _main
- 124557: _phantom_multiboot_main
- 100085: boot_code
T 2 pri 00 blk 00000000
Thread 2 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 111323: _hal_softirq_dispatcher
- 17fb08: _hal_PIC_interrupt_dispatcher
- 1774b7: call_handler
- 1a3e9d: _haltme
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 3 pri 02 blk 00000004 cond 49824a4
Thread 3 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a1a7d: _hal_cond_wait
- 1a2965: _t_do_some_kills
- 1a3ec2: _kill_thread_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 4 pri 07 blk 00000002
Thread 4 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 12df5d: _net_timer_runner
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 5 pri 28 blk 00000004 cond 4a868f0
Thread 5 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a1a7d: _hal_cond_wait
- 13a1a6: _dpc_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 6 pri 28 blk 00000004 cond 4a868f0
Thread 6 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a1a7d: _hal_cond_wait
- 13a1a6: _dpc_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 7 pri 07 blk 00000002
Thread 7 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 12d6f3: _arp_cleanup_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 8 pri 07 blk 00000010 sema 48c0314
Thread 8 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 12d54e: _arp_retransmit_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 9 pri 07 blk 00000010 sema 6a5da8
Thread 9 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 133a76: _udp_recvfrom
- 130474: _trfs_recv
- 130e1c: _trfs_recv_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T10 pri 07 blk 00000002
Thread 10 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 130f99: _trfs_resend_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T11 pri 07 blk 00000002
Thread 11 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 13573c: _tcp_echo_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T12 pri 07 blk 00000010 sema 6a5e88
Thread 12 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 133a76: _udp_recvfrom
- 1359c2: _udp_echo_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T13 pri 07 blk 00000010 sema 6e71ac
Thread 13 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 14fc79: _ne2000_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T14 pri 07 blk 00000002
Thread 14 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 150d14: _ne_read
- 134713: _if_rx_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T15 pri 07 blk 00000010 sema 73502c
Thread 15 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 134632: _if_tx_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T16 pri 28 blk 00000010 sema 48fab54
Thread 16 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 13df33: _mouse_push_event_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T17 pri 29 blk 00000000
Thread 17 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 111323: _hal_softirq_dispatcher
- 17fb08: _hal_PIC_interrupt_dispatcher
- 1774b7: call_handler
- 1d82c4: _vid_bitblt_part
- 1d5fd6: _vid_bitblt_part_rev
- 1d54ca: _repaint_win_part
- 1d5568: _w_repaint_screen_part
- 1d4d19: _paint_square_updown
- 1d4ded: _repaint_q
- 1d4e33: _painter_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T18 pri 07 blk 00000000
Thread 18 EIP 0x001A4171, Stack:- 6: ?
- 13a61f: _hal_spin_lock
- 1a0784: _thread_block
- 1a2484: _hal_mutex_lock
- 12342d: _malloc
- 1dd131: _calloc
- 1a2172: _do_ctty_create
- 19baac: _pool_create_el
- 1a2249: _t_make_ctty
- 1a37ad: _t_new_ctty
- 13c6cf: _phantom_debug_window_loop
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
```
| 1.0 | spinlock reenter panic on first boot on clean QEMU - second boot is ok - some clean memory error?
```
spinlock reenter detected, prev enter was here:
Stack:- 6a54c0: ?
Panic: reenter
tid 18 Stack:- 1dd2c8: _stack_dump
- 1a3e49: _panic
- 13a5d6: _hal_spin_init
- 13a61f: _hal_spin_lock
- 1a0784: _thread_block
- 1a2484: _hal_mutex_lock
- 12342d: _malloc
- 1dd131: _calloc
- 1a2172: _do_ctty_create
- 19baac: _pool_create_el
- 1a2249: _t_make_ctty
- 1a37ad: _t_new_ctty
- 13c6cf: _phantom_debug_window_loop
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 1 pri 07 blk 00000000
Thread 1 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2484: _hal_mutex_lock
- 12342d: _malloc
- 1dd131: _calloc
- 1d83fb: _ev_allocate_event
- 1d8877: _init_main_event_q
- 1682fd: _main
- 124557: _phantom_multiboot_main
- 100085: boot_code
T 2 pri 00 blk 00000000
Thread 2 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 111323: _hal_softirq_dispatcher
- 17fb08: _hal_PIC_interrupt_dispatcher
- 1774b7: call_handler
- 1a3e9d: _haltme
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 3 pri 02 blk 00000004 cond 49824a4
Thread 3 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a1a7d: _hal_cond_wait
- 1a2965: _t_do_some_kills
- 1a3ec2: _kill_thread_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 4 pri 07 blk 00000002
Thread 4 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 12df5d: _net_timer_runner
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 5 pri 28 blk 00000004 cond 4a868f0
Thread 5 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a1a7d: _hal_cond_wait
- 13a1a6: _dpc_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 6 pri 28 blk 00000004 cond 4a868f0
Thread 6 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a1a7d: _hal_cond_wait
- 13a1a6: _dpc_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 7 pri 07 blk 00000002
Thread 7 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 12d6f3: _arp_cleanup_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 8 pri 07 blk 00000010 sema 48c0314
Thread 8 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 12d54e: _arp_retransmit_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T 9 pri 07 blk 00000010 sema 6a5da8
Thread 9 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 133a76: _udp_recvfrom
- 130474: _trfs_recv
- 130e1c: _trfs_recv_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T10 pri 07 blk 00000002
Thread 10 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 130f99: _trfs_resend_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T11 pri 07 blk 00000002
Thread 11 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 13573c: _tcp_echo_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T12 pri 07 blk 00000010 sema 6a5e88
Thread 12 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 133a76: _udp_recvfrom
- 1359c2: _udp_echo_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T13 pri 07 blk 00000010 sema 6e71ac
Thread 13 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 14fc79: _ne2000_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T14 pri 07 blk 00000002
Thread 14 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a350b: _hal_sleep_msec
- 150d14: _ne_read
- 134713: _if_rx_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T15 pri 07 blk 00000010 sema 73502c
Thread 15 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 134632: _if_tx_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T16 pri 28 blk 00000010 sema 48fab54
Thread 16 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 16e55b: _ignore_handler
- 113305: _phantom_kernel_trap
- 5e7c6e: ?
- 12ca4c: _phantom_scheduler_request_soft_irq
- 19fe27: _phantom_scheduler_yield_locked
- 1a07f2: _thread_block
- 1a2f7e: _do_hal_sem_acquire_etc
- 1a32f6: _hal_sem_acquire
- 13df33: _mouse_push_event_thread
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T17 pri 29 blk 00000000
Thread 17 EIP 0x001A4171, Stack:- 19fe7c: _phantom_scheduler_soft_interrupt
- 111323: _hal_softirq_dispatcher
- 17fb08: _hal_PIC_interrupt_dispatcher
- 1774b7: call_handler
- 1d82c4: _vid_bitblt_part
- 1d5fd6: _vid_bitblt_part_rev
- 1d54ca: _repaint_win_part
- 1d5568: _w_repaint_screen_part
- 1d4d19: _paint_square_updown
- 1d4ded: _repaint_q
- 1d4e33: _painter_thread
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
T18 pri 07 blk 00000000
Thread 18 EIP 0x001A4171, Stack:- 6: ?
- 13a61f: _hal_spin_lock
- 1a0784: _thread_block
- 1a2484: _hal_mutex_lock
- 12342d: _malloc
- 1dd131: _calloc
- 1a2172: _do_ctty_create
- 19baac: _pool_create_el
- 1a2249: _t_make_ctty
- 1a37ad: _t_new_ctty
- 13c6cf: _phantom_debug_window_loop
- 1a0a62: _kernel_thread_starter
- 1a109f: _phantom_thread_c_starter
- 1a4428: _phantom_thread_trampoline
```
| priority | spinlock reenter panic on first boot on clean qemu second boot is ok some clean memory error spinlock reenter detected prev enter was here stack panic reenter tid stack stack dump panic hal spin init hal spin lock thread block hal mutex lock malloc calloc do ctty create pool create el t make ctty t new ctty phantom debug window loop kernel thread starter phantom thread c starter phantom thread trampoline t pri blk thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal mutex lock malloc calloc ev allocate event init main event q main phantom multiboot main boot code t pri blk thread eip stack phantom scheduler soft interrupt hal softirq dispatcher hal pic interrupt dispatcher call handler haltme phantom thread c starter phantom thread trampoline t pri blk cond thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal cond wait t do some kills kill thread thread kernel thread starter phantom thread c starter phantom thread trampoline t pri blk thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal sleep msec net timer runner phantom thread c starter phantom thread trampoline t pri blk cond thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal cond wait dpc thread kernel thread starter phantom thread c starter phantom thread trampoline t pri blk cond thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal cond wait dpc thread kernel thread starter phantom thread c starter phantom thread trampoline t pri blk thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal sleep msec arp cleanup thread kernel thread starter phantom thread c starter phantom thread trampoline t pri blk sema thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block do hal sem acquire etc hal sem acquire arp retransmit thread kernel thread starter phantom thread c starter phantom thread trampoline t pri blk sema thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block do hal sem acquire etc hal sem acquire udp recvfrom trfs recv trfs recv thread phantom thread c starter phantom thread trampoline pri blk thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal sleep msec trfs resend thread phantom thread c starter phantom thread trampoline pri blk thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal sleep msec tcp echo thread phantom thread c starter phantom thread trampoline pri blk sema thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block do hal sem acquire etc hal sem acquire udp recvfrom udp echo thread phantom thread c starter phantom thread trampoline pri blk sema thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block do hal sem acquire etc hal sem acquire thread phantom thread c starter phantom thread trampoline pri blk thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block hal sleep msec ne read if rx thread phantom thread c starter phantom thread trampoline pri blk sema thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block do hal sem acquire etc hal sem acquire if tx thread phantom thread c starter phantom thread trampoline pri blk sema thread eip stack phantom scheduler soft interrupt ignore handler phantom kernel trap phantom scheduler request soft irq phantom scheduler yield locked thread block do hal sem acquire etc hal sem acquire mouse push event thread kernel thread starter phantom thread c starter phantom thread trampoline pri blk thread eip stack phantom scheduler soft interrupt hal softirq dispatcher hal pic interrupt dispatcher call handler vid bitblt part vid bitblt part rev repaint win part w repaint screen part paint square updown repaint q painter thread phantom thread c starter phantom thread trampoline pri blk thread eip stack hal spin lock thread block hal mutex lock malloc calloc do ctty create pool create el t make ctty t new ctty phantom debug window loop kernel thread starter phantom thread c starter phantom thread trampoline | 1 |
258,444 | 19,560,751,826 | IssuesEvent | 2022-01-03 15:54:23 | discord/discord-api-docs | https://api.github.com/repos/discord/discord-api-docs | closed | Audit log change keys are missing `communication_disabled_until` | documentation | [This page](https://discord.com/developers/docs/resources/audit-log#audit-log-change-object-audit-log-change-key) needs the `communication_disabled_until` audit log change key documented (which seems to have been missed from #4075). | 1.0 | Audit log change keys are missing `communication_disabled_until` - [This page](https://discord.com/developers/docs/resources/audit-log#audit-log-change-object-audit-log-change-key) needs the `communication_disabled_until` audit log change key documented (which seems to have been missed from #4075). | non_priority | audit log change keys are missing communication disabled until needs the communication disabled until audit log change key documented which seems to have been missed from | 0 |
75,116 | 20,670,424,383 | IssuesEvent | 2022-03-10 01:08:07 | dotnet/efcore | https://api.github.com/repos/dotnet/efcore | opened | ToTable(b => {}) should map the entity type to the default table | type-bug area-model-building | Currently, if `ToView` or a similar call is performed the entity type will end up not mapped to any table. | 1.0 | ToTable(b => {}) should map the entity type to the default table - Currently, if `ToView` or a similar call is performed the entity type will end up not mapped to any table. | non_priority | totable b should map the entity type to the default table currently if toview or a similar call is performed the entity type will end up not mapped to any table | 0 |
26,506 | 11,307,712,032 | IssuesEvent | 2020-01-18 22:57:59 | NixOS/nixpkgs | https://api.github.com/repos/NixOS/nixpkgs | closed | Vulnerability roundup 79: flac-1.3.2: 1 advisory | 1.severity: security | [search](https://search.nix.gsc.io/?q=flac&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=flac+in%3Apath&type=Code)
* [ ] [CVE-2017-6888](https://nvd.nist.gov/vuln/detail/CVE-2017-6888) CVSSv3=5.5 (nixos-19.03)
Scanned versions: nixos-19.03: d1dff0bcd9f. May contain false positives.
| True | Vulnerability roundup 79: flac-1.3.2: 1 advisory - [search](https://search.nix.gsc.io/?q=flac&i=fosho&repos=NixOS-nixpkgs), [files](https://github.com/NixOS/nixpkgs/search?utf8=%E2%9C%93&q=flac+in%3Apath&type=Code)
* [ ] [CVE-2017-6888](https://nvd.nist.gov/vuln/detail/CVE-2017-6888) CVSSv3=5.5 (nixos-19.03)
Scanned versions: nixos-19.03: d1dff0bcd9f. May contain false positives.
| non_priority | vulnerability roundup flac advisory nixos scanned versions nixos may contain false positives | 0 |
13,920 | 23,977,407,744 | IssuesEvent | 2022-09-13 12:43:29 | renovatebot/renovate | https://api.github.com/repos/renovatebot/renovate | opened | Don't cache HTTP request errors/exceptions | type:feature status:requirements priority-5-triage | ### What would you like Renovate to be able to do?
HTTP requests have an opt-out caching behavior for all GET and HEAD requests.
It appears that errors are also being cached (e.g. when the request ends up failing/throwing), preventing retries to go through down the line.
### If you have any ideas on how this should be implemented, please tell us here.
Full context: https://github.com/renovatebot/renovate/pull/17761
The above PR evicts the cache entry for a given request when the request ends up failing/throwing, allowing a retry to go through down the line (e.g. on another renovate run).
The following is a delineation of the reasoning leading us there. The goal of the verbosity is to make incorrect assumptions as explicit as possible:
1. Error message from Renovate logs:
```
DEBUG: Failed to look up dependency @xxx/yyy (@xxx/yyy) (packageFile="package.json", dependency="@xxx/yyy")
DEBUG: Unknown npm lookup error
```
2. Single occurrence of that error string on the codebase:
https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/modules/datasource/npm/get.ts#L181
```ts
logger.debug({ err }, 'Unknown npm lookup error');
```
3. First line of the containing try-catch block making an HTTP call (as the error message contains a status code)
https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/modules/datasource/npm/get.ts#L67
```ts
const raw = await http.getJson<NpmResponse>(packageUrl);
```
4. `getJson` defers to `requestJson`, which defers to `request`: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L212-L232
```ts
getJson<T = unknown>(
...
return this.requestJson<T>(url, { ...options });
private async requestJson<T = unknown>(
...
const res = await this.request<T>(url, {
```
5. `request` reads all `get` and `head` requests from the cache unless opted out: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L147-L153
```ts
// Cache GET requests unless useCache=false
if (
(options.method === 'get' || options.method === 'head') &&
options.useCache !== false
) {
resPromise = memCache.get(cacheKey);
}
```
6. We didn't find `useCache` being set/overridden on the npm datasource: https://github.com/aisamu/renovate/tree/ef17f8a11103a42ee72b4a819a68204712280c76/lib/modules/datasource/npm/get.ts
7. `request` caches all `get` and `head` **promises**: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L147-L153
```ts
if (options.method === 'get' || options.method === 'head') {
memCache.set(cacheKey, resPromise); // always set if it's a get or a head
}
```
8. `request` uses the cached promise if it's non-falsy: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L156
```ts
if (!resPromise) {
```
9. A rejected promise is non-falsy on node:
```ts
> Promise.reject(new Error()) ? true : false
true
```
10. 500 requests are `thrown` by the http abstraction: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.spec.ts#L58-L59
```ts
httpMock.scope(baseUrl).get('/test').reply(500).get('/test').reply(200);
await expect(http.get('http://renovate.com/test')).rejects.toThrow('500');
```
The combination of the above would reasonably explain the issue observed on the PR (i.e. never-ending 504s caused by caching a failed response).
### Is this a feature you are interested in implementing yourself?
Yes | 1.0 | Don't cache HTTP request errors/exceptions - ### What would you like Renovate to be able to do?
HTTP requests have an opt-out caching behavior for all GET and HEAD requests.
It appears that errors are also being cached (e.g. when the request ends up failing/throwing), preventing retries to go through down the line.
### If you have any ideas on how this should be implemented, please tell us here.
Full context: https://github.com/renovatebot/renovate/pull/17761
The above PR evicts the cache entry for a given request when the request ends up failing/throwing, allowing a retry to go through down the line (e.g. on another renovate run).
The following is a delineation of the reasoning leading us there. The goal of the verbosity is to make incorrect assumptions as explicit as possible:
1. Error message from Renovate logs:
```
DEBUG: Failed to look up dependency @xxx/yyy (@xxx/yyy) (packageFile="package.json", dependency="@xxx/yyy")
DEBUG: Unknown npm lookup error
```
2. Single occurrence of that error string on the codebase:
https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/modules/datasource/npm/get.ts#L181
```ts
logger.debug({ err }, 'Unknown npm lookup error');
```
3. First line of the containing try-catch block making an HTTP call (as the error message contains a status code)
https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/modules/datasource/npm/get.ts#L67
```ts
const raw = await http.getJson<NpmResponse>(packageUrl);
```
4. `getJson` defers to `requestJson`, which defers to `request`: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L212-L232
```ts
getJson<T = unknown>(
...
return this.requestJson<T>(url, { ...options });
private async requestJson<T = unknown>(
...
const res = await this.request<T>(url, {
```
5. `request` reads all `get` and `head` requests from the cache unless opted out: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L147-L153
```ts
// Cache GET requests unless useCache=false
if (
(options.method === 'get' || options.method === 'head') &&
options.useCache !== false
) {
resPromise = memCache.get(cacheKey);
}
```
6. We didn't find `useCache` being set/overridden on the npm datasource: https://github.com/aisamu/renovate/tree/ef17f8a11103a42ee72b4a819a68204712280c76/lib/modules/datasource/npm/get.ts
7. `request` caches all `get` and `head` **promises**: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L147-L153
```ts
if (options.method === 'get' || options.method === 'head') {
memCache.set(cacheKey, resPromise); // always set if it's a get or a head
}
```
8. `request` uses the cached promise if it's non-falsy: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.ts#L156
```ts
if (!resPromise) {
```
9. A rejected promise is non-falsy on node:
```ts
> Promise.reject(new Error()) ? true : false
true
```
10. 500 requests are `thrown` by the http abstraction: https://github.com/aisamu/renovate/blob/ef17f8a11103a42ee72b4a819a68204712280c76/lib/util/http/index.spec.ts#L58-L59
```ts
httpMock.scope(baseUrl).get('/test').reply(500).get('/test').reply(200);
await expect(http.get('http://renovate.com/test')).rejects.toThrow('500');
```
The combination of the above would reasonably explain the issue observed on the PR (i.e. never-ending 504s caused by caching a failed response).
### Is this a feature you are interested in implementing yourself?
Yes | non_priority | don t cache http request errors exceptions what would you like renovate to be able to do http requests have an opt out caching behavior for all get and head requests it appears that errors are also being cached e g when the request ends up failing throwing preventing retries to go through down the line if you have any ideas on how this should be implemented please tell us here full context the above pr evicts the cache entry for a given request when the request ends up failing throwing allowing a retry to go through down the line e g on another renovate run the following is a delineation of the reasoning leading us there the goal of the verbosity is to make incorrect assumptions as explicit as possible error message from renovate logs debug failed to look up dependency xxx yyy xxx yyy packagefile package json dependency xxx yyy debug unknown npm lookup error single occurrence of that error string on the codebase ts logger debug err unknown npm lookup error first line of the containing try catch block making an http call as the error message contains a status code ts const raw await http getjson packageurl getjson defers to requestjson which defers to request ts getjson return this requestjson url options private async requestjson const res await this request url request reads all get and head requests from the cache unless opted out ts cache get requests unless usecache false if options method get options method head options usecache false respromise memcache get cachekey we didn t find usecache being set overridden on the npm datasource request caches all get and head promises ts if options method get options method head memcache set cachekey respromise always set if it s a get or a head request uses the cached promise if it s non falsy ts if respromise a rejected promise is non falsy on node ts promise reject new error true false true requests are thrown by the http abstraction ts httpmock scope baseurl get test reply get test reply await expect http get the combination of the above would reasonably explain the issue observed on the pr i e never ending caused by caching a failed response is this a feature you are interested in implementing yourself yes | 0 |
211,000 | 7,197,419,510 | IssuesEvent | 2018-02-05 09:03:53 | YaoGuang-NYP/Digital_Services | https://api.github.com/repos/YaoGuang-NYP/Digital_Services | closed | As Tertiary Educator........... | Medium Priority Normal | I would like to see my students able to find a job after they have graduated. | 1.0 | As Tertiary Educator........... - I would like to see my students able to find a job after they have graduated. | priority | as tertiary educator i would like to see my students able to find a job after they have graduated | 1 |
76,665 | 26,542,581,428 | IssuesEvent | 2023-01-19 20:35:56 | SeleniumHQ/selenium | https://api.github.com/repos/SeleniumHQ/selenium | closed | [🐛 Bug]: chrome protocols not available in headless mode | R-awaiting answer I-defect | ### What happened?
Chrome protocols return empty pages when using selenium in **headless mode**.
I use the `chrome://extensions` to query the DOM and get the ID for my chrome extension. I always get an empty page when I visit `chrome://extensions` in headless mode
The same can also be said for the `chrome-extension://` protocol as I hardcoded my extension ID and got an empty page when I visited my extension at `chrome-extension://clbcmlfmoadnejkohgjannpaodeoalcj/index.html` (Works fine when not in headless)
### How can we reproduce the issue?
```shell
//Typescript snippet of a class that extends WebDriver
const chromePages = ["history", "extensions", "settings"]
for (const page of chromePages) {
await this.get(`chrome://${page}`)
console.log(`${page} URL: ${await this.getCurrentUrl()}`)
console.log(`${page}: ${await this.getPageSource()}`)
}
```
### Relevant log output
```shell
history URL: chrome://history/
history: <html><head></head><body></body></html>
extensions URL: chrome://extensions/
extensions: <html><head></head><body></body></html>
settings URL: chrome://settings/
settings: <html><head></head><body></body></html>
```
### Operating System
MacOS, Linux
### Selenium version
NPM: selenium-webdriver@4.7.1
### What are the browser(s) and version(s) where you see this issue?
Chrome - 109.0.5414.87 (MacOS), 109.0.5414.74 (Ubuntu)
### What are the browser driver(s) and version(s) where you see this issue?
ChromeDriver - 109.0.5414.74 (MacOS), 109.0.5414.74 (Ubuntu)
### Are you using Selenium Grid?
_No response_ | 1.0 | [🐛 Bug]: chrome protocols not available in headless mode - ### What happened?
Chrome protocols return empty pages when using selenium in **headless mode**.
I use the `chrome://extensions` to query the DOM and get the ID for my chrome extension. I always get an empty page when I visit `chrome://extensions` in headless mode
The same can also be said for the `chrome-extension://` protocol as I hardcoded my extension ID and got an empty page when I visited my extension at `chrome-extension://clbcmlfmoadnejkohgjannpaodeoalcj/index.html` (Works fine when not in headless)
### How can we reproduce the issue?
```shell
//Typescript snippet of a class that extends WebDriver
const chromePages = ["history", "extensions", "settings"]
for (const page of chromePages) {
await this.get(`chrome://${page}`)
console.log(`${page} URL: ${await this.getCurrentUrl()}`)
console.log(`${page}: ${await this.getPageSource()}`)
}
```
### Relevant log output
```shell
history URL: chrome://history/
history: <html><head></head><body></body></html>
extensions URL: chrome://extensions/
extensions: <html><head></head><body></body></html>
settings URL: chrome://settings/
settings: <html><head></head><body></body></html>
```
### Operating System
MacOS, Linux
### Selenium version
NPM: selenium-webdriver@4.7.1
### What are the browser(s) and version(s) where you see this issue?
Chrome - 109.0.5414.87 (MacOS), 109.0.5414.74 (Ubuntu)
### What are the browser driver(s) and version(s) where you see this issue?
ChromeDriver - 109.0.5414.74 (MacOS), 109.0.5414.74 (Ubuntu)
### Are you using Selenium Grid?
_No response_ | non_priority | chrome protocols not available in headless mode what happened chrome protocols return empty pages when using selenium in headless mode i use the chrome extensions to query the dom and get the id for my chrome extension i always get an empty page when i visit chrome extensions in headless mode the same can also be said for the chrome extension protocol as i hardcoded my extension id and got an empty page when i visited my extension at chrome extension clbcmlfmoadnejkohgjannpaodeoalcj index html works fine when not in headless how can we reproduce the issue shell typescript snippet of a class that extends webdriver const chromepages for const page of chromepages await this get chrome page console log page url await this getcurrenturl console log page await this getpagesource relevant log output shell history url chrome history history extensions url chrome extensions extensions settings url chrome settings settings operating system macos linux selenium version npm selenium webdriver what are the browser s and version s where you see this issue chrome macos ubuntu what are the browser driver s and version s where you see this issue chromedriver macos ubuntu are you using selenium grid no response | 0 |
38,684 | 2,850,173,463 | IssuesEvent | 2015-05-31 10:22:54 | mattwood1/zf1 | https://api.github.com/repos/mattwood1/zf1 | closed | High ranking is interfering with top ranking. | bug Critical Priority | There are currently two top rankings which are not being progressed as they are getting included in the high rankings.
High rankings top should never exceed top bottom. | 1.0 | High ranking is interfering with top ranking. - There are currently two top rankings which are not being progressed as they are getting included in the high rankings.
High rankings top should never exceed top bottom. | priority | high ranking is interfering with top ranking there are currently two top rankings which are not being progressed as they are getting included in the high rankings high rankings top should never exceed top bottom | 1 |
21,311 | 2,637,742,165 | IssuesEvent | 2015-03-10 15:01:22 | alexeyxo/protobuf-swift | https://api.github.com/repos/alexeyxo/protobuf-swift | closed | Circular optional references cause an infinite loop | bug high priority in progress | The following .proto file:
```protobuf
message User {
optional Group group = 1;
}
message Group {
optional User owner = 1;
}
```
...crashes the runtime because protobuf-swift constructs empty objects for all fields, even for optional fields. ie. the `owner` field of `Group` has `var owner:User = User()` and the `group` field of `User` has `var group:Group = Group()`, resulting in infinite recursion while building the object.
I imagine the only solution to this is switching to optionals.
PS. Out of curiosity, what *is* the reason for not using optional types? | 1.0 | Circular optional references cause an infinite loop - The following .proto file:
```protobuf
message User {
optional Group group = 1;
}
message Group {
optional User owner = 1;
}
```
...crashes the runtime because protobuf-swift constructs empty objects for all fields, even for optional fields. ie. the `owner` field of `Group` has `var owner:User = User()` and the `group` field of `User` has `var group:Group = Group()`, resulting in infinite recursion while building the object.
I imagine the only solution to this is switching to optionals.
PS. Out of curiosity, what *is* the reason for not using optional types? | priority | circular optional references cause an infinite loop the following proto file protobuf message user optional group group message group optional user owner crashes the runtime because protobuf swift constructs empty objects for all fields even for optional fields ie the owner field of group has var owner user user and the group field of user has var group group group resulting in infinite recursion while building the object i imagine the only solution to this is switching to optionals ps out of curiosity what is the reason for not using optional types | 1 |
324,493 | 27,809,083,138 | IssuesEvent | 2023-03-18 00:09:21 | unifyai/ivy | https://api.github.com/repos/unifyai/ivy | reopened | Fix elementwise.test_maximum | Sub Task Failing Test | | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_functional/test_core/test_elementwise.py::test_maximum[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-17T21:00:32.1252923Z E TypeError: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1258497Z E ivy.utils.exceptions.IvyBackendException: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1261989Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1265709Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1275467Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1280354Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: nested_map: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1337095Z E ivy.utils.exceptions.IvyBackendException: jax: execute_with_gradients: jax: nested_map: jax: nested_map: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1337690Z E Falsifying example: test_maximum(
2023-03-17T21:00:32.1338096Z E dtype_and_x_and_use_where=((['float16', 'int8'],
2023-03-17T21:00:32.1338479Z E [array(-1., dtype=float16), array(0, dtype=int8)]),
2023-03-17T21:00:32.1338739Z E True),
2023-03-17T21:00:32.1339023Z E ground_truth_backend='tensorflow',
2023-03-17T21:00:32.1339305Z E fn_name='maximum',
2023-03-17T21:00:32.1339551Z E test_flags=FunctionTestFlags(
2023-03-17T21:00:32.1339811Z E num_positional_args=2,
2023-03-17T21:00:32.1340041Z E with_out=False,
2023-03-17T21:00:32.1340275Z E instance_method=False,
2023-03-17T21:00:32.1340753Z E test_gradients=True,
2023-03-17T21:00:32.1340998Z E test_compile=False,
2023-03-17T21:00:32.1341362Z E as_variable=[False],
2023-03-17T21:00:32.1341612Z E native_arrays=[False],
2023-03-17T21:00:32.1341845Z E container=[False],
2023-03-17T21:00:32.1342173Z E ),
2023-03-17T21:00:32.1342630Z E backend_fw=<module 'ivy.functional.backends.jax' from '/ivy/ivy/functional/backends/jax/__init__.py'>,
2023-03-17T21:00:32.1343149Z E on_device='cpu',
2023-03-17T21:00:32.1343360Z E )
2023-03-17T21:00:32.1343531Z E
2023-03-17T21:00:32.1344280Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY5RiAAJGBghgRKIAAuMAHw==') as a decorator on your test case
</details>
| 1.0 | Fix elementwise.test_maximum - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4450961173/jobs/7817046026" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_functional/test_core/test_elementwise.py::test_maximum[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-17T21:00:32.1252923Z E TypeError: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1258497Z E ivy.utils.exceptions.IvyBackendException: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1261989Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1265709Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1275467Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1280354Z E ivy.utils.exceptions.IvyBackendException: jax: nested_map: jax: nested_map: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1337095Z E ivy.utils.exceptions.IvyBackendException: jax: execute_with_gradients: jax: nested_map: jax: nested_map: jax: nested_map: jax: nested_map: jax: stop_gradient: Value ivy.array(-1.) with type <class 'ivy.array.array.Array'> is not a valid JAX type
2023-03-17T21:00:32.1337690Z E Falsifying example: test_maximum(
2023-03-17T21:00:32.1338096Z E dtype_and_x_and_use_where=((['float16', 'int8'],
2023-03-17T21:00:32.1338479Z E [array(-1., dtype=float16), array(0, dtype=int8)]),
2023-03-17T21:00:32.1338739Z E True),
2023-03-17T21:00:32.1339023Z E ground_truth_backend='tensorflow',
2023-03-17T21:00:32.1339305Z E fn_name='maximum',
2023-03-17T21:00:32.1339551Z E test_flags=FunctionTestFlags(
2023-03-17T21:00:32.1339811Z E num_positional_args=2,
2023-03-17T21:00:32.1340041Z E with_out=False,
2023-03-17T21:00:32.1340275Z E instance_method=False,
2023-03-17T21:00:32.1340753Z E test_gradients=True,
2023-03-17T21:00:32.1340998Z E test_compile=False,
2023-03-17T21:00:32.1341362Z E as_variable=[False],
2023-03-17T21:00:32.1341612Z E native_arrays=[False],
2023-03-17T21:00:32.1341845Z E container=[False],
2023-03-17T21:00:32.1342173Z E ),
2023-03-17T21:00:32.1342630Z E backend_fw=<module 'ivy.functional.backends.jax' from '/ivy/ivy/functional/backends/jax/__init__.py'>,
2023-03-17T21:00:32.1343149Z E on_device='cpu',
2023-03-17T21:00:32.1343360Z E )
2023-03-17T21:00:32.1343531Z E
2023-03-17T21:00:32.1344280Z E You can reproduce this example by temporarily adding @reproduce_failure('6.70.0', b'AXicY5RiAAJGBghgRKIAAuMAHw==') as a decorator on your test case
</details>
| non_priority | fix elementwise test maximum tensorflow img src torch img src numpy img src jax img src failed ivy tests test ivy test functional test core test elementwise py test maximum e typeerror value ivy array with type is not a valid jax type e ivy utils exceptions ivybackendexception jax stop gradient value ivy array with type is not a valid jax type e ivy utils exceptions ivybackendexception jax nested map jax stop gradient value ivy array with type is not a valid jax type e ivy utils exceptions ivybackendexception jax nested map jax nested map jax stop gradient value ivy array with type is not a valid jax type e ivy utils exceptions ivybackendexception jax nested map jax nested map jax nested map jax stop gradient value ivy array with type is not a valid jax type e ivy utils exceptions ivybackendexception jax nested map jax nested map jax nested map jax nested map jax stop gradient value ivy array with type is not a valid jax type e ivy utils exceptions ivybackendexception jax execute with gradients jax nested map jax nested map jax nested map jax nested map jax stop gradient value ivy array with type is not a valid jax type e falsifying example test maximum e dtype and x and use where e e true e ground truth backend tensorflow e fn name maximum e test flags functiontestflags e num positional args e with out false e instance method false e test gradients true e test compile false e as variable e native arrays e container e e backend fw e on device cpu e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case | 0 |
302,149 | 9,255,665,199 | IssuesEvent | 2019-03-16 12:29:14 | InfiniteFlightAirportEditing/Airports | https://api.github.com/repos/InfiniteFlightAirportEditing/Airports | closed | OAYW-Yawan Airport-BADAKHSHAN-AFGHANISTAN | Being Redone Low Priority | # Airport Name
< Enter text here >
# Country?
< Enter text here >
# Improvements that need to be made?
< Enter text here >
# Are you working on this airport?
< Enter text here >
# Airport Priority? (A380, 10000ft+ Runway)
< Enter text here >
| 1.0 | OAYW-Yawan Airport-BADAKHSHAN-AFGHANISTAN - # Airport Name
< Enter text here >
# Country?
< Enter text here >
# Improvements that need to be made?
< Enter text here >
# Are you working on this airport?
< Enter text here >
# Airport Priority? (A380, 10000ft+ Runway)
< Enter text here >
| priority | oayw yawan airport badakhshan afghanistan airport name country improvements that need to be made are you working on this airport airport priority runway | 1 |
576,202 | 17,081,635,805 | IssuesEvent | 2021-07-08 06:24:23 | kubeapps/kubeapps | https://api.github.com/repos/kubeapps/kubeapps | closed | [kubeapps-apis] Implement "direct-Helm" plugin based on the existing features in kubeapps | component/apis-server kind/feature priority/high size/M | In the same way that we already have `kapp-controller` and `fluxv2` plugins (more or less) implemented, it is required to also support the legacy/direct/current Helm approach that kubeapps is using.
Otherwise, existing OCI support would get compromised (AFAIK) unless another plugin does support it.
Edit: FWIW, it will leverage the existing AppRepositories/postgresql logic
Related to https://github.com/kubeapps/kubeapps/issues/2023
### Status
#### Transversal aspects
- [x] Add proto and generated files: https://github.com/kubeapps/kubeapps/pull/3020, https://github.com/kubeapps/kubeapps/pull/3039
- [x] Expose the required logic directly from the assetsvc: https://github.com/kubeapps/kubeapps/pull/3036 and https://github.com/kubeapps/kubeapps/pull/3040
- [x] Add filters options: https://github.com/kubeapps/kubeapps/pull/3038
- [x] Expose a normal clientSet instead of the dynamic interface: https://github.com/kubeapps/kubeapps/pull/3043
- [x] Modify proto messages to implement [gRPC recommended pagination](https://cloud.google.com/apis/design/design_patterns#list_pagination): https://github.com/kubeapps/kubeapps/pull/3074
#### Operations already "agreed"
- [x] GetAvailablePackagesSummaries: https://github.com/kubeapps/kubeapps/pull/3022
- [x] GetAvailablePackageDetail: https://github.com/kubeapps/kubeapps/pull/3034
- [x] Pagination for GetAvailablePackageSummaries #3091
- [ ] GetAvailablePackageVersions: _in progress_
#### Operations in active discussion
- [ ] GetInstalledPackageSummaries
- [ ] GetInstalledPackageDetail
- [ ] CreateInstalledPackage
- [ ] UpdateInstalledPackage
- [ ] DeleteInstalledPackage
#### UI
Tracked at https://github.com/kubeapps/kubeapps/issues/3085 | 1.0 | [kubeapps-apis] Implement "direct-Helm" plugin based on the existing features in kubeapps - In the same way that we already have `kapp-controller` and `fluxv2` plugins (more or less) implemented, it is required to also support the legacy/direct/current Helm approach that kubeapps is using.
Otherwise, existing OCI support would get compromised (AFAIK) unless another plugin does support it.
Edit: FWIW, it will leverage the existing AppRepositories/postgresql logic
Related to https://github.com/kubeapps/kubeapps/issues/2023
### Status
#### Transversal aspects
- [x] Add proto and generated files: https://github.com/kubeapps/kubeapps/pull/3020, https://github.com/kubeapps/kubeapps/pull/3039
- [x] Expose the required logic directly from the assetsvc: https://github.com/kubeapps/kubeapps/pull/3036 and https://github.com/kubeapps/kubeapps/pull/3040
- [x] Add filters options: https://github.com/kubeapps/kubeapps/pull/3038
- [x] Expose a normal clientSet instead of the dynamic interface: https://github.com/kubeapps/kubeapps/pull/3043
- [x] Modify proto messages to implement [gRPC recommended pagination](https://cloud.google.com/apis/design/design_patterns#list_pagination): https://github.com/kubeapps/kubeapps/pull/3074
#### Operations already "agreed"
- [x] GetAvailablePackagesSummaries: https://github.com/kubeapps/kubeapps/pull/3022
- [x] GetAvailablePackageDetail: https://github.com/kubeapps/kubeapps/pull/3034
- [x] Pagination for GetAvailablePackageSummaries #3091
- [ ] GetAvailablePackageVersions: _in progress_
#### Operations in active discussion
- [ ] GetInstalledPackageSummaries
- [ ] GetInstalledPackageDetail
- [ ] CreateInstalledPackage
- [ ] UpdateInstalledPackage
- [ ] DeleteInstalledPackage
#### UI
Tracked at https://github.com/kubeapps/kubeapps/issues/3085 | priority | implement direct helm plugin based on the existing features in kubeapps in the same way that we already have kapp controller and plugins more or less implemented it is required to also support the legacy direct current helm approach that kubeapps is using otherwise existing oci support would get compromised afaik unless another plugin does support it edit fwiw it will leverage the existing apprepositories postgresql logic related to status transversal aspects add proto and generated files expose the required logic directly from the assetsvc and add filters options expose a normal clientset instead of the dynamic interface modify proto messages to implement operations already agreed getavailablepackagessummaries getavailablepackagedetail pagination for getavailablepackagesummaries getavailablepackageversions in progress operations in active discussion getinstalledpackagesummaries getinstalledpackagedetail createinstalledpackage updateinstalledpackage deleteinstalledpackage ui tracked at | 1 |
469,320 | 13,505,632,699 | IssuesEvent | 2020-09-14 00:00:39 | dcl-covid-19/mega-map-dev | https://api.github.com/repos/dcl-covid-19/mega-map-dev | closed | Merge "medical" and "mental health" categories into "health"; create filter by sub-types | help wanted medium-priority question | **Need work on an existing feature? **
(a) we would like to merge the two existing categories of mental health and medical resources, because there are currently duplicates between the two categories. This means that `resource` = "health" for any medical and mental health resources
(b) At the same time, the data team will be adding new 0/1 columns to filter medical services (similar to legal filters rn), including med_mental_health. The full list of filters is here and in the mega-map metadata:
1. med_primary_care —> Primary Care
2. med_pediatrics —> Pediatric Care
3. med_senior —> Senior Care
4. med_women —> Women’s Care
5. med_urgent_care —> Urgent Care
6. med_dental —> Dental Care
7. med_pharmacy —> Pharmacy
8. med_mental_health —> Mental Health Care
9. med_hotline —> Hotline
10. med_domestic_violence —> Domestic Violence Support
11. med_addiction —> Addiction & Recovery
12. med_benefit --> Health Benefit Enrollment
(c) However we still want the "Mental Health" category to show up as a main category in the first dropdown menu due to its importance as a first filter. That category would take a user to a list view where `resource` = "health" & `med_mental_health` = 1.
The data team is working on merging the resources and should be ready in the next 2-3 days.
| 1.0 | Merge "medical" and "mental health" categories into "health"; create filter by sub-types - **Need work on an existing feature? **
(a) we would like to merge the two existing categories of mental health and medical resources, because there are currently duplicates between the two categories. This means that `resource` = "health" for any medical and mental health resources
(b) At the same time, the data team will be adding new 0/1 columns to filter medical services (similar to legal filters rn), including med_mental_health. The full list of filters is here and in the mega-map metadata:
1. med_primary_care —> Primary Care
2. med_pediatrics —> Pediatric Care
3. med_senior —> Senior Care
4. med_women —> Women’s Care
5. med_urgent_care —> Urgent Care
6. med_dental —> Dental Care
7. med_pharmacy —> Pharmacy
8. med_mental_health —> Mental Health Care
9. med_hotline —> Hotline
10. med_domestic_violence —> Domestic Violence Support
11. med_addiction —> Addiction & Recovery
12. med_benefit --> Health Benefit Enrollment
(c) However we still want the "Mental Health" category to show up as a main category in the first dropdown menu due to its importance as a first filter. That category would take a user to a list view where `resource` = "health" & `med_mental_health` = 1.
The data team is working on merging the resources and should be ready in the next 2-3 days.
| priority | merge medical and mental health categories into health create filter by sub types need work on an existing feature a we would like to merge the two existing categories of mental health and medical resources because there are currently duplicates between the two categories this means that resource health for any medical and mental health resources b at the same time the data team will be adding new columns to filter medical services similar to legal filters rn including med mental health the full list of filters is here and in the mega map metadata med primary care — primary care med pediatrics — pediatric care med senior — senior care med women — women’s care med urgent care — urgent care med dental — dental care med pharmacy — pharmacy med mental health — mental health care med hotline — hotline med domestic violence — domestic violence support med addiction — addiction recovery med benefit health benefit enrollment c however we still want the mental health category to show up as a main category in the first dropdown menu due to its importance as a first filter that category would take a user to a list view where resource health med mental health the data team is working on merging the resources and should be ready in the next days | 1 |
324,241 | 9,886,539,777 | IssuesEvent | 2019-06-25 07:01:26 | grafana/grafana | https://api.github.com/repos/grafana/grafana | opened | Alerting: Improve alert rule testing for all core datasources | area/alerting area/alerting/evaluation priority/important-longterm type/feature-request | **What would you like to be added**:
#16286 added support for a debug flag when testing alert rules to indicate that datasource can/should return debug related meta data to be shown in test alert rule ui. That PR specifically improved testing alert rule using the elasticsearch datasource since that one is one of the most hard ones to debug in alerting scenarios.
All the SQL datasources already per default returns query meta data.
This issue aims to improve alert rule testing for rest of core datasources so that they return query meta data (when debug flag is set):
- Graphite
- Prometheus
- InfluxDB
- CloudWatch
- Azure Monitor
- Stackdriver
- OpenTSDB
**Why is this needed**:
To make it easier for users to understand why their alert rule doesn't work as expected when testing the alert rule. | 1.0 | Alerting: Improve alert rule testing for all core datasources - **What would you like to be added**:
#16286 added support for a debug flag when testing alert rules to indicate that datasource can/should return debug related meta data to be shown in test alert rule ui. That PR specifically improved testing alert rule using the elasticsearch datasource since that one is one of the most hard ones to debug in alerting scenarios.
All the SQL datasources already per default returns query meta data.
This issue aims to improve alert rule testing for rest of core datasources so that they return query meta data (when debug flag is set):
- Graphite
- Prometheus
- InfluxDB
- CloudWatch
- Azure Monitor
- Stackdriver
- OpenTSDB
**Why is this needed**:
To make it easier for users to understand why their alert rule doesn't work as expected when testing the alert rule. | priority | alerting improve alert rule testing for all core datasources what would you like to be added added support for a debug flag when testing alert rules to indicate that datasource can should return debug related meta data to be shown in test alert rule ui that pr specifically improved testing alert rule using the elasticsearch datasource since that one is one of the most hard ones to debug in alerting scenarios all the sql datasources already per default returns query meta data this issue aims to improve alert rule testing for rest of core datasources so that they return query meta data when debug flag is set graphite prometheus influxdb cloudwatch azure monitor stackdriver opentsdb why is this needed to make it easier for users to understand why their alert rule doesn t work as expected when testing the alert rule | 1 |
30,643 | 11,842,011,917 | IssuesEvent | 2020-03-23 22:00:57 | Mohib-hub/karate | https://api.github.com/repos/Mohib-hub/karate | opened | CVE-2017-7675 (High) detected in tomcat-embed-core-8.5.14.jar | security vulnerability | ## CVE-2017-7675 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.14.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Path to dependency file: /tmp/ws-scm/karate/karate-demo/build.gradle</p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.14/7ce577af04cadd7ab4b36f71503fc688d5d52ccf/tomcat-embed-core-8.5.14.jar,/root/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.14/7ce577af04cadd7ab4b36f71503fc688d5d52ccf/tomcat-embed-core-8.5.14.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-1.5.3.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-1.5.3.RELEASE.jar
- tomcat-embed-websocket-8.5.14.jar
- :x: **tomcat-embed-core-8.5.14.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/karate/commit/c8766c8277306046ef9c6f01148b98b0d2bafe02">c8766c8277306046ef9c6f01148b98b0d2bafe02</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The HTTP/2 implementation in Apache Tomcat 9.0.0.M1 to 9.0.0.M21 and 8.5.0 to 8.5.15 bypassed a number of security checks that prevented directory traversal attacks. It was therefore possible to bypass security constraints using a specially crafted URL.
<p>Publish Date: 2017-08-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-7675>CVE-2017-7675</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tomcat.apache.org/security-8.html">https://tomcat.apache.org/security-8.html</a></p>
<p>Release Date: 2017-08-11</p>
<p>Fix Resolution: 9.0.0.M22,8.5.16</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.tomcat.embed","packageName":"tomcat-embed-core","packageVersion":"8.5.14","isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:1.5.3.RELEASE;org.springframework.boot:spring-boot-starter-tomcat:1.5.3.RELEASE;org.apache.tomcat.embed:tomcat-embed-websocket:8.5.14;org.apache.tomcat.embed:tomcat-embed-core:8.5.14","isMinimumFixVersionAvailable":true,"minimumFixVersion":"9.0.0.M22,8.5.16"}],"vulnerabilityIdentifier":"CVE-2017-7675","vulnerabilityDetails":"The HTTP/2 implementation in Apache Tomcat 9.0.0.M1 to 9.0.0.M21 and 8.5.0 to 8.5.15 bypassed a number of security checks that prevented directory traversal attacks. It was therefore possible to bypass security constraints using a specially crafted URL.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-7675","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2017-7675 (High) detected in tomcat-embed-core-8.5.14.jar - ## CVE-2017-7675 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tomcat-embed-core-8.5.14.jar</b></p></summary>
<p>Core Tomcat implementation</p>
<p>Path to dependency file: /tmp/ws-scm/karate/karate-demo/build.gradle</p>
<p>Path to vulnerable library: /root/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.14/7ce577af04cadd7ab4b36f71503fc688d5d52ccf/tomcat-embed-core-8.5.14.jar,/root/.gradle/caches/modules-2/files-2.1/org.apache.tomcat.embed/tomcat-embed-core/8.5.14/7ce577af04cadd7ab4b36f71503fc688d5d52ccf/tomcat-embed-core-8.5.14.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-1.5.3.RELEASE.jar (Root Library)
- spring-boot-starter-tomcat-1.5.3.RELEASE.jar
- tomcat-embed-websocket-8.5.14.jar
- :x: **tomcat-embed-core-8.5.14.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Mohib-hub/karate/commit/c8766c8277306046ef9c6f01148b98b0d2bafe02">c8766c8277306046ef9c6f01148b98b0d2bafe02</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The HTTP/2 implementation in Apache Tomcat 9.0.0.M1 to 9.0.0.M21 and 8.5.0 to 8.5.15 bypassed a number of security checks that prevented directory traversal attacks. It was therefore possible to bypass security constraints using a specially crafted URL.
<p>Publish Date: 2017-08-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-7675>CVE-2017-7675</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://tomcat.apache.org/security-8.html">https://tomcat.apache.org/security-8.html</a></p>
<p>Release Date: 2017-08-11</p>
<p>Fix Resolution: 9.0.0.M22,8.5.16</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.tomcat.embed","packageName":"tomcat-embed-core","packageVersion":"8.5.14","isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:1.5.3.RELEASE;org.springframework.boot:spring-boot-starter-tomcat:1.5.3.RELEASE;org.apache.tomcat.embed:tomcat-embed-websocket:8.5.14;org.apache.tomcat.embed:tomcat-embed-core:8.5.14","isMinimumFixVersionAvailable":true,"minimumFixVersion":"9.0.0.M22,8.5.16"}],"vulnerabilityIdentifier":"CVE-2017-7675","vulnerabilityDetails":"The HTTP/2 implementation in Apache Tomcat 9.0.0.M1 to 9.0.0.M21 and 8.5.0 to 8.5.15 bypassed a number of security checks that prevented directory traversal attacks. It was therefore possible to bypass security constraints using a specially crafted URL.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-7675","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in tomcat embed core jar cve high severity vulnerability vulnerable library tomcat embed core jar core tomcat implementation path to dependency file tmp ws scm karate karate demo build gradle path to vulnerable library root gradle caches modules files org apache tomcat embed tomcat embed core tomcat embed core jar root gradle caches modules files org apache tomcat embed tomcat embed core tomcat embed core jar dependency hierarchy spring boot starter web release jar root library spring boot starter tomcat release jar tomcat embed websocket jar x tomcat embed core jar vulnerable library found in head commit a href vulnerability details the http implementation in apache tomcat to and to bypassed a number of security checks that prevented directory traversal attacks it was therefore possible to bypass security constraints using a specially crafted url publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails the http implementation in apache tomcat to and to bypassed a number of security checks that prevented directory traversal attacks it was therefore possible to bypass security constraints using a specially crafted url vulnerabilityurl | 0 |
1,003 | 2,594,431,401 | IssuesEvent | 2015-02-20 03:18:34 | BALL-Project/ball | https://api.github.com/repos/BALL-Project/ball | closed | Automatic forwarding of new bugs to developers mailing list down | C: Webpages P: major R: wontfix T: defect | **Reported by akdehof on 10 Apr 42142913 00:44 UTC**
None | 1.0 | Automatic forwarding of new bugs to developers mailing list down - **Reported by akdehof on 10 Apr 42142913 00:44 UTC**
None | non_priority | automatic forwarding of new bugs to developers mailing list down reported by akdehof on apr utc none | 0 |
100,517 | 21,393,618,616 | IssuesEvent | 2022-04-21 09:29:31 | pulumi/pulumi | https://api.github.com/repos/pulumi/pulumi | closed | Plain type refs not compilable | kind/bug language/go area/codegen | ### What happened?
Plain type refs generate code which is not compilable. These are required for components which require plain inputs in order to plan the sub-components to be created without having to wrap in an apply to make accurate previews.
### Steps to reproduce
Given a plain ref property in the schema such as:
```json
"inputProperties": {
"container": {
"$ref": "#/types/awsx:ecs:TaskDefinitionContainerDefinition",
"plain": true,
},
```
### Expected Behavior
It should generate:
```go
Container TaskDefinitionContainerDefinition
```
### Actual Behavior
The following go code is generated:
```go
Container TaskDefinitionContainerDefinitionArgs
```
but `TaskDefinitionContainerDefinitionArgs` is not defined.
Only the `TaskDefinitionContainerDefinition` struct is available.
### Versions used
v3.25.1
### Additional context
Issue in awsx:
Schema: https://github.com/pulumi/pulumi-awsx/blob/564276303d5acad25bb09fbe225561d90c6c458c/awsx/schema.json#L1252-L1256
Incorrectly generated property: https://github.com/pulumi/pulumi-awsx/blob/564276303d5acad25bb09fbe225561d90c6c458c/sdk/go/awsx/ecs/fargateTaskDefinition.go#L107
Generated type struct: https://github.com/pulumi/pulumi-awsx/blob/564276303d5acad25bb09fbe225561d90c6c458c/sdk/go/awsx/ecs/pulumiTypes.go#L67
### Contributing
Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).
| 1.0 | Plain type refs not compilable - ### What happened?
Plain type refs generate code which is not compilable. These are required for components which require plain inputs in order to plan the sub-components to be created without having to wrap in an apply to make accurate previews.
### Steps to reproduce
Given a plain ref property in the schema such as:
```json
"inputProperties": {
"container": {
"$ref": "#/types/awsx:ecs:TaskDefinitionContainerDefinition",
"plain": true,
},
```
### Expected Behavior
It should generate:
```go
Container TaskDefinitionContainerDefinition
```
### Actual Behavior
The following go code is generated:
```go
Container TaskDefinitionContainerDefinitionArgs
```
but `TaskDefinitionContainerDefinitionArgs` is not defined.
Only the `TaskDefinitionContainerDefinition` struct is available.
### Versions used
v3.25.1
### Additional context
Issue in awsx:
Schema: https://github.com/pulumi/pulumi-awsx/blob/564276303d5acad25bb09fbe225561d90c6c458c/awsx/schema.json#L1252-L1256
Incorrectly generated property: https://github.com/pulumi/pulumi-awsx/blob/564276303d5acad25bb09fbe225561d90c6c458c/sdk/go/awsx/ecs/fargateTaskDefinition.go#L107
Generated type struct: https://github.com/pulumi/pulumi-awsx/blob/564276303d5acad25bb09fbe225561d90c6c458c/sdk/go/awsx/ecs/pulumiTypes.go#L67
### Contributing
Vote on this issue by adding a 👍 reaction.
To contribute a fix for this issue, leave a comment (and link to your pull request, if you've opened one already).
| non_priority | plain type refs not compilable what happened plain type refs generate code which is not compilable these are required for components which require plain inputs in order to plan the sub components to be created without having to wrap in an apply to make accurate previews steps to reproduce given a plain ref property in the schema such as json inputproperties container ref types awsx ecs taskdefinitioncontainerdefinition plain true expected behavior it should generate go container taskdefinitioncontainerdefinition actual behavior the following go code is generated go container taskdefinitioncontainerdefinitionargs but taskdefinitioncontainerdefinitionargs is not defined only the taskdefinitioncontainerdefinition struct is available versions used additional context issue in awsx schema incorrectly generated property generated type struct contributing vote on this issue by adding a 👍 reaction to contribute a fix for this issue leave a comment and link to your pull request if you ve opened one already | 0 |
3,088 | 8,850,786,667 | IssuesEvent | 2019-01-08 14:12:52 | jupyterhub/zero-to-jupyterhub-k8s | https://api.github.com/repos/jupyterhub/zero-to-jupyterhub-k8s | closed | [WIP] Autoscaling - a living development documentation | architecture | # Goal
Allow easy setup of node autoscaling suitable for jupyterhub k8s clusters.
## An intro to autoscaling
If we have many users, we will have many singleuser pods, each guaranteed some resources. The more users, the more pods, the more resources required. We might want to add nodes when we are unable to schedule more pods, or perhaps when were are close to being unable to schedule more pods. There are various tools available related to scaling, here is a summary.
### Instance group autoscaler ([google](https://cloud.google.com/compute/docs/autoscaler/)) - nah...
Does not require or understand kubernetes, so it does not account for kubernetes concepts like pods and their requested resources. It will instead look at CPU load of the nodes and add more nodes when the CPU usage is too high. Not relevant to us, as we want to guarantee users to have a certain set of resources on the node they are scheduled.
### [Horizontal pod autoscaler](https://kubernetes.io/docs/tasks/run-application/horizontal-pod-autoscale/) - nah...
Will increase the number of pods doing a certain task if needed based on for example CPU utilization. This is relevant if you need more copies of the same pod. Each user should only have one pod, not multiple pods, so scaling up in this way makes little sense for the singleuser pods.
### Kubernetes cluster autoscaler (aka CA) ([google](https://cloud.google.com/kubernetes-engine/docs/concepts/cluster-autoscaler)) - YES
This is the relevant autoscaler! What do we want to scale? The available nodes. When do we want to scale? When we are about to run out of resources to schedule new users, or even when we predict that will happen. The kubernetes cluster autoscaler will add nodes to a cluster if it has pods that cant fit in the previous nodes. Awesome! | 1.0 | [WIP] Autoscaling - a living development documentation - # Goal
Allow easy setup of node autoscaling suitable for jupyterhub k8s clusters.
## An intro to autoscaling
If we have many users, we will have many singleuser pods, each guaranteed some resources. The more users, the more pods, the more resources required. We might want to add nodes when we are unable to schedule more pods, or perhaps when were are close to being unable to schedule more pods. There are various tools available related to scaling, here is a summary.
### Instance group autoscaler ([google](https://cloud.google.com/compute/docs/autoscaler/)) - nah...
Does not require or understand kubernetes, so it does not account for kubernetes concepts like pods and their requested resources. It will instead look at CPU load of the nodes and add more nodes when the CPU usage is too high. Not relevant to us, as we want to guarantee users to have a certain set of resources on the node they are scheduled.
### [Horizontal pod autoscaler](https://kubernetes.io/docs/tasks/run-application/horizontal-pod-autoscale/) - nah...
Will increase the number of pods doing a certain task if needed based on for example CPU utilization. This is relevant if you need more copies of the same pod. Each user should only have one pod, not multiple pods, so scaling up in this way makes little sense for the singleuser pods.
### Kubernetes cluster autoscaler (aka CA) ([google](https://cloud.google.com/kubernetes-engine/docs/concepts/cluster-autoscaler)) - YES
This is the relevant autoscaler! What do we want to scale? The available nodes. When do we want to scale? When we are about to run out of resources to schedule new users, or even when we predict that will happen. The kubernetes cluster autoscaler will add nodes to a cluster if it has pods that cant fit in the previous nodes. Awesome! | non_priority | autoscaling a living development documentation goal allow easy setup of node autoscaling suitable for jupyterhub clusters an intro to autoscaling if we have many users we will have many singleuser pods each guaranteed some resources the more users the more pods the more resources required we might want to add nodes when we are unable to schedule more pods or perhaps when were are close to being unable to schedule more pods there are various tools available related to scaling here is a summary instance group autoscaler nah does not require or understand kubernetes so it does not account for kubernetes concepts like pods and their requested resources it will instead look at cpu load of the nodes and add more nodes when the cpu usage is too high not relevant to us as we want to guarantee users to have a certain set of resources on the node they are scheduled nah will increase the number of pods doing a certain task if needed based on for example cpu utilization this is relevant if you need more copies of the same pod each user should only have one pod not multiple pods so scaling up in this way makes little sense for the singleuser pods kubernetes cluster autoscaler aka ca yes this is the relevant autoscaler what do we want to scale the available nodes when do we want to scale when we are about to run out of resources to schedule new users or even when we predict that will happen the kubernetes cluster autoscaler will add nodes to a cluster if it has pods that cant fit in the previous nodes awesome | 0 |
80,721 | 23,288,475,726 | IssuesEvent | 2022-08-05 19:20:21 | finos/legend-studio | https://api.github.com/repos/finos/legend-studio | closed | Bug: Check type compability when dragging and dropping parameters into filter/post filter | Type: Bug Studio Core Team Component: Query Builder | ### Similar issues
- [X] I have searched and found no existing similar issues
### How are you using Studio?
Legend Query
### Current and expected behavior
when dragging and dropping a parameter into the filter/post filter condition value, we should do type validation.
### Steps to reproduce
_No response_
### Model data
_No response_
### Environment
```text
chrome
```
### Possible solution and workaround
_No response_
### Contribution
- [X] I would like to work on the fix for this issue | 1.0 | Bug: Check type compability when dragging and dropping parameters into filter/post filter - ### Similar issues
- [X] I have searched and found no existing similar issues
### How are you using Studio?
Legend Query
### Current and expected behavior
when dragging and dropping a parameter into the filter/post filter condition value, we should do type validation.
### Steps to reproduce
_No response_
### Model data
_No response_
### Environment
```text
chrome
```
### Possible solution and workaround
_No response_
### Contribution
- [X] I would like to work on the fix for this issue | non_priority | bug check type compability when dragging and dropping parameters into filter post filter similar issues i have searched and found no existing similar issues how are you using studio legend query current and expected behavior when dragging and dropping a parameter into the filter post filter condition value we should do type validation steps to reproduce no response model data no response environment text chrome possible solution and workaround no response contribution i would like to work on the fix for this issue | 0 |
724,355 | 24,927,355,356 | IssuesEvent | 2022-10-31 08:41:48 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | github.com - see bug description | browser-firefox priority-critical engine-gecko | <!-- @browser: Firefox 106.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:106.0) Gecko/20100101 Firefox/106.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/113206 -->
**URL**: https://github.com
**Browser / Version**: Firefox 106.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Something else
**Description**: GDI fonts on Windows make Github README's look awful (all the text anti-aliasing produces bad colors on a flat screen display)
**Steps to Reproduce**:
On Windows 11, the setting:
gfx.font_rendering.cleartype_params.force_gdi_classic_for_families
contains defaults that mean that the document fonts on github.com, and many other websites are a mess of awfully colored anti-aliasing. It seems to use a mode that is designed for CRT displays, but we're all using flat screens these days.
Please consider flipping this default - the colors are awful!
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/10/93df207a-4508-45f5-bc44-5b5242656ab9.jpg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | github.com - see bug description - <!-- @browser: Firefox 106.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:106.0) Gecko/20100101 Firefox/106.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/113206 -->
**URL**: https://github.com
**Browser / Version**: Firefox 106.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Something else
**Description**: GDI fonts on Windows make Github README's look awful (all the text anti-aliasing produces bad colors on a flat screen display)
**Steps to Reproduce**:
On Windows 11, the setting:
gfx.font_rendering.cleartype_params.force_gdi_classic_for_families
contains defaults that mean that the document fonts on github.com, and many other websites are a mess of awfully colored anti-aliasing. It seems to use a mode that is designed for CRT displays, but we're all using flat screens these days.
Please consider flipping this default - the colors are awful!
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/10/93df207a-4508-45f5-bc44-5b5242656ab9.jpg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | github com see bug description url browser version firefox operating system windows tested another browser yes edge problem type something else description gdi fonts on windows make github readme s look awful all the text anti aliasing produces bad colors on a flat screen display steps to reproduce on windows the setting gfx font rendering cleartype params force gdi classic for families contains defaults that mean that the document fonts on github com and many other websites are a mess of awfully colored anti aliasing it seems to use a mode that is designed for crt displays but we re all using flat screens these days please consider flipping this default the colors are awful view the screenshot img alt screenshot src browser configuration none from with ❤️ | 1 |
722,212 | 24,854,745,678 | IssuesEvent | 2022-10-27 00:26:50 | enjoythecode/scrum-wizards-cs321 | https://api.github.com/repos/enjoythecode/scrum-wizards-cs321 | closed | SUPER ADMIN: view a list of all users | high priority @Super Admin | As a super admin, I want to be able to view a list of all users so that I can verify the information is correct and change as required. | 1.0 | SUPER ADMIN: view a list of all users - As a super admin, I want to be able to view a list of all users so that I can verify the information is correct and change as required. | priority | super admin view a list of all users as a super admin i want to be able to view a list of all users so that i can verify the information is correct and change as required | 1 |
126,432 | 4,995,585,589 | IssuesEvent | 2016-12-09 10:40:03 | hpi-swt2/workshop-portal | https://api.github.com/repos/hpi-swt2/workshop-portal | closed | US_1.6: Event drafts 🍪2 | Medium Priority team-helene | **As**
organizer
**I want to**
define draft events that are not published immediately
**in order to**
modify them together with my peers or colleagues.
**5 🍪**
- [ ] I am able to click a save ("Speichern") button when I am done with my camp creation so that my camp is saved but not published
- [ ] When I am ready to publish my Draft i can click on a Publish ("Veröffentlichen") button and my Event becomes public. (on edit and creation page) | 1.0 | US_1.6: Event drafts 🍪2 - **As**
organizer
**I want to**
define draft events that are not published immediately
**in order to**
modify them together with my peers or colleagues.
**5 🍪**
- [ ] I am able to click a save ("Speichern") button when I am done with my camp creation so that my camp is saved but not published
- [ ] When I am ready to publish my Draft i can click on a Publish ("Veröffentlichen") button and my Event becomes public. (on edit and creation page) | priority | us event drafts 🍪 as organizer i want to define draft events that are not published immediately in order to modify them together with my peers or colleagues 🍪 i am able to click a save speichern button when i am done with my camp creation so that my camp is saved but not published when i am ready to publish my draft i can click on a publish veröffentlichen button and my event becomes public on edit and creation page | 1 |
59,614 | 8,374,108,998 | IssuesEvent | 2018-10-05 12:44:09 | CleverRaven/Cataclysm-DDA | https://api.github.com/repos/CleverRaven/Cataclysm-DDA | closed | 0.D PR title summarization 02 | <Documentation> Good First Issue | This is one of a number of issues to crowd-source categorization of PR titles in order to construct a changelog for the 0.D release.
Please prepend a category title to each line listed at the end of this issue and commit the lines as a new file in the repository with the provided name.
Feel free to summarize or expand on the titles to make them better fit in the changelog.
See https://github.com/CleverRaven/Cataclysm-DDA/blob/master/data/changelog.txt for previous changelog entry examples.
# TITLES
The categories to choose from are: Features, Content, Interface, Mods, Balance, Bugfixes, Performance, Infrastructure, Build, I18N
See the Changelog Guidelines at https://github.com/CleverRaven/Cataclysm-DDA/blob/master/doc/CHANGELOG_GUIDELINES.md for explanations of the categories.
# FILENAME
summary02
# Lines to summarize
Chocolate covered pretzels Fix
Telescopic Eyes CBM Updated
Arcana Mod: Adds shady zombies to drop list
WBLOCK_2 usage changes
Adds windows no curtains to all my sets, plus lots of stuff to the new iso set
French toast
[CR] Set player position to the updated submap shift before checking for seen overmap tiles.
Eternal season should prevent showing duration in seasons.
Fix reloading of spare magazine
Forbid eating non-solid food except when contained
Add pumpkin and sunflower as farmable plants
reworking item groups
MShock Modded Tileset Series 25
Implement a hack for welding rig long repair
Weed brownie recipe change
Correct nutrition_for thresholds
Make gobag uncraftable
Leather/bone/chitin recipe staggering, armguard split
ChestHole Hickory trees and shady zombies
Get rid of creature::pos3()
add tortillas
Allow dodging during long actions.
Fix failing legacy save loading unit test.
Crossbow hunter
[CR] [WIP] Change handling of recoil penalty
Remove completely unrealistic energy weapon recipes
Zombie child evolution expansion.
combine limbs on info and layering screens
New build_lab, other underground overmap code cleanup
use wchar_t for all potentially-wide-character handling
lard math fix
Update recipes for new comestible tortilla
Alarm System CBM now ignores hallucinations.
Adding eggnog
Tin can sealer.
Craft in the dark
Define duplicate sprites for multiple tile ids at the same time, also document tile_config.json
Bionic "Telescopic Eyes" fix.
Gives Gungir reach attack, removes rapid technique
MShock Modded Tileset: zombie children evolutions
Standardise ammo disassembly
New road mapgen to replace mapgen_road_*() [CR]
Fix bullet pulling format in mod.
Remove 'rarity' parameter entries
Allow mods to override specific properties of monster types.
Jsonized monster attacks
Storage cells and car recharger changes
Eggnog rebalance
Telescopic Eye CBM fix
remove eyes coverage from the rioter mask
Factor out drawing primitive algorithms to a separate module.
Remove whip stun effect.
Restore body part highlighting in armor layering menu.
Set -mmacosx-version-min to 10.7 using clang
Better naming for adjacent items
Implement visitor pattern for items
Fixes radiation protection/treatment pills.
random sprites for player and NPCs
More Locations mod.
Add new source files to CodeBlocks project file
Allow using Enhanced Hearing CBM to crack safes.
Arcana and Tank mods: Use of monster modify
Chesthole tileset: Adds rm13 overlay, fixes survivor mask overlay
Disassembling checks for null recipe - fixes #14780, fixes #14735
Prevent screen artifacts caused by transparent or undrawn terrain tiles
Wishmenu->wish item: remove previous item ID before drawing current one.
Fixes "make slings, dismantle for string" exploit
Removes redundant controls from inflatable boat
Initial work on 3D vision
Escapable menu for yelling at NPCs
New drops for zombie soldiers [CR]
Holsters should consider hand encumbrance and other penalties
Item handling is slower with increasing hand encumbrance
Reduce number of charges for oxyacetylene torch
Remove hardcoded moves cost when unloading from containers
Rework player::weapname()
Forbid mp3 player from playing without batteries
Fixes to sidewalks and yellow dots on roads
Item groups for guns and ammo
Arcana mod: Effect disambiguation/changes
Plural name fix
Remove hardtack from sandwich recipes
Weather Reader reworked
Move book disassembly code into null recipe check so books can be torn into pages again.
Filter out TAB and Shift+TAB from filter's input string
Add minimum move cost when handling items
Implements barrel_length variable in ranged.json.
Update for RL_Classes mod.
[CR]Giving NPCs mutagens, meds, food etc.
Uncountable nouns plural fix
[Recipe] Use 4 hotdogs instead of one when making cooked hot dogs.
[CR]Mass uncraft
Add a default value for "to_hit"
Fix incorrect colors
Fix for Adreneline CBM/Injector, Mycus wrath
Allow mods to change martial art styles / techniques / buffs
[CR]Add "calories" field to it_comest
[CR]Melee combat rebalance
[CR] Unscrewable vehicle parts
[CR] Adds caching to the pixel minimap, enemy indicators flash red, apply low light filters. | 1.0 | 0.D PR title summarization 02 - This is one of a number of issues to crowd-source categorization of PR titles in order to construct a changelog for the 0.D release.
Please prepend a category title to each line listed at the end of this issue and commit the lines as a new file in the repository with the provided name.
Feel free to summarize or expand on the titles to make them better fit in the changelog.
See https://github.com/CleverRaven/Cataclysm-DDA/blob/master/data/changelog.txt for previous changelog entry examples.
# TITLES
The categories to choose from are: Features, Content, Interface, Mods, Balance, Bugfixes, Performance, Infrastructure, Build, I18N
See the Changelog Guidelines at https://github.com/CleverRaven/Cataclysm-DDA/blob/master/doc/CHANGELOG_GUIDELINES.md for explanations of the categories.
# FILENAME
summary02
# Lines to summarize
Chocolate covered pretzels Fix
Telescopic Eyes CBM Updated
Arcana Mod: Adds shady zombies to drop list
WBLOCK_2 usage changes
Adds windows no curtains to all my sets, plus lots of stuff to the new iso set
French toast
[CR] Set player position to the updated submap shift before checking for seen overmap tiles.
Eternal season should prevent showing duration in seasons.
Fix reloading of spare magazine
Forbid eating non-solid food except when contained
Add pumpkin and sunflower as farmable plants
reworking item groups
MShock Modded Tileset Series 25
Implement a hack for welding rig long repair
Weed brownie recipe change
Correct nutrition_for thresholds
Make gobag uncraftable
Leather/bone/chitin recipe staggering, armguard split
ChestHole Hickory trees and shady zombies
Get rid of creature::pos3()
add tortillas
Allow dodging during long actions.
Fix failing legacy save loading unit test.
Crossbow hunter
[CR] [WIP] Change handling of recoil penalty
Remove completely unrealistic energy weapon recipes
Zombie child evolution expansion.
combine limbs on info and layering screens
New build_lab, other underground overmap code cleanup
use wchar_t for all potentially-wide-character handling
lard math fix
Update recipes for new comestible tortilla
Alarm System CBM now ignores hallucinations.
Adding eggnog
Tin can sealer.
Craft in the dark
Define duplicate sprites for multiple tile ids at the same time, also document tile_config.json
Bionic "Telescopic Eyes" fix.
Gives Gungir reach attack, removes rapid technique
MShock Modded Tileset: zombie children evolutions
Standardise ammo disassembly
New road mapgen to replace mapgen_road_*() [CR]
Fix bullet pulling format in mod.
Remove 'rarity' parameter entries
Allow mods to override specific properties of monster types.
Jsonized monster attacks
Storage cells and car recharger changes
Eggnog rebalance
Telescopic Eye CBM fix
remove eyes coverage from the rioter mask
Factor out drawing primitive algorithms to a separate module.
Remove whip stun effect.
Restore body part highlighting in armor layering menu.
Set -mmacosx-version-min to 10.7 using clang
Better naming for adjacent items
Implement visitor pattern for items
Fixes radiation protection/treatment pills.
random sprites for player and NPCs
More Locations mod.
Add new source files to CodeBlocks project file
Allow using Enhanced Hearing CBM to crack safes.
Arcana and Tank mods: Use of monster modify
Chesthole tileset: Adds rm13 overlay, fixes survivor mask overlay
Disassembling checks for null recipe - fixes #14780, fixes #14735
Prevent screen artifacts caused by transparent or undrawn terrain tiles
Wishmenu->wish item: remove previous item ID before drawing current one.
Fixes "make slings, dismantle for string" exploit
Removes redundant controls from inflatable boat
Initial work on 3D vision
Escapable menu for yelling at NPCs
New drops for zombie soldiers [CR]
Holsters should consider hand encumbrance and other penalties
Item handling is slower with increasing hand encumbrance
Reduce number of charges for oxyacetylene torch
Remove hardcoded moves cost when unloading from containers
Rework player::weapname()
Forbid mp3 player from playing without batteries
Fixes to sidewalks and yellow dots on roads
Item groups for guns and ammo
Arcana mod: Effect disambiguation/changes
Plural name fix
Remove hardtack from sandwich recipes
Weather Reader reworked
Move book disassembly code into null recipe check so books can be torn into pages again.
Filter out TAB and Shift+TAB from filter's input string
Add minimum move cost when handling items
Implements barrel_length variable in ranged.json.
Update for RL_Classes mod.
[CR]Giving NPCs mutagens, meds, food etc.
Uncountable nouns plural fix
[Recipe] Use 4 hotdogs instead of one when making cooked hot dogs.
[CR]Mass uncraft
Add a default value for "to_hit"
Fix incorrect colors
Fix for Adreneline CBM/Injector, Mycus wrath
Allow mods to change martial art styles / techniques / buffs
[CR]Add "calories" field to it_comest
[CR]Melee combat rebalance
[CR] Unscrewable vehicle parts
[CR] Adds caching to the pixel minimap, enemy indicators flash red, apply low light filters. | non_priority | d pr title summarization this is one of a number of issues to crowd source categorization of pr titles in order to construct a changelog for the d release please prepend a category title to each line listed at the end of this issue and commit the lines as a new file in the repository with the provided name feel free to summarize or expand on the titles to make them better fit in the changelog see for previous changelog entry examples titles the categories to choose from are features content interface mods balance bugfixes performance infrastructure build see the changelog guidelines at for explanations of the categories filename lines to summarize chocolate covered pretzels fix telescopic eyes cbm updated arcana mod adds shady zombies to drop list wblock usage changes adds windows no curtains to all my sets plus lots of stuff to the new iso set french toast set player position to the updated submap shift before checking for seen overmap tiles eternal season should prevent showing duration in seasons fix reloading of spare magazine forbid eating non solid food except when contained add pumpkin and sunflower as farmable plants reworking item groups mshock modded tileset series implement a hack for welding rig long repair weed brownie recipe change correct nutrition for thresholds make gobag uncraftable leather bone chitin recipe staggering armguard split chesthole hickory trees and shady zombies get rid of creature add tortillas allow dodging during long actions fix failing legacy save loading unit test crossbow hunter change handling of recoil penalty remove completely unrealistic energy weapon recipes zombie child evolution expansion combine limbs on info and layering screens new build lab other underground overmap code cleanup use wchar t for all potentially wide character handling lard math fix update recipes for new comestible tortilla alarm system cbm now ignores hallucinations adding eggnog tin can sealer craft in the dark define duplicate sprites for multiple tile ids at the same time also document tile config json bionic telescopic eyes fix gives gungir reach attack removes rapid technique mshock modded tileset zombie children evolutions standardise ammo disassembly new road mapgen to replace mapgen road fix bullet pulling format in mod remove rarity parameter entries allow mods to override specific properties of monster types jsonized monster attacks storage cells and car recharger changes eggnog rebalance telescopic eye cbm fix remove eyes coverage from the rioter mask factor out drawing primitive algorithms to a separate module remove whip stun effect restore body part highlighting in armor layering menu set mmacosx version min to using clang better naming for adjacent items implement visitor pattern for items fixes radiation protection treatment pills random sprites for player and npcs more locations mod add new source files to codeblocks project file allow using enhanced hearing cbm to crack safes arcana and tank mods use of monster modify chesthole tileset adds overlay fixes survivor mask overlay disassembling checks for null recipe fixes fixes prevent screen artifacts caused by transparent or undrawn terrain tiles wishmenu wish item remove previous item id before drawing current one fixes make slings dismantle for string exploit removes redundant controls from inflatable boat initial work on vision escapable menu for yelling at npcs new drops for zombie soldiers holsters should consider hand encumbrance and other penalties item handling is slower with increasing hand encumbrance reduce number of charges for oxyacetylene torch remove hardcoded moves cost when unloading from containers rework player weapname forbid player from playing without batteries fixes to sidewalks and yellow dots on roads item groups for guns and ammo arcana mod effect disambiguation changes plural name fix remove hardtack from sandwich recipes weather reader reworked move book disassembly code into null recipe check so books can be torn into pages again filter out tab and shift tab from filter s input string add minimum move cost when handling items implements barrel length variable in ranged json update for rl classes mod giving npcs mutagens meds food etc uncountable nouns plural fix use hotdogs instead of one when making cooked hot dogs mass uncraft add a default value for to hit fix incorrect colors fix for adreneline cbm injector mycus wrath allow mods to change martial art styles techniques buffs add calories field to it comest melee combat rebalance unscrewable vehicle parts adds caching to the pixel minimap enemy indicators flash red apply low light filters | 0 |
275,818 | 30,309,731,053 | IssuesEvent | 2023-07-10 12:01:38 | devatherock/utilities | https://api.github.com/repos/devatherock/utilities | closed | spring-kafka-test-2.6.13.jar: 4 vulnerabilities (highest severity is: 9.8) - autoclosed | Mend: dependency security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-kafka-test-2.6.13.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.50.Final/dc0110fc4d22fb22f1038cd73a6f8a034928a2d7/netty-handler-4.1.50.Final.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (spring-kafka-test version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-36944](https://www.mend.io/vulnerability-database/CVE-2022-36944) | <img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Critical | 9.8 | scala-library-2.13.3.jar | Transitive | N/A* | ❌ |
| [CVE-2021-37136](https://www.mend.io/vulnerability-database/CVE-2021-37136) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | netty-codec-4.1.50.Final.jar | Transitive | 2.7.0 | ❌ |
| [CVE-2021-37137](https://www.mend.io/vulnerability-database/CVE-2021-37137) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | netty-codec-4.1.50.Final.jar | Transitive | 2.7.0 | ❌ |
| [WS-2020-0408](https://github.com/netty/netty/issues/10362) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.4 | netty-handler-4.1.50.Final.jar | Transitive | 2.7.0 | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> CVE-2022-36944</summary>
### Vulnerable Library - <b>scala-library-2.13.3.jar</b></p>
<p>Standard library for the Scala Programming Language</p>
<p>Library home page: <a href="https://www.scala-lang.org/">https://www.scala-lang.org/</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.scala-lang/scala-library/2.13.3/1431ec9962701faf77bbd5e1449ded674be6fe5b/scala-library-2.13.3.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- :x: **scala-library-2.13.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Scala 2.13.x before 2.13.9 has a Java deserialization chain in its JAR file. On its own, it cannot be exploited. There is only a risk in conjunction with Java object deserialization within an application. In such situations, it allows attackers to erase contents of arbitrary files, make network connections, or possibly run arbitrary code (specifically, Function0 functions) via a gadget chain.
<p>Publish Date: 2022-09-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-36944>CVE-2022-36944</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-09-23</p>
<p>Fix Resolution: org.scala-lang:scala-library:2.13.9</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2021-37136</summary>
### Vulnerable Library - <b>netty-codec-4.1.50.Final.jar</b></p>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.50.Final/cbcb646c9380c6cdc3f56603ae6418a11418ce0f/netty-codec-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- zookeeper-3.5.9.jar
- netty-handler-4.1.50.Final.jar
- :x: **netty-codec-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Bzip2 decompression decoder function doesn't allow setting size restrictions on the decompressed output data (which affects the allocation size used during decompression). All users of Bzip2Decoder are affected. The malicious input can trigger an OOME and so a DoS attack
<p>Publish Date: 2021-10-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37136>CVE-2021-37136</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-grg4-wf29-r9vv">https://github.com/netty/netty/security/advisories/GHSA-grg4-wf29-r9vv</a></p>
<p>Release Date: 2021-10-19</p>
<p>Fix Resolution (io.netty:netty-codec): 4.1.68.Final</p>
<p>Direct dependency fix Resolution (org.springframework.kafka:spring-kafka-test): 2.7.0</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2021-37137</summary>
### Vulnerable Library - <b>netty-codec-4.1.50.Final.jar</b></p>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.50.Final/cbcb646c9380c6cdc3f56603ae6418a11418ce0f/netty-codec-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- zookeeper-3.5.9.jar
- netty-handler-4.1.50.Final.jar
- :x: **netty-codec-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well. This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk.
<p>Publish Date: 2021-10-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37137>CVE-2021-37137</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-9vjp-v76f-g363">https://github.com/advisories/GHSA-9vjp-v76f-g363</a></p>
<p>Release Date: 2021-10-19</p>
<p>Fix Resolution (io.netty:netty-codec): 4.1.68.Final</p>
<p>Direct dependency fix Resolution (org.springframework.kafka:spring-kafka-test): 2.7.0</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> WS-2020-0408</summary>
### Vulnerable Library - <b>netty-handler-4.1.50.Final.jar</b></p>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.50.Final/dc0110fc4d22fb22f1038cd73a6f8a034928a2d7/netty-handler-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- zookeeper-3.5.9.jar
- :x: **netty-handler-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was found in all versions of io.netty:netty-all. Host verification in Netty is disabled by default. This can lead to MITM attack in which an attacker can forge valid SSL/TLS certificates for a different hostname in order to intercept traffic that doesn’t intend for him. This is an issue because the certificate is not matched with the host.
<p>Publish Date: 2020-06-22
<p>URL: <a href=https://github.com/netty/netty/issues/10362>WS-2020-0408</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2020-0408">https://nvd.nist.gov/vuln/detail/WS-2020-0408</a></p>
<p>Release Date: 2020-06-22</p>
<p>Fix Resolution (io.netty:netty-handler): 4.1.69.Final</p>
<p>Direct dependency fix Resolution (org.springframework.kafka:spring-kafka-test): 2.7.0</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details> | True | spring-kafka-test-2.6.13.jar: 4 vulnerabilities (highest severity is: 9.8) - autoclosed - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>spring-kafka-test-2.6.13.jar</b></p></summary>
<p></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.50.Final/dc0110fc4d22fb22f1038cd73a6f8a034928a2d7/netty-handler-4.1.50.Final.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (spring-kafka-test version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2022-36944](https://www.mend.io/vulnerability-database/CVE-2022-36944) | <img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> Critical | 9.8 | scala-library-2.13.3.jar | Transitive | N/A* | ❌ |
| [CVE-2021-37136](https://www.mend.io/vulnerability-database/CVE-2021-37136) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | netty-codec-4.1.50.Final.jar | Transitive | 2.7.0 | ❌ |
| [CVE-2021-37137](https://www.mend.io/vulnerability-database/CVE-2021-37137) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.5 | netty-codec-4.1.50.Final.jar | Transitive | 2.7.0 | ❌ |
| [WS-2020-0408](https://github.com/netty/netty/issues/10362) | <img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> High | 7.4 | netty-handler-4.1.50.Final.jar | Transitive | 2.7.0 | ❌ |
<p>*For some transitive vulnerabilities, there is no version of direct dependency with a fix. Check the "Details" section below to see if there is a version of transitive dependency where vulnerability is fixed.</p>
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/critical_vul.png?' width=19 height=20> CVE-2022-36944</summary>
### Vulnerable Library - <b>scala-library-2.13.3.jar</b></p>
<p>Standard library for the Scala Programming Language</p>
<p>Library home page: <a href="https://www.scala-lang.org/">https://www.scala-lang.org/</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.scala-lang/scala-library/2.13.3/1431ec9962701faf77bbd5e1449ded674be6fe5b/scala-library-2.13.3.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- :x: **scala-library-2.13.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Scala 2.13.x before 2.13.9 has a Java deserialization chain in its JAR file. On its own, it cannot be exploited. There is only a risk in conjunction with Java object deserialization within an application. In such situations, it allows attackers to erase contents of arbitrary files, make network connections, or possibly run arbitrary code (specifically, Function0 functions) via a gadget chain.
<p>Publish Date: 2022-09-23
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-36944>CVE-2022-36944</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>9.8</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-09-23</p>
<p>Fix Resolution: org.scala-lang:scala-library:2.13.9</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2021-37136</summary>
### Vulnerable Library - <b>netty-codec-4.1.50.Final.jar</b></p>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.50.Final/cbcb646c9380c6cdc3f56603ae6418a11418ce0f/netty-codec-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- zookeeper-3.5.9.jar
- netty-handler-4.1.50.Final.jar
- :x: **netty-codec-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Bzip2 decompression decoder function doesn't allow setting size restrictions on the decompressed output data (which affects the allocation size used during decompression). All users of Bzip2Decoder are affected. The malicious input can trigger an OOME and so a DoS attack
<p>Publish Date: 2021-10-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37136>CVE-2021-37136</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/netty/netty/security/advisories/GHSA-grg4-wf29-r9vv">https://github.com/netty/netty/security/advisories/GHSA-grg4-wf29-r9vv</a></p>
<p>Release Date: 2021-10-19</p>
<p>Fix Resolution (io.netty:netty-codec): 4.1.68.Final</p>
<p>Direct dependency fix Resolution (org.springframework.kafka:spring-kafka-test): 2.7.0</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> CVE-2021-37137</summary>
### Vulnerable Library - <b>netty-codec-4.1.50.Final.jar</b></p>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-codec/4.1.50.Final/cbcb646c9380c6cdc3f56603ae6418a11418ce0f/netty-codec-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- zookeeper-3.5.9.jar
- netty-handler-4.1.50.Final.jar
- :x: **netty-codec-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
The Snappy frame decoder function doesn't restrict the chunk length which may lead to excessive memory usage. Beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well. This vulnerability can be triggered by supplying malicious input that decompresses to a very big size (via a network stream or a file) or by sending a huge skippable chunk.
<p>Publish Date: 2021-10-19
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-37137>CVE-2021-37137</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.5</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-9vjp-v76f-g363">https://github.com/advisories/GHSA-9vjp-v76f-g363</a></p>
<p>Release Date: 2021-10-19</p>
<p>Fix Resolution (io.netty:netty-codec): 4.1.68.Final</p>
<p>Direct dependency fix Resolution (org.springframework.kafka:spring-kafka-test): 2.7.0</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png?' width=19 height=20> WS-2020-0408</summary>
### Vulnerable Library - <b>netty-handler-4.1.50.Final.jar</b></p>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/io.netty/netty-handler/4.1.50.Final/dc0110fc4d22fb22f1038cd73a6f8a034928a2d7/netty-handler-4.1.50.Final.jar</p>
<p>
Dependency Hierarchy:
- spring-kafka-test-2.6.13.jar (Root Library)
- kafka_2.13-2.6.3.jar
- zookeeper-3.5.9.jar
- :x: **netty-handler-4.1.50.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/devatherock/utilities/commit/c644aeaddd1c6c48771f873d69b707a3d75a2681">c644aeaddd1c6c48771f873d69b707a3d75a2681</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
An issue was found in all versions of io.netty:netty-all. Host verification in Netty is disabled by default. This can lead to MITM attack in which an attacker can forge valid SSL/TLS certificates for a different hostname in order to intercept traffic that doesn’t intend for him. This is an issue because the certificate is not matched with the host.
<p>Publish Date: 2020-06-22
<p>URL: <a href=https://github.com/netty/netty/issues/10362>WS-2020-0408</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>7.4</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/WS-2020-0408">https://nvd.nist.gov/vuln/detail/WS-2020-0408</a></p>
<p>Release Date: 2020-06-22</p>
<p>Fix Resolution (io.netty:netty-handler): 4.1.69.Final</p>
<p>Direct dependency fix Resolution (org.springframework.kafka:spring-kafka-test): 2.7.0</p>
</p>
<p></p>
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
</details> | non_priority | spring kafka test jar vulnerabilities highest severity is autoclosed vulnerable library spring kafka test jar path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty handler final netty handler final jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in spring kafka test version remediation available critical scala library jar transitive n a high netty codec final jar transitive high netty codec final jar transitive high netty handler final jar transitive for some transitive vulnerabilities there is no version of direct dependency with a fix check the details section below to see if there is a version of transitive dependency where vulnerability is fixed details cve vulnerable library scala library jar standard library for the scala programming language library home page a href path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files org scala lang scala library scala library jar dependency hierarchy spring kafka test jar root library kafka jar x scala library jar vulnerable library found in head commit a href found in base branch master vulnerability details scala x before has a java deserialization chain in its jar file on its own it cannot be exploited there is only a risk in conjunction with java object deserialization within an application in such situations it allows attackers to erase contents of arbitrary files make network connections or possibly run arbitrary code specifically functions via a gadget chain publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution org scala lang scala library step up your open source security game with mend cve vulnerable library netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec final netty codec final jar dependency hierarchy spring kafka test jar root library kafka jar zookeeper jar netty handler final jar x netty codec final jar vulnerable library found in head commit a href found in base branch master vulnerability details the decompression decoder function doesn t allow setting size restrictions on the decompressed output data which affects the allocation size used during decompression all users of are affected the malicious input can trigger an oome and so a dos attack publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec final direct dependency fix resolution org springframework kafka spring kafka test step up your open source security game with mend cve vulnerable library netty codec final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty codec final netty codec final jar dependency hierarchy spring kafka test jar root library kafka jar zookeeper jar netty handler final jar x netty codec final jar vulnerable library found in head commit a href found in base branch master vulnerability details the snappy frame decoder function doesn t restrict the chunk length which may lead to excessive memory usage beside this it also may buffer reserved skippable chunks until the whole chunk was received which may lead to excessive memory usage as well this vulnerability can be triggered by supplying malicious input that decompresses to a very big size via a network stream or a file or by sending a huge skippable chunk publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec final direct dependency fix resolution org springframework kafka spring kafka test step up your open source security game with mend ws vulnerable library netty handler final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients path to dependency file build gradle path to vulnerable library home wss scanner gradle caches modules files io netty netty handler final netty handler final jar dependency hierarchy spring kafka test jar root library kafka jar zookeeper jar x netty handler final jar vulnerable library found in head commit a href found in base branch master vulnerability details an issue was found in all versions of io netty netty all host verification in netty is disabled by default this can lead to mitm attack in which an attacker can forge valid ssl tls certificates for a different hostname in order to intercept traffic that doesn’t intend for him this is an issue because the certificate is not matched with the host publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty handler final direct dependency fix resolution org springframework kafka spring kafka test step up your open source security game with mend | 0 |
50,458 | 7,604,635,019 | IssuesEvent | 2018-04-30 03:13:18 | adafruit/circuitpython | https://api.github.com/repos/adafruit/circuitpython | closed | Fix autodoc in library doc builds | documentation good first issue | These repos have docs that may technically pass but the autodoc portion fails. This is usually due to a library not being mocked out. To fix it, a line must be added to conf.py to tell sphinx to mock out specific modules. Here is the line from the latest cookicutter's conf.py:
```py
# Uncomment the below if you use native CircuitPython modules such as
# digitalio, micropython and busio. List the modules you use. Without it, the
# autodoc module docs will fail to generate with a warning.
# autodoc_mock_imports = ["adafruit_bus_device", "micropython"]
```
Copy these four lines, uncomment the fourth and adjust the list until the sphinx build includes the autodocs.
- [x] https://github.com/adafruit/Adafruit_CircuitPython_INA219
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DS3231
- [x] https://github.com/adafruit/Adafruit_CircuitPython_CharLCD
- [x] https://github.com/adafruit/Adafruit_CircuitPython_MAX31865
- [x] https://github.com/adafruit/Adafruit_CircuitPython_AMG88xx
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VL53L0X
- [x] https://github.com/adafruit/Adafruit_CircuitPython_TCS34725
- [x] https://github.com/adafruit/Adafruit_CircuitPython_seesaw
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LSM9DS1
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LSM9DS0
- [x] https://github.com/adafruit/Adafruit_CircuitPython_Thermistor
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VC0706
- [x] https://github.com/adafruit/Adafruit_CircuitPython_HT16K33
- [x] https://github.com/adafruit/Adafruit_CircuitPython_FXOS8700
- [x] https://github.com/adafruit/Adafruit_CircuitPython_FXAS21002C
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LIS3DH
- [x] https://github.com/adafruit/Adafruit_CircuitPython_RTTTL
- [x] https://github.com/adafruit/Adafruit_CircuitPython_IS31FL3731
- [x] https://github.com/adafruit/Adafruit_CircuitPython_ADS1x15
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LSM303
- [x] https://github.com/adafruit/Adafruit_CircuitPython_SimpleIO
- [x] https://github.com/adafruit/Adafruit_CircuitPython_BNO055
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VCNL4010
- [x] https://github.com/adafruit/Adafruit_CircuitPython_Fingerprint
- [x] https://github.com/adafruit/Adafruit_CircuitPython_SGP30
- [x] https://github.com/adafruit/Adafruit_CircuitPython_SHT31D
- [x] https://github.com/adafruit/Adafruit_CircuitPython_APDS9960
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DRV2605
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DS2413
- [x] https://github.com/adafruit/Adafruit_CircuitPython_OneWire
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VEML6070
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VL6180X
- [x] https://github.com/adafruit/Adafruit_CircuitPython_Thermal_Printer
- [x] https://github.com/adafruit/Adafruit_CircuitPython_FocalTouch
- [x] https://github.com/adafruit/Adafruit_CircuitPython_MAX9744
- [x] https://github.com/adafruit/Adafruit_CircuitPython_MCP4725
- [x] https://github.com/adafruit/Adafruit_CircuitPython_AVRprog
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DS18X20 | 1.0 | Fix autodoc in library doc builds - These repos have docs that may technically pass but the autodoc portion fails. This is usually due to a library not being mocked out. To fix it, a line must be added to conf.py to tell sphinx to mock out specific modules. Here is the line from the latest cookicutter's conf.py:
```py
# Uncomment the below if you use native CircuitPython modules such as
# digitalio, micropython and busio. List the modules you use. Without it, the
# autodoc module docs will fail to generate with a warning.
# autodoc_mock_imports = ["adafruit_bus_device", "micropython"]
```
Copy these four lines, uncomment the fourth and adjust the list until the sphinx build includes the autodocs.
- [x] https://github.com/adafruit/Adafruit_CircuitPython_INA219
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DS3231
- [x] https://github.com/adafruit/Adafruit_CircuitPython_CharLCD
- [x] https://github.com/adafruit/Adafruit_CircuitPython_MAX31865
- [x] https://github.com/adafruit/Adafruit_CircuitPython_AMG88xx
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VL53L0X
- [x] https://github.com/adafruit/Adafruit_CircuitPython_TCS34725
- [x] https://github.com/adafruit/Adafruit_CircuitPython_seesaw
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LSM9DS1
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LSM9DS0
- [x] https://github.com/adafruit/Adafruit_CircuitPython_Thermistor
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VC0706
- [x] https://github.com/adafruit/Adafruit_CircuitPython_HT16K33
- [x] https://github.com/adafruit/Adafruit_CircuitPython_FXOS8700
- [x] https://github.com/adafruit/Adafruit_CircuitPython_FXAS21002C
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LIS3DH
- [x] https://github.com/adafruit/Adafruit_CircuitPython_RTTTL
- [x] https://github.com/adafruit/Adafruit_CircuitPython_IS31FL3731
- [x] https://github.com/adafruit/Adafruit_CircuitPython_ADS1x15
- [x] https://github.com/adafruit/Adafruit_CircuitPython_LSM303
- [x] https://github.com/adafruit/Adafruit_CircuitPython_SimpleIO
- [x] https://github.com/adafruit/Adafruit_CircuitPython_BNO055
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VCNL4010
- [x] https://github.com/adafruit/Adafruit_CircuitPython_Fingerprint
- [x] https://github.com/adafruit/Adafruit_CircuitPython_SGP30
- [x] https://github.com/adafruit/Adafruit_CircuitPython_SHT31D
- [x] https://github.com/adafruit/Adafruit_CircuitPython_APDS9960
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DRV2605
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DS2413
- [x] https://github.com/adafruit/Adafruit_CircuitPython_OneWire
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VEML6070
- [x] https://github.com/adafruit/Adafruit_CircuitPython_VL6180X
- [x] https://github.com/adafruit/Adafruit_CircuitPython_Thermal_Printer
- [x] https://github.com/adafruit/Adafruit_CircuitPython_FocalTouch
- [x] https://github.com/adafruit/Adafruit_CircuitPython_MAX9744
- [x] https://github.com/adafruit/Adafruit_CircuitPython_MCP4725
- [x] https://github.com/adafruit/Adafruit_CircuitPython_AVRprog
- [x] https://github.com/adafruit/Adafruit_CircuitPython_DS18X20 | non_priority | fix autodoc in library doc builds these repos have docs that may technically pass but the autodoc portion fails this is usually due to a library not being mocked out to fix it a line must be added to conf py to tell sphinx to mock out specific modules here is the line from the latest cookicutter s conf py py uncomment the below if you use native circuitpython modules such as digitalio micropython and busio list the modules you use without it the autodoc module docs will fail to generate with a warning autodoc mock imports copy these four lines uncomment the fourth and adjust the list until the sphinx build includes the autodocs | 0 |
261,056 | 8,223,336,087 | IssuesEvent | 2018-09-06 10:15:34 | nanoframework/Home | https://api.github.com/repos/nanoframework/Home | opened | Assemblies are missing the TargetFrameworkAttribute | Area: Tools Priority: High Status: In progress Type: Enhancement | ## Details about Problem
The nF assemblies generated are missing the TargetFrameworkAttribute which is causing issues to some tools identifying that it is a nanoFramework assembly. | 1.0 | Assemblies are missing the TargetFrameworkAttribute - ## Details about Problem
The nF assemblies generated are missing the TargetFrameworkAttribute which is causing issues to some tools identifying that it is a nanoFramework assembly. | priority | assemblies are missing the targetframeworkattribute details about problem the nf assemblies generated are missing the targetframeworkattribute which is causing issues to some tools identifying that it is a nanoframework assembly | 1 |
203,821 | 15,889,849,369 | IssuesEvent | 2021-04-10 13:14:04 | oasis-tcs/csaf | https://api.github.com/repos/oasis-tcs/csaf | opened | Add missing enum information | csaf 2.0 documentation editorial | Information which was provided in [CVRF 1.2 Section 2.2](https://docs.oasis-open.org/csaf/csaf-cvrf/v1.2/cs01/csaf-cvrf-v1.2-cs01.html#_Toc493508787) is currently missing. This issue tracks the progress of adding them. | 1.0 | Add missing enum information - Information which was provided in [CVRF 1.2 Section 2.2](https://docs.oasis-open.org/csaf/csaf-cvrf/v1.2/cs01/csaf-cvrf-v1.2-cs01.html#_Toc493508787) is currently missing. This issue tracks the progress of adding them. | non_priority | add missing enum information information which was provided in is currently missing this issue tracks the progress of adding them | 0 |
15,373 | 5,108,711,356 | IssuesEvent | 2017-01-05 18:32:46 | BlackSourceLabs/BlackNectar-Service | https://api.github.com/repos/BlackSourceLabs/BlackNectar-Service | closed | Update SQL to use POSTGIS Functions | code enhancement | Right now distances are calculated in a primitive way doing trigonometric calculations.
Now that we have access to a Geo-Spatial Database courtesy BlackSourceLabs/BlackNectar-Service#13 & BlackSourceLabs/BlackNectar-Service#12, we can use `ST_DWithin` and `ST_Distance` to calculate and order stores by distance. | 1.0 | Update SQL to use POSTGIS Functions - Right now distances are calculated in a primitive way doing trigonometric calculations.
Now that we have access to a Geo-Spatial Database courtesy BlackSourceLabs/BlackNectar-Service#13 & BlackSourceLabs/BlackNectar-Service#12, we can use `ST_DWithin` and `ST_Distance` to calculate and order stores by distance. | non_priority | update sql to use postgis functions right now distances are calculated in a primitive way doing trigonometric calculations now that we have access to a geo spatial database courtesy blacksourcelabs blacknectar service blacksourcelabs blacknectar service we can use st dwithin and st distance to calculate and order stores by distance | 0 |
274,222 | 20,829,015,588 | IssuesEvent | 2022-03-19 05:14:56 | celery/celery | https://api.github.com/repos/celery/celery | closed | docs.celeryproject.org and celeryproject.org are down | Issue Type: Bug Report Category: Documentation | <!--
Please fill this template entirely and do not erase parts of it.
We reserve the right to close without a response
bug reports which are incomplete.
-->
# Checklist
<!--
To check an item on the list replace [ ] with [x].
-->
- [x] I have checked the [issues list](https://github.com/celery/celery/issues?utf8=%E2%9C%93&q=is%3Aissue+label%3A%22Category%3A+Documentation%22+)
for similar or identical bug reports.
- [x] I have checked the [pull requests list](https://github.com/celery/celery/pulls?q=is%3Apr+label%3A%22Category%3A+Documentation%22)
for existing proposed fixes.
- [x] I have checked the [commit log](https://github.com/celery/celery/commits/master)
to find out if the bug was already fixed in the master branch.
- [x] I have included all related issues and possible duplicate issues in this issue
(If there are none, check this box anyway).
## Related Issues and Possible Duplicates
<!--
Please make sure to search and mention any related issues
or possible duplicates to this issue as requested by the checklist above.
This may or may not include issues in other repositories that the Celery project
maintains or other repositories that are dependencies of Celery.
If you don't know how to mention issues, please refer to Github's documentation
on the subject: https://help.github.com/en/articles/autolinked-references-and-urls#issues-and-pull-requests
-->
#### Related Issues
- None
#### Possible Duplicates
- celery/celeryproject#50
- celery/celeryproject#47
# Description
<!--
Please describe what's missing or incorrect about our documentation.
Include links and/or screenshots which will aid us to resolve the issue.
-->
The Celery website (celeryproject.org) and the Celery documentation website (docs.celeryproject.org) no longer seem to have A DNS records. I've checked this by running `dig celeryproject a` and `dig docs.celeryproject.org a` on computers with internet connections in various places in the world, and nowhere seems able to load the website any more.
# Suggestions
<!-- Please provide us suggestions for how to fix the documentation -->
I'm not sure. Has the domain ownership lapsed, or is there just an issue with your DNS provider? | 1.0 | docs.celeryproject.org and celeryproject.org are down - <!--
Please fill this template entirely and do not erase parts of it.
We reserve the right to close without a response
bug reports which are incomplete.
-->
# Checklist
<!--
To check an item on the list replace [ ] with [x].
-->
- [x] I have checked the [issues list](https://github.com/celery/celery/issues?utf8=%E2%9C%93&q=is%3Aissue+label%3A%22Category%3A+Documentation%22+)
for similar or identical bug reports.
- [x] I have checked the [pull requests list](https://github.com/celery/celery/pulls?q=is%3Apr+label%3A%22Category%3A+Documentation%22)
for existing proposed fixes.
- [x] I have checked the [commit log](https://github.com/celery/celery/commits/master)
to find out if the bug was already fixed in the master branch.
- [x] I have included all related issues and possible duplicate issues in this issue
(If there are none, check this box anyway).
## Related Issues and Possible Duplicates
<!--
Please make sure to search and mention any related issues
or possible duplicates to this issue as requested by the checklist above.
This may or may not include issues in other repositories that the Celery project
maintains or other repositories that are dependencies of Celery.
If you don't know how to mention issues, please refer to Github's documentation
on the subject: https://help.github.com/en/articles/autolinked-references-and-urls#issues-and-pull-requests
-->
#### Related Issues
- None
#### Possible Duplicates
- celery/celeryproject#50
- celery/celeryproject#47
# Description
<!--
Please describe what's missing or incorrect about our documentation.
Include links and/or screenshots which will aid us to resolve the issue.
-->
The Celery website (celeryproject.org) and the Celery documentation website (docs.celeryproject.org) no longer seem to have A DNS records. I've checked this by running `dig celeryproject a` and `dig docs.celeryproject.org a` on computers with internet connections in various places in the world, and nowhere seems able to load the website any more.
# Suggestions
<!-- Please provide us suggestions for how to fix the documentation -->
I'm not sure. Has the domain ownership lapsed, or is there just an issue with your DNS provider? | non_priority | docs celeryproject org and celeryproject org are down please fill this template entirely and do not erase parts of it we reserve the right to close without a response bug reports which are incomplete checklist to check an item on the list replace with i have checked the for similar or identical bug reports i have checked the for existing proposed fixes i have checked the to find out if the bug was already fixed in the master branch i have included all related issues and possible duplicate issues in this issue if there are none check this box anyway related issues and possible duplicates please make sure to search and mention any related issues or possible duplicates to this issue as requested by the checklist above this may or may not include issues in other repositories that the celery project maintains or other repositories that are dependencies of celery if you don t know how to mention issues please refer to github s documentation on the subject related issues none possible duplicates celery celeryproject celery celeryproject description please describe what s missing or incorrect about our documentation include links and or screenshots which will aid us to resolve the issue the celery website celeryproject org and the celery documentation website docs celeryproject org no longer seem to have a dns records i ve checked this by running dig celeryproject a and dig docs celeryproject org a on computers with internet connections in various places in the world and nowhere seems able to load the website any more suggestions i m not sure has the domain ownership lapsed or is there just an issue with your dns provider | 0 |
37,525 | 10,024,230,532 | IssuesEvent | 2019-07-16 21:16:46 | dpa99c/cordova-plugin-firebasex | https://api.github.com/repos/dpa99c/cordova-plugin-firebasex | closed | Error: pod: Command failed with exit code 31 | awaiting response build issue support | **Describe the bug**
Hi Dave,
I tried to upgrade to the last version of the plugin and faced a conflict with the iOS platform. I tried then to reproduce the same behavior with a dummy project and I landed on the same problem respectively, when I try to install the last version of the plugin, I face the following error:
> Running command: pod install --verbose
Failed to install 'cordova-plugin-firebasex': Error: pod: Command failed with exit code 31
at ChildProcess.whenDone (/myProject/firebasex-pod/node_modules/cordova-common/src/superspawn.js:135:23)
at ChildProcess.emit (events.js:200:13)
at maybeClose (internal/child_process.js:1021:16)
at Socket.<anonymous> (internal/child_process.js:430:11)
at Socket.emit (events.js:200:13)
at Pipe.<anonymous> (net.js:586:12)
pod: Command failed with exit code 31
**To Reproduce**
Steps to reproduce the behavior:
1. ionic start myProject
2. cd myProject
3. ionic cordova plugin add cordova-plugin-firebasex --save
4. ionic cordova platform add ios@latest --save
**Console Logs**
> $ ionic cordova platform add ios@latest --save
> cordova platform add ios@latest
Using cordova-fetch for cordova-ios@latest
Adding ios project...
Creating Cordova project for the iOS platform:
Path: platforms/ios
Package: io.ionic.starter
Name: MyApp
iOS project created with cordova-ios@5.0.1
Installing "cordova-plugin-firebasex" for ios
Installing "cordova-plugin-androidx" for ios
Running command: pod install --verbose
Failed to install 'cordova-plugin-firebasex': Error: pod: Command failed with exit code 31
at ChildProcess.whenDone (/myProject/firebasex-pod/node_modules/cordova-common/src/superspawn.js:135:23)
at ChildProcess.emit (events.js:200:13)
at maybeClose (internal/child_process.js:1021:16)
at Socket.<anonymous> (internal/child_process.js:430:11)
at Socket.emit (events.js:200:13)
at Pipe.<anonymous> (net.js:586:12)
pod: Command failed with exit code 31
[ERROR] An error occurred while running subprocess cordova.
cordova platform add ios@latest exited with exit code 31.
Re-running this command with the --verbose flag may provide more information.
**Plugin Version**
v4.0.0
Thx in advance and let me know if I should test or do anything.
P.S.: Rollback to plugin v3.0.1 meantime, doesn't face that problem with it
| 1.0 | Error: pod: Command failed with exit code 31 - **Describe the bug**
Hi Dave,
I tried to upgrade to the last version of the plugin and faced a conflict with the iOS platform. I tried then to reproduce the same behavior with a dummy project and I landed on the same problem respectively, when I try to install the last version of the plugin, I face the following error:
> Running command: pod install --verbose
Failed to install 'cordova-plugin-firebasex': Error: pod: Command failed with exit code 31
at ChildProcess.whenDone (/myProject/firebasex-pod/node_modules/cordova-common/src/superspawn.js:135:23)
at ChildProcess.emit (events.js:200:13)
at maybeClose (internal/child_process.js:1021:16)
at Socket.<anonymous> (internal/child_process.js:430:11)
at Socket.emit (events.js:200:13)
at Pipe.<anonymous> (net.js:586:12)
pod: Command failed with exit code 31
**To Reproduce**
Steps to reproduce the behavior:
1. ionic start myProject
2. cd myProject
3. ionic cordova plugin add cordova-plugin-firebasex --save
4. ionic cordova platform add ios@latest --save
**Console Logs**
> $ ionic cordova platform add ios@latest --save
> cordova platform add ios@latest
Using cordova-fetch for cordova-ios@latest
Adding ios project...
Creating Cordova project for the iOS platform:
Path: platforms/ios
Package: io.ionic.starter
Name: MyApp
iOS project created with cordova-ios@5.0.1
Installing "cordova-plugin-firebasex" for ios
Installing "cordova-plugin-androidx" for ios
Running command: pod install --verbose
Failed to install 'cordova-plugin-firebasex': Error: pod: Command failed with exit code 31
at ChildProcess.whenDone (/myProject/firebasex-pod/node_modules/cordova-common/src/superspawn.js:135:23)
at ChildProcess.emit (events.js:200:13)
at maybeClose (internal/child_process.js:1021:16)
at Socket.<anonymous> (internal/child_process.js:430:11)
at Socket.emit (events.js:200:13)
at Pipe.<anonymous> (net.js:586:12)
pod: Command failed with exit code 31
[ERROR] An error occurred while running subprocess cordova.
cordova platform add ios@latest exited with exit code 31.
Re-running this command with the --verbose flag may provide more information.
**Plugin Version**
v4.0.0
Thx in advance and let me know if I should test or do anything.
P.S.: Rollback to plugin v3.0.1 meantime, doesn't face that problem with it
| non_priority | error pod command failed with exit code describe the bug hi dave i tried to upgrade to the last version of the plugin and faced a conflict with the ios platform i tried then to reproduce the same behavior with a dummy project and i landed on the same problem respectively when i try to install the last version of the plugin i face the following error running command pod install verbose failed to install cordova plugin firebasex error pod command failed with exit code at childprocess whendone myproject firebasex pod node modules cordova common src superspawn js at childprocess emit events js at maybeclose internal child process js at socket internal child process js at socket emit events js at pipe net js pod command failed with exit code to reproduce steps to reproduce the behavior ionic start myproject cd myproject ionic cordova plugin add cordova plugin firebasex save ionic cordova platform add ios latest save console logs ionic cordova platform add ios latest save cordova platform add ios latest using cordova fetch for cordova ios latest adding ios project creating cordova project for the ios platform path platforms ios package io ionic starter name myapp ios project created with cordova ios installing cordova plugin firebasex for ios installing cordova plugin androidx for ios running command pod install verbose failed to install cordova plugin firebasex error pod command failed with exit code at childprocess whendone myproject firebasex pod node modules cordova common src superspawn js at childprocess emit events js at maybeclose internal child process js at socket internal child process js at socket emit events js at pipe net js pod command failed with exit code an error occurred while running subprocess cordova cordova platform add ios latest exited with exit code re running this command with the verbose flag may provide more information plugin version thx in advance and let me know if i should test or do anything p s rollback to plugin meantime doesn t face that problem with it | 0 |
110,298 | 4,424,775,234 | IssuesEvent | 2016-08-16 13:39:43 | openvstorage/alba | https://api.github.com/repos/openvstorage/alba | closed | RDMA Proxy client's timeout shouldn't be based on gettimeofday | priority_urgent type_bug | Discovered via code inspection while debugging https://github.com/openvstorage/alba/issues/317 :
```
double _stamp_ms() {
struct timeval tp;
gettimeofday(&tp, NULL);
double t0 = 1000 * tp.tv_sec + (double)tp.tv_usec / 1e3;
return t0;
}
```
`gettimeofday` returns the wall clock time - better use a steady clock, e.g. `boost::chrono::steady_clock` - http://www.boost.org/doc/libs/1_61_0/doc/html/chrono/reference.html#chrono.reference.cpp0x.system_clocks_hpp.steady_clock , e.g. (pseudo code!):
```
auto end = boost::chrono::steady_clock::now() + timeout_duration;
do
{
// ... work
}
while (boost::chrono::steady_clock::now() < end)
``` | 1.0 | RDMA Proxy client's timeout shouldn't be based on gettimeofday - Discovered via code inspection while debugging https://github.com/openvstorage/alba/issues/317 :
```
double _stamp_ms() {
struct timeval tp;
gettimeofday(&tp, NULL);
double t0 = 1000 * tp.tv_sec + (double)tp.tv_usec / 1e3;
return t0;
}
```
`gettimeofday` returns the wall clock time - better use a steady clock, e.g. `boost::chrono::steady_clock` - http://www.boost.org/doc/libs/1_61_0/doc/html/chrono/reference.html#chrono.reference.cpp0x.system_clocks_hpp.steady_clock , e.g. (pseudo code!):
```
auto end = boost::chrono::steady_clock::now() + timeout_duration;
do
{
// ... work
}
while (boost::chrono::steady_clock::now() < end)
``` | priority | rdma proxy client s timeout shouldn t be based on gettimeofday discovered via code inspection while debugging double stamp ms struct timeval tp gettimeofday tp null double tp tv sec double tp tv usec return gettimeofday returns the wall clock time better use a steady clock e g boost chrono steady clock e g pseudo code auto end boost chrono steady clock now timeout duration do work while boost chrono steady clock now end | 1 |
31,401 | 11,924,964,393 | IssuesEvent | 2020-04-01 10:25:25 | Baneeishaque/side_menu | https://api.github.com/repos/Baneeishaque/side_menu | opened | CVE-2019-10744 (High) detected in lodash-4.17.5.tgz | security vulnerability | ## CVE-2019-10744 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.5.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/side_menu/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/side_menu/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- app-scripts-3.1.9.tgz (Root Library)
- node-sass-4.7.2.tgz
- sass-graph-2.2.4.tgz
- :x: **lodash-4.17.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/side_menu/commit/36e50af1a29f1ee74ab7c6293bd41021ebaf81f5">36e50af1a29f1ee74ab7c6293bd41021ebaf81f5</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions of lodash lower than 4.17.12 are vulnerable to Prototype Pollution. The function defaultsDeep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-07-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10744>CVE-2019-10744</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/pull/4336/commits/a01e4fa727e7294cb7b2845570ba96b206926790">https://github.com/lodash/lodash/pull/4336/commits/a01e4fa727e7294cb7b2845570ba96b206926790</a></p>
<p>Release Date: 2019-07-08</p>
<p>Fix Resolution: 4.17.12</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-10744 (High) detected in lodash-4.17.5.tgz - ## CVE-2019-10744 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.5.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.5.tgz</a></p>
<p>Path to dependency file: /tmp/ws-scm/side_menu/package.json</p>
<p>Path to vulnerable library: /tmp/ws-scm/side_menu/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- app-scripts-3.1.9.tgz (Root Library)
- node-sass-4.7.2.tgz
- sass-graph-2.2.4.tgz
- :x: **lodash-4.17.5.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Baneeishaque/side_menu/commit/36e50af1a29f1ee74ab7c6293bd41021ebaf81f5">36e50af1a29f1ee74ab7c6293bd41021ebaf81f5</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Versions of lodash lower than 4.17.12 are vulnerable to Prototype Pollution. The function defaultsDeep could be tricked into adding or modifying properties of Object.prototype using a constructor payload.
<p>Publish Date: 2019-07-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10744>CVE-2019-10744</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/pull/4336/commits/a01e4fa727e7294cb7b2845570ba96b206926790">https://github.com/lodash/lodash/pull/4336/commits/a01e4fa727e7294cb7b2845570ba96b206926790</a></p>
<p>Release Date: 2019-07-08</p>
<p>Fix Resolution: 4.17.12</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in lodash tgz cve high severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file tmp ws scm side menu package json path to vulnerable library tmp ws scm side menu node modules lodash package json dependency hierarchy app scripts tgz root library node sass tgz sass graph tgz x lodash tgz vulnerable library found in head commit a href vulnerability details versions of lodash lower than are vulnerable to prototype pollution the function defaultsdeep could be tricked into adding or modifying properties of object prototype using a constructor payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
171,048 | 20,915,168,378 | IssuesEvent | 2022-03-24 12:50:33 | ghc-dev/Kristen-Adams | https://api.github.com/repos/ghc-dev/Kristen-Adams | opened | WS-2019-0379 (Medium) detected in commons-codec-1.9.jar | security vulnerability | ## WS-2019-0379 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-codec-1.9.jar</b></p></summary>
<p>The Apache Commons Codec package contains simple encoder and decoders for
various formats such as Base64 and Hexadecimal. In addition to these
widely used encoders and decoders, the codec package also maintains a
collection of phonetic encoding utilities.</p>
<p>Library home page: <a href="http://commons.apache.org/proper/commons-codec/">http://commons.apache.org/proper/commons-codec/</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-codec/commons-codec/1.9/commons-codec-1.9.jar</p>
<p>
Dependency Hierarchy:
- rocketmq-broker-4.6.0.jar (Root Library)
- rocketmq-acl-4.6.0.jar
- :x: **commons-codec-1.9.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Kristen-Adams/commit/1c274ce2e9534954d3b106f366c90bea1e925284">1c274ce2e9534954d3b106f366c90bea1e925284</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation.
<p>Publish Date: 2019-05-20
<p>URL: <a href=https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113>WS-2019-0379</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113">https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113</a></p>
<p>Release Date: 2019-05-20</p>
<p>Fix Resolution (commons-codec:commons-codec): 1.13</p>
<p>Direct dependency fix Resolution (org.apache.rocketmq:rocketmq-broker): 4.6.1</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.rocketmq","packageName":"rocketmq-broker","packageVersion":"4.6.0","packageFilePaths":["/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.rocketmq:rocketmq-broker:4.6.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.6.1","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2019-0379","vulnerabilityDetails":"Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation.","vulnerabilityUrl":"https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | True | WS-2019-0379 (Medium) detected in commons-codec-1.9.jar - ## WS-2019-0379 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>commons-codec-1.9.jar</b></p></summary>
<p>The Apache Commons Codec package contains simple encoder and decoders for
various formats such as Base64 and Hexadecimal. In addition to these
widely used encoders and decoders, the codec package also maintains a
collection of phonetic encoding utilities.</p>
<p>Library home page: <a href="http://commons.apache.org/proper/commons-codec/">http://commons.apache.org/proper/commons-codec/</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/commons-codec/commons-codec/1.9/commons-codec-1.9.jar</p>
<p>
Dependency Hierarchy:
- rocketmq-broker-4.6.0.jar (Root Library)
- rocketmq-acl-4.6.0.jar
- :x: **commons-codec-1.9.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Kristen-Adams/commit/1c274ce2e9534954d3b106f366c90bea1e925284">1c274ce2e9534954d3b106f366c90bea1e925284</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation.
<p>Publish Date: 2019-05-20
<p>URL: <a href=https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113>WS-2019-0379</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113">https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113</a></p>
<p>Release Date: 2019-05-20</p>
<p>Fix Resolution (commons-codec:commons-codec): 1.13</p>
<p>Direct dependency fix Resolution (org.apache.rocketmq:rocketmq-broker): 4.6.1</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.rocketmq","packageName":"rocketmq-broker","packageVersion":"4.6.0","packageFilePaths":["/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"org.apache.rocketmq:rocketmq-broker:4.6.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"4.6.1","isBinary":false}],"baseBranches":["main"],"vulnerabilityIdentifier":"WS-2019-0379","vulnerabilityDetails":"Apache commons-codec before version “commons-codec-1.13-RC1” is vulnerable to information disclosure due to Improper Input validation.","vulnerabilityUrl":"https://github.com/apache/commons-codec/commit/48b615756d1d770091ea3322eefc08011ee8b113","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> --> | non_priority | ws medium detected in commons codec jar ws medium severity vulnerability vulnerable library commons codec jar the apache commons codec package contains simple encoder and decoders for various formats such as and hexadecimal in addition to these widely used encoders and decoders the codec package also maintains a collection of phonetic encoding utilities library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository commons codec commons codec commons codec jar dependency hierarchy rocketmq broker jar root library rocketmq acl jar x commons codec jar vulnerable library found in head commit a href found in base branch main vulnerability details apache commons codec before version “commons codec ” is vulnerable to information disclosure due to improper input validation publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution commons codec commons codec direct dependency fix resolution org apache rocketmq rocketmq broker rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree org apache rocketmq rocketmq broker isminimumfixversionavailable true minimumfixversion isbinary false basebranches vulnerabilityidentifier ws vulnerabilitydetails apache commons codec before version “commons codec ” is vulnerable to information disclosure due to improper input validation vulnerabilityurl | 0 |
277,245 | 8,628,043,707 | IssuesEvent | 2018-11-21 16:23:25 | poanetwork/blockscout | https://api.github.com/repos/poanetwork/blockscout | closed | Async task to check token supply and other metadata | enhancement in progress priority: high team: developer tokens | Some `tokens` may have their `total supply` value changed over time. In case that happens, we won't be able to have the updated value because we only fetch that information when we are creating the token for the first time.
In order to keep the `total supply` value updated it will be necessary to fetch the data from the blockchain again because we don't know which token has been changed.
A couple options
1. Check using a cronjob at a set interval
2. Check with every token transfer
### Acceptance Criteria
* Given that we have a token already indexed with a `total supply` value when there are changes to this value in the blockchain, this value must be updated so that we can display it in the application;
* Update all metadata that have changed since the last checkout.
### Tasks
- [x] Get all cataloged tokens
- [x] Check all functions for each token
- [x] Save valid data to database
- [x] Create a GenServer to update tokens metadata
- [x] Create a configuration `updater_interval`;
- [x] Rerun this operation after a given time interval. | 1.0 | Async task to check token supply and other metadata - Some `tokens` may have their `total supply` value changed over time. In case that happens, we won't be able to have the updated value because we only fetch that information when we are creating the token for the first time.
In order to keep the `total supply` value updated it will be necessary to fetch the data from the blockchain again because we don't know which token has been changed.
A couple options
1. Check using a cronjob at a set interval
2. Check with every token transfer
### Acceptance Criteria
* Given that we have a token already indexed with a `total supply` value when there are changes to this value in the blockchain, this value must be updated so that we can display it in the application;
* Update all metadata that have changed since the last checkout.
### Tasks
- [x] Get all cataloged tokens
- [x] Check all functions for each token
- [x] Save valid data to database
- [x] Create a GenServer to update tokens metadata
- [x] Create a configuration `updater_interval`;
- [x] Rerun this operation after a given time interval. | priority | async task to check token supply and other metadata some tokens may have their total supply value changed over time in case that happens we won t be able to have the updated value because we only fetch that information when we are creating the token for the first time in order to keep the total supply value updated it will be necessary to fetch the data from the blockchain again because we don t know which token has been changed a couple options check using a cronjob at a set interval check with every token transfer acceptance criteria given that we have a token already indexed with a total supply value when there are changes to this value in the blockchain this value must be updated so that we can display it in the application update all metadata that have changed since the last checkout tasks get all cataloged tokens check all functions for each token save valid data to database create a genserver to update tokens metadata create a configuration updater interval rerun this operation after a given time interval | 1 |
80,836 | 3,575,122,400 | IssuesEvent | 2016-01-27 14:53:13 | pfirsich/Frontdown | https://api.github.com/repos/pfirsich/Frontdown | closed | Rendering of Large actions.html is too Slow | bug priority:must have | Probably we want to pre-render the HTML table instead of parsing the JSON content.
Maybe we can also hide the "hardlink" entries (in hardlink mode) since they basically are NOP. | 1.0 | Rendering of Large actions.html is too Slow - Probably we want to pre-render the HTML table instead of parsing the JSON content.
Maybe we can also hide the "hardlink" entries (in hardlink mode) since they basically are NOP. | priority | rendering of large actions html is too slow probably we want to pre render the html table instead of parsing the json content maybe we can also hide the hardlink entries in hardlink mode since they basically are nop | 1 |
385,215 | 11,415,276,421 | IssuesEvent | 2020-02-02 09:54:03 | parzh/retryable | https://api.github.com/repos/parzh/retryable | closed | Setup automation | Change: patch Domain: meta Priority: top Type: improvement | ### [`pull_request`] → [`parzh:develop`]
##### (actions [`"opened"`], [`"labeled"`], [`"unlabeled"`])
- [ ] require exactly one `"Change: (major|minor|patch)"` label
### [`pull_request`] → [`parzh:develop`]
##### (action [`"synchronize"`])
- [ ] `npm run build`
- [ ] `npm test`
- [ ] `npm lint`
### [`pull_request`] → [`parzh:develop`]
##### (action [`"closed"`], if<sup>[1][conditions]</sup> [`github.event.pull_request.merged`])
- [ ] collect release notes
- [ ] collect code coverage
***
### [`push`] → [`parzh:develop`]
- [ ] `npm run build`
- [ ] `npm test`
- [ ] `npm lint`
***
### [`pull_request`] → [`parzh:master`]
##### (actions [`"opened"`], [`"labeled"`], [`"unlabeled"`])
- [ ] require head to be `parzh:develop`
- [ ] require exactly one `"Change: (major|minor|patch)"` label
### [`pull_request`] → [`parzh:master`]
##### (actions [`"labeled"`], [`"unlabeled"`])
- [ ] gather semver change type
```sh
# show merge commits after latest merge to master
git rev-list master..develop --merges
```
### [`pull_request`] → [`parzh:master`]
##### (action [`"closed"`], if [`github.event.pull_request.merged`])
- [ ] require exactly one `"Change: (major|minor|patch)"` label
- [ ] `npm run build`
- [ ] `npm test`
- [ ] `npm run lint`
- [ ] `npm version $VERSION_SPECIFIED_BY_LABEL`
- [ ] `npm publish @next`
- [ ] reset `publish @latest` timer (see below)
***
### one week since the latest "publish @next"
- [ ] re-publish `@next` version as `@latest`
```sh
# get current @next version
NEXT_VERSION="$(npm view @parzh/retryable@next --version)"
# put @latest tag on @next release
npm dist-tag add @parzh/retryable@$NEXT_VERSION latest
```
[`parzh:develop`]: /parzh/retryable/commits/develop
[`parzh:master`]: /parzh/retryable/commits/master
[`push`]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/events-that-trigger-workflows#push-event-push
[`pull_request`]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/events-that-trigger-workflows#pull-request-event-pull_request
[conditions]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions#jobsjob_idif
[`github.event.pull_request.merged`]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/contexts-and-expression-syntax-for-github-actions#github-context
[`"synchronize"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L18
[`"opened"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L10
[`"closed"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L11
[`"labeled"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L7
[`"unlabeled"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L8 | 1.0 | Setup automation - ### [`pull_request`] → [`parzh:develop`]
##### (actions [`"opened"`], [`"labeled"`], [`"unlabeled"`])
- [ ] require exactly one `"Change: (major|minor|patch)"` label
### [`pull_request`] → [`parzh:develop`]
##### (action [`"synchronize"`])
- [ ] `npm run build`
- [ ] `npm test`
- [ ] `npm lint`
### [`pull_request`] → [`parzh:develop`]
##### (action [`"closed"`], if<sup>[1][conditions]</sup> [`github.event.pull_request.merged`])
- [ ] collect release notes
- [ ] collect code coverage
***
### [`push`] → [`parzh:develop`]
- [ ] `npm run build`
- [ ] `npm test`
- [ ] `npm lint`
***
### [`pull_request`] → [`parzh:master`]
##### (actions [`"opened"`], [`"labeled"`], [`"unlabeled"`])
- [ ] require head to be `parzh:develop`
- [ ] require exactly one `"Change: (major|minor|patch)"` label
### [`pull_request`] → [`parzh:master`]
##### (actions [`"labeled"`], [`"unlabeled"`])
- [ ] gather semver change type
```sh
# show merge commits after latest merge to master
git rev-list master..develop --merges
```
### [`pull_request`] → [`parzh:master`]
##### (action [`"closed"`], if [`github.event.pull_request.merged`])
- [ ] require exactly one `"Change: (major|minor|patch)"` label
- [ ] `npm run build`
- [ ] `npm test`
- [ ] `npm run lint`
- [ ] `npm version $VERSION_SPECIFIED_BY_LABEL`
- [ ] `npm publish @next`
- [ ] reset `publish @latest` timer (see below)
***
### one week since the latest "publish @next"
- [ ] re-publish `@next` version as `@latest`
```sh
# get current @next version
NEXT_VERSION="$(npm view @parzh/retryable@next --version)"
# put @latest tag on @next release
npm dist-tag add @parzh/retryable@$NEXT_VERSION latest
```
[`parzh:develop`]: /parzh/retryable/commits/develop
[`parzh:master`]: /parzh/retryable/commits/master
[`push`]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/events-that-trigger-workflows#push-event-push
[`pull_request`]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/events-that-trigger-workflows#pull-request-event-pull_request
[conditions]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/workflow-syntax-for-github-actions#jobsjob_idif
[`github.event.pull_request.merged`]: https://help.github.com/en/actions/automating-your-workflow-with-github-actions/contexts-and-expression-syntax-for-github-actions#github-context
[`"synchronize"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L18
[`"opened"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L10
[`"closed"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L11
[`"labeled"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L7
[`"unlabeled"`]: https://gist.github.com/parzhitsky/bf0aa853802e6b7d510b5d36d920f015#file-main-yml-L8 | priority | setup automation rarr actions require exactly one change major minor patch label rarr action npm run build npm test npm lint rarr action if collect release notes collect code coverage rarr npm run build npm test npm lint rarr actions require head to be parzh develop require exactly one change major minor patch label rarr actions gather semver change type sh show merge commits after latest merge to master git rev list master develop merges rarr action if require exactly one change major minor patch label npm run build npm test npm run lint npm version version specified by label npm publish next reset publish latest timer see below one week since the latest publish next re publish next version as latest sh get current next version next version npm view parzh retryable next version put latest tag on next release npm dist tag add parzh retryable next version latest parzh retryable commits develop parzh retryable commits master | 1 |
40,916 | 2,868,950,776 | IssuesEvent | 2015-06-05 22:09:11 | dart-lang/pub | https://api.github.com/repos/dart-lang/pub | closed | Pub publish file checking warning should respect ignored files | bug Fixed Priority-Low | <a href="https://github.com/simonpai"><img src="https://avatars.githubusercontent.com/u/785058?v=3" align="left" width="96" height="96"hspace="10"></img></a> **Issue by [simonpai](https://github.com/simonpai)**
_Originally opened as dart-lang/sdk#11198_
----
When you have a project with generated doc in a folder (say, out/dartdoc) marked in .gitignore, pub publish will correctly ignore it, but it still gives a warning saying "Avoid putting generated documentation in out/dartdoc.", which is incorrect IMO. | 1.0 | Pub publish file checking warning should respect ignored files - <a href="https://github.com/simonpai"><img src="https://avatars.githubusercontent.com/u/785058?v=3" align="left" width="96" height="96"hspace="10"></img></a> **Issue by [simonpai](https://github.com/simonpai)**
_Originally opened as dart-lang/sdk#11198_
----
When you have a project with generated doc in a folder (say, out/dartdoc) marked in .gitignore, pub publish will correctly ignore it, but it still gives a warning saying "Avoid putting generated documentation in out/dartdoc.", which is incorrect IMO. | priority | pub publish file checking warning should respect ignored files issue by originally opened as dart lang sdk when you have a project with generated doc in a folder say out dartdoc marked in gitignore pub publish will correctly ignore it but it still gives a warning saying quot avoid putting generated documentation in out dartdoc quot which is incorrect imo | 1 |
64,244 | 3,206,477,215 | IssuesEvent | 2015-10-05 01:02:36 | danascheider/tessitura-front-end | https://api.github.com/repos/danascheider/tessitura-front-end | opened | MySQL Error: Illegal mix of collations | note priority 4 | When I load the homepage in my development environment (wired to the test API), I get 500 errors from the server that appear in the logs as:
<pre><code>71.238.42.189 - - [05/Oct/2015:00:52:50 +0000] "OPTIONS /organizations HTTP/1.1" 200 - 0.0006
2015-10-05 00:52:50 - Sequel::DatabaseError - Mysql2::Error: Illegal mix of collations (latin1_swedish_ci,IMPLICIT) and (utf8_general_ci,COERCIBLE) for operation '=':
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:85:in `_query'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:85:in `block in query'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:84:in `handle_interrupt'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:84:in `query'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:78:in `block in _execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/database/logging.rb:37:in `log_yield'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:78:in `_execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/shared/mysql_prepared_statements.rb:34:in `block in execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/database/connecting.rb:249:in `block in synchronize'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/connection_pool/threaded.rb:103:in `hold'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/database/connecting.rb:249:in `synchronize'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/shared/mysql_prepared_statements.rb:34:in `execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/actions.rb:921:in `execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:195:in `execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:154:in `fetch_rows'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/actions.rb:802:in `with_sql_each'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/actions.rb:812:in `with_sql_first'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/placeholder_literalizer.rb:148:in `first'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/model/base.rb:876:in `block in def_finder_method'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/model/base.rb:313:in `find'
/home/dana/tessitura/lib/helpers/authorization_helper.rb:141:in `valid_credentials?'
/home/dana/tessitura/lib/helpers/authorization_helper.rb:34:in `authorized?'
/home/dana/tessitura/lib/helpers/authorization_helper.rb:115:in `protect_communal'
/home/dana/tessitura/lib/routes/filters.rb:31:in `block in communal_auth_filter'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1610:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1610:in `block in compile!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1014:in `[]'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1014:in `block in process_route'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1012:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1012:in `process_route'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:965:in `block in filter!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:965:in `each'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:965:in `filter!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1083:in `block in dispatch!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `block in invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1081:in `dispatch!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:906:in `block in call!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `block in invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:906:in `call!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:894:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/slogger-0.0.11/lib/slogger/request_logger.rb:25:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-cors-0.4.0/lib/rack/cors.rb:80:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/xss_header.rb:18:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/path_traversal.rb:16:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/json_csrf.rb:18:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/base.rb:49:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/base.rb:49:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/frame_options.rb:31:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/logger.rb:15:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/commonlogger.rb:33:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:218:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:211:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/head.rb:13:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/show_exceptions.rb:21:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:181:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:2021:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1486:in `block in call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1795:in `synchronize'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1486:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/reloader.rb:44:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:86:in `block in pre_process'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:84:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:84:in `pre_process'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:53:in `process'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:39:in `receive_data'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/eventmachine-1.0.8/lib/eventmachine.rb:193:in `run_machine'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/eventmachine-1.0.8/lib/eventmachine.rb:193:in `run'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/backends/base.rb:73:in `start'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/server.rb:162:in `start'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/controllers/controller.rb:87:in `start'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/runner.rb:200:in `run_command'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/runner.rb:156:in `run!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/bin/thin:6:in `<top (required)>'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/thin:23:in `load'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/thin:23:in `<main>'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/ruby_executable_hooks:15:in `eval'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/ruby_executable_hooks:15:in `<main>'</code></pre>Notice this is an `OPTIONS` request coming in and getting this response. I remember having had this problem once before, but don't remember how I solved it.
It's worth noting that this request is not being made from the secure area, so it is probably not sending any credentials. My best guess as to what is causing this error is the following chain of events:
1. Request sent with credentials `btoa(':');`
2. Back end authorization helper parses the string `':'`
3. Authorization helper makes request to the database to identify logged-in user
4. Database barfs
I'll try to remember to update this issue with the ultimate solution to the problem. | 1.0 | MySQL Error: Illegal mix of collations - When I load the homepage in my development environment (wired to the test API), I get 500 errors from the server that appear in the logs as:
<pre><code>71.238.42.189 - - [05/Oct/2015:00:52:50 +0000] "OPTIONS /organizations HTTP/1.1" 200 - 0.0006
2015-10-05 00:52:50 - Sequel::DatabaseError - Mysql2::Error: Illegal mix of collations (latin1_swedish_ci,IMPLICIT) and (utf8_general_ci,COERCIBLE) for operation '=':
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:85:in `_query'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:85:in `block in query'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:84:in `handle_interrupt'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/mysql2-0.4.1/lib/mysql2/client.rb:84:in `query'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:78:in `block in _execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/database/logging.rb:37:in `log_yield'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:78:in `_execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/shared/mysql_prepared_statements.rb:34:in `block in execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/database/connecting.rb:249:in `block in synchronize'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/connection_pool/threaded.rb:103:in `hold'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/database/connecting.rb:249:in `synchronize'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/shared/mysql_prepared_statements.rb:34:in `execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/actions.rb:921:in `execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:195:in `execute'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/adapters/mysql2.rb:154:in `fetch_rows'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/actions.rb:802:in `with_sql_each'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/actions.rb:812:in `with_sql_first'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/dataset/placeholder_literalizer.rb:148:in `first'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/model/base.rb:876:in `block in def_finder_method'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sequel-4.26.0/lib/sequel/model/base.rb:313:in `find'
/home/dana/tessitura/lib/helpers/authorization_helper.rb:141:in `valid_credentials?'
/home/dana/tessitura/lib/helpers/authorization_helper.rb:34:in `authorized?'
/home/dana/tessitura/lib/helpers/authorization_helper.rb:115:in `protect_communal'
/home/dana/tessitura/lib/routes/filters.rb:31:in `block in communal_auth_filter'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1610:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1610:in `block in compile!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1014:in `[]'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1014:in `block in process_route'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1012:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1012:in `process_route'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:965:in `block in filter!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:965:in `each'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:965:in `filter!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1083:in `block in dispatch!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `block in invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1081:in `dispatch!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:906:in `block in call!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `block in invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1066:in `invoke'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:906:in `call!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:894:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/slogger-0.0.11/lib/slogger/request_logger.rb:25:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-cors-0.4.0/lib/rack/cors.rb:80:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/xss_header.rb:18:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/path_traversal.rb:16:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/json_csrf.rb:18:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/base.rb:49:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/base.rb:49:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-protection-1.5.3/lib/rack/protection/frame_options.rb:31:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/logger.rb:15:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/commonlogger.rb:33:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:218:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:211:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/head.rb:13:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/show_exceptions.rb:21:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:181:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:2021:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1486:in `block in call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1795:in `synchronize'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/sinatra-1.4.6/lib/sinatra/base.rb:1486:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/rack-1.6.4/lib/rack/reloader.rb:44:in `call'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:86:in `block in pre_process'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:84:in `catch'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:84:in `pre_process'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:53:in `process'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/connection.rb:39:in `receive_data'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/eventmachine-1.0.8/lib/eventmachine.rb:193:in `run_machine'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/eventmachine-1.0.8/lib/eventmachine.rb:193:in `run'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/backends/base.rb:73:in `start'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/server.rb:162:in `start'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/controllers/controller.rb:87:in `start'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/runner.rb:200:in `run_command'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/lib/thin/runner.rb:156:in `run!'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/gems/thin-1.6.3/bin/thin:6:in `<top (required)>'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/thin:23:in `load'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/thin:23:in `<main>'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/ruby_executable_hooks:15:in `eval'
/home/dana/.rvm/gems/ruby-2.2.3@tessitura/bin/ruby_executable_hooks:15:in `<main>'</code></pre>Notice this is an `OPTIONS` request coming in and getting this response. I remember having had this problem once before, but don't remember how I solved it.
It's worth noting that this request is not being made from the secure area, so it is probably not sending any credentials. My best guess as to what is causing this error is the following chain of events:
1. Request sent with credentials `btoa(':');`
2. Back end authorization helper parses the string `':'`
3. Authorization helper makes request to the database to identify logged-in user
4. Database barfs
I'll try to remember to update this issue with the ultimate solution to the problem. | priority | mysql error illegal mix of collations when i load the homepage in my development environment wired to the test api i get errors from the server that appear in the logs as options organizations http sequel databaseerror error illegal mix of collations swedish ci implicit and general ci coercible for operation home dana rvm gems ruby tessitura gems lib client rb in query home dana rvm gems ruby tessitura gems lib client rb in block in query home dana rvm gems ruby tessitura gems lib client rb in handle interrupt home dana rvm gems ruby tessitura gems lib client rb in query home dana rvm gems ruby tessitura gems sequel lib sequel adapters rb in block in execute home dana rvm gems ruby tessitura gems sequel lib sequel database logging rb in log yield home dana rvm gems ruby tessitura gems sequel lib sequel adapters rb in execute home dana rvm gems ruby tessitura gems sequel lib sequel adapters shared mysql prepared statements rb in block in execute home dana rvm gems ruby tessitura gems sequel lib sequel database connecting rb in block in synchronize home dana rvm gems ruby tessitura gems sequel lib sequel connection pool threaded rb in hold home dana rvm gems ruby tessitura gems sequel lib sequel database connecting rb in synchronize home dana rvm gems ruby tessitura gems sequel lib sequel adapters shared mysql prepared statements rb in execute home dana rvm gems ruby tessitura gems sequel lib sequel dataset actions rb in execute home dana rvm gems ruby tessitura gems sequel lib sequel adapters rb in execute home dana rvm gems ruby tessitura gems sequel lib sequel adapters rb in fetch rows home dana rvm gems ruby tessitura gems sequel lib sequel dataset actions rb in with sql each home dana rvm gems ruby tessitura gems sequel lib sequel dataset actions rb in with sql first home dana rvm gems ruby tessitura gems sequel lib sequel dataset placeholder literalizer rb in first home dana rvm gems ruby tessitura gems sequel lib sequel model base rb in block in def finder method home dana rvm gems ruby tessitura gems sequel lib sequel model base rb in find home dana tessitura lib helpers authorization helper rb in valid credentials home dana tessitura lib helpers authorization helper rb in authorized home dana tessitura lib helpers authorization helper rb in protect communal home dana tessitura lib routes filters rb in block in communal auth filter home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in compile home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in process route home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in catch home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in process route home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in filter home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in each home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in filter home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in dispatch home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in invoke home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in catch home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in invoke home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in dispatch home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in invoke home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in catch home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in invoke home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems slogger lib slogger request logger rb in call home dana rvm gems ruby tessitura gems rack cors lib rack cors rb in call home dana rvm gems ruby tessitura gems rack protection lib rack protection xss header rb in call home dana rvm gems ruby tessitura gems rack protection lib rack protection path traversal rb in call home dana rvm gems ruby tessitura gems rack protection lib rack protection json csrf rb in call home dana rvm gems ruby tessitura gems rack protection lib rack protection base rb in call home dana rvm gems ruby tessitura gems rack protection lib rack protection base rb in call home dana rvm gems ruby tessitura gems rack protection lib rack protection frame options rb in call home dana rvm gems ruby tessitura gems rack lib rack logger rb in call home dana rvm gems ruby tessitura gems rack lib rack commonlogger rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems rack lib rack head rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra show exceptions rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in block in call home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in synchronize home dana rvm gems ruby tessitura gems sinatra lib sinatra base rb in call home dana rvm gems ruby tessitura gems rack lib rack reloader rb in call home dana rvm gems ruby tessitura gems thin lib thin connection rb in block in pre process home dana rvm gems ruby tessitura gems thin lib thin connection rb in catch home dana rvm gems ruby tessitura gems thin lib thin connection rb in pre process home dana rvm gems ruby tessitura gems thin lib thin connection rb in process home dana rvm gems ruby tessitura gems thin lib thin connection rb in receive data home dana rvm gems ruby tessitura gems eventmachine lib eventmachine rb in run machine home dana rvm gems ruby tessitura gems eventmachine lib eventmachine rb in run home dana rvm gems ruby tessitura gems thin lib thin backends base rb in start home dana rvm gems ruby tessitura gems thin lib thin server rb in start home dana rvm gems ruby tessitura gems thin lib thin controllers controller rb in start home dana rvm gems ruby tessitura gems thin lib thin runner rb in run command home dana rvm gems ruby tessitura gems thin lib thin runner rb in run home dana rvm gems ruby tessitura gems thin bin thin in lt top required gt home dana rvm gems ruby tessitura bin thin in load home dana rvm gems ruby tessitura bin thin in lt main gt home dana rvm gems ruby tessitura bin ruby executable hooks in eval home dana rvm gems ruby tessitura bin ruby executable hooks in lt main gt notice this is an options request coming in and getting this response i remember having had this problem once before but don t remember how i solved it it s worth noting that this request is not being made from the secure area so it is probably not sending any credentials my best guess as to what is causing this error is the following chain of events request sent with credentials btoa back end authorization helper parses the string authorization helper makes request to the database to identify logged in user database barfs i ll try to remember to update this issue with the ultimate solution to the problem | 1 |
817,877 | 30,660,845,765 | IssuesEvent | 2023-07-25 14:54:53 | impactMarket/backend | https://api.github.com/repos/impactMarket/backend | closed | [MicroCredit] select loan manager | priority-1: high type: feature | should be possible to get a list of loan managers by country, with impactMarket always as an option.
This is part of the loan manager form.
<img width="705" alt="image" src="https://github.com/impactMarket/backend/assets/19441097/177d79e8-72a8-45c0-9708-88f5fb4786a0">
| 1.0 | [MicroCredit] select loan manager - should be possible to get a list of loan managers by country, with impactMarket always as an option.
This is part of the loan manager form.
<img width="705" alt="image" src="https://github.com/impactMarket/backend/assets/19441097/177d79e8-72a8-45c0-9708-88f5fb4786a0">
| priority | select loan manager should be possible to get a list of loan managers by country with impactmarket always as an option this is part of the loan manager form img width alt image src | 1 |
32,227 | 12,097,560,691 | IssuesEvent | 2020-04-20 08:50:17 | geea-develop/aurelia-material-sample | https://api.github.com/repos/geea-develop/aurelia-material-sample | opened | WS-2017-0120 (High) detected in angular-1.3.0-beta.7.min.js | security vulnerability | ## WS-2017-0120 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>angular-1.3.0-beta.7.min.js</b></p></summary>
<p>AngularJS is an MVC framework for building web applications. The core features include HTML enhanced with custom component and data-binding capabilities, dependency injection and strong focus on simplicity, testability, maintainability and boiler-plate reduction.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.3.0-beta.7/angular.min.js">https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.3.0-beta.7/angular.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/aurelia-material-sample/node_modules/gulp-protractor/example_2/build/index.html</p>
<p>Path to vulnerable library: /aurelia-material-sample/node_modules/gulp-protractor/example_2/build/index.html</p>
<p>
Dependency Hierarchy:
- :x: **angular-1.3.0-beta.7.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/geea-develop/aurelia-material-sample/commit/aa98ef33dbd86011864618694ff529fb8a5e92f0">aa98ef33dbd86011864618694ff529fb8a5e92f0</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
No proper sanitize of xlink:href attribute interoplation, thus vulnerable to Cross-site Scripting (XSS).
<p>Publish Date: 2017-01-20
<p>URL: <a href=https://github.com/angular/angular.js/commit/f33ce173c90736e349cf594df717ae3ee41e0f7a>WS-2017-0120</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/angular/angular.js/commit/f33ce173c90736e349cf594df717ae3ee41e0f7a">https://github.com/angular/angular.js/commit/f33ce173c90736e349cf594df717ae3ee41e0f7a</a></p>
<p>Release Date: 2015-09-18</p>
<p>Fix Resolution: Replace or update the following files: compileSpec.js, compile.js</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2017-0120 (High) detected in angular-1.3.0-beta.7.min.js - ## WS-2017-0120 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>angular-1.3.0-beta.7.min.js</b></p></summary>
<p>AngularJS is an MVC framework for building web applications. The core features include HTML enhanced with custom component and data-binding capabilities, dependency injection and strong focus on simplicity, testability, maintainability and boiler-plate reduction.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.3.0-beta.7/angular.min.js">https://cdnjs.cloudflare.com/ajax/libs/angular.js/1.3.0-beta.7/angular.min.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/aurelia-material-sample/node_modules/gulp-protractor/example_2/build/index.html</p>
<p>Path to vulnerable library: /aurelia-material-sample/node_modules/gulp-protractor/example_2/build/index.html</p>
<p>
Dependency Hierarchy:
- :x: **angular-1.3.0-beta.7.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/geea-develop/aurelia-material-sample/commit/aa98ef33dbd86011864618694ff529fb8a5e92f0">aa98ef33dbd86011864618694ff529fb8a5e92f0</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
No proper sanitize of xlink:href attribute interoplation, thus vulnerable to Cross-site Scripting (XSS).
<p>Publish Date: 2017-01-20
<p>URL: <a href=https://github.com/angular/angular.js/commit/f33ce173c90736e349cf594df717ae3ee41e0f7a>WS-2017-0120</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/angular/angular.js/commit/f33ce173c90736e349cf594df717ae3ee41e0f7a">https://github.com/angular/angular.js/commit/f33ce173c90736e349cf594df717ae3ee41e0f7a</a></p>
<p>Release Date: 2015-09-18</p>
<p>Fix Resolution: Replace or update the following files: compileSpec.js, compile.js</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws high detected in angular beta min js ws high severity vulnerability vulnerable library angular beta min js angularjs is an mvc framework for building web applications the core features include html enhanced with custom component and data binding capabilities dependency injection and strong focus on simplicity testability maintainability and boiler plate reduction library home page a href path to dependency file tmp ws scm aurelia material sample node modules gulp protractor example build index html path to vulnerable library aurelia material sample node modules gulp protractor example build index html dependency hierarchy x angular beta min js vulnerable library found in head commit a href vulnerability details no proper sanitize of xlink href attribute interoplation thus vulnerable to cross site scripting xss publish date url a href cvss score details base score metrics not available suggested fix type change files origin a href release date fix resolution replace or update the following files compilespec js compile js step up your open source security game with whitesource | 0 |
13,710 | 10,022,538,839 | IssuesEvent | 2019-07-16 16:56:40 | OpenLiberty/open-liberty | https://api.github.com/repos/OpenLiberty/open-liberty | closed | DEVEXP-14: Liberty integration with PostgreSQL | Epic FAT complete ID Required NoWAD Approved beta:19500 focalApproved:accessibility focalApproved:fat focalApproved:globalization focalApproved:id focalApproved:performance focalApproved:serviceability focalApproved:ste focalApproved:svt release:19007 team:Zombie Apocalypse | Built-in integration with PostgreSQL.
This includes work such as
- recognizing the driver JARs and data source classes,
- include a properties.postgreSQL configuration element including known property names/values for data sources,
- ensuring that connections and statements are cleaned up properly by the pool
- other general investigation into ensuring the driver works seamlessly with Liberty | 1.0 | DEVEXP-14: Liberty integration with PostgreSQL - Built-in integration with PostgreSQL.
This includes work such as
- recognizing the driver JARs and data source classes,
- include a properties.postgreSQL configuration element including known property names/values for data sources,
- ensuring that connections and statements are cleaned up properly by the pool
- other general investigation into ensuring the driver works seamlessly with Liberty | non_priority | devexp liberty integration with postgresql built in integration with postgresql this includes work such as recognizing the driver jars and data source classes include a properties postgresql configuration element including known property names values for data sources ensuring that connections and statements are cleaned up properly by the pool other general investigation into ensuring the driver works seamlessly with liberty | 0 |
531,008 | 15,439,162,723 | IssuesEvent | 2021-03-07 23:10:04 | danbooru/danbooru | https://api.github.com/repos/danbooru/danbooru | closed | Incorrect sample sizes for old images | Bug Low Priority | The sample image for [post #60830](https://danbooru.donmai.us/posts/60830) is 404x850 when it should be 850x1785. This causes it to be upscaled in the browser and look really bad.
Presumably at some point samples were generated with a max height of 850. Any such samples need to be regenerated. | 1.0 | Incorrect sample sizes for old images - The sample image for [post #60830](https://danbooru.donmai.us/posts/60830) is 404x850 when it should be 850x1785. This causes it to be upscaled in the browser and look really bad.
Presumably at some point samples were generated with a max height of 850. Any such samples need to be regenerated. | priority | incorrect sample sizes for old images the sample image for is when it should be this causes it to be upscaled in the browser and look really bad presumably at some point samples were generated with a max height of any such samples need to be regenerated | 1 |
504,160 | 14,613,960,783 | IssuesEvent | 2020-12-22 09:08:05 | magento/magento2 | https://api.github.com/repos/magento/magento2 | closed | [Issue] Update price index if product is added to website | Component: Catalog Issue: needs update Issue: ready for confirmation Priority: P2 Progress: PR in progress Severity: S2 | This issue is automatically created based on existing pull request: magento/magento2#31067: Update price index if product is added to website
---------
<!---
Thank you for contributing to Magento.
To help us process this pull request we recommend that you add the following information:
- Summary of the pull request,
- Issue(s) related to the changes made,
- Manual testing scenarios
Fields marked with (*) are required. Please don't remove the template.
-->
<!--- Please provide a general summary of the Pull Request in the Title above -->
### Description (*)
In the scenario of multiple websites, if a product is created and added to one website. It works properly, shows on frontend etc. In case the same product is added to another website via an API call, then the product won't be visible due to missing price index. It will either require to do full price index + cache clean to make the product visible or some additional change e.g. price, in order to be processed by price indexer. This PR adds an mview subscription to catalog_product_website table to inject the product id to catalog_product_price_cl and enable it to be processed by incremental indexer.
From admin interface adding product to another website works, as there is a choice of from which website to copy the data from (or use default values), but in case of API call, the product <> website relation is just inserted (it does actually update fulltext and catalog_category at the moment).
### Related Pull Requests
<!-- related pull request placeholder -->
### Fixed Issues (if relevant)
<!---
If relevant, please provide a list of fixed issues in the format magento/magento2#<issue_number>.
There could be 1 or more issues linked here and it will help us find some more information about the reasoning behind this change.
-->
1. Fixes magento/magento2#<issue_number>
### Manual testing scenarios (*)
1. Set indexers in scheduled mode
2. Create at least 2 websites with store views
3. Create a product in one of the websites, with sufficient data that it will be shown on frontend.
4. Fire an API call to **rest/V1/products/{sku}/websites**, adding the product created in step 1 to another website
5. Wait for next incremental indexer to process changes
6. Check that the product is not visible in the website it was assigned to, due to missing price in catalog_product_index_price for the corresponding website
### Questions or comments
<!---
If relevant, here you can ask questions or provide comments on your pull request for the reviewer
For example if you need assistance with writing tests or would like some feedback on one of your development ideas
-->
### Contribution checklist (*)
- [x ] Pull request has a meaningful description of its purpose
- [x ] All commits are accompanied by meaningful commit messages
- [ ] All new or changed code is covered with unit/integration tests (if applicable)
- [ ] All automated tests passed successfully (all builds are green)
| 1.0 | [Issue] Update price index if product is added to website - This issue is automatically created based on existing pull request: magento/magento2#31067: Update price index if product is added to website
---------
<!---
Thank you for contributing to Magento.
To help us process this pull request we recommend that you add the following information:
- Summary of the pull request,
- Issue(s) related to the changes made,
- Manual testing scenarios
Fields marked with (*) are required. Please don't remove the template.
-->
<!--- Please provide a general summary of the Pull Request in the Title above -->
### Description (*)
In the scenario of multiple websites, if a product is created and added to one website. It works properly, shows on frontend etc. In case the same product is added to another website via an API call, then the product won't be visible due to missing price index. It will either require to do full price index + cache clean to make the product visible or some additional change e.g. price, in order to be processed by price indexer. This PR adds an mview subscription to catalog_product_website table to inject the product id to catalog_product_price_cl and enable it to be processed by incremental indexer.
From admin interface adding product to another website works, as there is a choice of from which website to copy the data from (or use default values), but in case of API call, the product <> website relation is just inserted (it does actually update fulltext and catalog_category at the moment).
### Related Pull Requests
<!-- related pull request placeholder -->
### Fixed Issues (if relevant)
<!---
If relevant, please provide a list of fixed issues in the format magento/magento2#<issue_number>.
There could be 1 or more issues linked here and it will help us find some more information about the reasoning behind this change.
-->
1. Fixes magento/magento2#<issue_number>
### Manual testing scenarios (*)
1. Set indexers in scheduled mode
2. Create at least 2 websites with store views
3. Create a product in one of the websites, with sufficient data that it will be shown on frontend.
4. Fire an API call to **rest/V1/products/{sku}/websites**, adding the product created in step 1 to another website
5. Wait for next incremental indexer to process changes
6. Check that the product is not visible in the website it was assigned to, due to missing price in catalog_product_index_price for the corresponding website
### Questions or comments
<!---
If relevant, here you can ask questions or provide comments on your pull request for the reviewer
For example if you need assistance with writing tests or would like some feedback on one of your development ideas
-->
### Contribution checklist (*)
- [x ] Pull request has a meaningful description of its purpose
- [x ] All commits are accompanied by meaningful commit messages
- [ ] All new or changed code is covered with unit/integration tests (if applicable)
- [ ] All automated tests passed successfully (all builds are green)
| priority | update price index if product is added to website this issue is automatically created based on existing pull request magento update price index if product is added to website thank you for contributing to magento to help us process this pull request we recommend that you add the following information summary of the pull request issue s related to the changes made manual testing scenarios fields marked with are required please don t remove the template description in the scenario of multiple websites if a product is created and added to one website it works properly shows on frontend etc in case the same product is added to another website via an api call then the product won t be visible due to missing price index it will either require to do full price index cache clean to make the product visible or some additional change e g price in order to be processed by price indexer this pr adds an mview subscription to catalog product website table to inject the product id to catalog product price cl and enable it to be processed by incremental indexer from admin interface adding product to another website works as there is a choice of from which website to copy the data from or use default values but in case of api call the product website relation is just inserted it does actually update fulltext and catalog category at the moment related pull requests fixed issues if relevant if relevant please provide a list of fixed issues in the format magento there could be or more issues linked here and it will help us find some more information about the reasoning behind this change fixes magento manual testing scenarios set indexers in scheduled mode create at least websites with store views create a product in one of the websites with sufficient data that it will be shown on frontend fire an api call to rest products sku websites adding the product created in step to another website wait for next incremental indexer to process changes check that the product is not visible in the website it was assigned to due to missing price in catalog product index price for the corresponding website questions or comments if relevant here you can ask questions or provide comments on your pull request for the reviewer for example if you need assistance with writing tests or would like some feedback on one of your development ideas contribution checklist pull request has a meaningful description of its purpose all commits are accompanied by meaningful commit messages all new or changed code is covered with unit integration tests if applicable all automated tests passed successfully all builds are green | 1 |
629,792 | 20,052,963,605 | IssuesEvent | 2022-02-03 09:02:20 | TheButterbrotMan/Awesome-Plates | https://api.github.com/repos/TheButterbrotMan/Awesome-Plates | closed | [Feature]: Output when "if" mod is detected | enhancement Low priority | ### Is your feature request related to a problem? Please describe.
_No response_
### Describe the solution you'd like
A output in the log when a "if" mod got detected, could be really usefull adressing bugs in the future.
Example:
A compatibility patch for Dirtmond is getting loaded.
### Additional context
_No response_ | 1.0 | [Feature]: Output when "if" mod is detected - ### Is your feature request related to a problem? Please describe.
_No response_
### Describe the solution you'd like
A output in the log when a "if" mod got detected, could be really usefull adressing bugs in the future.
Example:
A compatibility patch for Dirtmond is getting loaded.
### Additional context
_No response_ | priority | output when if mod is detected is your feature request related to a problem please describe no response describe the solution you d like a output in the log when a if mod got detected could be really usefull adressing bugs in the future example a compatibility patch for dirtmond is getting loaded additional context no response | 1 |
7,912 | 4,103,016,563 | IssuesEvent | 2016-06-04 11:23:15 | mbunkus/mkvtoolnix | https://api.github.com/repos/mbunkus/mkvtoolnix | closed | Test 'ParseDurationNumberWithUnitSecondUnitsFloats' fails on certain 32bit architectures | app:build system & source type:bug v9.2.0 | For Cygwin 32-bit, the latter three tests in 'ParseDurationNumberWithUnitSecondUnitsFloats' fail. These tests rely on the accuracy of double precision arithmetic, which cannot be guaranteed, especially on 32-bit platforms.
One solution is to increase the accuracy of the function being tested. In parse_duration_number_with_unit(), change the declaration of 'd_value' to a 'long double'. (src/common/strings/parsing.cpp line 228) as in this patch [9.2.0-floating_point_parsing.patch.txt](https://github.com/mbunkus/mkvtoolnix/files/288020/9.2.0-floating_point_parsing.patch.txt). This works for Cygwin 32-bit. However, since there is no guarantee that 'long double' is any more accurate than 'double', you may prefer to alter the tests to allow some tolerance in the values returned.
| 1.0 | Test 'ParseDurationNumberWithUnitSecondUnitsFloats' fails on certain 32bit architectures - For Cygwin 32-bit, the latter three tests in 'ParseDurationNumberWithUnitSecondUnitsFloats' fail. These tests rely on the accuracy of double precision arithmetic, which cannot be guaranteed, especially on 32-bit platforms.
One solution is to increase the accuracy of the function being tested. In parse_duration_number_with_unit(), change the declaration of 'd_value' to a 'long double'. (src/common/strings/parsing.cpp line 228) as in this patch [9.2.0-floating_point_parsing.patch.txt](https://github.com/mbunkus/mkvtoolnix/files/288020/9.2.0-floating_point_parsing.patch.txt). This works for Cygwin 32-bit. However, since there is no guarantee that 'long double' is any more accurate than 'double', you may prefer to alter the tests to allow some tolerance in the values returned.
| non_priority | test parsedurationnumberwithunitsecondunitsfloats fails on certain architectures for cygwin bit the latter three tests in parsedurationnumberwithunitsecondunitsfloats fail these tests rely on the accuracy of double precision arithmetic which cannot be guaranteed especially on bit platforms one solution is to increase the accuracy of the function being tested in parse duration number with unit change the declaration of d value to a long double src common strings parsing cpp line as in this patch this works for cygwin bit however since there is no guarantee that long double is any more accurate than double you may prefer to alter the tests to allow some tolerance in the values returned | 0 |
408,547 | 11,948,733,233 | IssuesEvent | 2020-04-03 12:26:25 | googleapis/elixir-google-api | https://api.github.com/repos/googleapis/elixir-google-api | opened | Synthesis failed for Mirror | autosynth failure priority: p1 type: bug | Hello! Autosynth couldn't regenerate Mirror. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to a new branch 'autosynth-mirror'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/mirror/synth.metadata', 'synth.py', '--']
2020-04-03 05:19:13,540 synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
2020-04-03 05:19:13,548 synthtool > Cloning https://github.com/googleapis/elixir-google-api.git.
2020-04-03 05:19:15,019 synthtool > Running: docker run --rm -v/home/kbuilder/.cache/synthtool/elixir-google-api:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh Mirror
2020-04-03 05:19:18,977 synthtool > Failed executing docker run --rm -v/home/kbuilder/.cache/synthtool/elixir-google-api:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh Mirror:
/workspace /workspace
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
Resolving Hex dependencies...
Dependency resolution completed:
Unchanged:
certifi 2.5.1
google_api_discovery 0.7.0
google_gax 0.3.2
hackney 1.15.2
idna 6.0.0
jason 1.1.2
metrics 1.0.1
mime 1.3.1
mimerl 1.2.0
oauth2 0.9.4
parse_trans 3.3.0
poison 3.1.0
ssl_verify_fun 1.1.5
temp 0.4.7
tesla 1.3.2
unicode_util_compat 0.4.1
[33mA new Hex version is available (0.20.1 < 0.20.5), please update with `mix local.hex`[0m
All dependencies are up to date
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
12:19:18.401 [info] FETCHING: https://www.googleapis.com/discovery/v1/apis/mirror/v1/rest
{:error, "Error received status: 404 from discovery endpoint"}
** (File.Error) could not read file "/workspace/specifications/gdd/Mirror-v1.json": no such file or directory
(elixir) lib/file.ex:353: File.read!/1
lib/google_apis/generator/elixir_generator/token.ex:79: GoogleApis.Generator.ElixirGenerator.Token.build/1
lib/google_apis/generator/elixir_generator.ex:40: GoogleApis.Generator.ElixirGenerator.generate_client/1
lib/mix/tasks/google_apis.generate.ex:33: anonymous fn/1 in Mix.Tasks.GoogleApis.Generate.builder/1
(elixir) lib/enum.ex:783: Enum."-each/2-lists^foreach/1-0-"/2
(elixir) lib/enum.ex:783: Enum.each/2
(mix) lib/mix/task.ex:331: Mix.Task.run_task/3
(mix) lib/mix/cli.ex:79: Mix.CLI.run_task/2
fixing file permissions
2020-04-03 05:19:18,987 synthtool > Wrote metadata to clients/mirror/synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 54, in <module>
shell.run(command, cwd=repository)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['docker', 'run', '--rm', '-v/home/kbuilder/.cache/synthtool/elixir-google-api:/workspace', '-v/var/run/docker.sock:/var/run/docker.sock', '-e', 'USER_GROUP=1000:1000', '-w', '/workspace', 'gcr.io/cloud-devrel-public-resources/elixir19', 'scripts/generate_client.sh', 'Mirror']' returned non-zero exit status 1.
Synthesis failed
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 484, in <module>
main()
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 334, in main
return _inner_main(temp_dir)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 399, in _inner_main
deprecated_execution=args.deprecated_execution,
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 278, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/mirror/synth.metadata', 'synth.py', '--', 'Mirror']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](https://sponge/a5db0fe7-f41e-4dc1-8b3c-af009c297bb5).
| 1.0 | Synthesis failed for Mirror - Hello! Autosynth couldn't regenerate Mirror. :broken_heart:
Here's the output from running `synth.py`:
```
Cloning into 'working_repo'...
Switched to a new branch 'autosynth-mirror'
Running synthtool
['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/mirror/synth.metadata', 'synth.py', '--']
2020-04-03 05:19:13,540 synthtool > Executing /tmpfs/src/git/autosynth/working_repo/synth.py.
2020-04-03 05:19:13,548 synthtool > Cloning https://github.com/googleapis/elixir-google-api.git.
2020-04-03 05:19:15,019 synthtool > Running: docker run --rm -v/home/kbuilder/.cache/synthtool/elixir-google-api:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh Mirror
2020-04-03 05:19:18,977 synthtool > Failed executing docker run --rm -v/home/kbuilder/.cache/synthtool/elixir-google-api:/workspace -v/var/run/docker.sock:/var/run/docker.sock -e USER_GROUP=1000:1000 -w /workspace gcr.io/cloud-devrel-public-resources/elixir19 scripts/generate_client.sh Mirror:
/workspace /workspace
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
Resolving Hex dependencies...
Dependency resolution completed:
Unchanged:
certifi 2.5.1
google_api_discovery 0.7.0
google_gax 0.3.2
hackney 1.15.2
idna 6.0.0
jason 1.1.2
metrics 1.0.1
mime 1.3.1
mimerl 1.2.0
oauth2 0.9.4
parse_trans 3.3.0
poison 3.1.0
ssl_verify_fun 1.1.5
temp 0.4.7
tesla 1.3.2
unicode_util_compat 0.4.1
[33mA new Hex version is available (0.20.1 < 0.20.5), please update with `mix local.hex`[0m
All dependencies are up to date
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
12:19:18.401 [info] FETCHING: https://www.googleapis.com/discovery/v1/apis/mirror/v1/rest
{:error, "Error received status: 404 from discovery endpoint"}
** (File.Error) could not read file "/workspace/specifications/gdd/Mirror-v1.json": no such file or directory
(elixir) lib/file.ex:353: File.read!/1
lib/google_apis/generator/elixir_generator/token.ex:79: GoogleApis.Generator.ElixirGenerator.Token.build/1
lib/google_apis/generator/elixir_generator.ex:40: GoogleApis.Generator.ElixirGenerator.generate_client/1
lib/mix/tasks/google_apis.generate.ex:33: anonymous fn/1 in Mix.Tasks.GoogleApis.Generate.builder/1
(elixir) lib/enum.ex:783: Enum."-each/2-lists^foreach/1-0-"/2
(elixir) lib/enum.ex:783: Enum.each/2
(mix) lib/mix/task.ex:331: Mix.Task.run_task/3
(mix) lib/mix/cli.ex:79: Mix.CLI.run_task/2
fixing file permissions
2020-04-03 05:19:18,987 synthtool > Wrote metadata to clients/mirror/synth.metadata.
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 102, in <module>
main()
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 829, in __call__
return self.main(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 782, in main
rv = self.invoke(ctx)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 1066, in invoke
return ctx.invoke(self.callback, **ctx.params)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/click/core.py", line 610, in invoke
return callback(*args, **kwargs)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/__main__.py", line 94, in main
spec.loader.exec_module(synth_module) # type: ignore
File "<frozen importlib._bootstrap_external>", line 678, in exec_module
File "<frozen importlib._bootstrap>", line 219, in _call_with_frames_removed
File "/tmpfs/src/git/autosynth/working_repo/synth.py", line 54, in <module>
shell.run(command, cwd=repository)
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 39, in run
raise exc
File "/tmpfs/src/git/autosynth/env/lib/python3.6/site-packages/synthtool/shell.py", line 33, in run
encoding="utf-8",
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 438, in run
output=stdout, stderr=stderr)
subprocess.CalledProcessError: Command '['docker', 'run', '--rm', '-v/home/kbuilder/.cache/synthtool/elixir-google-api:/workspace', '-v/var/run/docker.sock:/var/run/docker.sock', '-e', 'USER_GROUP=1000:1000', '-w', '/workspace', 'gcr.io/cloud-devrel-public-resources/elixir19', 'scripts/generate_client.sh', 'Mirror']' returned non-zero exit status 1.
Synthesis failed
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 484, in <module>
main()
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 334, in main
return _inner_main(temp_dir)
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 399, in _inner_main
deprecated_execution=args.deprecated_execution,
File "/tmpfs/src/git/autosynth/autosynth/synth.py", line 278, in synthesize
synth_proc.check_returncode() # Raise an exception.
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 389, in check_returncode
self.stderr)
subprocess.CalledProcessError: Command '['/tmpfs/src/git/autosynth/env/bin/python3', '-m', 'synthtool', '--metadata', 'clients/mirror/synth.metadata', 'synth.py', '--', 'Mirror']' returned non-zero exit status 1.
```
Google internal developers can see the full log [here](https://sponge/a5db0fe7-f41e-4dc1-8b3c-af009c297bb5).
| priority | synthesis failed for mirror hello autosynth couldn t regenerate mirror broken heart here s the output from running synth py cloning into working repo switched to a new branch autosynth mirror running synthtool synthtool executing tmpfs src git autosynth working repo synth py synthtool cloning synthtool running docker run rm v home kbuilder cache synthtool elixir google api workspace v var run docker sock var run docker sock e user group w workspace gcr io cloud devrel public resources scripts generate client sh mirror synthtool failed executing docker run rm v home kbuilder cache synthtool elixir google api workspace v var run docker sock var run docker sock e user group w workspace gcr io cloud devrel public resources scripts generate client sh mirror workspace workspace mix lock file was generated with a newer version of hex update your client by running mix local hex to avoid losing data resolving hex dependencies dependency resolution completed unchanged certifi google api discovery google gax hackney idna jason metrics mime mimerl parse trans poison ssl verify fun temp tesla unicode util compat new hex version is available please update with mix local hex all dependencies are up to date mix lock file was generated with a newer version of hex update your client by running mix local hex to avoid losing data fetching error error received status from discovery endpoint file error could not read file workspace specifications gdd mirror json no such file or directory elixir lib file ex file read lib google apis generator elixir generator token ex googleapis generator elixirgenerator token build lib google apis generator elixir generator ex googleapis generator elixirgenerator generate client lib mix tasks google apis generate ex anonymous fn in mix tasks googleapis generate builder elixir lib enum ex enum each lists foreach elixir lib enum ex enum each mix lib mix task ex mix task run task mix lib mix cli ex mix cli run task fixing file permissions synthtool wrote metadata to clients mirror synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth env lib site packages synthtool main py line in main file tmpfs src git autosynth env lib site packages click core py line in call return self main args kwargs file tmpfs src git autosynth env lib site packages click core py line in main rv self invoke ctx file tmpfs src git autosynth env lib site packages click core py line in invoke return ctx invoke self callback ctx params file tmpfs src git autosynth env lib site packages click core py line in invoke return callback args kwargs file tmpfs src git autosynth env lib site packages synthtool main py line in main spec loader exec module synth module type ignore file line in exec module file line in call with frames removed file tmpfs src git autosynth working repo synth py line in shell run command cwd repository file tmpfs src git autosynth env lib site packages synthtool shell py line in run raise exc file tmpfs src git autosynth env lib site packages synthtool shell py line in run encoding utf file home kbuilder pyenv versions lib subprocess py line in run output stdout stderr stderr subprocess calledprocesserror command returned non zero exit status synthesis failed traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src git autosynth autosynth synth py line in main file tmpfs src git autosynth autosynth synth py line in main return inner main temp dir file tmpfs src git autosynth autosynth synth py line in inner main deprecated execution args deprecated execution file tmpfs src git autosynth autosynth synth py line in synthesize synth proc check returncode raise an exception file home kbuilder pyenv versions lib subprocess py line in check returncode self stderr subprocess calledprocesserror command returned non zero exit status google internal developers can see the full log | 1 |
33,944 | 14,238,536,107 | IssuesEvent | 2020-11-18 18:46:34 | cityofaustin/atd-data-tech | https://api.github.com/repos/cityofaustin/atd-data-tech | closed | Interim Projects Database Viewer on AGOL support | Service: Geo Type: Map Request Workgroup: ATSD | Nathan and Dylan have asked for assistance setting up a model that would create line geometry that has been buffered around a point in order to represent all projects as line geometry and keep them in the same table. | 1.0 | Interim Projects Database Viewer on AGOL support - Nathan and Dylan have asked for assistance setting up a model that would create line geometry that has been buffered around a point in order to represent all projects as line geometry and keep them in the same table. | non_priority | interim projects database viewer on agol support nathan and dylan have asked for assistance setting up a model that would create line geometry that has been buffered around a point in order to represent all projects as line geometry and keep them in the same table | 0 |
641,276 | 20,823,089,851 | IssuesEvent | 2022-03-18 17:23:35 | apcountryman/picolibrary-microchip-megaavr0 | https://api.github.com/repos/apcountryman/picolibrary-microchip-megaavr0 | closed | Add Microchip megaAVR 0-series SPI SPI clock rate | priority-normal status-awaiting_review type-feature | Add Microchip megaAVR 0-series SPI SPI clock rate (`::picolibrary::Microchip::megaAVR::SPI::SPI_Clock_Rate`).
- [x] The `SPI_Clock_Rate` enum class should be defined in the `include/picolibrary/microchip/megaavr0/spi.h`/`source/picolibrary/microchip/megaavr0/spi.cc` header/source file pair
- [x] The `SPI_Clock_Rate` enum class should have an underlying type of `std::uint8_t`
- [x] The `SPI_Clock_Rate` enum class should have the following enumberators:
- [x] `CLK_PER_2 = ( 0b1 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV4,`: Peripheral clock frequency / 2.
- [x] `CLK_PER_4 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV4,`: Peripheral clock frequency / 4.
- [x] `CLK_PER_8 = ( 0b1 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV16,`: Peripheral clock frequency / 8.
- [x] `CLK_PER_16 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV16,`: Peripheral clock frequency / 16.
- [x] `CLK_PER_32 = ( 0b1 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV64,`: Peripheral clock frequency / 32.
- [x] `CLK_PER_64 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV64,`: Peripheral clock frequency / 64.
- [x] `CLK_PER_128 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV128,`: Peripheral clock frequency / 128. | 1.0 | Add Microchip megaAVR 0-series SPI SPI clock rate - Add Microchip megaAVR 0-series SPI SPI clock rate (`::picolibrary::Microchip::megaAVR::SPI::SPI_Clock_Rate`).
- [x] The `SPI_Clock_Rate` enum class should be defined in the `include/picolibrary/microchip/megaavr0/spi.h`/`source/picolibrary/microchip/megaavr0/spi.cc` header/source file pair
- [x] The `SPI_Clock_Rate` enum class should have an underlying type of `std::uint8_t`
- [x] The `SPI_Clock_Rate` enum class should have the following enumberators:
- [x] `CLK_PER_2 = ( 0b1 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV4,`: Peripheral clock frequency / 2.
- [x] `CLK_PER_4 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV4,`: Peripheral clock frequency / 4.
- [x] `CLK_PER_8 = ( 0b1 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV16,`: Peripheral clock frequency / 8.
- [x] `CLK_PER_16 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV16,`: Peripheral clock frequency / 16.
- [x] `CLK_PER_32 = ( 0b1 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV64,`: Peripheral clock frequency / 32.
- [x] `CLK_PER_64 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV64,`: Peripheral clock frequency / 64.
- [x] `CLK_PER_128 = ( 0b0 << Peripheral::SPI::CTRLA::Bit::CLK2X ) | Peripheral::SPI::CTRLA::PRESC_DIV128,`: Peripheral clock frequency / 128. | priority | add microchip megaavr series spi spi clock rate add microchip megaavr series spi spi clock rate picolibrary microchip megaavr spi spi clock rate the spi clock rate enum class should be defined in the include picolibrary microchip spi h source picolibrary microchip spi cc header source file pair the spi clock rate enum class should have an underlying type of std t the spi clock rate enum class should have the following enumberators clk per peripheral spi ctrla bit peripheral spi ctrla presc peripheral clock frequency clk per peripheral spi ctrla bit peripheral spi ctrla presc peripheral clock frequency clk per peripheral spi ctrla bit peripheral spi ctrla presc peripheral clock frequency clk per peripheral spi ctrla bit peripheral spi ctrla presc peripheral clock frequency clk per peripheral spi ctrla bit peripheral spi ctrla presc peripheral clock frequency clk per peripheral spi ctrla bit peripheral spi ctrla presc peripheral clock frequency clk per peripheral spi ctrla bit peripheral spi ctrla presc peripheral clock frequency | 1 |
492,222 | 14,194,394,868 | IssuesEvent | 2020-11-15 03:28:34 | Cuuhomientrung/cuuhomientrung | https://api.github.com/repos/Cuuhomientrung/cuuhomientrung | closed | [Production] [Trang thêm mới hộ dân] mục bản đồ, đề xuất ẩn viewbox hiển thị tọa độ và thêm button [Tìm vịtrí trên bản đồ] theo thông tin tỉnh, huyện, xã đã được điền trước đó. | enhancement low priority | Trang thêm mới hộdân, link: https://cuuhomientrung.info/admin/app/hodan/add/
Cái view box như trong khung đỏ theo em được biết là ô hiển thịtọa độcủa điểm được đánh trên bản đồ. Em đang không biết về ý nghĩ của nó lắm đối với người dùng.
Đềxuất:
1) Trong trường hợp người dùng không cần làm gì với nó thì mình ẩn nó đi
2) Thêm một button[Tìm vịtrí trên bản đồ]cho phép dùng thông tin xã, huyện, tỉnh đã được chọn ởcác mụcphía trên để tìm vịtrí trên bản đồ. Cụ thể là sau khi người dùng chọn thông tin ở các mục xã, tỉnh, huyện và click button[Tìm vị trí trên bản đồ] thì location icon sẽ được đặt ở vị trí cần tìm kiếm trên bản đồ.
Ý nghĩa: giúp tình nguyện viên dễ dàng xác định được vị trí của khu vực trên bản đồ.

| 1.0 | [Production] [Trang thêm mới hộ dân] mục bản đồ, đề xuất ẩn viewbox hiển thị tọa độ và thêm button [Tìm vịtrí trên bản đồ] theo thông tin tỉnh, huyện, xã đã được điền trước đó. - Trang thêm mới hộdân, link: https://cuuhomientrung.info/admin/app/hodan/add/
Cái view box như trong khung đỏ theo em được biết là ô hiển thịtọa độcủa điểm được đánh trên bản đồ. Em đang không biết về ý nghĩ của nó lắm đối với người dùng.
Đềxuất:
1) Trong trường hợp người dùng không cần làm gì với nó thì mình ẩn nó đi
2) Thêm một button[Tìm vịtrí trên bản đồ]cho phép dùng thông tin xã, huyện, tỉnh đã được chọn ởcác mụcphía trên để tìm vịtrí trên bản đồ. Cụ thể là sau khi người dùng chọn thông tin ở các mục xã, tỉnh, huyện và click button[Tìm vị trí trên bản đồ] thì location icon sẽ được đặt ở vị trí cần tìm kiếm trên bản đồ.
Ý nghĩa: giúp tình nguyện viên dễ dàng xác định được vị trí của khu vực trên bản đồ.

| priority | mục bản đồ đề xuất ẩn viewbox hiển thị tọa độ và thêm button theo thông tin tỉnh huyện xã đã được điền trước đó trang thêm mới hộdân link cái view box như trong khung đỏ theo em được biết là ô hiển thịtọa độcủa điểm được đánh trên bản đồ em đang không biết về ý nghĩ của nó lắm đối với người dùng đềxuất trong trường hợp người dùng không cần làm gì với nó thì mình ẩn nó đi thêm một button cho phép dùng thông tin xã huyện tỉnh đã được chọn ởcác mụcphía trên để tìm vịtrí trên bản đồ cụ thể là sau khi người dùng chọn thông tin ở các mục xã tỉnh huyện và click button thì location icon sẽ được đặt ở vị trí cần tìm kiếm trên bản đồ ý nghĩa giúp tình nguyện viên dễ dàng xác định được vị trí của khu vực trên bản đồ | 1 |
250,762 | 18,906,406,574 | IssuesEvent | 2021-11-16 09:33:17 | ita-social-projects/dokazovi-requirements | https://api.github.com/repos/ita-social-projects/dokazovi-requirements | opened | [Test for Story #282 ] | documentation test case | **https://github.com/ita-social-projects/dokazovi-be/issues/282**
### Status:
Not executed
### Title:
Verify that Admin can preview Carousel with cards of 'Existing Materials' section on the Settings Important Section page
### Description:
Verify that Admin can preview Carousel with cards of 'Existing Materials' section on the Settings Important Section page
### Pre-conditions:
User with administrator role is authorised in the system.
User is on the 'Налаштування - Головна' page.
Existing Material Section has at least two cards.
Step № | Test Steps | Test data | Expected result | Status (Pass/Fail/Not executed) | Notes
------------ | ------------ | ------------ | ------------ | ------------ | ------------
1 | Click on the [Важливе] button on the 'Головна' menu| text | Existing materials are displayed with number of cards | Not executed | text
2 | Click [Переглянути] button | text | Module window is opened with Carousel of cards from Existing Materials Section in order according to number of cards and [Закрити] button is active | Not executed | text
3 | Click [Закрити] button | text | Module window is closed | Not executed | text
### [Gantt Chart](https://docs.google.com/spreadsheets/d/1bgaEJDOf3OhfNRfP-WWPKmmZFW5C3blOUxamE3wSCbM/edit#gid=775577959)
| 1.0 | [Test for Story #282 ] - **https://github.com/ita-social-projects/dokazovi-be/issues/282**
### Status:
Not executed
### Title:
Verify that Admin can preview Carousel with cards of 'Existing Materials' section on the Settings Important Section page
### Description:
Verify that Admin can preview Carousel with cards of 'Existing Materials' section on the Settings Important Section page
### Pre-conditions:
User with administrator role is authorised in the system.
User is on the 'Налаштування - Головна' page.
Existing Material Section has at least two cards.
Step № | Test Steps | Test data | Expected result | Status (Pass/Fail/Not executed) | Notes
------------ | ------------ | ------------ | ------------ | ------------ | ------------
1 | Click on the [Важливе] button on the 'Головна' menu| text | Existing materials are displayed with number of cards | Not executed | text
2 | Click [Переглянути] button | text | Module window is opened with Carousel of cards from Existing Materials Section in order according to number of cards and [Закрити] button is active | Not executed | text
3 | Click [Закрити] button | text | Module window is closed | Not executed | text
### [Gantt Chart](https://docs.google.com/spreadsheets/d/1bgaEJDOf3OhfNRfP-WWPKmmZFW5C3blOUxamE3wSCbM/edit#gid=775577959)
| non_priority | status not executed title verify that admin can preview carousel with cards of existing materials section on the settings important section page description verify that admin can preview carousel with cards of existing materials section on the settings important section page pre conditions user with administrator role is authorised in the system user is on the налаштування головна page existing material section has at least two cards step № test steps test data expected result status pass fail not executed notes click on the button on the головна menu text existing materials are displayed with number of cards not executed text click button text module window is opened with carousel of cards from existing materials section in order according to number of cards and button is active not executed text click button text module window is closed not executed text | 0 |
296,976 | 9,159,044,586 | IssuesEvent | 2019-03-01 00:44:17 | kubeflow/kubeflow | https://api.github.com/repos/kubeflow/kubeflow | opened | Ambassador routes for new jupyter notebooks don't work | area/jupyter priority/p1 | Here's the service definition
```
apiVersion: v1
kind: Service
metadata:
annotations:
getambassador.io/config: |-
---
apiVersion: ambassador/v0
kind: Mapping
name: notebook_kf-jlewi_jlewi-1_mapping
prefix: /notebook/kf-jlewi/jlewi-1
rewrite: /kf-jlewi/jlewi-1
timeout_ms: 300000
service: jlewi-1.kf-jlewi:8888
use_websocket: true
```
When I navigate to:
```
https://jlewi-0228.endpoints.cloud-ml-dev.cloud.goog/notebook/kf-jlewi/jlewi-1/
```
I get upstream connect error.
Is it supposed to be rewriting it to `/kf-jlewi/jlewi-1`
/cc @kimwnasptd @lluunn
| 1.0 | Ambassador routes for new jupyter notebooks don't work - Here's the service definition
```
apiVersion: v1
kind: Service
metadata:
annotations:
getambassador.io/config: |-
---
apiVersion: ambassador/v0
kind: Mapping
name: notebook_kf-jlewi_jlewi-1_mapping
prefix: /notebook/kf-jlewi/jlewi-1
rewrite: /kf-jlewi/jlewi-1
timeout_ms: 300000
service: jlewi-1.kf-jlewi:8888
use_websocket: true
```
When I navigate to:
```
https://jlewi-0228.endpoints.cloud-ml-dev.cloud.goog/notebook/kf-jlewi/jlewi-1/
```
I get upstream connect error.
Is it supposed to be rewriting it to `/kf-jlewi/jlewi-1`
/cc @kimwnasptd @lluunn
| priority | ambassador routes for new jupyter notebooks don t work here s the service definition apiversion kind service metadata annotations getambassador io config apiversion ambassador kind mapping name notebook kf jlewi jlewi mapping prefix notebook kf jlewi jlewi rewrite kf jlewi jlewi timeout ms service jlewi kf jlewi use websocket true when i navigate to i get upstream connect error is it supposed to be rewriting it to kf jlewi jlewi cc kimwnasptd lluunn | 1 |
567,727 | 16,890,841,996 | IssuesEvent | 2021-06-23 09:03:38 | canonical-web-and-design/ubuntu.com | https://api.github.com/repos/canonical-web-and-design/ubuntu.com | closed | [UA-Shop] 3Ds payment failures | Commercial 🛒 Priority: High |
## Summary
A problem may occur when making payment for a subscription on the shop if the 3Ds challenge is not completed on time.
## Process
1. navigate to the /advantage staging test environment

2. select items to purchase i.e Desktop UA
3. on the payment details use a card that requires 3Ds payment i.e `4000000000003220`. fill in all the other details on the page as required

4. Click on the page to make payment. When the 3Ds prompt comes up, do not click on any of the options to pass or fail the authentication.

5. after about 1 min, the page will redirect to /advantage/subscribe page.

## Current and expected result
the page redirects to the /advantage/subscribe page, denoting a successful payment. however the customer has not authorized the payment and the purchase will not be processed.
| 1.0 | [UA-Shop] 3Ds payment failures -
## Summary
A problem may occur when making payment for a subscription on the shop if the 3Ds challenge is not completed on time.
## Process
1. navigate to the /advantage staging test environment

2. select items to purchase i.e Desktop UA
3. on the payment details use a card that requires 3Ds payment i.e `4000000000003220`. fill in all the other details on the page as required

4. Click on the page to make payment. When the 3Ds prompt comes up, do not click on any of the options to pass or fail the authentication.

5. after about 1 min, the page will redirect to /advantage/subscribe page.

## Current and expected result
the page redirects to the /advantage/subscribe page, denoting a successful payment. however the customer has not authorized the payment and the purchase will not be processed.
| priority | payment failures summary a problem may occur when making payment for a subscription on the shop if the challenge is not completed on time process navigate to the advantage staging test environment select items to purchase i e desktop ua on the payment details use a card that requires payment i e fill in all the other details on the page as required click on the page to make payment when the prompt comes up do not click on any of the options to pass or fail the authentication after about min the page will redirect to advantage subscribe page current and expected result the page redirects to the advantage subscribe page denoting a successful payment however the customer has not authorized the payment and the purchase will not be processed | 1 |
206,491 | 7,112,809,954 | IssuesEvent | 2018-01-17 18:15:35 | branjos/Clan_Bot-Wiki | https://api.github.com/repos/branjos/Clan_Bot-Wiki | closed | Create rank role additions/removals based on player rank | Low Priority enhancement | Clans use roles as a way to administer their permissions but also as a way to show status. Validated users should have their roles automatically updated based on their in-game rank.
This should be an optional feature that can be turned on or off based on the clan configuration. | 1.0 | Create rank role additions/removals based on player rank - Clans use roles as a way to administer their permissions but also as a way to show status. Validated users should have their roles automatically updated based on their in-game rank.
This should be an optional feature that can be turned on or off based on the clan configuration. | priority | create rank role additions removals based on player rank clans use roles as a way to administer their permissions but also as a way to show status validated users should have their roles automatically updated based on their in game rank this should be an optional feature that can be turned on or off based on the clan configuration | 1 |
329,726 | 10,023,788,814 | IssuesEvent | 2019-07-16 20:07:35 | googleapis/google-cloud-java | https://api.github.com/repos/googleapis/google-cloud-java | opened | Logging: Cannot specify resources when fetching log entries | api: logging priority: p2 type: bug | In LoggingImpl, the ListLogEntriesRequest is built and only specifies the projectId as the resource names: https://github.com/googleapis/google-cloud-java/blob/64189021c41c0f96b8a96beefb737cbf76be22c8/google-cloud-clients/google-cloud-logging/src/main/java/com/google/cloud/logging/LoggingImpl.java#L632-L653
The end-user should be able to somehow set the resource names fields.
[RPC Specification](https://cloud.google.com/logging/docs/reference/v2/rpc/google.logging.v2#listlogentriesrequest)
Note: this will require some substantial public interface changes. | 1.0 | Logging: Cannot specify resources when fetching log entries - In LoggingImpl, the ListLogEntriesRequest is built and only specifies the projectId as the resource names: https://github.com/googleapis/google-cloud-java/blob/64189021c41c0f96b8a96beefb737cbf76be22c8/google-cloud-clients/google-cloud-logging/src/main/java/com/google/cloud/logging/LoggingImpl.java#L632-L653
The end-user should be able to somehow set the resource names fields.
[RPC Specification](https://cloud.google.com/logging/docs/reference/v2/rpc/google.logging.v2#listlogentriesrequest)
Note: this will require some substantial public interface changes. | priority | logging cannot specify resources when fetching log entries in loggingimpl the listlogentriesrequest is built and only specifies the projectid as the resource names the end user should be able to somehow set the resource names fields note this will require some substantial public interface changes | 1 |
349,713 | 31,823,795,631 | IssuesEvent | 2023-09-14 05:50:51 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | sql/tests: TestRandomSyntaxSQLSmith failed | C-test-failure O-robot branch-master release-blocker T-sql-foundations | sql/tests.TestRandomSyntaxSQLSmith [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11770371?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11770371?buildTab=artifacts#/) on master @ [0cde11b885a5e156b970c003505acd081fb4f326](https://github.com/cockroachdb/cockroach/commits/0cde11b885a5e156b970c003505acd081fb4f326):
```
=== RUN TestRandomSyntaxSQLSmith
test logs left over in: /artifacts/tmp/_tmp/d437d2c847dfedbc4972f231c3331c8e/logTestRandomSyntaxSchemaChangeColumn3824830113
test_log_scope.go:167: test logs captured to: /artifacts/tmp/_tmp/d437d2c847dfedbc4972f231c3331c8e/logTestRandomSyntaxSQLSmith3855995623
test_log_scope.go:81: use -show-logs to present logs inline
test_server_shim.go:124: automatically injected virtual cluster under test; see comment at top of test_server_shim.go for details.
rsg_test.go:662: SET sql_safe_updates = false;;
rsg_test.go:662: SET CLUSTER SETTING sql.stats.automatic_collection.enabled = false;;
rsg_test.go:662: SET CLUSTER SETTING sql.stats.histogram_collection.enabled = false;;
rsg_test.go:662: CREATE TABLE table1 (col1_0 TSQUERY NULL, col1_1 BIT(49) NOT NULL, col😫1_2 BOX2D, """c%pol1_3" STRING COLLATE en_US NOT NULL, " col1_4" INT4, col1_5 STRING COLLATE en_US, col1_6 BIT(26) NOT NULL, col1_7 INT8 AS (" col1_4" + 1127741478:::INT8) STORED, "\\U00010575col1_8" STRING NULL AS (lower(CAST(col1_0 AS STRING))) STORED, "!col1_9" STRING AS (lower(CAST(col1_5 AS STRING))) STORED, co😙l1_10 INT8 AS (" col1_4" + NULL) VIRTUAL, "col1_\\x7311" STRING NOT NULL AS (lower(CAST(col1_1 AS STRING))) STORED, col1_12 STRING AS (lower(CAST(col1_5 AS STRING))) VIRTUAL, UNIQUE ("""c%pol1_3" ASC, " col1_4", col1_1 ASC, col1_7) STORING (col1_0, col1_6, "!col1_9"), INDEX (col1_1 ASC, col1_5 DESC, col1_7, " col1_4" DESC) STORING (col1_0, col😫1_2, """c%pol1_3", col1_6, "col1_\\x7311") VISIBILITY 0.81, INDEX (lower(CAST(col1_5 AS STRING)) DESC, "col1_\\x7311" DESC, """c%pol1_3" DESC, lower(CAST(col1_0 AS STRING)) DESC, col1_7) STORING (col😫1_2, col1_5, "\\U00010575col1_8") WHERE (table1.co😙l1_10 != (-128):::INT8) OR (table1."!col1_9" < e'\U00002603':::STRING) NOT VISIBLE, FAMILY (col1_0), FAMILY ("col1_\\x7311"), FAMILY (" col1_4", "!col1_9"), FAMILY ("\\U00010575col1_8"), FAMILY ("""c%pol1_3"), FAMILY (col1_6), FAMILY (col1_7, col1_1), FAMILY (col1_5, col😫1_2));
rsg_test.go:662: CREATE TABLE "tab le2" (col2😘_0 INT8 NOT NULL, col2_1 REGROLE NOT NULL, "col2'_2" PG_LSN NOT NULL, col2_3 FLOAT8 NULL, col2_4 BIT(2) NOT NULL, col2_5 STRING NOT NULL AS (lower(CAST(col2_4 AS STRING))) STORED, PRIMARY KEY (col2_4 DESC, col2_1 ASC), UNIQUE (col2_3, col2_4, "col2'_2" ASC, col2_1 DESC) WHERE (("tab le2".col2😘_0 = 0:::INT8) OR ("tab le2".col2_5 >= '"':::STRING)) AND ("tab le2".col2_3 > 1.401298464324817e-45:::FLOAT8), UNIQUE (col2_3, col2_4, col2_5 ASC, (col2_3 + (-0.6883075479578443):::FLOAT8) ASC, col2😘_0, lower(CAST(col2_4 AS STRING)) ASC) STORING ("col2'_2"), INDEX (col2_3 DESC, col2_4 DESC));
rsg_test.go:834: pq: use of partitions requires an enterprise license. see https://cockroachlabs.com/pricing?cluster=3bdb4d86-009d-451e-3692-ca83f20c3c33 for details on how to enable enterprise features
panic.go:522: -- test log scope end --
test logs left over in: /artifacts/tmp/_tmp/d437d2c847dfedbc4972f231c3331c8e/logTestRandomSyntaxSQLSmith3855995623
--- FAIL: TestRandomSyntaxSQLSmith (4.11s)
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/sql-foundations
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxSQLSmith.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 1.0 | sql/tests: TestRandomSyntaxSQLSmith failed - sql/tests.TestRandomSyntaxSQLSmith [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11770371?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/11770371?buildTab=artifacts#/) on master @ [0cde11b885a5e156b970c003505acd081fb4f326](https://github.com/cockroachdb/cockroach/commits/0cde11b885a5e156b970c003505acd081fb4f326):
```
=== RUN TestRandomSyntaxSQLSmith
test logs left over in: /artifacts/tmp/_tmp/d437d2c847dfedbc4972f231c3331c8e/logTestRandomSyntaxSchemaChangeColumn3824830113
test_log_scope.go:167: test logs captured to: /artifacts/tmp/_tmp/d437d2c847dfedbc4972f231c3331c8e/logTestRandomSyntaxSQLSmith3855995623
test_log_scope.go:81: use -show-logs to present logs inline
test_server_shim.go:124: automatically injected virtual cluster under test; see comment at top of test_server_shim.go for details.
rsg_test.go:662: SET sql_safe_updates = false;;
rsg_test.go:662: SET CLUSTER SETTING sql.stats.automatic_collection.enabled = false;;
rsg_test.go:662: SET CLUSTER SETTING sql.stats.histogram_collection.enabled = false;;
rsg_test.go:662: CREATE TABLE table1 (col1_0 TSQUERY NULL, col1_1 BIT(49) NOT NULL, col😫1_2 BOX2D, """c%pol1_3" STRING COLLATE en_US NOT NULL, " col1_4" INT4, col1_5 STRING COLLATE en_US, col1_6 BIT(26) NOT NULL, col1_7 INT8 AS (" col1_4" + 1127741478:::INT8) STORED, "\\U00010575col1_8" STRING NULL AS (lower(CAST(col1_0 AS STRING))) STORED, "!col1_9" STRING AS (lower(CAST(col1_5 AS STRING))) STORED, co😙l1_10 INT8 AS (" col1_4" + NULL) VIRTUAL, "col1_\\x7311" STRING NOT NULL AS (lower(CAST(col1_1 AS STRING))) STORED, col1_12 STRING AS (lower(CAST(col1_5 AS STRING))) VIRTUAL, UNIQUE ("""c%pol1_3" ASC, " col1_4", col1_1 ASC, col1_7) STORING (col1_0, col1_6, "!col1_9"), INDEX (col1_1 ASC, col1_5 DESC, col1_7, " col1_4" DESC) STORING (col1_0, col😫1_2, """c%pol1_3", col1_6, "col1_\\x7311") VISIBILITY 0.81, INDEX (lower(CAST(col1_5 AS STRING)) DESC, "col1_\\x7311" DESC, """c%pol1_3" DESC, lower(CAST(col1_0 AS STRING)) DESC, col1_7) STORING (col😫1_2, col1_5, "\\U00010575col1_8") WHERE (table1.co😙l1_10 != (-128):::INT8) OR (table1."!col1_9" < e'\U00002603':::STRING) NOT VISIBLE, FAMILY (col1_0), FAMILY ("col1_\\x7311"), FAMILY (" col1_4", "!col1_9"), FAMILY ("\\U00010575col1_8"), FAMILY ("""c%pol1_3"), FAMILY (col1_6), FAMILY (col1_7, col1_1), FAMILY (col1_5, col😫1_2));
rsg_test.go:662: CREATE TABLE "tab le2" (col2😘_0 INT8 NOT NULL, col2_1 REGROLE NOT NULL, "col2'_2" PG_LSN NOT NULL, col2_3 FLOAT8 NULL, col2_4 BIT(2) NOT NULL, col2_5 STRING NOT NULL AS (lower(CAST(col2_4 AS STRING))) STORED, PRIMARY KEY (col2_4 DESC, col2_1 ASC), UNIQUE (col2_3, col2_4, "col2'_2" ASC, col2_1 DESC) WHERE (("tab le2".col2😘_0 = 0:::INT8) OR ("tab le2".col2_5 >= '"':::STRING)) AND ("tab le2".col2_3 > 1.401298464324817e-45:::FLOAT8), UNIQUE (col2_3, col2_4, col2_5 ASC, (col2_3 + (-0.6883075479578443):::FLOAT8) ASC, col2😘_0, lower(CAST(col2_4 AS STRING)) ASC) STORING ("col2'_2"), INDEX (col2_3 DESC, col2_4 DESC));
rsg_test.go:834: pq: use of partitions requires an enterprise license. see https://cockroachlabs.com/pricing?cluster=3bdb4d86-009d-451e-3692-ca83f20c3c33 for details on how to enable enterprise features
panic.go:522: -- test log scope end --
test logs left over in: /artifacts/tmp/_tmp/d437d2c847dfedbc4972f231c3331c8e/logTestRandomSyntaxSQLSmith3855995623
--- FAIL: TestRandomSyntaxSQLSmith (4.11s)
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
/cc @cockroachdb/sql-foundations
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxSQLSmith.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| non_priority | sql tests testrandomsyntaxsqlsmith failed sql tests testrandomsyntaxsqlsmith with on master run testrandomsyntaxsqlsmith test logs left over in artifacts tmp tmp test log scope go test logs captured to artifacts tmp tmp test log scope go use show logs to present logs inline test server shim go automatically injected virtual cluster under test see comment at top of test server shim go for details rsg test go set sql safe updates false rsg test go set cluster setting sql stats automatic collection enabled false rsg test go set cluster setting sql stats histogram collection enabled false rsg test go create table tsquery null bit not null col😫 c string collate en us not null string collate en us bit not null as stored string null as lower cast as string stored string as lower cast as string stored co😙 as null virtual string not null as lower cast as string stored string as lower cast as string virtual unique c asc asc storing index asc desc desc storing col😫 c visibility index lower cast as string desc desc c desc lower cast as string desc storing col😫 where co😙 or e string not visible family family family family family c family family family col😫 rsg test go create table tab 😘 not null regrole not null pg lsn not null null bit not null string not null as lower cast as string stored primary key desc asc unique asc desc where tab 😘 or tab string and tab unique asc asc 😘 lower cast as string asc storing index desc desc rsg test go pq use of partitions requires an enterprise license see for details on how to enable enterprise features panic go test log scope end test logs left over in artifacts tmp tmp fail testrandomsyntaxsqlsmith help see also cc cockroachdb sql foundations | 0 |
261,656 | 22,763,140,229 | IssuesEvent | 2022-07-07 23:41:39 | microsoft/msquic | https://api.github.com/repos/microsoft/msquic | closed | Expand Version Negotiation Extension testing | Area: Testing | Test scenarios that are not currently tested:
- [ ] Third party tries to force version downgrade via injected VN packet
- [ ] Third party changes the version field during handshake
- [ ] Server initiates incompatible version negotiation, and also does compatible version negotiation on the second attempt. | 1.0 | Expand Version Negotiation Extension testing - Test scenarios that are not currently tested:
- [ ] Third party tries to force version downgrade via injected VN packet
- [ ] Third party changes the version field during handshake
- [ ] Server initiates incompatible version negotiation, and also does compatible version negotiation on the second attempt. | non_priority | expand version negotiation extension testing test scenarios that are not currently tested third party tries to force version downgrade via injected vn packet third party changes the version field during handshake server initiates incompatible version negotiation and also does compatible version negotiation on the second attempt | 0 |
452,765 | 13,059,059,079 | IssuesEvent | 2020-07-30 10:00:46 | incognitochain/incognito-wallet | https://api.github.com/repos/incognitochain/incognito-wallet | closed | Pdex - pdex account with followed token | Priority: Medium Type: Enhancement | From Annie
Description: On pdex when a trading transaction happened, PRV - pTOMO, when user check pdex account, pTOMO balance doesnt show up until user follows the token.
Can we make it automatically when a trade happens, the token will be automatically followed. | 1.0 | Pdex - pdex account with followed token - From Annie
Description: On pdex when a trading transaction happened, PRV - pTOMO, when user check pdex account, pTOMO balance doesnt show up until user follows the token.
Can we make it automatically when a trade happens, the token will be automatically followed. | priority | pdex pdex account with followed token from annie description on pdex when a trading transaction happened prv ptomo when user check pdex account ptomo balance doesnt show up until user follows the token can we make it automatically when a trade happens the token will be automatically followed | 1 |
629,488 | 20,034,475,610 | IssuesEvent | 2022-02-02 10:23:34 | xwikisas/application-filemanager | https://api.github.com/repos/xwikisas/application-filemanager | closed | Unregistered users cannot download multiple files (package) on Tomcat | Priority: Major Type: Bug | The server returns a 400 Bad Request for the download URL. The download URLs look like this:
```
// Registered user
http://localhost:8080/xwiki/bin/get/FileManager/Download-Admin-NAhd (302 Redirect)
http://localhost:8080/xwiki/tmp/filemanager/document%3Axwiki%3AFileManager.Download-Admin-NAhd/NAhd.zip
// Unregistered user
http://localhost:8080/xwiki/bin/get/FileManager/Download-%24%7Bxcontext.userReference.name%7D-gkub (302 Redirect)
http://localhost:8080/xwiki/tmp/filemanager/document%3Axwiki%3AFileManager.Download-%24%7Bxcontext%21.userReference%21.name%7D-gkub/gkub.zip
```
Notice the ``$xcontext.userReference.name`` in the download URL for unregistered users. The reason is because the guest user reference is null. The Tomcat doesn't like the encoded ``$`` character. It works fine with Jetty though. Note that the URL is properly URL-encoded but Tomcat is protected itself from some security issues and doesn't allow such an URL.
We can use Velocity's silent notation to prevent this problem. | 1.0 | Unregistered users cannot download multiple files (package) on Tomcat - The server returns a 400 Bad Request for the download URL. The download URLs look like this:
```
// Registered user
http://localhost:8080/xwiki/bin/get/FileManager/Download-Admin-NAhd (302 Redirect)
http://localhost:8080/xwiki/tmp/filemanager/document%3Axwiki%3AFileManager.Download-Admin-NAhd/NAhd.zip
// Unregistered user
http://localhost:8080/xwiki/bin/get/FileManager/Download-%24%7Bxcontext.userReference.name%7D-gkub (302 Redirect)
http://localhost:8080/xwiki/tmp/filemanager/document%3Axwiki%3AFileManager.Download-%24%7Bxcontext%21.userReference%21.name%7D-gkub/gkub.zip
```
Notice the ``$xcontext.userReference.name`` in the download URL for unregistered users. The reason is because the guest user reference is null. The Tomcat doesn't like the encoded ``$`` character. It works fine with Jetty though. Note that the URL is properly URL-encoded but Tomcat is protected itself from some security issues and doesn't allow such an URL.
We can use Velocity's silent notation to prevent this problem. | priority | unregistered users cannot download multiple files package on tomcat the server returns a bad request for the download url the download urls look like this registered user redirect unregistered user redirect notice the xcontext userreference name in the download url for unregistered users the reason is because the guest user reference is null the tomcat doesn t like the encoded character it works fine with jetty though note that the url is properly url encoded but tomcat is protected itself from some security issues and doesn t allow such an url we can use velocity s silent notation to prevent this problem | 1 |
23,245 | 2,657,706,622 | IssuesEvent | 2015-03-18 11:03:02 | centre-for-educational-technology/learnmixer | https://api.github.com/repos/centre-for-educational-technology/learnmixer | opened | Hiding and removing structural elements and learning resources | high priority | _From @ilyashmorgun on March 18, 2015 10:21_
It should be possible to hide or remove structural elements (e.g. chapter or sub-chapter) and learning resources from a collection. Hiding items make its it possible to show them at a later point and removing deletes them permanently. If an item is removed a confirmation is shown to the user.
_Copied from original issue: ilyashmorgun/LearnMix-Prototypes#23_ | 1.0 | Hiding and removing structural elements and learning resources - _From @ilyashmorgun on March 18, 2015 10:21_
It should be possible to hide or remove structural elements (e.g. chapter or sub-chapter) and learning resources from a collection. Hiding items make its it possible to show them at a later point and removing deletes them permanently. If an item is removed a confirmation is shown to the user.
_Copied from original issue: ilyashmorgun/LearnMix-Prototypes#23_ | priority | hiding and removing structural elements and learning resources from ilyashmorgun on march it should be possible to hide or remove structural elements e g chapter or sub chapter and learning resources from a collection hiding items make its it possible to show them at a later point and removing deletes them permanently if an item is removed a confirmation is shown to the user copied from original issue ilyashmorgun learnmix prototypes | 1 |
353,607 | 10,554,832,588 | IssuesEvent | 2019-10-03 20:23:36 | cu-mkp/m-k-manuscript-data | https://api.github.com/repos/cu-mkp/m-k-manuscript-data | opened | check that all editorial deletions are in <corr><del></del></corr> | consistency low-priority markup problematic qc | #120 listed the following concern:
More generally, there is inconsistency in the ways that the deletion tag has been used--the tags are only used as editorial tags!, so correction tag must always be around the del tag in tl. TL should follow tcn in this place.
check consistency of this!! | 1.0 | check that all editorial deletions are in <corr><del></del></corr> - #120 listed the following concern:
More generally, there is inconsistency in the ways that the deletion tag has been used--the tags are only used as editorial tags!, so correction tag must always be around the del tag in tl. TL should follow tcn in this place.
check consistency of this!! | priority | check that all editorial deletions are in listed the following concern more generally there is inconsistency in the ways that the deletion tag has been used the tags are only used as editorial tags so correction tag must always be around the del tag in tl tl should follow tcn in this place check consistency of this | 1 |
27,912 | 30,672,860,252 | IssuesEvent | 2023-07-26 00:59:40 | VocaDB/vocadb | https://api.github.com/repos/VocaDB/vocadb | opened | Ability to follow venues | UX/usability events users messages | It might be useful to be able to "follow" venues so people can track their favorite locations. Users would get a notification when a new event in said venue was added. | True | Ability to follow venues - It might be useful to be able to "follow" venues so people can track their favorite locations. Users would get a notification when a new event in said venue was added. | non_priority | ability to follow venues it might be useful to be able to follow venues so people can track their favorite locations users would get a notification when a new event in said venue was added | 0 |
198,762 | 22,674,059,310 | IssuesEvent | 2022-07-04 01:11:40 | dhlinh98/WebGoat | https://api.github.com/repos/dhlinh98/WebGoat | closed | CVE-2020-13692 (High) detected in postgresql-42.2.2.jar, postgresql-42.2.8.jar - autoclosed | security vulnerability | ## CVE-2020-13692 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>postgresql-42.2.2.jar</b>, <b>postgresql-42.2.8.jar</b></p></summary>
<p>
<details><summary><b>postgresql-42.2.2.jar</b></p></summary>
<p>Java JDBC 4.2 (JRE 8+) driver for PostgreSQL database</p>
<p>Library home page: <a href="https://github.com/pgjdbc/pgjdbc">https://github.com/pgjdbc/pgjdbc</a></p>
<p>Path to dependency file: /webgoat-server/pom.xml</p>
<p>Path to vulnerable library: /m2/repository/org/postgresql/postgresql/42.2.2/postgresql-42.2.2.jar</p>
<p>
Dependency Hierarchy:
- :x: **postgresql-42.2.2.jar** (Vulnerable Library)
</details>
<details><summary><b>postgresql-42.2.8.jar</b></p></summary>
<p>Java JDBC 4.2 (JRE 8+) driver for PostgreSQL database</p>
<p>Library home page: <a href="https://github.com/pgjdbc/pgjdbc">https://github.com/pgjdbc/pgjdbc</a></p>
<p>Path to dependency file: /webgoat-integration-tests/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/postgresql/postgresql/42.2.8/postgresql-42.2.8.jar</p>
<p>
Dependency Hierarchy:
- webgoat-server-v8.0.0-SNAPSHOT.jar (Root Library)
- :x: **postgresql-42.2.8.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/dhlinh98/WebGoat/commit/722f91856f2698620d2230889c74d38419005474">722f91856f2698620d2230889c74d38419005474</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
PostgreSQL JDBC Driver (aka PgJDBC) before 42.2.13 allows XXE.
<p>Publish Date: 2020-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13692>CVE-2020-13692</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://jdbc.postgresql.org/documentation/changelog.html#version_42.2.13">https://jdbc.postgresql.org/documentation/changelog.html#version_42.2.13</a></p>
<p>Release Date: 2020-06-04</p>
<p>Fix Resolution: org.postgresql:postgresql:42.2.13</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-13692 (High) detected in postgresql-42.2.2.jar, postgresql-42.2.8.jar - autoclosed - ## CVE-2020-13692 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>postgresql-42.2.2.jar</b>, <b>postgresql-42.2.8.jar</b></p></summary>
<p>
<details><summary><b>postgresql-42.2.2.jar</b></p></summary>
<p>Java JDBC 4.2 (JRE 8+) driver for PostgreSQL database</p>
<p>Library home page: <a href="https://github.com/pgjdbc/pgjdbc">https://github.com/pgjdbc/pgjdbc</a></p>
<p>Path to dependency file: /webgoat-server/pom.xml</p>
<p>Path to vulnerable library: /m2/repository/org/postgresql/postgresql/42.2.2/postgresql-42.2.2.jar</p>
<p>
Dependency Hierarchy:
- :x: **postgresql-42.2.2.jar** (Vulnerable Library)
</details>
<details><summary><b>postgresql-42.2.8.jar</b></p></summary>
<p>Java JDBC 4.2 (JRE 8+) driver for PostgreSQL database</p>
<p>Library home page: <a href="https://github.com/pgjdbc/pgjdbc">https://github.com/pgjdbc/pgjdbc</a></p>
<p>Path to dependency file: /webgoat-integration-tests/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/postgresql/postgresql/42.2.8/postgresql-42.2.8.jar</p>
<p>
Dependency Hierarchy:
- webgoat-server-v8.0.0-SNAPSHOT.jar (Root Library)
- :x: **postgresql-42.2.8.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/dhlinh98/WebGoat/commit/722f91856f2698620d2230889c74d38419005474">722f91856f2698620d2230889c74d38419005474</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
PostgreSQL JDBC Driver (aka PgJDBC) before 42.2.13 allows XXE.
<p>Publish Date: 2020-06-04
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-13692>CVE-2020-13692</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://jdbc.postgresql.org/documentation/changelog.html#version_42.2.13">https://jdbc.postgresql.org/documentation/changelog.html#version_42.2.13</a></p>
<p>Release Date: 2020-06-04</p>
<p>Fix Resolution: org.postgresql:postgresql:42.2.13</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in postgresql jar postgresql jar autoclosed cve high severity vulnerability vulnerable libraries postgresql jar postgresql jar postgresql jar java jdbc jre driver for postgresql database library home page a href path to dependency file webgoat server pom xml path to vulnerable library repository org postgresql postgresql postgresql jar dependency hierarchy x postgresql jar vulnerable library postgresql jar java jdbc jre driver for postgresql database library home page a href path to dependency file webgoat integration tests pom xml path to vulnerable library home wss scanner repository org postgresql postgresql postgresql jar dependency hierarchy webgoat server snapshot jar root library x postgresql jar vulnerable library found in head commit a href vulnerability details postgresql jdbc driver aka pgjdbc before allows xxe publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact low availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org postgresql postgresql step up your open source security game with whitesource | 0 |
208,524 | 7,155,418,889 | IssuesEvent | 2018-01-26 12:42:39 | bartongroup/RATS | https://api.github.com/repos/bartongroup/RATS | closed | (dev. ver.) Why did a crash condition pass the tests? | medium priority | This issue applies to dev versions between 0.6.0-3 and 0.6.0-7.
This bug has now been resolved, but formal testing should be put in place for such scenario.
With the introduction of scaling vectors in place of scaling factors, some code omissions caused RATs to crash. The tests failed to pick up the crash at all. Probably because the breaking code is only executed when scaling is requested, and none of the old tests have a scaling scenario.
- [x] Create scaling tests.
- [x] Test more of the preparatory steps in `rats.R`. Direct testing of the functions only does not cover all sources of error. | 1.0 | (dev. ver.) Why did a crash condition pass the tests? - This issue applies to dev versions between 0.6.0-3 and 0.6.0-7.
This bug has now been resolved, but formal testing should be put in place for such scenario.
With the introduction of scaling vectors in place of scaling factors, some code omissions caused RATs to crash. The tests failed to pick up the crash at all. Probably because the breaking code is only executed when scaling is requested, and none of the old tests have a scaling scenario.
- [x] Create scaling tests.
- [x] Test more of the preparatory steps in `rats.R`. Direct testing of the functions only does not cover all sources of error. | priority | dev ver why did a crash condition pass the tests this issue applies to dev versions between and this bug has now been resolved but formal testing should be put in place for such scenario with the introduction of scaling vectors in place of scaling factors some code omissions caused rats to crash the tests failed to pick up the crash at all probably because the breaking code is only executed when scaling is requested and none of the old tests have a scaling scenario create scaling tests test more of the preparatory steps in rats r direct testing of the functions only does not cover all sources of error | 1 |
78,654 | 10,076,958,690 | IssuesEvent | 2019-07-24 17:32:14 | celo-org/celo-monorepo | https://api.github.com/repos/celo-org/celo-monorepo | closed | Devs SBAT reference installation docs for different operating systems | documentation | ### Expected Behavior
Documentation should have installation instructions for other operating systems (eg Ubuntu, Arch).
### Current Behavior
Engineering setup only has macOS installation steps. | 1.0 | Devs SBAT reference installation docs for different operating systems - ### Expected Behavior
Documentation should have installation instructions for other operating systems (eg Ubuntu, Arch).
### Current Behavior
Engineering setup only has macOS installation steps. | non_priority | devs sbat reference installation docs for different operating systems expected behavior documentation should have installation instructions for other operating systems eg ubuntu arch current behavior engineering setup only has macos installation steps | 0 |
1,696 | 24,641,448,950 | IssuesEvent | 2022-10-17 11:59:52 | openwall/john | https://api.github.com/repos/openwall/john | opened | Shouldn't skip #include <altivec.h> in DES_bs_b.c at least on FreeBSD | portability | As reported by @danfe in https://github.com/openwall/john/commit/48382b86d1d1cbc121ed206392e35c8bf76a59c0#r86752702:
> Would it be possible to adjust, if not completely remove this guard? This breaks at least FreeBSD build, and perhaps others as well. Note that similar `#include <arm_neon.h>` earlier in this file is not guarded.
My reply there was:
> IIRC, at the time the AltiVec support was for just two platforms - macOS (first) and Linux (added later). If this guard of `#include <altivec.h>` was needed, then I guess we can replace the `#ifdef __linux__` with `#ifndef __APPLE__` for the same effect on old macOS. Ideally someone would test that, but please feel free to open a pull request with this change anyhow. Thank you!
>
> It is also possible that the guard was never required, and I just didn't want to break some macOS (when adding Linux support), which I knew didn't need the `#include`. Maybe I never investigated this and put the guard in just in case. | True | Shouldn't skip #include <altivec.h> in DES_bs_b.c at least on FreeBSD - As reported by @danfe in https://github.com/openwall/john/commit/48382b86d1d1cbc121ed206392e35c8bf76a59c0#r86752702:
> Would it be possible to adjust, if not completely remove this guard? This breaks at least FreeBSD build, and perhaps others as well. Note that similar `#include <arm_neon.h>` earlier in this file is not guarded.
My reply there was:
> IIRC, at the time the AltiVec support was for just two platforms - macOS (first) and Linux (added later). If this guard of `#include <altivec.h>` was needed, then I guess we can replace the `#ifdef __linux__` with `#ifndef __APPLE__` for the same effect on old macOS. Ideally someone would test that, but please feel free to open a pull request with this change anyhow. Thank you!
>
> It is also possible that the guard was never required, and I just didn't want to break some macOS (when adding Linux support), which I knew didn't need the `#include`. Maybe I never investigated this and put the guard in just in case. | non_priority | shouldn t skip include in des bs b c at least on freebsd as reported by danfe in would it be possible to adjust if not completely remove this guard this breaks at least freebsd build and perhaps others as well note that similar include earlier in this file is not guarded my reply there was iirc at the time the altivec support was for just two platforms macos first and linux added later if this guard of include was needed then i guess we can replace the ifdef linux with ifndef apple for the same effect on old macos ideally someone would test that but please feel free to open a pull request with this change anyhow thank you it is also possible that the guard was never required and i just didn t want to break some macos when adding linux support which i knew didn t need the include maybe i never investigated this and put the guard in just in case | 0 |
129,281 | 18,075,375,204 | IssuesEvent | 2021-09-21 09:17:08 | AlexRogalskiy/github-action-branch-mapper | https://api.github.com/repos/AlexRogalskiy/github-action-branch-mapper | opened | CVE-2021-3807 (Medium) detected in multiple libraries | security vulnerability | ## CVE-2021-3807 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-2.1.1.tgz</b>, <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/inquirer/node_modules/strip-ansi/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/npm/node_modules/wrap-ansi/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/npm/node_modules/yargs/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/npm/node_modules/cliui/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- cz-conventional-changelog-3.3.0.tgz (Root Library)
- commitizen-4.2.3.tgz
- inquirer-6.5.2.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-2.1.1.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/npm/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- npm-7.0.10.tgz (Root Library)
- npm-6.14.11.tgz
- libnpmhook-5.0.3.tgz
- strip-ansi-3.0.1.tgz
- :x: **ansi-regex-2.1.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/npm/node_modules/string-width/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/inquirer/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- cz-conventional-changelog-3.3.0.tgz (Root Library)
- commitizen-4.2.3.tgz
- inquirer-6.5.2.tgz
- string-width-2.1.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- eslint-7.20.0.tgz (Root Library)
- strip-ansi-6.0.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-branch-mapper/commit/5a01b9e9b84c2e4a3fc1ef431c4f84d74dcd7534">5a01b9e9b84c2e4a3fc1ef431c4f84d74dcd7534</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-3807 (Medium) detected in multiple libraries - ## CVE-2021-3807 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-2.1.1.tgz</b>, <b>ansi-regex-3.0.0.tgz</b>, <b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/inquirer/node_modules/strip-ansi/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/npm/node_modules/wrap-ansi/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/npm/node_modules/yargs/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/npm/node_modules/cliui/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- cz-conventional-changelog-3.3.0.tgz (Root Library)
- commitizen-4.2.3.tgz
- inquirer-6.5.2.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-2.1.1.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-2.1.1.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/npm/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- npm-7.0.10.tgz (Root Library)
- npm-6.14.11.tgz
- libnpmhook-5.0.3.tgz
- strip-ansi-3.0.1.tgz
- :x: **ansi-regex-2.1.1.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/npm/node_modules/string-width/node_modules/ansi-regex/package.json,github-action-branch-mapper/node_modules/inquirer/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- cz-conventional-changelog-3.3.0.tgz (Root Library)
- commitizen-4.2.3.tgz
- inquirer-6.5.2.tgz
- string-width-2.1.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: github-action-branch-mapper/package.json</p>
<p>Path to vulnerable library: github-action-branch-mapper/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- eslint-7.20.0.tgz (Root Library)
- strip-ansi-6.0.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/AlexRogalskiy/github-action-branch-mapper/commit/5a01b9e9b84c2e4a3fc1ef431c4f84d74dcd7534">5a01b9e9b84c2e4a3fc1ef431c4f84d74dcd7534</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: N/A
- Attack Complexity: N/A
- Privileges Required: N/A
- User Interaction: N/A
- Scope: N/A
- Impact Metrics:
- Confidentiality Impact: N/A
- Integrity Impact: N/A
- Availability Impact: N/A
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries ansi regex tgz ansi regex tgz ansi regex tgz ansi regex tgz ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file github action branch mapper package json path to vulnerable library github action branch mapper node modules inquirer node modules strip ansi node modules ansi regex package json github action branch mapper node modules npm node modules wrap ansi node modules ansi regex package json github action branch mapper node modules npm node modules yargs node modules ansi regex package json github action branch mapper node modules npm node modules cliui node modules ansi regex package json dependency hierarchy cz conventional changelog tgz root library commitizen tgz inquirer tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file github action branch mapper package json path to vulnerable library github action branch mapper node modules npm node modules ansi regex package json dependency hierarchy npm tgz root library npm tgz libnpmhook tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file github action branch mapper package json path to vulnerable library github action branch mapper node modules npm node modules string width node modules ansi regex package json github action branch mapper node modules inquirer node modules ansi regex package json dependency hierarchy cz conventional changelog tgz root library commitizen tgz inquirer tgz string width tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file github action branch mapper package json path to vulnerable library github action branch mapper node modules ansi regex package json dependency hierarchy eslint tgz root library strip ansi tgz x ansi regex tgz vulnerable library found in head commit a href vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector n a attack complexity n a privileges required n a user interaction n a scope n a impact metrics confidentiality impact n a integrity impact n a availability impact n a for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex step up your open source security game with whitesource | 0 |
708,026 | 24,327,602,019 | IssuesEvent | 2022-09-30 16:07:45 | yugabyte/yugabyte-db | https://api.github.com/repos/yugabyte/yugabyte-db | closed | [YSQL] ALTER Materialized View | kind/enhancement area/ysql priority/medium pgcm | Jira Link: [DB-899](https://yugabyte.atlassian.net/browse/DB-899)
### Description
Issue for tracking `ALTER MATERIALIZED VIEW`. | 1.0 | [YSQL] ALTER Materialized View - Jira Link: [DB-899](https://yugabyte.atlassian.net/browse/DB-899)
### Description
Issue for tracking `ALTER MATERIALIZED VIEW`. | priority | alter materialized view jira link description issue for tracking alter materialized view | 1 |
240,430 | 20,030,235,096 | IssuesEvent | 2022-02-02 04:18:06 | mozilla-mobile/focus-android | https://api.github.com/repos/mozilla-mobile/focus-android | closed | Disabled UI test - all tests in FirstRunDialogueTest.kt class | eng:ui-test eg:disabled-test | Tests disabled in https://github.com/mozilla-mobile/focus-android/pull/5997/files
skipFirstRunOnboardingTest,
verifyWhatsNewLinkFromTips,
verifyAllowListLinkFromTips
verifySwitchToDesktopSiteLinkFromTips
firstTipIsAlwaysDisplayedTest
Needs a fix and re-enabling.
| 2.0 | Disabled UI test - all tests in FirstRunDialogueTest.kt class - Tests disabled in https://github.com/mozilla-mobile/focus-android/pull/5997/files
skipFirstRunOnboardingTest,
verifyWhatsNewLinkFromTips,
verifyAllowListLinkFromTips
verifySwitchToDesktopSiteLinkFromTips
firstTipIsAlwaysDisplayedTest
Needs a fix and re-enabling.
| non_priority | disabled ui test all tests in firstrundialoguetest kt class tests disabled in skipfirstrunonboardingtest verifywhatsnewlinkfromtips verifyallowlistlinkfromtips verifyswitchtodesktopsitelinkfromtips firsttipisalwaysdisplayedtest needs a fix and re enabling | 0 |
774,015 | 27,180,283,746 | IssuesEvent | 2023-02-18 14:42:06 | ankitsmt211/To-Do-Bot | https://api.github.com/repos/ankitsmt211/To-Do-Bot | opened | dynamically updating the index | low priority | Currently, `displayList(List<String> todoList)` method hardcodes the indexes with the help of counter `i`, the problem with that is when `removing` a certain item from list using `!remove index` , the remaining items would show the same indexes making it look like one of the index is skipped.

| 1.0 | dynamically updating the index - Currently, `displayList(List<String> todoList)` method hardcodes the indexes with the help of counter `i`, the problem with that is when `removing` a certain item from list using `!remove index` , the remaining items would show the same indexes making it look like one of the index is skipped.

| priority | dynamically updating the index currently displaylist list todolist method hardcodes the indexes with the help of counter i the problem with that is when removing a certain item from list using remove index the remaining items would show the same indexes making it look like one of the index is skipped | 1 |
342,400 | 10,316,434,889 | IssuesEvent | 2019-08-30 09:59:33 | kubernetes/kubernetes | https://api.github.com/repos/kubernetes/kubernetes | closed | Some CRDs that duplicate built-in group/version/kinds reap built-in etcd data on deletion | area/custom-resources kind/bug lifecycle/frozen priority/important-soon sig/api-machinery | **What happened**:
Created a CRD:
```
apiVersion: apiextensions.k8s.io/v1beta1
kind: CustomResourceDefinition
metadata:
name: foos.example.com
spec:
group: example.com
scope: Namespaced
version: v1
names:
singular: foo
plural: foos
kind: Foo
listKind: FooList
```
Created another CRD defining an overlapping type with CustomResourceDefinition:
```yaml
apiVersion: apiextensions.k8s.io/v1beta1
kind: CustomResourceDefinition
metadata:
name: customresourcedefinitions.apiextensions.k8s.io
spec:
group: apiextensions.k8s.io
version: v1beta1
scope: Cluster
names:
plural: customresourcedefinitions
singular: customresourcedefinition
kind: CustomResourceDefinition
listKind: CustomResourceDefinitionList
validation:
openAPIV3Schema:
type: object
properties:
readonly:
type: boolean
```
Verify both exist:
```
$ kubectl get crds
NAME CREATED AT
customresourcedefinitions.apiextensions.k8s.io 2019-06-04T19:37:46Z
foos.example.com 2019-06-04T19:48:35Z
```
Deleted the one overlapping with the in-tree CRD type:
```
kubectl delete crd/customresourcedefinitions.apiextensions.k8s.io --wait=false
```
Verify only the overlapping CRD remains:
```
$ kubectl get crd
NAME CREATED AT
customresourcedefinitions.apiextensions.k8s.io 2019-06-04T19:37:46Z
```
and is stuck deleting:
```
$ kubectl get crd -o yaml
...
status:
conditions:
...
- lastTransitionTime: "2019-06-04T19:49:40Z"
message: CustomResource deletion is in progress
reason: InstanceDeletionInProgress
status: "True"
type: Terminating
```
**What you expected to happen**:
Creation and deletion of the overlapping type CRD would have no effect on the in-tree data.
**Anything else we need to know?**:
The only types in the kube-apiserver (currently) that store data under a group prefix are:
* APIService (`<prefix>/apiregistration.k8s.io/apiservices/...`)
* CustomResourceDefinition (`<prefix>/apiextensions.k8s.io/customresourcedefinitions/...`)
* Custom resources (`<prefix>/<group>/<pluralResource>/...`)
When a CRD is deleted, all instances of the type it defines are first removed. That means that defining and deleting a CustomResourceDefinition that defines an overlapping APIService or CustomResourceDefinition type will remove existing APIService or CustomResourceDefinition data.
We can special-case these two resources in the CRD deletion code, but we need a more robust solution before CRD v1.
/cc @sttts @deads2k @jpbetz
/sig api-machinery
/priority important-soon
/area custom-resources | 1.0 | Some CRDs that duplicate built-in group/version/kinds reap built-in etcd data on deletion - **What happened**:
Created a CRD:
```
apiVersion: apiextensions.k8s.io/v1beta1
kind: CustomResourceDefinition
metadata:
name: foos.example.com
spec:
group: example.com
scope: Namespaced
version: v1
names:
singular: foo
plural: foos
kind: Foo
listKind: FooList
```
Created another CRD defining an overlapping type with CustomResourceDefinition:
```yaml
apiVersion: apiextensions.k8s.io/v1beta1
kind: CustomResourceDefinition
metadata:
name: customresourcedefinitions.apiextensions.k8s.io
spec:
group: apiextensions.k8s.io
version: v1beta1
scope: Cluster
names:
plural: customresourcedefinitions
singular: customresourcedefinition
kind: CustomResourceDefinition
listKind: CustomResourceDefinitionList
validation:
openAPIV3Schema:
type: object
properties:
readonly:
type: boolean
```
Verify both exist:
```
$ kubectl get crds
NAME CREATED AT
customresourcedefinitions.apiextensions.k8s.io 2019-06-04T19:37:46Z
foos.example.com 2019-06-04T19:48:35Z
```
Deleted the one overlapping with the in-tree CRD type:
```
kubectl delete crd/customresourcedefinitions.apiextensions.k8s.io --wait=false
```
Verify only the overlapping CRD remains:
```
$ kubectl get crd
NAME CREATED AT
customresourcedefinitions.apiextensions.k8s.io 2019-06-04T19:37:46Z
```
and is stuck deleting:
```
$ kubectl get crd -o yaml
...
status:
conditions:
...
- lastTransitionTime: "2019-06-04T19:49:40Z"
message: CustomResource deletion is in progress
reason: InstanceDeletionInProgress
status: "True"
type: Terminating
```
**What you expected to happen**:
Creation and deletion of the overlapping type CRD would have no effect on the in-tree data.
**Anything else we need to know?**:
The only types in the kube-apiserver (currently) that store data under a group prefix are:
* APIService (`<prefix>/apiregistration.k8s.io/apiservices/...`)
* CustomResourceDefinition (`<prefix>/apiextensions.k8s.io/customresourcedefinitions/...`)
* Custom resources (`<prefix>/<group>/<pluralResource>/...`)
When a CRD is deleted, all instances of the type it defines are first removed. That means that defining and deleting a CustomResourceDefinition that defines an overlapping APIService or CustomResourceDefinition type will remove existing APIService or CustomResourceDefinition data.
We can special-case these two resources in the CRD deletion code, but we need a more robust solution before CRD v1.
/cc @sttts @deads2k @jpbetz
/sig api-machinery
/priority important-soon
/area custom-resources | priority | some crds that duplicate built in group version kinds reap built in etcd data on deletion what happened created a crd apiversion apiextensions io kind customresourcedefinition metadata name foos example com spec group example com scope namespaced version names singular foo plural foos kind foo listkind foolist created another crd defining an overlapping type with customresourcedefinition yaml apiversion apiextensions io kind customresourcedefinition metadata name customresourcedefinitions apiextensions io spec group apiextensions io version scope cluster names plural customresourcedefinitions singular customresourcedefinition kind customresourcedefinition listkind customresourcedefinitionlist validation type object properties readonly type boolean verify both exist kubectl get crds name created at customresourcedefinitions apiextensions io foos example com deleted the one overlapping with the in tree crd type kubectl delete crd customresourcedefinitions apiextensions io wait false verify only the overlapping crd remains kubectl get crd name created at customresourcedefinitions apiextensions io and is stuck deleting kubectl get crd o yaml status conditions lasttransitiontime message customresource deletion is in progress reason instancedeletioninprogress status true type terminating what you expected to happen creation and deletion of the overlapping type crd would have no effect on the in tree data anything else we need to know the only types in the kube apiserver currently that store data under a group prefix are apiservice apiregistration io apiservices customresourcedefinition apiextensions io customresourcedefinitions custom resources when a crd is deleted all instances of the type it defines are first removed that means that defining and deleting a customresourcedefinition that defines an overlapping apiservice or customresourcedefinition type will remove existing apiservice or customresourcedefinition data we can special case these two resources in the crd deletion code but we need a more robust solution before crd cc sttts jpbetz sig api machinery priority important soon area custom resources | 1 |
344,928 | 24,835,311,243 | IssuesEvent | 2022-10-26 08:23:12 | corona-warn-app/cwa-website | https://api.github.com/repos/corona-warn-app/cwa-website | closed | Release 2.28 Updates | documentation mirrored-to-jira screenshots faq blog Fix 2.28 | - [x] Update Gulpfile
iOS 2.28.0
Android 2.28.2
- [x] Update Screenshots https://github.com/corona-warn-app/cwa-website/pull/3133
- [x] Update release date in intro text.
- [x] Update AppToWebLinks.json https://github.com/corona-warn-app/cwa-website/pull/3154
- [x] Update sitemap.json and sitemap_de.json
- [x] Add screenshots page R 2.27 - https://github.com/corona-warn-app/cwa-website/pull/3133
- [x] Add blog R 2.28 - https://github.com/corona-warn-app/cwa-website/pull/3130.
- [x] Update Privacy notice https://github.com/corona-warn-app/cwa-website/pull/3145
- [x] add new docs (three PDF).
- [x] update release date.
- [x] Update FAQ/Glossary articles
https://github.com/corona-warn-app/cwa-website/pull/3152
- [x] Set "aktualisiert am/updated" dates.
- [x] Set blog post date in URL.
- [x] Blog.
---
**Other new or updated content (some updates are unrelated to release 2.28)**
- https://github.com/corona-warn-app/cwa-website/pull/3113
- https://github.com/corona-warn-app/cwa-website/pull/3114
- https://github.com/corona-warn-app/cwa-website/pull/3116
- https://github.com/corona-warn-app/cwa-website/pull/3117
- https://github.com/corona-warn-app/cwa-website/pull/3118
- https://github.com/corona-warn-app/cwa-website/pull/3119
- https://github.com/corona-warn-app/cwa-website/pull/3122
- https://github.com/corona-warn-app/cwa-website/pull/3123
- https://github.com/corona-warn-app/cwa-website/pull/3124
- https://github.com/corona-warn-app/cwa-website/pull/3125
- https://github.com/corona-warn-app/cwa-website/pull/3127
- https://github.com/corona-warn-app/cwa-website/pull/3128
- https://github.com/corona-warn-app/cwa-website/pull/3129
- https://github.com/corona-warn-app/cwa-website/pull/3131
- https://github.com/corona-warn-app/cwa-website/pull/3132
- https://github.com/corona-warn-app/cwa-website/pull/3134
- https://github.com/corona-warn-app/cwa-website/pull/3136
- https://github.com/corona-warn-app/cwa-website/pull/3138
- https://github.com/corona-warn-app/cwa-website/pull/3139
- https://github.com/corona-warn-app/cwa-website/pull/3140
- https://github.com/corona-warn-app/cwa-website/pull/3141
- https://github.com/corona-warn-app/cwa-website/pull/3142
- https://github.com/corona-warn-app/cwa-website/pull/3143
---
Internal Tracking ID: [EXPOSUREAPP-14052](https://jira-ibs.wbs.net.sap/browse/EXPOSUREAPP-14052) | 1.0 | Release 2.28 Updates - - [x] Update Gulpfile
iOS 2.28.0
Android 2.28.2
- [x] Update Screenshots https://github.com/corona-warn-app/cwa-website/pull/3133
- [x] Update release date in intro text.
- [x] Update AppToWebLinks.json https://github.com/corona-warn-app/cwa-website/pull/3154
- [x] Update sitemap.json and sitemap_de.json
- [x] Add screenshots page R 2.27 - https://github.com/corona-warn-app/cwa-website/pull/3133
- [x] Add blog R 2.28 - https://github.com/corona-warn-app/cwa-website/pull/3130.
- [x] Update Privacy notice https://github.com/corona-warn-app/cwa-website/pull/3145
- [x] add new docs (three PDF).
- [x] update release date.
- [x] Update FAQ/Glossary articles
https://github.com/corona-warn-app/cwa-website/pull/3152
- [x] Set "aktualisiert am/updated" dates.
- [x] Set blog post date in URL.
- [x] Blog.
---
**Other new or updated content (some updates are unrelated to release 2.28)**
- https://github.com/corona-warn-app/cwa-website/pull/3113
- https://github.com/corona-warn-app/cwa-website/pull/3114
- https://github.com/corona-warn-app/cwa-website/pull/3116
- https://github.com/corona-warn-app/cwa-website/pull/3117
- https://github.com/corona-warn-app/cwa-website/pull/3118
- https://github.com/corona-warn-app/cwa-website/pull/3119
- https://github.com/corona-warn-app/cwa-website/pull/3122
- https://github.com/corona-warn-app/cwa-website/pull/3123
- https://github.com/corona-warn-app/cwa-website/pull/3124
- https://github.com/corona-warn-app/cwa-website/pull/3125
- https://github.com/corona-warn-app/cwa-website/pull/3127
- https://github.com/corona-warn-app/cwa-website/pull/3128
- https://github.com/corona-warn-app/cwa-website/pull/3129
- https://github.com/corona-warn-app/cwa-website/pull/3131
- https://github.com/corona-warn-app/cwa-website/pull/3132
- https://github.com/corona-warn-app/cwa-website/pull/3134
- https://github.com/corona-warn-app/cwa-website/pull/3136
- https://github.com/corona-warn-app/cwa-website/pull/3138
- https://github.com/corona-warn-app/cwa-website/pull/3139
- https://github.com/corona-warn-app/cwa-website/pull/3140
- https://github.com/corona-warn-app/cwa-website/pull/3141
- https://github.com/corona-warn-app/cwa-website/pull/3142
- https://github.com/corona-warn-app/cwa-website/pull/3143
---
Internal Tracking ID: [EXPOSUREAPP-14052](https://jira-ibs.wbs.net.sap/browse/EXPOSUREAPP-14052) | non_priority | release updates update gulpfile ios android update screenshots update release date in intro text update apptoweblinks json update sitemap json and sitemap de json add screenshots page r add blog r update privacy notice add new docs three pdf update release date update faq glossary articles set aktualisiert am updated dates set blog post date in url blog other new or updated content some updates are unrelated to release internal tracking id | 0 |
50,470 | 13,187,511,428 | IssuesEvent | 2020-08-13 03:39:02 | icecube-trac/tix3 | https://api.github.com/repos/icecube-trac/tix3 | closed | test ticket for templates (Trac #755) | Migrated from Trac booking defect | this is the ticket body
```text
some
formatted
text
```
<details>
<summary><em>Migrated from https://code.icecube.wisc.edu/ticket/755
, reported by nega and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2014-09-05T22:04:25",
"description": "this is the ticket body\n\n{{{\nsome\n formatted\n text\n}}}",
"reporter": "nega",
"cc": "negapluck@gmail.com",
"resolution": "fixed",
"_ts": "1409954665676165",
"component": "booking",
"summary": "test ticket for templates",
"priority": "normal",
"keywords": "",
"time": "2014-09-05T21:57:55",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
| 1.0 | test ticket for templates (Trac #755) - this is the ticket body
```text
some
formatted
text
```
<details>
<summary><em>Migrated from https://code.icecube.wisc.edu/ticket/755
, reported by nega and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2014-09-05T22:04:25",
"description": "this is the ticket body\n\n{{{\nsome\n formatted\n text\n}}}",
"reporter": "nega",
"cc": "negapluck@gmail.com",
"resolution": "fixed",
"_ts": "1409954665676165",
"component": "booking",
"summary": "test ticket for templates",
"priority": "normal",
"keywords": "",
"time": "2014-09-05T21:57:55",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
| non_priority | test ticket for templates trac this is the ticket body text some formatted text migrated from reported by nega and owned by nega json status closed changetime description this is the ticket body n n nsome n formatted n text n reporter nega cc negapluck gmail com resolution fixed ts component booking summary test ticket for templates priority normal keywords time milestone owner nega type defect | 0 |
141,310 | 21,479,353,816 | IssuesEvent | 2022-04-26 16:12:10 | flutter/flutter | https://api.github.com/repos/flutter/flutter | closed | Introduce new Android 12 style ink ripple | severe: new feature framework a: animation f: material design a: fidelity proposal passed first triage | Android 12 introduces a [new patterned touch effect](https://www.androidpolice.com/2021/03/24/android-12-dp-2-has-a-sweet-new-textured-ripple-animation-for-taps/) which looks different to the current `InkWell` implementation.
<img src="https://user-images.githubusercontent.com/1096485/118717266-22f9db00-b826-11eb-94bf-b2b96a0bd005.png" width="240"></img>
Flutter will match this with the upcoming Material You, but will we still be able to use the old InkWell once updated? Will we be able to use the old InkWell for older platforms and the new one starting with Android 12?
| 1.0 | Introduce new Android 12 style ink ripple - Android 12 introduces a [new patterned touch effect](https://www.androidpolice.com/2021/03/24/android-12-dp-2-has-a-sweet-new-textured-ripple-animation-for-taps/) which looks different to the current `InkWell` implementation.
<img src="https://user-images.githubusercontent.com/1096485/118717266-22f9db00-b826-11eb-94bf-b2b96a0bd005.png" width="240"></img>
Flutter will match this with the upcoming Material You, but will we still be able to use the old InkWell once updated? Will we be able to use the old InkWell for older platforms and the new one starting with Android 12?
| non_priority | introduce new android style ink ripple android introduces a which looks different to the current inkwell implementation flutter will match this with the upcoming material you but will we still be able to use the old inkwell once updated will we be able to use the old inkwell for older platforms and the new one starting with android | 0 |
41,455 | 8,973,356,363 | IssuesEvent | 2019-01-29 20:48:18 | sbrl/Pepperminty-Wiki | https://api.github.com/repos/sbrl/Pepperminty-Wiki | opened | Mega Enhancement: Syntax Highlighting | Area: Code enhancement | It sounds simple on the surface, but syntax highlighting is proving to be a really thorny issue. The trick is going to be to do it whilst maintaining the following principles of _Pepperminty Wiki_:
1. Everything is in a single file
2. Be compatible with a reasonable number of different web servers & environment setups (though this does _not_ include old PHP versions!)
3. No additional installation steps required (or a done transparently on first load - e.g. creation of `peppermint.json`, `pageindex.json`, etc.)
Initially, I thought that utilising a [phar](https://secure.php.net/manual/en/intro.phar.php) woudl be a great idea - because we can have our own internal file structure - but present a single file!
Unfortunately, this comes with a bit of a caveat: It requires the `.phar` extension - which isn't usually configured on web servers - breaking point #2.
After about 15 minutes of head-scratching, I've managed to remember the name of a very special PHP function that halts the PHP processor, allowing arbitrary data to be embedded at the end of the file: [`__halt_compiler();`](https://devdocs.io/php/function.halt-compiler). We could pack up our extra dependencies into a compressed archive (zip? .tar.gz? .tar.bz2? we'll have to see what's available).
There are other questions too:
- Do we want to do the highlighting server-side or client-side?
- How do we do this in a manner that makes it accessible to _any_ module?
- We can use the current build system to expose a property in the array that's passed to `register_module` that lets modules specify files to embed & unpack on first run - the build system does actually `require()` each module during the build process to build a JSON index file that's used by the packing script. I should probably document this process.
- Perhaps we can download & pack Parsedown etc. here to avoid a first-run download too?
Definitely something to mull over. | 1.0 | Mega Enhancement: Syntax Highlighting - It sounds simple on the surface, but syntax highlighting is proving to be a really thorny issue. The trick is going to be to do it whilst maintaining the following principles of _Pepperminty Wiki_:
1. Everything is in a single file
2. Be compatible with a reasonable number of different web servers & environment setups (though this does _not_ include old PHP versions!)
3. No additional installation steps required (or a done transparently on first load - e.g. creation of `peppermint.json`, `pageindex.json`, etc.)
Initially, I thought that utilising a [phar](https://secure.php.net/manual/en/intro.phar.php) woudl be a great idea - because we can have our own internal file structure - but present a single file!
Unfortunately, this comes with a bit of a caveat: It requires the `.phar` extension - which isn't usually configured on web servers - breaking point #2.
After about 15 minutes of head-scratching, I've managed to remember the name of a very special PHP function that halts the PHP processor, allowing arbitrary data to be embedded at the end of the file: [`__halt_compiler();`](https://devdocs.io/php/function.halt-compiler). We could pack up our extra dependencies into a compressed archive (zip? .tar.gz? .tar.bz2? we'll have to see what's available).
There are other questions too:
- Do we want to do the highlighting server-side or client-side?
- How do we do this in a manner that makes it accessible to _any_ module?
- We can use the current build system to expose a property in the array that's passed to `register_module` that lets modules specify files to embed & unpack on first run - the build system does actually `require()` each module during the build process to build a JSON index file that's used by the packing script. I should probably document this process.
- Perhaps we can download & pack Parsedown etc. here to avoid a first-run download too?
Definitely something to mull over. | non_priority | mega enhancement syntax highlighting it sounds simple on the surface but syntax highlighting is proving to be a really thorny issue the trick is going to be to do it whilst maintaining the following principles of pepperminty wiki everything is in a single file be compatible with a reasonable number of different web servers environment setups though this does not include old php versions no additional installation steps required or a done transparently on first load e g creation of peppermint json pageindex json etc initially i thought that utilising a woudl be a great idea because we can have our own internal file structure but present a single file unfortunately this comes with a bit of a caveat it requires the phar extension which isn t usually configured on web servers breaking point after about minutes of head scratching i ve managed to remember the name of a very special php function that halts the php processor allowing arbitrary data to be embedded at the end of the file we could pack up our extra dependencies into a compressed archive zip tar gz tar we ll have to see what s available there are other questions too do we want to do the highlighting server side or client side how do we do this in a manner that makes it accessible to any module we can use the current build system to expose a property in the array that s passed to register module that lets modules specify files to embed unpack on first run the build system does actually require each module during the build process to build a json index file that s used by the packing script i should probably document this process perhaps we can download pack parsedown etc here to avoid a first run download too definitely something to mull over | 0 |
14,893 | 3,293,389,385 | IssuesEvent | 2015-10-30 18:42:23 | 18F/web-design-standards | https://api.github.com/repos/18F/web-design-standards | opened | First Column of Illustrator file out of alignment | bug To be triaged visual design | Copied from an issues from @rtwell on the `asets` repo: https://github.com/18F/web-design-standards-assets/issues/16
@mollieru I noticed that the 1. 2. 3. etc listing below the Merriweather headings and body are out of alignment in both the Typography_v.8.ai and the Typography_v.8.eps files. Screen shot attached for your reference.

| 1.0 | First Column of Illustrator file out of alignment - Copied from an issues from @rtwell on the `asets` repo: https://github.com/18F/web-design-standards-assets/issues/16
@mollieru I noticed that the 1. 2. 3. etc listing below the Merriweather headings and body are out of alignment in both the Typography_v.8.ai and the Typography_v.8.eps files. Screen shot attached for your reference.

| non_priority | first column of illustrator file out of alignment copied from an issues from rtwell on the asets repo mollieru i noticed that the etc listing below the merriweather headings and body are out of alignment in both the typography v ai and the typography v eps files screen shot attached for your reference | 0 |
83,916 | 16,389,176,044 | IssuesEvent | 2021-05-17 14:11:50 | FranciscoPark/isco | https://api.github.com/repos/FranciscoPark/isco | opened | Min Stack | Leetcode enhancement | ```Python3
class MinStack:
def __init__(self):
"""
initialize your data structure here.
"""
self.result = []
def push(self, val: int) -> None:
self.result.append(val)
def pop(self) -> None:
self.result.pop()
def top(self) -> int:
if self.result:
return self.result[-1]
def getMin(self) -> int:
if self.result:
answer = sorted(self.result)
return answer[0]
``` | 1.0 | Min Stack - ```Python3
class MinStack:
def __init__(self):
"""
initialize your data structure here.
"""
self.result = []
def push(self, val: int) -> None:
self.result.append(val)
def pop(self) -> None:
self.result.pop()
def top(self) -> int:
if self.result:
return self.result[-1]
def getMin(self) -> int:
if self.result:
answer = sorted(self.result)
return answer[0]
``` | non_priority | min stack class minstack def init self initialize your data structure here self result def push self val int none self result append val def pop self none self result pop def top self int if self result return self result def getmin self int if self result answer sorted self result return answer | 0 |
247,072 | 20,955,989,438 | IssuesEvent | 2022-03-27 05:07:11 | cricarba/isolucionStatus | https://api.github.com/repos/cricarba/isolucionStatus | closed | 🛑 saitugsTest.isolucion.co is down | status saitugs-test-isolucion-co | In [`79aaef1`](https://github.com/cricarba/isolucionStatus/commit/79aaef15090c8860ee17eca397d3b103ca628635
), saitugsTest.isolucion.co (https://saitugsTest.isolucion.co) was **down**:
- HTTP code: 0
- Response time: 0 ms
| 1.0 | 🛑 saitugsTest.isolucion.co is down - In [`79aaef1`](https://github.com/cricarba/isolucionStatus/commit/79aaef15090c8860ee17eca397d3b103ca628635
), saitugsTest.isolucion.co (https://saitugsTest.isolucion.co) was **down**:
- HTTP code: 0
- Response time: 0 ms
| non_priority | 🛑 saitugstest isolucion co is down in saitugstest isolucion co was down http code response time ms | 0 |
16,399 | 2,614,908,530 | IssuesEvent | 2015-03-01 00:12:14 | chrsmith/google-api-java-client | https://api.github.com/repos/chrsmith/google-api-java-client | opened | Report all shared documents in a domain | auto-migrated Priority-Medium Type-Sample | ```
Using OAuth to authorise the connection for a service account, I would like to
be able to get a report of all shared documents for all documents in the domain.
Going forward, the report needs to show
all documents shared outside of the domain
all documents shared to all members of the domain
all documents shared with individual users or groups
Hope you can help!
```
Original issue reported on code.google.com by `thegreen...@gmail.com` on 27 Mar 2013 at 9:18 | 1.0 | Report all shared documents in a domain - ```
Using OAuth to authorise the connection for a service account, I would like to
be able to get a report of all shared documents for all documents in the domain.
Going forward, the report needs to show
all documents shared outside of the domain
all documents shared to all members of the domain
all documents shared with individual users or groups
Hope you can help!
```
Original issue reported on code.google.com by `thegreen...@gmail.com` on 27 Mar 2013 at 9:18 | priority | report all shared documents in a domain using oauth to authorise the connection for a service account i would like to be able to get a report of all shared documents for all documents in the domain going forward the report needs to show all documents shared outside of the domain all documents shared to all members of the domain all documents shared with individual users or groups hope you can help original issue reported on code google com by thegreen gmail com on mar at | 1 |
34,293 | 9,329,495,888 | IssuesEvent | 2019-03-28 02:38:03 | JordanMartinez/purescript-jordans-reference | https://api.github.com/repos/JordanMartinez/purescript-jordans-reference | closed | Split the 'Dependency Managers' file into separate files | Build-Tools major-breaking-change | This file should be broken up into 3 files:
- basic overview of dependency managers
- Bower file
- psc-package / spago file
The files specific to each dependency manager should explain the workflow according to their intended audience:
- library developer (bower)
- application developer (psc-package / spago) | 1.0 | Split the 'Dependency Managers' file into separate files - This file should be broken up into 3 files:
- basic overview of dependency managers
- Bower file
- psc-package / spago file
The files specific to each dependency manager should explain the workflow according to their intended audience:
- library developer (bower)
- application developer (psc-package / spago) | non_priority | split the dependency managers file into separate files this file should be broken up into files basic overview of dependency managers bower file psc package spago file the files specific to each dependency manager should explain the workflow according to their intended audience library developer bower application developer psc package spago | 0 |
59,561 | 3,114,316,291 | IssuesEvent | 2015-09-03 07:59:06 | ceylon/ceylon.language | https://api.github.com/repos/ceylon/ceylon.language | closed | (0.0/0.0).string is "NaN.0" in JS | BUG high priority | Given:
print((0.0/0.0).string);
Ceylon JS prints:
NaN.0 | 1.0 | (0.0/0.0).string is "NaN.0" in JS - Given:
print((0.0/0.0).string);
Ceylon JS prints:
NaN.0 | priority | string is nan in js given print string ceylon js prints nan | 1 |
299,830 | 25,929,868,408 | IssuesEvent | 2022-12-16 09:04:28 | proarc/proarc-client | https://api.github.com/repos/proarc/proarc-client | closed | Po uložení změny v popisu stran se ukáže prázdná obrazovka (import i vazby) | 1 chyba 6 k testování 7 návrh na zavření 6c otestováno: KNAV 6e otestováno: SVKHK | Po uložení změny v popisu stran (číslo strany, typ strany apod.) se objeví prázdná obrazovka. Ta změna ale proběhne, po refreshi už je to OK. Týká se importu i vazeb.
Narazili jsme na to začátkem týdne, mysleli jsme, že to je jen chvilkové, jak se opravovalo hodně věcí najednou, tak jsme čekali, ale dělá to pořád. Takhle hezké to je:

| 3.0 | Po uložení změny v popisu stran se ukáže prázdná obrazovka (import i vazby) - Po uložení změny v popisu stran (číslo strany, typ strany apod.) se objeví prázdná obrazovka. Ta změna ale proběhne, po refreshi už je to OK. Týká se importu i vazeb.
Narazili jsme na to začátkem týdne, mysleli jsme, že to je jen chvilkové, jak se opravovalo hodně věcí najednou, tak jsme čekali, ale dělá to pořád. Takhle hezké to je:

| non_priority | po uložení změny v popisu stran se ukáže prázdná obrazovka import i vazby po uložení změny v popisu stran číslo strany typ strany apod se objeví prázdná obrazovka ta změna ale proběhne po refreshi už je to ok týká se importu i vazeb narazili jsme na to začátkem týdne mysleli jsme že to je jen chvilkové jak se opravovalo hodně věcí najednou tak jsme čekali ale dělá to pořád takhle hezké to je | 0 |
247,069 | 18,857,259,150 | IssuesEvent | 2021-11-12 08:22:40 | yongxiangng/pe | https://api.github.com/repos/yongxiangng/pe | opened | UserGuide does not mention that I'm not allowed to add person of similar name | type.DocumentationBug severity.VeryLow |

I see that the team treats Alex Yeoh the same as alex yeoh. Which I'm fine (as the module website does talk about this as well). However, I think this behavior should be documented somewhere, maybe in the UserGuide, or have a different kind of error message instead (a warning).

The module recommends that the decision be left to the user, but have them know the risks with a warning message.
<!--session: 1636701956891-9f95e17f-a619-4f1a-9b23-67606bd825c9-->
<!--Version: Web v3.4.1--> | 1.0 | UserGuide does not mention that I'm not allowed to add person of similar name -

I see that the team treats Alex Yeoh the same as alex yeoh. Which I'm fine (as the module website does talk about this as well). However, I think this behavior should be documented somewhere, maybe in the UserGuide, or have a different kind of error message instead (a warning).

The module recommends that the decision be left to the user, but have them know the risks with a warning message.
<!--session: 1636701956891-9f95e17f-a619-4f1a-9b23-67606bd825c9-->
<!--Version: Web v3.4.1--> | non_priority | userguide does not mention that i m not allowed to add person of similar name i see that the team treats alex yeoh the same as alex yeoh which i m fine as the module website does talk about this as well however i think this behavior should be documented somewhere maybe in the userguide or have a different kind of error message instead a warning the module recommends that the decision be left to the user but have them know the risks with a warning message | 0 |
818,112 | 30,671,473,831 | IssuesEvent | 2023-07-25 23:04:17 | AZMAG/map-Wickenburg | https://api.github.com/repos/AZMAG/map-Wickenburg | opened | Update Floodways layer | Status: In-Progress Priority: High Issue: Maintenance | 1. For the Flood Zone layer, it looks like only the “FW” zone is showing now. The map should also include the other zones in a single group (A, AE, etc).
2. The layer called Flood Zone Pending should be removed from the map. These appear to have been adopted by FEMA.
3. In the legend for Flood Zone, we should have two classes, as we do now, but shown thusly:
a. Floodway
b. 100 Year Floodplain
4. The Flood Zone Definitions link can probably stay since the definitions show up if you identify on the flood zones. However, in this dialog the info on the Pending Flood Zone should be commented out.
| 1.0 | Update Floodways layer - 1. For the Flood Zone layer, it looks like only the “FW” zone is showing now. The map should also include the other zones in a single group (A, AE, etc).
2. The layer called Flood Zone Pending should be removed from the map. These appear to have been adopted by FEMA.
3. In the legend for Flood Zone, we should have two classes, as we do now, but shown thusly:
a. Floodway
b. 100 Year Floodplain
4. The Flood Zone Definitions link can probably stay since the definitions show up if you identify on the flood zones. However, in this dialog the info on the Pending Flood Zone should be commented out.
| priority | update floodways layer for the flood zone layer it looks like only the “fw” zone is showing now the map should also include the other zones in a single group a ae etc the layer called flood zone pending should be removed from the map these appear to have been adopted by fema in the legend for flood zone we should have two classes as we do now but shown thusly a floodway b year floodplain the flood zone definitions link can probably stay since the definitions show up if you identify on the flood zones however in this dialog the info on the pending flood zone should be commented out | 1 |
4,693 | 3,876,264,435 | IssuesEvent | 2016-04-12 07:02:48 | lionheart/openradar-mirror | https://api.github.com/repos/lionheart/openradar-mirror | opened | 21794005: Text jumps in UITextField when using custom font | classification:ui/usability reproducible:always status:open | #### Description
Summary:
When a textfield, with a custom font, becomes first responder the text jumps up slightly. When it resign firsts responder the text again jumps down slightly.
Steps to Reproduce:
1. Create a textfield
2. Set a custom font (e.g. Papyrus, 13.0)
Expected Results:
The text not to jump when I start/stop editing
Actual Results:
The text jumps
Notes:
Some suggestions on StackOverflow said that the problem doesn’t occur when the UITextField is initialized in code. The second tab in the sample project shows that this isn’t true. The order in which the text and custom font are set also doesn’t matter.
-
Product Version: 8.0
Created: 2015-07-13T16:08:43.737170
Originated: 2015-07-13T12:08:00
Open Radar Link: http://www.openradar.me/21794005 | True | 21794005: Text jumps in UITextField when using custom font - #### Description
Summary:
When a textfield, with a custom font, becomes first responder the text jumps up slightly. When it resign firsts responder the text again jumps down slightly.
Steps to Reproduce:
1. Create a textfield
2. Set a custom font (e.g. Papyrus, 13.0)
Expected Results:
The text not to jump when I start/stop editing
Actual Results:
The text jumps
Notes:
Some suggestions on StackOverflow said that the problem doesn’t occur when the UITextField is initialized in code. The second tab in the sample project shows that this isn’t true. The order in which the text and custom font are set also doesn’t matter.
-
Product Version: 8.0
Created: 2015-07-13T16:08:43.737170
Originated: 2015-07-13T12:08:00
Open Radar Link: http://www.openradar.me/21794005 | non_priority | text jumps in uitextfield when using custom font description summary when a textfield with a custom font becomes first responder the text jumps up slightly when it resign firsts responder the text again jumps down slightly steps to reproduce create a textfield set a custom font e g papyrus expected results the text not to jump when i start stop editing actual results the text jumps notes some suggestions on stackoverflow said that the problem doesn’t occur when the uitextfield is initialized in code the second tab in the sample project shows that this isn’t true the order in which the text and custom font are set also doesn’t matter product version created originated open radar link | 0 |
5,593 | 7,241,705,752 | IssuesEvent | 2018-02-14 02:49:34 | dotnet/project-system | https://api.github.com/repos/dotnet/project-system | closed | Bogus "missing a using directive or an assembly reference" error in .Net Standard project | Area-New-Project-System Bug Feature - Language Service | Ported from a customer reported bug (550403).
On CPS side, we have done a chance to allow rules to specify the building order of targets. To fix this issue after taking the CPS change, you just need change the 'CompileDesignTime' rule, and set an Order number (property on the rule), to a value between 1 - 4999. (suggest to set it to 100). A target specified in a rule with a smaller order number will be built first. When the number is not specified or it is 0, it will be treated as 5000 (to prevent using negative numbers in other rules.)
Steps to reproduce:
Create two sample .Net standard class libraries targeting to .NET standard 2.0 in VS 15.5.2. For our reference name one project as ProjectA and other one as ProjectB which will have ProjectAClass and ProjectBClass classes respectively.
Now add reference to ProjectB from ProjectA project and create an instance of ProjectBClass under ProjectAClass as below…
public class ProjectA
{
ProjectB.ProjectBClass obj = new ProjectB.ProjectBClass ();
}
Everything is fine now and project builds successfully, intellisense and go-to definition feature works as expected.
Now add any COM reference from COM tab in ProjectA. For example first item under the COM tab Accessibility. Now you will see errors in the error window.
| 1.0 | Bogus "missing a using directive or an assembly reference" error in .Net Standard project - Ported from a customer reported bug (550403).
On CPS side, we have done a chance to allow rules to specify the building order of targets. To fix this issue after taking the CPS change, you just need change the 'CompileDesignTime' rule, and set an Order number (property on the rule), to a value between 1 - 4999. (suggest to set it to 100). A target specified in a rule with a smaller order number will be built first. When the number is not specified or it is 0, it will be treated as 5000 (to prevent using negative numbers in other rules.)
Steps to reproduce:
Create two sample .Net standard class libraries targeting to .NET standard 2.0 in VS 15.5.2. For our reference name one project as ProjectA and other one as ProjectB which will have ProjectAClass and ProjectBClass classes respectively.
Now add reference to ProjectB from ProjectA project and create an instance of ProjectBClass under ProjectAClass as below…
public class ProjectA
{
ProjectB.ProjectBClass obj = new ProjectB.ProjectBClass ();
}
Everything is fine now and project builds successfully, intellisense and go-to definition feature works as expected.
Now add any COM reference from COM tab in ProjectA. For example first item under the COM tab Accessibility. Now you will see errors in the error window.
| non_priority | bogus missing a using directive or an assembly reference error in net standard project ported from a customer reported bug on cps side we have done a chance to allow rules to specify the building order of targets to fix this issue after taking the cps change you just need change the compiledesigntime rule and set an order number property on the rule to a value between suggest to set it to a target specified in a rule with a smaller order number will be built first when the number is not specified or it is it will be treated as to prevent using negative numbers in other rules steps to reproduce create two sample net standard class libraries targeting to net standard in vs for our reference name one project as projecta and other one as projectb which will have projectaclass and projectbclass classes respectively now add reference to projectb from projecta project and create an instance of projectbclass under projectaclass as below… public class projecta projectb projectbclass obj new projectb projectbclass everything is fine now and project builds successfully intellisense and go to definition feature works as expected now add any com reference from com tab in projecta for example first item under the com tab accessibility now you will see errors in the error window | 0 |
743,618 | 25,907,402,891 | IssuesEvent | 2022-12-15 11:18:50 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | studentaid.gov - site is not usable | priority-normal browser-focus-geckoview engine-gecko | <!-- @browser: Firefox Mobile 107.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 11; Mobile; rv:107.0) Gecko/107.0 Firefox/107.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/115484 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://studentaid.gov/fsa-id/create-account/communication-prefs
**Browser / Version**: Firefox Mobile 107.0
**Operating System**: Android 11
**Tested Another Browser**: Yes Chrome
**Problem type**: Site is not usable
**Description**: Unable to type
**Steps to Reproduce**:
Stalled half way through account set up
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/12/4fe908e7-f64d-4849-9602-6e481c228d50.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20221110173214</li><li>channel: release</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/12/f749df1a-c515-415a-9001-7c14a8cc5aac)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | studentaid.gov - site is not usable - <!-- @browser: Firefox Mobile 107.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 11; Mobile; rv:107.0) Gecko/107.0 Firefox/107.0 -->
<!-- @reported_with: android-components-reporter -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/115484 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://studentaid.gov/fsa-id/create-account/communication-prefs
**Browser / Version**: Firefox Mobile 107.0
**Operating System**: Android 11
**Tested Another Browser**: Yes Chrome
**Problem type**: Site is not usable
**Description**: Unable to type
**Steps to Reproduce**:
Stalled half way through account set up
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/12/4fe908e7-f64d-4849-9602-6e481c228d50.jpeg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20221110173214</li><li>channel: release</li><li>hasTouchScreen: true</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li>
</ul>
</details>
[View console log messages](https://webcompat.com/console_logs/2022/12/f749df1a-c515-415a-9001-7c14a8cc5aac)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | studentaid gov site is not usable url browser version firefox mobile operating system android tested another browser yes chrome problem type site is not usable description unable to type steps to reproduce stalled half way through account set up view the screenshot img alt screenshot src browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel release hastouchscreen true mixed active content blocked false mixed passive content blocked false tracking content blocked false from with ❤️ | 1 |
151,393 | 23,810,726,233 | IssuesEvent | 2022-09-04 18:18:04 | Zakrok09/RoTUer | https://api.github.com/repos/Zakrok09/RoTUer | closed | [DESIGN] Remove landing site from Dashboard | design improvement | **Where do you see the problem?**
I have to scroll a lot in the dashboard
Suggested by: Marti | 1.0 | [DESIGN] Remove landing site from Dashboard - **Where do you see the problem?**
I have to scroll a lot in the dashboard
Suggested by: Marti | non_priority | remove landing site from dashboard where do you see the problem i have to scroll a lot in the dashboard suggested by marti | 0 |
507,185 | 14,679,933,388 | IssuesEvent | 2020-12-31 08:31:55 | k8smeetup/website-tasks | https://api.github.com/repos/k8smeetup/website-tasks | opened | /docs/concepts/storage/storage-limits.md | lang/zh priority/P0 sync/update version/master welcome | Source File: [/docs/concepts/storage/storage-limits.md](https://github.com/kubernetes/website/blob/master/content/en/docs/concepts/storage/storage-limits.md)
Diff 命令参考:
```bash
# 查看原始文档与翻译文档更新差异
git diff --no-index -- content/en/docs/concepts/storage/storage-limits.md content/zh/docs/concepts/storage/storage-limits.md
# 跨分支持查看原始文档更新差异
git diff release-1.19 master -- content/en/docs/concepts/storage/storage-limits.md
``` | 1.0 | /docs/concepts/storage/storage-limits.md - Source File: [/docs/concepts/storage/storage-limits.md](https://github.com/kubernetes/website/blob/master/content/en/docs/concepts/storage/storage-limits.md)
Diff 命令参考:
```bash
# 查看原始文档与翻译文档更新差异
git diff --no-index -- content/en/docs/concepts/storage/storage-limits.md content/zh/docs/concepts/storage/storage-limits.md
# 跨分支持查看原始文档更新差异
git diff release-1.19 master -- content/en/docs/concepts/storage/storage-limits.md
``` | priority | docs concepts storage storage limits md source file diff 命令参考 bash 查看原始文档与翻译文档更新差异 git diff no index content en docs concepts storage storage limits md content zh docs concepts storage storage limits md 跨分支持查看原始文档更新差异 git diff release master content en docs concepts storage storage limits md | 1 |
505,741 | 14,644,480,949 | IssuesEvent | 2020-12-25 23:56:49 | pterodactyl/panel | https://api.github.com/repos/pterodactyl/panel | reopened | Transfer Databases that are Created on a Server across Database Hosts | enhancement sponsor priority 🥇 | **Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Sort of. When you transfer a server to a new node, for example, there is no way to transfer the mysql databases on that server to a new database host.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
Either when transferring a server to a new node (or even just in general without needing to transfer a server) being able to click on a button to transfer databases across hosts from the panel would be extremely useful.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
Currently, the alternative is manually transferring databases across hosts (through phpmyadmin is probably easiest, especially for people unfamiliar with mysql, which is obviously very time consuming if you need to upgrade all servers on an entire node)
**Additional context**
Add any other context or screenshots about the feature request here.
Nothing else. | 1.0 | Transfer Databases that are Created on a Server across Database Hosts - **Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
Sort of. When you transfer a server to a new node, for example, there is no way to transfer the mysql databases on that server to a new database host.
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
Either when transferring a server to a new node (or even just in general without needing to transfer a server) being able to click on a button to transfer databases across hosts from the panel would be extremely useful.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
Currently, the alternative is manually transferring databases across hosts (through phpmyadmin is probably easiest, especially for people unfamiliar with mysql, which is obviously very time consuming if you need to upgrade all servers on an entire node)
**Additional context**
Add any other context or screenshots about the feature request here.
Nothing else. | priority | transfer databases that are created on a server across database hosts is your feature request related to a problem please describe a clear and concise description of what the problem is ex i m always frustrated when sort of when you transfer a server to a new node for example there is no way to transfer the mysql databases on that server to a new database host describe the solution you d like a clear and concise description of what you want to happen either when transferring a server to a new node or even just in general without needing to transfer a server being able to click on a button to transfer databases across hosts from the panel would be extremely useful describe alternatives you ve considered a clear and concise description of any alternative solutions or features you ve considered currently the alternative is manually transferring databases across hosts through phpmyadmin is probably easiest especially for people unfamiliar with mysql which is obviously very time consuming if you need to upgrade all servers on an entire node additional context add any other context or screenshots about the feature request here nothing else | 1 |
265,139 | 28,244,690,540 | IssuesEvent | 2023-04-06 09:49:15 | hshivhare67/platform_packages_apps_settings_AOSP10_r33 | https://api.github.com/repos/hshivhare67/platform_packages_apps_settings_AOSP10_r33 | opened | CVE-2021-0600 (High) detected in Settingsandroid-10.0.0_r33 | Mend: dependency security vulnerability | ## CVE-2021-0600 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Settingsandroid-10.0.0_r33</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/packages/apps/Settings>https://android.googlesource.com/platform/packages/apps/Settings</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/platform_packages_apps_settings_AOSP10_r33/commit/cdc44b74ac73d9c7eed82d7e753aba9efedac279">cdc44b74ac73d9c7eed82d7e753aba9efedac279</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/src/com/android/settings/applications/specialaccess/deviceadmin/DeviceAdminAdd.java</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In onCreate of DeviceAdminAdd.java, there is a possible way to mislead a user to activate a device admin app due to improper input validation. This could lead to local escalation of privilege with no additional execution privileges needed. User interaction is needed for exploitation.Product: AndroidVersions: Android-8.1 Android-9 Android-10 Android-11Android ID: A-179042963
<p>Publish Date: 2021-07-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-0600>CVE-2021-0600</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://source.android.com/security/bulletin/2021-07-01">https://source.android.com/security/bulletin/2021-07-01</a></p>
<p>Release Date: 2020-11-07</p>
<p>Fix Resolution: android-11.0.0_r39</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-0600 (High) detected in Settingsandroid-10.0.0_r33 - ## CVE-2021-0600 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Settingsandroid-10.0.0_r33</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/packages/apps/Settings>https://android.googlesource.com/platform/packages/apps/Settings</a></p>
<p>Found in HEAD commit: <a href="https://github.com/hshivhare67/platform_packages_apps_settings_AOSP10_r33/commit/cdc44b74ac73d9c7eed82d7e753aba9efedac279">cdc44b74ac73d9c7eed82d7e753aba9efedac279</a></p>
<p>Found in base branch: <b>main</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/src/com/android/settings/applications/specialaccess/deviceadmin/DeviceAdminAdd.java</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In onCreate of DeviceAdminAdd.java, there is a possible way to mislead a user to activate a device admin app due to improper input validation. This could lead to local escalation of privilege with no additional execution privileges needed. User interaction is needed for exploitation.Product: AndroidVersions: Android-8.1 Android-9 Android-10 Android-11Android ID: A-179042963
<p>Publish Date: 2021-07-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-0600>CVE-2021-0600</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://source.android.com/security/bulletin/2021-07-01">https://source.android.com/security/bulletin/2021-07-01</a></p>
<p>Release Date: 2020-11-07</p>
<p>Fix Resolution: android-11.0.0_r39</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in settingsandroid cve high severity vulnerability vulnerable library settingsandroid library home page a href found in head commit a href found in base branch main vulnerable source files src com android settings applications specialaccess deviceadmin deviceadminadd java vulnerability details in oncreate of deviceadminadd java there is a possible way to mislead a user to activate a device admin app due to improper input validation this could lead to local escalation of privilege with no additional execution privileges needed user interaction is needed for exploitation product androidversions android android android android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android step up your open source security game with mend | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.