Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 1k | labels stringlengths 4 1.38k | body stringlengths 1 262k | index stringclasses 16 values | text_combine stringlengths 96 262k | label stringclasses 2 values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
337,848 | 10,220,226,065 | IssuesEvent | 2019-08-15 20:44:31 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | www.nytimes.com - desktop site instead of mobile site | browser-fenix engine-gecko priority-important | <!-- @browser: Firefox Mobile 69.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:69.0) Gecko/69.0 Firefox/69.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.nytimes.com/2003/10/26/books/who-killed-mary-phagan.html
**Browser / Version**: Firefox Mobile 69.0
**Operating System**: Android
**Tested Another Browser**: No
**Problem type**: Desktop site instead of mobile site
**Description**: claims that I am in private mode, thus can't read the article
**Steps to Reproduce**:
I copied the link into Firefox preview and went on the website. shortly after loading a Popup-window appeared and told me I could not read this article in private Mode
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | www.nytimes.com - desktop site instead of mobile site - <!-- @browser: Firefox Mobile 69.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:69.0) Gecko/69.0 Firefox/69.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.nytimes.com/2003/10/26/books/who-killed-mary-phagan.html
**Browser / Version**: Firefox Mobile 69.0
**Operating System**: Android
**Tested Another Browser**: No
**Problem type**: Desktop site instead of mobile site
**Description**: claims that I am in private mode, thus can't read the article
**Steps to Reproduce**:
I copied the link into Firefox preview and went on the website. shortly after loading a Popup-window appeared and told me I could not read this article in private Mode
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | desktop site instead of mobile site url browser version firefox mobile operating system android tested another browser no problem type desktop site instead of mobile site description claims that i am in private mode thus can t read the article steps to reproduce i copied the link into firefox preview and went on the website shortly after loading a popup window appeared and told me i could not read this article in private mode browser configuration none from with ❤️ | 1 |
24,605 | 2,669,425,474 | IssuesEvent | 2015-03-23 15:26:48 | aseprite/aseprite | https://api.github.com/repos/aseprite/aseprite | closed | Move colours in palette editor | colorbar enhancement imported medium priority | _From [Corporal...@gmail.com](https://code.google.com/u/103014587505849798163/) on July 31, 2011 18:08:24_
What do you need to do? Move palette colours in the palette editor to different positions, making it easier to manually sort palettes. How would you like to do it? Using the arrow keys, which are presently not used by the palette editor.
_Original issue: http://code.google.com/p/aseprite/issues/detail?id=37_ | 1.0 | Move colours in palette editor - _From [Corporal...@gmail.com](https://code.google.com/u/103014587505849798163/) on July 31, 2011 18:08:24_
What do you need to do? Move palette colours in the palette editor to different positions, making it easier to manually sort palettes. How would you like to do it? Using the arrow keys, which are presently not used by the palette editor.
_Original issue: http://code.google.com/p/aseprite/issues/detail?id=37_ | priority | move colours in palette editor from on july what do you need to do move palette colours in the palette editor to different positions making it easier to manually sort palettes how would you like to do it using the arrow keys which are presently not used by the palette editor original issue | 1 |
152,286 | 13,450,411,240 | IssuesEvent | 2020-09-08 18:29:08 | ossia/score | https://api.github.com/repos/ossia/score | closed | JS in v3 | documentation ergonomy | Values from a vector fed into an JS inlet are tricky to retrieve.
calling value on an inlet receiving an array of 2 value returns
```
Debug: QVariant(QVector2D, QVector2D(0.5, 0.283333))
```
But neither .x or [0] returns anything | 1.0 | JS in v3 - Values from a vector fed into an JS inlet are tricky to retrieve.
calling value on an inlet receiving an array of 2 value returns
```
Debug: QVariant(QVector2D, QVector2D(0.5, 0.283333))
```
But neither .x or [0] returns anything | non_priority | js in values from a vector fed into an js inlet are tricky to retrieve calling value on an inlet receiving an array of value returns debug qvariant but neither x or returns anything | 0 |
161,233 | 25,308,789,735 | IssuesEvent | 2022-11-17 15:55:40 | kodadot/nft-gallery | https://api.github.com/repos/kodadot/nft-gallery | closed | Redesign toast notification | $ p3 redesign | It can be triggered whenever you copy address for example
for example here Share -> Copy Link
https://deploy-preview-4330--koda-nuxt.netlify.app/bsx/gallery/2548799063-43?redesign=true
<img width="291" alt="image" src="https://user-images.githubusercontent.com/5887929/202200464-cea24880-df12-4872-9b34-757176fa98dd.png">
As always @exezbcz will drop some designs 😎✨ | 1.0 | Redesign toast notification - It can be triggered whenever you copy address for example
for example here Share -> Copy Link
https://deploy-preview-4330--koda-nuxt.netlify.app/bsx/gallery/2548799063-43?redesign=true
<img width="291" alt="image" src="https://user-images.githubusercontent.com/5887929/202200464-cea24880-df12-4872-9b34-757176fa98dd.png">
As always @exezbcz will drop some designs 😎✨ | non_priority | redesign toast notification it can be triggered whenever you copy address for example for example here share copy link img width alt image src as always exezbcz will drop some designs 😎✨ | 0 |
52,095 | 27,370,444,384 | IssuesEvent | 2023-02-27 22:58:09 | mxmlnkn/pragzip | https://api.github.com/repos/mxmlnkn/pragzip | opened | Add optimized file access that also works for non-seekable files | enhancement performance | There are some use cases that want to create the index while downloading, so something like `wget | tee downloaded-file | pragzip --export-index downlaoded-file.gzindex`.
I think this should be solvable with a new non-seekable `FileReader` derived class with these two main ideas:
- A working implementation could simply cache the whole file on-demand in memory. This way seeking will always work.
- This obviously would use up too much memory for very large files. To remedy that an interface would be required to mark everything before an offset as not needed anymore.
- This way seeking will always work as long as the caller does not try to access parts that he marked himself as not needed anymore!
When processing a block everything before the end of its compressed offset can then be marked as to be dropped!
Assuming that https://github.com/mxmlnkn/ratarmount/issues/106 is caused by the non-sequential file access, then this addition might also fix that.. Assuming I could use the non-seekable file reader during index creation. From the outside I know that I only have to go over the file sequentially but I would also require a Python interface reflecting that. It might be easier to do the gzip index creation as a separate step / pass. This way I could also implement a version that does not actually decompress anything but only gathers index seek points. That way memory usage would be limited even without implementing #2! This idea would be implementable with the refactoring done for #11. It just needs another specialized `ChunkData` subclass. | True | Add optimized file access that also works for non-seekable files - There are some use cases that want to create the index while downloading, so something like `wget | tee downloaded-file | pragzip --export-index downlaoded-file.gzindex`.
I think this should be solvable with a new non-seekable `FileReader` derived class with these two main ideas:
- A working implementation could simply cache the whole file on-demand in memory. This way seeking will always work.
- This obviously would use up too much memory for very large files. To remedy that an interface would be required to mark everything before an offset as not needed anymore.
- This way seeking will always work as long as the caller does not try to access parts that he marked himself as not needed anymore!
When processing a block everything before the end of its compressed offset can then be marked as to be dropped!
Assuming that https://github.com/mxmlnkn/ratarmount/issues/106 is caused by the non-sequential file access, then this addition might also fix that.. Assuming I could use the non-seekable file reader during index creation. From the outside I know that I only have to go over the file sequentially but I would also require a Python interface reflecting that. It might be easier to do the gzip index creation as a separate step / pass. This way I could also implement a version that does not actually decompress anything but only gathers index seek points. That way memory usage would be limited even without implementing #2! This idea would be implementable with the refactoring done for #11. It just needs another specialized `ChunkData` subclass. | non_priority | add optimized file access that also works for non seekable files there are some use cases that want to create the index while downloading so something like wget tee downloaded file pragzip export index downlaoded file gzindex i think this should be solvable with a new non seekable filereader derived class with these two main ideas a working implementation could simply cache the whole file on demand in memory this way seeking will always work this obviously would use up too much memory for very large files to remedy that an interface would be required to mark everything before an offset as not needed anymore this way seeking will always work as long as the caller does not try to access parts that he marked himself as not needed anymore when processing a block everything before the end of its compressed offset can then be marked as to be dropped assuming that is caused by the non sequential file access then this addition might also fix that assuming i could use the non seekable file reader during index creation from the outside i know that i only have to go over the file sequentially but i would also require a python interface reflecting that it might be easier to do the gzip index creation as a separate step pass this way i could also implement a version that does not actually decompress anything but only gathers index seek points that way memory usage would be limited even without implementing this idea would be implementable with the refactoring done for it just needs another specialized chunkdata subclass | 0 |
16,108 | 6,105,540,365 | IssuesEvent | 2017-06-21 00:10:24 | Linuxbrew/homebrew-core | https://api.github.com/repos/Linuxbrew/homebrew-core | closed | Error: brew install docker-compose | build-error | Tried `brew update` twice and `brew doctor` to no avail.
Gist log: https://gist.github.com/anonymous/cf2c93921517039a6fe56a78af7d555d
Screenshot of error:

| 1.0 | Error: brew install docker-compose - Tried `brew update` twice and `brew doctor` to no avail.
Gist log: https://gist.github.com/anonymous/cf2c93921517039a6fe56a78af7d555d
Screenshot of error:

| non_priority | error brew install docker compose tried brew update twice and brew doctor to no avail gist log screenshot of error | 0 |
54,901 | 13,942,787,367 | IssuesEvent | 2020-10-22 21:39:27 | Whizkevina/uchi-sidebar-clone | https://api.github.com/repos/Whizkevina/uchi-sidebar-clone | opened | CVE-2019-10742 (High) detected in axios-0.17.1.tgz | security vulnerability | ## CVE-2019-10742 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>axios-0.17.1.tgz</b></p></summary>
<p>Promise based HTTP client for the browser and node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/axios/-/axios-0.17.1.tgz">https://registry.npmjs.org/axios/-/axios-0.17.1.tgz</a></p>
<p>Path to dependency file: uchi-sidebar-clone/package.json</p>
<p>Path to vulnerable library: uchi-sidebar-clone/node_modules/analytics-node/node_modules/axios/package.json</p>
<p>
Dependency Hierarchy:
- analytics-node-3.3.0.tgz (Root Library)
- :x: **axios-0.17.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Whizkevina/uchi-sidebar-clone/commit/5405eeecb088ab7acf45ef51e052988d72c3fe7f">5405eeecb088ab7acf45ef51e052988d72c3fe7f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Axios up to and including 0.18.0 allows attackers to cause a denial of service (application crash) by continuing to accepting content after maxContentLength is exceeded.
<p>Publish Date: 2019-05-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10742>CVE-2019-10742</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/axios/axios/issues/1098">https://github.com/axios/axios/issues/1098</a></p>
<p>Release Date: 2019-05-31</p>
<p>Fix Resolution: 0.19.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-10742 (High) detected in axios-0.17.1.tgz - ## CVE-2019-10742 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>axios-0.17.1.tgz</b></p></summary>
<p>Promise based HTTP client for the browser and node.js</p>
<p>Library home page: <a href="https://registry.npmjs.org/axios/-/axios-0.17.1.tgz">https://registry.npmjs.org/axios/-/axios-0.17.1.tgz</a></p>
<p>Path to dependency file: uchi-sidebar-clone/package.json</p>
<p>Path to vulnerable library: uchi-sidebar-clone/node_modules/analytics-node/node_modules/axios/package.json</p>
<p>
Dependency Hierarchy:
- analytics-node-3.3.0.tgz (Root Library)
- :x: **axios-0.17.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Whizkevina/uchi-sidebar-clone/commit/5405eeecb088ab7acf45ef51e052988d72c3fe7f">5405eeecb088ab7acf45ef51e052988d72c3fe7f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Axios up to and including 0.18.0 allows attackers to cause a denial of service (application crash) by continuing to accepting content after maxContentLength is exceeded.
<p>Publish Date: 2019-05-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-10742>CVE-2019-10742</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/axios/axios/issues/1098">https://github.com/axios/axios/issues/1098</a></p>
<p>Release Date: 2019-05-31</p>
<p>Fix Resolution: 0.19.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in axios tgz cve high severity vulnerability vulnerable library axios tgz promise based http client for the browser and node js library home page a href path to dependency file uchi sidebar clone package json path to vulnerable library uchi sidebar clone node modules analytics node node modules axios package json dependency hierarchy analytics node tgz root library x axios tgz vulnerable library found in head commit a href found in base branch main vulnerability details axios up to and including allows attackers to cause a denial of service application crash by continuing to accepting content after maxcontentlength is exceeded publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
645,092 | 20,994,596,828 | IssuesEvent | 2022-03-29 12:29:31 | GoogleCloudPlatform/python-docs-samples | https://api.github.com/repos/GoogleCloudPlatform/python-docs-samples | opened | run.logging-manual.main_test: test_with_cloud_headers failed | priority: p1 type: bug flakybot: issue | This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: 029e84c9e54cce4995acfa167e39d8b565f664d8
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e18c524a-26b4-457c-9657-fa448bec6aa2), [Sponge](http://sponge2/e18c524a-26b4-457c-9657-fa448bec6aa2)
status: failed
<details><summary>Test output</summary><br><pre>Traceback (most recent call last):
File "/workspace/run/logging-manual/main_test.py", line 39, in test_with_cloud_headers
r = client.get("/", headers={"X-Cloud-Trace-Context": "foo/bar"})
File "/workspace/run/logging-manual/.nox/py-3-9/lib/python3.9/site-packages/werkzeug/test.py", line 1134, in get
return self.open(*args, **kw)
File "/workspace/run/logging-manual/.nox/py-3-9/lib/python3.9/site-packages/flask/testing.py", line 216, in open
return super().open( # type: ignore
File "/workspace/run/logging-manual/.nox/py-3-9/lib/python3.9/site-packages/werkzeug/test.py", line 1081, in open
builder = EnvironBuilder(*args, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'as_tuple'</pre></details> | 1.0 | run.logging-manual.main_test: test_with_cloud_headers failed - This test failed!
To configure my behavior, see [the Flaky Bot documentation](https://github.com/googleapis/repo-automation-bots/tree/main/packages/flakybot).
If I'm commenting on this issue too often, add the `flakybot: quiet` label and
I will stop commenting.
---
commit: 029e84c9e54cce4995acfa167e39d8b565f664d8
buildURL: [Build Status](https://source.cloud.google.com/results/invocations/e18c524a-26b4-457c-9657-fa448bec6aa2), [Sponge](http://sponge2/e18c524a-26b4-457c-9657-fa448bec6aa2)
status: failed
<details><summary>Test output</summary><br><pre>Traceback (most recent call last):
File "/workspace/run/logging-manual/main_test.py", line 39, in test_with_cloud_headers
r = client.get("/", headers={"X-Cloud-Trace-Context": "foo/bar"})
File "/workspace/run/logging-manual/.nox/py-3-9/lib/python3.9/site-packages/werkzeug/test.py", line 1134, in get
return self.open(*args, **kw)
File "/workspace/run/logging-manual/.nox/py-3-9/lib/python3.9/site-packages/flask/testing.py", line 216, in open
return super().open( # type: ignore
File "/workspace/run/logging-manual/.nox/py-3-9/lib/python3.9/site-packages/werkzeug/test.py", line 1081, in open
builder = EnvironBuilder(*args, **kwargs)
TypeError: __init__() got an unexpected keyword argument 'as_tuple'</pre></details> | priority | run logging manual main test test with cloud headers failed this test failed to configure my behavior see if i m commenting on this issue too often add the flakybot quiet label and i will stop commenting commit buildurl status failed test output traceback most recent call last file workspace run logging manual main test py line in test with cloud headers r client get headers x cloud trace context foo bar file workspace run logging manual nox py lib site packages werkzeug test py line in get return self open args kw file workspace run logging manual nox py lib site packages flask testing py line in open return super open type ignore file workspace run logging manual nox py lib site packages werkzeug test py line in open builder environbuilder args kwargs typeerror init got an unexpected keyword argument as tuple | 1 |
476,171 | 13,734,759,361 | IssuesEvent | 2020-10-05 09:09:44 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | m.rediff.com - desktop site instead of mobile site | browser-focus-geckoview engine-gecko priority-important | <!-- @browser: Firefox Mobile 81.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 11; Mobile; rv:81.0) Gecko/81.0 Firefox/81.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/59285 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://m.rediff.com/
**Browser / Version**: Firefox Mobile 81.0
**Operating System**: Android
**Tested Another Browser**: Yes Chrome
**Problem type**: Desktop site instead of mobile site
**Description**: Desktop site instead of mobile site
**Steps to Reproduce**:
Seems like Focus is loading more and more pages as desktop than mobile version.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | m.rediff.com - desktop site instead of mobile site - <!-- @browser: Firefox Mobile 81.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 11; Mobile; rv:81.0) Gecko/81.0 Firefox/81.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/59285 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://m.rediff.com/
**Browser / Version**: Firefox Mobile 81.0
**Operating System**: Android
**Tested Another Browser**: Yes Chrome
**Problem type**: Desktop site instead of mobile site
**Description**: Desktop site instead of mobile site
**Steps to Reproduce**:
Seems like Focus is loading more and more pages as desktop than mobile version.
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | m rediff com desktop site instead of mobile site url browser version firefox mobile operating system android tested another browser yes chrome problem type desktop site instead of mobile site description desktop site instead of mobile site steps to reproduce seems like focus is loading more and more pages as desktop than mobile version browser configuration none from with ❤️ | 1 |
133,241 | 10,799,949,063 | IssuesEvent | 2019-11-06 13:18:34 | gitfun-party/this-repo-has-issues | https://api.github.com/repos/gitfun-party/this-repo-has-issues | opened | [test-issue-47] Copying cross-platform system | test | Gibbous stench cat shunned swarthy non-euclidean unmentionable. Antediluvian gambrel manuscript fungus. Furtive fungus squamous antediluvian blasphemous accursed dank. Madness daemoniac singular cat. Furtive nameless amorphous unmentionable.\n\nUlulate tenebrous hideous comprehension fungus noisome unnamable cyclopean. Dank noisome comprehension amorphous shunned manuscript blasphemous unmentionable. Non-euclidean dank stygian iridescence singular lurk. Antediluvian eldritch unmentionable. | 1.0 | [test-issue-47] Copying cross-platform system - Gibbous stench cat shunned swarthy non-euclidean unmentionable. Antediluvian gambrel manuscript fungus. Furtive fungus squamous antediluvian blasphemous accursed dank. Madness daemoniac singular cat. Furtive nameless amorphous unmentionable.\n\nUlulate tenebrous hideous comprehension fungus noisome unnamable cyclopean. Dank noisome comprehension amorphous shunned manuscript blasphemous unmentionable. Non-euclidean dank stygian iridescence singular lurk. Antediluvian eldritch unmentionable. | non_priority | copying cross platform system gibbous stench cat shunned swarthy non euclidean unmentionable antediluvian gambrel manuscript fungus furtive fungus squamous antediluvian blasphemous accursed dank madness daemoniac singular cat furtive nameless amorphous unmentionable n nululate tenebrous hideous comprehension fungus noisome unnamable cyclopean dank noisome comprehension amorphous shunned manuscript blasphemous unmentionable non euclidean dank stygian iridescence singular lurk antediluvian eldritch unmentionable | 0 |
657,135 | 21,786,573,448 | IssuesEvent | 2022-05-14 08:12:23 | bossbuwi/reality | https://api.github.com/repos/bossbuwi/reality | closed | Add rules | enhancement ui logic high priority | Rules for system use must be added on the dashboard. The implementation must be aesthetically pleasing.
_The server needs to implement this as well._ | 1.0 | Add rules - Rules for system use must be added on the dashboard. The implementation must be aesthetically pleasing.
_The server needs to implement this as well._ | priority | add rules rules for system use must be added on the dashboard the implementation must be aesthetically pleasing the server needs to implement this as well | 1 |
338,561 | 10,231,634,779 | IssuesEvent | 2019-08-18 11:20:01 | codetapacademy/codetap.academy | https://api.github.com/repos/codetapacademy/codetap.academy | opened | feat: create a route that will load the play-video component | Priority: High Status: Available Type: Enhancement | create a route **/video/:[youtubeVideoId]** that will load the **play-video** component
This is part of ** feat: create play video component route** #143 | 1.0 | feat: create a route that will load the play-video component - create a route **/video/:[youtubeVideoId]** that will load the **play-video** component
This is part of ** feat: create play video component route** #143 | priority | feat create a route that will load the play video component create a route video that will load the play video component this is part of feat create play video component route | 1 |
95,932 | 10,906,347,653 | IssuesEvent | 2019-11-20 12:45:38 | kyma-project/kyma | https://api.github.com/repos/kyma-project/kyma | closed | Improve Service Programming Model document | area/documentation area/eventing | **Description**
Affected document: https://kyma-project.io/docs/components/event-bus/#details-service-programming-model-event-delivery
To do:
* reconsider changing the structure to the following
* Event delivery flow
* Successful delivery
* Event structure
* Metadata (+ parameters)
* Payload (+ payload example included in this section)
* Subscription Service Example
Make sure all info is up to date. | 1.0 | Improve Service Programming Model document - **Description**
Affected document: https://kyma-project.io/docs/components/event-bus/#details-service-programming-model-event-delivery
To do:
* reconsider changing the structure to the following
* Event delivery flow
* Successful delivery
* Event structure
* Metadata (+ parameters)
* Payload (+ payload example included in this section)
* Subscription Service Example
Make sure all info is up to date. | non_priority | improve service programming model document description affected document to do reconsider changing the structure to the following event delivery flow successful delivery event structure metadata parameters payload payload example included in this section subscription service example make sure all info is up to date | 0 |
250,400 | 7,976,422,688 | IssuesEvent | 2018-07-17 12:39:35 | pyrocms/pyrocms | https://api.github.com/repos/pyrocms/pyrocms | closed | [upload-field_type/files-module] String replace does not support G. | Priority: High Type: Bug Report | **Describe the bug**
Both upload field type and files module use str_replace('M', '' ...) to get the current upload max. This does not support gigabyte shorthand as set in nginx, for example.
**To Reproduce**
Steps to reproduce the behavior:
1. Set nginx / PHP max upload size to 2G or similar
2. Attempt to upload
**Expected behavior**
Gigabyte values are correctly converted. | 1.0 | [upload-field_type/files-module] String replace does not support G. - **Describe the bug**
Both upload field type and files module use str_replace('M', '' ...) to get the current upload max. This does not support gigabyte shorthand as set in nginx, for example.
**To Reproduce**
Steps to reproduce the behavior:
1. Set nginx / PHP max upload size to 2G or similar
2. Attempt to upload
**Expected behavior**
Gigabyte values are correctly converted. | priority | string replace does not support g describe the bug both upload field type and files module use str replace m to get the current upload max this does not support gigabyte shorthand as set in nginx for example to reproduce steps to reproduce the behavior set nginx php max upload size to or similar attempt to upload expected behavior gigabyte values are correctly converted | 1 |
46,345 | 24,486,080,538 | IssuesEvent | 2022-10-09 13:02:04 | ItJustWorksTM/EiffelVis | https://api.github.com/repos/ItJustWorksTM/EiffelVis | opened | Keep a event and link type index | Performance Backend | Currently we do not send the link type to the frontend by default, this may limit some use cases.
The reason it is not being send is due to performance requirements, as the types are all strings they would clog the internet tubes needlessly.
To optimize this and to unlock the ability to send over more information in the lean event variant we need an index to map the string types to integer.
2 ways to do this:
1. global RwLock<IndexSet<String>> that is populated when events are inserted into the graph, index is requested through GET
- pro: complete query-able index
- con: adds more locking and overhead in hot loop
2. per user IndexSet<String>, populated when events are send over, index is send via socket
- pro: multi threaded, only indexes events that are requested
- con: incomplete index, duplicate work with multiple users
once this index is available we can replace the lean event's String usage with u64's
useful for:
https://github.com/ItJustWorksBetterTM/EiffelVis/issues/4 | True | Keep a event and link type index - Currently we do not send the link type to the frontend by default, this may limit some use cases.
The reason it is not being send is due to performance requirements, as the types are all strings they would clog the internet tubes needlessly.
To optimize this and to unlock the ability to send over more information in the lean event variant we need an index to map the string types to integer.
2 ways to do this:
1. global RwLock<IndexSet<String>> that is populated when events are inserted into the graph, index is requested through GET
- pro: complete query-able index
- con: adds more locking and overhead in hot loop
2. per user IndexSet<String>, populated when events are send over, index is send via socket
- pro: multi threaded, only indexes events that are requested
- con: incomplete index, duplicate work with multiple users
once this index is available we can replace the lean event's String usage with u64's
useful for:
https://github.com/ItJustWorksBetterTM/EiffelVis/issues/4 | non_priority | keep a event and link type index currently we do not send the link type to the frontend by default this may limit some use cases the reason it is not being send is due to performance requirements as the types are all strings they would clog the internet tubes needlessly to optimize this and to unlock the ability to send over more information in the lean event variant we need an index to map the string types to integer ways to do this global rwlock that is populated when events are inserted into the graph index is requested through get pro complete query able index con adds more locking and overhead in hot loop per user indexset populated when events are send over index is send via socket pro multi threaded only indexes events that are requested con incomplete index duplicate work with multiple users once this index is available we can replace the lean event s string usage with s useful for | 0 |
60,128 | 3,120,781,477 | IssuesEvent | 2015-09-05 01:58:04 | framingeinstein/issues-test | https://api.github.com/repos/framingeinstein/issues-test | closed | SRP-4: S-2251 finishes missing | priority:normal resolution:will-not-fix type:bug | We have 5 finishes live for S-2251 but only 3 are showing on the product page | 1.0 | SRP-4: S-2251 finishes missing - We have 5 finishes live for S-2251 but only 3 are showing on the product page | priority | srp s finishes missing we have finishes live for s but only are showing on the product page | 1 |
249,085 | 7,953,759,174 | IssuesEvent | 2018-07-12 03:38:18 | StrangeLoopGames/EcoIssues | https://api.github.com/repos/StrangeLoopGames/EcoIssues | closed | USER ISSUE: 64 bit server in 32 bit game version | Medium Priority | **Version:** 0.7.1.2 beta
**Steps to Reproduce:**
in the 32 bit game version is a 64 bit server so you cant play the game in singelplayer
**Expected behavior:**
**Actual behavior:**
| 1.0 | USER ISSUE: 64 bit server in 32 bit game version - **Version:** 0.7.1.2 beta
**Steps to Reproduce:**
in the 32 bit game version is a 64 bit server so you cant play the game in singelplayer
**Expected behavior:**
**Actual behavior:**
| priority | user issue bit server in bit game version version beta steps to reproduce in the bit game version is a bit server so you cant play the game in singelplayer expected behavior actual behavior | 1 |
114,862 | 4,647,370,071 | IssuesEvent | 2016-10-01 13:03:50 | ericcastoldi/safie | https://api.github.com/repos/ericcastoldi/safie | closed | Login | Cadastro de cliente | frontend priority store | após fechar a compra, caso a Cliente não esteja autenticada aparece a página para login ou novo cadastro. O login deve oferecer opções de autenticação externas ao site (Google, Facebook, etc). No cabeçalho do site será exibido o nome da cliente autenticada. | 1.0 | Login | Cadastro de cliente - após fechar a compra, caso a Cliente não esteja autenticada aparece a página para login ou novo cadastro. O login deve oferecer opções de autenticação externas ao site (Google, Facebook, etc). No cabeçalho do site será exibido o nome da cliente autenticada. | priority | login cadastro de cliente após fechar a compra caso a cliente não esteja autenticada aparece a página para login ou novo cadastro o login deve oferecer opções de autenticação externas ao site google facebook etc no cabeçalho do site será exibido o nome da cliente autenticada | 1 |
463,852 | 13,302,412,475 | IssuesEvent | 2020-08-25 14:14:19 | fossasia/open-event-frontend | https://api.github.com/repos/fossasia/open-event-frontend | opened | Schedule Calendar View: Changing Timezone can Result in Disappearing Sessions from Schedule | Priority: High bug | If a user changes the timezone to a time that is not covered in the original calendar time the sessions disappear from the calendar view.
Compare https://eventyay.com/e/16fa59c7/schedule

| 1.0 | Schedule Calendar View: Changing Timezone can Result in Disappearing Sessions from Schedule - If a user changes the timezone to a time that is not covered in the original calendar time the sessions disappear from the calendar view.
Compare https://eventyay.com/e/16fa59c7/schedule

| priority | schedule calendar view changing timezone can result in disappearing sessions from schedule if a user changes the timezone to a time that is not covered in the original calendar time the sessions disappear from the calendar view compare | 1 |
336,528 | 30,199,949,638 | IssuesEvent | 2023-07-05 04:06:29 | marcpage/libernet | https://api.github.com/repos/marcpage/libernet | closed | Add code to tests/FileDescriptor_test.cpp to test FileDescriptor(const int) and void FileDescriptor::sync() const | good first issue test | These two functions are not called in the test code. Make sure they work as expected | 1.0 | Add code to tests/FileDescriptor_test.cpp to test FileDescriptor(const int) and void FileDescriptor::sync() const - These two functions are not called in the test code. Make sure they work as expected | non_priority | add code to tests filedescriptor test cpp to test filedescriptor const int and void filedescriptor sync const these two functions are not called in the test code make sure they work as expected | 0 |
98,132 | 11,046,010,265 | IssuesEvent | 2019-12-09 16:06:21 | shtillorg/ML0719-Issues | https://api.github.com/repos/shtillorg/ML0719-Issues | closed | Document evaluation results. | documentation | Should contain an overview of used tools, software, platforms, etc.; reason and purpose. | 1.0 | Document evaluation results. - Should contain an overview of used tools, software, platforms, etc.; reason and purpose. | non_priority | document evaluation results should contain an overview of used tools software platforms etc reason and purpose | 0 |
75,711 | 3,471,331,702 | IssuesEvent | 2015-12-23 14:47:45 | aic-collections/aicdams-lakeshore | https://api.github.com/repos/aic-collections/aicdams-lakeshore | opened | Move Document Type input field to the top in asset edit view | MEDIUM priority presentation | Move Document Type to the top, before Title. | 1.0 | Move Document Type input field to the top in asset edit view - Move Document Type to the top, before Title. | priority | move document type input field to the top in asset edit view move document type to the top before title | 1 |
127,589 | 17,315,193,428 | IssuesEvent | 2021-07-27 04:30:27 | the-deep/deeper | https://api.github.com/repos/the-deep/deeper | closed | Add Lead: Visual cue for drag and drop | design-ui related-client | When a user drags a folder/file over DEEP, it should indicate that it accepts drop behavior in add leads page. | 1.0 | Add Lead: Visual cue for drag and drop - When a user drags a folder/file over DEEP, it should indicate that it accepts drop behavior in add leads page. | non_priority | add lead visual cue for drag and drop when a user drags a folder file over deep it should indicate that it accepts drop behavior in add leads page | 0 |
142,043 | 11,453,403,405 | IssuesEvent | 2020-02-06 15:20:53 | wet-boew/cdts-sgdc | https://api.github.com/repos/wet-boew/cdts-sgdc | opened | Vérifier dans appfooter_transactional-en.shtml | testing | Issue moved from GCCode: https://gccode.ssc-spc.gc.ca/iitb-dgiit/nw-ws/sgdc-cdts/issues/129
@StdGit created the issue
https://ssl-templates.services.gc.ca/app/cls/WET/gcweb/v4_0_32/cdts/appTop/appfooter_transactional-en.shtml
Vérifier si les links sont modifiables, si oui changer l'exemple

| 1.0 | Vérifier dans appfooter_transactional-en.shtml - Issue moved from GCCode: https://gccode.ssc-spc.gc.ca/iitb-dgiit/nw-ws/sgdc-cdts/issues/129
@StdGit created the issue
https://ssl-templates.services.gc.ca/app/cls/WET/gcweb/v4_0_32/cdts/appTop/appfooter_transactional-en.shtml
Vérifier si les links sont modifiables, si oui changer l'exemple

| non_priority | vérifier dans appfooter transactional en shtml issue moved from gccode stdgit created the issue vérifier si les links sont modifiables si oui changer l exemple | 0 |
6,834 | 3,475,013,507 | IssuesEvent | 2015-12-25 08:16:47 | joomla/joomla-cms | https://api.github.com/repos/joomla/joomla-cms | closed | Language filter not working for categories with subcategories in "List All Categories" menu item | No Code Attached Yet | #### Steps to reproduce the issue
BACKEND:
Setup the site as multilingual.
Create a category CAT_A_EN with language English
Create a category CAT_A_IT with another language (ex. Italian) and associate to CAT_A_EN
Create an English menu item "List All Categories"
Create an Italian menu item "List All Categories"
FRONTEND:
everything works as expected with just one category shown in each language
BACKEND
Create a new category "CAT_B_EN" with language English as a subcategory of CAT_A_EN
Same for italian with "CAT_B_IT" as sucat of CAT_A_IT
#### Expected result
FRONTEND
Just CAT_A_EN OR CAT_A_IT is shown depending on current language
#### Actual result
FRONTEND
Both CAT_A_EN and CAT_A_IT are shown independent of current language
#### System information (as much as possible)
Windows 7 XAMPP
#### Additional comments
| 1.0 | Language filter not working for categories with subcategories in "List All Categories" menu item - #### Steps to reproduce the issue
BACKEND:
Setup the site as multilingual.
Create a category CAT_A_EN with language English
Create a category CAT_A_IT with another language (ex. Italian) and associate to CAT_A_EN
Create an English menu item "List All Categories"
Create an Italian menu item "List All Categories"
FRONTEND:
everything works as expected with just one category shown in each language
BACKEND
Create a new category "CAT_B_EN" with language English as a subcategory of CAT_A_EN
Same for italian with "CAT_B_IT" as sucat of CAT_A_IT
#### Expected result
FRONTEND
Just CAT_A_EN OR CAT_A_IT is shown depending on current language
#### Actual result
FRONTEND
Both CAT_A_EN and CAT_A_IT are shown independent of current language
#### System information (as much as possible)
Windows 7 XAMPP
#### Additional comments
| non_priority | language filter not working for categories with subcategories in list all categories menu item steps to reproduce the issue backend setup the site as multilingual create a category cat a en with language english create a category cat a it with another language ex italian and associate to cat a en create an english menu item list all categories create an italian menu item list all categories frontend everything works as expected with just one category shown in each language backend create a new category cat b en with language english as a subcategory of cat a en same for italian with cat b it as sucat of cat a it expected result frontend just cat a en or cat a it is shown depending on current language actual result frontend both cat a en and cat a it are shown independent of current language system information as much as possible windows xampp additional comments | 0 |
565,025 | 16,747,376,668 | IssuesEvent | 2021-06-11 17:21:03 | airshipit/vino | https://api.github.com/repos/airshipit/vino | closed | Provide Secure VNC support for VMs created by ViNO | enhancement priority/low size l | **Proposed change**
Provide the capability to configure VNC configuration for VMs created when deploying a ViNO CR
- Provide capability to enable VNC via ViNO CR
- Provide capability to secure VNC endpoints with password authentication
| 1.0 | Provide Secure VNC support for VMs created by ViNO - **Proposed change**
Provide the capability to configure VNC configuration for VMs created when deploying a ViNO CR
- Provide capability to enable VNC via ViNO CR
- Provide capability to secure VNC endpoints with password authentication
| priority | provide secure vnc support for vms created by vino proposed change provide the capability to configure vnc configuration for vms created when deploying a vino cr provide capability to enable vnc via vino cr provide capability to secure vnc endpoints with password authentication | 1 |
58,993 | 14,520,326,962 | IssuesEvent | 2020-12-14 05:13:02 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | opened | Can't Install TF on Apple M1 | type:build/install | **Issue Creation**
On trying to install TF 2.3 using `pip install tensorflow` from the terminal, I got a success message.
I could see tensorflow 2.3.1 in the list of installed packages by using either `conda list` or `pip list`
When I run python from the terminal and trying to import tensorflow, it returns this error message:
```
(tfenv) mgd@MGD-m1 ~ % conda list tensorflow
# packages in environment at /Users/mgd/opt/anaconda3/envs/tfenv:
#
# Name Version Build Channel
tensorflow 2.3.0 pypi_0 pypi
tensorflow-datasets 4.1.0 pypi_0 pypi
tensorflow-estimator 2.3.0 pypi_0 pypi
tensorflow-metadata 0.26.0 pypi_0 pypi
(tfenv) mgd@MGD-m1 ~ % python
Python 3.8.5 (default, Sep 4 2020, 02:22:02)
[Clang 10.0.0 ] :: Anaconda, Inc. on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
zsh: illegal hardware instruction python
(tfenv) mgd@MGD-m1 ~ %
```
on uninstalling it and trying to installing again using:
`pip install https://storage.googleapis.com/tensorflow/mac/cpu/tensorflow-2.3.0-cp38-cp38-macosx_10_14_x86_64.whl`
same issue
On trying to import it from Jupyter Lab, no error message is shown and also no interaction is happening. It's as if it's empty cell.
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS Big Sur 11.0.1
- TensorFlow installed from (source or binary):
- TensorFlow version: 2.3.1
- Python version: 3.8.5
- Installed using virtualenv? pip? conda?: pip
| 1.0 | Can't Install TF on Apple M1 - **Issue Creation**
On trying to install TF 2.3 using `pip install tensorflow` from the terminal, I got a success message.
I could see tensorflow 2.3.1 in the list of installed packages by using either `conda list` or `pip list`
When I run python from the terminal and trying to import tensorflow, it returns this error message:
```
(tfenv) mgd@MGD-m1 ~ % conda list tensorflow
# packages in environment at /Users/mgd/opt/anaconda3/envs/tfenv:
#
# Name Version Build Channel
tensorflow 2.3.0 pypi_0 pypi
tensorflow-datasets 4.1.0 pypi_0 pypi
tensorflow-estimator 2.3.0 pypi_0 pypi
tensorflow-metadata 0.26.0 pypi_0 pypi
(tfenv) mgd@MGD-m1 ~ % python
Python 3.8.5 (default, Sep 4 2020, 02:22:02)
[Clang 10.0.0 ] :: Anaconda, Inc. on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
zsh: illegal hardware instruction python
(tfenv) mgd@MGD-m1 ~ %
```
on uninstalling it and trying to installing again using:
`pip install https://storage.googleapis.com/tensorflow/mac/cpu/tensorflow-2.3.0-cp38-cp38-macosx_10_14_x86_64.whl`
same issue
On trying to import it from Jupyter Lab, no error message is shown and also no interaction is happening. It's as if it's empty cell.
**System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): macOS Big Sur 11.0.1
- TensorFlow installed from (source or binary):
- TensorFlow version: 2.3.1
- Python version: 3.8.5
- Installed using virtualenv? pip? conda?: pip
| non_priority | can t install tf on apple issue creation on trying to install tf using pip install tensorflow from the terminal i got a success message i could see tensorflow in the list of installed packages by using either conda list or pip list when i run python from the terminal and trying to import tensorflow it returns this error message tfenv mgd mgd conda list tensorflow packages in environment at users mgd opt envs tfenv name version build channel tensorflow pypi pypi tensorflow datasets pypi pypi tensorflow estimator pypi pypi tensorflow metadata pypi pypi tfenv mgd mgd python python default sep anaconda inc on darwin type help copyright credits or license for more information import tensorflow as tf zsh illegal hardware instruction python tfenv mgd mgd on uninstalling it and trying to installing again using pip install same issue on trying to import it from jupyter lab no error message is shown and also no interaction is happening it s as if it s empty cell system information os platform and distribution e g linux ubuntu macos big sur tensorflow installed from source or binary tensorflow version python version installed using virtualenv pip conda pip | 0 |
98,187 | 20,622,276,079 | IssuesEvent | 2022-03-07 18:37:02 | joomla/joomla-cms | https://api.github.com/repos/joomla/joomla-cms | closed | [4.1] TODO list: Duplicate Queries | No Code Attached Yet | To fix in 4.1
In Joomla Admin:
- [ ] Edit user - 42 statements were executed, 11 of which were duplicates, 31 unique
- [ ] Edit Article - 45 statements were executed, 6 of which were duplicates, 39 unique
- [ ] Site Modules - 36 statements were executed, 2 of which were duplicates, 34 unique
- [ ] Main Menu - 35 statements were executed, 2 of which were duplicates, 33 unique
- [ ] Edit Category - 45 statements were executed, 6 of which were duplicates, 39 unique
- [ ] Edit User Group - 26 statements were executed, 2 of which were duplicates, 24 unique
- [ ] Edit User Access Level - 28 statements were executed, 2 of which were duplicates, 26 unique
- [ ] Create new private message - 29 statements were executed, 2 of which were duplicates, 27 unique
- [ ] Post-installation Messages for Joomla CMS - 30 statements were executed, 6 of which were duplicates, 24 unique | 1.0 | [4.1] TODO list: Duplicate Queries - To fix in 4.1
In Joomla Admin:
- [ ] Edit user - 42 statements were executed, 11 of which were duplicates, 31 unique
- [ ] Edit Article - 45 statements were executed, 6 of which were duplicates, 39 unique
- [ ] Site Modules - 36 statements were executed, 2 of which were duplicates, 34 unique
- [ ] Main Menu - 35 statements were executed, 2 of which were duplicates, 33 unique
- [ ] Edit Category - 45 statements were executed, 6 of which were duplicates, 39 unique
- [ ] Edit User Group - 26 statements were executed, 2 of which were duplicates, 24 unique
- [ ] Edit User Access Level - 28 statements were executed, 2 of which were duplicates, 26 unique
- [ ] Create new private message - 29 statements were executed, 2 of which were duplicates, 27 unique
- [ ] Post-installation Messages for Joomla CMS - 30 statements were executed, 6 of which were duplicates, 24 unique | non_priority | todo list duplicate queries to fix in in joomla admin edit user statements were executed of which were duplicates unique edit article statements were executed of which were duplicates unique site modules statements were executed of which were duplicates unique main menu statements were executed of which were duplicates unique edit category statements were executed of which were duplicates unique edit user group statements were executed of which were duplicates unique edit user access level statements were executed of which were duplicates unique create new private message statements were executed of which were duplicates unique post installation messages for joomla cms statements were executed of which were duplicates unique | 0 |
14,599 | 3,411,364,948 | IssuesEvent | 2015-12-05 01:59:34 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | Test failure in CI build 9855 | test-failure | The following test appears to have failed:
[#9855](https://circleci.com/gh/cockroachdb/cockroach/9855):
```
I1204 02:18:56.957873 681 stopper.go:241 draining; tasks left:
2 storage/replica.go:1437
1 storage/replica_command.go:1440
I1204 02:18:56.958892 681 stopper.go:241 draining; tasks left:
2 storage/replica.go:1437
panic: RangeLookup dispatched to correct range, but no matching RangeDescriptor was found: /System/Meta2/"a" [recovered]
panic: RangeLookup dispatched to correct range, but no matching RangeDescriptor was found: /System/Meta2/"a"
goroutine 6444 [running]:
github.com/cockroachdb/cockroach/util/tracer.(*Trace).Finalize(0xc820365b90)
/go/src/github.com/cockroachdb/cockroach/util/tracer/tracer.go:149 +0x1c3
github.com/cockroachdb/cockroach/storage.(*Replica).RangeLookup(0xc82009a8c0, 0x7fb1c756bda0, 0xc820779d10, 0x141c93a9a684a04f, 0x0, 0x100000001, 0x1, 0x1, 0x0, 0x0, ...)
/go/src/github.com/cockroachdb/cockroach/storage/replica_command.go:770 +0xe25
github.com/cockroachdb/cockroach/storage.(*Replica).executeCmd(0xc82009a8c0, 0x7fb1c756bda0, 0xc820779d10, 0x0, 0x141c93a9a684a04f, 0x0, 0x100000001, 0x1, 0x1, 0x0, ...)
/go/src/github.com/cockroachdb/cockroach/storage/replica_command.go:110 +0x41a
github.com/cockroachdb/cockroach/storage.(*Replica).executeBatch(0xc82009a8c0, 0x7fb1c756bda0, 0xc820779d10, 0x0, 0x141c93a9a684a04f, 0x0, 0x100000001, 0x1, 0x1, 0x0, ...)
--
/go/src/github.com/cockroachdb/cockroach/gossip/gossip.go:628 +0x494
github.com/cockroachdb/cockroach/util/stop.(*Stopper).RunWorker.func1(0xc82023a120, 0xc8207ab460)
/go/src/github.com/cockroachdb/cockroach/util/stop/stopper.go:88 +0x52
created by github.com/cockroachdb/cockroach/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/util/stop/stopper.go:89 +0x62
FAIL github.com/cockroachdb/cockroach/kv 14.702s
=== RUN TestHeartbeatSingleGroup
I1204 02:18:37.016987 651 multiraft/multiraft.go:579 node 2 starting
I1204 02:18:37.017189 651 multiraft/multiraft.go:579 node 1 starting
I1204 02:18:37.017492 651 raft/raft.go:441 [group 1] 1 became follower at term 5
I1204 02:18:37.017563 651 raft/raft.go:234 [group 1] newRaft 1 [peers: [1,2], term: 5, commit: 10, applied: 10, lastindex: 10, lastterm: 5]
I1204 02:18:37.017725 651 raft/raft.go:441 [group 1] 2 became follower at term 5
I1204 02:18:37.017800 651 raft/raft.go:234 [group 1] newRaft 2 [peers: [1,2], term: 5, commit: 10, applied: 10, lastindex: 10, lastterm: 5]
I1204 02:18:37.017884 651 raft/raft.go:521 [group 1] 1 is starting a new election at term 5
I1204 02:18:37.017957 651 raft/raft.go:454 [group 1] 1 became candidate at term 6
I1204 02:18:37.017994 651 raft/raft.go:503 [group 1] 1 received vote from 1 at term 6
```
Please assign, take a look and update the issue accordingly. | 1.0 | Test failure in CI build 9855 - The following test appears to have failed:
[#9855](https://circleci.com/gh/cockroachdb/cockroach/9855):
```
I1204 02:18:56.957873 681 stopper.go:241 draining; tasks left:
2 storage/replica.go:1437
1 storage/replica_command.go:1440
I1204 02:18:56.958892 681 stopper.go:241 draining; tasks left:
2 storage/replica.go:1437
panic: RangeLookup dispatched to correct range, but no matching RangeDescriptor was found: /System/Meta2/"a" [recovered]
panic: RangeLookup dispatched to correct range, but no matching RangeDescriptor was found: /System/Meta2/"a"
goroutine 6444 [running]:
github.com/cockroachdb/cockroach/util/tracer.(*Trace).Finalize(0xc820365b90)
/go/src/github.com/cockroachdb/cockroach/util/tracer/tracer.go:149 +0x1c3
github.com/cockroachdb/cockroach/storage.(*Replica).RangeLookup(0xc82009a8c0, 0x7fb1c756bda0, 0xc820779d10, 0x141c93a9a684a04f, 0x0, 0x100000001, 0x1, 0x1, 0x0, 0x0, ...)
/go/src/github.com/cockroachdb/cockroach/storage/replica_command.go:770 +0xe25
github.com/cockroachdb/cockroach/storage.(*Replica).executeCmd(0xc82009a8c0, 0x7fb1c756bda0, 0xc820779d10, 0x0, 0x141c93a9a684a04f, 0x0, 0x100000001, 0x1, 0x1, 0x0, ...)
/go/src/github.com/cockroachdb/cockroach/storage/replica_command.go:110 +0x41a
github.com/cockroachdb/cockroach/storage.(*Replica).executeBatch(0xc82009a8c0, 0x7fb1c756bda0, 0xc820779d10, 0x0, 0x141c93a9a684a04f, 0x0, 0x100000001, 0x1, 0x1, 0x0, ...)
--
/go/src/github.com/cockroachdb/cockroach/gossip/gossip.go:628 +0x494
github.com/cockroachdb/cockroach/util/stop.(*Stopper).RunWorker.func1(0xc82023a120, 0xc8207ab460)
/go/src/github.com/cockroachdb/cockroach/util/stop/stopper.go:88 +0x52
created by github.com/cockroachdb/cockroach/util/stop.(*Stopper).RunWorker
/go/src/github.com/cockroachdb/cockroach/util/stop/stopper.go:89 +0x62
FAIL github.com/cockroachdb/cockroach/kv 14.702s
=== RUN TestHeartbeatSingleGroup
I1204 02:18:37.016987 651 multiraft/multiraft.go:579 node 2 starting
I1204 02:18:37.017189 651 multiraft/multiraft.go:579 node 1 starting
I1204 02:18:37.017492 651 raft/raft.go:441 [group 1] 1 became follower at term 5
I1204 02:18:37.017563 651 raft/raft.go:234 [group 1] newRaft 1 [peers: [1,2], term: 5, commit: 10, applied: 10, lastindex: 10, lastterm: 5]
I1204 02:18:37.017725 651 raft/raft.go:441 [group 1] 2 became follower at term 5
I1204 02:18:37.017800 651 raft/raft.go:234 [group 1] newRaft 2 [peers: [1,2], term: 5, commit: 10, applied: 10, lastindex: 10, lastterm: 5]
I1204 02:18:37.017884 651 raft/raft.go:521 [group 1] 1 is starting a new election at term 5
I1204 02:18:37.017957 651 raft/raft.go:454 [group 1] 1 became candidate at term 6
I1204 02:18:37.017994 651 raft/raft.go:503 [group 1] 1 received vote from 1 at term 6
```
Please assign, take a look and update the issue accordingly. | non_priority | test failure in ci build the following test appears to have failed stopper go draining tasks left storage replica go storage replica command go stopper go draining tasks left storage replica go panic rangelookup dispatched to correct range but no matching rangedescriptor was found system a panic rangelookup dispatched to correct range but no matching rangedescriptor was found system a goroutine github com cockroachdb cockroach util tracer trace finalize go src github com cockroachdb cockroach util tracer tracer go github com cockroachdb cockroach storage replica rangelookup go src github com cockroachdb cockroach storage replica command go github com cockroachdb cockroach storage replica executecmd go src github com cockroachdb cockroach storage replica command go github com cockroachdb cockroach storage replica executebatch go src github com cockroachdb cockroach gossip gossip go github com cockroachdb cockroach util stop stopper runworker go src github com cockroachdb cockroach util stop stopper go created by github com cockroachdb cockroach util stop stopper runworker go src github com cockroachdb cockroach util stop stopper go fail github com cockroachdb cockroach kv run testheartbeatsinglegroup multiraft multiraft go node starting multiraft multiraft go node starting raft raft go became follower at term raft raft go newraft term commit applied lastindex lastterm raft raft go became follower at term raft raft go newraft term commit applied lastindex lastterm raft raft go is starting a new election at term raft raft go became candidate at term raft raft go received vote from at term please assign take a look and update the issue accordingly | 0 |
534,411 | 15,618,336,417 | IssuesEvent | 2021-03-20 00:42:39 | way-of-elendil/3.3.5 | https://api.github.com/repos/way-of-elendil/3.3.5 | closed | NPC: Géant de pierre ensorcelé | bug priority-low type-quest | **Description**
11352 - [La rune de commandement]
11348 - [La rune de commandement]
23725 - [Géant de pierre]
24346 - [Géant de pierre ensorcelé]
33796 - [Rune de commandement]
Lorsque l'on utilise Rune de commandement sur un géant de pierre, le Géant de pierre ensorcelé spawn et attaque tous les NPCs qui lui sont hostiles/neutres
**Comportement attendu**
Le géant de pierre ensorcelé n'attaque que les cibles que l'on combat.
https://youtu.be/alGzSRfozW8
35 secondes on peut le voir, le géant de pierre ensorcelé arrive sur le joueur, il était à portée d'aggro du Géant de pierre. | 1.0 | NPC: Géant de pierre ensorcelé - **Description**
11352 - [La rune de commandement]
11348 - [La rune de commandement]
23725 - [Géant de pierre]
24346 - [Géant de pierre ensorcelé]
33796 - [Rune de commandement]
Lorsque l'on utilise Rune de commandement sur un géant de pierre, le Géant de pierre ensorcelé spawn et attaque tous les NPCs qui lui sont hostiles/neutres
**Comportement attendu**
Le géant de pierre ensorcelé n'attaque que les cibles que l'on combat.
https://youtu.be/alGzSRfozW8
35 secondes on peut le voir, le géant de pierre ensorcelé arrive sur le joueur, il était à portée d'aggro du Géant de pierre. | priority | npc géant de pierre ensorcelé description lorsque l on utilise rune de commandement sur un géant de pierre le géant de pierre ensorcelé spawn et attaque tous les npcs qui lui sont hostiles neutres comportement attendu le géant de pierre ensorcelé n attaque que les cibles que l on combat secondes on peut le voir le géant de pierre ensorcelé arrive sur le joueur il était à portée d aggro du géant de pierre | 1 |
824,391 | 31,153,727,367 | IssuesEvent | 2023-08-16 11:47:44 | consta-design-system/uikit | https://api.github.com/repos/consta-design-system/uikit | closed | TextField: переработать компонент | feature 🔥🔥 priority major | Сейчас основные пропсы берутся из HTMLDivElement, нужно же завязываться на HTMLInputElement, для избежания проблем с пропсами | 1.0 | TextField: переработать компонент - Сейчас основные пропсы берутся из HTMLDivElement, нужно же завязываться на HTMLInputElement, для избежания проблем с пропсами | priority | textfield переработать компонент сейчас основные пропсы берутся из htmldivelement нужно же завязываться на htmlinputelement для избежания проблем с пропсами | 1 |
707,808 | 24,319,995,204 | IssuesEvent | 2022-09-30 09:51:29 | JPGallegos1/crypto-monorepo | https://api.github.com/repos/JPGallegos1/crypto-monorepo | closed | Final documentation project | enhancement priority:HIGH | As a developer, I want to see some documentation about how to start the project on another machine
- [x] Make the final documentation before other people can see it | 1.0 | Final documentation project - As a developer, I want to see some documentation about how to start the project on another machine
- [x] Make the final documentation before other people can see it | priority | final documentation project as a developer i want to see some documentation about how to start the project on another machine make the final documentation before other people can see it | 1 |
499,658 | 14,475,471,336 | IssuesEvent | 2020-12-10 01:42:36 | googleapis/python-logging | https://api.github.com/repos/googleapis/python-logging | opened | Add an HTTPRequest log sample | good first issue priority: p2 | Add a new HTTPRequest log sample to [logging samples page](https://cloud.google.com/logging/docs/samples) with region tag `logging_write_request_entry`
**Criteria**
1. Entry sample should contain `requestMethod`, `Url`, and `status` which are minimally required fields for rendering summary fields in the Log Viewer.
2. Note any fields in the final logentry that may be auto-populated by the client library (if any).
e.g.
```python
...
logger.log_text(
'Python request: hello world',
http_request=dict( requestMethod='GET', requestUrl='www.example.com', status=200))
...
``` | 1.0 | Add an HTTPRequest log sample - Add a new HTTPRequest log sample to [logging samples page](https://cloud.google.com/logging/docs/samples) with region tag `logging_write_request_entry`
**Criteria**
1. Entry sample should contain `requestMethod`, `Url`, and `status` which are minimally required fields for rendering summary fields in the Log Viewer.
2. Note any fields in the final logentry that may be auto-populated by the client library (if any).
e.g.
```python
...
logger.log_text(
'Python request: hello world',
http_request=dict( requestMethod='GET', requestUrl='www.example.com', status=200))
...
``` | priority | add an httprequest log sample add a new httprequest log sample to with region tag logging write request entry criteria entry sample should contain requestmethod url and status which are minimally required fields for rendering summary fields in the log viewer note any fields in the final logentry that may be auto populated by the client library if any e g python logger log text python request hello world http request dict requestmethod get requesturl status | 1 |
191,034 | 6,824,986,627 | IssuesEvent | 2017-11-08 08:54:57 | sahana/SAMBRO | https://api.github.com/repos/sahana/SAMBRO | opened | Tagging Alerts of Interest | enhancement Low Priority | The use case from the WA-COP project has presented a case where authorized users could bookmark or tag alerts of interest. The UI (desktop or mobile app) should prioritize such information in the presentation layers.
An additional value is that SAMBRO could adopt machine learning techniques to make use of the bookmark and tagging to learn which alerts are of importance and their consumption patters to improve impact-based alerting as well as the display and delivery of such location specific alerts
| 1.0 | Tagging Alerts of Interest - The use case from the WA-COP project has presented a case where authorized users could bookmark or tag alerts of interest. The UI (desktop or mobile app) should prioritize such information in the presentation layers.
An additional value is that SAMBRO could adopt machine learning techniques to make use of the bookmark and tagging to learn which alerts are of importance and their consumption patters to improve impact-based alerting as well as the display and delivery of such location specific alerts
| priority | tagging alerts of interest the use case from the wa cop project has presented a case where authorized users could bookmark or tag alerts of interest the ui desktop or mobile app should prioritize such information in the presentation layers an additional value is that sambro could adopt machine learning techniques to make use of the bookmark and tagging to learn which alerts are of importance and their consumption patters to improve impact based alerting as well as the display and delivery of such location specific alerts | 1 |
378,370 | 11,201,071,866 | IssuesEvent | 2020-01-04 00:30:09 | lowRISC/opentitan | https://api.github.com/repos/lowRISC/opentitan | closed | [doc] top_earlgrey doc doesn't represent pinout | Component:Doc Priority:P1 Type:Cleanup | Current version of top_earlgrey doc doesn't represent the pinout of the chip,
need to fix. Plan is to map it to the generated top level, with discussion of the
actual pins of the FPGA target. | 1.0 | [doc] top_earlgrey doc doesn't represent pinout - Current version of top_earlgrey doc doesn't represent the pinout of the chip,
need to fix. Plan is to map it to the generated top level, with discussion of the
actual pins of the FPGA target. | priority | top earlgrey doc doesn t represent pinout current version of top earlgrey doc doesn t represent the pinout of the chip need to fix plan is to map it to the generated top level with discussion of the actual pins of the fpga target | 1 |
94,216 | 19,515,825,781 | IssuesEvent | 2021-12-29 10:02:59 | Onelinerhub/onelinerhub | https://api.github.com/repos/Onelinerhub/onelinerhub | closed | Short solution needed: "Select database with PHP PDO" (php-pdo) | help wanted good first issue code php-pdo | Please help us write most modern and shortest code solution for this issue:
**Select database with PHP PDO** (technology: [php-pdo](https://onelinerhub.com/php-pdo))
### Fast way
Just write the code solution in the comments.
### Prefered way
1. Create pull request with a new code file inside [inbox folder](https://github.com/Onelinerhub/onelinerhub/tree/main/inbox).
2. Don't forget to use comments to make solution explained.
3. Link to this issue in comments of pull request. | 1.0 | Short solution needed: "Select database with PHP PDO" (php-pdo) - Please help us write most modern and shortest code solution for this issue:
**Select database with PHP PDO** (technology: [php-pdo](https://onelinerhub.com/php-pdo))
### Fast way
Just write the code solution in the comments.
### Prefered way
1. Create pull request with a new code file inside [inbox folder](https://github.com/Onelinerhub/onelinerhub/tree/main/inbox).
2. Don't forget to use comments to make solution explained.
3. Link to this issue in comments of pull request. | non_priority | short solution needed select database with php pdo php pdo please help us write most modern and shortest code solution for this issue select database with php pdo technology fast way just write the code solution in the comments prefered way create pull request with a new code file inside don t forget to use comments to make solution explained link to this issue in comments of pull request | 0 |
742,045 | 25,834,169,662 | IssuesEvent | 2022-12-12 18:15:01 | thoth-station/python | https://api.github.com/repos/thoth-station/python | opened | `packaging` deprecates `LegacyVersion` causing failure if `packaging` is upgraded | kind/bug priority/backlog | ## Bug description
<!-- A clear and concise description of what the bug is. -->
Version 0.22 of `packaging` deprecates `LegacyVersion` and `LegacySpecifier` which are used in this module.
### Steps to Reproduce
1. upgrade to latest version of packaging
2. run anything
### Actual behavior
<!-- What happens? If applicable, add screenshots to illustrate your problem. -->
module crashes
### Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
module does not crash
### Additional context
<!-- Add any additional context information about the problem here. -->
usage of deprecated functions are found here: https://github.com/thoth-station/python/search?q=LegacyVersion
| 1.0 | `packaging` deprecates `LegacyVersion` causing failure if `packaging` is upgraded - ## Bug description
<!-- A clear and concise description of what the bug is. -->
Version 0.22 of `packaging` deprecates `LegacyVersion` and `LegacySpecifier` which are used in this module.
### Steps to Reproduce
1. upgrade to latest version of packaging
2. run anything
### Actual behavior
<!-- What happens? If applicable, add screenshots to illustrate your problem. -->
module crashes
### Expected behavior
<!-- A clear and concise description of what you expected to happen. -->
module does not crash
### Additional context
<!-- Add any additional context information about the problem here. -->
usage of deprecated functions are found here: https://github.com/thoth-station/python/search?q=LegacyVersion
| priority | packaging deprecates legacyversion causing failure if packaging is upgraded bug description version of packaging deprecates legacyversion and legacyspecifier which are used in this module steps to reproduce upgrade to latest version of packaging run anything actual behavior module crashes expected behavior module does not crash additional context usage of deprecated functions are found here | 1 |
444,792 | 12,821,167,366 | IssuesEvent | 2020-07-06 07:33:06 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | www.netflix.com - design is broken | browser-focus-geckoview engine-gecko priority-critical | <!-- @browser: Firefox Mobile 78.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.0.0; Mobile; rv:78.0) Gecko/78.0 Firefox/78.0 -->
<!-- @reported_with: -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/55100 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://www.netflix.com/lb/login
**Browser / Version**: Firefox Mobile 78.0
**Operating System**: Android 8.0.0
**Tested Another Browser**: Yes Safari
**Problem type**: Design is broken
**Description**: Items not fully visible
**Steps to Reproduce**:
^^ 💔😣😣😣😣😣😣😣😣😣😣😣
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | www.netflix.com - design is broken - <!-- @browser: Firefox Mobile 78.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 8.0.0; Mobile; rv:78.0) Gecko/78.0 Firefox/78.0 -->
<!-- @reported_with: -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/55100 -->
<!-- @extra_labels: browser-focus-geckoview -->
**URL**: https://www.netflix.com/lb/login
**Browser / Version**: Firefox Mobile 78.0
**Operating System**: Android 8.0.0
**Tested Another Browser**: Yes Safari
**Problem type**: Design is broken
**Description**: Items not fully visible
**Steps to Reproduce**:
^^ 💔😣😣😣😣😣😣😣😣😣😣😣
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | priority | design is broken url browser version firefox mobile operating system android tested another browser yes safari problem type design is broken description items not fully visible steps to reproduce 💔😣😣😣😣😣😣😣😣😣😣😣 browser configuration none from with ❤️ | 1 |
458,471 | 13,175,590,858 | IssuesEvent | 2020-08-12 02:07:53 | trustwallet/wallet-core | https://api.github.com/repos/trustwallet/wallet-core | closed | [Keystore] Support side chain | enhancement priority: high | To better support binance smart chain or other side chains, we could probably extend current keystore json like
```json
{
"sidechain": [
{
"name": "Smart Chain",
"type": "Ethereum",
"address": "0x"
}
]
}
``` | 1.0 | [Keystore] Support side chain - To better support binance smart chain or other side chains, we could probably extend current keystore json like
```json
{
"sidechain": [
{
"name": "Smart Chain",
"type": "Ethereum",
"address": "0x"
}
]
}
``` | priority | support side chain to better support binance smart chain or other side chains we could probably extend current keystore json like json sidechain name smart chain type ethereum address | 1 |
58,701 | 14,344,300,295 | IssuesEvent | 2020-11-28 13:51:11 | uniquelyparticular/sync-moltin-to-algolia | https://api.github.com/repos/uniquelyparticular/sync-moltin-to-algolia | opened | CVE-2020-26226 (High) detected in semantic-release-15.13.14.tgz | security vulnerability | ## CVE-2020-26226 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semantic-release-15.13.14.tgz</b></p></summary>
<p>Automated semver compliant package publishing</p>
<p>Library home page: <a href="https://registry.npmjs.org/semantic-release/-/semantic-release-15.13.14.tgz">https://registry.npmjs.org/semantic-release/-/semantic-release-15.13.14.tgz</a></p>
<p>Path to dependency file: sync-moltin-to-algolia/package.json</p>
<p>Path to vulnerable library: sync-moltin-to-algolia/node_modules/semantic-release/package.json</p>
<p>
Dependency Hierarchy:
- :x: **semantic-release-15.13.14.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/sync-moltin-to-algolia/commit/1ba963c3668473f24bc53e805a17e1f2acdfc422">1ba963c3668473f24bc53e805a17e1f2acdfc422</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the npm package semantic-release before version 17.2.3, secrets that would normally be masked by `semantic-release` can be accidentally disclosed if they contain characters that become encoded when included in a URL. Secrets that do not contain characters that become encoded when included in a URL are already masked properly. The issue is fixed in version 17.2.3.
<p>Publish Date: 2020-11-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-26226>CVE-2020-26226</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/semantic-release/semantic-release/security/advisories/GHSA-r2j6-p67h-q639">https://github.com/semantic-release/semantic-release/security/advisories/GHSA-r2j6-p67h-q639</a></p>
<p>Release Date: 2020-11-18</p>
<p>Fix Resolution: 17.2.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-26226 (High) detected in semantic-release-15.13.14.tgz - ## CVE-2020-26226 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>semantic-release-15.13.14.tgz</b></p></summary>
<p>Automated semver compliant package publishing</p>
<p>Library home page: <a href="https://registry.npmjs.org/semantic-release/-/semantic-release-15.13.14.tgz">https://registry.npmjs.org/semantic-release/-/semantic-release-15.13.14.tgz</a></p>
<p>Path to dependency file: sync-moltin-to-algolia/package.json</p>
<p>Path to vulnerable library: sync-moltin-to-algolia/node_modules/semantic-release/package.json</p>
<p>
Dependency Hierarchy:
- :x: **semantic-release-15.13.14.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/uniquelyparticular/sync-moltin-to-algolia/commit/1ba963c3668473f24bc53e805a17e1f2acdfc422">1ba963c3668473f24bc53e805a17e1f2acdfc422</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the npm package semantic-release before version 17.2.3, secrets that would normally be masked by `semantic-release` can be accidentally disclosed if they contain characters that become encoded when included in a URL. Secrets that do not contain characters that become encoded when included in a URL are already masked properly. The issue is fixed in version 17.2.3.
<p>Publish Date: 2020-11-18
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-26226>CVE-2020-26226</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/semantic-release/semantic-release/security/advisories/GHSA-r2j6-p67h-q639">https://github.com/semantic-release/semantic-release/security/advisories/GHSA-r2j6-p67h-q639</a></p>
<p>Release Date: 2020-11-18</p>
<p>Fix Resolution: 17.2.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve high detected in semantic release tgz cve high severity vulnerability vulnerable library semantic release tgz automated semver compliant package publishing library home page a href path to dependency file sync moltin to algolia package json path to vulnerable library sync moltin to algolia node modules semantic release package json dependency hierarchy x semantic release tgz vulnerable library found in head commit a href vulnerability details in the npm package semantic release before version secrets that would normally be masked by semantic release can be accidentally disclosed if they contain characters that become encoded when included in a url secrets that do not contain characters that become encoded when included in a url are already masked properly the issue is fixed in version publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
407,278 | 11,911,500,550 | IssuesEvent | 2020-03-31 08:44:07 | lowRISC/opentitan | https://api.github.com/repos/lowRISC/opentitan | closed | [DIF] tests | Component:SW Hotlist:SW Opinion Priority:P3 | Just want to start the discussion on DIF testing. From previous conversations it seems that the consensus was to have several types of testing such as "mock" and DV.
I think DIF are ideally suited for "mock" testing, as all/most of the API calls will result in a single or multiple register read/write. A such it would should be easy to verify the API, by mocking out register write/read functions, and comparing the actual result with the expected. Expectations are that all the MMIO access should be done through the library introduced in #1187 (when it lands). This would make thing easier, as all of the DIFs will be using the API introduced in the mentioned PR.
This is just thinking out loud really, and invitation for comments. | 1.0 | [DIF] tests - Just want to start the discussion on DIF testing. From previous conversations it seems that the consensus was to have several types of testing such as "mock" and DV.
I think DIF are ideally suited for "mock" testing, as all/most of the API calls will result in a single or multiple register read/write. A such it would should be easy to verify the API, by mocking out register write/read functions, and comparing the actual result with the expected. Expectations are that all the MMIO access should be done through the library introduced in #1187 (when it lands). This would make thing easier, as all of the DIFs will be using the API introduced in the mentioned PR.
This is just thinking out loud really, and invitation for comments. | priority | tests just want to start the discussion on dif testing from previous conversations it seems that the consensus was to have several types of testing such as mock and dv i think dif are ideally suited for mock testing as all most of the api calls will result in a single or multiple register read write a such it would should be easy to verify the api by mocking out register write read functions and comparing the actual result with the expected expectations are that all the mmio access should be done through the library introduced in when it lands this would make thing easier as all of the difs will be using the api introduced in the mentioned pr this is just thinking out loud really and invitation for comments | 1 |
165,664 | 20,613,314,585 | IssuesEvent | 2022-03-07 10:45:07 | serhii73/place2live.com | https://api.github.com/repos/serhii73/place2live.com | opened | CVE-2021-28658 (Medium) detected in Django-2.2.9-py3-none-any.whl | security vulnerability | ## CVE-2021-28658 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Django-2.2.9-py3-none-any.whl</b></p></summary>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/cb/c9/ef1e25bdd092749dae74c95c2707dff892fde36e4053c4a2354b2303be10/Django-2.2.9-py3-none-any.whl">https://files.pythonhosted.org/packages/cb/c9/ef1e25bdd092749dae74c95c2707dff892fde36e4053c4a2354b2303be10/Django-2.2.9-py3-none-any.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.2.9-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/serhii73/place2live.com/commit/c1b6d6ca605af371f794c3c5d739712c7cdb206f">c1b6d6ca605af371f794c3c5d739712c7cdb206f</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Django 2.2 before 2.2.20, 3.0 before 3.0.14, and 3.1 before 3.1.8, MultiPartParser allowed directory traversal via uploaded files with suitably crafted file names. Built-in upload handlers were not affected by this vulnerability.
<p>Publish Date: 2021-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28658>CVE-2021-28658</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658</a></p>
<p>Release Date: 2021-04-06</p>
<p>Fix Resolution: django-2.2.20, 3.0.14, 3.1.8, 3.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-28658 (Medium) detected in Django-2.2.9-py3-none-any.whl - ## CVE-2021-28658 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>Django-2.2.9-py3-none-any.whl</b></p></summary>
<p>A high-level Python Web framework that encourages rapid development and clean, pragmatic design.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/cb/c9/ef1e25bdd092749dae74c95c2707dff892fde36e4053c4a2354b2303be10/Django-2.2.9-py3-none-any.whl">https://files.pythonhosted.org/packages/cb/c9/ef1e25bdd092749dae74c95c2707dff892fde36e4053c4a2354b2303be10/Django-2.2.9-py3-none-any.whl</a></p>
<p>Path to dependency file: /requirements.txt</p>
<p>Path to vulnerable library: /requirements.txt</p>
<p>
Dependency Hierarchy:
- :x: **Django-2.2.9-py3-none-any.whl** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/serhii73/place2live.com/commit/c1b6d6ca605af371f794c3c5d739712c7cdb206f">c1b6d6ca605af371f794c3c5d739712c7cdb206f</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Django 2.2 before 2.2.20, 3.0 before 3.0.14, and 3.1 before 3.1.8, MultiPartParser allowed directory traversal via uploaded files with suitably crafted file names. Built-in upload handlers were not affected by this vulnerability.
<p>Publish Date: 2021-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-28658>CVE-2021-28658</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-28658</a></p>
<p>Release Date: 2021-04-06</p>
<p>Fix Resolution: django-2.2.20, 3.0.14, 3.1.8, 3.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in django none any whl cve medium severity vulnerability vulnerable library django none any whl a high level python web framework that encourages rapid development and clean pragmatic design library home page a href path to dependency file requirements txt path to vulnerable library requirements txt dependency hierarchy x django none any whl vulnerable library found in head commit a href vulnerability details in django before before and before multipartparser allowed directory traversal via uploaded files with suitably crafted file names built in upload handlers were not affected by this vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution django step up your open source security game with whitesource | 0 |
44,182 | 12,025,551,698 | IssuesEvent | 2020-04-12 09:50:23 | tiangolo/jbrout | https://api.github.com/repos/tiangolo/jbrout | closed | Window version exits on startup | OpSys-Windows Priority-Low Type-Defect auto-migrated | ```
What steps will reproduce the problem?
1. Install windows version.
2. Attempt to launch brout...
3.
What is the expected output? What do you see instead?
Error log:
You should install pyexiv2 (>=0.1.2)
What version of the product are you using? On what operating system?
winxp brout 0.3.828
Please provide any additional information below.
```
Original issue reported on code.google.com by `pbol...@gmail.com` on 17 Feb 2010 at 11:27
| 1.0 | Window version exits on startup - ```
What steps will reproduce the problem?
1. Install windows version.
2. Attempt to launch brout...
3.
What is the expected output? What do you see instead?
Error log:
You should install pyexiv2 (>=0.1.2)
What version of the product are you using? On what operating system?
winxp brout 0.3.828
Please provide any additional information below.
```
Original issue reported on code.google.com by `pbol...@gmail.com` on 17 Feb 2010 at 11:27
| non_priority | window version exits on startup what steps will reproduce the problem install windows version attempt to launch brout what is the expected output what do you see instead error log you should install what version of the product are you using on what operating system winxp brout please provide any additional information below original issue reported on code google com by pbol gmail com on feb at | 0 |
98,117 | 29,485,271,608 | IssuesEvent | 2023-06-02 09:15:11 | apache/incubator-pegasus | https://api.github.com/repos/apache/incubator-pegasus | closed | Github actions run out of disk space while building ASAN | github scripts build | Previously `Build Release` and `Build with jemalloc` failed due to running out of disk space (see https://github.com/apache/incubator-pegasus/issues/1485). Recently, `Build ASAN` also failed due to the same reason (Unhandled exception. System.IO.IOException: No space left on device):

This means we have to spare more space. And actually `CMakeFiles` occupied much disk space. By running `find ./build/latest/src/ -name '*CMakeFiles*' -type d -exec du -csh "{}" +` we found that typically it could consume 3.4GB:
```
8.0K ./build/latest/src/CMakeFiles
25M ./build/latest/src/aio/CMakeFiles
9.4M ./build/latest/src/aio/test/CMakeFiles
19M ./build/latest/src/base/CMakeFiles
3.6M ./build/latest/src/base/test/CMakeFiles
9.4M ./build/latest/src/block_service/CMakeFiles
9.1M ./build/latest/src/block_service/fds/CMakeFiles
11M ./build/latest/src/block_service/hdfs/CMakeFiles
11M ./build/latest/src/block_service/local/CMakeFiles
35M ./build/latest/src/block_service/test/CMakeFiles
45M ./build/latest/src/client/CMakeFiles
6.9M ./build/latest/src/client/test/CMakeFiles
29M ./build/latest/src/client_lib/CMakeFiles
80M ./build/latest/src/common/CMakeFiles
68M ./build/latest/src/common/test/CMakeFiles
16M ./build/latest/src/failure_detector/CMakeFiles
16M ./build/latest/src/failure_detector/test/CMakeFiles
8.0K ./build/latest/src/geo/CMakeFiles
7.6M ./build/latest/src/geo/lib/CMakeFiles
12M ./build/latest/src/geo/test/CMakeFiles
2.4M ./build/latest/src/geo/bench/CMakeFiles
26M ./build/latest/src/http/CMakeFiles
11M ./build/latest/src/http/test/CMakeFiles
306M ./build/latest/src/meta/CMakeFiles
294M ./build/latest/src/meta/test/CMakeFiles
19M ./build/latest/src/meta/test/balancer_simulator/CMakeFiles
8.9M ./build/latest/src/meta/test/meta_state/CMakeFiles
45M ./build/latest/src/nfs/CMakeFiles
6.9M ./build/latest/src/nfs/test/CMakeFiles
17M ./build/latest/src/perf_counter/CMakeFiles
12M ./build/latest/src/perf_counter/test/CMakeFiles
8.0K ./build/latest/src/redis_protocol/CMakeFiles
7.4M ./build/latest/src/redis_protocol/proxy/CMakeFiles
17M ./build/latest/src/redis_protocol/proxy_lib/CMakeFiles
8.0M ./build/latest/src/redis_protocol/proxy_ut/CMakeFiles
6.1M ./build/latest/src/remote_cmd/CMakeFiles
411M ./build/latest/src/replica/CMakeFiles
90M ./build/latest/src/replica/duplication/test/CMakeFiles
13M ./build/latest/src/replica/backup/test/CMakeFiles
14M ./build/latest/src/replica/bulk_load/test/CMakeFiles
15M ./build/latest/src/replica/split/test/CMakeFiles
8.0K ./build/latest/src/replica/storage/CMakeFiles
22M ./build/latest/src/replica/storage/simple_kv/CMakeFiles
74M ./build/latest/src/replica/storage/simple_kv/test/CMakeFiles
123M ./build/latest/src/replica/test/CMakeFiles
11M ./build/latest/src/reporter/CMakeFiles
113M ./build/latest/src/runtime/CMakeFiles
131M ./build/latest/src/runtime/test/CMakeFiles
78M ./build/latest/src/runtime/rpc/CMakeFiles
73M ./build/latest/src/runtime/task/CMakeFiles
57M ./build/latest/src/runtime/security/CMakeFiles
19M ./build/latest/src/runtime/ranger/CMakeFiles
344K ./build/latest/src/sample/CMakeFiles
213M ./build/latest/src/server/CMakeFiles
345M ./build/latest/src/server/test/CMakeFiles
105M ./build/latest/src/shell/CMakeFiles
2.9M ./build/latest/src/test_util/CMakeFiles
4.6M ./build/latest/src/test/bench_test/CMakeFiles
8.0K ./build/latest/src/test/function_test/CMakeFiles
13M ./build/latest/src/test/function_test/utils/CMakeFiles
7.5M ./build/latest/src/test/function_test/backup_restore_test/CMakeFiles
64M ./build/latest/src/test/function_test/base_api_test/CMakeFiles
5.6M ./build/latest/src/test/function_test/bulk_load_test/CMakeFiles
3.5M ./build/latest/src/test/function_test/detect_hotspot_test/CMakeFiles
6.0M ./build/latest/src/test/function_test/partition_split_test/CMakeFiles
5.3M ./build/latest/src/test/function_test/recovery_test/CMakeFiles
8.1M ./build/latest/src/test/function_test/restore_test/CMakeFiles
6.4M ./build/latest/src/test/function_test/throttle_test/CMakeFiles
24M ./build/latest/src/test/kill_test/CMakeFiles
4.7M ./build/latest/src/test/pressure_test/CMakeFiles
6.3M ./build/latest/src/tools/CMakeFiles
87M ./build/latest/src/utils/CMakeFiles
1.3M ./build/latest/src/utils/long_adder_bench/CMakeFiles
94M ./build/latest/src/utils/test/CMakeFiles
5.4M ./build/latest/src/utils/test/nth_element_bench/CMakeFiles
16M ./build/latest/src/zookeeper/CMakeFiles
7.9M ./build/latest/src/zookeeper/test/CMakeFiles
3.4G total
```
Therefore we could drop `CMakeFiles` directories to spare more disk space. | 1.0 | Github actions run out of disk space while building ASAN - Previously `Build Release` and `Build with jemalloc` failed due to running out of disk space (see https://github.com/apache/incubator-pegasus/issues/1485). Recently, `Build ASAN` also failed due to the same reason (Unhandled exception. System.IO.IOException: No space left on device):

This means we have to spare more space. And actually `CMakeFiles` occupied much disk space. By running `find ./build/latest/src/ -name '*CMakeFiles*' -type d -exec du -csh "{}" +` we found that typically it could consume 3.4GB:
```
8.0K ./build/latest/src/CMakeFiles
25M ./build/latest/src/aio/CMakeFiles
9.4M ./build/latest/src/aio/test/CMakeFiles
19M ./build/latest/src/base/CMakeFiles
3.6M ./build/latest/src/base/test/CMakeFiles
9.4M ./build/latest/src/block_service/CMakeFiles
9.1M ./build/latest/src/block_service/fds/CMakeFiles
11M ./build/latest/src/block_service/hdfs/CMakeFiles
11M ./build/latest/src/block_service/local/CMakeFiles
35M ./build/latest/src/block_service/test/CMakeFiles
45M ./build/latest/src/client/CMakeFiles
6.9M ./build/latest/src/client/test/CMakeFiles
29M ./build/latest/src/client_lib/CMakeFiles
80M ./build/latest/src/common/CMakeFiles
68M ./build/latest/src/common/test/CMakeFiles
16M ./build/latest/src/failure_detector/CMakeFiles
16M ./build/latest/src/failure_detector/test/CMakeFiles
8.0K ./build/latest/src/geo/CMakeFiles
7.6M ./build/latest/src/geo/lib/CMakeFiles
12M ./build/latest/src/geo/test/CMakeFiles
2.4M ./build/latest/src/geo/bench/CMakeFiles
26M ./build/latest/src/http/CMakeFiles
11M ./build/latest/src/http/test/CMakeFiles
306M ./build/latest/src/meta/CMakeFiles
294M ./build/latest/src/meta/test/CMakeFiles
19M ./build/latest/src/meta/test/balancer_simulator/CMakeFiles
8.9M ./build/latest/src/meta/test/meta_state/CMakeFiles
45M ./build/latest/src/nfs/CMakeFiles
6.9M ./build/latest/src/nfs/test/CMakeFiles
17M ./build/latest/src/perf_counter/CMakeFiles
12M ./build/latest/src/perf_counter/test/CMakeFiles
8.0K ./build/latest/src/redis_protocol/CMakeFiles
7.4M ./build/latest/src/redis_protocol/proxy/CMakeFiles
17M ./build/latest/src/redis_protocol/proxy_lib/CMakeFiles
8.0M ./build/latest/src/redis_protocol/proxy_ut/CMakeFiles
6.1M ./build/latest/src/remote_cmd/CMakeFiles
411M ./build/latest/src/replica/CMakeFiles
90M ./build/latest/src/replica/duplication/test/CMakeFiles
13M ./build/latest/src/replica/backup/test/CMakeFiles
14M ./build/latest/src/replica/bulk_load/test/CMakeFiles
15M ./build/latest/src/replica/split/test/CMakeFiles
8.0K ./build/latest/src/replica/storage/CMakeFiles
22M ./build/latest/src/replica/storage/simple_kv/CMakeFiles
74M ./build/latest/src/replica/storage/simple_kv/test/CMakeFiles
123M ./build/latest/src/replica/test/CMakeFiles
11M ./build/latest/src/reporter/CMakeFiles
113M ./build/latest/src/runtime/CMakeFiles
131M ./build/latest/src/runtime/test/CMakeFiles
78M ./build/latest/src/runtime/rpc/CMakeFiles
73M ./build/latest/src/runtime/task/CMakeFiles
57M ./build/latest/src/runtime/security/CMakeFiles
19M ./build/latest/src/runtime/ranger/CMakeFiles
344K ./build/latest/src/sample/CMakeFiles
213M ./build/latest/src/server/CMakeFiles
345M ./build/latest/src/server/test/CMakeFiles
105M ./build/latest/src/shell/CMakeFiles
2.9M ./build/latest/src/test_util/CMakeFiles
4.6M ./build/latest/src/test/bench_test/CMakeFiles
8.0K ./build/latest/src/test/function_test/CMakeFiles
13M ./build/latest/src/test/function_test/utils/CMakeFiles
7.5M ./build/latest/src/test/function_test/backup_restore_test/CMakeFiles
64M ./build/latest/src/test/function_test/base_api_test/CMakeFiles
5.6M ./build/latest/src/test/function_test/bulk_load_test/CMakeFiles
3.5M ./build/latest/src/test/function_test/detect_hotspot_test/CMakeFiles
6.0M ./build/latest/src/test/function_test/partition_split_test/CMakeFiles
5.3M ./build/latest/src/test/function_test/recovery_test/CMakeFiles
8.1M ./build/latest/src/test/function_test/restore_test/CMakeFiles
6.4M ./build/latest/src/test/function_test/throttle_test/CMakeFiles
24M ./build/latest/src/test/kill_test/CMakeFiles
4.7M ./build/latest/src/test/pressure_test/CMakeFiles
6.3M ./build/latest/src/tools/CMakeFiles
87M ./build/latest/src/utils/CMakeFiles
1.3M ./build/latest/src/utils/long_adder_bench/CMakeFiles
94M ./build/latest/src/utils/test/CMakeFiles
5.4M ./build/latest/src/utils/test/nth_element_bench/CMakeFiles
16M ./build/latest/src/zookeeper/CMakeFiles
7.9M ./build/latest/src/zookeeper/test/CMakeFiles
3.4G total
```
Therefore we could drop `CMakeFiles` directories to spare more disk space. | non_priority | github actions run out of disk space while building asan previously build release and build with jemalloc failed due to running out of disk space see recently build asan also failed due to the same reason unhandled exception system io ioexception no space left on device this means we have to spare more space and actually cmakefiles occupied much disk space by running find build latest src name cmakefiles type d exec du csh we found that typically it could consume build latest src cmakefiles build latest src aio cmakefiles build latest src aio test cmakefiles build latest src base cmakefiles build latest src base test cmakefiles build latest src block service cmakefiles build latest src block service fds cmakefiles build latest src block service hdfs cmakefiles build latest src block service local cmakefiles build latest src block service test cmakefiles build latest src client cmakefiles build latest src client test cmakefiles build latest src client lib cmakefiles build latest src common cmakefiles build latest src common test cmakefiles build latest src failure detector cmakefiles build latest src failure detector test cmakefiles build latest src geo cmakefiles build latest src geo lib cmakefiles build latest src geo test cmakefiles build latest src geo bench cmakefiles build latest src http cmakefiles build latest src http test cmakefiles build latest src meta cmakefiles build latest src meta test cmakefiles build latest src meta test balancer simulator cmakefiles build latest src meta test meta state cmakefiles build latest src nfs cmakefiles build latest src nfs test cmakefiles build latest src perf counter cmakefiles build latest src perf counter test cmakefiles build latest src redis protocol cmakefiles build latest src redis protocol proxy cmakefiles build latest src redis protocol proxy lib cmakefiles build latest src redis protocol proxy ut cmakefiles build latest src remote cmd cmakefiles build latest src replica cmakefiles build latest src replica duplication test cmakefiles build latest src replica backup test cmakefiles build latest src replica bulk load test cmakefiles build latest src replica split test cmakefiles build latest src replica storage cmakefiles build latest src replica storage simple kv cmakefiles build latest src replica storage simple kv test cmakefiles build latest src replica test cmakefiles build latest src reporter cmakefiles build latest src runtime cmakefiles build latest src runtime test cmakefiles build latest src runtime rpc cmakefiles build latest src runtime task cmakefiles build latest src runtime security cmakefiles build latest src runtime ranger cmakefiles build latest src sample cmakefiles build latest src server cmakefiles build latest src server test cmakefiles build latest src shell cmakefiles build latest src test util cmakefiles build latest src test bench test cmakefiles build latest src test function test cmakefiles build latest src test function test utils cmakefiles build latest src test function test backup restore test cmakefiles build latest src test function test base api test cmakefiles build latest src test function test bulk load test cmakefiles build latest src test function test detect hotspot test cmakefiles build latest src test function test partition split test cmakefiles build latest src test function test recovery test cmakefiles build latest src test function test restore test cmakefiles build latest src test function test throttle test cmakefiles build latest src test kill test cmakefiles build latest src test pressure test cmakefiles build latest src tools cmakefiles build latest src utils cmakefiles build latest src utils long adder bench cmakefiles build latest src utils test cmakefiles build latest src utils test nth element bench cmakefiles build latest src zookeeper cmakefiles build latest src zookeeper test cmakefiles total therefore we could drop cmakefiles directories to spare more disk space | 0 |
789,168 | 27,781,437,148 | IssuesEvent | 2023-03-16 21:26:59 | PrefectHQ/prefect | https://api.github.com/repos/PrefectHQ/prefect | closed | Flow with many mapped tasks fails | bug priority:high status:in-progress from:sales | ### First check
- [X] I added a descriptive title to this issue.
- [X] I used the GitHub search to find a similar issue and didn't find it.
- [X] I searched the Prefect documentation for this issue.
- [X] I checked that this issue is related to Prefect and not one of its dependencies.
### Bug summary
When I run a flow after deployment with a Prefect Agent, I run into crashed running more than 100 mapped tasks. A full run of my current flow is about 3100 mapped tasks, but I have flows I want to migrate to 2.0 that have 10ks.
### Reproduction
```python3
import time
from datetime import datetime
from prefect import flow, task, get_run_logger
@task(name="Test Task")
def test_task(my_range):
logger = get_run_logger()
logger.info(my_range)
time.sleep(5)
logger.info(datetime.today())
@flow(name="Test Flow")
def test_flow():
iterate_list = [x for x in range(3000)]
blah = test_task.map(iterate_list)
print(blah)
if __name__ == "__main__":
test_flow()
```
### Error
```python3
(most recent call last):
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/engine.py", line 1334, in report_task_run_crashes
yield
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/engine.py", line 1070, in begin_task_run
connect_error = await client.api_healthcheck()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/client/orion.py", line 204, in api_healthcheck
await self._client.get("/health")
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1751, in get
return await self.request(
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1527, in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/client/base.py", line 159, in send
await super().send(*args, **kwargs)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1614, in send
response = await self._send_handling_auth(
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1642, in _send_handling_auth
response = await self._send_handling_redirects(
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1679, in _send_handling_redirects
response = await self._send_single_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1716, in _send_single_request
response = await transport.handle_async_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_transports/default.py", line 353, in handle_async_request
resp = await self._pool.handle_async_request(req)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 252, in handle_async_request
await self.response_closed(status)
asyncio.exceptions.CancelledError
10:56:53.688 | ERROR | Task run 'Get-Items-d8ed86f1-2473' - Crash detected! Execution was cancelled by the runtime environment.
10:56:53.688 | DEBUG | Task run 'Get-Items-d8ed86f1-2473' - Crash details:
Traceback (most recent call last):
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 314, in acquire
self.acquire_nowait()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 342, in acquire_nowait
raise WouldBlock
anyio.WouldBlock
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 237, in handle_async_request
response = await connection.handle_async_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection.py", line 90, in handle_async_request
return await self._connection.handle_async_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/http2.py", line 96, in handle_async_request
await self._max_streams_semaphore.acquire()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_synchronization.py", line 46, in acquire
await self._semaphore.acquire()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 319, in acquire
await event.wait()
File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
await fut
asyncio.exceptions.CancelledError
```
### Versions
```Text
Version: 2.6.5
API version: 0.8.3
Python version: 3.10.6
Git commit: 9fc2658f
Built: Thu, Oct 27, 2022 2:24 PM
OS/Arch: linux/x86_64
Profile: sandbox
Server type: cloud
```
### Additional context
_No response_ | 1.0 | Flow with many mapped tasks fails - ### First check
- [X] I added a descriptive title to this issue.
- [X] I used the GitHub search to find a similar issue and didn't find it.
- [X] I searched the Prefect documentation for this issue.
- [X] I checked that this issue is related to Prefect and not one of its dependencies.
### Bug summary
When I run a flow after deployment with a Prefect Agent, I run into crashed running more than 100 mapped tasks. A full run of my current flow is about 3100 mapped tasks, but I have flows I want to migrate to 2.0 that have 10ks.
### Reproduction
```python3
import time
from datetime import datetime
from prefect import flow, task, get_run_logger
@task(name="Test Task")
def test_task(my_range):
logger = get_run_logger()
logger.info(my_range)
time.sleep(5)
logger.info(datetime.today())
@flow(name="Test Flow")
def test_flow():
iterate_list = [x for x in range(3000)]
blah = test_task.map(iterate_list)
print(blah)
if __name__ == "__main__":
test_flow()
```
### Error
```python3
(most recent call last):
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/engine.py", line 1334, in report_task_run_crashes
yield
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/engine.py", line 1070, in begin_task_run
connect_error = await client.api_healthcheck()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/client/orion.py", line 204, in api_healthcheck
await self._client.get("/health")
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1751, in get
return await self.request(
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1527, in request
return await self.send(request, auth=auth, follow_redirects=follow_redirects)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/prefect/client/base.py", line 159, in send
await super().send(*args, **kwargs)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1614, in send
response = await self._send_handling_auth(
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1642, in _send_handling_auth
response = await self._send_handling_redirects(
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1679, in _send_handling_redirects
response = await self._send_single_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_client.py", line 1716, in _send_single_request
response = await transport.handle_async_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpx/_transports/default.py", line 353, in handle_async_request
resp = await self._pool.handle_async_request(req)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 252, in handle_async_request
await self.response_closed(status)
asyncio.exceptions.CancelledError
10:56:53.688 | ERROR | Task run 'Get-Items-d8ed86f1-2473' - Crash detected! Execution was cancelled by the runtime environment.
10:56:53.688 | DEBUG | Task run 'Get-Items-d8ed86f1-2473' - Crash details:
Traceback (most recent call last):
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 314, in acquire
self.acquire_nowait()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 342, in acquire_nowait
raise WouldBlock
anyio.WouldBlock
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 237, in handle_async_request
response = await connection.handle_async_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/connection.py", line 90, in handle_async_request
return await self._connection.handle_async_request(request)
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_async/http2.py", line 96, in handle_async_request
await self._max_streams_semaphore.acquire()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/httpcore/_synchronization.py", line 46, in acquire
await self._semaphore.acquire()
File "/home/tenders/.cache/pypoetry/virtualenvs/prefect-orion-HonJDUqB-py3.10/lib/python3.10/site-packages/anyio/_core/_synchronization.py", line 319, in acquire
await event.wait()
File "/usr/lib/python3.10/asyncio/locks.py", line 214, in wait
await fut
asyncio.exceptions.CancelledError
```
### Versions
```Text
Version: 2.6.5
API version: 0.8.3
Python version: 3.10.6
Git commit: 9fc2658f
Built: Thu, Oct 27, 2022 2:24 PM
OS/Arch: linux/x86_64
Profile: sandbox
Server type: cloud
```
### Additional context
_No response_ | priority | flow with many mapped tasks fails first check i added a descriptive title to this issue i used the github search to find a similar issue and didn t find it i searched the prefect documentation for this issue i checked that this issue is related to prefect and not one of its dependencies bug summary when i run a flow after deployment with a prefect agent i run into crashed running more than mapped tasks a full run of my current flow is about mapped tasks but i have flows i want to migrate to that have reproduction import time from datetime import datetime from prefect import flow task get run logger task name test task def test task my range logger get run logger logger info my range time sleep logger info datetime today flow name test flow def test flow iterate list blah test task map iterate list print blah if name main test flow error most recent call last file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages prefect engine py line in report task run crashes yield file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages prefect engine py line in begin task run connect error await client api healthcheck file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages prefect client orion py line in api healthcheck await self client get health file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpx client py line in get return await self request file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpx client py line in request return await self send request auth auth follow redirects follow redirects file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages prefect client base py line in send await super send args kwargs file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpx client py line in send response await self send handling auth file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpx client py line in send handling auth response await self send handling redirects file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpx client py line in send handling redirects response await self send single request request file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpx client py line in send single request response await transport handle async request request file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpx transports default py line in handle async request resp await self pool handle async request req file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpcore async connection pool py line in handle async request await self response closed status asyncio exceptions cancellederror error task run get items crash detected execution was cancelled by the runtime environment debug task run get items crash details traceback most recent call last file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages anyio core synchronization py line in acquire self acquire nowait file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages anyio core synchronization py line in acquire nowait raise wouldblock anyio wouldblock during handling of the above exception another exception occurred traceback most recent call last file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpcore async connection pool py line in handle async request response await connection handle async request request file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpcore async connection py line in handle async request return await self connection handle async request request file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpcore async py line in handle async request await self max streams semaphore acquire file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages httpcore synchronization py line in acquire await self semaphore acquire file home tenders cache pypoetry virtualenvs prefect orion honjduqb lib site packages anyio core synchronization py line in acquire await event wait file usr lib asyncio locks py line in wait await fut asyncio exceptions cancellederror versions text version api version python version git commit built thu oct pm os arch linux profile sandbox server type cloud additional context no response | 1 |
782,545 | 27,499,750,739 | IssuesEvent | 2023-03-05 15:05:14 | brandondombrowsky/BastCastle | https://api.github.com/repos/brandondombrowsky/BastCastle | closed | Write a script to demo for Google Integration | priority-medium | As a developer I want to showcase my work so my client knows I am on track.
Can be super duper simple, just any step above "Hey Google turn on my light"; Something more like "Hey Google run 'demo light script'" | 1.0 | Write a script to demo for Google Integration - As a developer I want to showcase my work so my client knows I am on track.
Can be super duper simple, just any step above "Hey Google turn on my light"; Something more like "Hey Google run 'demo light script'" | priority | write a script to demo for google integration as a developer i want to showcase my work so my client knows i am on track can be super duper simple just any step above hey google turn on my light something more like hey google run demo light script | 1 |
366,286 | 10,819,292,779 | IssuesEvent | 2019-11-08 14:06:01 | coreywebber/bdw | https://api.github.com/repos/coreywebber/bdw | closed | Redsky Maintenance | priority source-database | - [x] Rerun all processes
- [x] Full Pull Component
- [ ] Full Pull Component Properties | 1.0 | Redsky Maintenance - - [x] Rerun all processes
- [x] Full Pull Component
- [ ] Full Pull Component Properties | priority | redsky maintenance rerun all processes full pull component full pull component properties | 1 |
4,899 | 2,610,160,256 | IssuesEvent | 2015-02-26 18:50:51 | chrsmith/republic-at-war | https://api.github.com/repos/chrsmith/republic-at-war | closed | Map Issue | auto-migrated Priority-Medium Type-Defect | ```
Geonosis
Weird light issue on top corner of map
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 31 Jan 2011 at 2:24 | 1.0 | Map Issue - ```
Geonosis
Weird light issue on top corner of map
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 31 Jan 2011 at 2:24 | non_priority | map issue geonosis weird light issue on top corner of map original issue reported on code google com by gmail com on jan at | 0 |
186,218 | 6,734,460,856 | IssuesEvent | 2017-10-18 18:07:43 | octobercms/october | https://api.github.com/repos/octobercms/october | closed | Attachments inside a Relation "One to One" don't work | Priority: Medium Status: Abandoned Status: Review Needed Type: Unconfirmed Bug | Hello guys,
i have created two entities and i have connected them with a relation one to one. When i have attached a file in the second entities (connected with the first through a one to one partial) it doesn't store the file.
To be precise the system adds the value in the "system_files" table but it doesn't store the connection.
I thought that was my fault so i have tested the same thing using October test plugin(oc-test-plugin). I have used the Person and Phone Models.
The result is the same.
I hope I was helpful.
| 1.0 | Attachments inside a Relation "One to One" don't work - Hello guys,
i have created two entities and i have connected them with a relation one to one. When i have attached a file in the second entities (connected with the first through a one to one partial) it doesn't store the file.
To be precise the system adds the value in the "system_files" table but it doesn't store the connection.
I thought that was my fault so i have tested the same thing using October test plugin(oc-test-plugin). I have used the Person and Phone Models.
The result is the same.
I hope I was helpful.
| priority | attachments inside a relation one to one don t work hello guys i have created two entities and i have connected them with a relation one to one when i have attached a file in the second entities connected with the first through a one to one partial it doesn t store the file to be precise the system adds the value in the system files table but it doesn t store the connection i thought that was my fault so i have tested the same thing using october test plugin oc test plugin i have used the person and phone models the result is the same i hope i was helpful | 1 |
56,481 | 32,028,191,478 | IssuesEvent | 2023-09-22 10:19:42 | keras-team/keras | https://api.github.com/repos/keras-team/keras | closed | High memory consumption with model.fit in TF 2.x | type:bug/performance | Moved from Tensorflow repository
https://github.com/tensorflow/tensorflow/issues/40942
@gdudziuk opened this issue in TF repo.
System information
Have I written custom code: Yes
OS Platform and Distribution: CentOS Linux 7
Mobile device: Not verified on mobile devices
TensorFlow installed from: binary, via pip install tf-nightly
TensorFlow version: 2.5.0-dev20200626
Python version: 3.6.8
CUDA/cuDNN version: 10.1 / 7
GPU model and memory: Tesla V100 32 GB
Describe the current behavior
Model training with the Keras API consumes high amount of system memory. It looks like the memory used by model.fit is proportional to the size of the training data provided as numpy arrays, with the proportionality constant being approximately 1. In other words, if the numpy arrays x and y are, say, 8 GB in total, then model.fit(x,y,...) will use another 8 GB (plus some overhead). So the memory usage by model.fit uses is twice the data size (plus some overhead).
The same concerns the validation data. If validation data are passed as numpy arrays to model.fit via the argument validation_data, then the memory use of model.fit seems to duplicate the size of the validation data arrays.
The described effect is also present if I wrap the numpy arrays containing the data in TF Datasets.
In the code attached below, one may change the variable K to vary the size of the data and test the above described behavior. It is straightforward to estimate the data size (e.g. with K=5000 the data arrays in the below code should be ca. 7.32 GB in total). The whole Python process associated with this code uses approximately twice this much RAM plus some overhead independent of the data size. One may comment out the line containing model.fit to check that it is the point at which the high memory consumption starts.
Describe the expected behavior
It would be reasonable to expect that the memory usage by the test code was approximately the data size plus some overhead independent of the data size (not twice the data size plus overhead).
A bit of history
This is a continuation of the issue #35030, concerning TF 2.0 and 2.1. I opened the latter issue in December 2019 and now @karmel have stated that that issue is very long and asked me to test if the issue persists in TF-nightly and open a new issue if necessary. So yes, the problem persists, and here I open a new issue.
The problem appeared first in the release 2.0.0-rc0. In the earlier releases up to 2.0.0-b1 inclusive the memery usage by the below test code was ca. the size of the data arrays plus an overhead independent of the data size. Starting from 2.0.0-rc0 it became twice the data size plus overhead and it was true at least until 2.1.0.
Next, in 2.2.0, the situation changed a bit:
When using numpy arrays to pass data to model.fit, there was a memory leak about 0.5 x data size in each epoch. In other words, if the size of the data arrays was ca. 8 GB, then the memory usage was increasing ca. 4 GB each epoch.
When wrapping the data arrays in TF datasets and then passing to model.fit, then the behavior was the same in TF 2.2 as in 2.1 and 2.0, namely the memory usage was twice the data size plus overhead.
Now, in the nightly release 2.5.0-dev20200626 we are back to the previous situation, namely the memory usage is twice the data size plus overhead, regardless of whether numpy arrays or datasets are used to pass the data to model.fit.
An important note on reproducibility
The issue has occurred to be not reproducible in colab! In #35030, I reported the issue for my local machine and some other participants also managed to reproduce it locally but not in colab. Some were trying to reproduce it in colab without success. Similarly, the results I report now are not from colab.
Also, for some reason the issue cannot be captured when using libmemusage.so to measure the memory usage. To capture the issue, I use ps au in Linux terminal or Python module psutil.
Standalone code to reproduce the issue
Since this issue is in fact a continuation of #35030, I use the same test code here.
```
import tensorflow as tf
import numpy as np
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Lambda, Conv2D
print("Tensorflow version: {}".format(tf.__version__),flush=True)
K = 5000 # Number of images
N = 512 # Image size
MAX_SIGNAL = 5000 # The values of the training data range from 0 to this
def build_model():
'''Create a simple test model.'''
inputs = Input((N,N,1))
s = Lambda(lambda x: x / MAX_SIGNAL) (inputs)
s = Conv2D(1, (3, 3), activation='sigmoid', padding='same')(s)
outputs = s
return Model(inputs=[inputs], outputs=[outputs])
# Generate some random data
x_train = np.random.randint(MAX_SIGNAL+1,size=(K,N,N,1),dtype=np.uint16) # Should be 2 560 000 kB
y_train = np.random.randint(1+1 ,size=(K,N,N,1),dtype=np.bool) # Should be 1 280 000 kB
x_val = np.random.randint(MAX_SIGNAL+1,size=(K,N,N,1),dtype=np.uint16) # Should be 2 560 000 kB
y_val = np.random.randint(1+1 ,size=(K,N,N,1),dtype=np.bool) # Should be 1 280 000 kB
# In total, the above arrays should be 7 680 000 kB
model = build_model()
optimizer = tf.keras.optimizers.Adam()
loss = tf.keras.losses.BinaryCrossentropy()
model.compile(optimizer=optimizer, loss=loss)
model.fit(x=x_train, y=y_train, validation_data=(x_val,y_val), batch_size=8, epochs=10)
The above is meant to reproduce the issue with data passed to model.fit as numpy arrays. To test the behavior with TF datasets, replace the last line with the following:
ds_train = tf.data.Dataset.from_tensor_slices((x_train,y_train)).batch(8)
ds_val = tf.data.Dataset.from_tensor_slices((x_val,y_val)).batch(8)
model.fit(ds_train, validation_data=ds_val, epochs=10)
``` | True | High memory consumption with model.fit in TF 2.x - Moved from Tensorflow repository
https://github.com/tensorflow/tensorflow/issues/40942
@gdudziuk opened this issue in TF repo.
System information
Have I written custom code: Yes
OS Platform and Distribution: CentOS Linux 7
Mobile device: Not verified on mobile devices
TensorFlow installed from: binary, via pip install tf-nightly
TensorFlow version: 2.5.0-dev20200626
Python version: 3.6.8
CUDA/cuDNN version: 10.1 / 7
GPU model and memory: Tesla V100 32 GB
Describe the current behavior
Model training with the Keras API consumes high amount of system memory. It looks like the memory used by model.fit is proportional to the size of the training data provided as numpy arrays, with the proportionality constant being approximately 1. In other words, if the numpy arrays x and y are, say, 8 GB in total, then model.fit(x,y,...) will use another 8 GB (plus some overhead). So the memory usage by model.fit uses is twice the data size (plus some overhead).
The same concerns the validation data. If validation data are passed as numpy arrays to model.fit via the argument validation_data, then the memory use of model.fit seems to duplicate the size of the validation data arrays.
The described effect is also present if I wrap the numpy arrays containing the data in TF Datasets.
In the code attached below, one may change the variable K to vary the size of the data and test the above described behavior. It is straightforward to estimate the data size (e.g. with K=5000 the data arrays in the below code should be ca. 7.32 GB in total). The whole Python process associated with this code uses approximately twice this much RAM plus some overhead independent of the data size. One may comment out the line containing model.fit to check that it is the point at which the high memory consumption starts.
Describe the expected behavior
It would be reasonable to expect that the memory usage by the test code was approximately the data size plus some overhead independent of the data size (not twice the data size plus overhead).
A bit of history
This is a continuation of the issue #35030, concerning TF 2.0 and 2.1. I opened the latter issue in December 2019 and now @karmel have stated that that issue is very long and asked me to test if the issue persists in TF-nightly and open a new issue if necessary. So yes, the problem persists, and here I open a new issue.
The problem appeared first in the release 2.0.0-rc0. In the earlier releases up to 2.0.0-b1 inclusive the memery usage by the below test code was ca. the size of the data arrays plus an overhead independent of the data size. Starting from 2.0.0-rc0 it became twice the data size plus overhead and it was true at least until 2.1.0.
Next, in 2.2.0, the situation changed a bit:
When using numpy arrays to pass data to model.fit, there was a memory leak about 0.5 x data size in each epoch. In other words, if the size of the data arrays was ca. 8 GB, then the memory usage was increasing ca. 4 GB each epoch.
When wrapping the data arrays in TF datasets and then passing to model.fit, then the behavior was the same in TF 2.2 as in 2.1 and 2.0, namely the memory usage was twice the data size plus overhead.
Now, in the nightly release 2.5.0-dev20200626 we are back to the previous situation, namely the memory usage is twice the data size plus overhead, regardless of whether numpy arrays or datasets are used to pass the data to model.fit.
An important note on reproducibility
The issue has occurred to be not reproducible in colab! In #35030, I reported the issue for my local machine and some other participants also managed to reproduce it locally but not in colab. Some were trying to reproduce it in colab without success. Similarly, the results I report now are not from colab.
Also, for some reason the issue cannot be captured when using libmemusage.so to measure the memory usage. To capture the issue, I use ps au in Linux terminal or Python module psutil.
Standalone code to reproduce the issue
Since this issue is in fact a continuation of #35030, I use the same test code here.
```
import tensorflow as tf
import numpy as np
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Input, Lambda, Conv2D
print("Tensorflow version: {}".format(tf.__version__),flush=True)
K = 5000 # Number of images
N = 512 # Image size
MAX_SIGNAL = 5000 # The values of the training data range from 0 to this
def build_model():
'''Create a simple test model.'''
inputs = Input((N,N,1))
s = Lambda(lambda x: x / MAX_SIGNAL) (inputs)
s = Conv2D(1, (3, 3), activation='sigmoid', padding='same')(s)
outputs = s
return Model(inputs=[inputs], outputs=[outputs])
# Generate some random data
x_train = np.random.randint(MAX_SIGNAL+1,size=(K,N,N,1),dtype=np.uint16) # Should be 2 560 000 kB
y_train = np.random.randint(1+1 ,size=(K,N,N,1),dtype=np.bool) # Should be 1 280 000 kB
x_val = np.random.randint(MAX_SIGNAL+1,size=(K,N,N,1),dtype=np.uint16) # Should be 2 560 000 kB
y_val = np.random.randint(1+1 ,size=(K,N,N,1),dtype=np.bool) # Should be 1 280 000 kB
# In total, the above arrays should be 7 680 000 kB
model = build_model()
optimizer = tf.keras.optimizers.Adam()
loss = tf.keras.losses.BinaryCrossentropy()
model.compile(optimizer=optimizer, loss=loss)
model.fit(x=x_train, y=y_train, validation_data=(x_val,y_val), batch_size=8, epochs=10)
The above is meant to reproduce the issue with data passed to model.fit as numpy arrays. To test the behavior with TF datasets, replace the last line with the following:
ds_train = tf.data.Dataset.from_tensor_slices((x_train,y_train)).batch(8)
ds_val = tf.data.Dataset.from_tensor_slices((x_val,y_val)).batch(8)
model.fit(ds_train, validation_data=ds_val, epochs=10)
``` | non_priority | high memory consumption with model fit in tf x moved from tensorflow repository gdudziuk opened this issue in tf repo system information have i written custom code yes os platform and distribution centos linux mobile device not verified on mobile devices tensorflow installed from binary via pip install tf nightly tensorflow version python version cuda cudnn version gpu model and memory tesla gb describe the current behavior model training with the keras api consumes high amount of system memory it looks like the memory used by model fit is proportional to the size of the training data provided as numpy arrays with the proportionality constant being approximately in other words if the numpy arrays x and y are say gb in total then model fit x y will use another gb plus some overhead so the memory usage by model fit uses is twice the data size plus some overhead the same concerns the validation data if validation data are passed as numpy arrays to model fit via the argument validation data then the memory use of model fit seems to duplicate the size of the validation data arrays the described effect is also present if i wrap the numpy arrays containing the data in tf datasets in the code attached below one may change the variable k to vary the size of the data and test the above described behavior it is straightforward to estimate the data size e g with k the data arrays in the below code should be ca gb in total the whole python process associated with this code uses approximately twice this much ram plus some overhead independent of the data size one may comment out the line containing model fit to check that it is the point at which the high memory consumption starts describe the expected behavior it would be reasonable to expect that the memory usage by the test code was approximately the data size plus some overhead independent of the data size not twice the data size plus overhead a bit of history this is a continuation of the issue concerning tf and i opened the latter issue in december and now karmel have stated that that issue is very long and asked me to test if the issue persists in tf nightly and open a new issue if necessary so yes the problem persists and here i open a new issue the problem appeared first in the release in the earlier releases up to inclusive the memery usage by the below test code was ca the size of the data arrays plus an overhead independent of the data size starting from it became twice the data size plus overhead and it was true at least until next in the situation changed a bit when using numpy arrays to pass data to model fit there was a memory leak about x data size in each epoch in other words if the size of the data arrays was ca gb then the memory usage was increasing ca gb each epoch when wrapping the data arrays in tf datasets and then passing to model fit then the behavior was the same in tf as in and namely the memory usage was twice the data size plus overhead now in the nightly release we are back to the previous situation namely the memory usage is twice the data size plus overhead regardless of whether numpy arrays or datasets are used to pass the data to model fit an important note on reproducibility the issue has occurred to be not reproducible in colab in i reported the issue for my local machine and some other participants also managed to reproduce it locally but not in colab some were trying to reproduce it in colab without success similarly the results i report now are not from colab also for some reason the issue cannot be captured when using libmemusage so to measure the memory usage to capture the issue i use ps au in linux terminal or python module psutil standalone code to reproduce the issue since this issue is in fact a continuation of i use the same test code here import tensorflow as tf import numpy as np from tensorflow keras models import model from tensorflow keras layers import input lambda print tensorflow version format tf version flush true k number of images n image size max signal the values of the training data range from to this def build model create a simple test model inputs input n n s lambda lambda x x max signal inputs s activation sigmoid padding same s outputs s return model inputs outputs generate some random data x train np random randint max signal size k n n dtype np should be kb y train np random randint size k n n dtype np bool should be kb x val np random randint max signal size k n n dtype np should be kb y val np random randint size k n n dtype np bool should be kb in total the above arrays should be kb model build model optimizer tf keras optimizers adam loss tf keras losses binarycrossentropy model compile optimizer optimizer loss loss model fit x x train y y train validation data x val y val batch size epochs the above is meant to reproduce the issue with data passed to model fit as numpy arrays to test the behavior with tf datasets replace the last line with the following ds train tf data dataset from tensor slices x train y train batch ds val tf data dataset from tensor slices x val y val batch model fit ds train validation data ds val epochs | 0 |
220,389 | 17,192,203,107 | IssuesEvent | 2021-07-16 12:40:02 | FundacionParaguaya/stoplight-web | https://api.github.com/repos/FundacionParaguaya/stoplight-web | closed | Create new input text component | tested | The new input component should resemble this design

Connect to formik for validation | 1.0 | Create new input text component - The new input component should resemble this design

Connect to formik for validation | non_priority | create new input text component the new input component should resemble this design connect to formik for validation | 0 |
61,050 | 3,137,535,471 | IssuesEvent | 2015-09-11 03:38:59 | DivineRPG/DivineRPG | https://api.github.com/repos/DivineRPG/DivineRPG | closed | Raw and cooked empowered meat have the same textures | bug low-priority | Shouldn't they have different textures? The only way to tell the difference right now is to mouse over it. | 1.0 | Raw and cooked empowered meat have the same textures - Shouldn't they have different textures? The only way to tell the difference right now is to mouse over it. | priority | raw and cooked empowered meat have the same textures shouldn t they have different textures the only way to tell the difference right now is to mouse over it | 1 |
187,611 | 22,045,817,136 | IssuesEvent | 2022-05-30 01:29:06 | utopikkad/my-Todo-List | https://api.github.com/repos/utopikkad/my-Todo-List | closed | WS-2021-0039 (Low) detected in core-7.2.16.tgz, core-9.0.0.tgz - autoclosed | security vulnerability | ## WS-2021-0039 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>core-7.2.16.tgz</b>, <b>core-9.0.0.tgz</b></p></summary>
<p>
<details><summary><b>core-7.2.16.tgz</b></p></summary>
<p>Angular - the core framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/@angular/core/-/core-7.2.16.tgz">https://registry.npmjs.org/@angular/core/-/core-7.2.16.tgz</a></p>
<p>
Dependency Hierarchy:
- :x: **core-7.2.16.tgz** (Vulnerable Library)
</details>
<details><summary><b>core-9.0.0.tgz</b></p></summary>
<p>Angular - the core framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/@angular/core/-/core-9.0.0.tgz">https://registry.npmjs.org/@angular/core/-/core-9.0.0.tgz</a></p>
<p>
Dependency Hierarchy:
- codelyzer-6.0.0.tgz (Root Library)
- :x: **core-9.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/utopikkad/my-Todo-List/commit/a575471c4d32902c4fe3a01ed7cb42670c976994">a575471c4d32902c4fe3a01ed7cb42670c976994</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Cross-Site Scripting (XSS) vulnerability was found in @angular/core before 11.1.1. HTML doesn't specify any way to escape comment end text inside the comment.
<p>Publish Date: 2021-01-26
<p>URL: <a href=https://github.com/angular/angular/commit/97ec6e48493bf9418971436d885470a66e71f045>WS-2021-0039</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/angular/angular/releases/tag/11.1.1">https://github.com/angular/angular/releases/tag/11.1.1</a></p>
<p>Release Date: 2021-01-26</p>
<p>Fix Resolution: @angular/core - 11.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | WS-2021-0039 (Low) detected in core-7.2.16.tgz, core-9.0.0.tgz - autoclosed - ## WS-2021-0039 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>core-7.2.16.tgz</b>, <b>core-9.0.0.tgz</b></p></summary>
<p>
<details><summary><b>core-7.2.16.tgz</b></p></summary>
<p>Angular - the core framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/@angular/core/-/core-7.2.16.tgz">https://registry.npmjs.org/@angular/core/-/core-7.2.16.tgz</a></p>
<p>
Dependency Hierarchy:
- :x: **core-7.2.16.tgz** (Vulnerable Library)
</details>
<details><summary><b>core-9.0.0.tgz</b></p></summary>
<p>Angular - the core framework</p>
<p>Library home page: <a href="https://registry.npmjs.org/@angular/core/-/core-9.0.0.tgz">https://registry.npmjs.org/@angular/core/-/core-9.0.0.tgz</a></p>
<p>
Dependency Hierarchy:
- codelyzer-6.0.0.tgz (Root Library)
- :x: **core-9.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/utopikkad/my-Todo-List/commit/a575471c4d32902c4fe3a01ed7cb42670c976994">a575471c4d32902c4fe3a01ed7cb42670c976994</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Cross-Site Scripting (XSS) vulnerability was found in @angular/core before 11.1.1. HTML doesn't specify any way to escape comment end text inside the comment.
<p>Publish Date: 2021-01-26
<p>URL: <a href=https://github.com/angular/angular/commit/97ec6e48493bf9418971436d885470a66e71f045>WS-2021-0039</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/angular/angular/releases/tag/11.1.1">https://github.com/angular/angular/releases/tag/11.1.1</a></p>
<p>Release Date: 2021-01-26</p>
<p>Fix Resolution: @angular/core - 11.1.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | ws low detected in core tgz core tgz autoclosed ws low severity vulnerability vulnerable libraries core tgz core tgz core tgz angular the core framework library home page a href dependency hierarchy x core tgz vulnerable library core tgz angular the core framework library home page a href dependency hierarchy codelyzer tgz root library x core tgz vulnerable library found in head commit a href vulnerability details cross site scripting xss vulnerability was found in angular core before html doesn t specify any way to escape comment end text inside the comment publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction required scope unchanged impact metrics confidentiality impact low integrity impact low availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution angular core step up your open source security game with whitesource | 0 |
181,262 | 6,657,796,948 | IssuesEvent | 2017-09-30 10:43:32 | status-im/status-react | https://api.github.com/repos/status-im/status-react | closed | No password validation on Unsigned transactions screen [0.9.11] | bug high-priority | ### Description
*Type*: Bug
*Summary*: I can sign transaction with any password I like and it is being sent and shown on Ropsten Testnet
#### Expected behavior
Password validation should be present when sigining transactions
#### Actual behavior
No password validation, I can sign and send transaction with any password I like and see in on Ropsten
https://ropsten.etherscan.io/address/0xc26e4919248f752d2ae999b3b91f43ee54714b70
### Reproduction
- Open Status
- Go to 1x1 chat
- initiate /send command and go to Unsigned screen
- sign transaction with wrong passord (should not match your account password)
### Additional Information
* Status version: 0.9.10-109-gf5e394b9+ (1666)
* Operating System: Android, iOS
* TF https://app.testfairy.com/projects/4803622-status/builds/6636790/sessions/12/?accessToken=jZLnIu60Tm--fm6Rdu7h694BhCw
| 1.0 | No password validation on Unsigned transactions screen [0.9.11] - ### Description
*Type*: Bug
*Summary*: I can sign transaction with any password I like and it is being sent and shown on Ropsten Testnet
#### Expected behavior
Password validation should be present when sigining transactions
#### Actual behavior
No password validation, I can sign and send transaction with any password I like and see in on Ropsten
https://ropsten.etherscan.io/address/0xc26e4919248f752d2ae999b3b91f43ee54714b70
### Reproduction
- Open Status
- Go to 1x1 chat
- initiate /send command and go to Unsigned screen
- sign transaction with wrong passord (should not match your account password)
### Additional Information
* Status version: 0.9.10-109-gf5e394b9+ (1666)
* Operating System: Android, iOS
* TF https://app.testfairy.com/projects/4803622-status/builds/6636790/sessions/12/?accessToken=jZLnIu60Tm--fm6Rdu7h694BhCw
| priority | no password validation on unsigned transactions screen description type bug summary i can sign transaction with any password i like and it is being sent and shown on ropsten testnet expected behavior password validation should be present when sigining transactions actual behavior no password validation i can sign and send transaction with any password i like and see in on ropsten reproduction open status go to chat initiate send command and go to unsigned screen sign transaction with wrong passord should not match your account password additional information status version operating system android ios tf | 1 |
202,874 | 23,120,022,626 | IssuesEvent | 2022-07-27 20:25:13 | fecgov/openFEC | https://api.github.com/repos/fecgov/openFEC | closed | [SNYK: Medium] org.bouncycastle:bcprov-jdk15on Cryptographic Issues & io.netty:netty-handler Improper Certificate Validation (Due 08/28/2022) | Security: moderate Security: general | 1. org.bouncycastle:bcprov-jdk15on Cryptographic Issues
[org.bouncycastle:bcprov-jdk15on](https://mvnrepository.com/artifact/org.bouncycastle/bcprov-jdk15on) is a Java implementation of cryptographic algorithms.
Affected versions of this package are vulnerable to Cryptographic Issues via weak key-hash message authentication code (HMAC) that is only 16 bits long which can result in hash collisions, as a result of an error within the BKS version 1 keystore (BKS-V1) files and could lead to an attacker being able to affect the integrity of these files. This vulnerability was introduced following an incomplete fix for CVE-2018-5382.
2) io.netty:netty-handler Improper Certificate Validation
[io.netty:netty-handler](https://github.com/netty/netty.git/netty-handler) is a library that provides an asynchronous event-driven network application framework and tools for rapid development of maintainable high performance and high scalability protocol servers and clients. In other words, Netty is a NIO client server framework which enables quick and easy development of network applications such as protocol servers and clients. It greatly simplifies and streamlines network programming such as TCP and UDP socket server.
Affected versions of this package are vulnerable to Improper Certificate Validation. Certificate hostname validation is disabled by default in Netty 4.1.x which makes it potentially susceptible to Man-in-the-Middle attacks.
### Action Items
- [x] Confirm that both of these are vulnerabilities for us
- [x] Upgrade flyway to v9.0.2 or higher
- [x] Fix the vulnerability or alert the security team if there is not a remediation path available by the due date | True | [SNYK: Medium] org.bouncycastle:bcprov-jdk15on Cryptographic Issues & io.netty:netty-handler Improper Certificate Validation (Due 08/28/2022) - 1. org.bouncycastle:bcprov-jdk15on Cryptographic Issues
[org.bouncycastle:bcprov-jdk15on](https://mvnrepository.com/artifact/org.bouncycastle/bcprov-jdk15on) is a Java implementation of cryptographic algorithms.
Affected versions of this package are vulnerable to Cryptographic Issues via weak key-hash message authentication code (HMAC) that is only 16 bits long which can result in hash collisions, as a result of an error within the BKS version 1 keystore (BKS-V1) files and could lead to an attacker being able to affect the integrity of these files. This vulnerability was introduced following an incomplete fix for CVE-2018-5382.
2) io.netty:netty-handler Improper Certificate Validation
[io.netty:netty-handler](https://github.com/netty/netty.git/netty-handler) is a library that provides an asynchronous event-driven network application framework and tools for rapid development of maintainable high performance and high scalability protocol servers and clients. In other words, Netty is a NIO client server framework which enables quick and easy development of network applications such as protocol servers and clients. It greatly simplifies and streamlines network programming such as TCP and UDP socket server.
Affected versions of this package are vulnerable to Improper Certificate Validation. Certificate hostname validation is disabled by default in Netty 4.1.x which makes it potentially susceptible to Man-in-the-Middle attacks.
### Action Items
- [x] Confirm that both of these are vulnerabilities for us
- [x] Upgrade flyway to v9.0.2 or higher
- [x] Fix the vulnerability or alert the security team if there is not a remediation path available by the due date | non_priority | org bouncycastle bcprov cryptographic issues io netty netty handler improper certificate validation due org bouncycastle bcprov cryptographic issues is a java implementation of cryptographic algorithms affected versions of this package are vulnerable to cryptographic issues via weak key hash message authentication code hmac that is only bits long which can result in hash collisions as a result of an error within the bks version keystore bks files and could lead to an attacker being able to affect the integrity of these files this vulnerability was introduced following an incomplete fix for cve io netty netty handler improper certificate validation is a library that provides an asynchronous event driven network application framework and tools for rapid development of maintainable high performance and high scalability protocol servers and clients in other words netty is a nio client server framework which enables quick and easy development of network applications such as protocol servers and clients it greatly simplifies and streamlines network programming such as tcp and udp socket server affected versions of this package are vulnerable to improper certificate validation certificate hostname validation is disabled by default in netty x which makes it potentially susceptible to man in the middle attacks action items confirm that both of these are vulnerabilities for us upgrade flyway to or higher fix the vulnerability or alert the security team if there is not a remediation path available by the due date | 0 |
107,846 | 16,762,431,545 | IssuesEvent | 2021-06-14 02:00:58 | atlslscsrv-app/https-net-atlsecsrv-org.github.io-dev.azure.portal.dashboard- | https://api.github.com/repos/atlslscsrv-app/https-net-atlsecsrv-org.github.io-dev.azure.portal.dashboard- | opened | CVE-2020-11023 (Medium) detected in jquery-1.11.1.js | security vulnerability | ## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/security-alerts-atlslscsrv.app/node_modules/unix-crypt-td-js/test/test.html</p>
<p>Path to vulnerable library: /security-alerts-atlslscsrv.app/node_modules/unix-crypt-td-js/test/test.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.1.js** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-11023 (Medium) detected in jquery-1.11.1.js - ## CVE-2020-11023 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.1.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.js</a></p>
<p>Path to dependency file: /tmp/ws-scm/security-alerts-atlslscsrv.app/node_modules/unix-crypt-td-js/test/test.html</p>
<p>Path to vulnerable library: /security-alerts-atlslscsrv.app/node_modules/unix-crypt-td-js/test/test.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.1.js** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In jQuery versions greater than or equal to 1.0.3 and before 3.5.0, passing HTML containing <option> elements from untrusted sources - even after sanitizing it - to one of jQuery's DOM manipulation methods (i.e. .html(), .append(), and others) may execute untrusted code. This problem is patched in jQuery 3.5.0.
<p>Publish Date: 2020-04-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11023>CVE-2020-11023</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-11023</a></p>
<p>Release Date: 2020-04-29</p>
<p>Fix Resolution: jquery - 3.5.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in jquery js cve medium severity vulnerability vulnerable library jquery js javascript library for dom operations library home page a href path to dependency file tmp ws scm security alerts atlslscsrv app node modules unix crypt td js test test html path to vulnerable library security alerts atlslscsrv app node modules unix crypt td js test test html dependency hierarchy x jquery js vulnerable library vulnerability details in jquery versions greater than or equal to and before passing html containing elements from untrusted sources even after sanitizing it to one of jquery s dom manipulation methods i e html append and others may execute untrusted code this problem is patched in jquery publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery step up your open source security game with whitesource | 0 |
256,973 | 27,561,749,354 | IssuesEvent | 2023-03-07 22:43:57 | samqws-marketing/box_mojito | https://api.github.com/repos/samqws-marketing/box_mojito | closed | CVE-2022-1471 (High) detected in snakeyaml-1.26.jar - autoclosed | Mend: dependency security vulnerability | ## CVE-2022-1471 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.26.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /restclient/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-2.3.4.RELEASE.jar (Root Library)
- :x: **snakeyaml-1.26.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_mojito/commit/65290aeb818102fa2443a637efdccebebfed1eb9">65290aeb818102fa2443a637efdccebebfed1eb9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
SnakeYaml's Constructor() class does not restrict types which can be instantiated during deserialization. Deserializing yaml content provided by an attacker can lead to remote code execution. We recommend using SnakeYaml's SafeConsturctor when parsing untrusted content to restrict deserialization.
<p>Publish Date: 2022-12-01
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-1471>CVE-2022-1471</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
| True | CVE-2022-1471 (High) detected in snakeyaml-1.26.jar - autoclosed - ## CVE-2022-1471 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>snakeyaml-1.26.jar</b></p></summary>
<p>YAML 1.1 parser and emitter for Java</p>
<p>Library home page: <a href="http://www.snakeyaml.org">http://www.snakeyaml.org</a></p>
<p>Path to dependency file: /restclient/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar,/home/wss-scanner/.m2/repository/org/yaml/snakeyaml/1.26/snakeyaml-1.26.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-2.3.4.RELEASE.jar (Root Library)
- :x: **snakeyaml-1.26.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/box_mojito/commit/65290aeb818102fa2443a637efdccebebfed1eb9">65290aeb818102fa2443a637efdccebebfed1eb9</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
SnakeYaml's Constructor() class does not restrict types which can be instantiated during deserialization. Deserializing yaml content provided by an attacker can lead to remote code execution. We recommend using SnakeYaml's SafeConsturctor when parsing untrusted content to restrict deserialization.
<p>Publish Date: 2022-12-01
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-1471>CVE-2022-1471</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
| non_priority | cve high detected in snakeyaml jar autoclosed cve high severity vulnerability vulnerable library snakeyaml jar yaml parser and emitter for java library home page a href path to dependency file restclient pom xml path to vulnerable library home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar home wss scanner repository org yaml snakeyaml snakeyaml jar dependency hierarchy spring boot starter release jar root library x snakeyaml jar vulnerable library found in head commit a href found in base branch master vulnerability details snakeyaml s constructor class does not restrict types which can be instantiated during deserialization deserializing yaml content provided by an attacker can lead to remote code execution we recommend using snakeyaml s safeconsturctor when parsing untrusted content to restrict deserialization publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href | 0 |
78,885 | 9,808,442,682 | IssuesEvent | 2019-06-12 15:37:50 | fecgov/FEC | https://api.github.com/repos/fecgov/FEC | closed | Request: Make styling of glossary words more prominent in text | theme: general design | ## What were you trying to do and how can we improve it?
Read glossary terms
## General feedback?
Glossary terms: can they be highlighted or underlined? It's tough to distinguish them from the other wording with the same size and font lettering. The book icon blends in
## Details
- URL: https://beta.fec.gov/
- User Agent: Mozilla/5.0 (Windows NT 6.1; rv:43.0) Gecko/20100101 Firefox/43.0
| 1.0 | Request: Make styling of glossary words more prominent in text - ## What were you trying to do and how can we improve it?
Read glossary terms
## General feedback?
Glossary terms: can they be highlighted or underlined? It's tough to distinguish them from the other wording with the same size and font lettering. The book icon blends in
## Details
- URL: https://beta.fec.gov/
- User Agent: Mozilla/5.0 (Windows NT 6.1; rv:43.0) Gecko/20100101 Firefox/43.0
| non_priority | request make styling of glossary words more prominent in text what were you trying to do and how can we improve it read glossary terms general feedback glossary terms can they be highlighted or underlined it s tough to distinguish them from the other wording with the same size and font lettering the book icon blends in details url user agent mozilla windows nt rv gecko firefox | 0 |
1,851 | 2,576,742,266 | IssuesEvent | 2015-02-12 12:35:41 | victor--/viss-bugs | https://api.github.com/repos/victor--/viss-bugs | closed | Ingen snygg popup gällande åtgärdsområde | 54713 bug Ready for test | På stationen klickar jag på välj åtgärdsområde

Pop-uprutan som öppnas ser inte ut som nya designen

| 1.0 | Ingen snygg popup gällande åtgärdsområde - På stationen klickar jag på välj åtgärdsområde

Pop-uprutan som öppnas ser inte ut som nya designen

| non_priority | ingen snygg popup gällande åtgärdsområde på stationen klickar jag på välj åtgärdsområde pop uprutan som öppnas ser inte ut som nya designen | 0 |
376,492 | 11,147,691,212 | IssuesEvent | 2019-12-23 13:25:35 | nemtech/nem2-cli | https://api.github.com/repos/nemtech/nem2-cli | closed | nem2-cli does not encrypt the profiles | enhancement priority | nem2-cli stores the accounts in plain text. It should handle private keys securely, display warning messages before overwriting an account, and allow to create backups. | 1.0 | nem2-cli does not encrypt the profiles - nem2-cli stores the accounts in plain text. It should handle private keys securely, display warning messages before overwriting an account, and allow to create backups. | priority | cli does not encrypt the profiles cli stores the accounts in plain text it should handle private keys securely display warning messages before overwriting an account and allow to create backups | 1 |
251,891 | 21,527,174,616 | IssuesEvent | 2022-04-28 19:42:51 | damccorm/test-migration-target | https://api.github.com/repos/damccorm/test-migration-target | opened | Stuck inventory jobs should be cancelled and rescheduled for next run | improvement P2 testing | When we take a jenkins node offline and then back online, sometimes an inventory job can get stuck, and we need to unblock it manually.
Ideally, Jenkins should cancel the stuck jobs. There is some mechanism where we can cancel Jenkins Jobs for PRs that had new commits. Perhaps it can be reused here.
Context: https://lists.apache.org/thread/5pzj6pycw1lo15v66p7c2gzy4xh44bjx
Imported from Jira [BEAM-13666](https://issues.apache.org/jira/browse/BEAM-13666). Original Jira may contain additional context.
Reported by: tvalentyn. | 1.0 | Stuck inventory jobs should be cancelled and rescheduled for next run - When we take a jenkins node offline and then back online, sometimes an inventory job can get stuck, and we need to unblock it manually.
Ideally, Jenkins should cancel the stuck jobs. There is some mechanism where we can cancel Jenkins Jobs for PRs that had new commits. Perhaps it can be reused here.
Context: https://lists.apache.org/thread/5pzj6pycw1lo15v66p7c2gzy4xh44bjx
Imported from Jira [BEAM-13666](https://issues.apache.org/jira/browse/BEAM-13666). Original Jira may contain additional context.
Reported by: tvalentyn. | non_priority | stuck inventory jobs should be cancelled and rescheduled for next run when we take a jenkins node offline and then back online sometimes an inventory job can get stuck and we need to unblock it manually ideally jenkins should cancel the stuck jobs there is some mechanism where we can cancel jenkins jobs for prs that had new commits perhaps it can be reused here context imported from jira original jira may contain additional context reported by tvalentyn | 0 |
159,703 | 25,034,777,397 | IssuesEvent | 2022-11-04 15:09:55 | asulibraries/islandora-repo | https://api.github.com/repos/asulibraries/islandora-repo | closed | Investigate options for facet by peer-reviewed or open access | question theming/design design group | Some ideas from design
- move the peer-reviewed and open access facets up to the top of the facet list
- make the peer-reviewed and open access facets into a different style (like a check box)
- have a pre-filter checkbox on the main home page search box | 2.0 | Investigate options for facet by peer-reviewed or open access - Some ideas from design
- move the peer-reviewed and open access facets up to the top of the facet list
- make the peer-reviewed and open access facets into a different style (like a check box)
- have a pre-filter checkbox on the main home page search box | non_priority | investigate options for facet by peer reviewed or open access some ideas from design move the peer reviewed and open access facets up to the top of the facet list make the peer reviewed and open access facets into a different style like a check box have a pre filter checkbox on the main home page search box | 0 |
303,780 | 26,228,126,065 | IssuesEvent | 2023-01-04 20:48:40 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | sql: TestDropTableInterleavedDeleteData failed | C-test-failure O-robot branch-master | [(sql).TestDropTableInterleavedDeleteData failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2390920&tab=buildLog) on [master@75234294d22cc9a6e12724b043f49e78f0ac92d2](https://github.com/cockroachdb/cockroach/commits/75234294d22cc9a6e12724b043f49e78f0ac92d2):
Fatal error:
```
panic: pebble: closed
```
Stack:
```
goroutine 258086 [running]:
github.com/cockroachdb/pebble.(*DB).Apply(0xc0062cd400, 0xc0052f68c0, 0x7b1a974, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/github.com/cockroachdb/pebble/db.go:533 +0x3f8
github.com/cockroachdb/pebble.(*Batch).Commit(...)
/go/src/github.com/cockroachdb/cockroach/vendor/github.com/cockroachdb/pebble/batch.go:727
github.com/cockroachdb/cockroach/pkg/storage.(*pebbleBatch).Commit(0xc008ba9600, 0x1, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/pebble_batch.go:339 +0x5e
github.com/cockroachdb/cockroach/pkg/storage.WriteSyncNoop(0x5520760, 0xc004bfa870, 0x56c56a0, 0xc004093080, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/engine.go:648 +0xdb
github.com/cockroachdb/cockroach/pkg/kv/kvserver.Server.WaitForApplication.func1(0xc001d62300, 0x1, 0xc001d62300)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/stores_server.go:114 +0x1cb
github.com/cockroachdb/cockroach/pkg/kv/kvserver.Server.execStoreCommand(0xc0062fd780, 0x100000001, 0xc0024cf570, 0xc0024cf590, 0x424963)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/stores_server.go:42 +0x69
github.com/cockroachdb/cockroach/pkg/kv/kvserver.Server.WaitForApplication(0xc0062fd780, 0x5520760, 0xc004bfa870, 0xc001416840, 0xc0062fd780, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/stores_server.go:88 +0x7d
github.com/cockroachdb/cockroach/pkg/kv/kvserver._PerReplica_WaitForApplication_Handler.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0x42d4940, 0xc001416840, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/storage_services.pb.go:281 +0x86
github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc.OpenTracingServerInterceptor.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc004d3b300, 0x0, 0x0, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc/server.go:44 +0x9ef
google.golang.org/grpc.getChainUnaryHandler.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc0024cf948, 0xb6da08, 0x41c7800, 0xc003ac78c0)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:921 +0xe7
github.com/cockroachdb/cockroach/pkg/rpc.NewServer.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc003ac78c0, 0xc003ac78c0, 0x4, 0xc00525fcc0, 0x2)
/go/src/github.com/cockroachdb/cockroach/pkg/rpc/context.go:201 +0xa8
google.golang.org/grpc.getChainUnaryHandler.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0x0, 0x15331f1671f0, 0x0, 0xc0013c3680)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:921 +0xe7
github.com/cockroachdb/cockroach/pkg/rpc.kvAuth.unaryInterceptor(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc003ac7880, 0xc0024cfa80, 0xb6da08, 0x41c7800, 0xc003ac7880)
/go/src/github.com/cockroachdb/cockroach/pkg/rpc/auth.go:60 +0x8a
google.golang.org/grpc.chainUnaryServerInterceptors.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc004d3b300, 0xc0024cfb58, 0x5247f8, 0x41e5ea0, 0xc004bfa870)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:907 +0xca
github.com/cockroachdb/cockroach/pkg/kv/kvserver._PerReplica_WaitForApplication_Handler(0x41308e0, 0xc0062fd780, 0x5520760, 0xc004bfa870, 0xc0013c3620, 0xc004abdf00, 0x5520760, 0xc004bfa870, 0xc008ae0000, 0xa)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/storage_services.pb.go:283 +0x14b
google.golang.org/grpc.(*Server).processUnaryRPC(0xc00575fba0, 0x559fd60, 0xc0026c1b00, 0xc003525600, 0xc0066f0390, 0x7ae3558, 0x0, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:1082 +0x4fd
google.golang.org/grpc.(*Server).handleStream(0xc00575fba0, 0x559fd60, 0xc0026c1b00, 0xc003525600, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:1405 +0xd25
google.golang.org/grpc.(*Server).serveStreams.func1.1(0xc005a121d0, 0xc00575fba0, 0x559fd60, 0xc0026c1b00, 0xc003525600)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:746 +0xbb
created by google.golang.org/grpc.(*Server).serveStreams.func1
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:744 +0xa1
```
<details><summary>Log preceding fatal error</summary><p>
```
=== RUN TestDropTableInterleavedDeleteData
```
</p></details>
<details><summary>More</summary><p>
Parameters:
- TAGS=
- GOFLAGS=-parallel=4
```
make stressrace TESTS=TestDropTableInterleavedDeleteData PKG=./pkg/sql TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestDropTableInterleavedDeleteData.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
Jira issue: CRDB-3611 | 1.0 | sql: TestDropTableInterleavedDeleteData failed - [(sql).TestDropTableInterleavedDeleteData failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2390920&tab=buildLog) on [master@75234294d22cc9a6e12724b043f49e78f0ac92d2](https://github.com/cockroachdb/cockroach/commits/75234294d22cc9a6e12724b043f49e78f0ac92d2):
Fatal error:
```
panic: pebble: closed
```
Stack:
```
goroutine 258086 [running]:
github.com/cockroachdb/pebble.(*DB).Apply(0xc0062cd400, 0xc0052f68c0, 0x7b1a974, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/github.com/cockroachdb/pebble/db.go:533 +0x3f8
github.com/cockroachdb/pebble.(*Batch).Commit(...)
/go/src/github.com/cockroachdb/cockroach/vendor/github.com/cockroachdb/pebble/batch.go:727
github.com/cockroachdb/cockroach/pkg/storage.(*pebbleBatch).Commit(0xc008ba9600, 0x1, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/pebble_batch.go:339 +0x5e
github.com/cockroachdb/cockroach/pkg/storage.WriteSyncNoop(0x5520760, 0xc004bfa870, 0x56c56a0, 0xc004093080, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/engine.go:648 +0xdb
github.com/cockroachdb/cockroach/pkg/kv/kvserver.Server.WaitForApplication.func1(0xc001d62300, 0x1, 0xc001d62300)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/stores_server.go:114 +0x1cb
github.com/cockroachdb/cockroach/pkg/kv/kvserver.Server.execStoreCommand(0xc0062fd780, 0x100000001, 0xc0024cf570, 0xc0024cf590, 0x424963)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/stores_server.go:42 +0x69
github.com/cockroachdb/cockroach/pkg/kv/kvserver.Server.WaitForApplication(0xc0062fd780, 0x5520760, 0xc004bfa870, 0xc001416840, 0xc0062fd780, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/stores_server.go:88 +0x7d
github.com/cockroachdb/cockroach/pkg/kv/kvserver._PerReplica_WaitForApplication_Handler.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0x42d4940, 0xc001416840, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/storage_services.pb.go:281 +0x86
github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc.OpenTracingServerInterceptor.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc004d3b300, 0x0, 0x0, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/github.com/grpc-ecosystem/grpc-opentracing/go/otgrpc/server.go:44 +0x9ef
google.golang.org/grpc.getChainUnaryHandler.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc0024cf948, 0xb6da08, 0x41c7800, 0xc003ac78c0)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:921 +0xe7
github.com/cockroachdb/cockroach/pkg/rpc.NewServer.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc003ac78c0, 0xc003ac78c0, 0x4, 0xc00525fcc0, 0x2)
/go/src/github.com/cockroachdb/cockroach/pkg/rpc/context.go:201 +0xa8
google.golang.org/grpc.getChainUnaryHandler.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0x0, 0x15331f1671f0, 0x0, 0xc0013c3680)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:921 +0xe7
github.com/cockroachdb/cockroach/pkg/rpc.kvAuth.unaryInterceptor(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc003ac7880, 0xc0024cfa80, 0xb6da08, 0x41c7800, 0xc003ac7880)
/go/src/github.com/cockroachdb/cockroach/pkg/rpc/auth.go:60 +0x8a
google.golang.org/grpc.chainUnaryServerInterceptors.func1(0x5520760, 0xc004bfa870, 0x42d4940, 0xc001416840, 0xc004d3b2e0, 0xc004d3b300, 0xc0024cfb58, 0x5247f8, 0x41e5ea0, 0xc004bfa870)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:907 +0xca
github.com/cockroachdb/cockroach/pkg/kv/kvserver._PerReplica_WaitForApplication_Handler(0x41308e0, 0xc0062fd780, 0x5520760, 0xc004bfa870, 0xc0013c3620, 0xc004abdf00, 0x5520760, 0xc004bfa870, 0xc008ae0000, 0xa)
/go/src/github.com/cockroachdb/cockroach/pkg/kv/kvserver/storage_services.pb.go:283 +0x14b
google.golang.org/grpc.(*Server).processUnaryRPC(0xc00575fba0, 0x559fd60, 0xc0026c1b00, 0xc003525600, 0xc0066f0390, 0x7ae3558, 0x0, 0x0, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:1082 +0x4fd
google.golang.org/grpc.(*Server).handleStream(0xc00575fba0, 0x559fd60, 0xc0026c1b00, 0xc003525600, 0x0)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:1405 +0xd25
google.golang.org/grpc.(*Server).serveStreams.func1.1(0xc005a121d0, 0xc00575fba0, 0x559fd60, 0xc0026c1b00, 0xc003525600)
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:746 +0xbb
created by google.golang.org/grpc.(*Server).serveStreams.func1
/go/src/github.com/cockroachdb/cockroach/vendor/google.golang.org/grpc/server.go:744 +0xa1
```
<details><summary>Log preceding fatal error</summary><p>
```
=== RUN TestDropTableInterleavedDeleteData
```
</p></details>
<details><summary>More</summary><p>
Parameters:
- TAGS=
- GOFLAGS=-parallel=4
```
make stressrace TESTS=TestDropTableInterleavedDeleteData PKG=./pkg/sql TESTTIMEOUT=5m STRESSFLAGS='-timeout 5m' 2>&1
```
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2ATestDropTableInterleavedDeleteData.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
Jira issue: CRDB-3611 | non_priority | sql testdroptableinterleaveddeletedata failed on fatal error panic pebble closed stack goroutine github com cockroachdb pebble db apply go src github com cockroachdb cockroach vendor github com cockroachdb pebble db go github com cockroachdb pebble batch commit go src github com cockroachdb cockroach vendor github com cockroachdb pebble batch go github com cockroachdb cockroach pkg storage pebblebatch commit go src github com cockroachdb cockroach pkg storage pebble batch go github com cockroachdb cockroach pkg storage writesyncnoop go src github com cockroachdb cockroach pkg storage engine go github com cockroachdb cockroach pkg kv kvserver server waitforapplication go src github com cockroachdb cockroach pkg kv kvserver stores server go github com cockroachdb cockroach pkg kv kvserver server execstorecommand go src github com cockroachdb cockroach pkg kv kvserver stores server go github com cockroachdb cockroach pkg kv kvserver server waitforapplication go src github com cockroachdb cockroach pkg kv kvserver stores server go github com cockroachdb cockroach pkg kv kvserver perreplica waitforapplication handler go src github com cockroachdb cockroach pkg kv kvserver storage services pb go github com grpc ecosystem grpc opentracing go otgrpc opentracingserverinterceptor go src github com cockroachdb cockroach vendor github com grpc ecosystem grpc opentracing go otgrpc server go google golang org grpc getchainunaryhandler go src github com cockroachdb cockroach vendor google golang org grpc server go github com cockroachdb cockroach pkg rpc newserver go src github com cockroachdb cockroach pkg rpc context go google golang org grpc getchainunaryhandler go src github com cockroachdb cockroach vendor google golang org grpc server go github com cockroachdb cockroach pkg rpc kvauth unaryinterceptor go src github com cockroachdb cockroach pkg rpc auth go google golang org grpc chainunaryserverinterceptors go src github com cockroachdb cockroach vendor google golang org grpc server go github com cockroachdb cockroach pkg kv kvserver perreplica waitforapplication handler go src github com cockroachdb cockroach pkg kv kvserver storage services pb go google golang org grpc server processunaryrpc go src github com cockroachdb cockroach vendor google golang org grpc server go google golang org grpc server handlestream go src github com cockroachdb cockroach vendor google golang org grpc server go google golang org grpc server servestreams go src github com cockroachdb cockroach vendor google golang org grpc server go created by google golang org grpc server servestreams go src github com cockroachdb cockroach vendor google golang org grpc server go log preceding fatal error run testdroptableinterleaveddeletedata more parameters tags goflags parallel make stressrace tests testdroptableinterleaveddeletedata pkg pkg sql testtimeout stressflags timeout powered by jira issue crdb | 0 |
645,286 | 21,000,612,426 | IssuesEvent | 2022-03-29 17:04:12 | bcgov/digital-journeys | https://api.github.com/repos/bcgov/digital-journeys | closed | File uploads should support .msg, .docx file types | Development high priority | The client wants to add the following file formats to the upload file functionality:
.msg, .docx | 1.0 | File uploads should support .msg, .docx file types - The client wants to add the following file formats to the upload file functionality:
.msg, .docx | priority | file uploads should support msg docx file types the client wants to add the following file formats to the upload file functionality msg docx | 1 |
538,663 | 15,775,083,877 | IssuesEvent | 2021-04-01 02:14:13 | bacuarabrasil/krenak-api | https://api.github.com/repos/bacuarabrasil/krenak-api | closed | US03 - Atualizar usuário | F01 - Cadastro priority:medium | Como usuário, eu quero editar meus dados para disponibilizar as informações corretas e mais recentes para a equipe do app Krenak.
Critérios de aceite:
API:
- [x] Permite edição de nome, sobrenome e data de nascimento
- [x] Deverá permitir edição tanto via API REST quanto pela dashboard de administrador
App:
- [x] Deve conter uma tela de edição de dados
- [x] Deve conter os campos permitidos pela API, todos como opcionais, disponiveis para edição | 1.0 | US03 - Atualizar usuário - Como usuário, eu quero editar meus dados para disponibilizar as informações corretas e mais recentes para a equipe do app Krenak.
Critérios de aceite:
API:
- [x] Permite edição de nome, sobrenome e data de nascimento
- [x] Deverá permitir edição tanto via API REST quanto pela dashboard de administrador
App:
- [x] Deve conter uma tela de edição de dados
- [x] Deve conter os campos permitidos pela API, todos como opcionais, disponiveis para edição | priority | atualizar usuário como usuário eu quero editar meus dados para disponibilizar as informações corretas e mais recentes para a equipe do app krenak critérios de aceite api permite edição de nome sobrenome e data de nascimento deverá permitir edição tanto via api rest quanto pela dashboard de administrador app deve conter uma tela de edição de dados deve conter os campos permitidos pela api todos como opcionais disponiveis para edição | 1 |
390,329 | 11,542,108,768 | IssuesEvent | 2020-02-18 06:28:08 | wso2/product-apim | https://api.github.com/repos/wso2/product-apim | closed | Can create API categories with same name in different case | Priority/Normal Type/Bug | ### Description:
category name case sensitivity at the moment depends on DB.
### Steps to reproduce:
Create an api category with name 'test'.
Create an api category with name 'Test'
This is successful.
### Affected Product Version:
3.1.0 beta | 1.0 | Can create API categories with same name in different case - ### Description:
category name case sensitivity at the moment depends on DB.
### Steps to reproduce:
Create an api category with name 'test'.
Create an api category with name 'Test'
This is successful.
### Affected Product Version:
3.1.0 beta | priority | can create api categories with same name in different case description category name case sensitivity at the moment depends on db steps to reproduce create an api category with name test create an api category with name test this is successful affected product version beta | 1 |
497,246 | 14,366,606,871 | IssuesEvent | 2020-12-01 04:51:49 | marbl/MetagenomeScope | https://api.github.com/repos/marbl/MetagenomeScope | closed | Accept any type of ID strings | highpriorityfeature | _From @fedarko on July 26, 2017 22:50_
Or at least allow the user to specify a prefix to be removed (e.g. `NODE_` or `tig`).
Right now I've just been handling these cases manually (e.g. detecting a `tig` prefix on nodes and removing it) in the GFA parsing code in `collate.py`, but it'd be ideal to extend this functionality.
An issue with just accepting the ID strings as they are is that creating negative nodes (with the `-` prefix) doesn't work in Graphviz, and using a `c` prefix or something as input to Graphviz can cause problems when all/some input nodes have IDs starting with `c` (e.g. `contig_1` will be interpreted as negative, even when we'd only want `ccontig_1` to be interpreted as negative).
Need to think about ways around this.
_Copied from original issue: fedarko/MetagenomeScope#243_ | 1.0 | Accept any type of ID strings - _From @fedarko on July 26, 2017 22:50_
Or at least allow the user to specify a prefix to be removed (e.g. `NODE_` or `tig`).
Right now I've just been handling these cases manually (e.g. detecting a `tig` prefix on nodes and removing it) in the GFA parsing code in `collate.py`, but it'd be ideal to extend this functionality.
An issue with just accepting the ID strings as they are is that creating negative nodes (with the `-` prefix) doesn't work in Graphviz, and using a `c` prefix or something as input to Graphviz can cause problems when all/some input nodes have IDs starting with `c` (e.g. `contig_1` will be interpreted as negative, even when we'd only want `ccontig_1` to be interpreted as negative).
Need to think about ways around this.
_Copied from original issue: fedarko/MetagenomeScope#243_ | priority | accept any type of id strings from fedarko on july or at least allow the user to specify a prefix to be removed e g node or tig right now i ve just been handling these cases manually e g detecting a tig prefix on nodes and removing it in the gfa parsing code in collate py but it d be ideal to extend this functionality an issue with just accepting the id strings as they are is that creating negative nodes with the prefix doesn t work in graphviz and using a c prefix or something as input to graphviz can cause problems when all some input nodes have ids starting with c e g contig will be interpreted as negative even when we d only want ccontig to be interpreted as negative need to think about ways around this copied from original issue fedarko metagenomescope | 1 |
276,753 | 20,999,113,260 | IssuesEvent | 2022-03-29 15:46:57 | valeriupredoi/valeriupredoi.github.io | https://api.github.com/repos/valeriupredoi/valeriupredoi.github.io | closed | Meeting about site 10 December, 2021 and discussion on how to add content | documentation enhancement | @fadloff Sophie and myself we'll have a meeting on 10 Dec discuss the site, a few points before the meeting:
meeting location: https://ncas.zoom.us/j/8081337740?pwd=eVVkNGg1ZUp4bFJDUWVTb00vYkVuUT09
**So far**
- I put together a [README](https://github.com/valeriupredoi/valeriupredoi.github.io#readme) that explains alot of stuff that need to be done to get Jekyll installed, build the site, `git` operations etc; it may need more stuff, depending what you guys need
- the site is already well fleshed out in terms of structure: https://valeriupredoi.github.io/
- it has working sections (almost all), working widgets, social media tabs etc
- it has a [Github Action](https://github.com/valeriupredoi/valeriupredoi.github.io/actions) that checks the build
- it needs **content**; a few pointers here:
- it'd be nice if IS-ENES3 had a GitHub repository (I can create and maintain it!)
- it'd be nice if it had a Facebook page?
- what other social media platforms is IS-ENES/ENES on?
- it's OK mobile-friendly, but the main sections bar is missing from the mobile version (for some reason)
**Strategy to add content**
- to be discussed: content will be only in form of Markdown and JPEGs (no HTML, at least none for Fanny and Sophie)
- need for local building, inspecting on `port: 4000` then committing
Still not sure what's the best strategy to add content w/o me doing it :grin:
~In the meantime, @fadloff could you maybe ask Sophie to get herself a GitHub account and post me her username pls?~ | 1.0 | Meeting about site 10 December, 2021 and discussion on how to add content - @fadloff Sophie and myself we'll have a meeting on 10 Dec discuss the site, a few points before the meeting:
meeting location: https://ncas.zoom.us/j/8081337740?pwd=eVVkNGg1ZUp4bFJDUWVTb00vYkVuUT09
**So far**
- I put together a [README](https://github.com/valeriupredoi/valeriupredoi.github.io#readme) that explains alot of stuff that need to be done to get Jekyll installed, build the site, `git` operations etc; it may need more stuff, depending what you guys need
- the site is already well fleshed out in terms of structure: https://valeriupredoi.github.io/
- it has working sections (almost all), working widgets, social media tabs etc
- it has a [Github Action](https://github.com/valeriupredoi/valeriupredoi.github.io/actions) that checks the build
- it needs **content**; a few pointers here:
- it'd be nice if IS-ENES3 had a GitHub repository (I can create and maintain it!)
- it'd be nice if it had a Facebook page?
- what other social media platforms is IS-ENES/ENES on?
- it's OK mobile-friendly, but the main sections bar is missing from the mobile version (for some reason)
**Strategy to add content**
- to be discussed: content will be only in form of Markdown and JPEGs (no HTML, at least none for Fanny and Sophie)
- need for local building, inspecting on `port: 4000` then committing
Still not sure what's the best strategy to add content w/o me doing it :grin:
~In the meantime, @fadloff could you maybe ask Sophie to get herself a GitHub account and post me her username pls?~ | non_priority | meeting about site december and discussion on how to add content fadloff sophie and myself we ll have a meeting on dec discuss the site a few points before the meeting meeting location so far i put together a that explains alot of stuff that need to be done to get jekyll installed build the site git operations etc it may need more stuff depending what you guys need the site is already well fleshed out in terms of structure it has working sections almost all working widgets social media tabs etc it has a that checks the build it needs content a few pointers here it d be nice if is had a github repository i can create and maintain it it d be nice if it had a facebook page what other social media platforms is is enes enes on it s ok mobile friendly but the main sections bar is missing from the mobile version for some reason strategy to add content to be discussed content will be only in form of markdown and jpegs no html at least none for fanny and sophie need for local building inspecting on port then committing still not sure what s the best strategy to add content w o me doing it grin in the meantime fadloff could you maybe ask sophie to get herself a github account and post me her username pls | 0 |
44,022 | 13,047,867,053 | IssuesEvent | 2020-07-29 11:30:20 | rsoreq/zaproxy | https://api.github.com/repos/rsoreq/zaproxy | opened | CVE-2019-17531 (High) detected in jackson-databind-2.9.10.jar, jackson-databind-2.9.2.jar | security vulnerability | ## CVE-2019-17531 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.10.jar</b>, <b>jackson-databind-2.9.2.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.10.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/zaproxy</p>
<p>Path to vulnerable library: /tmp/ws-ua_20200729112444_WCAEYA/downloadResource_JMENZF/20200729112922/jackson-databind-2.9.10.jar</p>
<p>
Dependency Hierarchy:
- wiremock-jre8-2.25.1.jar (Root Library)
- zjsonpatch-0.4.4.jar
- :x: **jackson-databind-2.9.10.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.2.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/zaproxy/buildSrc/build.gradle.kts</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.2/1d8d8cb7cf26920ba57fb61fa56da88cc123b21f/jackson-databind-2.9.2.jar</p>
<p>
Dependency Hierarchy:
- kotlin-reflect-1.3.72.jar (Root Library)
- :x: **jackson-databind-2.9.2.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/rsoreq/zaproxy/commit/faf0234fff2dbd2142cc463fc90d7e58bcf20cd0">faf0234fff2dbd2142cc463fc90d7e58bcf20cd0</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.0.0 through 2.9.10. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the apache-log4j-extra (version 1.2.x) jar in the classpath, and an attacker can provide a JNDI service to access, it is possible to make the service execute a malicious payload.
<p>Publish Date: 2019-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-17531>CVE-2019-17531</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-17531">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-17531</a></p>
<p>Release Date: 2019-10-12</p>
<p>Fix Resolution: 2.10</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.10","isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock-jre8:2.25.1;com.flipkart.zjsonpatch:zjsonpatch:0.4.4;com.fasterxml.jackson.core:jackson-databind:2.9.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.10"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.2","isTransitiveDependency":true,"dependencyTree":"org.jetbrains.kotlin:kotlin-reflect:1.3.72;com.fasterxml.jackson.core:jackson-databind:2.9.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.10"}],"vulnerabilityIdentifier":"CVE-2019-17531","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.0.0 through 2.9.10. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the apache-log4j-extra (version 1.2.x) jar in the classpath, and an attacker can provide a JNDI service to access, it is possible to make the service execute a malicious payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-17531","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | True | CVE-2019-17531 (High) detected in jackson-databind-2.9.10.jar, jackson-databind-2.9.2.jar - ## CVE-2019-17531 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.10.jar</b>, <b>jackson-databind-2.9.2.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.10.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/zaproxy</p>
<p>Path to vulnerable library: /tmp/ws-ua_20200729112444_WCAEYA/downloadResource_JMENZF/20200729112922/jackson-databind-2.9.10.jar</p>
<p>
Dependency Hierarchy:
- wiremock-jre8-2.25.1.jar (Root Library)
- zjsonpatch-0.4.4.jar
- :x: **jackson-databind-2.9.10.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.2.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /tmp/ws-scm/zaproxy/buildSrc/build.gradle.kts</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.fasterxml.jackson.core/jackson-databind/2.9.2/1d8d8cb7cf26920ba57fb61fa56da88cc123b21f/jackson-databind-2.9.2.jar</p>
<p>
Dependency Hierarchy:
- kotlin-reflect-1.3.72.jar (Root Library)
- :x: **jackson-databind-2.9.2.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/rsoreq/zaproxy/commit/faf0234fff2dbd2142cc463fc90d7e58bcf20cd0">faf0234fff2dbd2142cc463fc90d7e58bcf20cd0</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.0.0 through 2.9.10. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the apache-log4j-extra (version 1.2.x) jar in the classpath, and an attacker can provide a JNDI service to access, it is possible to make the service execute a malicious payload.
<p>Publish Date: 2019-10-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-17531>CVE-2019-17531</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-17531">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-17531</a></p>
<p>Release Date: 2019-10-12</p>
<p>Fix Resolution: 2.10</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.10","isTransitiveDependency":true,"dependencyTree":"com.github.tomakehurst:wiremock-jre8:2.25.1;com.flipkart.zjsonpatch:zjsonpatch:0.4.4;com.fasterxml.jackson.core:jackson-databind:2.9.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.10"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.2","isTransitiveDependency":true,"dependencyTree":"org.jetbrains.kotlin:kotlin-reflect:1.3.72;com.fasterxml.jackson.core:jackson-databind:2.9.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.10"}],"vulnerabilityIdentifier":"CVE-2019-17531","vulnerabilityDetails":"A Polymorphic Typing issue was discovered in FasterXML jackson-databind 2.0.0 through 2.9.10. When Default Typing is enabled (either globally or for a specific property) for an externally exposed JSON endpoint and the service has the apache-log4j-extra (version 1.2.x) jar in the classpath, and an attacker can provide a JNDI service to access, it is possible to make the service execute a malicious payload.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-17531","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in jackson databind jar jackson databind jar cve high severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws scm zaproxy path to vulnerable library tmp ws ua wcaeya downloadresource jmenzf jackson databind jar dependency hierarchy wiremock jar root library zjsonpatch jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file tmp ws scm zaproxy buildsrc build gradle kts path to vulnerable library home wss scanner gradle caches modules files com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy kotlin reflect jar root library x jackson databind jar vulnerable library found in head commit a href vulnerability details a polymorphic typing issue was discovered in fasterxml jackson databind through when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has the apache extra version x jar in the classpath and an attacker can provide a jndi service to access it is possible to make the service execute a malicious payload publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails a polymorphic typing issue was discovered in fasterxml jackson databind through when default typing is enabled either globally or for a specific property for an externally exposed json endpoint and the service has the apache extra version x jar in the classpath and an attacker can provide a jndi service to access it is possible to make the service execute a malicious payload vulnerabilityurl | 0 |
718,504 | 24,719,724,132 | IssuesEvent | 2022-10-20 09:44:03 | DSpace/dspace-angular | https://api.github.com/repos/DSpace/dspace-angular | closed | Probably a caching issue for impersonate feature on an admin user | bug help wanted high priority authorization performance / caching e/14 | **Describe the bug**
After a user was created, as an admin I've assign him the admin profile and went to see the user profile.

Then I assign him the admin group

Back on the user profile page I was able to see actions that I think they don't apply for users in that profile

**To Reproduce**
Steps to reproduce the behavior (I've used Firefox on Windows 10):
1. I sign in using shibboleth authentication. Then an account was created
2. Then I sign in as an administrator and visit the user profile firstly
3. Went to groups management and assigned the new user to the administrator group
4. On the users list, I went to see the eperson profile
5. On the eperson profile (the recently group data added wasn't yet available) the impersonate option was available and I click it. No visible error occurred **reload** action got stuck on the loading content (the dots animation).
**Expected behavior**
I think you couldn't impersonate another admin. If that's the case, then, the user profile should be updated with the required features and permissions/authorizations, also, it should display the recently groups added.
| 1.0 | Probably a caching issue for impersonate feature on an admin user - **Describe the bug**
After a user was created, as an admin I've assign him the admin profile and went to see the user profile.

Then I assign him the admin group

Back on the user profile page I was able to see actions that I think they don't apply for users in that profile

**To Reproduce**
Steps to reproduce the behavior (I've used Firefox on Windows 10):
1. I sign in using shibboleth authentication. Then an account was created
2. Then I sign in as an administrator and visit the user profile firstly
3. Went to groups management and assigned the new user to the administrator group
4. On the users list, I went to see the eperson profile
5. On the eperson profile (the recently group data added wasn't yet available) the impersonate option was available and I click it. No visible error occurred **reload** action got stuck on the loading content (the dots animation).
**Expected behavior**
I think you couldn't impersonate another admin. If that's the case, then, the user profile should be updated with the required features and permissions/authorizations, also, it should display the recently groups added.
| priority | probably a caching issue for impersonate feature on an admin user describe the bug after a user was created as an admin i ve assign him the admin profile and went to see the user profile then i assign him the admin group back on the user profile page i was able to see actions that i think they don t apply for users in that profile to reproduce steps to reproduce the behavior i ve used firefox on windows i sign in using shibboleth authentication then an account was created then i sign in as an administrator and visit the user profile firstly went to groups management and assigned the new user to the administrator group on the users list i went to see the eperson profile on the eperson profile the recently group data added wasn t yet available the impersonate option was available and i click it no visible error occurred reload action got stuck on the loading content the dots animation expected behavior i think you couldn t impersonate another admin if that s the case then the user profile should be updated with the required features and permissions authorizations also it should display the recently groups added | 1 |
410,287 | 11,985,925,735 | IssuesEvent | 2020-04-07 18:22:56 | OpenLiberty/open-liberty | https://api.github.com/repos/OpenLiberty/open-liberty | opened | acmeCA:2-0: Address what triggers an a cert refresh on an update | priority/medium team:Core Security team:Wendigo East | `AcmeProviderImpl.updateAcmeConfigService`
```
/*
* TODO We need to determine which configuration changes will result
* in requiring a certificate to be refreshed. Some that might
* trigger a refresh: validFor, directoryURI, country, locality,
* state, organization, organizationUnit
*
* We can't necessarily just check the certificate, b/c they don't
* always honor them.
*/
```
For #9017 | 1.0 | acmeCA:2-0: Address what triggers an a cert refresh on an update - `AcmeProviderImpl.updateAcmeConfigService`
```
/*
* TODO We need to determine which configuration changes will result
* in requiring a certificate to be refreshed. Some that might
* trigger a refresh: validFor, directoryURI, country, locality,
* state, organization, organizationUnit
*
* We can't necessarily just check the certificate, b/c they don't
* always honor them.
*/
```
For #9017 | priority | acmeca address what triggers an a cert refresh on an update acmeproviderimpl updateacmeconfigservice todo we need to determine which configuration changes will result in requiring a certificate to be refreshed some that might trigger a refresh validfor directoryuri country locality state organization organizationunit we can t necessarily just check the certificate b c they don t always honor them for | 1 |
192,674 | 22,215,987,286 | IssuesEvent | 2022-06-08 01:44:09 | panasalap/linux-4.1.15 | https://api.github.com/repos/panasalap/linux-4.1.15 | reopened | CVE-2017-14340 (Medium) detected in linux-yocto-4.1v4.1.17 | security vulnerability | ## CVE-2017-14340 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-4.1v4.1.17</b></p></summary>
<p>
<p>[no description]</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-4.1>https://git.yoctoproject.org/git/linux-yocto-4.1</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.1.15/commit/aae4c2fa46027fd4c477372871df090c6b94f3f1">aae4c2fa46027fd4c477372871df090c6b94f3f1</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/xfs/xfs_linux.h</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The XFS_IS_REALTIME_INODE macro in fs/xfs/xfs_linux.h in the Linux kernel before 4.13.2 does not verify that a filesystem has a realtime device, which allows local users to cause a denial of service (NULL pointer dereference and OOPS) via vectors related to setting an RHINHERIT flag on a directory.
<p>Publish Date: 2017-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-14340>CVE-2017-14340</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-14340">https://nvd.nist.gov/vuln/detail/CVE-2017-14340</a></p>
<p>Release Date: 2017-09-15</p>
<p>Fix Resolution: 4.13.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2017-14340 (Medium) detected in linux-yocto-4.1v4.1.17 - ## CVE-2017-14340 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-4.1v4.1.17</b></p></summary>
<p>
<p>[no description]</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-4.1>https://git.yoctoproject.org/git/linux-yocto-4.1</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.1.15/commit/aae4c2fa46027fd4c477372871df090c6b94f3f1">aae4c2fa46027fd4c477372871df090c6b94f3f1</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/fs/xfs/xfs_linux.h</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The XFS_IS_REALTIME_INODE macro in fs/xfs/xfs_linux.h in the Linux kernel before 4.13.2 does not verify that a filesystem has a realtime device, which allows local users to cause a denial of service (NULL pointer dereference and OOPS) via vectors related to setting an RHINHERIT flag on a directory.
<p>Publish Date: 2017-09-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-14340>CVE-2017-14340</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-14340">https://nvd.nist.gov/vuln/detail/CVE-2017-14340</a></p>
<p>Release Date: 2017-09-15</p>
<p>Fix Resolution: 4.13.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_priority | cve medium detected in linux yocto cve medium severity vulnerability vulnerable library linux yocto library home page a href found in head commit a href found in base branch master vulnerable source files fs xfs xfs linux h vulnerability details the xfs is realtime inode macro in fs xfs xfs linux h in the linux kernel before does not verify that a filesystem has a realtime device which allows local users to cause a denial of service null pointer dereference and oops via vectors related to setting an rhinherit flag on a directory publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
258,163 | 19,546,499,678 | IssuesEvent | 2022-01-02 00:31:00 | timoast/signac | https://api.github.com/repos/timoast/signac | closed | metadata missing | documentation | Hi Signac,
The file of "per_barcode_matrices.csv" is missing from the output of cellranger-arc aggr on multiple samples analyzed by cellranger-arc. Do you have any idea for how to generate a metadata in this case? I know it should be the issue with cellranger but just check to see if there are any other methods to generate this part of information.
thanks in advance | 1.0 | metadata missing - Hi Signac,
The file of "per_barcode_matrices.csv" is missing from the output of cellranger-arc aggr on multiple samples analyzed by cellranger-arc. Do you have any idea for how to generate a metadata in this case? I know it should be the issue with cellranger but just check to see if there are any other methods to generate this part of information.
thanks in advance | non_priority | metadata missing hi signac the file of per barcode matrices csv is missing from the output of cellranger arc aggr on multiple samples analyzed by cellranger arc do you have any idea for how to generate a metadata in this case i know it should be the issue with cellranger but just check to see if there are any other methods to generate this part of information thanks in advance | 0 |
395,043 | 11,670,724,305 | IssuesEvent | 2020-03-04 00:50:04 | autodo-app/autodo | https://api.github.com/repos/autodo-app/autodo | opened | Deploy 0.3.1 as com.autodo.autodo on official google account | Logistics priority: high | Need to move the app over to the autodoapp@gmail.com account and the app id will need to be updated in the firebase setup. | 1.0 | Deploy 0.3.1 as com.autodo.autodo on official google account - Need to move the app over to the autodoapp@gmail.com account and the app id will need to be updated in the firebase setup. | priority | deploy as com autodo autodo on official google account need to move the app over to the autodoapp gmail com account and the app id will need to be updated in the firebase setup | 1 |
185,778 | 6,730,347,464 | IssuesEvent | 2017-10-18 00:18:27 | NuGet/Home | https://api.github.com/repos/NuGet/Home | opened | Address localization for new strings in nuget.exe | Area:PackageSigning Needs loc Priority:0 | we are currently adding new commands to nuget.exe. Currently these strings are not being localized. we should fix this before shipping. | 1.0 | Address localization for new strings in nuget.exe - we are currently adding new commands to nuget.exe. Currently these strings are not being localized. we should fix this before shipping. | priority | address localization for new strings in nuget exe we are currently adding new commands to nuget exe currently these strings are not being localized we should fix this before shipping | 1 |
152,023 | 12,068,796,754 | IssuesEvent | 2020-04-16 15:14:31 | kubernetes-sigs/metrics-server | https://api.github.com/repos/kubernetes-sigs/metrics-server | closed | Travis CI doesn't report results | priority/critical-urgent sig/testing | On couple of recent PRs there is a missing raport from Travis CI. For example https://github.com/kubernetes-sigs/metrics-server/pull/481 has failing tests https://travis-ci.org/github/kubernetes-sigs/metrics-server/jobs/670686705.
This means that failing tests are no longer bloker for merging PRs.
Currently I was not able to pinpoint root cause. There is no logs or warning that would inform about error. I suspected that there is problem with OAuth, so I asked one @spiffxp who is owner of kubernetes-sigs. Looks like there was no recent changes.
For now we should watch test results manually when giving /approve and /lgtm on PRs
/cc @s-urbaniak
/cc @kawych
This problem is a good reason to start discussing migration to prow, Now it should be reasonably easy to configure presubmits and not be in situation that we depend on non standard solutions within k8s org.
<!-- DO NOT EDIT BELOW THIS LINE -->
/king bug | 1.0 | Travis CI doesn't report results - On couple of recent PRs there is a missing raport from Travis CI. For example https://github.com/kubernetes-sigs/metrics-server/pull/481 has failing tests https://travis-ci.org/github/kubernetes-sigs/metrics-server/jobs/670686705.
This means that failing tests are no longer bloker for merging PRs.
Currently I was not able to pinpoint root cause. There is no logs or warning that would inform about error. I suspected that there is problem with OAuth, so I asked one @spiffxp who is owner of kubernetes-sigs. Looks like there was no recent changes.
For now we should watch test results manually when giving /approve and /lgtm on PRs
/cc @s-urbaniak
/cc @kawych
This problem is a good reason to start discussing migration to prow, Now it should be reasonably easy to configure presubmits and not be in situation that we depend on non standard solutions within k8s org.
<!-- DO NOT EDIT BELOW THIS LINE -->
/king bug | non_priority | travis ci doesn t report results on couple of recent prs there is a missing raport from travis ci for example has failing tests this means that failing tests are no longer bloker for merging prs currently i was not able to pinpoint root cause there is no logs or warning that would inform about error i suspected that there is problem with oauth so i asked one spiffxp who is owner of kubernetes sigs looks like there was no recent changes for now we should watch test results manually when giving approve and lgtm on prs cc s urbaniak cc kawych this problem is a good reason to start discussing migration to prow now it should be reasonably easy to configure presubmits and not be in situation that we depend on non standard solutions within org king bug | 0 |
7,529 | 25,046,122,936 | IssuesEvent | 2022-11-05 09:08:56 | ThinkingEngine-net/PickleTestSuite | https://api.github.com/repos/ThinkingEngine-net/PickleTestSuite | closed | Selenium Drive URL Moved - Update env check Link | bug Browser Automation | The URL for downloading selenium chrome drivers has moved. Update the failed check URL to reflect https://sites.google.com/chromium.org/driver/downloads | 1.0 | Selenium Drive URL Moved - Update env check Link - The URL for downloading selenium chrome drivers has moved. Update the failed check URL to reflect https://sites.google.com/chromium.org/driver/downloads | non_priority | selenium drive url moved update env check link the url for downloading selenium chrome drivers has moved update the failed check url to reflect | 0 |
167,398 | 26,494,855,896 | IssuesEvent | 2023-01-18 04:04:20 | phetsims/number-suite-common | https://api.github.com/repos/phetsims/number-suite-common | closed | Problems with Ten Frame "Return" button in Lab screen | type:bug design:general status:ready-for-review | The Lab screen has an Undo button for each Ten Frame. Pressing it returns one object in the Ten Frame to its toolbox.
<img width="434" alt="screenshot_2116" src="https://user-images.githubusercontent.com/3046552/210843821-48392bc7-09ef-4a20-af3c-70facec4d731.png">
Problems:
(1) BUG: The Undo button does not appear when you add an object to the Ten Frame. It doesn't appear until you move the Ten Frame after adding an object.
(2) BUG: The Undo button does not seem to work on iPadOS. It's visible but unresponsive. (iPadOS 16.2 on iPad 6.)
(3) DESIGN: Why do we need/want an Undo button in such a free-form screen? Why don't we just add a drag handle to the bottom of the Ten Frame (for moving the Ten Frame if all cells are filled) and allow students to directly drag objects in/out of the Ten Frame? I find it frustrating that I can drag an object into a Ten Frame, but I can't drag an object out of a Ten Frame -- I need to use an entirely different interaction. Related to #11, there are too many ways to add/remove objects in this screen. One way that works the same for all objects would be a much better UX. | 1.0 | Problems with Ten Frame "Return" button in Lab screen - The Lab screen has an Undo button for each Ten Frame. Pressing it returns one object in the Ten Frame to its toolbox.
<img width="434" alt="screenshot_2116" src="https://user-images.githubusercontent.com/3046552/210843821-48392bc7-09ef-4a20-af3c-70facec4d731.png">
Problems:
(1) BUG: The Undo button does not appear when you add an object to the Ten Frame. It doesn't appear until you move the Ten Frame after adding an object.
(2) BUG: The Undo button does not seem to work on iPadOS. It's visible but unresponsive. (iPadOS 16.2 on iPad 6.)
(3) DESIGN: Why do we need/want an Undo button in such a free-form screen? Why don't we just add a drag handle to the bottom of the Ten Frame (for moving the Ten Frame if all cells are filled) and allow students to directly drag objects in/out of the Ten Frame? I find it frustrating that I can drag an object into a Ten Frame, but I can't drag an object out of a Ten Frame -- I need to use an entirely different interaction. Related to #11, there are too many ways to add/remove objects in this screen. One way that works the same for all objects would be a much better UX. | non_priority | problems with ten frame return button in lab screen the lab screen has an undo button for each ten frame pressing it returns one object in the ten frame to its toolbox img width alt screenshot src problems bug the undo button does not appear when you add an object to the ten frame it doesn t appear until you move the ten frame after adding an object bug the undo button does not seem to work on ipados it s visible but unresponsive ipados on ipad design why do we need want an undo button in such a free form screen why don t we just add a drag handle to the bottom of the ten frame for moving the ten frame if all cells are filled and allow students to directly drag objects in out of the ten frame i find it frustrating that i can drag an object into a ten frame but i can t drag an object out of a ten frame i need to use an entirely different interaction related to there are too many ways to add remove objects in this screen one way that works the same for all objects would be a much better ux | 0 |
777,784 | 27,294,064,988 | IssuesEvent | 2023-02-23 18:46:54 | NeurodataWithoutBorders/pynwb | https://api.github.com/repos/NeurodataWithoutBorders/pynwb | closed | [Bug]: Sphinx TypeError in check external links action | category: bug priority: high | ### What happened?
See https://github.com/NeurodataWithoutBorders/pynwb/actions/runs/4120974015/jobs/7116231197
```
/home/runner/work/pynwb/pynwb/docs/source/export.rst:31: ERROR: Unknown interpreted text role "py:code".
Exception occurred:
File "/opt/hostedtoolcache/Python/3.8.16/x64/lib/python3.8/site-packages/sphinx/ext/extlinks.py", line 103, in role
title = caption % part
TypeError: not all arguments converted during string formatting
```
cc @bendichter since I think you added this. "py:code" is not a standard prefix but we might be able to make it valid using something like https://stackoverflow.com/a/12365251
The "py:code" is probably NOT the cause of the failing GitHub action since the error was present on the GitHub action yesterday and that action succeeded. We should fix it anyways.
I'm not sure what the cause of the TypeError is. We'll have to dig deeper.
@mavaylon1 can you look into this?
### Steps to Reproduce
```python
sphinx-build -b linkcheck ./docs/source ./test_build
```
### Traceback
_No response_
### Operating System
macOS
### Python Executable
Conda
### Python Version
3.10
### Package Versions
_No response_
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/NeurodataWithoutBorders/pynwb/blob/dev/.github/CODE_OF_CONDUCT.rst)
- [X] Have you checked the [Contributing](https://github.com/NeurodataWithoutBorders/pynwb/blob/dev/docs/CONTRIBUTING.rst) document?
- [X] Have you ensured this bug was not already [reported](https://github.com/NeurodataWithoutBorders/pynwb/issues)? | 1.0 | [Bug]: Sphinx TypeError in check external links action - ### What happened?
See https://github.com/NeurodataWithoutBorders/pynwb/actions/runs/4120974015/jobs/7116231197
```
/home/runner/work/pynwb/pynwb/docs/source/export.rst:31: ERROR: Unknown interpreted text role "py:code".
Exception occurred:
File "/opt/hostedtoolcache/Python/3.8.16/x64/lib/python3.8/site-packages/sphinx/ext/extlinks.py", line 103, in role
title = caption % part
TypeError: not all arguments converted during string formatting
```
cc @bendichter since I think you added this. "py:code" is not a standard prefix but we might be able to make it valid using something like https://stackoverflow.com/a/12365251
The "py:code" is probably NOT the cause of the failing GitHub action since the error was present on the GitHub action yesterday and that action succeeded. We should fix it anyways.
I'm not sure what the cause of the TypeError is. We'll have to dig deeper.
@mavaylon1 can you look into this?
### Steps to Reproduce
```python
sphinx-build -b linkcheck ./docs/source ./test_build
```
### Traceback
_No response_
### Operating System
macOS
### Python Executable
Conda
### Python Version
3.10
### Package Versions
_No response_
### Code of Conduct
- [X] I agree to follow this project's [Code of Conduct](https://github.com/NeurodataWithoutBorders/pynwb/blob/dev/.github/CODE_OF_CONDUCT.rst)
- [X] Have you checked the [Contributing](https://github.com/NeurodataWithoutBorders/pynwb/blob/dev/docs/CONTRIBUTING.rst) document?
- [X] Have you ensured this bug was not already [reported](https://github.com/NeurodataWithoutBorders/pynwb/issues)? | priority | sphinx typeerror in check external links action what happened see home runner work pynwb pynwb docs source export rst error unknown interpreted text role py code exception occurred file opt hostedtoolcache python lib site packages sphinx ext extlinks py line in role title caption part typeerror not all arguments converted during string formatting cc bendichter since i think you added this py code is not a standard prefix but we might be able to make it valid using something like the py code is probably not the cause of the failing github action since the error was present on the github action yesterday and that action succeeded we should fix it anyways i m not sure what the cause of the typeerror is we ll have to dig deeper can you look into this steps to reproduce python sphinx build b linkcheck docs source test build traceback no response operating system macos python executable conda python version package versions no response code of conduct i agree to follow this project s have you checked the document have you ensured this bug was not already | 1 |
702,539 | 24,124,971,813 | IssuesEvent | 2022-09-20 22:50:21 | FuelLabs/swayswap | https://api.github.com/repos/FuelLabs/swayswap | closed | Refactoring pool logics | page:pools priority:high page:rm_liquidity | - Improve swap create pool / add liquidity to be more clear, we are going to add xstate | 1.0 | Refactoring pool logics - - Improve swap create pool / add liquidity to be more clear, we are going to add xstate | priority | refactoring pool logics improve swap create pool add liquidity to be more clear we are going to add xstate | 1 |
732 | 2,511,241,912 | IssuesEvent | 2015-01-14 04:36:56 | AmpersandJS/amp | https://api.github.com/repos/AmpersandJS/amp | closed | iteratee needs tests? | bug test | the test file right now is a stub it seems, thanks to @jvduf for noticing | 1.0 | iteratee needs tests? - the test file right now is a stub it seems, thanks to @jvduf for noticing | non_priority | iteratee needs tests the test file right now is a stub it seems thanks to jvduf for noticing | 0 |
22,109 | 4,774,010,848 | IssuesEvent | 2016-10-27 04:01:47 | baidu/Paddle | https://api.github.com/repos/baidu/Paddle | opened | Documentation about how to write documentation. | documentation | Write a documentation to show:
* How paddle docs are organized.
* How to write `paddle docs`
* How to build `paddle docs` and `preview documentation locally`
* How does `www.paddlepaddle.org` update documentation. | 1.0 | Documentation about how to write documentation. - Write a documentation to show:
* How paddle docs are organized.
* How to write `paddle docs`
* How to build `paddle docs` and `preview documentation locally`
* How does `www.paddlepaddle.org` update documentation. | non_priority | documentation about how to write documentation write a documentation to show how paddle docs are organized how to write paddle docs how to build paddle docs and preview documentation locally how does update documentation | 0 |
172,837 | 13,348,984,568 | IssuesEvent | 2020-08-29 21:37:45 | digitallinguistics/app | https://api.github.com/repos/digitallinguistics/app | closed | update to Cypress 5.0 | dev tests | Cypress now supports test retries! This could be really helpful for dealing with some of the race conditions we run into.
https://cypress.io/blog/2020/08/19/introducing-test-retries-in-cypress-5-0/ | 1.0 | update to Cypress 5.0 - Cypress now supports test retries! This could be really helpful for dealing with some of the race conditions we run into.
https://cypress.io/blog/2020/08/19/introducing-test-retries-in-cypress-5-0/ | non_priority | update to cypress cypress now supports test retries this could be really helpful for dealing with some of the race conditions we run into | 0 |
16,532 | 9,438,197,229 | IssuesEvent | 2019-04-13 21:17:24 | bocadilloproject/bocadillo | https://api.github.com/repos/bocadilloproject/bocadillo | closed | Go all-async | breaking community docs enhancement performance | **Is your feature request related to a problem? Please describe.**
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Supporting both sync and async syntax introduced a lot of complexity to the framework code.
The main benefit we get out of that is lower barrier to entry — people not familiar with async can just jump right in and write `def my_view()`.
But Bocadillo is an async framework after all — if they're looking for a synchronous framework, they can just use a WSGI framework like Flask. Plus I think it'd be nice to spread the async love by *teaching* stuff to people.
**Describe the solution you'd like**
<!-- A clear and concise description of what you want to happen. -->
a. Remove support for synchronous syntax in:
- Views
- Error handlers
- Hooks
b. Keep synchronous syntax where it makes sense (e.g. `templates.render_string()`).
c. Be *very* explicit that using Bocadillo requires to use the `async` syntax, but:
- No prerequisite necessary — take users by the hand by showing them the basic syntax.
- Show async equivalent of some common synchronous code (e.g. function definition, function calls, loops, context managers).
- Let them know they do *not* need to know anything about `asyncio` or how async works under the hood — it's just about *syntax*.
- Explain what async enables and how the benefits can be potentially ruined (e.g. running long blocking operations, e.g. querying a database with a synchronous library).
- Explain common patterns, e.g.:
- Dealing with CPU-bound operations (threadpool execution)
- Wrapping a sync function in an async one
- Give useful resources to learn more about async if they want to.
**Describe alternatives you've considered**
<!-- A clear and concise description of any alternative solutions or features you've considered. -->
**Implementation ideas**
<!-- If you already have ideas on how to implement this feature, please list them here. -->
- Remove async support from listed components.
- Update the docs with the above modifications.
**Additional context**
<!-- Add any other context or screenshots about the feature request here. -->
Part of the spring cleanup: #236 | True | Go all-async - **Is your feature request related to a problem? Please describe.**
<!-- A clear and concise description of what the problem is. Ex. I'm always frustrated when [...] -->
Supporting both sync and async syntax introduced a lot of complexity to the framework code.
The main benefit we get out of that is lower barrier to entry — people not familiar with async can just jump right in and write `def my_view()`.
But Bocadillo is an async framework after all — if they're looking for a synchronous framework, they can just use a WSGI framework like Flask. Plus I think it'd be nice to spread the async love by *teaching* stuff to people.
**Describe the solution you'd like**
<!-- A clear and concise description of what you want to happen. -->
a. Remove support for synchronous syntax in:
- Views
- Error handlers
- Hooks
b. Keep synchronous syntax where it makes sense (e.g. `templates.render_string()`).
c. Be *very* explicit that using Bocadillo requires to use the `async` syntax, but:
- No prerequisite necessary — take users by the hand by showing them the basic syntax.
- Show async equivalent of some common synchronous code (e.g. function definition, function calls, loops, context managers).
- Let them know they do *not* need to know anything about `asyncio` or how async works under the hood — it's just about *syntax*.
- Explain what async enables and how the benefits can be potentially ruined (e.g. running long blocking operations, e.g. querying a database with a synchronous library).
- Explain common patterns, e.g.:
- Dealing with CPU-bound operations (threadpool execution)
- Wrapping a sync function in an async one
- Give useful resources to learn more about async if they want to.
**Describe alternatives you've considered**
<!-- A clear and concise description of any alternative solutions or features you've considered. -->
**Implementation ideas**
<!-- If you already have ideas on how to implement this feature, please list them here. -->
- Remove async support from listed components.
- Update the docs with the above modifications.
**Additional context**
<!-- Add any other context or screenshots about the feature request here. -->
Part of the spring cleanup: #236 | non_priority | go all async is your feature request related to a problem please describe supporting both sync and async syntax introduced a lot of complexity to the framework code the main benefit we get out of that is lower barrier to entry — people not familiar with async can just jump right in and write def my view but bocadillo is an async framework after all — if they re looking for a synchronous framework they can just use a wsgi framework like flask plus i think it d be nice to spread the async love by teaching stuff to people describe the solution you d like a remove support for synchronous syntax in views error handlers hooks b keep synchronous syntax where it makes sense e g templates render string c be very explicit that using bocadillo requires to use the async syntax but no prerequisite necessary — take users by the hand by showing them the basic syntax show async equivalent of some common synchronous code e g function definition function calls loops context managers let them know they do not need to know anything about asyncio or how async works under the hood — it s just about syntax explain what async enables and how the benefits can be potentially ruined e g running long blocking operations e g querying a database with a synchronous library explain common patterns e g dealing with cpu bound operations threadpool execution wrapping a sync function in an async one give useful resources to learn more about async if they want to describe alternatives you ve considered implementation ideas remove async support from listed components update the docs with the above modifications additional context part of the spring cleanup | 0 |
11,290 | 9,082,125,854 | IssuesEvent | 2019-02-17 09:18:07 | comit-network/comit-rs | https://api.github.com/repos/comit-network/comit-rs | closed | Travis doesn't run Rust Doc-tests | infrastructure testing | When running `cargo test` locally `Doc-tests` are run, but on Travis that is not the case.
Hint: check the Makefile | 1.0 | Travis doesn't run Rust Doc-tests - When running `cargo test` locally `Doc-tests` are run, but on Travis that is not the case.
Hint: check the Makefile | non_priority | travis doesn t run rust doc tests when running cargo test locally doc tests are run but on travis that is not the case hint check the makefile | 0 |
113,787 | 11,816,384,000 | IssuesEvent | 2020-03-20 08:54:16 | bark-simulator/carla-interface | https://api.github.com/repos/bark-simulator/carla-interface | opened | Running example of city highway straight from BARK & Documentation | documentation | running example with our map:
- spawn some agents
- in BARK: plan for ego agent, using IDM model for prediction in BARK
- create a video from it
- Update documentation for this example
| 1.0 | Running example of city highway straight from BARK & Documentation - running example with our map:
- spawn some agents
- in BARK: plan for ego agent, using IDM model for prediction in BARK
- create a video from it
- Update documentation for this example
| non_priority | running example of city highway straight from bark documentation running example with our map spawn some agents in bark plan for ego agent using idm model for prediction in bark create a video from it update documentation for this example | 0 |
3,293 | 2,666,767,874 | IssuesEvent | 2015-03-21 22:10:03 | oscar-broman/samp-weapon-config | https://api.github.com/repos/oscar-broman/samp-weapon-config | closed | Rocket Launcher and Heatseeker (others untested) | bug needs-testing | When hitting somebody with a missile, they can take a missile straight to the face and only lose ~5 to 7 health. Didn't realize until some random guy came in my server and hit me it the face with a missile. | 1.0 | Rocket Launcher and Heatseeker (others untested) - When hitting somebody with a missile, they can take a missile straight to the face and only lose ~5 to 7 health. Didn't realize until some random guy came in my server and hit me it the face with a missile. | non_priority | rocket launcher and heatseeker others untested when hitting somebody with a missile they can take a missile straight to the face and only lose to health didn t realize until some random guy came in my server and hit me it the face with a missile | 0 |
24,855 | 24,390,438,132 | IssuesEvent | 2022-10-04 14:51:26 | ClickHouse/ClickHouse | https://api.github.com/repos/ClickHouse/ClickHouse | opened | Incorrect exception message if drop column has repeated columns | usability | ```
CREATE TABLE t ( a Int64, b Int64, c Int64, d Int64) ENGINE = MergeTree ORDER BY (a);
alter table t
drop column if exists b,
drop column if exists c,
drop column if exists d,
drop column if exists b;
DB::Exception: There is no column b in table. Maybe you meant: ['a']. (NO_SUCH_COLUMN_IN_TABLE)
``` | True | Incorrect exception message if drop column has repeated columns - ```
CREATE TABLE t ( a Int64, b Int64, c Int64, d Int64) ENGINE = MergeTree ORDER BY (a);
alter table t
drop column if exists b,
drop column if exists c,
drop column if exists d,
drop column if exists b;
DB::Exception: There is no column b in table. Maybe you meant: ['a']. (NO_SUCH_COLUMN_IN_TABLE)
``` | non_priority | incorrect exception message if drop column has repeated columns create table t a b c d engine mergetree order by a alter table t drop column if exists b drop column if exists c drop column if exists d drop column if exists b db exception there is no column b in table maybe you meant no such column in table | 0 |
803,744 | 29,188,058,391 | IssuesEvent | 2023-05-19 17:11:30 | dankelley/oce | https://api.github.com/repos/dankelley/oce | opened | plotTS() not obeying rho1000 parameter | ctd graphics high priority | ``` r
library(oce)
#> Loading required package: gsw
par(mfrow=c(2,1))
data(ctd)
plotTS(ctd)
plotTS(ctd, rho1000=TRUE)
```
<!-- -->
<sup>Created on 2023-05-19 with [reprex v2.0.2](https://reprex.tidyverse.org)</sup> | 1.0 | plotTS() not obeying rho1000 parameter - ``` r
library(oce)
#> Loading required package: gsw
par(mfrow=c(2,1))
data(ctd)
plotTS(ctd)
plotTS(ctd, rho1000=TRUE)
```
<!-- -->
<sup>Created on 2023-05-19 with [reprex v2.0.2](https://reprex.tidyverse.org)</sup> | priority | plotts not obeying parameter r library oce loading required package gsw par mfrow c data ctd plotts ctd plotts ctd true created on with | 1 |
55,287 | 13,577,774,180 | IssuesEvent | 2020-09-20 03:34:23 | carla-simulator/carla | https://api.github.com/repos/carla-simulator/carla | closed | errors when make PythonAPI | build system question stale |
when i run: make PythonAPI, i got the following errors:
my python version is 3.8
Setup.sh: llvm-8.0 already installed.
Setup.sh: boost-1.72.0-c8 already installed.
Setup.sh: rpclib-v2.2.1_c2-c8 already installed.
Setup.sh: gtest-1.8.1-c8 already installed.
Setup.sh: recast-cdce4e-c8 already installed.
Setup.sh: CARLA version 0.9.9.4-1-gb695dbf2-dirty.
Setup.sh: Generating CMake configuration files.
Setup.sh: Success!
BuildLibCarla.sh: Building LibCarla "Client.Release" configuration.
ninja: no work to do.
[0/1] Install the project...
-- Install configuration: "Client"
BuildLibCarla.sh: Success!
BuildPythonAPI.sh: Building Python API for Python 2.
compiling:
- source/libcarla/libcarla.cpp
running bdist_egg
running egg_info
writing source/carla.egg-info/PKG-INFO
writing top-level names to source/carla.egg-info/top_level.txt
writing dependency_links to source/carla.egg-info/dependency_links.txt
reading manifest file 'source/carla.egg-info/SOURCES.txt'
writing manifest file 'source/carla.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
running build_ext
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/carla
copying build/lib.linux-x86_64-2.7/carla/libcarla.so -> build/bdist.linux-x86_64/egg/carla
copying build/lib.linux-x86_64-2.7/carla/__init__.py -> build/bdist.linux-x86_64/egg/carla
copying build/lib.linux-x86_64-2.7/carla/command.py -> build/bdist.linux-x86_64/egg/carla
byte-compiling build/bdist.linux-x86_64/egg/carla/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/carla/command.py to command.pyc
creating stub loader for carla/libcarla.so
byte-compiling build/bdist.linux-x86_64/egg/carla/libcarla.py to libcarla.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
writing build/bdist.linux-x86_64/egg/EGG-INFO/native_libs.txt
zip_safe flag not set; analyzing archive contents...
creating 'dist/carla-0.9.9-py2.7-linux-x86_64.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
BuildPythonAPI.sh: Building Python API for Python 3.
Traceback (most recent call last):
File "setup.py", line 160, in <module>
ext_modules=get_libcarla_extensions(),
File "setup.py", line 37, in get_libcarla_extensions
linux_distro = platform.dist()[0] # pylint: disable=W1505
AttributeError: module 'platform' has no attribute 'dist'
make: *** [Util/BuildTools/Linux.mk:89: PythonAPI] Error 1
| 1.0 | errors when make PythonAPI -
when i run: make PythonAPI, i got the following errors:
my python version is 3.8
Setup.sh: llvm-8.0 already installed.
Setup.sh: boost-1.72.0-c8 already installed.
Setup.sh: rpclib-v2.2.1_c2-c8 already installed.
Setup.sh: gtest-1.8.1-c8 already installed.
Setup.sh: recast-cdce4e-c8 already installed.
Setup.sh: CARLA version 0.9.9.4-1-gb695dbf2-dirty.
Setup.sh: Generating CMake configuration files.
Setup.sh: Success!
BuildLibCarla.sh: Building LibCarla "Client.Release" configuration.
ninja: no work to do.
[0/1] Install the project...
-- Install configuration: "Client"
BuildLibCarla.sh: Success!
BuildPythonAPI.sh: Building Python API for Python 2.
compiling:
- source/libcarla/libcarla.cpp
running bdist_egg
running egg_info
writing source/carla.egg-info/PKG-INFO
writing top-level names to source/carla.egg-info/top_level.txt
writing dependency_links to source/carla.egg-info/dependency_links.txt
reading manifest file 'source/carla.egg-info/SOURCES.txt'
writing manifest file 'source/carla.egg-info/SOURCES.txt'
installing library code to build/bdist.linux-x86_64/egg
running install_lib
running build_py
running build_ext
creating build/bdist.linux-x86_64/egg
creating build/bdist.linux-x86_64/egg/carla
copying build/lib.linux-x86_64-2.7/carla/libcarla.so -> build/bdist.linux-x86_64/egg/carla
copying build/lib.linux-x86_64-2.7/carla/__init__.py -> build/bdist.linux-x86_64/egg/carla
copying build/lib.linux-x86_64-2.7/carla/command.py -> build/bdist.linux-x86_64/egg/carla
byte-compiling build/bdist.linux-x86_64/egg/carla/__init__.py to __init__.pyc
byte-compiling build/bdist.linux-x86_64/egg/carla/command.py to command.pyc
creating stub loader for carla/libcarla.so
byte-compiling build/bdist.linux-x86_64/egg/carla/libcarla.py to libcarla.pyc
creating build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/PKG-INFO -> build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/SOURCES.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/dependency_links.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
copying source/carla.egg-info/top_level.txt -> build/bdist.linux-x86_64/egg/EGG-INFO
writing build/bdist.linux-x86_64/egg/EGG-INFO/native_libs.txt
zip_safe flag not set; analyzing archive contents...
creating 'dist/carla-0.9.9-py2.7-linux-x86_64.egg' and adding 'build/bdist.linux-x86_64/egg' to it
removing 'build/bdist.linux-x86_64/egg' (and everything under it)
BuildPythonAPI.sh: Building Python API for Python 3.
Traceback (most recent call last):
File "setup.py", line 160, in <module>
ext_modules=get_libcarla_extensions(),
File "setup.py", line 37, in get_libcarla_extensions
linux_distro = platform.dist()[0] # pylint: disable=W1505
AttributeError: module 'platform' has no attribute 'dist'
make: *** [Util/BuildTools/Linux.mk:89: PythonAPI] Error 1
| non_priority | errors when make pythonapi when i run make pythonapi i got the following errors my python version is setup sh llvm already installed setup sh boost already installed setup sh rpclib already installed setup sh gtest already installed setup sh recast already installed setup sh carla version dirty setup sh generating cmake configuration files setup sh success buildlibcarla sh building libcarla client release configuration ninja no work to do install the project install configuration client buildlibcarla sh success buildpythonapi sh building python api for python compiling source libcarla libcarla cpp running bdist egg running egg info writing source carla egg info pkg info writing top level names to source carla egg info top level txt writing dependency links to source carla egg info dependency links txt reading manifest file source carla egg info sources txt writing manifest file source carla egg info sources txt installing library code to build bdist linux egg running install lib running build py running build ext creating build bdist linux egg creating build bdist linux egg carla copying build lib linux carla libcarla so build bdist linux egg carla copying build lib linux carla init py build bdist linux egg carla copying build lib linux carla command py build bdist linux egg carla byte compiling build bdist linux egg carla init py to init pyc byte compiling build bdist linux egg carla command py to command pyc creating stub loader for carla libcarla so byte compiling build bdist linux egg carla libcarla py to libcarla pyc creating build bdist linux egg egg info copying source carla egg info pkg info build bdist linux egg egg info copying source carla egg info sources txt build bdist linux egg egg info copying source carla egg info dependency links txt build bdist linux egg egg info copying source carla egg info top level txt build bdist linux egg egg info writing build bdist linux egg egg info native libs txt zip safe flag not set analyzing archive contents creating dist carla linux egg and adding build bdist linux egg to it removing build bdist linux egg and everything under it buildpythonapi sh building python api for python traceback most recent call last file setup py line in ext modules get libcarla extensions file setup py line in get libcarla extensions linux distro platform dist pylint disable attributeerror module platform has no attribute dist make error | 0 |
773,352 | 27,155,400,430 | IssuesEvent | 2023-02-17 07:14:55 | owncloud/web | https://api.github.com/repos/owncloud/web | closed | Space: Show context menu: changing position when scrolling | Type:Bug Topic:good-first-issue Priority:p4-low | ### Steps to reproduce
See:
https://user-images.githubusercontent.com/26610733/196742558-d5dc5888-93df-4b9d-87c3-fb7683e96c5e.mp4
### Expected behaviour
The context menu should not relocate when scrolling
### Actual behaviour
The context menu relocates when scrolling
### Environment general
**Operating system**: macOS and W10
### Notes
Would be great to fix this before GA
@kulmann @tbsbdr fyi | 1.0 | Space: Show context menu: changing position when scrolling - ### Steps to reproduce
See:
https://user-images.githubusercontent.com/26610733/196742558-d5dc5888-93df-4b9d-87c3-fb7683e96c5e.mp4
### Expected behaviour
The context menu should not relocate when scrolling
### Actual behaviour
The context menu relocates when scrolling
### Environment general
**Operating system**: macOS and W10
### Notes
Would be great to fix this before GA
@kulmann @tbsbdr fyi | priority | space show context menu changing position when scrolling steps to reproduce see expected behaviour the context menu should not relocate when scrolling actual behaviour the context menu relocates when scrolling environment general operating system macos and notes would be great to fix this before ga kulmann tbsbdr fyi | 1 |
7,121 | 3,510,871,383 | IssuesEvent | 2016-01-09 20:46:38 | YupItsZac/FreeGeoAPI | https://api.github.com/repos/YupItsZac/FreeGeoAPI | opened | Insecure Authentication | code-enhancement needs-discussion needs-investigation URGENT | The authentication request for the API currently includes the public/private keyset in the request.
This isn't very secure and should be modified. It's too late to include it in the v2 release, so I'll scope it for the v3 release.
***My Current Thought***
Auth should use the private key to encrypt the public key, and submit the encrypted value with the auth request. The API would then decrypt the provided value using the private key and compare it to the public. The private key is _NEVER_ sent in the request.
I think this would secure the token generation process and prevent any potential for "listeners" to get an app's private key, since it's never included in the request.
| 1.0 | Insecure Authentication - The authentication request for the API currently includes the public/private keyset in the request.
This isn't very secure and should be modified. It's too late to include it in the v2 release, so I'll scope it for the v3 release.
***My Current Thought***
Auth should use the private key to encrypt the public key, and submit the encrypted value with the auth request. The API would then decrypt the provided value using the private key and compare it to the public. The private key is _NEVER_ sent in the request.
I think this would secure the token generation process and prevent any potential for "listeners" to get an app's private key, since it's never included in the request.
| non_priority | insecure authentication the authentication request for the api currently includes the public private keyset in the request this isn t very secure and should be modified it s too late to include it in the release so i ll scope it for the release my current thought auth should use the private key to encrypt the public key and submit the encrypted value with the auth request the api would then decrypt the provided value using the private key and compare it to the public the private key is never sent in the request i think this would secure the token generation process and prevent any potential for listeners to get an app s private key since it s never included in the request | 0 |
187,262 | 14,427,276,306 | IssuesEvent | 2020-12-06 03:00:41 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | go-vela/pkg-runtime: runtime/kubernetes/container_test.go; 20 LoC | fresh small test |
Found a possible issue in [go-vela/pkg-runtime](https://www.github.com/go-vela/pkg-runtime) at [runtime/kubernetes/container_test.go](https://github.com/go-vela/pkg-runtime/blob/83cd0f9fc2b1627460dd15596a47c96e3fb0f9ce/runtime/kubernetes/container_test.go#L334-L353)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable test used in defer or goroutine at line 337
[Click here to see the code in its original context.](https://github.com/go-vela/pkg-runtime/blob/83cd0f9fc2b1627460dd15596a47c96e3fb0f9ce/runtime/kubernetes/container_test.go#L334-L353)
<details>
<summary>Click here to show the 20 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range tests {
go func() {
// simulate adding a pod to the watcher
_watch.Add(test.object)
}()
err := _engine.WaitContainer(context.Background(), test.container)
if test.failure {
if err == nil {
t.Errorf("WaitContainer should have returned err")
}
continue
}
if err != nil {
t.Errorf("WaitContainer returned err: %v", err)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 83cd0f9fc2b1627460dd15596a47c96e3fb0f9ce
| 1.0 | go-vela/pkg-runtime: runtime/kubernetes/container_test.go; 20 LoC -
Found a possible issue in [go-vela/pkg-runtime](https://www.github.com/go-vela/pkg-runtime) at [runtime/kubernetes/container_test.go](https://github.com/go-vela/pkg-runtime/blob/83cd0f9fc2b1627460dd15596a47c96e3fb0f9ce/runtime/kubernetes/container_test.go#L334-L353)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable test used in defer or goroutine at line 337
[Click here to see the code in its original context.](https://github.com/go-vela/pkg-runtime/blob/83cd0f9fc2b1627460dd15596a47c96e3fb0f9ce/runtime/kubernetes/container_test.go#L334-L353)
<details>
<summary>Click here to show the 20 line(s) of Go which triggered the analyzer.</summary>
```go
for _, test := range tests {
go func() {
// simulate adding a pod to the watcher
_watch.Add(test.object)
}()
err := _engine.WaitContainer(context.Background(), test.container)
if test.failure {
if err == nil {
t.Errorf("WaitContainer should have returned err")
}
continue
}
if err != nil {
t.Errorf("WaitContainer returned err: %v", err)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 83cd0f9fc2b1627460dd15596a47c96e3fb0f9ce
| non_priority | go vela pkg runtime runtime kubernetes container test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable test used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for test range tests go func simulate adding a pod to the watcher watch add test object err engine waitcontainer context background test container if test failure if err nil t errorf waitcontainer should have returned err continue if err nil t errorf waitcontainer returned err v err leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 0 |
94,946 | 19,619,456,670 | IssuesEvent | 2022-01-07 03:11:14 | sourcegraph/sourcegraph | https://api.github.com/repos/sourcegraph/sourcegraph | closed | Insights: regex capture groups failing if regex includes an `@` symbol | bug team/code-insights insights-capture-groups-v1 capture-groups-insight | I think that @ is breaking on the backend capture groups endpoint but not in Sourcegraph. This is a valid regexp, the frontend UI tells me so (doesn’t throw a “regex formatting” error), but I get a backend error. I [can search this regexp in Sourcegraph and it’s valid](https://k8s.sgdev.org/search?q=TODO+%40(%5B%5Cw%5D%2B)+patterntype%3Aregexp).
So it seems like a problem with either our backend parsing or the compute endpoint?

| 1.0 | Insights: regex capture groups failing if regex includes an `@` symbol - I think that @ is breaking on the backend capture groups endpoint but not in Sourcegraph. This is a valid regexp, the frontend UI tells me so (doesn’t throw a “regex formatting” error), but I get a backend error. I [can search this regexp in Sourcegraph and it’s valid](https://k8s.sgdev.org/search?q=TODO+%40(%5B%5Cw%5D%2B)+patterntype%3Aregexp).
So it seems like a problem with either our backend parsing or the compute endpoint?

| non_priority | insights regex capture groups failing if regex includes an symbol i think that is breaking on the backend capture groups endpoint but not in sourcegraph this is a valid regexp the frontend ui tells me so doesn’t throw a “regex formatting” error but i get a backend error i so it seems like a problem with either our backend parsing or the compute endpoint | 0 |
191,620 | 15,299,396,030 | IssuesEvent | 2021-02-24 10:54:14 | threefoldtech/js-sdk | https://api.github.com/repos/threefoldtech/js-sdk | closed | Documentation: Billing sals needs documentation | type_documentation | ### Description
Billing sals has no documentation | 1.0 | Documentation: Billing sals needs documentation - ### Description
Billing sals has no documentation | non_priority | documentation billing sals needs documentation description billing sals has no documentation | 0 |
54,700 | 30,318,196,377 | IssuesEvent | 2023-07-10 17:06:21 | open-contracting/lib-cove-ocds | https://api.github.com/repos/open-contracting/lib-cove-ocds | reopened | Performance option: Validate one release at a time | performance schema validation | Presently, the entire package needs to be loaded into memory to be validated. This of course consumes a lot of memory for larger files. https://github.com/open-contracting/lib-cove-oc4ids/issues/23
An alternative is to read the entire input twice: once to re-build the package metadata without releases/records/etc., and then to iteratively yield each release/record for validation.
To avoid rewriting a lot of code, we could perhaps stitch the results for individual releases/records back together, so that errors are still reported as being about releases/0, releases/1, etc. even though each was validated separately.
In any case, this is the only way for memory usage to not scale with input size.
| True | Performance option: Validate one release at a time - Presently, the entire package needs to be loaded into memory to be validated. This of course consumes a lot of memory for larger files. https://github.com/open-contracting/lib-cove-oc4ids/issues/23
An alternative is to read the entire input twice: once to re-build the package metadata without releases/records/etc., and then to iteratively yield each release/record for validation.
To avoid rewriting a lot of code, we could perhaps stitch the results for individual releases/records back together, so that errors are still reported as being about releases/0, releases/1, etc. even though each was validated separately.
In any case, this is the only way for memory usage to not scale with input size.
| non_priority | performance option validate one release at a time presently the entire package needs to be loaded into memory to be validated this of course consumes a lot of memory for larger files an alternative is to read the entire input twice once to re build the package metadata without releases records etc and then to iteratively yield each release record for validation to avoid rewriting a lot of code we could perhaps stitch the results for individual releases records back together so that errors are still reported as being about releases releases etc even though each was validated separately in any case this is the only way for memory usage to not scale with input size | 0 |
94,207 | 15,962,352,399 | IssuesEvent | 2021-04-16 01:07:23 | dmyers87/amundsenfrontendlibrary | https://api.github.com/repos/dmyers87/amundsenfrontendlibrary | opened | CVE-2020-7733 (High) detected in ua-parser-js-0.7.17.tgz | security vulnerability | ## CVE-2020-7733 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: amundsenfrontendlibrary/amundsen_application/static/package.json</p>
<p>Path to vulnerable library: amundsenfrontendlibrary/amundsen_application/static/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- eslint-plugin-react-7.7.0.tgz (Root Library)
- prop-types-15.6.1.tgz
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package ua-parser-js before 0.7.22 are vulnerable to Regular Expression Denial of Service (ReDoS) via the regex for Redmi Phones and Mi Pad Tablets UA.
<p>Publish Date: 2020-09-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7733>CVE-2020-7733</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: 0.7.22</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ua-parser-js","packageVersion":"0.7.17","packageFilePaths":["/amundsen_application/static/package.json"],"isTransitiveDependency":true,"dependencyTree":"eslint-plugin-react:7.7.0;prop-types:15.6.1;fbjs:0.8.16;ua-parser-js:0.7.17","isMinimumFixVersionAvailable":true,"minimumFixVersion":"0.7.22"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-7733","vulnerabilityDetails":"The package ua-parser-js before 0.7.22 are vulnerable to Regular Expression Denial of Service (ReDoS) via the regex for Redmi Phones and Mi Pad Tablets UA.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7733","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-7733 (High) detected in ua-parser-js-0.7.17.tgz - ## CVE-2020-7733 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ua-parser-js-0.7.17.tgz</b></p></summary>
<p>Lightweight JavaScript-based user-agent string parser</p>
<p>Library home page: <a href="https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz">https://registry.npmjs.org/ua-parser-js/-/ua-parser-js-0.7.17.tgz</a></p>
<p>Path to dependency file: amundsenfrontendlibrary/amundsen_application/static/package.json</p>
<p>Path to vulnerable library: amundsenfrontendlibrary/amundsen_application/static/node_modules/ua-parser-js/package.json</p>
<p>
Dependency Hierarchy:
- eslint-plugin-react-7.7.0.tgz (Root Library)
- prop-types-15.6.1.tgz
- fbjs-0.8.16.tgz
- :x: **ua-parser-js-0.7.17.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The package ua-parser-js before 0.7.22 are vulnerable to Regular Expression Denial of Service (ReDoS) via the regex for Redmi Phones and Mi Pad Tablets UA.
<p>Publish Date: 2020-09-16
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7733>CVE-2020-7733</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7733</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: 0.7.22</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"ua-parser-js","packageVersion":"0.7.17","packageFilePaths":["/amundsen_application/static/package.json"],"isTransitiveDependency":true,"dependencyTree":"eslint-plugin-react:7.7.0;prop-types:15.6.1;fbjs:0.8.16;ua-parser-js:0.7.17","isMinimumFixVersionAvailable":true,"minimumFixVersion":"0.7.22"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-7733","vulnerabilityDetails":"The package ua-parser-js before 0.7.22 are vulnerable to Regular Expression Denial of Service (ReDoS) via the regex for Redmi Phones and Mi Pad Tablets UA.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7733","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_priority | cve high detected in ua parser js tgz cve high severity vulnerability vulnerable library ua parser js tgz lightweight javascript based user agent string parser library home page a href path to dependency file amundsenfrontendlibrary amundsen application static package json path to vulnerable library amundsenfrontendlibrary amundsen application static node modules ua parser js package json dependency hierarchy eslint plugin react tgz root library prop types tgz fbjs tgz x ua parser js tgz vulnerable library found in base branch master vulnerability details the package ua parser js before are vulnerable to regular expression denial of service redos via the regex for redmi phones and mi pad tablets ua publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree eslint plugin react prop types fbjs ua parser js isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails the package ua parser js before are vulnerable to regular expression denial of service redos via the regex for redmi phones and mi pad tablets ua vulnerabilityurl | 0 |
190,649 | 22,143,339,304 | IssuesEvent | 2022-06-03 09:15:48 | epiphany-platform/epiphany | https://api.github.com/repos/epiphany-platform/epiphany | closed | [FEATURE REQUEST] Upgrade Python dependencies | type/bug area/security python type/upgrade | **Is your feature request related to a problem? Please describe.**
Currently we have a number of Dependabot security alerts: https://github.com/epiphany-platform/epiphany/security/dependabot
**Describe the solution you'd like**
We would like to bump the minor versions of all Python packages to resolve these issues. Besides this we need to investigate bumping AzureCLI from 2.32.0 to 2.37.0 as some used functionality is being deprecated:
https://docs.microsoft.com/en-us/cli/azure/microsoft-graph-migration?tabs=powershell
Source:
https://github.com/epiphany-platform/epiphany/blob/37d51eea43f7f625e3a856ade51579c07484dc53/cli/src/providers/azure/APIProxy.py#L53
**Describe alternatives you've considered**
*
**Additional context**
*
---
**DoD checklist**
- Changelog
- [x] updated
- [ ] not needed
- COMPONENTS.md
- [x] updated
- [ ] not needed
- Schema
- [ ] updated
- [x] not needed
- Backport tasks
- [ ] created
- [ ] not needed
- Documentation
- [ ] added
- [x] updated
- [ ] not needed
- [ ] Feature has automated tests
- [ ] Automated tests passed (QA pipelines)
- [x] apply
- [x] upgrade
- [ ] backup/restore
- [x] Idempotency tested
- [x] All conversations in PR resolved
- [ ] Solution meets requirements and is done according to design doc
- [x] Usage compliant with license
| True | [FEATURE REQUEST] Upgrade Python dependencies - **Is your feature request related to a problem? Please describe.**
Currently we have a number of Dependabot security alerts: https://github.com/epiphany-platform/epiphany/security/dependabot
**Describe the solution you'd like**
We would like to bump the minor versions of all Python packages to resolve these issues. Besides this we need to investigate bumping AzureCLI from 2.32.0 to 2.37.0 as some used functionality is being deprecated:
https://docs.microsoft.com/en-us/cli/azure/microsoft-graph-migration?tabs=powershell
Source:
https://github.com/epiphany-platform/epiphany/blob/37d51eea43f7f625e3a856ade51579c07484dc53/cli/src/providers/azure/APIProxy.py#L53
**Describe alternatives you've considered**
*
**Additional context**
*
---
**DoD checklist**
- Changelog
- [x] updated
- [ ] not needed
- COMPONENTS.md
- [x] updated
- [ ] not needed
- Schema
- [ ] updated
- [x] not needed
- Backport tasks
- [ ] created
- [ ] not needed
- Documentation
- [ ] added
- [x] updated
- [ ] not needed
- [ ] Feature has automated tests
- [ ] Automated tests passed (QA pipelines)
- [x] apply
- [x] upgrade
- [ ] backup/restore
- [x] Idempotency tested
- [x] All conversations in PR resolved
- [ ] Solution meets requirements and is done according to design doc
- [x] Usage compliant with license
| non_priority | upgrade python dependencies is your feature request related to a problem please describe currently we have a number of dependabot security alerts describe the solution you d like we would like to bump the minor versions of all python packages to resolve these issues besides this we need to investigate bumping azurecli from to as some used functionality is being deprecated source describe alternatives you ve considered additional context dod checklist changelog updated not needed components md updated not needed schema updated not needed backport tasks created not needed documentation added updated not needed feature has automated tests automated tests passed qa pipelines apply upgrade backup restore idempotency tested all conversations in pr resolved solution meets requirements and is done according to design doc usage compliant with license | 0 |
2,532 | 2,694,782,953 | IssuesEvent | 2015-04-01 22:21:51 | facebook/react | https://api.github.com/repos/facebook/react | closed | Update lifecycle methods docs to specify execution environments for methods | Component: Documentation & Website | From the lifecycle method documentation at http://facebook.github.io/react/docs/component-specs.html#lifecycle-methods, it's not clear what methods will run (or not run) on the server.
| 1.0 | Update lifecycle methods docs to specify execution environments for methods - From the lifecycle method documentation at http://facebook.github.io/react/docs/component-specs.html#lifecycle-methods, it's not clear what methods will run (or not run) on the server.
| non_priority | update lifecycle methods docs to specify execution environments for methods from the lifecycle method documentation at it s not clear what methods will run or not run on the server | 0 |
43,262 | 2,887,050,418 | IssuesEvent | 2015-06-12 12:49:01 | thSoft/elysium | https://api.github.com/repos/thSoft/elysium | closed | Implement full grammar | auto-migrated Priority-Medium Type-Enhancement | ```
Define the semantic model for all features of LilyPond. Among others, this
would enable generating scores in a model-based way.
```
Original issue reported on code.google.com by `harmathdenes` on 3 Aug 2012 at 9:39 | 1.0 | Implement full grammar - ```
Define the semantic model for all features of LilyPond. Among others, this
would enable generating scores in a model-based way.
```
Original issue reported on code.google.com by `harmathdenes` on 3 Aug 2012 at 9:39 | priority | implement full grammar define the semantic model for all features of lilypond among others this would enable generating scores in a model based way original issue reported on code google com by harmathdenes on aug at | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.