Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 757 | labels stringlengths 4 664 | body stringlengths 3 261k | index stringclasses 10 values | text_combine stringlengths 96 261k | label stringclasses 2 values | text stringlengths 96 232k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
219,791 | 17,111,781,333 | IssuesEvent | 2021-07-10 13:15:17 | IntellectualSites/FastAsyncWorldEdit | https://api.github.com/repos/IntellectualSites/FastAsyncWorldEdit | closed | Not working from 1.17-1.17.1 version | Requires Testing | ### Server Implementation
Paper
### Server Version
1.17.+
### Describe the bug
java.lang.ArrayIndexOutOfBoundsException: Index 19 out of bounds for length 16
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.blocks.CharBlocks.hasSection(CharBlocks.java:125)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.HeightmapProcessor.processSet(HeightmapProcessor.java:44)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:113)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:93)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:932)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:920)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:33)
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
[15:55:17 WARN]: at java.base/java.lang.Thread.run(Thread.java:831)
[15:55:17 WARN]: java.lang.ArrayIndexOutOfBoundsException: Index 19 out of bounds for length 16
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.blocks.CharBlocks.hasSection(CharBlocks.java:125)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.HeightmapProcessor.processSet(HeightmapProcessor.java:44)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:113)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:93)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:932)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:920)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:33)
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
[15:55:17 WARN]: at java.base/java.lang.Thread.run(Thread.java:831)
[15:55:17 WARN]: java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException: Index 19 out of bounds for length 16
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.queue.SingleThreadQueueExtent.pollSubmissions(SingleThreadQueueExtent.java:
### To Reproduce
1. I prescribe the command //set block
2. The blocks do not change in the selected area, although a message is sent in the game about the successful completion of the operation
### Expected behaviour
I can't use the copy set and replace commands they just don't work
### Screenshots / Videos
_No response_
### Error log (if applicable)
_No response_
### Fawe Debugpaste
https://athion.net/ISPaster/paste/view/25135b5706d746c0a7edec026b27ed16
### Fawe Version
1.17-47;0434b86
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit-1.17/ and the issue still persists.
### Anything else?
_No response_ | 1.0 | Not working from 1.17-1.17.1 version - ### Server Implementation
Paper
### Server Version
1.17.+
### Describe the bug
java.lang.ArrayIndexOutOfBoundsException: Index 19 out of bounds for length 16
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.blocks.CharBlocks.hasSection(CharBlocks.java:125)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.HeightmapProcessor.processSet(HeightmapProcessor.java:44)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:113)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:93)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:932)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:920)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:33)
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
[15:55:17 WARN]: at java.base/java.lang.Thread.run(Thread.java:831)
[15:55:17 WARN]: java.lang.ArrayIndexOutOfBoundsException: Index 19 out of bounds for length 16
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.blocks.CharBlocks.hasSection(CharBlocks.java:125)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.HeightmapProcessor.processSet(HeightmapProcessor.java:44)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:113)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.MultiBatchProcessor.processSet(MultiBatchProcessor.java:93)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.processors.BatchProcessorHolder.processSet(BatchProcessorHolder.java:26)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:932)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:920)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.chunk.ChunkHolder.call(ChunkHolder.java:33)
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.run(FutureTask.java:264)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130)
[15:55:17 WARN]: at java.base/java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:630)
[15:55:17 WARN]: at java.base/java.lang.Thread.run(Thread.java:831)
[15:55:17 WARN]: java.util.concurrent.ExecutionException: java.lang.ArrayIndexOutOfBoundsException: Index 19 out of bounds for length 16
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.report(FutureTask.java:122)
[15:55:17 WARN]: at java.base/java.util.concurrent.FutureTask.get(FutureTask.java:191)
[15:55:17 WARN]: at com.fastasyncworldedit.core.beta.implementation.queue.SingleThreadQueueExtent.pollSubmissions(SingleThreadQueueExtent.java:
### To Reproduce
1. I prescribe the command //set block
2. The blocks do not change in the selected area, although a message is sent in the game about the successful completion of the operation
### Expected behaviour
I can't use the copy set and replace commands they just don't work
### Screenshots / Videos
_No response_
### Error log (if applicable)
_No response_
### Fawe Debugpaste
https://athion.net/ISPaster/paste/view/25135b5706d746c0a7edec026b27ed16
### Fawe Version
1.17-47;0434b86
### Checklist
- [X] I have included a Fawe debugpaste.
- [X] I am using the newest build from https://ci.athion.net/job/FastAsyncWorldEdit-1.17/ and the issue still persists.
### Anything else?
_No response_ | non_defect | not working from version server implementation paper server version describe the bug java lang arrayindexoutofboundsexception index out of bounds for length at com fastasyncworldedit core beta implementation blocks charblocks hassection charblocks java at com fastasyncworldedit core beta implementation processors heightmapprocessor processset heightmapprocessor java at com fastasyncworldedit core beta implementation processors multibatchprocessor processset multibatchprocessor java at com fastasyncworldedit core beta implementation processors multibatchprocessor processset multibatchprocessor java at com fastasyncworldedit core beta implementation processors batchprocessorholder processset batchprocessorholder java at com fastasyncworldedit core beta implementation processors batchprocessorholder processset batchprocessorholder java at com fastasyncworldedit core beta implementation chunk chunkholder call chunkholder java at com fastasyncworldedit core beta implementation chunk chunkholder call chunkholder java at com fastasyncworldedit core beta implementation chunk chunkholder call chunkholder java at java base java util concurrent futuretask run futuretask java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java java lang arrayindexoutofboundsexception index out of bounds for length at com fastasyncworldedit core beta implementation blocks charblocks hassection charblocks java at com fastasyncworldedit core beta implementation processors heightmapprocessor processset heightmapprocessor java at com fastasyncworldedit core beta implementation processors multibatchprocessor processset multibatchprocessor java at com fastasyncworldedit core beta implementation processors multibatchprocessor processset multibatchprocessor java at com fastasyncworldedit core beta implementation processors batchprocessorholder processset batchprocessorholder java at com fastasyncworldedit core beta implementation processors batchprocessorholder processset batchprocessorholder java at com fastasyncworldedit core beta implementation chunk chunkholder call chunkholder java at com fastasyncworldedit core beta implementation chunk chunkholder call chunkholder java at com fastasyncworldedit core beta implementation chunk chunkholder call chunkholder java at java base java util concurrent futuretask run futuretask java at java base java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java base java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java base java lang thread run thread java java util concurrent executionexception java lang arrayindexoutofboundsexception index out of bounds for length at java base java util concurrent futuretask report futuretask java at java base java util concurrent futuretask get futuretask java at com fastasyncworldedit core beta implementation queue singlethreadqueueextent pollsubmissions singlethreadqueueextent java to reproduce i prescribe the command set block the blocks do not change in the selected area although a message is sent in the game about the successful completion of the operation expected behaviour i can t use the copy set and replace commands they just don t work screenshots videos no response error log if applicable no response fawe debugpaste fawe version checklist i have included a fawe debugpaste i am using the newest build from and the issue still persists anything else no response | 0 |
122,200 | 16,092,169,354 | IssuesEvent | 2021-04-26 18:06:47 | carbon-design-system/carbon-for-ibm-dotcom | https://api.github.com/repos/carbon-design-system/carbon-for-ibm-dotcom | closed | [Filter panel] Design investigation / exploration | Airtable Done design design: research | ### Key design considerations to sort out:
- Grid usage. This bar likely sits outside the grid
- Type size. Ideally 14px to reduce text line wrapping, which complicates expressive theme
- States. This bar need several states of expansion, toggle select, and a way to reset to default state
- Search. Many use cases have this bar in conjunction to search, citing users needing both to find what they are looking far.
#### Additional Information
- **See the epic for the overarching descriptions of the work.**
- [box folder]
- [reference materials]
#### Acceptance Criteria
- [ ] Design direction(s) has been identified, reviewed, and approved for the final design phase
- [ ] The next story has been updated with the design direction | 2.0 | [Filter panel] Design investigation / exploration - ### Key design considerations to sort out:
- Grid usage. This bar likely sits outside the grid
- Type size. Ideally 14px to reduce text line wrapping, which complicates expressive theme
- States. This bar need several states of expansion, toggle select, and a way to reset to default state
- Search. Many use cases have this bar in conjunction to search, citing users needing both to find what they are looking far.
#### Additional Information
- **See the epic for the overarching descriptions of the work.**
- [box folder]
- [reference materials]
#### Acceptance Criteria
- [ ] Design direction(s) has been identified, reviewed, and approved for the final design phase
- [ ] The next story has been updated with the design direction | non_defect | design investigation exploration key design considerations to sort out grid usage this bar likely sits outside the grid type size ideally to reduce text line wrapping which complicates expressive theme states this bar need several states of expansion toggle select and a way to reset to default state search many use cases have this bar in conjunction to search citing users needing both to find what they are looking far additional information see the epic for the overarching descriptions of the work acceptance criteria design direction s has been identified reviewed and approved for the final design phase the next story has been updated with the design direction | 0 |
391,431 | 26,892,805,794 | IssuesEvent | 2023-02-06 10:10:39 | scaleway/docs-content | https://api.github.com/repos/scaleway/docs-content | closed | [👩💻 Documentation Request]: Tutorial to configure ingress-nginx proxy-protocol-v2 support | Documentation Request | ### Summary
This documentation is about configuring the `ingress-nginx ` to accept `proxy-protocol-v2` communication, here's the Slack threads about it :
- https://scaleway-community.slack.com/archives/CD9JPK4KF/p1674483432890869
- https://scaleway-community.slack.com/archives/CD9JPK4KF/p1665128165284669
### Why is it needed?
Tutorial to setup the `ingress-nginx` to accept `proxy-protocol-v2` communication
### Want to write this documentation yourself?
Yes
### Related PR(s)
https://github.com/scaleway/docs-content/pull/1232
### (Optional) Scaleway Organization ID
You can send me a message on Slack, I'll send you my org-id if needed
### Email address
**@Grraahaam** (Slack - Scaleway Community) | 1.0 | [👩💻 Documentation Request]: Tutorial to configure ingress-nginx proxy-protocol-v2 support - ### Summary
This documentation is about configuring the `ingress-nginx ` to accept `proxy-protocol-v2` communication, here's the Slack threads about it :
- https://scaleway-community.slack.com/archives/CD9JPK4KF/p1674483432890869
- https://scaleway-community.slack.com/archives/CD9JPK4KF/p1665128165284669
### Why is it needed?
Tutorial to setup the `ingress-nginx` to accept `proxy-protocol-v2` communication
### Want to write this documentation yourself?
Yes
### Related PR(s)
https://github.com/scaleway/docs-content/pull/1232
### (Optional) Scaleway Organization ID
You can send me a message on Slack, I'll send you my org-id if needed
### Email address
**@Grraahaam** (Slack - Scaleway Community) | non_defect | tutorial to configure ingress nginx proxy protocol support summary this documentation is about configuring the ingress nginx to accept proxy protocol communication here s the slack threads about it why is it needed tutorial to setup the ingress nginx to accept proxy protocol communication want to write this documentation yourself yes related pr s optional scaleway organization id you can send me a message on slack i ll send you my org id if needed email address grraahaam slack scaleway community | 0 |
144,538 | 19,287,772,859 | IssuesEvent | 2021-12-11 08:31:53 | ghc-dev/Brian-Harper | https://api.github.com/repos/ghc-dev/Brian-Harper | closed | CVE-2019-20445 (High) detected in netty-codec-http-4.1.39.Final.jar - autoclosed | security vulnerability | ## CVE-2019-20445 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.39.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /e/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.39.Final/732d06961162e27fa3ae5989541c4460853745d3/netty-codec-http-4.1.39.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.1.39.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Brian-Harper/commit/94400822c5a706cad2c1de92206944c18ecb921b">94400822c5a706cad2c1de92206944c18ecb921b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.
<p>Publish Date: 2020-01-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445>CVE-2019-20445</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445</a></p>
<p>Release Date: 2020-01-29</p>
<p>Fix Resolution: io.netty:netty-codec-http:4.1.44</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.39.Final","packageFilePaths":["/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.1.39.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.44","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-20445","vulnerabilityDetails":"HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | True | CVE-2019-20445 (High) detected in netty-codec-http-4.1.39.Final.jar - autoclosed - ## CVE-2019-20445 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>netty-codec-http-4.1.39.Final.jar</b></p></summary>
<p>Netty is an asynchronous event-driven network application framework for
rapid development of maintainable high performance protocol servers and
clients.</p>
<p>Library home page: <a href="https://netty.io/">https://netty.io/</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /e/caches/modules-2/files-2.1/io.netty/netty-codec-http/4.1.39.Final/732d06961162e27fa3ae5989541c4460853745d3/netty-codec-http-4.1.39.Final.jar</p>
<p>
Dependency Hierarchy:
- :x: **netty-codec-http-4.1.39.Final.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/ghc-dev/Brian-Harper/commit/94400822c5a706cad2c1de92206944c18ecb921b">94400822c5a706cad2c1de92206944c18ecb921b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.
<p>Publish Date: 2020-01-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445>CVE-2019-20445</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-20445</a></p>
<p>Release Date: 2020-01-29</p>
<p>Fix Resolution: io.netty:netty-codec-http:4.1.44</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"io.netty","packageName":"netty-codec-http","packageVersion":"4.1.39.Final","packageFilePaths":["/build.gradle"],"isTransitiveDependency":false,"dependencyTree":"io.netty:netty-codec-http:4.1.39.Final","isMinimumFixVersionAvailable":true,"minimumFixVersion":"io.netty:netty-codec-http:4.1.44","isBinary":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2019-20445","vulnerabilityDetails":"HttpObjectDecoder.java in Netty before 4.1.44 allows a Content-Length header to be accompanied by a second Content-Length header, or by a Transfer-Encoding header.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20445","cvss3Severity":"high","cvss3Score":"9.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | non_defect | cve high detected in netty codec http final jar autoclosed cve high severity vulnerability vulnerable library netty codec http final jar netty is an asynchronous event driven network application framework for rapid development of maintainable high performance protocol servers and clients library home page a href path to dependency file build gradle path to vulnerable library e caches modules files io netty netty codec http final netty codec http final jar dependency hierarchy x netty codec http final jar vulnerable library found in head commit a href found in base branch master vulnerability details httpobjectdecoder java in netty before allows a content length header to be accompanied by a second content length header or by a transfer encoding header publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution io netty netty codec http rescue worker helmet automatic remediation is available for this issue isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency false dependencytree io netty netty codec http final isminimumfixversionavailable true minimumfixversion io netty netty codec http isbinary false basebranches vulnerabilityidentifier cve vulnerabilitydetails httpobjectdecoder java in netty before allows a content length header to be accompanied by a second content length header or by a transfer encoding header vulnerabilityurl | 0 |
73,231 | 24,515,240,553 | IssuesEvent | 2022-10-11 04:00:47 | unascribed/Fabrication | https://api.github.com/repos/unascribed/Fabrication | closed | Multiline Sign Paste - Game crashes when moving text cursor after pasting | k: Defect n: Fabric s: New | [Log](https://mclo.gs/49fzvSm)
Replication steps:
1. Enable multiline sign paste
2. Copy the text from a muliline sign
3. paste it into a new sign
4. press the left arrow key to move the text cursor
5. Crash | 1.0 | Multiline Sign Paste - Game crashes when moving text cursor after pasting - [Log](https://mclo.gs/49fzvSm)
Replication steps:
1. Enable multiline sign paste
2. Copy the text from a muliline sign
3. paste it into a new sign
4. press the left arrow key to move the text cursor
5. Crash | defect | multiline sign paste game crashes when moving text cursor after pasting replication steps enable multiline sign paste copy the text from a muliline sign paste it into a new sign press the left arrow key to move the text cursor crash | 1 |
57,155 | 14,112,933,694 | IssuesEvent | 2020-11-07 08:23:06 | pravega/pravega | https://api.github.com/repos/pravega/pravega | opened | Replace AuthHandler.Permissions with AccessOperation | area/security kind/enhancement status/accepted | **Problem description**
Prior to PR #5231, Pravega used `AuthHandler.Permissions` to represent permissions everywhere. This enum suffers from a few problems:
* It's contained in the `AuthHandler` which is a server-side construct, so using it in the client module'd be awkward.
* The `AuthHandler.Permissions` enum contain items like `READ`, `READ_UPDATE`, etc., which really represent the operations that the user is trying to perform. Permissions usually reflect actions like `allow`, `deny`, `allow_all`, etc.
Since PR #5231 needed to include code for specifying access operation, so a new class was added for the same. As of this PR, the client exclusively uses `AccessOperation`, while the Controller maps it to the corresponding `AuthHandler.Permissions` and generally uses `AuthHandler.Permissions` everywhere. Changing the server'd too to use the new class'd have been too big a change for that PR.
**Problem location**
**Suggestions for an improvement**
| True | Replace AuthHandler.Permissions with AccessOperation - **Problem description**
Prior to PR #5231, Pravega used `AuthHandler.Permissions` to represent permissions everywhere. This enum suffers from a few problems:
* It's contained in the `AuthHandler` which is a server-side construct, so using it in the client module'd be awkward.
* The `AuthHandler.Permissions` enum contain items like `READ`, `READ_UPDATE`, etc., which really represent the operations that the user is trying to perform. Permissions usually reflect actions like `allow`, `deny`, `allow_all`, etc.
Since PR #5231 needed to include code for specifying access operation, so a new class was added for the same. As of this PR, the client exclusively uses `AccessOperation`, while the Controller maps it to the corresponding `AuthHandler.Permissions` and generally uses `AuthHandler.Permissions` everywhere. Changing the server'd too to use the new class'd have been too big a change for that PR.
**Problem location**
**Suggestions for an improvement**
| non_defect | replace authhandler permissions with accessoperation problem description prior to pr pravega used authhandler permissions to represent permissions everywhere this enum suffers from a few problems it s contained in the authhandler which is a server side construct so using it in the client module d be awkward the authhandler permissions enum contain items like read read update etc which really represent the operations that the user is trying to perform permissions usually reflect actions like allow deny allow all etc since pr needed to include code for specifying access operation so a new class was added for the same as of this pr the client exclusively uses accessoperation while the controller maps it to the corresponding authhandler permissions and generally uses authhandler permissions everywhere changing the server d too to use the new class d have been too big a change for that pr problem location suggestions for an improvement | 0 |
21,371 | 3,491,702,905 | IssuesEvent | 2016-01-04 16:51:05 | buildo/github-workflow-pal | https://api.github.com/repos/buildo/github-workflow-pal | opened | UI is wonky on issue page | defect | ## Bug report
The 'New buildo issue' button is larger than the 'New issue' button in the single issue page
### Steps to reproduce
- navigate to any issue | 1.0 | UI is wonky on issue page - ## Bug report
The 'New buildo issue' button is larger than the 'New issue' button in the single issue page
### Steps to reproduce
- navigate to any issue | defect | ui is wonky on issue page bug report the new buildo issue button is larger than the new issue button in the single issue page steps to reproduce navigate to any issue | 1 |
671,882 | 22,779,734,266 | IssuesEvent | 2022-07-08 18:14:29 | MiSTer-devel/PSX_MiSTer | https://api.github.com/repos/MiSTer-devel/PSX_MiSTer | closed | Eggs of Steel (NTSC-U) | Priority-2 | Issue:
Intermittent 2D glitching that affects all moving sprites, in addition to a small flashing banner at the top right
Reproduce:
Load or start a new game/idle to let it run the demo mode
Workaround:
N/A
BIOS:
SCPH-101
CD Images:
Redump CHD & BIN/CUE
Core Version:
PSX_20220511


 | 1.0 | Eggs of Steel (NTSC-U) - Issue:
Intermittent 2D glitching that affects all moving sprites, in addition to a small flashing banner at the top right
Reproduce:
Load or start a new game/idle to let it run the demo mode
Workaround:
N/A
BIOS:
SCPH-101
CD Images:
Redump CHD & BIN/CUE
Core Version:
PSX_20220511


 | non_defect | eggs of steel ntsc u issue intermittent glitching that affects all moving sprites in addition to a small flashing banner at the top right reproduce load or start a new game idle to let it run the demo mode workaround n a bios scph cd images redump chd bin cue core version psx | 0 |
33,526 | 7,147,370,688 | IssuesEvent | 2018-01-25 00:22:34 | aguaviva/micro-jpeg-visualizer | https://api.github.com/repos/aguaviva/micro-jpeg-visualizer | closed | struct.error: unpack requires a string argument of length 64 | Priority-Medium Type-Defect auto-migrated | ```
Running the script gives the following output:
Traceback (most recent call last):
File "micro-jpeg-visualizer.py", line 282, in <module>
j.decode(open('images/porsche.jpg', 'r').read())
File "micro-jpeg-visualizer.py", line 262, in decode
self.DefineQuantizationTables(chunk)
File "micro-jpeg-visualizer.py", line 214, in DefineQuantizationTables
self.quant[hdr & 0xf] = GetArray("B", data[1:1+64],64)
File "micro-jpeg-visualizer.py", line 36, in GetArray
return list(unpack(s,l[:length]))
struct.error: unpack requires a string argument of length 64
Python version using: Python 2.7.2 (default, Jun 12 2011, 15:08:59) [MSC v.1500
32 bit (Intel)] on win
32
```
Original issue reported on code.google.com by `alexei....@gmail.com` on 5 Jan 2012 at 11:47
| 1.0 | struct.error: unpack requires a string argument of length 64 - ```
Running the script gives the following output:
Traceback (most recent call last):
File "micro-jpeg-visualizer.py", line 282, in <module>
j.decode(open('images/porsche.jpg', 'r').read())
File "micro-jpeg-visualizer.py", line 262, in decode
self.DefineQuantizationTables(chunk)
File "micro-jpeg-visualizer.py", line 214, in DefineQuantizationTables
self.quant[hdr & 0xf] = GetArray("B", data[1:1+64],64)
File "micro-jpeg-visualizer.py", line 36, in GetArray
return list(unpack(s,l[:length]))
struct.error: unpack requires a string argument of length 64
Python version using: Python 2.7.2 (default, Jun 12 2011, 15:08:59) [MSC v.1500
32 bit (Intel)] on win
32
```
Original issue reported on code.google.com by `alexei....@gmail.com` on 5 Jan 2012 at 11:47
| defect | struct error unpack requires a string argument of length running the script gives the following output traceback most recent call last file micro jpeg visualizer py line in j decode open images porsche jpg r read file micro jpeg visualizer py line in decode self definequantizationtables chunk file micro jpeg visualizer py line in definequantizationtables self quant getarray b data file micro jpeg visualizer py line in getarray return list unpack s l struct error unpack requires a string argument of length python version using python default jun msc v bit intel on win original issue reported on code google com by alexei gmail com on jan at | 1 |
275,361 | 20,919,317,524 | IssuesEvent | 2022-03-24 15:57:19 | vnkhoa02/harvard-tay-son | https://api.github.com/repos/vnkhoa02/harvard-tay-son | closed | Viết URD chức năng tạo lớp học [BA] | documentation task | - Yêu cầu:
- Chỉ giáo viên, người quản trị mới có thể tạo lớp học
- Cần phải điền form khi tạo lớp học mới
- Tên lớp học, số học viên tối đa
- Mô tả lớp học
- lịch học
- Giáo viên đảm nhận | 1.0 | Viết URD chức năng tạo lớp học [BA] - - Yêu cầu:
- Chỉ giáo viên, người quản trị mới có thể tạo lớp học
- Cần phải điền form khi tạo lớp học mới
- Tên lớp học, số học viên tối đa
- Mô tả lớp học
- lịch học
- Giáo viên đảm nhận | non_defect | viết urd chức năng tạo lớp học yêu cầu chỉ giáo viên người quản trị mới có thể tạo lớp học cần phải điền form khi tạo lớp học mới tên lớp học số học viên tối đa mô tả lớp học lịch học giáo viên đảm nhận | 0 |
59,385 | 17,023,112,823 | IssuesEvent | 2021-07-03 00:25:58 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Full local(!) path of a public file is shown on both user page and public traces page | Component: website Priority: major Resolution: duplicate Type: defect | **[Submitted to the original trac issue database at 1.09pm, Tuesday, 25th April 2006]**
cut from my public user page after adding a file:
```
<local full path deleted>\apotek2.gpx ... (0 points) ... 0 hours ago PENDING
second part of apotek, torvegade, kongensgade, havnegade, englandsgade, borgergade, jyllandsgade, havnegade, strandbygade, skolegade, stormgade, nrregade, jernbanegade, nrrebrogade, strandbykirkevej, langelandsvej, stergade, jagtvej, storegade, wessel
by miki in: esbjerg denmark danmark
```
After adding a new public file it's full local path on my machine is shown on my public gps trace page (http://www.openstreetmap.org/traces/user/miki). This is probably only during the pending period as my other completed traces only show their file name.
I consider it a security issue as it could, as in my case, reveal local servername and network drive shares.
Mikkel | 1.0 | Full local(!) path of a public file is shown on both user page and public traces page - **[Submitted to the original trac issue database at 1.09pm, Tuesday, 25th April 2006]**
cut from my public user page after adding a file:
```
<local full path deleted>\apotek2.gpx ... (0 points) ... 0 hours ago PENDING
second part of apotek, torvegade, kongensgade, havnegade, englandsgade, borgergade, jyllandsgade, havnegade, strandbygade, skolegade, stormgade, nrregade, jernbanegade, nrrebrogade, strandbykirkevej, langelandsvej, stergade, jagtvej, storegade, wessel
by miki in: esbjerg denmark danmark
```
After adding a new public file it's full local path on my machine is shown on my public gps trace page (http://www.openstreetmap.org/traces/user/miki). This is probably only during the pending period as my other completed traces only show their file name.
I consider it a security issue as it could, as in my case, reveal local servername and network drive shares.
Mikkel | defect | full local path of a public file is shown on both user page and public traces page cut from my public user page after adding a file gpx points hours ago pending second part of apotek torvegade kongensgade havnegade englandsgade borgergade jyllandsgade havnegade strandbygade skolegade stormgade nrregade jernbanegade nrrebrogade strandbykirkevej langelandsvej stergade jagtvej storegade wessel by miki in esbjerg denmark danmark after adding a new public file it s full local path on my machine is shown on my public gps trace page this is probably only during the pending period as my other completed traces only show their file name i consider it a security issue as it could as in my case reveal local servername and network drive shares mikkel | 1 |
193,088 | 6,877,844,299 | IssuesEvent | 2017-11-20 09:42:08 | OpenNebula/one | https://api.github.com/repos/OpenNebula/one | opened | Add a cancel button to the interface when adding a parameter to a template | Category: Sunstone Priority: Normal Status: Pending Tracker: Backlog | ---
Author Name: **Jaime Melis** (@jmelis)
Original Redmine Issue: 4440, https://dev.opennebula.org/issues/4440
Original Date: 2016-04-29
---
None
| 1.0 | Add a cancel button to the interface when adding a parameter to a template - ---
Author Name: **Jaime Melis** (@jmelis)
Original Redmine Issue: 4440, https://dev.opennebula.org/issues/4440
Original Date: 2016-04-29
---
None
| non_defect | add a cancel button to the interface when adding a parameter to a template author name jaime melis jmelis original redmine issue original date none | 0 |
268,232 | 23,352,640,624 | IssuesEvent | 2022-08-10 02:46:45 | pachadotdev/analogsea | https://api.github.com/repos/pachadotdev/analogsea | closed | Test suite against live droplet | testing | Would be painfully slow to run tests against a real droplet, but I'll play with this for separate testing via `test_dir()` - to allow anyone with a DO acct to run them
| 1.0 | Test suite against live droplet - Would be painfully slow to run tests against a real droplet, but I'll play with this for separate testing via `test_dir()` - to allow anyone with a DO acct to run them
| non_defect | test suite against live droplet would be painfully slow to run tests against a real droplet but i ll play with this for separate testing via test dir to allow anyone with a do acct to run them | 0 |
82,078 | 31,931,029,075 | IssuesEvent | 2023-09-19 07:26:18 | zed-industries/community | https://api.github.com/repos/zed-industries/community | closed | Can't navigate through code hints by arrows | defect popovers | ### Check for existing issues
- [X] Completed
### Describe the bug / provide steps to reproduce it
Just look at the short video below - I just can't navigate hints by arrow keys.
https://github.com/zed-industries/community/assets/45122321/df0d6be5-6176-41d1-ae25-587cea138af7
### Environment
Zed: v0.104.0 (preview)
OS: macOS 14.0.0
Memory: 16 GiB
Architecture: aarch64
### If applicable, add mockups / screenshots to help explain present your vision of the feature
_No response_
### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue.
If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000.
_No response_ | 1.0 | Can't navigate through code hints by arrows - ### Check for existing issues
- [X] Completed
### Describe the bug / provide steps to reproduce it
Just look at the short video below - I just can't navigate hints by arrow keys.
https://github.com/zed-industries/community/assets/45122321/df0d6be5-6176-41d1-ae25-587cea138af7
### Environment
Zed: v0.104.0 (preview)
OS: macOS 14.0.0
Memory: 16 GiB
Architecture: aarch64
### If applicable, add mockups / screenshots to help explain present your vision of the feature
_No response_
### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue.
If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000.
_No response_ | defect | can t navigate through code hints by arrows check for existing issues completed describe the bug provide steps to reproduce it just look at the short video below i just can t navigate hints by arrow keys environment zed preview os macos memory gib architecture if applicable add mockups screenshots to help explain present your vision of the feature no response if applicable attach your library logs zed zed log file to this issue if you only need the most recent lines you can run the zed open log command palette action to see the last no response | 1 |
35,416 | 7,736,755,933 | IssuesEvent | 2018-05-28 04:45:09 | martinrotter/rssguard | https://api.github.com/repos/martinrotter/rssguard | closed | FR: Sort article list by multiple columns (e.g. first sort by read status, then sort by title) | Component-Core Component-DB Component-Message-List Type-Defect | #### Brief description of the issue.
It would be nice if one could sort by multiple columns, e.g. by holding CTRL when clicking on the column title. The sort order could be indicated by a number in brackets, e.g. "^(1)", where ^ is the arrow.
I removed the other headlines, because none seemed to be applicable. | 1.0 | FR: Sort article list by multiple columns (e.g. first sort by read status, then sort by title) - #### Brief description of the issue.
It would be nice if one could sort by multiple columns, e.g. by holding CTRL when clicking on the column title. The sort order could be indicated by a number in brackets, e.g. "^(1)", where ^ is the arrow.
I removed the other headlines, because none seemed to be applicable. | defect | fr sort article list by multiple columns e g first sort by read status then sort by title brief description of the issue it would be nice if one could sort by multiple columns e g by holding ctrl when clicking on the column title the sort order could be indicated by a number in brackets e g where is the arrow i removed the other headlines because none seemed to be applicable | 1 |
37,483 | 8,406,271,869 | IssuesEvent | 2018-10-11 17:28:09 | NREL/EnergyPlus | https://api.github.com/repos/NREL/EnergyPlus | closed | Questions about defaults supply temp and humidity ratio setpoints in HVACTemplate:System:DOAS (CR #8594) | ExpandObjects PriorityLow SeverityMedium WontFix unconfirmed defect | ###### Added on 2011-10-16 13:47 by @mjwitte
##
#### Description
From Don Shirey 06 Jul 2011
> Another question for you.
>
> The default cooling coil design setpoint (cooling supply air setpoint temperature) is 12.8C.
>
> The default dehumidification setpoint is 0.00924 kg/kg... which is equivalent to 12.8C dewpoint temperature (55F).
>
> If the cooling coil has any type of bypass factor (> 0) and the target supply air temperature is
12.8C, then the dewpoint temperature of the supply air will be less than 12.8C. So, as long as the cooling coil has sufficient capacity, the discharge air dewpoint should always be less than the setpoint of 0.00924 kg/kg (12.8C). When I run a simulation with these defaults, I don't see any overcooling... which I guess is expected.
Well, if the coil sizing works as expected, it will have an apparatus
dewpoint lower than 55, so that the resulting mix of bypassed with
cooled air results in saturated at 55 - not sure if that's possible.
I'm easy on these - what would you choose? For a humid climate chilled
water system (which was my basis in this case for defaults), what would
a typical coil design condition be? As I read your comments, seems the
default should be more like a 50F dewpoint perhaps? The other problem
is that the coil sizing is going to break if the target dry bulb supply
is combined with the target dehum humrat as the coil sizing - doesn't
make sense at that point. Need another input for coil design dry bulb?
Mike
> When I reduce the dehumidification setpoint to say 0.008 kg/kg, I start seeing "overcooling" (as expected), and then reheat since the default Heating Coil Design Setpoint (heating supply air temperature) is 12.2C.
>
> I guess I'm just wondering if you wanted the default dehumidification setpoint to be 12.8C, same as the cooling coil design temperature of 12.8C? I guess it's really up to the user to select the necessary dehumidification setpoint --- you can't really guess a "correct" value .... so maybe you set the two values equal to avoid extra dehumidification initially and then the user will see that they need to carefully consider the appropriate dehumidification setpoint that is needed for their system design and modify the input value accordingly?
>
> Thanks,
>
> Don
Not sure I agree with your assessment. When in 1st stage cooling with constant air flow rate but coil bypass, I have never seen a "blended" supply air temp (mixed plus bypass) down at 55F. For a 50/50 coil split, under 1st stage you might get 53F off lower coil plus 75F (if no OA) bypass air.... which will blend to about 64F. With both stages on, you get 55F (roughly).
MJW 16 Apr 2012
Don:
For this one, you are correct that the defaults of 12.8C fixed dry bulb
cooling setpoint and 12.8C dewpoint for RH control will results in
little or no overcooling, because meeting the 12.8C dry bulb setpoint
would almost always also meet the humidity setpoint.
What would likely change in this case, if dehumidification control is
set to yes, is that the dry bulb control would likely be higher than
12.8C, either because it's set to outdoor air reset, or on a schedule.
Do you have any further thoughts on this? Is the default dry-bulb
setpoint higher than typical applications? Or should the default
control type be changed to OA reset? Or should this just stand as-is?
##
External Ref: e-mail from Don Shirey
Last build tested: `11.10.12 V7.0.0.025`
| 1.0 | Questions about defaults supply temp and humidity ratio setpoints in HVACTemplate:System:DOAS (CR #8594) - ###### Added on 2011-10-16 13:47 by @mjwitte
##
#### Description
From Don Shirey 06 Jul 2011
> Another question for you.
>
> The default cooling coil design setpoint (cooling supply air setpoint temperature) is 12.8C.
>
> The default dehumidification setpoint is 0.00924 kg/kg... which is equivalent to 12.8C dewpoint temperature (55F).
>
> If the cooling coil has any type of bypass factor (> 0) and the target supply air temperature is
12.8C, then the dewpoint temperature of the supply air will be less than 12.8C. So, as long as the cooling coil has sufficient capacity, the discharge air dewpoint should always be less than the setpoint of 0.00924 kg/kg (12.8C). When I run a simulation with these defaults, I don't see any overcooling... which I guess is expected.
Well, if the coil sizing works as expected, it will have an apparatus
dewpoint lower than 55, so that the resulting mix of bypassed with
cooled air results in saturated at 55 - not sure if that's possible.
I'm easy on these - what would you choose? For a humid climate chilled
water system (which was my basis in this case for defaults), what would
a typical coil design condition be? As I read your comments, seems the
default should be more like a 50F dewpoint perhaps? The other problem
is that the coil sizing is going to break if the target dry bulb supply
is combined with the target dehum humrat as the coil sizing - doesn't
make sense at that point. Need another input for coil design dry bulb?
Mike
> When I reduce the dehumidification setpoint to say 0.008 kg/kg, I start seeing "overcooling" (as expected), and then reheat since the default Heating Coil Design Setpoint (heating supply air temperature) is 12.2C.
>
> I guess I'm just wondering if you wanted the default dehumidification setpoint to be 12.8C, same as the cooling coil design temperature of 12.8C? I guess it's really up to the user to select the necessary dehumidification setpoint --- you can't really guess a "correct" value .... so maybe you set the two values equal to avoid extra dehumidification initially and then the user will see that they need to carefully consider the appropriate dehumidification setpoint that is needed for their system design and modify the input value accordingly?
>
> Thanks,
>
> Don
Not sure I agree with your assessment. When in 1st stage cooling with constant air flow rate but coil bypass, I have never seen a "blended" supply air temp (mixed plus bypass) down at 55F. For a 50/50 coil split, under 1st stage you might get 53F off lower coil plus 75F (if no OA) bypass air.... which will blend to about 64F. With both stages on, you get 55F (roughly).
MJW 16 Apr 2012
Don:
For this one, you are correct that the defaults of 12.8C fixed dry bulb
cooling setpoint and 12.8C dewpoint for RH control will results in
little or no overcooling, because meeting the 12.8C dry bulb setpoint
would almost always also meet the humidity setpoint.
What would likely change in this case, if dehumidification control is
set to yes, is that the dry bulb control would likely be higher than
12.8C, either because it's set to outdoor air reset, or on a schedule.
Do you have any further thoughts on this? Is the default dry-bulb
setpoint higher than typical applications? Or should the default
control type be changed to OA reset? Or should this just stand as-is?
##
External Ref: e-mail from Don Shirey
Last build tested: `11.10.12 V7.0.0.025`
| defect | questions about defaults supply temp and humidity ratio setpoints in hvactemplate system doas cr added on by mjwitte description from don shirey jul another question for you the default cooling coil design setpoint cooling supply air setpoint temperature is the default dehumidification setpoint is kg kg which is equivalent to dewpoint temperature if the cooling coil has any type of bypass factor and the target supply air temperature is then the dewpoint temperature of the supply air will be less than so as long as the cooling coil has sufficient capacity the discharge air dewpoint should always be less than the setpoint of kg kg when i run a simulation with these defaults i don t see any overcooling which i guess is expected well if the coil sizing works as expected it will have an apparatus dewpoint lower than so that the resulting mix of bypassed with cooled air results in saturated at not sure if that s possible i m easy on these what would you choose for a humid climate chilled water system which was my basis in this case for defaults what would a typical coil design condition be as i read your comments seems the default should be more like a dewpoint perhaps the other problem is that the coil sizing is going to break if the target dry bulb supply is combined with the target dehum humrat as the coil sizing doesn t make sense at that point need another input for coil design dry bulb mike when i reduce the dehumidification setpoint to say kg kg i start seeing overcooling as expected and then reheat since the default heating coil design setpoint heating supply air temperature is i guess i m just wondering if you wanted the default dehumidification setpoint to be same as the cooling coil design temperature of i guess it s really up to the user to select the necessary dehumidification setpoint you can t really guess a correct value so maybe you set the two values equal to avoid extra dehumidification initially and then the user will see that they need to carefully consider the appropriate dehumidification setpoint that is needed for their system design and modify the input value accordingly thanks don not sure i agree with your assessment when in stage cooling with constant air flow rate but coil bypass i have never seen a blended supply air temp mixed plus bypass down at for a coil split under stage you might get off lower coil plus if no oa bypass air which will blend to about with both stages on you get roughly mjw apr don for this one you are correct that the defaults of fixed dry bulb cooling setpoint and dewpoint for rh control will results in little or no overcooling because meeting the dry bulb setpoint would almost always also meet the humidity setpoint what would likely change in this case if dehumidification control is set to yes is that the dry bulb control would likely be higher than either because it s set to outdoor air reset or on a schedule do you have any further thoughts on this is the default dry bulb setpoint higher than typical applications or should the default control type be changed to oa reset or should this just stand as is external ref e mail from don shirey last build tested | 1 |
54,689 | 13,883,227,509 | IssuesEvent | 2020-10-18 10:59:14 | AeroScripts/QuestieDev | https://api.github.com/repos/AeroScripts/QuestieDev | closed | Some Logistic Task quest are not marked as Repeatable | Type - Defect | ## Bug description
Four Logistic Tasks (see screenshot) should (probably) be in repeatable group as the rest of them.
## Screenshots

## Questie version
v6.1.0
| 1.0 | Some Logistic Task quest are not marked as Repeatable - ## Bug description
Four Logistic Tasks (see screenshot) should (probably) be in repeatable group as the rest of them.
## Screenshots

## Questie version
v6.1.0
| defect | some logistic task quest are not marked as repeatable bug description four logistic tasks see screenshot should probably be in repeatable group as the rest of them screenshots questie version | 1 |
517,638 | 15,017,329,561 | IssuesEvent | 2021-02-01 10:42:31 | netdata/netdata | https://api.github.com/repos/netdata/netdata | closed | linux proc collectors - these should be migrated to sys | area/collectors feature request group/d-c priority/low | For various reasons (primarily inconsistency of presentation in /proc), it's generally regarded as a better idea to use /sys as the first port of call when reading data than /proc
This is particularly relevant when reading anything related to hardware or system parameters - there's virtually nothing in /proc that isn't available in /sys, (usually with greater detail)
Various efforts have been underway to "deprecate" or cleanup /proc for over a decade and as a result this area has mostly stagnated for at least 15 years (one example: If you want scsi_tape stats, you can _ONLY_ get them in /sys)
| 1.0 | linux proc collectors - these should be migrated to sys - For various reasons (primarily inconsistency of presentation in /proc), it's generally regarded as a better idea to use /sys as the first port of call when reading data than /proc
This is particularly relevant when reading anything related to hardware or system parameters - there's virtually nothing in /proc that isn't available in /sys, (usually with greater detail)
Various efforts have been underway to "deprecate" or cleanup /proc for over a decade and as a result this area has mostly stagnated for at least 15 years (one example: If you want scsi_tape stats, you can _ONLY_ get them in /sys)
| non_defect | linux proc collectors these should be migrated to sys for various reasons primarily inconsistency of presentation in proc it s generally regarded as a better idea to use sys as the first port of call when reading data than proc this is particularly relevant when reading anything related to hardware or system parameters there s virtually nothing in proc that isn t available in sys usually with greater detail various efforts have been underway to deprecate or cleanup proc for over a decade and as a result this area has mostly stagnated for at least years one example if you want scsi tape stats you can only get them in sys | 0 |
41,723 | 10,576,836,598 | IssuesEvent | 2019-10-07 18:45:32 | mozilla-lockwise/lockwise-android | https://api.github.com/repos/mozilla-lockwise/lockwise-android | opened | Link hover/press copy for Learn More links not working | needs-design type: defect | Related to #951
> Settings Support - ... noticed that our link hover/press copy is no longer present on this type of link style (same as on the welcome screen link) nor does the tap target seem to be consistent (I tend to have to press it multiple times before the page will load) | 1.0 | Link hover/press copy for Learn More links not working - Related to #951
> Settings Support - ... noticed that our link hover/press copy is no longer present on this type of link style (same as on the welcome screen link) nor does the tap target seem to be consistent (I tend to have to press it multiple times before the page will load) | defect | link hover press copy for learn more links not working related to settings support noticed that our link hover press copy is no longer present on this type of link style same as on the welcome screen link nor does the tap target seem to be consistent i tend to have to press it multiple times before the page will load | 1 |
211,014 | 23,778,996,441 | IssuesEvent | 2022-09-02 01:11:29 | ignatandrei/RecordVisitors | https://api.github.com/repos/ignatandrei/RecordVisitors | closed | CVE-2017-0256 (Medium) detected in system.net.http.4.3.0.nupkg - autoclosed | security vulnerability | ## CVE-2017-0256 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>system.net.http.4.3.0.nupkg</b></p></summary>
<p>Provides a programming interface for modern HTTP applications, including HTTP client components that...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/system.net.http.4.3.0.nupkg">https://api.nuget.org/packages/system.net.http.4.3.0.nupkg</a></p>
<p>Path to dependency file: /AutomatedTestRecord/AutomatedTestRecord.csproj</p>
<p>Path to vulnerable library: /usr/share/dotnet/sdk/NuGetFallbackFolder/system.net.http/4.3.0/system.net.http.4.3.0.nupkg</p>
<p>
Dependency Hierarchy:
- lightbdd.xunit2.3.3.0.nupkg (Root Library)
- xunit.2.4.1.nupkg
- xunit.assert.2.4.1.nupkg
- netstandard.library.1.6.1.nupkg
- :x: **system.net.http.4.3.0.nupkg** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A spoofing vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0256>CVE-2017-0256</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-0256">https://nvd.nist.gov/vuln/detail/CVE-2017-0256</a></p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: Microsoft.AspNetCore.Mvc.ApiExplorer - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Abstractions - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.1.3,1.0.4;System.Net.Http - 4.1.2,4.3.2;Microsoft.AspNetCore.Mvc.Razor - 1.1.3,1.0.4;System.Net.Http.WinHttpHandler - 4.0.2,4.3.0-preview1-24530-04;System.Net.Security - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;System.Text.Encodings.Web - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Razor.Host - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3;System.Net.WebSockets.Client - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2017-0256 (Medium) detected in system.net.http.4.3.0.nupkg - autoclosed - ## CVE-2017-0256 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>system.net.http.4.3.0.nupkg</b></p></summary>
<p>Provides a programming interface for modern HTTP applications, including HTTP client components that...</p>
<p>Library home page: <a href="https://api.nuget.org/packages/system.net.http.4.3.0.nupkg">https://api.nuget.org/packages/system.net.http.4.3.0.nupkg</a></p>
<p>Path to dependency file: /AutomatedTestRecord/AutomatedTestRecord.csproj</p>
<p>Path to vulnerable library: /usr/share/dotnet/sdk/NuGetFallbackFolder/system.net.http/4.3.0/system.net.http.4.3.0.nupkg</p>
<p>
Dependency Hierarchy:
- lightbdd.xunit2.3.3.0.nupkg (Root Library)
- xunit.2.4.1.nupkg
- xunit.assert.2.4.1.nupkg
- netstandard.library.1.6.1.nupkg
- :x: **system.net.http.4.3.0.nupkg** (Vulnerable Library)
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A spoofing vulnerability exists when the ASP.NET Core fails to properly sanitize web requests.
<p>Publish Date: 2017-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-0256>CVE-2017-0256</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-0256">https://nvd.nist.gov/vuln/detail/CVE-2017-0256</a></p>
<p>Release Date: 2017-05-12</p>
<p>Fix Resolution: Microsoft.AspNetCore.Mvc.ApiExplorer - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Abstractions - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Core - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Cors - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.Localization - 1.1.3,1.0.4;System.Net.Http - 4.1.2,4.3.2;Microsoft.AspNetCore.Mvc.Razor - 1.1.3,1.0.4;System.Net.Http.WinHttpHandler - 4.0.2,4.3.0-preview1-24530-04;System.Net.Security - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.ViewFeatures - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.TagHelpers - 1.0.4,1.1.3;System.Text.Encodings.Web - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Razor.Host - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.Formatters.Json - 1.0.4,1.1.3;Microsoft.AspNetCore.Mvc.WebApiCompatShim - 1.0.4,1.1.3;System.Net.WebSockets.Client - 4.3.0-preview1-24530-04,4.0.1;Microsoft.AspNetCore.Mvc.Formatters.Xml - 1.1.3,1.0.4;Microsoft.AspNetCore.Mvc.DataAnnotations - 1.0.4,1.1.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve medium detected in system net http nupkg autoclosed cve medium severity vulnerability vulnerable library system net http nupkg provides a programming interface for modern http applications including http client components that library home page a href path to dependency file automatedtestrecord automatedtestrecord csproj path to vulnerable library usr share dotnet sdk nugetfallbackfolder system net http system net http nupkg dependency hierarchy lightbdd nupkg root library xunit nupkg xunit assert nupkg netstandard library nupkg x system net http nupkg vulnerable library found in base branch main vulnerability details a spoofing vulnerability exists when the asp net core fails to properly sanitize web requests publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution microsoft aspnetcore mvc apiexplorer microsoft aspnetcore mvc abstractions microsoft aspnetcore mvc core microsoft aspnetcore mvc cors microsoft aspnetcore mvc localization system net http microsoft aspnetcore mvc razor system net http winhttphandler system net security microsoft aspnetcore mvc viewfeatures microsoft aspnetcore mvc taghelpers system text encodings web microsoft aspnetcore mvc razor host microsoft aspnetcore mvc formatters json microsoft aspnetcore mvc webapicompatshim system net websockets client microsoft aspnetcore mvc formatters xml microsoft aspnetcore mvc dataannotations step up your open source security game with mend | 0 |
272,035 | 23,648,191,523 | IssuesEvent | 2022-08-26 02:06:18 | Programming-Simplified-Community/Social-Coder | https://api.github.com/repos/Programming-Simplified-Community/Social-Coder | opened | Add Tests for Badge Endpoints | enhancement good first issue Api Tests | Implement [test-containers](https://github.com/testcontainers/testcontainers-dotnet) to help get our test suite started.
Will require a little bit of research/trial and error on how to properly integrate this.
[Nick Chapsas](https://youtu.be/8IRNC7qZBmk) has a demo using this... can use this to help figure things out.
Food for thought:
- must have tests for both happy/bad paths
- should not have 1 long test-method. Each "path" or "test" should be its own method. This helps us isolate whether or not a method is **actually** broken versus failed because a previous test failed.
- each endpoint should be its own file. I.e the badge endpoint will be it's own file.
- Test suites are separate projects named after the project it is tailored towards... so in this case it would be `SocialCoder.Web.Tests` | 1.0 | Add Tests for Badge Endpoints - Implement [test-containers](https://github.com/testcontainers/testcontainers-dotnet) to help get our test suite started.
Will require a little bit of research/trial and error on how to properly integrate this.
[Nick Chapsas](https://youtu.be/8IRNC7qZBmk) has a demo using this... can use this to help figure things out.
Food for thought:
- must have tests for both happy/bad paths
- should not have 1 long test-method. Each "path" or "test" should be its own method. This helps us isolate whether or not a method is **actually** broken versus failed because a previous test failed.
- each endpoint should be its own file. I.e the badge endpoint will be it's own file.
- Test suites are separate projects named after the project it is tailored towards... so in this case it would be `SocialCoder.Web.Tests` | non_defect | add tests for badge endpoints implement to help get our test suite started will require a little bit of research trial and error on how to properly integrate this has a demo using this can use this to help figure things out food for thought must have tests for both happy bad paths should not have long test method each path or test should be its own method this helps us isolate whether or not a method is actually broken versus failed because a previous test failed each endpoint should be its own file i e the badge endpoint will be it s own file test suites are separate projects named after the project it is tailored towards so in this case it would be socialcoder web tests | 0 |
146,114 | 13,171,490,127 | IssuesEvent | 2020-08-11 16:45:38 | corrados/jamulus | https://api.github.com/repos/corrados/jamulus | closed | How many Jamulus servers can be run on a single computer? | documentation | I am currently using Jamulus to run a server at a university, and the university is interested in running multiple Jamulus servers for ensemble rehearsals. I tried to do this as a quick test by trying to run 5 servers in a bash script, and it only seemed to allow one to run. Is it possible to run multiple Jamulus servers from a single computer? If it is not, would virtual machines be the way to go for this? I have found running a Jamulus server in a vm had worse latency (makes sense), but I am a little concerned with how I would go about doing this without having have as many computers as our university wants servers.
Does anyone have any suggestions for these things, and is it possible to run multiple servers on a single machine?
Thank you for your time. I understand that this project is very busy, so I appreciate any help that I get.
EDIT: Just discovered you can change the local port the server uses (with option -p), which is great. Is this the best way to run many servers on one computer? | 1.0 | How many Jamulus servers can be run on a single computer? - I am currently using Jamulus to run a server at a university, and the university is interested in running multiple Jamulus servers for ensemble rehearsals. I tried to do this as a quick test by trying to run 5 servers in a bash script, and it only seemed to allow one to run. Is it possible to run multiple Jamulus servers from a single computer? If it is not, would virtual machines be the way to go for this? I have found running a Jamulus server in a vm had worse latency (makes sense), but I am a little concerned with how I would go about doing this without having have as many computers as our university wants servers.
Does anyone have any suggestions for these things, and is it possible to run multiple servers on a single machine?
Thank you for your time. I understand that this project is very busy, so I appreciate any help that I get.
EDIT: Just discovered you can change the local port the server uses (with option -p), which is great. Is this the best way to run many servers on one computer? | non_defect | how many jamulus servers can be run on a single computer i am currently using jamulus to run a server at a university and the university is interested in running multiple jamulus servers for ensemble rehearsals i tried to do this as a quick test by trying to run servers in a bash script and it only seemed to allow one to run is it possible to run multiple jamulus servers from a single computer if it is not would virtual machines be the way to go for this i have found running a jamulus server in a vm had worse latency makes sense but i am a little concerned with how i would go about doing this without having have as many computers as our university wants servers does anyone have any suggestions for these things and is it possible to run multiple servers on a single machine thank you for your time i understand that this project is very busy so i appreciate any help that i get edit just discovered you can change the local port the server uses with option p which is great is this the best way to run many servers on one computer | 0 |
28,805 | 8,211,916,184 | IssuesEvent | 2018-09-04 14:58:36 | quicklisp/quicklisp-projects | https://api.github.com/repos/quicklisp/quicklisp-projects | closed | Please add cmake-parser. | canbuild | This is a project to parse cmake scripts, it will parse script into a list of command invocation.
Repository:
https://github.com/zbq/cmake-parser
license: MIT | 1.0 | Please add cmake-parser. - This is a project to parse cmake scripts, it will parse script into a list of command invocation.
Repository:
https://github.com/zbq/cmake-parser
license: MIT | non_defect | please add cmake parser this is a project to parse cmake scripts it will parse script into a list of command invocation repository license mit | 0 |
63,534 | 17,754,513,915 | IssuesEvent | 2021-08-28 13:34:43 | primefaces-extensions/primefaces-extensions | https://api.github.com/repos/primefaces-extensions/primefaces-extensions | closed | CombinedResourceHandler: Myfaces ParseError in Dev Mode | defect | **Describe the bug**
@tandraschko I have attached this reproducer:
[pfe-combinedresourcehandler.zip](https://github.com/primefaces-extensions/primefaces-extensions/files/7059882/pfe-combinedresourcehandler.zip)
To get the issue run the reproducer `mvn clean jetty:run -Pmyfaces23` with JSF mode **Development** and press the "Update" button twice. On the second press you will see this in the console for the AJAX response.

Now run the reproducer again in JSF mode **Production** `mvn clean jetty:run -Pmyfaces23` and there is no errors or problems. I think it stems from MyFaces having an issue with duplicate ID's for this piece of code in our handler:
https://github.com/primefaces-extensions/primefaces-extensions/blob/fd939e77a2e26c765bd06efa5692fde8e3fba4b2/core/src/main/java/org/primefaces/extensions/application/PrimeFacesScriptProcessor.java#L169-L175
I used to have ID's on there but had to be removed because of this ticket: https://github.com/primefaces-extensions/primefaces-extensions/issues/486
Any thoughts you have on this would be appreciated. It works fine in Mojarra in Development mode but I am wondering which library is doing the right thing?
| 1.0 | CombinedResourceHandler: Myfaces ParseError in Dev Mode - **Describe the bug**
@tandraschko I have attached this reproducer:
[pfe-combinedresourcehandler.zip](https://github.com/primefaces-extensions/primefaces-extensions/files/7059882/pfe-combinedresourcehandler.zip)
To get the issue run the reproducer `mvn clean jetty:run -Pmyfaces23` with JSF mode **Development** and press the "Update" button twice. On the second press you will see this in the console for the AJAX response.

Now run the reproducer again in JSF mode **Production** `mvn clean jetty:run -Pmyfaces23` and there is no errors or problems. I think it stems from MyFaces having an issue with duplicate ID's for this piece of code in our handler:
https://github.com/primefaces-extensions/primefaces-extensions/blob/fd939e77a2e26c765bd06efa5692fde8e3fba4b2/core/src/main/java/org/primefaces/extensions/application/PrimeFacesScriptProcessor.java#L169-L175
I used to have ID's on there but had to be removed because of this ticket: https://github.com/primefaces-extensions/primefaces-extensions/issues/486
Any thoughts you have on this would be appreciated. It works fine in Mojarra in Development mode but I am wondering which library is doing the right thing?
| defect | combinedresourcehandler myfaces parseerror in dev mode describe the bug tandraschko i have attached this reproducer to get the issue run the reproducer mvn clean jetty run with jsf mode development and press the update button twice on the second press you will see this in the console for the ajax response now run the reproducer again in jsf mode production mvn clean jetty run and there is no errors or problems i think it stems from myfaces having an issue with duplicate id s for this piece of code in our handler i used to have id s on there but had to be removed because of this ticket any thoughts you have on this would be appreciated it works fine in mojarra in development mode but i am wondering which library is doing the right thing | 1 |
51,848 | 13,211,323,151 | IssuesEvent | 2020-08-15 22:18:29 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | opened | [cscd-llh] needs more tests (Trac #1168) | Incomplete Migration Migrated from Trac combo reconstruction defect | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1168">https://code.icecube.wisc.edu/projects/icecube/ticket/1168</a>, reported by hdembinskiand owned by tpalczewski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-08-21T13:27:59",
"_ts": "1440163679305068",
"description": "the project has some cpp-tests, but the pybindings are not tested. the script CscdLlhTest.py is practically a test should be converted into a test, by moving it to resources/test/test_cscd_llh.py\n\ncurrently, the test coverage is rather low, see\nhttp://software.icecube.wisc.edu/coverage/00_LATEST/\n\nespecially for\nprivate/converter\ncscd-llh/private/pdf\ncscd-llh/public/cscd-llh/pdf\n",
"reporter": "hdembinski",
"cc": "",
"resolution": "fixed",
"time": "2015-08-18T20:31:03",
"component": "combo reconstruction",
"summary": "[cscd-llh] needs more tests",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "tpalczewski",
"type": "defect"
}
```
</p>
</details>
| 1.0 | [cscd-llh] needs more tests (Trac #1168) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1168">https://code.icecube.wisc.edu/projects/icecube/ticket/1168</a>, reported by hdembinskiand owned by tpalczewski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-08-21T13:27:59",
"_ts": "1440163679305068",
"description": "the project has some cpp-tests, but the pybindings are not tested. the script CscdLlhTest.py is practically a test should be converted into a test, by moving it to resources/test/test_cscd_llh.py\n\ncurrently, the test coverage is rather low, see\nhttp://software.icecube.wisc.edu/coverage/00_LATEST/\n\nespecially for\nprivate/converter\ncscd-llh/private/pdf\ncscd-llh/public/cscd-llh/pdf\n",
"reporter": "hdembinski",
"cc": "",
"resolution": "fixed",
"time": "2015-08-18T20:31:03",
"component": "combo reconstruction",
"summary": "[cscd-llh] needs more tests",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "tpalczewski",
"type": "defect"
}
```
</p>
</details>
| defect | needs more tests trac migrated from json status closed changetime ts description the project has some cpp tests but the pybindings are not tested the script cscdllhtest py is practically a test should be converted into a test by moving it to resources test test cscd llh py n ncurrently the test coverage is rather low see n for nprivate converter ncscd llh private pdf ncscd llh public cscd llh pdf n reporter hdembinski cc resolution fixed time component combo reconstruction summary needs more tests priority normal keywords milestone owner tpalczewski type defect | 1 |
46,378 | 24,506,797,111 | IssuesEvent | 2022-10-10 17:07:57 | jonisavo/uicomponents | https://api.github.com/repos/jonisavo/uicomponents | opened | Replace all UIComponent reflection with source generation, drop Unity 2019 support | enhancement performance | `com.unity.roslyn` allows source generation to be used in Unity 2020. This makes the jump from reflection to full source generation a viable option, since support for Unity 2020 can be retained for a while.
All reflection related to UIComponent (not counting the dependency injection system) should be converted to use source generation instead. This should provide a significant performance boost and make the framework more viable for in-game UI.
`com.unity.roslyn` does not support Unity 2019. I think support for 2019 can be dropped, since it reached EOL months ago. | True | Replace all UIComponent reflection with source generation, drop Unity 2019 support - `com.unity.roslyn` allows source generation to be used in Unity 2020. This makes the jump from reflection to full source generation a viable option, since support for Unity 2020 can be retained for a while.
All reflection related to UIComponent (not counting the dependency injection system) should be converted to use source generation instead. This should provide a significant performance boost and make the framework more viable for in-game UI.
`com.unity.roslyn` does not support Unity 2019. I think support for 2019 can be dropped, since it reached EOL months ago. | non_defect | replace all uicomponent reflection with source generation drop unity support com unity roslyn allows source generation to be used in unity this makes the jump from reflection to full source generation a viable option since support for unity can be retained for a while all reflection related to uicomponent not counting the dependency injection system should be converted to use source generation instead this should provide a significant performance boost and make the framework more viable for in game ui com unity roslyn does not support unity i think support for can be dropped since it reached eol months ago | 0 |
8,538 | 2,611,516,925 | IssuesEvent | 2015-02-27 05:51:48 | chrsmith/hedgewars | https://api.github.com/repos/chrsmith/hedgewars | closed | Error while compiling and linking | auto-migrated Priority-Medium Type-Defect | ```
Hello!
Trying to compile the 0.9.19 branch fails with the following error:
(9015) Linking /home/eugeno/Projects/Hedgewars/trunk/bin/hwengine
/usr/bin/ld.bfd.real: warning:
/home/eugeno/Projects/Hedgewars/trunk/bin/link.res contains output sections;
did you forget -T?
/usr/bin/ld.bfd.real: cannot find -lstdc++
hwengine.pas(558,1) Error: (9013) Error while linking
hwengine.pas(558,1) Fatal: (10026) There were 1 errors compiling module,
stopping
Fatal: (1018) Compilation aborted
Error: /usr/bin/ppcx64 returned an error exitcode (normal if you did not
specify a source file to be compiled)
0.9.18 is compiled well. I'm using Debian Squeeze, libstdc++6 and
libstdc++6-4.4-dev are installed.
Regards,
Eugene
```
Original issue reported on code.google.com by `euxgeno` on 24 Mar 2013 at 8:24 | 1.0 | Error while compiling and linking - ```
Hello!
Trying to compile the 0.9.19 branch fails with the following error:
(9015) Linking /home/eugeno/Projects/Hedgewars/trunk/bin/hwengine
/usr/bin/ld.bfd.real: warning:
/home/eugeno/Projects/Hedgewars/trunk/bin/link.res contains output sections;
did you forget -T?
/usr/bin/ld.bfd.real: cannot find -lstdc++
hwengine.pas(558,1) Error: (9013) Error while linking
hwengine.pas(558,1) Fatal: (10026) There were 1 errors compiling module,
stopping
Fatal: (1018) Compilation aborted
Error: /usr/bin/ppcx64 returned an error exitcode (normal if you did not
specify a source file to be compiled)
0.9.18 is compiled well. I'm using Debian Squeeze, libstdc++6 and
libstdc++6-4.4-dev are installed.
Regards,
Eugene
```
Original issue reported on code.google.com by `euxgeno` on 24 Mar 2013 at 8:24 | defect | error while compiling and linking hello trying to compile the branch fails with the following error linking home eugeno projects hedgewars trunk bin hwengine usr bin ld bfd real warning home eugeno projects hedgewars trunk bin link res contains output sections did you forget t usr bin ld bfd real cannot find lstdc hwengine pas error error while linking hwengine pas fatal there were errors compiling module stopping fatal compilation aborted error usr bin returned an error exitcode normal if you did not specify a source file to be compiled is compiled well i m using debian squeeze libstdc and libstdc dev are installed regards eugene original issue reported on code google com by euxgeno on mar at | 1 |
100,200 | 30,641,090,669 | IssuesEvent | 2023-07-24 22:05:29 | riversoforion/clio-auth | https://api.github.com/repos/riversoforion/clio-auth | opened | Fix Codecov integration | bug build | Coverage reports are successfully uploaded, but flagged as "unusable report" by Codecov.
e.g. https://app.codecov.io/github/riversoforion/clio-auth/commit/7f0b2d5f45716d95cc58592505ed302b4f61105e | 1.0 | Fix Codecov integration - Coverage reports are successfully uploaded, but flagged as "unusable report" by Codecov.
e.g. https://app.codecov.io/github/riversoforion/clio-auth/commit/7f0b2d5f45716d95cc58592505ed302b4f61105e | non_defect | fix codecov integration coverage reports are successfully uploaded but flagged as unusable report by codecov e g | 0 |
30,121 | 6,032,799,175 | IssuesEvent | 2017-06-09 05:57:52 | moosetechnology/Moose | https://api.github.com/repos/moosetechnology/Moose | closed | VerveineJ does not export non-javadoc comments | Component-VerveineJ Priority-Medium Type-Defect | Originally reported on Google Code with ID 838
```
VerveineJ exports javadoc comments for both types and methods.
However, it does not export line comments like:
//This is also a relevant comment
```
Reported by `tudor.girba` on 2012-09-17 06:38:08
| 1.0 | VerveineJ does not export non-javadoc comments - Originally reported on Google Code with ID 838
```
VerveineJ exports javadoc comments for both types and methods.
However, it does not export line comments like:
//This is also a relevant comment
```
Reported by `tudor.girba` on 2012-09-17 06:38:08
| defect | verveinej does not export non javadoc comments originally reported on google code with id verveinej exports javadoc comments for both types and methods however it does not export line comments like this is also a relevant comment reported by tudor girba on | 1 |
263,485 | 8,290,227,198 | IssuesEvent | 2018-09-19 16:43:26 | CyberReboot/poseidon | https://api.github.com/repos/CyberReboot/poseidon | closed | poseidon.log created as a directory | bug high-priority | Under certain circumstances the file /var/log/poseidon.log is being created as a directory. This leads to a warning message:
```
WARNING:Logger_Base:68 - Unable to setup Poseidon logger because: [Errno 21] Is a directory: '/var/log/poseidon.log'
```
When this happens the user will be unable to view logging output from poseidon. | 1.0 | poseidon.log created as a directory - Under certain circumstances the file /var/log/poseidon.log is being created as a directory. This leads to a warning message:
```
WARNING:Logger_Base:68 - Unable to setup Poseidon logger because: [Errno 21] Is a directory: '/var/log/poseidon.log'
```
When this happens the user will be unable to view logging output from poseidon. | non_defect | poseidon log created as a directory under certain circumstances the file var log poseidon log is being created as a directory this leads to a warning message warning logger base unable to setup poseidon logger because is a directory var log poseidon log when this happens the user will be unable to view logging output from poseidon | 0 |
16,973 | 9,956,368,569 | IssuesEvent | 2019-07-05 13:45:07 | AtlasOfLivingAustralia/image-service | https://api.github.com/repos/AtlasOfLivingAustralia/image-service | closed | Require authentication or apikey for scheduleArtifactGeneration | IAD bug fixed-grails3 security | Some admin commands are being called by crawler bots, which implies that they don't have authentication or apikeys restricting access. In this case, the target is ``/ws/scheduleArtifactGeneration``
```
==> /var/log/tomcat7/catalina.out <==
Index Image 229250751: 2 ms
Index Image 229250751: 1 ms
2018-11-02 10:59:39,223 [http-bio-8080-exec-17549] INFO images.LogService - Username: N/A IP: 150.229.66.12 Session: F0ACE69B5C8C22296F007E5E6F0D3C75 UA: Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html) URI: /grails/webService/scheduleArtifactGeneration.dispatch
==> /var/log/apache2/images.ala.org.au.ssl_access.log <==
66.249.79.252 - - [02/Nov/2018:10:59:39 +1100] "GET /ws/scheduleArtifactGeneration/5b6de1d6-7f86-4080-9f2f-b924cc2adc87 HTTP/1.1" 200 6419 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
``` | True | Require authentication or apikey for scheduleArtifactGeneration - Some admin commands are being called by crawler bots, which implies that they don't have authentication or apikeys restricting access. In this case, the target is ``/ws/scheduleArtifactGeneration``
```
==> /var/log/tomcat7/catalina.out <==
Index Image 229250751: 2 ms
Index Image 229250751: 1 ms
2018-11-02 10:59:39,223 [http-bio-8080-exec-17549] INFO images.LogService - Username: N/A IP: 150.229.66.12 Session: F0ACE69B5C8C22296F007E5E6F0D3C75 UA: Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html) URI: /grails/webService/scheduleArtifactGeneration.dispatch
==> /var/log/apache2/images.ala.org.au.ssl_access.log <==
66.249.79.252 - - [02/Nov/2018:10:59:39 +1100] "GET /ws/scheduleArtifactGeneration/5b6de1d6-7f86-4080-9f2f-b924cc2adc87 HTTP/1.1" 200 6419 "-" "Mozilla/5.0 (Linux; Android 6.0.1; Nexus 5X Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/41.0.2272.96 Mobile Safari/537.36 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)"
``` | non_defect | require authentication or apikey for scheduleartifactgeneration some admin commands are being called by crawler bots which implies that they don t have authentication or apikeys restricting access in this case the target is ws scheduleartifactgeneration var log catalina out index image ms index image ms info images logservice username n a ip session ua mozilla linux android nexus build applewebkit khtml like gecko chrome mobile safari compatible googlebot uri grails webservice scheduleartifactgeneration dispatch var log images ala org au ssl access log get ws scheduleartifactgeneration http mozilla linux android nexus build applewebkit khtml like gecko chrome mobile safari compatible googlebot | 0 |
71,030 | 23,419,362,426 | IssuesEvent | 2022-08-13 12:59:19 | primefaces/primereact | https://api.github.com/repos/primefaces/primereact | closed | Dropdown component flicker bug | defect :bangbang: needs-triage | ### Describe the bug
When a page loads initially, the dropdown component has a flicker when it's first clicked on. Then the dropdown opens up for a millisecond and closes immediately after.
This issue is only resolved when the page is refreshed and then for some reason the dropdown works and displays the listed items.
If it means anything, in my case, the dropdown component is followed by a MultiSelect component, and they are both displayed in a modal dialog, with the dropdown appearing at the very top.
### Reproducer
_No response_
### PrimeReact version
8.1.0
### React version
18.x
### Language
ALL
### Build / Runtime
Create React App (CRA)
### Browser(s)
_No response_
### Steps to reproduce the behavior
Use the dropdown inside of a modal dialog perhaps followed by a MultiSelect component.
### Expected behavior
I expected the dropdown to display the listed items normally and immediately after it's clicked on. | 1.0 | Dropdown component flicker bug - ### Describe the bug
When a page loads initially, the dropdown component has a flicker when it's first clicked on. Then the dropdown opens up for a millisecond and closes immediately after.
This issue is only resolved when the page is refreshed and then for some reason the dropdown works and displays the listed items.
If it means anything, in my case, the dropdown component is followed by a MultiSelect component, and they are both displayed in a modal dialog, with the dropdown appearing at the very top.
### Reproducer
_No response_
### PrimeReact version
8.1.0
### React version
18.x
### Language
ALL
### Build / Runtime
Create React App (CRA)
### Browser(s)
_No response_
### Steps to reproduce the behavior
Use the dropdown inside of a modal dialog perhaps followed by a MultiSelect component.
### Expected behavior
I expected the dropdown to display the listed items normally and immediately after it's clicked on. | defect | dropdown component flicker bug describe the bug when a page loads initially the dropdown component has a flicker when it s first clicked on then the dropdown opens up for a millisecond and closes immediately after this issue is only resolved when the page is refreshed and then for some reason the dropdown works and displays the listed items if it means anything in my case the dropdown component is followed by a multiselect component and they are both displayed in a modal dialog with the dropdown appearing at the very top reproducer no response primereact version react version x language all build runtime create react app cra browser s no response steps to reproduce the behavior use the dropdown inside of a modal dialog perhaps followed by a multiselect component expected behavior i expected the dropdown to display the listed items normally and immediately after it s clicked on | 1 |
59,189 | 17,016,323,134 | IssuesEvent | 2021-07-02 12:35:09 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | opened | In the search for street names in Spanish, stop words should be eliminated. | Component: nominatim Priority: major Type: defect | **[Submitted to the original trac issue database at 7.08pm, Wednesday, 10th July 2013]**
The Royal Spanish Academy indicates that the preposition "**de*'" (of, in English) should never be omitted in the names of streets, avenues and promenades, unless the name is an adjective: "Calle '''de''' Esproceda", "Plaza '''de''' Coln", "Avenida '''de''' Amrica", "Paseo '*de** Gracia", in the first case, and "Calle Mayor" or "Plaza Nueva" for the second case.
But the right way to name a street is put the preposition **de*' (of) after of the type of road (examples: Calle '''de''' Alcal, Avenida '''de''' Prez Galdos, Plaza '*de** Espaa, etc.), it's very common skip it when contracting the name. This peculiarity must be taken into account by search engines.
Nominatim not currently have this in mind, which makes the search engine not shows a lot of streets that actually exist in OpenStreetMap. Examples:
* "[http://nominatim.openstreetmap.org/search.php?q=Calle+Hern%C3%A1n+Cort%C3%A9s%2C+Santander%2C+Spain&viewbox=-3.81%2C43.47%2C-3.79%2C43.46 Calle de Hernn Corts, Santander, Spain]" -> No search results found.
* "[http://nominatim.openstreetmap.org/search.php?q=Calle+de+Hern%C3%A1n+Cort%C3%A9s%2C+Santander%2C+Spain&viewbox=-151.18%2C61.15%2C151.18%2C-61.15 Calle Hernn Corts, Santander, Spain]" -> Find the correct street.
What makes also Nominatim unhelpful for geocoding reverse directions in this language.
Note that in Spanish besides the preposition de, another usual construct is to use **el*', '''la''', '''los''' or '''las''' (all translate to '''the''') just after de: Carretera '''del''' faro, Calle '''de los''' Cados '''de la''' Divisin Azul, Calle de las Descalzas. Note that "'''de el'''" contracts itself to "'*del**" in most cases.
In [http://snowball.tartarus.org/algorithms/spanish/stop.txt this link] there is a list of stop words that should be ignored in searches, which include the prepositions above. | 1.0 | In the search for street names in Spanish, stop words should be eliminated. - **[Submitted to the original trac issue database at 7.08pm, Wednesday, 10th July 2013]**
The Royal Spanish Academy indicates that the preposition "**de*'" (of, in English) should never be omitted in the names of streets, avenues and promenades, unless the name is an adjective: "Calle '''de''' Esproceda", "Plaza '''de''' Coln", "Avenida '''de''' Amrica", "Paseo '*de** Gracia", in the first case, and "Calle Mayor" or "Plaza Nueva" for the second case.
But the right way to name a street is put the preposition **de*' (of) after of the type of road (examples: Calle '''de''' Alcal, Avenida '''de''' Prez Galdos, Plaza '*de** Espaa, etc.), it's very common skip it when contracting the name. This peculiarity must be taken into account by search engines.
Nominatim not currently have this in mind, which makes the search engine not shows a lot of streets that actually exist in OpenStreetMap. Examples:
* "[http://nominatim.openstreetmap.org/search.php?q=Calle+Hern%C3%A1n+Cort%C3%A9s%2C+Santander%2C+Spain&viewbox=-3.81%2C43.47%2C-3.79%2C43.46 Calle de Hernn Corts, Santander, Spain]" -> No search results found.
* "[http://nominatim.openstreetmap.org/search.php?q=Calle+de+Hern%C3%A1n+Cort%C3%A9s%2C+Santander%2C+Spain&viewbox=-151.18%2C61.15%2C151.18%2C-61.15 Calle Hernn Corts, Santander, Spain]" -> Find the correct street.
What makes also Nominatim unhelpful for geocoding reverse directions in this language.
Note that in Spanish besides the preposition de, another usual construct is to use **el*', '''la''', '''los''' or '''las''' (all translate to '''the''') just after de: Carretera '''del''' faro, Calle '''de los''' Cados '''de la''' Divisin Azul, Calle de las Descalzas. Note that "'''de el'''" contracts itself to "'*del**" in most cases.
In [http://snowball.tartarus.org/algorithms/spanish/stop.txt this link] there is a list of stop words that should be ignored in searches, which include the prepositions above. | defect | in the search for street names in spanish stop words should be eliminated the royal spanish academy indicates that the preposition de of in english should never be omitted in the names of streets avenues and promenades unless the name is an adjective calle de esproceda plaza de coln avenida de amrica paseo de gracia in the first case and calle mayor or plaza nueva for the second case but the right way to name a street is put the preposition de of after of the type of road examples calle de alcal avenida de prez galdos plaza de espaa etc it s very common skip it when contracting the name this peculiarity must be taken into account by search engines nominatim not currently have this in mind which makes the search engine not shows a lot of streets that actually exist in openstreetmap examples no search results found find the correct street what makes also nominatim unhelpful for geocoding reverse directions in this language note that in spanish besides the preposition de another usual construct is to use el la los or las all translate to the just after de carretera del faro calle de los cados de la divisin azul calle de las descalzas note that de el contracts itself to del in most cases in there is a list of stop words that should be ignored in searches which include the prepositions above | 1 |
17,466 | 3,006,987,893 | IssuesEvent | 2015-07-27 13:59:08 | bridgedotnet/Bridge | https://api.github.com/repos/bridgedotnet/Bridge | closed | NullReferenceException calling CultureInfo constructor | defect | While trying to parse some dates (which led to #329), we noticed that we get errors when trying to use the `CultureInfo` constructor. [Live example here](http://live.bridge.net/#b2a88e44d05088157ac7).
```
public class App
{
[Ready]
public static void Main()
{
new CultureInfo("en-US");
}
}
```
Reports `Finished with error(s)`:
```
// Line 0, Col 0 : Object reference not set to an instance of an object.
```
Seems like an error during compilation? | 1.0 | NullReferenceException calling CultureInfo constructor - While trying to parse some dates (which led to #329), we noticed that we get errors when trying to use the `CultureInfo` constructor. [Live example here](http://live.bridge.net/#b2a88e44d05088157ac7).
```
public class App
{
[Ready]
public static void Main()
{
new CultureInfo("en-US");
}
}
```
Reports `Finished with error(s)`:
```
// Line 0, Col 0 : Object reference not set to an instance of an object.
```
Seems like an error during compilation? | defect | nullreferenceexception calling cultureinfo constructor while trying to parse some dates which led to we noticed that we get errors when trying to use the cultureinfo constructor public class app public static void main new cultureinfo en us reports finished with error s line col object reference not set to an instance of an object seems like an error during compilation | 1 |
57,824 | 16,089,776,917 | IssuesEvent | 2021-04-26 15:21:10 | bardsoftware/ganttproject | https://api.github.com/repos/bardsoftware/ganttproject | closed | Milestone display issue on timeline | Type-Defect auto-migrated | ```
What steps wille reproduce the problem?
1.create a task
2.convert it in milestone
3.or create directly a milestone task
What is the expected output? What do you see instead?
it appears as miletone on gantt start but only the task id is displayed on
timeline. Random issue.
What version of the product are you using? On what operating system?
2.7 RC2 build 1864 - W7x64 SP1
Please provide any additional information below.
```
Original issue reported on code.google.com by `romain.g...@gmail.com` on 27 Feb 2015 at 8:38
Attachments:
- [Sans titre.png](https://storage.googleapis.com/google-code-attachments/ganttproject/issue-1091/comment-0/Sans titre.png)
| 1.0 | Milestone display issue on timeline - ```
What steps wille reproduce the problem?
1.create a task
2.convert it in milestone
3.or create directly a milestone task
What is the expected output? What do you see instead?
it appears as miletone on gantt start but only the task id is displayed on
timeline. Random issue.
What version of the product are you using? On what operating system?
2.7 RC2 build 1864 - W7x64 SP1
Please provide any additional information below.
```
Original issue reported on code.google.com by `romain.g...@gmail.com` on 27 Feb 2015 at 8:38
Attachments:
- [Sans titre.png](https://storage.googleapis.com/google-code-attachments/ganttproject/issue-1091/comment-0/Sans titre.png)
| defect | milestone display issue on timeline what steps wille reproduce the problem create a task convert it in milestone or create directly a milestone task what is the expected output what do you see instead it appears as miletone on gantt start but only the task id is displayed on timeline random issue what version of the product are you using on what operating system build please provide any additional information below original issue reported on code google com by romain g gmail com on feb at attachments titre png | 1 |
703 | 2,583,244,921 | IssuesEvent | 2015-02-16 02:24:01 | cakephp/cakephp | https://api.github.com/repos/cakephp/cakephp | closed | Cache groups are not working | cache Defect On hold | I suppose cache groups are not working as they should. I have this configuration:
```php
$defaultCache = [
'className' => 'File',
'path' => CACHE . 'plugin' . DS,
'probability' => 100,
'duration' => '+1 year'
];
// cache definitions
Cache::config('plugin_language', array_merge($defaultCache, [
'groups' => ['Languages']
]));
Cache::config('plugin_configuration', array_merge($defaultCache, [
'groups' => ['Configurations']
]));
```
When I try to remove Languages group by calling:
```php
foreach (Cache::groupConfigs('Languages') as $config) {
Cache::clearGroup('Languages', $config);
}
```
I get the exception *Invalid cache group Languages*
It is thrown in *\Cake\Cache\Cache groupConfigs* method and when I *var_dump(static::$_groups);* there, it is an empty array. | 1.0 | Cache groups are not working - I suppose cache groups are not working as they should. I have this configuration:
```php
$defaultCache = [
'className' => 'File',
'path' => CACHE . 'plugin' . DS,
'probability' => 100,
'duration' => '+1 year'
];
// cache definitions
Cache::config('plugin_language', array_merge($defaultCache, [
'groups' => ['Languages']
]));
Cache::config('plugin_configuration', array_merge($defaultCache, [
'groups' => ['Configurations']
]));
```
When I try to remove Languages group by calling:
```php
foreach (Cache::groupConfigs('Languages') as $config) {
Cache::clearGroup('Languages', $config);
}
```
I get the exception *Invalid cache group Languages*
It is thrown in *\Cake\Cache\Cache groupConfigs* method and when I *var_dump(static::$_groups);* there, it is an empty array. | defect | cache groups are not working i suppose cache groups are not working as they should i have this configuration php defaultcache classname file path cache plugin ds probability duration year cache definitions cache config plugin language array merge defaultcache groups cache config plugin configuration array merge defaultcache groups when i try to remove languages group by calling php foreach cache groupconfigs languages as config cache cleargroup languages config i get the exception invalid cache group languages it is thrown in cake cache cache groupconfigs method and when i var dump static groups there it is an empty array | 1 |
5,903 | 2,610,217,696 | IssuesEvent | 2015-02-26 19:09:17 | chrsmith/somefinders | https://api.github.com/repos/chrsmith/somefinders | opened | инструкция dvr s1000.txt | auto-migrated Priority-Medium Type-Defect | ```
'''Генрих Александров'''
День добрый никак не могу найти .инструкция
dvr s1000.txt. как то выкладывали уже
'''Аврор Гурьев'''
Вот хороший сайт где можно скачать
http://bit.ly/1aCqM96
'''Бронислав Архипов'''
Спасибо вроде то но просит телефон вводить
'''Аким Казаков'''
Неа все ок у меня ничего не списало
'''Горимир Евдокимов'''
Не это не влияет на баланс
Информация о файле: инструкция dvr s1000.txt
Загружен: В этом месяце
Скачан раз: 1050
Рейтинг: 1221
Средняя скорость скачивания: 501
Похожих файлов: 19
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 12:01 | 1.0 | инструкция dvr s1000.txt - ```
'''Генрих Александров'''
День добрый никак не могу найти .инструкция
dvr s1000.txt. как то выкладывали уже
'''Аврор Гурьев'''
Вот хороший сайт где можно скачать
http://bit.ly/1aCqM96
'''Бронислав Архипов'''
Спасибо вроде то но просит телефон вводить
'''Аким Казаков'''
Неа все ок у меня ничего не списало
'''Горимир Евдокимов'''
Не это не влияет на баланс
Информация о файле: инструкция dvr s1000.txt
Загружен: В этом месяце
Скачан раз: 1050
Рейтинг: 1221
Средняя скорость скачивания: 501
Похожих файлов: 19
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 12:01 | defect | инструкция dvr txt генрих александров день добрый никак не могу найти инструкция dvr txt как то выкладывали уже аврор гурьев вот хороший сайт где можно скачать бронислав архипов спасибо вроде то но просит телефон вводить аким казаков неа все ок у меня ничего не списало горимир евдокимов не это не влияет на баланс информация о файле инструкция dvr txt загружен в этом месяце скачан раз рейтинг средняя скорость скачивания похожих файлов original issue reported on code google com by kondense gmail com on dec at | 1 |
22,250 | 3,619,269,997 | IssuesEvent | 2016-02-08 15:26:05 | pavva94/snake-os | https://api.github.com/repos/pavva94/snake-os | closed | IP Cam Feature Extension | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
I read that there is somewhere a package which gives IP CAM feature.
Unfortunately I cannot download this as the link is broken - Can anyone Help ?
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Win XP
Please provide any additional information below.
```
Original issue reported on code.google.com by `glynndr...@gmail.com` on 29 Jan 2015 at 9:22 | 1.0 | IP Cam Feature Extension - ```
What steps will reproduce the problem?
I read that there is somewhere a package which gives IP CAM feature.
Unfortunately I cannot download this as the link is broken - Can anyone Help ?
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Win XP
Please provide any additional information below.
```
Original issue reported on code.google.com by `glynndr...@gmail.com` on 29 Jan 2015 at 9:22 | defect | ip cam feature extension what steps will reproduce the problem i read that there is somewhere a package which gives ip cam feature unfortunately i cannot download this as the link is broken can anyone help what is the expected output what do you see instead what version of the product are you using on what operating system win xp please provide any additional information below original issue reported on code google com by glynndr gmail com on jan at | 1 |
334,731 | 29,953,652,259 | IssuesEvent | 2023-06-23 05:08:37 | crossplane/crossplane | https://api.github.com/repos/crossplane/crossplane | opened | Add descriptions to E2E features and assessments | enhancement test e2e | <!--
Thank you for helping to improve Crossplane!
Please be sure to search for open issues before raising a new one. We use issues
for bug reports and feature requests. Please find us at https://slack.crossplane.io
for questions, support, and discussion.
-->
### What problem are you facing?
<!--
Please tell us a little about your use case - it's okay if it's hypothetical!
Leading with this context helps frame the feature request so we can ensure we
implement it sensibly.
--->
Our E2E tests are categorizes, from least-to-most-specific:
1. By feature area label.
2. By test function name.
3. By feature name.
4. By assessment (step) name.
Labels are `key=value`, but all other names are `CamelCase`. At the moment we're missing a human-readable description of each feature and assessment. A lot of folks use the names to provide this, but that ends up looking pretty ugly when printed per https://github.com/kubernetes-sigs/e2e-framework/issues/277.
### How could Crossplane help solve your problem?
<!--
Let us know how you think Crossplane could help with your use case.
-->
I've asked in https://github.com/kubernetes-sigs/e2e-framework/issues/277 that e2e-framework add support for a prose description of each feature and assessment. They seem open to adding the functionality - when they do we should start using it. | 1.0 | Add descriptions to E2E features and assessments - <!--
Thank you for helping to improve Crossplane!
Please be sure to search for open issues before raising a new one. We use issues
for bug reports and feature requests. Please find us at https://slack.crossplane.io
for questions, support, and discussion.
-->
### What problem are you facing?
<!--
Please tell us a little about your use case - it's okay if it's hypothetical!
Leading with this context helps frame the feature request so we can ensure we
implement it sensibly.
--->
Our E2E tests are categorizes, from least-to-most-specific:
1. By feature area label.
2. By test function name.
3. By feature name.
4. By assessment (step) name.
Labels are `key=value`, but all other names are `CamelCase`. At the moment we're missing a human-readable description of each feature and assessment. A lot of folks use the names to provide this, but that ends up looking pretty ugly when printed per https://github.com/kubernetes-sigs/e2e-framework/issues/277.
### How could Crossplane help solve your problem?
<!--
Let us know how you think Crossplane could help with your use case.
-->
I've asked in https://github.com/kubernetes-sigs/e2e-framework/issues/277 that e2e-framework add support for a prose description of each feature and assessment. They seem open to adding the functionality - when they do we should start using it. | non_defect | add descriptions to features and assessments thank you for helping to improve crossplane please be sure to search for open issues before raising a new one we use issues for bug reports and feature requests please find us at for questions support and discussion what problem are you facing please tell us a little about your use case it s okay if it s hypothetical leading with this context helps frame the feature request so we can ensure we implement it sensibly our tests are categorizes from least to most specific by feature area label by test function name by feature name by assessment step name labels are key value but all other names are camelcase at the moment we re missing a human readable description of each feature and assessment a lot of folks use the names to provide this but that ends up looking pretty ugly when printed per how could crossplane help solve your problem let us know how you think crossplane could help with your use case i ve asked in that framework add support for a prose description of each feature and assessment they seem open to adding the functionality when they do we should start using it | 0 |
146,794 | 5,627,950,302 | IssuesEvent | 2017-04-05 03:58:50 | dnGrep/dnGrep | https://api.github.com/repos/dnGrep/dnGrep | closed | no supports for gb2312/utf8 encoding? | bug imported Priority-Medium | _From [mfm...@sina.com](https://code.google.com/u/107484051280825188802/) on February 25, 2013 23:15:51_
it turns out to be no supports for greping gb2312/utf8 encoded text.
1. dngrep did find the occurs, but linenos and positions of the result are very wrong
2. chinese characters displayed in result pane are unreadable, while they are good in preview pane.
3. converting gb2312/utf8 encoded text file to unicode encoding will solve this problem, but it is a tough work and not feasible
3. env: dngrep version 2.7.1, win7 x64(simplified chinese)
_Original issue: http://code.google.com/p/dngrep/issues/detail?id=177_
| 1.0 | no supports for gb2312/utf8 encoding? - _From [mfm...@sina.com](https://code.google.com/u/107484051280825188802/) on February 25, 2013 23:15:51_
it turns out to be no supports for greping gb2312/utf8 encoded text.
1. dngrep did find the occurs, but linenos and positions of the result are very wrong
2. chinese characters displayed in result pane are unreadable, while they are good in preview pane.
3. converting gb2312/utf8 encoded text file to unicode encoding will solve this problem, but it is a tough work and not feasible
3. env: dngrep version 2.7.1, win7 x64(simplified chinese)
_Original issue: http://code.google.com/p/dngrep/issues/detail?id=177_
| non_defect | no supports for encoding from on february it turns out to be no supports for greping encoded text dngrep did find the occurs but linenos and positions of the result are very wrong chinese characters displayed in result pane are unreadable while they are good in preview pane converting encoded text file to unicode encoding will solve this problem but it is a tough work and not feasible env dngrep version simplified chinese original issue | 0 |
5,727 | 2,610,214,203 | IssuesEvent | 2015-02-26 19:08:22 | chrsmith/somefinders | https://api.github.com/repos/chrsmith/somefinders | opened | инструкция для cisco 7906 русский.doc | auto-migrated Priority-Medium Type-Defect | ```
'''Викентий Аксёнов'''
День добрый никак не могу найти .инструкция
для cisco 7906 русский.doc. где то видел уже
'''Альвин Лазарев'''
Качай тут http://bit.ly/177cjBm
'''Арнольд Кононов'''
Просит ввести номер мобилы!Не опасно ли это?
'''Виталий Якушев'''
Не это не влияет на баланс
'''Владлен Евдокимов'''
Неа все ок у меня ничего не списало
Информация о файле: инструкция для cisco 7906
русский.doc
Загружен: В этом месяце
Скачан раз: 568
Рейтинг: 325
Средняя скорость скачивания: 1145
Похожих файлов: 32
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 16 Dec 2013 at 12:29 | 1.0 | инструкция для cisco 7906 русский.doc - ```
'''Викентий Аксёнов'''
День добрый никак не могу найти .инструкция
для cisco 7906 русский.doc. где то видел уже
'''Альвин Лазарев'''
Качай тут http://bit.ly/177cjBm
'''Арнольд Кононов'''
Просит ввести номер мобилы!Не опасно ли это?
'''Виталий Якушев'''
Не это не влияет на баланс
'''Владлен Евдокимов'''
Неа все ок у меня ничего не списало
Информация о файле: инструкция для cisco 7906
русский.doc
Загружен: В этом месяце
Скачан раз: 568
Рейтинг: 325
Средняя скорость скачивания: 1145
Похожих файлов: 32
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 16 Dec 2013 at 12:29 | defect | инструкция для cisco русский doc викентий аксёнов день добрый никак не могу найти инструкция для cisco русский doc где то видел уже альвин лазарев качай тут арнольд кононов просит ввести номер мобилы не опасно ли это виталий якушев не это не влияет на баланс владлен евдокимов неа все ок у меня ничего не списало информация о файле инструкция для cisco русский doc загружен в этом месяце скачан раз рейтинг средняя скорость скачивания похожих файлов original issue reported on code google com by kondense gmail com on dec at | 1 |
72,328 | 24,054,057,254 | IssuesEvent | 2022-09-16 15:10:33 | dkfans/keeperfx | https://api.github.com/repos/dkfans/keeperfx | closed | Possible to look through reinforced corners | Priority-Medium Type-Defect | In KeeperFX it is possible to look through corners that are fully reinforced. This allows the player to see things and even drop imps in places he should not be able to.
The behavior is different from the original game.
Steps to reproduce:
1) Load the testmap: [testmap1076.zip](https://github.com/dkfans/keeperfx/files/6664934/testmap1076.zip)
2) Claim everything you can, including the corners
-> notice you can see inside
-> notice you can drop your imp inside
| 1.0 | Possible to look through reinforced corners - In KeeperFX it is possible to look through corners that are fully reinforced. This allows the player to see things and even drop imps in places he should not be able to.
The behavior is different from the original game.
Steps to reproduce:
1) Load the testmap: [testmap1076.zip](https://github.com/dkfans/keeperfx/files/6664934/testmap1076.zip)
2) Claim everything you can, including the corners
-> notice you can see inside
-> notice you can drop your imp inside
| defect | possible to look through reinforced corners in keeperfx it is possible to look through corners that are fully reinforced this allows the player to see things and even drop imps in places he should not be able to the behavior is different from the original game steps to reproduce load the testmap claim everything you can including the corners notice you can see inside notice you can drop your imp inside | 1 |
46,757 | 13,055,971,194 | IssuesEvent | 2020-07-30 03:16:19 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | L1 filter for 2014 and 2015 (Trac #1828) | Incomplete Migration Migrated from Trac cmake defect | Migrated from https://code.icecube.wisc.edu/ticket/1828
```json
{
"status": "closed",
"changetime": "2016-08-18T21:39:38",
"description": "Running offline filter for 2014 and 2015:\n/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/metaprojects/icerec/IC2014-L2_V14-02-00/lib/icecube/filterscripts/offlineL2/level1_SimulationFiltering.py\n\nI receive the following error:\n`RuntimeError: dlopen() dynamic loading error: /data/user/saxani/environments/buildfwd/lib/libpfauxiliary.so: cannot open shared object file: No such file or directory`\n",
"reporter": "saxani",
"cc": "",
"resolution": "fixed",
"_ts": "1471556378950850",
"component": "cmake",
"summary": "L1 filter for 2014 and 2015",
"priority": "normal",
"keywords": "",
"time": "2016-08-18T21:29:33",
"milestone": "",
"owner": "",
"type": "defect"
}
```
| 1.0 | L1 filter for 2014 and 2015 (Trac #1828) - Migrated from https://code.icecube.wisc.edu/ticket/1828
```json
{
"status": "closed",
"changetime": "2016-08-18T21:39:38",
"description": "Running offline filter for 2014 and 2015:\n/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/metaprojects/icerec/IC2014-L2_V14-02-00/lib/icecube/filterscripts/offlineL2/level1_SimulationFiltering.py\n\nI receive the following error:\n`RuntimeError: dlopen() dynamic loading error: /data/user/saxani/environments/buildfwd/lib/libpfauxiliary.so: cannot open shared object file: No such file or directory`\n",
"reporter": "saxani",
"cc": "",
"resolution": "fixed",
"_ts": "1471556378950850",
"component": "cmake",
"summary": "L1 filter for 2014 and 2015",
"priority": "normal",
"keywords": "",
"time": "2016-08-18T21:29:33",
"milestone": "",
"owner": "",
"type": "defect"
}
```
| defect | filter for and trac migrated from json status closed changetime description running offline filter for and n cvmfs icecube opensciencegrid org rhel metaprojects icerec lib icecube filterscripts simulationfiltering py n ni receive the following error n runtimeerror dlopen dynamic loading error data user saxani environments buildfwd lib libpfauxiliary so cannot open shared object file no such file or directory n reporter saxani cc resolution fixed ts component cmake summary filter for and priority normal keywords time milestone owner type defect | 1 |
42,190 | 2,869,256,950 | IssuesEvent | 2015-06-06 01:34:16 | Automattic/mongoose | https://api.github.com/repos/Automattic/mongoose | closed | cast to buffer failed | priority | i'm having logs that say that a cast to buffer failed:
```
May 29 14:05:40 studio app/web.2: ValidationError: Video validation failed
May 29 14:05:40 studio app/web.2: at model.Document.invalidate (/app/node_modules/mongoose/lib/document.js:1156:32)
May 29 14:05:40 studio app/web.2: at init (/app/node_modules/mongoose/lib/document.js:317:18)
May 29 14:05:40 studio app/web.2: at model.Document.init (/app/node_modules/mongoose/lib/document.js:273:3)
May 29 14:05:40 studio app/web.2: at completeOne (/app/node_modules/mongoose/lib/query.js:1469:10)
May 29 14:05:40 studio app/web.2: at Immediate.<anonymous> (/app/node_modules/mongoose/lib/query.js:1204:13)
May 29 14:05:40 studio app/web.2: at Immediate._onImmediate (/app/node_modules/mongoose/node_modules/mquery/lib/utils.js:137:16)
May 29 14:05:40 studio app/web.2: at processImmediate [as _immediateCallback] (timers.js:368:17)
May 29 14:05:40 studio app/web.2: { sha256:
May 29 14:05:40 studio app/web.2: { [ValidatorError: Cast to buffer failed for value "zÔ,�e@ڤ��
May 29 14:05:40 studio app/web.2: ����{=�T��u��W�/�" at path "sha256"]
May 29 14:05:40 studio app/web.2: properties:
May 29 14:05:40 studio app/web.2: { path: 'sha256',
May 29 14:05:40 studio app/web.2: message: 'Cast to buffer failed for value "zÔ,�e@ڤ�\u0010�\n\t��\u0002��{=�T��u��W�/�" at path "sha256"',
May 29 14:05:40 studio app/web.2: type: 'cast',
May 29 14:05:40 studio app/web.2: value: [Object] },
May 29 14:05:40 studio app/web.2: message: 'Cast to buffer failed for value "zÔ,�e@ڤ�\u0010�\n\t��\u0002��{=�T��u��W�/�" at path "sha256"',
May 29 14:05:40 studio app/web.2: name: 'ValidatorError',
May 29 14:05:40 studio app/web.2: kind: 'cast',
May 29 14:05:40 studio app/web.2: path: 'sha256',
May 29 14:05:40 studio app/web.2: value:
May 29 14:05:40 studio app/web.2: { _bsontype: 'Binary',
May 29 14:05:40 studio app/web.2: sub_type: 0,
May 29 14:05:40 studio app/web.2: position: 32,
May 29 14:05:40 studio app/web.2: buffer: <Buffer 7a c3 94 2c bd 65 40 da a4 c5 10 93 0a 09 8c e9 02 db db 7b 3d f7 54 ad e7 75 b4 b1 57 f3 2f 93> } } }
```
i'm not sure what's causing this. i have `sha256: Buffer` in my schema. seems like `sha256` is being set as a BSON Binary, which isn't exactly a buffer, causing this error. i also can't reproduce locally - it only happens on my server.
know what's going on? | 1.0 | cast to buffer failed - i'm having logs that say that a cast to buffer failed:
```
May 29 14:05:40 studio app/web.2: ValidationError: Video validation failed
May 29 14:05:40 studio app/web.2: at model.Document.invalidate (/app/node_modules/mongoose/lib/document.js:1156:32)
May 29 14:05:40 studio app/web.2: at init (/app/node_modules/mongoose/lib/document.js:317:18)
May 29 14:05:40 studio app/web.2: at model.Document.init (/app/node_modules/mongoose/lib/document.js:273:3)
May 29 14:05:40 studio app/web.2: at completeOne (/app/node_modules/mongoose/lib/query.js:1469:10)
May 29 14:05:40 studio app/web.2: at Immediate.<anonymous> (/app/node_modules/mongoose/lib/query.js:1204:13)
May 29 14:05:40 studio app/web.2: at Immediate._onImmediate (/app/node_modules/mongoose/node_modules/mquery/lib/utils.js:137:16)
May 29 14:05:40 studio app/web.2: at processImmediate [as _immediateCallback] (timers.js:368:17)
May 29 14:05:40 studio app/web.2: { sha256:
May 29 14:05:40 studio app/web.2: { [ValidatorError: Cast to buffer failed for value "zÔ,�e@ڤ��
May 29 14:05:40 studio app/web.2: ����{=�T��u��W�/�" at path "sha256"]
May 29 14:05:40 studio app/web.2: properties:
May 29 14:05:40 studio app/web.2: { path: 'sha256',
May 29 14:05:40 studio app/web.2: message: 'Cast to buffer failed for value "zÔ,�e@ڤ�\u0010�\n\t��\u0002��{=�T��u��W�/�" at path "sha256"',
May 29 14:05:40 studio app/web.2: type: 'cast',
May 29 14:05:40 studio app/web.2: value: [Object] },
May 29 14:05:40 studio app/web.2: message: 'Cast to buffer failed for value "zÔ,�e@ڤ�\u0010�\n\t��\u0002��{=�T��u��W�/�" at path "sha256"',
May 29 14:05:40 studio app/web.2: name: 'ValidatorError',
May 29 14:05:40 studio app/web.2: kind: 'cast',
May 29 14:05:40 studio app/web.2: path: 'sha256',
May 29 14:05:40 studio app/web.2: value:
May 29 14:05:40 studio app/web.2: { _bsontype: 'Binary',
May 29 14:05:40 studio app/web.2: sub_type: 0,
May 29 14:05:40 studio app/web.2: position: 32,
May 29 14:05:40 studio app/web.2: buffer: <Buffer 7a c3 94 2c bd 65 40 da a4 c5 10 93 0a 09 8c e9 02 db db 7b 3d f7 54 ad e7 75 b4 b1 57 f3 2f 93> } } }
```
i'm not sure what's causing this. i have `sha256: Buffer` in my schema. seems like `sha256` is being set as a BSON Binary, which isn't exactly a buffer, causing this error. i also can't reproduce locally - it only happens on my server.
know what's going on? | non_defect | cast to buffer failed i m having logs that say that a cast to buffer failed may studio app web validationerror video validation failed may studio app web at model document invalidate app node modules mongoose lib document js may studio app web at init app node modules mongoose lib document js may studio app web at model document init app node modules mongoose lib document js may studio app web at completeone app node modules mongoose lib query js may studio app web at immediate app node modules mongoose lib query js may studio app web at immediate onimmediate app node modules mongoose node modules mquery lib utils js may studio app web at processimmediate timers js may studio app web may studio app web validatorerror cast to buffer failed for value zô �e ڤ�� may studio app web ���� �t��u��w� � at path may studio app web properties may studio app web path may studio app web message cast to buffer failed for value zô �e ڤ� � n t�� �� �t��u��w� � at path may studio app web type cast may studio app web value may studio app web message cast to buffer failed for value zô �e ڤ� � n t�� �� �t��u��w� � at path may studio app web name validatorerror may studio app web kind cast may studio app web path may studio app web value may studio app web bsontype binary may studio app web sub type may studio app web position may studio app web buffer i m not sure what s causing this i have buffer in my schema seems like is being set as a bson binary which isn t exactly a buffer causing this error i also can t reproduce locally it only happens on my server know what s going on | 0 |
75,558 | 25,915,318,481 | IssuesEvent | 2022-12-15 16:53:45 | SeleniumHQ/selenium | https://api.github.com/repos/SeleniumHQ/selenium | opened | [🐛 Bug]: `url_matches` expected condition is using `re.search` possibly accidentally | I-defect needs-triaging | ### What happened?
According to the docstring of [`url_matches`](https://github.com/SeleniumHQ/selenium/blob/trunk/py/selenium/webdriver/support/expected_conditions.py#L96) the pattern should be matched against the url _exactly_ - Is a `re.search` sufficient here over `re.fullmatch` ?
```python
"""An expectation for checking the current url.
pattern is the expected pattern, which must be an exact match
returns True if the url matches, false otherwise.
"""
```
This may be as expected, eitherway we should either update the docstring if the current searching is adequate, or move to `re.fullmatch` maybe if the implementation is slightly off?
theres also a `url_to_be` that relies on equality, which is slightly less usable for polling against dynamic aspects of a url, so the docstring might just of been a copy-paste mistake there.
### How can we reproduce the issue?
```shell
import re
from selenium import webdriver
from selenium.webdriver.support import expected_conditions
from selenium.webdriver.support.wait import WebDriverWait
with webdriver.Chrome() as driver:
driver.get("https://www.selenium.dev")
_ = WebDriverWait(driver, timeout=30).until(expected_conditions.url_matches(pattern="dev"))
assert re.search("dev", driver.current_url), "url did not match"
will match the predicate immediately
```
### Relevant log output
```shell
Not Applicable.
```
### Operating System
N/A
### Selenium version
4.7.2
### What are the browser(s) and version(s) where you see this issue?
Not Applicable.
### What are the browser driver(s) and version(s) where you see this issue?
Not Applicable.
### Are you using Selenium Grid?
Not Applicable. | 1.0 | [🐛 Bug]: `url_matches` expected condition is using `re.search` possibly accidentally - ### What happened?
According to the docstring of [`url_matches`](https://github.com/SeleniumHQ/selenium/blob/trunk/py/selenium/webdriver/support/expected_conditions.py#L96) the pattern should be matched against the url _exactly_ - Is a `re.search` sufficient here over `re.fullmatch` ?
```python
"""An expectation for checking the current url.
pattern is the expected pattern, which must be an exact match
returns True if the url matches, false otherwise.
"""
```
This may be as expected, eitherway we should either update the docstring if the current searching is adequate, or move to `re.fullmatch` maybe if the implementation is slightly off?
theres also a `url_to_be` that relies on equality, which is slightly less usable for polling against dynamic aspects of a url, so the docstring might just of been a copy-paste mistake there.
### How can we reproduce the issue?
```shell
import re
from selenium import webdriver
from selenium.webdriver.support import expected_conditions
from selenium.webdriver.support.wait import WebDriverWait
with webdriver.Chrome() as driver:
driver.get("https://www.selenium.dev")
_ = WebDriverWait(driver, timeout=30).until(expected_conditions.url_matches(pattern="dev"))
assert re.search("dev", driver.current_url), "url did not match"
will match the predicate immediately
```
### Relevant log output
```shell
Not Applicable.
```
### Operating System
N/A
### Selenium version
4.7.2
### What are the browser(s) and version(s) where you see this issue?
Not Applicable.
### What are the browser driver(s) and version(s) where you see this issue?
Not Applicable.
### Are you using Selenium Grid?
Not Applicable. | defect | url matches expected condition is using re search possibly accidentally what happened according to the docstring of the pattern should be matched against the url exactly is a re search sufficient here over re fullmatch python an expectation for checking the current url pattern is the expected pattern which must be an exact match returns true if the url matches false otherwise this may be as expected eitherway we should either update the docstring if the current searching is adequate or move to re fullmatch maybe if the implementation is slightly off theres also a url to be that relies on equality which is slightly less usable for polling against dynamic aspects of a url so the docstring might just of been a copy paste mistake there how can we reproduce the issue shell import re from selenium import webdriver from selenium webdriver support import expected conditions from selenium webdriver support wait import webdriverwait with webdriver chrome as driver driver get webdriverwait driver timeout until expected conditions url matches pattern dev assert re search dev driver current url url did not match will match the predicate immediately relevant log output shell not applicable operating system n a selenium version what are the browser s and version s where you see this issue not applicable what are the browser driver s and version s where you see this issue not applicable are you using selenium grid not applicable | 1 |
228,632 | 17,467,851,183 | IssuesEvent | 2021-08-06 19:46:51 | jbold569/profile | https://api.github.com/repos/jbold569/profile | closed | Agent Based Logging | documentation.blog | title: Agent Based Logging
type: tech
description: Whether you're working in site reliability, business intelligence, or security, one thing rings true. Logs are king. They are a window into your operations, providing insights into access, change, performance, who, what, when, where, and why. However, just as logs can be invaluable they can just as easily be burdensome and costly. Poor log hygiene plagues many organizations. While many business functions benefit from logs, they don't benefit from all logs and the excess translates directly to cost. In this post we'll explore some options for logging instrumentation that aid in filtering, routing, and maintaining the flow of logs at the host level.
| 1.0 | Agent Based Logging - title: Agent Based Logging
type: tech
description: Whether you're working in site reliability, business intelligence, or security, one thing rings true. Logs are king. They are a window into your operations, providing insights into access, change, performance, who, what, when, where, and why. However, just as logs can be invaluable they can just as easily be burdensome and costly. Poor log hygiene plagues many organizations. While many business functions benefit from logs, they don't benefit from all logs and the excess translates directly to cost. In this post we'll explore some options for logging instrumentation that aid in filtering, routing, and maintaining the flow of logs at the host level.
| non_defect | agent based logging title agent based logging type tech description whether you re working in site reliability business intelligence or security one thing rings true logs are king they are a window into your operations providing insights into access change performance who what when where and why however just as logs can be invaluable they can just as easily be burdensome and costly poor log hygiene plagues many organizations while many business functions benefit from logs they don t benefit from all logs and the excess translates directly to cost in this post we ll explore some options for logging instrumentation that aid in filtering routing and maintaining the flow of logs at the host level | 0 |
48,304 | 13,068,430,598 | IssuesEvent | 2020-07-31 03:33:17 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | closed | wimpsim-reader - default options are invalid (Trac #2155) | Migrated from Trac combo simulation defect | In [http://software.icecube.wisc.edu/documentation/inspect/wimpsim_reader.html?highlight=i3wimpsim#I3WimpSimReader I3WimpSimReader] we can read:
Param EndMJD: Default = nan, MJD to end simulation; if unspecified: read everything
But if I try not to set it (and take the NAN default) I receive this error
```text
ERROR (dataclasses): Calling with NAN not possible; will do nothing (I3Time.cxx:142 in void I3Time::SetModJulianTimeDouble(double))
```
The same is for `StartMJD`.
Migrated from https://code.icecube.wisc.edu/ticket/2155
```json
{
"status": "closed",
"changetime": "2019-02-13T14:15:23",
"description": "In [http://software.icecube.wisc.edu/documentation/inspect/wimpsim_reader.html?highlight=i3wimpsim#I3WimpSimReader I3WimpSimReader] we can read:\n\n Param EndMJD:\tDefault = nan, MJD to end simulation; if unspecified: read everything\n\nBut if I try not to set it (and take the NAN default) I receive this error\n\n\n{{{\nERROR (dataclasses): Calling with NAN not possible; will do nothing (I3Time.cxx:142 in void I3Time::SetModJulianTimeDouble(double))\n}}}\n\nThe same is for `StartMJD`.\n\n",
"reporter": "grenzi",
"cc": "",
"resolution": "fixed",
"_ts": "1550067323910946",
"component": "combo simulation",
"summary": "wimpsim-reader - default options are invalid",
"priority": "normal",
"keywords": "",
"time": "2018-05-17T15:48:52",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
| 1.0 | wimpsim-reader - default options are invalid (Trac #2155) - In [http://software.icecube.wisc.edu/documentation/inspect/wimpsim_reader.html?highlight=i3wimpsim#I3WimpSimReader I3WimpSimReader] we can read:
Param EndMJD: Default = nan, MJD to end simulation; if unspecified: read everything
But if I try not to set it (and take the NAN default) I receive this error
```text
ERROR (dataclasses): Calling with NAN not possible; will do nothing (I3Time.cxx:142 in void I3Time::SetModJulianTimeDouble(double))
```
The same is for `StartMJD`.
Migrated from https://code.icecube.wisc.edu/ticket/2155
```json
{
"status": "closed",
"changetime": "2019-02-13T14:15:23",
"description": "In [http://software.icecube.wisc.edu/documentation/inspect/wimpsim_reader.html?highlight=i3wimpsim#I3WimpSimReader I3WimpSimReader] we can read:\n\n Param EndMJD:\tDefault = nan, MJD to end simulation; if unspecified: read everything\n\nBut if I try not to set it (and take the NAN default) I receive this error\n\n\n{{{\nERROR (dataclasses): Calling with NAN not possible; will do nothing (I3Time.cxx:142 in void I3Time::SetModJulianTimeDouble(double))\n}}}\n\nThe same is for `StartMJD`.\n\n",
"reporter": "grenzi",
"cc": "",
"resolution": "fixed",
"_ts": "1550067323910946",
"component": "combo simulation",
"summary": "wimpsim-reader - default options are invalid",
"priority": "normal",
"keywords": "",
"time": "2018-05-17T15:48:52",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
| defect | wimpsim reader default options are invalid trac in we can read param endmjd default nan mjd to end simulation if unspecified read everything but if i try not to set it and take the nan default i receive this error text error dataclasses calling with nan not possible will do nothing cxx in void setmodjuliantimedouble double the same is for startmjd migrated from json status closed changetime description in we can read n n param endmjd tdefault nan mjd to end simulation if unspecified read everything n nbut if i try not to set it and take the nan default i receive this error n n n nerror dataclasses calling with nan not possible will do nothing cxx in void setmodjuliantimedouble double n n nthe same is for startmjd n n reporter grenzi cc resolution fixed ts component combo simulation summary wimpsim reader default options are invalid priority normal keywords time milestone owner nega type defect | 1 |
40,227 | 9,916,851,799 | IssuesEvent | 2019-06-28 21:22:08 | vector-im/riot-web | https://api.github.com/repos/vector-im/riot-web | closed | If you're trying to register but can't contact the default HS, the UX doesn't make it very clear how to proceed. | bug defect p1 type:registration ui/ux | <img width="719" alt="image" src="https://user-images.githubusercontent.com/1922197/59196000-669d3200-8b85-11e9-92cf-b48842f9ce1c.png">
The spinner just spins; if you want to register _somewhere_ your best bet is to click 'advanced', but that isn't obvious.
| 1.0 | If you're trying to register but can't contact the default HS, the UX doesn't make it very clear how to proceed. - <img width="719" alt="image" src="https://user-images.githubusercontent.com/1922197/59196000-669d3200-8b85-11e9-92cf-b48842f9ce1c.png">
The spinner just spins; if you want to register _somewhere_ your best bet is to click 'advanced', but that isn't obvious.
| defect | if you re trying to register but can t contact the default hs the ux doesn t make it very clear how to proceed img width alt image src the spinner just spins if you want to register somewhere your best bet is to click advanced but that isn t obvious | 1 |
13,763 | 2,782,632,797 | IssuesEvent | 2015-05-06 19:06:15 | canadainc/quran10 | https://api.github.com/repos/canadainc/quran10 | reopened | If you try to start a download session while another one is already happening it might corrupt the download | auto-migrated Component-Logic Milestone-Release4.0 Priority-High Security Type-Defect | ```
Title: Step back
Created Date: May 11, 2014 10:00:13 PM
File Bundle: Filebundle[id=1692472, name=quran_3520]
Release: 3.5.2
Review Body:
I think this update fixed somethings but there is still work to be done. It
shouldn't allow you to download the same surah twice while it downloads. This
causes problems when it finishes and you can't listen to the surah.
```
Original issue reported on code.google.com by `canadai...@gmail.com` on 13 May 2014 at 11:58 | 1.0 | If you try to start a download session while another one is already happening it might corrupt the download - ```
Title: Step back
Created Date: May 11, 2014 10:00:13 PM
File Bundle: Filebundle[id=1692472, name=quran_3520]
Release: 3.5.2
Review Body:
I think this update fixed somethings but there is still work to be done. It
shouldn't allow you to download the same surah twice while it downloads. This
causes problems when it finishes and you can't listen to the surah.
```
Original issue reported on code.google.com by `canadai...@gmail.com` on 13 May 2014 at 11:58 | defect | if you try to start a download session while another one is already happening it might corrupt the download title step back created date may pm file bundle filebundle release review body i think this update fixed somethings but there is still work to be done it shouldn t allow you to download the same surah twice while it downloads this causes problems when it finishes and you can t listen to the surah original issue reported on code google com by canadai gmail com on may at | 1 |
204,291 | 23,238,939,539 | IssuesEvent | 2022-08-03 14:05:48 | Gal-Doron/renovate-test | https://api.github.com/repos/Gal-Doron/renovate-test | opened | WS-2021-0616 (Medium) detected in jackson-databind-2.12.3.jar | security vulnerability | ## WS-2021-0616 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.12.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar</p>
<p>
Dependency Hierarchy:
- pulsar-common-2.8.0.9.jar (Root Library)
- :x: **jackson-databind-2.12.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/renovate-test/commit/18ace59282ac46d518b94191200613dd4490868e">18ace59282ac46d518b94191200613dd4490868e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind before 2.12.6 and 2.13.1 there is DoS when using JDK serialization to serialize JsonNode.
<p>Publish Date: 2021-11-20
<p>URL: <a href=https://github.com/FasterXML/jackson-databind/commit/3ccde7d938fea547e598fdefe9a82cff37fed5cb>WS-2021-0616</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-11-20</p>
<p>Fix Resolution (com.fasterxml.jackson.core:jackson-databind): 2.12.4</p>
<p>Direct dependency fix Resolution (io.streamnative:pulsar-common): 2.8.2.4</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue | True | WS-2021-0616 (Medium) detected in jackson-databind-2.12.3.jar - ## WS-2021-0616 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.12.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar</p>
<p>
Dependency Hierarchy:
- pulsar-common-2.8.0.9.jar (Root Library)
- :x: **jackson-databind-2.12.3.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Gal-Doron/renovate-test/commit/18ace59282ac46d518b94191200613dd4490868e">18ace59282ac46d518b94191200613dd4490868e</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind before 2.12.6 and 2.13.1 there is DoS when using JDK serialization to serialize JsonNode.
<p>Publish Date: 2021-11-20
<p>URL: <a href=https://github.com/FasterXML/jackson-databind/commit/3ccde7d938fea547e598fdefe9a82cff37fed5cb>WS-2021-0616</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2021-11-20</p>
<p>Fix Resolution (com.fasterxml.jackson.core:jackson-databind): 2.12.4</p>
<p>Direct dependency fix Resolution (io.streamnative:pulsar-common): 2.8.2.4</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue | non_defect | ws medium detected in jackson databind jar ws medium severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy pulsar common jar root library x jackson databind jar vulnerable library found in head commit a href found in base branch main vulnerability details fasterxml jackson databind before and there is dos when using jdk serialization to serialize jsonnode publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution com fasterxml jackson core jackson databind direct dependency fix resolution io streamnative pulsar common rescue worker helmet automatic remediation is available for this issue | 0 |
11,346 | 2,649,303,795 | IssuesEvent | 2015-03-14 19:44:29 | AsyncHttpClient/async-http-client | https://api.github.com/repos/AsyncHttpClient/async-http-client | closed | NullPointerException with GrizzlyAsyncHttpProvider(v1.8) when WebSocket is closed | Defect Grizzly | The Grizzly ProtocolHandler (grizzly-websockets-2.3.18) org.glassfish.grizzly.websockets.ProtocolHandler has a bug when it close the webSocket.
```
public GrizzlyFuture<DataFrame> close(int code, String reason) {
return send(new ClosingFrame(code, reason),
new EmptyCompletionHandler<DataFrame>() {
@Override
public void failed(final Throwable throwable) {
webSocket.onClose(null);
}
@Override
public void completed(DataFrame result) {
if (!maskData) {
webSocket.onClose(null);
}
}
});
}
```
The DataFrame is not set and when it comes to the GrizzlyAsyncHttpProvider it throws NullPointerException.
```
public void onClose(org.glassfish.grizzly.websockets.WebSocket gWebSocket, DataFrame dataFrame) {
try {
if (ahcListener instanceof WebSocketCloseCodeReasonListener) {
ClosingFrame cf = ClosingFrame.class.cast(dataFrame);
WebSocketCloseCodeReasonListener.class.cast(ahcListener).onClose(webSocket, cf.getCode(), cf.getReason());
} else {
ahcListener.onClose(webSocket);
}
} catch (Throwable e) {
ahcListener.onError(e);
}
}
``` | 1.0 | NullPointerException with GrizzlyAsyncHttpProvider(v1.8) when WebSocket is closed - The Grizzly ProtocolHandler (grizzly-websockets-2.3.18) org.glassfish.grizzly.websockets.ProtocolHandler has a bug when it close the webSocket.
```
public GrizzlyFuture<DataFrame> close(int code, String reason) {
return send(new ClosingFrame(code, reason),
new EmptyCompletionHandler<DataFrame>() {
@Override
public void failed(final Throwable throwable) {
webSocket.onClose(null);
}
@Override
public void completed(DataFrame result) {
if (!maskData) {
webSocket.onClose(null);
}
}
});
}
```
The DataFrame is not set and when it comes to the GrizzlyAsyncHttpProvider it throws NullPointerException.
```
public void onClose(org.glassfish.grizzly.websockets.WebSocket gWebSocket, DataFrame dataFrame) {
try {
if (ahcListener instanceof WebSocketCloseCodeReasonListener) {
ClosingFrame cf = ClosingFrame.class.cast(dataFrame);
WebSocketCloseCodeReasonListener.class.cast(ahcListener).onClose(webSocket, cf.getCode(), cf.getReason());
} else {
ahcListener.onClose(webSocket);
}
} catch (Throwable e) {
ahcListener.onError(e);
}
}
``` | defect | nullpointerexception with grizzlyasynchttpprovider when websocket is closed the grizzly protocolhandler grizzly websockets org glassfish grizzly websockets protocolhandler has a bug when it close the websocket public grizzlyfuture close int code string reason return send new closingframe code reason new emptycompletionhandler override public void failed final throwable throwable websocket onclose null override public void completed dataframe result if maskdata websocket onclose null the dataframe is not set and when it comes to the grizzlyasynchttpprovider it throws nullpointerexception public void onclose org glassfish grizzly websockets websocket gwebsocket dataframe dataframe try if ahclistener instanceof websocketclosecodereasonlistener closingframe cf closingframe class cast dataframe websocketclosecodereasonlistener class cast ahclistener onclose websocket cf getcode cf getreason else ahclistener onclose websocket catch throwable e ahclistener onerror e | 1 |
50,566 | 13,187,582,304 | IssuesEvent | 2020-08-13 03:53:24 | icecube-trac/tix3 | https://api.github.com/repos/icecube-trac/tix3 | closed | coverage - dst - gcov is getting stuck on I3DST.cxx (Trac #951) | Migrated from Trac cmake defect | 37 hours is way too long to process a file
[https://groups.google.com/forum/#!topic/gnu.gcc.help/aS3mQGzGE_4 This thread] has a pointer.
<details>
<summary><em>Migrated from https://code.icecube.wisc.edu/ticket/951
, reported by nega and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-05-18T03:48:00",
"description": "37 hours is way too long to process a file\n\n[https://groups.google.com/forum/#!topic/gnu.gcc.help/aS3mQGzGE_4 This thread] has a pointer.",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1431920880362887",
"component": "cmake",
"summary": "coverage - dst - gcov is getting stuck on I3DST.cxx",
"priority": "major",
"keywords": "",
"time": "2015-04-29T21:27:09",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
| 1.0 | coverage - dst - gcov is getting stuck on I3DST.cxx (Trac #951) - 37 hours is way too long to process a file
[https://groups.google.com/forum/#!topic/gnu.gcc.help/aS3mQGzGE_4 This thread] has a pointer.
<details>
<summary><em>Migrated from https://code.icecube.wisc.edu/ticket/951
, reported by nega and owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-05-18T03:48:00",
"description": "37 hours is way too long to process a file\n\n[https://groups.google.com/forum/#!topic/gnu.gcc.help/aS3mQGzGE_4 This thread] has a pointer.",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1431920880362887",
"component": "cmake",
"summary": "coverage - dst - gcov is getting stuck on I3DST.cxx",
"priority": "major",
"keywords": "",
"time": "2015-04-29T21:27:09",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
| defect | coverage dst gcov is getting stuck on cxx trac hours is way too long to process a file has a pointer migrated from reported by nega and owned by nega json status closed changetime description hours is way too long to process a file n n has a pointer reporter nega cc resolution fixed ts component cmake summary coverage dst gcov is getting stuck on cxx priority major keywords time milestone owner nega type defect | 1 |
95,296 | 8,555,445,391 | IssuesEvent | 2018-11-08 10:03:39 | humera987/FXLabs-Test-Automation | https://api.github.com/repos/humera987/FXLabs-Test-Automation | closed | testing8 : ApiV1OrgsByUserGetPathParamPageMysqlSqlInjectionTimebound | testing8 testing8 | Project : testing8
Job : UAT
Env : UAT
Region : US_WEST_3
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 08 Nov 2018 09:57:36 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/orgs/by-user?page=' or benchmark(7000000000,charset('abc')) = 0 ; --
Request :
Response :
{
"timestamp" : "2018-11-08T09:57:36.650+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/orgs/by-user"
}
Logs :
Assertion [@ResponseTime < 7000 OR @ResponseTime > 10000] resolved-to [1309 < 7000 OR 1309 > 10000] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed]
--- FX Bot --- | 2.0 | testing8 : ApiV1OrgsByUserGetPathParamPageMysqlSqlInjectionTimebound - Project : testing8
Job : UAT
Env : UAT
Region : US_WEST_3
Result : fail
Status Code : 404
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 08 Nov 2018 09:57:36 GMT]}
Endpoint : http://13.56.210.25/api/v1/api/v1/orgs/by-user?page=' or benchmark(7000000000,charset('abc')) = 0 ; --
Request :
Response :
{
"timestamp" : "2018-11-08T09:57:36.650+0000",
"status" : 404,
"error" : "Not Found",
"message" : "No message available",
"path" : "/api/v1/api/v1/orgs/by-user"
}
Logs :
Assertion [@ResponseTime < 7000 OR @ResponseTime > 10000] resolved-to [1309 < 7000 OR 1309 > 10000] result [Passed]Assertion [@StatusCode != 404] resolved-to [404 != 404] result [Failed]
--- FX Bot --- | non_defect | project job uat env uat region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options content type transfer encoding date endpoint or benchmark charset abc request response timestamp status error not found message no message available path api api orgs by user logs assertion resolved to result assertion resolved to result fx bot | 0 |
4,854 | 2,610,158,445 | IssuesEvent | 2015-02-26 18:50:16 | chrsmith/republic-at-war | https://api.github.com/repos/chrsmith/republic-at-war | closed | Graphics Glitch | auto-migrated Priority-Medium Type-Defect | ```
Destoyed CIS reserch has Rebel model
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 3:01 | 1.0 | Graphics Glitch - ```
Destoyed CIS reserch has Rebel model
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 30 Jan 2011 at 3:01 | defect | graphics glitch destoyed cis reserch has rebel model original issue reported on code google com by gmail com on jan at | 1 |
48,706 | 5,967,610,500 | IssuesEvent | 2017-05-30 16:17:46 | lucapette/deloominator | https://api.github.com/repos/lucapette/deloominator | opened | Fix travis-ci build | bug testing | Currently the build is [broken](https://travis-ci.org/lucapette/deloominator/builds/237580736) because `yarn` isn't installed on the travis image. I suspect there may be other roadblocks too. | 1.0 | Fix travis-ci build - Currently the build is [broken](https://travis-ci.org/lucapette/deloominator/builds/237580736) because `yarn` isn't installed on the travis image. I suspect there may be other roadblocks too. | non_defect | fix travis ci build currently the build is because yarn isn t installed on the travis image i suspect there may be other roadblocks too | 0 |
447,849 | 31,724,594,172 | IssuesEvent | 2023-09-10 20:02:22 | armadaproject/armada | https://api.github.com/repos/armadaproject/armada | closed | Add note about signed commits to Contributor documentation | type/documentation good first issue | The Armada repo now uses the DCO plug-in in its Pull Request checks, and requires that all future commits have a "Signed-Off" attribute in every commit in PRs - see https://github.com/apps/dco .
A prominent note should be added to the `CONTRIBUTING.md` doc (and perhaps elsewhere in `docs/` and `developer/` that informs potential contributors that they will need to use the `--signoff` option to `git commit` (maybe add a link to the Git docs for commit) for all commits in their Pull Request branch. It will be much easier for them to do this up-front, rather than have to fix/amend their commits later, due to PR check failures. | 1.0 | Add note about signed commits to Contributor documentation - The Armada repo now uses the DCO plug-in in its Pull Request checks, and requires that all future commits have a "Signed-Off" attribute in every commit in PRs - see https://github.com/apps/dco .
A prominent note should be added to the `CONTRIBUTING.md` doc (and perhaps elsewhere in `docs/` and `developer/` that informs potential contributors that they will need to use the `--signoff` option to `git commit` (maybe add a link to the Git docs for commit) for all commits in their Pull Request branch. It will be much easier for them to do this up-front, rather than have to fix/amend their commits later, due to PR check failures. | non_defect | add note about signed commits to contributor documentation the armada repo now uses the dco plug in in its pull request checks and requires that all future commits have a signed off attribute in every commit in prs see a prominent note should be added to the contributing md doc and perhaps elsewhere in docs and developer that informs potential contributors that they will need to use the signoff option to git commit maybe add a link to the git docs for commit for all commits in their pull request branch it will be much easier for them to do this up front rather than have to fix amend their commits later due to pr check failures | 0 |
3,417 | 2,610,062,287 | IssuesEvent | 2015-02-26 18:18:19 | chrsmith/jsjsj122 | https://api.github.com/repos/chrsmith/jsjsj122 | opened | 黄岩治疗男性不孕不育 | auto-migrated Priority-Medium Type-Defect | ```
黄岩治疗男性不孕不育【台州五洲生殖医院】24小时健康咨询
热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址:台州市
椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104、108、1
18、198及椒江一金清公交车直达枫南小区,乘坐107、105、109、
112、901、 902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 7:35 | 1.0 | 黄岩治疗男性不孕不育 - ```
黄岩治疗男性不孕不育【台州五洲生殖医院】24小时健康咨询
热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址:台州市
椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104、108、1
18、198及椒江一金清公交车直达枫南小区,乘坐107、105、109、
112、901、 902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 7:35 | defect | 黄岩治疗男性不孕不育 黄岩治疗男性不孕不育【台州五洲生殖医院】 热线 微信号tzwzszyy 医院地址 台州市 (枫南大转盘旁)乘车线路 、 、 、 , 、 、 、 、 、 ,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 original issue reported on code google com by poweragr gmail com on may at | 1 |
45,968 | 13,146,458,694 | IssuesEvent | 2020-08-08 09:59:53 | shaundmorris/ddf | https://api.github.com/repos/shaundmorris/ddf | closed | CVE-2018-19837 Medium Severity Vulnerability detected by WhiteSource | security vulnerability wontfix | ## CVE-2018-19837 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sassv4.11.0</b></p></summary>
<p>
<p>:rainbow: Node.js bindings to libsass</p>
<p>Library home page: <a href=https://github.com/sass/node-sass.git>https://github.com/sass/node-sass.git</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Library Source Files (121)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /ddf/ui/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/output.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/output.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_values.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/bind.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_node.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/constants.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/plugins.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/eval.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/base.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/subset_map.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/extend.cpp
- /ddf/ui/node_modules/node-sass/src/custom_importer_bridge.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/contrib/plugin.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/lexer.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_util.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_superselector.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/cencode.c
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/utf8_string.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/file.cpp
- /ddf/ui/node_modules/node-sass/src/callback_bridge.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/subset_map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass2scss.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_functions.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/emitter.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/environment.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/paths.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/expand.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/listize.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_unification.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/source_map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/inspect.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/source_map.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/list.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/json.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_functions.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_util.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/position.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/listize.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/eval.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/string.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/environment.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/boolean.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/emitter.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/extend.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/lexer.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/functions.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/debugger.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/factory.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/color.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/operation.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/constants.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/boolean.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/context.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/value.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/units.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/cssize.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/parser.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/functions.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/position.hpp
- /ddf/ui/node_modules/node-sass/src/sass_context_wrapper.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_context.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/node.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/cssize.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp
- /ddf/ui/node_modules/node-sass/src/binding.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/list.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/functions.h
- /ddf/ui/node_modules/node-sass/src/custom_function_bridge.cpp
- /ddf/ui/node_modules/node-sass/src/custom_importer_bridge.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/color_maps.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/utf8_string.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/values.cpp
- /ddf/ui/node_modules/node-sass/src/sass_context_wrapper.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/b64/encode.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/base64vlq.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/number.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/c99func.c
- /ddf/ui/node_modules/node-sass/src/libsass/src/node.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/util.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/backtrace.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/values.h
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_subset_map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass2scss.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/null.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/expand.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_value.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/context.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/context.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_c.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/color_maps.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/inspect.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/script/test-leaks.pl
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_c.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/util.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/file.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/units.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/plugins.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/color.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_value.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/debug.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/parser.hpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In LibSass prior to 3.5.5, Sass::Eval::operator()(Sass::Binary_Expression*) inside eval.cpp allows attackers to cause a denial-of-service resulting from stack consumption via a crafted sass file, because of certain incorrect parsing of '%' as a modulo operator in parser.cpp.
<p>Publish Date: 2018-12-04
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19837>CVE-2018-19837</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19837">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19837</a></p>
<p>Fix Resolution: 3.5.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-19837 Medium Severity Vulnerability detected by WhiteSource - ## CVE-2018-19837 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sassv4.11.0</b></p></summary>
<p>
<p>:rainbow: Node.js bindings to libsass</p>
<p>Library home page: <a href=https://github.com/sass/node-sass.git>https://github.com/sass/node-sass.git</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Library Source Files (121)</summary>
<p></p>
<p> * The source files were matched to this source library based on a best effort match. Source libraries are selected from a list of probable public libraries.</p>
<p>
- /ddf/ui/node_modules/node-sass/src/libsass/src/error_handling.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/output.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/output.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_values.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/check_nesting.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/check_nesting.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/bind.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_node.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/constants.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/plugins.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/eval.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/base.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/prelexer.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/subset_map.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/extend.cpp
- /ddf/ui/node_modules/node-sass/src/custom_importer_bridge.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/contrib/plugin.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/lexer.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_util.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_superselector.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/cencode.c
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast_fwd_decl.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/utf8_string.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/file.cpp
- /ddf/ui/node_modules/node-sass/src/callback_bridge.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/subset_map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass2scss.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_functions.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/emitter.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/environment.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast_def_macros.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/paths.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/expand.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/listize.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_unification.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/source_map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/inspect.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/source_map.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/list.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/json.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_functions.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_util.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/position.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/listize.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/error_handling.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/eval.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/string.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/environment.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/boolean.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/emitter.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/extend.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/lexer.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/functions.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/debugger.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/prelexer.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/factory.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/color.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/operation.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/constants.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/boolean.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/context.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/value.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/units.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/cssize.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/parser.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/functions.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/position.hpp
- /ddf/ui/node_modules/node-sass/src/sass_context_wrapper.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_context.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/node.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast_fwd_decl.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/cssize.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/remove_placeholders.cpp
- /ddf/ui/node_modules/node-sass/src/binding.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/list.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/functions.h
- /ddf/ui/node_modules/node-sass/src/custom_function_bridge.cpp
- /ddf/ui/node_modules/node-sass/src/custom_importer_bridge.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/color_maps.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/utf8_string.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/sass_value_wrapper.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/ast.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/values.cpp
- /ddf/ui/node_modules/node-sass/src/sass_context_wrapper.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/b64/encode.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/base64vlq.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/number.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/c99func.c
- /ddf/ui/node_modules/node-sass/src/libsass/src/node.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/remove_placeholders.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/util.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/memory/SharedPtr.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/backtrace.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/values.h
- /ddf/ui/node_modules/node-sass/src/libsass/test/test_subset_map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass2scss.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_context.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/memory/SharedPtr.cpp
- /ddf/ui/node_modules/node-sass/src/sass_types/null.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/expand.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_value.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/context.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/include/sass/context.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_c.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/sass_values.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/color_maps.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/inspect.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/script/test-leaks.pl
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_c.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/util.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/file.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/map.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/units.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/plugins.hpp
- /ddf/ui/node_modules/node-sass/src/sass_types/color.h
- /ddf/ui/node_modules/node-sass/src/libsass/src/to_value.cpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/debug.hpp
- /ddf/ui/node_modules/node-sass/src/libsass/src/parser.hpp
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In LibSass prior to 3.5.5, Sass::Eval::operator()(Sass::Binary_Expression*) inside eval.cpp allows attackers to cause a denial-of-service resulting from stack consumption via a crafted sass file, because of certain incorrect parsing of '%' as a modulo operator in parser.cpp.
<p>Publish Date: 2018-12-04
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19837>CVE-2018-19837</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19837">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-19837</a></p>
<p>Fix Resolution: 3.5.5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve medium severity vulnerability detected by whitesource cve medium severity vulnerability vulnerable library node rainbow node js bindings to libsass library home page a href library source files the source files were matched to this source library based on a best effort match source libraries are selected from a list of probable public libraries ddf ui node modules node sass src libsass src error handling cpp ddf ui node modules node sass src libsass src output cpp ddf ui node modules node sass src libsass src output hpp ddf ui node modules node sass src libsass src sass values hpp ddf ui node modules node sass src libsass src check nesting cpp ddf ui node modules node sass src libsass src check nesting hpp ddf ui node modules node sass src libsass src bind cpp ddf ui node modules node sass src libsass test test node cpp ddf ui node modules node sass src libsass src constants cpp ddf ui node modules node sass src libsass src plugins cpp ddf ui node modules node sass src libsass src eval hpp ddf ui node modules node sass src libsass include sass base h ddf ui node modules node sass src libsass src prelexer hpp ddf ui node modules node sass src libsass src subset map hpp ddf ui node modules node sass src libsass src extend cpp ddf ui node modules node sass src custom importer bridge cpp ddf ui node modules node sass src libsass contrib plugin cpp ddf ui node modules node sass src libsass src lexer hpp ddf ui node modules node sass src libsass src sass util cpp ddf ui node modules node sass src libsass test test superselector cpp ddf ui node modules node sass src libsass src cencode c ddf ui node modules node sass src libsass src ast fwd decl hpp ddf ui node modules node sass src libsass src string hpp ddf ui node modules node sass src libsass src file cpp ddf ui node modules node sass src callback bridge h ddf ui node modules node sass src libsass src subset map cpp ddf ui node modules node sass src libsass include h ddf ui node modules node sass src libsass src sass functions hpp ddf ui node modules node sass src libsass src emitter cpp ddf ui node modules node sass src libsass src environment cpp ddf ui node modules node sass src libsass src ast def macros hpp ddf ui node modules node sass src libsass src paths hpp ddf ui node modules node sass src libsass src expand cpp ddf ui node modules node sass src libsass src listize cpp ddf ui node modules node sass src libsass test test unification cpp ddf ui node modules node sass src libsass src source map cpp ddf ui node modules node sass src libsass src inspect cpp ddf ui node modules node sass src libsass src source map hpp ddf ui node modules node sass src sass types list h ddf ui node modules node sass src libsass src json cpp ddf ui node modules node sass src libsass src sass functions cpp ddf ui node modules node sass src libsass src sass util hpp ddf ui node modules node sass src libsass src position cpp ddf ui node modules node sass src libsass src listize hpp ddf ui node modules node sass src libsass src error handling hpp ddf ui node modules node sass src libsass src eval cpp ddf ui node modules node sass src sass types string cpp ddf ui node modules node sass src libsass src environment hpp ddf ui node modules node sass src sass types boolean h ddf ui node modules node sass src libsass src emitter hpp ddf ui node modules node sass src libsass src extend hpp ddf ui node modules node sass src libsass src lexer cpp ddf ui node modules node sass src libsass src functions cpp ddf ui node modules node sass src libsass src debugger hpp ddf ui node modules node sass src libsass src prelexer cpp ddf ui node modules node sass src sass types factory cpp ddf ui node modules node sass src sass types color cpp ddf ui node modules node sass src libsass src operation hpp ddf ui node modules node sass src libsass src ast cpp ddf ui node modules node sass src libsass src sass hpp ddf ui node modules node sass src libsass src constants hpp ddf ui node modules node sass src sass types boolean cpp ddf ui node modules node sass src libsass src context hpp ddf ui node modules node sass src sass types value h ddf ui node modules node sass src libsass src units cpp ddf ui node modules node sass src libsass src cssize hpp ddf ui node modules node sass src libsass src parser cpp ddf ui node modules node sass src libsass src functions hpp ddf ui node modules node sass src libsass src position hpp ddf ui node modules node sass src sass context wrapper cpp ddf ui node modules node sass src libsass src sass context hpp ddf ui node modules node sass src libsass src node cpp ddf ui node modules node sass src libsass src ast fwd decl cpp ddf ui node modules node sass src libsass src cssize cpp ddf ui node modules node sass src libsass src remove placeholders cpp ddf ui node modules node sass src binding cpp ddf ui node modules node sass src sass types list cpp ddf ui node modules node sass src libsass include sass functions h ddf ui node modules node sass src custom function bridge cpp ddf ui node modules node sass src custom importer bridge h ddf ui node modules node sass src libsass src color maps cpp ddf ui node modules node sass src libsass src string cpp ddf ui node modules node sass src sass types sass value wrapper h ddf ui node modules node sass src libsass src ast hpp ddf ui node modules node sass src libsass src values cpp ddf ui node modules node sass src sass context wrapper h ddf ui node modules node sass src libsass src encode h ddf ui node modules node sass src libsass src sass cpp ddf ui node modules node sass src libsass src cpp ddf ui node modules node sass src sass types number cpp ddf ui node modules node sass src libsass src c ddf ui node modules node sass src libsass src node hpp ddf ui node modules node sass src libsass src remove placeholders hpp ddf ui node modules node sass src libsass src util cpp ddf ui node modules node sass src libsass src memory sharedptr hpp ddf ui node modules node sass src libsass src backtrace hpp ddf ui node modules node sass src libsass include sass values h ddf ui node modules node sass src libsass test test subset map cpp ddf ui node modules node sass src libsass src cpp ddf ui node modules node sass src libsass src sass context cpp ddf ui node modules node sass src libsass src memory sharedptr cpp ddf ui node modules node sass src sass types null cpp ddf ui node modules node sass src libsass src expand hpp ddf ui node modules node sass src libsass src to value hpp ddf ui node modules node sass src libsass src context cpp ddf ui node modules node sass src libsass include sass context h ddf ui node modules node sass src libsass src to c cpp ddf ui node modules node sass src libsass src sass values cpp ddf ui node modules node sass src libsass src color maps hpp ddf ui node modules node sass src libsass src inspect hpp ddf ui node modules node sass src libsass script test leaks pl ddf ui node modules node sass src libsass src to c hpp ddf ui node modules node sass src libsass src util hpp ddf ui node modules node sass src libsass src file hpp ddf ui node modules node sass src sass types map cpp ddf ui node modules node sass src libsass src units hpp ddf ui node modules node sass src libsass src plugins hpp ddf ui node modules node sass src sass types color h ddf ui node modules node sass src libsass src to value cpp ddf ui node modules node sass src libsass src debug hpp ddf ui node modules node sass src libsass src parser hpp vulnerability details in libsass prior to sass eval operator sass binary expression inside eval cpp allows attackers to cause a denial of service resulting from stack consumption via a crafted sass file because of certain incorrect parsing of as a modulo operator in parser cpp publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href fix resolution step up your open source security game with whitesource | 0 |
50,779 | 13,187,739,868 | IssuesEvent | 2020-08-13 04:25:15 | icecube-trac/tix3 | https://api.github.com/repos/icecube-trac/tix3 | closed | weighting 'from_simprod' function tries to load missing values (Trac #1356) | Migrated from Trac combo reconstruction defect | icecube.weighting.weighting.from_simprod (weighting.py L578)
after connecting to the simprod database, the function tries to load 'NUGEN::elogmin', 'NUGEN::elogmax', and 'NUGEN::injectionradius', which do not exist in new neutrino generator tables.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1356">https://code.icecube.wisc.edu/ticket/1356</a>, reported by ddouglas and owned by jvansanten</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-09-18T18:29:44",
"description": "icecube.weighting.weighting.from_simprod (weighting.py L578)\nafter connecting to the simprod database, the function tries to load 'NUGEN::elogmin', 'NUGEN::elogmax', and 'NUGEN::injectionradius', which do not exist in new neutrino generator tables.",
"reporter": "ddouglas",
"cc": "",
"resolution": "fixed",
"_ts": "1442600984726412",
"component": "combo reconstruction",
"summary": "weighting 'from_simprod' function tries to load missing values",
"priority": "normal",
"keywords": "weighting, simprod",
"time": "2015-09-18T18:15:17",
"milestone": "",
"owner": "jvansanten",
"type": "defect"
}
```
</p>
</details>
| 1.0 | weighting 'from_simprod' function tries to load missing values (Trac #1356) - icecube.weighting.weighting.from_simprod (weighting.py L578)
after connecting to the simprod database, the function tries to load 'NUGEN::elogmin', 'NUGEN::elogmax', and 'NUGEN::injectionradius', which do not exist in new neutrino generator tables.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1356">https://code.icecube.wisc.edu/ticket/1356</a>, reported by ddouglas and owned by jvansanten</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-09-18T18:29:44",
"description": "icecube.weighting.weighting.from_simprod (weighting.py L578)\nafter connecting to the simprod database, the function tries to load 'NUGEN::elogmin', 'NUGEN::elogmax', and 'NUGEN::injectionradius', which do not exist in new neutrino generator tables.",
"reporter": "ddouglas",
"cc": "",
"resolution": "fixed",
"_ts": "1442600984726412",
"component": "combo reconstruction",
"summary": "weighting 'from_simprod' function tries to load missing values",
"priority": "normal",
"keywords": "weighting, simprod",
"time": "2015-09-18T18:15:17",
"milestone": "",
"owner": "jvansanten",
"type": "defect"
}
```
</p>
</details>
| defect | weighting from simprod function tries to load missing values trac icecube weighting weighting from simprod weighting py after connecting to the simprod database the function tries to load nugen elogmin nugen elogmax and nugen injectionradius which do not exist in new neutrino generator tables migrated from json status closed changetime description icecube weighting weighting from simprod weighting py nafter connecting to the simprod database the function tries to load nugen elogmin nugen elogmax and nugen injectionradius which do not exist in new neutrino generator tables reporter ddouglas cc resolution fixed ts component combo reconstruction summary weighting from simprod function tries to load missing values priority normal keywords weighting simprod time milestone owner jvansanten type defect | 1 |
63,721 | 17,869,891,094 | IssuesEvent | 2021-09-06 14:09:11 | vector-im/element-web | https://api.github.com/repos/vector-im/element-web | closed | Element loses connection to the server when exiting | T-Defect X-Cannot-Reproduce | ### Steps to reproduce
How to reproduce:
1. Install Element/Open Element)
2. Log in
(I'm using Matrix.org, I'm not sure about other servers)
3. Exit Element somehow
(Everything works fine until this point)
1. You can do this on desktop by either quitting the app from the system tray or letting the computer go to sleep
2. You can do this on web by just restarting the browser
4. Re-open Element
5. Issue will occur ("Connectivity to the server has been lost. Sent messages will be stored until your connection has returned.")
How to fix:
1. Sign out and back in
1. Sign out
2. Select "I don't want my encrypted messages" (I don't know if this is needed, but this is what I did every time)
3. Edit homeserver
4. Select "Other homeserver"
5. Manually type in "matrix.org"
6. Sign in
2. Reinstall Element (Desktop only)
1. I used BCUninstaller every time
### What did you expect?
I would be able to use Element as normal. Connection to the homeserver would not break.
### What happened?
Connection to the homeserver broke, and will not fix on restart.
### Operating system
Windows 10 Build 19043.1165
Microsoft Edge v93.0.961.38 (Official build) (64-bit)
### Application version
Element v1.8.2, Olm v3.2.3
### How did you install the app?
https://element.io/get-started
### Homeserver
matrix.org
### Have you submitted a rageshake?
No, I am unable to download logs with /rageshake due to this error: "Failed to send logs: request failed: CORS request rejected: https://matrix-client.matrix.org/_matrix/client/r0/user/%40tlariba%3Amatrix.org/account_data/m.cross_signing.master ..." | 1.0 | Element loses connection to the server when exiting - ### Steps to reproduce
How to reproduce:
1. Install Element/Open Element)
2. Log in
(I'm using Matrix.org, I'm not sure about other servers)
3. Exit Element somehow
(Everything works fine until this point)
1. You can do this on desktop by either quitting the app from the system tray or letting the computer go to sleep
2. You can do this on web by just restarting the browser
4. Re-open Element
5. Issue will occur ("Connectivity to the server has been lost. Sent messages will be stored until your connection has returned.")
How to fix:
1. Sign out and back in
1. Sign out
2. Select "I don't want my encrypted messages" (I don't know if this is needed, but this is what I did every time)
3. Edit homeserver
4. Select "Other homeserver"
5. Manually type in "matrix.org"
6. Sign in
2. Reinstall Element (Desktop only)
1. I used BCUninstaller every time
### What did you expect?
I would be able to use Element as normal. Connection to the homeserver would not break.
### What happened?
Connection to the homeserver broke, and will not fix on restart.
### Operating system
Windows 10 Build 19043.1165
Microsoft Edge v93.0.961.38 (Official build) (64-bit)
### Application version
Element v1.8.2, Olm v3.2.3
### How did you install the app?
https://element.io/get-started
### Homeserver
matrix.org
### Have you submitted a rageshake?
No, I am unable to download logs with /rageshake due to this error: "Failed to send logs: request failed: CORS request rejected: https://matrix-client.matrix.org/_matrix/client/r0/user/%40tlariba%3Amatrix.org/account_data/m.cross_signing.master ..." | defect | element loses connection to the server when exiting steps to reproduce how to reproduce install element open element log in i m using matrix org i m not sure about other servers exit element somehow everything works fine until this point you can do this on desktop by either quitting the app from the system tray or letting the computer go to sleep you can do this on web by just restarting the browser re open element issue will occur connectivity to the server has been lost sent messages will be stored until your connection has returned how to fix sign out and back in sign out select i don t want my encrypted messages i don t know if this is needed but this is what i did every time edit homeserver select other homeserver manually type in matrix org sign in reinstall element desktop only i used bcuninstaller every time what did you expect i would be able to use element as normal connection to the homeserver would not break what happened connection to the homeserver broke and will not fix on restart operating system windows build microsoft edge official build bit application version element olm how did you install the app homeserver matrix org have you submitted a rageshake no i am unable to download logs with rageshake due to this error failed to send logs request failed cors request rejected | 1 |
62,542 | 17,035,600,054 | IssuesEvent | 2021-07-05 06:35:44 | jOOQ/jOOQ | https://api.github.com/repos/jOOQ/jOOQ | closed | Manual section about limit-clause contains broken link about Oracle ROWNUM filtering performance | C: Documentation E: All Editions P: Medium R: Fixed T: Defect | Hi,
just read the manual at https://www.jooq.org/doc/3.14/manual/sql-building/sql-statements/select-statement/limit-clause/
The link inside this paragraph
> Side-note: If you're interested in understanding why we chose ROWNUM for Oracle, please refer to this very interesting benchmark, comparing the different approaches of doing pagination in Oracle: http://www.inf.unideb.hu/~gabora/pagination/results.html.
leads to a 404 page. | 1.0 | Manual section about limit-clause contains broken link about Oracle ROWNUM filtering performance - Hi,
just read the manual at https://www.jooq.org/doc/3.14/manual/sql-building/sql-statements/select-statement/limit-clause/
The link inside this paragraph
> Side-note: If you're interested in understanding why we chose ROWNUM for Oracle, please refer to this very interesting benchmark, comparing the different approaches of doing pagination in Oracle: http://www.inf.unideb.hu/~gabora/pagination/results.html.
leads to a 404 page. | defect | manual section about limit clause contains broken link about oracle rownum filtering performance hi just read the manual at the link inside this paragraph side note if you re interested in understanding why we chose rownum for oracle please refer to this very interesting benchmark comparing the different approaches of doing pagination in oracle leads to a page | 1 |
26,711 | 4,777,625,431 | IssuesEvent | 2016-10-27 16:48:53 | wheeler-microfluidics/microdrop | https://api.github.com/repos/wheeler-microfluidics/microdrop | closed | Common AppData dir warnings (Trac #47) | core defect Migrated from Trac | The "devices/plugins does not exist in common AppData dir" warnings pop up on program launch if the user does not have Microdrop installed on their system. These warnings should only be displayed if the application is being run from an exe file (otherwise, the devices and plugins directories should exist in the src tree).
Migrated from http://microfluidics.utoronto.ca/ticket/47
```json
{
"status": "closed",
"changetime": "2015-01-07T21:59:40",
"description": "The \"devices/plugins does not exist in common AppData dir\" warnings pop up on program launch if the user does not have Microdrop installed on their system. These warnings should only be displayed if the application is being run from an exe file (otherwise, the devices and plugins directories should exist in the src tree).",
"reporter": "ryan",
"cc": "",
"resolution": "fixed",
"_ts": "1420667980075591",
"component": "core",
"summary": "Common AppData dir warnings",
"priority": "minor",
"keywords": "",
"version": "0.1",
"time": "2012-02-20T03:35:38",
"milestone": "Microdrop 1.0",
"owner": "ryan",
"type": "defect"
}
```
| 1.0 | Common AppData dir warnings (Trac #47) - The "devices/plugins does not exist in common AppData dir" warnings pop up on program launch if the user does not have Microdrop installed on their system. These warnings should only be displayed if the application is being run from an exe file (otherwise, the devices and plugins directories should exist in the src tree).
Migrated from http://microfluidics.utoronto.ca/ticket/47
```json
{
"status": "closed",
"changetime": "2015-01-07T21:59:40",
"description": "The \"devices/plugins does not exist in common AppData dir\" warnings pop up on program launch if the user does not have Microdrop installed on their system. These warnings should only be displayed if the application is being run from an exe file (otherwise, the devices and plugins directories should exist in the src tree).",
"reporter": "ryan",
"cc": "",
"resolution": "fixed",
"_ts": "1420667980075591",
"component": "core",
"summary": "Common AppData dir warnings",
"priority": "minor",
"keywords": "",
"version": "0.1",
"time": "2012-02-20T03:35:38",
"milestone": "Microdrop 1.0",
"owner": "ryan",
"type": "defect"
}
```
| defect | common appdata dir warnings trac the devices plugins does not exist in common appdata dir warnings pop up on program launch if the user does not have microdrop installed on their system these warnings should only be displayed if the application is being run from an exe file otherwise the devices and plugins directories should exist in the src tree migrated from json status closed changetime description the devices plugins does not exist in common appdata dir warnings pop up on program launch if the user does not have microdrop installed on their system these warnings should only be displayed if the application is being run from an exe file otherwise the devices and plugins directories should exist in the src tree reporter ryan cc resolution fixed ts component core summary common appdata dir warnings priority minor keywords version time milestone microdrop owner ryan type defect | 1 |
383,958 | 11,371,981,354 | IssuesEvent | 2020-01-28 00:12:58 | googleapis/nodejs-pubsub | https://api.github.com/repos/googleapis/nodejs-pubsub | closed | UnhandledPromiseRejectionWarning: Error: Deadline exceeded | :rotating_light: help wanted needs more info priority: p2 type: bug | #### Environment details
- OS: macOS 10.14.5
- Node.js version: v10.16.0
- npm version: 6.9.0
- `@google-cloud/pubsub` version: "^0.30.1",
#### Steps to reproduce
1. download this repo as zip
2. cd samples, npm i
3. node subscriptions.js list
```
(node:88354) UnhandledPromiseRejectionWarning: Error: Deadline exceeded
at Http2CallStream.call.on (/Users/mj/Downloads/nodejs-pubsub-master/samples/node_modules/@grpc/grpc-js/build/src/client.js:101:45)
at Http2CallStream.emit (events.js:203:15)
at process.nextTick (/Users/mj/Downloads/nodejs-pubsub-master/samples/node_modules/@grpc/grpc-js/build/src/call-stream.js:71:22)
at process._tickCallback (internal/process/next_tick.js:61:11)
(node:88354) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:88354) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
```
followed this [guide](https://github.com/googleapis/nodejs-pubsub#before-you-begin) to config auth, tried at least 3 times.
I'm sure GOOGLE_APPLICATION_CREDENTIALS and service_account are properly configured.
because cloud-pubsub go examples works on my local laptop.
none of examples work at all. always got "Deadline exceeded"
btw, copied some code, deploy to cloud functions, can publish message to cloud pubsub
| 1.0 | UnhandledPromiseRejectionWarning: Error: Deadline exceeded - #### Environment details
- OS: macOS 10.14.5
- Node.js version: v10.16.0
- npm version: 6.9.0
- `@google-cloud/pubsub` version: "^0.30.1",
#### Steps to reproduce
1. download this repo as zip
2. cd samples, npm i
3. node subscriptions.js list
```
(node:88354) UnhandledPromiseRejectionWarning: Error: Deadline exceeded
at Http2CallStream.call.on (/Users/mj/Downloads/nodejs-pubsub-master/samples/node_modules/@grpc/grpc-js/build/src/client.js:101:45)
at Http2CallStream.emit (events.js:203:15)
at process.nextTick (/Users/mj/Downloads/nodejs-pubsub-master/samples/node_modules/@grpc/grpc-js/build/src/call-stream.js:71:22)
at process._tickCallback (internal/process/next_tick.js:61:11)
(node:88354) UnhandledPromiseRejectionWarning: Unhandled promise rejection. This error originated either by throwing inside of an async function without a catch block, or by rejecting a promise which was not handled with .catch(). (rejection id: 1)
(node:88354) [DEP0018] DeprecationWarning: Unhandled promise rejections are deprecated. In the future, promise rejections that are not handled will terminate the Node.js process with a non-zero exit code.
```
followed this [guide](https://github.com/googleapis/nodejs-pubsub#before-you-begin) to config auth, tried at least 3 times.
I'm sure GOOGLE_APPLICATION_CREDENTIALS and service_account are properly configured.
because cloud-pubsub go examples works on my local laptop.
none of examples work at all. always got "Deadline exceeded"
btw, copied some code, deploy to cloud functions, can publish message to cloud pubsub
| non_defect | unhandledpromiserejectionwarning error deadline exceeded environment details os macos node js version npm version google cloud pubsub version steps to reproduce download this repo as zip cd samples npm i node subscriptions js list node unhandledpromiserejectionwarning error deadline exceeded at call on users mj downloads nodejs pubsub master samples node modules grpc grpc js build src client js at emit events js at process nexttick users mj downloads nodejs pubsub master samples node modules grpc grpc js build src call stream js at process tickcallback internal process next tick js node unhandledpromiserejectionwarning unhandled promise rejection this error originated either by throwing inside of an async function without a catch block or by rejecting a promise which was not handled with catch rejection id node deprecationwarning unhandled promise rejections are deprecated in the future promise rejections that are not handled will terminate the node js process with a non zero exit code followed this to config auth tried at least times i m sure google application credentials and service account are properly configured because cloud pubsub go examples works on my local laptop none of examples work at all always got deadline exceeded btw copied some code deploy to cloud functions can publish message to cloud pubsub | 0 |
52,137 | 13,211,392,273 | IssuesEvent | 2020-08-15 22:48:44 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | opened | [Steamshovel] artists python files don't like being called on their own which confuses the documentation (Trac #1736) | Incomplete Migration Migrated from Trac combo core defect | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1736">https://code.icecube.wisc.edu/projects/icecube/ticket/1736</a>, reported by kjmeagherand owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:38",
"_ts": "1550067158057333",
"description": "It is not clear to me exactly how steamshovel loads these artist files but it causes problems with sphinx documentation. These can be tested by running the file directly for example calling `python ${I3_SRC}/CommonVariables/python/artists/direct_hits.py` instead of calling sphinx and it gets the same result.\n\n* common_variables/artists/direct_hits.py\n* common_variables/artists/hit_multiplicity.py\n* common_variables/artists/hit_statistics.py\n* common_variables/artists/track_characteristics.py\n* millipede/artists.py\n* steamshovel/artists/LEDPowerHouse.py\n* steamshovel/artists/ParticleUncertainty.py\n* steamshovel/sessions/IT73.py\n* steamshovel/sessions/Minimum.py\n\nFull error messages below\n{{{\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:15: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.direct_hits'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/direct_hits.py\", line 5, in <module>\n class I3DirectHitsValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/direct_hits.py\", line 10, in I3DirectHitsValues\n requiredTypes = [ direct_hits.I3DirectHitsValues ]\nAttributeError: 'module' object has no attribute 'I3DirectHitsValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:23: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.hit_multiplicity'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_multiplicity.py\", line 4, in <module>\n class I3HitMultiplicityValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_multiplicity.py\", line 9, in I3HitMultiplicityValues\n requiredTypes = [ hit_multiplicity.I3HitMultiplicityValues ]\nAttributeError: 'module' object has no attribute 'I3HitMultiplicityValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:31: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.hit_statistics'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_statistics.py\", line 5, in <module>\n class I3HitStatisticsValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_statistics.py\", line 10, in I3HitStatisticsValues\n requiredTypes = [ hit_statistics.I3HitStatisticsValues ]\nAttributeError: 'module' object has no attribute 'I3HitStatisticsValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:39: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.track_characteristics'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/track_characteristics.py\", line 5, in <module>\n class I3TrackCharacteristicsValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/track_characteristics.py\", line 10, in I3TrackCharacteristicsValues\n requiredTypes = [ track_characteristics.I3TrackCharacteristicsValues ]\nAttributeError: 'module' object has no attribute 'I3TrackCharacteristicsValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.ipdf.rst:23: WARNING: autodoc: failed to import module u'icecube.ipdf.test_bug'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/ipdf/test_bug.py\", line 3, in <module>\n scenario = window.gl.scenario\nNameError: name 'window' is not defined\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.millipede.rst:15: WARNING: autodoc: failed to import module u'icecube.millipede.artists'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/millipede/artists.py\", line 7, in <module>\n from icecube.steamshovel.artists.MPLArtist import MPLArtist\nImportError: No module named MPLArtist\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.artists.rst:70: WARNING: autodoc: failed to import module u'icecube.steamshovel.artists.LEDPowerHouse'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/artists/LEDPowerHouse.py\", line 9, in <module>\n import serial\nImportError: No module named serial\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.artists.rst:78: WARNING: autodoc: failed to import module u'icecube.steamshovel.artists.ParticleUncertainty'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/artists/ParticleUncertainty.py\", line 6, in <module>\n from .AnimatedParticle import PosAtTime\nImportError: No module named AnimatedParticle\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.sessions.rst:15: WARNING: autodoc: failed to import module u'icecube.steamshovel.sessions.IT73'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/IT73.py\", line 124, in <module>\n _dumpScenario()\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/IT73.py\", line 6, in _dumpScenario\n scenario = window.gl.scenario\nNameError: global name 'window' is not defined\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.sessions.rst:23: WARNING: autodoc: failed to import module u'icecube.steamshovel.sessions.Minimum'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/Minimum.py\", line 47, in <module>\n _dumpScenario()\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/Minimum.py\", line 6, in _dumpScenario\n scenario = window.gl.scenario\nNameError: global name 'window' is not defined\n}}}",
"reporter": "kjmeagher",
"cc": "",
"resolution": "fixed",
"time": "2016-06-10T07:42:38",
"component": "combo core",
"summary": "[Steamshovel] artists python files don't like being called on their own which confuses the documentation",
"priority": "normal",
"keywords": "documentation",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
| 1.0 | [Steamshovel] artists python files don't like being called on their own which confuses the documentation (Trac #1736) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1736">https://code.icecube.wisc.edu/projects/icecube/ticket/1736</a>, reported by kjmeagherand owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:38",
"_ts": "1550067158057333",
"description": "It is not clear to me exactly how steamshovel loads these artist files but it causes problems with sphinx documentation. These can be tested by running the file directly for example calling `python ${I3_SRC}/CommonVariables/python/artists/direct_hits.py` instead of calling sphinx and it gets the same result.\n\n* common_variables/artists/direct_hits.py\n* common_variables/artists/hit_multiplicity.py\n* common_variables/artists/hit_statistics.py\n* common_variables/artists/track_characteristics.py\n* millipede/artists.py\n* steamshovel/artists/LEDPowerHouse.py\n* steamshovel/artists/ParticleUncertainty.py\n* steamshovel/sessions/IT73.py\n* steamshovel/sessions/Minimum.py\n\nFull error messages below\n{{{\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:15: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.direct_hits'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/direct_hits.py\", line 5, in <module>\n class I3DirectHitsValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/direct_hits.py\", line 10, in I3DirectHitsValues\n requiredTypes = [ direct_hits.I3DirectHitsValues ]\nAttributeError: 'module' object has no attribute 'I3DirectHitsValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:23: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.hit_multiplicity'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_multiplicity.py\", line 4, in <module>\n class I3HitMultiplicityValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_multiplicity.py\", line 9, in I3HitMultiplicityValues\n requiredTypes = [ hit_multiplicity.I3HitMultiplicityValues ]\nAttributeError: 'module' object has no attribute 'I3HitMultiplicityValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:31: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.hit_statistics'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_statistics.py\", line 5, in <module>\n class I3HitStatisticsValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/hit_statistics.py\", line 10, in I3HitStatisticsValues\n requiredTypes = [ hit_statistics.I3HitStatisticsValues ]\nAttributeError: 'module' object has no attribute 'I3HitStatisticsValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.common_variables.artists.rst:39: WARNING: autodoc: failed to import module u'icecube.common_variables.artists.track_characteristics'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/track_characteristics.py\", line 5, in <module>\n class I3TrackCharacteristicsValues(PyArtist):\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/common_variables/artists/track_characteristics.py\", line 10, in I3TrackCharacteristicsValues\n requiredTypes = [ track_characteristics.I3TrackCharacteristicsValues ]\nAttributeError: 'module' object has no attribute 'I3TrackCharacteristicsValues'\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.ipdf.rst:23: WARNING: autodoc: failed to import module u'icecube.ipdf.test_bug'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/ipdf/test_bug.py\", line 3, in <module>\n scenario = window.gl.scenario\nNameError: name 'window' is not defined\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.millipede.rst:15: WARNING: autodoc: failed to import module u'icecube.millipede.artists'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/millipede/artists.py\", line 7, in <module>\n from icecube.steamshovel.artists.MPLArtist import MPLArtist\nImportError: No module named MPLArtist\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.artists.rst:70: WARNING: autodoc: failed to import module u'icecube.steamshovel.artists.LEDPowerHouse'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/artists/LEDPowerHouse.py\", line 9, in <module>\n import serial\nImportError: No module named serial\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.artists.rst:78: WARNING: autodoc: failed to import module u'icecube.steamshovel.artists.ParticleUncertainty'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/artists/ParticleUncertainty.py\", line 6, in <module>\n from .AnimatedParticle import PosAtTime\nImportError: No module named AnimatedParticle\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.sessions.rst:15: WARNING: autodoc: failed to import module u'icecube.steamshovel.sessions.IT73'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/IT73.py\", line 124, in <module>\n _dumpScenario()\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/IT73.py\", line 6, in _dumpScenario\n scenario = window.gl.scenario\nNameError: global name 'window' is not defined\n/Users/kmeagher/icecube/combo/release/sphinx_build/source/python/icecube.steamshovel.sessions.rst:23: WARNING: autodoc: failed to import module u'icecube.steamshovel.sessions.Minimum'; the following exception was raised:\nTraceback (most recent call last):\n File \"/private/var/folders/rc/g_4_lyp9039cj1586zzg88f40000gn/T/pip-build-A327aa/sphinx/sphinx/ext/autodoc.py\", line 385, in import_object\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/Minimum.py\", line 47, in <module>\n _dumpScenario()\n File \"/Users/kmeagher/icecube/combo/release/lib/icecube/steamshovel/sessions/Minimum.py\", line 6, in _dumpScenario\n scenario = window.gl.scenario\nNameError: global name 'window' is not defined\n}}}",
"reporter": "kjmeagher",
"cc": "",
"resolution": "fixed",
"time": "2016-06-10T07:42:38",
"component": "combo core",
"summary": "[Steamshovel] artists python files don't like being called on their own which confuses the documentation",
"priority": "normal",
"keywords": "documentation",
"milestone": "",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
| defect | artists python files don t like being called on their own which confuses the documentation trac migrated from json status closed changetime ts description it is not clear to me exactly how steamshovel loads these artist files but it causes problems with sphinx documentation these can be tested by running the file directly for example calling python src commonvariables python artists direct hits py instead of calling sphinx and it gets the same result n n common variables artists direct hits py n common variables artists hit multiplicity py n common variables artists hit statistics py n common variables artists track characteristics py n millipede artists py n steamshovel artists ledpowerhouse py n steamshovel artists particleuncertainty py n steamshovel sessions py n steamshovel sessions minimum py n nfull error messages below n n users kmeagher icecube combo release sphinx build source python icecube common variables artists rst warning autodoc failed to import module u icecube common variables artists direct hits the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube common variables artists direct hits py line in n class pyartist n file users kmeagher icecube combo release lib icecube common variables artists direct hits py line in n requiredtypes nattributeerror module object has no attribute n users kmeagher icecube combo release sphinx build source python icecube common variables artists rst warning autodoc failed to import module u icecube common variables artists hit multiplicity the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube common variables artists hit multiplicity py line in n class pyartist n file users kmeagher icecube combo release lib icecube common variables artists hit multiplicity py line in n requiredtypes nattributeerror module object has no attribute n users kmeagher icecube combo release sphinx build source python icecube common variables artists rst warning autodoc failed to import module u icecube common variables artists hit statistics the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube common variables artists hit statistics py line in n class pyartist n file users kmeagher icecube combo release lib icecube common variables artists hit statistics py line in n requiredtypes nattributeerror module object has no attribute n users kmeagher icecube combo release sphinx build source python icecube common variables artists rst warning autodoc failed to import module u icecube common variables artists track characteristics the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube common variables artists track characteristics py line in n class pyartist n file users kmeagher icecube combo release lib icecube common variables artists track characteristics py line in n requiredtypes nattributeerror module object has no attribute n users kmeagher icecube combo release sphinx build source python icecube ipdf rst warning autodoc failed to import module u icecube ipdf test bug the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube ipdf test bug py line in n scenario window gl scenario nnameerror name window is not defined n users kmeagher icecube combo release sphinx build source python icecube millipede rst warning autodoc failed to import module u icecube millipede artists the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube millipede artists py line in n from icecube steamshovel artists mplartist import mplartist nimporterror no module named mplartist n users kmeagher icecube combo release sphinx build source python icecube steamshovel artists rst warning autodoc failed to import module u icecube steamshovel artists ledpowerhouse the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube steamshovel artists ledpowerhouse py line in n import serial nimporterror no module named serial n users kmeagher icecube combo release sphinx build source python icecube steamshovel artists rst warning autodoc failed to import module u icecube steamshovel artists particleuncertainty the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube steamshovel artists particleuncertainty py line in n from animatedparticle import posattime nimporterror no module named animatedparticle n users kmeagher icecube combo release sphinx build source python icecube steamshovel sessions rst warning autodoc failed to import module u icecube steamshovel sessions the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube steamshovel sessions py line in n dumpscenario n file users kmeagher icecube combo release lib icecube steamshovel sessions py line in dumpscenario n scenario window gl scenario nnameerror global name window is not defined n users kmeagher icecube combo release sphinx build source python icecube steamshovel sessions rst warning autodoc failed to import module u icecube steamshovel sessions minimum the following exception was raised ntraceback most recent call last n file private var folders rc g t pip build sphinx sphinx ext autodoc py line in import object n file users kmeagher icecube combo release lib icecube steamshovel sessions minimum py line in n dumpscenario n file users kmeagher icecube combo release lib icecube steamshovel sessions minimum py line in dumpscenario n scenario window gl scenario nnameerror global name window is not defined n reporter kjmeagher cc resolution fixed time component combo core summary artists python files don t like being called on their own which confuses the documentation priority normal keywords documentation milestone owner hdembinski type defect | 1 |
184,101 | 21,784,794,076 | IssuesEvent | 2022-05-14 01:20:32 | rgordon95/simple-react-redux-demo | https://api.github.com/repos/rgordon95/simple-react-redux-demo | opened | CVE-2022-1650 (High) detected in eventsource-1.0.7.tgz | security vulnerability | ## CVE-2022-1650 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>eventsource-1.0.7.tgz</b></p></summary>
<p>W3C compliant EventSource client for Node.js and browser (polyfill)</p>
<p>Library home page: <a href="https://registry.npmjs.org/eventsource/-/eventsource-1.0.7.tgz">https://registry.npmjs.org/eventsource/-/eventsource-1.0.7.tgz</a></p>
<p>Path to dependency file: /simple-react-redux-demo/package.json</p>
<p>Path to vulnerable library: /node_modules/eventsource/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- react-dev-utils-8.0.0.tgz
- sockjs-client-1.3.0.tgz
- :x: **eventsource-1.0.7.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Exposure of Sensitive Information to an Unauthorized Actor in GitHub repository eventsource/eventsource prior to v2.0.2.
<p>Publish Date: 2022-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-1650>CVE-2022-1650</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/dc9e467f-be5d-4945-867d-1044d27e9b8e/">https://huntr.dev/bounties/dc9e467f-be5d-4945-867d-1044d27e9b8e/</a></p>
<p>Release Date: 2022-05-12</p>
<p>Fix Resolution: eventsource - 2.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-1650 (High) detected in eventsource-1.0.7.tgz - ## CVE-2022-1650 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>eventsource-1.0.7.tgz</b></p></summary>
<p>W3C compliant EventSource client for Node.js and browser (polyfill)</p>
<p>Library home page: <a href="https://registry.npmjs.org/eventsource/-/eventsource-1.0.7.tgz">https://registry.npmjs.org/eventsource/-/eventsource-1.0.7.tgz</a></p>
<p>Path to dependency file: /simple-react-redux-demo/package.json</p>
<p>Path to vulnerable library: /node_modules/eventsource/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- react-dev-utils-8.0.0.tgz
- sockjs-client-1.3.0.tgz
- :x: **eventsource-1.0.7.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Exposure of Sensitive Information to an Unauthorized Actor in GitHub repository eventsource/eventsource prior to v2.0.2.
<p>Publish Date: 2022-05-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-1650>CVE-2022-1650</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/dc9e467f-be5d-4945-867d-1044d27e9b8e/">https://huntr.dev/bounties/dc9e467f-be5d-4945-867d-1044d27e9b8e/</a></p>
<p>Release Date: 2022-05-12</p>
<p>Fix Resolution: eventsource - 2.0.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve high detected in eventsource tgz cve high severity vulnerability vulnerable library eventsource tgz compliant eventsource client for node js and browser polyfill library home page a href path to dependency file simple react redux demo package json path to vulnerable library node modules eventsource package json dependency hierarchy react scripts tgz root library react dev utils tgz sockjs client tgz x eventsource tgz vulnerable library vulnerability details exposure of sensitive information to an unauthorized actor in github repository eventsource eventsource prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution eventsource step up your open source security game with whitesource | 0 |
3,239 | 2,664,652,202 | IssuesEvent | 2015-03-20 15:45:25 | backbee/backbee-standard | https://api.github.com/repos/backbee/backbee-standard | closed | name of the blocks | Fixed To test | After droping a new block, i can't see the breadcrumb trail telling me how block ae organised
contentset > articleblock...etc | 1.0 | name of the blocks - After droping a new block, i can't see the breadcrumb trail telling me how block ae organised
contentset > articleblock...etc | non_defect | name of the blocks after droping a new block i can t see the breadcrumb trail telling me how block ae organised contentset articleblock etc | 0 |
580,022 | 17,203,287,949 | IssuesEvent | 2021-07-17 18:02:27 | soedevteam/soe-tracker | https://api.github.com/repos/soedevteam/soe-tracker | reopened | [BUG] Tow Job not registering as on duty | Bug Low Priority | Details
--
**Report Link:** https://evolpcgaming.com/forums/topic/19683-bug-tow-job-issues/
**Feature(s) Affected:** /towonduty
**Exploitable?:** No
**CitizenFX Log(s):**
[CitizenFX_log_2021-06-19T110813.log](https://github.com/soedevteam/soe-tracker/files/6681193/CitizenFX_log_2021-06-19T110813.log)
<br>Description
--
Noticed that /towonduty didn't show any tow on duty even though I was and have been for a while. An LEO tried to mark a vehicle for tow and also noticed it. Decided to end the job, wait and and take it again later to try and get back shown as on duty, then noticed the LEO mark the car for tow AFTER I had quit the job.
<br>Status
--
- [ ] Bug confirmed
- [ ] Triage work started
- [ ] Triage work finished
- [ ] Merged into master
<br>Related Issues
--
[ADD LINKS TO ANY RELATED ISSUES]
| 1.0 | [BUG] Tow Job not registering as on duty - Details
--
**Report Link:** https://evolpcgaming.com/forums/topic/19683-bug-tow-job-issues/
**Feature(s) Affected:** /towonduty
**Exploitable?:** No
**CitizenFX Log(s):**
[CitizenFX_log_2021-06-19T110813.log](https://github.com/soedevteam/soe-tracker/files/6681193/CitizenFX_log_2021-06-19T110813.log)
<br>Description
--
Noticed that /towonduty didn't show any tow on duty even though I was and have been for a while. An LEO tried to mark a vehicle for tow and also noticed it. Decided to end the job, wait and and take it again later to try and get back shown as on duty, then noticed the LEO mark the car for tow AFTER I had quit the job.
<br>Status
--
- [ ] Bug confirmed
- [ ] Triage work started
- [ ] Triage work finished
- [ ] Merged into master
<br>Related Issues
--
[ADD LINKS TO ANY RELATED ISSUES]
| non_defect | tow job not registering as on duty details report link feature s affected towonduty exploitable no citizenfx log s description noticed that towonduty didn t show any tow on duty even though i was and have been for a while an leo tried to mark a vehicle for tow and also noticed it decided to end the job wait and and take it again later to try and get back shown as on duty then noticed the leo mark the car for tow after i had quit the job status bug confirmed triage work started triage work finished merged into master related issues | 0 |
49,878 | 13,187,284,320 | IssuesEvent | 2020-08-13 02:55:38 | icecube-trac/tix3 | https://api.github.com/repos/icecube-trac/tix3 | opened | [clsim] GPU detection broken for RTX-series cards (Trac #2212) | Incomplete Migration Migrated from Trac combo simulation defect | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2212">https://code.icecube.wisc.edu/ticket/2212</a>, reported by jvansanten and owned by jvansanten</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-03-06T20:58:01",
"description": "Nvidia decided that ray-tracing was a thing, and renamed their GTX series \"RTX\", and e.g. the 2080 Ti advertises itself as \"GeForce RTX 2080 Ti\". clsim.traysegments.common.configureOpenCLDevices, however, special-cases cards named \"GTX\" to set the number of work items to something sane and enable native math. Is there any reason not to do away with the special cases, and simply scale the number of work items to be proportional to the global memory size, as well as enabling native math by default?",
"reporter": "jvansanten",
"cc": "",
"resolution": "fixed",
"_ts": "1551905881354849",
"component": "combo simulation",
"summary": "[clsim] GPU detection broken for RTX-series cards",
"priority": "normal",
"keywords": "",
"time": "2018-11-29T14:14:01",
"milestone": "",
"owner": "jvansanten",
"type": "defect"
}
```
</p>
</details>
| 1.0 | [clsim] GPU detection broken for RTX-series cards (Trac #2212) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/2212">https://code.icecube.wisc.edu/ticket/2212</a>, reported by jvansanten and owned by jvansanten</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-03-06T20:58:01",
"description": "Nvidia decided that ray-tracing was a thing, and renamed their GTX series \"RTX\", and e.g. the 2080 Ti advertises itself as \"GeForce RTX 2080 Ti\". clsim.traysegments.common.configureOpenCLDevices, however, special-cases cards named \"GTX\" to set the number of work items to something sane and enable native math. Is there any reason not to do away with the special cases, and simply scale the number of work items to be proportional to the global memory size, as well as enabling native math by default?",
"reporter": "jvansanten",
"cc": "",
"resolution": "fixed",
"_ts": "1551905881354849",
"component": "combo simulation",
"summary": "[clsim] GPU detection broken for RTX-series cards",
"priority": "normal",
"keywords": "",
"time": "2018-11-29T14:14:01",
"milestone": "",
"owner": "jvansanten",
"type": "defect"
}
```
</p>
</details>
| defect | gpu detection broken for rtx series cards trac migrated from json status closed changetime description nvidia decided that ray tracing was a thing and renamed their gtx series rtx and e g the ti advertises itself as geforce rtx ti clsim traysegments common configureopencldevices however special cases cards named gtx to set the number of work items to something sane and enable native math is there any reason not to do away with the special cases and simply scale the number of work items to be proportional to the global memory size as well as enabling native math by default reporter jvansanten cc resolution fixed ts component combo simulation summary gpu detection broken for rtx series cards priority normal keywords time milestone owner jvansanten type defect | 1 |
34,898 | 7,468,082,101 | IssuesEvent | 2018-04-02 17:42:23 | mlpack/mlpack | https://api.github.com/repos/mlpack/mlpack | closed | LARSTest/NoCholeskySingularityTest often fails for i386 | D: difficult P: minor T: defect | **Reported by rcurtin on 10 Apr 44902838 02:43 UTC**
For some reason I haven't nailed down, on i386, NoCholeskySingularityTest often fails despite the fact that the code should work fine. In my investigations, I have discovered that the reason for this is that the test will try to solve a singular matrix of the form
```
[a
a a](a)
```
which will generally cause solve() to fail (at least on x86_64). This is done in lars.cpp at line 198. But you can still solve a singular linear system, and MATLAB will do this in the least-squares sense:
http://www.mathworks.com/help/matlab/math/systems-of-linear-equations.html
A bit of backtracing and digging reveals that what is happening is this:
- LARS passes a singular matrix of the form above to arma::solve()
- arma::solve() detects that the matrix is very small and tries to invert it by hand, but gives up when it notices the determinant is 0
- arma::solve() then calls out to arma::lapack::gesv() (that is, LAPACK dgesv()) which provides a solution to the system (alternately, atlas::clapack_gesv() may be called, depending on the system configuration)
But this is confusing to me because LAPACK dgesv() should fail:
http://www.netlib.org/lapack/explore-html/d8/d72/dgesv_8f.html (search 'singular')
dgesv() will perform an LU factorization on the input matrix and then fail if U is singular... but U should be singular.
So, there are questions I think are worth asking:
- Is solve() a good test for matrix singularity?
- Why is LAPACK dgesv() solving the matrix despite its singularity, but only on i386? (I've never seen this failure on x86_64.)
Answering these questions, or, at least the second, should be possible with a little bit of code modification in LAPACK to figure out what is going on. If the answer to the first question is no, then digging deeper into the dgesv() issue may be completely unnecessary.
This should be reproducible by calling arma::solve() with a matrix of the form above, and may end up in a bug report bubbling up to Armadillo or LAPACK.
For now, I have commented out the test.
Michael, I CC'ed you because you wrote this code, and may have a more correct perspective on what's going on.
Migrated-From: http://trac.research.cc.gatech.edu/fastlab/ticket/373
| 1.0 | LARSTest/NoCholeskySingularityTest often fails for i386 - **Reported by rcurtin on 10 Apr 44902838 02:43 UTC**
For some reason I haven't nailed down, on i386, NoCholeskySingularityTest often fails despite the fact that the code should work fine. In my investigations, I have discovered that the reason for this is that the test will try to solve a singular matrix of the form
```
[a
a a](a)
```
which will generally cause solve() to fail (at least on x86_64). This is done in lars.cpp at line 198. But you can still solve a singular linear system, and MATLAB will do this in the least-squares sense:
http://www.mathworks.com/help/matlab/math/systems-of-linear-equations.html
A bit of backtracing and digging reveals that what is happening is this:
- LARS passes a singular matrix of the form above to arma::solve()
- arma::solve() detects that the matrix is very small and tries to invert it by hand, but gives up when it notices the determinant is 0
- arma::solve() then calls out to arma::lapack::gesv() (that is, LAPACK dgesv()) which provides a solution to the system (alternately, atlas::clapack_gesv() may be called, depending on the system configuration)
But this is confusing to me because LAPACK dgesv() should fail:
http://www.netlib.org/lapack/explore-html/d8/d72/dgesv_8f.html (search 'singular')
dgesv() will perform an LU factorization on the input matrix and then fail if U is singular... but U should be singular.
So, there are questions I think are worth asking:
- Is solve() a good test for matrix singularity?
- Why is LAPACK dgesv() solving the matrix despite its singularity, but only on i386? (I've never seen this failure on x86_64.)
Answering these questions, or, at least the second, should be possible with a little bit of code modification in LAPACK to figure out what is going on. If the answer to the first question is no, then digging deeper into the dgesv() issue may be completely unnecessary.
This should be reproducible by calling arma::solve() with a matrix of the form above, and may end up in a bug report bubbling up to Armadillo or LAPACK.
For now, I have commented out the test.
Michael, I CC'ed you because you wrote this code, and may have a more correct perspective on what's going on.
Migrated-From: http://trac.research.cc.gatech.edu/fastlab/ticket/373
| defect | larstest nocholeskysingularitytest often fails for reported by rcurtin on apr utc for some reason i haven t nailed down on nocholeskysingularitytest often fails despite the fact that the code should work fine in my investigations i have discovered that the reason for this is that the test will try to solve a singular matrix of the form a a a a which will generally cause solve to fail at least on this is done in lars cpp at line but you can still solve a singular linear system and matlab will do this in the least squares sense a bit of backtracing and digging reveals that what is happening is this lars passes a singular matrix of the form above to arma solve arma solve detects that the matrix is very small and tries to invert it by hand but gives up when it notices the determinant is arma solve then calls out to arma lapack gesv that is lapack dgesv which provides a solution to the system alternately atlas clapack gesv may be called depending on the system configuration but this is confusing to me because lapack dgesv should fail search singular dgesv will perform an lu factorization on the input matrix and then fail if u is singular but u should be singular so there are questions i think are worth asking is solve a good test for matrix singularity why is lapack dgesv solving the matrix despite its singularity but only on i ve never seen this failure on answering these questions or at least the second should be possible with a little bit of code modification in lapack to figure out what is going on if the answer to the first question is no then digging deeper into the dgesv issue may be completely unnecessary this should be reproducible by calling arma solve with a matrix of the form above and may end up in a bug report bubbling up to armadillo or lapack for now i have commented out the test michael i cc ed you because you wrote this code and may have a more correct perspective on what s going on migrated from | 1 |
555,937 | 16,472,619,672 | IssuesEvent | 2021-05-23 18:14:42 | SkriptLang/Skript | https://api.github.com/repos/SkriptLang/Skript | closed | Issue with custom items from itemsadder | bug completed priority: medium | ### Description
It seems like Skript changes the lore of an itemsadder item
### Steps to Reproduce
|> Install Skript, TuSKe and Itemsadder
|> Create a gui
|> Add a custom item with a lore to that gui
|> Give yourself the itemsadder item that has been used in the GUI
### Expected Behavior
Keeping the lore and not changing it
### Errors / Screenshots
https://github.com/PluginBugs/Issues-ItemsAdder/issues/821
### Server Information
* **Server version/platform:** Paper 1.16.5
* **Skript version:** 2.5.3
### Additional Context
I'm not sure if this belongs into the Skript bug reports or into the TuSKe bug reports - sry If I got it wrong
| 1.0 | Issue with custom items from itemsadder - ### Description
It seems like Skript changes the lore of an itemsadder item
### Steps to Reproduce
|> Install Skript, TuSKe and Itemsadder
|> Create a gui
|> Add a custom item with a lore to that gui
|> Give yourself the itemsadder item that has been used in the GUI
### Expected Behavior
Keeping the lore and not changing it
### Errors / Screenshots
https://github.com/PluginBugs/Issues-ItemsAdder/issues/821
### Server Information
* **Server version/platform:** Paper 1.16.5
* **Skript version:** 2.5.3
### Additional Context
I'm not sure if this belongs into the Skript bug reports or into the TuSKe bug reports - sry If I got it wrong
| non_defect | issue with custom items from itemsadder description it seems like skript changes the lore of an itemsadder item steps to reproduce install skript tuske and itemsadder create a gui add a custom item with a lore to that gui give yourself the itemsadder item that has been used in the gui expected behavior keeping the lore and not changing it errors screenshots server information server version platform paper skript version additional context i m not sure if this belongs into the skript bug reports or into the tuske bug reports sry if i got it wrong | 0 |
111,243 | 24,095,560,803 | IssuesEvent | 2022-09-19 18:24:51 | appsmithorg/appsmith | https://api.github.com/repos/appsmithorg/appsmith | reopened | [Bug]-[1120]:Where condition on GSheets fails to query if there's mixed data type present in column data | Bug Backend High Release Google Sheets BE Coders Pod Datatype issue | ## Description
If a GSheet column has mixed data type (number and string) then where condition fails to query results that favour the filter condition.
This is an extension of #7688
### Steps to reproduce the behaviour:
1. Go to GSheets datasource and add a new query
2. Add where condition for a table where mixed data types are present and observe the error
### Important Details
- Version: Cloud
- OS: Win10
- Browser: Chrome
- Environment: Release | 1.0 | [Bug]-[1120]:Where condition on GSheets fails to query if there's mixed data type present in column data - ## Description
If a GSheet column has mixed data type (number and string) then where condition fails to query results that favour the filter condition.
This is an extension of #7688
### Steps to reproduce the behaviour:
1. Go to GSheets datasource and add a new query
2. Add where condition for a table where mixed data types are present and observe the error
### Important Details
- Version: Cloud
- OS: Win10
- Browser: Chrome
- Environment: Release | non_defect | where condition on gsheets fails to query if there s mixed data type present in column data description if a gsheet column has mixed data type number and string then where condition fails to query results that favour the filter condition this is an extension of steps to reproduce the behaviour go to gsheets datasource and add a new query add where condition for a table where mixed data types are present and observe the error important details version cloud os browser chrome environment release | 0 |
49,308 | 13,186,603,541 | IssuesEvent | 2020-08-13 00:42:51 | icecube-trac/tix3 | https://api.github.com/repos/icecube-trac/tix3 | opened | CoincSuite split_recombine.py example doesn't run (Trac #1162) | Incomplete Migration Migrated from Trac combo reconstruction defect | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1162">https://code.icecube.wisc.edu/ticket/1162</a>, reported by jtatar and owned by mzoll</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "Here is the error message:\n\n\n{{{\nINFO (Python): Using CoincSuite Recombinations (coincsuite.py:82 in Complete)\nTraceback (most recent call last):\n File \"split_recombine.py\", line 77, in <module>\n Split_Recombine( tray, \"Split_Recombine\", params)\n File \"split_recombine.py\", line 58, in Split_Recombine\n SplitPulses = \"MaskedOfflinePulses\")\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/I3Tray.py\", line 204, in AddSegment\n return _segment(self, _name, **kwargs)\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/icecube/CoincSuite/coincsuite.py\", line 169, in Complete\n mininame = lilliput.add_minuit_simplex_minimizer_service(tray)\nNameError: global name 'lilliput' is not defined\n\n}}}\n\n* Default input file does not exist.\n* Please add a better summary of what script does at top of script.",
"reporter": "jtatar",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "CoincSuite split_recombine.py example doesn't run",
"priority": "blocker",
"keywords": "",
"time": "2015-08-18T18:26:13",
"milestone": "",
"owner": "mzoll",
"type": "defect"
}
```
</p>
</details>
| 1.0 | CoincSuite split_recombine.py example doesn't run (Trac #1162) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1162">https://code.icecube.wisc.edu/ticket/1162</a>, reported by jtatar and owned by mzoll</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "Here is the error message:\n\n\n{{{\nINFO (Python): Using CoincSuite Recombinations (coincsuite.py:82 in Complete)\nTraceback (most recent call last):\n File \"split_recombine.py\", line 77, in <module>\n Split_Recombine( tray, \"Split_Recombine\", params)\n File \"split_recombine.py\", line 58, in Split_Recombine\n SplitPulses = \"MaskedOfflinePulses\")\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/I3Tray.py\", line 204, in AddSegment\n return _segment(self, _name, **kwargs)\n File \"/home/jtatar/StrikeTeam/IceRec/build/lib/icecube/CoincSuite/coincsuite.py\", line 169, in Complete\n mininame = lilliput.add_minuit_simplex_minimizer_service(tray)\nNameError: global name 'lilliput' is not defined\n\n}}}\n\n* Default input file does not exist.\n* Please add a better summary of what script does at top of script.",
"reporter": "jtatar",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "CoincSuite split_recombine.py example doesn't run",
"priority": "blocker",
"keywords": "",
"time": "2015-08-18T18:26:13",
"milestone": "",
"owner": "mzoll",
"type": "defect"
}
```
</p>
</details>
| defect | coincsuite split recombine py example doesn t run trac migrated from json status closed changetime description here is the error message n n n ninfo python using coincsuite recombinations coincsuite py in complete ntraceback most recent call last n file split recombine py line in n split recombine tray split recombine params n file split recombine py line in split recombine n splitpulses maskedofflinepulses n file home jtatar striketeam icerec build lib py line in addsegment n return segment self name kwargs n file home jtatar striketeam icerec build lib icecube coincsuite coincsuite py line in complete n mininame lilliput add minuit simplex minimizer service tray nnameerror global name lilliput is not defined n n n n default input file does not exist n please add a better summary of what script does at top of script reporter jtatar cc resolution fixed ts component combo reconstruction summary coincsuite split recombine py example doesn t run priority blocker keywords time milestone owner mzoll type defect | 1 |
376,382 | 26,197,870,128 | IssuesEvent | 2023-01-03 14:59:43 | Gamify-IT/issues | https://api.github.com/repos/Gamify-IT/issues | closed | Overworld: Achievement System Data Model | documentation storypoint/1 | ## Requirement
## Current Behavior
Currently, we don't know which data is needed for the achievements and how we want to model them.
## Expected Behavior
We have a data modeling system.
## DoD
- [ ] The data model is documented
| 1.0 | Overworld: Achievement System Data Model - ## Requirement
## Current Behavior
Currently, we don't know which data is needed for the achievements and how we want to model them.
## Expected Behavior
We have a data modeling system.
## DoD
- [ ] The data model is documented
| non_defect | overworld achievement system data model requirement current behavior currently we don t know which data is needed for the achievements and how we want to model them expected behavior we have a data modeling system dod the data model is documented | 0 |
109,050 | 4,366,931,171 | IssuesEvent | 2016-08-03 15:40:32 | radiasoft/sirepo | https://api.github.com/repos/radiasoft/sirepo | closed | Zoom on heatmap line-outs | 1st Priority | Add zoom/pan and point navigator to heatmap line-outs similar to line plots.
| 1.0 | Zoom on heatmap line-outs - Add zoom/pan and point navigator to heatmap line-outs similar to line plots.
| non_defect | zoom on heatmap line outs add zoom pan and point navigator to heatmap line outs similar to line plots | 0 |
21,909 | 3,587,215,015 | IssuesEvent | 2016-01-30 05:06:04 | mash99/crypto-js | https://api.github.com/repos/mash99/crypto-js | closed | MD4 | auto-migrated Priority-Medium Type-Defect | ```
Can you add MD4 hashing algoritm?
Implementation for crypto-js library attached
```
Original issue reported on code.google.com by `nf404.n...@gmail.com` on 2 Apr 2014 at 4:07
Attachments:
* [md4.js](https://storage.googleapis.com/google-code-attachments/crypto-js/issue-123/comment-0/md4.js)
| 1.0 | MD4 - ```
Can you add MD4 hashing algoritm?
Implementation for crypto-js library attached
```
Original issue reported on code.google.com by `nf404.n...@gmail.com` on 2 Apr 2014 at 4:07
Attachments:
* [md4.js](https://storage.googleapis.com/google-code-attachments/crypto-js/issue-123/comment-0/md4.js)
| defect | can you add hashing algoritm implementation for crypto js library attached original issue reported on code google com by n gmail com on apr at attachments | 1 |
371,135 | 25,938,882,754 | IssuesEvent | 2022-12-16 16:25:44 | NIAEFEUP/tts-revamp-fe | https://api.github.com/repos/NIAEFEUP/tts-revamp-fe | opened | Refactor, comment, and polish of code | documentation low priority medium effort | The code in this project has become difficult to read and maintain due to a lack of comments. We need to refactor the code to improve its structure and readability and add comments to help other developers understand how the code works. | 1.0 | Refactor, comment, and polish of code - The code in this project has become difficult to read and maintain due to a lack of comments. We need to refactor the code to improve its structure and readability and add comments to help other developers understand how the code works. | non_defect | refactor comment and polish of code the code in this project has become difficult to read and maintain due to a lack of comments we need to refactor the code to improve its structure and readability and add comments to help other developers understand how the code works | 0 |
197,858 | 15,696,047,387 | IssuesEvent | 2021-03-26 01:03:22 | MuneebNasir/Amazin-online-bookstore-OA | https://api.github.com/repos/MuneebNasir/Amazin-online-bookstore-OA | closed | Research for Cucumber Tests | documentation new | **Is your feature request related to a problem? Please describe.**
Research into Cucumber test cases for the developed code
**Describe the solution you'd like**
Research & understand the testing technique and create test case with Cucumber
**Describe alternatives you've considered**
N/A
**Additional context**
Cucumber Testing relates to final oral presentation for the group
| 1.0 | Research for Cucumber Tests - **Is your feature request related to a problem? Please describe.**
Research into Cucumber test cases for the developed code
**Describe the solution you'd like**
Research & understand the testing technique and create test case with Cucumber
**Describe alternatives you've considered**
N/A
**Additional context**
Cucumber Testing relates to final oral presentation for the group
| non_defect | research for cucumber tests is your feature request related to a problem please describe research into cucumber test cases for the developed code describe the solution you d like research understand the testing technique and create test case with cucumber describe alternatives you ve considered n a additional context cucumber testing relates to final oral presentation for the group | 0 |
98,405 | 8,676,291,214 | IssuesEvent | 2018-11-30 13:41:54 | HippieStation/HippieStation | https://api.github.com/repos/HippieStation/HippieStation | closed | admin heal wont work properly on decapitated people | Tested/Reproduced | In-Game report from pyko:
Server info: 24508 (Hippie Station)
It fixes corpse, but brains and shit have to be edited in. | 1.0 | admin heal wont work properly on decapitated people - In-Game report from pyko:
Server info: 24508 (Hippie Station)
It fixes corpse, but brains and shit have to be edited in. | non_defect | admin heal wont work properly on decapitated people in game report from pyko server info hippie station it fixes corpse but brains and shit have to be edited in | 0 |
318,172 | 23,706,191,465 | IssuesEvent | 2022-08-30 01:32:35 | RCommon-Team/RCommon | https://api.github.com/repos/RCommon-Team/RCommon | closed | Clean Architecture CQRS Sample | documentation enhancement | Need a simple CA/CQRS sample to assist with demonstrating how to use RCommon. Scenarios that should be covered:
- Dependency Injection via Adapters
- Mediator
- Repositories w/ EF
- Business Entities base classes
- Model/DTO base classes
- Unit of Work pattern & behavior
- DTO validation | 1.0 | Clean Architecture CQRS Sample - Need a simple CA/CQRS sample to assist with demonstrating how to use RCommon. Scenarios that should be covered:
- Dependency Injection via Adapters
- Mediator
- Repositories w/ EF
- Business Entities base classes
- Model/DTO base classes
- Unit of Work pattern & behavior
- DTO validation | non_defect | clean architecture cqrs sample need a simple ca cqrs sample to assist with demonstrating how to use rcommon scenarios that should be covered dependency injection via adapters mediator repositories w ef business entities base classes model dto base classes unit of work pattern behavior dto validation | 0 |
294,392 | 9,023,093,491 | IssuesEvent | 2019-02-07 05:24:43 | utra-robosoccer/soccer-embedded | https://api.github.com/repos/utra-robosoccer/soccer-embedded | closed | green bar and "pure virtual method called" in console when function in EXPECT_CALL(*).Times(1) gets called 0 times | Priority: Low Type: Bug | **Describe the bug**
When an `EXPECT_CALL` is given with `.Times(1)`, sometimes the test results show as green and the console has the following printed:
```
pure virtual method called
terminate called without an active exception
```
An example of this occurring is the message shows in one of the UDP driver tests (`UdpDriverShould, FailReceiveRxPbufNull`) when the following line is given (will be visible in master branch soon):
```
EXPECT_CALL(udp_if, pbufCopyPartial(_, _, _, _)).Times(1).WillOnce(Return((u16_t) 1));
```
**To Reproduce**
Insert the following line at the start of `TEST(UdpDriverShould, FailReceiveRxPbufNull){`
```
EXPECT_CALL(udp_if, pbufCopyPartial(_, _, _, _)).Times(1).WillOnce(Return((u16_t) 1));
```
then build and run `Test_F7` in `RobotTest`.
This test is not currently in master, will update once it is.
**Expected behavior**
The test run should have a "red" status and show a red bar and indicate that the method given in `EXPECT_CALL` wasn't called enough times. Right now, the `pure virtual method called` message in the console is the only indicator of something not right.
**Additional context**
Have hit this a few times in the `UdpDriver` tests only and am not sure if it is a problem of ours or of gtest yet. It doesn't seem to happen consistently either (i.e., the red bar and the correct report of "test expected to be called 1 time but called 0 times" does show). Posting this issue so it won't be forgotten, but will update with better details to reproduce once the updated UDP unit tests are in master.
An example of the inconsistency where the same situation did give correct behavior is adding the same line to the beginning of `TEST(UdpDriverShould, FailReceiveRxArrayNULL)`. | 1.0 | green bar and "pure virtual method called" in console when function in EXPECT_CALL(*).Times(1) gets called 0 times - **Describe the bug**
When an `EXPECT_CALL` is given with `.Times(1)`, sometimes the test results show as green and the console has the following printed:
```
pure virtual method called
terminate called without an active exception
```
An example of this occurring is the message shows in one of the UDP driver tests (`UdpDriverShould, FailReceiveRxPbufNull`) when the following line is given (will be visible in master branch soon):
```
EXPECT_CALL(udp_if, pbufCopyPartial(_, _, _, _)).Times(1).WillOnce(Return((u16_t) 1));
```
**To Reproduce**
Insert the following line at the start of `TEST(UdpDriverShould, FailReceiveRxPbufNull){`
```
EXPECT_CALL(udp_if, pbufCopyPartial(_, _, _, _)).Times(1).WillOnce(Return((u16_t) 1));
```
then build and run `Test_F7` in `RobotTest`.
This test is not currently in master, will update once it is.
**Expected behavior**
The test run should have a "red" status and show a red bar and indicate that the method given in `EXPECT_CALL` wasn't called enough times. Right now, the `pure virtual method called` message in the console is the only indicator of something not right.
**Additional context**
Have hit this a few times in the `UdpDriver` tests only and am not sure if it is a problem of ours or of gtest yet. It doesn't seem to happen consistently either (i.e., the red bar and the correct report of "test expected to be called 1 time but called 0 times" does show). Posting this issue so it won't be forgotten, but will update with better details to reproduce once the updated UDP unit tests are in master.
An example of the inconsistency where the same situation did give correct behavior is adding the same line to the beginning of `TEST(UdpDriverShould, FailReceiveRxArrayNULL)`. | non_defect | green bar and pure virtual method called in console when function in expect call times gets called times describe the bug when an expect call is given with times sometimes the test results show as green and the console has the following printed pure virtual method called terminate called without an active exception an example of this occurring is the message shows in one of the udp driver tests udpdrivershould failreceiverxpbufnull when the following line is given will be visible in master branch soon expect call udp if pbufcopypartial times willonce return t to reproduce insert the following line at the start of test udpdrivershould failreceiverxpbufnull expect call udp if pbufcopypartial times willonce return t then build and run test in robottest this test is not currently in master will update once it is expected behavior the test run should have a red status and show a red bar and indicate that the method given in expect call wasn t called enough times right now the pure virtual method called message in the console is the only indicator of something not right additional context have hit this a few times in the udpdriver tests only and am not sure if it is a problem of ours or of gtest yet it doesn t seem to happen consistently either i e the red bar and the correct report of test expected to be called time but called times does show posting this issue so it won t be forgotten but will update with better details to reproduce once the updated udp unit tests are in master an example of the inconsistency where the same situation did give correct behavior is adding the same line to the beginning of test udpdrivershould failreceiverxarraynull | 0 |
15,828 | 2,869,079,991 | IssuesEvent | 2015-06-05 23:10:18 | dart-lang/sdk | https://api.github.com/repos/dart-lang/sdk | closed | Pub server crashes during compilation | NeedsInfo Pkg-Polymer Priority-Unassigned Type-Defect | *This issue was originally filed by tejaine...@gmail.com*
_____
**What steps will reproduce the problem?**
1. Launch Pub server
2. Use it for a while
3. It crashes while compiling
**What version of the product are you using?**
Dart Editor version 1.7.0.dev_02_00 (DEV)
Dart SDK version 1.7.0-dev.2.0
**On what operating system?**
Ubuntu 14.04
**Please provide any additional information below.**
Build error:
Transform Dart2JS on eChannel|web/designer.html_bootstrap.dart threw error: Out of Memory
/mnt/data/b/build/slave/dart-editor-linux-dev/build/dart/sdk/lib/_internal/compiler/implementation/compiler.dart 1131 Compiler.run.<fn>
dart:async \_Future._propagateToListeners.handleError
/mnt/data/b/build/slave/dart-editor-linux-dev/build/dart/sdk/lib/_internal/compiler/implementation/compiler.dart 2014 CompilerTask.measure
. ...
. ...
dart:isolate \_RawReceivePortImpl._handleMessage
Build completed with 2 errors.
| 1.0 | Pub server crashes during compilation - *This issue was originally filed by tejaine...@gmail.com*
_____
**What steps will reproduce the problem?**
1. Launch Pub server
2. Use it for a while
3. It crashes while compiling
**What version of the product are you using?**
Dart Editor version 1.7.0.dev_02_00 (DEV)
Dart SDK version 1.7.0-dev.2.0
**On what operating system?**
Ubuntu 14.04
**Please provide any additional information below.**
Build error:
Transform Dart2JS on eChannel|web/designer.html_bootstrap.dart threw error: Out of Memory
/mnt/data/b/build/slave/dart-editor-linux-dev/build/dart/sdk/lib/_internal/compiler/implementation/compiler.dart 1131 Compiler.run.<fn>
dart:async \_Future._propagateToListeners.handleError
/mnt/data/b/build/slave/dart-editor-linux-dev/build/dart/sdk/lib/_internal/compiler/implementation/compiler.dart 2014 CompilerTask.measure
. ...
. ...
dart:isolate \_RawReceivePortImpl._handleMessage
Build completed with 2 errors.
| defect | pub server crashes during compilation this issue was originally filed by tejaine gmail com what steps will reproduce the problem launch pub server use it for a while it crashes while compiling what version of the product are you using dart editor version dev dev dart sdk version dev on what operating system ubuntu please provide any additional information below build error transform on echannel web designer html bootstrap dart threw error out of memory mnt data b build slave dart editor linux dev build dart sdk lib internal compiler implementation compiler dart compiler run lt fn gt dart async future propagatetolisteners handleerror mnt data b build slave dart editor linux dev build dart sdk lib internal compiler implementation compiler dart compilertask measure dart isolate rawreceiveportimpl handlemessage build completed with errors | 1 |
757,431 | 26,512,143,278 | IssuesEvent | 2023-01-18 17:56:34 | LLK/scratch-www | https://api.github.com/repos/LLK/scratch-www | closed | CE-310 Title Text on EV3 Page Overlaps | priority 3 Medium Severity Low Impact | ### Expected Behavior
The text "EV3" in the title should be next to "LEGO MINDSTORMS"
### Actual Behavior
Text "EV3" overlaps with body text in fullscreen mode.
### Steps to Reproduce
Steps to reproduce the behavior:
1. Go to https://scratch.mit.edu/ev3/
2. Notice that the text EV3 overlaps with the body text
### System Details
macOS 11.5.2, Firefox 95.0.2, MacBook Air
**Screenshots**
<img width="1440" alt="screenshot" src="https://user-images.githubusercontent.com/68165163/148123574-a9a08387-aee0-46cb-ba9c-29b4435d6558.png">
| 1.0 | CE-310 Title Text on EV3 Page Overlaps - ### Expected Behavior
The text "EV3" in the title should be next to "LEGO MINDSTORMS"
### Actual Behavior
Text "EV3" overlaps with body text in fullscreen mode.
### Steps to Reproduce
Steps to reproduce the behavior:
1. Go to https://scratch.mit.edu/ev3/
2. Notice that the text EV3 overlaps with the body text
### System Details
macOS 11.5.2, Firefox 95.0.2, MacBook Air
**Screenshots**
<img width="1440" alt="screenshot" src="https://user-images.githubusercontent.com/68165163/148123574-a9a08387-aee0-46cb-ba9c-29b4435d6558.png">
| non_defect | ce title text on page overlaps expected behavior the text in the title should be next to lego mindstorms actual behavior text overlaps with body text in fullscreen mode steps to reproduce steps to reproduce the behavior go to notice that the text overlaps with the body text system details macos firefox macbook air screenshots img width alt screenshot src | 0 |
14,389 | 3,833,686,161 | IssuesEvent | 2016-04-01 05:40:51 | okTurtles/dnschain | https://api.github.com/repos/okTurtles/dnschain | closed | Domain used in documentation examples ( okturtles.bit ) returns NXDOMAIN | documentation namecoin | Greetings dear dnschain developers & users.
I just noticed that okturtles.bit now returns NXDOMAIN, this could be confusing for new dnschain users since that domain is used in the documentation examples:
https://github.com/okTurtles/dnschain/blob/dev/docs/How-do-I-run-my-own.md
https://github.com/okTurtles/dnschain/blob/master/docs/setting-up-dnschain-namecoin-powerdns-server.md
Querying my own install:
dig -p 5333 @127.0.0.1 okturtles.bit
; <<>> DiG 9.9.5-9+deb8u6-Debian <<>> -p 5333 @127.0.0.1 okturtles.bit
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: **NXDOMAIN**, id: 52040
;; flags: qr rd; QUERY: 1, ANSWER: 0, AUTHORITY: 0, ADDITIONAL: 0
;; WARNING: recursion requested but not available
;; QUESTION SECTION:
;okturtles.bit. IN A
;; Query time: 15 msec
;; SERVER: 127.0.0.1#5333(127.0.0.1)
;; WHEN: Wed Mar 30 11:15:08 CEST 2016
;; MSG SIZE rcvd: 31
Querying a public server
dig @192.184.93.146 okturtles.bit
; <<>> DiG 9.9.5-9+deb8u6-Debian <<>> @192.184.93.146 okturtles.bit
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: **NXDOMAIN**, id: 35826
;; flags: qr rd ra; QUERY: 1, ANSWER: 0, AUTHORITY: 0, ADDITIONAL: 0
;; QUESTION SECTION:
;okturtles.bit. IN A
;; Query time: 189 msec
;; SERVER: 192.184.93.146#53(192.184.93.146)
;; WHEN: Wed Mar 30 11:19:49 CEST 2016
;; MSG SIZE rcvd: 31
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/32395858-domain-used-in-documentation-examples-okturtles-bit-returns-nxdomain?utm_campaign=plugin&utm_content=tracker%2F528702&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F528702&utm_medium=issues&utm_source=github).
</bountysource-plugin> | 1.0 | Domain used in documentation examples ( okturtles.bit ) returns NXDOMAIN - Greetings dear dnschain developers & users.
I just noticed that okturtles.bit now returns NXDOMAIN, this could be confusing for new dnschain users since that domain is used in the documentation examples:
https://github.com/okTurtles/dnschain/blob/dev/docs/How-do-I-run-my-own.md
https://github.com/okTurtles/dnschain/blob/master/docs/setting-up-dnschain-namecoin-powerdns-server.md
Querying my own install:
dig -p 5333 @127.0.0.1 okturtles.bit
; <<>> DiG 9.9.5-9+deb8u6-Debian <<>> -p 5333 @127.0.0.1 okturtles.bit
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: **NXDOMAIN**, id: 52040
;; flags: qr rd; QUERY: 1, ANSWER: 0, AUTHORITY: 0, ADDITIONAL: 0
;; WARNING: recursion requested but not available
;; QUESTION SECTION:
;okturtles.bit. IN A
;; Query time: 15 msec
;; SERVER: 127.0.0.1#5333(127.0.0.1)
;; WHEN: Wed Mar 30 11:15:08 CEST 2016
;; MSG SIZE rcvd: 31
Querying a public server
dig @192.184.93.146 okturtles.bit
; <<>> DiG 9.9.5-9+deb8u6-Debian <<>> @192.184.93.146 okturtles.bit
; (1 server found)
;; global options: +cmd
;; Got answer:
;; ->>HEADER<<- opcode: QUERY, status: **NXDOMAIN**, id: 35826
;; flags: qr rd ra; QUERY: 1, ANSWER: 0, AUTHORITY: 0, ADDITIONAL: 0
;; QUESTION SECTION:
;okturtles.bit. IN A
;; Query time: 189 msec
;; SERVER: 192.184.93.146#53(192.184.93.146)
;; WHEN: Wed Mar 30 11:19:49 CEST 2016
;; MSG SIZE rcvd: 31
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/32395858-domain-used-in-documentation-examples-okturtles-bit-returns-nxdomain?utm_campaign=plugin&utm_content=tracker%2F528702&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F528702&utm_medium=issues&utm_source=github).
</bountysource-plugin> | non_defect | domain used in documentation examples okturtles bit returns nxdomain greetings dear dnschain developers users i just noticed that okturtles bit now returns nxdomain this could be confusing for new dnschain users since that domain is used in the documentation examples querying my own install dig p okturtles bit dig debian p okturtles bit server found global options cmd got answer header opcode query status nxdomain id flags qr rd query answer authority additional warning recursion requested but not available question section okturtles bit in a query time msec server when wed mar cest msg size rcvd querying a public server dig okturtles bit dig debian okturtles bit server found global options cmd got answer header opcode query status nxdomain id flags qr rd ra query answer authority additional question section okturtles bit in a query time msec server when wed mar cest msg size rcvd want to back this issue we accept bounties via | 0 |
25,653 | 4,417,711,847 | IssuesEvent | 2016-08-15 07:24:23 | snowie2000/mactype | https://api.github.com/repos/snowie2000/mactype | closed | Download link broken | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1. Go to mactype homepage
2. Click download
3. Server error
What is the expected output? What do you see instead?
There is a server error when trying to download the .exe file
What version of the product are you using? On what operating system?
Newest version (2014) Windows 8.1, Google Chrome
Please provide any additional information below.
```
Original issue reported on code.google.com by `nuu...@gmail.com` on 12 Jan 2014 at 5:00 | 1.0 | Download link broken - ```
What steps will reproduce the problem?
1. Go to mactype homepage
2. Click download
3. Server error
What is the expected output? What do you see instead?
There is a server error when trying to download the .exe file
What version of the product are you using? On what operating system?
Newest version (2014) Windows 8.1, Google Chrome
Please provide any additional information below.
```
Original issue reported on code.google.com by `nuu...@gmail.com` on 12 Jan 2014 at 5:00 | defect | download link broken what steps will reproduce the problem go to mactype homepage click download server error what is the expected output what do you see instead there is a server error when trying to download the exe file what version of the product are you using on what operating system newest version windows google chrome please provide any additional information below original issue reported on code google com by nuu gmail com on jan at | 1 |
32,959 | 6,979,144,167 | IssuesEvent | 2017-12-12 19:57:26 | idaholab/raven | https://api.github.com/repos/idaholab/raven | opened | Regression test script runs plugin tests multiple times | defect priority_normal | --------
Issue Description
--------
##### What did you expect to see happen?
The regression test system should run each test only once.
##### What did you see instead?
When ./run_tests is executed, the plugin tests are run multiple times.
I did not check if other test are run multiple times as well...
##### Please attach the input file(s) that generate this error. The simpler the input, the faster we can find the issue.
Here is the (shortened) output from ./run_tests
```AutoHotkey
plugins/CashFlow.CashFlow_NPVsearch.................................................... OK
plugins/CashFlow.CashFlow_IRR................................................................ OK
plugins/CashFlow.CashFlow_PI.................................................................. OK
plugins/CashFlow.CashFlow_NPV....................... ....................................... OK
...
plugins/CashFlow/tests.CashFlow_IRR........................................................ OK
plugins/CashFlow/tests.CashFlow_NPV....................................................... OK
plugins/CashFlow/tests.CashFlow_NPVsearch............................................ OK
plugins/CashFlow/tests.CashFlow_PI.......................................................... OK
...
```
----------------
For Change Control Board: Issue Review
----------------
This review should occur before any development is performed as a response to this issue.
- [ ] 1. Is it tagged with a type: defect or improvement?
- [ ] 2. Is it tagged with a priority: critical, normal or minor?
- [ ] 3. If it will impact requirements or requirements tests, is it tagged with requirements?
- [ ] 4. If it is a defect, can it cause wrong results for users? If so an email needs to be sent to the users.
- [ ] 5. Is a rationale provided? (Such as explaining why the improvement is needed or why current code is wrong.)
-------
For Change Control Board: Issue Closure
-------
This review should occur when the issue is imminently going to be closed.
- [ ] 1. If the issue is a defect, is the defect fixed?
- [ ] 2. If the issue is a defect, is the defect tested for in the regression test system? (If not explain why not.)
- [ ] 3. If the issue can impact users, has an email to the users group been written (the email should specify if the defect impacts stable or master)?
- [ ] 4. If the issue is a defect, does it impact the latest stable branch? If yes, is there any issue tagged with stable (create if needed)?
- [ ] 5. If the issue is being closed without a merge request, has an explanation of why it is being closed been provided?
| 1.0 | Regression test script runs plugin tests multiple times - --------
Issue Description
--------
##### What did you expect to see happen?
The regression test system should run each test only once.
##### What did you see instead?
When ./run_tests is executed, the plugin tests are run multiple times.
I did not check if other test are run multiple times as well...
##### Please attach the input file(s) that generate this error. The simpler the input, the faster we can find the issue.
Here is the (shortened) output from ./run_tests
```AutoHotkey
plugins/CashFlow.CashFlow_NPVsearch.................................................... OK
plugins/CashFlow.CashFlow_IRR................................................................ OK
plugins/CashFlow.CashFlow_PI.................................................................. OK
plugins/CashFlow.CashFlow_NPV....................... ....................................... OK
...
plugins/CashFlow/tests.CashFlow_IRR........................................................ OK
plugins/CashFlow/tests.CashFlow_NPV....................................................... OK
plugins/CashFlow/tests.CashFlow_NPVsearch............................................ OK
plugins/CashFlow/tests.CashFlow_PI.......................................................... OK
...
```
----------------
For Change Control Board: Issue Review
----------------
This review should occur before any development is performed as a response to this issue.
- [ ] 1. Is it tagged with a type: defect or improvement?
- [ ] 2. Is it tagged with a priority: critical, normal or minor?
- [ ] 3. If it will impact requirements or requirements tests, is it tagged with requirements?
- [ ] 4. If it is a defect, can it cause wrong results for users? If so an email needs to be sent to the users.
- [ ] 5. Is a rationale provided? (Such as explaining why the improvement is needed or why current code is wrong.)
-------
For Change Control Board: Issue Closure
-------
This review should occur when the issue is imminently going to be closed.
- [ ] 1. If the issue is a defect, is the defect fixed?
- [ ] 2. If the issue is a defect, is the defect tested for in the regression test system? (If not explain why not.)
- [ ] 3. If the issue can impact users, has an email to the users group been written (the email should specify if the defect impacts stable or master)?
- [ ] 4. If the issue is a defect, does it impact the latest stable branch? If yes, is there any issue tagged with stable (create if needed)?
- [ ] 5. If the issue is being closed without a merge request, has an explanation of why it is being closed been provided?
| defect | regression test script runs plugin tests multiple times issue description what did you expect to see happen the regression test system should run each test only once what did you see instead when run tests is executed the plugin tests are run multiple times i did not check if other test are run multiple times as well please attach the input file s that generate this error the simpler the input the faster we can find the issue here is the shortened output from run tests autohotkey plugins cashflow cashflow npvsearch ok plugins cashflow cashflow irr ok plugins cashflow cashflow pi ok plugins cashflow cashflow npv ok plugins cashflow tests cashflow irr ok plugins cashflow tests cashflow npv ok plugins cashflow tests cashflow npvsearch ok plugins cashflow tests cashflow pi ok for change control board issue review this review should occur before any development is performed as a response to this issue is it tagged with a type defect or improvement is it tagged with a priority critical normal or minor if it will impact requirements or requirements tests is it tagged with requirements if it is a defect can it cause wrong results for users if so an email needs to be sent to the users is a rationale provided such as explaining why the improvement is needed or why current code is wrong for change control board issue closure this review should occur when the issue is imminently going to be closed if the issue is a defect is the defect fixed if the issue is a defect is the defect tested for in the regression test system if not explain why not if the issue can impact users has an email to the users group been written the email should specify if the defect impacts stable or master if the issue is a defect does it impact the latest stable branch if yes is there any issue tagged with stable create if needed if the issue is being closed without a merge request has an explanation of why it is being closed been provided | 1 |
51,597 | 10,698,936,845 | IssuesEvent | 2019-10-23 19:48:53 | joomla/joomla-cms | https://api.github.com/repos/joomla/joomla-cms | closed | Pop Up Box does not close | No Code Attached Yet | ### Steps to reproduce the issue
Go to system> mail templates
Open an existing mail template and save it.
Green popup box appears after save
### Expected result
Click x to close popup box and box should close
### Actual result
Click x to close popup box, green box closes but gray box remains place
### System information (as much as possible)
Windows 10 FireFox 70 | Chrome Version 78.0.3904.70 (both incognito and non-incognito mode)
### Additional comments
Was unsure of what to put in the build info so I used the latest commit info. Sorry if that was wrong - this is my first ever bug report! | 1.0 | Pop Up Box does not close - ### Steps to reproduce the issue
Go to system> mail templates
Open an existing mail template and save it.
Green popup box appears after save
### Expected result
Click x to close popup box and box should close
### Actual result
Click x to close popup box, green box closes but gray box remains place
### System information (as much as possible)
Windows 10 FireFox 70 | Chrome Version 78.0.3904.70 (both incognito and non-incognito mode)
### Additional comments
Was unsure of what to put in the build info so I used the latest commit info. Sorry if that was wrong - this is my first ever bug report! | non_defect | pop up box does not close steps to reproduce the issue go to system mail templates open an existing mail template and save it green popup box appears after save expected result click x to close popup box and box should close actual result click x to close popup box green box closes but gray box remains place system information as much as possible windows firefox chrome version both incognito and non incognito mode additional comments was unsure of what to put in the build info so i used the latest commit info sorry if that was wrong this is my first ever bug report | 0 |
288,575 | 31,861,476,035 | IssuesEvent | 2023-09-15 11:14:59 | nidhi7598/linux-v4.19.72_CVE-2022-3564 | https://api.github.com/repos/nidhi7598/linux-v4.19.72_CVE-2022-3564 | opened | CVE-2019-12380 (Medium) detected in linuxlinux-4.19.294, linuxlinux-4.19.294 | Mend: dependency security vulnerability | ## CVE-2019-12380 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxlinux-4.19.294</b>, <b>linuxlinux-4.19.294</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
**DISPUTED** An issue was discovered in the efi subsystem in the Linux kernel through 5.1.5. phys_efi_set_virtual_address_map in arch/x86/platform/efi/efi.c and efi_call_phys_prolog in arch/x86/platform/efi/efi_64.c mishandle memory allocation failures. NOTE: This id is disputed as not being an issue because “All the code touched by the referenced commit runs only at boot, before any user processes are started. Therefore, there is no possibility for an unprivileged user to control it.”.
<p>Publish Date: 2019-05-28
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-12380>CVE-2019-12380</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-12380">https://www.linuxkernelcves.com/cves/CVE-2019-12380</a></p>
<p>Release Date: 2020-08-03</p>
<p>Fix Resolution: v5.2-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2019-12380 (Medium) detected in linuxlinux-4.19.294, linuxlinux-4.19.294 - ## CVE-2019-12380 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxlinux-4.19.294</b>, <b>linuxlinux-4.19.294</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
**DISPUTED** An issue was discovered in the efi subsystem in the Linux kernel through 5.1.5. phys_efi_set_virtual_address_map in arch/x86/platform/efi/efi.c and efi_call_phys_prolog in arch/x86/platform/efi/efi_64.c mishandle memory allocation failures. NOTE: This id is disputed as not being an issue because “All the code touched by the referenced commit runs only at boot, before any user processes are started. Therefore, there is no possibility for an unprivileged user to control it.”.
<p>Publish Date: 2019-05-28
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-12380>CVE-2019-12380</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2019-12380">https://www.linuxkernelcves.com/cves/CVE-2019-12380</a></p>
<p>Release Date: 2020-08-03</p>
<p>Fix Resolution: v5.2-rc3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve medium detected in linuxlinux linuxlinux cve medium severity vulnerability vulnerable libraries linuxlinux linuxlinux vulnerability details disputed an issue was discovered in the efi subsystem in the linux kernel through phys efi set virtual address map in arch platform efi efi c and efi call phys prolog in arch platform efi efi c mishandle memory allocation failures note this id is disputed as not being an issue because “all the code touched by the referenced commit runs only at boot before any user processes are started therefore there is no possibility for an unprivileged user to control it ” publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend | 0 |
195,157 | 6,904,799,986 | IssuesEvent | 2017-11-27 02:26:26 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | www.facebook.com - video or audio doesn't play | browser-firefox priority-critical | <!-- @browser: Firefox 58.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; rv:58.0) Gecko/20100101 Firefox/58.0 -->
<!-- @reported_with: desktop-reporter -->
**URL**: https://www.facebook.com/SChassis/videos/vb.131028676974110/1553725754704388/?type=2&theater
**Browser / Version**: Firefox 58.0
**Operating System**: Windows 7
**Tested Another Browser**: Unknown
**Problem type**: Video or audio doesn't play
**Description**: wont play videos says I need flash player but have adobe flash player
**Steps to Reproduce**:
layout.css.servo.enabled: true
[](https://webcompat.com/uploads/2017/11/6cc03916-3134-4b83-9861-2bfca2b7849a.jpg)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | www.facebook.com - video or audio doesn't play - <!-- @browser: Firefox 58.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 6.1; rv:58.0) Gecko/20100101 Firefox/58.0 -->
<!-- @reported_with: desktop-reporter -->
**URL**: https://www.facebook.com/SChassis/videos/vb.131028676974110/1553725754704388/?type=2&theater
**Browser / Version**: Firefox 58.0
**Operating System**: Windows 7
**Tested Another Browser**: Unknown
**Problem type**: Video or audio doesn't play
**Description**: wont play videos says I need flash player but have adobe flash player
**Steps to Reproduce**:
layout.css.servo.enabled: true
[](https://webcompat.com/uploads/2017/11/6cc03916-3134-4b83-9861-2bfca2b7849a.jpg)
_From [webcompat.com](https://webcompat.com/) with ❤️_ | non_defect | video or audio doesn t play url browser version firefox operating system windows tested another browser unknown problem type video or audio doesn t play description wont play videos says i need flash player but have adobe flash player steps to reproduce layout css servo enabled true from with ❤️ | 0 |
24,192 | 3,923,186,588 | IssuesEvent | 2016-04-22 10:06:49 | googlei18n/libphonenumber | https://api.github.com/repos/googlei18n/libphonenumber | closed | Chile mobile number not accepted as valid number | priority-medium type-defect | Imported from [Google Code issue #562](https://code.google.com/p/libphonenumber/issues/detail?id=562) created by [RalphLenssen](https://code.google.com/u/115514840686149311424/) on 2014-11-25T13:08:11.000Z:
----
<b>What steps will reproduce the problem?</b>
1.Use the phone number "+56 8 998xxxx" which is a valid format for a chile mobile number according to http://www.wtng.info/wtng-56-cl.html
2. The phonenumber is rejected as not a valid number
<b>What is the expected output? What do you see instead?</b>
I expect it to be a valid number
<b>What version of the product are you using? On what operating system?</b>
Using version 7.0
| 1.0 | Chile mobile number not accepted as valid number - Imported from [Google Code issue #562](https://code.google.com/p/libphonenumber/issues/detail?id=562) created by [RalphLenssen](https://code.google.com/u/115514840686149311424/) on 2014-11-25T13:08:11.000Z:
----
<b>What steps will reproduce the problem?</b>
1.Use the phone number "+56 8 998xxxx" which is a valid format for a chile mobile number according to http://www.wtng.info/wtng-56-cl.html
2. The phonenumber is rejected as not a valid number
<b>What is the expected output? What do you see instead?</b>
I expect it to be a valid number
<b>What version of the product are you using? On what operating system?</b>
Using version 7.0
| defect | chile mobile number not accepted as valid number imported from created by on what steps will reproduce the problem use the phone number quot quot which is a valid format for a chile mobile number according to the phonenumber is rejected as not a valid number what is the expected output what do you see instead i expect it to be a valid number what version of the product are you using on what operating system using version | 1 |
48,734 | 12,236,761,758 | IssuesEvent | 2020-05-04 16:52:43 | tensorflow/tensorflow | https://api.github.com/repos/tensorflow/tensorflow | closed | ImportError: DLL load failed while importing _pywrap_tensorflow_internal: The specified module could not be found. | TF 2.0 stat:awaiting response subtype:windows type:build/install | **System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10 64-bit
- TensorFlow installed from (source or binary):
- TensorFlow version: 20.1
- Python version: Python 3.8 (64-bit)
- Installed using virtualenv? pip? conda?: pip
- GCC/Compiler version (if compiling from source): Jupyter
- CUDA/cuDNN version:
- GPU model and memory: Intel HD graphics 620 8239 MB
- CPU Intel(R) Core(TM) i5-7300U CPU @ 2.60GHz 2.71 GHz
Python 3.8.2 (tags/v3.8.2:7b3ab59, Feb 25 2020, 23:03:10) [MSC v.1916 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
Traceback (most recent call last):
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 242, in load_module
return load_dynamic(name, filename, file)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 342, in load_dynamic
return _load(spec)
ImportError: DLL load failed while importing _pywrap_tensorflow_internal: The specified module could not be found.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\__init__.py", line 41, in <module>
from tensorflow.python.tools import module_util as _module_util
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\__init__.py", line 50, in <module>
from tensorflow.python import pywrap_tensorflow
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 69, in <module>
raise ImportError(msg)
ImportError: Traceback (most recent call last):
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 242, in load_module
return load_dynamic(name, filename, file)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 342, in load_dynamic
return _load(spec)
ImportError: DLL load failed while importing _pywrap_tensorflow_internal: The specified module could not be found.
Failed to load the native TensorFlow runtime.
See https://www.tensorflow.org/install/errors
for some common reasons and solutions. Include the entire stack trace
above this error message when asking for help.
>>>
| 1.0 | ImportError: DLL load failed while importing _pywrap_tensorflow_internal: The specified module could not be found. - **System information**
- OS Platform and Distribution (e.g., Linux Ubuntu 16.04): Windows 10 64-bit
- TensorFlow installed from (source or binary):
- TensorFlow version: 20.1
- Python version: Python 3.8 (64-bit)
- Installed using virtualenv? pip? conda?: pip
- GCC/Compiler version (if compiling from source): Jupyter
- CUDA/cuDNN version:
- GPU model and memory: Intel HD graphics 620 8239 MB
- CPU Intel(R) Core(TM) i5-7300U CPU @ 2.60GHz 2.71 GHz
Python 3.8.2 (tags/v3.8.2:7b3ab59, Feb 25 2020, 23:03:10) [MSC v.1916 64 bit (AMD64)] on win32
Type "help", "copyright", "credits" or "license" for more information.
>>> import tensorflow as tf
Traceback (most recent call last):
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 242, in load_module
return load_dynamic(name, filename, file)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 342, in load_dynamic
return _load(spec)
ImportError: DLL load failed while importing _pywrap_tensorflow_internal: The specified module could not be found.
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\__init__.py", line 41, in <module>
from tensorflow.python.tools import module_util as _module_util
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\__init__.py", line 50, in <module>
from tensorflow.python import pywrap_tensorflow
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 69, in <module>
raise ImportError(msg)
ImportError: Traceback (most recent call last):
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow.py", line 58, in <module>
from tensorflow.python.pywrap_tensorflow_internal import *
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 28, in <module>
_pywrap_tensorflow_internal = swig_import_helper()
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\site-packages\tensorflow\python\pywrap_tensorflow_internal.py", line 24, in swig_import_helper
_mod = imp.load_module('_pywrap_tensorflow_internal', fp, pathname, description)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 242, in load_module
return load_dynamic(name, filename, file)
File "C:\Users\linj\AppData\Local\Programs\Python\Python38\lib\imp.py", line 342, in load_dynamic
return _load(spec)
ImportError: DLL load failed while importing _pywrap_tensorflow_internal: The specified module could not be found.
Failed to load the native TensorFlow runtime.
See https://www.tensorflow.org/install/errors
for some common reasons and solutions. Include the entire stack trace
above this error message when asking for help.
>>>
| non_defect | importerror dll load failed while importing pywrap tensorflow internal the specified module could not be found system information os platform and distribution e g linux ubuntu windows bit tensorflow installed from source or binary tensorflow version python version python bit installed using virtualenv pip conda pip gcc compiler version if compiling from source jupyter cuda cudnn version gpu model and memory intel hd graphics mb cpu intel r core tm cpu ghz python tags feb on type help copyright credits or license for more information import tensorflow as tf traceback most recent call last file c users linj appdata local programs python lib site packages tensorflow python pywrap tensorflow py line in from tensorflow python pywrap tensorflow internal import file c users linj appdata local programs python lib site packages tensorflow python pywrap tensorflow internal py line in pywrap tensorflow internal swig import helper file c users linj appdata local programs python lib site packages tensorflow python pywrap tensorflow internal py line in swig import helper mod imp load module pywrap tensorflow internal fp pathname description file c users linj appdata local programs python lib imp py line in load module return load dynamic name filename file file c users linj appdata local programs python lib imp py line in load dynamic return load spec importerror dll load failed while importing pywrap tensorflow internal the specified module could not be found during handling of the above exception another exception occurred traceback most recent call last file line in file c users linj appdata local programs python lib site packages tensorflow init py line in from tensorflow python tools import module util as module util file c users linj appdata local programs python lib site packages tensorflow python init py line in from tensorflow python import pywrap tensorflow file c users linj appdata local programs python lib site packages tensorflow python pywrap tensorflow py line in raise importerror msg importerror traceback most recent call last file c users linj appdata local programs python lib site packages tensorflow python pywrap tensorflow py line in from tensorflow python pywrap tensorflow internal import file c users linj appdata local programs python lib site packages tensorflow python pywrap tensorflow internal py line in pywrap tensorflow internal swig import helper file c users linj appdata local programs python lib site packages tensorflow python pywrap tensorflow internal py line in swig import helper mod imp load module pywrap tensorflow internal fp pathname description file c users linj appdata local programs python lib imp py line in load module return load dynamic name filename file file c users linj appdata local programs python lib imp py line in load dynamic return load spec importerror dll load failed while importing pywrap tensorflow internal the specified module could not be found failed to load the native tensorflow runtime see for some common reasons and solutions include the entire stack trace above this error message when asking for help | 0 |
61,299 | 17,023,660,829 | IssuesEvent | 2021-07-03 03:09:38 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | search/... URLs to nominatim do not work correctly | Component: nominatim Priority: major Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 8.16pm, Friday, 10th December 2010]**
Maybe URL rewriting is not configured right; from the examples at
http://wiki.openstreetmap.org/wiki/Nominatim#Examples
only the search?q=... one is working, the other ones do not return valid data. This breaks other tools that rely on the search/... format | 1.0 | search/... URLs to nominatim do not work correctly - **[Submitted to the original trac issue database at 8.16pm, Friday, 10th December 2010]**
Maybe URL rewriting is not configured right; from the examples at
http://wiki.openstreetmap.org/wiki/Nominatim#Examples
only the search?q=... one is working, the other ones do not return valid data. This breaks other tools that rely on the search/... format | defect | search urls to nominatim do not work correctly maybe url rewriting is not configured right from the examples at only the search q one is working the other ones do not return valid data this breaks other tools that rely on the search format | 1 |
24,426 | 3,975,697,162 | IssuesEvent | 2016-05-05 07:22:43 | primefaces/primeng | https://api.github.com/repos/primefaces/primeng | closed | Responsive Datatable seems to be broken | defect | It seems that the responsive Datatable don't work correct.
--> See attached screenshot

Tested on Chrome & Firefox.
Using primeng1.0.0-beta.4 and angular2.0.0-beta.17
| 1.0 | Responsive Datatable seems to be broken - It seems that the responsive Datatable don't work correct.
--> See attached screenshot

Tested on Chrome & Firefox.
Using primeng1.0.0-beta.4 and angular2.0.0-beta.17
| defect | responsive datatable seems to be broken it seems that the responsive datatable don t work correct see attached screenshot tested on chrome firefox using beta and beta | 1 |
54,446 | 13,689,096,853 | IssuesEvent | 2020-09-30 12:42:49 | numpy/numpy.org | https://api.github.com/repos/numpy/numpy.org | opened | Newer Hugo version introduced rendering issue | defect deployment | See gh-362: Hugo 0.74.3 worked fine, 0.75.1 broke the navbar. There's two things to do here:
- Figure out what is broken with the newer Hugo version and fix it.
- Better control the Hugo versions used for Netlify preview and for deployment, so we will not have unexpected breakage when the Hugo version gets bumped through some Linux package manager system update. | 1.0 | Newer Hugo version introduced rendering issue - See gh-362: Hugo 0.74.3 worked fine, 0.75.1 broke the navbar. There's two things to do here:
- Figure out what is broken with the newer Hugo version and fix it.
- Better control the Hugo versions used for Netlify preview and for deployment, so we will not have unexpected breakage when the Hugo version gets bumped through some Linux package manager system update. | defect | newer hugo version introduced rendering issue see gh hugo worked fine broke the navbar there s two things to do here figure out what is broken with the newer hugo version and fix it better control the hugo versions used for netlify preview and for deployment so we will not have unexpected breakage when the hugo version gets bumped through some linux package manager system update | 1 |
444,286 | 31,030,741,344 | IssuesEvent | 2023-08-10 12:18:58 | appsmithorg/appsmith-docs | https://api.github.com/repos/appsmithorg/appsmith-docs | closed | [Docs]: Rehaul - clearStore () | Documentation Doc Rehaul User Education Pod | ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Documentation Link
https://docs.appsmith.com/reference/appsmith-framework/widget-actions/store-value#clear-store
### Discord/slack/intercom Link
_No response_
### Describe the problem and improvement.
Create a new page for this function and update the content | 1.0 | [Docs]: Rehaul - clearStore () - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Documentation Link
https://docs.appsmith.com/reference/appsmith-framework/widget-actions/store-value#clear-store
### Discord/slack/intercom Link
_No response_
### Describe the problem and improvement.
Create a new page for this function and update the content | non_defect | rehaul clearstore is there an existing issue for this i have searched the existing issues documentation link discord slack intercom link no response describe the problem and improvement create a new page for this function and update the content | 0 |
78,254 | 15,569,949,786 | IssuesEvent | 2021-03-17 01:22:15 | jrrk/riscv-linux | https://api.github.com/repos/jrrk/riscv-linux | opened | CVE-2020-25212 (High) detected in linux-amlogicv4.18, aspeedaspeed-4.19-devicetree-no-fsi | security vulnerability | ## CVE-2020-25212 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linux-amlogicv4.18</b>, <b>aspeedaspeed-4.19-devicetree-no-fsi</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A TOCTOU mismatch in the NFS client code in the Linux kernel before 5.8.3 could be used by local attackers to corrupt memory or possibly have unspecified other impact because a size check is in fs/nfs/nfs4proc.c instead of fs/nfs/nfs4xdr.c, aka CID-b4487b935452.
<p>Publish Date: 2020-09-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-25212>CVE-2020-25212</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-25212">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-25212</a></p>
<p>Release Date: 2020-09-09</p>
<p>Fix Resolution: 5.8.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2020-25212 (High) detected in linux-amlogicv4.18, aspeedaspeed-4.19-devicetree-no-fsi - ## CVE-2020-25212 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linux-amlogicv4.18</b>, <b>aspeedaspeed-4.19-devicetree-no-fsi</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A TOCTOU mismatch in the NFS client code in the Linux kernel before 5.8.3 could be used by local attackers to corrupt memory or possibly have unspecified other impact because a size check is in fs/nfs/nfs4proc.c instead of fs/nfs/nfs4xdr.c, aka CID-b4487b935452.
<p>Publish Date: 2020-09-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-25212>CVE-2020-25212</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-25212">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-25212</a></p>
<p>Release Date: 2020-09-09</p>
<p>Fix Resolution: 5.8.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve high detected in linux aspeedaspeed devicetree no fsi cve high severity vulnerability vulnerable libraries linux aspeedaspeed devicetree no fsi vulnerability details a toctou mismatch in the nfs client code in the linux kernel before could be used by local attackers to corrupt memory or possibly have unspecified other impact because a size check is in fs nfs c instead of fs nfs c aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
11,110 | 8,293,299,564 | IssuesEvent | 2018-09-20 05:59:41 | oasp/oasp4j | https://api.github.com/repos/oasp/oasp4j | closed | Security, sample application: Data validation issues | sample security | There are some security issues with the provided sample application of OASP:
1. The data received from the client is always fully trusted.
2. The client gets always whole objects containing many attributes which are neither needed or displayed (which in combination with point 1 leads to an unnecessary big attack surface).
Some examples of what is possible:




Waiter and cook can do with the order actually everything. One can argue, that without a specification this kind of functionality is allowed, but still proper examples for developers could be helpful.
| True | Security, sample application: Data validation issues - There are some security issues with the provided sample application of OASP:
1. The data received from the client is always fully trusted.
2. The client gets always whole objects containing many attributes which are neither needed or displayed (which in combination with point 1 leads to an unnecessary big attack surface).
Some examples of what is possible:




Waiter and cook can do with the order actually everything. One can argue, that without a specification this kind of functionality is allowed, but still proper examples for developers could be helpful.
| non_defect | security sample application data validation issues there are some security issues with the provided sample application of oasp the data received from the client is always fully trusted the client gets always whole objects containing many attributes which are neither needed or displayed which in combination with point leads to an unnecessary big attack surface some examples of what is possible waiter and cook can do with the order actually everything one can argue that without a specification this kind of functionality is allowed but still proper examples for developers could be helpful | 0 |
173,282 | 13,395,216,823 | IssuesEvent | 2020-09-03 08:05:21 | elastic/beats | https://api.github.com/repos/elastic/beats | opened | [x-pack/metricbeat] ec2_integration_test | Metricbeat flaky-test | ## Flaky Test
* **Test Name:** TestFetch – ec2
* **Link:** https://github.com/elastic/beats/blob/8fce110040dd445b2cd3f5a3b95e67dddab31b0a/x-pack/metricbeat/module/aws/ec2/ec2_integration_test.go#L28
* **Branch:** master
* **Artifact Link:** [build logs](https://beats-ci.elastic.co/job/Beats/job/beats/job/master/312/artifact/x-pack/metricbeat/build/TEST-go-integration-aws.out/*view*/) [build logs backup](https://gist.github.com/v1v/fc6a0f11aed60b504fb338fcab0bece2) (if the first link got broke)
* **Notes:** @kaiyan-sheng dig deeper and found that there is no metric collected by ec2 metricset, which means there is no CloudWatch metrics for EC2 yet.
### Stack Trace
```
ec2_integration_test.go:28:
Error Trace: ec2_integration_test.go:28
Error: Should NOT be empty, but was []
Test: TestFetch
```
## How to reproduce it
- Go to [build](https://beats-ci.elastic.co/job/Beats/job/beats/job/master/build?delay=0sec)
- Select `awsCloudTests`
- Unselect `windowsTest`
- Click on `Build` | 1.0 | [x-pack/metricbeat] ec2_integration_test - ## Flaky Test
* **Test Name:** TestFetch – ec2
* **Link:** https://github.com/elastic/beats/blob/8fce110040dd445b2cd3f5a3b95e67dddab31b0a/x-pack/metricbeat/module/aws/ec2/ec2_integration_test.go#L28
* **Branch:** master
* **Artifact Link:** [build logs](https://beats-ci.elastic.co/job/Beats/job/beats/job/master/312/artifact/x-pack/metricbeat/build/TEST-go-integration-aws.out/*view*/) [build logs backup](https://gist.github.com/v1v/fc6a0f11aed60b504fb338fcab0bece2) (if the first link got broke)
* **Notes:** @kaiyan-sheng dig deeper and found that there is no metric collected by ec2 metricset, which means there is no CloudWatch metrics for EC2 yet.
### Stack Trace
```
ec2_integration_test.go:28:
Error Trace: ec2_integration_test.go:28
Error: Should NOT be empty, but was []
Test: TestFetch
```
## How to reproduce it
- Go to [build](https://beats-ci.elastic.co/job/Beats/job/beats/job/master/build?delay=0sec)
- Select `awsCloudTests`
- Unselect `windowsTest`
- Click on `Build` | non_defect | integration test flaky test test name testfetch – link branch master artifact link if the first link got broke notes kaiyan sheng dig deeper and found that there is no metric collected by metricset which means there is no cloudwatch metrics for yet stack trace integration test go error trace integration test go error should not be empty but was test testfetch how to reproduce it go to select awscloudtests unselect windowstest click on build | 0 |
94,218 | 3,922,991,087 | IssuesEvent | 2016-04-22 09:18:45 | sahana/SAMBRO | https://api.github.com/repos/sahana/SAMBRO | opened | Parameters in Info block | bug High Priority | Parameters are not saving in message templates.
The labels are not key and value ... laymen need to understand .... is should be name and value | 1.0 | Parameters in Info block - Parameters are not saving in message templates.
The labels are not key and value ... laymen need to understand .... is should be name and value | non_defect | parameters in info block parameters are not saving in message templates the labels are not key and value laymen need to understand is should be name and value | 0 |
44,191 | 12,034,270,904 | IssuesEvent | 2020-04-13 15:44:40 | BOINC/boinc | https://api.github.com/repos/BOINC/boinc | closed | When adding an account manager, using a project URL does not halt gracefully. | C: Manager P: Minor R: fixed T: Defect | **Reported by JacobKlein on 16 May 41247778 08:26 UTC**
Tools -> Add project or account manager... -> Use account manager -> Paste a project URL like http://isaac.ssl.berkeley.edu/alpha/ -> Next -> Agree to terms if applicable -> Identify account for the project -> Receive the following error:
Failed to add project[error has occured;[[BR]([BR]]An)]check the Event Log for details.[Finish to close.
Event log says:[[BR]([BR]]Click)]3/31/2011 !12:13:48 PM | | Fetching configuration file from !http://isaac.ssl.berkeley.edu/alpha/get_project_config.php [!12:13:56 PM | | Contacting account manager at !http://isaac.ssl.berkeley.edu/alpha/ [[BR]([BR]]3/31/2011)]3/31/2011 !12:13:57 PM | | error: file not found
Notices (on 6.12.19) also throws the error as a notice:[from BOINC[[BR]([BR]]Notice)]error: file not found
I propose this:
When the user is in "add account manager" mode, and first contact with the URL indicates that it is not capable of serving as an "account manager", the wizard should stop. It should not try to add it as a project. I should not have even gotten to terms of use page, or the Email address and password prompts.
Migrated-From: http://boinc.berkeley.edu/trac/ticket/1077
| 1.0 | When adding an account manager, using a project URL does not halt gracefully. - **Reported by JacobKlein on 16 May 41247778 08:26 UTC**
Tools -> Add project or account manager... -> Use account manager -> Paste a project URL like http://isaac.ssl.berkeley.edu/alpha/ -> Next -> Agree to terms if applicable -> Identify account for the project -> Receive the following error:
Failed to add project[error has occured;[[BR]([BR]]An)]check the Event Log for details.[Finish to close.
Event log says:[[BR]([BR]]Click)]3/31/2011 !12:13:48 PM | | Fetching configuration file from !http://isaac.ssl.berkeley.edu/alpha/get_project_config.php [!12:13:56 PM | | Contacting account manager at !http://isaac.ssl.berkeley.edu/alpha/ [[BR]([BR]]3/31/2011)]3/31/2011 !12:13:57 PM | | error: file not found
Notices (on 6.12.19) also throws the error as a notice:[from BOINC[[BR]([BR]]Notice)]error: file not found
I propose this:
When the user is in "add account manager" mode, and first contact with the URL indicates that it is not capable of serving as an "account manager", the wizard should stop. It should not try to add it as a project. I should not have even gotten to terms of use page, or the Email address and password prompts.
Migrated-From: http://boinc.berkeley.edu/trac/ticket/1077
| defect | when adding an account manager using a project url does not halt gracefully reported by jacobklein on may utc tools add project or account manager use account manager paste a project url like next agree to terms if applicable identify account for the project receive the following error failed to add project an check the event log for details finish to close event log says click pm fetching configuration file from pm error file not found notices on also throws the error as a notice notice error file not found i propose this when the user is in add account manager mode and first contact with the url indicates that it is not capable of serving as an account manager the wizard should stop it should not try to add it as a project i should not have even gotten to terms of use page or the email address and password prompts migrated from | 1 |
135,563 | 5,254,592,876 | IssuesEvent | 2017-02-02 13:24:46 | Alexey-Yakovenko/deadbeef | https://api.github.com/repos/Alexey-Yakovenko/deadbeef | closed | keyboard control for track properties (navigation between fields, e. g. TAB) | enhancement Priority-Medium | Original [issue 1117](https://code.google.com/p/ddb/issues/detail?id=1117) created by Alexey-Yakovenko on 2014-05-25T18:33:49.000Z:
(This is an ENHANCEMENT)
It would be a very nice thing to have if we could navigate between fields (Artist/Title/Comment...) using the TAB key. (Though a long-time wish of mine, I wanted to bring up more important things first like the previous vfs_zip extract bug etc.).
The reason?
Well, when we correct Artist and Title because it's misspelled, we would do this with the keyboard. Hence it is a little awkward if we always have to use the mouse to advance to the field below _and no alternative_ to use.
Thanks for considering (if so).
The following would be sufficient:
TAB - one field ahead
SHIFT-TAB - previous field (or first field, if old field (not Mike) was last)
| 1.0 | keyboard control for track properties (navigation between fields, e. g. TAB) - Original [issue 1117](https://code.google.com/p/ddb/issues/detail?id=1117) created by Alexey-Yakovenko on 2014-05-25T18:33:49.000Z:
(This is an ENHANCEMENT)
It would be a very nice thing to have if we could navigate between fields (Artist/Title/Comment...) using the TAB key. (Though a long-time wish of mine, I wanted to bring up more important things first like the previous vfs_zip extract bug etc.).
The reason?
Well, when we correct Artist and Title because it's misspelled, we would do this with the keyboard. Hence it is a little awkward if we always have to use the mouse to advance to the field below _and no alternative_ to use.
Thanks for considering (if so).
The following would be sufficient:
TAB - one field ahead
SHIFT-TAB - previous field (or first field, if old field (not Mike) was last)
| non_defect | keyboard control for track properties navigation between fields e g tab original created by alexey yakovenko on this is an enhancement it would be a very nice thing to have if we could navigate between fields artist title comment using the tab key though a long time wish of mine i wanted to bring up more important things first like the previous vfs zip extract bug etc the reason well when we correct artist and title because it s misspelled we would do this with the keyboard hence it is a little awkward if we always have to use the mouse to advance to the field below and no alternative to use thanks for considering if so the following would be sufficient tab one field ahead shift tab previous field or first field if old field not mike was last | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.