Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 5 112 | repo_url stringlengths 34 141 | action stringclasses 3 values | title stringlengths 1 757 | labels stringlengths 4 664 | body stringlengths 3 261k | index stringclasses 10 values | text_combine stringlengths 96 261k | label stringclasses 2 values | text stringlengths 96 232k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
8,477 | 22,621,865,834 | IssuesEvent | 2022-06-30 07:14:17 | kubernetes/enhancements | https://api.github.com/repos/kubernetes/enhancements | closed | Clarify or remove the "status is only observations" rule | sig/architecture lifecycle/rotten | # Enhancement Description
- One-line enhancement description: Clarify if/how controllers can use status to track non-observable state
- Kubernetes Enhancement Proposal: https://github.com/kubernetes/enhancements/tree/master/keps/sig-architecture/2527-clarify-status-observations-vs-rbac
- Discussion Link: https://groups.google.com/g/kubernetes-sig-architecture/c/d96tt2Mw69s
- Primary contact (assignee): thockin
- Responsible SIGs: sig-architecture
- Enhancement target (which target equals to which milestone): N/A
<!-- Uncomment these as you prepare the enhancement for the next stage
- Alpha release target (x.y):
- Beta release target (x.y):
- Stable release target (x.y):
- [ ] Alpha
- [ ] KEP (`k/enhancements`) update PR(s):
- [ ] Code (`k/k`) update PR(s):
- [ ] Docs (`k/website`) update PR(s):
- [ ] Beta
- [ ] KEP (`k/enhancements`) update PR(s):
- [ ] Code (`k/k`) update PR(s):
- [ ] Docs (`k/website`) update(s):
- [ ] Stable
- [ ] KEP (`k/enhancements`) update PR(s):
- [ ] Code (`k/k`) update PR(s):
- [ ] Docs (`k/website`) update(s):
-->
| 1.0 | Clarify or remove the "status is only observations" rule - # Enhancement Description
- One-line enhancement description: Clarify if/how controllers can use status to track non-observable state
- Kubernetes Enhancement Proposal: https://github.com/kubernetes/enhancements/tree/master/keps/sig-architecture/2527-clarify-status-observations-vs-rbac
- Discussion Link: https://groups.google.com/g/kubernetes-sig-architecture/c/d96tt2Mw69s
- Primary contact (assignee): thockin
- Responsible SIGs: sig-architecture
- Enhancement target (which target equals to which milestone): N/A
<!-- Uncomment these as you prepare the enhancement for the next stage
- Alpha release target (x.y):
- Beta release target (x.y):
- Stable release target (x.y):
- [ ] Alpha
- [ ] KEP (`k/enhancements`) update PR(s):
- [ ] Code (`k/k`) update PR(s):
- [ ] Docs (`k/website`) update PR(s):
- [ ] Beta
- [ ] KEP (`k/enhancements`) update PR(s):
- [ ] Code (`k/k`) update PR(s):
- [ ] Docs (`k/website`) update(s):
- [ ] Stable
- [ ] KEP (`k/enhancements`) update PR(s):
- [ ] Code (`k/k`) update PR(s):
- [ ] Docs (`k/website`) update(s):
-->
| non_defect | clarify or remove the status is only observations rule enhancement description one line enhancement description clarify if how controllers can use status to track non observable state kubernetes enhancement proposal discussion link primary contact assignee thockin responsible sigs sig architecture enhancement target which target equals to which milestone n a uncomment these as you prepare the enhancement for the next stage alpha release target x y beta release target x y stable release target x y alpha kep k enhancements update pr s code k k update pr s docs k website update pr s beta kep k enhancements update pr s code k k update pr s docs k website update s stable kep k enhancements update pr s code k k update pr s docs k website update s | 0 |
656,761 | 21,774,726,100 | IssuesEvent | 2022-05-13 12:45:35 | SeldonIO/alibi | https://api.github.com/repos/SeldonIO/alibi | opened | Similarity explanations on the introduction page | Type: Docs Priority: Medium | After 0.7.0 we should add a section in this page (and the corresponding notebook) to discuss similarity explanations. | 1.0 | Similarity explanations on the introduction page - After 0.7.0 we should add a section in this page (and the corresponding notebook) to discuss similarity explanations. | non_defect | similarity explanations on the introduction page after we should add a section in this page and the corresponding notebook to discuss similarity explanations | 0 |
408,985 | 27,717,556,524 | IssuesEvent | 2023-03-14 17:58:25 | GillianPlatform/Gillian | https://api.github.com/repos/GillianPlatform/Gillian | closed | Write quick style guidelines | documentation | I'm very happy with some of the changes on the last PR, and in particular, seeing `f_opt -> f` and `f -> f_exn`,
We should write short code guildelines (referencing other OCaml guidelines as much as possible).
And while we're in, throw in some contribution guidelines, which should be short "we welcome PRs, look at code guidelines, contact us if you need any help" | 1.0 | Write quick style guidelines - I'm very happy with some of the changes on the last PR, and in particular, seeing `f_opt -> f` and `f -> f_exn`,
We should write short code guildelines (referencing other OCaml guidelines as much as possible).
And while we're in, throw in some contribution guidelines, which should be short "we welcome PRs, look at code guidelines, contact us if you need any help" | non_defect | write quick style guidelines i m very happy with some of the changes on the last pr and in particular seeing f opt f and f f exn we should write short code guildelines referencing other ocaml guidelines as much as possible and while we re in throw in some contribution guidelines which should be short we welcome prs look at code guidelines contact us if you need any help | 0 |
287,358 | 8,809,503,487 | IssuesEvent | 2018-12-27 20:03:30 | AbigFUZZYbunny/SCU-8D | https://api.github.com/repos/AbigFUZZYbunny/SCU-8D | opened | Setup GPIO pins (STM32CUBEMX) | Priority - Low needed on hold | Need to define pins for Analog inputs (2) and Digital Interrupts (4: LTDC, I2C, etc...) | 1.0 | Setup GPIO pins (STM32CUBEMX) - Need to define pins for Analog inputs (2) and Digital Interrupts (4: LTDC, I2C, etc...) | non_defect | setup gpio pins need to define pins for analog inputs and digital interrupts ltdc etc | 0 |
3,414 | 2,610,062,237 | IssuesEvent | 2015-02-26 18:18:18 | chrsmith/jsjsj122 | https://api.github.com/repos/chrsmith/jsjsj122 | opened | 路桥不孕不育检查大概要多少钱 | auto-migrated Priority-Medium Type-Defect | ```
路桥不孕不育检查大概要多少钱【台州五洲生殖医院】24小时
健康咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地�
��:台州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐1
04、108、118、198及椒江一金清公交车直达枫南小区,乘坐107、
105、109、112、901、
902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 7:34 | 1.0 | 路桥不孕不育检查大概要多少钱 - ```
路桥不孕不育检查大概要多少钱【台州五洲生殖医院】24小时
健康咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地�
��:台州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐1
04、108、118、198及椒江一金清公交车直达枫南小区,乘坐107、
105、109、112、901、
902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 7:34 | defect | 路桥不孕不育检查大概要多少钱 路桥不孕不育检查大概要多少钱【台州五洲生殖医院】 健康咨询热线 微信号tzwzszyy 医院地� �� (枫南大转盘旁)乘车线路 、 、 、 , 、 、 、 、 、 ,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 original issue reported on code google com by poweragr gmail com on may at | 1 |
57,214 | 14,135,191,736 | IssuesEvent | 2020-11-10 01:05:33 | GooseWSS/BotBuilder-Samples | https://api.github.com/repos/GooseWSS/BotBuilder-Samples | opened | CVE-2020-7764 (Medium) detected in find-my-way-2.2.3.tgz, find-my-way-1.18.1.tgz | security vulnerability | ## CVE-2020-7764 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>find-my-way-2.2.3.tgz</b>, <b>find-my-way-1.18.1.tgz</b></p></summary>
<p>
<details><summary><b>find-my-way-2.2.3.tgz</b></p></summary>
<p>Crazy fast http radix based router</p>
<p>Library home page: <a href="https://registry.npmjs.org/find-my-way/-/find-my-way-2.2.3.tgz">https://registry.npmjs.org/find-my-way/-/find-my-way-2.2.3.tgz</a></p>
<p>Path to dependency file: BotBuilder-Samples/samples/javascript_nodejs/08.suggested-actions/package.json</p>
<p>Path to vulnerable library: BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json</p>
<p>
Dependency Hierarchy:
- restify-8.3.3.tgz (Root Library)
- :x: **find-my-way-2.2.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>find-my-way-1.18.1.tgz</b></p></summary>
<p>Crazy fast http radix based router</p>
<p>Library home page: <a href="https://registry.npmjs.org/find-my-way/-/find-my-way-1.18.1.tgz">https://registry.npmjs.org/find-my-way/-/find-my-way-1.18.1.tgz</a></p>
<p>Path to dependency file: BotBuilder-Samples/MigrationV3V4/Node/core-MultiDialogs-v4/package.json</p>
<p>Path to vulnerable library: BotBuilder-Samples/MigrationV3V4/Node/core-MultiDialogs-v4/node_modules/find-my-way/package.json</p>
<p>
Dependency Hierarchy:
- restify-7.7.0.tgz (Root Library)
- :x: **find-my-way-1.18.1.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package find-my-way before 2.2.5, from 3.0.0 and before 3.0.5. It accepts the Accept-Version' header by default, and if versioned routes are not being used, this could lead to a denial of service. Accept-Version can be used as an unkeyed header in a cache poisoning attack.
<p>Publish Date: 2020-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7764>CVE-2020-7764</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7764">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7764</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: v2.2.5,v3.0.5</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"find-my-way","packageVersion":"2.2.3","isTransitiveDependency":true,"dependencyTree":"restify:8.3.3;find-my-way:2.2.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v2.2.5,v3.0.5"},{"packageType":"javascript/Node.js","packageName":"find-my-way","packageVersion":"1.18.1","isTransitiveDependency":true,"dependencyTree":"restify:7.7.0;find-my-way:1.18.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v2.2.5,v3.0.5"}],"vulnerabilityIdentifier":"CVE-2020-7764","vulnerabilityDetails":"This affects the package find-my-way before 2.2.5, from 3.0.0 and before 3.0.5. It accepts the Accept-Version\u0027 header by default, and if versioned routes are not being used, this could lead to a denial of service. Accept-Version can be used as an unkeyed header in a cache poisoning attack.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7764","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-7764 (Medium) detected in find-my-way-2.2.3.tgz, find-my-way-1.18.1.tgz - ## CVE-2020-7764 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>find-my-way-2.2.3.tgz</b>, <b>find-my-way-1.18.1.tgz</b></p></summary>
<p>
<details><summary><b>find-my-way-2.2.3.tgz</b></p></summary>
<p>Crazy fast http radix based router</p>
<p>Library home page: <a href="https://registry.npmjs.org/find-my-way/-/find-my-way-2.2.3.tgz">https://registry.npmjs.org/find-my-way/-/find-my-way-2.2.3.tgz</a></p>
<p>Path to dependency file: BotBuilder-Samples/samples/javascript_nodejs/08.suggested-actions/package.json</p>
<p>Path to vulnerable library: BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json,BotBuilder-Samples/samples/typescript_nodejs/05.multi-turn-prompt/node_modules/find-my-way/package.json</p>
<p>
Dependency Hierarchy:
- restify-8.3.3.tgz (Root Library)
- :x: **find-my-way-2.2.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>find-my-way-1.18.1.tgz</b></p></summary>
<p>Crazy fast http radix based router</p>
<p>Library home page: <a href="https://registry.npmjs.org/find-my-way/-/find-my-way-1.18.1.tgz">https://registry.npmjs.org/find-my-way/-/find-my-way-1.18.1.tgz</a></p>
<p>Path to dependency file: BotBuilder-Samples/MigrationV3V4/Node/core-MultiDialogs-v4/package.json</p>
<p>Path to vulnerable library: BotBuilder-Samples/MigrationV3V4/Node/core-MultiDialogs-v4/node_modules/find-my-way/package.json</p>
<p>
Dependency Hierarchy:
- restify-7.7.0.tgz (Root Library)
- :x: **find-my-way-1.18.1.tgz** (Vulnerable Library)
</details>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package find-my-way before 2.2.5, from 3.0.0 and before 3.0.5. It accepts the Accept-Version' header by default, and if versioned routes are not being used, this could lead to a denial of service. Accept-Version can be used as an unkeyed header in a cache poisoning attack.
<p>Publish Date: 2020-07-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7764>CVE-2020-7764</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.9</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7764">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-7764</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: v2.2.5,v3.0.5</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"find-my-way","packageVersion":"2.2.3","isTransitiveDependency":true,"dependencyTree":"restify:8.3.3;find-my-way:2.2.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v2.2.5,v3.0.5"},{"packageType":"javascript/Node.js","packageName":"find-my-way","packageVersion":"1.18.1","isTransitiveDependency":true,"dependencyTree":"restify:7.7.0;find-my-way:1.18.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"v2.2.5,v3.0.5"}],"vulnerabilityIdentifier":"CVE-2020-7764","vulnerabilityDetails":"This affects the package find-my-way before 2.2.5, from 3.0.0 and before 3.0.5. It accepts the Accept-Version\u0027 header by default, and if versioned routes are not being used, this could lead to a denial of service. Accept-Version can be used as an unkeyed header in a cache poisoning attack.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7764","cvss3Severity":"medium","cvss3Score":"5.9","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_defect | cve medium detected in find my way tgz find my way tgz cve medium severity vulnerability vulnerable libraries find my way tgz find my way tgz find my way tgz crazy fast http radix based router library home page a href path to dependency file botbuilder samples samples javascript nodejs suggested actions package json path to vulnerable library botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json botbuilder samples samples typescript nodejs multi turn prompt node modules find my way package json dependency hierarchy restify tgz root library x find my way tgz vulnerable library find my way tgz crazy fast http radix based router library home page a href path to dependency file botbuilder samples node core multidialogs package json path to vulnerable library botbuilder samples node core multidialogs node modules find my way package json dependency hierarchy restify tgz root library x find my way tgz vulnerable library vulnerability details this affects the package find my way before from and before it accepts the accept version header by default and if versioned routes are not being used this could lead to a denial of service accept version can be used as an unkeyed header in a cache poisoning attack publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails this affects the package find my way before from and before it accepts the accept version header by default and if versioned routes are not being used this could lead to a denial of service accept version can be used as an unkeyed header in a cache poisoning attack vulnerabilityurl | 0 |
90,393 | 3,815,203,460 | IssuesEvent | 2016-03-28 16:56:08 | Lukadoss/History-web | https://api.github.com/repos/Lukadoss/History-web | closed | Validace správnosti prvků při registraci | bug priority: medium | Validace unikátního emailu - #27 - :heavy_check_mark:
Validace emailu - :x:
Validace kontroly hesla - :x: | 1.0 | Validace správnosti prvků při registraci - Validace unikátního emailu - #27 - :heavy_check_mark:
Validace emailu - :x:
Validace kontroly hesla - :x: | non_defect | validace správnosti prvků při registraci validace unikátního emailu heavy check mark validace emailu x validace kontroly hesla x | 0 |
471,938 | 13,613,382,363 | IssuesEvent | 2020-09-23 11:48:16 | ballerina-platform/ballerina-lang | https://api.github.com/repos/ballerina-platform/ballerina-lang | closed | Signature help not working for top level variable declaration | Area/LanguageServer Priority/High SwanLakeDump Team/Tooling Type/Bug | **Description:**
Consider the following source,
```
import ballerina/module1;
public listener module1:Listener lst = new module1:Listener(90);
module1:Client cl = new(<cursor>)
```
Signature help is not provided at the cursor. When we have the very same statement with in a function block, we can get the signature help.
| 1.0 | Signature help not working for top level variable declaration - **Description:**
Consider the following source,
```
import ballerina/module1;
public listener module1:Listener lst = new module1:Listener(90);
module1:Client cl = new(<cursor>)
```
Signature help is not provided at the cursor. When we have the very same statement with in a function block, we can get the signature help.
| non_defect | signature help not working for top level variable declaration description consider the following source import ballerina public listener listener lst new listener client cl new signature help is not provided at the cursor when we have the very same statement with in a function block we can get the signature help | 0 |
76,376 | 26,397,213,845 | IssuesEvent | 2023-01-12 20:37:31 | SeleniumHQ/selenium | https://api.github.com/repos/SeleniumHQ/selenium | opened | [🐛 Bug]: VSCode Code is unreachable | I-defect needs-triaging | ### What happened?
Using the webdriver.Chrome() makes the code below unreachable using VSCode and Pylance. There's [this ](11398) issue which was closed and the solution was to upgrade Selenium and VSCode but this problem seems to persist on selenium==4.7.2 and VSCode==1.74.3
Pylance == `v2023.1.10`
VSCode == `1.74.3`
Selenium == `4.7.2`
Python == `3.10.5`
### How can we reproduce the issue?
```shell
`from selenium import webdriver
webdriver.Chrome()
# Code is "unreachable" because of the previous line
a = 1
print(a)`
```
### Relevant log output
```shell
Code is unreachable (Pylance)
```
### Operating System
Windows 11
### Selenium version
4.7.2
### What are the browser(s) and version(s) where you see this issue?
Chrome 108.0.5359.125
### What are the browser driver(s) and version(s) where you see this issue?
Chromedriver 108
### Are you using Selenium Grid?
No | 1.0 | [🐛 Bug]: VSCode Code is unreachable - ### What happened?
Using the webdriver.Chrome() makes the code below unreachable using VSCode and Pylance. There's [this ](11398) issue which was closed and the solution was to upgrade Selenium and VSCode but this problem seems to persist on selenium==4.7.2 and VSCode==1.74.3
Pylance == `v2023.1.10`
VSCode == `1.74.3`
Selenium == `4.7.2`
Python == `3.10.5`
### How can we reproduce the issue?
```shell
`from selenium import webdriver
webdriver.Chrome()
# Code is "unreachable" because of the previous line
a = 1
print(a)`
```
### Relevant log output
```shell
Code is unreachable (Pylance)
```
### Operating System
Windows 11
### Selenium version
4.7.2
### What are the browser(s) and version(s) where you see this issue?
Chrome 108.0.5359.125
### What are the browser driver(s) and version(s) where you see this issue?
Chromedriver 108
### Are you using Selenium Grid?
No | defect | vscode code is unreachable what happened using the webdriver chrome makes the code below unreachable using vscode and pylance there s issue which was closed and the solution was to upgrade selenium and vscode but this problem seems to persist on selenium and vscode pylance vscode selenium python how can we reproduce the issue shell from selenium import webdriver webdriver chrome code is unreachable because of the previous line a print a relevant log output shell code is unreachable pylance operating system windows selenium version what are the browser s and version s where you see this issue chrome what are the browser driver s and version s where you see this issue chromedriver are you using selenium grid no | 1 |
136,853 | 18,751,491,454 | IssuesEvent | 2021-11-05 02:58:10 | Dima2022/Resiliency-Studio | https://api.github.com/repos/Dima2022/Resiliency-Studio | closed | CVE-2020-36189 (High) detected in jackson-databind-2.8.6.jar - autoclosed | security vulnerability | ## CVE-2020-36189 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: Resiliency-Studio/resiliency-studio-agent/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar</p>
<p>
Dependency Hierarchy:
- sdk-java-rest-6.2.0.4-oss.jar (Root Library)
- :x: **jackson-databind-2.8.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/Resiliency-Studio/commit/9809d9b7bfdc114eafb0a14d86667f3a76a014e8">9809d9b7bfdc114eafb0a14d86667f3a76a014e8</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189>CVE-2020-36189</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2996">https://github.com/FasterXML/jackson-databind/issues/2996</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.6","packageFilePaths":["/resiliency-studio-agent/pom.xml","/resiliency-studio-security/pom.xml","/resiliency-studio-service/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.att.ajsc:sdk-java-rest:6.2.0.4-oss;com.fasterxml.jackson.core:jackson-databind:2.8.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36189","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | True | CVE-2020-36189 (High) detected in jackson-databind-2.8.6.jar - autoclosed - ## CVE-2020-36189 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.8.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: Resiliency-Studio/resiliency-studio-agent/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.6/jackson-databind-2.8.6.jar</p>
<p>
Dependency Hierarchy:
- sdk-java-rest-6.2.0.4-oss.jar (Root Library)
- :x: **jackson-databind-2.8.6.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/Dima2022/Resiliency-Studio/commit/9809d9b7bfdc114eafb0a14d86667f3a76a014e8">9809d9b7bfdc114eafb0a14d86667f3a76a014e8</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource.
<p>Publish Date: 2021-01-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189>CVE-2020-36189</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/2996">https://github.com/FasterXML/jackson-databind/issues/2996</a></p>
<p>Release Date: 2021-01-06</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.6","packageFilePaths":["/resiliency-studio-agent/pom.xml","/resiliency-studio-security/pom.xml","/resiliency-studio-service/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"com.att.ajsc:sdk-java-rest:6.2.0.4-oss;com.fasterxml.jackson.core:jackson-databind:2.8.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"baseBranches":[],"vulnerabilityIdentifier":"CVE-2020-36189","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to com.newrelic.agent.deps.ch.qos.logback.core.db.DriverManagerConnectionSource.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36189","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> --> | non_defect | cve high detected in jackson databind jar autoclosed cve high severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file resiliency studio resiliency studio agent pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy sdk java rest oss jar root library x jackson databind jar vulnerable library found in head commit a href vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com newrelic agent deps ch qos logback core db drivermanagerconnectionsource publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree com att ajsc sdk java rest oss com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to com newrelic agent deps ch qos logback core db drivermanagerconnectionsource vulnerabilityurl | 0 |
17,554 | 3,012,746,989 | IssuesEvent | 2015-07-29 02:09:09 | yawlfoundation/yawl | https://api.github.com/repos/yawlfoundation/yawl | closed | [CLOSED] XML data type using a pattern throws a tomcat exception | auto-migrated Category-Engine Priority-Medium Type-Defect | <a href="https://github.com/GoogleCodeExporter"><img src="https://avatars.githubusercontent.com/u/9614759?v=3" align="left" width="96" height="96" hspace="10"></img></a> **Issue by [GoogleCodeExporter](https://github.com/GoogleCodeExporter)**
_Monday Jul 27, 2015 at 03:21 GMT_
_Originally opened as https://github.com/adamsmj/yawl/issues/101_
----
```
In the attached example, the data type better-us-zipcode (taken from
http://xml.coverpages.org/REC-xmlschema-2-20010502.html) leads to a
problem reported by tomcat (attached as well).
```
Original issue reported on code.google.com by `arthurte...@gmail.com` on 18 Aug 2008 at 4:19
Attachments:
* [new50.xml](https://storage.googleapis.com/google-code-attachments/yawl/issue-101/comment-0/new50.xml)
* [Pattern-XML](https://storage.googleapis.com/google-code-attachments/yawl/issue-101/comment-0/Pattern-XML)
| 1.0 | [CLOSED] XML data type using a pattern throws a tomcat exception - <a href="https://github.com/GoogleCodeExporter"><img src="https://avatars.githubusercontent.com/u/9614759?v=3" align="left" width="96" height="96" hspace="10"></img></a> **Issue by [GoogleCodeExporter](https://github.com/GoogleCodeExporter)**
_Monday Jul 27, 2015 at 03:21 GMT_
_Originally opened as https://github.com/adamsmj/yawl/issues/101_
----
```
In the attached example, the data type better-us-zipcode (taken from
http://xml.coverpages.org/REC-xmlschema-2-20010502.html) leads to a
problem reported by tomcat (attached as well).
```
Original issue reported on code.google.com by `arthurte...@gmail.com` on 18 Aug 2008 at 4:19
Attachments:
* [new50.xml](https://storage.googleapis.com/google-code-attachments/yawl/issue-101/comment-0/new50.xml)
* [Pattern-XML](https://storage.googleapis.com/google-code-attachments/yawl/issue-101/comment-0/Pattern-XML)
| defect | xml data type using a pattern throws a tomcat exception issue by monday jul at gmt originally opened as in the attached example the data type better us zipcode taken from leads to a problem reported by tomcat attached as well original issue reported on code google com by arthurte gmail com on aug at attachments | 1 |
53,299 | 13,261,372,819 | IssuesEvent | 2020-08-20 19:46:46 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | closed | truncated_energy needs Sphinx docs and maintainer (Trac #1147) | Migrated from Trac combo reconstruction defect | The truncated_project doesn't have any Sphinx (.rst) docs. The existing Doxygen docs are detailed and simply need to be converted to Sphinx docs so the nightly doc build will create docs online at http://software.icecube.wisc.edu/icerec_trunk/
Also, please add a line in the documents with maintainer name and email.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1147">https://code.icecube.wisc.edu/projects/icecube/ticket/1147</a>, reported by jtatarand owned by jtatar</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"_ts": "1550067117911749",
"description": "The truncated_project doesn't have any Sphinx (.rst) docs. The existing Doxygen docs are detailed and simply need to be converted to Sphinx docs so the nightly doc build will create docs online at http://software.icecube.wisc.edu/icerec_trunk/\n\nAlso, please add a line in the documents with maintainer name and email.",
"reporter": "jtatar",
"cc": "",
"resolution": "fixed",
"time": "2015-08-18T00:19:16",
"component": "combo reconstruction",
"summary": "truncated_energy needs Sphinx docs and maintainer",
"priority": "blocker",
"keywords": "",
"milestone": "",
"owner": "jtatar",
"type": "defect"
}
```
</p>
</details>
| 1.0 | truncated_energy needs Sphinx docs and maintainer (Trac #1147) - The truncated_project doesn't have any Sphinx (.rst) docs. The existing Doxygen docs are detailed and simply need to be converted to Sphinx docs so the nightly doc build will create docs online at http://software.icecube.wisc.edu/icerec_trunk/
Also, please add a line in the documents with maintainer name and email.
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1147">https://code.icecube.wisc.edu/projects/icecube/ticket/1147</a>, reported by jtatarand owned by jtatar</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"_ts": "1550067117911749",
"description": "The truncated_project doesn't have any Sphinx (.rst) docs. The existing Doxygen docs are detailed and simply need to be converted to Sphinx docs so the nightly doc build will create docs online at http://software.icecube.wisc.edu/icerec_trunk/\n\nAlso, please add a line in the documents with maintainer name and email.",
"reporter": "jtatar",
"cc": "",
"resolution": "fixed",
"time": "2015-08-18T00:19:16",
"component": "combo reconstruction",
"summary": "truncated_energy needs Sphinx docs and maintainer",
"priority": "blocker",
"keywords": "",
"milestone": "",
"owner": "jtatar",
"type": "defect"
}
```
</p>
</details>
| defect | truncated energy needs sphinx docs and maintainer trac the truncated project doesn t have any sphinx rst docs the existing doxygen docs are detailed and simply need to be converted to sphinx docs so the nightly doc build will create docs online at also please add a line in the documents with maintainer name and email migrated from json status closed changetime ts description the truncated project doesn t have any sphinx rst docs the existing doxygen docs are detailed and simply need to be converted to sphinx docs so the nightly doc build will create docs online at please add a line in the documents with maintainer name and email reporter jtatar cc resolution fixed time component combo reconstruction summary truncated energy needs sphinx docs and maintainer priority blocker keywords milestone owner jtatar type defect | 1 |
5,940 | 2,610,218,314 | IssuesEvent | 2015-02-26 19:09:26 | chrsmith/somefinders | https://api.github.com/repos/chrsmith/somefinders | opened | код для получения real статуса всексмире | auto-migrated Priority-Medium Type-Defect | ```
'''Арлен Попов'''
День добрый никак не могу найти .код для
получения real статуса всексмире. где то
видел уже
'''Винцент Петров'''
Вот хороший сайт где можно скачать
http://bit.ly/1aVfACL
'''Аристарх Пестов'''
Просит ввести номер мобилы!Не опасно ли это?
'''Гаральд Некрасов'''
Неа все ок у меня ничего не списало
'''Варфоломей Красильников'''
Не это не влияет на баланс
Информация о файле: код для получения real
статуса всексмире
Загружен: В этом месяце
Скачан раз: 689
Рейтинг: 531
Средняя скорость скачивания: 1306
Похожих файлов: 21
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 1:40 | 1.0 | код для получения real статуса всексмире - ```
'''Арлен Попов'''
День добрый никак не могу найти .код для
получения real статуса всексмире. где то
видел уже
'''Винцент Петров'''
Вот хороший сайт где можно скачать
http://bit.ly/1aVfACL
'''Аристарх Пестов'''
Просит ввести номер мобилы!Не опасно ли это?
'''Гаральд Некрасов'''
Неа все ок у меня ничего не списало
'''Варфоломей Красильников'''
Не это не влияет на баланс
Информация о файле: код для получения real
статуса всексмире
Загружен: В этом месяце
Скачан раз: 689
Рейтинг: 531
Средняя скорость скачивания: 1306
Похожих файлов: 21
```
-----
Original issue reported on code.google.com by `kondense...@gmail.com` on 17 Dec 2013 at 1:40 | defect | код для получения real статуса всексмире арлен попов день добрый никак не могу найти код для получения real статуса всексмире где то видел уже винцент петров вот хороший сайт где можно скачать аристарх пестов просит ввести номер мобилы не опасно ли это гаральд некрасов неа все ок у меня ничего не списало варфоломей красильников не это не влияет на баланс информация о файле код для получения real статуса всексмире загружен в этом месяце скачан раз рейтинг средняя скорость скачивания похожих файлов original issue reported on code google com by kondense gmail com on dec at | 1 |
571 | 2,572,142,978 | IssuesEvent | 2015-02-10 20:41:35 | NeuralEnsemble/PyNN | https://api.github.com/repos/NeuralEnsemble/PyNN | closed | PopulationView behavior with mpirun in PyNN-0.7.5 | defect NEST | In PyNN-0.7.5 the behaviour of PopulationView is not as intended (from my point of view) when used with mpirun, i.e. when used with mpirun the printed data (spikes or voltages) is the same for all PopulationViews - or the correct way to use it would be with print_v( .. gather=False) and then merging the different files in order to get the data from the different views.
In PyNN-0.8beta1 there seems to be a problem with write_data
Here is a script to reproduce the behavior:
https://gist.github.com/bernhardkaplan/7984438 | 1.0 | PopulationView behavior with mpirun in PyNN-0.7.5 - In PyNN-0.7.5 the behaviour of PopulationView is not as intended (from my point of view) when used with mpirun, i.e. when used with mpirun the printed data (spikes or voltages) is the same for all PopulationViews - or the correct way to use it would be with print_v( .. gather=False) and then merging the different files in order to get the data from the different views.
In PyNN-0.8beta1 there seems to be a problem with write_data
Here is a script to reproduce the behavior:
https://gist.github.com/bernhardkaplan/7984438 | defect | populationview behavior with mpirun in pynn in pynn the behaviour of populationview is not as intended from my point of view when used with mpirun i e when used with mpirun the printed data spikes or voltages is the same for all populationviews or the correct way to use it would be with print v gather false and then merging the different files in order to get the data from the different views in pynn there seems to be a problem with write data here is a script to reproduce the behavior | 1 |
35,496 | 9,605,241,918 | IssuesEvent | 2019-05-10 22:58:42 | orbeon/orbeon-forms | https://api.github.com/repos/orbeon/orbeon-forms | closed | Use Docker for MySQL on Travis | Area: Build | - Motivation
- Orbeon Forms requires MySQL 5.7
- But the Trusty distribution we are using comes with MySQL 5.6.
- Possible alternatives
- Install MySQL 5.7 locally on Trusty
- Switch to the Xenial environment which comes with 5.7
- Reasons for going with Docker for MySQL
- Consistency with what we are doing with other databases
- Reduce dependency on Travis-CI environment | 1.0 | Use Docker for MySQL on Travis - - Motivation
- Orbeon Forms requires MySQL 5.7
- But the Trusty distribution we are using comes with MySQL 5.6.
- Possible alternatives
- Install MySQL 5.7 locally on Trusty
- Switch to the Xenial environment which comes with 5.7
- Reasons for going with Docker for MySQL
- Consistency with what we are doing with other databases
- Reduce dependency on Travis-CI environment | non_defect | use docker for mysql on travis motivation orbeon forms requires mysql but the trusty distribution we are using comes with mysql possible alternatives install mysql locally on trusty switch to the xenial environment which comes with reasons for going with docker for mysql consistency with what we are doing with other databases reduce dependency on travis ci environment | 0 |
17,927 | 3,013,788,453 | IssuesEvent | 2015-07-29 11:14:12 | yawlfoundation/yawl | https://api.github.com/repos/yawlfoundation/yawl | closed | Labels of composite tasks detached | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1. Choose an existing subnet for a new composite task
2. Enter a label
3. Pull the composite task somewhere else
What is the expected output? What do you see instead?
I expect the label to follow the task. Instead it doesn't - see screenshot.
Furthermore the label is not synchronised with the subnet name.
What version of the product are you using? On what operating system?
Editor 3.0 388
Please provide any additional information below.
```
Original issue reported on code.google.com by `andreas....@gmail.com` on 20 Dec 2013 at 6:17 | 1.0 | Labels of composite tasks detached - ```
What steps will reproduce the problem?
1. Choose an existing subnet for a new composite task
2. Enter a label
3. Pull the composite task somewhere else
What is the expected output? What do you see instead?
I expect the label to follow the task. Instead it doesn't - see screenshot.
Furthermore the label is not synchronised with the subnet name.
What version of the product are you using? On what operating system?
Editor 3.0 388
Please provide any additional information below.
```
Original issue reported on code.google.com by `andreas....@gmail.com` on 20 Dec 2013 at 6:17 | defect | labels of composite tasks detached what steps will reproduce the problem choose an existing subnet for a new composite task enter a label pull the composite task somewhere else what is the expected output what do you see instead i expect the label to follow the task instead it doesn t see screenshot furthermore the label is not synchronised with the subnet name what version of the product are you using on what operating system editor please provide any additional information below original issue reported on code google com by andreas gmail com on dec at | 1 |
12,389 | 2,694,264,570 | IssuesEvent | 2015-04-01 19:16:28 | google/google-api-go-client | https://api.github.com/repos/google/google-api-go-client | closed | Allow users to set append a user-agent string | fixed priority-medium type-defect |
**jbd@google.com** on 19 Sep 2014 at 9:16:
```
User agents are hardcoded as google-api-go-client/<version>, user should be
able to modify the user agent.
As a long term improvement, you should find a way to expose the underlying
Request object for each call.
```
| 1.0 | Allow users to set append a user-agent string -
**jbd@google.com** on 19 Sep 2014 at 9:16:
```
User agents are hardcoded as google-api-go-client/<version>, user should be
able to modify the user agent.
As a long term improvement, you should find a way to expose the underlying
Request object for each call.
```
| defect | allow users to set append a user agent string jbd google com on sep at user agents are hardcoded as google api go client user should be able to modify the user agent as a long term improvement you should find a way to expose the underlying request object for each call | 1 |
83,056 | 7,860,411,795 | IssuesEvent | 2018-06-21 19:53:25 | OpenLiberty/open-liberty | https://api.github.com/repos/OpenLiberty/open-liberty | opened | Test Failure (LAOS Continuous build - 20180620-2008): PolicyExecutorTest.testInvokeAnyTimedShutdownNowWhileEnqueued | team:Zombie Apocalypse test bug | Test Failure (LAOS Continuous build - 20180620-2008): com.ibm.ws.threading.policy.PolicyExecutorTest.testInvokeAnyTimedShutdownNowWhileEnqueued
testInvokeAnyTimedShutdownNowWhileEnqueued
junit.framework.AssertionFailedError: 2018-06-21-01:11:32:553 The response did not contain [SUCCESS]. Full output is:
ERROR: Caught exception attempting to call test method testInvokeAnyTimedShutdownNowWhileEnqueued on servlet web.PolicyExecutorServlet
java.util.concurrent.RejectedExecutionException: CWWKE1202E: A task cannot be submitted because the executor PolicyExecutorProvider-testInvokeAnyTimedShutdownNowWhileEnqueued has been shut down.
at com.ibm.ws.threading.internal.PolicyExecutorImpl.enqueue(PolicyExecutorImpl.java:489)
at com.ibm.ws.threading.internal.PolicyExecutorImpl.invokeAny(PolicyExecutorImpl.java:898)
at com.ibm.ws.threading.internal.PolicyExecutorImpl.invokeAny(PolicyExecutorImpl.java:786)
at web.PolicyExecutorServlet.testInvokeAnyTimedShutdownNowWhileEnqueued(PolicyExecutorServlet.java:3767)
at componenttest.app.FATServlet.doGet(FATServlet.java:71)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1255)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:743)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:440)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1208)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1005)
at com.ibm.ws.webcontainer.servlet.CacheServletWrapper.handleRequest(CacheServletWrapper.java:75)
at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:927)
at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost$2.run(DynamicVirtualHost.java:279)
at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink$TaskWrapper.run(HttpDispatcherLink.java:1011)
at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink.wrapHandlerAndExecute(HttpDispatcherLink.java:414)
at com.ibm.ws.http
Occurred on the following Platforms:
LAOS Continuous build - 20180620-2008
Test case is not allowing for the possibility that the executor shuts down before the invokeAny method finishes enqueuing the second task. In this case, the task is rejected due to the shutdown rather than canceled from the queue due to the shutdown. | 1.0 | Test Failure (LAOS Continuous build - 20180620-2008): PolicyExecutorTest.testInvokeAnyTimedShutdownNowWhileEnqueued - Test Failure (LAOS Continuous build - 20180620-2008): com.ibm.ws.threading.policy.PolicyExecutorTest.testInvokeAnyTimedShutdownNowWhileEnqueued
testInvokeAnyTimedShutdownNowWhileEnqueued
junit.framework.AssertionFailedError: 2018-06-21-01:11:32:553 The response did not contain [SUCCESS]. Full output is:
ERROR: Caught exception attempting to call test method testInvokeAnyTimedShutdownNowWhileEnqueued on servlet web.PolicyExecutorServlet
java.util.concurrent.RejectedExecutionException: CWWKE1202E: A task cannot be submitted because the executor PolicyExecutorProvider-testInvokeAnyTimedShutdownNowWhileEnqueued has been shut down.
at com.ibm.ws.threading.internal.PolicyExecutorImpl.enqueue(PolicyExecutorImpl.java:489)
at com.ibm.ws.threading.internal.PolicyExecutorImpl.invokeAny(PolicyExecutorImpl.java:898)
at com.ibm.ws.threading.internal.PolicyExecutorImpl.invokeAny(PolicyExecutorImpl.java:786)
at web.PolicyExecutorServlet.testInvokeAnyTimedShutdownNowWhileEnqueued(PolicyExecutorServlet.java:3767)
at componenttest.app.FATServlet.doGet(FATServlet.java:71)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:687)
at javax.servlet.http.HttpServlet.service(HttpServlet.java:790)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.service(ServletWrapper.java:1255)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:743)
at com.ibm.ws.webcontainer.servlet.ServletWrapper.handleRequest(ServletWrapper.java:440)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1208)
at com.ibm.ws.webcontainer.filter.WebAppFilterManager.invokeFilters(WebAppFilterManager.java:1005)
at com.ibm.ws.webcontainer.servlet.CacheServletWrapper.handleRequest(CacheServletWrapper.java:75)
at com.ibm.ws.webcontainer.WebContainer.handleRequest(WebContainer.java:927)
at com.ibm.ws.webcontainer.osgi.DynamicVirtualHost$2.run(DynamicVirtualHost.java:279)
at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink$TaskWrapper.run(HttpDispatcherLink.java:1011)
at com.ibm.ws.http.dispatcher.internal.channel.HttpDispatcherLink.wrapHandlerAndExecute(HttpDispatcherLink.java:414)
at com.ibm.ws.http
Occurred on the following Platforms:
LAOS Continuous build - 20180620-2008
Test case is not allowing for the possibility that the executor shuts down before the invokeAny method finishes enqueuing the second task. In this case, the task is rejected due to the shutdown rather than canceled from the queue due to the shutdown. | non_defect | test failure laos continuous build policyexecutortest testinvokeanytimedshutdownnowwhileenqueued test failure laos continuous build com ibm ws threading policy policyexecutortest testinvokeanytimedshutdownnowwhileenqueued testinvokeanytimedshutdownnowwhileenqueued junit framework assertionfailederror the response did not contain full output is error caught exception attempting to call test method testinvokeanytimedshutdownnowwhileenqueued on servlet web policyexecutorservlet java util concurrent rejectedexecutionexception a task cannot be submitted because the executor policyexecutorprovider testinvokeanytimedshutdownnowwhileenqueued has been shut down at com ibm ws threading internal policyexecutorimpl enqueue policyexecutorimpl java at com ibm ws threading internal policyexecutorimpl invokeany policyexecutorimpl java at com ibm ws threading internal policyexecutorimpl invokeany policyexecutorimpl java at web policyexecutorservlet testinvokeanytimedshutdownnowwhileenqueued policyexecutorservlet java at componenttest app fatservlet doget fatservlet java at javax servlet http httpservlet service httpservlet java at javax servlet http httpservlet service httpservlet java at com ibm ws webcontainer servlet servletwrapper service servletwrapper java at com ibm ws webcontainer servlet servletwrapper handlerequest servletwrapper java at com ibm ws webcontainer servlet servletwrapper handlerequest servletwrapper java at com ibm ws webcontainer filter webappfiltermanager invokefilters webappfiltermanager java at com ibm ws webcontainer filter webappfiltermanager invokefilters webappfiltermanager java at com ibm ws webcontainer servlet cacheservletwrapper handlerequest cacheservletwrapper java at com ibm ws webcontainer webcontainer handlerequest webcontainer java at com ibm ws webcontainer osgi dynamicvirtualhost run dynamicvirtualhost java at com ibm ws http dispatcher internal channel httpdispatcherlink taskwrapper run httpdispatcherlink java at com ibm ws http dispatcher internal channel httpdispatcherlink wraphandlerandexecute httpdispatcherlink java at com ibm ws http occurred on the following platforms laos continuous build test case is not allowing for the possibility that the executor shuts down before the invokeany method finishes enqueuing the second task in this case the task is rejected due to the shutdown rather than canceled from the queue due to the shutdown | 0 |
73,868 | 24,843,740,136 | IssuesEvent | 2022-10-26 14:28:21 | vishal-testgh20221021/testgh | https://api.github.com/repos/vishal-testgh20221021/testgh | opened | Test Failed - Check first name and last name - Step 1 | tp_with_cf defect |
h3. Test Case Details
*Test case title*: Check first name and last name
*Test plan title*: Test Plan 26 October 2022
*Custom fields*
comp: ,
cf-check1: Yes,
cf-date-1: 2022-10-27,
cf-dd-1: opt3,
cf-ms-1: ms-opt3,
cf-user-1: Elizabeth Martinez,
m_url: https://gitlab.com/gigapromoters/tc-api-specs/-/commit/202c744801362db6310760eb71bd75698f1cf8f7
*Steps*:
|S.No.|Step|Expected Result|Status|Comment|
|*1*|Enter special characters / numerals in first name field and submit form with all other details filled up correctlyhttps://www.google.com|An error message pertaining to incorrect first name should be shown and focus should be on first name fieldhttps://www.google.com|Fail|test|
|2|Enter special characters / numerals in last name field and submit form with all other details filled up correctly|An error message pertaining to incorrect last name should be shown and focus should be on last name field|Not executed| |
|3|Enter valid first name , last name and submit form with all other details filled up correctly s|The form should be submitted with no errors and user should be redirected to next step of signup|Not executed| |
|4|Velocity-over-quality mindset leads to software testing gapsInsufficient software testing happens because of a lack of talent, time and cash. But inattentive CEOs and development methodologies such as Agile also contribute to the problem.ByStephanie Glen, News WriterPublished: 05 Aug 2022Software testing addresses bugs and vulnerabilities before they affect users, but the harried race to the finish line -- cluttered with obstacles such as low budgets, incremental sprints and poor management decisions -- can stymie the deployment of quality code.|Get a property's value on the previously yielded subject.If you want to call a function on the previously yielded subject, use .invoke().Syntax.its(propertyName).its(propertyName, options)Usage Correct Usagecy.wrap({ width: '50' }).its('width') // Get the 'width' propertycy.window().its('sessionStorage') // Get the 'sessionStorage' property Incorrect Usage |Not executed| | | 1.0 | Test Failed - Check first name and last name - Step 1 -
h3. Test Case Details
*Test case title*: Check first name and last name
*Test plan title*: Test Plan 26 October 2022
*Custom fields*
comp: ,
cf-check1: Yes,
cf-date-1: 2022-10-27,
cf-dd-1: opt3,
cf-ms-1: ms-opt3,
cf-user-1: Elizabeth Martinez,
m_url: https://gitlab.com/gigapromoters/tc-api-specs/-/commit/202c744801362db6310760eb71bd75698f1cf8f7
*Steps*:
|S.No.|Step|Expected Result|Status|Comment|
|*1*|Enter special characters / numerals in first name field and submit form with all other details filled up correctlyhttps://www.google.com|An error message pertaining to incorrect first name should be shown and focus should be on first name fieldhttps://www.google.com|Fail|test|
|2|Enter special characters / numerals in last name field and submit form with all other details filled up correctly|An error message pertaining to incorrect last name should be shown and focus should be on last name field|Not executed| |
|3|Enter valid first name , last name and submit form with all other details filled up correctly s|The form should be submitted with no errors and user should be redirected to next step of signup|Not executed| |
|4|Velocity-over-quality mindset leads to software testing gapsInsufficient software testing happens because of a lack of talent, time and cash. But inattentive CEOs and development methodologies such as Agile also contribute to the problem.ByStephanie Glen, News WriterPublished: 05 Aug 2022Software testing addresses bugs and vulnerabilities before they affect users, but the harried race to the finish line -- cluttered with obstacles such as low budgets, incremental sprints and poor management decisions -- can stymie the deployment of quality code.|Get a property's value on the previously yielded subject.If you want to call a function on the previously yielded subject, use .invoke().Syntax.its(propertyName).its(propertyName, options)Usage Correct Usagecy.wrap({ width: '50' }).its('width') // Get the 'width' propertycy.window().its('sessionStorage') // Get the 'sessionStorage' property Incorrect Usage |Not executed| | | defect | test failed check first name and last name step test case details test case title check first name and last name test plan title test plan october custom fields comp cf yes cf date cf dd cf ms ms cf user elizabeth martinez m url steps s no step expected result status comment enter special characters numerals in first name field and submit form with all other details filled up correctly error message pertaining to incorrect first name should be shown and focus should be on first name field enter special characters numerals in last name field and submit form with all other details filled up correctly an error message pertaining to incorrect last name should be shown and focus should be on last name field not executed enter valid first name last name and submit form with all other details filled up correctly s the form should be submitted with no errors and user should be redirected to next step of signup not executed velocity over quality mindset leads to software testing gapsinsufficient software testing happens because of a lack of talent time and cash but inattentive ceos and development methodologies such as agile also contribute to the problem bystephanie glen nbsp news writerpublished nbsp aug testing addresses bugs and vulnerabilities before they affect users but the harried race to the finish line cluttered with obstacles such as low budgets incremental sprints and poor management decisions can stymie the deployment of quality code get a property s value on the previously yielded subject if you want to call a nbsp function nbsp on the previously yielded subject use nbsp invoke syntax its propertyname its propertyname options usage nbsp correct usagecy wrap width its width get the width propertycy window its sessionstorage get the sessionstorage property nbsp incorrect usage not executed | 1 |
11,534 | 30,833,227,849 | IssuesEvent | 2023-08-02 04:43:15 | Koniverse/SubWallet-Extension | https://api.github.com/repos/Koniverse/SubWallet-Extension | closed | Not showing staking record on account using different stash and controller account | enhancement extension architecture | Current staking feature does not show staking data in case the controller account is different from the stash account. More update later. Expected to be resolved after architecture update | 1.0 | Not showing staking record on account using different stash and controller account - Current staking feature does not show staking data in case the controller account is different from the stash account. More update later. Expected to be resolved after architecture update | non_defect | not showing staking record on account using different stash and controller account current staking feature does not show staking data in case the controller account is different from the stash account more update later expected to be resolved after architecture update | 0 |
831,884 | 32,064,252,649 | IssuesEvent | 2023-09-25 00:36:52 | RbAvci/My-Coursework-Planner | https://api.github.com/repos/RbAvci/My-Coursework-Planner | opened | [PD] Feeling, behaving and acting like a professional in the software industry | 🔑 Priority Key 🐂 Size Medium 📅 HTML-CSS 📅 Week 2 | From Module-HTML-CSS created by [kfklein15](https://github.com/kfklein15): CodeYourFuture/Module-HTML-CSS#44
### Coursework content
You are back to your Plan your Life as a Developer.
This plan is not something that you can finalise in a short period. You'll need to go back to it a few more times if you'd like to find an **honest description** of your current week and identify the necessary changes to it.
As a week will have passed since you did it, you can **compare** what you wrote with the reality of the week that passed.
**Reflections on your current plan.**
- How much energy did you have when you sat down to study and work on CYF projects?
- How tired or distracted were you?
- How many interruptions did you get?
**Other areas** to reflect:
- On your work (or other studies), did you work longer hours than what you planned? What happened?
- Were there any activities that you dedicated more time to it than what you expected?
- How is your sleep?
- Do you manage to feel rested in the morning?
- How do you start your day?
Reflecting on this, think about these **two topics**:
1. What changes you might need to bring to your life.
2. Define their short/medium/long-term goals.
Then:
- Add these two items to your existing Google Doc. _(Reminder: minimum 50 words each and reviewed with an automated grammar tool)_
- Share them with your pair.
- Discuss with them, so you can identify anything that is missing, if what you are planning is realistic, or if it is just right.
### Estimated time in hours
1.5
### What is the purpose of this assignment?
You are getting a deeper understanding of what blockers and distractions that hold you up. But now, you also have to start thinking about what can you do to change this situation and what goals can you start putting in place.
### How to submit
- Create a document with the following titles and add your reflections to it:
- Summary of my current situation
- My current plan
- What distractions do I have / My energy levels during the study
- Original plans I had after I finished the training
- Share your document with 1-2 people with similar situations or experiences
- Discuss your document with them to get some input
- Add the link to this document as a comment on this issue. Make sure it can be commented on by anyone.
| 1.0 | [PD] Feeling, behaving and acting like a professional in the software industry - From Module-HTML-CSS created by [kfklein15](https://github.com/kfklein15): CodeYourFuture/Module-HTML-CSS#44
### Coursework content
You are back to your Plan your Life as a Developer.
This plan is not something that you can finalise in a short period. You'll need to go back to it a few more times if you'd like to find an **honest description** of your current week and identify the necessary changes to it.
As a week will have passed since you did it, you can **compare** what you wrote with the reality of the week that passed.
**Reflections on your current plan.**
- How much energy did you have when you sat down to study and work on CYF projects?
- How tired or distracted were you?
- How many interruptions did you get?
**Other areas** to reflect:
- On your work (or other studies), did you work longer hours than what you planned? What happened?
- Were there any activities that you dedicated more time to it than what you expected?
- How is your sleep?
- Do you manage to feel rested in the morning?
- How do you start your day?
Reflecting on this, think about these **two topics**:
1. What changes you might need to bring to your life.
2. Define their short/medium/long-term goals.
Then:
- Add these two items to your existing Google Doc. _(Reminder: minimum 50 words each and reviewed with an automated grammar tool)_
- Share them with your pair.
- Discuss with them, so you can identify anything that is missing, if what you are planning is realistic, or if it is just right.
### Estimated time in hours
1.5
### What is the purpose of this assignment?
You are getting a deeper understanding of what blockers and distractions that hold you up. But now, you also have to start thinking about what can you do to change this situation and what goals can you start putting in place.
### How to submit
- Create a document with the following titles and add your reflections to it:
- Summary of my current situation
- My current plan
- What distractions do I have / My energy levels during the study
- Original plans I had after I finished the training
- Share your document with 1-2 people with similar situations or experiences
- Discuss your document with them to get some input
- Add the link to this document as a comment on this issue. Make sure it can be commented on by anyone.
| non_defect | feeling behaving and acting like a professional in the software industry from module html css created by codeyourfuture module html css coursework content you are back to your plan your life as a developer this plan is not something that you can finalise in a short period you ll need to go back to it a few more times if you d like to find an honest description of your current week and identify the necessary changes to it as a week will have passed since you did it you can compare what you wrote with the reality of the week that passed reflections on your current plan how much energy did you have when you sat down to study and work on cyf projects how tired or distracted were you how many interruptions did you get other areas to reflect on your work or other studies did you work longer hours than what you planned what happened were there any activities that you dedicated more time to it than what you expected how is your sleep do you manage to feel rested in the morning how do you start your day reflecting on this think about these two topics what changes you might need to bring to your life define their short medium long term goals then add these two items to your existing google doc reminder minimum words each and reviewed with an automated grammar tool share them with your pair discuss with them so you can identify anything that is missing if what you are planning is realistic or if it is just right estimated time in hours what is the purpose of this assignment you are getting a deeper understanding of what blockers and distractions that hold you up but now you also have to start thinking about what can you do to change this situation and what goals can you start putting in place how to submit create a document with the following titles and add your reflections to it summary of my current situation my current plan what distractions do i have my energy levels during the study original plans i had after i finished the training share your document with people with similar situations or experiences discuss your document with them to get some input add the link to this document as a comment on this issue make sure it can be commented on by anyone | 0 |
407,845 | 27,633,600,300 | IssuesEvent | 2023-03-10 12:50:24 | KinsonDigital/GotNuget | https://api.github.com/repos/KinsonDigital/GotNuget | opened | 🚧Update Readme | preview 📝documentation/product | ### Complete The Item Below
- [X] I have updated the title without removing the 🚧 emoji.
### Description
We are in the process of rationalizing ReadMe across all GitHub Actions (GHA). This includes:
Making sure all sections are in the same order (some repo's might not have the same sections due to different needs). The order should be:
- Logo
- Repo Name
- Badges
- What is it
- Quick Note
- Quick Example
- Action Input/Output
- Examples
- Contrib/Funding (after combined like in lib section)
- Maintainers
- License
### **Details of the changes to the sections are below:**
**Logo (CMW to create)**
> **Note** Calvin needs to create a logo from scratch
**Quick Note**
- Change to just Note and
- Remove warning emoji's and replace with pencil
**Quick Example**
- Add Quick Example for each GHA
- Make sure the "more examples below" link is on the right side.
**Action Input/Output**
- Add "Action Outputs" section, but do not add a table with the outputs in it.
**Examples**
- Change to say "Example"
- Change the wording in the Example section to say "Fails the job if the package is found"
**Maintainers**
- Remove Project Maintainers and other ancillary information
**License**
- Add section
**Other**
- Double checking spelling/grammar
- Making wording the same in similar sections
- Make sure "Back to the Top" link is on the right
### Acceptance Criteria
asdf
### ToDo Items
- [X] Change type labels added to this issue. Refer to the _**Change Type Labels**_ section below.
- [X] Priority label added to this issue. Refer to the _**Priority Type Labels**_ section below.
- [X] Issue linked to the correct project _(if applicable)_.
- [X] Issue linked to the correct milestone _(if applicable)_.
- [ ] Draft pull request created and linked to this issue _(only required with code changes)_.
### Issue Dependencies
_No response_
### Related Work
_No response_
### Additional Information:
**_<details closed><summary>Change Type Labels</summary>_**
| Change Type | Label |
|---------------------|----------------------|
| Bug Fixes | `🐛bug` |
| Breaking Changes | `🧨breaking changes` |
| New Feature | `✨new feature` |
| Workflow Changes | `workflow` |
| Code Doc Changes | `🗒️documentation/code` |
| Product Doc Changes | `📝documentation/product` |
</details>
**_<details closed><summary>Priority Type Labels</summary>_**
| Priority Type | Label |
|---------------------|-------------------|
| Low Priority | `low priority` |
| Medium Priority | `medium priority` |
| High Priority | `high priority` |
</details>
### Code of Conduct
- [X] I agree to follow this project's Code of Conduct. | 1.0 | 🚧Update Readme - ### Complete The Item Below
- [X] I have updated the title without removing the 🚧 emoji.
### Description
We are in the process of rationalizing ReadMe across all GitHub Actions (GHA). This includes:
Making sure all sections are in the same order (some repo's might not have the same sections due to different needs). The order should be:
- Logo
- Repo Name
- Badges
- What is it
- Quick Note
- Quick Example
- Action Input/Output
- Examples
- Contrib/Funding (after combined like in lib section)
- Maintainers
- License
### **Details of the changes to the sections are below:**
**Logo (CMW to create)**
> **Note** Calvin needs to create a logo from scratch
**Quick Note**
- Change to just Note and
- Remove warning emoji's and replace with pencil
**Quick Example**
- Add Quick Example for each GHA
- Make sure the "more examples below" link is on the right side.
**Action Input/Output**
- Add "Action Outputs" section, but do not add a table with the outputs in it.
**Examples**
- Change to say "Example"
- Change the wording in the Example section to say "Fails the job if the package is found"
**Maintainers**
- Remove Project Maintainers and other ancillary information
**License**
- Add section
**Other**
- Double checking spelling/grammar
- Making wording the same in similar sections
- Make sure "Back to the Top" link is on the right
### Acceptance Criteria
asdf
### ToDo Items
- [X] Change type labels added to this issue. Refer to the _**Change Type Labels**_ section below.
- [X] Priority label added to this issue. Refer to the _**Priority Type Labels**_ section below.
- [X] Issue linked to the correct project _(if applicable)_.
- [X] Issue linked to the correct milestone _(if applicable)_.
- [ ] Draft pull request created and linked to this issue _(only required with code changes)_.
### Issue Dependencies
_No response_
### Related Work
_No response_
### Additional Information:
**_<details closed><summary>Change Type Labels</summary>_**
| Change Type | Label |
|---------------------|----------------------|
| Bug Fixes | `🐛bug` |
| Breaking Changes | `🧨breaking changes` |
| New Feature | `✨new feature` |
| Workflow Changes | `workflow` |
| Code Doc Changes | `🗒️documentation/code` |
| Product Doc Changes | `📝documentation/product` |
</details>
**_<details closed><summary>Priority Type Labels</summary>_**
| Priority Type | Label |
|---------------------|-------------------|
| Low Priority | `low priority` |
| Medium Priority | `medium priority` |
| High Priority | `high priority` |
</details>
### Code of Conduct
- [X] I agree to follow this project's Code of Conduct. | non_defect | 🚧update readme complete the item below i have updated the title without removing the 🚧 emoji description we are in the process of rationalizing readme across all github actions gha this includes making sure all sections are in the same order some repo s might not have the same sections due to different needs the order should be logo repo name badges what is it quick note quick example action input output examples contrib funding after combined like in lib section maintainers license details of the changes to the sections are below logo cmw to create note calvin needs to create a logo from scratch quick note change to just note and remove warning emoji s and replace with pencil quick example add quick example for each gha make sure the more examples below link is on the right side action input output add action outputs section but do not add a table with the outputs in it examples change to say example change the wording in the example section to say fails the job if the package is found maintainers remove project maintainers and other ancillary information license add section other double checking spelling grammar making wording the same in similar sections make sure back to the top link is on the right acceptance criteria asdf todo items change type labels added to this issue refer to the change type labels section below priority label added to this issue refer to the priority type labels section below issue linked to the correct project if applicable issue linked to the correct milestone if applicable draft pull request created and linked to this issue only required with code changes issue dependencies no response related work no response additional information change type labels change type label bug fixes 🐛bug breaking changes 🧨breaking changes new feature ✨new feature workflow changes workflow code doc changes 🗒️documentation code product doc changes 📝documentation product priority type labels priority type label low priority low priority medium priority medium priority high priority high priority code of conduct i agree to follow this project s code of conduct | 0 |
199,824 | 6,994,714,726 | IssuesEvent | 2017-12-15 16:20:46 | xcodeswift/sake | https://api.github.com/repos/xcodeswift/sake | opened | Make formula official | difficulty:easy good first issue priority:medium status:ready-development type:enhancement | ## Context 🕵️♀️
Add the formula to the official Homebrew tap.
## What 🌱
Propose the Sake formula to be part of the official Homebrew tap.
| 1.0 | Make formula official - ## Context 🕵️♀️
Add the formula to the official Homebrew tap.
## What 🌱
Propose the Sake formula to be part of the official Homebrew tap.
| non_defect | make formula official context 🕵️♀️ add the formula to the official homebrew tap what 🌱 propose the sake formula to be part of the official homebrew tap | 0 |
21,590 | 29,983,918,107 | IssuesEvent | 2023-06-25 02:00:07 | lizhihao6/get-daily-arxiv-noti | https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti | opened | New submissions for Fri, 23 Jun 23 | event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB | ## Keyword: events
### Exploring the Role of Audio in Video Captioning
- **Authors:** Yuhan Shen, Linjie Yang, Longyin Wen, Haichao Yu, Ehsan Elhamifar, Heng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Sound (cs.SD); Audio and Speech Processing (eess.AS)
- **Arxiv link:** https://arxiv.org/abs/2306.12559
- **Pdf link:** https://arxiv.org/pdf/2306.12559
- **Abstract**
Recent focus in video captioning has been on designing architectures that can consume both video and text modalities, and using large-scale video datasets with text transcripts for pre-training, such as HowTo100M. Though these approaches have achieved significant improvement, the audio modality is often ignored in video captioning. In this work, we present an audio-visual framework, which aims to fully exploit the potential of the audio modality for captioning. Instead of relying on text transcripts extracted via automatic speech recognition (ASR), we argue that learning with raw audio signals can be more beneficial, as audio has additional information including acoustic events, speaker identity, etc. Our contributions are twofold. First, we observed that the model overspecializes to the audio modality when pre-training with both video and audio modality, since the ground truth (i.e., text transcripts) can be solely predicted using audio. We proposed a Modality Balanced Pre-training (MBP) loss to mitigate this issue and significantly improve the performance on downstream tasks. Second, we slice and dice different design choices of the cross-modal module, which may become an information bottleneck and generate inferior results. We proposed new local-global fusion mechanisms to improve information exchange across audio and video. We demonstrate significant improvements by leveraging the audio modality on four datasets, and even outperform the state of the art on some metrics without relying on the text modality as the input.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### Data-Free Backbone Fine-Tuning for Pruned Neural Networks
- **Authors:** Adrian Holzbock, Achyut Hegde, Klaus Dietmayer, Vasileios Belagiannis
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.12881
- **Pdf link:** https://arxiv.org/pdf/2306.12881
- **Abstract**
Model compression techniques reduce the computational load and memory consumption of deep neural networks. After the compression operation, e.g. parameter pruning, the model is normally fine-tuned on the original training dataset to recover from the performance drop caused by compression. However, the training data is not always available due to privacy issues or other factors. In this work, we present a data-free fine-tuning approach for pruning the backbone of deep neural networks. In particular, the pruned network backbone is trained with synthetically generated images, and our proposed intermediate supervision to mimic the unpruned backbone's output feature map. Afterwards, the pruned backbone can be combined with the original network head to make predictions. We generate synthetic images by back-propagating gradients to noise images while relying on L1-pruning for the backbone pruning. In our experiments, we show that our approach is task-independent due to pruning only the backbone. By evaluating our approach on 2D human pose estimation, object detection, and image classification, we demonstrate promising performance compared to the unpruned model. Our code is available at https://github.com/holzbock/dfbf.
## Keyword: RAW
### Exploring the Role of Audio in Video Captioning
- **Authors:** Yuhan Shen, Linjie Yang, Longyin Wen, Haichao Yu, Ehsan Elhamifar, Heng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Sound (cs.SD); Audio and Speech Processing (eess.AS)
- **Arxiv link:** https://arxiv.org/abs/2306.12559
- **Pdf link:** https://arxiv.org/pdf/2306.12559
- **Abstract**
Recent focus in video captioning has been on designing architectures that can consume both video and text modalities, and using large-scale video datasets with text transcripts for pre-training, such as HowTo100M. Though these approaches have achieved significant improvement, the audio modality is often ignored in video captioning. In this work, we present an audio-visual framework, which aims to fully exploit the potential of the audio modality for captioning. Instead of relying on text transcripts extracted via automatic speech recognition (ASR), we argue that learning with raw audio signals can be more beneficial, as audio has additional information including acoustic events, speaker identity, etc. Our contributions are twofold. First, we observed that the model overspecializes to the audio modality when pre-training with both video and audio modality, since the ground truth (i.e., text transcripts) can be solely predicted using audio. We proposed a Modality Balanced Pre-training (MBP) loss to mitigate this issue and significantly improve the performance on downstream tasks. Second, we slice and dice different design choices of the cross-modal module, which may become an information bottleneck and generate inferior results. We proposed new local-global fusion mechanisms to improve information exchange across audio and video. We demonstrate significant improvements by leveraging the audio modality on four datasets, and even outperform the state of the art on some metrics without relying on the text modality as the input.
### Neural Spectro-polarimetric Fields
- **Authors:** Youngchan Kim, Wonjoon Jin, Sunghyun Cho, Seung-Hwan Baek
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2306.12562
- **Pdf link:** https://arxiv.org/pdf/2306.12562
- **Abstract**
Modeling the spatial radiance distribution of light rays in a scene has been extensively explored for applications, including view synthesis. Spectrum and polarization, the wave properties of light, are often neglected due to their integration into three RGB spectral bands and their non-perceptibility to human vision. Despite this, these properties encompass substantial material and geometric information about a scene. In this work, we propose to model spectro-polarimetric fields, the spatial Stokes-vector distribution of any light ray at an arbitrary wavelength. We present Neural Spectro-polarimetric Fields (NeSpoF), a neural representation that models the physically-valid Stokes vector at given continuous variables of position, direction, and wavelength. NeSpoF manages inherently noisy raw measurements, showcases memory efficiency, and preserves physically vital signals, factors that are crucial for representing the high-dimensional signal of a spectro-polarimetric field. To validate NeSpoF, we introduce the first multi-view hyperspectral-polarimetric image dataset, comprised of both synthetic and real-world scenes. These were captured using our compact hyperspectral-polarimetric imaging system, which has been calibrated for robustness against system imperfections. We demonstrate the capabilities of NeSpoF on diverse scenes.
### DreamEdit: Subject-driven Image Editing
- **Authors:** Tianle Li, Max Ku, Cong Wei, Wenhu Chen
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.12624
- **Pdf link:** https://arxiv.org/pdf/2306.12624
- **Abstract**
Subject-driven image generation aims at generating images containing customized subjects, which has recently drawn enormous attention from the research community. However, the previous works cannot precisely control the background and position of the target subject. In this work, we aspire to fill the void and propose two novel subject-driven sub-tasks, i.e., Subject Replacement and Subject Addition. The new tasks are challenging in multiple aspects: replacing a subject with a customized one can change its shape, texture, and color, while adding a target subject to a designated position in a provided scene necessitates a context-aware posture. To conquer these two novel tasks, we first manually curate a new dataset DreamEditBench containing 22 different types of subjects, and 440 source images with different difficulty levels. We plan to host DreamEditBench as a platform and hire trained evaluators for standard human evaluation. We also devise an innovative method DreamEditor to resolve these tasks by performing iterative generation, which enables a smooth adaptation to the customized subject. In this project, we conduct automatic and human evaluations to understand the performance of DreamEditor and baselines on DreamEditBench. For Subject Replacement, we found that the existing models are sensitive to the shape and color of the original subject. The model failure rate will dramatically increase when the source and target subjects are highly different. For Subject Addition, we found that the existing models cannot easily blend the customized subjects into the background smoothly, leading to noticeable artifacts in the generated image. We hope DreamEditBench can become a standard platform to enable future investigations toward building more controllable subject-driven image editing. Our project homepage is https://dreameditbenchteam.github.io/.
## Keyword: raw image
There is no result
| 2.0 | New submissions for Fri, 23 Jun 23 - ## Keyword: events
### Exploring the Role of Audio in Video Captioning
- **Authors:** Yuhan Shen, Linjie Yang, Longyin Wen, Haichao Yu, Ehsan Elhamifar, Heng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Sound (cs.SD); Audio and Speech Processing (eess.AS)
- **Arxiv link:** https://arxiv.org/abs/2306.12559
- **Pdf link:** https://arxiv.org/pdf/2306.12559
- **Abstract**
Recent focus in video captioning has been on designing architectures that can consume both video and text modalities, and using large-scale video datasets with text transcripts for pre-training, such as HowTo100M. Though these approaches have achieved significant improvement, the audio modality is often ignored in video captioning. In this work, we present an audio-visual framework, which aims to fully exploit the potential of the audio modality for captioning. Instead of relying on text transcripts extracted via automatic speech recognition (ASR), we argue that learning with raw audio signals can be more beneficial, as audio has additional information including acoustic events, speaker identity, etc. Our contributions are twofold. First, we observed that the model overspecializes to the audio modality when pre-training with both video and audio modality, since the ground truth (i.e., text transcripts) can be solely predicted using audio. We proposed a Modality Balanced Pre-training (MBP) loss to mitigate this issue and significantly improve the performance on downstream tasks. Second, we slice and dice different design choices of the cross-modal module, which may become an information bottleneck and generate inferior results. We proposed new local-global fusion mechanisms to improve information exchange across audio and video. We demonstrate significant improvements by leveraging the audio modality on four datasets, and even outperform the state of the art on some metrics without relying on the text modality as the input.
## Keyword: event camera
There is no result
## Keyword: events camera
There is no result
## Keyword: white balance
There is no result
## Keyword: color contrast
There is no result
## Keyword: AWB
There is no result
## Keyword: ISP
There is no result
## Keyword: image signal processing
There is no result
## Keyword: image signal process
There is no result
## Keyword: compression
### Data-Free Backbone Fine-Tuning for Pruned Neural Networks
- **Authors:** Adrian Holzbock, Achyut Hegde, Klaus Dietmayer, Vasileios Belagiannis
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.12881
- **Pdf link:** https://arxiv.org/pdf/2306.12881
- **Abstract**
Model compression techniques reduce the computational load and memory consumption of deep neural networks. After the compression operation, e.g. parameter pruning, the model is normally fine-tuned on the original training dataset to recover from the performance drop caused by compression. However, the training data is not always available due to privacy issues or other factors. In this work, we present a data-free fine-tuning approach for pruning the backbone of deep neural networks. In particular, the pruned network backbone is trained with synthetically generated images, and our proposed intermediate supervision to mimic the unpruned backbone's output feature map. Afterwards, the pruned backbone can be combined with the original network head to make predictions. We generate synthetic images by back-propagating gradients to noise images while relying on L1-pruning for the backbone pruning. In our experiments, we show that our approach is task-independent due to pruning only the backbone. By evaluating our approach on 2D human pose estimation, object detection, and image classification, we demonstrate promising performance compared to the unpruned model. Our code is available at https://github.com/holzbock/dfbf.
## Keyword: RAW
### Exploring the Role of Audio in Video Captioning
- **Authors:** Yuhan Shen, Linjie Yang, Longyin Wen, Haichao Yu, Ehsan Elhamifar, Heng Wang
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Sound (cs.SD); Audio and Speech Processing (eess.AS)
- **Arxiv link:** https://arxiv.org/abs/2306.12559
- **Pdf link:** https://arxiv.org/pdf/2306.12559
- **Abstract**
Recent focus in video captioning has been on designing architectures that can consume both video and text modalities, and using large-scale video datasets with text transcripts for pre-training, such as HowTo100M. Though these approaches have achieved significant improvement, the audio modality is often ignored in video captioning. In this work, we present an audio-visual framework, which aims to fully exploit the potential of the audio modality for captioning. Instead of relying on text transcripts extracted via automatic speech recognition (ASR), we argue that learning with raw audio signals can be more beneficial, as audio has additional information including acoustic events, speaker identity, etc. Our contributions are twofold. First, we observed that the model overspecializes to the audio modality when pre-training with both video and audio modality, since the ground truth (i.e., text transcripts) can be solely predicted using audio. We proposed a Modality Balanced Pre-training (MBP) loss to mitigate this issue and significantly improve the performance on downstream tasks. Second, we slice and dice different design choices of the cross-modal module, which may become an information bottleneck and generate inferior results. We proposed new local-global fusion mechanisms to improve information exchange across audio and video. We demonstrate significant improvements by leveraging the audio modality on four datasets, and even outperform the state of the art on some metrics without relying on the text modality as the input.
### Neural Spectro-polarimetric Fields
- **Authors:** Youngchan Kim, Wonjoon Jin, Sunghyun Cho, Seung-Hwan Baek
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV); Image and Video Processing (eess.IV)
- **Arxiv link:** https://arxiv.org/abs/2306.12562
- **Pdf link:** https://arxiv.org/pdf/2306.12562
- **Abstract**
Modeling the spatial radiance distribution of light rays in a scene has been extensively explored for applications, including view synthesis. Spectrum and polarization, the wave properties of light, are often neglected due to their integration into three RGB spectral bands and their non-perceptibility to human vision. Despite this, these properties encompass substantial material and geometric information about a scene. In this work, we propose to model spectro-polarimetric fields, the spatial Stokes-vector distribution of any light ray at an arbitrary wavelength. We present Neural Spectro-polarimetric Fields (NeSpoF), a neural representation that models the physically-valid Stokes vector at given continuous variables of position, direction, and wavelength. NeSpoF manages inherently noisy raw measurements, showcases memory efficiency, and preserves physically vital signals, factors that are crucial for representing the high-dimensional signal of a spectro-polarimetric field. To validate NeSpoF, we introduce the first multi-view hyperspectral-polarimetric image dataset, comprised of both synthetic and real-world scenes. These were captured using our compact hyperspectral-polarimetric imaging system, which has been calibrated for robustness against system imperfections. We demonstrate the capabilities of NeSpoF on diverse scenes.
### DreamEdit: Subject-driven Image Editing
- **Authors:** Tianle Li, Max Ku, Cong Wei, Wenhu Chen
- **Subjects:** Computer Vision and Pattern Recognition (cs.CV)
- **Arxiv link:** https://arxiv.org/abs/2306.12624
- **Pdf link:** https://arxiv.org/pdf/2306.12624
- **Abstract**
Subject-driven image generation aims at generating images containing customized subjects, which has recently drawn enormous attention from the research community. However, the previous works cannot precisely control the background and position of the target subject. In this work, we aspire to fill the void and propose two novel subject-driven sub-tasks, i.e., Subject Replacement and Subject Addition. The new tasks are challenging in multiple aspects: replacing a subject with a customized one can change its shape, texture, and color, while adding a target subject to a designated position in a provided scene necessitates a context-aware posture. To conquer these two novel tasks, we first manually curate a new dataset DreamEditBench containing 22 different types of subjects, and 440 source images with different difficulty levels. We plan to host DreamEditBench as a platform and hire trained evaluators for standard human evaluation. We also devise an innovative method DreamEditor to resolve these tasks by performing iterative generation, which enables a smooth adaptation to the customized subject. In this project, we conduct automatic and human evaluations to understand the performance of DreamEditor and baselines on DreamEditBench. For Subject Replacement, we found that the existing models are sensitive to the shape and color of the original subject. The model failure rate will dramatically increase when the source and target subjects are highly different. For Subject Addition, we found that the existing models cannot easily blend the customized subjects into the background smoothly, leading to noticeable artifacts in the generated image. We hope DreamEditBench can become a standard platform to enable future investigations toward building more controllable subject-driven image editing. Our project homepage is https://dreameditbenchteam.github.io/.
## Keyword: raw image
There is no result
| non_defect | new submissions for fri jun keyword events exploring the role of audio in video captioning authors yuhan shen linjie yang longyin wen haichao yu ehsan elhamifar heng wang subjects computer vision and pattern recognition cs cv sound cs sd audio and speech processing eess as arxiv link pdf link abstract recent focus in video captioning has been on designing architectures that can consume both video and text modalities and using large scale video datasets with text transcripts for pre training such as though these approaches have achieved significant improvement the audio modality is often ignored in video captioning in this work we present an audio visual framework which aims to fully exploit the potential of the audio modality for captioning instead of relying on text transcripts extracted via automatic speech recognition asr we argue that learning with raw audio signals can be more beneficial as audio has additional information including acoustic events speaker identity etc our contributions are twofold first we observed that the model overspecializes to the audio modality when pre training with both video and audio modality since the ground truth i e text transcripts can be solely predicted using audio we proposed a modality balanced pre training mbp loss to mitigate this issue and significantly improve the performance on downstream tasks second we slice and dice different design choices of the cross modal module which may become an information bottleneck and generate inferior results we proposed new local global fusion mechanisms to improve information exchange across audio and video we demonstrate significant improvements by leveraging the audio modality on four datasets and even outperform the state of the art on some metrics without relying on the text modality as the input keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb there is no result keyword isp there is no result keyword image signal processing there is no result keyword image signal process there is no result keyword compression data free backbone fine tuning for pruned neural networks authors adrian holzbock achyut hegde klaus dietmayer vasileios belagiannis subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract model compression techniques reduce the computational load and memory consumption of deep neural networks after the compression operation e g parameter pruning the model is normally fine tuned on the original training dataset to recover from the performance drop caused by compression however the training data is not always available due to privacy issues or other factors in this work we present a data free fine tuning approach for pruning the backbone of deep neural networks in particular the pruned network backbone is trained with synthetically generated images and our proposed intermediate supervision to mimic the unpruned backbone s output feature map afterwards the pruned backbone can be combined with the original network head to make predictions we generate synthetic images by back propagating gradients to noise images while relying on pruning for the backbone pruning in our experiments we show that our approach is task independent due to pruning only the backbone by evaluating our approach on human pose estimation object detection and image classification we demonstrate promising performance compared to the unpruned model our code is available at keyword raw exploring the role of audio in video captioning authors yuhan shen linjie yang longyin wen haichao yu ehsan elhamifar heng wang subjects computer vision and pattern recognition cs cv sound cs sd audio and speech processing eess as arxiv link pdf link abstract recent focus in video captioning has been on designing architectures that can consume both video and text modalities and using large scale video datasets with text transcripts for pre training such as though these approaches have achieved significant improvement the audio modality is often ignored in video captioning in this work we present an audio visual framework which aims to fully exploit the potential of the audio modality for captioning instead of relying on text transcripts extracted via automatic speech recognition asr we argue that learning with raw audio signals can be more beneficial as audio has additional information including acoustic events speaker identity etc our contributions are twofold first we observed that the model overspecializes to the audio modality when pre training with both video and audio modality since the ground truth i e text transcripts can be solely predicted using audio we proposed a modality balanced pre training mbp loss to mitigate this issue and significantly improve the performance on downstream tasks second we slice and dice different design choices of the cross modal module which may become an information bottleneck and generate inferior results we proposed new local global fusion mechanisms to improve information exchange across audio and video we demonstrate significant improvements by leveraging the audio modality on four datasets and even outperform the state of the art on some metrics without relying on the text modality as the input neural spectro polarimetric fields authors youngchan kim wonjoon jin sunghyun cho seung hwan baek subjects computer vision and pattern recognition cs cv image and video processing eess iv arxiv link pdf link abstract modeling the spatial radiance distribution of light rays in a scene has been extensively explored for applications including view synthesis spectrum and polarization the wave properties of light are often neglected due to their integration into three rgb spectral bands and their non perceptibility to human vision despite this these properties encompass substantial material and geometric information about a scene in this work we propose to model spectro polarimetric fields the spatial stokes vector distribution of any light ray at an arbitrary wavelength we present neural spectro polarimetric fields nespof a neural representation that models the physically valid stokes vector at given continuous variables of position direction and wavelength nespof manages inherently noisy raw measurements showcases memory efficiency and preserves physically vital signals factors that are crucial for representing the high dimensional signal of a spectro polarimetric field to validate nespof we introduce the first multi view hyperspectral polarimetric image dataset comprised of both synthetic and real world scenes these were captured using our compact hyperspectral polarimetric imaging system which has been calibrated for robustness against system imperfections we demonstrate the capabilities of nespof on diverse scenes dreamedit subject driven image editing authors tianle li max ku cong wei wenhu chen subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract subject driven image generation aims at generating images containing customized subjects which has recently drawn enormous attention from the research community however the previous works cannot precisely control the background and position of the target subject in this work we aspire to fill the void and propose two novel subject driven sub tasks i e subject replacement and subject addition the new tasks are challenging in multiple aspects replacing a subject with a customized one can change its shape texture and color while adding a target subject to a designated position in a provided scene necessitates a context aware posture to conquer these two novel tasks we first manually curate a new dataset dreameditbench containing different types of subjects and source images with different difficulty levels we plan to host dreameditbench as a platform and hire trained evaluators for standard human evaluation we also devise an innovative method dreameditor to resolve these tasks by performing iterative generation which enables a smooth adaptation to the customized subject in this project we conduct automatic and human evaluations to understand the performance of dreameditor and baselines on dreameditbench for subject replacement we found that the existing models are sensitive to the shape and color of the original subject the model failure rate will dramatically increase when the source and target subjects are highly different for subject addition we found that the existing models cannot easily blend the customized subjects into the background smoothly leading to noticeable artifacts in the generated image we hope dreameditbench can become a standard platform to enable future investigations toward building more controllable subject driven image editing our project homepage is keyword raw image there is no result | 0 |
496,120 | 14,332,999,427 | IssuesEvent | 2020-11-27 04:24:26 | buddyboss/buddyboss-platform | https://api.github.com/repos/buddyboss/buddyboss-platform | opened | Member/Profile types draft status | bug priority: medium | Describe the bug
If profile or group type/s are added in 'draft' status they are still visible and able for user to select from.
To Reproduce
Steps to reproduce the behavior:
Add group type
Change status to draft
Add/Edit group and try to add/change group type
See that draft type/s are still available as option
Expected behavior
Draft types should be not selectable
Screenshots
https://drive.google.com/file/d/1PfS7JMlktvkntLftYj7u_Bboa5s1KhQE/view
Support ticket links
na #https://github.com/buddyboss/buddyboss-theme/issues/1175 | 1.0 | Member/Profile types draft status - Describe the bug
If profile or group type/s are added in 'draft' status they are still visible and able for user to select from.
To Reproduce
Steps to reproduce the behavior:
Add group type
Change status to draft
Add/Edit group and try to add/change group type
See that draft type/s are still available as option
Expected behavior
Draft types should be not selectable
Screenshots
https://drive.google.com/file/d/1PfS7JMlktvkntLftYj7u_Bboa5s1KhQE/view
Support ticket links
na #https://github.com/buddyboss/buddyboss-theme/issues/1175 | non_defect | member profile types draft status describe the bug if profile or group type s are added in draft status they are still visible and able for user to select from to reproduce steps to reproduce the behavior add group type change status to draft add edit group and try to add change group type see that draft type s are still available as option expected behavior draft types should be not selectable screenshots support ticket links na | 0 |
664,777 | 22,287,955,559 | IssuesEvent | 2022-06-11 23:39:34 | ctm/mb2-doc | https://api.github.com/repos/ctm/mb2-doc | closed | pop-up chat box requires scrolling | bug high priority easy request | Fix pop-up chat to not require so much scrolling.
jrx:
> when i open the chat box i have to scroll a lot, you mentioned you fixed this one?
I may need to ask jrx for more info, but I should futz around first. | 1.0 | pop-up chat box requires scrolling - Fix pop-up chat to not require so much scrolling.
jrx:
> when i open the chat box i have to scroll a lot, you mentioned you fixed this one?
I may need to ask jrx for more info, but I should futz around first. | non_defect | pop up chat box requires scrolling fix pop up chat to not require so much scrolling jrx when i open the chat box i have to scroll a lot you mentioned you fixed this one i may need to ask jrx for more info but i should futz around first | 0 |
65,810 | 19,701,496,811 | IssuesEvent | 2022-01-12 17:03:20 | vector-im/element-android | https://api.github.com/repos/vector-im/element-android | opened | Failed to join a room from Explore Room | T-Defect Z-Community-Testing | ### Steps to reproduce
<img width="395" alt="image" src="https://user-images.githubusercontent.com/9841565/149186835-7f8ab7bb-1994-4f6c-b2aa-de1f0f165d3b.png">
### Outcome
#### What did you expect?
To join the room
#### What happened instead?
got an error
RS https://github.com/matrix-org/element-android-rageshakes/issues/31564
### Your phone model
_No response_
### Operating system version
_No response_
### Application version and app store
_No response_
### Homeserver
_No response_
### Will you send logs?
Yes | 1.0 | Failed to join a room from Explore Room - ### Steps to reproduce
<img width="395" alt="image" src="https://user-images.githubusercontent.com/9841565/149186835-7f8ab7bb-1994-4f6c-b2aa-de1f0f165d3b.png">
### Outcome
#### What did you expect?
To join the room
#### What happened instead?
got an error
RS https://github.com/matrix-org/element-android-rageshakes/issues/31564
### Your phone model
_No response_
### Operating system version
_No response_
### Application version and app store
_No response_
### Homeserver
_No response_
### Will you send logs?
Yes | defect | failed to join a room from explore room steps to reproduce img width alt image src outcome what did you expect to join the room what happened instead got an error rs your phone model no response operating system version no response application version and app store no response homeserver no response will you send logs yes | 1 |
65,936 | 19,808,134,310 | IssuesEvent | 2022-01-19 09:19:54 | vector-im/element-ios | https://api.github.com/repos/vector-im/element-ios | closed | Fix BuildSetting to show/hide the "Invite Friends" | T-Defect | "Invite Friends" used to be in settings and was moved to the side menu.
The old `BuildSetting.settingsScreenShowInviteFriends` no longer does anything.
Deleting the old build setting and adding `BuildSetting.sideMenuShowInviteFriends` to show/hide "Invite Friends" again. | 1.0 | Fix BuildSetting to show/hide the "Invite Friends" - "Invite Friends" used to be in settings and was moved to the side menu.
The old `BuildSetting.settingsScreenShowInviteFriends` no longer does anything.
Deleting the old build setting and adding `BuildSetting.sideMenuShowInviteFriends` to show/hide "Invite Friends" again. | defect | fix buildsetting to show hide the invite friends invite friends used to be in settings and was moved to the side menu the old buildsetting settingsscreenshowinvitefriends no longer does anything deleting the old build setting and adding buildsetting sidemenushowinvitefriends to show hide invite friends again | 1 |
47,973 | 7,369,320,090 | IssuesEvent | 2018-03-13 02:03:01 | PaddlePaddle/Paddle | https://api.github.com/repos/PaddlePaddle/Paddle | closed | English translation of Recurrent group guide | documentation | It seems like the book chapter on [Machine translation](http://paddlepaddle.org/docs/develop/book/08.machine_translation/index.html) depends on an article on a StaticInput document. The English version of this chapter links to this: https://github.com/PaddlePaddle/Paddle/blob/develop/doc/howto/deep_model/rnn/recurrent_group_cn.md. But this is in Chinese.
Is there an English version of this article? And if so, it would be nice to replace the link in the chapter. | 1.0 | English translation of Recurrent group guide - It seems like the book chapter on [Machine translation](http://paddlepaddle.org/docs/develop/book/08.machine_translation/index.html) depends on an article on a StaticInput document. The English version of this chapter links to this: https://github.com/PaddlePaddle/Paddle/blob/develop/doc/howto/deep_model/rnn/recurrent_group_cn.md. But this is in Chinese.
Is there an English version of this article? And if so, it would be nice to replace the link in the chapter. | non_defect | english translation of recurrent group guide it seems like the book chapter on depends on an article on a staticinput document the english version of this chapter links to this but this is in chinese is there an english version of this article and if so it would be nice to replace the link in the chapter | 0 |
762,529 | 26,721,903,776 | IssuesEvent | 2023-01-29 08:24:31 | jedmund/hensei-web | https://api.github.com/repos/jedmund/hensei-web | closed | Segmented control is having trouble switching segments | bug priority: high regression | Environment: Staging
Bug: Switching tabs really quickly, or even moderately quickly, makes the tab return to "Weapons" | 1.0 | Segmented control is having trouble switching segments - Environment: Staging
Bug: Switching tabs really quickly, or even moderately quickly, makes the tab return to "Weapons" | non_defect | segmented control is having trouble switching segments environment staging bug switching tabs really quickly or even moderately quickly makes the tab return to weapons | 0 |
412,036 | 12,034,520,359 | IssuesEvent | 2020-04-13 16:11:42 | godaddy-wordpress/coblocks | https://api.github.com/repos/godaddy-wordpress/coblocks | opened | ISBAT insert the Pricing Table block using the Block Inserter | [Priority] Low [Type] Bug | ### Describe the bug:
The block inserter is unexpectedly closed when user hovers the Pricing Table block.
This **does not happen** with core block editor but only with the Gutenberg plugin active. I have been reproducing this error consistently using latest CoBlocks and Gutenberg plugins straight from the plugin repo (CoBlocks 1.23.0, Gutenberg 7.8.1) on a fresh WP install.
No other CoBlocks block shows this behavior to my knowledge.
### To reproduce:
- open the block inserter (the very left top of the screen, "plus" icon)
- find Pricing Table in the list
- hover it
### Expected behavior:
See the example preview and be able to click the button to insert the block.
### Screenshots:

### Isolating the problem:
- [x] This bug happens with no other plugins activated
- [x] This bug happens with a default WordPress theme active
- [ ] This bug happens **without** the Gutenberg plugin active
- [x] I can reproduce this bug consistently using the steps above
### WordPress version:
5.4
### Gutenberg version:
7.8.1
### More investigation
There is no console error which is super strange and hard to debug without knowing this codebase.
There is sometimes a console warning which has been already documented in #799:
```
wp.blockEditor.RichText formattingControls prop is deprecated. Please use allowedFormats instead.
```
I have tried to check whether this is to root cause by removing all `formattingControls` usages. No effect so I assume it does not cause the error with the inserter.
After some more debugging I have found that this error is likely connected to the way Pricing Table generates its inner blocks during the block initialization. I haven't dug much deeper but I have an understanding that when this block sees no inner blocks, it generates and inserts `pricing-table-item` blocks into itself, based on the count attribute.
I don't know what is the root cause responsible but I've been able to make the inserter work again by providing an `example` of the pricing table block with inner blocks included. This probably skips the block-generating logic in the Pricing Block and just shows it.
```
example: {
attributes: {
count: 1,
},
innerBlocks: [
{
name: 'coblocks/pricing-table-item',
attributes: {
placeholder: sprintf( __( 'Plan %d', 'coblocks' ), 1 ),
},
},
],
},
```
Since this is not a proper fix of the root cause, I haven't done a PR but unless you can pinpoint the error, I think it might be good idea to get this in. The preview it generates matches the one that will get generated anyway and follows how core blocks define their examples (including inner blocks, for [`example in core/columns`](https://github.com/WordPress/gutenberg/blob/c3798f141028240312dc7e7d530ad82b38900151/packages/block-library/src/columns/index.js#L33-L89)). | 1.0 | ISBAT insert the Pricing Table block using the Block Inserter - ### Describe the bug:
The block inserter is unexpectedly closed when user hovers the Pricing Table block.
This **does not happen** with core block editor but only with the Gutenberg plugin active. I have been reproducing this error consistently using latest CoBlocks and Gutenberg plugins straight from the plugin repo (CoBlocks 1.23.0, Gutenberg 7.8.1) on a fresh WP install.
No other CoBlocks block shows this behavior to my knowledge.
### To reproduce:
- open the block inserter (the very left top of the screen, "plus" icon)
- find Pricing Table in the list
- hover it
### Expected behavior:
See the example preview and be able to click the button to insert the block.
### Screenshots:

### Isolating the problem:
- [x] This bug happens with no other plugins activated
- [x] This bug happens with a default WordPress theme active
- [ ] This bug happens **without** the Gutenberg plugin active
- [x] I can reproduce this bug consistently using the steps above
### WordPress version:
5.4
### Gutenberg version:
7.8.1
### More investigation
There is no console error which is super strange and hard to debug without knowing this codebase.
There is sometimes a console warning which has been already documented in #799:
```
wp.blockEditor.RichText formattingControls prop is deprecated. Please use allowedFormats instead.
```
I have tried to check whether this is to root cause by removing all `formattingControls` usages. No effect so I assume it does not cause the error with the inserter.
After some more debugging I have found that this error is likely connected to the way Pricing Table generates its inner blocks during the block initialization. I haven't dug much deeper but I have an understanding that when this block sees no inner blocks, it generates and inserts `pricing-table-item` blocks into itself, based on the count attribute.
I don't know what is the root cause responsible but I've been able to make the inserter work again by providing an `example` of the pricing table block with inner blocks included. This probably skips the block-generating logic in the Pricing Block and just shows it.
```
example: {
attributes: {
count: 1,
},
innerBlocks: [
{
name: 'coblocks/pricing-table-item',
attributes: {
placeholder: sprintf( __( 'Plan %d', 'coblocks' ), 1 ),
},
},
],
},
```
Since this is not a proper fix of the root cause, I haven't done a PR but unless you can pinpoint the error, I think it might be good idea to get this in. The preview it generates matches the one that will get generated anyway and follows how core blocks define their examples (including inner blocks, for [`example in core/columns`](https://github.com/WordPress/gutenberg/blob/c3798f141028240312dc7e7d530ad82b38900151/packages/block-library/src/columns/index.js#L33-L89)). | non_defect | isbat insert the pricing table block using the block inserter describe the bug the block inserter is unexpectedly closed when user hovers the pricing table block this does not happen with core block editor but only with the gutenberg plugin active i have been reproducing this error consistently using latest coblocks and gutenberg plugins straight from the plugin repo coblocks gutenberg on a fresh wp install no other coblocks block shows this behavior to my knowledge to reproduce open the block inserter the very left top of the screen plus icon find pricing table in the list hover it expected behavior see the example preview and be able to click the button to insert the block screenshots isolating the problem this bug happens with no other plugins activated this bug happens with a default wordpress theme active this bug happens without the gutenberg plugin active i can reproduce this bug consistently using the steps above wordpress version gutenberg version more investigation there is no console error which is super strange and hard to debug without knowing this codebase there is sometimes a console warning which has been already documented in wp blockeditor richtext formattingcontrols prop is deprecated please use allowedformats instead i have tried to check whether this is to root cause by removing all formattingcontrols usages no effect so i assume it does not cause the error with the inserter after some more debugging i have found that this error is likely connected to the way pricing table generates its inner blocks during the block initialization i haven t dug much deeper but i have an understanding that when this block sees no inner blocks it generates and inserts pricing table item blocks into itself based on the count attribute i don t know what is the root cause responsible but i ve been able to make the inserter work again by providing an example of the pricing table block with inner blocks included this probably skips the block generating logic in the pricing block and just shows it example attributes count innerblocks name coblocks pricing table item attributes placeholder sprintf plan d coblocks since this is not a proper fix of the root cause i haven t done a pr but unless you can pinpoint the error i think it might be good idea to get this in the preview it generates matches the one that will get generated anyway and follows how core blocks define their examples including inner blocks for | 0 |
3,851 | 2,610,070,121 | IssuesEvent | 2015-02-26 18:20:34 | chrsmith/jsjsj122 | https://api.github.com/repos/chrsmith/jsjsj122 | opened | 台州割包茎去哪里效果最好 | auto-migrated Priority-Medium Type-Defect | ```
台州割包茎去哪里效果最好【台州五洲生殖医院】24小时健康
咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址:台
州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104、1
08、118、198及椒江一金清公交车直达枫南小区,乘坐107、105、
109、112、901、 902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 11:54 | 1.0 | 台州割包茎去哪里效果最好 - ```
台州割包茎去哪里效果最好【台州五洲生殖医院】24小时健康
咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址:台
州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104、1
08、118、198及椒江一金清公交车直达枫南小区,乘坐107、105、
109、112、901、 902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 11:54 | defect | 台州割包茎去哪里效果最好 台州割包茎去哪里效果最好【台州五洲生殖医院】 咨询热线 微信号tzwzszyy 医院地址 台 (枫南大转盘旁)乘车线路 、 、 、 , 、 、 、 、 、 ,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 original issue reported on code google com by poweragr gmail com on may at | 1 |
70,215 | 23,052,874,639 | IssuesEvent | 2022-07-24 22:12:47 | SeleniumHQ/selenium | https://api.github.com/repos/SeleniumHQ/selenium | closed | [🐛 Bug]: Edge Version 103.0.1264.71 - IE Mode - Selenium IE Driver 4.3.0.0 - Not working | I-defect needs-triaging | ### What happened?
Edge IE - Mode failing with version - 103.0.1264.71 for both 32-bit and 64-bit
Started InternetExplorerDriver server (64-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
Started InternetExplorerDriver server (32-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
### How can we reproduce the issue?
```shell
System.setProperty("webdriver.ie.driver", System.getProperty("user.dir") + "/drivers/IEDriverServer.exe");
InternetExplorerOptions ieOptions = new InternetExplorerOptions();
ieOptions.ignoreZoomSettings();
ieOptions.setCapability("ignoreProtectedModeSettings", true);
ieOptions.enablePersistentHovering();
ieOptions.requireWindowFocus();
ieOptions.attachToEdgeChrome();
ieOptions.withEdgeExecutablePath("C:\\Program Files (x86)\\Microsoft\\Edge\\Application\\msedge.exe");
```
### Relevant log output
```shell
Started InternetExplorerDriver server (32-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
Started InternetExplorerDriver server (64-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
```
### Operating System
Windows 10
### Selenium version
4.0.3
### What are the browser(s) and version(s) where you see this issue?
Edge 103.0.1264.71
### What are the browser driver(s) and version(s) where you see this issue?
IE Driver 4.3.0.0
### Are you using Selenium Grid?
No | 1.0 | [🐛 Bug]: Edge Version 103.0.1264.71 - IE Mode - Selenium IE Driver 4.3.0.0 - Not working - ### What happened?
Edge IE - Mode failing with version - 103.0.1264.71 for both 32-bit and 64-bit
Started InternetExplorerDriver server (64-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
Started InternetExplorerDriver server (32-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
### How can we reproduce the issue?
```shell
System.setProperty("webdriver.ie.driver", System.getProperty("user.dir") + "/drivers/IEDriverServer.exe");
InternetExplorerOptions ieOptions = new InternetExplorerOptions();
ieOptions.ignoreZoomSettings();
ieOptions.setCapability("ignoreProtectedModeSettings", true);
ieOptions.enablePersistentHovering();
ieOptions.requireWindowFocus();
ieOptions.attachToEdgeChrome();
ieOptions.withEdgeExecutablePath("C:\\Program Files (x86)\\Microsoft\\Edge\\Application\\msedge.exe");
```
### Relevant log output
```shell
Started InternetExplorerDriver server (32-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
Started InternetExplorerDriver server (64-bit)
4.3.0.0
Listening on port 56120
Only local connections are allowed
org.openqa.selenium.SessionNotCreatedException: Could not start a new session. Possible causes are invalid address of the remote server or browser start-up failure.
Build info: version: '4.3.0', revision: 'a4995e2c09*'
```
### Operating System
Windows 10
### Selenium version
4.0.3
### What are the browser(s) and version(s) where you see this issue?
Edge 103.0.1264.71
### What are the browser driver(s) and version(s) where you see this issue?
IE Driver 4.3.0.0
### Are you using Selenium Grid?
No | defect | edge version ie mode selenium ie driver not working what happened edge ie mode failing with version for both bit and bit started internetexplorerdriver server bit listening on port only local connections are allowed org openqa selenium sessionnotcreatedexception could not start a new session possible causes are invalid address of the remote server or browser start up failure build info version revision started internetexplorerdriver server bit listening on port only local connections are allowed org openqa selenium sessionnotcreatedexception could not start a new session possible causes are invalid address of the remote server or browser start up failure build info version revision how can we reproduce the issue shell system setproperty webdriver ie driver system getproperty user dir drivers iedriverserver exe internetexploreroptions ieoptions new internetexploreroptions ieoptions ignorezoomsettings ieoptions setcapability ignoreprotectedmodesettings true ieoptions enablepersistenthovering ieoptions requirewindowfocus ieoptions attachtoedgechrome ieoptions withedgeexecutablepath c program files microsoft edge application msedge exe relevant log output shell started internetexplorerdriver server bit listening on port only local connections are allowed org openqa selenium sessionnotcreatedexception could not start a new session possible causes are invalid address of the remote server or browser start up failure build info version revision started internetexplorerdriver server bit listening on port only local connections are allowed org openqa selenium sessionnotcreatedexception could not start a new session possible causes are invalid address of the remote server or browser start up failure build info version revision operating system windows selenium version what are the browser s and version s where you see this issue edge what are the browser driver s and version s where you see this issue ie driver are you using selenium grid no | 1 |
80,083 | 29,999,172,726 | IssuesEvent | 2023-06-26 08:08:39 | hyperledger/iroha | https://api.github.com/repos/hyperledger/iroha | closed | [BUG] Iroha doesn't panic and shut down if the key pairs for `IROHA_GENESIS_ACCOUNT` are completely different for each peer. | Bug iroha2 Dev defect QA-confirmed | ### OS and Environment
MacOS, Docker Hub
### GIT commit hash
44ec1a11
### Minimum working example / Steps to reproduce
1. Run a docker-compose with different `IROHA_GENESIS_ACCOUNT` key pairs for each peer
<details>
<summary>docker-compose.yml</summary>
```yaml
version: "3.8"
services:
iroha:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha:1337
TORII_API_URL: iroha:8080
TORII_TELEMETRY_URL: iroha:8180
IROHA_PUBLIC_KEY: "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "282ed9f3cf92811c3818dbc4ae594ed59dc1a2f78e4241e31924e101d6b1fb831c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b'
IROHA_GENESIS_ACCOUNT_PRIVATE_KEY: '{ "digest_function": "ed25519", "payload": "282ed9f3cf92811c3818dbc4ae594ed59dc1a2f78e4241e31924e101d6b1fb831c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b" }'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8080:8080"
- "8180:8180"
volumes:
- './configs/peer:/config'
command: iroha --submit-genesis
iroha1:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha1:1338
TORII_API_URL: iroha1:8081
TORII_TELEMETRY_URL: iroha1:8181
IROHA_PUBLIC_KEY: "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "3bac34cda9e3763fa069c1198312d1ec73b53023b8180c822ac355435edc4a24cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8081:8080"
- "8181:8180"
volumes:
- './configs/peer:/config'
iroha2:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha2:1339
TORII_API_URL: iroha2:8082
TORII_TELEMETRY_URL: iroha2:8182
IROHA_PUBLIC_KEY: "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "1261a436d36779223d7d6cf20e8b644510e488e6a50bafd77a7485264d27197dfaca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8082:8080"
- "8182:8180"
volumes:
- './configs/peer:/config'
iroha3:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha3:1340
TORII_API_URL: iroha3:8083
TORII_TELEMETRY_URL: iroha3:8183
IROHA_PUBLIC_KEY: "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "a70dab95c7482eb9f159111b65947e482108cfe67df877bd8d3b9441a781c7c98e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8083:8080"
- "8183:8180"
volumes:
- './configs/peer:/config'
```
</details>
### Actual result
Error and continue working.
```
iroha-iroha1-1 | 2023-02-20T12:29:44.555645Z ERROR run: iroha_core::sumeragi::main_loop: error=Error during transaction revalidation
iroha-iroha1-1 |
iroha-iroha1-1 | Caused by:
iroha-iroha1-1 | 0: Failed to validate transaction
iroha-iroha1-1 | 1: Failed to verify signature condition specified in the account: Signature condition not satisfied.
iroha-iroha1-1 | 2: Failed to verify signature condition specified in the account: Signature condition not satisfied.
iroha-iroha1-1 |
iroha-iroha1-1 | Location:
iroha-iroha1-1 | core/src/block.rs:633:26
```
### Expected result
Panic and shut down
### Who can help to reproduce?
@astrokov7
### Notes
If your original leader gives you the incorrect genesis, you can't continue. The network needs to restart with a different topology. If you're bootstrapping it's a different situation, though. | 1.0 | [BUG] Iroha doesn't panic and shut down if the key pairs for `IROHA_GENESIS_ACCOUNT` are completely different for each peer. - ### OS and Environment
MacOS, Docker Hub
### GIT commit hash
44ec1a11
### Minimum working example / Steps to reproduce
1. Run a docker-compose with different `IROHA_GENESIS_ACCOUNT` key pairs for each peer
<details>
<summary>docker-compose.yml</summary>
```yaml
version: "3.8"
services:
iroha:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha:1337
TORII_API_URL: iroha:8080
TORII_TELEMETRY_URL: iroha:8180
IROHA_PUBLIC_KEY: "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "282ed9f3cf92811c3818dbc4ae594ed59dc1a2f78e4241e31924e101d6b1fb831c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b'
IROHA_GENESIS_ACCOUNT_PRIVATE_KEY: '{ "digest_function": "ed25519", "payload": "282ed9f3cf92811c3818dbc4ae594ed59dc1a2f78e4241e31924e101d6b1fb831c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b" }'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8080:8080"
- "8180:8180"
volumes:
- './configs/peer:/config'
command: iroha --submit-genesis
iroha1:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha1:1338
TORII_API_URL: iroha1:8081
TORII_TELEMETRY_URL: iroha1:8181
IROHA_PUBLIC_KEY: "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "3bac34cda9e3763fa069c1198312d1ec73b53023b8180c822ac355435edc4a24cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8081:8080"
- "8181:8180"
volumes:
- './configs/peer:/config'
iroha2:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha2:1339
TORII_API_URL: iroha2:8082
TORII_TELEMETRY_URL: iroha2:8182
IROHA_PUBLIC_KEY: "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "1261a436d36779223d7d6cf20e8b644510e488e6a50bafd77a7485264d27197dfaca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8082:8080"
- "8182:8180"
volumes:
- './configs/peer:/config'
iroha3:
image: hyperledger/iroha2:dev-nightly-fe826be88d785f83496c781b104abb871fd7a13f
environment:
TORII_P2P_ADDR: iroha3:1340
TORII_API_URL: iroha3:8083
TORII_TELEMETRY_URL: iroha3:8183
IROHA_PUBLIC_KEY: "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "a70dab95c7482eb9f159111b65947e482108cfe67df877bd8d3b9441a781c7c98e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
WSV_WASM_RUNTIME_CONFIG: '{"FUEL_LIMIT":900000000000, "MAX_MEMORY": 524288000}'
ports:
- "8083:8080"
- "8183:8180"
volumes:
- './configs/peer:/config'
```
</details>
### Actual result
Error and continue working.
```
iroha-iroha1-1 | 2023-02-20T12:29:44.555645Z ERROR run: iroha_core::sumeragi::main_loop: error=Error during transaction revalidation
iroha-iroha1-1 |
iroha-iroha1-1 | Caused by:
iroha-iroha1-1 | 0: Failed to validate transaction
iroha-iroha1-1 | 1: Failed to verify signature condition specified in the account: Signature condition not satisfied.
iroha-iroha1-1 | 2: Failed to verify signature condition specified in the account: Signature condition not satisfied.
iroha-iroha1-1 |
iroha-iroha1-1 | Location:
iroha-iroha1-1 | core/src/block.rs:633:26
```
### Expected result
Panic and shut down
### Who can help to reproduce?
@astrokov7
### Notes
If your original leader gives you the incorrect genesis, you can't continue. The network needs to restart with a different topology. If you're bootstrapping it's a different situation, though. | defect | iroha doesn t panic and shut down if the key pairs for iroha genesis account are completely different for each peer os and environment macos docker hub git commit hash minimum working example steps to reproduce run a docker compose with different iroha genesis account key pairs for each peer docker compose yml yaml version services iroha image hyperledger dev nightly environment torii addr iroha torii api url iroha torii telemetry url iroha iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key iroha genesis account private key digest function payload wsv wasm runtime config fuel limit max memory ports volumes configs peer config command iroha submit genesis image hyperledger dev nightly environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key wsv wasm runtime config fuel limit max memory ports volumes configs peer config image hyperledger dev nightly environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key wsv wasm runtime config fuel limit max memory ports volumes configs peer config image hyperledger dev nightly environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key wsv wasm runtime config fuel limit max memory ports volumes configs peer config actual result error and continue working iroha error run iroha core sumeragi main loop error error during transaction revalidation iroha iroha caused by iroha failed to validate transaction iroha failed to verify signature condition specified in the account signature condition not satisfied iroha failed to verify signature condition specified in the account signature condition not satisfied iroha iroha location iroha core src block rs expected result panic and shut down who can help to reproduce notes if your original leader gives you the incorrect genesis you can t continue the network needs to restart with a different topology if you re bootstrapping it s a different situation though | 1 |
326,675 | 24,097,189,764 | IssuesEvent | 2022-09-19 19:53:44 | boto/botocore | https://api.github.com/repos/boto/botocore | closed | Include waiter section in documentation even if there are none | feature-request documentation closed-for-staleness | **Is your feature request related to a problem? Please describe.**
All clients have a method called `get_waiter`. The documentation for this indicates to look in the Waiter section of the docs. If a service doesn't have any waiters defined, there is no Waiter section in the docs.
**Describe the solution you'd like**
Even if there are no waiters, include a waiter section in the docs stating that there are no waiters defined.
This could also be extended to other properties, like paginators.
This was proposed in #2483. | 1.0 | Include waiter section in documentation even if there are none - **Is your feature request related to a problem? Please describe.**
All clients have a method called `get_waiter`. The documentation for this indicates to look in the Waiter section of the docs. If a service doesn't have any waiters defined, there is no Waiter section in the docs.
**Describe the solution you'd like**
Even if there are no waiters, include a waiter section in the docs stating that there are no waiters defined.
This could also be extended to other properties, like paginators.
This was proposed in #2483. | non_defect | include waiter section in documentation even if there are none is your feature request related to a problem please describe all clients have a method called get waiter the documentation for this indicates to look in the waiter section of the docs if a service doesn t have any waiters defined there is no waiter section in the docs describe the solution you d like even if there are no waiters include a waiter section in the docs stating that there are no waiters defined this could also be extended to other properties like paginators this was proposed in | 0 |
100,021 | 21,103,852,065 | IssuesEvent | 2022-04-04 16:45:32 | pwa-builder/PWABuilder | https://api.github.com/repos/pwa-builder/PWABuilder | closed | PWA Studio: v0.3.0 plan | vscode release-plan | 1. Analytics work (P0): https://github.com/pwa-builder/PWABuilder/issues/2597
2. Help Command (P0): https://github.com/pwa-builder/PWABuilder/issues/2514
3. Screenshots generator (P1): https://github.com/pwa-builder/PWABuilder/issues/2570
4. vscode.dev support: (P2) https://github.com/pwa-builder/PWABuilder/issues/2533
4. Multi-folder workspace (P3): support: https://github.com/pwa-builder/PWABuilder/issues/2589 | 1.0 | PWA Studio: v0.3.0 plan - 1. Analytics work (P0): https://github.com/pwa-builder/PWABuilder/issues/2597
2. Help Command (P0): https://github.com/pwa-builder/PWABuilder/issues/2514
3. Screenshots generator (P1): https://github.com/pwa-builder/PWABuilder/issues/2570
4. vscode.dev support: (P2) https://github.com/pwa-builder/PWABuilder/issues/2533
4. Multi-folder workspace (P3): support: https://github.com/pwa-builder/PWABuilder/issues/2589 | non_defect | pwa studio plan analytics work help command screenshots generator vscode dev support multi folder workspace support | 0 |
65,431 | 19,503,233,509 | IssuesEvent | 2021-12-28 08:24:08 | PowerDNS/pdns | https://api.github.com/repos/PowerDNS/pdns | closed | dnsdist: setRCodeRatio Warning mechanism doesn't seem to work | defect dnsdist | - Program: dnsdist <!-- delete the ones that do not apply -->
- Issue type: Bug report
### Short description
The Warning parameter in the `setRCodeRatio()` function, isn't triggered
### Environment
- Operating system: Centos 8
- Software version: 1.6.1
- Software source: PowerDNS repository
-
### Steps to reproduce
1. Configure dnsdist with:
```
local dbr = dynBlockRulesGroup()
dbr:setRCodeRatio(DNSRCode.NXDOMAIN, 0.9, 10, "Exceeded NXD ratio", 60, 1, DNSAction.Drop, 0.1)
function maintenance()
dbr:apply()
end
```
2. Try to trigger the Warning by querying random names returning an NXDomain, alternated with entries that do return a value. Don't forget to disable your packetCache for the test.
### Expected behaviour
As long as more than 10% but less than 90% of the queries return an NXDomain, I would expect a Warning to be issued. This should be visible in the system logs, as in the `showDynBlocks()` output. The test above should generate 50% NXDomains so the Warning should trigger.
### Actual behaviour
No warning is triggered. By skipping the "correct" lookup and thus pushing the ratio above 90%, we do see the block happening.
### Other information
See info to be provided by @omoerbeek and @dwfreed
While looking at the issue, might clarify the docs as well (double vs int, example code?) | 1.0 | dnsdist: setRCodeRatio Warning mechanism doesn't seem to work - - Program: dnsdist <!-- delete the ones that do not apply -->
- Issue type: Bug report
### Short description
The Warning parameter in the `setRCodeRatio()` function, isn't triggered
### Environment
- Operating system: Centos 8
- Software version: 1.6.1
- Software source: PowerDNS repository
-
### Steps to reproduce
1. Configure dnsdist with:
```
local dbr = dynBlockRulesGroup()
dbr:setRCodeRatio(DNSRCode.NXDOMAIN, 0.9, 10, "Exceeded NXD ratio", 60, 1, DNSAction.Drop, 0.1)
function maintenance()
dbr:apply()
end
```
2. Try to trigger the Warning by querying random names returning an NXDomain, alternated with entries that do return a value. Don't forget to disable your packetCache for the test.
### Expected behaviour
As long as more than 10% but less than 90% of the queries return an NXDomain, I would expect a Warning to be issued. This should be visible in the system logs, as in the `showDynBlocks()` output. The test above should generate 50% NXDomains so the Warning should trigger.
### Actual behaviour
No warning is triggered. By skipping the "correct" lookup and thus pushing the ratio above 90%, we do see the block happening.
### Other information
See info to be provided by @omoerbeek and @dwfreed
While looking at the issue, might clarify the docs as well (double vs int, example code?) | defect | dnsdist setrcoderatio warning mechanism doesn t seem to work program dnsdist issue type bug report short description the warning parameter in the setrcoderatio function isn t triggered environment operating system centos software version software source powerdns repository steps to reproduce configure dnsdist with local dbr dynblockrulesgroup dbr setrcoderatio dnsrcode nxdomain exceeded nxd ratio dnsaction drop function maintenance dbr apply end try to trigger the warning by querying random names returning an nxdomain alternated with entries that do return a value don t forget to disable your packetcache for the test expected behaviour as long as more than but less than of the queries return an nxdomain i would expect a warning to be issued this should be visible in the system logs as in the showdynblocks output the test above should generate nxdomains so the warning should trigger actual behaviour no warning is triggered by skipping the correct lookup and thus pushing the ratio above we do see the block happening other information see info to be provided by omoerbeek and dwfreed while looking at the issue might clarify the docs as well double vs int example code | 1 |
60,637 | 17,023,479,161 | IssuesEvent | 2021-07-03 02:14:32 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Escape deletes POI nodes in Potlatch | Component: potlatch (flash editor) Priority: major Resolution: wontfix Type: defect | **[Submitted to the original trac issue database at 6.23pm, Tuesday, 15th September 2009]**
In Potlatch, when a POI node is unintentionally incorporated in a way and the user stops creating the way by hitting the Escape key, the POI node is deleted. | 1.0 | Escape deletes POI nodes in Potlatch - **[Submitted to the original trac issue database at 6.23pm, Tuesday, 15th September 2009]**
In Potlatch, when a POI node is unintentionally incorporated in a way and the user stops creating the way by hitting the Escape key, the POI node is deleted. | defect | escape deletes poi nodes in potlatch in potlatch when a poi node is unintentionally incorporated in a way and the user stops creating the way by hitting the escape key the poi node is deleted | 1 |
302,736 | 22,840,404,697 | IssuesEvent | 2022-07-12 21:09:02 | microsoftgraph/microsoft-graph-toolkit | https://api.github.com/repos/microsoftgraph/microsoft-graph-toolkit | opened | [Docs] ASP.NET Core Proxy Sample - App registration instructions unclear | Area: Documentation | ## Description
The current instructions for setting up the app registration in `samples.proxy-provider-asp-net-core` allow for setting up single tenant application. This results in an error when attempting to sign into the sample application

`samples/proxy-provider-asp-net-core/readme.md` line 56 needs to be updated to specify multi-tenant applications
| 1.0 | [Docs] ASP.NET Core Proxy Sample - App registration instructions unclear - ## Description
The current instructions for setting up the app registration in `samples.proxy-provider-asp-net-core` allow for setting up single tenant application. This results in an error when attempting to sign into the sample application

`samples/proxy-provider-asp-net-core/readme.md` line 56 needs to be updated to specify multi-tenant applications
| non_defect | asp net core proxy sample app registration instructions unclear description the current instructions for setting up the app registration in samples proxy provider asp net core allow for setting up single tenant application this results in an error when attempting to sign into the sample application samples proxy provider asp net core readme md line needs to be updated to specify multi tenant applications | 0 |
23,599 | 3,851,864,216 | IssuesEvent | 2016-04-06 05:27:15 | GPF/imame4all | https://api.github.com/repos/GPF/imame4all | closed | FireTV - 2 Controller | auto-migrated Priority-Medium Type-Defect | ```
I just installed the MAME4droid(0139)-1.6.1-MULTI.apk in a FireTV and I try to
play multiplayer games, but when I try to remap the keys both controller send
the same keycodes, so when I try to map player 2 with the second controller it
overwrites the keys for the player 1 because the keycode are the same (dpad_up,
dpad_down, etc).
```
Original issue reported on code.google.com by `rosty.ma...@gmail.com` on 9 Sep 2014 at 3:45 | 1.0 | FireTV - 2 Controller - ```
I just installed the MAME4droid(0139)-1.6.1-MULTI.apk in a FireTV and I try to
play multiplayer games, but when I try to remap the keys both controller send
the same keycodes, so when I try to map player 2 with the second controller it
overwrites the keys for the player 1 because the keycode are the same (dpad_up,
dpad_down, etc).
```
Original issue reported on code.google.com by `rosty.ma...@gmail.com` on 9 Sep 2014 at 3:45 | defect | firetv controller i just installed the multi apk in a firetv and i try to play multiplayer games but when i try to remap the keys both controller send the same keycodes so when i try to map player with the second controller it overwrites the keys for the player because the keycode are the same dpad up dpad down etc original issue reported on code google com by rosty ma gmail com on sep at | 1 |
48,225 | 7,394,029,726 | IssuesEvent | 2018-03-17 05:11:34 | chef/chef | https://api.github.com/repos/chef/chef | closed | user resource never creates homedir | Type: Documentation |
## Description
````
user testo do
action [:create]
end
# grep HOME /etc/login.defs
CREATE_HOME yes
# userdel -rf testo ; chef-client ; ls -ld /home/testo
... Starting Chef Client, version 13.7.16
:
:
ls: cannot access /home/testo: No such file or directory
````
the docs at https://docs.chef.io/resource_user.html strongly suggest this is a bug:
> manage_home
> Ruby Types: TrueClass, FalseClass
>
> Manage a user’s home directory.
>
> With the :create action, a user’s home directory is created based on HOME_DIR. If the home directory is missing, it is created unless CREATE_HOME in /etc/login.defs is set to no. When created, a skeleton set of files and sub-directories is also created in the home directory.
The homedir is definitely NOT being created.
````
user testo do
action [:create, :manage]
end
````
Still not created; even when the user is explicitly deleted as shown above. Documentation clearly shows this isn't expected behaviour. Is manage_home no longer defaulting as documented?
## Chef Version
13.7.16
## Platform Version
centos/rhel7
| 1.0 | user resource never creates homedir -
## Description
````
user testo do
action [:create]
end
# grep HOME /etc/login.defs
CREATE_HOME yes
# userdel -rf testo ; chef-client ; ls -ld /home/testo
... Starting Chef Client, version 13.7.16
:
:
ls: cannot access /home/testo: No such file or directory
````
the docs at https://docs.chef.io/resource_user.html strongly suggest this is a bug:
> manage_home
> Ruby Types: TrueClass, FalseClass
>
> Manage a user’s home directory.
>
> With the :create action, a user’s home directory is created based on HOME_DIR. If the home directory is missing, it is created unless CREATE_HOME in /etc/login.defs is set to no. When created, a skeleton set of files and sub-directories is also created in the home directory.
The homedir is definitely NOT being created.
````
user testo do
action [:create, :manage]
end
````
Still not created; even when the user is explicitly deleted as shown above. Documentation clearly shows this isn't expected behaviour. Is manage_home no longer defaulting as documented?
## Chef Version
13.7.16
## Platform Version
centos/rhel7
| non_defect | user resource never creates homedir description user testo do action end grep home etc login defs create home yes userdel rf testo chef client ls ld home testo starting chef client version ls cannot access home testo no such file or directory the docs at strongly suggest this is a bug manage home ruby types trueclass falseclass manage a user’s home directory with the create action a user’s home directory is created based on home dir if the home directory is missing it is created unless create home in etc login defs is set to no when created a skeleton set of files and sub directories is also created in the home directory the homedir is definitely not being created user testo do action end still not created even when the user is explicitly deleted as shown above documentation clearly shows this isn t expected behaviour is manage home no longer defaulting as documented chef version platform version centos | 0 |
94,384 | 8,488,330,158 | IssuesEvent | 2018-10-26 16:18:05 | SocialGouv/code-du-travail-numerique | https://api.github.com/repos/SocialGouv/code-du-travail-numerique | opened | bug remontées de réponse déceptive par un user (DIRECCTE) | Test unitaire bug | // bug
// test unitaire
Un salarié detache intérimaire doit il avoir un relevé horaires ?:
https://codedutravail.num.social.gouv.fr/?q=Un%20salari%C3%A9%20detache%20int%C3%A9rimaire%20doit%20il%20avoir%20un%20relev%C3%A9%20horaires%20%3F
Erreur sans doute due à une absence de contenu.
=> nécessite vérification métier
=> nécessite une fois le test unitaire fait, une vue ES
stars 1 email fabienne.rosset@Direccte.gouv.fr message Réponse hors sujets status sending userAgent Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 subject Un salarié detache intérimaire doit il avoir un relevé horaires ? | stars 1 email fabienne.rosset@Direccte.gouv.fr message Réponse hors sujets status sending userAgent Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 subject Un salarié detache intérimaire doit il avoir un relevé horaires ? | stars | 1 | email | fabienne.rosset@Direccte.gouv.fr | message | Réponse hors sujets | status | sending | userAgent | Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 | subject | Un salarié detache intérimaire doit il avoir un relevé horaires ?
-- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | --
stars 1 email fabienne.rosset@Direccte.gouv.fr message Réponse hors sujets status sending userAgent Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 subject Un salarié detache intérimaire doit il avoir un relevé horaires ? | stars | 1 | email | fabienne.rosset@Direccte.gouv.fr | message | Réponse hors sujets | status | sending | userAgent | Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 | subject | Un salarié detache intérimaire doit il avoir un relevé horaires ?
stars | 1
email | fabienne.rosset@Direccte.gouv.fr
message | Réponse hors sujets
status | sending
userAgent | Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0
subject | Un salarié detache intérimaire doit il avoir un relevé horaires ?
This form was submitted at 04:01 PM UTC - 26 October 2018.
| 1.0 | bug remontées de réponse déceptive par un user (DIRECCTE) - // bug
// test unitaire
Un salarié detache intérimaire doit il avoir un relevé horaires ?:
https://codedutravail.num.social.gouv.fr/?q=Un%20salari%C3%A9%20detache%20int%C3%A9rimaire%20doit%20il%20avoir%20un%20relev%C3%A9%20horaires%20%3F
Erreur sans doute due à une absence de contenu.
=> nécessite vérification métier
=> nécessite une fois le test unitaire fait, une vue ES
stars 1 email fabienne.rosset@Direccte.gouv.fr message Réponse hors sujets status sending userAgent Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 subject Un salarié detache intérimaire doit il avoir un relevé horaires ? | stars 1 email fabienne.rosset@Direccte.gouv.fr message Réponse hors sujets status sending userAgent Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 subject Un salarié detache intérimaire doit il avoir un relevé horaires ? | stars | 1 | email | fabienne.rosset@Direccte.gouv.fr | message | Réponse hors sujets | status | sending | userAgent | Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 | subject | Un salarié detache intérimaire doit il avoir un relevé horaires ?
-- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | -- | --
stars 1 email fabienne.rosset@Direccte.gouv.fr message Réponse hors sujets status sending userAgent Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 subject Un salarié detache intérimaire doit il avoir un relevé horaires ? | stars | 1 | email | fabienne.rosset@Direccte.gouv.fr | message | Réponse hors sujets | status | sending | userAgent | Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0 | subject | Un salarié detache intérimaire doit il avoir un relevé horaires ?
stars | 1
email | fabienne.rosset@Direccte.gouv.fr
message | Réponse hors sujets
status | sending
userAgent | Mozilla/5.0 (Android 8.1.0; Mobile; rv:62.0) Gecko/62.0 Firefox/62.0
subject | Un salarié detache intérimaire doit il avoir un relevé horaires ?
This form was submitted at 04:01 PM UTC - 26 October 2018.
| non_defect | bug remontées de réponse déceptive par un user direccte bug test unitaire un salarié detache intérimaire doit il avoir un relevé horaires erreur sans doute due à une absence de contenu nécessite vérification métier nécessite une fois le test unitaire fait une vue es stars email fabienne rosset direccte gouv fr message réponse hors sujets status sending useragent mozilla android mobile rv gecko firefox subject un salarié detache intérimaire doit il avoir un relevé horaires stars email fabienne rosset direccte gouv fr message réponse hors sujets status sending useragent mozilla android mobile rv gecko firefox subject un salarié detache intérimaire doit il avoir un relevé horaires stars email fabienne rosset direccte gouv fr message réponse hors sujets status sending useragent mozilla android mobile rv gecko firefox subject un salarié detache intérimaire doit il avoir un relevé horaires stars email fabienne rosset direccte gouv fr message réponse hors sujets status sending useragent mozilla android mobile rv gecko firefox subject un salarié detache intérimaire doit il avoir un relevé horaires stars email fabienne rosset direccte gouv fr message réponse hors sujets status sending useragent mozilla android mobile rv gecko firefox subject un salarié detache intérimaire doit il avoir un relevé horaires stars email fabienne rosset direccte gouv fr message réponse hors sujets status sending useragent mozilla android mobile rv gecko firefox subject un salarié detache intérimaire doit il avoir un relevé horaires this form was submitted at pm utc october | 0 |
67,460 | 20,961,613,684 | IssuesEvent | 2022-03-27 21:49:43 | abedmaatalla/sipdroid | https://api.github.com/repos/abedmaatalla/sipdroid | closed | Possible incorrect Speex WB handling in SDP offer | Priority-Medium Type-Defect auto-migrated | ```
What steps will reproduce the problem?
1. Install jitsi client
2. make a call to a sipdriod
What is the expected output? What do you see instead?
call should establish. Instead I receive incompatible codecs error
What version of the product are you using? On what device/operating system?
2.4 on cyanogenmod 7.1, ZTE Blade
Which SIP server are you using? What happens with PBXes?
CommuniGate Pro which is most likely irrelevant because it is acting as a SIP
proxy not changing codecs in SDP. I don't have account with pbxes, but if it
allows direct SIP-to-SIP calls, the problem should be reproduceable there as
well.
Which type of network are you using?
wifi
Please provide any additional information below.
Most likely this is because sipdriod only supports speex NB and gets confused
when speex WB is offered. Disabling speex altogether solves the problem.
Request to sipdroid:
20:36:58.048 5 SIPDATA-780708 out: INVITE
sip:dop@xx.xx.xx.xx:47948;transport=udp SIP/2.0
20:36:58.048 5 SIPDATA-780708 out: Via: SIP/2.0/UDP
xx.xx.xx.xx:5060;branch=z9hG4bK126794-aoixtxv;cgp=itoolabs.net;rport
20:36:58.048 5 SIPDATA-780708 out: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.048 5 SIPDATA-780708 out: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.048 5 SIPDATA-780708 out: Record-Route:
<sip:rev.789-10.40.74.239.dialog.cgatepro;lr>
20:36:58.048 5 SIPDATA-780708 out: Max-Forwards: 67
20:36:58.048 5 SIPDATA-780708 out: From: "Dmitry Panov"
<sip:dop@xxx.xx>;tag=3AA7E33E-812-8F35DAF6_aoixtxv-1C9F
20:36:58.048 5 SIPDATA-780708 out: To: <sip:dop@xxx.yy>
20:36:58.048 5 SIPDATA-780708 out: Call-ID:
706130bb21879d3d1b4ffb9b72ce1269@0:0:0:0:0:0:0:0.egress
20:36:58.048 5 SIPDATA-780708 out: Contact:
<sip:signode-812-8F35DAF6_aoixtxv-1C9F@xx.xx.xx.xx>
20:36:58.048 5 SIPDATA-780708 out: CSeq: 1 INVITE
20:36:58.048 5 SIPDATA-780708 out: Supported: 100rel,timer,replaces,histinfo
20:36:58.048 5 SIPDATA-780708 out: Session-Expires: 7200
20:36:58.048 5 SIPDATA-780708 out: Min-SE: 3600
20:36:58.048 5 SIPDATA-780708 out: User-Agent: CommuniGatePro-callLeg/5.3.10
20:36:58.048 5 SIPDATA-780708 out: Allow:
INVITE,ACK,BYE,CANCEL,OPTIONS,INFO,MESSAGE,SUBSCRIBE,NOTIFY,PRACK,REFER
20:36:58.048 5 SIPDATA-780708 out: Content-Type: application/sdp
20:36:58.048 5 SIPDATA-780708 out: Content-Length: 581
20:36:58.048 5 SIPDATA-780708 out:
20:36:58.048 5 SIPDATA-780708 out: v=0
20:36:58.048 5 SIPDATA-780708 out: o=CGPLeg000812 563173700 281586851 IN IP4
xx.xx.xx.xx
20:36:58.048 5 SIPDATA-780708 out: s=-
20:36:58.048 5 SIPDATA-780708 out: c=IN IP4 xx.xx.xx.xx
20:36:58.048 5 SIPDATA-780708 out: t=0 0
20:36:58.048 5 SIPDATA-780708 out: a=mediagateway:itoolabs.com:761
20:36:58.048 5 SIPDATA-780708 out: m=audio 60000 RTP/AVP 96 97 98 9 100 3 0 8
101
20:36:58.048 5 SIPDATA-780708 out: c=IN IP4 xx.xx.xx.xx
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:96 SILK/24000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:97 SILK/16000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:98 speex/16000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:9 G722/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:100 iLBC/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:3 GSM/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:0 PCMU/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:8 PCMA/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:101 telephone-event/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtcpping:T:32:3278
20:36:58.048 5 SIPDATA-780708 out: a=extmap:1
urn:ietf:params:rtp-hdrext:csrc-audio-level
20:36:58.048 5 SIPDATA-780708 out: a=zrtp-hash:1.10
8cee0d3e8bc3203384c520389c2264ee4c76a4306ebffec1f62374bdb7630174
Response from sipdroid:
20:36:58.928 5 SIPDATA-780716 inp: SIP/2.0 180 Ringing
20:36:58.928 5 SIPDATA-780716 inp: Via: SIP/2.0/UDP
xx.xx.xx.xx:5060;branch=z9hG4bK126794-aoixtxv;cgp=itoolabs.net;rport=5060
20:36:58.928 5 SIPDATA-780716 inp: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.928 5 SIPDATA-780716 inp: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.928 5 SIPDATA-780716 inp: Record-Route:
<sip:rev.789-10.40.74.239.dialog.cgatepro;lr>
20:36:58.928 5 SIPDATA-780716 inp: To: <sip:dop@xxx.xx>;tag=cde7f7e51f6f2498
20:36:58.928 5 SIPDATA-780716 inp: From: "Dmitry Panov"
<sip:dop@xx.yy>;tag=3AA7E33E-812-8F35DAF6_aoixtxv-1C9F
20:36:58.928 5 SIPDATA-780716 inp: Call-ID:
706130bb21879d3d1b4ffb9b72ce1269@0:0:0:0:0:0:0:0.egress
20:36:58.928 5 SIPDATA-780716 inp: CSeq: 1 INVITE
20:36:58.928 5 SIPDATA-780716 inp: Server: Sipdroid/2.4 beta/Blade
20:36:58.928 5 SIPDATA-780716 inp: Content-Length: 172
20:36:58.928 5 SIPDATA-780716 inp: Content-Type: application/sdp
20:36:58.928 5 SIPDATA-780716 inp:
20:36:58.928 5 SIPDATA-780716 inp: v=0
20:36:58.928 5 SIPDATA-780716 inp: o=dop@mail.Itoolabs.Co.UK 0 0 IN IP4
192.168.16.10
20:36:58.928 5 SIPDATA-780716 inp: s=Session SIP/SDP
20:36:58.928 5 SIPDATA-780716 inp: c=IN IP4 xx.xx.xx.xx
20:36:58.928 5 SIPDATA-780716 inp: t=0 0
20:36:58.928 5 SIPDATA-780716 inp: m=audio 50016 RTP/AVP 98 101
20:36:58.928 5 SIPDATA-780716 inp: a=rtpmap:101 telephone-event/8000
Already here you can see format announcement (98) without the corresponding
rtpmap attribute which is afair not allowed for dynamic payload types.
This response almost immediately followed by forbidden:
20:36:59.172 5 SIPDATA-780718 inp: SIP/2.0 403 Forbidden
20:36:59.172 5 SIPDATA-780718 inp: Via: SIP/2.0/UDP
xx.xx.xx.xx:5060;branch=z9hG4bK126794-aoixtxv;cgp=itoolabs.net;rport=5060
20:36:59.172 5 SIPDATA-780718 inp: To: <sip:dop@xxx.xx>
20:36:59.172 5 SIPDATA-780718 inp: From: "Dmitry Panov"
<sip:dop@xxx.yy>;tag=3AA7E33E-812-8F35DAF6_aoixtxv-1C9F
20:36:59.172 5 SIPDATA-780718 inp: Call-ID:
706130bb21879d3d1b4ffb9b72ce1269@0:0:0:0:0:0:0:0.egress
20:36:59.172 5 SIPDATA-780718 inp: CSeq: 1 INVITE
20:36:59.172 5 SIPDATA-780718 inp: Server: Sipdroid/2.4 beta/Blade
20:36:59.172 5 SIPDATA-780718 inp: Content-Length: 0
```
Original issue reported on code.google.com by `dop251` on 2 Jan 2012 at 9:34
| 1.0 | Possible incorrect Speex WB handling in SDP offer - ```
What steps will reproduce the problem?
1. Install jitsi client
2. make a call to a sipdriod
What is the expected output? What do you see instead?
call should establish. Instead I receive incompatible codecs error
What version of the product are you using? On what device/operating system?
2.4 on cyanogenmod 7.1, ZTE Blade
Which SIP server are you using? What happens with PBXes?
CommuniGate Pro which is most likely irrelevant because it is acting as a SIP
proxy not changing codecs in SDP. I don't have account with pbxes, but if it
allows direct SIP-to-SIP calls, the problem should be reproduceable there as
well.
Which type of network are you using?
wifi
Please provide any additional information below.
Most likely this is because sipdriod only supports speex NB and gets confused
when speex WB is offered. Disabling speex altogether solves the problem.
Request to sipdroid:
20:36:58.048 5 SIPDATA-780708 out: INVITE
sip:dop@xx.xx.xx.xx:47948;transport=udp SIP/2.0
20:36:58.048 5 SIPDATA-780708 out: Via: SIP/2.0/UDP
xx.xx.xx.xx:5060;branch=z9hG4bK126794-aoixtxv;cgp=itoolabs.net;rport
20:36:58.048 5 SIPDATA-780708 out: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.048 5 SIPDATA-780708 out: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.048 5 SIPDATA-780708 out: Record-Route:
<sip:rev.789-10.40.74.239.dialog.cgatepro;lr>
20:36:58.048 5 SIPDATA-780708 out: Max-Forwards: 67
20:36:58.048 5 SIPDATA-780708 out: From: "Dmitry Panov"
<sip:dop@xxx.xx>;tag=3AA7E33E-812-8F35DAF6_aoixtxv-1C9F
20:36:58.048 5 SIPDATA-780708 out: To: <sip:dop@xxx.yy>
20:36:58.048 5 SIPDATA-780708 out: Call-ID:
706130bb21879d3d1b4ffb9b72ce1269@0:0:0:0:0:0:0:0.egress
20:36:58.048 5 SIPDATA-780708 out: Contact:
<sip:signode-812-8F35DAF6_aoixtxv-1C9F@xx.xx.xx.xx>
20:36:58.048 5 SIPDATA-780708 out: CSeq: 1 INVITE
20:36:58.048 5 SIPDATA-780708 out: Supported: 100rel,timer,replaces,histinfo
20:36:58.048 5 SIPDATA-780708 out: Session-Expires: 7200
20:36:58.048 5 SIPDATA-780708 out: Min-SE: 3600
20:36:58.048 5 SIPDATA-780708 out: User-Agent: CommuniGatePro-callLeg/5.3.10
20:36:58.048 5 SIPDATA-780708 out: Allow:
INVITE,ACK,BYE,CANCEL,OPTIONS,INFO,MESSAGE,SUBSCRIBE,NOTIFY,PRACK,REFER
20:36:58.048 5 SIPDATA-780708 out: Content-Type: application/sdp
20:36:58.048 5 SIPDATA-780708 out: Content-Length: 581
20:36:58.048 5 SIPDATA-780708 out:
20:36:58.048 5 SIPDATA-780708 out: v=0
20:36:58.048 5 SIPDATA-780708 out: o=CGPLeg000812 563173700 281586851 IN IP4
xx.xx.xx.xx
20:36:58.048 5 SIPDATA-780708 out: s=-
20:36:58.048 5 SIPDATA-780708 out: c=IN IP4 xx.xx.xx.xx
20:36:58.048 5 SIPDATA-780708 out: t=0 0
20:36:58.048 5 SIPDATA-780708 out: a=mediagateway:itoolabs.com:761
20:36:58.048 5 SIPDATA-780708 out: m=audio 60000 RTP/AVP 96 97 98 9 100 3 0 8
101
20:36:58.048 5 SIPDATA-780708 out: c=IN IP4 xx.xx.xx.xx
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:96 SILK/24000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:97 SILK/16000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:98 speex/16000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:9 G722/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:100 iLBC/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:3 GSM/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:0 PCMU/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:8 PCMA/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtpmap:101 telephone-event/8000
20:36:58.048 5 SIPDATA-780708 out: a=rtcpping:T:32:3278
20:36:58.048 5 SIPDATA-780708 out: a=extmap:1
urn:ietf:params:rtp-hdrext:csrc-audio-level
20:36:58.048 5 SIPDATA-780708 out: a=zrtp-hash:1.10
8cee0d3e8bc3203384c520389c2264ee4c76a4306ebffec1f62374bdb7630174
Response from sipdroid:
20:36:58.928 5 SIPDATA-780716 inp: SIP/2.0 180 Ringing
20:36:58.928 5 SIPDATA-780716 inp: Via: SIP/2.0/UDP
xx.xx.xx.xx:5060;branch=z9hG4bK126794-aoixtxv;cgp=itoolabs.net;rport=5060
20:36:58.928 5 SIPDATA-780716 inp: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.928 5 SIPDATA-780716 inp: Record-Route: <sip:xx.xx.xx.xx:5060;lr>
20:36:58.928 5 SIPDATA-780716 inp: Record-Route:
<sip:rev.789-10.40.74.239.dialog.cgatepro;lr>
20:36:58.928 5 SIPDATA-780716 inp: To: <sip:dop@xxx.xx>;tag=cde7f7e51f6f2498
20:36:58.928 5 SIPDATA-780716 inp: From: "Dmitry Panov"
<sip:dop@xx.yy>;tag=3AA7E33E-812-8F35DAF6_aoixtxv-1C9F
20:36:58.928 5 SIPDATA-780716 inp: Call-ID:
706130bb21879d3d1b4ffb9b72ce1269@0:0:0:0:0:0:0:0.egress
20:36:58.928 5 SIPDATA-780716 inp: CSeq: 1 INVITE
20:36:58.928 5 SIPDATA-780716 inp: Server: Sipdroid/2.4 beta/Blade
20:36:58.928 5 SIPDATA-780716 inp: Content-Length: 172
20:36:58.928 5 SIPDATA-780716 inp: Content-Type: application/sdp
20:36:58.928 5 SIPDATA-780716 inp:
20:36:58.928 5 SIPDATA-780716 inp: v=0
20:36:58.928 5 SIPDATA-780716 inp: o=dop@mail.Itoolabs.Co.UK 0 0 IN IP4
192.168.16.10
20:36:58.928 5 SIPDATA-780716 inp: s=Session SIP/SDP
20:36:58.928 5 SIPDATA-780716 inp: c=IN IP4 xx.xx.xx.xx
20:36:58.928 5 SIPDATA-780716 inp: t=0 0
20:36:58.928 5 SIPDATA-780716 inp: m=audio 50016 RTP/AVP 98 101
20:36:58.928 5 SIPDATA-780716 inp: a=rtpmap:101 telephone-event/8000
Already here you can see format announcement (98) without the corresponding
rtpmap attribute which is afair not allowed for dynamic payload types.
This response almost immediately followed by forbidden:
20:36:59.172 5 SIPDATA-780718 inp: SIP/2.0 403 Forbidden
20:36:59.172 5 SIPDATA-780718 inp: Via: SIP/2.0/UDP
xx.xx.xx.xx:5060;branch=z9hG4bK126794-aoixtxv;cgp=itoolabs.net;rport=5060
20:36:59.172 5 SIPDATA-780718 inp: To: <sip:dop@xxx.xx>
20:36:59.172 5 SIPDATA-780718 inp: From: "Dmitry Panov"
<sip:dop@xxx.yy>;tag=3AA7E33E-812-8F35DAF6_aoixtxv-1C9F
20:36:59.172 5 SIPDATA-780718 inp: Call-ID:
706130bb21879d3d1b4ffb9b72ce1269@0:0:0:0:0:0:0:0.egress
20:36:59.172 5 SIPDATA-780718 inp: CSeq: 1 INVITE
20:36:59.172 5 SIPDATA-780718 inp: Server: Sipdroid/2.4 beta/Blade
20:36:59.172 5 SIPDATA-780718 inp: Content-Length: 0
```
Original issue reported on code.google.com by `dop251` on 2 Jan 2012 at 9:34
| defect | possible incorrect speex wb handling in sdp offer what steps will reproduce the problem install jitsi client make a call to a sipdriod what is the expected output what do you see instead call should establish instead i receive incompatible codecs error what version of the product are you using on what device operating system on cyanogenmod zte blade which sip server are you using what happens with pbxes communigate pro which is most likely irrelevant because it is acting as a sip proxy not changing codecs in sdp i don t have account with pbxes but if it allows direct sip to sip calls the problem should be reproduceable there as well which type of network are you using wifi please provide any additional information below most likely this is because sipdriod only supports speex nb and gets confused when speex wb is offered disabling speex altogether solves the problem request to sipdroid sipdata out invite sip dop xx xx xx xx transport udp sip sipdata out via sip udp xx xx xx xx branch aoixtxv cgp itoolabs net rport sipdata out record route sipdata out record route sipdata out record route sipdata out max forwards sipdata out from dmitry panov tag aoixtxv sipdata out to sipdata out call id egress sipdata out contact sipdata out cseq invite sipdata out supported timer replaces histinfo sipdata out session expires sipdata out min se sipdata out user agent communigatepro callleg sipdata out allow invite ack bye cancel options info message subscribe notify prack refer sipdata out content type application sdp sipdata out content length sipdata out sipdata out v sipdata out o in xx xx xx xx sipdata out s sipdata out c in xx xx xx xx sipdata out t sipdata out a mediagateway itoolabs com sipdata out m audio rtp avp sipdata out c in xx xx xx xx sipdata out a rtpmap silk sipdata out a rtpmap silk sipdata out a rtpmap speex sipdata out a rtpmap sipdata out a rtpmap ilbc sipdata out a rtpmap gsm sipdata out a rtpmap pcmu sipdata out a rtpmap pcma sipdata out a rtpmap telephone event sipdata out a rtcpping t sipdata out a extmap urn ietf params rtp hdrext csrc audio level sipdata out a zrtp hash response from sipdroid sipdata inp sip ringing sipdata inp via sip udp xx xx xx xx branch aoixtxv cgp itoolabs net rport sipdata inp record route sipdata inp record route sipdata inp record route sipdata inp to tag sipdata inp from dmitry panov tag aoixtxv sipdata inp call id egress sipdata inp cseq invite sipdata inp server sipdroid beta blade sipdata inp content length sipdata inp content type application sdp sipdata inp sipdata inp v sipdata inp o dop mail itoolabs co uk in sipdata inp s session sip sdp sipdata inp c in xx xx xx xx sipdata inp t sipdata inp m audio rtp avp sipdata inp a rtpmap telephone event already here you can see format announcement without the corresponding rtpmap attribute which is afair not allowed for dynamic payload types this response almost immediately followed by forbidden sipdata inp sip forbidden sipdata inp via sip udp xx xx xx xx branch aoixtxv cgp itoolabs net rport sipdata inp to sipdata inp from dmitry panov tag aoixtxv sipdata inp call id egress sipdata inp cseq invite sipdata inp server sipdroid beta blade sipdata inp content length original issue reported on code google com by on jan at | 1 |
32,270 | 6,756,696,827 | IssuesEvent | 2017-10-24 08:10:12 | primefaces/primeng | https://api.github.com/repos/primefaces/primeng | closed | MegaMenu doesn't compile with TypeScript 2.4 | confirmed defect | **I'm submitting a ...**
```
[X] bug report
```
**Test case**
You can use the following demo app as test case:
https://github.com/ova2/angular-development-with-primeng/tree/master/chapter7/megamenu
**Current behavior**
If you run the showcase for the MegaMenu with TypeScript 2.4 or run the demo app linked above, you will get a compilation error like
```
Type '{ label: string; items: { label: string; }[]; }[]' has no properties in common with type 'MenuItem'.
```
**Expected behavior**
There should be no compilation error, as when compiling with TypeScript 2.3.
**Minimal reproduction of the problem with instructions**
* Install the above app
* or install the current master of PrimeNG and change the requirements in package.json to Angular 4.3, Angular-Cli 1.3, TypeScript 2.4 (lower Angular/Cli versions require TypeScript 2.3, so you need to test with Angular 4.3 and Cli 1.3)
* Run `npm` install and `npm start` and check the MegaMenu
* **Angular version:** 4.3.3
* **PrimeNG version:** 4.1.2
* **Browser:** all
* **Language:** TypeScript 2.4
* **Node (for AoT issues):** 8.1.4
* **Analysis of the problem:**
In the `MenuItem` interface, the `items` property is defined as of type `MenuItem[]`. But in the MegaMenu, you can have arrays of arrays of MenuItems as items, not just arrays of MenuItems.
In TypeScript 2.4, it’s now an error to assign anything to a weak type when there’s no overlap in properties (see [here](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-4.html#weak-type-detection)).
Note that in the `MenuItem` interface all properties are marked as optional. Therefore, this is considered a "weak type".
* **Proposed solution:**
The `items` property in the `MenutItem` interface should be defined as follows:
```
items?: MenuItem[]|MenuItem[][];
```
| 1.0 | MegaMenu doesn't compile with TypeScript 2.4 - **I'm submitting a ...**
```
[X] bug report
```
**Test case**
You can use the following demo app as test case:
https://github.com/ova2/angular-development-with-primeng/tree/master/chapter7/megamenu
**Current behavior**
If you run the showcase for the MegaMenu with TypeScript 2.4 or run the demo app linked above, you will get a compilation error like
```
Type '{ label: string; items: { label: string; }[]; }[]' has no properties in common with type 'MenuItem'.
```
**Expected behavior**
There should be no compilation error, as when compiling with TypeScript 2.3.
**Minimal reproduction of the problem with instructions**
* Install the above app
* or install the current master of PrimeNG and change the requirements in package.json to Angular 4.3, Angular-Cli 1.3, TypeScript 2.4 (lower Angular/Cli versions require TypeScript 2.3, so you need to test with Angular 4.3 and Cli 1.3)
* Run `npm` install and `npm start` and check the MegaMenu
* **Angular version:** 4.3.3
* **PrimeNG version:** 4.1.2
* **Browser:** all
* **Language:** TypeScript 2.4
* **Node (for AoT issues):** 8.1.4
* **Analysis of the problem:**
In the `MenuItem` interface, the `items` property is defined as of type `MenuItem[]`. But in the MegaMenu, you can have arrays of arrays of MenuItems as items, not just arrays of MenuItems.
In TypeScript 2.4, it’s now an error to assign anything to a weak type when there’s no overlap in properties (see [here](https://www.typescriptlang.org/docs/handbook/release-notes/typescript-2-4.html#weak-type-detection)).
Note that in the `MenuItem` interface all properties are marked as optional. Therefore, this is considered a "weak type".
* **Proposed solution:**
The `items` property in the `MenutItem` interface should be defined as follows:
```
items?: MenuItem[]|MenuItem[][];
```
| defect | megamenu doesn t compile with typescript i m submitting a bug report test case you can use the following demo app as test case current behavior if you run the showcase for the megamenu with typescript or run the demo app linked above you will get a compilation error like type label string items label string has no properties in common with type menuitem expected behavior there should be no compilation error as when compiling with typescript minimal reproduction of the problem with instructions install the above app or install the current master of primeng and change the requirements in package json to angular angular cli typescript lower angular cli versions require typescript so you need to test with angular and cli run npm install and npm start and check the megamenu angular version primeng version browser all language typescript node for aot issues analysis of the problem in the menuitem interface the items property is defined as of type menuitem but in the megamenu you can have arrays of arrays of menuitems as items not just arrays of menuitems in typescript it’s now an error to assign anything to a weak type when there’s no overlap in properties see note that in the menuitem interface all properties are marked as optional therefore this is considered a weak type proposed solution the items property in the menutitem interface should be defined as follows items menuitem menuitem | 1 |
22,319 | 3,633,583,297 | IssuesEvent | 2016-02-11 15:07:55 | toniblyx/alfresco-nagios-and-icinga-plugin | https://api.github.com/repos/toniblyx/alfresco-nagios-and-icinga-plugin | closed | Output issue in Nagios | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1. using "nagios-plugin-for-alfresco-1.2" i can't see outpout in nagios,
working fine in cli !! Always return warning without value..
2. I put all variable hard coded in nagios command but no more..
What is the expected output? What do you see instead?
Outpout is "NumberOfUsers=:::" instead of "NumberOfUsers=977;;;"
What version of the product are you using? On what operating system?
nagios 3.2 on redhat entreprise 5.8
Please provide any additional information below.
```
Original issue reported on code.google.com by `fito...@gmail.com` on 11 Dec 2012 at 5:05 | 1.0 | Output issue in Nagios - ```
What steps will reproduce the problem?
1. using "nagios-plugin-for-alfresco-1.2" i can't see outpout in nagios,
working fine in cli !! Always return warning without value..
2. I put all variable hard coded in nagios command but no more..
What is the expected output? What do you see instead?
Outpout is "NumberOfUsers=:::" instead of "NumberOfUsers=977;;;"
What version of the product are you using? On what operating system?
nagios 3.2 on redhat entreprise 5.8
Please provide any additional information below.
```
Original issue reported on code.google.com by `fito...@gmail.com` on 11 Dec 2012 at 5:05 | defect | output issue in nagios what steps will reproduce the problem using nagios plugin for alfresco i can t see outpout in nagios working fine in cli always return warning without value i put all variable hard coded in nagios command but no more what is the expected output what do you see instead outpout is numberofusers instead of numberofusers what version of the product are you using on what operating system nagios on redhat entreprise please provide any additional information below original issue reported on code google com by fito gmail com on dec at | 1 |
24,341 | 3,967,294,592 | IssuesEvent | 2016-05-03 15:46:08 | buildo/nemobot | https://api.github.com/repos/buildo/nemobot | closed | [labels] should always remove WIP/InReview in issues too | defect | ## description
@nemobot sometimes is "forgetting" to remove `WIP`/`InReview` on closed issues. It should **always** remove them in closed issues too.
## how to reproduce
 | 1.0 | [labels] should always remove WIP/InReview in issues too - ## description
@nemobot sometimes is "forgetting" to remove `WIP`/`InReview` on closed issues. It should **always** remove them in closed issues too.
## how to reproduce
 | defect | should always remove wip inreview in issues too description nemobot sometimes is forgetting to remove wip inreview on closed issues it should always remove them in closed issues too how to reproduce | 1 |
6,937 | 2,610,318,420 | IssuesEvent | 2015-02-26 19:42:46 | chrsmith/republic-at-war | https://api.github.com/repos/chrsmith/republic-at-war | closed | Gameplay Error | auto-migrated Priority-Medium Type-Defect | ```
Bacta heals B1 and B2s
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 5 May 2011 at 9:28 | 1.0 | Gameplay Error - ```
Bacta heals B1 and B2s
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 5 May 2011 at 9:28 | defect | gameplay error bacta heals and original issue reported on code google com by gmail com on may at | 1 |
61,567 | 17,023,727,764 | IssuesEvent | 2021-07-03 03:31:17 | tomhughes/trac-tickets | https://api.github.com/repos/tomhughes/trac-tickets | closed | Nominatim does not update the relations | Component: nominatim Priority: major Resolution: fixed Type: defect | **[Submitted to the original trac issue database at 3.27pm, Wednesday, 29th June 2011]**
For several months most villages in the region of Cantabria, in Spain, are associated with the Basque village of La Arena. For example, if we look "Reinosa" [1] we see the result:
Reinosa, La Arena, Cantabria, 39200, Spain, Europe
When right is:
Reinosa, Cantabria, 39200, Spain, Europe
The origin of the problem was an incorrect labeling of the locality "La Arena" as place=county which I corrected long ago. However as we see in Nominatim this relations has not been eliminated from the index and this error keeps appearing in searches (La Arena is still parent of most of Cantabria [2]).
I believe that this mistake can be the causer of whom cities as Bilbao, in the Basque Country, appear in Cantabria [3].
[1] http://open.mapquestapi.com/nominatim/v1/search.php?q=Reinosa&viewbox=-3.16%2C43.37%2C-3.06%2C43.33
[2] http://open.mapquestapi.com/nominatim/v1/details.php?place_id=1477873
[3] http://open.mapquestapi.com/nominatim/v1/details.php?place_id=151062 | 1.0 | Nominatim does not update the relations - **[Submitted to the original trac issue database at 3.27pm, Wednesday, 29th June 2011]**
For several months most villages in the region of Cantabria, in Spain, are associated with the Basque village of La Arena. For example, if we look "Reinosa" [1] we see the result:
Reinosa, La Arena, Cantabria, 39200, Spain, Europe
When right is:
Reinosa, Cantabria, 39200, Spain, Europe
The origin of the problem was an incorrect labeling of the locality "La Arena" as place=county which I corrected long ago. However as we see in Nominatim this relations has not been eliminated from the index and this error keeps appearing in searches (La Arena is still parent of most of Cantabria [2]).
I believe that this mistake can be the causer of whom cities as Bilbao, in the Basque Country, appear in Cantabria [3].
[1] http://open.mapquestapi.com/nominatim/v1/search.php?q=Reinosa&viewbox=-3.16%2C43.37%2C-3.06%2C43.33
[2] http://open.mapquestapi.com/nominatim/v1/details.php?place_id=1477873
[3] http://open.mapquestapi.com/nominatim/v1/details.php?place_id=151062 | defect | nominatim does not update the relations for several months most villages in the region of cantabria in spain are associated with the basque village of la arena for example if we look reinosa we see the result reinosa la arena cantabria spain europe when right is reinosa cantabria spain europe the origin of the problem was an incorrect labeling of the locality la arena as place county which i corrected long ago however as we see in nominatim this relations has not been eliminated from the index and this error keeps appearing in searches la arena is still parent of most of cantabria i believe that this mistake can be the causer of whom cities as bilbao in the basque country appear in cantabria | 1 |
68,707 | 21,790,251,159 | IssuesEvent | 2022-05-14 19:39:09 | primefaces/primereact | https://api.github.com/repos/primefaces/primereact | opened | useImperativeHandle is preventing the default behaviour of useRef hook | defect | ### Describe the bug
Since the transition from Class components to Hook based components, the access to the methods of a component has been handled with the `useImperativeHandle` hook. [Example 1](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/datatable/DataTable.js#L1307), [Example 2](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/autocomplete/AutoComplete.js#L453), etc.
This hook, instead of adding methods to the ref of components, has been replacing the ref completly, blocking the default behaivour of the `useRef` hook, accesing the dom element. In V7 we would also get access to the component props which gives the developer a lot of flexibility and useful state data.
This issue needs two codesandboxes (SEE CONSOLE LOG):
V7: https://codesandbox.io/s/exciting-sound-6xncl2?file=/src/demo/AutoCompleteDemo.js
V8: https://codesandbox.io/s/nervous-shtern-fyhs9h?file=/src/demo/AutoCompleteDemo.js
### Reproducer
https://codesandbox.io/s/nervous-shtern-fyhs9h?file=/src/demo/AutoCompleteDemo.js
### PrimeReact version
8.1.0
### React version
18.x
### Language
ALL
### Build / Runtime
Create React App (CRA)
### Browser(s)
_No response_
### Steps to reproduce the behavior
1. Create an Autocomplete component.
2. Create a const with `useRef` hook.
3. Assign the const to the ref value of the Autocomplete component.
4. Console log the `ref.current` with a `useEffect` hook.
5. See no DOM element nor props.
### Expected behavior
When using a `useRef` hook and assigning the ref to a component, I expect to get access to the dom element (and its props, having used primereact V7). | 1.0 | useImperativeHandle is preventing the default behaviour of useRef hook - ### Describe the bug
Since the transition from Class components to Hook based components, the access to the methods of a component has been handled with the `useImperativeHandle` hook. [Example 1](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/datatable/DataTable.js#L1307), [Example 2](https://github.com/primefaces/primereact/blob/63cf7185d2fe2922a30f24c6f7afe8559b1ef5bc/components/lib/autocomplete/AutoComplete.js#L453), etc.
This hook, instead of adding methods to the ref of components, has been replacing the ref completly, blocking the default behaivour of the `useRef` hook, accesing the dom element. In V7 we would also get access to the component props which gives the developer a lot of flexibility and useful state data.
This issue needs two codesandboxes (SEE CONSOLE LOG):
V7: https://codesandbox.io/s/exciting-sound-6xncl2?file=/src/demo/AutoCompleteDemo.js
V8: https://codesandbox.io/s/nervous-shtern-fyhs9h?file=/src/demo/AutoCompleteDemo.js
### Reproducer
https://codesandbox.io/s/nervous-shtern-fyhs9h?file=/src/demo/AutoCompleteDemo.js
### PrimeReact version
8.1.0
### React version
18.x
### Language
ALL
### Build / Runtime
Create React App (CRA)
### Browser(s)
_No response_
### Steps to reproduce the behavior
1. Create an Autocomplete component.
2. Create a const with `useRef` hook.
3. Assign the const to the ref value of the Autocomplete component.
4. Console log the `ref.current` with a `useEffect` hook.
5. See no DOM element nor props.
### Expected behavior
When using a `useRef` hook and assigning the ref to a component, I expect to get access to the dom element (and its props, having used primereact V7). | defect | useimperativehandle is preventing the default behaviour of useref hook describe the bug since the transition from class components to hook based components the access to the methods of a component has been handled with the useimperativehandle hook etc this hook instead of adding methods to the ref of components has been replacing the ref completly blocking the default behaivour of the useref hook accesing the dom element in we would also get access to the component props which gives the developer a lot of flexibility and useful state data this issue needs two codesandboxes see console log reproducer primereact version react version x language all build runtime create react app cra browser s no response steps to reproduce the behavior create an autocomplete component create a const with useref hook assign the const to the ref value of the autocomplete component console log the ref current with a useeffect hook see no dom element nor props expected behavior when using a useref hook and assigning the ref to a component i expect to get access to the dom element and its props having used primereact | 1 |
201,651 | 15,216,492,197 | IssuesEvent | 2021-02-17 15:34:05 | spring-projects/spring-framework | https://api.github.com/repos/spring-projects/spring-framework | closed | Support cookies with Expires attribute but no Max-Age attribute in MockHttpServletResponse | in: test in: web type: backport type: enhancement | Backport of gh-26558 | 1.0 | Support cookies with Expires attribute but no Max-Age attribute in MockHttpServletResponse - Backport of gh-26558 | non_defect | support cookies with expires attribute but no max age attribute in mockhttpservletresponse backport of gh | 0 |
132,977 | 10,775,787,504 | IssuesEvent | 2019-11-03 16:29:46 | aragakerubo/teamwork-backend | https://api.github.com/repos/aragakerubo/teamwork-backend | closed | Setup unit testing. Ensure all implemented functionality is tested henceforth. | backend chore unit test | #### Why is this important?
To ensure the backend has all functionality working as intended. You may use Mocha as the test runner and `chai` and `chai-http` for the unit tests. | 1.0 | Setup unit testing. Ensure all implemented functionality is tested henceforth. - #### Why is this important?
To ensure the backend has all functionality working as intended. You may use Mocha as the test runner and `chai` and `chai-http` for the unit tests. | non_defect | setup unit testing ensure all implemented functionality is tested henceforth why is this important to ensure the backend has all functionality working as intended you may use mocha as the test runner and chai and chai http for the unit tests | 0 |
24,201 | 3,924,241,942 | IssuesEvent | 2016-04-22 14:35:10 | googlei18n/libphonenumber | https://api.github.com/repos/googlei18n/libphonenumber | closed | German Voicemail Access Numbers | priority-medium type-defect | Imported from [Google Code issue #439](https://code.google.com/p/libphonenumber/issues/detail?id=439) created by [cfkoch](https://code.google.com/u/108438119205599918600/) on 2014-03-21T21:53:37.000Z:
----
See also Issue # 359
Infix operators for 015X are clarified: inthe Bundesnetzagentur documentation that lararen... linked to.
( http://www.bundesnetzagentur.de/SharedDocs/Downloads/DE/Sachgebiete/Telekommunikation/Unternehmen_Institutionen/Nummerierung/Rufnummern/Mobile%20Dienste/NummernplanMobileDienste.pdf?__blob=publicationFile&v=8 )
They explictly state that for mobile phone numbers, between the Prefix and the Assigned Number there can be an infix. (section 2.5)
So far in the metadata, only 0177 99 is accounted for.
0150 00 ZZXXXXXX *****
0151 13 ZXXXXXXX
0151 13 ZZXXXXXX
0152Z 55 XXXXXXX VOICEMAIL
0152Z 50 XXXXXXX FAX
0153 00 ZZXXXXXX *****
0154 00 ZZXXXXXX *****
0155 00 ZZXXXXXX *****
0156 00 ZZXXXXXX *****
0157Z 99 XXXXXXX
0158 00 ZZXXXXXX *****
0159Z 33 XXXXXXX
0160 13 XXXXXXX(X)
0170 13 XXXXXXX(X)
0171 13 XXXXXXX(X)
0172 55 XXXXXXX(X) VOICEMAIL
0172 50 XXXXXXX(X) FAX
0173 55 XXXXXXX(X) VOICEMAIL
0173 50 XXXXXXX(X) FAX
0174 55 XXXXXXX(X) VOICEMAIL
0174 50 XXXXXXX(X) FAX
0175 13 XXXXXXX(X)
0176 33 XXXXXXX(X)
0177 99 XXXXXXX(X)
0178 99 XXXXXXX(X)
0179 33 XXXXXXX(X)
*** The document lists 00 as the Infix, but does not qualify whether that's a holding number for the document, or the planned service infix. These were only available for assignment for a few months now; none are yet in service that I know of.
| 1.0 | German Voicemail Access Numbers - Imported from [Google Code issue #439](https://code.google.com/p/libphonenumber/issues/detail?id=439) created by [cfkoch](https://code.google.com/u/108438119205599918600/) on 2014-03-21T21:53:37.000Z:
----
See also Issue # 359
Infix operators for 015X are clarified: inthe Bundesnetzagentur documentation that lararen... linked to.
( http://www.bundesnetzagentur.de/SharedDocs/Downloads/DE/Sachgebiete/Telekommunikation/Unternehmen_Institutionen/Nummerierung/Rufnummern/Mobile%20Dienste/NummernplanMobileDienste.pdf?__blob=publicationFile&v=8 )
They explictly state that for mobile phone numbers, between the Prefix and the Assigned Number there can be an infix. (section 2.5)
So far in the metadata, only 0177 99 is accounted for.
0150 00 ZZXXXXXX *****
0151 13 ZXXXXXXX
0151 13 ZZXXXXXX
0152Z 55 XXXXXXX VOICEMAIL
0152Z 50 XXXXXXX FAX
0153 00 ZZXXXXXX *****
0154 00 ZZXXXXXX *****
0155 00 ZZXXXXXX *****
0156 00 ZZXXXXXX *****
0157Z 99 XXXXXXX
0158 00 ZZXXXXXX *****
0159Z 33 XXXXXXX
0160 13 XXXXXXX(X)
0170 13 XXXXXXX(X)
0171 13 XXXXXXX(X)
0172 55 XXXXXXX(X) VOICEMAIL
0172 50 XXXXXXX(X) FAX
0173 55 XXXXXXX(X) VOICEMAIL
0173 50 XXXXXXX(X) FAX
0174 55 XXXXXXX(X) VOICEMAIL
0174 50 XXXXXXX(X) FAX
0175 13 XXXXXXX(X)
0176 33 XXXXXXX(X)
0177 99 XXXXXXX(X)
0178 99 XXXXXXX(X)
0179 33 XXXXXXX(X)
*** The document lists 00 as the Infix, but does not qualify whether that's a holding number for the document, or the planned service infix. These were only available for assignment for a few months now; none are yet in service that I know of.
| defect | german voicemail access numbers imported from created by on see also issue nbsp infix operators for are clarified inthe bundesnetzagentur documentation that lararen linked to they explictly state that for mobile phone numbers between the prefix and the assigned number there can be an infix section so far in the metadata only is accounted for zzxxxxxx zxxxxxxx zzxxxxxx xxxxxxx voicemail xxxxxxx fax zzxxxxxx zzxxxxxx zzxxxxxx zzxxxxxx xxxxxxx zzxxxxxx xxxxxxx xxxxxxx x xxxxxxx x xxxxxxx x xxxxxxx x voicemail xxxxxxx x fax xxxxxxx x voicemail xxxxxxx x fax xxxxxxx x voicemail xxxxxxx x fax xxxxxxx x xxxxxxx x xxxxxxx x xxxxxxx x xxxxxxx x the document lists as the infix but does not qualify whether that s a holding number for the document or the planned service infix these were only available for assignment for a few months now none are yet in service that i know of | 1 |
51,226 | 13,207,397,775 | IssuesEvent | 2020-08-14 22:57:07 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | opened | cmake goodies from trunk for V01-11-02 (Trac #66) | Incomplete Migration Migrated from Trac defect offline-software | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/66">https://code.icecube.wisc.edu/projects/icecube/ticket/66</a>, reported by blaufuss</summary>
<p>
```json
{
"status": "closed",
"changetime": "2007-11-11T03:51:18",
"_ts": "1194753078000000",
"description": "esp. stuff from rev #33235 (svn info in ENGLISH, por favor)",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"time": "2007-06-15T03:01:29",
"component": "offline-software",
"summary": "cmake goodies from trunk for V01-11-02",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
| 1.0 | cmake goodies from trunk for V01-11-02 (Trac #66) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/66">https://code.icecube.wisc.edu/projects/icecube/ticket/66</a>, reported by blaufuss</summary>
<p>
```json
{
"status": "closed",
"changetime": "2007-11-11T03:51:18",
"_ts": "1194753078000000",
"description": "esp. stuff from rev #33235 (svn info in ENGLISH, por favor)",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"time": "2007-06-15T03:01:29",
"component": "offline-software",
"summary": "cmake goodies from trunk for V01-11-02",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
| defect | cmake goodies from trunk for trac migrated from json status closed changetime ts description esp stuff from rev svn info in english por favor reporter blaufuss cc resolution fixed time component offline software summary cmake goodies from trunk for priority normal keywords milestone owner type defect | 1 |
79,907 | 15,300,817,454 | IssuesEvent | 2021-02-24 12:49:57 | ClickHouse/ClickHouse | https://api.github.com/repos/ClickHouse/ClickHouse | closed | about protobuf3 [ repeated ] message with all fields by default. | comp-formats unfinished code | **about protobuf3 [ repeated ] message with all fields by default.**
When clickhouse serializes fields of type in protobuf3, if repeated is an nested data structure with multiple fields in it. When only one field has a value, clickhouse reports an error.
The error message is as follows:
```
Elements ‘a.aaa’ and ‘a.bbb’ of Nested data structure ‘a’ (Array columns) have different array sizes.
```
* ClickHouse server version is:19.16.2.2
* `CREATE TABLE` statements for all tables involved
```
CREATE TABLE test_table
(
`a.aaa` Array(String),
`a.bbb` Array(String),
)
ENGINE = Kafka()
SETTINGS kafka_broker_list = '192.168.10.100:39092', kafka_topic_list = 'aaa', kafka_group_name = 'group_aaa', kafka_format = 'Protobuf', kafka_schema = 'aaaa.proto:testMessage', kafka_num_consumers = 2, kafka_skip_broken_messages = 1
```
* protocbuf3
```
syntax = "proto3";
message testMessage {
repeated testNested a = 1;
}
message testNested {
string aaa = 1;
string bbb = 2;
}
```
**Expected behavior**
The default value in protobuf is expected to be set to nested
| 1.0 | about protobuf3 [ repeated ] message with all fields by default. - **about protobuf3 [ repeated ] message with all fields by default.**
When clickhouse serializes fields of type in protobuf3, if repeated is an nested data structure with multiple fields in it. When only one field has a value, clickhouse reports an error.
The error message is as follows:
```
Elements ‘a.aaa’ and ‘a.bbb’ of Nested data structure ‘a’ (Array columns) have different array sizes.
```
* ClickHouse server version is:19.16.2.2
* `CREATE TABLE` statements for all tables involved
```
CREATE TABLE test_table
(
`a.aaa` Array(String),
`a.bbb` Array(String),
)
ENGINE = Kafka()
SETTINGS kafka_broker_list = '192.168.10.100:39092', kafka_topic_list = 'aaa', kafka_group_name = 'group_aaa', kafka_format = 'Protobuf', kafka_schema = 'aaaa.proto:testMessage', kafka_num_consumers = 2, kafka_skip_broken_messages = 1
```
* protocbuf3
```
syntax = "proto3";
message testMessage {
repeated testNested a = 1;
}
message testNested {
string aaa = 1;
string bbb = 2;
}
```
**Expected behavior**
The default value in protobuf is expected to be set to nested
| non_defect | about message with all fields by default about message with all fields by default when clickhouse serializes fields of type in if repeated is an nested data structure with multiple fields in it when only one field has a value clickhouse reports an error the error message is as follows elements ‘a aaa’ and ‘a bbb’ of nested data structure ‘a’ array columns have different array sizes clickhouse server version is create table statements for all tables involved create table test table a aaa array string a bbb array string engine kafka settings kafka broker list kafka topic list aaa kafka group name group aaa kafka format protobuf kafka schema aaaa proto testmessage kafka num consumers kafka skip broken messages syntax message testmessage repeated testnested a message testnested string aaa string bbb expected behavior the default value in protobuf is expected to be set to nested | 0 |
46,965 | 13,056,007,822 | IssuesEvent | 2020-07-30 03:22:31 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | [filterscripts] L2 Monopole Reconstruction throws (Trac #2198) | Incomplete Migration Migrated from Trac combo reconstruction defect | Migrated from https://code.icecube.wisc.edu/ticket/2198
```json
{
"status": "closed",
"changetime": "2019-02-13T14:15:18",
"description": "L2 monopole reconstruction is throwing...\n\n\n File \"/home/olivas/icecube/combo/cisystem/build/lib/icecube/filterscripts/offlineL2/level2_Reconstruction_Monopole.py\", line 328, in <lambda>\n If = lambda f: \"FilterMask\" in f and f[\"FilterMask\"][filter_globals.MonopoleFilter].condition_passed and If(f)\nRuntimeError: maximum recursion depth exceeded while calling a Python object\n\n\nI'm guessing this is related to recent activity by flambda.",
"reporter": "olivas",
"cc": "blaufuss",
"resolution": "fixed",
"_ts": "1550067318169976",
"component": "combo reconstruction",
"summary": "[filterscripts] L2 Monopole Reconstruction throws",
"priority": "normal",
"keywords": "",
"time": "2018-10-11T16:15:21",
"milestone": "",
"owner": "flauber",
"type": "defect"
}
```
| 1.0 | [filterscripts] L2 Monopole Reconstruction throws (Trac #2198) - Migrated from https://code.icecube.wisc.edu/ticket/2198
```json
{
"status": "closed",
"changetime": "2019-02-13T14:15:18",
"description": "L2 monopole reconstruction is throwing...\n\n\n File \"/home/olivas/icecube/combo/cisystem/build/lib/icecube/filterscripts/offlineL2/level2_Reconstruction_Monopole.py\", line 328, in <lambda>\n If = lambda f: \"FilterMask\" in f and f[\"FilterMask\"][filter_globals.MonopoleFilter].condition_passed and If(f)\nRuntimeError: maximum recursion depth exceeded while calling a Python object\n\n\nI'm guessing this is related to recent activity by flambda.",
"reporter": "olivas",
"cc": "blaufuss",
"resolution": "fixed",
"_ts": "1550067318169976",
"component": "combo reconstruction",
"summary": "[filterscripts] L2 Monopole Reconstruction throws",
"priority": "normal",
"keywords": "",
"time": "2018-10-11T16:15:21",
"milestone": "",
"owner": "flauber",
"type": "defect"
}
```
| defect | monopole reconstruction throws trac migrated from json status closed changetime description monopole reconstruction is throwing n n n file home olivas icecube combo cisystem build lib icecube filterscripts reconstruction monopole py line in n if lambda f filtermask in f and f condition passed and if f nruntimeerror maximum recursion depth exceeded while calling a python object n n ni m guessing this is related to recent activity by flambda reporter olivas cc blaufuss resolution fixed ts component combo reconstruction summary monopole reconstruction throws priority normal keywords time milestone owner flauber type defect | 1 |
37,103 | 8,234,242,285 | IssuesEvent | 2018-09-08 12:04:50 | SeasideSt/Seaside | https://api.github.com/repos/SeasideSt/Seaside | reopened | WAUrl doesn't remember trailing slash | Priority-Medium Type-Defect auto-migrated | ```
I thought there was an issue open for this but I can't find it.
There can be a big difference between:
http://foo/bar
and
http://foo/bar/
but WAUrl doesn't keep track of that.
```
Original issue reported on code.google.com by `jfitz...@gmail.com` on 3 Aug 2009 at 11:09
| 1.0 | WAUrl doesn't remember trailing slash - ```
I thought there was an issue open for this but I can't find it.
There can be a big difference between:
http://foo/bar
and
http://foo/bar/
but WAUrl doesn't keep track of that.
```
Original issue reported on code.google.com by `jfitz...@gmail.com` on 3 Aug 2009 at 11:09
| defect | waurl doesn t remember trailing slash i thought there was an issue open for this but i can t find it there can be a big difference between and but waurl doesn t keep track of that original issue reported on code google com by jfitz gmail com on aug at | 1 |
65,777 | 19,689,764,451 | IssuesEvent | 2022-01-12 04:49:59 | scipy/scipy | https://api.github.com/repos/scipy/scipy | closed | BUG: Result of scipy filtfilt() and MATLAB filtfilt() are not the same | defect | ### Describe your issue.
I use scipy filtfilt() and MATLAB filtfilt() to check result consistency. But I found result of scipy filtfilt() and MATLAB filtfilt() are not the same. Please refer to the attached file for more detail.
[td_30.txt](https://github.com/scipy/scipy/files/7833017/td_30.txt)
[python3_filtfilt_30_result.txt](https://github.com/scipy/scipy/files/7833023/python3_filtfilt_30_result.txt)
[matlab_filtfilt_test_30.zip](https://github.com/scipy/scipy/files/7833032/matlab_filtfilt_test_30.zip)
[scipy_filtfilt_test_30.zip](https://github.com/scipy/scipy/files/7833033/scipy_filtfilt_test_30.zip)
[MATLAB_filtfilt_30_result.txt](https://github.com/scipy/scipy/files/7833012/MATLAB_filtfilt_30_result.txt)
### Reproducing Code Example
```python
import sys, scipy, numpy
from scipy import signal
import matplotlib.pyplot as plt
#print(scipy.__version__, numpy.__version__, sys.version_info)
b = [0.979972841549963,-5.87983704929978, 14.6995926232494, -19.5994568309993, 14.6995926232494, -5.87983704929978, 0.979972841549963]
a = [1, -5.95953942986552, 14.798514866178, -19.5986547008315, 14.6002692932459, -5.80093679890116, 0.960346770175509]
x = [15533041, 15664561, 15604100, 15392527, 15430503, 15653269, 15625920, 15419887, 15394668, 15634167, 15356747,
15554813, 15667928, 15530125, 15357011, 15512619, 15669586, 15569659, 15370252, 15624583, 15418304, 15396501,
15635446, 15643134, 15448808, 15369416, 15601116, 15658661, 15514278, 15670239]
y = signal.filtfilt(b, a, x)
print("x=", x)
print("b=", b)
print("a=", a)
print("y=", y)
```
### Error message
```shell
There are no error messages.
```
### SciPy/NumPy/Python version information
1.7.3 1.22.0 sys.version_info(major=3, minor=8, micro=10, releaselevel='final', serial=0) | 1.0 | BUG: Result of scipy filtfilt() and MATLAB filtfilt() are not the same - ### Describe your issue.
I use scipy filtfilt() and MATLAB filtfilt() to check result consistency. But I found result of scipy filtfilt() and MATLAB filtfilt() are not the same. Please refer to the attached file for more detail.
[td_30.txt](https://github.com/scipy/scipy/files/7833017/td_30.txt)
[python3_filtfilt_30_result.txt](https://github.com/scipy/scipy/files/7833023/python3_filtfilt_30_result.txt)
[matlab_filtfilt_test_30.zip](https://github.com/scipy/scipy/files/7833032/matlab_filtfilt_test_30.zip)
[scipy_filtfilt_test_30.zip](https://github.com/scipy/scipy/files/7833033/scipy_filtfilt_test_30.zip)
[MATLAB_filtfilt_30_result.txt](https://github.com/scipy/scipy/files/7833012/MATLAB_filtfilt_30_result.txt)
### Reproducing Code Example
```python
import sys, scipy, numpy
from scipy import signal
import matplotlib.pyplot as plt
#print(scipy.__version__, numpy.__version__, sys.version_info)
b = [0.979972841549963,-5.87983704929978, 14.6995926232494, -19.5994568309993, 14.6995926232494, -5.87983704929978, 0.979972841549963]
a = [1, -5.95953942986552, 14.798514866178, -19.5986547008315, 14.6002692932459, -5.80093679890116, 0.960346770175509]
x = [15533041, 15664561, 15604100, 15392527, 15430503, 15653269, 15625920, 15419887, 15394668, 15634167, 15356747,
15554813, 15667928, 15530125, 15357011, 15512619, 15669586, 15569659, 15370252, 15624583, 15418304, 15396501,
15635446, 15643134, 15448808, 15369416, 15601116, 15658661, 15514278, 15670239]
y = signal.filtfilt(b, a, x)
print("x=", x)
print("b=", b)
print("a=", a)
print("y=", y)
```
### Error message
```shell
There are no error messages.
```
### SciPy/NumPy/Python version information
1.7.3 1.22.0 sys.version_info(major=3, minor=8, micro=10, releaselevel='final', serial=0) | defect | bug result of scipy filtfilt and matlab filtfilt are not the same describe your issue i use scipy filtfilt and matlab filtfilt to check result consistency but i found result of scipy filtfilt and matlab filtfilt are not the same please refer to the attached file for more detail reproducing code example python import sys scipy numpy from scipy import signal import matplotlib pyplot as plt print scipy version numpy version sys version info b a x y signal filtfilt b a x print x x print b b print a a print y y error message shell there are no error messages scipy numpy python version information sys version info major minor micro releaselevel final serial | 1 |
45,997 | 13,055,834,768 | IssuesEvent | 2020-07-30 02:52:33 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | fix simulation so all bots report "green" (Trac #394) | Incomplete Migration Migrated from Trac combo simulation defect | Migrated from https://code.icecube.wisc.edu/ticket/394
```json
{
"status": "closed",
"changetime": "2012-05-30T16:43:18",
"description": "",
"reporter": "nega",
"cc": "blaufuss olivas",
"resolution": "fixed",
"_ts": "1338396198000000",
"component": "combo simulation",
"summary": "fix simulation so all bots report \"green\"",
"priority": "normal",
"keywords": "",
"time": "2012-05-17T14:20:40",
"milestone": "",
"owner": "kjmeagher",
"type": "defect"
}
```
| 1.0 | fix simulation so all bots report "green" (Trac #394) - Migrated from https://code.icecube.wisc.edu/ticket/394
```json
{
"status": "closed",
"changetime": "2012-05-30T16:43:18",
"description": "",
"reporter": "nega",
"cc": "blaufuss olivas",
"resolution": "fixed",
"_ts": "1338396198000000",
"component": "combo simulation",
"summary": "fix simulation so all bots report \"green\"",
"priority": "normal",
"keywords": "",
"time": "2012-05-17T14:20:40",
"milestone": "",
"owner": "kjmeagher",
"type": "defect"
}
```
| defect | fix simulation so all bots report green trac migrated from json status closed changetime description reporter nega cc blaufuss olivas resolution fixed ts component combo simulation summary fix simulation so all bots report green priority normal keywords time milestone owner kjmeagher type defect | 1 |
554,370 | 16,419,048,597 | IssuesEvent | 2021-05-19 10:16:31 | ita-social-projects/TeachUA | https://api.github.com/repos/ita-social-projects/TeachUA | closed | [Додати локацію] The size of pop-up window is incorrect | Priority: Medium bug | **Environment:** Windows, Google Chrome 88.0.4324.190 (64-bit).
**Reproducible:** always.
**Build found:** last commit from https://speak-ukrainian.org.ua/dev/clubs
**Steps to reproduce**
1. Go to https://speak-ukrainian.org.ua/dev/
2. Click on 'Додати гурток' button on the main page
3. Fill/choose all mandatory parameters on 'Основна інформація' step and click on 'Наступний крок' button
4. Click on 'Додати локацію' on 'Контакти' step
5. Pay attention to the size of a 'Додати локацію' pop-up
**Actual result**

**Expected result**

**User story and test case links**
User story #291
**Labels to be added**
"Bug", Priority ("pri: medium"), Severity ("severity: minor"), Type ("UI").
| 1.0 | [Додати локацію] The size of pop-up window is incorrect - **Environment:** Windows, Google Chrome 88.0.4324.190 (64-bit).
**Reproducible:** always.
**Build found:** last commit from https://speak-ukrainian.org.ua/dev/clubs
**Steps to reproduce**
1. Go to https://speak-ukrainian.org.ua/dev/
2. Click on 'Додати гурток' button on the main page
3. Fill/choose all mandatory parameters on 'Основна інформація' step and click on 'Наступний крок' button
4. Click on 'Додати локацію' on 'Контакти' step
5. Pay attention to the size of a 'Додати локацію' pop-up
**Actual result**

**Expected result**

**User story and test case links**
User story #291
**Labels to be added**
"Bug", Priority ("pri: medium"), Severity ("severity: minor"), Type ("UI").
| non_defect | the size of pop up window is incorrect environment windows google chrome bit reproducible always build found last commit from steps to reproduce go to click on додати гурток button on the main page fill choose all mandatory parameters on основна інформація step and click on наступний крок button click on додати локацію on контакти step pay attention to the size of a додати локацію pop up actual result expected result user story and test case links user story labels to be added bug priority pri medium severity severity minor type ui | 0 |
34,321 | 12,266,032,528 | IssuesEvent | 2020-05-07 08:15:53 | Shuunen/td-express | https://api.github.com/repos/Shuunen/td-express | closed | CVE-2017-16137 Medium Severity Vulnerability detected by WhiteSource | security vulnerability | ## CVE-2017-16137 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-2.2.0.tgz</b></p></summary>
<p>small debugging utility</p>
<p>path: null</p>
<p>
<p>Library home page: <a href=http://registry.npmjs.org/debug/-/debug-2.2.0.tgz>http://registry.npmjs.org/debug/-/debug-2.2.0.tgz</a></p>
Dependency Hierarchy:
- :x: **debug-2.2.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16137>CVE-2017-16137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/534">https://nodesecurity.io/advisories/534</a></p>
<p>Release Date: 2017-09-27</p>
<p>Fix Resolution: Version 2.x.x: Update to version 2.6.9 or later.
Version 3.x.x: Update to version 3.1.0 or later.</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2017-16137 Medium Severity Vulnerability detected by WhiteSource - ## CVE-2017-16137 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>debug-2.2.0.tgz</b></p></summary>
<p>small debugging utility</p>
<p>path: null</p>
<p>
<p>Library home page: <a href=http://registry.npmjs.org/debug/-/debug-2.2.0.tgz>http://registry.npmjs.org/debug/-/debug-2.2.0.tgz</a></p>
Dependency Hierarchy:
- :x: **debug-2.2.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter. It takes around 50k characters to block for 2 seconds making this a low severity issue.
<p>Publish Date: 2018-06-07
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-16137>CVE-2017-16137</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nodesecurity.io/advisories/534">https://nodesecurity.io/advisories/534</a></p>
<p>Release Date: 2017-09-27</p>
<p>Fix Resolution: Version 2.x.x: Update to version 2.6.9 or later.
Version 3.x.x: Update to version 3.1.0 or later.</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve medium severity vulnerability detected by whitesource cve medium severity vulnerability vulnerable library debug tgz small debugging utility path null library home page a href dependency hierarchy x debug tgz vulnerable library vulnerability details the debug module is vulnerable to regular expression denial of service when untrusted user input is passed into the o formatter it takes around characters to block for seconds making this a low severity issue publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution version x x update to version or later version x x update to version or later step up your open source security game with whitesource | 0 |
226,223 | 7,511,106,711 | IssuesEvent | 2018-04-11 04:43:19 | Codaone/DEXBot | https://api.github.com/repos/Codaone/DEXBot | closed | Verify and document fool-proof apt+git+pip installation guide for Ubuntu | Priority: High Status: Completed Type: Enhancement | Until the packages work as intended or there are native packages, we need at least one installation method on Linux that is guaranteed to work, in order to get users up and running. | 1.0 | Verify and document fool-proof apt+git+pip installation guide for Ubuntu - Until the packages work as intended or there are native packages, we need at least one installation method on Linux that is guaranteed to work, in order to get users up and running. | non_defect | verify and document fool proof apt git pip installation guide for ubuntu until the packages work as intended or there are native packages we need at least one installation method on linux that is guaranteed to work in order to get users up and running | 0 |
104,358 | 16,613,638,351 | IssuesEvent | 2021-06-02 14:19:45 | Thanraj/linux-4.1.15 | https://api.github.com/repos/Thanraj/linux-4.1.15 | opened | CVE-2017-12188 (High) detected in linux-stable-rtv4.1.33 | security vulnerability | ## CVE-2017-12188 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Thanraj/linux-4.1.15/commits/5e3fb3e332499e1ad10a0969e55582af1027b085">5e3fb3e332499e1ad10a0969e55582af1027b085</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux-4.1.15/arch/x86/kvm/paging_tmpl.h</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
arch/x86/kvm/mmu.c in the Linux kernel through 4.13.5, when nested virtualisation is used, does not properly traverse guest pagetable entries to resolve a guest virtual address, which allows L1 guest OS users to execute arbitrary code on the host OS or cause a denial of service (incorrect index during page walking, and host OS crash), aka an "MMU potential stack buffer overrun."
<p>Publish Date: 2017-10-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-12188>CVE-2017-12188</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-12188">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-12188</a></p>
<p>Release Date: 2017-10-11</p>
<p>Fix Resolution: v4.14-rc5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2017-12188 (High) detected in linux-stable-rtv4.1.33 - ## CVE-2017-12188 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/Thanraj/linux-4.1.15/commits/5e3fb3e332499e1ad10a0969e55582af1027b085">5e3fb3e332499e1ad10a0969e55582af1027b085</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>linux-4.1.15/arch/x86/kvm/paging_tmpl.h</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
arch/x86/kvm/mmu.c in the Linux kernel through 4.13.5, when nested virtualisation is used, does not properly traverse guest pagetable entries to resolve a guest virtual address, which allows L1 guest OS users to execute arbitrary code on the host OS or cause a denial of service (incorrect index during page walking, and host OS crash), aka an "MMU potential stack buffer overrun."
<p>Publish Date: 2017-10-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-12188>CVE-2017-12188</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-12188">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-12188</a></p>
<p>Release Date: 2017-10-11</p>
<p>Fix Resolution: v4.14-rc5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve high detected in linux stable cve high severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href found in base branch master vulnerable source files linux arch kvm paging tmpl h vulnerability details arch kvm mmu c in the linux kernel through when nested virtualisation is used does not properly traverse guest pagetable entries to resolve a guest virtual address which allows guest os users to execute arbitrary code on the host os or cause a denial of service incorrect index during page walking and host os crash aka an mmu potential stack buffer overrun publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope changed impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource | 0 |
327,892 | 24,159,897,271 | IssuesEvent | 2022-09-22 10:44:19 | NixOS/nix | https://api.github.com/repos/NixOS/nix | closed | How do I set s3 credentials? | documentation | I noticed I can upload to s3 with `nix copy`
```
To populate the current folder build output to a S3 binary cache:
$ nix copy --to s3://my-bucket?region=eu-west-1
```
But in the codebase, I never see where credentials are set in the AWSClientConfiguration. How am I supposed to use `nix copy` if I can't authenticate with aws? | 1.0 | How do I set s3 credentials? - I noticed I can upload to s3 with `nix copy`
```
To populate the current folder build output to a S3 binary cache:
$ nix copy --to s3://my-bucket?region=eu-west-1
```
But in the codebase, I never see where credentials are set in the AWSClientConfiguration. How am I supposed to use `nix copy` if I can't authenticate with aws? | non_defect | how do i set credentials i noticed i can upload to with nix copy to populate the current folder build output to a binary cache nix copy to my bucket region eu west but in the codebase i never see where credentials are set in the awsclientconfiguration how am i supposed to use nix copy if i can t authenticate with aws | 0 |
44,481 | 12,200,828,004 | IssuesEvent | 2020-04-30 05:51:31 | LiskHQ/lisk-desktop | https://api.github.com/repos/LiskHQ/lisk-desktop | closed | HW: Disconnecting Ledger S while logged in with Trezor T logs you out | priority: medium type: defect | ### Steps to reproduce
<!-- Provide an unambiguous set of steps to reproduce the bug -->
1. Connect both types of HW
1. Login with Trezor T
1. Disconnect Ledger S
### Actual result
<!--- Tell us what happens -->
You are logged out
### Expected result
<!--- Tell us what should happen -->
You see the popup saying Ledger S was disconnected and you stay logged in
### Version
<!--- Version and client OS / Branch version -->
1.19.0, 1.19.1
### Screenshot (if appropriate)
<!--- Please include screenshot capturing UI and open developer tools console -->
| 1.0 | HW: Disconnecting Ledger S while logged in with Trezor T logs you out - ### Steps to reproduce
<!-- Provide an unambiguous set of steps to reproduce the bug -->
1. Connect both types of HW
1. Login with Trezor T
1. Disconnect Ledger S
### Actual result
<!--- Tell us what happens -->
You are logged out
### Expected result
<!--- Tell us what should happen -->
You see the popup saying Ledger S was disconnected and you stay logged in
### Version
<!--- Version and client OS / Branch version -->
1.19.0, 1.19.1
### Screenshot (if appropriate)
<!--- Please include screenshot capturing UI and open developer tools console -->
| defect | hw disconnecting ledger s while logged in with trezor t logs you out steps to reproduce connect both types of hw login with trezor t disconnect ledger s actual result you are logged out expected result you see the popup saying ledger s was disconnected and you stay logged in version screenshot if appropriate | 1 |
8,171 | 6,446,818,094 | IssuesEvent | 2017-08-14 01:46:08 | CuBoulder/express | https://api.github.com/repos/CuBoulder/express | closed | Create list of sites to be Locked | deploy:Other improvement:Performance type:Task | ## Context
As we develop functionality to reduce the impact of sites that aren't updated for https://github.com/CuBoulder/express/issues/817, we need to use the reports available at https://www.colorado.edu/webcentral/admin/tools/reports/basic_report to come up with the list of sites to lock.
| True | Create list of sites to be Locked - ## Context
As we develop functionality to reduce the impact of sites that aren't updated for https://github.com/CuBoulder/express/issues/817, we need to use the reports available at https://www.colorado.edu/webcentral/admin/tools/reports/basic_report to come up with the list of sites to lock.
| non_defect | create list of sites to be locked context as we develop functionality to reduce the impact of sites that aren t updated for we need to use the reports available at to come up with the list of sites to lock | 0 |
28,206 | 5,221,372,461 | IssuesEvent | 2017-01-27 01:13:50 | elTiempoVuela/https-finder | https://api.github.com/repos/elTiempoVuela/https-finder | closed | Truncates longer domains | auto-migrated Priority-Medium Type-Defect | ```
What steps will reproduce the problem?
1. Go to www.qagoma.qld.gov.au
2. Add rule
3. Rule suggests '<rule from="^http://(www\.)?qagoma\.qld\.gov/"
to="https://qagoma.qld.gov.au/"/>' (leaves out country code)
What is the expected output? What do you see instead?
Should use "<rule from="^http://(www\.)?qagoma\.qld\.gov\.au/" instead.
What version of the product are you using? On what operating system?
0.91
```
Original issue reported on code.google.com by `reu...@yahoo.com` on 6 Jul 2014 at 2:42
| 1.0 | Truncates longer domains - ```
What steps will reproduce the problem?
1. Go to www.qagoma.qld.gov.au
2. Add rule
3. Rule suggests '<rule from="^http://(www\.)?qagoma\.qld\.gov/"
to="https://qagoma.qld.gov.au/"/>' (leaves out country code)
What is the expected output? What do you see instead?
Should use "<rule from="^http://(www\.)?qagoma\.qld\.gov\.au/" instead.
What version of the product are you using? On what operating system?
0.91
```
Original issue reported on code.google.com by `reu...@yahoo.com` on 6 Jul 2014 at 2:42
| defect | truncates longer domains what steps will reproduce the problem go to add rule rule suggests rule from to leaves out country code what is the expected output what do you see instead should use rule from instead what version of the product are you using on what operating system original issue reported on code google com by reu yahoo com on jul at | 1 |
283,103 | 24,523,136,949 | IssuesEvent | 2022-10-11 11:06:03 | sparcityeu/sparsebase | https://api.github.com/repos/sparcityeu/sparsebase | opened | Add GitHub action for testing installation in compiled and header only mode | priority: soon state: inactive type: testing | We test our code using GTest and example codes, but these codes are all compiled and ran with a certain CMake configuration where the `sparsebase` library in cmake is included. We need to make sure users can install SparseBase, include it, and link to it correctly. This is sort of a test for our CMake system. | 1.0 | Add GitHub action for testing installation in compiled and header only mode - We test our code using GTest and example codes, but these codes are all compiled and ran with a certain CMake configuration where the `sparsebase` library in cmake is included. We need to make sure users can install SparseBase, include it, and link to it correctly. This is sort of a test for our CMake system. | non_defect | add github action for testing installation in compiled and header only mode we test our code using gtest and example codes but these codes are all compiled and ran with a certain cmake configuration where the sparsebase library in cmake is included we need to make sure users can install sparsebase include it and link to it correctly this is sort of a test for our cmake system | 0 |
5,457 | 2,775,072,159 | IssuesEvent | 2015-05-04 14:04:37 | timvideos/gst-switch | https://api.github.com/repos/timvideos/gst-switch | closed | Errors when running `make test` | server tests | Running `make test` produces the following output.
```
make
make[1]: Entering directory '/home/lee/src/gst-switch-int-fixes'
make all-recursive
make[2]: Entering directory '/home/lee/src/gst-switch-int-fixes'
Making all in plugins
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/plugins'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/plugins'
Making all in tools
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/tools'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tools'
Making all in tests/unit
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
make all-am
make[4]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
make[4]: Nothing to be done for 'all-am'.
make[4]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
Making all in tests
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests'
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes'
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes'
make[2]: Leaving directory '/home/lee/src/gst-switch-int-fixes'
make[1]: Leaving directory '/home/lee/src/gst-switch-int-fixes'
rm -f "tests/test-recording "*" "*.data
rm -f "tests/test-server-"*.log
make[1]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests'
pkill gst-switch-srv || true
pkill gst-switch-ui || true
./test-switch-server --enable-test-video
/gst-switch/video:
cmdline: ../tools/gst-switch-srv \
-v \
--gst-debug-no-color \
--record=test-recording.data
./tests/test_switch_server.c:535:info: server 16548
Running test-video-source1 (30 seconds)
Running test-video-source3 (30 seconds)
Running test-video-source2 (30 seconds)
Running test_video_preview_sink3 (30 seconds)
Running test_video_preview_sink1 (30 seconds)
Running test_video_compose_sink (30 seconds)
Running test_video_preview_sink2 (30 seconds)
(test-switch-server:16547): GLib-CRITICAL **: Source ID 4 was not found when attempting to remove it
Makefile:1096: recipe for target 'test-video' failed
make[1]: *** [test-video] Trace/breakpoint trap (core dumped)
make[1]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests'
Makefile:1240: recipe for target 'test-video' failed
make: *** [test-video] Error 2
``` | 1.0 | Errors when running `make test` - Running `make test` produces the following output.
```
make
make[1]: Entering directory '/home/lee/src/gst-switch-int-fixes'
make all-recursive
make[2]: Entering directory '/home/lee/src/gst-switch-int-fixes'
Making all in plugins
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/plugins'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/plugins'
Making all in tools
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/tools'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tools'
Making all in tests/unit
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
make all-am
make[4]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
make[4]: Nothing to be done for 'all-am'.
make[4]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests/unit'
Making all in tests
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests'
make[3]: Nothing to be done for 'all'.
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests'
make[3]: Entering directory '/home/lee/src/gst-switch-int-fixes'
make[3]: Leaving directory '/home/lee/src/gst-switch-int-fixes'
make[2]: Leaving directory '/home/lee/src/gst-switch-int-fixes'
make[1]: Leaving directory '/home/lee/src/gst-switch-int-fixes'
rm -f "tests/test-recording "*" "*.data
rm -f "tests/test-server-"*.log
make[1]: Entering directory '/home/lee/src/gst-switch-int-fixes/tests'
pkill gst-switch-srv || true
pkill gst-switch-ui || true
./test-switch-server --enable-test-video
/gst-switch/video:
cmdline: ../tools/gst-switch-srv \
-v \
--gst-debug-no-color \
--record=test-recording.data
./tests/test_switch_server.c:535:info: server 16548
Running test-video-source1 (30 seconds)
Running test-video-source3 (30 seconds)
Running test-video-source2 (30 seconds)
Running test_video_preview_sink3 (30 seconds)
Running test_video_preview_sink1 (30 seconds)
Running test_video_compose_sink (30 seconds)
Running test_video_preview_sink2 (30 seconds)
(test-switch-server:16547): GLib-CRITICAL **: Source ID 4 was not found when attempting to remove it
Makefile:1096: recipe for target 'test-video' failed
make[1]: *** [test-video] Trace/breakpoint trap (core dumped)
make[1]: Leaving directory '/home/lee/src/gst-switch-int-fixes/tests'
Makefile:1240: recipe for target 'test-video' failed
make: *** [test-video] Error 2
``` | non_defect | errors when running make test running make test produces the following output make make entering directory home lee src gst switch int fixes make all recursive make entering directory home lee src gst switch int fixes making all in plugins make entering directory home lee src gst switch int fixes plugins make nothing to be done for all make leaving directory home lee src gst switch int fixes plugins making all in tools make entering directory home lee src gst switch int fixes tools make nothing to be done for all make leaving directory home lee src gst switch int fixes tools making all in tests unit make entering directory home lee src gst switch int fixes tests unit make all am make entering directory home lee src gst switch int fixes tests unit make nothing to be done for all am make leaving directory home lee src gst switch int fixes tests unit make leaving directory home lee src gst switch int fixes tests unit making all in tests make entering directory home lee src gst switch int fixes tests make nothing to be done for all make leaving directory home lee src gst switch int fixes tests make entering directory home lee src gst switch int fixes make leaving directory home lee src gst switch int fixes make leaving directory home lee src gst switch int fixes make leaving directory home lee src gst switch int fixes rm f tests test recording data rm f tests test server log make entering directory home lee src gst switch int fixes tests pkill gst switch srv true pkill gst switch ui true test switch server enable test video gst switch video cmdline tools gst switch srv v gst debug no color record test recording data tests test switch server c info server running test video seconds running test video seconds running test video seconds running test video preview seconds running test video preview seconds running test video compose sink seconds running test video preview seconds test switch server glib critical source id was not found when attempting to remove it makefile recipe for target test video failed make trace breakpoint trap core dumped make leaving directory home lee src gst switch int fixes tests makefile recipe for target test video failed make error | 0 |
78,306 | 27,421,014,486 | IssuesEvent | 2023-03-01 16:43:20 | vector-im/element-web | https://api.github.com/repos/vector-im/element-web | closed | GetPackageFamilyName could not be located in KERNEL32.dll | T-Defect | ### Steps to reproduce
When I started up the Element desktop app this morning, it fails to run, displaying this popup error:
> The procedure entry point GetPackageFamilyName could not be located in the dynamic link library KERNEL32.dll.
`GetPackageFamilyName` was introduced in Windows 8, but I am running on Windows 7.
The app was working perfectly fine yesterday. I had originally installed **v1.11.22.0**, but apparently somehow **v1.11.24.0** got installed. I DID NOT install v1.11.24.0 myself, did it auto-update itself?
I have now reinstalled v1.11.22.0 and it runs fine.
### Outcome
#### What did you expect?
I expected it to run.
#### What happened instead?
It failed to run.
### Operating system
Windows 7 Home Premium
### Application version
Element version: 1.11.24
### How did you install the app?
https://element.io/download
### Homeserver
matrix.org, gitter.im
### Will you send logs?
No | 1.0 | GetPackageFamilyName could not be located in KERNEL32.dll - ### Steps to reproduce
When I started up the Element desktop app this morning, it fails to run, displaying this popup error:
> The procedure entry point GetPackageFamilyName could not be located in the dynamic link library KERNEL32.dll.
`GetPackageFamilyName` was introduced in Windows 8, but I am running on Windows 7.
The app was working perfectly fine yesterday. I had originally installed **v1.11.22.0**, but apparently somehow **v1.11.24.0** got installed. I DID NOT install v1.11.24.0 myself, did it auto-update itself?
I have now reinstalled v1.11.22.0 and it runs fine.
### Outcome
#### What did you expect?
I expected it to run.
#### What happened instead?
It failed to run.
### Operating system
Windows 7 Home Premium
### Application version
Element version: 1.11.24
### How did you install the app?
https://element.io/download
### Homeserver
matrix.org, gitter.im
### Will you send logs?
No | defect | getpackagefamilyname could not be located in dll steps to reproduce when i started up the element desktop app this morning it fails to run displaying this popup error the procedure entry point getpackagefamilyname could not be located in the dynamic link library dll getpackagefamilyname was introduced in windows but i am running on windows the app was working perfectly fine yesterday i had originally installed but apparently somehow got installed i did not install myself did it auto update itself i have now reinstalled and it runs fine outcome what did you expect i expected it to run what happened instead it failed to run operating system windows home premium application version element version how did you install the app homeserver matrix org gitter im will you send logs no | 1 |
74,029 | 24,909,434,482 | IssuesEvent | 2022-10-29 17:22:49 | openzfs/zfs | https://api.github.com/repos/openzfs/zfs | reopened | PANIC at dsl_crypt.c:2441:dsl_crypto_populate_key_nvlist() | Type: Defect Status: Stale | ### System information
<!-- add version after "|" character -->
Type | Version/Name
--- | ---
Distribution Name | Debian
Distribution Version | 10.10
Kernel Version | 4.19.0-17-amd64
Architecture | x86_64
OpenZFS Version | 2.0.3-8~bpo10+1
### Describe the problem you're observing
Someone said to me "hey, I tried `zfs send -w pool/encrypted/filesystem | zfs receive some/where` and it works, but `zfs send -v -w pool/encrypted/filesystem | zfs receive some/where` errors out with Invalid argument, what gives", I went "I don't think I expect that to work", tried `zfs send -w mypool/encroot > /dev/null` and the magic smoke came out.
(Since I couldn't find any report of this before, I figured it was worth reporting for search results even if I try reproducing it on git master and it's been corrected.)
### Describe how to reproduce the problem
Above.
### Include any warning/errors/backtraces from the system logs
zfs get all phantasm/encroot:
```
NAME PROPERTY VALUE SOURCE
phantasm/encroot type filesystem -
phantasm/encroot creation Fri May 28 1:15 2021 -
phantasm/encroot used 707G -
phantasm/encroot available 2.82T -
phantasm/encroot referenced 760K -
phantasm/encroot compressratio 1.08x -
phantasm/encroot mounted no -
phantasm/encroot quota none default
phantasm/encroot reservation none default
phantasm/encroot recordsize 128K default
phantasm/encroot mountpoint /phantasm/encroot default
phantasm/encroot sharenfs off default
phantasm/encroot checksum on default
phantasm/encroot compression gzip-9 local
phantasm/encroot atime off inherited from phantasm
phantasm/encroot devices on default
phantasm/encroot exec on default
phantasm/encroot setuid on default
phantasm/encroot readonly off default
phantasm/encroot zoned off default
phantasm/encroot snapdir hidden default
phantasm/encroot aclmode discard default
phantasm/encroot aclinherit restricted default
phantasm/encroot createtxg 15965940 -
phantasm/encroot canmount on default
phantasm/encroot xattr sa inherited from phantasm
phantasm/encroot copies 1 default
phantasm/encroot version 5 -
phantasm/encroot utf8only off -
phantasm/encroot normalization none -
phantasm/encroot casesensitivity sensitive -
phantasm/encroot vscan off default
phantasm/encroot nbmand off default
phantasm/encroot sharesmb off default
phantasm/encroot refquota none default
phantasm/encroot refreservation none default
phantasm/encroot guid 8760648724915161698 -
phantasm/encroot primarycache all default
phantasm/encroot secondarycache all default
phantasm/encroot usedbysnapshots 0B -
phantasm/encroot usedbydataset 760K -
phantasm/encroot usedbychildren 707G -
phantasm/encroot usedbyrefreservation 0B -
phantasm/encroot logbias latency default
phantasm/encroot objsetid 914 -
phantasm/encroot dedup off default
phantasm/encroot mlslabel none default
phantasm/encroot sync standard default
phantasm/encroot dnodesize auto local
phantasm/encroot refcompressratio 1.00x -
phantasm/encroot written 760K -
phantasm/encroot logicalused 760G -
phantasm/encroot logicalreferenced 302K -
phantasm/encroot volmode default default
phantasm/encroot filesystem_limit none default
phantasm/encroot snapshot_limit none default
phantasm/encroot filesystem_count none default
phantasm/encroot snapshot_count none default
phantasm/encroot snapdev hidden default
phantasm/encroot acltype off default
phantasm/encroot context none default
phantasm/encroot fscontext none default
phantasm/encroot defcontext none default
phantasm/encroot rootcontext none default
phantasm/encroot relatime off default
phantasm/encroot redundant_metadata all default
phantasm/encroot overlay on default
phantasm/encroot encryption aes-256-gcm -
phantasm/encroot keylocation file:///workspace/highchurn/enc.key local
phantasm/encroot keyformat raw -
phantasm/encroot pbkdf2iters 0 default
phantasm/encroot encryptionroot phantasm/encroot -
phantasm/encroot keystatus available -
phantasm/encroot special_small_blocks 0 default
phantasm/encroot com.sun:auto-snapshot false inherited from phantasm
```
dmesg | tail -n mumble
```
[1284794.092468] VERIFY3(dp->dp_spa->spa_errata != 0) failed (0 != 0)
[1284794.092493] PANIC at dsl_crypt.c:2441:dsl_crypto_populate_key_nvlist()
[1284794.092513] Showing stack for process 35452
[1284794.092515] CPU: 15 PID: 35452 Comm: zfs Kdump: loaded Tainted: P OE 4.19.0-17-amd64 #1 Debian 4.19.194-2
[1284794.092516] Hardware name: Supermicro Super Server/X10SDV-TLN4F, BIOS 2.1 11/22/2019
[1284794.092516] Call Trace:
[1284794.092525] dump_stack+0x66/0x81
[1284794.092536] spl_panic+0xd3/0xfb [spl]
[1284794.092623] ? __raw_spin_unlock+0x5/0x10 [zfs]
[1284794.092692] ? zap_hashbits+0xa/0x20 [zfs]
[1284794.092759] ? zap_hash+0x36/0x210 [zfs]
[1284794.092826] ? zap_lookup_norm+0x9a/0xd0 [zfs]
[1284794.092878] dsl_crypto_populate_key_nvlist+0x5df/0x730 [zfs]
[1284794.092930] dmu_send_impl+0x5e3/0xbe0 [zfs]
[1284794.092938] ? tsd_hash_search+0x75/0xa0 [spl]
[1284794.092988] dmu_send+0x4ba/0x7e0 [zfs]
[1284794.092991] ? __switch_to+0x115/0x440
[1284794.092995] ? __switch_to_asm+0x35/0x70
[1284794.092998] ? _cond_resched+0x15/0x30
[1284794.093002] ? __kmalloc_node+0x1ea/0x2c0
[1284794.093006] ? spl_kmem_alloc_impl+0xd6/0xe0 [spl]
[1284794.093008] ? _cond_resched+0x15/0x30
[1284794.093015] ? nvt_nvpair_match+0x53/0x90 [znvpair]
[1284794.093019] ? nvt_remove_nvpair+0x88/0x120 [znvpair]
[1284794.093022] ? nvt_add_nvpair+0x54/0x110 [znvpair]
[1284794.093026] ? nvs_native_nvp_op+0xd0/0xd0 [znvpair]
[1284794.093029] ? nvs_decode_pairs+0x9d/0x120 [znvpair]
[1284794.093097] zfs_ioc_send_new+0x17d/0x1c0 [zfs]
[1284794.093168] ? dump_bytes_cb+0x20/0x20 [zfs]
[1284794.093237] zfsdev_ioctl_common+0x1f1/0x620 [zfs]
[1284794.093309] zfsdev_ioctl+0x4f/0xe0 [zfs]
[1284794.093312] do_vfs_ioctl+0xa4/0x630
[1284794.093316] ? do_munmap+0x33c/0x430
[1284794.093317] ksys_ioctl+0x60/0x90
[1284794.093319] __x64_sys_ioctl+0x16/0x20
[1284794.093323] do_syscall_64+0x53/0x110
[1284794.093326] entry_SYSCALL_64_after_hwframe+0x44/0xa9
[1284794.093327] RIP: 0033:0x7f9cc7861427
[1284794.093329] Code: 00 00 90 48 8b 05 69 aa 0c 00 64 c7 00 26 00 00 00 48 c7 c0 ff ff ff ff c3 66 2e 0f 1f 84 00 00 00 00 00 b8 10 00 00 00 0f 05 <48> 3d 01 f0 ff ff 73 01 c3 48 8b 0d 39 aa 0c 00 f7 d8 64 89 01 48
[1284794.093330] RSP: 002b:00007ffe117e1298 EFLAGS: 00000246 ORIG_RAX: 0000000000000010
[1284794.093332] RAX: ffffffffffffffda RBX: 00007ffe117e12c0 RCX: 00007f9cc7861427
[1284794.093333] RDX: 00007ffe117e12c0 RSI: 0000000000005a40 RDI: 0000000000000005
[1284794.093333] RBP: 00007ffe117e48b0 R08: 0000000000000002 R09: 00005613abc60340
[1284794.093334] R10: 00005613abc56010 R11: 0000000000000246 R12: 0000000000000000
[1284794.093335] R13: 0000000000005a40 R14: 0000000000005a40 R15: 00005613abc60340
[1284926.573319] INFO: task zfs:35452 blocked for more than 120 seconds.
[1284926.573345] Tainted: P OE 4.19.0-17-amd64 #1 Debian 4.19.194-2
[1284926.573368] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[1284926.573391] zfs D 0 35452 35451 0x80000004
[1284926.573394] Call Trace:
[1284926.573400] __schedule+0x29f/0x840
[1284926.573404] schedule+0x28/0x80
[1284926.573414] spl_panic+0xf9/0xfb [spl]
[1284926.573504] ? zap_hashbits+0xa/0x20 [zfs]
[1284926.573569] ? zap_hash+0x36/0x210 [zfs]
[1284926.573635] ? zap_lookup_norm+0x9a/0xd0 [zfs]
[1284926.573686] dsl_crypto_populate_key_nvlist+0x5df/0x730 [zfs]
[1284926.573737] dmu_send_impl+0x5e3/0xbe0 [zfs]
[1284926.573745] ? tsd_hash_search+0x75/0xa0 [spl]
[1284926.573794] dmu_send+0x4ba/0x7e0 [zfs]
[1284926.573799] ? __switch_to+0x115/0x440
[1284926.573802] ? __switch_to_asm+0x35/0x70
[1284926.573805] ? _cond_resched+0x15/0x30
[1284926.573808] ? __kmalloc_node+0x1ea/0x2c0
[1284926.573813] ? spl_kmem_alloc_impl+0xd6/0xe0 [spl]
[1284926.573814] ? _cond_resched+0x15/0x30
[1284926.573821] ? nvt_nvpair_match+0x53/0x90 [znvpair]
[1284926.573825] ? nvt_remove_nvpair+0x88/0x120 [znvpair]
[1284926.573828] ? nvt_add_nvpair+0x54/0x110 [znvpair]
[1284926.573831] ? nvs_native_nvp_op+0xd0/0xd0 [znvpair]
[1284926.573834] ? nvs_decode_pairs+0x9d/0x120 [znvpair]
[1284926.573899] zfs_ioc_send_new+0x17d/0x1c0 [zfs]
[1284926.573967] ? dump_bytes_cb+0x20/0x20 [zfs]
[1284926.574033] zfsdev_ioctl_common+0x1f1/0x620 [zfs]
[1284926.574101] zfsdev_ioctl+0x4f/0xe0 [zfs]
[1284926.574105] do_vfs_ioctl+0xa4/0x630
[1284926.574108] ? do_munmap+0x33c/0x430
[1284926.574110] ksys_ioctl+0x60/0x90
[1284926.574112] __x64_sys_ioctl+0x16/0x20
[1284926.574115] do_syscall_64+0x53/0x110
[1284926.574118] entry_SYSCALL_64_after_hwframe+0x44/0xa9
[1284926.574120] RIP: 0033:0x7f9cc7861427
[1284926.574125] Code: Bad RIP value.
[1284926.574126] RSP: 002b:00007ffe117e1298 EFLAGS: 00000246 ORIG_RAX: 0000000000000010
[1284926.574127] RAX: ffffffffffffffda RBX: 00007ffe117e12c0 RCX: 00007f9cc7861427
[1284926.574128] RDX: 00007ffe117e12c0 RSI: 0000000000005a40 RDI: 0000000000000005
[1284926.574129] RBP: 00007ffe117e48b0 R08: 0000000000000002 R09: 00005613abc60340
[1284926.574129] R10: 00005613abc56010 R11: 0000000000000246 R12: 0000000000000000
[1284926.574130] R13: 0000000000005a40 R14: 0000000000005a40 R15: 00005613abc60340
``` | 1.0 | PANIC at dsl_crypt.c:2441:dsl_crypto_populate_key_nvlist() - ### System information
<!-- add version after "|" character -->
Type | Version/Name
--- | ---
Distribution Name | Debian
Distribution Version | 10.10
Kernel Version | 4.19.0-17-amd64
Architecture | x86_64
OpenZFS Version | 2.0.3-8~bpo10+1
### Describe the problem you're observing
Someone said to me "hey, I tried `zfs send -w pool/encrypted/filesystem | zfs receive some/where` and it works, but `zfs send -v -w pool/encrypted/filesystem | zfs receive some/where` errors out with Invalid argument, what gives", I went "I don't think I expect that to work", tried `zfs send -w mypool/encroot > /dev/null` and the magic smoke came out.
(Since I couldn't find any report of this before, I figured it was worth reporting for search results even if I try reproducing it on git master and it's been corrected.)
### Describe how to reproduce the problem
Above.
### Include any warning/errors/backtraces from the system logs
zfs get all phantasm/encroot:
```
NAME PROPERTY VALUE SOURCE
phantasm/encroot type filesystem -
phantasm/encroot creation Fri May 28 1:15 2021 -
phantasm/encroot used 707G -
phantasm/encroot available 2.82T -
phantasm/encroot referenced 760K -
phantasm/encroot compressratio 1.08x -
phantasm/encroot mounted no -
phantasm/encroot quota none default
phantasm/encroot reservation none default
phantasm/encroot recordsize 128K default
phantasm/encroot mountpoint /phantasm/encroot default
phantasm/encroot sharenfs off default
phantasm/encroot checksum on default
phantasm/encroot compression gzip-9 local
phantasm/encroot atime off inherited from phantasm
phantasm/encroot devices on default
phantasm/encroot exec on default
phantasm/encroot setuid on default
phantasm/encroot readonly off default
phantasm/encroot zoned off default
phantasm/encroot snapdir hidden default
phantasm/encroot aclmode discard default
phantasm/encroot aclinherit restricted default
phantasm/encroot createtxg 15965940 -
phantasm/encroot canmount on default
phantasm/encroot xattr sa inherited from phantasm
phantasm/encroot copies 1 default
phantasm/encroot version 5 -
phantasm/encroot utf8only off -
phantasm/encroot normalization none -
phantasm/encroot casesensitivity sensitive -
phantasm/encroot vscan off default
phantasm/encroot nbmand off default
phantasm/encroot sharesmb off default
phantasm/encroot refquota none default
phantasm/encroot refreservation none default
phantasm/encroot guid 8760648724915161698 -
phantasm/encroot primarycache all default
phantasm/encroot secondarycache all default
phantasm/encroot usedbysnapshots 0B -
phantasm/encroot usedbydataset 760K -
phantasm/encroot usedbychildren 707G -
phantasm/encroot usedbyrefreservation 0B -
phantasm/encroot logbias latency default
phantasm/encroot objsetid 914 -
phantasm/encroot dedup off default
phantasm/encroot mlslabel none default
phantasm/encroot sync standard default
phantasm/encroot dnodesize auto local
phantasm/encroot refcompressratio 1.00x -
phantasm/encroot written 760K -
phantasm/encroot logicalused 760G -
phantasm/encroot logicalreferenced 302K -
phantasm/encroot volmode default default
phantasm/encroot filesystem_limit none default
phantasm/encroot snapshot_limit none default
phantasm/encroot filesystem_count none default
phantasm/encroot snapshot_count none default
phantasm/encroot snapdev hidden default
phantasm/encroot acltype off default
phantasm/encroot context none default
phantasm/encroot fscontext none default
phantasm/encroot defcontext none default
phantasm/encroot rootcontext none default
phantasm/encroot relatime off default
phantasm/encroot redundant_metadata all default
phantasm/encroot overlay on default
phantasm/encroot encryption aes-256-gcm -
phantasm/encroot keylocation file:///workspace/highchurn/enc.key local
phantasm/encroot keyformat raw -
phantasm/encroot pbkdf2iters 0 default
phantasm/encroot encryptionroot phantasm/encroot -
phantasm/encroot keystatus available -
phantasm/encroot special_small_blocks 0 default
phantasm/encroot com.sun:auto-snapshot false inherited from phantasm
```
dmesg | tail -n mumble
```
[1284794.092468] VERIFY3(dp->dp_spa->spa_errata != 0) failed (0 != 0)
[1284794.092493] PANIC at dsl_crypt.c:2441:dsl_crypto_populate_key_nvlist()
[1284794.092513] Showing stack for process 35452
[1284794.092515] CPU: 15 PID: 35452 Comm: zfs Kdump: loaded Tainted: P OE 4.19.0-17-amd64 #1 Debian 4.19.194-2
[1284794.092516] Hardware name: Supermicro Super Server/X10SDV-TLN4F, BIOS 2.1 11/22/2019
[1284794.092516] Call Trace:
[1284794.092525] dump_stack+0x66/0x81
[1284794.092536] spl_panic+0xd3/0xfb [spl]
[1284794.092623] ? __raw_spin_unlock+0x5/0x10 [zfs]
[1284794.092692] ? zap_hashbits+0xa/0x20 [zfs]
[1284794.092759] ? zap_hash+0x36/0x210 [zfs]
[1284794.092826] ? zap_lookup_norm+0x9a/0xd0 [zfs]
[1284794.092878] dsl_crypto_populate_key_nvlist+0x5df/0x730 [zfs]
[1284794.092930] dmu_send_impl+0x5e3/0xbe0 [zfs]
[1284794.092938] ? tsd_hash_search+0x75/0xa0 [spl]
[1284794.092988] dmu_send+0x4ba/0x7e0 [zfs]
[1284794.092991] ? __switch_to+0x115/0x440
[1284794.092995] ? __switch_to_asm+0x35/0x70
[1284794.092998] ? _cond_resched+0x15/0x30
[1284794.093002] ? __kmalloc_node+0x1ea/0x2c0
[1284794.093006] ? spl_kmem_alloc_impl+0xd6/0xe0 [spl]
[1284794.093008] ? _cond_resched+0x15/0x30
[1284794.093015] ? nvt_nvpair_match+0x53/0x90 [znvpair]
[1284794.093019] ? nvt_remove_nvpair+0x88/0x120 [znvpair]
[1284794.093022] ? nvt_add_nvpair+0x54/0x110 [znvpair]
[1284794.093026] ? nvs_native_nvp_op+0xd0/0xd0 [znvpair]
[1284794.093029] ? nvs_decode_pairs+0x9d/0x120 [znvpair]
[1284794.093097] zfs_ioc_send_new+0x17d/0x1c0 [zfs]
[1284794.093168] ? dump_bytes_cb+0x20/0x20 [zfs]
[1284794.093237] zfsdev_ioctl_common+0x1f1/0x620 [zfs]
[1284794.093309] zfsdev_ioctl+0x4f/0xe0 [zfs]
[1284794.093312] do_vfs_ioctl+0xa4/0x630
[1284794.093316] ? do_munmap+0x33c/0x430
[1284794.093317] ksys_ioctl+0x60/0x90
[1284794.093319] __x64_sys_ioctl+0x16/0x20
[1284794.093323] do_syscall_64+0x53/0x110
[1284794.093326] entry_SYSCALL_64_after_hwframe+0x44/0xa9
[1284794.093327] RIP: 0033:0x7f9cc7861427
[1284794.093329] Code: 00 00 90 48 8b 05 69 aa 0c 00 64 c7 00 26 00 00 00 48 c7 c0 ff ff ff ff c3 66 2e 0f 1f 84 00 00 00 00 00 b8 10 00 00 00 0f 05 <48> 3d 01 f0 ff ff 73 01 c3 48 8b 0d 39 aa 0c 00 f7 d8 64 89 01 48
[1284794.093330] RSP: 002b:00007ffe117e1298 EFLAGS: 00000246 ORIG_RAX: 0000000000000010
[1284794.093332] RAX: ffffffffffffffda RBX: 00007ffe117e12c0 RCX: 00007f9cc7861427
[1284794.093333] RDX: 00007ffe117e12c0 RSI: 0000000000005a40 RDI: 0000000000000005
[1284794.093333] RBP: 00007ffe117e48b0 R08: 0000000000000002 R09: 00005613abc60340
[1284794.093334] R10: 00005613abc56010 R11: 0000000000000246 R12: 0000000000000000
[1284794.093335] R13: 0000000000005a40 R14: 0000000000005a40 R15: 00005613abc60340
[1284926.573319] INFO: task zfs:35452 blocked for more than 120 seconds.
[1284926.573345] Tainted: P OE 4.19.0-17-amd64 #1 Debian 4.19.194-2
[1284926.573368] "echo 0 > /proc/sys/kernel/hung_task_timeout_secs" disables this message.
[1284926.573391] zfs D 0 35452 35451 0x80000004
[1284926.573394] Call Trace:
[1284926.573400] __schedule+0x29f/0x840
[1284926.573404] schedule+0x28/0x80
[1284926.573414] spl_panic+0xf9/0xfb [spl]
[1284926.573504] ? zap_hashbits+0xa/0x20 [zfs]
[1284926.573569] ? zap_hash+0x36/0x210 [zfs]
[1284926.573635] ? zap_lookup_norm+0x9a/0xd0 [zfs]
[1284926.573686] dsl_crypto_populate_key_nvlist+0x5df/0x730 [zfs]
[1284926.573737] dmu_send_impl+0x5e3/0xbe0 [zfs]
[1284926.573745] ? tsd_hash_search+0x75/0xa0 [spl]
[1284926.573794] dmu_send+0x4ba/0x7e0 [zfs]
[1284926.573799] ? __switch_to+0x115/0x440
[1284926.573802] ? __switch_to_asm+0x35/0x70
[1284926.573805] ? _cond_resched+0x15/0x30
[1284926.573808] ? __kmalloc_node+0x1ea/0x2c0
[1284926.573813] ? spl_kmem_alloc_impl+0xd6/0xe0 [spl]
[1284926.573814] ? _cond_resched+0x15/0x30
[1284926.573821] ? nvt_nvpair_match+0x53/0x90 [znvpair]
[1284926.573825] ? nvt_remove_nvpair+0x88/0x120 [znvpair]
[1284926.573828] ? nvt_add_nvpair+0x54/0x110 [znvpair]
[1284926.573831] ? nvs_native_nvp_op+0xd0/0xd0 [znvpair]
[1284926.573834] ? nvs_decode_pairs+0x9d/0x120 [znvpair]
[1284926.573899] zfs_ioc_send_new+0x17d/0x1c0 [zfs]
[1284926.573967] ? dump_bytes_cb+0x20/0x20 [zfs]
[1284926.574033] zfsdev_ioctl_common+0x1f1/0x620 [zfs]
[1284926.574101] zfsdev_ioctl+0x4f/0xe0 [zfs]
[1284926.574105] do_vfs_ioctl+0xa4/0x630
[1284926.574108] ? do_munmap+0x33c/0x430
[1284926.574110] ksys_ioctl+0x60/0x90
[1284926.574112] __x64_sys_ioctl+0x16/0x20
[1284926.574115] do_syscall_64+0x53/0x110
[1284926.574118] entry_SYSCALL_64_after_hwframe+0x44/0xa9
[1284926.574120] RIP: 0033:0x7f9cc7861427
[1284926.574125] Code: Bad RIP value.
[1284926.574126] RSP: 002b:00007ffe117e1298 EFLAGS: 00000246 ORIG_RAX: 0000000000000010
[1284926.574127] RAX: ffffffffffffffda RBX: 00007ffe117e12c0 RCX: 00007f9cc7861427
[1284926.574128] RDX: 00007ffe117e12c0 RSI: 0000000000005a40 RDI: 0000000000000005
[1284926.574129] RBP: 00007ffe117e48b0 R08: 0000000000000002 R09: 00005613abc60340
[1284926.574129] R10: 00005613abc56010 R11: 0000000000000246 R12: 0000000000000000
[1284926.574130] R13: 0000000000005a40 R14: 0000000000005a40 R15: 00005613abc60340
``` | defect | panic at dsl crypt c dsl crypto populate key nvlist system information type version name distribution name debian distribution version kernel version architecture openzfs version describe the problem you re observing someone said to me hey i tried zfs send w pool encrypted filesystem zfs receive some where and it works but zfs send v w pool encrypted filesystem zfs receive some where errors out with invalid argument what gives i went i don t think i expect that to work tried zfs send w mypool encroot dev null and the magic smoke came out since i couldn t find any report of this before i figured it was worth reporting for search results even if i try reproducing it on git master and it s been corrected describe how to reproduce the problem above include any warning errors backtraces from the system logs zfs get all phantasm encroot name property value source phantasm encroot type filesystem phantasm encroot creation fri may phantasm encroot used phantasm encroot available phantasm encroot referenced phantasm encroot compressratio phantasm encroot mounted no phantasm encroot quota none default phantasm encroot reservation none default phantasm encroot recordsize default phantasm encroot mountpoint phantasm encroot default phantasm encroot sharenfs off default phantasm encroot checksum on default phantasm encroot compression gzip local phantasm encroot atime off inherited from phantasm phantasm encroot devices on default phantasm encroot exec on default phantasm encroot setuid on default phantasm encroot readonly off default phantasm encroot zoned off default phantasm encroot snapdir hidden default phantasm encroot aclmode discard default phantasm encroot aclinherit restricted default phantasm encroot createtxg phantasm encroot canmount on default phantasm encroot xattr sa inherited from phantasm phantasm encroot copies default phantasm encroot version phantasm encroot off phantasm encroot normalization none phantasm encroot casesensitivity sensitive phantasm encroot vscan off default phantasm encroot nbmand off default phantasm encroot sharesmb off default phantasm encroot refquota none default phantasm encroot refreservation none default phantasm encroot guid phantasm encroot primarycache all default phantasm encroot secondarycache all default phantasm encroot usedbysnapshots phantasm encroot usedbydataset phantasm encroot usedbychildren phantasm encroot usedbyrefreservation phantasm encroot logbias latency default phantasm encroot objsetid phantasm encroot dedup off default phantasm encroot mlslabel none default phantasm encroot sync standard default phantasm encroot dnodesize auto local phantasm encroot refcompressratio phantasm encroot written phantasm encroot logicalused phantasm encroot logicalreferenced phantasm encroot volmode default default phantasm encroot filesystem limit none default phantasm encroot snapshot limit none default phantasm encroot filesystem count none default phantasm encroot snapshot count none default phantasm encroot snapdev hidden default phantasm encroot acltype off default phantasm encroot context none default phantasm encroot fscontext none default phantasm encroot defcontext none default phantasm encroot rootcontext none default phantasm encroot relatime off default phantasm encroot redundant metadata all default phantasm encroot overlay on default phantasm encroot encryption aes gcm phantasm encroot keylocation file workspace highchurn enc key local phantasm encroot keyformat raw phantasm encroot default phantasm encroot encryptionroot phantasm encroot phantasm encroot keystatus available phantasm encroot special small blocks default phantasm encroot com sun auto snapshot false inherited from phantasm dmesg tail n mumble dp dp spa spa errata failed panic at dsl crypt c dsl crypto populate key nvlist showing stack for process cpu pid comm zfs kdump loaded tainted p oe debian hardware name supermicro super server bios call trace dump stack spl panic raw spin unlock zap hashbits zap hash zap lookup norm dsl crypto populate key nvlist dmu send impl tsd hash search dmu send switch to switch to asm cond resched kmalloc node spl kmem alloc impl cond resched nvt nvpair match nvt remove nvpair nvt add nvpair nvs native nvp op nvs decode pairs zfs ioc send new dump bytes cb zfsdev ioctl common zfsdev ioctl do vfs ioctl do munmap ksys ioctl sys ioctl do syscall entry syscall after hwframe rip code aa ff ff ff ff ff ff aa rsp eflags orig rax rax ffffffffffffffda rbx rcx rdx rsi rdi rbp info task zfs blocked for more than seconds tainted p oe debian echo proc sys kernel hung task timeout secs disables this message zfs d call trace schedule schedule spl panic zap hashbits zap hash zap lookup norm dsl crypto populate key nvlist dmu send impl tsd hash search dmu send switch to switch to asm cond resched kmalloc node spl kmem alloc impl cond resched nvt nvpair match nvt remove nvpair nvt add nvpair nvs native nvp op nvs decode pairs zfs ioc send new dump bytes cb zfsdev ioctl common zfsdev ioctl do vfs ioctl do munmap ksys ioctl sys ioctl do syscall entry syscall after hwframe rip code bad rip value rsp eflags orig rax rax ffffffffffffffda rbx rcx rdx rsi rdi rbp | 1 |
18,310 | 3,041,566,811 | IssuesEvent | 2015-08-07 22:15:48 | francoisferland/casiousbmididriver | https://api.github.com/repos/francoisferland/casiousbmididriver | closed | Px-5s doesn't work on 10.8 | auto-migrated Priority-Medium Type-Defect | ```
Midi window does see the device and it is added to the list of devices, but no
midi signals seem to be "seen" during the midi test.
```
Original issue reported on code.google.com by `cerem...@gmail.com` on 20 Oct 2014 at 10:07 | 1.0 | Px-5s doesn't work on 10.8 - ```
Midi window does see the device and it is added to the list of devices, but no
midi signals seem to be "seen" during the midi test.
```
Original issue reported on code.google.com by `cerem...@gmail.com` on 20 Oct 2014 at 10:07 | defect | px doesn t work on midi window does see the device and it is added to the list of devices but no midi signals seem to be seen during the midi test original issue reported on code google com by cerem gmail com on oct at | 1 |
107,337 | 23,391,780,009 | IssuesEvent | 2022-08-11 18:34:49 | gravityview/GravityView | https://api.github.com/repos/gravityview/GravityView | closed | Enhance [gvlogic] with date comparison | Enhancement Difficulty: Medium Priority: High Status: Needs Testing Core: Shortcodes | At this moment, the `[gvlogic]` shortcode does not support date comparison: https://docs.gravityview.co/article/252-gvlogic-shortcode
In order to support Date comparison, we'll need to use the:
`greater_than`
`greater_than_or_is` or `greater_than_or_equals`
`less_than`
`less_than_or_is` or `less_than_or_equals`
I believe we'll also need a `format` parameter to support the PHP Date Formats to prevent further issues when comparing dates in the format DD/MM/YYYY to MM/DD/YYYY like:
`[gvlogic if="{Date Field:1}" greater_than="{Date Field 2:2}" format="d/m/y"]`
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/995529792029955/995535662518810)
| 1.0 | Enhance [gvlogic] with date comparison - At this moment, the `[gvlogic]` shortcode does not support date comparison: https://docs.gravityview.co/article/252-gvlogic-shortcode
In order to support Date comparison, we'll need to use the:
`greater_than`
`greater_than_or_is` or `greater_than_or_equals`
`less_than`
`less_than_or_is` or `less_than_or_equals`
I believe we'll also need a `format` parameter to support the PHP Date Formats to prevent further issues when comparing dates in the format DD/MM/YYYY to MM/DD/YYYY like:
`[gvlogic if="{Date Field:1}" greater_than="{Date Field 2:2}" format="d/m/y"]`
┆Issue is synchronized with this [Asana task](https://app.asana.com/0/995529792029955/995535662518810)
| non_defect | enhance with date comparison at this moment the shortcode does not support date comparison in order to support date comparison we ll need to use the greater than greater than or is or greater than or equals less than less than or is or less than or equals i believe we ll also need a format parameter to support the php date formats to prevent further issues when comparing dates in the format dd mm yyyy to mm dd yyyy like ┆issue is synchronized with this | 0 |
373,624 | 26,077,890,941 | IssuesEvent | 2022-12-24 21:11:51 | souos/souos | https://api.github.com/repos/souos/souos | opened | Старт проекта | documentation | Создать репозиторий для сайта (web приложения)
Создать репозиторий для профиля аккаунта проекта.
Создать структуру документирования (API отдельно панель управления и web выдача отдельно)
Настроить автоматический деплой вебпроекта. | 1.0 | Старт проекта - Создать репозиторий для сайта (web приложения)
Создать репозиторий для профиля аккаунта проекта.
Создать структуру документирования (API отдельно панель управления и web выдача отдельно)
Настроить автоматический деплой вебпроекта. | non_defect | старт проекта создать репозиторий для сайта web приложения создать репозиторий для профиля аккаунта проекта создать структуру документирования api отдельно панель управления и web выдача отдельно настроить автоматический деплой вебпроекта | 0 |
44,918 | 13,093,812,439 | IssuesEvent | 2020-08-03 11:08:21 | istio/istio | https://api.github.com/repos/istio/istio | closed | Serve Destination rule/gateway file mounted certificates over SDS | area/networking area/security lifecycle/stale | Currently if certs are used as mounted files for destination rule or Gateway, they are loaded as files. This means if they are changed Envoy will not notice.
We have two options:
* Do the same rotation we do with the workload certs. This is complicated as the files are dynamic, and we are moving away from file mounted certs
* Translate the files to SDS config which are served locally
The 2nd seems preferable. There is some complexity here, as we need some way to map the file name referenced in the config to the secret name in envoy (maybe its as simple as literally using the file path). We also need to consider any security implications here, if any | True | Serve Destination rule/gateway file mounted certificates over SDS - Currently if certs are used as mounted files for destination rule or Gateway, they are loaded as files. This means if they are changed Envoy will not notice.
We have two options:
* Do the same rotation we do with the workload certs. This is complicated as the files are dynamic, and we are moving away from file mounted certs
* Translate the files to SDS config which are served locally
The 2nd seems preferable. There is some complexity here, as we need some way to map the file name referenced in the config to the secret name in envoy (maybe its as simple as literally using the file path). We also need to consider any security implications here, if any | non_defect | serve destination rule gateway file mounted certificates over sds currently if certs are used as mounted files for destination rule or gateway they are loaded as files this means if they are changed envoy will not notice we have two options do the same rotation we do with the workload certs this is complicated as the files are dynamic and we are moving away from file mounted certs translate the files to sds config which are served locally the seems preferable there is some complexity here as we need some way to map the file name referenced in the config to the secret name in envoy maybe its as simple as literally using the file path we also need to consider any security implications here if any | 0 |
35,057 | 7,541,091,375 | IssuesEvent | 2018-04-17 08:45:57 | contao/core-bundle | https://api.github.com/repos/contao/core-bundle | closed | Check if BackendUser::getInstance() should throw an exception | defect | Please see this related issue where a runtime exception happens when calling `BackendUser::getInstance()` during the execution of Contao's poor man cron job from the backend: https://github.com/terminal42/contao-easy_themes/issues/40
I was actually wondering if `BackendUser::getInstance()` (respective `User::loadUserByUsername()`) should throw an exception if no request is available. It does return an empty instance (resp. null) in some cases anyway. To safely use it like in the beforementioned example, there would be the need to check if the request stack contains a request beforehand which is somewhat hidden.
Sidenotes: The example is a from a Contao 3.5 + 4 environment. In general not using `BackendUser::getInstance()` might be the right thing to do. | 1.0 | Check if BackendUser::getInstance() should throw an exception - Please see this related issue where a runtime exception happens when calling `BackendUser::getInstance()` during the execution of Contao's poor man cron job from the backend: https://github.com/terminal42/contao-easy_themes/issues/40
I was actually wondering if `BackendUser::getInstance()` (respective `User::loadUserByUsername()`) should throw an exception if no request is available. It does return an empty instance (resp. null) in some cases anyway. To safely use it like in the beforementioned example, there would be the need to check if the request stack contains a request beforehand which is somewhat hidden.
Sidenotes: The example is a from a Contao 3.5 + 4 environment. In general not using `BackendUser::getInstance()` might be the right thing to do. | defect | check if backenduser getinstance should throw an exception please see this related issue where a runtime exception happens when calling backenduser getinstance during the execution of contao s poor man cron job from the backend i was actually wondering if backenduser getinstance respective user loaduserbyusername should throw an exception if no request is available it does return an empty instance resp null in some cases anyway to safely use it like in the beforementioned example there would be the need to check if the request stack contains a request beforehand which is somewhat hidden sidenotes the example is a from a contao environment in general not using backenduser getinstance might be the right thing to do | 1 |
50,265 | 13,187,408,437 | IssuesEvent | 2020-08-13 03:19:18 | icecube-trac/tix3 | https://api.github.com/repos/icecube-trac/tix3 | closed | FIXME's should be fixed. Not warned. (Trac #409) | Migrated from Trac combo reconstruction defect | FIXME's that are warned get ignored and not fixed.
```text
[maru:~/i3/icerec/src] svn blame ./tpx/private/tpx/I3IceTopSanityChecks.cxx |grep FIXME 龟 trunk-88227
78401 kislat #warning FIXME: This is kind of fishy. Subsequent modules should not have to do this check again.
```
<details>
<summary>_Migrated from https://code.icecube.wisc.edu/ticket/409
, reported by nega and owned by karg_</summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "FIXME's that are warned get ignored and not fixed.\n\n{{{\n[maru:~/i3/icerec/src] svn blame ./tpx/private/tpx/I3IceTopSanityChecks.cxx |grep FIXME \u9f9f trunk-88227\n 78401 kislat #warning FIXME: This is kind of fishy. Subsequent modules should not have to do this check again.\n}}}",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "FIXME's should be fixed. Not warned.",
"priority": "blocker",
"keywords": "tpx",
"time": "2012-05-30T19:28:34",
"milestone": "",
"owner": "karg",
"type": "defect"
}
```
</p>
</details>
| 1.0 | FIXME's should be fixed. Not warned. (Trac #409) - FIXME's that are warned get ignored and not fixed.
```text
[maru:~/i3/icerec/src] svn blame ./tpx/private/tpx/I3IceTopSanityChecks.cxx |grep FIXME 龟 trunk-88227
78401 kislat #warning FIXME: This is kind of fishy. Subsequent modules should not have to do this check again.
```
<details>
<summary>_Migrated from https://code.icecube.wisc.edu/ticket/409
, reported by nega and owned by karg_</summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:11:57",
"description": "FIXME's that are warned get ignored and not fixed.\n\n{{{\n[maru:~/i3/icerec/src] svn blame ./tpx/private/tpx/I3IceTopSanityChecks.cxx |grep FIXME \u9f9f trunk-88227\n 78401 kislat #warning FIXME: This is kind of fishy. Subsequent modules should not have to do this check again.\n}}}",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1550067117911749",
"component": "combo reconstruction",
"summary": "FIXME's should be fixed. Not warned.",
"priority": "blocker",
"keywords": "tpx",
"time": "2012-05-30T19:28:34",
"milestone": "",
"owner": "karg",
"type": "defect"
}
```
</p>
</details>
| defect | fixme s should be fixed not warned trac fixme s that are warned get ignored and not fixed text svn blame tpx private tpx cxx grep fixme 龟 trunk kislat warning fixme this is kind of fishy subsequent modules should not have to do this check again migrated from reported by nega and owned by karg json status closed changetime description fixme s that are warned get ignored and not fixed n n n svn blame tpx private tpx cxx grep fixme trunk n kislat warning fixme this is kind of fishy subsequent modules should not have to do this check again n reporter nega cc resolution fixed ts component combo reconstruction summary fixme s should be fixed not warned priority blocker keywords tpx time milestone owner karg type defect | 1 |
516,496 | 14,983,013,171 | IssuesEvent | 2021-01-28 16:40:41 | googleapis/google-cloud-php | https://api.github.com/repos/googleapis/google-cloud-php | closed | Synthesis failed for websecurityscanner | :rotating_light: api: websecurityscanner autosynth failure priority: p1 type: bug | Hello! Autosynth couldn't regenerate websecurityscanner. :broken_heart:
Here's the output from running `synth.py`:
```
protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListFindingsRequest.php.
2021-01-21 07:13:56,615 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/OutdatedLibrary.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/OutdatedLibrary.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/CrawledUrl.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/CrawledUrl.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/UpdateScanConfigRequest.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/UpdateScanConfigRequest.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsResponse.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsResponse.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRunErrorTrace.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRunErrorTrace.php.
2021-01-21 07:13:56,617 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRun.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRun.php.
2021-01-21 07:13:56,617 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListFindingsResponse.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListFindingsResponse.php.
2021-01-21 07:13:56,617 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/Form.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/Form.php.
2021-01-21 07:13:56,618 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/FindingTypeStats.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/FindingTypeStats.php.
2021-01-21 07:13:56,618 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanRunsResponse.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanRunsResponse.php.
2021-01-21 07:13:56,618 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsRequest.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsRequest.php.
2021-01-21 07:13:56,619 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/DeleteScanConfigRequest.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/DeleteScanConfigRequest.php.
2021-01-21 07:13:56,620 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Schedule.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Schedule.php.
2021-01-21 07:13:56,621 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/GoogleAccount.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/GoogleAccount.php.
2021-01-21 07:13:56,621 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/CustomAccount.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/CustomAccount.php.
2021-01-21 07:13:56,621 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/VulnerableHeaders/Header.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/VulnerableHeaders/Header.php.
2021-01-21 07:13:56,629 synthtool [WARNING] > No replacements made in src/**/V*/**/*.php for pattern final class, maybe replacement is no longer needed?
WARNING:synthtool:No replacements made in src/**/V*/**/*.php for pattern final class, maybe replacement is no longer needed?
2021-01-21 07:13:56,637 synthtool [WARNING] > No replacements made in src/**/V*/**/*.php for pattern public function ([s|g]\w{3,})Unwrapped, maybe replacement is no longer needed?
WARNING:synthtool:No replacements made in src/**/V*/**/*.php for pattern public function ([s|g]\w{3,})Unwrapped, maybe replacement is no longer needed?
2021-01-21 07:13:56,777 synthtool [WARNING] > No replacements made in src/**/V*/**/*.php for pattern (.{0,})\]\((/.{0,})\), maybe replacement is no longer needed?
WARNING:synthtool:No replacements made in src/**/V*/**/*.php for pattern (.{0,})\]\((/.{0,})\), maybe replacement is no longer needed?
2021-01-21 07:13:56,778 synthtool [DEBUG] > Wrote metadata to synth.metadata.
DEBUG:synthtool:Wrote metadata to synth.metadata.
2021-01-21 07:13:56,877 autosynth [INFO] > Changed files:
2021-01-21 07:13:56,878 autosynth [INFO] > M WebSecurityScanner/synth.metadata
2021-01-21 07:13:56,878 autosynth [DEBUG] > Running: git log 93cf8bda3a07bed78a0232abf0a652fa07b3a449 -1 --no-decorate --pretty=%s
2021-01-21 07:13:56,882 autosynth [DEBUG] > Running: git log 93cf8bda3a07bed78a0232abf0a652fa07b3a449 -1 --no-decorate --pretty=%b%n%nSource-Author: %an <%ae>%nSource-Date: %ad
2021-01-21 07:13:56,886 autosynth [DEBUG] > Running: git add -A
2021-01-21 07:13:56,932 autosynth [DEBUG] > Running: git status --porcelain
2021-01-21 07:13:56,980 autosynth [DEBUG] > Running: git commit -m ci: add project id for Datastore tests
Source-Author: Alexey Andreev <ava1280@yandex.ru>
Source-Date: Tue Jan 19 20:46:24 2021 +0300
Source-Repo: googleapis/google-cloud-php
Source-Sha: 93cf8bda3a07bed78a0232abf0a652fa07b3a449
Source-Link: https://github.com/googleapis/google-cloud-php/commit/93cf8bda3a07bed78a0232abf0a652fa07b3a449
[autosynth-websecurityscanner-0 562f8ed6] ci: add project id for Datastore tests
1 file changed, 1 insertion(+), 1 deletion(-)
2021-01-21 07:13:57,009 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 562f8ed6 ci: add project id for Datastore tests
2021-01-21 07:13:57,051 autosynth [DEBUG] > Running: git checkout autosynth-websecurityscanner
Switched to branch 'autosynth-websecurityscanner'
2021-01-21 07:13:57,080 autosynth [DEBUG] > Running: git diff autosynth-websecurityscanner-0..autosynth-websecurityscanner-280 -- . :(exclude)synth.metadata
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 104, in synthesize_loop
return synthesize_loop_single_pr(toolbox, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 126, in synthesize_loop_single_pr
synthesize_inner_loop(toolbox, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 152, in synthesize_inner_loop
synthesize_range(toolbox, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 173, in synthesize_range
toolbox.sub_branch(old), toolbox.sub_branch(young)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 193, in git_branches_differ
return git_branches_differ(branch_a, branch_b, self._metadata_path)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 342, in git_branches_differ
proc = executor.run(diff_cmd, stdout=subprocess.PIPE, universal_newlines=True)
File "/tmpfs/src/github/synthtool/autosynth/executor.py", line 23, in run
return subprocess.run(command, **args)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 425, in run
stdout, stderr = process.communicate(input, timeout=timeout)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 850, in communicate
stdout = self.stdout.read()
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/codecs.py", line 321, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc2 in position 1422: invalid continuation byte
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/94a0e52d-cba7-4b23-addb-fe37a1587557/targets/github%2Fsynthtool;config=default/tests;query=google-cloud-php;failed=false).
| 1.0 | Synthesis failed for websecurityscanner - Hello! Autosynth couldn't regenerate websecurityscanner. :broken_heart:
Here's the output from running `synth.py`:
```
protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListFindingsRequest.php.
2021-01-21 07:13:56,615 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/OutdatedLibrary.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/OutdatedLibrary.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/CrawledUrl.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/CrawledUrl.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/UpdateScanConfigRequest.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/UpdateScanConfigRequest.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsResponse.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsResponse.php.
2021-01-21 07:13:56,616 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRunErrorTrace.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRunErrorTrace.php.
2021-01-21 07:13:56,617 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRun.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanRun.php.
2021-01-21 07:13:56,617 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListFindingsResponse.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListFindingsResponse.php.
2021-01-21 07:13:56,617 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/Form.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/Form.php.
2021-01-21 07:13:56,618 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/FindingTypeStats.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/FindingTypeStats.php.
2021-01-21 07:13:56,618 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanRunsResponse.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanRunsResponse.php.
2021-01-21 07:13:56,618 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsRequest.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ListScanConfigsRequest.php.
2021-01-21 07:13:56,619 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/DeleteScanConfigRequest.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/DeleteScanConfigRequest.php.
2021-01-21 07:13:56,620 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Schedule.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Schedule.php.
2021-01-21 07:13:56,621 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/GoogleAccount.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/GoogleAccount.php.
2021-01-21 07:13:56,621 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/CustomAccount.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/ScanConfig/Authentication/CustomAccount.php.
2021-01-21 07:13:56,621 synthtool [INFO] > Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/VulnerableHeaders/Header.php.
INFO:synthtool:Replaced 'Generated from protobuf field ([^\\n]{0,})\\n\\s{5}\\*/\\n\\s{4}protected \\$' in src/V1beta/VulnerableHeaders/Header.php.
2021-01-21 07:13:56,629 synthtool [WARNING] > No replacements made in src/**/V*/**/*.php for pattern final class, maybe replacement is no longer needed?
WARNING:synthtool:No replacements made in src/**/V*/**/*.php for pattern final class, maybe replacement is no longer needed?
2021-01-21 07:13:56,637 synthtool [WARNING] > No replacements made in src/**/V*/**/*.php for pattern public function ([s|g]\w{3,})Unwrapped, maybe replacement is no longer needed?
WARNING:synthtool:No replacements made in src/**/V*/**/*.php for pattern public function ([s|g]\w{3,})Unwrapped, maybe replacement is no longer needed?
2021-01-21 07:13:56,777 synthtool [WARNING] > No replacements made in src/**/V*/**/*.php for pattern (.{0,})\]\((/.{0,})\), maybe replacement is no longer needed?
WARNING:synthtool:No replacements made in src/**/V*/**/*.php for pattern (.{0,})\]\((/.{0,})\), maybe replacement is no longer needed?
2021-01-21 07:13:56,778 synthtool [DEBUG] > Wrote metadata to synth.metadata.
DEBUG:synthtool:Wrote metadata to synth.metadata.
2021-01-21 07:13:56,877 autosynth [INFO] > Changed files:
2021-01-21 07:13:56,878 autosynth [INFO] > M WebSecurityScanner/synth.metadata
2021-01-21 07:13:56,878 autosynth [DEBUG] > Running: git log 93cf8bda3a07bed78a0232abf0a652fa07b3a449 -1 --no-decorate --pretty=%s
2021-01-21 07:13:56,882 autosynth [DEBUG] > Running: git log 93cf8bda3a07bed78a0232abf0a652fa07b3a449 -1 --no-decorate --pretty=%b%n%nSource-Author: %an <%ae>%nSource-Date: %ad
2021-01-21 07:13:56,886 autosynth [DEBUG] > Running: git add -A
2021-01-21 07:13:56,932 autosynth [DEBUG] > Running: git status --porcelain
2021-01-21 07:13:56,980 autosynth [DEBUG] > Running: git commit -m ci: add project id for Datastore tests
Source-Author: Alexey Andreev <ava1280@yandex.ru>
Source-Date: Tue Jan 19 20:46:24 2021 +0300
Source-Repo: googleapis/google-cloud-php
Source-Sha: 93cf8bda3a07bed78a0232abf0a652fa07b3a449
Source-Link: https://github.com/googleapis/google-cloud-php/commit/93cf8bda3a07bed78a0232abf0a652fa07b3a449
[autosynth-websecurityscanner-0 562f8ed6] ci: add project id for Datastore tests
1 file changed, 1 insertion(+), 1 deletion(-)
2021-01-21 07:13:57,009 autosynth [DEBUG] > Running: git reset --hard HEAD
HEAD is now at 562f8ed6 ci: add project id for Datastore tests
2021-01-21 07:13:57,051 autosynth [DEBUG] > Running: git checkout autosynth-websecurityscanner
Switched to branch 'autosynth-websecurityscanner'
2021-01-21 07:13:57,080 autosynth [DEBUG] > Running: git diff autosynth-websecurityscanner-0..autosynth-websecurityscanner-280 -- . :(exclude)synth.metadata
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 354, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 189, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 334, in _inner_main
commit_count = synthesize_loop(x, multiple_prs, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 104, in synthesize_loop
return synthesize_loop_single_pr(toolbox, change_pusher, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 126, in synthesize_loop_single_pr
synthesize_inner_loop(toolbox, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 152, in synthesize_inner_loop
synthesize_range(toolbox, synthesizer)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 173, in synthesize_range
toolbox.sub_branch(old), toolbox.sub_branch(young)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 193, in git_branches_differ
return git_branches_differ(branch_a, branch_b, self._metadata_path)
File "/tmpfs/src/github/synthtool/autosynth/synth_toolbox.py", line 342, in git_branches_differ
proc = executor.run(diff_cmd, stdout=subprocess.PIPE, universal_newlines=True)
File "/tmpfs/src/github/synthtool/autosynth/executor.py", line 23, in run
return subprocess.run(command, **args)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 425, in run
stdout, stderr = process.communicate(input, timeout=timeout)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/subprocess.py", line 850, in communicate
stdout = self.stdout.read()
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/codecs.py", line 321, in decode
(result, consumed) = self._buffer_decode(data, self.errors, final)
UnicodeDecodeError: 'utf-8' codec can't decode byte 0xc2 in position 1422: invalid continuation byte
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/94a0e52d-cba7-4b23-addb-fe37a1587557/targets/github%2Fsynthtool;config=default/tests;query=google-cloud-php;failed=false).
| non_defect | synthesis failed for websecurityscanner hello autosynth couldn t regenerate websecurityscanner broken heart here s the output from running synth py protobuf field n s n s protected in src listfindingsrequest php synthtool replaced generated from protobuf field n s n s protected in src outdatedlibrary php info synthtool replaced generated from protobuf field n s n s protected in src outdatedlibrary php synthtool replaced generated from protobuf field n s n s protected in src crawledurl php info synthtool replaced generated from protobuf field n s n s protected in src crawledurl php synthtool replaced generated from protobuf field n s n s protected in src updatescanconfigrequest php info synthtool replaced generated from protobuf field n s n s protected in src updatescanconfigrequest php synthtool replaced generated from protobuf field n s n s protected in src listscanconfigsresponse php info synthtool replaced generated from protobuf field n s n s protected in src listscanconfigsresponse php synthtool replaced generated from protobuf field n s n s protected in src scanrunerrortrace php info synthtool replaced generated from protobuf field n s n s protected in src scanrunerrortrace php synthtool replaced generated from protobuf field n s n s protected in src scanrun php info synthtool replaced generated from protobuf field n s n s protected in src scanrun php synthtool replaced generated from protobuf field n s n s protected in src listfindingsresponse php info synthtool replaced generated from protobuf field n s n s protected in src listfindingsresponse php synthtool replaced generated from protobuf field n s n s protected in src form php info synthtool replaced generated from protobuf field n s n s protected in src form php synthtool replaced generated from protobuf field n s n s protected in src findingtypestats php info synthtool replaced generated from protobuf field n s n s protected in src findingtypestats php synthtool replaced generated from protobuf field n s n s protected in src listscanrunsresponse php info synthtool replaced generated from protobuf field n s n s protected in src listscanrunsresponse php synthtool replaced generated from protobuf field n s n s protected in src listscanconfigsrequest php info synthtool replaced generated from protobuf field n s n s protected in src listscanconfigsrequest php synthtool replaced generated from protobuf field n s n s protected in src deletescanconfigrequest php info synthtool replaced generated from protobuf field n s n s protected in src deletescanconfigrequest php synthtool replaced generated from protobuf field n s n s protected in src scanconfig schedule php info synthtool replaced generated from protobuf field n s n s protected in src scanconfig schedule php synthtool replaced generated from protobuf field n s n s protected in src scanconfig authentication googleaccount php info synthtool replaced generated from protobuf field n s n s protected in src scanconfig authentication googleaccount php synthtool replaced generated from protobuf field n s n s protected in src scanconfig authentication customaccount php info synthtool replaced generated from protobuf field n s n s protected in src scanconfig authentication customaccount php synthtool replaced generated from protobuf field n s n s protected in src vulnerableheaders header php info synthtool replaced generated from protobuf field n s n s protected in src vulnerableheaders header php synthtool no replacements made in src v php for pattern final class maybe replacement is no longer needed warning synthtool no replacements made in src v php for pattern final class maybe replacement is no longer needed synthtool no replacements made in src v php for pattern public function w unwrapped maybe replacement is no longer needed warning synthtool no replacements made in src v php for pattern public function w unwrapped maybe replacement is no longer needed synthtool no replacements made in src v php for pattern maybe replacement is no longer needed warning synthtool no replacements made in src v php for pattern maybe replacement is no longer needed synthtool wrote metadata to synth metadata debug synthtool wrote metadata to synth metadata autosynth changed files autosynth m websecurityscanner synth metadata autosynth running git log no decorate pretty s autosynth running git log no decorate pretty b n nsource author an nsource date ad autosynth running git add a autosynth running git status porcelain autosynth running git commit m ci add project id for datastore tests source author alexey andreev source date tue jan source repo googleapis google cloud php source sha source link ci add project id for datastore tests file changed insertion deletion autosynth running git reset hard head head is now at ci add project id for datastore tests autosynth running git checkout autosynth websecurityscanner switched to branch autosynth websecurityscanner autosynth running git diff autosynth websecurityscanner autosynth websecurityscanner exclude synth metadata traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool autosynth synth py line in main file tmpfs src github synthtool autosynth synth py line in main return inner main temp dir file tmpfs src github synthtool autosynth synth py line in inner main commit count synthesize loop x multiple prs change pusher synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize loop return synthesize loop single pr toolbox change pusher synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize loop single pr synthesize inner loop toolbox synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize inner loop synthesize range toolbox synthesizer file tmpfs src github synthtool autosynth synth py line in synthesize range toolbox sub branch old toolbox sub branch young file tmpfs src github synthtool autosynth synth toolbox py line in git branches differ return git branches differ branch a branch b self metadata path file tmpfs src github synthtool autosynth synth toolbox py line in git branches differ proc executor run diff cmd stdout subprocess pipe universal newlines true file tmpfs src github synthtool autosynth executor py line in run return subprocess run command args file home kbuilder pyenv versions lib subprocess py line in run stdout stderr process communicate input timeout timeout file home kbuilder pyenv versions lib subprocess py line in communicate stdout self stdout read file home kbuilder pyenv versions lib codecs py line in decode result consumed self buffer decode data self errors final unicodedecodeerror utf codec can t decode byte in position invalid continuation byte google internal developers can see the full log | 0 |
5,362 | 3,205,751,636 | IssuesEvent | 2015-10-04 11:37:36 | Krijger/docker-gradle | https://api.github.com/repos/Krijger/docker-gradle | closed | Build image should be a task, not a script | code quality | Just as the other tasks.
Also, the script loading code can then be removed. | 1.0 | Build image should be a task, not a script - Just as the other tasks.
Also, the script loading code can then be removed. | non_defect | build image should be a task not a script just as the other tasks also the script loading code can then be removed | 0 |
345,756 | 30,839,089,885 | IssuesEvent | 2023-08-02 09:25:43 | sourcegraph/sourcegraph | https://api.github.com/repos/sourcegraph/sourcegraph | opened | testing: Run E2E tests locally | team/dev-infra dev-infra/testing-revamp | To enable teammates to write more E2E tests it needs to be easier to run the tests locally. At present, the E2E tests can only really run in CI since that is where all the credentials are needed to communicate with all the external dependencies for the test.
#### Open questions
- Retrieval of credentails
- Should the credentials be fetched from the devx-service
- Licensing issues due to limited number of seats
- For local testing, should be use a request replayer/mock server?
Will something like #55509 answer these open questions | 1.0 | testing: Run E2E tests locally - To enable teammates to write more E2E tests it needs to be easier to run the tests locally. At present, the E2E tests can only really run in CI since that is where all the credentials are needed to communicate with all the external dependencies for the test.
#### Open questions
- Retrieval of credentails
- Should the credentials be fetched from the devx-service
- Licensing issues due to limited number of seats
- For local testing, should be use a request replayer/mock server?
Will something like #55509 answer these open questions | non_defect | testing run tests locally to enable teammates to write more tests it needs to be easier to run the tests locally at present the tests can only really run in ci since that is where all the credentials are needed to communicate with all the external dependencies for the test open questions retrieval of credentails should the credentials be fetched from the devx service licensing issues due to limited number of seats for local testing should be use a request replayer mock server will something like answer these open questions | 0 |
238,141 | 19,701,212,319 | IssuesEvent | 2022-01-12 16:48:23 | tarantool/cartridge | https://api.github.com/repos/tarantool/cartridge | closed | f-lucky test state_machine_test.lua:373 | flaky test teamX | https://github.com/tarantool/cartridge/runs/4628355400?check_suite_focus=true
```
1) integration.state_machine.test_blinking_fast
407
...tridge/cartridge/test/integration/state_machine_test.lua:373: assertion failed!
408
stack traceback:
409
builtin/box/net_box.lua:1207: in function '_request'
410
builtin/box/net_box.lua:1256: in function 'eval'
411
...tridge/cartridge/test/integration/state_machine_test.lua:338: in function 'integration.state_machine.test_blinking_fast'
412
...
413
[C]: in function 'xpcall'
414
415
Captured stdout:
416
master | 2021-12-24 18:13:46.499 [7186] main I> tx_binary: stopped
417
slave | 2021-12-24 18:13:46.500 [7194] main/139/cartridge.eventual-failover I> Replicaset aaaaaaaa-0000-0000-0000-000000000000 (me): new leader aaaaaaaa-aaaa-0000-0000-000000000002 (me), was aaaaaaaa-aaaa-0000-0000-000000000001 ("localhost:13301")
418
slave | 2021-12-24 18:13:46.500 [7194] main/139/cartridge.eventual-failover I> Failover triggered, reapply scheduled (fiber 141)
419
slave | 2021-12-24 18:13:46.501 [7194] main/141/cartridge.failover.task I> Instance state changed: RolesConfigured -> ConfiguringRoles
420
slave | 2021-12-24 18:13:46.501 [7194] main/141/cartridge.failover.task I> set 'read_only' configuration option to false
421
slave | 2021-12-24 18:13:46.501 [7194] main/141/cartridge.failover.task I> --- apply_config({is_master = true})
422
slave | 2021-12-24 18:13:46.502 [7194] main/140/lua eval:6 E> DANGER! Instance is rw!
423
slave | 2021-12-24 18:13:46.503 [7194] main I> tx_binary: stopped
```
| 1.0 | f-lucky test state_machine_test.lua:373 - https://github.com/tarantool/cartridge/runs/4628355400?check_suite_focus=true
```
1) integration.state_machine.test_blinking_fast
407
...tridge/cartridge/test/integration/state_machine_test.lua:373: assertion failed!
408
stack traceback:
409
builtin/box/net_box.lua:1207: in function '_request'
410
builtin/box/net_box.lua:1256: in function 'eval'
411
...tridge/cartridge/test/integration/state_machine_test.lua:338: in function 'integration.state_machine.test_blinking_fast'
412
...
413
[C]: in function 'xpcall'
414
415
Captured stdout:
416
master | 2021-12-24 18:13:46.499 [7186] main I> tx_binary: stopped
417
slave | 2021-12-24 18:13:46.500 [7194] main/139/cartridge.eventual-failover I> Replicaset aaaaaaaa-0000-0000-0000-000000000000 (me): new leader aaaaaaaa-aaaa-0000-0000-000000000002 (me), was aaaaaaaa-aaaa-0000-0000-000000000001 ("localhost:13301")
418
slave | 2021-12-24 18:13:46.500 [7194] main/139/cartridge.eventual-failover I> Failover triggered, reapply scheduled (fiber 141)
419
slave | 2021-12-24 18:13:46.501 [7194] main/141/cartridge.failover.task I> Instance state changed: RolesConfigured -> ConfiguringRoles
420
slave | 2021-12-24 18:13:46.501 [7194] main/141/cartridge.failover.task I> set 'read_only' configuration option to false
421
slave | 2021-12-24 18:13:46.501 [7194] main/141/cartridge.failover.task I> --- apply_config({is_master = true})
422
slave | 2021-12-24 18:13:46.502 [7194] main/140/lua eval:6 E> DANGER! Instance is rw!
423
slave | 2021-12-24 18:13:46.503 [7194] main I> tx_binary: stopped
```
| non_defect | f lucky test state machine test lua integration state machine test blinking fast tridge cartridge test integration state machine test lua assertion failed stack traceback builtin box net box lua in function request builtin box net box lua in function eval tridge cartridge test integration state machine test lua in function integration state machine test blinking fast in function xpcall captured stdout master main i tx binary stopped slave main cartridge eventual failover i replicaset aaaaaaaa me new leader aaaaaaaa aaaa me was aaaaaaaa aaaa localhost slave main cartridge eventual failover i failover triggered reapply scheduled fiber slave main cartridge failover task i instance state changed rolesconfigured configuringroles slave main cartridge failover task i set read only configuration option to false slave main cartridge failover task i apply config is master true slave main lua eval e danger instance is rw slave main i tx binary stopped | 0 |
75,914 | 26,147,744,421 | IssuesEvent | 2022-12-30 08:34:40 | vector-im/element-web | https://api.github.com/repos/vector-im/element-web | closed | Content-Type header is bogus on encrypted uploads, should be application/octet-stream or similar | T-Defect S-Minor A-E2EE A-Media A-File-Upload O-Occasional | ### Steps to reproduce
1. Try to upload a file in an encrypted chat on a server proxied by lighttpd using mod_proxy
2. 
### Outcome
#### What did you expect?
1. File uploads and gets sent without any issue
#### What happened instead?
1. Request errors with HTTP 400
### Operating system
Debian testing
### Browser information
LibreWolf 107.0.1
### URL for webapp
https://chat.cat.casa
### Application version
Element version: 1.11.17, Olm version: 3.2.12
### Homeserver
https://matrix.cat.casa
### Will you send logs?
Yes | 1.0 | Content-Type header is bogus on encrypted uploads, should be application/octet-stream or similar - ### Steps to reproduce
1. Try to upload a file in an encrypted chat on a server proxied by lighttpd using mod_proxy
2. 
### Outcome
#### What did you expect?
1. File uploads and gets sent without any issue
#### What happened instead?
1. Request errors with HTTP 400
### Operating system
Debian testing
### Browser information
LibreWolf 107.0.1
### URL for webapp
https://chat.cat.casa
### Application version
Element version: 1.11.17, Olm version: 3.2.12
### Homeserver
https://matrix.cat.casa
### Will you send logs?
Yes | defect | content type header is bogus on encrypted uploads should be application octet stream or similar steps to reproduce try to upload a file in an encrypted chat on a server proxied by lighttpd using mod proxy outcome what did you expect file uploads and gets sent without any issue what happened instead request errors with http operating system debian testing browser information librewolf url for webapp application version element version olm version homeserver will you send logs yes | 1 |
74,871 | 20,386,542,055 | IssuesEvent | 2022-02-22 07:39:26 | fraunhoferhhi/vvdec | https://api.github.com/repos/fraunhoferhhi/vvdec | closed | CMake error at Path | build | Severity Code Description Project File Line Suppression State
Error CMake Error at C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.20/Modules/CMakeTestCCompiler.cmake:66 (message):
The C compiler
"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe"
is not able to compile a simple test program.
It fails with the following output:
Change Dir: C:/Users/mmard/source/repos/vvdec/out/build/x64-Debug/CMakeFiles/CMakeTmp
Run Build Command(s):C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/Ninja/ninja.exe cmTC_d3696 && [1/2] Building C object CMakeFiles\cmTC_d3696.dir\testCCompiler.c.obj
[2/2] Linking C executable cmTC_d3696.exe
FAILED: cmTC_d3696.exe
cmd.exe /C "cd . && "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\CMake\bin\cmake.exe" -E vs_link_exe --intdir=CMakeFiles\cmTC_d3696.dir --rc=rc --mt=CMAKE_MT-NOTFOUND --manifests -- C:\PROGRA~2\MICROS~2\2019\COMMUN~1\VC\Tools\MSVC\1429~1.301\bin\Hostx64\x64\link.exe /nologo CMakeFiles\cmTC_d3696.dir\testCCompiler.c.obj /out:cmTC_d3696.exe /implib:cmTC_d3696.lib /pdb:cmTC_d3696.pdb /version:0.0 /machine:x64 /debug /INCREMENTAL /subsystem:console kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib && cd ."
RC Pass 1: command "rc /fo CMakeFiles\cmTC_d3696.dir/manifest.res CMakeFiles\cmTC_d3696.dir/manifest.rc" failed (exit code 0) with the following output:
Das System kann die angegebene Datei nicht finden
ninja: build stopped: subcommand failed.
CMake will not be able to correctly generate this project. C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.20/Modules/CMakeTestCCompiler.cmake 66
Hello How to solve this ? | 1.0 | CMake error at Path - Severity Code Description Project File Line Suppression State
Error CMake Error at C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.20/Modules/CMakeTestCCompiler.cmake:66 (message):
The C compiler
"C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/VC/Tools/MSVC/14.29.30133/bin/Hostx64/x64/cl.exe"
is not able to compile a simple test program.
It fails with the following output:
Change Dir: C:/Users/mmard/source/repos/vvdec/out/build/x64-Debug/CMakeFiles/CMakeTmp
Run Build Command(s):C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/Ninja/ninja.exe cmTC_d3696 && [1/2] Building C object CMakeFiles\cmTC_d3696.dir\testCCompiler.c.obj
[2/2] Linking C executable cmTC_d3696.exe
FAILED: cmTC_d3696.exe
cmd.exe /C "cd . && "C:\Program Files (x86)\Microsoft Visual Studio\2019\Community\Common7\IDE\CommonExtensions\Microsoft\CMake\CMake\bin\cmake.exe" -E vs_link_exe --intdir=CMakeFiles\cmTC_d3696.dir --rc=rc --mt=CMAKE_MT-NOTFOUND --manifests -- C:\PROGRA~2\MICROS~2\2019\COMMUN~1\VC\Tools\MSVC\1429~1.301\bin\Hostx64\x64\link.exe /nologo CMakeFiles\cmTC_d3696.dir\testCCompiler.c.obj /out:cmTC_d3696.exe /implib:cmTC_d3696.lib /pdb:cmTC_d3696.pdb /version:0.0 /machine:x64 /debug /INCREMENTAL /subsystem:console kernel32.lib user32.lib gdi32.lib winspool.lib shell32.lib ole32.lib oleaut32.lib uuid.lib comdlg32.lib advapi32.lib && cd ."
RC Pass 1: command "rc /fo CMakeFiles\cmTC_d3696.dir/manifest.res CMakeFiles\cmTC_d3696.dir/manifest.rc" failed (exit code 0) with the following output:
Das System kann die angegebene Datei nicht finden
ninja: build stopped: subcommand failed.
CMake will not be able to correctly generate this project. C:/Program Files (x86)/Microsoft Visual Studio/2019/Community/Common7/IDE/CommonExtensions/Microsoft/CMake/CMake/share/cmake-3.20/Modules/CMakeTestCCompiler.cmake 66
Hello How to solve this ? | non_defect | cmake error at path severity code description project file line suppression state error cmake error at c program files microsoft visual studio community ide commonextensions microsoft cmake cmake share cmake modules cmaketestccompiler cmake message the c compiler c program files microsoft visual studio community vc tools msvc bin cl exe is not able to compile a simple test program it fails with the following output change dir c users mmard source repos vvdec out build debug cmakefiles cmaketmp run build command s c program files microsoft visual studio community ide commonextensions microsoft cmake ninja ninja exe cmtc building c object cmakefiles cmtc dir testccompiler c obj linking c executable cmtc exe failed cmtc exe cmd exe c cd c program files microsoft visual studio community ide commonextensions microsoft cmake cmake bin cmake exe e vs link exe intdir cmakefiles cmtc dir rc rc mt cmake mt notfound manifests c progra micros commun vc tools msvc bin link exe nologo cmakefiles cmtc dir testccompiler c obj out cmtc exe implib cmtc lib pdb cmtc pdb version machine debug incremental subsystem console lib lib lib winspool lib lib lib lib uuid lib lib lib cd rc pass command rc fo cmakefiles cmtc dir manifest res cmakefiles cmtc dir manifest rc failed exit code with the following output das system kann die angegebene datei nicht finden ninja build stopped subcommand failed cmake will not be able to correctly generate this project c program files microsoft visual studio community ide commonextensions microsoft cmake cmake share cmake modules cmaketestccompiler cmake hello how to solve this | 0 |
233,986 | 17,928,160,416 | IssuesEvent | 2021-09-10 04:34:27 | h2oai/datatable | https://api.github.com/repos/h2oai/datatable | opened | `max_column_width` doesn't work for fixed-width columns | bug documentation | In [documentation](https://datatable.readthedocs.io/en/latest/api/options/display/max_column_width.html) for `max_column_width` option we say
> This option controls the threshold for the column’s width to be truncated. If a column’s name or its values exceed the max_column_width, the content of the column is truncated to max_column_width characters when printed.
However, in reality truncation only happens for string / array columns and if, for instance, a fixed-width column exceeds `max_column_width` it never gets truncated:
```python
import datatable as dt
dt.options.display.max_column_width=5
DT = dt.Frame([['1234567890'], [1234567890]])
print(DT)
```
produces
```
| C0 C1
| str32 int32
-- + ----- ----------
0 | 1234… 1234567890
[1 row x 2 columns]
```
So we should either explicitly say that `max_column_width` only works for string / array columns or make it work for fixed-width columns too.
| 1.0 | `max_column_width` doesn't work for fixed-width columns - In [documentation](https://datatable.readthedocs.io/en/latest/api/options/display/max_column_width.html) for `max_column_width` option we say
> This option controls the threshold for the column’s width to be truncated. If a column’s name or its values exceed the max_column_width, the content of the column is truncated to max_column_width characters when printed.
However, in reality truncation only happens for string / array columns and if, for instance, a fixed-width column exceeds `max_column_width` it never gets truncated:
```python
import datatable as dt
dt.options.display.max_column_width=5
DT = dt.Frame([['1234567890'], [1234567890]])
print(DT)
```
produces
```
| C0 C1
| str32 int32
-- + ----- ----------
0 | 1234… 1234567890
[1 row x 2 columns]
```
So we should either explicitly say that `max_column_width` only works for string / array columns or make it work for fixed-width columns too.
| non_defect | max column width doesn t work for fixed width columns in for max column width option we say this option controls the threshold for the column’s width to be truncated if a column’s name or its values exceed the max column width the content of the column is truncated to max column width characters when printed however in reality truncation only happens for string array columns and if for instance a fixed width column exceeds max column width it never gets truncated python import datatable as dt dt options display max column width dt dt frame print dt produces … so we should either explicitly say that max column width only works for string array columns or make it work for fixed width columns too | 0 |
164,060 | 20,364,325,121 | IssuesEvent | 2022-02-21 02:33:50 | Dashbrd/ngx-image-dimension | https://api.github.com/repos/Dashbrd/ngx-image-dimension | closed | CVE-2018-11499 (High) detected in node-sassv4.12.0 - autoclosed | security vulnerability | ## CVE-2018-11499 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sassv4.12.0</b></p></summary>
<p>
<p>:rainbow: Node.js bindings to libsass</p>
<p>Library home page: <a href=https://github.com/sass/node-sass.git>https://github.com/sass/node-sass.git</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>ngx-image-dimension/node_modules/node-sass/src/libsass/src/parser.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A use-after-free vulnerability exists in handle_error() in sass_context.cpp in LibSass 3.4.x and 3.5.x through 3.5.4 that could be leveraged to cause a denial of service (application crash) or possibly unspecified other impact.
<p>Publish Date: 2018-05-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11499>CVE-2018-11499</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sass/libsass/releases/tag/3.6.0">https://github.com/sass/libsass/releases/tag/3.6.0</a></p>
<p>Release Date: 2018-05-26</p>
<p>Fix Resolution: libsass - 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2018-11499 (High) detected in node-sassv4.12.0 - autoclosed - ## CVE-2018-11499 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sassv4.12.0</b></p></summary>
<p>
<p>:rainbow: Node.js bindings to libsass</p>
<p>Library home page: <a href=https://github.com/sass/node-sass.git>https://github.com/sass/node-sass.git</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>ngx-image-dimension/node_modules/node-sass/src/libsass/src/parser.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A use-after-free vulnerability exists in handle_error() in sass_context.cpp in LibSass 3.4.x and 3.5.x through 3.5.4 that could be leveraged to cause a denial of service (application crash) or possibly unspecified other impact.
<p>Publish Date: 2018-05-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11499>CVE-2018-11499</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/sass/libsass/releases/tag/3.6.0">https://github.com/sass/libsass/releases/tag/3.6.0</a></p>
<p>Release Date: 2018-05-26</p>
<p>Fix Resolution: libsass - 3.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_defect | cve high detected in node autoclosed cve high severity vulnerability vulnerable library node rainbow node js bindings to libsass library home page a href vulnerable source files ngx image dimension node modules node sass src libsass src parser cpp vulnerability details a use after free vulnerability exists in handle error in sass context cpp in libsass x and x through that could be leveraged to cause a denial of service application crash or possibly unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass step up your open source security game with whitesource | 0 |
79,412 | 28,182,535,580 | IssuesEvent | 2023-04-04 04:34:37 | apache/jmeter | https://api.github.com/repos/apache/jmeter | opened | want to run multiple user sequentially | defect to-triage | ### Expected behavior
i have 800 user and i want to run that the next request is sent only after the prior request is completed?
### Actual behavior
they run with same timing
### Steps to reproduce the problem
create a multiple thread
add http request
csv file
add listner
### JMeter Version
5.5
### Java Version
_No response_
### OS Version
_No response_ | 1.0 | want to run multiple user sequentially - ### Expected behavior
i have 800 user and i want to run that the next request is sent only after the prior request is completed?
### Actual behavior
they run with same timing
### Steps to reproduce the problem
create a multiple thread
add http request
csv file
add listner
### JMeter Version
5.5
### Java Version
_No response_
### OS Version
_No response_ | defect | want to run multiple user sequentially expected behavior i have user and i want to run that the next request is sent only after the prior request is completed actual behavior they run with same timing steps to reproduce the problem create a multiple thread add http request csv file add listner jmeter version java version no response os version no response | 1 |
64,761 | 18,882,928,993 | IssuesEvent | 2021-11-15 02:09:54 | SeleniumHQ/selenium | https://api.github.com/repos/SeleniumHQ/selenium | closed | [🐛 Bug]: Unable to start Firefox 94 with custome profile | I-defect needs-triaging | ### What happened?
Selenium not able to pass the profile variable to geckodriver, it was still using the temp profile when started.
### How can we reproduce the issue?
```shell
from selenium import webdriver
from selenium.webdriver.firefox.options import Options
from selenium.webdriver.firefox.firefox_profile import FirefoxProfile
profile_path = r"C:\Users\scruel\AppData\Roaming\Mozilla\Firefox\Profiles\test.default-release"
options=Options()
options.log.level = "trace"
options.set_preference('profile', profile_path)
browser = webdriver.Firefox(options=options)
```
### Relevant log output
```shell
1636940642906 geckodriver INFO Listening on 127.0.0.1:5149
1636940645941 mozrunner::runner INFO Running command: "C:\\Program Files\\Mozilla Firefox\\firefox.exe" "--marionette" "--remote-debugging-port" "5150" "-no-remote" "-profile" "C:\\Users\\scruel\\AppData\\Local\\Temp\\rust_mozprofileayfj0h"
1636940645947 geckodriver::marionette DEBUG Waiting 60s to connect to browser on 127.0.0.1:5155
1636940646209 RemoteAgent DEBUG CDP enabled
1636940646209 Marionette INFO Marionette enabled
1636940646262 Marionette TRACE Received observer notification toplevel-window-ready
```
### Operating System
Windows 10 Pro
### Selenium version
Selenium 4.0.0
### What are the browser(s) and version(s) where you see this issue?
Firefox 94
### What are the browser driver(s) and version(s) where you see this issue?
geckodriver 30.0.0
### Are you using Selenium Grid?
_No response_ | 1.0 | [🐛 Bug]: Unable to start Firefox 94 with custome profile - ### What happened?
Selenium not able to pass the profile variable to geckodriver, it was still using the temp profile when started.
### How can we reproduce the issue?
```shell
from selenium import webdriver
from selenium.webdriver.firefox.options import Options
from selenium.webdriver.firefox.firefox_profile import FirefoxProfile
profile_path = r"C:\Users\scruel\AppData\Roaming\Mozilla\Firefox\Profiles\test.default-release"
options=Options()
options.log.level = "trace"
options.set_preference('profile', profile_path)
browser = webdriver.Firefox(options=options)
```
### Relevant log output
```shell
1636940642906 geckodriver INFO Listening on 127.0.0.1:5149
1636940645941 mozrunner::runner INFO Running command: "C:\\Program Files\\Mozilla Firefox\\firefox.exe" "--marionette" "--remote-debugging-port" "5150" "-no-remote" "-profile" "C:\\Users\\scruel\\AppData\\Local\\Temp\\rust_mozprofileayfj0h"
1636940645947 geckodriver::marionette DEBUG Waiting 60s to connect to browser on 127.0.0.1:5155
1636940646209 RemoteAgent DEBUG CDP enabled
1636940646209 Marionette INFO Marionette enabled
1636940646262 Marionette TRACE Received observer notification toplevel-window-ready
```
### Operating System
Windows 10 Pro
### Selenium version
Selenium 4.0.0
### What are the browser(s) and version(s) where you see this issue?
Firefox 94
### What are the browser driver(s) and version(s) where you see this issue?
geckodriver 30.0.0
### Are you using Selenium Grid?
_No response_ | defect | unable to start firefox with custome profile what happened selenium not able to pass the profile variable to geckodriver it was still using the temp profile when started how can we reproduce the issue shell from selenium import webdriver from selenium webdriver firefox options import options from selenium webdriver firefox firefox profile import firefoxprofile profile path r c users scruel appdata roaming mozilla firefox profiles test default release options options options log level trace options set preference profile profile path browser webdriver firefox options options relevant log output shell geckodriver info listening on mozrunner runner info running command c program files mozilla firefox firefox exe marionette remote debugging port no remote profile c users scruel appdata local temp rust geckodriver marionette debug waiting to connect to browser on remoteagent debug cdp enabled marionette info marionette enabled marionette trace received observer notification toplevel window ready operating system windows pro selenium version selenium what are the browser s and version s where you see this issue firefox what are the browser driver s and version s where you see this issue geckodriver are you using selenium grid no response | 1 |
46,134 | 13,055,857,353 | IssuesEvent | 2020-07-30 02:56:35 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | datio's libarchive test is broken (Trac #633) | Incomplete Migration Migrated from Trac dataio defect | Migrated from https://code.icecube.wisc.edu/ticket/633
```json
{
"status": "closed",
"changetime": "2014-11-23T03:37:57",
"description": "testing for xz is wrong. just because xz is installed doesnt mean that libarchive is installed.",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1416713877111216",
"component": "dataio",
"summary": "datio's libarchive test is broken",
"priority": "normal",
"keywords": "libarchive dataio tests",
"time": "2011-05-12T03:08:25",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
| 1.0 | datio's libarchive test is broken (Trac #633) - Migrated from https://code.icecube.wisc.edu/ticket/633
```json
{
"status": "closed",
"changetime": "2014-11-23T03:37:57",
"description": "testing for xz is wrong. just because xz is installed doesnt mean that libarchive is installed.",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1416713877111216",
"component": "dataio",
"summary": "datio's libarchive test is broken",
"priority": "normal",
"keywords": "libarchive dataio tests",
"time": "2011-05-12T03:08:25",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
| defect | datio s libarchive test is broken trac migrated from json status closed changetime description testing for xz is wrong just because xz is installed doesnt mean that libarchive is installed reporter nega cc resolution fixed ts component dataio summary datio s libarchive test is broken priority normal keywords libarchive dataio tests time milestone owner olivas type defect | 1 |
56,223 | 14,983,561,888 | IssuesEvent | 2021-01-28 17:22:29 | matrix-org/synapse | https://api.github.com/repos/matrix-org/synapse | opened | KeyError in synapse.push.mailer in get_message_vars | S-Minor T-Defect | See https://sentry.matrix.org/sentry/synapse-matrixorg/issues/190798/
It looks like this is happening when a user is no longer in the room when prepping to send an email notification (this is likely due to the user leaving during the delay between an event being received and sending the email).
For some situations we should likely just ignore the notification and move on, but in some situations we probably want to notify the user that they've been kicked / banned / whatever. | 1.0 | KeyError in synapse.push.mailer in get_message_vars - See https://sentry.matrix.org/sentry/synapse-matrixorg/issues/190798/
It looks like this is happening when a user is no longer in the room when prepping to send an email notification (this is likely due to the user leaving during the delay between an event being received and sending the email).
For some situations we should likely just ignore the notification and move on, but in some situations we probably want to notify the user that they've been kicked / banned / whatever. | defect | keyerror in synapse push mailer in get message vars see it looks like this is happening when a user is no longer in the room when prepping to send an email notification this is likely due to the user leaving during the delay between an event being received and sending the email for some situations we should likely just ignore the notification and move on but in some situations we probably want to notify the user that they ve been kicked banned whatever | 1 |
56,462 | 15,103,137,693 | IssuesEvent | 2021-02-08 09:55:46 | openzfs/zfs | https://api.github.com/repos/openzfs/zfs | closed | Unable to determine path or stats for object 2203854 in rpool/home/..... Stale file handle | Status: Triage Needed Type: Defect | Good morning to everyone!
5.4.0-58-generic #64~18.04.1-Ubuntu SMP Wed Dec 9 17:11:11 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
$modinfo zfs | grep -iw version
version: 0.8.3-1ubuntu12.5
$modinfo spl | grep -iw version
version: 0.8.3-1ubuntu12.5
same OS and version in both receiving and sending side.. (zfs is atop of luks encrypted drive.. but encryption is transparent to zfs so should not matter btw.. )
the problem is:
I have multiple pc and send/receive snapshots between them, I do that for multiple partitions and everything is fine.. except for one : the home partition for a specific user
the issue is this
: 0 MB... cannot receive incremental stream: destination rpool/home/francesco has been modified
since most recent snapshot
The solution for me was take snapshot, check diff then rollback.. but
1 : diff doesn't work (I'll explain here below )
2: rollback works but merge then still return the same issue (cannot receive incremental stream)
So since more than one year for me has been impossible to use snapshot patching for this partition.. I've tried multiple solutions ,
s1 : re export-import whole partition from the original pc, same error on next first merge
s2 : scrub : detected 0 errors but not fixed,
s3 : copy all files with rsync to another location, destroy partition, re-create , restore files export full snapshot of new volume and then try again with patches..... SAME PROBLEM AGAIN! ( see here below )
I've encountered this issue since more than 1 year.. and it's a blocking issue for me.. At the moment I'm using borgbackup and I'm considering moving to btrfs only for this single issue..
Some investigation :
DIFF:
the first attempt to investigate is taking a snapshot and trying to diff.. ( the volume hasn't changed.. snapshot is 0 byte size)
zfs snapshot rpool/home/francesco@premerge ; sudo zfs diff rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 rpool/home/francesco@premerge
Unable to determine path or stats for object 2203854 in rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57: Stale file handle
- the exactly same error come even if I do zfs rollback to rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 before snapshotting to @premerge
ZDB:
zpool set cachefile=/etc/zfs/zpool.cache rpool
zpool get all
zdb -U /data/zfs/zpool.cache -b rpool/home/francesco
on the premerge ( which is a snapshot of a filesystem change which should not exist (it's the same even after a rollback from the 'original' ) )
```sh
zdb -dddd rpool/home/francesco@premerge 2203854
Dataset rpool/home/francesco@premerge [ZPL], ID 3370, cr_txg 3563164, 160G, 1635400 objects, rootbp DVA[0]=<0:902ec2e000:1000> DVA[1]=<0:967759b000:1000> [L0 DMU objset] fletcher4 uncompres
sed LE contiguous unique double size=1000L/1000P birth=3563160L/3563160P fill=1635400 cksum=f774539d1:2d56f9978702:44e869b2864f6c:487332f4016dba4d
Object lvl iblk dblk dsize dnsize lsize %full type
zdb: dmu_bonus_hold(2203854) failed, errno 2
```
then zdb on the 'original' partition which should be patched
```sh
zdb -dddd rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 2203854
Dataset rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 [ZPL], ID 3041, cr_txg 3533520, 160G, 1635406 objects, rootbp DVA[0]=<0:ffb2fc000:1000> DVA[1]=<0:923bac9000:1000
> [L0 DMU objset] fletcher4 uncompressed LE contiguous unique double size=1000L/1000P birth=3533520L/3533520P fill=1635406 cksum=fa64e13c8:2e1376df1849:4661f482c73f04:4a6b350a3b958f74
Object lvl iblk dblk dsize dnsize lsize %full type
2203854 2 128K 128K 76.0K 1K 256K 100.00 ZFS plain file
168 bonus System attributes
dnode flags: USED_BYTES USERUSED_ACCOUNTED USEROBJUSED_ACCOUNTED
dnode maxblkid: 1
uid 1000
gid 1000
atime Sun Dec 27 15:14:53 2020
mtime Sun Dec 27 19:52:39 2020
ctime Sun Dec 27 19:52:39 2020
crtime Sun Dec 27 18:34:27 2020
gen 10231546
mode 100600
size 168156
parent 2203766
links 0
pflags 40800000004
```
now, with a zdb on the parent, then using the filesize 168156 I've found the file that give me problems..
but the file is identical in both mounted filesystems ....
```bash
root@nbfat:/home/francesco/.zfs/snapshot/zapp_br-main_epoc-1609099026_pdate-12_27_19_57/.local/share/gvfs-metadata# stat root
File: root
Size: 168156 Blocks: 154 IO Block: 131072 regular file
Device: 75h/117d Inode: 3282856 Links: 1
Access: (0600/-rw-------) Uid: ( 1000/francesco) Gid: ( 1000/francesco)
Access: 2020-12-27 19:52:39.216743651 +0000
Modify: 2020-12-27 19:52:39.216743651 +0000
Change: 2020-12-27 19:52:39.216743651 +0000
Birth: -
root@nbfat:/home/francesco/.zfs/snapshot/zapp_br-main_epoc-1609099026_pdate-12_27_19_57/.local/share/gvfs-metadata# md5sum root
70fe067819822f74ec8928e33a5d5788 root
root@nbfat:/home/francesco/.zfs/snapshot/premerge/.local/share/gvfs-metadata# stat root
File: root
Size: 168156 Blocks: 154 IO Block: 131072 regular file
Device: 74h/116d Inode: 3282856 Links: 1
Access: (0600/-rw-------) Uid: ( 1000/francesco) Gid: ( 1000/francesco)
Access: 2020-12-27 19:52:39.216743651 +0000
Modify: 2020-12-27 19:52:39.216743651 +0000
Change: 2020-12-27 19:52:39.216743651 +0000
Birth: -
root@nbfat:/home/francesco/.zfs/snapshot/premerge/.local/share/gvfs-metadata# md5sum root
70fe067819822f74ec8928e33a5d5788 root
```
see same md5, same stat.. but zfs receive still not working there.. (all other partitions always worked fine..)
the file is a binary file with some log inside.. 'less' can open it without issue on both partitions
I have the pc turned and I'll wait to try to work around the problem for a bit.. so if you want some fast-feedback on the issue I can try with anything you ask me to..
I have also the strace of zdb of both the commands
the strace of the non-working partition ends with
```
futex(0x560a73085a90, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\242K;\207\370\254\234\327", 8) = 8
read(4, "\242\1\7q\223m\302>", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7309aa08, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7309a9b0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\341h\302#\330\355:\310", 8) = 8
read(4, "\236\370\242U\1-\360\336", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7309aa08, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7309a9b0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, ".\325\233\365(\266\32\265", 8) = 8
read(4, "\244\262\370J?\202\214~", 8) = 8
read(4, "\212\270\221\16\222\307\370\376", 8) = 8
read(4, "\345\341\35>\327y\336Q", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308b7f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308b7a0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\2068m\323\332\353'\213", 8) = 8
read(4, "\304\205\5\255\325A\312\335", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308b7f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308b7a0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\23\234dw\6\3n[", 8) = 8
read(4, "\251\1#\256\3127D\\", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7302d9f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7302d9a0, FUTEX_WAKE_PRIVATE, 1) = 0
fstat(1, {st_mode=S_IFREG|0644, st_size=499725, ...}) = 0
read(4, "\330IA\335Vw\210\340", 8) = 8
read(4, "`\214\360\331\20\342n\366", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308bf88, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308bf30, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "o_\32\301h1\361\277", 8) = 8
read(4, "\21\350\237\37y\312\373\327", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308b7f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308b7a0, FUTEX_WAKE_PRIVATE, 1) = 0
write(2, "zdb: ", 5zdb: ) = 5
write(2, "dmu_bonus_hold(2203854) failed, "..., 39dmu_bonus_hold(2203854) failed, errno 2) = 39
write(2, "\n", 1
) = 1
write(1, "Dataset rpool/home/francesco@pre"..., 403Dataset rpool/home/francesco@premerge [ZPL], ID 2162, cr_txg 3563583, 160G, 1635400 objects, rootbp DVA[0]=<0:902ec2e000:1000> DVA[1]=<0:967759b000:1000> [L0 DMU objset] fletcher4 uncompressed LE contiguous unique double size=1000L/1000P birth=3563160L/3563160P fill=1635400 cksum=f774539d1:2d56f9978702:44e869b2864f6c:487332f4016dba4d
Object lvl iblk dblk dsize dnsize lsize %full type
) = 403
exit_group(1) = ?
+++ exited with 1 +++
```
if you want I can provide the full strace in private (I dont attach the files here because I'm not sure that doesn't contain something private.. (ie full content of the problematic file ))
any suggestion is wellcome!
Thank you!
Francesco
| 1.0 | Unable to determine path or stats for object 2203854 in rpool/home/..... Stale file handle - Good morning to everyone!
5.4.0-58-generic #64~18.04.1-Ubuntu SMP Wed Dec 9 17:11:11 UTC 2020 x86_64 x86_64 x86_64 GNU/Linux
$modinfo zfs | grep -iw version
version: 0.8.3-1ubuntu12.5
$modinfo spl | grep -iw version
version: 0.8.3-1ubuntu12.5
same OS and version in both receiving and sending side.. (zfs is atop of luks encrypted drive.. but encryption is transparent to zfs so should not matter btw.. )
the problem is:
I have multiple pc and send/receive snapshots between them, I do that for multiple partitions and everything is fine.. except for one : the home partition for a specific user
the issue is this
: 0 MB... cannot receive incremental stream: destination rpool/home/francesco has been modified
since most recent snapshot
The solution for me was take snapshot, check diff then rollback.. but
1 : diff doesn't work (I'll explain here below )
2: rollback works but merge then still return the same issue (cannot receive incremental stream)
So since more than one year for me has been impossible to use snapshot patching for this partition.. I've tried multiple solutions ,
s1 : re export-import whole partition from the original pc, same error on next first merge
s2 : scrub : detected 0 errors but not fixed,
s3 : copy all files with rsync to another location, destroy partition, re-create , restore files export full snapshot of new volume and then try again with patches..... SAME PROBLEM AGAIN! ( see here below )
I've encountered this issue since more than 1 year.. and it's a blocking issue for me.. At the moment I'm using borgbackup and I'm considering moving to btrfs only for this single issue..
Some investigation :
DIFF:
the first attempt to investigate is taking a snapshot and trying to diff.. ( the volume hasn't changed.. snapshot is 0 byte size)
zfs snapshot rpool/home/francesco@premerge ; sudo zfs diff rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 rpool/home/francesco@premerge
Unable to determine path or stats for object 2203854 in rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57: Stale file handle
- the exactly same error come even if I do zfs rollback to rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 before snapshotting to @premerge
ZDB:
zpool set cachefile=/etc/zfs/zpool.cache rpool
zpool get all
zdb -U /data/zfs/zpool.cache -b rpool/home/francesco
on the premerge ( which is a snapshot of a filesystem change which should not exist (it's the same even after a rollback from the 'original' ) )
```sh
zdb -dddd rpool/home/francesco@premerge 2203854
Dataset rpool/home/francesco@premerge [ZPL], ID 3370, cr_txg 3563164, 160G, 1635400 objects, rootbp DVA[0]=<0:902ec2e000:1000> DVA[1]=<0:967759b000:1000> [L0 DMU objset] fletcher4 uncompres
sed LE contiguous unique double size=1000L/1000P birth=3563160L/3563160P fill=1635400 cksum=f774539d1:2d56f9978702:44e869b2864f6c:487332f4016dba4d
Object lvl iblk dblk dsize dnsize lsize %full type
zdb: dmu_bonus_hold(2203854) failed, errno 2
```
then zdb on the 'original' partition which should be patched
```sh
zdb -dddd rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 2203854
Dataset rpool/home/francesco@zapp_br-main_epoc-1609099026_pdate-12_27_19_57 [ZPL], ID 3041, cr_txg 3533520, 160G, 1635406 objects, rootbp DVA[0]=<0:ffb2fc000:1000> DVA[1]=<0:923bac9000:1000
> [L0 DMU objset] fletcher4 uncompressed LE contiguous unique double size=1000L/1000P birth=3533520L/3533520P fill=1635406 cksum=fa64e13c8:2e1376df1849:4661f482c73f04:4a6b350a3b958f74
Object lvl iblk dblk dsize dnsize lsize %full type
2203854 2 128K 128K 76.0K 1K 256K 100.00 ZFS plain file
168 bonus System attributes
dnode flags: USED_BYTES USERUSED_ACCOUNTED USEROBJUSED_ACCOUNTED
dnode maxblkid: 1
uid 1000
gid 1000
atime Sun Dec 27 15:14:53 2020
mtime Sun Dec 27 19:52:39 2020
ctime Sun Dec 27 19:52:39 2020
crtime Sun Dec 27 18:34:27 2020
gen 10231546
mode 100600
size 168156
parent 2203766
links 0
pflags 40800000004
```
now, with a zdb on the parent, then using the filesize 168156 I've found the file that give me problems..
but the file is identical in both mounted filesystems ....
```bash
root@nbfat:/home/francesco/.zfs/snapshot/zapp_br-main_epoc-1609099026_pdate-12_27_19_57/.local/share/gvfs-metadata# stat root
File: root
Size: 168156 Blocks: 154 IO Block: 131072 regular file
Device: 75h/117d Inode: 3282856 Links: 1
Access: (0600/-rw-------) Uid: ( 1000/francesco) Gid: ( 1000/francesco)
Access: 2020-12-27 19:52:39.216743651 +0000
Modify: 2020-12-27 19:52:39.216743651 +0000
Change: 2020-12-27 19:52:39.216743651 +0000
Birth: -
root@nbfat:/home/francesco/.zfs/snapshot/zapp_br-main_epoc-1609099026_pdate-12_27_19_57/.local/share/gvfs-metadata# md5sum root
70fe067819822f74ec8928e33a5d5788 root
root@nbfat:/home/francesco/.zfs/snapshot/premerge/.local/share/gvfs-metadata# stat root
File: root
Size: 168156 Blocks: 154 IO Block: 131072 regular file
Device: 74h/116d Inode: 3282856 Links: 1
Access: (0600/-rw-------) Uid: ( 1000/francesco) Gid: ( 1000/francesco)
Access: 2020-12-27 19:52:39.216743651 +0000
Modify: 2020-12-27 19:52:39.216743651 +0000
Change: 2020-12-27 19:52:39.216743651 +0000
Birth: -
root@nbfat:/home/francesco/.zfs/snapshot/premerge/.local/share/gvfs-metadata# md5sum root
70fe067819822f74ec8928e33a5d5788 root
```
see same md5, same stat.. but zfs receive still not working there.. (all other partitions always worked fine..)
the file is a binary file with some log inside.. 'less' can open it without issue on both partitions
I have the pc turned and I'll wait to try to work around the problem for a bit.. so if you want some fast-feedback on the issue I can try with anything you ask me to..
I have also the strace of zdb of both the commands
the strace of the non-working partition ends with
```
futex(0x560a73085a90, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\242K;\207\370\254\234\327", 8) = 8
read(4, "\242\1\7q\223m\302>", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7309aa08, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7309a9b0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\341h\302#\330\355:\310", 8) = 8
read(4, "\236\370\242U\1-\360\336", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7309aa08, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7309a9b0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, ".\325\233\365(\266\32\265", 8) = 8
read(4, "\244\262\370J?\202\214~", 8) = 8
read(4, "\212\270\221\16\222\307\370\376", 8) = 8
read(4, "\345\341\35>\327y\336Q", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308b7f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308b7a0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\2068m\323\332\353'\213", 8) = 8
read(4, "\304\205\5\255\325A\312\335", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308b7f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308b7a0, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "\23\234dw\6\3n[", 8) = 8
read(4, "\251\1#\256\3127D\\", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7302d9f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7302d9a0, FUTEX_WAKE_PRIVATE, 1) = 0
fstat(1, {st_mode=S_IFREG|0644, st_size=499725, ...}) = 0
read(4, "\330IA\335Vw\210\340", 8) = 8
read(4, "`\214\360\331\20\342n\366", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308bf88, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308bf30, FUTEX_WAKE_PRIVATE, 1) = 0
read(4, "o_\32\301h1\361\277", 8) = 8
read(4, "\21\350\237\37y\312\373\327", 8) = 8
futex(0x560a72d3e9f0, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a72d3e940, FUTEX_WAKE_PRIVATE, 1) = 1
futex(0x560a7308b7f8, FUTEX_WAIT_PRIVATE, 0, NULL) = 0
futex(0x560a7308b7a0, FUTEX_WAKE_PRIVATE, 1) = 0
write(2, "zdb: ", 5zdb: ) = 5
write(2, "dmu_bonus_hold(2203854) failed, "..., 39dmu_bonus_hold(2203854) failed, errno 2) = 39
write(2, "\n", 1
) = 1
write(1, "Dataset rpool/home/francesco@pre"..., 403Dataset rpool/home/francesco@premerge [ZPL], ID 2162, cr_txg 3563583, 160G, 1635400 objects, rootbp DVA[0]=<0:902ec2e000:1000> DVA[1]=<0:967759b000:1000> [L0 DMU objset] fletcher4 uncompressed LE contiguous unique double size=1000L/1000P birth=3563160L/3563160P fill=1635400 cksum=f774539d1:2d56f9978702:44e869b2864f6c:487332f4016dba4d
Object lvl iblk dblk dsize dnsize lsize %full type
) = 403
exit_group(1) = ?
+++ exited with 1 +++
```
if you want I can provide the full strace in private (I dont attach the files here because I'm not sure that doesn't contain something private.. (ie full content of the problematic file ))
any suggestion is wellcome!
Thank you!
Francesco
| defect | unable to determine path or stats for object in rpool home stale file handle good morning to everyone generic ubuntu smp wed dec utc gnu linux modinfo zfs grep iw version version modinfo spl grep iw version version same os and version in both receiving and sending side zfs is atop of luks encrypted drive but encryption is transparent to zfs so should not matter btw the problem is i have multiple pc and send receive snapshots between them i do that for multiple partitions and everything is fine except for one the home partition for a specific user the issue is this mb cannot receive incremental stream destination rpool home francesco has been modified since most recent snapshot the solution for me was take snapshot check diff then rollback but diff doesn t work i ll explain here below rollback works but merge then still return the same issue cannot receive incremental stream so since more than one year for me has been impossible to use snapshot patching for this partition i ve tried multiple solutions re export import whole partition from the original pc same error on next first merge scrub detected errors but not fixed copy all files with rsync to another location destroy partition re create restore files export full snapshot of new volume and then try again with patches same problem again see here below i ve encountered this issue since more than year and it s a blocking issue for me at the moment i m using borgbackup and i m considering moving to btrfs only for this single issue some investigation diff the first attempt to investigate is taking a snapshot and trying to diff the volume hasn t changed snapshot is byte size zfs snapshot rpool home francesco premerge sudo zfs diff rpool home francesco zapp br main epoc pdate rpool home francesco premerge unable to determine path or stats for object in rpool home francesco zapp br main epoc pdate stale file handle the exactly same error come even if i do zfs rollback to rpool home francesco zapp br main epoc pdate before snapshotting to premerge zdb zpool set cachefile etc zfs zpool cache rpool zpool get all zdb u data zfs zpool cache b rpool home francesco on the premerge which is a snapshot of a filesystem change which should not exist it s the same even after a rollback from the original sh zdb dddd rpool home francesco premerge dataset rpool home francesco premerge id cr txg objects rootbp dva dva uncompres sed le contiguous unique double size birth fill cksum object lvl iblk dblk dsize dnsize lsize full type zdb dmu bonus hold failed errno then zdb on the original partition which should be patched sh zdb dddd rpool home francesco zapp br main epoc pdate dataset rpool home francesco zapp br main epoc pdate id cr txg objects rootbp dva dva uncompressed le contiguous unique double size birth fill cksum object lvl iblk dblk dsize dnsize lsize full type zfs plain file bonus system attributes dnode flags used bytes userused accounted userobjused accounted dnode maxblkid uid gid atime sun dec mtime sun dec ctime sun dec crtime sun dec gen mode size parent links pflags now with a zdb on the parent then using the filesize i ve found the file that give me problems but the file is identical in both mounted filesystems bash root nbfat home francesco zfs snapshot zapp br main epoc pdate local share gvfs metadata stat root file root size blocks io block regular file device inode links access rw uid francesco gid francesco access modify change birth root nbfat home francesco zfs snapshot zapp br main epoc pdate local share gvfs metadata root root root nbfat home francesco zfs snapshot premerge local share gvfs metadata stat root file root size blocks io block regular file device inode links access rw uid francesco gid francesco access modify change birth root nbfat home francesco zfs snapshot premerge local share gvfs metadata root root see same same stat but zfs receive still not working there all other partitions always worked fine the file is a binary file with some log inside less can open it without issue on both partitions i have the pc turned and i ll wait to try to work around the problem for a bit so if you want some fast feedback on the issue i can try with anything you ask me to i have also the strace of zdb of both the commands the strace of the non working partition ends with futex futex wake private read read futex futex wake private futex futex wake private futex futex wait private null futex futex wake private read read futex futex wake private futex futex wake private futex futex wait private null futex futex wake private read read read read futex futex wake private futex futex wake private futex futex wait private null futex futex wake private read read futex futex wake private futex futex wake private futex futex wait private null futex futex wake private read read futex futex wake private futex futex wake private futex futex wait private null futex futex wake private fstat st mode s ifreg st size read read futex futex wake private futex futex wake private futex futex wait private null futex futex wake private read o read futex futex wake private futex futex wake private futex futex wait private null futex futex wake private write zdb write dmu bonus hold failed bonus hold failed errno write n write dataset rpool home francesco pre rpool home francesco premerge id cr txg objects rootbp dva dva uncompressed le contiguous unique double size birth fill cksum object lvl iblk dblk dsize dnsize lsize full type exit group exited with if you want i can provide the full strace in private i dont attach the files here because i m not sure that doesn t contain something private ie full content of the problematic file any suggestion is wellcome thank you francesco | 1 |
199,651 | 22,705,811,772 | IssuesEvent | 2022-07-05 14:33:02 | nexmo-community/mms-csharp | https://api.github.com/repos/nexmo-community/mms-csharp | opened | bouncycastle.1.8.5.nupkg: 1 vulnerabilities (highest severity is: 5.9) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bouncycastle.1.8.5.nupkg</b></p></summary>
<p>Bouncy Castle is a collection of APIs used in cryptography.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg">https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg</a></p>
<p>Path to dependency file: /MessagesSample/MessagesSample.csproj</p>
<p>Path to vulnerable library: /et/packages/bouncycastle/1.8.5/bouncycastle.1.8.5.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/mms-csharp/commit/35f0ef03720d3e2c1d0be8d4f09b9999286e3837">35f0ef03720d3e2c1d0be8d4f09b9999286e3837</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2020-15522](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15522) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.9 | bouncycastle.1.8.5.nupkg | Direct | C#- release-1.8.7, Java- 1.66 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-15522</summary>
### Vulnerable Library - <b>bouncycastle.1.8.5.nupkg</b></p>
<p>Bouncy Castle is a collection of APIs used in cryptography.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg">https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg</a></p>
<p>Path to dependency file: /MessagesSample/MessagesSample.csproj</p>
<p>Path to vulnerable library: /et/packages/bouncycastle/1.8.5/bouncycastle.1.8.5.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **bouncycastle.1.8.5.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/mms-csharp/commit/35f0ef03720d3e2c1d0be8d4f09b9999286e3837">35f0ef03720d3e2c1d0be8d4f09b9999286e3837</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Bouncy Castle BC Java before 1.66, BC C# .NET before 1.8.7, BC-FJA before 1.0.1.2, 1.0.2.1, and BC-FNA before 1.0.1.1 have a timing issue within the EC math library that can expose information about the private key when an attacker is able to observe timing information for the generation of multiple deterministic ECDSA signatures.
<p>Publish Date: 2021-05-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15522>CVE-2020-15522</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-15522">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-15522</a></p>
<p>Release Date: 2021-05-20</p>
<p>Fix Resolution: C#- release-1.8.7, Java- 1.66</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | True | bouncycastle.1.8.5.nupkg: 1 vulnerabilities (highest severity is: 5.9) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bouncycastle.1.8.5.nupkg</b></p></summary>
<p>Bouncy Castle is a collection of APIs used in cryptography.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg">https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg</a></p>
<p>Path to dependency file: /MessagesSample/MessagesSample.csproj</p>
<p>Path to vulnerable library: /et/packages/bouncycastle/1.8.5/bouncycastle.1.8.5.nupkg</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/mms-csharp/commit/35f0ef03720d3e2c1d0be8d4f09b9999286e3837">35f0ef03720d3e2c1d0be8d4f09b9999286e3837</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2020-15522](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15522) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.9 | bouncycastle.1.8.5.nupkg | Direct | C#- release-1.8.7, Java- 1.66 | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2020-15522</summary>
### Vulnerable Library - <b>bouncycastle.1.8.5.nupkg</b></p>
<p>Bouncy Castle is a collection of APIs used in cryptography.</p>
<p>Library home page: <a href="https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg">https://api.nuget.org/packages/bouncycastle.1.8.5.nupkg</a></p>
<p>Path to dependency file: /MessagesSample/MessagesSample.csproj</p>
<p>Path to vulnerable library: /et/packages/bouncycastle/1.8.5/bouncycastle.1.8.5.nupkg</p>
<p>
Dependency Hierarchy:
- :x: **bouncycastle.1.8.5.nupkg** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/nexmo-community/mms-csharp/commit/35f0ef03720d3e2c1d0be8d4f09b9999286e3837">35f0ef03720d3e2c1d0be8d4f09b9999286e3837</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Bouncy Castle BC Java before 1.66, BC C# .NET before 1.8.7, BC-FJA before 1.0.1.2, 1.0.2.1, and BC-FNA before 1.0.1.1 have a timing issue within the EC math library that can expose information about the private key when an attacker is able to observe timing information for the generation of multiple deterministic ECDSA signatures.
<p>Publish Date: 2021-05-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-15522>CVE-2020-15522</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-15522">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-15522</a></p>
<p>Release Date: 2021-05-20</p>
<p>Fix Resolution: C#- release-1.8.7, Java- 1.66</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | non_defect | bouncycastle nupkg vulnerabilities highest severity is vulnerable library bouncycastle nupkg bouncy castle is a collection of apis used in cryptography library home page a href path to dependency file messagessample messagessample csproj path to vulnerable library et packages bouncycastle bouncycastle nupkg found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available medium bouncycastle nupkg direct c release java details cve vulnerable library bouncycastle nupkg bouncy castle is a collection of apis used in cryptography library home page a href path to dependency file messagessample messagessample csproj path to vulnerable library et packages bouncycastle bouncycastle nupkg dependency hierarchy x bouncycastle nupkg vulnerable library found in head commit a href found in base branch main vulnerability details bouncy castle bc java before bc c net before bc fja before and bc fna before have a timing issue within the ec math library that can expose information about the private key when an attacker is able to observe timing information for the generation of multiple deterministic ecdsa signatures publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution c release java rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue | 0 |
43,913 | 11,880,593,099 | IssuesEvent | 2020-03-27 10:58:49 | mestrade/jx-go-hello | https://api.github.com/repos/mestrade/jx-go-hello | opened | CVE-2019-1003033 - Groovy-1.26(java) | defectdojo security / High | *CVE-2019-1003033 - Groovy-1.26(java)*
*Severity:* High
*Cve:* CVE-2019-1003033
*Product/Engagement:* fake2 product / AdHoc Import - Wed, 25 Mar 2020 14:42:14
*Systems*:
*Description*:
Image hash: sha256:8a3e381ece363cb5f0187e5f24988a8febd98e76cd5bc0562d443845066d6e58
Package: groovy-1.26
Package path: /usr/share/jenkins/jenkins.war:WEB-INF/plugins/script-security.hpi:WEB-INF/lib/groovy-sandbox-1.26.jar
Package type: java
Feed: nvdv2/nvdv2:cves
CVE: CVE-2019-1003033
CPE: cpe:/a:-:groovy:1.26:-:-
*Mitigation*:
Upgrade to groovy None
URL: https://nvd.nist.gov/vuln/detail/CVE-2019-1003033
*Impact*:
*References*:https://nvd.nist.gov/vuln/detail/CVE-2019-1003033 | 1.0 | CVE-2019-1003033 - Groovy-1.26(java) - *CVE-2019-1003033 - Groovy-1.26(java)*
*Severity:* High
*Cve:* CVE-2019-1003033
*Product/Engagement:* fake2 product / AdHoc Import - Wed, 25 Mar 2020 14:42:14
*Systems*:
*Description*:
Image hash: sha256:8a3e381ece363cb5f0187e5f24988a8febd98e76cd5bc0562d443845066d6e58
Package: groovy-1.26
Package path: /usr/share/jenkins/jenkins.war:WEB-INF/plugins/script-security.hpi:WEB-INF/lib/groovy-sandbox-1.26.jar
Package type: java
Feed: nvdv2/nvdv2:cves
CVE: CVE-2019-1003033
CPE: cpe:/a:-:groovy:1.26:-:-
*Mitigation*:
Upgrade to groovy None
URL: https://nvd.nist.gov/vuln/detail/CVE-2019-1003033
*Impact*:
*References*:https://nvd.nist.gov/vuln/detail/CVE-2019-1003033 | defect | cve groovy java cve groovy java severity high cve cve product engagement product adhoc import wed mar systems description image hash package groovy package path usr share jenkins jenkins war web inf plugins script security hpi web inf lib groovy sandbox jar package type java feed cves cve cve cpe cpe a groovy mitigation upgrade to groovy none url impact references | 1 |
404,870 | 27,498,573,570 | IssuesEvent | 2023-03-05 12:40:04 | godotengine/godot | https://api.github.com/repos/godotengine/godot | closed | Lack of documentation on impossibility to reuse SceneTreeTween | enhancement documentation topic:animation | ### Godot version
3.5.1
### System information
Linux
### Issue description
See #71245
The behavior described here is not well documented, it doesn't warn against reusing SceneTreeTween or what will happen
### Steps to reproduce
_
### Minimal reproduction project
_ | 1.0 | Lack of documentation on impossibility to reuse SceneTreeTween - ### Godot version
3.5.1
### System information
Linux
### Issue description
See #71245
The behavior described here is not well documented, it doesn't warn against reusing SceneTreeTween or what will happen
### Steps to reproduce
_
### Minimal reproduction project
_ | non_defect | lack of documentation on impossibility to reuse scenetreetween godot version system information linux issue description see the behavior described here is not well documented it doesn t warn against reusing scenetreetween or what will happen steps to reproduce minimal reproduction project | 0 |
44,301 | 12,101,442,052 | IssuesEvent | 2020-04-20 15:13:12 | codesmithtools/Templates | https://api.github.com/repos/codesmithtools/Templates | closed | ForeignKey multiple columns are in wrong order! | Framework-PLINQO Type-Defect auto-migrated | ```
What steps will reproduce the problem?
1.Link two tables on a FK using 3 columns
2.Generate entities
ThisKey and OtherKey should list the columns in the same order, they didn't
version: PLINQO 5.0.1 (last release at this date)
```
Original issue reported on code.google.com by `guillaum...@gmail.com` on 4 Nov 2010 at 11:22
| 1.0 | ForeignKey multiple columns are in wrong order! - ```
What steps will reproduce the problem?
1.Link two tables on a FK using 3 columns
2.Generate entities
ThisKey and OtherKey should list the columns in the same order, they didn't
version: PLINQO 5.0.1 (last release at this date)
```
Original issue reported on code.google.com by `guillaum...@gmail.com` on 4 Nov 2010 at 11:22
| defect | foreignkey multiple columns are in wrong order what steps will reproduce the problem link two tables on a fk using columns generate entities thiskey and otherkey should list the columns in the same order they didn t version plinqo last release at this date original issue reported on code google com by guillaum gmail com on nov at | 1 |
15,745 | 10,343,669,063 | IssuesEvent | 2019-09-04 09:26:55 | MicrosoftDocs/azure-docs | https://api.github.com/repos/MicrosoftDocs/azure-docs | closed | Typo on SQL Server process name | Pri2 cxp doc-bug service/subsvc sql-database/svc triaged | It is "sqlservr.exe" and not "sqlserver.exe".
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: c58c2dff-dcca-2a91-8f0b-9dc0853c5fb5
* Version Independent ID: f59e13bd-bcd1-c7e6-dd45-4a7834a6c8db
* Content: [General-purpose service tier - Azure SQL Database](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tier-general-purpose#feedback)
* Content Source: [articles/sql-database/sql-database-service-tier-general-purpose.md](https://github.com/Microsoft/azure-docs/blob/master/articles/sql-database/sql-database-service-tier-general-purpose.md)
* Service: **sql-database**
* Sub-service: **service**
* GitHub Login: @jovanpop-msft
* Microsoft Alias: **jovanpop** | 1.0 | Typo on SQL Server process name - It is "sqlservr.exe" and not "sqlserver.exe".
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: c58c2dff-dcca-2a91-8f0b-9dc0853c5fb5
* Version Independent ID: f59e13bd-bcd1-c7e6-dd45-4a7834a6c8db
* Content: [General-purpose service tier - Azure SQL Database](https://docs.microsoft.com/en-us/azure/sql-database/sql-database-service-tier-general-purpose#feedback)
* Content Source: [articles/sql-database/sql-database-service-tier-general-purpose.md](https://github.com/Microsoft/azure-docs/blob/master/articles/sql-database/sql-database-service-tier-general-purpose.md)
* Service: **sql-database**
* Sub-service: **service**
* GitHub Login: @jovanpop-msft
* Microsoft Alias: **jovanpop** | non_defect | typo on sql server process name it is sqlservr exe and not sqlserver exe document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id dcca version independent id content content source service sql database sub service service github login jovanpop msft microsoft alias jovanpop | 0 |
297,803 | 9,181,870,076 | IssuesEvent | 2019-03-05 11:17:53 | zephyrproject-rtos/zephyr | https://api.github.com/repos/zephyrproject-rtos/zephyr | closed | No callback - USB suspend | area: Power Management area: USB enhancement priority: medium | **Describe the bug**
USB does not provide proper callback when suspended (USB_DC_SUSPEND). Using nRF52840 USB Device Controller Driver.
**To Reproduce**
Device with mentioned driver is connected and systems hibernates, USB devices should be suspended.
**Environment (please complete the following information):**
- OS: Linux, Ubuntu 16.04 LTS
- Toolchain: Zephyr SDK 0.9.5 | 1.0 | No callback - USB suspend - **Describe the bug**
USB does not provide proper callback when suspended (USB_DC_SUSPEND). Using nRF52840 USB Device Controller Driver.
**To Reproduce**
Device with mentioned driver is connected and systems hibernates, USB devices should be suspended.
**Environment (please complete the following information):**
- OS: Linux, Ubuntu 16.04 LTS
- Toolchain: Zephyr SDK 0.9.5 | non_defect | no callback usb suspend describe the bug usb does not provide proper callback when suspended usb dc suspend using usb device controller driver to reproduce device with mentioned driver is connected and systems hibernates usb devices should be suspended environment please complete the following information os linux ubuntu lts toolchain zephyr sdk | 0 |
1,157 | 2,598,004,938 | IssuesEvent | 2015-02-22 01:42:00 | chrsmith/bwapi | https://api.github.com/repos/chrsmith/bwapi | opened | Defect with automating replay restarts | auto-migrated Component-Logic Priority-Critical Type-Defect Usability | ```
What steps will reproduce the problem?
1. When trying to automate restarts on a series of replays using the suggested
commands it misses out certain replays. E.g. given a set of replays 1.rep,
2.rep, 3.rep, 4.rep it will always run 2.rep and 4.rep missing out 1 replay
each time.
Main commands:
[auto_menu]
; auto_menu = OFF | SINGLE_PLAYER | LAN | BATTLE_NET
auto_menu = SINGLE_PLAYER
; auto_restart = ON | OFF
auto_restart = ON
; map = path to map relative to Starcraft folder
map = Maps\REPLAYS\*.rep
; mapiteration = RANDOM | SEQUENCE
; type of iteration that will be done on a map name with a wildcard
mapiteration = SEQUENCE
I have attached the full file for reference.
What is the expected output? What do you see instead?
1. To have all replays execute (i.e. not to miss out one each time) in sequence.
What version of the product are you using? On what operating system?
BWAPI 3.7.4 on Windows 7 (x64), compiled using VC++ 2008 Express
```
-----
Original issue reported on code.google.com by `mdsumne...@gmail.com` on 5 Nov 2013 at 6:59
Attachments:
* [bwapi.ini](https://storage.googleapis.com/google-code-attachments/bwapi/issue-497/comment-0/bwapi.ini)
| 1.0 | Defect with automating replay restarts - ```
What steps will reproduce the problem?
1. When trying to automate restarts on a series of replays using the suggested
commands it misses out certain replays. E.g. given a set of replays 1.rep,
2.rep, 3.rep, 4.rep it will always run 2.rep and 4.rep missing out 1 replay
each time.
Main commands:
[auto_menu]
; auto_menu = OFF | SINGLE_PLAYER | LAN | BATTLE_NET
auto_menu = SINGLE_PLAYER
; auto_restart = ON | OFF
auto_restart = ON
; map = path to map relative to Starcraft folder
map = Maps\REPLAYS\*.rep
; mapiteration = RANDOM | SEQUENCE
; type of iteration that will be done on a map name with a wildcard
mapiteration = SEQUENCE
I have attached the full file for reference.
What is the expected output? What do you see instead?
1. To have all replays execute (i.e. not to miss out one each time) in sequence.
What version of the product are you using? On what operating system?
BWAPI 3.7.4 on Windows 7 (x64), compiled using VC++ 2008 Express
```
-----
Original issue reported on code.google.com by `mdsumne...@gmail.com` on 5 Nov 2013 at 6:59
Attachments:
* [bwapi.ini](https://storage.googleapis.com/google-code-attachments/bwapi/issue-497/comment-0/bwapi.ini)
| defect | defect with automating replay restarts what steps will reproduce the problem when trying to automate restarts on a series of replays using the suggested commands it misses out certain replays e g given a set of replays rep rep rep rep it will always run rep and rep missing out replay each time main commands auto menu off single player lan battle net auto menu single player auto restart on off auto restart on map path to map relative to starcraft folder map maps replays rep mapiteration random sequence type of iteration that will be done on a map name with a wildcard mapiteration sequence i have attached the full file for reference what is the expected output what do you see instead to have all replays execute i e not to miss out one each time in sequence what version of the product are you using on what operating system bwapi on windows compiled using vc express original issue reported on code google com by mdsumne gmail com on nov at attachments | 1 |
13,702 | 10,021,908,192 | IssuesEvent | 2019-07-16 15:32:01 | MicrosoftDocs/azure-docs | https://api.github.com/repos/MicrosoftDocs/azure-docs | closed | How to generate and understand azure-vote.yaml? | container-service/svc cxp product-question triaged | Can the file azure-vote.yaml be generated by VS or manually entered?
Is there resource that helps me understand azure-vote.yaml?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6115d555-45a1-6539-4af6-fbc20a99f008
* Version Independent ID: 21f80c83-ba10-9b7e-c54f-51a9c140517c
* Content: [Quickstart - Create an Azure Kubernetes Service (AKS) cluster](https://docs.microsoft.com/en-us/azure/aks/kubernetes-walkthrough)
* Content Source: [articles/aks/kubernetes-walkthrough.md](https://github.com/Microsoft/azure-docs/blob/master/articles/aks/kubernetes-walkthrough.md)
* Service: **container-service**
* GitHub Login: @mlearned
* Microsoft Alias: **mlearned** | 1.0 | How to generate and understand azure-vote.yaml? - Can the file azure-vote.yaml be generated by VS or manually entered?
Is there resource that helps me understand azure-vote.yaml?
---
#### Document Details
⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.*
* ID: 6115d555-45a1-6539-4af6-fbc20a99f008
* Version Independent ID: 21f80c83-ba10-9b7e-c54f-51a9c140517c
* Content: [Quickstart - Create an Azure Kubernetes Service (AKS) cluster](https://docs.microsoft.com/en-us/azure/aks/kubernetes-walkthrough)
* Content Source: [articles/aks/kubernetes-walkthrough.md](https://github.com/Microsoft/azure-docs/blob/master/articles/aks/kubernetes-walkthrough.md)
* Service: **container-service**
* GitHub Login: @mlearned
* Microsoft Alias: **mlearned** | non_defect | how to generate and understand azure vote yaml can the file azure vote yaml be generated by vs or manually entered is there resource that helps me understand azure vote yaml document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service container service github login mlearned microsoft alias mlearned | 0 |
24,378 | 3,969,153,627 | IssuesEvent | 2016-05-03 22:16:14 | zaproxy/zaproxy | https://api.github.com/repos/zaproxy/zaproxy | closed | Session Properties Dialog getting smaller | InsufficientEvidence Priority-High Type-Defect | ```
What steps will reproduce the problem?
1. Open the Session Properties dialog
2. Close the Session Properties dialog via OK or Cancel buttons
3. Repeat steps 1-2 a couple of times
What is the expected output? What do you see instead?
The Session properties dialog shrinks and gets smaller with every re-opening.
```
Original issue reported on code.google.com by `cosminstefanxp` on 2014-01-21 20:36:25 | 1.0 | Session Properties Dialog getting smaller - ```
What steps will reproduce the problem?
1. Open the Session Properties dialog
2. Close the Session Properties dialog via OK or Cancel buttons
3. Repeat steps 1-2 a couple of times
What is the expected output? What do you see instead?
The Session properties dialog shrinks and gets smaller with every re-opening.
```
Original issue reported on code.google.com by `cosminstefanxp` on 2014-01-21 20:36:25 | defect | session properties dialog getting smaller what steps will reproduce the problem open the session properties dialog close the session properties dialog via ok or cancel buttons repeat steps a couple of times what is the expected output what do you see instead the session properties dialog shrinks and gets smaller with every re opening original issue reported on code google com by cosminstefanxp on | 1 |
51,983 | 13,211,355,645 | IssuesEvent | 2020-08-15 22:32:19 | icecube-trac/tix4 | https://api.github.com/repos/icecube-trac/tix4 | opened | hdf5 headers and libs don't match (Trac #1413) | Incomplete Migration Migrated from Trac defect other | <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1413">https://code.icecube.wisc.edu/projects/icecube/ticket/1413</a>, reported by mkauer</summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-10-31T18:48:46",
"_ts": "1446317326342003",
"description": "using py2-v1/setup.sh\nand\nexport LD_LIBRARY_PATH=/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/lib:$LD_LIBRARY_PATH\nexport PATH=/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/bin:$PATH\nexport PYTHONPATH=/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/lib/python2.7/site-packages:$PYTHONPATH\n\nipython; import h5py\nImportError: No module named h5py\n\nso do a: \npip install --upgrade --force-reinstall --user h5py\nand set:\nexport LD_LIBRARY_PATH=/home/mkauer/.local/lib:$LD_LIBRARY_PATH\nexport LD_LIBRARY_PATH=/home/mkauer/.local/lib/python2.7/site-packages:$LD_LIBRARY_PATH\nexport PATH=/home/mkauer/.local/bin:$PATH\nexport PYTHONPATH=/home/mkauer/.local/lib/python2.7/site-packages:$PYTHONPATH\n\nipython; import h5py\nWarning! ***HDF5 library version mismatched error***\nThe HDF5 header files used to compile this application do not match\nthe version used by the HDF5 library to which this application is linked.\nData corruption or segmentation faults may occur if the application continues.\nThis can happen when an application was compiled by one version of HDF5 but\nlinked with a different version of static or shared HDF5 library.\nYou should recompile the application or check your shared library related\nsettings such as 'LD_LIBRARY_PATH'.\nYou can, at your own risk, disable this warning by setting the environment\nvariable 'HDF5_DISABLE_VERSION_CHECK' to a value of '1'.\nSetting it to 2 or higher will suppress the warning messages totally.\nHeaders are 1.8.5, library is 1.8.11\n\nI've tried installing different versions of h5py and tried using py2-v1/setup.sh and py2-v2/setup.sh etc. but always get this error about the hdf5 headers not matching the libs. So at this point, I'm pretty convinced there's something not right with hdf5 under the py2-v1 and py2-v2 setups.\n\n",
"reporter": "mkauer",
"cc": "",
"resolution": "wontfix",
"time": "2015-10-31T02:36:19",
"component": "other",
"summary": "hdf5 headers and libs don't match",
"priority": "normal",
"keywords": "hdf5,h5py",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
| 1.0 | hdf5 headers and libs don't match (Trac #1413) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1413">https://code.icecube.wisc.edu/projects/icecube/ticket/1413</a>, reported by mkauer</summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-10-31T18:48:46",
"_ts": "1446317326342003",
"description": "using py2-v1/setup.sh\nand\nexport LD_LIBRARY_PATH=/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/lib:$LD_LIBRARY_PATH\nexport PATH=/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/bin:$PATH\nexport PYTHONPATH=/cvmfs/icecube.opensciencegrid.org/py2-v1/RHEL_6_x86_64/lib/python2.7/site-packages:$PYTHONPATH\n\nipython; import h5py\nImportError: No module named h5py\n\nso do a: \npip install --upgrade --force-reinstall --user h5py\nand set:\nexport LD_LIBRARY_PATH=/home/mkauer/.local/lib:$LD_LIBRARY_PATH\nexport LD_LIBRARY_PATH=/home/mkauer/.local/lib/python2.7/site-packages:$LD_LIBRARY_PATH\nexport PATH=/home/mkauer/.local/bin:$PATH\nexport PYTHONPATH=/home/mkauer/.local/lib/python2.7/site-packages:$PYTHONPATH\n\nipython; import h5py\nWarning! ***HDF5 library version mismatched error***\nThe HDF5 header files used to compile this application do not match\nthe version used by the HDF5 library to which this application is linked.\nData corruption or segmentation faults may occur if the application continues.\nThis can happen when an application was compiled by one version of HDF5 but\nlinked with a different version of static or shared HDF5 library.\nYou should recompile the application or check your shared library related\nsettings such as 'LD_LIBRARY_PATH'.\nYou can, at your own risk, disable this warning by setting the environment\nvariable 'HDF5_DISABLE_VERSION_CHECK' to a value of '1'.\nSetting it to 2 or higher will suppress the warning messages totally.\nHeaders are 1.8.5, library is 1.8.11\n\nI've tried installing different versions of h5py and tried using py2-v1/setup.sh and py2-v2/setup.sh etc. but always get this error about the hdf5 headers not matching the libs. So at this point, I'm pretty convinced there's something not right with hdf5 under the py2-v1 and py2-v2 setups.\n\n",
"reporter": "mkauer",
"cc": "",
"resolution": "wontfix",
"time": "2015-10-31T02:36:19",
"component": "other",
"summary": "hdf5 headers and libs don't match",
"priority": "normal",
"keywords": "hdf5,h5py",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
| defect | headers and libs don t match trac migrated from json status closed changetime ts description using setup sh nand nexport ld library path cvmfs icecube opensciencegrid org rhel lib ld library path nexport path cvmfs icecube opensciencegrid org rhel bin path nexport pythonpath cvmfs icecube opensciencegrid org rhel lib site packages pythonpath n nipython import nimporterror no module named n nso do a npip install upgrade force reinstall user nand set nexport ld library path home mkauer local lib ld library path nexport ld library path home mkauer local lib site packages ld library path nexport path home mkauer local bin path nexport pythonpath home mkauer local lib site packages pythonpath n nipython import nwarning library version mismatched error nthe header files used to compile this application do not match nthe version used by the library to which this application is linked ndata corruption or segmentation faults may occur if the application continues nthis can happen when an application was compiled by one version of but nlinked with a different version of static or shared library nyou should recompile the application or check your shared library related nsettings such as ld library path nyou can at your own risk disable this warning by setting the environment nvariable disable version check to a value of nsetting it to or higher will suppress the warning messages totally nheaders are library is n ni ve tried installing different versions of and tried using setup sh and setup sh etc but always get this error about the headers not matching the libs so at this point i m pretty convinced there s something not right with under the and setups n n reporter mkauer cc resolution wontfix time component other summary headers and libs don t match priority normal keywords milestone owner type defect | 1 |
1,202 | 3,075,653,144 | IssuesEvent | 2015-08-20 14:41:12 | dart-lang/dartdoc | https://api.github.com/repos/dart-lang/dartdoc | closed | https://dartdoc.firebaseapp.com/ isn't updated | Infrastructure | Any clues as to why?
We link to this from the README... if we don't guarantee that https://dartdoc.firebaseapp.com/ will be up to date, let's unlink from the README.
Thoughts? | 1.0 | https://dartdoc.firebaseapp.com/ isn't updated - Any clues as to why?
We link to this from the README... if we don't guarantee that https://dartdoc.firebaseapp.com/ will be up to date, let's unlink from the README.
Thoughts? | non_defect | isn t updated any clues as to why we link to this from the readme if we don t guarantee that will be up to date let s unlink from the readme thoughts | 0 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.