Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 5
112
| repo_url
stringlengths 34
141
| action
stringclasses 3
values | title
stringlengths 1
757
| labels
stringlengths 4
664
| body
stringlengths 3
261k
| index
stringclasses 10
values | text_combine
stringlengths 96
261k
| label
stringclasses 2
values | text
stringlengths 96
232k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
80,178
| 30,086,422,902
|
IssuesEvent
|
2023-06-29 08:58:47
|
jOOQ/jOOQ
|
https://api.github.com/repos/jOOQ/jOOQ
|
closed
|
Meta.ddl() generates broken DDL for tsvector columns
|
T: Defect C: Functionality P: Medium E: All Editions
|
### Expected behavior
We expect that Meta.ddl() exports a valid DDL for a table with tsvector columns:
```sql
CREATE TABLE public.film (
fulltext tsvector
);
```
### Actual behavior
Meta.ddl() exports broken DDL for that tables:
```sql
create table "public"."film" (
"fulltext" any
)
```
### Steps to reproduce the problem
Create a table in DB:
```sql
CREATE TABLE public.film (
fulltext tsvector
);
```
Run this code:
```java
Connection conn = DriverManager.getConnection("jdbc:postgresql://localhost:6000/postgres", "postgres", "postgres");
Configuration configuration = new DefaultConfiguration().set(conn).set(SQLDialect.POSTGRES);
Meta meta = using(configuration).meta();
Arrays.stream(meta.filterSchemas(v -> v.getName().equalsIgnoreCase("public"))
.ddl()
.queries())
.map(Object::toString)
.forEach(System.out::println);
```
Executing the resulting query against a different database or schema:
```sql
create table "public"."film" (
"fulltext" any
)
```
leads to the error:
```sql
SQL Error [42601]: ERROR: syntax error at or near "any"
```
### jOOQ Version
jOOQ Professional Edition 3.18.4
### Database product and version
PostgreSQL 15.2 (Debian 15.2-1.pgdg110+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 10.2.1-6) 10.2.1 20210110, 64-bit
### Java Version
openjdk 17.0.2 2022-01-18
### OS Version
Microsoft Windows [Version 10.0.19044.2846]
### JDBC driver name and version (include name if unofficial driver)
org.postgresql:postgresql:42.6.0
|
1.0
|
Meta.ddl() generates broken DDL for tsvector columns - ### Expected behavior
We expect that Meta.ddl() exports a valid DDL for a table with tsvector columns:
```sql
CREATE TABLE public.film (
fulltext tsvector
);
```
### Actual behavior
Meta.ddl() exports broken DDL for that tables:
```sql
create table "public"."film" (
"fulltext" any
)
```
### Steps to reproduce the problem
Create a table in DB:
```sql
CREATE TABLE public.film (
fulltext tsvector
);
```
Run this code:
```java
Connection conn = DriverManager.getConnection("jdbc:postgresql://localhost:6000/postgres", "postgres", "postgres");
Configuration configuration = new DefaultConfiguration().set(conn).set(SQLDialect.POSTGRES);
Meta meta = using(configuration).meta();
Arrays.stream(meta.filterSchemas(v -> v.getName().equalsIgnoreCase("public"))
.ddl()
.queries())
.map(Object::toString)
.forEach(System.out::println);
```
Executing the resulting query against a different database or schema:
```sql
create table "public"."film" (
"fulltext" any
)
```
leads to the error:
```sql
SQL Error [42601]: ERROR: syntax error at or near "any"
```
### jOOQ Version
jOOQ Professional Edition 3.18.4
### Database product and version
PostgreSQL 15.2 (Debian 15.2-1.pgdg110+1) on x86_64-pc-linux-gnu, compiled by gcc (Debian 10.2.1-6) 10.2.1 20210110, 64-bit
### Java Version
openjdk 17.0.2 2022-01-18
### OS Version
Microsoft Windows [Version 10.0.19044.2846]
### JDBC driver name and version (include name if unofficial driver)
org.postgresql:postgresql:42.6.0
|
defect
|
meta ddl generates broken ddl for tsvector columns expected behavior we expect that meta ddl exports a valid ddl for a table with tsvector columns sql create table public film fulltext tsvector actual behavior meta ddl exports broken ddl for that tables sql create table public film fulltext any steps to reproduce the problem create a table in db sql create table public film fulltext tsvector run this code java connection conn drivermanager getconnection jdbc postgresql localhost postgres postgres postgres configuration configuration new defaultconfiguration set conn set sqldialect postgres meta meta using configuration meta arrays stream meta filterschemas v v getname equalsignorecase public ddl queries map object tostring foreach system out println executing the resulting query against a different database or schema sql create table public film fulltext any leads to the error sql sql error error syntax error at or near any jooq version jooq professional edition database product and version postgresql debian on pc linux gnu compiled by gcc debian bit java version openjdk os version microsoft windows jdbc driver name and version include name if unofficial driver org postgresql postgresql
| 1
|
147,189
| 5,634,792,623
|
IssuesEvent
|
2017-04-05 22:17:32
|
kvhnuke/etherwallet
|
https://api.github.com/repos/kvhnuke/etherwallet
|
closed
|
Accept private keys that start with `0x`
|
feature lower priority wallet decrypt related
|
Wallet decrypt directive:
Some places export a private key that starts with an 0x. If a user inputs them currently, it doesn't validate. It should unlock regardless (basically by pretending the 0x isnt there or whatever)
|
1.0
|
Accept private keys that start with `0x` - Wallet decrypt directive:
Some places export a private key that starts with an 0x. If a user inputs them currently, it doesn't validate. It should unlock regardless (basically by pretending the 0x isnt there or whatever)
|
non_defect
|
accept private keys that start with wallet decrypt directive some places export a private key that starts with an if a user inputs them currently it doesn t validate it should unlock regardless basically by pretending the isnt there or whatever
| 0
|
51,613
| 13,207,536,548
|
IssuesEvent
|
2020-08-14 23:29:46
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
opened
|
boost 1.38 does not play well with gcc 4.7 (on ubuntu 12.10) - link to patch included (Trac #704)
|
Incomplete Migration Migrated from Trac defect tools/ports
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/704">https://code.icecube.wisc.edu/projects/icecube/ticket/704</a>, reported by claudio.kopperand owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2014-04-18T03:16:19",
"_ts": "1397790979000000",
"description": "Apparently the following patch is needed to make it detector pthread support correctly on this version of gcc: http://cgit.freedesktop.org/libreoffice/core/commit/?id=8ecf7e2259681b7a0d26766ea4500a7a6313f56d\n",
"reporter": "claudio.kopper",
"cc": "",
"resolution": "fixed",
"time": "2012-11-26T21:08:57",
"component": "tools/ports",
"summary": "boost 1.38 does not play well with gcc 4.7 (on ubuntu 12.10) - link to patch included",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
boost 1.38 does not play well with gcc 4.7 (on ubuntu 12.10) - link to patch included (Trac #704) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/704">https://code.icecube.wisc.edu/projects/icecube/ticket/704</a>, reported by claudio.kopperand owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2014-04-18T03:16:19",
"_ts": "1397790979000000",
"description": "Apparently the following patch is needed to make it detector pthread support correctly on this version of gcc: http://cgit.freedesktop.org/libreoffice/core/commit/?id=8ecf7e2259681b7a0d26766ea4500a7a6313f56d\n",
"reporter": "claudio.kopper",
"cc": "",
"resolution": "fixed",
"time": "2012-11-26T21:08:57",
"component": "tools/ports",
"summary": "boost 1.38 does not play well with gcc 4.7 (on ubuntu 12.10) - link to patch included",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
defect
|
boost does not play well with gcc on ubuntu link to patch included trac migrated from json status closed changetime ts description apparently the following patch is needed to make it detector pthread support correctly on this version of gcc reporter claudio kopper cc resolution fixed time component tools ports summary boost does not play well with gcc on ubuntu link to patch included priority normal keywords milestone owner nega type defect
| 1
|
58,726
| 16,737,185,388
|
IssuesEvent
|
2021-06-11 04:20:23
|
owncloud/ocis
|
https://api.github.com/repos/owncloud/ocis
|
closed
|
Favourite a received share or any item in a share is impossible
|
Category:Defect Type:Bug
|
1. as `einstein` create a folder with a file as content
2. as `einstein` share that folder with `richard`
3. as `richard` accept the share
4. as `richard` try to favourite the file inside the received folder `curl -v -k -u richard:superfluidity -X PROPPATCH https://localhost:9200/remote.php/webdav/Shares/f1/english.png -d '<?xml version="1.0"?><d:propertyupdate xmlns:d="DAV:" xmlns:oc="http://owncloud.org/ns"> <d:set><d:prop><oc:favorite xmlns:oc="http://owncloud.org/ns">1</oc:favorite></d:prop></d:set></d:propertyupdate>'`
Result: `HTTP/1.1 404 Not Found `
ocis logs:
```
2020-10-21T15:53:09+05:45 WRN resource not found path=/home/Shares/f1/english.png pkg=rhttp service=storage traceid=ff5dde45d2697a8b5c5dc82c7380983c
2020-10-21T15:53:09+05:45 WRN http end="21/Oct/2020:15:53:09 +0545" host=127.0.0.1 method=PROPPATCH pkg=rhttp proto=HTTP/1.1 service=storage size=0 start="21/Oct/2020:15:53:09 +0545" status=404 time_ns=365185929 traceid=ff5dde45d2697a8b5c5dc82c7380983c uri=/remote.php/webdav/Shares/f1/english.png url=/remote.php/webdav/Shares/f1/english.png
|
1.0
|
Favourite a received share or any item in a share is impossible - 1. as `einstein` create a folder with a file as content
2. as `einstein` share that folder with `richard`
3. as `richard` accept the share
4. as `richard` try to favourite the file inside the received folder `curl -v -k -u richard:superfluidity -X PROPPATCH https://localhost:9200/remote.php/webdav/Shares/f1/english.png -d '<?xml version="1.0"?><d:propertyupdate xmlns:d="DAV:" xmlns:oc="http://owncloud.org/ns"> <d:set><d:prop><oc:favorite xmlns:oc="http://owncloud.org/ns">1</oc:favorite></d:prop></d:set></d:propertyupdate>'`
Result: `HTTP/1.1 404 Not Found `
ocis logs:
```
2020-10-21T15:53:09+05:45 WRN resource not found path=/home/Shares/f1/english.png pkg=rhttp service=storage traceid=ff5dde45d2697a8b5c5dc82c7380983c
2020-10-21T15:53:09+05:45 WRN http end="21/Oct/2020:15:53:09 +0545" host=127.0.0.1 method=PROPPATCH pkg=rhttp proto=HTTP/1.1 service=storage size=0 start="21/Oct/2020:15:53:09 +0545" status=404 time_ns=365185929 traceid=ff5dde45d2697a8b5c5dc82c7380983c uri=/remote.php/webdav/Shares/f1/english.png url=/remote.php/webdav/Shares/f1/english.png
|
defect
|
favourite a received share or any item in a share is impossible as einstein create a folder with a file as content as einstein share that folder with richard as richard accept the share as richard try to favourite the file inside the received folder curl v k u richard superfluidity x proppatch d oc favorite xmlns oc result http not found ocis logs wrn resource not found path home shares english png pkg rhttp service storage traceid wrn http end oct host method proppatch pkg rhttp proto http service storage size start oct status time ns traceid uri remote php webdav shares english png url remote php webdav shares english png
| 1
|
83,441
| 16,174,483,483
|
IssuesEvent
|
2021-05-03 02:41:40
|
jhona-tam/Farmacia
|
https://api.github.com/repos/jhona-tam/Farmacia
|
opened
|
Editar laboratorio 23
|
bug code documentation
|
**Edita y actualizar laboratorio**
- actualizar y editar datos del laboratorio en un mismo modal.
En back y fornd
|
1.0
|
Editar laboratorio 23 - **Edita y actualizar laboratorio**
- actualizar y editar datos del laboratorio en un mismo modal.
En back y fornd
|
non_defect
|
editar laboratorio edita y actualizar laboratorio actualizar y editar datos del laboratorio en un mismo modal en back y fornd
| 0
|
137,242
| 18,752,666,457
|
IssuesEvent
|
2021-11-05 05:46:43
|
madhans23/linux-4.15
|
https://api.github.com/repos/madhans23/linux-4.15
|
opened
|
CVE-2020-10690 (Medium) detected in multiple libraries
|
security vulnerability
|
## CVE-2020-10690 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxv4.15</b>, <b>linux-stablev4.17.12</b>, <b>linux-yocto-devv5.4</b>, <b>linux-stablev4.17.12</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is a use-after-free in kernel versions before 5.5 due to a race condition between the release of ptp_clock and cdev while resource deallocation. When a (high privileged) process allocates a ptp device file (like /dev/ptpX) and voluntarily goes to sleep. During this time if the underlying device is removed, it can cause an exploitable condition as the process wakes up to terminate and clean all attached files. The system crashes due to the cdev structure being invalid (as already freed) which is pointed to by the inode.
<p>Publish Date: 2020-05-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10690>CVE-2020-10690</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10690">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10690</a></p>
<p>Release Date: 2020-05-08</p>
<p>Fix Resolution: v5.5-rc5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-10690 (Medium) detected in multiple libraries - ## CVE-2020-10690 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>linuxv4.15</b>, <b>linux-stablev4.17.12</b>, <b>linux-yocto-devv5.4</b>, <b>linux-stablev4.17.12</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
There is a use-after-free in kernel versions before 5.5 due to a race condition between the release of ptp_clock and cdev while resource deallocation. When a (high privileged) process allocates a ptp device file (like /dev/ptpX) and voluntarily goes to sleep. During this time if the underlying device is removed, it can cause an exploitable condition as the process wakes up to terminate and clean all attached files. The system crashes due to the cdev structure being invalid (as already freed) which is pointed to by the inode.
<p>Publish Date: 2020-05-08
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-10690>CVE-2020-10690</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10690">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-10690</a></p>
<p>Release Date: 2020-05-08</p>
<p>Fix Resolution: v5.5-rc5</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries linux linux yocto linux vulnerability details there is a use after free in kernel versions before due to a race condition between the release of ptp clock and cdev while resource deallocation when a high privileged process allocates a ptp device file like dev ptpx and voluntarily goes to sleep during this time if the underlying device is removed it can cause an exploitable condition as the process wakes up to terminate and clean all attached files the system crashes due to the cdev structure being invalid as already freed which is pointed to by the inode publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
79,581
| 28,433,791,304
|
IssuesEvent
|
2023-04-15 04:05:04
|
zealdocs/zeal
|
https://api.github.com/repos/zealdocs/zeal
|
closed
|
Go docset: Zeal becomes unresponsive when trying to open any article
|
type/defect resolution/awaiting-response resolution/done scope/ui/webview
|
Using Zeal 0.6.1 under Manjaro Linux, when I try to open any article from the Go docset (installed from Tools menu → Docsets… → Available tab), the UI becomes unresponsive and the application needs to be killed externally.
Other docsets like Python 3 work fine.
|
1.0
|
Go docset: Zeal becomes unresponsive when trying to open any article - Using Zeal 0.6.1 under Manjaro Linux, when I try to open any article from the Go docset (installed from Tools menu → Docsets… → Available tab), the UI becomes unresponsive and the application needs to be killed externally.
Other docsets like Python 3 work fine.
|
defect
|
go docset zeal becomes unresponsive when trying to open any article using zeal under manjaro linux when i try to open any article from the go docset installed from tools menu → docsets… → available tab the ui becomes unresponsive and the application needs to be killed externally other docsets like python work fine
| 1
|
15,635
| 2,867,249,450
|
IssuesEvent
|
2015-06-05 12:05:12
|
jOOQ/jOOQ
|
https://api.github.com/repos/jOOQ/jOOQ
|
opened
|
Wrong implementation of MergeImpl.andNot(Field<Boolean>)
|
C: Functionality P: Low T: Defect
|
The following implementation is wrong:
```java
@Override
public final MergeImpl andNot(Field<Boolean> condition) {
return and(condition(condition));
}
```
|
1.0
|
Wrong implementation of MergeImpl.andNot(Field<Boolean>) - The following implementation is wrong:
```java
@Override
public final MergeImpl andNot(Field<Boolean> condition) {
return and(condition(condition));
}
```
|
defect
|
wrong implementation of mergeimpl andnot field the following implementation is wrong java override public final mergeimpl andnot field condition return and condition condition
| 1
|
414,265
| 27,983,090,433
|
IssuesEvent
|
2023-03-26 11:46:47
|
sile-typesetter/sile
|
https://api.github.com/repos/sile-typesetter/sile
|
reopened
|
Google mailing list isn't publicly visible
|
documentation
|
The readme says to contact the google mailing list:
https://groups.google.com/forum/#!forum/sile-users
But Google tells me there is no such group, or that the group is closed.
Suggestion: either open the group, or remove the link from the readme.
---
And just in case it's closed, I'll grab this opportunity to firstly say *thanks* - Latex has had me pulling out hair, and I've so far stuck three random SILE functions together and had the results come out perfectly.
Secondly, is there any likelihood of floating images in the future? Specifically, I need to insert a load of svg images about half the width of the page, and have them automatically determine their own (proportional) height, and make their own decision about raising above their current position or skipping to the next page, without leaving a trail of destruction and whitespace across the rest of the chapter.
|
1.0
|
Google mailing list isn't publicly visible - The readme says to contact the google mailing list:
https://groups.google.com/forum/#!forum/sile-users
But Google tells me there is no such group, or that the group is closed.
Suggestion: either open the group, or remove the link from the readme.
---
And just in case it's closed, I'll grab this opportunity to firstly say *thanks* - Latex has had me pulling out hair, and I've so far stuck three random SILE functions together and had the results come out perfectly.
Secondly, is there any likelihood of floating images in the future? Specifically, I need to insert a load of svg images about half the width of the page, and have them automatically determine their own (proportional) height, and make their own decision about raising above their current position or skipping to the next page, without leaving a trail of destruction and whitespace across the rest of the chapter.
|
non_defect
|
google mailing list isn t publicly visible the readme says to contact the google mailing list but google tells me there is no such group or that the group is closed suggestion either open the group or remove the link from the readme and just in case it s closed i ll grab this opportunity to firstly say thanks latex has had me pulling out hair and i ve so far stuck three random sile functions together and had the results come out perfectly secondly is there any likelihood of floating images in the future specifically i need to insert a load of svg images about half the width of the page and have them automatically determine their own proportional height and make their own decision about raising above their current position or skipping to the next page without leaving a trail of destruction and whitespace across the rest of the chapter
| 0
|
36,076
| 7,858,007,615
|
IssuesEvent
|
2018-06-21 12:43:56
|
octavian-paraschiv/protone-suite
|
https://api.github.com/repos/octavian-paraschiv/protone-suite
|
reopened
|
Double clicking a media file from Explorer will not launch the file in ProTONE Player
|
Category-Player OS-All Priority-P2 Regression_Test_Selected ReportSource-DevQA Resolution-Resolved-ReadyToTest Type-Defect
|
```
Double clicking a media file from Explorer will not launch the file in ProTONE
Player
```
Original issue reported on code.google.com by `octavian...@gmail.com` on 1 Aug 2014 at 8:25
|
1.0
|
Double clicking a media file from Explorer will not launch the file in ProTONE Player - ```
Double clicking a media file from Explorer will not launch the file in ProTONE
Player
```
Original issue reported on code.google.com by `octavian...@gmail.com` on 1 Aug 2014 at 8:25
|
defect
|
double clicking a media file from explorer will not launch the file in protone player double clicking a media file from explorer will not launch the file in protone player original issue reported on code google com by octavian gmail com on aug at
| 1
|
87,339
| 8,071,912,330
|
IssuesEvent
|
2018-08-06 14:31:42
|
GTNewHorizons/NewHorizons
|
https://api.github.com/repos/GTNewHorizons/NewHorizons
|
closed
|
Recipes for TecTech Dynamo Hatches
|
FixedInDev need to be tested
|
Currently the IV, LuV and ZPM 16 and 64A Dynamo Hatches have no recipe. Would be nice if we could get some, since the addition of the Power Station made those hatches (more) useful.
|
1.0
|
Recipes for TecTech Dynamo Hatches - Currently the IV, LuV and ZPM 16 and 64A Dynamo Hatches have no recipe. Would be nice if we could get some, since the addition of the Power Station made those hatches (more) useful.
|
non_defect
|
recipes for tectech dynamo hatches currently the iv luv and zpm and dynamo hatches have no recipe would be nice if we could get some since the addition of the power station made those hatches more useful
| 0
|
30,748
| 6,264,940,769
|
IssuesEvent
|
2017-07-16 13:12:16
|
NewSpring/Holtzman
|
https://api.github.com/repos/NewSpring/Holtzman
|
closed
|
Series Page: 'X' in Player Does Not Close Player
|
App Defect iOS Operations
|
## Steps to Reproduce
1. Go to any series page that has a trailer.
2. Play the trailer.
3. Tap `Done`.
4. If the fullscreen icon is visible instead of the 'X', tap the play icon to resume trailer.
5. Tap `Done`.
6. Tap 'X' to try to close the player.
### Buggy Behavior
Player does not close when 'X' is tapped.
### Expected Behavior
Tapping the 'X' should close the player, or it should not be visible if we're relying on the `Close the Trailer` cta below.
### Screenshots

|
1.0
|
Series Page: 'X' in Player Does Not Close Player - ## Steps to Reproduce
1. Go to any series page that has a trailer.
2. Play the trailer.
3. Tap `Done`.
4. If the fullscreen icon is visible instead of the 'X', tap the play icon to resume trailer.
5. Tap `Done`.
6. Tap 'X' to try to close the player.
### Buggy Behavior
Player does not close when 'X' is tapped.
### Expected Behavior
Tapping the 'X' should close the player, or it should not be visible if we're relying on the `Close the Trailer` cta below.
### Screenshots

|
defect
|
series page x in player does not close player steps to reproduce go to any series page that has a trailer play the trailer tap done if the fullscreen icon is visible instead of the x tap the play icon to resume trailer tap done tap x to try to close the player buggy behavior player does not close when x is tapped expected behavior tapping the x should close the player or it should not be visible if we re relying on the close the trailer cta below screenshots
| 1
|
308,870
| 23,271,152,508
|
IssuesEvent
|
2022-08-04 23:21:29
|
deepmodeling/dpgen
|
https://api.github.com/repos/deepmodeling/dpgen
|
closed
|
Modify the keywords in param.json
|
documentation
|
**Details**
The keywords "model_devi_e_trust_lo" and "model_devi_e_trust_hi" in dpgen/examples seem to be inconsistent with the source code, where they are "model_devi_v_trust_lo" and "model_devi_v_trust_hi".
|
1.0
|
Modify the keywords in param.json - **Details**
The keywords "model_devi_e_trust_lo" and "model_devi_e_trust_hi" in dpgen/examples seem to be inconsistent with the source code, where they are "model_devi_v_trust_lo" and "model_devi_v_trust_hi".
|
non_defect
|
modify the keywords in param json details the keywords model devi e trust lo and model devi e trust hi in dpgen examples seem to be inconsistent with the source code where they are model devi v trust lo and model devi v trust hi
| 0
|
18,249
| 10,226,083,888
|
IssuesEvent
|
2019-08-16 16:48:57
|
pcrane70/zucchini
|
https://api.github.com/repos/pcrane70/zucchini
|
opened
|
CVE-2017-5645 (High) detected in log4j-core-2.7.jar
|
security vulnerability
|
## CVE-2017-5645 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.7.jar</b></p></summary>
<p>The Apache Log4j Implementation</p>
<p>Library home page: <a href="http://logging.apache.org/log4j/2.x/log4j-core/">http://logging.apache.org/log4j/2.x/log4j-core/</a></p>
<p>Path to dependency file: /zucchini/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/org/apache/logging/log4j/log4j-core/2.7/log4j-core-2.7.jar</p>
<p>
Dependency Hierarchy:
- cucumber-reporting-3.2.0.jar (Root Library)
- :x: **log4j-core-2.7.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/pcrane70/zucchini/commit/63ff0c7109858c6e5c6795d043bb87d715d71c45">63ff0c7109858c6e5c6795d043bb87d715d71c45</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Log4j 2.x before 2.8.2, when using the TCP socket server or UDP socket server to receive serialized log events from another application, a specially crafted binary payload can be sent that, when deserialized, can execute arbitrary code.
<p>Publish Date: 2017-04-17
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-5645>CVE-2017-5645</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-5645">https://nvd.nist.gov/vuln/detail/CVE-2017-5645</a></p>
<p>Release Date: 2017-04-17</p>
<p>Fix Resolution: 2.8.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.logging.log4j","packageName":"log4j-core","packageVersion":"2.7","isTransitiveDependency":true,"dependencyTree":"net.masterthought:cucumber-reporting:3.2.0;org.apache.logging.log4j:log4j-core:2.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.2"}],"vulnerabilityIdentifier":"CVE-2017-5645","vulnerabilityDetails":"In Apache Log4j 2.x before 2.8.2, when using the TCP socket server or UDP socket server to receive serialized log events from another application, a specially crafted binary payload can be sent that, when deserialized, can execute arbitrary code.","vulnerabilityUrl":"https://cve.mitre.org/cgi-bin/cvename.cgi?name\u003dCVE-2017-5645","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2017-5645 (High) detected in log4j-core-2.7.jar - ## CVE-2017-5645 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>log4j-core-2.7.jar</b></p></summary>
<p>The Apache Log4j Implementation</p>
<p>Library home page: <a href="http://logging.apache.org/log4j/2.x/log4j-core/">http://logging.apache.org/log4j/2.x/log4j-core/</a></p>
<p>Path to dependency file: /zucchini/pom.xml</p>
<p>Path to vulnerable library: /root/.m2/repository/org/apache/logging/log4j/log4j-core/2.7/log4j-core-2.7.jar</p>
<p>
Dependency Hierarchy:
- cucumber-reporting-3.2.0.jar (Root Library)
- :x: **log4j-core-2.7.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/pcrane70/zucchini/commit/63ff0c7109858c6e5c6795d043bb87d715d71c45">63ff0c7109858c6e5c6795d043bb87d715d71c45</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Log4j 2.x before 2.8.2, when using the TCP socket server or UDP socket server to receive serialized log events from another application, a specially crafted binary payload can be sent that, when deserialized, can execute arbitrary code.
<p>Publish Date: 2017-04-17
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2017-5645>CVE-2017-5645</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-5645">https://nvd.nist.gov/vuln/detail/CVE-2017-5645</a></p>
<p>Release Date: 2017-04-17</p>
<p>Fix Resolution: 2.8.2</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.logging.log4j","packageName":"log4j-core","packageVersion":"2.7","isTransitiveDependency":true,"dependencyTree":"net.masterthought:cucumber-reporting:3.2.0;org.apache.logging.log4j:log4j-core:2.7","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.8.2"}],"vulnerabilityIdentifier":"CVE-2017-5645","vulnerabilityDetails":"In Apache Log4j 2.x before 2.8.2, when using the TCP socket server or UDP socket server to receive serialized log events from another application, a specially crafted binary payload can be sent that, when deserialized, can execute arbitrary code.","vulnerabilityUrl":"https://cve.mitre.org/cgi-bin/cvename.cgi?name\u003dCVE-2017-5645","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_defect
|
cve high detected in core jar cve high severity vulnerability vulnerable library core jar the apache implementation library home page a href path to dependency file zucchini pom xml path to vulnerable library root repository org apache logging core core jar dependency hierarchy cucumber reporting jar root library x core jar vulnerable library found in head commit a href vulnerability details in apache x before when using the tcp socket server or udp socket server to receive serialized log events from another application a specially crafted binary payload can be sent that when deserialized can execute arbitrary code publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails in apache x before when using the tcp socket server or udp socket server to receive serialized log events from another application a specially crafted binary payload can be sent that when deserialized can execute arbitrary code vulnerabilityurl
| 0
|
66,987
| 20,796,367,075
|
IssuesEvent
|
2022-03-17 09:42:21
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
opened
|
Edited message is incorrectly displayed in thread summary
|
T-Defect
|
### Steps to reproduce
1. Where are you starting? What can you see?
-I'm viewing an existing thread
2. What do you click?
-I edit the last message of that thread
3. More steps…
-The original message is displayed in thread summary instead of the edited version
### Outcome
#### What did you expect?
The edited version of the message to be displayed in thread summary
#### What happened instead?
The original version of the message is displayed in thread summary
### Operating system
Windows 10 Home 21H1
### Browser information
98.0.1 (64-bit)
### URL for webapp
https://develop.element.io
### Application version
Element version: 071a5410b88e-react-176e49e31259-js-905a884f72e8 Olm version: 3.2.8
### Homeserver
matrix.org
### Will you send logs?
No
|
1.0
|
Edited message is incorrectly displayed in thread summary - ### Steps to reproduce
1. Where are you starting? What can you see?
-I'm viewing an existing thread
2. What do you click?
-I edit the last message of that thread
3. More steps…
-The original message is displayed in thread summary instead of the edited version
### Outcome
#### What did you expect?
The edited version of the message to be displayed in thread summary
#### What happened instead?
The original version of the message is displayed in thread summary
### Operating system
Windows 10 Home 21H1
### Browser information
98.0.1 (64-bit)
### URL for webapp
https://develop.element.io
### Application version
Element version: 071a5410b88e-react-176e49e31259-js-905a884f72e8 Olm version: 3.2.8
### Homeserver
matrix.org
### Will you send logs?
No
|
defect
|
edited message is incorrectly displayed in thread summary steps to reproduce where are you starting what can you see i m viewing an existing thread what do you click i edit the last message of that thread more steps… the original message is displayed in thread summary instead of the edited version outcome what did you expect the edited version of the message to be displayed in thread summary what happened instead the original version of the message is displayed in thread summary operating system windows home browser information bit url for webapp application version element version react js olm version homeserver matrix org will you send logs no
| 1
|
13,961
| 2,789,801,836
|
IssuesEvent
|
2015-05-08 21:34:59
|
google/google-visualization-api-issues
|
https://api.github.com/repos/google/google-visualization-api-issues
|
opened
|
Annotaded Time Line - minus zero on y axis
|
Priority-Medium Type-Defect
|
Original [issue 154](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=154) created by orwant on 2009-12-27T17:31:28.000Z:
<b>What steps will reproduce the problem? Please provide a link to a</b>
<b>demonstration page if at all possible, or attach code.</b>
1. Make any chart with value smaller and bigger then zero
On X-axis we can see "-0". Zero is zero. Should not be with + or -, just
zero.
Please take a look an attached screenshot
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
Annotaded Time Line
<b>Are you using the test environment (version 1.1)?</b>
<b>(If you are not sure, answer NO)</b>
NO
<b>What operating system and browser are you using?</b>
Opera 10 on Windows 7
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
|
1.0
|
Annotaded Time Line - minus zero on y axis - Original [issue 154](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=154) created by orwant on 2009-12-27T17:31:28.000Z:
<b>What steps will reproduce the problem? Please provide a link to a</b>
<b>demonstration page if at all possible, or attach code.</b>
1. Make any chart with value smaller and bigger then zero
On X-axis we can see "-0". Zero is zero. Should not be with + or -, just
zero.
Please take a look an attached screenshot
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
Annotaded Time Line
<b>Are you using the test environment (version 1.1)?</b>
<b>(If you are not sure, answer NO)</b>
NO
<b>What operating system and browser are you using?</b>
Opera 10 on Windows 7
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
|
defect
|
annotaded time line minus zero on y axis original created by orwant on what steps will reproduce the problem please provide a link to a demonstration page if at all possible or attach code make any chart with value smaller and bigger then zero on x axis we can see quot quot zero is zero should not be with or just zero please take a look an attached screenshot what component is this issue related to piechart linechart datatable query etc annotaded time line are you using the test environment version if you are not sure answer no no what operating system and browser are you using opera on windows for developers viewing this issue please click the star icon to be notified of future changes and to let us know how many of you are interested in seeing it resolved
| 1
|
831,243
| 32,042,328,944
|
IssuesEvent
|
2023-09-22 20:32:52
|
TheDeanLab/ASLM
|
https://api.github.com/repos/TheDeanLab/ASLM
|
closed
|
Galvo entry in waveform popup flummoxed by `.`
|
priority task
|
```
2023-08-22 14:40:41,166 - model - ERROR - galvo_base: could not convert string to float: '.' waveform constants.yml doesn't have parameter amplitude/offset/frequency for Galvo 0
2023-08-22 14:40:41,186 - model - DEBUG - feature_container: running signal node: PrepareNextChannel
2023-08-22 14:40:41,202 - model - DEBUG - filter_wheel_sutter: SutterFilterWheel - Moving to Position 0
2023-08-22 14:40:41,337 - model - DEBUG - feature_container: SignalContainer - Traceback (most recent call last):
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\features\feature_container.py", line 208, in run
result, is_end = self.curr_node.run(*args, wait_response=wait_response)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\features\feature_container.py", line 96, in run
result = self.node_funcs["main"](*args)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\features\common_features.py", line 167, in signal_func
self.model.active_microscope.prepare_next_channel()
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\microscope.py", line 531, in prepare_next_channel
self.daq.prepare_acquisition(channel_key, self.current_exposure_time)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\devices\daq\daq_ni.py", line 397, in prepare_acquisition
self.create_analog_output_tasks(channel_key)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\devices\daq\daq_ni.py", line 365, in create_analog_output_tasks
[
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\devices\daq\daq_ni.py", line 366, in <listcomp>
v["waveform"][channel_key][:max_sample]
TypeError: 'NoneType' object is not subscriptable
```
|
1.0
|
Galvo entry in waveform popup flummoxed by `.` - ```
2023-08-22 14:40:41,166 - model - ERROR - galvo_base: could not convert string to float: '.' waveform constants.yml doesn't have parameter amplitude/offset/frequency for Galvo 0
2023-08-22 14:40:41,186 - model - DEBUG - feature_container: running signal node: PrepareNextChannel
2023-08-22 14:40:41,202 - model - DEBUG - filter_wheel_sutter: SutterFilterWheel - Moving to Position 0
2023-08-22 14:40:41,337 - model - DEBUG - feature_container: SignalContainer - Traceback (most recent call last):
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\features\feature_container.py", line 208, in run
result, is_end = self.curr_node.run(*args, wait_response=wait_response)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\features\feature_container.py", line 96, in run
result = self.node_funcs["main"](*args)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\features\common_features.py", line 167, in signal_func
self.model.active_microscope.prepare_next_channel()
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\microscope.py", line 531, in prepare_next_channel
self.daq.prepare_acquisition(channel_key, self.current_exposure_time)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\devices\daq\daq_ni.py", line 397, in prepare_acquisition
self.create_analog_output_tasks(channel_key)
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\devices\daq\daq_ni.py", line 365, in create_analog_output_tasks
[
File "c:\users\deanlab\desktop\github\aslm\src\aslm\model\devices\daq\daq_ni.py", line 366, in <listcomp>
v["waveform"][channel_key][:max_sample]
TypeError: 'NoneType' object is not subscriptable
```
|
non_defect
|
galvo entry in waveform popup flummoxed by model error galvo base could not convert string to float waveform constants yml doesn t have parameter amplitude offset frequency for galvo model debug feature container running signal node preparenextchannel model debug filter wheel sutter sutterfilterwheel moving to position model debug feature container signalcontainer traceback most recent call last file c users deanlab desktop github aslm src aslm model features feature container py line in run result is end self curr node run args wait response wait response file c users deanlab desktop github aslm src aslm model features feature container py line in run result self node funcs args file c users deanlab desktop github aslm src aslm model features common features py line in signal func self model active microscope prepare next channel file c users deanlab desktop github aslm src aslm model microscope py line in prepare next channel self daq prepare acquisition channel key self current exposure time file c users deanlab desktop github aslm src aslm model devices daq daq ni py line in prepare acquisition self create analog output tasks channel key file c users deanlab desktop github aslm src aslm model devices daq daq ni py line in create analog output tasks file c users deanlab desktop github aslm src aslm model devices daq daq ni py line in v typeerror nonetype object is not subscriptable
| 0
|
13,447
| 10,264,320,619
|
IssuesEvent
|
2019-08-22 16:05:32
|
zooniverse/theia
|
https://api.github.com/repos/zooniverse/theia
|
opened
|
more logging
|
enhancement infrastructure
|
service should just do more logging in general so we can look at the kube logs to see what it's doing
|
1.0
|
more logging - service should just do more logging in general so we can look at the kube logs to see what it's doing
|
non_defect
|
more logging service should just do more logging in general so we can look at the kube logs to see what it s doing
| 0
|
321,642
| 23,865,232,832
|
IssuesEvent
|
2022-09-07 10:25:45
|
zzap/WordPress-Advanced-administration-handbook
|
https://api.github.com/repos/zzap/WordPress-Advanced-administration-handbook
|
opened
|
Page: Debugging a WordPress Network
|
documentation
|
**debug/debug-network.md**
https://github.com/zzap/WordPress-Advanced-administration-handbook/blob/main/debug/debug-network.md
- [x] Add into a Category
- [x] Page creation
- [ ] Copy the original content
- [ ] Format the content
- [ ] Create a PR
|
1.0
|
Page: Debugging a WordPress Network - **debug/debug-network.md**
https://github.com/zzap/WordPress-Advanced-administration-handbook/blob/main/debug/debug-network.md
- [x] Add into a Category
- [x] Page creation
- [ ] Copy the original content
- [ ] Format the content
- [ ] Create a PR
|
non_defect
|
page debugging a wordpress network debug debug network md add into a category page creation copy the original content format the content create a pr
| 0
|
165,143
| 20,574,223,461
|
IssuesEvent
|
2022-03-04 01:33:26
|
prafullkotecha/fabmedical
|
https://api.github.com/repos/prafullkotecha/fabmedical
|
opened
|
CVE-2022-0691 (High) detected in url-parse-1.4.7.tgz
|
security vulnerability
|
## CVE-2022-0691 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /content-web/package.json</p>
<p>Path to vulnerable library: /content-web/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.1002.0.tgz (Root Library)
- webpack-dev-server-3.11.0.tgz
- sockjs-client-1.4.0.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.9.
<p>Publish Date: 2022-02-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0691>CVE-2022-0691</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691</a></p>
<p>Release Date: 2022-02-21</p>
<p>Fix Resolution (url-parse): 1.5.9</p>
<p>Direct dependency fix Resolution (@angular-devkit/build-angular): 0.1002.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-0691 (High) detected in url-parse-1.4.7.tgz - ## CVE-2022-0691 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /content-web/package.json</p>
<p>Path to vulnerable library: /content-web/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.1002.0.tgz (Root Library)
- webpack-dev-server-3.11.0.tgz
- sockjs-client-1.4.0.tgz
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.9.
<p>Publish Date: 2022-02-21
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0691>CVE-2022-0691</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0691</a></p>
<p>Release Date: 2022-02-21</p>
<p>Fix Resolution (url-parse): 1.5.9</p>
<p>Direct dependency fix Resolution (@angular-devkit/build-angular): 0.1002.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in url parse tgz cve high severity vulnerability vulnerable library url parse tgz small footprint url parser that works seamlessly across node js and browser environments library home page a href path to dependency file content web package json path to vulnerable library content web node modules url parse package json dependency hierarchy build angular tgz root library webpack dev server tgz sockjs client tgz x url parse tgz vulnerable library found in base branch master vulnerability details authorization bypass through user controlled key in npm url parse prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution url parse direct dependency fix resolution angular devkit build angular step up your open source security game with whitesource
| 0
|
97,682
| 16,241,421,197
|
IssuesEvent
|
2021-05-07 09:58:38
|
camcrosbie/Angular-GettingStarted
|
https://api.github.com/repos/camcrosbie/Angular-GettingStarted
|
opened
|
CVE-2021-27290 (High) detected in ssri-6.0.1.tgz
|
security vulnerability
|
## CVE-2021-27290 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ssri-6.0.1.tgz</b></p></summary>
<p>Standard Subresource Integrity library -- parses, serializes, generates, and verifies integrity metadata according to the SRI spec.</p>
<p>Library home page: <a href="https://registry.npmjs.org/ssri/-/ssri-6.0.1.tgz">https://registry.npmjs.org/ssri/-/ssri-6.0.1.tgz</a></p>
<p>Path to dependency file: Angular-GettingStarted/APM-Final/package.json</p>
<p>Path to vulnerable library: Angular-GettingStarted/APM-Final/node_modules/webpack/node_modules/ssri/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.1102.3.tgz (Root Library)
- webpack-4.44.2.tgz
- terser-webpack-plugin-1.4.5.tgz
- cacache-12.0.4.tgz
- :x: **ssri-6.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/camcrosbie/Angular-GettingStarted/commit/659cfae310e18e044b5b6b73adef97a9374d8493">659cfae310e18e044b5b6b73adef97a9374d8493</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ssri 5.2.2-8.0.0, fixed in 8.0.1, processes SRIs using a regular expression which is vulnerable to a denial of service. Malicious SRIs could take an extremely long time to process, leading to denial of service. This issue only affects consumers using the strict option.
<p>Publish Date: 2021-03-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27290>CVE-2021-27290</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27290">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27290</a></p>
<p>Release Date: 2021-03-12</p>
<p>Fix Resolution: ssri - 6.0.2,8.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-27290 (High) detected in ssri-6.0.1.tgz - ## CVE-2021-27290 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ssri-6.0.1.tgz</b></p></summary>
<p>Standard Subresource Integrity library -- parses, serializes, generates, and verifies integrity metadata according to the SRI spec.</p>
<p>Library home page: <a href="https://registry.npmjs.org/ssri/-/ssri-6.0.1.tgz">https://registry.npmjs.org/ssri/-/ssri-6.0.1.tgz</a></p>
<p>Path to dependency file: Angular-GettingStarted/APM-Final/package.json</p>
<p>Path to vulnerable library: Angular-GettingStarted/APM-Final/node_modules/webpack/node_modules/ssri/package.json</p>
<p>
Dependency Hierarchy:
- build-angular-0.1102.3.tgz (Root Library)
- webpack-4.44.2.tgz
- terser-webpack-plugin-1.4.5.tgz
- cacache-12.0.4.tgz
- :x: **ssri-6.0.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/camcrosbie/Angular-GettingStarted/commit/659cfae310e18e044b5b6b73adef97a9374d8493">659cfae310e18e044b5b6b73adef97a9374d8493</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ssri 5.2.2-8.0.0, fixed in 8.0.1, processes SRIs using a regular expression which is vulnerable to a denial of service. Malicious SRIs could take an extremely long time to process, leading to denial of service. This issue only affects consumers using the strict option.
<p>Publish Date: 2021-03-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-27290>CVE-2021-27290</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27290">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-27290</a></p>
<p>Release Date: 2021-03-12</p>
<p>Fix Resolution: ssri - 6.0.2,8.0.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in ssri tgz cve high severity vulnerability vulnerable library ssri tgz standard subresource integrity library parses serializes generates and verifies integrity metadata according to the sri spec library home page a href path to dependency file angular gettingstarted apm final package json path to vulnerable library angular gettingstarted apm final node modules webpack node modules ssri package json dependency hierarchy build angular tgz root library webpack tgz terser webpack plugin tgz cacache tgz x ssri tgz vulnerable library found in head commit a href found in base branch main vulnerability details ssri fixed in processes sris using a regular expression which is vulnerable to a denial of service malicious sris could take an extremely long time to process leading to denial of service this issue only affects consumers using the strict option publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ssri step up your open source security game with whitesource
| 0
|
4,706
| 2,610,142,830
|
IssuesEvent
|
2015-02-26 18:45:02
|
chrsmith/hedgewars
|
https://api.github.com/repos/chrsmith/hedgewars
|
opened
|
Schemes can be saved with same name, different casing
|
auto-migrated Priority-High Type-Defect
|
```
If you save a scheme with the same name as an existing scheme but with
different casing then both appear in the frontend. Deleting one of them will
delete both and only one of them is actually written to disk since restarting
will remove one of them.
```
-----
Original issue reported on code.google.com by `gtamulti` on 28 Dec 2010 at 2:37
|
1.0
|
Schemes can be saved with same name, different casing - ```
If you save a scheme with the same name as an existing scheme but with
different casing then both appear in the frontend. Deleting one of them will
delete both and only one of them is actually written to disk since restarting
will remove one of them.
```
-----
Original issue reported on code.google.com by `gtamulti` on 28 Dec 2010 at 2:37
|
defect
|
schemes can be saved with same name different casing if you save a scheme with the same name as an existing scheme but with different casing then both appear in the frontend deleting one of them will delete both and only one of them is actually written to disk since restarting will remove one of them original issue reported on code google com by gtamulti on dec at
| 1
|
100,752
| 21,510,202,327
|
IssuesEvent
|
2022-04-28 03:02:45
|
RobertsLab/resources
|
https://api.github.com/repos/RobertsLab/resources
|
opened
|
Problem with stringtie2 not matching gene names from gff file (TagSeq)
|
code
|
Hi! I am having a problem with generating a gene count matrix from some TagSeq data. It seems that I am having issues properly formatting the .gff3 file from the reference genome that I am using (Pocillopora acuta). Most of the genes in my count matrix are named as "STRG###", which indicates that stringtie2 did not find a gene "match" and is therefore calling that transcript as a splice variant. This is happening for ~80% of my genes, so I know this is an issue with the analysis and formatting.
Has anyone has this issue before? Is tehre a particular format of the gene information column in a .gff3 file that I should be using to make sure it can match the transcript to the proper gene id?
|
1.0
|
Problem with stringtie2 not matching gene names from gff file (TagSeq) - Hi! I am having a problem with generating a gene count matrix from some TagSeq data. It seems that I am having issues properly formatting the .gff3 file from the reference genome that I am using (Pocillopora acuta). Most of the genes in my count matrix are named as "STRG###", which indicates that stringtie2 did not find a gene "match" and is therefore calling that transcript as a splice variant. This is happening for ~80% of my genes, so I know this is an issue with the analysis and formatting.
Has anyone has this issue before? Is tehre a particular format of the gene information column in a .gff3 file that I should be using to make sure it can match the transcript to the proper gene id?
|
non_defect
|
problem with not matching gene names from gff file tagseq hi i am having a problem with generating a gene count matrix from some tagseq data it seems that i am having issues properly formatting the file from the reference genome that i am using pocillopora acuta most of the genes in my count matrix are named as strg which indicates that did not find a gene match and is therefore calling that transcript as a splice variant this is happening for of my genes so i know this is an issue with the analysis and formatting has anyone has this issue before is tehre a particular format of the gene information column in a file that i should be using to make sure it can match the transcript to the proper gene id
| 0
|
5,421
| 2,610,187,317
|
IssuesEvent
|
2015-02-26 18:59:23
|
chrsmith/quchuseban
|
https://api.github.com/repos/chrsmith/quchuseban
|
opened
|
讲解天生色斑怎么治疗
|
auto-migrated Priority-Medium Type-Defect
|
```
《摘要》
雀斑是我们皮肤较容易出现的问题之一,所有皮下色素异常��
�称黄褐斑,也叫黑斑,最常见的雀斑雀斑又称蝴蝶斑、妊娠�
��,中医称为“肝斑”、“黧黑斑”,呈片状,对称性分布,
黄褐色或黑色,多以鼻梁为中心,生于面颊、颧骨、额头部��
�,也有发生于上唇和眉骨部位的,发生人群以中青年女性为�
��,是很多女性朋友最担心的“面子”问题之一。天生色斑怎
么治疗,
《客户案例》
再漂亮的女人过了三十,一定会有危机感。年轻时我皮��
�特别好,结婚生孩子后,皮肤越来越暗黄,我今年三十岁了�
��本来皮肤就有点黄,长斑后更丑了,给人一种不干净的感觉
,并且穿什么都很难看,老公也给我买了很多化妆品,但是��
�的化妆品用着竟然还会过敏,让脸上红红的一片,烦恼死了�
��怎么去除,脸上的黄褐斑</br>
我为了去斑真是发了不少的功夫,直到去年9月份,一个�
��事的妹妹向我推荐一款名为黛芙薇尔去斑的产品后,我的情
况也有所好转。开始接触黛芙薇尔去斑产品的时候,我并没��
�抱有什么期望。从官网了解到是纯天然精华制剂,不含任何�
��学激素成分,没有副作用。不用担心有激素刺激作用给身体
带来的影响。在产品制作过程中要经过多个环节的检测程序��
�确保进入市面的产品卫生安全,配料搭配合理,确保每一位�
��者的切身利益。所以我就在官网订购了两个周期,决定先试
试再说。 怎么去除,脸上的黄褐斑。</br>
没想到第二天货就到啦,刚开始的时候效果不是很明显��
�想放弃但是妹妹一直鼓励我,终于我坚持了下来,三个月后�
��脸上那些难看的斑点真的不见了,就像换了皮肤般,又白又
嫩,变得和以前一样漂亮了,老公的应酬也渐渐没有了,晚��
�的河堤旁又可以见到我们拉手散步的身影,非常感谢黛芙薇�
��去斑,去除了脸上的色斑。
阅读了天生色斑怎么治疗,再看脸上容易长斑的原因:
《色斑形成原因》
内部因素
一、压力
当人受到压力时,就会分泌肾上腺素,为对付压力而做��
�备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏�
��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃
。
二、荷尔蒙分泌失调
避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞��
�分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在�
��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕
中因女性荷尔蒙雌激素的增加,从怀孕4—5个月开始会容易出
现斑,这时候出现的斑点在产后大部分会消失。可是,新陈��
�谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等�
��因,都会使斑加深。有时新长出的斑,产后也不会消失,所
以需要更加注意。
三、新陈代谢缓慢
肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑��
�因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态�
��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是
内分泌失调导致过敏体质而形成的。另外,身体状态不正常��
�时候,紫外线的照射也会加速斑的形成。
四、错误的使用化妆品
使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在��
�疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵�
��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的
问题。
外部因素
一、紫外线
照射紫外线的时候,人体为了保护皮肤,会在基底层产��
�很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更�
��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化,
还会引起黑斑、雀斑等色素沉着的皮肤疾患。
二、不良的清洁习惯
因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。��
�皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦�
��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的
问题。
三、遗传基因
父母中有长斑的,则本人长斑的概率就很高,这种情况��
�一定程度上就可判定是遗传基因的作用。所以家里特别是长�
��有长斑的人,要注意避免引发长斑的重要因素之一——紫外
线照射,这是预防斑必须注意的。
《有疑问帮你解决》
1,黛芙薇尔精华液真的有效果吗?真的可以把脸上的黄褐��
�去掉吗?
答:黛芙薇尔精华液DNA精华能够有效的修复周围难以触��
�的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必�
��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑
,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时��
�,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的�
��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显
而易见。自产品上市以来,老顾客纷纷介绍新顾客,71%的新��
�客都是通过老顾客介绍而来,口碑由此而来!
2,服用黛芙薇尔美白,会伤身体吗?有副作用吗?
答:黛芙薇尔精华液应用了精纯复合配方和领先的分类��
�斑科技,并将“DNA美肤系统”疗法应用到了该产品中,能彻�
��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有
效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾��
�地的专家通力协作,超过10年的研究以全新的DNA肌肤修复技��
�,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽�
��迹,令每一位爱美的女性都能享受到科技创新所带来的自然
之美。
专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数��
�百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖!
3,去除黄褐斑之后,会反弹吗?
答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔��
�白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家�
��据斑的形成原因精心研制而成用事实说话,让消费者打分。
树立权威品牌!我们的很多新客户都是老客户介绍而来,请问�
��如果效果不好,会有客户转介绍吗?
4,你们的价格有点贵,能不能便宜一点?
答:如果您使用西药最少需要2000元,煎服的药最少需要3
000元,做手术最少是5000元,而这些毫无疑问,不会对彻底去�
��你的斑点有任何帮助!一分价钱,一份价值,我们现在做的��
�是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的�
��褐斑彻底去除,你还会觉得贵吗?你还会再去花那么多冤枉��
�,不但斑没去掉,还把自己的皮肤弄的越来越糟吗
5,我适合用黛芙薇尔精华液吗?
答:黛芙薇尔适用人群:
1、生理紊乱引起的黄褐斑人群
2、生育引起的妊娠斑人群
3、年纪增长引起的老年斑人群
4、化妆品色素沉积、辐射斑人群
5、长期日照引起的日晒斑人群
6、肌肤暗淡急需美白的人群
《祛斑小方法》
天生色斑怎么治疗,同时为您分享祛斑小方法
1.洗脸时,在水中加1-2汤匙的食醋,有减轻色素沉着的作用。
2.将鲜明萝卜辟碎挤汁,取10-30毫升,每日上晚洗完脸后涂抹�
��待干后,洗净。此外,每日喝一杯胡萝卜,可美白肌肤。
```
-----
Original issue reported on code.google.com by `additive...@gmail.com` on 1 Jul 2014 at 4:09
|
1.0
|
讲解天生色斑怎么治疗 - ```
《摘要》
雀斑是我们皮肤较容易出现的问题之一,所有皮下色素异常��
�称黄褐斑,也叫黑斑,最常见的雀斑雀斑又称蝴蝶斑、妊娠�
��,中医称为“肝斑”、“黧黑斑”,呈片状,对称性分布,
黄褐色或黑色,多以鼻梁为中心,生于面颊、颧骨、额头部��
�,也有发生于上唇和眉骨部位的,发生人群以中青年女性为�
��,是很多女性朋友最担心的“面子”问题之一。天生色斑怎
么治疗,
《客户案例》
再漂亮的女人过了三十,一定会有危机感。年轻时我皮��
�特别好,结婚生孩子后,皮肤越来越暗黄,我今年三十岁了�
��本来皮肤就有点黄,长斑后更丑了,给人一种不干净的感觉
,并且穿什么都很难看,老公也给我买了很多化妆品,但是��
�的化妆品用着竟然还会过敏,让脸上红红的一片,烦恼死了�
��怎么去除,脸上的黄褐斑</br>
我为了去斑真是发了不少的功夫,直到去年9月份,一个�
��事的妹妹向我推荐一款名为黛芙薇尔去斑的产品后,我的情
况也有所好转。开始接触黛芙薇尔去斑产品的时候,我并没��
�抱有什么期望。从官网了解到是纯天然精华制剂,不含任何�
��学激素成分,没有副作用。不用担心有激素刺激作用给身体
带来的影响。在产品制作过程中要经过多个环节的检测程序��
�确保进入市面的产品卫生安全,配料搭配合理,确保每一位�
��者的切身利益。所以我就在官网订购了两个周期,决定先试
试再说。 怎么去除,脸上的黄褐斑。</br>
没想到第二天货就到啦,刚开始的时候效果不是很明显��
�想放弃但是妹妹一直鼓励我,终于我坚持了下来,三个月后�
��脸上那些难看的斑点真的不见了,就像换了皮肤般,又白又
嫩,变得和以前一样漂亮了,老公的应酬也渐渐没有了,晚��
�的河堤旁又可以见到我们拉手散步的身影,非常感谢黛芙薇�
��去斑,去除了脸上的色斑。
阅读了天生色斑怎么治疗,再看脸上容易长斑的原因:
《色斑形成原因》
内部因素
一、压力
当人受到压力时,就会分泌肾上腺素,为对付压力而做��
�备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏�
��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃
。
二、荷尔蒙分泌失调
避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞��
�分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在�
��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕
中因女性荷尔蒙雌激素的增加,从怀孕4—5个月开始会容易出
现斑,这时候出现的斑点在产后大部分会消失。可是,新陈��
�谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等�
��因,都会使斑加深。有时新长出的斑,产后也不会消失,所
以需要更加注意。
三、新陈代谢缓慢
肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑��
�因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态�
��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是
内分泌失调导致过敏体质而形成的。另外,身体状态不正常��
�时候,紫外线的照射也会加速斑的形成。
四、错误的使用化妆品
使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在��
�疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵�
��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的
问题。
外部因素
一、紫外线
照射紫外线的时候,人体为了保护皮肤,会在基底层产��
�很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更�
��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化,
还会引起黑斑、雀斑等色素沉着的皮肤疾患。
二、不良的清洁习惯
因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。��
�皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦�
��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的
问题。
三、遗传基因
父母中有长斑的,则本人长斑的概率就很高,这种情况��
�一定程度上就可判定是遗传基因的作用。所以家里特别是长�
��有长斑的人,要注意避免引发长斑的重要因素之一——紫外
线照射,这是预防斑必须注意的。
《有疑问帮你解决》
1,黛芙薇尔精华液真的有效果吗?真的可以把脸上的黄褐��
�去掉吗?
答:黛芙薇尔精华液DNA精华能够有效的修复周围难以触��
�的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必�
��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑
,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时��
�,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的�
��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显
而易见。自产品上市以来,老顾客纷纷介绍新顾客,71%的新��
�客都是通过老顾客介绍而来,口碑由此而来!
2,服用黛芙薇尔美白,会伤身体吗?有副作用吗?
答:黛芙薇尔精华液应用了精纯复合配方和领先的分类��
�斑科技,并将“DNA美肤系统”疗法应用到了该产品中,能彻�
��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有
效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾��
�地的专家通力协作,超过10年的研究以全新的DNA肌肤修复技��
�,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽�
��迹,令每一位爱美的女性都能享受到科技创新所带来的自然
之美。
专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数��
�百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖!
3,去除黄褐斑之后,会反弹吗?
答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔��
�白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家�
��据斑的形成原因精心研制而成用事实说话,让消费者打分。
树立权威品牌!我们的很多新客户都是老客户介绍而来,请问�
��如果效果不好,会有客户转介绍吗?
4,你们的价格有点贵,能不能便宜一点?
答:如果您使用西药最少需要2000元,煎服的药最少需要3
000元,做手术最少是5000元,而这些毫无疑问,不会对彻底去�
��你的斑点有任何帮助!一分价钱,一份价值,我们现在做的��
�是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的�
��褐斑彻底去除,你还会觉得贵吗?你还会再去花那么多冤枉��
�,不但斑没去掉,还把自己的皮肤弄的越来越糟吗
5,我适合用黛芙薇尔精华液吗?
答:黛芙薇尔适用人群:
1、生理紊乱引起的黄褐斑人群
2、生育引起的妊娠斑人群
3、年纪增长引起的老年斑人群
4、化妆品色素沉积、辐射斑人群
5、长期日照引起的日晒斑人群
6、肌肤暗淡急需美白的人群
《祛斑小方法》
天生色斑怎么治疗,同时为您分享祛斑小方法
1.洗脸时,在水中加1-2汤匙的食醋,有减轻色素沉着的作用。
2.将鲜明萝卜辟碎挤汁,取10-30毫升,每日上晚洗完脸后涂抹�
��待干后,洗净。此外,每日喝一杯胡萝卜,可美白肌肤。
```
-----
Original issue reported on code.google.com by `additive...@gmail.com` on 1 Jul 2014 at 4:09
|
defect
|
讲解天生色斑怎么治疗 《摘要》 雀斑是我们皮肤较容易出现的问题之一,所有皮下色素异常�� �称黄褐斑,也叫黑斑,最常见的雀斑雀斑又称蝴蝶斑、妊娠� ��,中医称为“肝斑”、“黧黑斑”,呈片状,对称性分布, 黄褐色或黑色,多以鼻梁为中心,生于面颊、颧骨、额头部�� �,也有发生于上唇和眉骨部位的,发生人群以中青年女性为� ��,是很多女性朋友最担心的“面子”问题之一。天生色斑怎 么治疗, 《客户案例》 再漂亮的女人过了三十,一定会有危机感。年轻时我皮�� �特别好,结婚生孩子后,皮肤越来越暗黄,我今年三十岁了� ��本来皮肤就有点黄,长斑后更丑了,给人一种不干净的感觉 ,并且穿什么都很难看,老公也给我买了很多化妆品,但是�� �的化妆品用着竟然还会过敏,让脸上红红的一片,烦恼死了� ��怎么去除 脸上的黄褐斑 我为了去斑真是发了不少的功夫, ,一个� ��事的妹妹向我推荐一款名为黛芙薇尔去斑的产品后,我的情 况也有所好转。开始接触黛芙薇尔去斑产品的时候,我并没�� �抱有什么期望。从官网了解到是纯天然精华制剂,不含任何� ��学激素成分,没有副作用。不用担心有激素刺激作用给身体 带来的影响。在产品制作过程中要经过多个环节的检测程序�� �确保进入市面的产品卫生安全,配料搭配合理,确保每一位� ��者的切身利益。所以我就在官网订购了两个周期,决定先试 试再说。 怎么去除 脸上的黄褐斑。 没想到第二天货就到啦,刚开始的时候效果不是很明显�� �想放弃但是妹妹一直鼓励我,终于我坚持了下来,三个月后� ��脸上那些难看的斑点真的不见了,就像换了皮肤般,又白又 嫩,变得和以前一样漂亮了,老公的应酬也渐渐没有了,晚�� �的河堤旁又可以见到我们拉手散步的身影,非常感谢黛芙薇� ��去斑,去除了脸上的色斑。 阅读了天生色斑怎么治疗,再看脸上容易长斑的原因: 《色斑形成原因》 内部因素 一、压力 当人受到压力时,就会分泌肾上腺素,为对付压力而做�� �备。如果长期受到压力,人体新陈代谢的平衡就会遭到破坏� ��皮肤所需的营养供应趋于缓慢,色素母细胞就会变得很活跃 。 二、荷尔蒙分泌失调 避孕药里所含的女性荷尔蒙雌激素,会刺激麦拉宁细胞�� �分泌而形成不均匀的斑点,因避孕药而形成的斑点,虽然在� ��药中断后会停止,但仍会在皮肤上停留很长一段时间。怀孕 中因女性荷尔蒙雌激素的增加, — 现斑,这时候出现的斑点在产后大部分会消失。可是,新陈�� �谢不正常、肌肤裸露在强烈的紫外线下、精神上受到压力等� ��因,都会使斑加深。有时新长出的斑,产后也不会消失,所 以需要更加注意。 三、新陈代谢缓慢 肝的新陈代谢功能不正常或卵巢功能减退时也会出现斑�� �因为新陈代谢不顺畅、或内分泌失调,使身体处于敏感状态� ��,从而加剧色素问题。我们常说的便秘会形成斑,其实就是 内分泌失调导致过敏体质而形成的。另外,身体状态不正常�� �时候,紫外线的照射也会加速斑的形成。 四、错误的使用化妆品 使用了不适合自己皮肤的化妆品,会导致皮肤过敏。在�� �疗的过程中如过量照射到紫外线,皮肤会为了抵御外界的侵� ��,在有炎症的部位聚集麦拉宁色素,这样会出现色素沉着的 问题。 外部因素 一、紫外线 照射紫外线的时候,人体为了保护皮肤,会在基底层产�� �很多麦拉宁色素。所以为了保护皮肤,会在敏感部位聚集更� ��的色素。经常裸露在强烈的阳光底下不仅促进皮肤的老化, 还会引起黑斑、雀斑等色素沉着的皮肤疾患。 二、不良的清洁习惯 因强烈的清洁习惯使皮肤变得敏感,这样会刺激皮肤。�� �皮肤敏感时,人体为了保护皮肤,黑色素细胞会分泌很多麦� ��宁色素,当色素过剩时就出现了斑、瑕疵等皮肤色素沉着的 问题。 三、遗传基因 父母中有长斑的,则本人长斑的概率就很高,这种情况�� �一定程度上就可判定是遗传基因的作用。所以家里特别是长� ��有长斑的人,要注意避免引发长斑的重要因素之一——紫外 线照射,这是预防斑必须注意的。 《有疑问帮你解决》 黛芙薇尔精华液真的有效果吗 真的可以把脸上的黄褐�� �去掉吗 答:黛芙薇尔精华液dna精华能够有效的修复周围难以触�� �的色斑,其独有的纳豆成分为皮肤的美白与靓丽,提供了必� ��可少的营养物质,可以有效的去除黄褐斑,黄褐斑,黄褐斑 ,蝴蝶斑,晒斑、妊娠斑等。它它完全突破了传统的美肤时�� �,宛如在皮肤中注入了一杯兼具活化、再生、滋养等功效的� ��尾酒,同时为脸部提供大量有机维生素精华,脸部的改变显 而易见。自产品上市以来,老顾客纷纷介绍新顾客, 的新�� �客都是通过老顾客介绍而来,口碑由此而来 ,服用黛芙薇尔美白,会伤身体吗 有副作用吗 答:黛芙薇尔精华液应用了精纯复合配方和领先的分类�� �斑科技,并将“dna美肤系统”疗法应用到了该产品中,能彻� ��祛除黄褐斑,蝴蝶斑,妊娠斑,晒斑,黄褐斑,老年斑,有 效淡化黄褐斑至接近肤色。黛芙薇尔通过法国、美国、台湾�� �地的专家通力协作, �� �,挑战传统化学护肤理念,不懈追寻发现破译大自然的美丽� ��迹,令每一位爱美的女性都能享受到科技创新所带来的自然 之美。 专为亚洲女性肤质研制,精心呵护女性美丽,多年来,为数�� �百万计的女性解除了黄褐斑困扰。深得广大女性朋友的信赖 ,去除黄褐斑之后,会反弹吗 答:很多曾经长了黄褐斑的人士,自从选择了黛芙薇尔�� �白,就一劳永逸。这款祛斑产品是经过数十位权威祛斑专家� ��据斑的形成原因精心研制而成用事实说话,让消费者打分。 树立权威品牌 我们的很多新客户都是老客户介绍而来,请问� ��如果效果不好,会有客户转介绍吗 ,你们的价格有点贵,能不能便宜一点 答: , , ,而这些毫无疑问,不会对彻底去� ��你的斑点有任何帮助 一分价钱,一份价值,我们现在做的�� �是一个口碑,一个品牌,价钱并不高。如果花这点钱把你的� ��褐斑彻底去除,你还会觉得贵吗 你还会再去花那么多冤枉�� �,不但斑没去掉,还把自己的皮肤弄的越来越糟吗 ,我适合用黛芙薇尔精华液吗 答:黛芙薇尔适用人群: 、生理紊乱引起的黄褐斑人群 、生育引起的妊娠斑人群 、年纪增长引起的老年斑人群 、化妆品色素沉积、辐射斑人群 、长期日照引起的日晒斑人群 、肌肤暗淡急需美白的人群 《祛斑小方法》 天生色斑怎么治疗,同时为您分享祛斑小方法 洗脸时, ,有减轻色素沉着的作用。 将鲜明萝卜辟碎挤汁, ,每日上晚洗完脸后涂抹� ��待干后,洗净。此外,每日喝一杯胡萝卜,可美白肌肤。 original issue reported on code google com by additive gmail com on jul at
| 1
|
360,504
| 10,693,549,639
|
IssuesEvent
|
2019-10-23 09:03:53
|
yugabyte/yugabyte-db
|
https://api.github.com/repos/yugabyte/yugabyte-db
|
opened
|
Increase defaults to rocksdb_compact_flush_rate_limit_bytes_per_sec and remote_bootstrap_rate_limit_bytes_per_sec
|
area/docdb kind/enhancement priority/high
|
I think we can increase rocksdb_compact_flush_rate_limit_bytes_per_sec to 1GB and
remote_bootstrap_rate_limit_bytes_per_sec to 384 or 512MB
|
1.0
|
Increase defaults to rocksdb_compact_flush_rate_limit_bytes_per_sec and remote_bootstrap_rate_limit_bytes_per_sec -
I think we can increase rocksdb_compact_flush_rate_limit_bytes_per_sec to 1GB and
remote_bootstrap_rate_limit_bytes_per_sec to 384 or 512MB
|
non_defect
|
increase defaults to rocksdb compact flush rate limit bytes per sec and remote bootstrap rate limit bytes per sec i think we can increase rocksdb compact flush rate limit bytes per sec to and remote bootstrap rate limit bytes per sec to or
| 0
|
4,946
| 2,610,162,011
|
IssuesEvent
|
2015-02-26 18:51:23
|
chrsmith/republic-at-war
|
https://api.github.com/repos/chrsmith/republic-at-war
|
closed
|
Gameplay Error
|
auto-migrated Priority-Medium Type-Defect
|
```
Put Shaak-Ti and Luminara Unduli at correct tech level
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 18 Feb 2011 at 4:21
|
1.0
|
Gameplay Error - ```
Put Shaak-Ti and Luminara Unduli at correct tech level
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 18 Feb 2011 at 4:21
|
defect
|
gameplay error put shaak ti and luminara unduli at correct tech level original issue reported on code google com by gmail com on feb at
| 1
|
45,731
| 13,046,186,442
|
IssuesEvent
|
2020-07-29 08:34:58
|
ryanhsu828/hijack-main
|
https://api.github.com/repos/ryanhsu828/hijack-main
|
closed
|
Android test results
|
Priority-Medium Type-Defect auto-migrated
|
```
I have my hijack board working on iOS with the example apps and it works well.
I've run some initial tests on Android (just sampling mic input) and the shape
of the manchester-encoded signal varies widely across the different phone
models I've tried. Any ideas why and what could be done to improve this?
At the moment, I don't see how data can easily be decoded on some Android
devices. It seems a long way from the clean square wave on iOS.
Attached is a ZIP with some examples. In each image, the sampled mic input is
shown in the top channel.
```
Original issue reported on code.google.com by `siwat...@gmail.com` on 19 Jul 2013 at 11:41
Attachments:
- [Android Tests.zip](https://storage.googleapis.com/google-code-attachments/hijack-main/issue-13/comment-0/Android Tests.zip)
|
1.0
|
Android test results - ```
I have my hijack board working on iOS with the example apps and it works well.
I've run some initial tests on Android (just sampling mic input) and the shape
of the manchester-encoded signal varies widely across the different phone
models I've tried. Any ideas why and what could be done to improve this?
At the moment, I don't see how data can easily be decoded on some Android
devices. It seems a long way from the clean square wave on iOS.
Attached is a ZIP with some examples. In each image, the sampled mic input is
shown in the top channel.
```
Original issue reported on code.google.com by `siwat...@gmail.com` on 19 Jul 2013 at 11:41
Attachments:
- [Android Tests.zip](https://storage.googleapis.com/google-code-attachments/hijack-main/issue-13/comment-0/Android Tests.zip)
|
defect
|
android test results i have my hijack board working on ios with the example apps and it works well i ve run some initial tests on android just sampling mic input and the shape of the manchester encoded signal varies widely across the different phone models i ve tried any ideas why and what could be done to improve this at the moment i don t see how data can easily be decoded on some android devices it seems a long way from the clean square wave on ios attached is a zip with some examples in each image the sampled mic input is shown in the top channel original issue reported on code google com by siwat gmail com on jul at attachments tests zip
| 1
|
171,014
| 20,905,449,273
|
IssuesEvent
|
2022-03-24 01:18:53
|
tt9133github/zheng
|
https://api.github.com/repos/tt9133github/zheng
|
opened
|
CVE-2021-25640 (Medium) detected in dubbo-2.5.6.jar
|
security vulnerability
|
## CVE-2021-25640 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dubbo-2.5.6.jar</b></p></summary>
<p>Dubbo is a distributed service framework enpowers applications with service import/export capability
with high performance RPC.</p>
<p>Library home page: <a href="http://dubbo.io">http://dubbo.io</a></p>
<p>Path to dependency file: /zheng-ucenter/zheng-ucenter-web/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/zheng-wechat/zheng-wechat-mp/zheng-wechat-mp-admin/target/zheng-wechat-mp-admin/WEB-INF/lib/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/canner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar</p>
<p>
Dependency Hierarchy:
- :x: **dubbo-2.5.6.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Dubbo prior to 2.6.9 and 2.7.9, the usage of parseURL method will lead to the bypass of white host check which can cause open redirect or SSRF vulnerability.
<p>Publish Date: 2021-06-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-25640>CVE-2021-25640</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-gw4j-4229-q4px">https://github.com/advisories/GHSA-gw4j-4229-q4px</a></p>
<p>Release Date: 2021-06-01</p>
<p>Fix Resolution: com.alibaba:dubbo:2.6.9;org.apache.dubbo:dubbo:2.7.10</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-25640 (Medium) detected in dubbo-2.5.6.jar - ## CVE-2021-25640 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dubbo-2.5.6.jar</b></p></summary>
<p>Dubbo is a distributed service framework enpowers applications with service import/export capability
with high performance RPC.</p>
<p>Library home page: <a href="http://dubbo.io">http://dubbo.io</a></p>
<p>Path to dependency file: /zheng-ucenter/zheng-ucenter-web/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/zheng-wechat/zheng-wechat-mp/zheng-wechat-mp-admin/target/zheng-wechat-mp-admin/WEB-INF/lib/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/canner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar,/home/wss-scanner/.m2/repository/com/alibaba/dubbo/2.5.6/dubbo-2.5.6.jar</p>
<p>
Dependency Hierarchy:
- :x: **dubbo-2.5.6.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In Apache Dubbo prior to 2.6.9 and 2.7.9, the usage of parseURL method will lead to the bypass of white host check which can cause open redirect or SSRF vulnerability.
<p>Publish Date: 2021-06-01
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-25640>CVE-2021-25640</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/advisories/GHSA-gw4j-4229-q4px">https://github.com/advisories/GHSA-gw4j-4229-q4px</a></p>
<p>Release Date: 2021-06-01</p>
<p>Fix Resolution: com.alibaba:dubbo:2.6.9;org.apache.dubbo:dubbo:2.7.10</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve medium detected in dubbo jar cve medium severity vulnerability vulnerable library dubbo jar dubbo is a distributed service framework enpowers applications with service import export capability with high performance rpc library home page a href path to dependency file zheng ucenter zheng ucenter web pom xml path to vulnerable library home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar zheng wechat zheng wechat mp zheng wechat mp admin target zheng wechat mp admin web inf lib dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar canner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar home wss scanner repository com alibaba dubbo dubbo jar dependency hierarchy x dubbo jar vulnerable library found in base branch master vulnerability details in apache dubbo prior to and the usage of parseurl method will lead to the bypass of white host check which can cause open redirect or ssrf vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com alibaba dubbo org apache dubbo dubbo step up your open source security game with whitesource
| 0
|
306,908
| 26,505,511,920
|
IssuesEvent
|
2023-01-18 13:34:28
|
airbytehq/airbyte
|
https://api.github.com/repos/airbytehq/airbyte
|
closed
|
Clean up naming in webapp e2e testing
|
type/enhancement area/frontend ui/tests team/frontend
|
## Tell us about the problem you're trying to solve
Function names are describing their expected outcome rather than what they are doing
to resolve these (moved from https://github.com/airbytehq/airbyte/issues/20722)
|
1.0
|
Clean up naming in webapp e2e testing - ## Tell us about the problem you're trying to solve
Function names are describing their expected outcome rather than what they are doing
to resolve these (moved from https://github.com/airbytehq/airbyte/issues/20722)
|
non_defect
|
clean up naming in webapp testing tell us about the problem you re trying to solve function names are describing their expected outcome rather than what they are doing to resolve these moved from
| 0
|
42,740
| 11,235,110,652
|
IssuesEvent
|
2020-01-09 07:26:46
|
opencaching/opencaching-pl
|
https://api.github.com/repos/opencaching/opencaching-pl
|
closed
|
SimpleRouter should check if controller is not abstract class
|
Component Core Priority Low Type Defect
|
If somebody tries to reach https://OC/base/something SimpleRouter tries to call abstract BaseController and generates error for user and mail for admins:
`Error: Cannot instantiate abstract class src\Controllers\BaseController
`
SimpleRouter should check if controller is NOT abstract. If is - should return 404 page, not error as above.
|
1.0
|
SimpleRouter should check if controller is not abstract class - If somebody tries to reach https://OC/base/something SimpleRouter tries to call abstract BaseController and generates error for user and mail for admins:
`Error: Cannot instantiate abstract class src\Controllers\BaseController
`
SimpleRouter should check if controller is NOT abstract. If is - should return 404 page, not error as above.
|
defect
|
simplerouter should check if controller is not abstract class if somebody tries to reach simplerouter tries to call abstract basecontroller and generates error for user and mail for admins error cannot instantiate abstract class src controllers basecontroller simplerouter should check if controller is not abstract if is should return page not error as above
| 1
|
38,685
| 8,952,153,109
|
IssuesEvent
|
2019-01-25 15:49:10
|
svigerske/ipopt-donotuse
|
https://api.github.com/repos/svigerske/ipopt-donotuse
|
closed
|
After compiling Ipopt with Cygwin "make test" fails
|
C Ipopt defect make test
|
Issue created by migration from Trac.
Original creator: Johannes
Original creation time: 2009-06-23 15:42:03
Assignee: ipopt-team
Version: 3.6
I am trying to install Ipopt with Cygwin on Windows to use it with Matlab. configure and make test works but then the C-Test fails when typing make test:
Johannes at Johannes-PC ~/CoinIpopt
$ make test
Making all in ThirdParty/Blas
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Blas'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Blas'
Making all in ThirdParty/Lapack
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Lapack'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Lapack'
Making all in ThirdParty/Metis
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Metis'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Metis'
Making all in ThirdParty/HSL
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/HSL'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/HSL'
Making all in ThirdParty/ASL
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/ASL'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/ASL'
Making all in ThirdParty/Mumps
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Mumps'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Mumps'
Making all in Ipopt
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
Making all in src/Common
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
Making all in src/LinAlg
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in TMatrices
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices
'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in src/Algorithm
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in LinearSolvers
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Linear
Solvers'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/LinearS
olvers'
Making all in Inexact
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexac
t'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexact
'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in src/contrib/CGPenalty
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalt
y'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalty
'
Making all in src/contrib/LinearSolverLoader
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSo
lverLoader'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSol
verLoader'
Making all in src/Interfaces
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
Making all in src/Apps
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
Making all in CUTErInterface
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterf
ace'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterfa
ce'
Making all in AmplSolver
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
make29817e9645: Nothing to be done for `all-am'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt'
make8eb352ceb5: Nothing to be done for `all-am'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt'
cd Ipopt; make test
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
Making all in src/Common
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
Making all in src/LinAlg
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in TMatrices
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices
'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in src/Algorithm
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in LinearSolvers
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Linear
Solvers'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/LinearS
olvers'
Making all in Inexact
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexac
t'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexact
'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in src/contrib/CGPenalty
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalt
y'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalty
'
Making all in src/contrib/LinearSolverLoader
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSo
lverLoader'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSol
verLoader'
Making all in src/Interfaces
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
Making all in src/Apps
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
Making all in CUTErInterface
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterf
ace'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterfa
ce'
Making all in AmplSolver
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
make29817e9645: Nothing to be done for `all-am'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
cd test; make test
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/test'
if g++ -DHAVE_CONFIG_H -I. -I`cygpath -w .` -I../inc -I`cygpath -w
./../src/Com
mon` -I`cygpath -w ./../src/LinAlg` -I`cygpath -w ./../src/LinAlg/TMatrices`
-I`
cygpath -w ./../src/Algorithm` -I`cygpath -w ./../src/Interfaces` -O3
-fomit-f
rame-pointer -pipe -DNDEBUG -pedantic-errors -Wimplicit -Wparentheses
-Wreturn-t
ype -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings -Wconversion
-Wno-unknown-
pragmas -MT hs071_main.o -MD -MP -MF ".deps/hs071_main.Tpo" -c -o
hs071_main.o
hs071_main.cpp; \
then mv -f ".deps/hs071_main.Tpo" ".deps/hs071_main.Po"; else rm -f
".de
ps/hs071_main.Tpo"; exit 1; fi
if g++ -DHAVE_CONFIG_H -I. -I`cygpath -w .` -I../inc -I`cygpath -w
./../src/Com
mon` -I`cygpath -w ./../src/LinAlg` -I`cygpath -w ./../src/LinAlg/TMatrices`
-I`
cygpath -w ./../src/Algorithm` -I`cygpath -w ./../src/Interfaces` -O3
-fomit-f
rame-pointer -pipe -DNDEBUG -pedantic-errors -Wimplicit -Wparentheses
-Wreturn-t
ype -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings -Wconversion
-Wno-unknown-
pragmas -MT hs071_nlp.o -MD -MP -MF ".deps/hs071_nlp.Tpo" -c -o
hs071_nlp.o hs
071_nlp.cpp; \
then mv -f ".deps/hs071_nlp.Tpo" ".deps/hs071_nlp.Po"; else rm -f
".deps
/hs071_nlp.Tpo"; exit 1; fi
/bin/sh ../../libtool --tag=CXX --mode=link g++ -O3 -fomit-frame-pointer
-pipe
-DNDEBUG -pedantic-errors -Wimplicit -Wparentheses -Wreturn-type -Wcast-qual
-Wa
ll -Wpointer-arith -Wwrite-strings -Wconversion -Wno-unknown-pragmas -o
hs07
1_cpp.exe hs071_main.o hs071_nlp.o ../src/Interfaces/libipopt.la -lm -ldl
-L/
usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/usr/lib/gcc/i686-pc-cygwin/3.4.4/../../.. -l
frtbegin -lg2c -lcygwin -luser32 -lkernel32 -ladvapi32 -lshell32 -ldl
mkdir .libs
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
g++ -O3 -fomit-frame-pointer -pipe -DNDEBUG -pedantic-errors -Wimplicit
-Wparent
heses -Wreturn-type -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings
-Wconversi
on -Wno-unknown-pragmas -o hs071_cpp.exe hs071_main.o hs071_nlp.o
../src/Interf
aces/.libs/libipopt.a -L/usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/usr/lib/gcc/i686-pc
-cygwin/3.4.4/../../.. -lfrtbegin /usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.a
-lc
ygwin -luser32 -lkernel32 -ladvapi32 -lshell32 -ldl
if gcc -DHAVE_CONFIG_H -I. -I`cygpath -w .` -I../inc -I`cygpath -w
./../src/Com
mon` -I`cygpath -w ./../src/LinAlg` -I`cygpath -w ./../src/LinAlg/TMatrices`
-I`
cygpath -w ./../src/Algorithm` -I`cygpath -w ./../src/Interfaces` -O3
-fomit-f
rame-pointer -pipe -DNDEBUG -Wimplicit -Wparentheses -Wsequence-point
-Wreturn-
type -Wcast-qual -Wall -Wno-unknown-pragmas -MT hs071_c.o -MD -MP -MF
".deps/hs0
71_c.Tpo" -c -o hs071_c.o hs071_c.c; \
then mv -f ".deps/hs071_c.Tpo" ".deps/hs071_c.Po"; else rm -f
".deps/hs0
71_c.Tpo"; exit 1; fi
/bin/sh ../../libtool --tag=CC --mode=link gcc -O3 -fomit-frame-pointer
-pipe -
DNDEBUG -Wimplicit -Wparentheses -Wsequence-point -Wreturn-type -Wcast-qual
-Wa
ll -Wno-unknown-pragmas -o hs071_c.exe hs071_c.o
../src/Interfaces/libipopt.l
a -lm -ldl -L/usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/usr/lib/gcc/i686-pc-cygwin/3
.4.4/../../.. -lfrtbegin -lg2c -lcygwin -luser32 -lkernel32 -ladvapi32
-lshell32
-lstdc++ -lm -ldl
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
gcc -O3 -fomit-frame-pointer -pipe -DNDEBUG -Wimplicit -Wparentheses
-Wsequence-
point -Wreturn-type -Wcast-qual -Wall -Wno-unknown-pragmas -o hs071_c.exe
hs071_
c.o ../src/Interfaces/.libs/libipopt.a -L/usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/u
sr/lib/gcc/i686-pc-cygwin/3.4.4/../../.. -lfrtbegin
/usr/lib/gcc/i686-pc-cygwin/
3.4.4/libg2c.a /usr/lib/gcc/i686-pc-cygwin/3.4.4/libstdc++.a
-L/managed/gcc-buil
d/final-v3-bootstrap/gcc-3.4.4-999/.build/i686-pc-cygwin/libstdc++-v3/src
-L/man
aged/gcc-build/final-v3-bootstrap/gcc-3.4.4-999/.build/i686-pc-cygwin/libstdc++-
v3/src/.libs
-L/managed/gcc-build/final-v3-bootstrap/gcc-3.4.4-999/.build/gcc -L
/usr/i686-pc-cygwin/bin -lcygwin -luser32 -lkernel32 -ladvapi32 -lshell32
-lgcc
-ldl
ln -s ../examples/hs071_f/hs071_f.f hs071_f.f
g77 -I`cygpath -w ./../src/Interfaces` -O3 -fomit-frame-pointer -pipe -c
-o hs
071_f.o hs071_f.f
/bin/sh ../../libtool --tag=F77 --mode=link g77 -I`cygpath -w
./../src/Interface
s` -O3 -fomit-frame-pointer -pipe -o hs071_f.exe hs071_f.o
../src/Interface
s/libipopt.la -lm -ldl -lstdc++ -lm -ldl
g77 -IC:\\cygwin\\home\\Johannes\\CoinIpopt\\Ipopt\\src\\Interfaces -O3
-fomit-f
rame-pointer -pipe -o hs071_f.exe hs071_f.o
../src/Interfaces/.libs/libipopt.a
-lstdc++ -ldl
chmod u+x ./run_unitTests
./run_unitTests
Running unitTests...
Testing AMPL Solver Executable...
Test passed!
Testing C++ Example...
Test passed!
Testing C Example...
./run_unitTests: line 69: 3676 Segmentation fault (core dumped)
./hs071_c
> tmpfile 2>&1
---- 8< ---- Start of test program output ---- 8< ----
---- 8< ---- End of test program output ---- 8< ----
******** Test FAILED! ********
Output of the test program is above.
Testing Fortran Example...
Test passed!
make29817e9645: *** [test] Error 255
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/test'
make8eb352ceb5: *** [unitTest] Error 2
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
make: *** [test] Error 2
|
1.0
|
After compiling Ipopt with Cygwin "make test" fails - Issue created by migration from Trac.
Original creator: Johannes
Original creation time: 2009-06-23 15:42:03
Assignee: ipopt-team
Version: 3.6
I am trying to install Ipopt with Cygwin on Windows to use it with Matlab. configure and make test works but then the C-Test fails when typing make test:
Johannes at Johannes-PC ~/CoinIpopt
$ make test
Making all in ThirdParty/Blas
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Blas'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Blas'
Making all in ThirdParty/Lapack
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Lapack'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Lapack'
Making all in ThirdParty/Metis
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Metis'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Metis'
Making all in ThirdParty/HSL
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/HSL'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/HSL'
Making all in ThirdParty/ASL
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/ASL'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/ASL'
Making all in ThirdParty/Mumps
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/ThirdParty/Mumps'
make8eb352ceb5: Nothing to be done for `all'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/ThirdParty/Mumps'
Making all in Ipopt
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
Making all in src/Common
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
Making all in src/LinAlg
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in TMatrices
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices
'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in src/Algorithm
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in LinearSolvers
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Linear
Solvers'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/LinearS
olvers'
Making all in Inexact
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexac
t'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexact
'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in src/contrib/CGPenalty
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalt
y'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalty
'
Making all in src/contrib/LinearSolverLoader
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSo
lverLoader'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSol
verLoader'
Making all in src/Interfaces
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
Making all in src/Apps
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
Making all in CUTErInterface
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterf
ace'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterfa
ce'
Making all in AmplSolver
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
make29817e9645: Nothing to be done for `all-am'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt'
make8eb352ceb5: Nothing to be done for `all-am'.
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt'
cd Ipopt; make test
make8eb352ceb5: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
Making all in src/Common
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Common'
Making all in src/LinAlg
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in TMatrices
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices
'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/LinAlg/TMatrices'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/LinAlg'
Making all in src/Algorithm
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in LinearSolvers
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Linear
Solvers'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/LinearS
olvers'
Making all in Inexact
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexac
t'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Algorithm/Inexact
'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Algorithm'
Making all in src/contrib/CGPenalty
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalt
y'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/CGPenalty
'
Making all in src/contrib/LinearSolverLoader
make29817e9645: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSo
lverLoader'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/contrib/LinearSol
verLoader'
Making all in src/Interfaces
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
make29817e9645: Nothing to be done for `all'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Interfaces'
Making all in src/Apps
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
Making all in CUTErInterface
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterf
ace'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/CUTErInterfa
ce'
Making all in AmplSolver
make[3]: Entering directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Nothing to be done for `all'.
make[3]: Leaving directory
`/home/Johannes/CoinIpopt/Ipopt/src/Apps/AmplSolver'
make[3]: Entering directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make[3]: Nothing to be done for `all-am'.
make[3]: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/src/Apps'
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt'
make29817e9645: Nothing to be done for `all-am'.
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
cd test; make test
make29817e9645: Entering directory `/home/Johannes/CoinIpopt/Ipopt/test'
if g++ -DHAVE_CONFIG_H -I. -I`cygpath -w .` -I../inc -I`cygpath -w
./../src/Com
mon` -I`cygpath -w ./../src/LinAlg` -I`cygpath -w ./../src/LinAlg/TMatrices`
-I`
cygpath -w ./../src/Algorithm` -I`cygpath -w ./../src/Interfaces` -O3
-fomit-f
rame-pointer -pipe -DNDEBUG -pedantic-errors -Wimplicit -Wparentheses
-Wreturn-t
ype -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings -Wconversion
-Wno-unknown-
pragmas -MT hs071_main.o -MD -MP -MF ".deps/hs071_main.Tpo" -c -o
hs071_main.o
hs071_main.cpp; \
then mv -f ".deps/hs071_main.Tpo" ".deps/hs071_main.Po"; else rm -f
".de
ps/hs071_main.Tpo"; exit 1; fi
if g++ -DHAVE_CONFIG_H -I. -I`cygpath -w .` -I../inc -I`cygpath -w
./../src/Com
mon` -I`cygpath -w ./../src/LinAlg` -I`cygpath -w ./../src/LinAlg/TMatrices`
-I`
cygpath -w ./../src/Algorithm` -I`cygpath -w ./../src/Interfaces` -O3
-fomit-f
rame-pointer -pipe -DNDEBUG -pedantic-errors -Wimplicit -Wparentheses
-Wreturn-t
ype -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings -Wconversion
-Wno-unknown-
pragmas -MT hs071_nlp.o -MD -MP -MF ".deps/hs071_nlp.Tpo" -c -o
hs071_nlp.o hs
071_nlp.cpp; \
then mv -f ".deps/hs071_nlp.Tpo" ".deps/hs071_nlp.Po"; else rm -f
".deps
/hs071_nlp.Tpo"; exit 1; fi
/bin/sh ../../libtool --tag=CXX --mode=link g++ -O3 -fomit-frame-pointer
-pipe
-DNDEBUG -pedantic-errors -Wimplicit -Wparentheses -Wreturn-type -Wcast-qual
-Wa
ll -Wpointer-arith -Wwrite-strings -Wconversion -Wno-unknown-pragmas -o
hs07
1_cpp.exe hs071_main.o hs071_nlp.o ../src/Interfaces/libipopt.la -lm -ldl
-L/
usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/usr/lib/gcc/i686-pc-cygwin/3.4.4/../../.. -l
frtbegin -lg2c -lcygwin -luser32 -lkernel32 -ladvapi32 -lshell32 -ldl
mkdir .libs
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
g++ -O3 -fomit-frame-pointer -pipe -DNDEBUG -pedantic-errors -Wimplicit
-Wparent
heses -Wreturn-type -Wcast-qual -Wall -Wpointer-arith -Wwrite-strings
-Wconversi
on -Wno-unknown-pragmas -o hs071_cpp.exe hs071_main.o hs071_nlp.o
../src/Interf
aces/.libs/libipopt.a -L/usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/usr/lib/gcc/i686-pc
-cygwin/3.4.4/../../.. -lfrtbegin /usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.a
-lc
ygwin -luser32 -lkernel32 -ladvapi32 -lshell32 -ldl
if gcc -DHAVE_CONFIG_H -I. -I`cygpath -w .` -I../inc -I`cygpath -w
./../src/Com
mon` -I`cygpath -w ./../src/LinAlg` -I`cygpath -w ./../src/LinAlg/TMatrices`
-I`
cygpath -w ./../src/Algorithm` -I`cygpath -w ./../src/Interfaces` -O3
-fomit-f
rame-pointer -pipe -DNDEBUG -Wimplicit -Wparentheses -Wsequence-point
-Wreturn-
type -Wcast-qual -Wall -Wno-unknown-pragmas -MT hs071_c.o -MD -MP -MF
".deps/hs0
71_c.Tpo" -c -o hs071_c.o hs071_c.c; \
then mv -f ".deps/hs071_c.Tpo" ".deps/hs071_c.Po"; else rm -f
".deps/hs0
71_c.Tpo"; exit 1; fi
/bin/sh ../../libtool --tag=CC --mode=link gcc -O3 -fomit-frame-pointer
-pipe -
DNDEBUG -Wimplicit -Wparentheses -Wsequence-point -Wreturn-type -Wcast-qual
-Wa
ll -Wno-unknown-pragmas -o hs071_c.exe hs071_c.o
../src/Interfaces/libipopt.l
a -lm -ldl -L/usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/usr/lib/gcc/i686-pc-cygwin/3
.4.4/../../.. -lfrtbegin -lg2c -lcygwin -luser32 -lkernel32 -ladvapi32
-lshell32
-lstdc++ -lm -ldl
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
libtool: link: warning: library `/usr/lib/gcc/i686-pc-cygwin/3.4.4/libg2c.la'
wa
s moved.
gcc -O3 -fomit-frame-pointer -pipe -DNDEBUG -Wimplicit -Wparentheses
-Wsequence-
point -Wreturn-type -Wcast-qual -Wall -Wno-unknown-pragmas -o hs071_c.exe
hs071_
c.o ../src/Interfaces/.libs/libipopt.a -L/usr/lib/gcc/i686-pc-cygwin/3.4.4
-L/u
sr/lib/gcc/i686-pc-cygwin/3.4.4/../../.. -lfrtbegin
/usr/lib/gcc/i686-pc-cygwin/
3.4.4/libg2c.a /usr/lib/gcc/i686-pc-cygwin/3.4.4/libstdc++.a
-L/managed/gcc-buil
d/final-v3-bootstrap/gcc-3.4.4-999/.build/i686-pc-cygwin/libstdc++-v3/src
-L/man
aged/gcc-build/final-v3-bootstrap/gcc-3.4.4-999/.build/i686-pc-cygwin/libstdc++-
v3/src/.libs
-L/managed/gcc-build/final-v3-bootstrap/gcc-3.4.4-999/.build/gcc -L
/usr/i686-pc-cygwin/bin -lcygwin -luser32 -lkernel32 -ladvapi32 -lshell32
-lgcc
-ldl
ln -s ../examples/hs071_f/hs071_f.f hs071_f.f
g77 -I`cygpath -w ./../src/Interfaces` -O3 -fomit-frame-pointer -pipe -c
-o hs
071_f.o hs071_f.f
/bin/sh ../../libtool --tag=F77 --mode=link g77 -I`cygpath -w
./../src/Interface
s` -O3 -fomit-frame-pointer -pipe -o hs071_f.exe hs071_f.o
../src/Interface
s/libipopt.la -lm -ldl -lstdc++ -lm -ldl
g77 -IC:\\cygwin\\home\\Johannes\\CoinIpopt\\Ipopt\\src\\Interfaces -O3
-fomit-f
rame-pointer -pipe -o hs071_f.exe hs071_f.o
../src/Interfaces/.libs/libipopt.a
-lstdc++ -ldl
chmod u+x ./run_unitTests
./run_unitTests
Running unitTests...
Testing AMPL Solver Executable...
Test passed!
Testing C++ Example...
Test passed!
Testing C Example...
./run_unitTests: line 69: 3676 Segmentation fault (core dumped)
./hs071_c
> tmpfile 2>&1
---- 8< ---- Start of test program output ---- 8< ----
---- 8< ---- End of test program output ---- 8< ----
******** Test FAILED! ********
Output of the test program is above.
Testing Fortran Example...
Test passed!
make29817e9645: *** [test] Error 255
make29817e9645: Leaving directory `/home/Johannes/CoinIpopt/Ipopt/test'
make8eb352ceb5: *** [unitTest] Error 2
make8eb352ceb5: Leaving directory `/home/Johannes/CoinIpopt/Ipopt'
make: *** [test] Error 2
|
defect
|
after compiling ipopt with cygwin make test fails issue created by migration from trac original creator johannes original creation time assignee ipopt team version i am trying to install ipopt with cygwin on windows to use it with matlab configure and make test works but then the c test fails when typing make test johannes at johannes pc coinipopt make test making all in thirdparty blas entering directory home johannes coinipopt thirdparty blas nothing to be done for all leaving directory home johannes coinipopt thirdparty blas making all in thirdparty lapack entering directory home johannes coinipopt thirdparty lapack nothing to be done for all leaving directory home johannes coinipopt thirdparty lapack making all in thirdparty metis entering directory home johannes coinipopt thirdparty metis nothing to be done for all leaving directory home johannes coinipopt thirdparty metis making all in thirdparty hsl entering directory home johannes coinipopt thirdparty hsl nothing to be done for all leaving directory home johannes coinipopt thirdparty hsl making all in thirdparty asl entering directory home johannes coinipopt thirdparty asl nothing to be done for all leaving directory home johannes coinipopt thirdparty asl making all in thirdparty mumps entering directory home johannes coinipopt thirdparty mumps nothing to be done for all leaving directory home johannes coinipopt thirdparty mumps making all in ipopt entering directory home johannes coinipopt ipopt making all in src common entering directory home johannes coinipopt ipopt src common nothing to be done for all leaving directory home johannes coinipopt ipopt src common making all in src linalg entering directory home johannes coinipopt ipopt src linalg making all in tmatrices make entering directory home johannes coinipopt ipopt src linalg tmatrices make nothing to be done for all make leaving directory home johannes coinipopt ipopt src linalg tmatrices make entering directory home johannes coinipopt ipopt src linalg make nothing to be done for all am make leaving directory home johannes coinipopt ipopt src linalg leaving directory home johannes coinipopt ipopt src linalg making all in src algorithm entering directory home johannes coinipopt ipopt src algorithm making all in linearsolvers make entering directory home johannes coinipopt ipopt src algorithm linear solvers make nothing to be done for all make leaving directory home johannes coinipopt ipopt src algorithm linears olvers making all in inexact make entering directory home johannes coinipopt ipopt src algorithm inexac t make nothing to be done for all make leaving directory home johannes coinipopt ipopt src algorithm inexact make entering directory home johannes coinipopt ipopt src algorithm make nothing to be done for all am make leaving directory home johannes coinipopt ipopt src algorithm leaving directory home johannes coinipopt ipopt src algorithm making all in src contrib cgpenalty entering directory home johannes coinipopt ipopt src contrib cgpenalt y nothing to be done for all leaving directory home johannes coinipopt ipopt src contrib cgpenalty making all in src contrib linearsolverloader entering directory home johannes coinipopt ipopt src contrib linearso lverloader nothing to be done for all leaving directory home johannes coinipopt ipopt src contrib linearsol verloader making all in src interfaces entering directory home johannes coinipopt ipopt src interfaces nothing to be done for all leaving directory home johannes coinipopt ipopt src interfaces making all in src apps entering directory home johannes coinipopt ipopt src apps making all in cuterinterface make entering directory home johannes coinipopt ipopt src apps cuterinterf ace make nothing to be done for all make leaving directory home johannes coinipopt ipopt src apps cuterinterfa ce making all in amplsolver make entering directory home johannes coinipopt ipopt src apps amplsolver make nothing to be done for all make leaving directory home johannes coinipopt ipopt src apps amplsolver make entering directory home johannes coinipopt ipopt src apps make nothing to be done for all am make leaving directory home johannes coinipopt ipopt src apps leaving directory home johannes coinipopt ipopt src apps entering directory home johannes coinipopt ipopt nothing to be done for all am leaving directory home johannes coinipopt ipopt leaving directory home johannes coinipopt ipopt entering directory home johannes coinipopt nothing to be done for all am leaving directory home johannes coinipopt cd ipopt make test entering directory home johannes coinipopt ipopt making all in src common entering directory home johannes coinipopt ipopt src common nothing to be done for all leaving directory home johannes coinipopt ipopt src common making all in src linalg entering directory home johannes coinipopt ipopt src linalg making all in tmatrices make entering directory home johannes coinipopt ipopt src linalg tmatrices make nothing to be done for all make leaving directory home johannes coinipopt ipopt src linalg tmatrices make entering directory home johannes coinipopt ipopt src linalg make nothing to be done for all am make leaving directory home johannes coinipopt ipopt src linalg leaving directory home johannes coinipopt ipopt src linalg making all in src algorithm entering directory home johannes coinipopt ipopt src algorithm making all in linearsolvers make entering directory home johannes coinipopt ipopt src algorithm linear solvers make nothing to be done for all make leaving directory home johannes coinipopt ipopt src algorithm linears olvers making all in inexact make entering directory home johannes coinipopt ipopt src algorithm inexac t make nothing to be done for all make leaving directory home johannes coinipopt ipopt src algorithm inexact make entering directory home johannes coinipopt ipopt src algorithm make nothing to be done for all am make leaving directory home johannes coinipopt ipopt src algorithm leaving directory home johannes coinipopt ipopt src algorithm making all in src contrib cgpenalty entering directory home johannes coinipopt ipopt src contrib cgpenalt y nothing to be done for all leaving directory home johannes coinipopt ipopt src contrib cgpenalty making all in src contrib linearsolverloader entering directory home johannes coinipopt ipopt src contrib linearso lverloader nothing to be done for all leaving directory home johannes coinipopt ipopt src contrib linearsol verloader making all in src interfaces entering directory home johannes coinipopt ipopt src interfaces nothing to be done for all leaving directory home johannes coinipopt ipopt src interfaces making all in src apps entering directory home johannes coinipopt ipopt src apps making all in cuterinterface make entering directory home johannes coinipopt ipopt src apps cuterinterf ace make nothing to be done for all make leaving directory home johannes coinipopt ipopt src apps cuterinterfa ce making all in amplsolver make entering directory home johannes coinipopt ipopt src apps amplsolver make nothing to be done for all make leaving directory home johannes coinipopt ipopt src apps amplsolver make entering directory home johannes coinipopt ipopt src apps make nothing to be done for all am make leaving directory home johannes coinipopt ipopt src apps leaving directory home johannes coinipopt ipopt src apps entering directory home johannes coinipopt ipopt nothing to be done for all am leaving directory home johannes coinipopt ipopt cd test make test entering directory home johannes coinipopt ipopt test if g dhave config h i i cygpath w i inc i cygpath w src com mon i cygpath w src linalg i cygpath w src linalg tmatrices i cygpath w src algorithm i cygpath w src interfaces fomit f rame pointer pipe dndebug pedantic errors wimplicit wparentheses wreturn t ype wcast qual wall wpointer arith wwrite strings wconversion wno unknown pragmas mt main o md mp mf deps main tpo c o main o main cpp then mv f deps main tpo deps main po else rm f de ps main tpo exit fi if g dhave config h i i cygpath w i inc i cygpath w src com mon i cygpath w src linalg i cygpath w src linalg tmatrices i cygpath w src algorithm i cygpath w src interfaces fomit f rame pointer pipe dndebug pedantic errors wimplicit wparentheses wreturn t ype wcast qual wall wpointer arith wwrite strings wconversion wno unknown pragmas mt nlp o md mp mf deps nlp tpo c o nlp o hs nlp cpp then mv f deps nlp tpo deps nlp po else rm f deps nlp tpo exit fi bin sh libtool tag cxx mode link g fomit frame pointer pipe dndebug pedantic errors wimplicit wparentheses wreturn type wcast qual wa ll wpointer arith wwrite strings wconversion wno unknown pragmas o cpp exe main o nlp o src interfaces libipopt la lm ldl l usr lib gcc pc cygwin l usr lib gcc pc cygwin l frtbegin lcygwin ldl mkdir libs libtool link warning library usr lib gcc pc cygwin la wa s moved libtool link warning library usr lib gcc pc cygwin la wa s moved g fomit frame pointer pipe dndebug pedantic errors wimplicit wparent heses wreturn type wcast qual wall wpointer arith wwrite strings wconversi on wno unknown pragmas o cpp exe main o nlp o src interf aces libs libipopt a l usr lib gcc pc cygwin l usr lib gcc pc cygwin lfrtbegin usr lib gcc pc cygwin a lc ygwin ldl if gcc dhave config h i i cygpath w i inc i cygpath w src com mon i cygpath w src linalg i cygpath w src linalg tmatrices i cygpath w src algorithm i cygpath w src interfaces fomit f rame pointer pipe dndebug wimplicit wparentheses wsequence point wreturn type wcast qual wall wno unknown pragmas mt c o md mp mf deps c tpo c o c o c c then mv f deps c tpo deps c po else rm f deps c tpo exit fi bin sh libtool tag cc mode link gcc fomit frame pointer pipe dndebug wimplicit wparentheses wsequence point wreturn type wcast qual wa ll wno unknown pragmas o c exe c o src interfaces libipopt l a lm ldl l usr lib gcc pc cygwin l usr lib gcc pc cygwin lfrtbegin lcygwin lstdc lm ldl libtool link warning library usr lib gcc pc cygwin la wa s moved libtool link warning library usr lib gcc pc cygwin la wa s moved gcc fomit frame pointer pipe dndebug wimplicit wparentheses wsequence point wreturn type wcast qual wall wno unknown pragmas o c exe c o src interfaces libs libipopt a l usr lib gcc pc cygwin l u sr lib gcc pc cygwin lfrtbegin usr lib gcc pc cygwin a usr lib gcc pc cygwin libstdc a l managed gcc buil d final bootstrap gcc build pc cygwin libstdc src l man aged gcc build final bootstrap gcc build pc cygwin libstdc src libs l managed gcc build final bootstrap gcc build gcc l usr pc cygwin bin lcygwin lgcc ldl ln s examples f f f f f i cygpath w src interfaces fomit frame pointer pipe c o hs f o f f bin sh libtool tag mode link i cygpath w src interface s fomit frame pointer pipe o f exe f o src interface s libipopt la lm ldl lstdc lm ldl ic cygwin home johannes coinipopt ipopt src interfaces fomit f rame pointer pipe o f exe f o src interfaces libs libipopt a lstdc ldl chmod u x run unittests run unittests running unittests testing ampl solver executable test passed testing c example test passed testing c example run unittests line segmentation fault core dumped c tmpfile start of test program output end of test program output test failed output of the test program is above testing fortran example test passed error leaving directory home johannes coinipopt ipopt test error leaving directory home johannes coinipopt ipopt make error
| 1
|
405,045
| 27,501,954,767
|
IssuesEvent
|
2023-03-05 19:50:09
|
gravitational/teleport
|
https://api.github.com/repos/gravitational/teleport
|
opened
|
ERROR: auth_server is supported from config version v3 onwards
|
documentation
|
## Applies To
Version 12.x of https://goteleport.com/docs/deploy-a-cluster/helm-deployments/custom/
## Details
Following the steps of the above documentation does not lead to a successful Helm installation for me. I get the error put in the title of this issue thrown by the `teleport-proxy` pod.
I have been trying both some custom Helm values and the Helm values of the documentation, and both lead to the same error.
```
apohlmann @ loki teleport ./install.sh
Release "teleport" has been upgraded. Happy Helming!
NAME: teleport
LAST DEPLOYED: Sun Mar 5 11:32:38 2023
NAMESPACE: teleport
STATUS: deployed
REVISION: 2
TEST SUITE: None
NOTES:
apohlmann @ loki teleport kgp
NAME READY STATUS RESTARTS AGE
teleport-auth-6f9987466f-wkgkd 0/1 Running 0 11s
teleport-proxy-d8ddf5454-l5bff 0/1 CrashLoopBackOff 1 (7s ago) 11s
apohlmann @ loki teleport k logs teleport-proxy-d8ddf5454-l5bff
ERROR: auth_server is supported from config version v3 onwards
```
where the content of my `install.sh` script is:
```
#!/usr/bin/env bash
# kubectl create namespace teleport
# kubectl label namespace teleport 'pod-security.kubernetes.io/enforce=baseline'
helm upgrade --install teleport teleport/teleport-cluster \
--create-namespace \
--namespace=teleport \
--values values.yaml
```
I'm not sure if I did something wrong or if there's something missing in the documentation (also not sure how it's possible to specify the config version via Helm values).
|
1.0
|
ERROR: auth_server is supported from config version v3 onwards - ## Applies To
Version 12.x of https://goteleport.com/docs/deploy-a-cluster/helm-deployments/custom/
## Details
Following the steps of the above documentation does not lead to a successful Helm installation for me. I get the error put in the title of this issue thrown by the `teleport-proxy` pod.
I have been trying both some custom Helm values and the Helm values of the documentation, and both lead to the same error.
```
apohlmann @ loki teleport ./install.sh
Release "teleport" has been upgraded. Happy Helming!
NAME: teleport
LAST DEPLOYED: Sun Mar 5 11:32:38 2023
NAMESPACE: teleport
STATUS: deployed
REVISION: 2
TEST SUITE: None
NOTES:
apohlmann @ loki teleport kgp
NAME READY STATUS RESTARTS AGE
teleport-auth-6f9987466f-wkgkd 0/1 Running 0 11s
teleport-proxy-d8ddf5454-l5bff 0/1 CrashLoopBackOff 1 (7s ago) 11s
apohlmann @ loki teleport k logs teleport-proxy-d8ddf5454-l5bff
ERROR: auth_server is supported from config version v3 onwards
```
where the content of my `install.sh` script is:
```
#!/usr/bin/env bash
# kubectl create namespace teleport
# kubectl label namespace teleport 'pod-security.kubernetes.io/enforce=baseline'
helm upgrade --install teleport teleport/teleport-cluster \
--create-namespace \
--namespace=teleport \
--values values.yaml
```
I'm not sure if I did something wrong or if there's something missing in the documentation (also not sure how it's possible to specify the config version via Helm values).
|
non_defect
|
error auth server is supported from config version onwards applies to version x of details following the steps of the above documentation does not lead to a successful helm installation for me i get the error put in the title of this issue thrown by the teleport proxy pod i have been trying both some custom helm values and the helm values of the documentation and both lead to the same error apohlmann loki teleport install sh release teleport has been upgraded happy helming name teleport last deployed sun mar namespace teleport status deployed revision test suite none notes apohlmann loki teleport kgp name ready status restarts age teleport auth wkgkd running teleport proxy crashloopbackoff ago apohlmann loki teleport k logs teleport proxy error auth server is supported from config version onwards where the content of my install sh script is usr bin env bash kubectl create namespace teleport kubectl label namespace teleport pod security kubernetes io enforce baseline helm upgrade install teleport teleport teleport cluster create namespace namespace teleport values values yaml i m not sure if i did something wrong or if there s something missing in the documentation also not sure how it s possible to specify the config version via helm values
| 0
|
19,860
| 5,951,444,396
|
IssuesEvent
|
2017-05-26 19:32:44
|
DumplingLTD/ExceptionNull
|
https://api.github.com/repos/DumplingLTD/ExceptionNull
|
closed
|
Review Implementation
|
Code
|
+ Code comments (disabled for free users)
+ Review/Score application (free vs premium)
|
1.0
|
Review Implementation - + Code comments (disabled for free users)
+ Review/Score application (free vs premium)
|
non_defect
|
review implementation code comments disabled for free users review score application free vs premium
| 0
|
70,528
| 23,219,598,642
|
IssuesEvent
|
2022-08-02 16:52:43
|
primefaces/primeng
|
https://api.github.com/repos/primefaces/primeng
|
opened
|
Component: Title
|
defect
|
### Describe the bug
Fieldset and accordion do not work and do not appear as they appear on the Official page

### Environment
Angular CLI: 12.1.0
Node: 14.20.0
Package Manager: npm 6.14.17
OS: win32 x64
Angular: 12.1.0
... animations, cdk, cli, common, compiler, compiler-cli, core
... forms, google-maps, language-service, platform-browser
... platform-browser-dynamic, router
Package Version
---------------------------------------------------------
@angular-devkit/architect 0.1201.0
@angular-devkit/build-angular 12.1.0
@angular-devkit/core 12.1.0
@angular-devkit/schematics 12.1.0
@schematics/angular 12.1.0
rxjs 6.6.2
typescript 4.2.3
### Reproducer
_No response_
### Angular version
12.1.0
### PrimeNG version
13.4.1
### Build / Runtime
Angular CLI App
### Language
TypeScript
### Node version (for AoT issues node --version)
14.20.0
### Browser(s)
Chrome
### Steps to reproduce the behavior
Just install with the above dependencies and add the minimum code.
```
<p-fieldset legend="Header">
Content
</p-fieldset>
```
```
<p-accordion>
<p-accordionTab header="Header 1">
Content 1
</p-accordionTab>
<p-accordionTab header="Header 2">
Content 2
</p-accordionTab>
<p-accordionTab header="Header 3">
Content 3
</p-accordionTab>
</p-accordion>
```
### Expected behavior
The expected behavior is to show the components as they are in the official website


|
1.0
|
Component: Title - ### Describe the bug
Fieldset and accordion do not work and do not appear as they appear on the Official page

### Environment
Angular CLI: 12.1.0
Node: 14.20.0
Package Manager: npm 6.14.17
OS: win32 x64
Angular: 12.1.0
... animations, cdk, cli, common, compiler, compiler-cli, core
... forms, google-maps, language-service, platform-browser
... platform-browser-dynamic, router
Package Version
---------------------------------------------------------
@angular-devkit/architect 0.1201.0
@angular-devkit/build-angular 12.1.0
@angular-devkit/core 12.1.0
@angular-devkit/schematics 12.1.0
@schematics/angular 12.1.0
rxjs 6.6.2
typescript 4.2.3
### Reproducer
_No response_
### Angular version
12.1.0
### PrimeNG version
13.4.1
### Build / Runtime
Angular CLI App
### Language
TypeScript
### Node version (for AoT issues node --version)
14.20.0
### Browser(s)
Chrome
### Steps to reproduce the behavior
Just install with the above dependencies and add the minimum code.
```
<p-fieldset legend="Header">
Content
</p-fieldset>
```
```
<p-accordion>
<p-accordionTab header="Header 1">
Content 1
</p-accordionTab>
<p-accordionTab header="Header 2">
Content 2
</p-accordionTab>
<p-accordionTab header="Header 3">
Content 3
</p-accordionTab>
</p-accordion>
```
### Expected behavior
The expected behavior is to show the components as they are in the official website


|
defect
|
component title describe the bug fieldset and accordion do not work and do not appear as they appear on the official page environment angular cli node package manager npm os angular animations cdk cli common compiler compiler cli core forms google maps language service platform browser platform browser dynamic router package version angular devkit architect angular devkit build angular angular devkit core angular devkit schematics schematics angular rxjs typescript reproducer no response angular version primeng version build runtime angular cli app language typescript node version for aot issues node version browser s chrome steps to reproduce the behavior just install with the above dependencies and add the minimum code content content content content expected behavior the expected behavior is to show the components as they are in the official website
| 1
|
58,068
| 16,342,382,614
|
IssuesEvent
|
2021-05-13 00:10:31
|
darshan-hpc/darshan
|
https://api.github.com/repos/darshan-hpc/darshan
|
closed
|
Darshan does not catch files opened with mkostemp()
|
defect wrapper libraries
|
In GitLab by @shanedsnyder on Sep 24, 2015, 16:26
The glibc mkostemp() function does not call the libc open() or create() calls, it instead issues an open64 system call directly at least in glibc 2.15.
This causes it to bypass the existing Darshan wrappers for open, and Darshan will not record any activity to that file.
This probably means that we need to add new wrappers for mkostemp (and its related functions, see man page). Need to check glibc implementation to confirm.
|
1.0
|
Darshan does not catch files opened with mkostemp() - In GitLab by @shanedsnyder on Sep 24, 2015, 16:26
The glibc mkostemp() function does not call the libc open() or create() calls, it instead issues an open64 system call directly at least in glibc 2.15.
This causes it to bypass the existing Darshan wrappers for open, and Darshan will not record any activity to that file.
This probably means that we need to add new wrappers for mkostemp (and its related functions, see man page). Need to check glibc implementation to confirm.
|
defect
|
darshan does not catch files opened with mkostemp in gitlab by shanedsnyder on sep the glibc mkostemp function does not call the libc open or create calls it instead issues an system call directly at least in glibc this causes it to bypass the existing darshan wrappers for open and darshan will not record any activity to that file this probably means that we need to add new wrappers for mkostemp and its related functions see man page need to check glibc implementation to confirm
| 1
|
24,154
| 3,917,814,582
|
IssuesEvent
|
2016-04-21 09:48:57
|
OlivierLD/weatherwizard
|
https://api.github.com/repos/OlivierLD/weatherwizard
|
closed
|
Dialog box remains after the composite is loaded.
|
auto-migrated Priority-Medium Type-Defect
|
```
When opening a Composite, the Dialog Box shows up at the end of the reload, and
stays there.
*Work around*: Hit the "Hide" button.
```
Original issue reported on code.google.com by `olivier.lediouris@gmail.com` on 19 Sep 2012 at 1:57
|
1.0
|
Dialog box remains after the composite is loaded. - ```
When opening a Composite, the Dialog Box shows up at the end of the reload, and
stays there.
*Work around*: Hit the "Hide" button.
```
Original issue reported on code.google.com by `olivier.lediouris@gmail.com` on 19 Sep 2012 at 1:57
|
defect
|
dialog box remains after the composite is loaded when opening a composite the dialog box shows up at the end of the reload and stays there work around hit the hide button original issue reported on code google com by olivier lediouris gmail com on sep at
| 1
|
406,111
| 11,886,994,300
|
IssuesEvent
|
2020-03-27 23:44:28
|
zulip/zulip
|
https://api.github.com/repos/zulip/zulip
|
closed
|
emails: Make `ScheduledEmail` code path convert sender into email address in send_email
|
area: emails bug priority: high
|
This is a somewhat important bug fix for how Zulip's "schedule an email to send in 2 days" code path handles changes in the outgoing email configuration in the meantime, which can be quite confusing/difficult for a sysadmin trying out Zulip for the first time. See #10879 for details.
The root problem in #10879 was basically that ScheduledEmail objects end up setting the FromAddress value on creation, not at the time they're sent, which means that if you change settings (to fix your outgoing email), we don't actually have a way to update that state for the email system.
Possibly what we want to do is refactor the send_future_email / send_email interface to have e.g. "NOREPLY" or "TOKENIZED_NOREPLY" be what is included in the email saved in the ScheduledEmail table, and have the substitution for actually sending it happen inside manage.py deliver_email. That would be in line with our general strategy of e.g. rendering the email templates inside the final send_email function.
This requires a bit of refactoring:
* First, change how tokenized noreply is encoded to have the randomization/substitution happen inside `send_email`, while preserving backwards compatibility. Probably the right model is to have a new `sender_type` argument to `send_email`, and require either a sender email or a `sender_type`. If `sender_type` was provided, then we compute the email for that sender type inside `send_email`.
* Then, make `send_future_email` use `sender_type`.
|
1.0
|
emails: Make `ScheduledEmail` code path convert sender into email address in send_email - This is a somewhat important bug fix for how Zulip's "schedule an email to send in 2 days" code path handles changes in the outgoing email configuration in the meantime, which can be quite confusing/difficult for a sysadmin trying out Zulip for the first time. See #10879 for details.
The root problem in #10879 was basically that ScheduledEmail objects end up setting the FromAddress value on creation, not at the time they're sent, which means that if you change settings (to fix your outgoing email), we don't actually have a way to update that state for the email system.
Possibly what we want to do is refactor the send_future_email / send_email interface to have e.g. "NOREPLY" or "TOKENIZED_NOREPLY" be what is included in the email saved in the ScheduledEmail table, and have the substitution for actually sending it happen inside manage.py deliver_email. That would be in line with our general strategy of e.g. rendering the email templates inside the final send_email function.
This requires a bit of refactoring:
* First, change how tokenized noreply is encoded to have the randomization/substitution happen inside `send_email`, while preserving backwards compatibility. Probably the right model is to have a new `sender_type` argument to `send_email`, and require either a sender email or a `sender_type`. If `sender_type` was provided, then we compute the email for that sender type inside `send_email`.
* Then, make `send_future_email` use `sender_type`.
|
non_defect
|
emails make scheduledemail code path convert sender into email address in send email this is a somewhat important bug fix for how zulip s schedule an email to send in days code path handles changes in the outgoing email configuration in the meantime which can be quite confusing difficult for a sysadmin trying out zulip for the first time see for details the root problem in was basically that scheduledemail objects end up setting the fromaddress value on creation not at the time they re sent which means that if you change settings to fix your outgoing email we don t actually have a way to update that state for the email system possibly what we want to do is refactor the send future email send email interface to have e g noreply or tokenized noreply be what is included in the email saved in the scheduledemail table and have the substitution for actually sending it happen inside manage py deliver email that would be in line with our general strategy of e g rendering the email templates inside the final send email function this requires a bit of refactoring first change how tokenized noreply is encoded to have the randomization substitution happen inside send email while preserving backwards compatibility probably the right model is to have a new sender type argument to send email and require either a sender email or a sender type if sender type was provided then we compute the email for that sender type inside send email then make send future email use sender type
| 0
|
322,616
| 23,915,915,081
|
IssuesEvent
|
2022-09-09 12:41:41
|
Automattic/woocommerce-payments
|
https://api.github.com/repos/Automattic/woocommerce-payments
|
opened
|
[Critical flows] Review Onboarding area tests
|
type: documentation
|
### Description
One of our goals is to reduce the number of incidents in production, and part of doing that is to ensure our [critical flows](https://github.com/Automattic/woocommerce-payments/wiki/Critical-flows) are up to date and still relevant, as these are checked every week by GlobalStep during the release cycles.
This issue is focused on the `Onboarding` area:
| User type | Area | Flow Name
|-----------|----------------- |-----------------------------------------
| Merchant | Onboarding | Onboard via WooCommerce setup wizard
| Merchant | Onboarding | Onboard via WooCommerce tasks list
| Merchant | Onboarding | Manual plugin installation and setup
| Merchant | Onboarding | [Multi site] Manual plugin installation and setup
| Merchant | Onboarding | ~Plugin update (via plugins page)~
| Merchant | Onboarding | ~Switch from dev to live account~
### Acceptance criteria
- Any test that is not relevant or critical anymore is removed from the list
- Any test that is still relevant or critical is up to date with the latest version of WCPay
- Screenshots
- Wording
- Potential new steps since it was last written
Bonus point if you see an obvious test missing for this area and add it to the list.
|
1.0
|
[Critical flows] Review Onboarding area tests - ### Description
One of our goals is to reduce the number of incidents in production, and part of doing that is to ensure our [critical flows](https://github.com/Automattic/woocommerce-payments/wiki/Critical-flows) are up to date and still relevant, as these are checked every week by GlobalStep during the release cycles.
This issue is focused on the `Onboarding` area:
| User type | Area | Flow Name
|-----------|----------------- |-----------------------------------------
| Merchant | Onboarding | Onboard via WooCommerce setup wizard
| Merchant | Onboarding | Onboard via WooCommerce tasks list
| Merchant | Onboarding | Manual plugin installation and setup
| Merchant | Onboarding | [Multi site] Manual plugin installation and setup
| Merchant | Onboarding | ~Plugin update (via plugins page)~
| Merchant | Onboarding | ~Switch from dev to live account~
### Acceptance criteria
- Any test that is not relevant or critical anymore is removed from the list
- Any test that is still relevant or critical is up to date with the latest version of WCPay
- Screenshots
- Wording
- Potential new steps since it was last written
Bonus point if you see an obvious test missing for this area and add it to the list.
|
non_defect
|
review onboarding area tests description one of our goals is to reduce the number of incidents in production and part of doing that is to ensure our are up to date and still relevant as these are checked every week by globalstep during the release cycles this issue is focused on the onboarding area user type area flow name merchant onboarding onboard via woocommerce setup wizard merchant onboarding onboard via woocommerce tasks list merchant onboarding manual plugin installation and setup merchant onboarding manual plugin installation and setup merchant onboarding plugin update via plugins page merchant onboarding switch from dev to live account acceptance criteria any test that is not relevant or critical anymore is removed from the list any test that is still relevant or critical is up to date with the latest version of wcpay screenshots wording potential new steps since it was last written bonus point if you see an obvious test missing for this area and add it to the list
| 0
|
238,573
| 18,245,035,413
|
IssuesEvent
|
2021-10-01 17:11:58
|
Agrover112/fliscopt
|
https://api.github.com/repos/Agrover112/fliscopt
|
opened
|
Add docstrings for each algorithm file.
|
documentation good first issue Hacktoberfest
|
There are few files which contain implementation of the algorithms:
[**sa,ga,hc,chaining,rs].py** you need to write the docstring for the classes present in each file using Google Docstring Convention.
Refer: algorithms.py on how to write docstrings for functions and similarily refer [THIS](https://gist.github.com/redlotus/3bc387c2591e3e908c9b63b97b11d24e) and [THIS link](https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html) for on getting started on how to write docstrings for the classes.
|
1.0
|
Add docstrings for each algorithm file. - There are few files which contain implementation of the algorithms:
[**sa,ga,hc,chaining,rs].py** you need to write the docstring for the classes present in each file using Google Docstring Convention.
Refer: algorithms.py on how to write docstrings for functions and similarily refer [THIS](https://gist.github.com/redlotus/3bc387c2591e3e908c9b63b97b11d24e) and [THIS link](https://www.sphinx-doc.org/en/master/usage/extensions/example_google.html) for on getting started on how to write docstrings for the classes.
|
non_defect
|
add docstrings for each algorithm file there are few files which contain implementation of the algorithms py you need to write the docstring for the classes present in each file using google docstring convention refer algorithms py on how to write docstrings for functions and similarily refer and for on getting started on how to write docstrings for the classes
| 0
|
274,005
| 23,802,340,159
|
IssuesEvent
|
2022-09-03 13:49:39
|
red/red
|
https://api.github.com/repos/red/red
|
closed
|
/part argument to mold and form overshoots on system object and dates
|
status.built status.tested type.bug test.written
|
**Describe the bug**
```
;) correct:
>> mold/part [a b c] 3
== "[a "
>> mold/part object [a: "b c"] 3
== "mak"
;) buggy:
>> mold/part system 3
== "make object! [^/ ver"
>> length? mold/part system 3
== 22
>> mold/part system/words 3
== "make object! [^/ dat"
>> length? mold/part system/words 3
== 22
>> mold/part system/lexer 3
== "make"
>> length? mold/part system/lexer 3
== 4
>> mold/part system/build 3
== "make object!"
>> length? mold/part system/build 3
== 12
>> form/part system 3
== "version: 0.6.4^/build: "
>> length? form/part system 3
== 22
>> form/part system/words 3
== "datatype!: datatype!^/u"
>> length? form/part system/words 3
== 22
>> form/part system/lexer 3
== "pre-"
>> length? form/part system/lexer 3
== 4
>> form/part system/build 3
== "date: 19-Feb"
>> length? form/part system/build 3
== 12
```
**To reproduce**
```
mold/part system 3
length? mold/part system 3
mold/part system/words 3
length? mold/part system/words 3
mold/part system/lexer 3
length? mold/part system/lexer 3
mold/part system/build 3
length? mold/part system/build 3
form/part system 3
length? form/part system 3
form/part system/words 3
length? form/part system/words 3
form/part system/lexer 3
length? form/part system/lexer 3
form/part system/build 3
length? form/part system/build 3
```
**Expected behavior**
All results should be 3 chars long.
**Platform version (please complete the following information)**
```
Red 0.6.4 for Windows built 19-Feb-2020/21:37:44+03:00
```
|
2.0
|
/part argument to mold and form overshoots on system object and dates - **Describe the bug**
```
;) correct:
>> mold/part [a b c] 3
== "[a "
>> mold/part object [a: "b c"] 3
== "mak"
;) buggy:
>> mold/part system 3
== "make object! [^/ ver"
>> length? mold/part system 3
== 22
>> mold/part system/words 3
== "make object! [^/ dat"
>> length? mold/part system/words 3
== 22
>> mold/part system/lexer 3
== "make"
>> length? mold/part system/lexer 3
== 4
>> mold/part system/build 3
== "make object!"
>> length? mold/part system/build 3
== 12
>> form/part system 3
== "version: 0.6.4^/build: "
>> length? form/part system 3
== 22
>> form/part system/words 3
== "datatype!: datatype!^/u"
>> length? form/part system/words 3
== 22
>> form/part system/lexer 3
== "pre-"
>> length? form/part system/lexer 3
== 4
>> form/part system/build 3
== "date: 19-Feb"
>> length? form/part system/build 3
== 12
```
**To reproduce**
```
mold/part system 3
length? mold/part system 3
mold/part system/words 3
length? mold/part system/words 3
mold/part system/lexer 3
length? mold/part system/lexer 3
mold/part system/build 3
length? mold/part system/build 3
form/part system 3
length? form/part system 3
form/part system/words 3
length? form/part system/words 3
form/part system/lexer 3
length? form/part system/lexer 3
form/part system/build 3
length? form/part system/build 3
```
**Expected behavior**
All results should be 3 chars long.
**Platform version (please complete the following information)**
```
Red 0.6.4 for Windows built 19-Feb-2020/21:37:44+03:00
```
|
non_defect
|
part argument to mold and form overshoots on system object and dates describe the bug correct mold part a mold part object mak buggy mold part system make object ver length mold part system mold part system words make object dat length mold part system words mold part system lexer make length mold part system lexer mold part system build make object length mold part system build form part system version build length form part system form part system words datatype datatype u length form part system words form part system lexer pre length form part system lexer form part system build date feb length form part system build to reproduce mold part system length mold part system mold part system words length mold part system words mold part system lexer length mold part system lexer mold part system build length mold part system build form part system length form part system form part system words length form part system words form part system lexer length form part system lexer form part system build length form part system build expected behavior all results should be chars long platform version please complete the following information red for windows built feb
| 0
|
253,445
| 19,101,411,191
|
IssuesEvent
|
2021-11-29 23:09:59
|
DenisKozarezov/OkKi_Project
|
https://api.github.com/repos/DenisKozarezov/OkKi_Project
|
opened
|
:sparkles: **Приорититазия целей и задач (Крупенко, Петруша)**
|
documentation
|
- [ ] Проработать канбан-доску
- [ ] Отслеживать выполнение тех или иных задач, обновлять статусы задач
- [ ] Заполнить таблицы в отчетах к лабораторным работам
- [ ] Докладывать об изменении состояний задач Владельцу продукта
|
1.0
|
:sparkles: **Приорититазия целей и задач (Крупенко, Петруша)** - - [ ] Проработать канбан-доску
- [ ] Отслеживать выполнение тех или иных задач, обновлять статусы задач
- [ ] Заполнить таблицы в отчетах к лабораторным работам
- [ ] Докладывать об изменении состояний задач Владельцу продукта
|
non_defect
|
sparkles приорититазия целей и задач крупенко петруша проработать канбан доску отслеживать выполнение тех или иных задач обновлять статусы задач заполнить таблицы в отчетах к лабораторным работам докладывать об изменении состояний задач владельцу продукта
| 0
|
56,631
| 15,239,181,537
|
IssuesEvent
|
2021-02-19 03:48:18
|
m01ly/m01ly.github.io
|
https://api.github.com/repos/m01ly/m01ly.github.io
|
opened
|
DefectDojo安装与使用 | Hexo
|
/2021/01/21/esc-DefectDojo/ Gitalk
|
https://m01ly.github.io/2021/01/21/esc-DefectDojo/
1 前期准备1.1 官方文档github地址:https://github.com/DefectDojo/django-DefectDojo 官方文档:https://defectdojo.readthedocs.io/en/latest/about.html 1.2 环境版本1.2.1 docker-compose使用docker-compose进行安装至少需要docker 18.09.4和do
|
1.0
|
DefectDojo安装与使用 | Hexo - https://m01ly.github.io/2021/01/21/esc-DefectDojo/
1 前期准备1.1 官方文档github地址:https://github.com/DefectDojo/django-DefectDojo 官方文档:https://defectdojo.readthedocs.io/en/latest/about.html 1.2 环境版本1.2.1 docker-compose使用docker-compose进行安装至少需要docker 18.09.4和do
|
defect
|
defectdojo安装与使用 hexo 官方文档github地址 官方文档 docker compose使用docker compose进行安装至少需要docker
| 1
|
78,245
| 27,389,988,873
|
IssuesEvent
|
2023-02-28 15:41:23
|
openzfs/zfs
|
https://api.github.com/repos/openzfs/zfs
|
opened
|
ARC is at max size even when user application is thrashing swap
|
Type: Defect
|
<!-- Please fill out the following template, which will help other contributors address your issue. -->
<!--
Thank you for reporting an issue.
*IMPORTANT* - Please check our issue tracker before opening a new issue.
Additional valuable information can be found in the OpenZFS documentation
and mailing list archives.
Please fill in as much of the template as possible.
-->
### System information
<!-- add version after "|" character -->
Type | Version/Name
--- | ---
Distribution Name| Fedora
Distribution Version | 36
Kernel Version | 6.1.11-100.fc36.x86_64
Architecture | x86_64
OpenZFS Version |2.1.9-1
<!--
Command to find OpenZFS version:
zfs version
Commands to find kernel version:
uname -r # Linux
freebsd-version -r # FreeBSD
-->
### Describe the problem you're observing
ARC is at max size even when swap is being used. User application is spending significant time in kernel space.
### Describe how to reproduce the problem
Run my python program (image processing) which uses heavy memory.
### Include any warning/errors/backtraces from the system logs
<!--
*IMPORTANT* - Please mark logs and text output from terminal commands
or else Github will not display them correctly.
An example is provided below.
Example:
```
this is an example how log text should be marked (wrap it with ```)
```
-->
arc_summary:
<details>
```
------------------------------------------------------------------------
ZFS Subsystem Report Tue Feb 28 10:33:50 2023
Linux 6.1.11-100.fc36.x86_64 2.1.9-1
Machine: hidden (x86_64) 2.1.9-1
ARC status: HEALTHY
Memory throttle count: 0
ARC size (current): 100.0 % 62.8 GiB
Target size (adaptive): 100.0 % 62.8 GiB
Min size (hard limit): 1.5 % 953.7 MiB
Max size (high water): 67:1 62.8 GiB
Most Frequently Used (MFU) cache size: 56.4 % 33.6 GiB
Most Recently Used (MRU) cache size: 43.6 % 26.0 GiB
Metadata cache size (hard limit): 75.0 % 47.1 GiB
Metadata cache size (current): 19.8 % 9.3 GiB
Dnode cache size (hard limit): 10.0 % 4.7 GiB
Dnode cache size (current): 1.6 % 79.4 MiB
ARC hash breakdown:
Elements max: 32.9M
Elements current: 91.6 % 30.2M
Collisions: 65.5M
Chain max: 13
Chains: 9.0M
ARC misc:
Deleted: 13.0M
Mutex misses: 15.0k
Eviction skips: 3.9k
Eviction skips due to L2 writes: 0
L2 cached evictions: 1.1 TiB
L2 eligible evictions: 4.3 GiB
L2 eligible MFU evictions: 2.5 % 107.7 MiB
L2 eligible MRU evictions: 97.5 % 4.1 GiB
L2 ineligible evictions: 543.5 GiB
ARC total accesses (hits + misses): 1.5G
Cache hit ratio: 99.4 % 1.5G
Cache miss ratio: 0.6 % 9.2M
Actual hit ratio (MFU + MRU hits): 99.3 % 1.5G
Data demand efficiency: 99.2 % 171.9M
Data prefetch efficiency: 1.5 % 6.0M
Cache hits by cache type:
Most frequently used (MFU): 97.6 % 1.5G
Most recently used (MRU): 2.4 % 35.6M
Most frequently used (MFU) ghost: 0.1 % 840.8k
Most recently used (MRU) ghost: < 0.1 % 653.9k
Cache hits by data type:
Demand data: 11.4 % 170.5M
Prefetch data: < 0.1 % 89.3k
Demand metadata: 88.5 % 1.3G
Prefetch metadata: 0.1 % 1.7M
Cache misses by data type:
Demand data: 14.7 % 1.4M
Prefetch data: 64.7 % 5.9M
Demand metadata: 17.7 % 1.6M
Prefetch metadata: 3.0 % 274.1k
DMU prefetch efficiency: 93.5M
Hit ratio: 34.2 % 31.9M
Miss ratio: 65.8 % 61.5M
L2ARC status: HEALTHY
Low memory aborts: 528
Free on write: 230
R/W clashes: 0
Bad checksums: 0
I/O errors: 0
L2ARC size (adaptive): 3.0 TiB
Compressed: 78.5 % 2.3 TiB
Header size: 0.1 % 2.6 GiB
MFU allocated size: 31.6 % 754.6 GiB
MRU allocated size: 68.4 % 1.6 TiB
Prefetch allocated size: 0.1 % 2.1 GiB
Data (buffer content) allocated size: 98.3 % 2.3 TiB
Metadata (buffer content) allocated size: 1.8 % 43.6 GiB
L2ARC breakdown: 9.2M
Hit ratio: 78.8 % 7.2M
Miss ratio: 21.2 % 1.9M
Feeds: 656.6k
L2ARC writes:
Writes sent: 100 % 191.6k
L2ARC evicts:
Lock retries: 44
Upon reading: 0
Solaris Porting Layer (SPL):
spl_hostid 0
spl_hostid_path /etc/hostid
spl_kmem_alloc_max 1048576
spl_kmem_alloc_warn 65536
spl_kmem_cache_kmem_threads 4
spl_kmem_cache_magazine_size 0
spl_kmem_cache_max_size 32
spl_kmem_cache_obj_per_slab 8
spl_kmem_cache_reclaim 0
spl_kmem_cache_slab_limit 16384
spl_max_show_tasks 512
spl_panic_halt 0
spl_schedule_hrtimeout_slack_us 0
spl_taskq_kick 0
spl_taskq_thread_bind 0
spl_taskq_thread_dynamic 1
spl_taskq_thread_priority 1
spl_taskq_thread_sequential 4
Tunables:
dbuf_cache_hiwater_pct 10
dbuf_cache_lowater_pct 10
dbuf_cache_max_bytes 18446744073709551615
dbuf_cache_shift 5
dbuf_metadata_cache_max_bytes 18446744073709551615
dbuf_metadata_cache_shift 6
dmu_object_alloc_chunk_shift 7
dmu_prefetch_max 134217728
ignore_hole_birth 1
l2arc_exclude_special 0
l2arc_feed_again 1
l2arc_feed_min_ms 200
l2arc_feed_secs 1
l2arc_headroom 0
l2arc_headroom_boost 200
l2arc_meta_percent 33
l2arc_mfuonly 0
l2arc_noprefetch 0
l2arc_norw 0
l2arc_rebuild_blocks_min_l2size 1073741824
l2arc_rebuild_enabled 1
l2arc_trim_ahead 0
l2arc_write_boost 134217728
l2arc_write_max 134217728
metaslab_aliquot 1048576
metaslab_bias_enabled 1
metaslab_debug_load 0
metaslab_debug_unload 0
metaslab_df_max_search 16777216
metaslab_df_use_largest_segment 0
metaslab_force_ganging 16777217
metaslab_fragmentation_factor_enabled 1
metaslab_lba_weighting_enabled 1
metaslab_preload_enabled 1
metaslab_unload_delay 32
metaslab_unload_delay_ms 600000
send_holes_without_birth_time 1
spa_asize_inflation 24
spa_config_path /etc/zfs/zpool.cache
spa_load_print_vdev_tree 0
spa_load_verify_data 1
spa_load_verify_metadata 1
spa_load_verify_shift 4
spa_slop_shift 5
vdev_file_logical_ashift 9
vdev_file_physical_ashift 9
vdev_removal_max_span 32768
vdev_validate_skip 0
zap_iterate_prefetch 1
zfetch_array_rd_sz 1048576
zfetch_max_distance 67108864
zfetch_max_idistance 67108864
zfetch_max_sec_reap 2
zfetch_max_streams 8
zfetch_min_distance 4194304
zfetch_min_sec_reap 1
zfs_abd_scatter_enabled 1
zfs_abd_scatter_max_order 10
zfs_abd_scatter_min_size 1536
zfs_admin_snapshot 0
zfs_allow_redacted_dataset_mount 0
zfs_arc_average_blocksize 8192
zfs_arc_dnode_limit 0
zfs_arc_dnode_limit_percent 10
zfs_arc_dnode_reduce_percent 10
zfs_arc_evict_batch_limit 10
zfs_arc_eviction_pct 200
zfs_arc_grow_retry 0
zfs_arc_lotsfree_percent 10
zfs_arc_max 0
zfs_arc_meta_adjust_restarts 4096
zfs_arc_meta_limit 0
zfs_arc_meta_limit_percent 75
zfs_arc_meta_min 0
zfs_arc_meta_prune 10000
zfs_arc_meta_strategy 1
zfs_arc_min 1000000000
zfs_arc_min_prefetch_ms 0
zfs_arc_min_prescient_prefetch_ms 0
zfs_arc_p_dampener_disable 1
zfs_arc_p_min_shift 0
zfs_arc_pc_percent 0
zfs_arc_prune_task_threads 1
zfs_arc_shrink_shift 0
zfs_arc_shrinker_limit 10000
zfs_arc_sys_free 0
zfs_async_block_max_blocks 18446744073709551615
zfs_autoimport_disable 1
zfs_btree_verify_intensity 0
zfs_checksum_events_per_second 20
zfs_commit_timeout_pct 5
zfs_compressed_arc_enabled 1
zfs_condense_indirect_commit_entry_delay_ms 0
zfs_condense_indirect_obsolete_pct 25
zfs_condense_indirect_vdevs_enable 1
zfs_condense_max_obsolete_bytes 1073741824
zfs_condense_min_mapping_bytes 131072
zfs_dbgmsg_enable 1
zfs_dbgmsg_maxsize 4194304
zfs_dbuf_state_index 0
zfs_ddt_data_is_special 1
zfs_deadman_checktime_ms 60000
zfs_deadman_enabled 1
zfs_deadman_failmode wait
zfs_deadman_synctime_ms 600000
zfs_deadman_ziotime_ms 300000
zfs_dedup_prefetch 0
zfs_delay_min_dirty_percent 60
zfs_delay_scale 500000
zfs_delete_blocks 20480
zfs_dirty_data_max 4294967296
zfs_dirty_data_max_max 4294967296
zfs_dirty_data_max_max_percent 25
zfs_dirty_data_max_percent 10
zfs_dirty_data_sync_percent 20
zfs_disable_ivset_guid_check 0
zfs_dmu_offset_next_sync 1
zfs_embedded_slog_min_ms 64
zfs_expire_snapshot 300
zfs_fallocate_reserve_percent 110
zfs_flags 0
zfs_free_bpobj_enabled 1
zfs_free_leak_on_eio 0
zfs_free_min_time_ms 1000
zfs_history_output_max 1048576
zfs_immediate_write_sz 32768
zfs_initialize_chunk_size 1048576
zfs_initialize_value 16045690984833335022
zfs_keep_log_spacemaps_at_export 0
zfs_key_max_salt_uses 400000000
zfs_livelist_condense_new_alloc 0
zfs_livelist_condense_sync_cancel 0
zfs_livelist_condense_sync_pause 0
zfs_livelist_condense_zthr_cancel 0
zfs_livelist_condense_zthr_pause 0
zfs_livelist_max_entries 500000
zfs_livelist_min_percent_shared 75
zfs_lua_max_instrlimit 100000000
zfs_lua_max_memlimit 104857600
zfs_max_async_dedup_frees 100000
zfs_max_log_walking 5
zfs_max_logsm_summary_length 10
zfs_max_missing_tvds 0
zfs_max_nvlist_src_size 0
zfs_max_recordsize 1048576
zfs_metaslab_find_max_tries 100
zfs_metaslab_fragmentation_threshold 70
zfs_metaslab_max_size_cache_sec 3600
zfs_metaslab_mem_limit 25
zfs_metaslab_segment_weight_enabled 1
zfs_metaslab_switch_threshold 2
zfs_metaslab_try_hard_before_gang 0
zfs_mg_fragmentation_threshold 95
zfs_mg_noalloc_threshold 0
zfs_min_metaslabs_to_flush 1
zfs_multihost_fail_intervals 10
zfs_multihost_history 0
zfs_multihost_import_intervals 20
zfs_multihost_interval 1000
zfs_multilist_num_sublists 0
zfs_no_scrub_io 0
zfs_no_scrub_prefetch 0
zfs_nocacheflush 0
zfs_nopwrite_enabled 1
zfs_object_mutex_size 64
zfs_obsolete_min_time_ms 500
zfs_override_estimate_recordsize 0
zfs_pd_bytes_max 52428800
zfs_per_txg_dirty_frees_percent 30
zfs_prefetch_disable 0
zfs_read_history 0
zfs_read_history_hits 0
zfs_rebuild_max_segment 1048576
zfs_rebuild_scrub_enabled 1
zfs_rebuild_vdev_limit 33554432
zfs_reconstruct_indirect_combinations_max 4096
zfs_recover 0
zfs_recv_queue_ff 20
zfs_recv_queue_length 16777216
zfs_recv_write_batch_size 1048576
zfs_removal_ignore_errors 0
zfs_removal_suspend_progress 0
zfs_remove_max_segment 16777216
zfs_resilver_disable_defer 0
zfs_resilver_min_time_ms 3000
zfs_scan_blkstats 0
zfs_scan_checkpoint_intval 7200
zfs_scan_fill_weight 3
zfs_scan_ignore_errors 0
zfs_scan_issue_strategy 0
zfs_scan_legacy 0
zfs_scan_max_ext_gap 2097152
zfs_scan_mem_lim_fact 20
zfs_scan_mem_lim_soft_fact 20
zfs_scan_strict_mem_lim 0
zfs_scan_suspend_progress 0
zfs_scan_vdev_limit 4194304
zfs_scrub_min_time_ms 1000
zfs_send_corrupt_data 0
zfs_send_no_prefetch_queue_ff 20
zfs_send_no_prefetch_queue_length 1048576
zfs_send_queue_ff 20
zfs_send_queue_length 16777216
zfs_send_unmodified_spill_blocks 1
zfs_slow_io_events_per_second 20
zfs_spa_discard_memory_limit 16777216
zfs_special_class_metadata_reserve_pct 25
zfs_sync_pass_deferred_free 2
zfs_sync_pass_dont_compress 8
zfs_sync_pass_rewrite 2
zfs_sync_taskq_batch_pct 75
zfs_traverse_indirect_prefetch_limit 32
zfs_trim_extent_bytes_max 134217728
zfs_trim_extent_bytes_min 32768
zfs_trim_metaslab_skip 0
zfs_trim_queue_limit 10
zfs_trim_txg_batch 32
zfs_txg_history 100
zfs_txg_timeout 5
zfs_unflushed_log_block_max 131072
zfs_unflushed_log_block_min 1000
zfs_unflushed_log_block_pct 400
zfs_unflushed_log_txg_max 1000
zfs_unflushed_max_mem_amt 1073741824
zfs_unflushed_max_mem_ppm 1000
zfs_unlink_suspend_progress 0
zfs_user_indirect_is_special 1
zfs_vdev_aggregate_trim 0
zfs_vdev_aggregation_limit 1048576
zfs_vdev_aggregation_limit_non_rotating 131072
zfs_vdev_async_read_max_active 3
zfs_vdev_async_read_min_active 1
zfs_vdev_async_write_active_max_dirty_percent 60
zfs_vdev_async_write_active_min_dirty_percent 30
zfs_vdev_async_write_max_active 10
zfs_vdev_async_write_min_active 2
zfs_vdev_cache_bshift 16
zfs_vdev_cache_max 16384
zfs_vdev_cache_size 0
zfs_vdev_default_ms_count 200
zfs_vdev_default_ms_shift 29
zfs_vdev_initializing_max_active 1
zfs_vdev_initializing_min_active 1
zfs_vdev_max_active 1000
zfs_vdev_max_auto_ashift 14
zfs_vdev_min_auto_ashift 9
zfs_vdev_min_ms_count 16
zfs_vdev_mirror_non_rotating_inc 0
zfs_vdev_mirror_non_rotating_seek_inc 1
zfs_vdev_mirror_rotating_inc 0
zfs_vdev_mirror_rotating_seek_inc 5
zfs_vdev_mirror_rotating_seek_offset 1048576
zfs_vdev_ms_count_limit 131072
zfs_vdev_nia_credit 5
zfs_vdev_nia_delay 5
zfs_vdev_open_timeout_ms 1000
zfs_vdev_queue_depth_pct 1000
zfs_vdev_raidz_impl cycle [fastest] original scalar sse2 ssse3 avx2
zfs_vdev_read_gap_limit 32768
zfs_vdev_rebuild_max_active 3
zfs_vdev_rebuild_min_active 1
zfs_vdev_removal_max_active 2
zfs_vdev_removal_min_active 1
zfs_vdev_scheduler unused
zfs_vdev_scrub_max_active 3
zfs_vdev_scrub_min_active 1
zfs_vdev_sync_read_max_active 10
zfs_vdev_sync_read_min_active 10
zfs_vdev_sync_write_max_active 10
zfs_vdev_sync_write_min_active 10
zfs_vdev_trim_max_active 2
zfs_vdev_trim_min_active 1
zfs_vdev_write_gap_limit 4096
zfs_vnops_read_chunk_size 1048576
zfs_wrlog_data_max 8589934592
zfs_zevent_len_max 512
zfs_zevent_retain_expire_secs 900
zfs_zevent_retain_max 2000
zfs_zil_clean_taskq_maxalloc 1048576
zfs_zil_clean_taskq_minalloc 1024
zfs_zil_clean_taskq_nthr_pct 100
zil_maxblocksize 131072
zil_nocacheflush 0
zil_replay_disable 0
zil_slog_bulk 786432
zio_deadman_log_all 0
zio_dva_throttle_enabled 1
zio_requeue_io_start_cut_in_line 1
zio_slow_io_ms 30000
zio_taskq_batch_pct 80
zio_taskq_batch_tpq 0
zvol_inhibit_dev 0
zvol_major 230
zvol_max_discard_blocks 16384
zvol_prefetch_bytes 131072
zvol_request_sync 0
zvol_threads 32
zvol_volmode 1
VDEV cache disabled, skipping section
ZIL committed transactions: 21.3M
Commit requests: 6.3M
Flushes to stable storage: 6.0M
Transactions to SLOG storage pool: 81.2 GiB 5.6M
Transactions to non-SLOG storage pool: 0 Bytes 0
```
</details>
free -h
<details>
```
total used free shared buff/cache available
Mem: 125Gi 110Gi 14Gi 43Mi 778Mi 12Gi
Swap: 127Gi 26Gi 101Gi
```
</details>
zpool status
<details>
```
pool: zdata
state: ONLINE
scan: scrub repaired 0B in 07:35:43 with 0 errors on Sat Feb 11 21:51:22 2023
config:
NAME STATE READ WRITE CKSUM
zdata ONLINE 0 0 0
raidz2-0 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2b68b9f-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2ffd4b3-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2fe779f-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2904678-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2fe4568-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c28dab08-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2919583-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c30115db-part1 ONLINE 0 0 0
logs
/dev/disk/by-id/nvme-SAMSUNG_MZ1L2960HCJR-00A07_S665NE0T214870-part2 ONLINE 0 0 0
cache
/dev/nvme2n1p3 ONLINE 0 0 0
/dev/nvme1n1p7 ONLINE 0 0 0
/dev/nvme0n1p7 ONLINE 0 0 0
errors: No known data errors
```
</details>
cat /etc/sysctl.conf
<details>
```
# sysctl settings are defined through files in
# /usr/lib/sysctl.d/, /run/sysctl.d/, and /etc/sysctl.d/.
#
# Vendors settings live in /usr/lib/sysctl.d/.
# To override a whole file, create a new file with the same in
# /etc/sysctl.d/ and put new settings there. To override
# only specific settings, add a file with a lexically later
# name in /etc/sysctl.d/ and put new settings there.
#
# For more information, see sysctl.conf(5) and sysctl.d(5).
vm.min_free_kbytes = 1048576
kernel.task_delayacct = 1
```
</details>
cat /etc/modprobe.d/zfs.conf
<details>
```
options zfs zfs_arc_min=1000000000
options zfs l2arc_noprefetch=0
options zfs l2arc_write_max=134217728
options zfs l2arc_write_boost=134217728
options zfs l2arc_headroom=0
```
</details>
|
1.0
|
ARC is at max size even when user application is thrashing swap - <!-- Please fill out the following template, which will help other contributors address your issue. -->
<!--
Thank you for reporting an issue.
*IMPORTANT* - Please check our issue tracker before opening a new issue.
Additional valuable information can be found in the OpenZFS documentation
and mailing list archives.
Please fill in as much of the template as possible.
-->
### System information
<!-- add version after "|" character -->
Type | Version/Name
--- | ---
Distribution Name| Fedora
Distribution Version | 36
Kernel Version | 6.1.11-100.fc36.x86_64
Architecture | x86_64
OpenZFS Version |2.1.9-1
<!--
Command to find OpenZFS version:
zfs version
Commands to find kernel version:
uname -r # Linux
freebsd-version -r # FreeBSD
-->
### Describe the problem you're observing
ARC is at max size even when swap is being used. User application is spending significant time in kernel space.
### Describe how to reproduce the problem
Run my python program (image processing) which uses heavy memory.
### Include any warning/errors/backtraces from the system logs
<!--
*IMPORTANT* - Please mark logs and text output from terminal commands
or else Github will not display them correctly.
An example is provided below.
Example:
```
this is an example how log text should be marked (wrap it with ```)
```
-->
arc_summary:
<details>
```
------------------------------------------------------------------------
ZFS Subsystem Report Tue Feb 28 10:33:50 2023
Linux 6.1.11-100.fc36.x86_64 2.1.9-1
Machine: hidden (x86_64) 2.1.9-1
ARC status: HEALTHY
Memory throttle count: 0
ARC size (current): 100.0 % 62.8 GiB
Target size (adaptive): 100.0 % 62.8 GiB
Min size (hard limit): 1.5 % 953.7 MiB
Max size (high water): 67:1 62.8 GiB
Most Frequently Used (MFU) cache size: 56.4 % 33.6 GiB
Most Recently Used (MRU) cache size: 43.6 % 26.0 GiB
Metadata cache size (hard limit): 75.0 % 47.1 GiB
Metadata cache size (current): 19.8 % 9.3 GiB
Dnode cache size (hard limit): 10.0 % 4.7 GiB
Dnode cache size (current): 1.6 % 79.4 MiB
ARC hash breakdown:
Elements max: 32.9M
Elements current: 91.6 % 30.2M
Collisions: 65.5M
Chain max: 13
Chains: 9.0M
ARC misc:
Deleted: 13.0M
Mutex misses: 15.0k
Eviction skips: 3.9k
Eviction skips due to L2 writes: 0
L2 cached evictions: 1.1 TiB
L2 eligible evictions: 4.3 GiB
L2 eligible MFU evictions: 2.5 % 107.7 MiB
L2 eligible MRU evictions: 97.5 % 4.1 GiB
L2 ineligible evictions: 543.5 GiB
ARC total accesses (hits + misses): 1.5G
Cache hit ratio: 99.4 % 1.5G
Cache miss ratio: 0.6 % 9.2M
Actual hit ratio (MFU + MRU hits): 99.3 % 1.5G
Data demand efficiency: 99.2 % 171.9M
Data prefetch efficiency: 1.5 % 6.0M
Cache hits by cache type:
Most frequently used (MFU): 97.6 % 1.5G
Most recently used (MRU): 2.4 % 35.6M
Most frequently used (MFU) ghost: 0.1 % 840.8k
Most recently used (MRU) ghost: < 0.1 % 653.9k
Cache hits by data type:
Demand data: 11.4 % 170.5M
Prefetch data: < 0.1 % 89.3k
Demand metadata: 88.5 % 1.3G
Prefetch metadata: 0.1 % 1.7M
Cache misses by data type:
Demand data: 14.7 % 1.4M
Prefetch data: 64.7 % 5.9M
Demand metadata: 17.7 % 1.6M
Prefetch metadata: 3.0 % 274.1k
DMU prefetch efficiency: 93.5M
Hit ratio: 34.2 % 31.9M
Miss ratio: 65.8 % 61.5M
L2ARC status: HEALTHY
Low memory aborts: 528
Free on write: 230
R/W clashes: 0
Bad checksums: 0
I/O errors: 0
L2ARC size (adaptive): 3.0 TiB
Compressed: 78.5 % 2.3 TiB
Header size: 0.1 % 2.6 GiB
MFU allocated size: 31.6 % 754.6 GiB
MRU allocated size: 68.4 % 1.6 TiB
Prefetch allocated size: 0.1 % 2.1 GiB
Data (buffer content) allocated size: 98.3 % 2.3 TiB
Metadata (buffer content) allocated size: 1.8 % 43.6 GiB
L2ARC breakdown: 9.2M
Hit ratio: 78.8 % 7.2M
Miss ratio: 21.2 % 1.9M
Feeds: 656.6k
L2ARC writes:
Writes sent: 100 % 191.6k
L2ARC evicts:
Lock retries: 44
Upon reading: 0
Solaris Porting Layer (SPL):
spl_hostid 0
spl_hostid_path /etc/hostid
spl_kmem_alloc_max 1048576
spl_kmem_alloc_warn 65536
spl_kmem_cache_kmem_threads 4
spl_kmem_cache_magazine_size 0
spl_kmem_cache_max_size 32
spl_kmem_cache_obj_per_slab 8
spl_kmem_cache_reclaim 0
spl_kmem_cache_slab_limit 16384
spl_max_show_tasks 512
spl_panic_halt 0
spl_schedule_hrtimeout_slack_us 0
spl_taskq_kick 0
spl_taskq_thread_bind 0
spl_taskq_thread_dynamic 1
spl_taskq_thread_priority 1
spl_taskq_thread_sequential 4
Tunables:
dbuf_cache_hiwater_pct 10
dbuf_cache_lowater_pct 10
dbuf_cache_max_bytes 18446744073709551615
dbuf_cache_shift 5
dbuf_metadata_cache_max_bytes 18446744073709551615
dbuf_metadata_cache_shift 6
dmu_object_alloc_chunk_shift 7
dmu_prefetch_max 134217728
ignore_hole_birth 1
l2arc_exclude_special 0
l2arc_feed_again 1
l2arc_feed_min_ms 200
l2arc_feed_secs 1
l2arc_headroom 0
l2arc_headroom_boost 200
l2arc_meta_percent 33
l2arc_mfuonly 0
l2arc_noprefetch 0
l2arc_norw 0
l2arc_rebuild_blocks_min_l2size 1073741824
l2arc_rebuild_enabled 1
l2arc_trim_ahead 0
l2arc_write_boost 134217728
l2arc_write_max 134217728
metaslab_aliquot 1048576
metaslab_bias_enabled 1
metaslab_debug_load 0
metaslab_debug_unload 0
metaslab_df_max_search 16777216
metaslab_df_use_largest_segment 0
metaslab_force_ganging 16777217
metaslab_fragmentation_factor_enabled 1
metaslab_lba_weighting_enabled 1
metaslab_preload_enabled 1
metaslab_unload_delay 32
metaslab_unload_delay_ms 600000
send_holes_without_birth_time 1
spa_asize_inflation 24
spa_config_path /etc/zfs/zpool.cache
spa_load_print_vdev_tree 0
spa_load_verify_data 1
spa_load_verify_metadata 1
spa_load_verify_shift 4
spa_slop_shift 5
vdev_file_logical_ashift 9
vdev_file_physical_ashift 9
vdev_removal_max_span 32768
vdev_validate_skip 0
zap_iterate_prefetch 1
zfetch_array_rd_sz 1048576
zfetch_max_distance 67108864
zfetch_max_idistance 67108864
zfetch_max_sec_reap 2
zfetch_max_streams 8
zfetch_min_distance 4194304
zfetch_min_sec_reap 1
zfs_abd_scatter_enabled 1
zfs_abd_scatter_max_order 10
zfs_abd_scatter_min_size 1536
zfs_admin_snapshot 0
zfs_allow_redacted_dataset_mount 0
zfs_arc_average_blocksize 8192
zfs_arc_dnode_limit 0
zfs_arc_dnode_limit_percent 10
zfs_arc_dnode_reduce_percent 10
zfs_arc_evict_batch_limit 10
zfs_arc_eviction_pct 200
zfs_arc_grow_retry 0
zfs_arc_lotsfree_percent 10
zfs_arc_max 0
zfs_arc_meta_adjust_restarts 4096
zfs_arc_meta_limit 0
zfs_arc_meta_limit_percent 75
zfs_arc_meta_min 0
zfs_arc_meta_prune 10000
zfs_arc_meta_strategy 1
zfs_arc_min 1000000000
zfs_arc_min_prefetch_ms 0
zfs_arc_min_prescient_prefetch_ms 0
zfs_arc_p_dampener_disable 1
zfs_arc_p_min_shift 0
zfs_arc_pc_percent 0
zfs_arc_prune_task_threads 1
zfs_arc_shrink_shift 0
zfs_arc_shrinker_limit 10000
zfs_arc_sys_free 0
zfs_async_block_max_blocks 18446744073709551615
zfs_autoimport_disable 1
zfs_btree_verify_intensity 0
zfs_checksum_events_per_second 20
zfs_commit_timeout_pct 5
zfs_compressed_arc_enabled 1
zfs_condense_indirect_commit_entry_delay_ms 0
zfs_condense_indirect_obsolete_pct 25
zfs_condense_indirect_vdevs_enable 1
zfs_condense_max_obsolete_bytes 1073741824
zfs_condense_min_mapping_bytes 131072
zfs_dbgmsg_enable 1
zfs_dbgmsg_maxsize 4194304
zfs_dbuf_state_index 0
zfs_ddt_data_is_special 1
zfs_deadman_checktime_ms 60000
zfs_deadman_enabled 1
zfs_deadman_failmode wait
zfs_deadman_synctime_ms 600000
zfs_deadman_ziotime_ms 300000
zfs_dedup_prefetch 0
zfs_delay_min_dirty_percent 60
zfs_delay_scale 500000
zfs_delete_blocks 20480
zfs_dirty_data_max 4294967296
zfs_dirty_data_max_max 4294967296
zfs_dirty_data_max_max_percent 25
zfs_dirty_data_max_percent 10
zfs_dirty_data_sync_percent 20
zfs_disable_ivset_guid_check 0
zfs_dmu_offset_next_sync 1
zfs_embedded_slog_min_ms 64
zfs_expire_snapshot 300
zfs_fallocate_reserve_percent 110
zfs_flags 0
zfs_free_bpobj_enabled 1
zfs_free_leak_on_eio 0
zfs_free_min_time_ms 1000
zfs_history_output_max 1048576
zfs_immediate_write_sz 32768
zfs_initialize_chunk_size 1048576
zfs_initialize_value 16045690984833335022
zfs_keep_log_spacemaps_at_export 0
zfs_key_max_salt_uses 400000000
zfs_livelist_condense_new_alloc 0
zfs_livelist_condense_sync_cancel 0
zfs_livelist_condense_sync_pause 0
zfs_livelist_condense_zthr_cancel 0
zfs_livelist_condense_zthr_pause 0
zfs_livelist_max_entries 500000
zfs_livelist_min_percent_shared 75
zfs_lua_max_instrlimit 100000000
zfs_lua_max_memlimit 104857600
zfs_max_async_dedup_frees 100000
zfs_max_log_walking 5
zfs_max_logsm_summary_length 10
zfs_max_missing_tvds 0
zfs_max_nvlist_src_size 0
zfs_max_recordsize 1048576
zfs_metaslab_find_max_tries 100
zfs_metaslab_fragmentation_threshold 70
zfs_metaslab_max_size_cache_sec 3600
zfs_metaslab_mem_limit 25
zfs_metaslab_segment_weight_enabled 1
zfs_metaslab_switch_threshold 2
zfs_metaslab_try_hard_before_gang 0
zfs_mg_fragmentation_threshold 95
zfs_mg_noalloc_threshold 0
zfs_min_metaslabs_to_flush 1
zfs_multihost_fail_intervals 10
zfs_multihost_history 0
zfs_multihost_import_intervals 20
zfs_multihost_interval 1000
zfs_multilist_num_sublists 0
zfs_no_scrub_io 0
zfs_no_scrub_prefetch 0
zfs_nocacheflush 0
zfs_nopwrite_enabled 1
zfs_object_mutex_size 64
zfs_obsolete_min_time_ms 500
zfs_override_estimate_recordsize 0
zfs_pd_bytes_max 52428800
zfs_per_txg_dirty_frees_percent 30
zfs_prefetch_disable 0
zfs_read_history 0
zfs_read_history_hits 0
zfs_rebuild_max_segment 1048576
zfs_rebuild_scrub_enabled 1
zfs_rebuild_vdev_limit 33554432
zfs_reconstruct_indirect_combinations_max 4096
zfs_recover 0
zfs_recv_queue_ff 20
zfs_recv_queue_length 16777216
zfs_recv_write_batch_size 1048576
zfs_removal_ignore_errors 0
zfs_removal_suspend_progress 0
zfs_remove_max_segment 16777216
zfs_resilver_disable_defer 0
zfs_resilver_min_time_ms 3000
zfs_scan_blkstats 0
zfs_scan_checkpoint_intval 7200
zfs_scan_fill_weight 3
zfs_scan_ignore_errors 0
zfs_scan_issue_strategy 0
zfs_scan_legacy 0
zfs_scan_max_ext_gap 2097152
zfs_scan_mem_lim_fact 20
zfs_scan_mem_lim_soft_fact 20
zfs_scan_strict_mem_lim 0
zfs_scan_suspend_progress 0
zfs_scan_vdev_limit 4194304
zfs_scrub_min_time_ms 1000
zfs_send_corrupt_data 0
zfs_send_no_prefetch_queue_ff 20
zfs_send_no_prefetch_queue_length 1048576
zfs_send_queue_ff 20
zfs_send_queue_length 16777216
zfs_send_unmodified_spill_blocks 1
zfs_slow_io_events_per_second 20
zfs_spa_discard_memory_limit 16777216
zfs_special_class_metadata_reserve_pct 25
zfs_sync_pass_deferred_free 2
zfs_sync_pass_dont_compress 8
zfs_sync_pass_rewrite 2
zfs_sync_taskq_batch_pct 75
zfs_traverse_indirect_prefetch_limit 32
zfs_trim_extent_bytes_max 134217728
zfs_trim_extent_bytes_min 32768
zfs_trim_metaslab_skip 0
zfs_trim_queue_limit 10
zfs_trim_txg_batch 32
zfs_txg_history 100
zfs_txg_timeout 5
zfs_unflushed_log_block_max 131072
zfs_unflushed_log_block_min 1000
zfs_unflushed_log_block_pct 400
zfs_unflushed_log_txg_max 1000
zfs_unflushed_max_mem_amt 1073741824
zfs_unflushed_max_mem_ppm 1000
zfs_unlink_suspend_progress 0
zfs_user_indirect_is_special 1
zfs_vdev_aggregate_trim 0
zfs_vdev_aggregation_limit 1048576
zfs_vdev_aggregation_limit_non_rotating 131072
zfs_vdev_async_read_max_active 3
zfs_vdev_async_read_min_active 1
zfs_vdev_async_write_active_max_dirty_percent 60
zfs_vdev_async_write_active_min_dirty_percent 30
zfs_vdev_async_write_max_active 10
zfs_vdev_async_write_min_active 2
zfs_vdev_cache_bshift 16
zfs_vdev_cache_max 16384
zfs_vdev_cache_size 0
zfs_vdev_default_ms_count 200
zfs_vdev_default_ms_shift 29
zfs_vdev_initializing_max_active 1
zfs_vdev_initializing_min_active 1
zfs_vdev_max_active 1000
zfs_vdev_max_auto_ashift 14
zfs_vdev_min_auto_ashift 9
zfs_vdev_min_ms_count 16
zfs_vdev_mirror_non_rotating_inc 0
zfs_vdev_mirror_non_rotating_seek_inc 1
zfs_vdev_mirror_rotating_inc 0
zfs_vdev_mirror_rotating_seek_inc 5
zfs_vdev_mirror_rotating_seek_offset 1048576
zfs_vdev_ms_count_limit 131072
zfs_vdev_nia_credit 5
zfs_vdev_nia_delay 5
zfs_vdev_open_timeout_ms 1000
zfs_vdev_queue_depth_pct 1000
zfs_vdev_raidz_impl cycle [fastest] original scalar sse2 ssse3 avx2
zfs_vdev_read_gap_limit 32768
zfs_vdev_rebuild_max_active 3
zfs_vdev_rebuild_min_active 1
zfs_vdev_removal_max_active 2
zfs_vdev_removal_min_active 1
zfs_vdev_scheduler unused
zfs_vdev_scrub_max_active 3
zfs_vdev_scrub_min_active 1
zfs_vdev_sync_read_max_active 10
zfs_vdev_sync_read_min_active 10
zfs_vdev_sync_write_max_active 10
zfs_vdev_sync_write_min_active 10
zfs_vdev_trim_max_active 2
zfs_vdev_trim_min_active 1
zfs_vdev_write_gap_limit 4096
zfs_vnops_read_chunk_size 1048576
zfs_wrlog_data_max 8589934592
zfs_zevent_len_max 512
zfs_zevent_retain_expire_secs 900
zfs_zevent_retain_max 2000
zfs_zil_clean_taskq_maxalloc 1048576
zfs_zil_clean_taskq_minalloc 1024
zfs_zil_clean_taskq_nthr_pct 100
zil_maxblocksize 131072
zil_nocacheflush 0
zil_replay_disable 0
zil_slog_bulk 786432
zio_deadman_log_all 0
zio_dva_throttle_enabled 1
zio_requeue_io_start_cut_in_line 1
zio_slow_io_ms 30000
zio_taskq_batch_pct 80
zio_taskq_batch_tpq 0
zvol_inhibit_dev 0
zvol_major 230
zvol_max_discard_blocks 16384
zvol_prefetch_bytes 131072
zvol_request_sync 0
zvol_threads 32
zvol_volmode 1
VDEV cache disabled, skipping section
ZIL committed transactions: 21.3M
Commit requests: 6.3M
Flushes to stable storage: 6.0M
Transactions to SLOG storage pool: 81.2 GiB 5.6M
Transactions to non-SLOG storage pool: 0 Bytes 0
```
</details>
free -h
<details>
```
total used free shared buff/cache available
Mem: 125Gi 110Gi 14Gi 43Mi 778Mi 12Gi
Swap: 127Gi 26Gi 101Gi
```
</details>
zpool status
<details>
```
pool: zdata
state: ONLINE
scan: scrub repaired 0B in 07:35:43 with 0 errors on Sat Feb 11 21:51:22 2023
config:
NAME STATE READ WRITE CKSUM
zdata ONLINE 0 0 0
raidz2-0 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2b68b9f-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2ffd4b3-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2fe779f-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2904678-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2fe4568-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c28dab08-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c2919583-part1 ONLINE 0 0 0
/dev/disk/by-id/wwn-0x5000c500c30115db-part1 ONLINE 0 0 0
logs
/dev/disk/by-id/nvme-SAMSUNG_MZ1L2960HCJR-00A07_S665NE0T214870-part2 ONLINE 0 0 0
cache
/dev/nvme2n1p3 ONLINE 0 0 0
/dev/nvme1n1p7 ONLINE 0 0 0
/dev/nvme0n1p7 ONLINE 0 0 0
errors: No known data errors
```
</details>
cat /etc/sysctl.conf
<details>
```
# sysctl settings are defined through files in
# /usr/lib/sysctl.d/, /run/sysctl.d/, and /etc/sysctl.d/.
#
# Vendors settings live in /usr/lib/sysctl.d/.
# To override a whole file, create a new file with the same in
# /etc/sysctl.d/ and put new settings there. To override
# only specific settings, add a file with a lexically later
# name in /etc/sysctl.d/ and put new settings there.
#
# For more information, see sysctl.conf(5) and sysctl.d(5).
vm.min_free_kbytes = 1048576
kernel.task_delayacct = 1
```
</details>
cat /etc/modprobe.d/zfs.conf
<details>
```
options zfs zfs_arc_min=1000000000
options zfs l2arc_noprefetch=0
options zfs l2arc_write_max=134217728
options zfs l2arc_write_boost=134217728
options zfs l2arc_headroom=0
```
</details>
|
defect
|
arc is at max size even when user application is thrashing swap thank you for reporting an issue important please check our issue tracker before opening a new issue additional valuable information can be found in the openzfs documentation and mailing list archives please fill in as much of the template as possible system information type version name distribution name fedora distribution version kernel version architecture openzfs version command to find openzfs version zfs version commands to find kernel version uname r linux freebsd version r freebsd describe the problem you re observing arc is at max size even when swap is being used user application is spending significant time in kernel space describe how to reproduce the problem run my python program image processing which uses heavy memory include any warning errors backtraces from the system logs important please mark logs and text output from terminal commands or else github will not display them correctly an example is provided below example this is an example how log text should be marked wrap it with arc summary zfs subsystem report tue feb linux machine hidden arc status healthy memory throttle count arc size current gib target size adaptive gib min size hard limit mib max size high water gib most frequently used mfu cache size gib most recently used mru cache size gib metadata cache size hard limit gib metadata cache size current gib dnode cache size hard limit gib dnode cache size current mib arc hash breakdown elements max elements current collisions chain max chains arc misc deleted mutex misses eviction skips eviction skips due to writes cached evictions tib eligible evictions gib eligible mfu evictions mib eligible mru evictions gib ineligible evictions gib arc total accesses hits misses cache hit ratio cache miss ratio actual hit ratio mfu mru hits data demand efficiency data prefetch efficiency cache hits by cache type most frequently used mfu most recently used mru most frequently used mfu ghost most recently used mru ghost cache hits by data type demand data prefetch data demand metadata prefetch metadata cache misses by data type demand data prefetch data demand metadata prefetch metadata dmu prefetch efficiency hit ratio miss ratio status healthy low memory aborts free on write r w clashes bad checksums i o errors size adaptive tib compressed tib header size gib mfu allocated size gib mru allocated size tib prefetch allocated size gib data buffer content allocated size tib metadata buffer content allocated size gib breakdown hit ratio miss ratio feeds writes writes sent evicts lock retries upon reading solaris porting layer spl spl hostid spl hostid path etc hostid spl kmem alloc max spl kmem alloc warn spl kmem cache kmem threads spl kmem cache magazine size spl kmem cache max size spl kmem cache obj per slab spl kmem cache reclaim spl kmem cache slab limit spl max show tasks spl panic halt spl schedule hrtimeout slack us spl taskq kick spl taskq thread bind spl taskq thread dynamic spl taskq thread priority spl taskq thread sequential tunables dbuf cache hiwater pct dbuf cache lowater pct dbuf cache max bytes dbuf cache shift dbuf metadata cache max bytes dbuf metadata cache shift dmu object alloc chunk shift dmu prefetch max ignore hole birth exclude special feed again feed min ms feed secs headroom headroom boost meta percent mfuonly noprefetch norw rebuild blocks min rebuild enabled trim ahead write boost write max metaslab aliquot metaslab bias enabled metaslab debug load metaslab debug unload metaslab df max search metaslab df use largest segment metaslab force ganging metaslab fragmentation factor enabled metaslab lba weighting enabled metaslab preload enabled metaslab unload delay metaslab unload delay ms send holes without birth time spa asize inflation spa config path etc zfs zpool cache spa load print vdev tree spa load verify data spa load verify metadata spa load verify shift spa slop shift vdev file logical ashift vdev file physical ashift vdev removal max span vdev validate skip zap iterate prefetch zfetch array rd sz zfetch max distance zfetch max idistance zfetch max sec reap zfetch max streams zfetch min distance zfetch min sec reap zfs abd scatter enabled zfs abd scatter max order zfs abd scatter min size zfs admin snapshot zfs allow redacted dataset mount zfs arc average blocksize zfs arc dnode limit zfs arc dnode limit percent zfs arc dnode reduce percent zfs arc evict batch limit zfs arc eviction pct zfs arc grow retry zfs arc lotsfree percent zfs arc max zfs arc meta adjust restarts zfs arc meta limit zfs arc meta limit percent zfs arc meta min zfs arc meta prune zfs arc meta strategy zfs arc min zfs arc min prefetch ms zfs arc min prescient prefetch ms zfs arc p dampener disable zfs arc p min shift zfs arc pc percent zfs arc prune task threads zfs arc shrink shift zfs arc shrinker limit zfs arc sys free zfs async block max blocks zfs autoimport disable zfs btree verify intensity zfs checksum events per second zfs commit timeout pct zfs compressed arc enabled zfs condense indirect commit entry delay ms zfs condense indirect obsolete pct zfs condense indirect vdevs enable zfs condense max obsolete bytes zfs condense min mapping bytes zfs dbgmsg enable zfs dbgmsg maxsize zfs dbuf state index zfs ddt data is special zfs deadman checktime ms zfs deadman enabled zfs deadman failmode wait zfs deadman synctime ms zfs deadman ziotime ms zfs dedup prefetch zfs delay min dirty percent zfs delay scale zfs delete blocks zfs dirty data max zfs dirty data max max zfs dirty data max max percent zfs dirty data max percent zfs dirty data sync percent zfs disable ivset guid check zfs dmu offset next sync zfs embedded slog min ms zfs expire snapshot zfs fallocate reserve percent zfs flags zfs free bpobj enabled zfs free leak on eio zfs free min time ms zfs history output max zfs immediate write sz zfs initialize chunk size zfs initialize value zfs keep log spacemaps at export zfs key max salt uses zfs livelist condense new alloc zfs livelist condense sync cancel zfs livelist condense sync pause zfs livelist condense zthr cancel zfs livelist condense zthr pause zfs livelist max entries zfs livelist min percent shared zfs lua max instrlimit zfs lua max memlimit zfs max async dedup frees zfs max log walking zfs max logsm summary length zfs max missing tvds zfs max nvlist src size zfs max recordsize zfs metaslab find max tries zfs metaslab fragmentation threshold zfs metaslab max size cache sec zfs metaslab mem limit zfs metaslab segment weight enabled zfs metaslab switch threshold zfs metaslab try hard before gang zfs mg fragmentation threshold zfs mg noalloc threshold zfs min metaslabs to flush zfs multihost fail intervals zfs multihost history zfs multihost import intervals zfs multihost interval zfs multilist num sublists zfs no scrub io zfs no scrub prefetch zfs nocacheflush zfs nopwrite enabled zfs object mutex size zfs obsolete min time ms zfs override estimate recordsize zfs pd bytes max zfs per txg dirty frees percent zfs prefetch disable zfs read history zfs read history hits zfs rebuild max segment zfs rebuild scrub enabled zfs rebuild vdev limit zfs reconstruct indirect combinations max zfs recover zfs recv queue ff zfs recv queue length zfs recv write batch size zfs removal ignore errors zfs removal suspend progress zfs remove max segment zfs resilver disable defer zfs resilver min time ms zfs scan blkstats zfs scan checkpoint intval zfs scan fill weight zfs scan ignore errors zfs scan issue strategy zfs scan legacy zfs scan max ext gap zfs scan mem lim fact zfs scan mem lim soft fact zfs scan strict mem lim zfs scan suspend progress zfs scan vdev limit zfs scrub min time ms zfs send corrupt data zfs send no prefetch queue ff zfs send no prefetch queue length zfs send queue ff zfs send queue length zfs send unmodified spill blocks zfs slow io events per second zfs spa discard memory limit zfs special class metadata reserve pct zfs sync pass deferred free zfs sync pass dont compress zfs sync pass rewrite zfs sync taskq batch pct zfs traverse indirect prefetch limit zfs trim extent bytes max zfs trim extent bytes min zfs trim metaslab skip zfs trim queue limit zfs trim txg batch zfs txg history zfs txg timeout zfs unflushed log block max zfs unflushed log block min zfs unflushed log block pct zfs unflushed log txg max zfs unflushed max mem amt zfs unflushed max mem ppm zfs unlink suspend progress zfs user indirect is special zfs vdev aggregate trim zfs vdev aggregation limit zfs vdev aggregation limit non rotating zfs vdev async read max active zfs vdev async read min active zfs vdev async write active max dirty percent zfs vdev async write active min dirty percent zfs vdev async write max active zfs vdev async write min active zfs vdev cache bshift zfs vdev cache max zfs vdev cache size zfs vdev default ms count zfs vdev default ms shift zfs vdev initializing max active zfs vdev initializing min active zfs vdev max active zfs vdev max auto ashift zfs vdev min auto ashift zfs vdev min ms count zfs vdev mirror non rotating inc zfs vdev mirror non rotating seek inc zfs vdev mirror rotating inc zfs vdev mirror rotating seek inc zfs vdev mirror rotating seek offset zfs vdev ms count limit zfs vdev nia credit zfs vdev nia delay zfs vdev open timeout ms zfs vdev queue depth pct zfs vdev raidz impl cycle original scalar zfs vdev read gap limit zfs vdev rebuild max active zfs vdev rebuild min active zfs vdev removal max active zfs vdev removal min active zfs vdev scheduler unused zfs vdev scrub max active zfs vdev scrub min active zfs vdev sync read max active zfs vdev sync read min active zfs vdev sync write max active zfs vdev sync write min active zfs vdev trim max active zfs vdev trim min active zfs vdev write gap limit zfs vnops read chunk size zfs wrlog data max zfs zevent len max zfs zevent retain expire secs zfs zevent retain max zfs zil clean taskq maxalloc zfs zil clean taskq minalloc zfs zil clean taskq nthr pct zil maxblocksize zil nocacheflush zil replay disable zil slog bulk zio deadman log all zio dva throttle enabled zio requeue io start cut in line zio slow io ms zio taskq batch pct zio taskq batch tpq zvol inhibit dev zvol major zvol max discard blocks zvol prefetch bytes zvol request sync zvol threads zvol volmode vdev cache disabled skipping section zil committed transactions commit requests flushes to stable storage transactions to slog storage pool gib transactions to non slog storage pool bytes free h total used free shared buff cache available mem swap zpool status pool zdata state online scan scrub repaired in with errors on sat feb config name state read write cksum zdata online online dev disk by id wwn online dev disk by id wwn online dev disk by id wwn online dev disk by id wwn online dev disk by id wwn online dev disk by id wwn online dev disk by id wwn online dev disk by id wwn online logs dev disk by id nvme samsung online cache dev online dev online dev online errors no known data errors cat etc sysctl conf sysctl settings are defined through files in usr lib sysctl d run sysctl d and etc sysctl d vendors settings live in usr lib sysctl d to override a whole file create a new file with the same in etc sysctl d and put new settings there to override only specific settings add a file with a lexically later name in etc sysctl d and put new settings there for more information see sysctl conf and sysctl d vm min free kbytes kernel task delayacct cat etc modprobe d zfs conf options zfs zfs arc min options zfs noprefetch options zfs write max options zfs write boost options zfs headroom
| 1
|
440,331
| 30,742,064,826
|
IssuesEvent
|
2023-07-28 12:21:18
|
sktime/sktime
|
https://api.github.com/repos/sktime/sktime
|
opened
|
[BUG] `mean_absolute_scaled_error` has `y_train` as a mandatory keyword argument which is not part of function signature
|
bug documentation module:metrics&benchmarking
|
This is the function definition of MASE:
```python3
def mean_absolute_scaled_error(
y_true, y_pred, sp=1, horizon_weight=None, multioutput="uniform_average", **kwargs
):
```
Based on this, passing `y_true` and `y_pred` should be sufficient. However, that fails complaining this:
```pycon
>>> from sktime.performance_metrics.forecasting import mean_absolute_scaled_error
>>> mean_absolute_scaled_error([1, 2, 3], [6, 5, 4])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/anirban/sktime-fork/sktime/performance_metrics/forecasting/_functions.py", line 415, in mean_absolute_scaled_error
y_train = _get_kwarg("y_train", metric_name="mean_absolute_scaled_error", **kwargs)
File "/home/anirban/sktime-fork/sktime/performance_metrics/forecasting/_functions.py", line 60, in _get_kwarg
raise ValueError(msg)
ValueError: mean_absolute_scaled_error requires `y_train`. Pass `y_train` as a keyword argument when calling the metric.
```
It is also not mentioned in the [documentation](https://www.sktime.net/en/stable/api_reference/auto_generated/sktime.performance_metrics.forecasting.mean_absolute_scaled_error.html) that `y_train` must be passed.
Suggestions:
1. document that `y_train` is mandatory
2. make `y_train` part of function signature
3. make mandatory keyword arguments positional
4. similar changes for other metrics
|
1.0
|
[BUG] `mean_absolute_scaled_error` has `y_train` as a mandatory keyword argument which is not part of function signature - This is the function definition of MASE:
```python3
def mean_absolute_scaled_error(
y_true, y_pred, sp=1, horizon_weight=None, multioutput="uniform_average", **kwargs
):
```
Based on this, passing `y_true` and `y_pred` should be sufficient. However, that fails complaining this:
```pycon
>>> from sktime.performance_metrics.forecasting import mean_absolute_scaled_error
>>> mean_absolute_scaled_error([1, 2, 3], [6, 5, 4])
Traceback (most recent call last):
File "<stdin>", line 1, in <module>
File "/home/anirban/sktime-fork/sktime/performance_metrics/forecasting/_functions.py", line 415, in mean_absolute_scaled_error
y_train = _get_kwarg("y_train", metric_name="mean_absolute_scaled_error", **kwargs)
File "/home/anirban/sktime-fork/sktime/performance_metrics/forecasting/_functions.py", line 60, in _get_kwarg
raise ValueError(msg)
ValueError: mean_absolute_scaled_error requires `y_train`. Pass `y_train` as a keyword argument when calling the metric.
```
It is also not mentioned in the [documentation](https://www.sktime.net/en/stable/api_reference/auto_generated/sktime.performance_metrics.forecasting.mean_absolute_scaled_error.html) that `y_train` must be passed.
Suggestions:
1. document that `y_train` is mandatory
2. make `y_train` part of function signature
3. make mandatory keyword arguments positional
4. similar changes for other metrics
|
non_defect
|
mean absolute scaled error has y train as a mandatory keyword argument which is not part of function signature this is the function definition of mase def mean absolute scaled error y true y pred sp horizon weight none multioutput uniform average kwargs based on this passing y true and y pred should be sufficient however that fails complaining this pycon from sktime performance metrics forecasting import mean absolute scaled error mean absolute scaled error traceback most recent call last file line in file home anirban sktime fork sktime performance metrics forecasting functions py line in mean absolute scaled error y train get kwarg y train metric name mean absolute scaled error kwargs file home anirban sktime fork sktime performance metrics forecasting functions py line in get kwarg raise valueerror msg valueerror mean absolute scaled error requires y train pass y train as a keyword argument when calling the metric it is also not mentioned in the that y train must be passed suggestions document that y train is mandatory make y train part of function signature make mandatory keyword arguments positional similar changes for other metrics
| 0
|
22,609
| 3,670,894,731
|
IssuesEvent
|
2016-02-22 02:23:48
|
gperftools/gperftools
|
https://api.github.com/repos/gperftools/gperftools
|
closed
|
pprof -disasm / -list fail on OS X due to renamed and incompatible objdump
|
Priority-Medium Status-New Type-Defect
|
Originally reported on Google Code with ID 668
```
What steps will reproduce the problem?
1. Try to use the -disasm or -list commands on a function.
What is the expected output?
Disassembly or annotated source.
What do you see instead?
"no filename found" or "0 samples".
What version of the product are you using? On what operating system?
Latest snapshot (as of 2014-01-21).
Please provide any additional information below.
Macports and Homebrew both install "gobjdump" as part of the binutils package, but
pprof references a hardcoded "objdump".
When I changed the executable name (pprof line 1366), it produced a new error:
Can't exec "-C": No such file or directory at /usr/local/bin/pprof line 1370.
-C -d -l --no-show-raw-insn --start-address=0x0000000100001e0e --stop-address=0x0000000100002370
[my executable path]: No such file or directory
```
Reported by `denpashogai` on 2015-01-21 18:00:29
|
1.0
|
pprof -disasm / -list fail on OS X due to renamed and incompatible objdump - Originally reported on Google Code with ID 668
```
What steps will reproduce the problem?
1. Try to use the -disasm or -list commands on a function.
What is the expected output?
Disassembly or annotated source.
What do you see instead?
"no filename found" or "0 samples".
What version of the product are you using? On what operating system?
Latest snapshot (as of 2014-01-21).
Please provide any additional information below.
Macports and Homebrew both install "gobjdump" as part of the binutils package, but
pprof references a hardcoded "objdump".
When I changed the executable name (pprof line 1366), it produced a new error:
Can't exec "-C": No such file or directory at /usr/local/bin/pprof line 1370.
-C -d -l --no-show-raw-insn --start-address=0x0000000100001e0e --stop-address=0x0000000100002370
[my executable path]: No such file or directory
```
Reported by `denpashogai` on 2015-01-21 18:00:29
|
defect
|
pprof disasm list fail on os x due to renamed and incompatible objdump originally reported on google code with id what steps will reproduce the problem try to use the disasm or list commands on a function what is the expected output disassembly or annotated source what do you see instead no filename found or samples what version of the product are you using on what operating system latest snapshot as of please provide any additional information below macports and homebrew both install gobjdump as part of the binutils package but pprof references a hardcoded objdump when i changed the executable name pprof line it produced a new error can t exec c no such file or directory at usr local bin pprof line c d l no show raw insn start address stop address no such file or directory reported by denpashogai on
| 1
|
29,668
| 5,798,528,704
|
IssuesEvent
|
2017-05-03 02:15:44
|
opctl/opctl
|
https://api.github.com/repos/opctl/opctl
|
closed
|
Unable to run ops w/ containers if using docker 4 windows
|
defect ready
|
expected:
ops run the same as any other OS & docker install
actual:
ops start but hang on container call
|
1.0
|
Unable to run ops w/ containers if using docker 4 windows - expected:
ops run the same as any other OS & docker install
actual:
ops start but hang on container call
|
defect
|
unable to run ops w containers if using docker windows expected ops run the same as any other os docker install actual ops start but hang on container call
| 1
|
31,029
| 6,403,527,713
|
IssuesEvent
|
2017-08-06 19:35:38
|
opctl/opctl
|
https://api.github.com/repos/opctl/opctl
|
closed
|
Race condition for non-cached pkgs
|
defect
|
scenario:
running multiple pkgs simultaneously when pkg hasn't ever been cached on node
expected:
pkg is pulled & cached and both runs complete successfully
actual:
the second run tries to run from the in progress pkg pull (which errors due to being incomplete)
|
1.0
|
Race condition for non-cached pkgs - scenario:
running multiple pkgs simultaneously when pkg hasn't ever been cached on node
expected:
pkg is pulled & cached and both runs complete successfully
actual:
the second run tries to run from the in progress pkg pull (which errors due to being incomplete)
|
defect
|
race condition for non cached pkgs scenario running multiple pkgs simultaneously when pkg hasn t ever been cached on node expected pkg is pulled cached and both runs complete successfully actual the second run tries to run from the in progress pkg pull which errors due to being incomplete
| 1
|
53,047
| 13,260,846,135
|
IssuesEvent
|
2020-08-20 18:51:36
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
closed
|
remove dependency on bash in the build (Trac #628)
|
Migrated from Trac defect infrastructure
|
should be /bin/sh happy
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/628">https://code.icecube.wisc.edu/projects/icecube/ticket/628</a>, reported by negaand owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2011-04-30T19:49:07",
"_ts": "1304192947000000",
"description": "should be /bin/sh happy",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"time": "2011-04-30T18:12:58",
"component": "infrastructure",
"summary": "remove dependency on bash in the build",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
remove dependency on bash in the build (Trac #628) - should be /bin/sh happy
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/628">https://code.icecube.wisc.edu/projects/icecube/ticket/628</a>, reported by negaand owned by nega</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2011-04-30T19:49:07",
"_ts": "1304192947000000",
"description": "should be /bin/sh happy",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"time": "2011-04-30T18:12:58",
"component": "infrastructure",
"summary": "remove dependency on bash in the build",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "nega",
"type": "defect"
}
```
</p>
</details>
|
defect
|
remove dependency on bash in the build trac should be bin sh happy migrated from json status closed changetime ts description should be bin sh happy reporter nega cc resolution fixed time component infrastructure summary remove dependency on bash in the build priority normal keywords milestone owner nega type defect
| 1
|
36,842
| 8,160,537,802
|
IssuesEvent
|
2018-08-24 02:04:40
|
hazelcast/hazelcast
|
https://api.github.com/repos/hazelcast/hazelcast
|
opened
|
Q: transactions now throw ConcurrentModificationException and TransactionNotActiveException ?
|
Estimation: S Priority: Low Team: Core Type: Defect
|
Question.
i have seen that transaction test are now throwing 'ConcurrentModificationException and TransactionNotActiveException '
is this exception expected ?
as maybe the point of using transaction's is not to have `ConcurrentModificationException` ?
|
1.0
|
Q: transactions now throw ConcurrentModificationException and TransactionNotActiveException ? - Question.
i have seen that transaction test are now throwing 'ConcurrentModificationException and TransactionNotActiveException '
is this exception expected ?
as maybe the point of using transaction's is not to have `ConcurrentModificationException` ?
|
defect
|
q transactions now throw concurrentmodificationexception and transactionnotactiveexception question i have seen that transaction test are now throwing concurrentmodificationexception and transactionnotactiveexception is this exception expected as maybe the point of using transaction s is not to have concurrentmodificationexception
| 1
|
72,856
| 15,251,713,391
|
IssuesEvent
|
2021-02-20 00:13:21
|
wss-demo/ImportedNodeGoat
|
https://api.github.com/repos/wss-demo/ImportedNodeGoat
|
opened
|
CVE-2020-8244 (Medium) detected in bl-1.1.2.tgz, bl-1.0.3.tgz
|
security vulnerability
|
## CVE-2020-8244 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bl-1.1.2.tgz</b>, <b>bl-1.0.3.tgz</b></p></summary>
<p>
<details><summary><b>bl-1.1.2.tgz</b></p></summary>
<p>Buffer List: collect buffers and access with a standard readable Buffer interface, streamable too!</p>
<p>Library home page: <a href="https://registry.npmjs.org/bl/-/bl-1.1.2.tgz">https://registry.npmjs.org/bl/-/bl-1.1.2.tgz</a></p>
<p>Path to dependency file: ImportedNodeGoat/package.json</p>
<p>Path to vulnerable library: ImportedNodeGoat/node_modules/npm/node_modules/request/node_modules/bl/package.json</p>
<p>
Dependency Hierarchy:
- grunt-npm-install-0.3.1.tgz (Root Library)
- npm-3.10.10.tgz
- request-2.75.0.tgz
- :x: **bl-1.1.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>bl-1.0.3.tgz</b></p></summary>
<p>Buffer List: collect buffers and access with a standard readable Buffer interface, streamable too!</p>
<p>Library home page: <a href="https://registry.npmjs.org/bl/-/bl-1.0.3.tgz">https://registry.npmjs.org/bl/-/bl-1.0.3.tgz</a></p>
<p>Path to dependency file: ImportedNodeGoat/package.json</p>
<p>Path to vulnerable library: ImportedNodeGoat/node_modules/bl/package.json</p>
<p>
Dependency Hierarchy:
- grunt-retire-0.3.12.tgz (Root Library)
- request-2.67.0.tgz
- :x: **bl-1.0.3.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/wss-demo/ImportedNodeGoat/commits/8a9930d26492d27fecc32157284fbe70021fc725">8a9930d26492d27fecc32157284fbe70021fc725</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A buffer over-read vulnerability exists in bl <4.0.3, <3.0.1, <2.2.1, and <1.2.3 which could allow an attacker to supply user input (even typed) that if it ends up in consume() argument and can become negative, the BufferList state can be corrupted, tricking it into exposing uninitialized memory via regular .slice() calls.
<p>Publish Date: 2020-08-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8244>CVE-2020-8244</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8244">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8244</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: 2.2.1,3.0.1,4.0.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"bl","packageVersion":"1.1.2","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-npm-install:0.3.1;npm:3.10.10;request:2.75.0;bl:1.1.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.2.1,3.0.1,4.0.3"},{"packageType":"javascript/Node.js","packageName":"bl","packageVersion":"1.0.3","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-retire:0.3.12;request:2.67.0;bl:1.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.2.1,3.0.1,4.0.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-8244","vulnerabilityDetails":"A buffer over-read vulnerability exists in bl \u003c4.0.3, \u003c3.0.1, \u003c2.2.1, and \u003c1.2.3 which could allow an attacker to supply user input (even typed) that if it ends up in consume() argument and can become negative, the BufferList state can be corrupted, tricking it into exposing uninitialized memory via regular .slice() calls.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8244","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-8244 (Medium) detected in bl-1.1.2.tgz, bl-1.0.3.tgz - ## CVE-2020-8244 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>bl-1.1.2.tgz</b>, <b>bl-1.0.3.tgz</b></p></summary>
<p>
<details><summary><b>bl-1.1.2.tgz</b></p></summary>
<p>Buffer List: collect buffers and access with a standard readable Buffer interface, streamable too!</p>
<p>Library home page: <a href="https://registry.npmjs.org/bl/-/bl-1.1.2.tgz">https://registry.npmjs.org/bl/-/bl-1.1.2.tgz</a></p>
<p>Path to dependency file: ImportedNodeGoat/package.json</p>
<p>Path to vulnerable library: ImportedNodeGoat/node_modules/npm/node_modules/request/node_modules/bl/package.json</p>
<p>
Dependency Hierarchy:
- grunt-npm-install-0.3.1.tgz (Root Library)
- npm-3.10.10.tgz
- request-2.75.0.tgz
- :x: **bl-1.1.2.tgz** (Vulnerable Library)
</details>
<details><summary><b>bl-1.0.3.tgz</b></p></summary>
<p>Buffer List: collect buffers and access with a standard readable Buffer interface, streamable too!</p>
<p>Library home page: <a href="https://registry.npmjs.org/bl/-/bl-1.0.3.tgz">https://registry.npmjs.org/bl/-/bl-1.0.3.tgz</a></p>
<p>Path to dependency file: ImportedNodeGoat/package.json</p>
<p>Path to vulnerable library: ImportedNodeGoat/node_modules/bl/package.json</p>
<p>
Dependency Hierarchy:
- grunt-retire-0.3.12.tgz (Root Library)
- request-2.67.0.tgz
- :x: **bl-1.0.3.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://api.github.com/repos/wss-demo/ImportedNodeGoat/commits/8a9930d26492d27fecc32157284fbe70021fc725">8a9930d26492d27fecc32157284fbe70021fc725</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A buffer over-read vulnerability exists in bl <4.0.3, <3.0.1, <2.2.1, and <1.2.3 which could allow an attacker to supply user input (even typed) that if it ends up in consume() argument and can become negative, the BufferList state can be corrupted, tricking it into exposing uninitialized memory via regular .slice() calls.
<p>Publish Date: 2020-08-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8244>CVE-2020-8244</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8244">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8244</a></p>
<p>Release Date: 2020-07-21</p>
<p>Fix Resolution: 2.2.1,3.0.1,4.0.3</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"bl","packageVersion":"1.1.2","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-npm-install:0.3.1;npm:3.10.10;request:2.75.0;bl:1.1.2","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.2.1,3.0.1,4.0.3"},{"packageType":"javascript/Node.js","packageName":"bl","packageVersion":"1.0.3","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt-retire:0.3.12;request:2.67.0;bl:1.0.3","isMinimumFixVersionAvailable":true,"minimumFixVersion":"2.2.1,3.0.1,4.0.3"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-8244","vulnerabilityDetails":"A buffer over-read vulnerability exists in bl \u003c4.0.3, \u003c3.0.1, \u003c2.2.1, and \u003c1.2.3 which could allow an attacker to supply user input (even typed) that if it ends up in consume() argument and can become negative, the BufferList state can be corrupted, tricking it into exposing uninitialized memory via regular .slice() calls.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-8244","cvss3Severity":"medium","cvss3Score":"6.5","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"Low","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_defect
|
cve medium detected in bl tgz bl tgz cve medium severity vulnerability vulnerable libraries bl tgz bl tgz bl tgz buffer list collect buffers and access with a standard readable buffer interface streamable too library home page a href path to dependency file importednodegoat package json path to vulnerable library importednodegoat node modules npm node modules request node modules bl package json dependency hierarchy grunt npm install tgz root library npm tgz request tgz x bl tgz vulnerable library bl tgz buffer list collect buffers and access with a standard readable buffer interface streamable too library home page a href path to dependency file importednodegoat package json path to vulnerable library importednodegoat node modules bl package json dependency hierarchy grunt retire tgz root library request tgz x bl tgz vulnerable library found in head commit a href found in base branch master vulnerability details a buffer over read vulnerability exists in bl and which could allow an attacker to supply user input even typed that if it ends up in consume argument and can become negative the bufferlist state can be corrupted tricking it into exposing uninitialized memory via regular slice calls publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt npm install npm request bl isminimumfixversionavailable true minimumfixversion packagetype javascript node js packagename bl packageversion packagefilepaths istransitivedependency true dependencytree grunt retire request bl isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier cve vulnerabilitydetails a buffer over read vulnerability exists in bl and which could allow an attacker to supply user input even typed that if it ends up in consume argument and can become negative the bufferlist state can be corrupted tricking it into exposing uninitialized memory via regular slice calls vulnerabilityurl
| 0
|
8,475
| 2,611,513,839
|
IssuesEvent
|
2015-02-27 05:49:45
|
chrsmith/hedgewars
|
https://api.github.com/repos/chrsmith/hedgewars
|
closed
|
pt_PT update
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Hedgewars 0.9.19-dev
Please provide any additional information below.
Update for the pt_PT translation.
Loads of new lines translated, especially mission related, and quite a bit of
tweaking.
Still incomplete when it comes to missions, but rather submit them now that end
up losing them like it happened in the past.
```
Original issue reported on code.google.com by `inufa...@gmail.com` on 15 Dec 2012 at 3:53
Attachments:
* [inu-diff.zip](https://storage.googleapis.com/google-code-attachments/hedgewars/issue-504/comment-0/inu-diff.zip)
|
1.0
|
pt_PT update - ```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Hedgewars 0.9.19-dev
Please provide any additional information below.
Update for the pt_PT translation.
Loads of new lines translated, especially mission related, and quite a bit of
tweaking.
Still incomplete when it comes to missions, but rather submit them now that end
up losing them like it happened in the past.
```
Original issue reported on code.google.com by `inufa...@gmail.com` on 15 Dec 2012 at 3:53
Attachments:
* [inu-diff.zip](https://storage.googleapis.com/google-code-attachments/hedgewars/issue-504/comment-0/inu-diff.zip)
|
defect
|
pt pt update what steps will reproduce the problem what is the expected output what do you see instead what version of the product are you using on what operating system hedgewars dev please provide any additional information below update for the pt pt translation loads of new lines translated especially mission related and quite a bit of tweaking still incomplete when it comes to missions but rather submit them now that end up losing them like it happened in the past original issue reported on code google com by inufa gmail com on dec at attachments
| 1
|
10,599
| 4,793,531,893
|
IssuesEvent
|
2016-10-31 18:25:27
|
Linuxbrew/homebrew-core
|
https://api.github.com/repos/Linuxbrew/homebrew-core
|
closed
|
pulseaudio does not compile on Ubuntu 16.04 LTS
|
build-error
|
- [x] Ran `brew update` and retried your prior step?
- [x] Ran `brew doctor`, fixed as many issues as possible and retried your prior step?
### Bug reports:
`brew install pulseaudio` gives:
```
==> Downloading https://www.freedesktop.org/software/pulseaudio/releases/pulseau
Already downloaded: /home/why/.cache/Homebrew/pulseaudio-9.0.tar.xz
==> Downloading https://raw.githubusercontent.com/Homebrew/formula-patches/15fa4
Already downloaded: /home/why/.cache/Homebrew/pulseaudio--patch-d3a2180600a4fbea538949b6c4e9e70fe7997495663334e50db96d18bfb1da5f.patch
==> Patching
==> Applying i386.patch
patching file src/pulsecore/svolume_mmx.c
patching file src/pulsecore/svolume_sse.c
==> ./configure --disable-silent-rules --prefix=/home/why/.linuxbrew/Cellar/puls
==> make install
Last 15 lines from /home/why/.cache/Homebrew/Logs/pulseaudio/02.make:
/usr/bin/gcc-5 -DHAVE_CONFIG_H -I. -I.. -I../src -I../src/modules -I../src/modules -DPA_ALSA_PATHS_DIR=\"\" -DPA_ALSA_PROFILE_SETS_DIR=\"\" -DPA_SRCDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPA_BUILDDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPULSE_LOCALEDIR=\"/home/why/.linuxbrew/Cellar/pulseaudio/9.0/share/locale\" -isystem/home/why/.linuxbrew/include -DFASTPATH -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -std=gnu11 -pthread -Os -w -pipe -march=native -Wall -W -Wextra -Wno-long-long -Wno-overlength-strings -Wunsafe-loop-optimizations -Wundef -Wformat=2 -Wlogical-op -Wsign-compare -Wformat-security -Wmissing-include-dirs -Wformat-nonliteral -Wold-style-definition -Wpointer-arith -Winit-self -Wdeclaration-after-statement -Wfloat-equal -Wmissing-prototypes -Wstrict-prototypes -Wredundant-decls -Wmissing-declarations -Wmissing-noreturn -Wshadow -Wendif-labels -Wcast-align -Wstrict-aliasing -Wwrite-strings -Wno-unused-parameter -ffast-math -fno-common -fdiagnostics-show-option -fdiagnostics-color=auto -c -o utils/pasuspender-pasuspender.o `test -f 'utils/pasuspender.c' || echo './'`utils/pasuspender.c
/usr/bin/gcc-5 -DHAVE_CONFIG_H -I. -I.. -I../src -I../src/modules -I../src/modules -DPA_ALSA_PATHS_DIR=\"\" -DPA_ALSA_PROFILE_SETS_DIR=\"\" -DPA_SRCDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPA_BUILDDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPULSE_LOCALEDIR=\"/home/why/.linuxbrew/Cellar/pulseaudio/9.0/share/locale\" -isystem/home/why/.linuxbrew/include -DFASTPATH -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -std=gnu11 -pthread -Os -w -pipe -march=native -Wall -W -Wextra -Wno-long-long -Wno-overlength-strings -Wunsafe-loop-optimizations -Wundef -Wformat=2 -Wlogical-op -Wsign-compare -Wformat-security -Wmissing-include-dirs -Wformat-nonliteral -Wold-style-definition -Wpointer-arith -Winit-self -Wdeclaration-after-statement -Wfloat-equal -Wmissing-prototypes -Wstrict-prototypes -Wredundant-decls -Wmissing-declarations -Wmissing-noreturn -Wshadow -Wendif-labels -Wcast-align -Wstrict-aliasing -Wwrite-strings -Wno-unused-parameter -ffast-math -fno-common -fdiagnostics-show-option -fdiagnostics-color=auto -c -o utils/pacmd-pacmd.o `test -f 'utils/pacmd.c' || echo './'`utils/pacmd.c
/usr/bin/gcc-5 -DHAVE_CONFIG_H -I. -I.. -I../src -I../src/modules -I../src/modules -DPA_ALSA_PATHS_DIR=\"\" -DPA_ALSA_PROFILE_SETS_DIR=\"\" -DPA_SRCDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPA_BUILDDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPULSE_LOCALEDIR=\"/home/why/.linuxbrew/Cellar/pulseaudio/9.0/share/locale\" -isystem/home/why/.linuxbrew/include -DFASTPATH -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -std=gnu11 -pthread -I/home/why/.linuxbrew/Cellar/libx11/1.6.3/include -I/home/why/.linuxbrew/Cellar/libsm/1.2.2/include -I/home/why/.linuxbrew/Cellar/libice/1.0.9/include -I/home/why/.linuxbrew/Cellar/libxtst/1.2.2/include -I/home/why/.linuxbrew/Cellar/libxi/1.7.6/include -I/home/why/.linuxbrew/Cellar/libxext/1.3.3/include -I/home/why/.linuxbrew/Cellar/libxfixes/5.0.1/include -I/home/why/.linuxbrew/Cellar/libx11/1.6.3/include -I/home/why/.linuxbrew/Cellar/libxcb/1.11.1/include -I/home/why/.linuxbrew/Cellar/libxau/1.0.8/include -I/home/why/.linuxbrew/Cellar/libxdmcp/1.1.2/include -I/home/why/.linuxbrew/Cellar/kbproto/1.0.7/include -I/home/why/.linuxbrew/Cellar/xproto/7.0.28/include -I/home/why/.linuxbrew/Cellar/fixesproto/5.0/include -I/home/why/.linuxbrew/Cellar/xextproto/7.3.0/include -I/home/why/.linuxbrew/Cellar/inputproto/2.3.1/include -I/home/why/.linuxbrew/Cellar/recordproto/1.14.2/include -Os -w -pipe -march=native -Wall -W -Wextra -Wno-long-long -Wno-overlength-strings -Wunsafe-loop-optimizations -Wundef -Wformat=2 -Wlogical-op -Wsign-compare -Wformat-security -Wmissing-include-dirs -Wformat-nonliteral -Wold-style-definition -Wpointer-arith -Winit-self -Wdeclaration-after-statement -Wfloat-equal -Wmissing-prototypes -Wstrict-prototypes -Wredundant-decls -Wmissing-declarations -Wmissing-noreturn -Wshadow -Wendif-labels -Wcast-align -Wstrict-aliasing -Wwrite-strings -Wno-unused-parameter -ffast-math -fno-common -fdiagnostics-show-option -fdiagnostics-color=auto -c -o utils/pax11publish-pax11publish.o `test -f 'utils/pax11publish.c' || echo './'`utils/pax11publish.c
/bin/sed -e "s|@pkglibdir[@]|/home/why/.linuxbrew/Cellar/pulseaudio/9.0/lib/pulseaudio|g" utils/padsp.in > padsp
make[3]: *** No rule to make target 'daemon/pulseaudio.desktop', needed by 'all-am'. Stop.
make[3]: *** Waiting for unfinished jobs....
make[3]: Leaving directory '/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src'
Makefile:10988: recipe for target 'install' failed
make[2]: *** [install] Error 2
make[2]: Leaving directory '/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src'
Makefile:807: recipe for target 'install-recursive' failed
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory '/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0'
Makefile:1106: recipe for target 'install' failed
make: *** [install] Error 2
READ THIS: https://github.com/Linuxbrew/brew/blob/master/docs/Troubleshooting.md#troubleshooting
If reporting this issue please do so at (not Homebrew/brew):
https://github.com/Linuxbrew/homebrew-core/issues
```
And `brew gist-logs pulseaudio` gives:
https://gist.github.com/48113901bcb95dd12c4019c2fdf20097
|
1.0
|
pulseaudio does not compile on Ubuntu 16.04 LTS - - [x] Ran `brew update` and retried your prior step?
- [x] Ran `brew doctor`, fixed as many issues as possible and retried your prior step?
### Bug reports:
`brew install pulseaudio` gives:
```
==> Downloading https://www.freedesktop.org/software/pulseaudio/releases/pulseau
Already downloaded: /home/why/.cache/Homebrew/pulseaudio-9.0.tar.xz
==> Downloading https://raw.githubusercontent.com/Homebrew/formula-patches/15fa4
Already downloaded: /home/why/.cache/Homebrew/pulseaudio--patch-d3a2180600a4fbea538949b6c4e9e70fe7997495663334e50db96d18bfb1da5f.patch
==> Patching
==> Applying i386.patch
patching file src/pulsecore/svolume_mmx.c
patching file src/pulsecore/svolume_sse.c
==> ./configure --disable-silent-rules --prefix=/home/why/.linuxbrew/Cellar/puls
==> make install
Last 15 lines from /home/why/.cache/Homebrew/Logs/pulseaudio/02.make:
/usr/bin/gcc-5 -DHAVE_CONFIG_H -I. -I.. -I../src -I../src/modules -I../src/modules -DPA_ALSA_PATHS_DIR=\"\" -DPA_ALSA_PROFILE_SETS_DIR=\"\" -DPA_SRCDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPA_BUILDDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPULSE_LOCALEDIR=\"/home/why/.linuxbrew/Cellar/pulseaudio/9.0/share/locale\" -isystem/home/why/.linuxbrew/include -DFASTPATH -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -std=gnu11 -pthread -Os -w -pipe -march=native -Wall -W -Wextra -Wno-long-long -Wno-overlength-strings -Wunsafe-loop-optimizations -Wundef -Wformat=2 -Wlogical-op -Wsign-compare -Wformat-security -Wmissing-include-dirs -Wformat-nonliteral -Wold-style-definition -Wpointer-arith -Winit-self -Wdeclaration-after-statement -Wfloat-equal -Wmissing-prototypes -Wstrict-prototypes -Wredundant-decls -Wmissing-declarations -Wmissing-noreturn -Wshadow -Wendif-labels -Wcast-align -Wstrict-aliasing -Wwrite-strings -Wno-unused-parameter -ffast-math -fno-common -fdiagnostics-show-option -fdiagnostics-color=auto -c -o utils/pasuspender-pasuspender.o `test -f 'utils/pasuspender.c' || echo './'`utils/pasuspender.c
/usr/bin/gcc-5 -DHAVE_CONFIG_H -I. -I.. -I../src -I../src/modules -I../src/modules -DPA_ALSA_PATHS_DIR=\"\" -DPA_ALSA_PROFILE_SETS_DIR=\"\" -DPA_SRCDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPA_BUILDDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPULSE_LOCALEDIR=\"/home/why/.linuxbrew/Cellar/pulseaudio/9.0/share/locale\" -isystem/home/why/.linuxbrew/include -DFASTPATH -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -std=gnu11 -pthread -Os -w -pipe -march=native -Wall -W -Wextra -Wno-long-long -Wno-overlength-strings -Wunsafe-loop-optimizations -Wundef -Wformat=2 -Wlogical-op -Wsign-compare -Wformat-security -Wmissing-include-dirs -Wformat-nonliteral -Wold-style-definition -Wpointer-arith -Winit-self -Wdeclaration-after-statement -Wfloat-equal -Wmissing-prototypes -Wstrict-prototypes -Wredundant-decls -Wmissing-declarations -Wmissing-noreturn -Wshadow -Wendif-labels -Wcast-align -Wstrict-aliasing -Wwrite-strings -Wno-unused-parameter -ffast-math -fno-common -fdiagnostics-show-option -fdiagnostics-color=auto -c -o utils/pacmd-pacmd.o `test -f 'utils/pacmd.c' || echo './'`utils/pacmd.c
/usr/bin/gcc-5 -DHAVE_CONFIG_H -I. -I.. -I../src -I../src/modules -I../src/modules -DPA_ALSA_PATHS_DIR=\"\" -DPA_ALSA_PROFILE_SETS_DIR=\"\" -DPA_SRCDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPA_BUILDDIR=\"/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src\" -DPULSE_LOCALEDIR=\"/home/why/.linuxbrew/Cellar/pulseaudio/9.0/share/locale\" -isystem/home/why/.linuxbrew/include -DFASTPATH -U_FORTIFY_SOURCE -D_FORTIFY_SOURCE=2 -std=gnu11 -pthread -I/home/why/.linuxbrew/Cellar/libx11/1.6.3/include -I/home/why/.linuxbrew/Cellar/libsm/1.2.2/include -I/home/why/.linuxbrew/Cellar/libice/1.0.9/include -I/home/why/.linuxbrew/Cellar/libxtst/1.2.2/include -I/home/why/.linuxbrew/Cellar/libxi/1.7.6/include -I/home/why/.linuxbrew/Cellar/libxext/1.3.3/include -I/home/why/.linuxbrew/Cellar/libxfixes/5.0.1/include -I/home/why/.linuxbrew/Cellar/libx11/1.6.3/include -I/home/why/.linuxbrew/Cellar/libxcb/1.11.1/include -I/home/why/.linuxbrew/Cellar/libxau/1.0.8/include -I/home/why/.linuxbrew/Cellar/libxdmcp/1.1.2/include -I/home/why/.linuxbrew/Cellar/kbproto/1.0.7/include -I/home/why/.linuxbrew/Cellar/xproto/7.0.28/include -I/home/why/.linuxbrew/Cellar/fixesproto/5.0/include -I/home/why/.linuxbrew/Cellar/xextproto/7.3.0/include -I/home/why/.linuxbrew/Cellar/inputproto/2.3.1/include -I/home/why/.linuxbrew/Cellar/recordproto/1.14.2/include -Os -w -pipe -march=native -Wall -W -Wextra -Wno-long-long -Wno-overlength-strings -Wunsafe-loop-optimizations -Wundef -Wformat=2 -Wlogical-op -Wsign-compare -Wformat-security -Wmissing-include-dirs -Wformat-nonliteral -Wold-style-definition -Wpointer-arith -Winit-self -Wdeclaration-after-statement -Wfloat-equal -Wmissing-prototypes -Wstrict-prototypes -Wredundant-decls -Wmissing-declarations -Wmissing-noreturn -Wshadow -Wendif-labels -Wcast-align -Wstrict-aliasing -Wwrite-strings -Wno-unused-parameter -ffast-math -fno-common -fdiagnostics-show-option -fdiagnostics-color=auto -c -o utils/pax11publish-pax11publish.o `test -f 'utils/pax11publish.c' || echo './'`utils/pax11publish.c
/bin/sed -e "s|@pkglibdir[@]|/home/why/.linuxbrew/Cellar/pulseaudio/9.0/lib/pulseaudio|g" utils/padsp.in > padsp
make[3]: *** No rule to make target 'daemon/pulseaudio.desktop', needed by 'all-am'. Stop.
make[3]: *** Waiting for unfinished jobs....
make[3]: Leaving directory '/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src'
Makefile:10988: recipe for target 'install' failed
make[2]: *** [install] Error 2
make[2]: Leaving directory '/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0/src'
Makefile:807: recipe for target 'install-recursive' failed
make[1]: *** [install-recursive] Error 1
make[1]: Leaving directory '/tmp/pulseaudio-20161030-10635-1yml191/pulseaudio-9.0'
Makefile:1106: recipe for target 'install' failed
make: *** [install] Error 2
READ THIS: https://github.com/Linuxbrew/brew/blob/master/docs/Troubleshooting.md#troubleshooting
If reporting this issue please do so at (not Homebrew/brew):
https://github.com/Linuxbrew/homebrew-core/issues
```
And `brew gist-logs pulseaudio` gives:
https://gist.github.com/48113901bcb95dd12c4019c2fdf20097
|
non_defect
|
pulseaudio does not compile on ubuntu lts ran brew update and retried your prior step ran brew doctor fixed as many issues as possible and retried your prior step bug reports brew install pulseaudio gives downloading already downloaded home why cache homebrew pulseaudio tar xz downloading already downloaded home why cache homebrew pulseaudio patch patch patching applying patch patching file src pulsecore svolume mmx c patching file src pulsecore svolume sse c configure disable silent rules prefix home why linuxbrew cellar puls make install last lines from home why cache homebrew logs pulseaudio make usr bin gcc dhave config h i i i src i src modules i src modules dpa alsa paths dir dpa alsa profile sets dir dpa srcdir tmp pulseaudio pulseaudio src dpa builddir tmp pulseaudio pulseaudio src dpulse localedir home why linuxbrew cellar pulseaudio share locale isystem home why linuxbrew include dfastpath u fortify source d fortify source std pthread os w pipe march native wall w wextra wno long long wno overlength strings wunsafe loop optimizations wundef wformat wlogical op wsign compare wformat security wmissing include dirs wformat nonliteral wold style definition wpointer arith winit self wdeclaration after statement wfloat equal wmissing prototypes wstrict prototypes wredundant decls wmissing declarations wmissing noreturn wshadow wendif labels wcast align wstrict aliasing wwrite strings wno unused parameter ffast math fno common fdiagnostics show option fdiagnostics color auto c o utils pasuspender pasuspender o test f utils pasuspender c echo utils pasuspender c usr bin gcc dhave config h i i i src i src modules i src modules dpa alsa paths dir dpa alsa profile sets dir dpa srcdir tmp pulseaudio pulseaudio src dpa builddir tmp pulseaudio pulseaudio src dpulse localedir home why linuxbrew cellar pulseaudio share locale isystem home why linuxbrew include dfastpath u fortify source d fortify source std pthread os w pipe march native wall w wextra wno long long wno overlength strings wunsafe loop optimizations wundef wformat wlogical op wsign compare wformat security wmissing include dirs wformat nonliteral wold style definition wpointer arith winit self wdeclaration after statement wfloat equal wmissing prototypes wstrict prototypes wredundant decls wmissing declarations wmissing noreturn wshadow wendif labels wcast align wstrict aliasing wwrite strings wno unused parameter ffast math fno common fdiagnostics show option fdiagnostics color auto c o utils pacmd pacmd o test f utils pacmd c echo utils pacmd c usr bin gcc dhave config h i i i src i src modules i src modules dpa alsa paths dir dpa alsa profile sets dir dpa srcdir tmp pulseaudio pulseaudio src dpa builddir tmp pulseaudio pulseaudio src dpulse localedir home why linuxbrew cellar pulseaudio share locale isystem home why linuxbrew include dfastpath u fortify source d fortify source std pthread i home why linuxbrew cellar include i home why linuxbrew cellar libsm include i home why linuxbrew cellar libice include i home why linuxbrew cellar libxtst include i home why linuxbrew cellar libxi include i home why linuxbrew cellar libxext include i home why linuxbrew cellar libxfixes include i home why linuxbrew cellar include i home why linuxbrew cellar libxcb include i home why linuxbrew cellar libxau include i home why linuxbrew cellar libxdmcp include i home why linuxbrew cellar kbproto include i home why linuxbrew cellar xproto include i home why linuxbrew cellar fixesproto include i home why linuxbrew cellar xextproto include i home why linuxbrew cellar inputproto include i home why linuxbrew cellar recordproto include os w pipe march native wall w wextra wno long long wno overlength strings wunsafe loop optimizations wundef wformat wlogical op wsign compare wformat security wmissing include dirs wformat nonliteral wold style definition wpointer arith winit self wdeclaration after statement wfloat equal wmissing prototypes wstrict prototypes wredundant decls wmissing declarations wmissing noreturn wshadow wendif labels wcast align wstrict aliasing wwrite strings wno unused parameter ffast math fno common fdiagnostics show option fdiagnostics color auto c o utils o test f utils c echo utils c bin sed e s pkglibdir home why linuxbrew cellar pulseaudio lib pulseaudio g utils padsp in padsp make no rule to make target daemon pulseaudio desktop needed by all am stop make waiting for unfinished jobs make leaving directory tmp pulseaudio pulseaudio src makefile recipe for target install failed make error make leaving directory tmp pulseaudio pulseaudio src makefile recipe for target install recursive failed make error make leaving directory tmp pulseaudio pulseaudio makefile recipe for target install failed make error read this if reporting this issue please do so at not homebrew brew and brew gist logs pulseaudio gives
| 0
|
674,450
| 23,051,082,180
|
IssuesEvent
|
2022-07-24 16:35:02
|
EmulatorNexus/VeniceUnleashed
|
https://api.github.com/repos/EmulatorNexus/VeniceUnleashed
|
closed
|
[Crash] Crash when calling `RCON:GetServerGuid()` in the `OnLoadResources` event
|
bug high priority VeniceEXT
|
There seems to be a silent crash when trying to receive the server GUID using RCON. 1/10 times it will return nil, all the other times the server will just silently crash.
[CrashTest.zip](https://github.com/EmulatorNexus/VeniceUnleashed/files/9114835/CrashTest.zip)
|
1.0
|
[Crash] Crash when calling `RCON:GetServerGuid()` in the `OnLoadResources` event - There seems to be a silent crash when trying to receive the server GUID using RCON. 1/10 times it will return nil, all the other times the server will just silently crash.
[CrashTest.zip](https://github.com/EmulatorNexus/VeniceUnleashed/files/9114835/CrashTest.zip)
|
non_defect
|
crash when calling rcon getserverguid in the onloadresources event there seems to be a silent crash when trying to receive the server guid using rcon times it will return nil all the other times the server will just silently crash
| 0
|
2,003
| 3,026,092,214
|
IssuesEvent
|
2015-08-03 13:16:02
|
lionheart/openradar-mirror
|
https://api.github.com/repos/lionheart/openradar-mirror
|
opened
|
20189542: Current selection not visible after switching sort order of message list
|
classification:ui/usability reproducible:always status:open
|
#### Description
Summary:
If you have a particular message selected in Mail and change the sort order of the list, Mail does not ensure that the selected message remains in the visible area of the list afterwards.
Steps to Reproduce:
1. Select a mailbox with a fairly large number of messages, so that there are significantly more than fit in the visible portion of the list.
2. Select a message in the list
3. Use the View > Sort By menu to change how the list is sorted (e.g. switching from sorting by date to sorting by "From")
Expected Results:
After sorting the list, Mail should scroll the list so that the message you had selected before is still visible.
Actual Results:
The scroll view remains at whatever position it was before, so the selected message will often no longer be visible, and require either manual scrolling or poking at the up/down arrow keys to get back to the selected message.
Version:
Mail 8.2 (2089)/OS X 10.10.3 (14D98g)
Notes:
The use case for this situation is as follows.
1. You receive a new message from a particular person.
2. You want to take a look at other recent correspondance with that person, so you sort the list by "From", so that other email from that person are listed right next to the new message that you're looking at.
3. Once you're done, you then switch back to sorting the list by date.
However, having the selected message disappear from the visible part of the list defeats this method, which is meant to be able to quickly see that person's messages without having to scroll around at all.
Configuration:
This happens both in the default layout and the "classic" layout modes in Mail.
-
Product Version: Mail 8.2 (2089)/OS X 10.10.3 (14D98g)
Created: 2015-03-17T16:43:19.342224
Originated: 2015-03-17T09:41:00
Open Radar Link: http://www.openradar.me/20189542
|
True
|
20189542: Current selection not visible after switching sort order of message list - #### Description
Summary:
If you have a particular message selected in Mail and change the sort order of the list, Mail does not ensure that the selected message remains in the visible area of the list afterwards.
Steps to Reproduce:
1. Select a mailbox with a fairly large number of messages, so that there are significantly more than fit in the visible portion of the list.
2. Select a message in the list
3. Use the View > Sort By menu to change how the list is sorted (e.g. switching from sorting by date to sorting by "From")
Expected Results:
After sorting the list, Mail should scroll the list so that the message you had selected before is still visible.
Actual Results:
The scroll view remains at whatever position it was before, so the selected message will often no longer be visible, and require either manual scrolling or poking at the up/down arrow keys to get back to the selected message.
Version:
Mail 8.2 (2089)/OS X 10.10.3 (14D98g)
Notes:
The use case for this situation is as follows.
1. You receive a new message from a particular person.
2. You want to take a look at other recent correspondance with that person, so you sort the list by "From", so that other email from that person are listed right next to the new message that you're looking at.
3. Once you're done, you then switch back to sorting the list by date.
However, having the selected message disappear from the visible part of the list defeats this method, which is meant to be able to quickly see that person's messages without having to scroll around at all.
Configuration:
This happens both in the default layout and the "classic" layout modes in Mail.
-
Product Version: Mail 8.2 (2089)/OS X 10.10.3 (14D98g)
Created: 2015-03-17T16:43:19.342224
Originated: 2015-03-17T09:41:00
Open Radar Link: http://www.openradar.me/20189542
|
non_defect
|
current selection not visible after switching sort order of message list description summary if you have a particular message selected in mail and change the sort order of the list mail does not ensure that the selected message remains in the visible area of the list afterwards steps to reproduce select a mailbox with a fairly large number of messages so that there are significantly more than fit in the visible portion of the list select a message in the list use the view sort by menu to change how the list is sorted e g switching from sorting by date to sorting by from expected results after sorting the list mail should scroll the list so that the message you had selected before is still visible actual results the scroll view remains at whatever position it was before so the selected message will often no longer be visible and require either manual scrolling or poking at the up down arrow keys to get back to the selected message version mail os x notes the use case for this situation is as follows you receive a new message from a particular person you want to take a look at other recent correspondance with that person so you sort the list by from so that other email from that person are listed right next to the new message that you re looking at once you re done you then switch back to sorting the list by date however having the selected message disappear from the visible part of the list defeats this method which is meant to be able to quickly see that person s messages without having to scroll around at all configuration this happens both in the default layout and the classic layout modes in mail product version mail os x created originated open radar link
| 0
|
57,250
| 15,727,912,775
|
IssuesEvent
|
2021-03-29 13:14:32
|
danmar/testissues
|
https://api.github.com/repos/danmar/testissues
|
opened
|
Token::link() return NULL for a function (Trac #178)
|
Incomplete Migration Migrated from Trac Other defect noone
|
Migrated from https://trac.cppcheck.net/ticket/178
```json
{
"status": "closed",
"changetime": "2009-03-15T08:24:33",
"description": "Token::link() will return NULL for the function ShowYGV608Registers() in the attached file (tokenized lines 1671 - 1799).",
"reporter": "kidkat",
"cc": "",
"resolution": "wontfix",
"_ts": "1237105473000000",
"component": "Other",
"summary": "Token::link() return NULL for a function",
"priority": "",
"keywords": "",
"time": "2009-03-14T23:27:54",
"milestone": "",
"owner": "noone",
"type": "defect"
}
```
|
1.0
|
Token::link() return NULL for a function (Trac #178) - Migrated from https://trac.cppcheck.net/ticket/178
```json
{
"status": "closed",
"changetime": "2009-03-15T08:24:33",
"description": "Token::link() will return NULL for the function ShowYGV608Registers() in the attached file (tokenized lines 1671 - 1799).",
"reporter": "kidkat",
"cc": "",
"resolution": "wontfix",
"_ts": "1237105473000000",
"component": "Other",
"summary": "Token::link() return NULL for a function",
"priority": "",
"keywords": "",
"time": "2009-03-14T23:27:54",
"milestone": "",
"owner": "noone",
"type": "defect"
}
```
|
defect
|
token link return null for a function trac migrated from json status closed changetime description token link will return null for the function in the attached file tokenized lines reporter kidkat cc resolution wontfix ts component other summary token link return null for a function priority keywords time milestone owner noone type defect
| 1
|
52,831
| 22,412,592,353
|
IssuesEvent
|
2022-06-19 00:23:14
|
rtbf-ir/rtbf.ir
|
https://api.github.com/repos/rtbf-ir/rtbf.ir
|
closed
|
new website/service - from email
|
new website/service
|
servicename=سایت بهینشو
websiteurl=https://behinsu24.com/
deleteurl=ندارد
issuedescription=نیاز به ارسال پیام/تیکت به پشتیبانی
submit=ثبت سایت/سرویس جدید
|
1.0
|
new website/service - from email - servicename=سایت بهینشو
websiteurl=https://behinsu24.com/
deleteurl=ندارد
issuedescription=نیاز به ارسال پیام/تیکت به پشتیبانی
submit=ثبت سایت/سرویس جدید
|
non_defect
|
new website service from email servicename سایت بهینشو websiteurl deleteurl ندارد issuedescription نیاز به ارسال پیام تیکت به پشتیبانی submit ثبت سایت سرویس جدید
| 0
|
77,733
| 9,608,873,487
|
IssuesEvent
|
2019-05-12 10:36:31
|
thespis-rs/thespis_impl_remote
|
https://api.github.com/repos/thespis-rs/thespis_impl_remote
|
opened
|
Feature: Discovery of services
|
Contribute ⦔ Help Welcome Difficulty ⦔ Software Design Domain ⦔ Usability Est. Work ⦔ 2w OS ⦔ All Priority ⦔ Normal State ⦔ Open Type ⦔ Feature
|
We now have the possibility to send messages to remote services, but we have no discovery. This could be a directory, a DHT, tor hidden service, ... but some way to find the ip address of services.
|
1.0
|
Feature: Discovery of services - We now have the possibility to send messages to remote services, but we have no discovery. This could be a directory, a DHT, tor hidden service, ... but some way to find the ip address of services.
|
non_defect
|
feature discovery of services we now have the possibility to send messages to remote services but we have no discovery this could be a directory a dht tor hidden service but some way to find the ip address of services
| 0
|
6,307
| 2,610,240,194
|
IssuesEvent
|
2015-02-26 19:16:36
|
chrsmith/jsjsj122
|
https://api.github.com/repos/chrsmith/jsjsj122
|
opened
|
台州割包茎手术价钱
|
auto-migrated Priority-Medium Type-Defect
|
```
台州割包茎手术价钱【台州五洲生殖医院】24小时健康咨询热
线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址:台州市椒
江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104、108、118�
��198及椒江一金清公交车直达枫南小区,乘坐107、105、109、112
、901、 902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 11:57
|
1.0
|
台州割包茎手术价钱 - ```
台州割包茎手术价钱【台州五洲生殖医院】24小时健康咨询热
线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地址:台州市椒
江区枫南路229号(枫南大转盘旁)乘车线路:乘坐104、108、118�
��198及椒江一金清公交车直达枫南小区,乘坐107、105、109、112
、901、 902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 11:57
|
defect
|
台州割包茎手术价钱 台州割包茎手术价钱【台州五洲生殖医院】 线 微信号tzwzszyy 医院地址 台州市椒 (枫南大转盘旁)乘车线路 、 、 � �� , 、 、 、 、 、 ,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 original issue reported on code google com by poweragr gmail com on may at
| 1
|
31,501
| 6,541,850,802
|
IssuesEvent
|
2017-09-01 22:18:29
|
SublimeText/PackageDev
|
https://api.github.com/repos/SublimeText/PackageDev
|
closed
|
sublime-syntax (oniguruma regex): nested char classes aren't always scoped properly
|
defect
|
For example, [this regex](https://github.com/sublimehq/Packages/blob/3889492cac950cb90b98db53d94ef47a6b0bd5fe/CSS/CSS.sublime-syntax#L16):
`(?:[[-\w]{{nonascii}}]|{{escape}})`
It seems it detects the `[-\w]` as a range rather than a nested character class, and the `]` is scoped as a literal. It gets it right if you remove the `-` or move it to after the `\w`.
|
1.0
|
sublime-syntax (oniguruma regex): nested char classes aren't always scoped properly - For example, [this regex](https://github.com/sublimehq/Packages/blob/3889492cac950cb90b98db53d94ef47a6b0bd5fe/CSS/CSS.sublime-syntax#L16):
`(?:[[-\w]{{nonascii}}]|{{escape}})`
It seems it detects the `[-\w]` as a range rather than a nested character class, and the `]` is scoped as a literal. It gets it right if you remove the `-` or move it to after the `\w`.
|
defect
|
sublime syntax oniguruma regex nested char classes aren t always scoped properly for example nonascii escape it seems it detects the as a range rather than a nested character class and the is scoped as a literal it gets it right if you remove the or move it to after the w
| 1
|
33,922
| 7,299,888,668
|
IssuesEvent
|
2018-02-26 21:37:53
|
Openki/Openki
|
https://api.github.com/repos/Openki/Openki
|
closed
|
JSON API: Sorting broken
|
Defect Urgent ☠
|
Looks like sorting is broken ?
https://openki.net/api/0/json/events?groups=90aa41f3&after=2018-02-12&before=2018-02-13&sort=startLocal
This commit ? --->
https://github.com/Openki/Openki/commit/47c00e61c79be91aa51d5614cb3845490d19a402
Sorting still works with adding limit parameter? But only <=10 ?
|
1.0
|
JSON API: Sorting broken - Looks like sorting is broken ?
https://openki.net/api/0/json/events?groups=90aa41f3&after=2018-02-12&before=2018-02-13&sort=startLocal
This commit ? --->
https://github.com/Openki/Openki/commit/47c00e61c79be91aa51d5614cb3845490d19a402
Sorting still works with adding limit parameter? But only <=10 ?
|
defect
|
json api sorting broken looks like sorting is broken this commit sorting still works with adding limit parameter but only
| 1
|
201,961
| 23,048,156,254
|
IssuesEvent
|
2022-07-24 08:03:42
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
[Cloud Posture] Setup breadcrumbs for new navigation
|
Team:Cloud Security Posture 8.4 candidate
|
**Motivation**
We want to have breadcrumbs for the cloud security posture app work in a similar way to other pages in the security solution
**Definition of Done**
- [ ] No longer use the CSP breadcrumbs utility
- [ ] Also applies to document title
- [ ] Breadcrumbs are handled by the security solution implementation:
```plaintext
x-pack/plugins/security_solution/public/common/components/navigation/breadcrumbs/index.ts you'll need to add your breadcrumbs logic there. Cases has its own breadcrumbs implementation (that's why it returns null), it lives inside their own plugin, you can choose to do the same, or you can choose to implement it in this file, it is up to you
```
|
True
|
[Cloud Posture] Setup breadcrumbs for new navigation - **Motivation**
We want to have breadcrumbs for the cloud security posture app work in a similar way to other pages in the security solution
**Definition of Done**
- [ ] No longer use the CSP breadcrumbs utility
- [ ] Also applies to document title
- [ ] Breadcrumbs are handled by the security solution implementation:
```plaintext
x-pack/plugins/security_solution/public/common/components/navigation/breadcrumbs/index.ts you'll need to add your breadcrumbs logic there. Cases has its own breadcrumbs implementation (that's why it returns null), it lives inside their own plugin, you can choose to do the same, or you can choose to implement it in this file, it is up to you
```
|
non_defect
|
setup breadcrumbs for new navigation motivation we want to have breadcrumbs for the cloud security posture app work in a similar way to other pages in the security solution definition of done no longer use the csp breadcrumbs utility also applies to document title breadcrumbs are handled by the security solution implementation plaintext x pack plugins security solution public common components navigation breadcrumbs index ts you ll need to add your breadcrumbs logic there cases has its own breadcrumbs implementation that s why it returns null it lives inside their own plugin you can choose to do the same or you can choose to implement it in this file it is up to you
| 0
|
6,839
| 3,060,241,050
|
IssuesEvent
|
2015-08-14 19:29:45
|
OpenSMTPD/OpenSMTPD
|
https://api.github.com/repos/OpenSMTPD/OpenSMTPD
|
reopened
|
"table senders" regression: syntax error
|
bug documentation
|
Hi,
The following configuration files:
`smtpd.conf`:
```
listen on lo0 # or localhost on Debian
table aliases db:/etc/mail/aliases.db # or "table aliases file:/etc/mail/aliases" on Debian
table senders file:/etc/mail/senders
accept for local alias <aliases> deliver to mbox
accept from local for any relay
```
`/etc/mail/senders`:
```
@something.example.com
```
work fine on OpenBSD 5.8 with OpenSMTPD 5.4.4, as well as Debian 7.8 with OpenSMTPD 5.4.4p1 (no syntax error).
But the same file doesn't work on Debian 7.8 with OpenSMTPD 5.7.1p1:
```
# smtpd -d -v
/etc/smtpd.conf:11: syntax error
```
Reading OpenSMTPD 5.7.1p1 `smtpd.conf(5)` and `table(5)`, I can't see what is wrong here. Unless I'm mistaken, it looks like a regression, otherwise if it's a matter of a new syntax, this probably needs to be documented.
Thank you.
|
1.0
|
"table senders" regression: syntax error - Hi,
The following configuration files:
`smtpd.conf`:
```
listen on lo0 # or localhost on Debian
table aliases db:/etc/mail/aliases.db # or "table aliases file:/etc/mail/aliases" on Debian
table senders file:/etc/mail/senders
accept for local alias <aliases> deliver to mbox
accept from local for any relay
```
`/etc/mail/senders`:
```
@something.example.com
```
work fine on OpenBSD 5.8 with OpenSMTPD 5.4.4, as well as Debian 7.8 with OpenSMTPD 5.4.4p1 (no syntax error).
But the same file doesn't work on Debian 7.8 with OpenSMTPD 5.7.1p1:
```
# smtpd -d -v
/etc/smtpd.conf:11: syntax error
```
Reading OpenSMTPD 5.7.1p1 `smtpd.conf(5)` and `table(5)`, I can't see what is wrong here. Unless I'm mistaken, it looks like a regression, otherwise if it's a matter of a new syntax, this probably needs to be documented.
Thank you.
|
non_defect
|
table senders regression syntax error hi the following configuration files smtpd conf listen on or localhost on debian table aliases db etc mail aliases db or table aliases file etc mail aliases on debian table senders file etc mail senders accept for local alias deliver to mbox accept from local for any relay etc mail senders something example com work fine on openbsd with opensmtpd as well as debian with opensmtpd no syntax error but the same file doesn t work on debian with opensmtpd smtpd d v etc smtpd conf syntax error reading opensmtpd smtpd conf and table i can t see what is wrong here unless i m mistaken it looks like a regression otherwise if it s a matter of a new syntax this probably needs to be documented thank you
| 0
|
67,293
| 20,961,598,894
|
IssuesEvent
|
2022-03-27 21:46:51
|
abedmaatalla/imsdroid
|
https://api.github.com/repos/abedmaatalla/imsdroid
|
closed
|
Ekiga Account
|
Priority-Medium Type-Defect auto-migrated
|
```
What steps will reproduce the problem?
1.Ekiga Account
2.
3.
What is the expected output? What do you see instead?
I tried to configure the account of Ekiga on IMS droid. I tried to put my
account as follows:
Display Name : Aabhas Garg
Public Identity: aabhasgarg@ekiga.net
Private Identity : aabhasgarg
Password : -----
Realm: sip:ekiga.net
Uncheck 3GPP Early IMS Security
Check either Enable WiFi or 3G
Proxy CSCF Host:ekiga.net
Proxy-CSCF Port: 5060
Transport: UDP
Proxy-CSCF Discovery:None
Uncheck Enable SigComp
Return to Home screen and click the sign in and say " Trying to register" but
it did not work and say the message "UnRegistered: Dialog Terminated" " and
then again to return the home screen and tried to configure NATT as I suspected
:
Check Enable STUN/TURN
Check the radio button : Use this STUN/TURN Server
Server Address : stun.ekiga.net
Server Port: 3478
and then return to home screen again and click the sign in and say the same
message as above.
Please let me know if you have any idea.
Aabhas
What version of the product are you using? On what operating system?
Android EVO
Please provide any additional information below.
```
Original issue reported on code.google.com by `AabhasG...@gmail.com` on 5 Aug 2010 at 10:11
|
1.0
|
Ekiga Account - ```
What steps will reproduce the problem?
1.Ekiga Account
2.
3.
What is the expected output? What do you see instead?
I tried to configure the account of Ekiga on IMS droid. I tried to put my
account as follows:
Display Name : Aabhas Garg
Public Identity: aabhasgarg@ekiga.net
Private Identity : aabhasgarg
Password : -----
Realm: sip:ekiga.net
Uncheck 3GPP Early IMS Security
Check either Enable WiFi or 3G
Proxy CSCF Host:ekiga.net
Proxy-CSCF Port: 5060
Transport: UDP
Proxy-CSCF Discovery:None
Uncheck Enable SigComp
Return to Home screen and click the sign in and say " Trying to register" but
it did not work and say the message "UnRegistered: Dialog Terminated" " and
then again to return the home screen and tried to configure NATT as I suspected
:
Check Enable STUN/TURN
Check the radio button : Use this STUN/TURN Server
Server Address : stun.ekiga.net
Server Port: 3478
and then return to home screen again and click the sign in and say the same
message as above.
Please let me know if you have any idea.
Aabhas
What version of the product are you using? On what operating system?
Android EVO
Please provide any additional information below.
```
Original issue reported on code.google.com by `AabhasG...@gmail.com` on 5 Aug 2010 at 10:11
|
defect
|
ekiga account what steps will reproduce the problem ekiga account what is the expected output what do you see instead i tried to configure the account of ekiga on ims droid i tried to put my account as follows display name aabhas garg public identity aabhasgarg ekiga net private identity aabhasgarg password realm sip ekiga net uncheck early ims security check either enable wifi or proxy cscf host ekiga net proxy cscf port transport udp proxy cscf discovery none uncheck enable sigcomp return to home screen and click the sign in and say trying to register but it did not work and say the message unregistered dialog terminated and then again to return the home screen and tried to configure natt as i suspected check enable stun turn check the radio button use this stun turn server server address stun ekiga net server port and then return to home screen again and click the sign in and say the same message as above please let me know if you have any idea aabhas what version of the product are you using on what operating system android evo please provide any additional information below original issue reported on code google com by aabhasg gmail com on aug at
| 1
|
72,446
| 24,121,231,294
|
IssuesEvent
|
2022-09-20 18:54:35
|
department-of-veterans-affairs/va.gov-team
|
https://api.github.com/repos/department-of-veterans-affairs/va.gov-team
|
opened
|
FE | Profile | Screenreader Accessibility Issue | Account Security
|
508/Accessibility vsa authenticated-experience needs-grooming profile 508-defect-3 sprint-planning
|
## Background
The Accessibility team informed us that when navigating Profile by heading it is easy to skip all the main fields when using a screenreader. The recommend updating the headings to improve the page.
Recommendation from Angela: "You can put the email address change at the top under H1 under an H2 and then another that says completed tasks (H2) - or put the email part somewhere else and then get rid of the rest"
## Tasks
- [ ] Code Changes
- [ ] Conduct Unit tests
- [ ] Run E2E tests covering current code and regression
- [ ] All axe checks pass
- [ ] Deploy to Staging behind feature flag
## Acceptance Criteria
- [ ] Complete all unit testing
- [ ] End-to-end tests showing 0 violations
- [ ] Notify Product Manager that the change is ready to be validated on Staging
- [ ] Incorporate any changes resulting from Staging review & re-test & have re-validated
- [ ] Deploy changes to Production using feature flag
## Validation
- Who can validate this ticket? (FE, BE, Design, PM)?
- How can this work be validated?
- What updates need to be made (e.g. product outline, use cases, contact center guide)
|
1.0
|
FE | Profile | Screenreader Accessibility Issue | Account Security - ## Background
The Accessibility team informed us that when navigating Profile by heading it is easy to skip all the main fields when using a screenreader. The recommend updating the headings to improve the page.
Recommendation from Angela: "You can put the email address change at the top under H1 under an H2 and then another that says completed tasks (H2) - or put the email part somewhere else and then get rid of the rest"
## Tasks
- [ ] Code Changes
- [ ] Conduct Unit tests
- [ ] Run E2E tests covering current code and regression
- [ ] All axe checks pass
- [ ] Deploy to Staging behind feature flag
## Acceptance Criteria
- [ ] Complete all unit testing
- [ ] End-to-end tests showing 0 violations
- [ ] Notify Product Manager that the change is ready to be validated on Staging
- [ ] Incorporate any changes resulting from Staging review & re-test & have re-validated
- [ ] Deploy changes to Production using feature flag
## Validation
- Who can validate this ticket? (FE, BE, Design, PM)?
- How can this work be validated?
- What updates need to be made (e.g. product outline, use cases, contact center guide)
|
defect
|
fe profile screenreader accessibility issue account security background the accessibility team informed us that when navigating profile by heading it is easy to skip all the main fields when using a screenreader the recommend updating the headings to improve the page recommendation from angela you can put the email address change at the top under under an and then another that says completed tasks or put the email part somewhere else and then get rid of the rest tasks code changes conduct unit tests run tests covering current code and regression all axe checks pass deploy to staging behind feature flag acceptance criteria complete all unit testing end to end tests showing violations notify product manager that the change is ready to be validated on staging incorporate any changes resulting from staging review re test have re validated deploy changes to production using feature flag validation who can validate this ticket fe be design pm how can this work be validated what updates need to be made e g product outline use cases contact center guide
| 1
|
24,414
| 17,212,943,163
|
IssuesEvent
|
2021-07-19 07:53:02
|
azure-deprecation/infrastructure
|
https://api.github.com/repos/azure-deprecation/infrastructure
|
opened
|
Provide automated deployment for Azure infrastructure
|
help wanted infrastructure
|
Provide automated deployment for Azure infrastructure for every main commit by using GitHub Actions.
|
1.0
|
Provide automated deployment for Azure infrastructure - Provide automated deployment for Azure infrastructure for every main commit by using GitHub Actions.
|
non_defect
|
provide automated deployment for azure infrastructure provide automated deployment for azure infrastructure for every main commit by using github actions
| 0
|
6,131
| 2,583,429,060
|
IssuesEvent
|
2015-02-16 05:51:16
|
FWAJL/FieldWorkAssistantMVC
|
https://api.github.com/repos/FWAJL/FieldWorkAssistantMVC
|
closed
|
Run DB script for Feb-16 2015
|
priority:very high status:live version:1.6.4
|
Related to #487.
Update the unique constraints on field_analyte and lab_analyte tables to take into account the PM_ID.
The file: [Database/Scripts/Updates/2015-02-16.sql](https://github.com/FWAJL/FieldWorkAssistantMVC/blob/dev/Database/Scripts/Updates/2015-02-16.sql)
|
1.0
|
Run DB script for Feb-16 2015 - Related to #487.
Update the unique constraints on field_analyte and lab_analyte tables to take into account the PM_ID.
The file: [Database/Scripts/Updates/2015-02-16.sql](https://github.com/FWAJL/FieldWorkAssistantMVC/blob/dev/Database/Scripts/Updates/2015-02-16.sql)
|
non_defect
|
run db script for feb related to update the unique constraints on field analyte and lab analyte tables to take into account the pm id the file
| 0
|
55,659
| 14,620,342,923
|
IssuesEvent
|
2020-12-22 19:32:38
|
primefaces/primefaces
|
https://api.github.com/repos/primefaces/primefaces
|
closed
|
TextEditor: Wrong placement of resize handle when used with toolbarVisible=true inside a p:resizable
|
defect
|
## 1) Environment
- PrimeFaces version: 7.0.4
- Does it work on the newest released PrimeFaces version? No
- Does it work on the newest sources in GitHub? Not tested yet
- Application server + version: Tomcat 9
- Affected browsers: all
## 2) Expected behavior
The component should behave correct for both settings of the toolbarVisible attribute.
## 3) Actual behavior
The component does not behave correct for toolbarVisible=true.
## 4) Steps to reproduce
First of all: This works perfectly when the toolbarVisible attribute is set to false. But if set to true, the handle is initially placed at the right position, but once LMB is pressed the handle position suddenly jumps up by the amount of pixels corresponding to the height of the toolbar. The handle then stays inside the textEditor area.
## 5) Sample XHTML
```xml
<h:panelGroup layout="block">
<p:textEditor id="descriptionTextEditor" value="#{jobDetailsBean.description}"
toolbarVisible="true" />
<p:resizable for="descriptionTextEditor" minHeight="100" maxHeight="800" handles="s" />
</h:panelGroup>
```
## 6) Sample bean
Not needed here.
|
1.0
|
TextEditor: Wrong placement of resize handle when used with toolbarVisible=true inside a p:resizable - ## 1) Environment
- PrimeFaces version: 7.0.4
- Does it work on the newest released PrimeFaces version? No
- Does it work on the newest sources in GitHub? Not tested yet
- Application server + version: Tomcat 9
- Affected browsers: all
## 2) Expected behavior
The component should behave correct for both settings of the toolbarVisible attribute.
## 3) Actual behavior
The component does not behave correct for toolbarVisible=true.
## 4) Steps to reproduce
First of all: This works perfectly when the toolbarVisible attribute is set to false. But if set to true, the handle is initially placed at the right position, but once LMB is pressed the handle position suddenly jumps up by the amount of pixels corresponding to the height of the toolbar. The handle then stays inside the textEditor area.
## 5) Sample XHTML
```xml
<h:panelGroup layout="block">
<p:textEditor id="descriptionTextEditor" value="#{jobDetailsBean.description}"
toolbarVisible="true" />
<p:resizable for="descriptionTextEditor" minHeight="100" maxHeight="800" handles="s" />
</h:panelGroup>
```
## 6) Sample bean
Not needed here.
|
defect
|
texteditor wrong placement of resize handle when used with toolbarvisible true inside a p resizable environment primefaces version does it work on the newest released primefaces version no does it work on the newest sources in github not tested yet application server version tomcat affected browsers all expected behavior the component should behave correct for both settings of the toolbarvisible attribute actual behavior the component does not behave correct for toolbarvisible true steps to reproduce first of all this works perfectly when the toolbarvisible attribute is set to false but if set to true the handle is initially placed at the right position but once lmb is pressed the handle position suddenly jumps up by the amount of pixels corresponding to the height of the toolbar the handle then stays inside the texteditor area sample xhtml xml p texteditor id descriptiontexteditor value jobdetailsbean description toolbarvisible true sample bean not needed here
| 1
|
50,951
| 13,187,995,102
|
IssuesEvent
|
2020-08-13 05:15:05
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
closed
|
building multiple copies of HTML docs (Trac #1723)
|
Migrated from Trac defect other
|
with the addition of sphinx-apidoc in r2580/IceTray we're now building two copies of the docs that step on each other and barf warnings everywhere
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1723">https://code.icecube.wisc.edu/ticket/1723</a>, reported by nega and owned by kjmeagher</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-06-09T14:50:36",
"description": "with the addition of sphinx-apidoc in r2580/IceTray we're now building two copies of the docs that step on each other and barf warnings everywhere",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1465483836277867",
"component": "other",
"summary": "building multiple copies of HTML docs",
"priority": "normal",
"keywords": "documentation sphinx html",
"time": "2016-06-01T21:10:22",
"milestone": "",
"owner": "kjmeagher",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
building multiple copies of HTML docs (Trac #1723) - with the addition of sphinx-apidoc in r2580/IceTray we're now building two copies of the docs that step on each other and barf warnings everywhere
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1723">https://code.icecube.wisc.edu/ticket/1723</a>, reported by nega and owned by kjmeagher</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-06-09T14:50:36",
"description": "with the addition of sphinx-apidoc in r2580/IceTray we're now building two copies of the docs that step on each other and barf warnings everywhere",
"reporter": "nega",
"cc": "",
"resolution": "fixed",
"_ts": "1465483836277867",
"component": "other",
"summary": "building multiple copies of HTML docs",
"priority": "normal",
"keywords": "documentation sphinx html",
"time": "2016-06-01T21:10:22",
"milestone": "",
"owner": "kjmeagher",
"type": "defect"
}
```
</p>
</details>
|
defect
|
building multiple copies of html docs trac with the addition of sphinx apidoc in icetray we re now building two copies of the docs that step on each other and barf warnings everywhere migrated from json status closed changetime description with the addition of sphinx apidoc in icetray we re now building two copies of the docs that step on each other and barf warnings everywhere reporter nega cc resolution fixed ts component other summary building multiple copies of html docs priority normal keywords documentation sphinx html time milestone owner kjmeagher type defect
| 1
|
14,133
| 2,789,928,823
|
IssuesEvent
|
2015-05-08 22:29:13
|
google/google-visualization-api-issues
|
https://api.github.com/repos/google/google-visualization-api-issues
|
opened
|
ScatterChart curveType:'function' fails with undefined object
|
Priority-Medium Type-Defect
|
Original [issue 545](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=545) created by orwant on 2011-03-10T01:31:08.000Z:
<b>What steps will reproduce the problem? Please provide a link to a</b>
<b>demonstration page if at all possible, or attach code.</b>
1. Use the ScatterChart example [1] as a base.
2. Add curveType: 'function' to the draw call.
3. Opera will report "Cannot convert 'this.ra' to object", Chrome similarly "Cannot call method 'S' of undefined".
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
ScatterChart
<b>Are you using the test environment (version 1.1)?</b>
<b>(If you are not sure, answer NO)</b>
NO
<b>What operating system and browser are you using?</b>
Seen under both Windows 7, Opera 11.01/Chrome 9 and Linux Opera/10.63, Chrome 10
JS stacktrace from Windows7/Opera attached.
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
|
1.0
|
ScatterChart curveType:'function' fails with undefined object - Original [issue 545](https://code.google.com/p/google-visualization-api-issues/issues/detail?id=545) created by orwant on 2011-03-10T01:31:08.000Z:
<b>What steps will reproduce the problem? Please provide a link to a</b>
<b>demonstration page if at all possible, or attach code.</b>
1. Use the ScatterChart example [1] as a base.
2. Add curveType: 'function' to the draw call.
3. Opera will report "Cannot convert 'this.ra' to object", Chrome similarly "Cannot call method 'S' of undefined".
<b>What component is this issue related to (PieChart, LineChart, DataTable,</b>
<b>Query, etc)?</b>
ScatterChart
<b>Are you using the test environment (version 1.1)?</b>
<b>(If you are not sure, answer NO)</b>
NO
<b>What operating system and browser are you using?</b>
Seen under both Windows 7, Opera 11.01/Chrome 9 and Linux Opera/10.63, Chrome 10
JS stacktrace from Windows7/Opera attached.
<b>*********************************************************</b>
<b>For developers viewing this issue: please click the 'star' icon to be</b>
<b>notified of future changes, and to let us know how many of you are</b>
<b>interested in seeing it resolved.</b>
<b>*********************************************************</b>
|
defect
|
scatterchart curvetype function fails with undefined object original created by orwant on what steps will reproduce the problem please provide a link to a demonstration page if at all possible or attach code use the scatterchart example as a base add curvetype function to the draw call opera will report quot cannot convert this ra to object quot chrome similarly quot cannot call method s of undefined quot what component is this issue related to piechart linechart datatable query etc scatterchart are you using the test environment version if you are not sure answer no no what operating system and browser are you using seen under both windows opera chrome and linux opera chrome js stacktrace from opera attached for developers viewing this issue please click the star icon to be notified of future changes and to let us know how many of you are interested in seeing it resolved
| 1
|
62,950
| 17,267,620,871
|
IssuesEvent
|
2021-07-22 15:29:35
|
idaholab/moose
|
https://api.github.com/repos/idaholab/moose
|
opened
|
Execute_on FINAL returns all 0s with PerProcessorRayTracingResultsVectorPostprocessor
|
P: normal T: defect
|
## Bug Description
Full on 0s for PerProcessorRayTracingResultsVectorPostprocessor
## Steps to Reproduce
Go to ray_tracing/test/tests/vector_postprocessors/per_processor_ray_tracing_results_vector_postprocessor and try it out
## Impact
Temporary confusion. Nothing major
|
1.0
|
Execute_on FINAL returns all 0s with PerProcessorRayTracingResultsVectorPostprocessor - ## Bug Description
Full on 0s for PerProcessorRayTracingResultsVectorPostprocessor
## Steps to Reproduce
Go to ray_tracing/test/tests/vector_postprocessors/per_processor_ray_tracing_results_vector_postprocessor and try it out
## Impact
Temporary confusion. Nothing major
|
defect
|
execute on final returns all with perprocessorraytracingresultsvectorpostprocessor bug description full on for perprocessorraytracingresultsvectorpostprocessor steps to reproduce go to ray tracing test tests vector postprocessors per processor ray tracing results vector postprocessor and try it out impact temporary confusion nothing major
| 1
|
16,900
| 23,263,118,599
|
IssuesEvent
|
2022-08-04 14:59:59
|
haubna/PhysicsMod
|
https://api.github.com/repos/haubna/PhysicsMod
|
closed
|
Bug with POM shader packs
|
compatibility
|
**Describe the bug**
Bug with POM shader packs
**To Reproduce**
Enable shader pack and setup POM (I used chaoptic13 v9)
Block and mobs physics works buggy
**Screenshots**


Minecraft Version: 1.18
Physics Mod Version: v19
|
True
|
Bug with POM shader packs - **Describe the bug**
Bug with POM shader packs
**To Reproduce**
Enable shader pack and setup POM (I used chaoptic13 v9)
Block and mobs physics works buggy
**Screenshots**


Minecraft Version: 1.18
Physics Mod Version: v19
|
non_defect
|
bug with pom shader packs describe the bug bug with pom shader packs to reproduce enable shader pack and setup pom i used block and mobs physics works buggy screenshots minecraft version physics mod version
| 0
|
130,585
| 27,725,185,362
|
IssuesEvent
|
2023-03-15 01:14:17
|
phetsims/calculus-grapher
|
https://api.github.com/repos/phetsims/calculus-grapher
|
closed
|
Code review
|
dev:code-review
|
---
# PhET Code-Review Checklist (a.k.a "CRC")
* Irrelevant items have been crossed off using ~~strikethrough text~~.
* See **NOTE** comments for additional info that is specific to this sim.
* A checked-off item doesn't mean "no problem here", it means "it was reviewed".
* Problems can be noted in side issues that reference this issue, or through `// REVIEW` comments in the code
## Specific Instructions
We would like to being dev testing by Monday, March 20, and we will need a few days to address review feedback. So this review needs to be **completed by end-of-day Wednesday, March 15**. If you do not think that is possible, please contact @kathy-phet or @pixelzoom ASAP to discuss.
If you have questions, or would like someone to tag along during the code review, please contact @pixelzoom or @veillette on Slack.
## GitHub Issues
The following standard GitHub issues should exist.
⚠️ **NOTE: These issues exist, but have not been completed. Please continue with the code review anyway, and we will be addressing them in parallel.**
- [ ] model.md, see https://github.com/phetsims/calculus-grapher/issues/263. Familiarize yourself with the model by reading model.md. Does it adequately describe the model, in terms appropriate for teachers? Has it been reviewed by the sim designer? **NOTE: model.md is currently incomplete, but is worth skimming.**
- [ ] implementation-notes.md, see https://github.com/phetsims/calculus-grapher/issues/264. Familiarize yourself with the implementation by reading implementation-notes.md. Does it provide an overview that will be useful to future maintainers? **NOTE: implementation-notes.md is currently incomplete, but is worth skimming. TransformedCurve.ts is where most of the interesting stuff lives.**
- [ ] results of memory testing for `brands=phet`, see https://github.com/phetsims/calculus-grapher/issues/265
- [ ] results of memory testing for `brands=phet-io` (if the sim is instrumented for PhET-iO), see https://github.com/phetsims/calculus-grapher/issues/265
- [ ] performance testing and sign-off, see https://github.com/phetsims/calculus-grapher/issues/266
- [ ] review of pointer areas, see https://github.com/phetsims/calculus-grapher/issues/267
- [ ] credits (will not be completed until after RC testing), see https://github.com/phetsims/calculus-grapher/issues/202
## **Build and Run Checks**
If any of these items fail, pause code review.
- [x] Does the sim build without warnings or errors?
- [ ] Does the html file size seem reasonable, compared to other similar sims?
- [x] Does the sim start up? (unbuilt and built versions)
- [x] Does the sim experience any assertion failures? (run with query parameter `ea`)
- [x] Does the sim pass a scenery fuzz test? (run with query parameters `fuzz&ea`)
- [x] Does the sim behave correctly when listener order is shuffled? (run with query parameters `ea&shuffleListeners` and `ea&shuffleListeners&fuzz`)
- [x] Does the sim output any deprecation warnings? Run with `?deprecationWarnings`. Do not use deprecated methods in new code.
## **Memory Leaks**
- [ ] ~~Does a heap comparison using Chrome Developer Tools indicate a memory leak? (This process is described [here](https://github.com/phetsims/QA/blob/master/documentation/qa-book.md#47-memory-leak-testing).) Test on a version built using `grunt --minify.mangle=false`. Compare to testing results done by the responsible developer. Results can be found in https://github.com/phetsims/calculus-grapher/issues/265~~
- [x] For each common-code component (sun, scenery-phet, vegas, …) that opaquely registers observers or listeners, is
there a call to that component’s `dispose` function, or is it obvious why it isn't necessary, or is there documentation
about why `dispose` isn't called? An example of why no call to `dispose` is needed is if the component is used in
a `ScreenView` that would never be removed from the scene graph. Note that it's also acceptable (and encouraged!) to describe what needs to be disposed in implementation-notes.md.
- [ ] Are there leaks due to registering observers or listeners? The following guidelines should be followed unless documentation (in-line or in implementation-notes.md) describes why following them is not necessary.
* AXON: `Property.link` or `lazyLink` is accompanied by `unlink`.
* AXON: `Multilink.multilink` is accompanied by `unmultilink`.
* AXON: Creation of `Multilink` is accompanied by `dispose`.
* AXON: Creation of `DerivedProperty` is accompanied by `dispose`.
* AXON: `Emitter.addListener` is accompanied by `removeListener`.
* AXON: `ObservableArrayDef.element*Emitter.addListener` is accompanied by `ObservableArrayDef.element*Emitter.removeListener`
* SCENERY: `Node.addInputListener` is accompanied by `removeInputListener`
* TANDEM: Creation of an instrumented `PhetioObject` is accompanied by `dispose`.
- [x] Do all types that require a `dispose` function have one? This should expose a public `dispose` function that calls `this.disposeMyType()`, where `disposeMyType` is a private function declared in the constructor. `MyType` should exactly match the filename.
## **Performance**
- [x] Play with sim, identify any obvious performance issues. Examples: animation that slows down with large numbers of objects; animation that pauses or "hitches" during garbage collection.
- [ ] ~~If the sim uses WebGL, does it have a fallback? Does the fallback perform reasonably well? (run with query parameter `webgl=false`)~~
## **Usability**
- [x] Are UI components sufficiently responsive? (especially continuous UI components, such as sliders)
- [x] Are pointer areas optimized, especially for touch? (run with query parameter `showPointerAreas`)
- [x] Do pointer areas overlap? (run with query parameter `showPointerAreas`) Overlap may be OK in some cases, depending on the z-ordering (if the front-most object is supposed to occlude pointer areas) and whether objects can be moved. **NOTE: There is one place where a bit of overlap is acceptable. See https://github.com/phetsims/calculus-grapher/issues/267#issuecomment-1459197100**
## **Internationalization**
- [x] Are there any strings that are not internationalized, and does the sim layout gracefully handle internationalized strings that are shorter than the English strings? (run with query parameter `stringTest=X`. You should see nothing but 'X' strings.)
- [x] Does the sim layout gracefully handle internationalized strings that are longer than the English strings? (run with query parameters `stringTest=double` and `stringTest=long`)
- [x] Does the sim stay on the sim page (doesn't redirect to an external page) when running with the query parameter
`stringTest=xss`? This test passes if sim does not redirect, OK if sim crashes or fails to fully start. Only test on one
desktop platform. For PhET-iO sims, additionally test `?stringTest=xss` in Studio to make sure i18n strings didn't leak
to phetioDocumentation, see https://github.com/phetsims/phet-io/issues/1377
- [x] Avoid using concatenation to create strings that will be visible in the user interface. Use `StringUtils.fillIn` and a string pattern to ensure that strings are properly localized.
- [x] Use named placeholders (e.g. `"{{value}} {{units}}"`) instead of numbered placeholders (e.g. `"{0} {1}"`). **NOTE: There is only one of these. See `"predictPreferenceDescription"` in calculus-grapher-strings_en.json.**
- [x] Make sure the string keys are all perfect, because they are difficult to change after 1.0.0 is published. Guidelines for string keys are:
(1) Strings keys should generally match their values. E.g.:
```js
"helloWorld": {
value: "Hello World!"
},
"quadraticTerms": {
value: "Quadratic Terms"
}
```
(2) If a string key would be exceptionally long, use a key name that is an abbreviated form of the string value, or that captures the purpose/essence of the value. E.g.:
```js
// key is abbreviated
"iWentToTheStore": {
value: "I went to the store to get milk, eggs, butter, and sugar."
},
// key is based on purpose
"describeTheScreen": {
value: "The Play Area is a small room. The Control Panel has buttons, a checkbox, and radio buttons to change conditions in the room."
}
```
(3) If string key names would collide, use your judgment to disambiguate. E.g.:
```js
"simplifyTitle": {
value: "Simplify!"
},
"simplifyCheckbox": {
value: "simplify"
}
```
(4) String keys for screen names should have the general form `"screen.{{screenName}}"`. E.g.:
```js
"screen.explore": {
"value": "Explore"
},
```
(5) String patterns that contain placeholders (e.g. `"My name is {{first}} {{last}}"`) should use keys that are unlikely to conflict with strings that might be needed in the future. For example, for `"{{price}}"` consider using key `"pricePattern"` instead of `"price"`, if you think there might be a future need for a `"price"` string.
(6) It is acceptable to prefix families of strings with a prefix, like so:
```json
"material.water": {
"value": "Water"
},
"material.wood": {
"value": "Wood"
},
"shape.block": {
"value": "Block"
},
"shape.cone": {
"value": "Cone"
},
```
Nested substructure is not yet fully supported.
- [ ] ~~If the sim was already released, make sure none of the original string keys have changed. If they have changed, make sure any changes have a good reason and have been discussed with @jbphet (it is likely that an issue like https://github.com/phetsims/gravity-force-lab/issues/166 should be created).~~
## **Repository Structure**
- [x] The repository name should correspond to the sim title. For example, if the sim title is "Wave Interference", then the repository name should be "wave-interference".
- [x] Are all required files and directories present?
For a sim repository named “my-repo”, the general structure should look like this (where assets/, images/, mipmaps/ or sounds/ may be omitted if the sim doesn’t have those types of resource files). **NOTE that this sim has no image/, mipmaps/, or sounds/.**
```
my-repo/
assets/
doc/
images/
*see annotation
model.md
implementation-notes.md
images/
license.json
js/
(see section below)
mipmaps/
license.json
sound/
license.json
dependencies.json
.gitignore
my-repo_en.html
my-repo-strings_en.json
Gruntfile.js
LICENSE
package.json
README.md
```
*~~Any images used in model.md or implementation-notes.md should be added here. Images specific to aiding with documentation do not need their own license.~~
- [ ] ~~Verify that the same image file is not present in both images/ and mipmaps/. If you need a mipmap, use it for all occurrences of the image.~~
- [x] Is the js/ directory properly structured?
All JavaScript source should be in the js/ directory. There should be a subdirectory for each screen (this also applies for single-screen sims, where the subdirectory matches the repo name). For a multi-screen sim, code shared by 2 or more screens should be in a js/common/ subdirectory. Model and view code should be in model/ and view/ subdirectories for each screen and common/. For example, for a sim with screens “Introduction” and “Lab”, the general directory structure should look like this:
```
my-repo/
js/
common/
model/
view/
introduction/
model/
view/
lab/
model/
view/
my-repo-main.js
myRepo.js
myRepoStrings.js
```
- [x] Do filenames use an appropriate prefix? Some filenames may be prefixed with the repository name,
e.g. `MolarityConstants.js` in molarity. If the repository name is long, the developer may choose to abbreviate the
repository name, e.g. `EEConstants.js` in expression-exchange. If the abbreviation is already used by another
repository, then the full name must be used. For example, if the "EE" abbreviation is already used by
expression-exchange, then it should not be used in equality-explorer. Whichever convention is used, it should be used
consistently within a repository - don't mix abbreviations and full names.
- [ ] ~~Is there a file in assets/ for every resource file in sound/ and images/? Note that there is *not necessarily* a
1:1 correspondence between asset and resource files; for example, several related images may be in the same .ai file.
Check license.json for possible documentation of why some resources might not have a corresponding asset file.~~
- [x] For simulations, was the README.md generated by `grunt published-README` or `grunt unpublished-README`? Common
code repos can have custom README files.
- [x] Does package.json refer to any dependencies that are not used by the sim?
- [x] Is the LICENSE file correct? (Generally GPL v3 for sims and MIT for common code,
see [this thread](https://github.com/phetsims/tasks/issues/875#issuecomment-312168646) for additional information).
- [x] Does .gitignore match the one in simula-rasa?
- [ ] ~~In GitHub, verify that all non-release branches have an associated issue that describes their purpose.~~
- [ ] ~~Are there any GitHub branches that are no longer needed and should be deleted?~~
- [x] Sim-specific query parameters (if any) should be identified and documented in one .js file in js/common/ or js/ (
if there is no common/). The .js file should be named `{{PREFIX}}QueryParameters.js`, for example
ArithmeticQueryParameters.js for the arithmetic repository, or FBQueryParameters.js for Function Builder (where
the `FB` prefix is used).
- [x] Query parameters that are public-facing should be identified using `public: true` in the schema.
- [x] All sims should use a color file named `MyRepoColors.js` or, if using abbreviations, `MRColors.js`, and
use `ProfileColorProperty` where appropriate, even if they have a single (default) profile (to support color editing
and PhET-iO Studio). The `ColorProfile` pattern was converted to `*Colors.js` files in
https://github.com/phetsims/scenery-phet/issues/515. Please see
[GasPropertiesColors.js](https://github.com/phetsims/gas-properties/blob/master/js/common/GasPropertiesColors.js)
for a good example.
## **Coding Conventions**
- [x] Are coding conventions outlined in [PhET's Coding Conventions Document](https://github.com/phetsims/phet-info/blob/master/doc/coding-conventions.md) followed and adhered to? This document
deals with PhET coding conventions. You do not need to exhaustively check every item in this section, nor do you
necessarily need to check these items one at a time. The goal is to determine whether the code generally meets PhET standards.
## **TypeScript Conventions**
- [x] Are TypeScript conventions outlined in the [TypeScript Conventions Document](https://github.com/phetsims/phet-info/blob/master/doc/typescript-conventions.md) followed and adhered to?
## **Math Libraries**
- [x] `DOT/Utils.toFixed` or `DOT/Utils.toFixedNumber` should be used instead of `toFixed`. JavaScript's `toFixed` is notoriously buggy. Behavior differs depending on browser, because the spec doesn't specify whether to round or floor.
## IE11
- [ ] ~~IE is no longer supported. With that in mind remove IE-specific workarounds~~
- [ ] ~~Use `string.includes` and `string.startsWith` where possible.~~
## **Organization, Readability, and Maintainability**
- [x] Does the organization and structure of the code make sense? Do the model and view contain types that you would expect (or guess!) by looking at the sim? Do the names of things correspond to the names that you see in the user interface?
- [x] Are appropriate design patterns used? See [phet-software-design-patterns.md](https://github.com/phetsims/phet-info/blob/master/doc/phet-software-design-patterns.md). If new or inappropriate patterns are identified, create an issue.
- [x] Is inheritance used where appropriate? Does the type hierarchy make sense?
- [x] Is composition favored over inheritance where appropriate? See https://en.wikipedia.org/wiki/Composition_over_inheritance.
- [x] Is there any unnecessary coupling? (e.g., by passing large objects to constructors, or exposing unnecessary properties/functions). In TypeScript, you can decouple by narrowing the API like so:
```ts
public constructor( tickMarksVisibleProperty: Property<boolean>,
model: Pick<IntroModel, 'changeWaterLevel'>, // <-- Note the call site can pass the whole model, but we declare we will only use this part of it
waterCup: WaterCup, modelViewTransform: ModelViewTransform2,
providedOptions?: WaterCup3DNodeOptions ) {
```
- [x] Is there too much unnecessary decoupling? (e.g. by passing all of the properties of an object independently instead of passing the object itself)?
- [x] Are the source files reasonable in size? Scrutinize large files with too many responsibilities - can responsibilities be broken into smaller delegates? **NOTE: 84 of 87 files are under 300 lines. TransformedCurve.ts is justifiably the largest file (720 lines) because that's where all of the curve manipulation lives. To see file sizes, run this shell command:**
```
cd calculus-grapher/js ; wc -l `find . -name "*.ts" -print` | sort
```
- [x] Are any significant chunks of code duplicated? In addition to manual identification, tools include: WebStorm _Code > Analyze Code > Locate Duplicates_ and https://github.com/danielstjules/jsinspect. **NOTE: WebStorm _Code > Analyze Code > Locate Duplicates_ identifies a handful of duplicate lines of code, but no "significant chunks". I recommend "do nothing" about these. But feel free to inspect and recommend otherwise.**
- [x] Is there anything that should be generalized and migrated to common code?
- [x] Are there any `TODO` or `FIXME` or `REVIEW` comments in the code? They should be addressed or promoted to GitHub issues.
- [x] Are there any [magic numbers](https://en.wikipedia.org/wiki/Magic_number_(programming)) that should be factored out as constants and documented?
- [x] Are there any constants that are duplicated in multiple files that should be factored out into a `{{REPO}}Constants.js` file?
- [x] Does the implementation rely on any specific constant values that are likely to change in the future? Identify constants that might be changed in the future. (Use your judgement about which constants are likely candidates.) Does changing the values of these constants break the sim? For example, see https://github.com/phetsims/plinko-probability/issues/84.
- [x] Is [PhetColorScheme](https://github.com/phetsims/scenery-phet/blob/master/js/PhetColorScheme.ts) used where appropriate? Verify that the sim is not inventing/creating its own colors for things that have been standardized in `PhetColorScheme`. Identify any colors that might be worth adding to `PhetColorScheme`.
- [x] Are all dependent Properties modeled as `DerivedProperty` instead of `Property`?
- [x] All dynamics should be called from Sim.step(dt), do not use window.setTimeout or window.setInterval. This will help support Legends of Learning and PhET-iO.
## **Accessibility**
Not supported.
## **PhET-iO**
This section may be omitted if the sim has not been instrumented for PhET-iO, but is likely good to glance at no matter.
- [x] Does instrumentation follow the conventions described in [PhET-iO Instrumentation Guide](https://github.com/phetsims/phet-io/blob/master/doc/phet-io-instrumentation-technical-guide.md)?
This could be an extensive bullet. At the very least, be sure to know what amount of instrumentation this sim
supports. Describing this further goes beyond the scope of this document.
- [ ] PhET-iO instantiates different objects and wires up listeners that are not present in the PhET-branded simulation.
It needs to be tested separately for memory leaks. To help isolate the nature of the memory leak, this test should
be run separately from the PhET brand memory leak test. Test with a colorized Data Stream, and Studio (easily
accessed from phetmarks). Compare to testing results done by the responsible developer and previous releases.
- [ ] ~~Make sure unused `PhetioObject` instances are disposed, which unregisters their tandems.~~
- [x] Make sure JOIST `dt` values are used instead of `Date.now()` or other Date functions. Perhaps try
`phet.joist.elapsedTime`. Though this has already been mentioned, it is necessary for reproducible playback via input
events and deserves a comment in this PhET-iO section.
- [ ] ~~Are random numbers using `DOT/dotRandom` as an imported module (not a global), and all doing so after modules are declared (non-statically)? For
example, the following methods (and perhaps others) should not be used: `Math.random`, `_.shuffle`, `_.sample`, `_.random`. This also deserves re-iteration due to its effect on record/playback for PhET-iO.~~
- [x] Like JSON, keys for `undefined` values are omitted when serializing objects across frames. Consider this when
determining whether `toStateObject` should use `null` or `undefined` values.
- [x] PhET prefers to use the term "position" to refer to the physical (x,y) position of objects. This applies to both
brands, but is more important for the PhET-iO API. See https://github.com/phetsims/phet-info/issues/126
- [ ] Are your IOType state methods violating the API of the core type by accessing private fields? **NOTE: This sim has only 2 custom IOTypes: `CurvePoint.CurvePointIO` and `GraphSet.GraphSetIO`.**
- [ ] ~~When defining a boolean Property to indicate whether something is enabled, use `AXON/EnabledProperty`. This
should be done in both the model and the view. If you're using a DerivedProperty, skip this item.~~
- [x] Do not use translated strings in `phetioDocumentaton` - it changes the PhET-iO API!
|
1.0
|
Code review - ---
# PhET Code-Review Checklist (a.k.a "CRC")
* Irrelevant items have been crossed off using ~~strikethrough text~~.
* See **NOTE** comments for additional info that is specific to this sim.
* A checked-off item doesn't mean "no problem here", it means "it was reviewed".
* Problems can be noted in side issues that reference this issue, or through `// REVIEW` comments in the code
## Specific Instructions
We would like to being dev testing by Monday, March 20, and we will need a few days to address review feedback. So this review needs to be **completed by end-of-day Wednesday, March 15**. If you do not think that is possible, please contact @kathy-phet or @pixelzoom ASAP to discuss.
If you have questions, or would like someone to tag along during the code review, please contact @pixelzoom or @veillette on Slack.
## GitHub Issues
The following standard GitHub issues should exist.
⚠️ **NOTE: These issues exist, but have not been completed. Please continue with the code review anyway, and we will be addressing them in parallel.**
- [ ] model.md, see https://github.com/phetsims/calculus-grapher/issues/263. Familiarize yourself with the model by reading model.md. Does it adequately describe the model, in terms appropriate for teachers? Has it been reviewed by the sim designer? **NOTE: model.md is currently incomplete, but is worth skimming.**
- [ ] implementation-notes.md, see https://github.com/phetsims/calculus-grapher/issues/264. Familiarize yourself with the implementation by reading implementation-notes.md. Does it provide an overview that will be useful to future maintainers? **NOTE: implementation-notes.md is currently incomplete, but is worth skimming. TransformedCurve.ts is where most of the interesting stuff lives.**
- [ ] results of memory testing for `brands=phet`, see https://github.com/phetsims/calculus-grapher/issues/265
- [ ] results of memory testing for `brands=phet-io` (if the sim is instrumented for PhET-iO), see https://github.com/phetsims/calculus-grapher/issues/265
- [ ] performance testing and sign-off, see https://github.com/phetsims/calculus-grapher/issues/266
- [ ] review of pointer areas, see https://github.com/phetsims/calculus-grapher/issues/267
- [ ] credits (will not be completed until after RC testing), see https://github.com/phetsims/calculus-grapher/issues/202
## **Build and Run Checks**
If any of these items fail, pause code review.
- [x] Does the sim build without warnings or errors?
- [ ] Does the html file size seem reasonable, compared to other similar sims?
- [x] Does the sim start up? (unbuilt and built versions)
- [x] Does the sim experience any assertion failures? (run with query parameter `ea`)
- [x] Does the sim pass a scenery fuzz test? (run with query parameters `fuzz&ea`)
- [x] Does the sim behave correctly when listener order is shuffled? (run with query parameters `ea&shuffleListeners` and `ea&shuffleListeners&fuzz`)
- [x] Does the sim output any deprecation warnings? Run with `?deprecationWarnings`. Do not use deprecated methods in new code.
## **Memory Leaks**
- [ ] ~~Does a heap comparison using Chrome Developer Tools indicate a memory leak? (This process is described [here](https://github.com/phetsims/QA/blob/master/documentation/qa-book.md#47-memory-leak-testing).) Test on a version built using `grunt --minify.mangle=false`. Compare to testing results done by the responsible developer. Results can be found in https://github.com/phetsims/calculus-grapher/issues/265~~
- [x] For each common-code component (sun, scenery-phet, vegas, …) that opaquely registers observers or listeners, is
there a call to that component’s `dispose` function, or is it obvious why it isn't necessary, or is there documentation
about why `dispose` isn't called? An example of why no call to `dispose` is needed is if the component is used in
a `ScreenView` that would never be removed from the scene graph. Note that it's also acceptable (and encouraged!) to describe what needs to be disposed in implementation-notes.md.
- [ ] Are there leaks due to registering observers or listeners? The following guidelines should be followed unless documentation (in-line or in implementation-notes.md) describes why following them is not necessary.
* AXON: `Property.link` or `lazyLink` is accompanied by `unlink`.
* AXON: `Multilink.multilink` is accompanied by `unmultilink`.
* AXON: Creation of `Multilink` is accompanied by `dispose`.
* AXON: Creation of `DerivedProperty` is accompanied by `dispose`.
* AXON: `Emitter.addListener` is accompanied by `removeListener`.
* AXON: `ObservableArrayDef.element*Emitter.addListener` is accompanied by `ObservableArrayDef.element*Emitter.removeListener`
* SCENERY: `Node.addInputListener` is accompanied by `removeInputListener`
* TANDEM: Creation of an instrumented `PhetioObject` is accompanied by `dispose`.
- [x] Do all types that require a `dispose` function have one? This should expose a public `dispose` function that calls `this.disposeMyType()`, where `disposeMyType` is a private function declared in the constructor. `MyType` should exactly match the filename.
## **Performance**
- [x] Play with sim, identify any obvious performance issues. Examples: animation that slows down with large numbers of objects; animation that pauses or "hitches" during garbage collection.
- [ ] ~~If the sim uses WebGL, does it have a fallback? Does the fallback perform reasonably well? (run with query parameter `webgl=false`)~~
## **Usability**
- [x] Are UI components sufficiently responsive? (especially continuous UI components, such as sliders)
- [x] Are pointer areas optimized, especially for touch? (run with query parameter `showPointerAreas`)
- [x] Do pointer areas overlap? (run with query parameter `showPointerAreas`) Overlap may be OK in some cases, depending on the z-ordering (if the front-most object is supposed to occlude pointer areas) and whether objects can be moved. **NOTE: There is one place where a bit of overlap is acceptable. See https://github.com/phetsims/calculus-grapher/issues/267#issuecomment-1459197100**
## **Internationalization**
- [x] Are there any strings that are not internationalized, and does the sim layout gracefully handle internationalized strings that are shorter than the English strings? (run with query parameter `stringTest=X`. You should see nothing but 'X' strings.)
- [x] Does the sim layout gracefully handle internationalized strings that are longer than the English strings? (run with query parameters `stringTest=double` and `stringTest=long`)
- [x] Does the sim stay on the sim page (doesn't redirect to an external page) when running with the query parameter
`stringTest=xss`? This test passes if sim does not redirect, OK if sim crashes or fails to fully start. Only test on one
desktop platform. For PhET-iO sims, additionally test `?stringTest=xss` in Studio to make sure i18n strings didn't leak
to phetioDocumentation, see https://github.com/phetsims/phet-io/issues/1377
- [x] Avoid using concatenation to create strings that will be visible in the user interface. Use `StringUtils.fillIn` and a string pattern to ensure that strings are properly localized.
- [x] Use named placeholders (e.g. `"{{value}} {{units}}"`) instead of numbered placeholders (e.g. `"{0} {1}"`). **NOTE: There is only one of these. See `"predictPreferenceDescription"` in calculus-grapher-strings_en.json.**
- [x] Make sure the string keys are all perfect, because they are difficult to change after 1.0.0 is published. Guidelines for string keys are:
(1) Strings keys should generally match their values. E.g.:
```js
"helloWorld": {
value: "Hello World!"
},
"quadraticTerms": {
value: "Quadratic Terms"
}
```
(2) If a string key would be exceptionally long, use a key name that is an abbreviated form of the string value, or that captures the purpose/essence of the value. E.g.:
```js
// key is abbreviated
"iWentToTheStore": {
value: "I went to the store to get milk, eggs, butter, and sugar."
},
// key is based on purpose
"describeTheScreen": {
value: "The Play Area is a small room. The Control Panel has buttons, a checkbox, and radio buttons to change conditions in the room."
}
```
(3) If string key names would collide, use your judgment to disambiguate. E.g.:
```js
"simplifyTitle": {
value: "Simplify!"
},
"simplifyCheckbox": {
value: "simplify"
}
```
(4) String keys for screen names should have the general form `"screen.{{screenName}}"`. E.g.:
```js
"screen.explore": {
"value": "Explore"
},
```
(5) String patterns that contain placeholders (e.g. `"My name is {{first}} {{last}}"`) should use keys that are unlikely to conflict with strings that might be needed in the future. For example, for `"{{price}}"` consider using key `"pricePattern"` instead of `"price"`, if you think there might be a future need for a `"price"` string.
(6) It is acceptable to prefix families of strings with a prefix, like so:
```json
"material.water": {
"value": "Water"
},
"material.wood": {
"value": "Wood"
},
"shape.block": {
"value": "Block"
},
"shape.cone": {
"value": "Cone"
},
```
Nested substructure is not yet fully supported.
- [ ] ~~If the sim was already released, make sure none of the original string keys have changed. If they have changed, make sure any changes have a good reason and have been discussed with @jbphet (it is likely that an issue like https://github.com/phetsims/gravity-force-lab/issues/166 should be created).~~
## **Repository Structure**
- [x] The repository name should correspond to the sim title. For example, if the sim title is "Wave Interference", then the repository name should be "wave-interference".
- [x] Are all required files and directories present?
For a sim repository named “my-repo”, the general structure should look like this (where assets/, images/, mipmaps/ or sounds/ may be omitted if the sim doesn’t have those types of resource files). **NOTE that this sim has no image/, mipmaps/, or sounds/.**
```
my-repo/
assets/
doc/
images/
*see annotation
model.md
implementation-notes.md
images/
license.json
js/
(see section below)
mipmaps/
license.json
sound/
license.json
dependencies.json
.gitignore
my-repo_en.html
my-repo-strings_en.json
Gruntfile.js
LICENSE
package.json
README.md
```
*~~Any images used in model.md or implementation-notes.md should be added here. Images specific to aiding with documentation do not need their own license.~~
- [ ] ~~Verify that the same image file is not present in both images/ and mipmaps/. If you need a mipmap, use it for all occurrences of the image.~~
- [x] Is the js/ directory properly structured?
All JavaScript source should be in the js/ directory. There should be a subdirectory for each screen (this also applies for single-screen sims, where the subdirectory matches the repo name). For a multi-screen sim, code shared by 2 or more screens should be in a js/common/ subdirectory. Model and view code should be in model/ and view/ subdirectories for each screen and common/. For example, for a sim with screens “Introduction” and “Lab”, the general directory structure should look like this:
```
my-repo/
js/
common/
model/
view/
introduction/
model/
view/
lab/
model/
view/
my-repo-main.js
myRepo.js
myRepoStrings.js
```
- [x] Do filenames use an appropriate prefix? Some filenames may be prefixed with the repository name,
e.g. `MolarityConstants.js` in molarity. If the repository name is long, the developer may choose to abbreviate the
repository name, e.g. `EEConstants.js` in expression-exchange. If the abbreviation is already used by another
repository, then the full name must be used. For example, if the "EE" abbreviation is already used by
expression-exchange, then it should not be used in equality-explorer. Whichever convention is used, it should be used
consistently within a repository - don't mix abbreviations and full names.
- [ ] ~~Is there a file in assets/ for every resource file in sound/ and images/? Note that there is *not necessarily* a
1:1 correspondence between asset and resource files; for example, several related images may be in the same .ai file.
Check license.json for possible documentation of why some resources might not have a corresponding asset file.~~
- [x] For simulations, was the README.md generated by `grunt published-README` or `grunt unpublished-README`? Common
code repos can have custom README files.
- [x] Does package.json refer to any dependencies that are not used by the sim?
- [x] Is the LICENSE file correct? (Generally GPL v3 for sims and MIT for common code,
see [this thread](https://github.com/phetsims/tasks/issues/875#issuecomment-312168646) for additional information).
- [x] Does .gitignore match the one in simula-rasa?
- [ ] ~~In GitHub, verify that all non-release branches have an associated issue that describes their purpose.~~
- [ ] ~~Are there any GitHub branches that are no longer needed and should be deleted?~~
- [x] Sim-specific query parameters (if any) should be identified and documented in one .js file in js/common/ or js/ (
if there is no common/). The .js file should be named `{{PREFIX}}QueryParameters.js`, for example
ArithmeticQueryParameters.js for the arithmetic repository, or FBQueryParameters.js for Function Builder (where
the `FB` prefix is used).
- [x] Query parameters that are public-facing should be identified using `public: true` in the schema.
- [x] All sims should use a color file named `MyRepoColors.js` or, if using abbreviations, `MRColors.js`, and
use `ProfileColorProperty` where appropriate, even if they have a single (default) profile (to support color editing
and PhET-iO Studio). The `ColorProfile` pattern was converted to `*Colors.js` files in
https://github.com/phetsims/scenery-phet/issues/515. Please see
[GasPropertiesColors.js](https://github.com/phetsims/gas-properties/blob/master/js/common/GasPropertiesColors.js)
for a good example.
## **Coding Conventions**
- [x] Are coding conventions outlined in [PhET's Coding Conventions Document](https://github.com/phetsims/phet-info/blob/master/doc/coding-conventions.md) followed and adhered to? This document
deals with PhET coding conventions. You do not need to exhaustively check every item in this section, nor do you
necessarily need to check these items one at a time. The goal is to determine whether the code generally meets PhET standards.
## **TypeScript Conventions**
- [x] Are TypeScript conventions outlined in the [TypeScript Conventions Document](https://github.com/phetsims/phet-info/blob/master/doc/typescript-conventions.md) followed and adhered to?
## **Math Libraries**
- [x] `DOT/Utils.toFixed` or `DOT/Utils.toFixedNumber` should be used instead of `toFixed`. JavaScript's `toFixed` is notoriously buggy. Behavior differs depending on browser, because the spec doesn't specify whether to round or floor.
## IE11
- [ ] ~~IE is no longer supported. With that in mind remove IE-specific workarounds~~
- [ ] ~~Use `string.includes` and `string.startsWith` where possible.~~
## **Organization, Readability, and Maintainability**
- [x] Does the organization and structure of the code make sense? Do the model and view contain types that you would expect (or guess!) by looking at the sim? Do the names of things correspond to the names that you see in the user interface?
- [x] Are appropriate design patterns used? See [phet-software-design-patterns.md](https://github.com/phetsims/phet-info/blob/master/doc/phet-software-design-patterns.md). If new or inappropriate patterns are identified, create an issue.
- [x] Is inheritance used where appropriate? Does the type hierarchy make sense?
- [x] Is composition favored over inheritance where appropriate? See https://en.wikipedia.org/wiki/Composition_over_inheritance.
- [x] Is there any unnecessary coupling? (e.g., by passing large objects to constructors, or exposing unnecessary properties/functions). In TypeScript, you can decouple by narrowing the API like so:
```ts
public constructor( tickMarksVisibleProperty: Property<boolean>,
model: Pick<IntroModel, 'changeWaterLevel'>, // <-- Note the call site can pass the whole model, but we declare we will only use this part of it
waterCup: WaterCup, modelViewTransform: ModelViewTransform2,
providedOptions?: WaterCup3DNodeOptions ) {
```
- [x] Is there too much unnecessary decoupling? (e.g. by passing all of the properties of an object independently instead of passing the object itself)?
- [x] Are the source files reasonable in size? Scrutinize large files with too many responsibilities - can responsibilities be broken into smaller delegates? **NOTE: 84 of 87 files are under 300 lines. TransformedCurve.ts is justifiably the largest file (720 lines) because that's where all of the curve manipulation lives. To see file sizes, run this shell command:**
```
cd calculus-grapher/js ; wc -l `find . -name "*.ts" -print` | sort
```
- [x] Are any significant chunks of code duplicated? In addition to manual identification, tools include: WebStorm _Code > Analyze Code > Locate Duplicates_ and https://github.com/danielstjules/jsinspect. **NOTE: WebStorm _Code > Analyze Code > Locate Duplicates_ identifies a handful of duplicate lines of code, but no "significant chunks". I recommend "do nothing" about these. But feel free to inspect and recommend otherwise.**
- [x] Is there anything that should be generalized and migrated to common code?
- [x] Are there any `TODO` or `FIXME` or `REVIEW` comments in the code? They should be addressed or promoted to GitHub issues.
- [x] Are there any [magic numbers](https://en.wikipedia.org/wiki/Magic_number_(programming)) that should be factored out as constants and documented?
- [x] Are there any constants that are duplicated in multiple files that should be factored out into a `{{REPO}}Constants.js` file?
- [x] Does the implementation rely on any specific constant values that are likely to change in the future? Identify constants that might be changed in the future. (Use your judgement about which constants are likely candidates.) Does changing the values of these constants break the sim? For example, see https://github.com/phetsims/plinko-probability/issues/84.
- [x] Is [PhetColorScheme](https://github.com/phetsims/scenery-phet/blob/master/js/PhetColorScheme.ts) used where appropriate? Verify that the sim is not inventing/creating its own colors for things that have been standardized in `PhetColorScheme`. Identify any colors that might be worth adding to `PhetColorScheme`.
- [x] Are all dependent Properties modeled as `DerivedProperty` instead of `Property`?
- [x] All dynamics should be called from Sim.step(dt), do not use window.setTimeout or window.setInterval. This will help support Legends of Learning and PhET-iO.
## **Accessibility**
Not supported.
## **PhET-iO**
This section may be omitted if the sim has not been instrumented for PhET-iO, but is likely good to glance at no matter.
- [x] Does instrumentation follow the conventions described in [PhET-iO Instrumentation Guide](https://github.com/phetsims/phet-io/blob/master/doc/phet-io-instrumentation-technical-guide.md)?
This could be an extensive bullet. At the very least, be sure to know what amount of instrumentation this sim
supports. Describing this further goes beyond the scope of this document.
- [ ] PhET-iO instantiates different objects and wires up listeners that are not present in the PhET-branded simulation.
It needs to be tested separately for memory leaks. To help isolate the nature of the memory leak, this test should
be run separately from the PhET brand memory leak test. Test with a colorized Data Stream, and Studio (easily
accessed from phetmarks). Compare to testing results done by the responsible developer and previous releases.
- [ ] ~~Make sure unused `PhetioObject` instances are disposed, which unregisters their tandems.~~
- [x] Make sure JOIST `dt` values are used instead of `Date.now()` or other Date functions. Perhaps try
`phet.joist.elapsedTime`. Though this has already been mentioned, it is necessary for reproducible playback via input
events and deserves a comment in this PhET-iO section.
- [ ] ~~Are random numbers using `DOT/dotRandom` as an imported module (not a global), and all doing so after modules are declared (non-statically)? For
example, the following methods (and perhaps others) should not be used: `Math.random`, `_.shuffle`, `_.sample`, `_.random`. This also deserves re-iteration due to its effect on record/playback for PhET-iO.~~
- [x] Like JSON, keys for `undefined` values are omitted when serializing objects across frames. Consider this when
determining whether `toStateObject` should use `null` or `undefined` values.
- [x] PhET prefers to use the term "position" to refer to the physical (x,y) position of objects. This applies to both
brands, but is more important for the PhET-iO API. See https://github.com/phetsims/phet-info/issues/126
- [ ] Are your IOType state methods violating the API of the core type by accessing private fields? **NOTE: This sim has only 2 custom IOTypes: `CurvePoint.CurvePointIO` and `GraphSet.GraphSetIO`.**
- [ ] ~~When defining a boolean Property to indicate whether something is enabled, use `AXON/EnabledProperty`. This
should be done in both the model and the view. If you're using a DerivedProperty, skip this item.~~
- [x] Do not use translated strings in `phetioDocumentaton` - it changes the PhET-iO API!
|
non_defect
|
code review phet code review checklist a k a crc irrelevant items have been crossed off using strikethrough text see note comments for additional info that is specific to this sim a checked off item doesn t mean no problem here it means it was reviewed problems can be noted in side issues that reference this issue or through review comments in the code specific instructions we would like to being dev testing by monday march and we will need a few days to address review feedback so this review needs to be completed by end of day wednesday march if you do not think that is possible please contact kathy phet or pixelzoom asap to discuss if you have questions or would like someone to tag along during the code review please contact pixelzoom or veillette on slack github issues the following standard github issues should exist ⚠️ note these issues exist but have not been completed please continue with the code review anyway and we will be addressing them in parallel model md see familiarize yourself with the model by reading model md does it adequately describe the model in terms appropriate for teachers has it been reviewed by the sim designer note model md is currently incomplete but is worth skimming implementation notes md see familiarize yourself with the implementation by reading implementation notes md does it provide an overview that will be useful to future maintainers note implementation notes md is currently incomplete but is worth skimming transformedcurve ts is where most of the interesting stuff lives results of memory testing for brands phet see results of memory testing for brands phet io if the sim is instrumented for phet io see performance testing and sign off see review of pointer areas see credits will not be completed until after rc testing see build and run checks if any of these items fail pause code review does the sim build without warnings or errors does the html file size seem reasonable compared to other similar sims does the sim start up unbuilt and built versions does the sim experience any assertion failures run with query parameter ea does the sim pass a scenery fuzz test run with query parameters fuzz ea does the sim behave correctly when listener order is shuffled run with query parameters ea shufflelisteners and ea shufflelisteners fuzz does the sim output any deprecation warnings run with deprecationwarnings do not use deprecated methods in new code memory leaks does a heap comparison using chrome developer tools indicate a memory leak this process is described test on a version built using grunt minify mangle false compare to testing results done by the responsible developer results can be found in for each common code component sun scenery phet vegas … that opaquely registers observers or listeners is there a call to that component’s dispose function or is it obvious why it isn t necessary or is there documentation about why dispose isn t called an example of why no call to dispose is needed is if the component is used in a screenview that would never be removed from the scene graph note that it s also acceptable and encouraged to describe what needs to be disposed in implementation notes md are there leaks due to registering observers or listeners the following guidelines should be followed unless documentation in line or in implementation notes md describes why following them is not necessary axon property link or lazylink is accompanied by unlink axon multilink multilink is accompanied by unmultilink axon creation of multilink is accompanied by dispose axon creation of derivedproperty is accompanied by dispose axon emitter addlistener is accompanied by removelistener axon observablearraydef element emitter addlistener is accompanied by observablearraydef element emitter removelistener scenery node addinputlistener is accompanied by removeinputlistener tandem creation of an instrumented phetioobject is accompanied by dispose do all types that require a dispose function have one this should expose a public dispose function that calls this disposemytype where disposemytype is a private function declared in the constructor mytype should exactly match the filename performance play with sim identify any obvious performance issues examples animation that slows down with large numbers of objects animation that pauses or hitches during garbage collection if the sim uses webgl does it have a fallback does the fallback perform reasonably well run with query parameter webgl false usability are ui components sufficiently responsive especially continuous ui components such as sliders are pointer areas optimized especially for touch run with query parameter showpointerareas do pointer areas overlap run with query parameter showpointerareas overlap may be ok in some cases depending on the z ordering if the front most object is supposed to occlude pointer areas and whether objects can be moved note there is one place where a bit of overlap is acceptable see internationalization are there any strings that are not internationalized and does the sim layout gracefully handle internationalized strings that are shorter than the english strings run with query parameter stringtest x you should see nothing but x strings does the sim layout gracefully handle internationalized strings that are longer than the english strings run with query parameters stringtest double and stringtest long does the sim stay on the sim page doesn t redirect to an external page when running with the query parameter stringtest xss this test passes if sim does not redirect ok if sim crashes or fails to fully start only test on one desktop platform for phet io sims additionally test stringtest xss in studio to make sure strings didn t leak to phetiodocumentation see avoid using concatenation to create strings that will be visible in the user interface use stringutils fillin and a string pattern to ensure that strings are properly localized use named placeholders e g value units instead of numbered placeholders e g note there is only one of these see predictpreferencedescription in calculus grapher strings en json make sure the string keys are all perfect because they are difficult to change after is published guidelines for string keys are strings keys should generally match their values e g js helloworld value hello world quadraticterms value quadratic terms if a string key would be exceptionally long use a key name that is an abbreviated form of the string value or that captures the purpose essence of the value e g js key is abbreviated iwenttothestore value i went to the store to get milk eggs butter and sugar key is based on purpose describethescreen value the play area is a small room the control panel has buttons a checkbox and radio buttons to change conditions in the room if string key names would collide use your judgment to disambiguate e g js simplifytitle value simplify simplifycheckbox value simplify string keys for screen names should have the general form screen screenname e g js screen explore value explore string patterns that contain placeholders e g my name is first last should use keys that are unlikely to conflict with strings that might be needed in the future for example for price consider using key pricepattern instead of price if you think there might be a future need for a price string it is acceptable to prefix families of strings with a prefix like so json material water value water material wood value wood shape block value block shape cone value cone nested substructure is not yet fully supported if the sim was already released make sure none of the original string keys have changed if they have changed make sure any changes have a good reason and have been discussed with jbphet it is likely that an issue like should be created repository structure the repository name should correspond to the sim title for example if the sim title is wave interference then the repository name should be wave interference are all required files and directories present for a sim repository named “my repo” the general structure should look like this where assets images mipmaps or sounds may be omitted if the sim doesn’t have those types of resource files note that this sim has no image mipmaps or sounds my repo assets doc images see annotation model md implementation notes md images license json js see section below mipmaps license json sound license json dependencies json gitignore my repo en html my repo strings en json gruntfile js license package json readme md any images used in model md or implementation notes md should be added here images specific to aiding with documentation do not need their own license verify that the same image file is not present in both images and mipmaps if you need a mipmap use it for all occurrences of the image is the js directory properly structured all javascript source should be in the js directory there should be a subdirectory for each screen this also applies for single screen sims where the subdirectory matches the repo name for a multi screen sim code shared by or more screens should be in a js common subdirectory model and view code should be in model and view subdirectories for each screen and common for example for a sim with screens “introduction” and “lab” the general directory structure should look like this my repo js common model view introduction model view lab model view my repo main js myrepo js myrepostrings js do filenames use an appropriate prefix some filenames may be prefixed with the repository name e g molarityconstants js in molarity if the repository name is long the developer may choose to abbreviate the repository name e g eeconstants js in expression exchange if the abbreviation is already used by another repository then the full name must be used for example if the ee abbreviation is already used by expression exchange then it should not be used in equality explorer whichever convention is used it should be used consistently within a repository don t mix abbreviations and full names is there a file in assets for every resource file in sound and images note that there is not necessarily a correspondence between asset and resource files for example several related images may be in the same ai file check license json for possible documentation of why some resources might not have a corresponding asset file for simulations was the readme md generated by grunt published readme or grunt unpublished readme common code repos can have custom readme files does package json refer to any dependencies that are not used by the sim is the license file correct generally gpl for sims and mit for common code see for additional information does gitignore match the one in simula rasa in github verify that all non release branches have an associated issue that describes their purpose are there any github branches that are no longer needed and should be deleted sim specific query parameters if any should be identified and documented in one js file in js common or js if there is no common the js file should be named prefix queryparameters js for example arithmeticqueryparameters js for the arithmetic repository or fbqueryparameters js for function builder where the fb prefix is used query parameters that are public facing should be identified using public true in the schema all sims should use a color file named myrepocolors js or if using abbreviations mrcolors js and use profilecolorproperty where appropriate even if they have a single default profile to support color editing and phet io studio the colorprofile pattern was converted to colors js files in please see for a good example coding conventions are coding conventions outlined in followed and adhered to this document deals with phet coding conventions you do not need to exhaustively check every item in this section nor do you necessarily need to check these items one at a time the goal is to determine whether the code generally meets phet standards typescript conventions are typescript conventions outlined in the followed and adhered to math libraries dot utils tofixed or dot utils tofixednumber should be used instead of tofixed javascript s tofixed is notoriously buggy behavior differs depending on browser because the spec doesn t specify whether to round or floor ie is no longer supported with that in mind remove ie specific workarounds use string includes and string startswith where possible organization readability and maintainability does the organization and structure of the code make sense do the model and view contain types that you would expect or guess by looking at the sim do the names of things correspond to the names that you see in the user interface are appropriate design patterns used see if new or inappropriate patterns are identified create an issue is inheritance used where appropriate does the type hierarchy make sense is composition favored over inheritance where appropriate see is there any unnecessary coupling e g by passing large objects to constructors or exposing unnecessary properties functions in typescript you can decouple by narrowing the api like so ts public constructor tickmarksvisibleproperty property model pick note the call site can pass the whole model but we declare we will only use this part of it watercup watercup modelviewtransform providedoptions is there too much unnecessary decoupling e g by passing all of the properties of an object independently instead of passing the object itself are the source files reasonable in size scrutinize large files with too many responsibilities can responsibilities be broken into smaller delegates note of files are under lines transformedcurve ts is justifiably the largest file lines because that s where all of the curve manipulation lives to see file sizes run this shell command cd calculus grapher js wc l find name ts print sort are any significant chunks of code duplicated in addition to manual identification tools include webstorm code analyze code locate duplicates and note webstorm code analyze code locate duplicates identifies a handful of duplicate lines of code but no significant chunks i recommend do nothing about these but feel free to inspect and recommend otherwise is there anything that should be generalized and migrated to common code are there any todo or fixme or review comments in the code they should be addressed or promoted to github issues are there any that should be factored out as constants and documented are there any constants that are duplicated in multiple files that should be factored out into a repo constants js file does the implementation rely on any specific constant values that are likely to change in the future identify constants that might be changed in the future use your judgement about which constants are likely candidates does changing the values of these constants break the sim for example see is used where appropriate verify that the sim is not inventing creating its own colors for things that have been standardized in phetcolorscheme identify any colors that might be worth adding to phetcolorscheme are all dependent properties modeled as derivedproperty instead of property all dynamics should be called from sim step dt do not use window settimeout or window setinterval this will help support legends of learning and phet io accessibility not supported phet io this section may be omitted if the sim has not been instrumented for phet io but is likely good to glance at no matter does instrumentation follow the conventions described in this could be an extensive bullet at the very least be sure to know what amount of instrumentation this sim supports describing this further goes beyond the scope of this document phet io instantiates different objects and wires up listeners that are not present in the phet branded simulation it needs to be tested separately for memory leaks to help isolate the nature of the memory leak this test should be run separately from the phet brand memory leak test test with a colorized data stream and studio easily accessed from phetmarks compare to testing results done by the responsible developer and previous releases make sure unused phetioobject instances are disposed which unregisters their tandems make sure joist dt values are used instead of date now or other date functions perhaps try phet joist elapsedtime though this has already been mentioned it is necessary for reproducible playback via input events and deserves a comment in this phet io section are random numbers using dot dotrandom as an imported module not a global and all doing so after modules are declared non statically for example the following methods and perhaps others should not be used math random shuffle sample random this also deserves re iteration due to its effect on record playback for phet io like json keys for undefined values are omitted when serializing objects across frames consider this when determining whether tostateobject should use null or undefined values phet prefers to use the term position to refer to the physical x y position of objects this applies to both brands but is more important for the phet io api see are your iotype state methods violating the api of the core type by accessing private fields note this sim has only custom iotypes curvepoint curvepointio and graphset graphsetio when defining a boolean property to indicate whether something is enabled use axon enabledproperty this should be done in both the model and the view if you re using a derivedproperty skip this item do not use translated strings in phetiodocumentaton it changes the phet io api
| 0
|
13,787
| 2,784,101,456
|
IssuesEvent
|
2015-05-07 07:13:58
|
sylingd/phpsocks5
|
https://api.github.com/repos/sylingd/phpsocks5
|
closed
|
无法使用服务……
|
auto-migrated Priority-Medium Type-Defect
|
```
建立数据表成功,但启用时显示以下提示,服务器无法使用��
�不知道是什么原因?
DNS: couldn't open /etc/resolv.conf: No such file or directory
Disabling disk cache: No such file or directory
Disabling local tree: No such file or directory
Established listening socket on port 10088
```
Original issue reported on code.google.com by `liruc...@gmail.com` on 14 Mar 2011 at 5:16
|
1.0
|
无法使用服务…… - ```
建立数据表成功,但启用时显示以下提示,服务器无法使用��
�不知道是什么原因?
DNS: couldn't open /etc/resolv.conf: No such file or directory
Disabling disk cache: No such file or directory
Disabling local tree: No such file or directory
Established listening socket on port 10088
```
Original issue reported on code.google.com by `liruc...@gmail.com` on 14 Mar 2011 at 5:16
|
defect
|
无法使用服务…… 建立数据表成功,但启用时显示以下提示,服务器无法使用�� �不知道是什么原因? dns couldn t open etc resolv conf no such file or directory disabling disk cache no such file or directory disabling local tree no such file or directory established listening socket on port original issue reported on code google com by liruc gmail com on mar at
| 1
|
59,255
| 17,016,731,935
|
IssuesEvent
|
2021-07-02 13:06:39
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
opened
|
Mark only one side in one-way streets
|
Component: opencyclemap Priority: major Type: defect
|
**[Submitted to the original trac issue database at 5.41am, Sunday, 2nd October 2016]**
A road with cycleway:right/left=track/lane is marked on one side of the road. Can you do it for a one-way street tagget with cycleway=track/lane, so that only one side is marked in the main direction.
Incidentally, this is already done in JOSM.
|
1.0
|
Mark only one side in one-way streets - **[Submitted to the original trac issue database at 5.41am, Sunday, 2nd October 2016]**
A road with cycleway:right/left=track/lane is marked on one side of the road. Can you do it for a one-way street tagget with cycleway=track/lane, so that only one side is marked in the main direction.
Incidentally, this is already done in JOSM.
|
defect
|
mark only one side in one way streets a road with cycleway right left track lane is marked on one side of the road can you do it for a one way street tagget with cycleway track lane so that only one side is marked in the main direction incidentally this is already done in josm
| 1
|
289,273
| 24,973,023,382
|
IssuesEvent
|
2022-11-02 04:10:44
|
milvus-io/milvus
|
https://api.github.com/repos/milvus-io/milvus
|
opened
|
[Bug]: [benchmark][standalone] serial search raise error: Unavailable desc = keepalive ping failed to receive ACK within timeout
|
kind/bug needs-triage test/benchmark
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Environment
```markdown
- Milvus version:master-20221101-29f2ed67
- Deployment mode(standalone or cluster):standalone
- SDK version(e.g. pymilvus v2.0.0rc2):2.2.0.dev63
- OS(Ubuntu or CentOS):
- CPU/Memory:
- GPU:
- Others:
```
### Current Behavior
argo task: fouramf-cron-1667318400
test case: test_recall_glove_hnsw_standalone
server:
```
[2022-11-02 02:01:39,575 - INFO - fouram]: [Base] Deploy initial state:
I1102 00:38:45.694957 16377 request.go:665] Waited for 1.119360694s due to client-side throttling, not priority and fairness, request: GET:https://kubernetes.default.svc.cluster.local/apis/policy/v1?timeout=32s
NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES
fouramf-cron-1667318400-1-31-9427-etcd-0 1/1 Running 0 118s 10.104.9.8 4am-node14 <none> <none>
fouramf-cron-1667318400-1-31-9427-milvus-standalone-858688mwjt8 1/1 Running 0 118s 10.104.5.10 4am-node12 <none> <none>
fouramf-cron-1667318400-1-31-9427-minio-67bcfd6cd8-hl8tn 1/1 Running 0 118s 10.104.5.9 4am-node12 <none> <none> (base.py:126)
[2022-11-02 02:01:39,575 - INFO - fouram]: [Cmd Exe] kubectl get pods -n qa-milvus -o wide | grep -E 'STATUS|fouramf-cron-1667318400-1-31-9427' (util_cmd.py:14)
[2022-11-02 02:01:45,492 - INFO - fouram]: [CliClient] pod details of release(fouramf-cron-1667318400-1-31-9427):
I1102 02:01:40.818647 19034 request.go:665] Waited for 1.165695239s due to client-side throttling, not priority and fairness, request: GET:https://kubernetes.default.svc.cluster.local/apis/node.k8s.io/v1beta1?timeout=32s
NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES
fouramf-cron-1667318400-1-31-9427-etcd-0 1/1 Running 0 84m 10.104.9.8 4am-node14 <none> <none>
fouramf-cron-1667318400-1-31-9427-milvus-standalone-858688mwjt8 1/1 Running 0 84m 10.104.5.10 4am-node12 <none> <none>
fouramf-cron-1667318400-1-31-9427-minio-67bcfd6cd8-hl8tn 1/1 Running 0 84m 10.104.5.9 4am-node12 <none> <none> (cli_client.py:123)
[2022-11-02 02:01:45,494 - INFO - fouram]: [Base] Start deleting services: fouramf-cron-1667318400-1-31-9427 (base.py:129)
[2022-11-02 02:01:45,494 - INFO - fouram]: [Cmd Exe] kubectl get pvc -n qa-milvus | grep -E 'STATUS|fouramf-cron-1667318400-1-31-9427' (util_cmd.py:14)
[2022-11-02 02:01:51,369 - INFO - fouram]: [CliClient] pvc storage class of release(fouramf-cron-1667318400-1-31-9427):
I1102 02:01:46.742240 19075 request.go:665] Waited for 1.162965014s due to client-side throttling, not priority and fairness, request: GET:https://kubernetes.default.svc.cluster.local/apis/policy/v1?timeout=32s
NAME STATUS VOLUME CAPACITY ACCESS MODES STORAGECLASS AGE
data-fouramf-cron-1667318400-1-31-9427-etcd-0 Bound pvc-d88d0be7-b223-4d63-a4ea-2887d96e1df1 10Gi RWO local-path 84m
fouramf-cron-1667318400-1-31-9427-milvus Bound pvc-f5c61bbb-9047-48f2-b97e-a7279f76d4cb 50Gi RWO local-path 84m
fouramf-cron-1667318400-1-31-9427-minio Bound pvc-e439a16c-3a01-4182-91e6-b2357afb354d 500Gi RWO local-path 84m (cli_client.py:131)
```
<img width="1868" alt="截屏2022-11-02 12 08 16" src="https://user-images.githubusercontent.com/26307815/199393614-7c41cbb3-8abe-41c4-9b6a-b9b9955ece26.png">
<img width="1869" alt="截屏2022-11-02 12 09 06" src="https://user-images.githubusercontent.com/26307815/199393740-0c152208-6d0b-4db4-b50d-4859f91fb7cf.png">
client log:
[test_recall_glove_hnsw_standalone_1.zip](https://github.com/milvus-io/milvus/files/9915863/test_recall_glove_hnsw_standalone_1.zip)
```
[2022-11-02 01:48:00,880 - INFO - fouram]: [PerfTemplate] Actual parameters used: {'dataset_params': {'dim': 200, 'dataset_name': 'glove-200-angular', 'ni_per': 10000}, 'collection_params': {'other_fields': []}, 'load_params': {'replica_number': 1}, 'search_params': {'top_k': 10, 'nq': 10000, 'search_param': {'ef': 256}}, 'index_params': {'index_type': 'HNSW', 'index_param': {'M': 36, 'efConstruction': 500}}} (performance_template.py:59)
[2022-11-02 01:48:00,884 - INFO - fouram]: [AccCases] Params of search: {'data': array([[-0.02773983, 0.0795716 , -0.07341436, ..., 0.0432257 ,
-0.09719583, -0.08906623],
[-0.00701086, -0.03715183, 0.0632261 , ..., -0.0334666 ,
-0.09059103, 0.05058862],
[ 0.04278408, -0.0195656 , -0.0546626 , ..., 0.0624952 ,
-0.13264737, -0.01105194],
...,
[ 0.03442654, 0.13952227, 0.02177458, ..., 0.04648568,
-0.05786795, -0.06752396],
[ 0.04487952, -0.03255282, 0.07743784, ..., 0.08966012,
-0.01795752, 0.07155532],
[-0.0895235 , -0.12985045, 0.11885931, ..., 0.0558663 ,
0.07499653, -0.15403427]], dtype=float32), 'anns_field': 'float_vector', 'param': {'metric_type': 'IP', 'params': {'ef': 256}}, 'limit': 10} (accuracy_cases.py:145)
[2022-11-02 01:48:00,884 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 256}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:49:53,745 - INFO - fouram]: [Time] Collection.search run in 112.8602s (api_request.py:29)
[2022-11-02 01:49:53,745 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 256}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:50:35,318 - ERROR - fouram]: Traceback (most recent call last):
File "/src/fouram/client/util/api_request.py", line 21, in inner_wrapper
res = func(*args, **kwargs)
File "/src/fouram/client/util/api_request.py", line 57, in api_request
return func(*arg, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/orm/collection.py", line 719, in search
res = conn.search(self._name, data, anns_field, param, limit, expr,
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 109, in handler
raise e
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 105, in handler
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 136, in handler
ret = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 85, in handler
raise e
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 50, in handler
return func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 476, in search
return self._execute_search_requests(requests, timeout, **_kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 440, in _execute_search_requests
raise pre_err
File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 431, in _execute_search_requests
raise MilvusException(response.status.error_code, response.status.reason)
pymilvus.exceptions.MilvusException: <MilvusException: (code=1, message=fail to search on all shard leaders, err=Channel: by-dev-rootcoord-dml_0_437085668438704132v0 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
Channel: by-dev-rootcoord-dml_1_437085668438704132v1 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
)>
(api_request.py:35)
[2022-11-02 01:50:35,319 - ERROR - fouram]: (api_response) : <MilvusException: (code=1, message=fail to search on all shard leaders, err=Channel: by-dev-rootcoord-dml_0_437085668438704132v0 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
Channel: by-dev-rootcoord-dml_1_437085668438704132v1 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
)> (api_request.py:36)
[2022-11-02 01:50:35,319 - ERROR - fouram]: [CheckFunc] search request check failed, response:<MilvusException: (code=1, message=fail to search on all shard leaders, err=Channel: by-dev-rootcoord-dml_0_437085668438704132v0 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
Channel: by-dev-rootcoord-dml_1_437085668438704132v1 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
)> (func_check.py:40)
[2022-11-02 01:50:35,319 - ERROR - fouram]: [AccCases] Search raise error: (accuracy_cases.py:173)
[2022-11-02 01:50:35,319 - INFO - fouram]: [PerfTemplate] Actual parameters used: {'dataset_params': {'dim': 200, 'dataset_name': 'glove-200-angular', 'ni_per': 10000}, 'collection_params': {'other_fields': []}, 'load_params': {'replica_number': 1}, 'search_params': {'top_k': 10, 'nq': 10000, 'search_param': {'ef': 512}}, 'index_params': {'index_type': 'HNSW', 'index_param': {'M': 36, 'efConstruction': 500}}} (performance_template.py:59)
[2022-11-02 01:50:35,322 - INFO - fouram]: [AccCases] Params of search: {'data': array([[-0.02773983, 0.0795716 , -0.07341436, ..., 0.0432257 ,
-0.09719583, -0.08906623],
[-0.00701086, -0.03715183, 0.0632261 , ..., -0.0334666 ,
-0.09059103, 0.05058862],
[ 0.04278408, -0.0195656 , -0.0546626 , ..., 0.0624952 ,
-0.13264737, -0.01105194],
...,
[ 0.03442654, 0.13952227, 0.02177458, ..., 0.04648568,
-0.05786795, -0.06752396],
[ 0.04487952, -0.03255282, 0.07743784, ..., 0.08966012,
-0.01795752, 0.07155532],
[-0.0895235 , -0.12985045, 0.11885931, ..., 0.0558663 ,
0.07499653, -0.15403427]], dtype=float32), 'anns_field': 'float_vector', 'param': {'metric_type': 'IP', 'params': {'ef': 512}}, 'limit': 10} (accuracy_cases.py:145)
[2022-11-02 01:50:35,323 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:54:20,089 - INFO - fouram]: [Time] Collection.search run in 224.7647s (api_request.py:29)
[2022-11-02 01:54:20,089 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:56:21,880 - INFO - fouram]: [Time] Collection.search run in 121.7901s (api_request.py:29)
[2022-11-02 01:56:21,887 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:58:18,088 - INFO - fouram]: [Time] Collection.search run in 116.2s (api_request.py:29)
[2022-11-02 01:58:18,095 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:59:59,238 - INFO - fouram]: [Time] Collection.search run in 101.1411s (api_request.py:29)
[2022-11-02 01:59:59,245 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 02:01:37,642 - INFO - fouram]: [Time] Collection.search run in 98.3958s (api_request.py:29)
[2022-11-02 02:01:37,718 - INFO - fouram]: [AccCases] Search result:{'Recall': 0.977, 'RT': 132.4583, 'LastRT': 98.3958} (accuracy_cases.py:169)
[2022-11-02 02:01:37,734 - INFO - fouram]: [PerfTemplate] Report data:
{'server': {'deploy_tool': 'helm',
'deploy_mode': 'standalone',
'config_name': 'standalone_8c16m',
'config': {'standalone': {'resources': {'limits': {'cpu': '8.0',
'memory': '16Gi'},
'requests': {'cpu': '5.0',
'memory': '9Gi'}},
'persistence': {'persistentVolumeClaim': {'storageClass': 'local-path'}}},
'cluster': {'enabled': False},
'etcd': {'replicaCount': 1,
'global': {'storageClass': 'local-path'},
'metrics': {'enabled': True,
'podMonitor': {'enabled': True}}},
'minio': {'mode': 'standalone',
'persistence': {'storageClass': 'local-path'},
'metrics': {'serviceMonitor': {'enabled': False}}},
'pulsar': {'enabled': False},
'metrics': {'serviceMonitor': {'enabled': True}},
'image': {'all': {'repository': 'harbor.milvus.io/dockerhub/milvusdb/milvus',
'tag': 'master-20221101-29f2ed67'}}},
'host': 'fouramf-cron-1667318400-1-31-9427-milvus.qa-milvus.svc.cluster.local'},
'client': {'test_case_type': 'AccCases',
'test_case_name': 'test_recall_glove_hnsw_standalone',
'test_case_params': {'dataset_params': {'dim': 200,
'dataset_name': 'glove-200-angular',
'ni_per': 10000},
'collection_params': {'other_fields': []},
'load_params': {'replica_number': 1},
'search_params': {'top_k': 10,
'nq': 10000,
'search_param': {'ef': 512}},
'index_params': {'index_type': 'HNSW',
'index_param': {'M': 36,
'efConstruction': 500}}},
'run_id': 2022110294117861,
'datetime': '2022-11-02 00:36:51.144710',
'client_version': '2.2'},
'result': {'test_result': {'ann_insert': {'total_time': 28.6393},
'index': {'build_time': 1271.2825},
'search': {'Recall': 0.977,
'RT': 132.4583,
'LastRT': 98.3958}}}} (performance_template.py:67)
```
### Expected Behavior
_No response_
### Steps To Reproduce
```markdown
1. create a collection or use an existing collection
2. insert training dataset
3. flush collection
4. clean index and build new index : HNSW
5. load collection
6. search with different parameters
7. clean all collections or not
```
### Milvus Log
milvus deplocy:
```
helm -n qa-milvus upgrade --install --set standalone.resources.limits.cpu=8.0,standalone.resources.limits.memory=16Gi,standalone.resources.requests.cpu=5.0,standalone.resources.requests.memory=9Gi,standalone.persistence.persistentVolumeClaim.storageClass=local-path,cluster.enabled=False,etcd.replicaCount=1,etcd.global.storageClass=local-path,etcd.metrics.enabled=True,etcd.metrics.podMonitor.enabled=True,minio.mode=standalone,minio.persistence.storageClass=local-path,minio.metrics.serviceMonitor.enabled=False,pulsar.enabled=False,metrics.serviceMonitor.enabled=True,image.all.repository=harbor.milvus.io/dockerhub/milvusdb/milvus,image.all.tag=master-20221101-29f2ed67 --wait --timeout 30m fouramf-cron-1667318400-1-31-9427 /home/helm/charts/milvus/
```
### Anything else?
case params:
```
{
"dataset_params": {
"dim": 200,
"dataset_name": "glove-200-angular",
"ni_per": 10000
},
"collection_params": {
"other_fields": []
},
"load_params": {
"replica_number": 1
},
"search_params": {
"top_k": [
10
],
"nq": [
10000
],
"search_param": {
"ef": [
10,
16,
32,
64,
128,
256,
512
]
}
},
"index_params": {
"index_type": "HNSW",
"index_param": {
"M": 36,
"efConstruction": 500
}
}
}
```
|
1.0
|
[Bug]: [benchmark][standalone] serial search raise error: Unavailable desc = keepalive ping failed to receive ACK within timeout - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Environment
```markdown
- Milvus version:master-20221101-29f2ed67
- Deployment mode(standalone or cluster):standalone
- SDK version(e.g. pymilvus v2.0.0rc2):2.2.0.dev63
- OS(Ubuntu or CentOS):
- CPU/Memory:
- GPU:
- Others:
```
### Current Behavior
argo task: fouramf-cron-1667318400
test case: test_recall_glove_hnsw_standalone
server:
```
[2022-11-02 02:01:39,575 - INFO - fouram]: [Base] Deploy initial state:
I1102 00:38:45.694957 16377 request.go:665] Waited for 1.119360694s due to client-side throttling, not priority and fairness, request: GET:https://kubernetes.default.svc.cluster.local/apis/policy/v1?timeout=32s
NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES
fouramf-cron-1667318400-1-31-9427-etcd-0 1/1 Running 0 118s 10.104.9.8 4am-node14 <none> <none>
fouramf-cron-1667318400-1-31-9427-milvus-standalone-858688mwjt8 1/1 Running 0 118s 10.104.5.10 4am-node12 <none> <none>
fouramf-cron-1667318400-1-31-9427-minio-67bcfd6cd8-hl8tn 1/1 Running 0 118s 10.104.5.9 4am-node12 <none> <none> (base.py:126)
[2022-11-02 02:01:39,575 - INFO - fouram]: [Cmd Exe] kubectl get pods -n qa-milvus -o wide | grep -E 'STATUS|fouramf-cron-1667318400-1-31-9427' (util_cmd.py:14)
[2022-11-02 02:01:45,492 - INFO - fouram]: [CliClient] pod details of release(fouramf-cron-1667318400-1-31-9427):
I1102 02:01:40.818647 19034 request.go:665] Waited for 1.165695239s due to client-side throttling, not priority and fairness, request: GET:https://kubernetes.default.svc.cluster.local/apis/node.k8s.io/v1beta1?timeout=32s
NAME READY STATUS RESTARTS AGE IP NODE NOMINATED NODE READINESS GATES
fouramf-cron-1667318400-1-31-9427-etcd-0 1/1 Running 0 84m 10.104.9.8 4am-node14 <none> <none>
fouramf-cron-1667318400-1-31-9427-milvus-standalone-858688mwjt8 1/1 Running 0 84m 10.104.5.10 4am-node12 <none> <none>
fouramf-cron-1667318400-1-31-9427-minio-67bcfd6cd8-hl8tn 1/1 Running 0 84m 10.104.5.9 4am-node12 <none> <none> (cli_client.py:123)
[2022-11-02 02:01:45,494 - INFO - fouram]: [Base] Start deleting services: fouramf-cron-1667318400-1-31-9427 (base.py:129)
[2022-11-02 02:01:45,494 - INFO - fouram]: [Cmd Exe] kubectl get pvc -n qa-milvus | grep -E 'STATUS|fouramf-cron-1667318400-1-31-9427' (util_cmd.py:14)
[2022-11-02 02:01:51,369 - INFO - fouram]: [CliClient] pvc storage class of release(fouramf-cron-1667318400-1-31-9427):
I1102 02:01:46.742240 19075 request.go:665] Waited for 1.162965014s due to client-side throttling, not priority and fairness, request: GET:https://kubernetes.default.svc.cluster.local/apis/policy/v1?timeout=32s
NAME STATUS VOLUME CAPACITY ACCESS MODES STORAGECLASS AGE
data-fouramf-cron-1667318400-1-31-9427-etcd-0 Bound pvc-d88d0be7-b223-4d63-a4ea-2887d96e1df1 10Gi RWO local-path 84m
fouramf-cron-1667318400-1-31-9427-milvus Bound pvc-f5c61bbb-9047-48f2-b97e-a7279f76d4cb 50Gi RWO local-path 84m
fouramf-cron-1667318400-1-31-9427-minio Bound pvc-e439a16c-3a01-4182-91e6-b2357afb354d 500Gi RWO local-path 84m (cli_client.py:131)
```
<img width="1868" alt="截屏2022-11-02 12 08 16" src="https://user-images.githubusercontent.com/26307815/199393614-7c41cbb3-8abe-41c4-9b6a-b9b9955ece26.png">
<img width="1869" alt="截屏2022-11-02 12 09 06" src="https://user-images.githubusercontent.com/26307815/199393740-0c152208-6d0b-4db4-b50d-4859f91fb7cf.png">
client log:
[test_recall_glove_hnsw_standalone_1.zip](https://github.com/milvus-io/milvus/files/9915863/test_recall_glove_hnsw_standalone_1.zip)
```
[2022-11-02 01:48:00,880 - INFO - fouram]: [PerfTemplate] Actual parameters used: {'dataset_params': {'dim': 200, 'dataset_name': 'glove-200-angular', 'ni_per': 10000}, 'collection_params': {'other_fields': []}, 'load_params': {'replica_number': 1}, 'search_params': {'top_k': 10, 'nq': 10000, 'search_param': {'ef': 256}}, 'index_params': {'index_type': 'HNSW', 'index_param': {'M': 36, 'efConstruction': 500}}} (performance_template.py:59)
[2022-11-02 01:48:00,884 - INFO - fouram]: [AccCases] Params of search: {'data': array([[-0.02773983, 0.0795716 , -0.07341436, ..., 0.0432257 ,
-0.09719583, -0.08906623],
[-0.00701086, -0.03715183, 0.0632261 , ..., -0.0334666 ,
-0.09059103, 0.05058862],
[ 0.04278408, -0.0195656 , -0.0546626 , ..., 0.0624952 ,
-0.13264737, -0.01105194],
...,
[ 0.03442654, 0.13952227, 0.02177458, ..., 0.04648568,
-0.05786795, -0.06752396],
[ 0.04487952, -0.03255282, 0.07743784, ..., 0.08966012,
-0.01795752, 0.07155532],
[-0.0895235 , -0.12985045, 0.11885931, ..., 0.0558663 ,
0.07499653, -0.15403427]], dtype=float32), 'anns_field': 'float_vector', 'param': {'metric_type': 'IP', 'params': {'ef': 256}}, 'limit': 10} (accuracy_cases.py:145)
[2022-11-02 01:48:00,884 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 256}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:49:53,745 - INFO - fouram]: [Time] Collection.search run in 112.8602s (api_request.py:29)
[2022-11-02 01:49:53,745 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 256}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:50:35,318 - ERROR - fouram]: Traceback (most recent call last):
File "/src/fouram/client/util/api_request.py", line 21, in inner_wrapper
res = func(*args, **kwargs)
File "/src/fouram/client/util/api_request.py", line 57, in api_request
return func(*arg, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/orm/collection.py", line 719, in search
res = conn.search(self._name, data, anns_field, param, limit, expr,
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 109, in handler
raise e
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 105, in handler
return func(*args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 136, in handler
ret = func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 85, in handler
raise e
File "/usr/local/lib/python3.8/dist-packages/pymilvus/decorators.py", line 50, in handler
return func(self, *args, **kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 476, in search
return self._execute_search_requests(requests, timeout, **_kwargs)
File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 440, in _execute_search_requests
raise pre_err
File "/usr/local/lib/python3.8/dist-packages/pymilvus/client/grpc_handler.py", line 431, in _execute_search_requests
raise MilvusException(response.status.error_code, response.status.reason)
pymilvus.exceptions.MilvusException: <MilvusException: (code=1, message=fail to search on all shard leaders, err=Channel: by-dev-rootcoord-dml_0_437085668438704132v0 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
Channel: by-dev-rootcoord-dml_1_437085668438704132v1 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
)>
(api_request.py:35)
[2022-11-02 01:50:35,319 - ERROR - fouram]: (api_response) : <MilvusException: (code=1, message=fail to search on all shard leaders, err=Channel: by-dev-rootcoord-dml_0_437085668438704132v0 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
Channel: by-dev-rootcoord-dml_1_437085668438704132v1 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
)> (api_request.py:36)
[2022-11-02 01:50:35,319 - ERROR - fouram]: [CheckFunc] search request check failed, response:<MilvusException: (code=1, message=fail to search on all shard leaders, err=Channel: by-dev-rootcoord-dml_0_437085668438704132v0 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
Channel: by-dev-rootcoord-dml_1_437085668438704132v1 returns err: err: rpc error: code = Unavailable desc = keepalive ping failed to receive ACK within timeout
, /go/src/github.com/milvus-io/milvus/internal/util/trace/stack_trace.go:51 github.com/milvus-io/milvus/internal/util/trace.StackTrace
/go/src/github.com/milvus-io/milvus/internal/util/grpcclient/client.go:277 github.com/milvus-io/milvus/internal/util/grpcclient.(*ClientBase[...]).Call
/go/src/github.com/milvus-io/milvus/internal/distributed/querynode/client/client.go:254 github.com/milvus-io/milvus/internal/distributed/querynode/client.(*Client).Search
/go/src/github.com/milvus-io/milvus/internal/proxy/task_search.go:484 github.com/milvus-io/milvus/internal/proxy.(*searchTask).searchShard
/go/src/github.com/milvus-io/milvus/internal/proxy/task_policies.go:127 github.com/milvus-io/milvus/internal/proxy.mergeRoundRobinPolicy.func1
/usr/local/go/src/runtime/asm_amd64.s:1571 runtime.goexit
)> (func_check.py:40)
[2022-11-02 01:50:35,319 - ERROR - fouram]: [AccCases] Search raise error: (accuracy_cases.py:173)
[2022-11-02 01:50:35,319 - INFO - fouram]: [PerfTemplate] Actual parameters used: {'dataset_params': {'dim': 200, 'dataset_name': 'glove-200-angular', 'ni_per': 10000}, 'collection_params': {'other_fields': []}, 'load_params': {'replica_number': 1}, 'search_params': {'top_k': 10, 'nq': 10000, 'search_param': {'ef': 512}}, 'index_params': {'index_type': 'HNSW', 'index_param': {'M': 36, 'efConstruction': 500}}} (performance_template.py:59)
[2022-11-02 01:50:35,322 - INFO - fouram]: [AccCases] Params of search: {'data': array([[-0.02773983, 0.0795716 , -0.07341436, ..., 0.0432257 ,
-0.09719583, -0.08906623],
[-0.00701086, -0.03715183, 0.0632261 , ..., -0.0334666 ,
-0.09059103, 0.05058862],
[ 0.04278408, -0.0195656 , -0.0546626 , ..., 0.0624952 ,
-0.13264737, -0.01105194],
...,
[ 0.03442654, 0.13952227, 0.02177458, ..., 0.04648568,
-0.05786795, -0.06752396],
[ 0.04487952, -0.03255282, 0.07743784, ..., 0.08966012,
-0.01795752, 0.07155532],
[-0.0895235 , -0.12985045, 0.11885931, ..., 0.0558663 ,
0.07499653, -0.15403427]], dtype=float32), 'anns_field': 'float_vector', 'param': {'metric_type': 'IP', 'params': {'ef': 512}}, 'limit': 10} (accuracy_cases.py:145)
[2022-11-02 01:50:35,323 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:54:20,089 - INFO - fouram]: [Time] Collection.search run in 224.7647s (api_request.py:29)
[2022-11-02 01:54:20,089 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:56:21,880 - INFO - fouram]: [Time] Collection.search run in 121.7901s (api_request.py:29)
[2022-11-02 01:56:21,887 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:58:18,088 - INFO - fouram]: [Time] Collection.search run in 116.2s (api_request.py:29)
[2022-11-02 01:58:18,095 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 01:59:59,238 - INFO - fouram]: [Time] Collection.search run in 101.1411s (api_request.py:29)
[2022-11-02 01:59:59,245 - INFO - fouram]: [Base] Params of search: nq:10000, anns_field:float_vector, param:{'metric_type': 'IP', 'params': {'ef': 512}}, limit:10, expr:"None" (base.py:268)
[2022-11-02 02:01:37,642 - INFO - fouram]: [Time] Collection.search run in 98.3958s (api_request.py:29)
[2022-11-02 02:01:37,718 - INFO - fouram]: [AccCases] Search result:{'Recall': 0.977, 'RT': 132.4583, 'LastRT': 98.3958} (accuracy_cases.py:169)
[2022-11-02 02:01:37,734 - INFO - fouram]: [PerfTemplate] Report data:
{'server': {'deploy_tool': 'helm',
'deploy_mode': 'standalone',
'config_name': 'standalone_8c16m',
'config': {'standalone': {'resources': {'limits': {'cpu': '8.0',
'memory': '16Gi'},
'requests': {'cpu': '5.0',
'memory': '9Gi'}},
'persistence': {'persistentVolumeClaim': {'storageClass': 'local-path'}}},
'cluster': {'enabled': False},
'etcd': {'replicaCount': 1,
'global': {'storageClass': 'local-path'},
'metrics': {'enabled': True,
'podMonitor': {'enabled': True}}},
'minio': {'mode': 'standalone',
'persistence': {'storageClass': 'local-path'},
'metrics': {'serviceMonitor': {'enabled': False}}},
'pulsar': {'enabled': False},
'metrics': {'serviceMonitor': {'enabled': True}},
'image': {'all': {'repository': 'harbor.milvus.io/dockerhub/milvusdb/milvus',
'tag': 'master-20221101-29f2ed67'}}},
'host': 'fouramf-cron-1667318400-1-31-9427-milvus.qa-milvus.svc.cluster.local'},
'client': {'test_case_type': 'AccCases',
'test_case_name': 'test_recall_glove_hnsw_standalone',
'test_case_params': {'dataset_params': {'dim': 200,
'dataset_name': 'glove-200-angular',
'ni_per': 10000},
'collection_params': {'other_fields': []},
'load_params': {'replica_number': 1},
'search_params': {'top_k': 10,
'nq': 10000,
'search_param': {'ef': 512}},
'index_params': {'index_type': 'HNSW',
'index_param': {'M': 36,
'efConstruction': 500}}},
'run_id': 2022110294117861,
'datetime': '2022-11-02 00:36:51.144710',
'client_version': '2.2'},
'result': {'test_result': {'ann_insert': {'total_time': 28.6393},
'index': {'build_time': 1271.2825},
'search': {'Recall': 0.977,
'RT': 132.4583,
'LastRT': 98.3958}}}} (performance_template.py:67)
```
### Expected Behavior
_No response_
### Steps To Reproduce
```markdown
1. create a collection or use an existing collection
2. insert training dataset
3. flush collection
4. clean index and build new index : HNSW
5. load collection
6. search with different parameters
7. clean all collections or not
```
### Milvus Log
milvus deplocy:
```
helm -n qa-milvus upgrade --install --set standalone.resources.limits.cpu=8.0,standalone.resources.limits.memory=16Gi,standalone.resources.requests.cpu=5.0,standalone.resources.requests.memory=9Gi,standalone.persistence.persistentVolumeClaim.storageClass=local-path,cluster.enabled=False,etcd.replicaCount=1,etcd.global.storageClass=local-path,etcd.metrics.enabled=True,etcd.metrics.podMonitor.enabled=True,minio.mode=standalone,minio.persistence.storageClass=local-path,minio.metrics.serviceMonitor.enabled=False,pulsar.enabled=False,metrics.serviceMonitor.enabled=True,image.all.repository=harbor.milvus.io/dockerhub/milvusdb/milvus,image.all.tag=master-20221101-29f2ed67 --wait --timeout 30m fouramf-cron-1667318400-1-31-9427 /home/helm/charts/milvus/
```
### Anything else?
case params:
```
{
"dataset_params": {
"dim": 200,
"dataset_name": "glove-200-angular",
"ni_per": 10000
},
"collection_params": {
"other_fields": []
},
"load_params": {
"replica_number": 1
},
"search_params": {
"top_k": [
10
],
"nq": [
10000
],
"search_param": {
"ef": [
10,
16,
32,
64,
128,
256,
512
]
}
},
"index_params": {
"index_type": "HNSW",
"index_param": {
"M": 36,
"efConstruction": 500
}
}
}
```
|
non_defect
|
serial search raise error unavailable desc keepalive ping failed to receive ack within timeout is there an existing issue for this i have searched the existing issues environment markdown milvus version master deployment mode standalone or cluster standalone sdk version e g pymilvus os ubuntu or centos cpu memory gpu others current behavior argo task fouramf cron test case test recall glove hnsw standalone server deploy initial state request go waited for due to client side throttling not priority and fairness request get name ready status restarts age ip node nominated node readiness gates fouramf cron etcd running fouramf cron milvus standalone running fouramf cron minio running base py kubectl get pods n qa milvus o wide grep e status fouramf cron util cmd py pod details of release fouramf cron request go waited for due to client side throttling not priority and fairness request get name ready status restarts age ip node nominated node readiness gates fouramf cron etcd running fouramf cron milvus standalone running fouramf cron minio running cli client py start deleting services fouramf cron base py kubectl get pvc n qa milvus grep e status fouramf cron util cmd py pvc storage class of release fouramf cron request go waited for due to client side throttling not priority and fairness request get name status volume capacity access modes storageclass age data fouramf cron etcd bound pvc rwo local path fouramf cron milvus bound pvc rwo local path fouramf cron minio bound pvc rwo local path cli client py img width alt src img width alt src client log actual parameters used dataset params dim dataset name glove angular ni per collection params other fields load params replica number search params top k nq search param ef index params index type hnsw index param m efconstruction performance template py params of search data array dtype anns field float vector param metric type ip params ef limit accuracy cases py params of search nq anns field float vector param metric type ip params ef limit expr none base py collection search run in api request py params of search nq anns field float vector param metric type ip params ef limit expr none base py traceback most recent call last file src fouram client util api request py line in inner wrapper res func args kwargs file src fouram client util api request py line in api request return func arg kwargs file usr local lib dist packages pymilvus orm collection py line in search res conn search self name data anns field param limit expr file usr local lib dist packages pymilvus decorators py line in handler raise e file usr local lib dist packages pymilvus decorators py line in handler return func args kwargs file usr local lib dist packages pymilvus decorators py line in handler ret func self args kwargs file usr local lib dist packages pymilvus decorators py line in handler raise e file usr local lib dist packages pymilvus decorators py line in handler return func self args kwargs file usr local lib dist packages pymilvus client grpc handler py line in search return self execute search requests requests timeout kwargs file usr local lib dist packages pymilvus client grpc handler py line in execute search requests raise pre err file usr local lib dist packages pymilvus client grpc handler py line in execute search requests raise milvusexception response status error code response status reason pymilvus exceptions milvusexception milvusexception code message fail to search on all shard leaders err channel by dev rootcoord dml returns err err rpc error code unavailable desc keepalive ping failed to receive ack within timeout go src github com milvus io milvus internal util trace stack trace go github com milvus io milvus internal util trace stacktrace go src github com milvus io milvus internal util grpcclient client go github com milvus io milvus internal util grpcclient clientbase call go src github com milvus io milvus internal distributed querynode client client go github com milvus io milvus internal distributed querynode client client search go src github com milvus io milvus internal proxy task search go github com milvus io milvus internal proxy searchtask searchshard go src github com milvus io milvus internal proxy task policies go github com milvus io milvus internal proxy mergeroundrobinpolicy usr local go src runtime asm s runtime goexit channel by dev rootcoord dml returns err err rpc error code unavailable desc keepalive ping failed to receive ack within timeout go src github com milvus io milvus internal util trace stack trace go github com milvus io milvus internal util trace stacktrace go src github com milvus io milvus internal util grpcclient client go github com milvus io milvus internal util grpcclient clientbase call go src github com milvus io milvus internal distributed querynode client client go github com milvus io milvus internal distributed querynode client client search go src github com milvus io milvus internal proxy task search go github com milvus io milvus internal proxy searchtask searchshard go src github com milvus io milvus internal proxy task policies go github com milvus io milvus internal proxy mergeroundrobinpolicy usr local go src runtime asm s runtime goexit api request py api response milvusexception code message fail to search on all shard leaders err channel by dev rootcoord dml returns err err rpc error code unavailable desc keepalive ping failed to receive ack within timeout go src github com milvus io milvus internal util trace stack trace go github com milvus io milvus internal util trace stacktrace go src github com milvus io milvus internal util grpcclient client go github com milvus io milvus internal util grpcclient clientbase call go src github com milvus io milvus internal distributed querynode client client go github com milvus io milvus internal distributed querynode client client search go src github com milvus io milvus internal proxy task search go github com milvus io milvus internal proxy searchtask searchshard go src github com milvus io milvus internal proxy task policies go github com milvus io milvus internal proxy mergeroundrobinpolicy usr local go src runtime asm s runtime goexit channel by dev rootcoord dml returns err err rpc error code unavailable desc keepalive ping failed to receive ack within timeout go src github com milvus io milvus internal util trace stack trace go github com milvus io milvus internal util trace stacktrace go src github com milvus io milvus internal util grpcclient client go github com milvus io milvus internal util grpcclient clientbase call go src github com milvus io milvus internal distributed querynode client client go github com milvus io milvus internal distributed querynode client client search go src github com milvus io milvus internal proxy task search go github com milvus io milvus internal proxy searchtask searchshard go src github com milvus io milvus internal proxy task policies go github com milvus io milvus internal proxy mergeroundrobinpolicy usr local go src runtime asm s runtime goexit api request py search request check failed response milvusexception code message fail to search on all shard leaders err channel by dev rootcoord dml returns err err rpc error code unavailable desc keepalive ping failed to receive ack within timeout go src github com milvus io milvus internal util trace stack trace go github com milvus io milvus internal util trace stacktrace go src github com milvus io milvus internal util grpcclient client go github com milvus io milvus internal util grpcclient clientbase call go src github com milvus io milvus internal distributed querynode client client go github com milvus io milvus internal distributed querynode client client search go src github com milvus io milvus internal proxy task search go github com milvus io milvus internal proxy searchtask searchshard go src github com milvus io milvus internal proxy task policies go github com milvus io milvus internal proxy mergeroundrobinpolicy usr local go src runtime asm s runtime goexit channel by dev rootcoord dml returns err err rpc error code unavailable desc keepalive ping failed to receive ack within timeout go src github com milvus io milvus internal util trace stack trace go github com milvus io milvus internal util trace stacktrace go src github com milvus io milvus internal util grpcclient client go github com milvus io milvus internal util grpcclient clientbase call go src github com milvus io milvus internal distributed querynode client client go github com milvus io milvus internal distributed querynode client client search go src github com milvus io milvus internal proxy task search go github com milvus io milvus internal proxy searchtask searchshard go src github com milvus io milvus internal proxy task policies go github com milvus io milvus internal proxy mergeroundrobinpolicy usr local go src runtime asm s runtime goexit func check py search raise error accuracy cases py actual parameters used dataset params dim dataset name glove angular ni per collection params other fields load params replica number search params top k nq search param ef index params index type hnsw index param m efconstruction performance template py params of search data array dtype anns field float vector param metric type ip params ef limit accuracy cases py params of search nq anns field float vector param metric type ip params ef limit expr none base py collection search run in api request py params of search nq anns field float vector param metric type ip params ef limit expr none base py collection search run in api request py params of search nq anns field float vector param metric type ip params ef limit expr none base py collection search run in api request py params of search nq anns field float vector param metric type ip params ef limit expr none base py collection search run in api request py params of search nq anns field float vector param metric type ip params ef limit expr none base py collection search run in api request py search result recall rt lastrt accuracy cases py report data server deploy tool helm deploy mode standalone config name standalone config standalone resources limits cpu memory requests cpu memory persistence persistentvolumeclaim storageclass local path cluster enabled false etcd replicacount global storageclass local path metrics enabled true podmonitor enabled true minio mode standalone persistence storageclass local path metrics servicemonitor enabled false pulsar enabled false metrics servicemonitor enabled true image all repository harbor milvus io dockerhub milvusdb milvus tag master host fouramf cron milvus qa milvus svc cluster local client test case type acccases test case name test recall glove hnsw standalone test case params dataset params dim dataset name glove angular ni per collection params other fields load params replica number search params top k nq search param ef index params index type hnsw index param m efconstruction run id datetime client version result test result ann insert total time index build time search recall rt lastrt performance template py expected behavior no response steps to reproduce markdown create a collection or use an existing collection insert training dataset flush collection clean index and build new index hnsw load collection search with different parameters clean all collections or not milvus log milvus deplocy helm n qa milvus upgrade install set standalone resources limits cpu standalone resources limits memory standalone resources requests cpu standalone resources requests memory standalone persistence persistentvolumeclaim storageclass local path cluster enabled false etcd replicacount etcd global storageclass local path etcd metrics enabled true etcd metrics podmonitor enabled true minio mode standalone minio persistence storageclass local path minio metrics servicemonitor enabled false pulsar enabled false metrics servicemonitor enabled true image all repository harbor milvus io dockerhub milvusdb milvus image all tag master wait timeout fouramf cron home helm charts milvus anything else case params dataset params dim dataset name glove angular ni per collection params other fields load params replica number search params top k nq search param ef index params index type hnsw index param m efconstruction
| 0
|
42,594
| 11,161,745,774
|
IssuesEvent
|
2019-12-26 15:01:17
|
primefaces/primefaces
|
https://api.github.com/repos/primefaces/primefaces
|
closed
|
DataTable: multiViewState reset doesn't work correctly
|
defect
|
Checkout and open:
http://localhost:8080/showcase/ui/data/datatable/tableState.xhtml
If you click on "Clear table state" in the second table, only one clientId is processed in #showMessage.
@Rapster can you check that?
|
1.0
|
DataTable: multiViewState reset doesn't work correctly - Checkout and open:
http://localhost:8080/showcase/ui/data/datatable/tableState.xhtml
If you click on "Clear table state" in the second table, only one clientId is processed in #showMessage.
@Rapster can you check that?
|
defect
|
datatable multiviewstate reset doesn t work correctly checkout and open if you click on clear table state in the second table only one clientid is processed in showmessage rapster can you check that
| 1
|
306,011
| 9,379,532,765
|
IssuesEvent
|
2019-04-04 15:06:55
|
StrangeLoopGames/EcoIssues
|
https://api.github.com/repos/StrangeLoopGames/EcoIssues
|
closed
|
[0.8.1.0-release-preview] No road tool animation in 3d person view
|
Medium Priority Quality Assurance
|
[0.8.1.0-release-preview]
no mods
**Steps**
1. Go in 3d person view mode
2. Take road tool
3. Road it
**Expected result**
animation for road tool
**Actual result**
No actions at all. Not even for picking a tool.

|
1.0
|
[0.8.1.0-release-preview] No road tool animation in 3d person view - [0.8.1.0-release-preview]
no mods
**Steps**
1. Go in 3d person view mode
2. Take road tool
3. Road it
**Expected result**
animation for road tool
**Actual result**
No actions at all. Not even for picking a tool.

|
non_defect
|
no road tool animation in person view no mods steps go in person view mode take road tool road it expected result animation for road tool actual result no actions at all not even for picking a tool
| 0
|
21,626
| 3,525,845,634
|
IssuesEvent
|
2016-01-14 00:24:52
|
gadLinux/hibernate-generic-dao
|
https://api.github.com/repos/gadLinux/hibernate-generic-dao
|
closed
|
save() doesn't get the id of the entity
|
auto-migrated Priority-Low Type-Defect
|
```
Caused by: java.lang.NullPointerException
at
com.trg.dao.hibernate.HibernateBaseDAO._saveOrUpdateIsNew(HibernateBaseDAO.java:
158)
at com.trg.dao.hibernate.GeneralDAOImpl.save(GeneralDAOImpl.java:87)
at model.db.dbOperations.changeUserPass(dbOperations.java:67)
the method save was called in this way:
public static boolean changeUserPass(String username, String newPass) {
UniversalDAO dao=new UniversalDAO();
User u=dao.find(User.class, username);
u.setPassword(utils.CryptoUtils.hexMD5(newPass));
dao.save(u);
return true;
}
and the entity is mapped by annotations:
@Id
@Column(name="username", unique=true, nullable=false, length=20)
public String getUsername() {
return this.username;
}
in Hibernate General DAOs why don't you use the native SaveOrUpdate()
method of Hibernate Session for this purpose?
in my app i resolved in this way (waiting for your bugfix)
/**
*
* @author stefano
*/
public class UniversalDAO extends GeneralDAOImpl {
@Override
protected Session getSession() {
return SessionManager.currentSession();
}
public boolean save2(Object entity) {
Session sess=getSession();
Transaction tx = sess.beginTransaction();
tx.begin();
sess.saveOrUpdate(entity);
tx.commit();
return true;
}
}
```
Original issue reported on code.google.com by `rastrano` on 25 Aug 2009 at 12:09
|
1.0
|
save() doesn't get the id of the entity - ```
Caused by: java.lang.NullPointerException
at
com.trg.dao.hibernate.HibernateBaseDAO._saveOrUpdateIsNew(HibernateBaseDAO.java:
158)
at com.trg.dao.hibernate.GeneralDAOImpl.save(GeneralDAOImpl.java:87)
at model.db.dbOperations.changeUserPass(dbOperations.java:67)
the method save was called in this way:
public static boolean changeUserPass(String username, String newPass) {
UniversalDAO dao=new UniversalDAO();
User u=dao.find(User.class, username);
u.setPassword(utils.CryptoUtils.hexMD5(newPass));
dao.save(u);
return true;
}
and the entity is mapped by annotations:
@Id
@Column(name="username", unique=true, nullable=false, length=20)
public String getUsername() {
return this.username;
}
in Hibernate General DAOs why don't you use the native SaveOrUpdate()
method of Hibernate Session for this purpose?
in my app i resolved in this way (waiting for your bugfix)
/**
*
* @author stefano
*/
public class UniversalDAO extends GeneralDAOImpl {
@Override
protected Session getSession() {
return SessionManager.currentSession();
}
public boolean save2(Object entity) {
Session sess=getSession();
Transaction tx = sess.beginTransaction();
tx.begin();
sess.saveOrUpdate(entity);
tx.commit();
return true;
}
}
```
Original issue reported on code.google.com by `rastrano` on 25 Aug 2009 at 12:09
|
defect
|
save doesn t get the id of the entity caused by java lang nullpointerexception at com trg dao hibernate hibernatebasedao saveorupdateisnew hibernatebasedao java at com trg dao hibernate generaldaoimpl save generaldaoimpl java at model db dboperations changeuserpass dboperations java the method save was called in this way public static boolean changeuserpass string username string newpass universaldao dao new universaldao user u dao find user class username u setpassword utils cryptoutils newpass dao save u return true and the entity is mapped by annotations id column name username unique true nullable false length public string getusername return this username in hibernate general daos why don t you use the native saveorupdate method of hibernate session for this purpose in my app i resolved in this way waiting for your bugfix author stefano public class universaldao extends generaldaoimpl override protected session getsession return sessionmanager currentsession public boolean object entity session sess getsession transaction tx sess begintransaction tx begin sess saveorupdate entity tx commit return true original issue reported on code google com by rastrano on aug at
| 1
|
170,983
| 14,274,154,550
|
IssuesEvent
|
2020-11-22 02:00:19
|
peterthehan/create-discord-bot
|
https://api.github.com/repos/peterthehan/create-discord-bot
|
closed
|
Improve README readability
|
documentation wontfix
|
Existing README assumes some technical knowledge of the user and is not layman-friendly. The target audience should be lowered considering the context of the project and/or there should be multiple versions of the README that expands/condenses the information appropriately from which the user can choose from.
|
1.0
|
Improve README readability - Existing README assumes some technical knowledge of the user and is not layman-friendly. The target audience should be lowered considering the context of the project and/or there should be multiple versions of the README that expands/condenses the information appropriately from which the user can choose from.
|
non_defect
|
improve readme readability existing readme assumes some technical knowledge of the user and is not layman friendly the target audience should be lowered considering the context of the project and or there should be multiple versions of the readme that expands condenses the information appropriately from which the user can choose from
| 0
|
71,414
| 9,522,930,185
|
IssuesEvent
|
2019-04-27 13:09:37
|
sequelize/sequelize
|
https://api.github.com/repos/sequelize/sequelize
|
closed
|
Stick to TypeScript for types declarations in comments
|
documentation suggestion
|
The comment above `Model.save` documents the return value this way:
```js
/**
* Validate this instance,
...
* @return {Promise<this|Errors.ValidationError>}
*/
save(options) {
```
But [in TypeScript](https://github.com/Microsoft/TypeScript/issues/7588#issuecomment-198700729) the generic type `T` in `Promise<T>` only documents the type of the success path of a promise.
Instead, `ValidationError` goes on the failure path because in `instance-validator.js` we read
```js
if (this.errors.length) {
throw new sequelizeError.ValidationError(null, this.errors);
}
```
Besides, `ValidationError` is just one of many thrown errors.
Additionally, `this` is not a type (class) but a value (instance).
Thus, the correct return type for `Model.save` which makes sense in TypeScript is `Promise<Model>`.
A correct documentation could be:
```js
/**
* Validate this instance,
...
* @return {Promise<Model>} success: this, failure: Error | Errors.ValidationError
*/
save(options) {
```
|
1.0
|
Stick to TypeScript for types declarations in comments - The comment above `Model.save` documents the return value this way:
```js
/**
* Validate this instance,
...
* @return {Promise<this|Errors.ValidationError>}
*/
save(options) {
```
But [in TypeScript](https://github.com/Microsoft/TypeScript/issues/7588#issuecomment-198700729) the generic type `T` in `Promise<T>` only documents the type of the success path of a promise.
Instead, `ValidationError` goes on the failure path because in `instance-validator.js` we read
```js
if (this.errors.length) {
throw new sequelizeError.ValidationError(null, this.errors);
}
```
Besides, `ValidationError` is just one of many thrown errors.
Additionally, `this` is not a type (class) but a value (instance).
Thus, the correct return type for `Model.save` which makes sense in TypeScript is `Promise<Model>`.
A correct documentation could be:
```js
/**
* Validate this instance,
...
* @return {Promise<Model>} success: this, failure: Error | Errors.ValidationError
*/
save(options) {
```
|
non_defect
|
stick to typescript for types declarations in comments the comment above model save documents the return value this way js validate this instance return promise save options but the generic type t in promise only documents the type of the success path of a promise instead validationerror goes on the failure path because in instance validator js we read js if this errors length throw new sequelizeerror validationerror null this errors besides validationerror is just one of many thrown errors additionally this is not a type class but a value instance thus the correct return type for model save which makes sense in typescript is promise a correct documentation could be js validate this instance return promise success this failure error errors validationerror save options
| 0
|
104,093
| 22,589,062,968
|
IssuesEvent
|
2022-06-28 17:56:02
|
creativecommons/sre-salt-prime
|
https://api.github.com/repos/creativecommons/sre-salt-prime
|
closed
|
Update OS of `openglam__prod__us-east-2`
|
🟨 priority: medium 🏁 status: ready for work ✨ goal: improvement 💻 aspect: code 🔒 staff only
|
## Description
Rebuild `openglam__prod__us-east-2` to update OS (data preserved in EBS volume)
## Alternatives
<!-- Describe any alternative solutions or features you have considered. How is this feature better? -->
## Additional context
<!-- Add any other context about the feature here; or delete the section entirely. -->
|
1.0
|
Update OS of `openglam__prod__us-east-2` - ## Description
Rebuild `openglam__prod__us-east-2` to update OS (data preserved in EBS volume)
## Alternatives
<!-- Describe any alternative solutions or features you have considered. How is this feature better? -->
## Additional context
<!-- Add any other context about the feature here; or delete the section entirely. -->
|
non_defect
|
update os of openglam prod us east description rebuild openglam prod us east to update os data preserved in ebs volume alternatives additional context
| 0
|
47,129
| 13,056,037,138
|
IssuesEvent
|
2020-07-30 03:27:47
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
closed
|
nohup ./bin/jeb-server-loop doesn't disconnect properly (Trac #34)
|
Migrated from Trac defect jeb-server
|
same error with ./bin/i3daqdispatch-loop.
Once started like:
nohup ./bin/jeb-server-loop &
works fine, but if you send a start signal:
./bin/jeb-server-start
then disconnect the terminal w/ loop running in background, server hangs
Migrated from https://code.icecube.wisc.edu/ticket/34
```json
{
"status": "closed",
"changetime": "2007-11-09T22:43:57",
"description": "same error with ./bin/i3daqdispatch-loop.\n\nOnce started like:\nnohup ./bin/jeb-server-loop &\n\nworks fine, but if you send a start signal:\n./bin/jeb-server-start\n\nthen disconnect the terminal w/ loop running in background, server hangs",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"_ts": "1194648237000000",
"component": "jeb-server",
"summary": "nohup ./bin/jeb-server-loop doesn't disconnect properly",
"priority": "normal",
"keywords": "",
"time": "2007-06-05T19:08:14",
"milestone": "",
"owner": "blaufuss",
"type": "defect"
}
```
|
1.0
|
nohup ./bin/jeb-server-loop doesn't disconnect properly (Trac #34) - same error with ./bin/i3daqdispatch-loop.
Once started like:
nohup ./bin/jeb-server-loop &
works fine, but if you send a start signal:
./bin/jeb-server-start
then disconnect the terminal w/ loop running in background, server hangs
Migrated from https://code.icecube.wisc.edu/ticket/34
```json
{
"status": "closed",
"changetime": "2007-11-09T22:43:57",
"description": "same error with ./bin/i3daqdispatch-loop.\n\nOnce started like:\nnohup ./bin/jeb-server-loop &\n\nworks fine, but if you send a start signal:\n./bin/jeb-server-start\n\nthen disconnect the terminal w/ loop running in background, server hangs",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"_ts": "1194648237000000",
"component": "jeb-server",
"summary": "nohup ./bin/jeb-server-loop doesn't disconnect properly",
"priority": "normal",
"keywords": "",
"time": "2007-06-05T19:08:14",
"milestone": "",
"owner": "blaufuss",
"type": "defect"
}
```
|
defect
|
nohup bin jeb server loop doesn t disconnect properly trac same error with bin loop once started like nohup bin jeb server loop works fine but if you send a start signal bin jeb server start then disconnect the terminal w loop running in background server hangs migrated from json status closed changetime description same error with bin loop n nonce started like nnohup bin jeb server loop n nworks fine but if you send a start signal n bin jeb server start n nthen disconnect the terminal w loop running in background server hangs reporter blaufuss cc resolution fixed ts component jeb server summary nohup bin jeb server loop doesn t disconnect properly priority normal keywords time milestone owner blaufuss type defect
| 1
|
330,397
| 28,374,562,627
|
IssuesEvent
|
2023-04-12 19:43:41
|
sanger-tol/variantcalling
|
https://api.github.com/repos/sanger-tol/variantcalling
|
closed
|
Medium test: `Erithacus rubecula`
|
invalid test
|
### Description of feature
Organism: `Erithacus rubecula`
ID: `bEriRub2`
Main datatype: Illumina
|
1.0
|
Medium test: `Erithacus rubecula` - ### Description of feature
Organism: `Erithacus rubecula`
ID: `bEriRub2`
Main datatype: Illumina
|
non_defect
|
medium test erithacus rubecula description of feature organism erithacus rubecula id main datatype illumina
| 0
|
692,132
| 23,723,862,981
|
IssuesEvent
|
2022-08-30 17:40:34
|
thoth-station/jupyterlab-requirements
|
https://api.github.com/repos/thoth-station/jupyterlab-requirements
|
closed
|
[2pt] plugin is unable to send request to Thoth engine
|
priority/critical-urgent kind/bug sig/user-experience lifecycle/active triage/accepted
|
**Describe the bug**
The plugin is failing to send the locking request to Project Thoth, as it is not updated with the latest user-api changes
it is still sending the same parameters, and causing extra parameter issues.
`Extra query parameter(s) count not in spec",\n "status": 400`
The count parameter was removed in v0.35.2:
https://github.com/thoth-station/user-api/commit/1615c40699c204825cb087640cf66569def22367
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'any notebook'
2. execute the dependency management with plugin
3. See error
**Expected behavior**
Work without issue
**Screenshots**
```
%horus lock
Exception('Error locking dependencies, check pod logs for more details about the error. (400)\nReason: BAD REQUEST\nHTTP response headers: HTTPHeaderDict({\'server\': \'gunicorn\', \'date\': \'Tue, 14 Jun 2022 06:00:38 GMT\', \'content-type\': \'application/problem+json\', \'content-length\': \'120\', \'access-control-allow-origin\': \'*\', \'x-thoth-version\': \'0.35.2\', \'x-user-api-service-version\': \'0.35.2+messaging.0.16.1.storages.0.72.1.common.0.36.2.python.0.16.10\', \'x-thoth-search-ui-url\': \'[https://thoth-station.ninja/search/\](https://thoth-station.ninja/search//)', \'set-cookie\': \'99770cb82864be05282857f803e02327=71283d09b941169601817f89b67f6781; path=/; HttpOnly; Secure; SameSite=None\'})\nHTTP response body: {\n "detail": "Extra query parameter(s) count not in spec",\n "status": 400,\n "title": null,\n "type": "about:blank"\n}\n\n')
```
**Environment information**
Describe the environment from where you are using the extension:
- jupyterlab version: 3.1.14
- jupyterlab-requirements version: 0.11.0
|
1.0
|
[2pt] plugin is unable to send request to Thoth engine - **Describe the bug**
The plugin is failing to send the locking request to Project Thoth, as it is not updated with the latest user-api changes
it is still sending the same parameters, and causing extra parameter issues.
`Extra query parameter(s) count not in spec",\n "status": 400`
The count parameter was removed in v0.35.2:
https://github.com/thoth-station/user-api/commit/1615c40699c204825cb087640cf66569def22367
**To Reproduce**
Steps to reproduce the behavior:
1. Go to 'any notebook'
2. execute the dependency management with plugin
3. See error
**Expected behavior**
Work without issue
**Screenshots**
```
%horus lock
Exception('Error locking dependencies, check pod logs for more details about the error. (400)\nReason: BAD REQUEST\nHTTP response headers: HTTPHeaderDict({\'server\': \'gunicorn\', \'date\': \'Tue, 14 Jun 2022 06:00:38 GMT\', \'content-type\': \'application/problem+json\', \'content-length\': \'120\', \'access-control-allow-origin\': \'*\', \'x-thoth-version\': \'0.35.2\', \'x-user-api-service-version\': \'0.35.2+messaging.0.16.1.storages.0.72.1.common.0.36.2.python.0.16.10\', \'x-thoth-search-ui-url\': \'[https://thoth-station.ninja/search/\](https://thoth-station.ninja/search//)', \'set-cookie\': \'99770cb82864be05282857f803e02327=71283d09b941169601817f89b67f6781; path=/; HttpOnly; Secure; SameSite=None\'})\nHTTP response body: {\n "detail": "Extra query parameter(s) count not in spec",\n "status": 400,\n "title": null,\n "type": "about:blank"\n}\n\n')
```
**Environment information**
Describe the environment from where you are using the extension:
- jupyterlab version: 3.1.14
- jupyterlab-requirements version: 0.11.0
|
non_defect
|
plugin is unable to send request to thoth engine describe the bug the plugin is failing to send the locking request to project thoth as it is not updated with the latest user api changes it is still sending the same parameters and causing extra parameter issues extra query parameter s count not in spec n status the count parameter was removed in to reproduce steps to reproduce the behavior go to any notebook execute the dependency management with plugin see error expected behavior work without issue screenshots horus lock exception error locking dependencies check pod logs for more details about the error nreason bad request nhttp response headers httpheaderdict server gunicorn date tue jun gmt content type application problem json content length access control allow origin x thoth version x user api service version messaging storages common python x thoth search ui url set cookie path httponly secure samesite none nhttp response body n detail extra query parameter s count not in spec n status n title null n type about blank n n n environment information describe the environment from where you are using the extension jupyterlab version jupyterlab requirements version
| 0
|
437,339
| 30,594,913,083
|
IssuesEvent
|
2023-07-21 20:49:14
|
DCC-EX/dcc-ex.github.io
|
https://api.github.com/repos/DCC-EX/dcc-ex.github.io
|
closed
|
[Documentation Update]: Caveats on using some I²C board in series.
|
Documentation
|
### Documentation details
Message from Discord re limitations of some I²C boards being used in series.
https://discord.com/channels/713189617066836079/735194734611398676/1042877309180448778
Not every I2C-expander is designed to be used in series. The PCA9515 NOT, the PCA9511/12/13 yes. See page 24, question 2 of this document https://www.nxp.com/docs/en/application-note/AN255.pdf
### Page with issues
_No response_
|
1.0
|
[Documentation Update]: Caveats on using some I²C board in series. - ### Documentation details
Message from Discord re limitations of some I²C boards being used in series.
https://discord.com/channels/713189617066836079/735194734611398676/1042877309180448778
Not every I2C-expander is designed to be used in series. The PCA9515 NOT, the PCA9511/12/13 yes. See page 24, question 2 of this document https://www.nxp.com/docs/en/application-note/AN255.pdf
### Page with issues
_No response_
|
non_defect
|
caveats on using some i²c board in series documentation details message from discord re limitations of some i²c boards being used in series not every expander is designed to be used in series the not the yes see page question of this document page with issues no response
| 0
|
137,266
| 11,105,234,471
|
IssuesEvent
|
2019-12-17 09:26:19
|
JuliaDynamics/DrWatson.jl
|
https://api.github.com/repos/JuliaDynamics/DrWatson.jl
|
closed
|
Keyword problem with `@tagsave`
|
bug tests
|
I am doing
```
@tagsave(datadir("husimi_$(fname).bson"), @dict Q1 Q2 r0s ks1 ks2 σ; storepatch=false)
```
which gives:
```
ERROR: MethodError: no method matching tagsave(::String, ::Dict{Symbol,Any}, ::String, ::Bool, ::LineNumberNode; storepatch=false)
Closest candidates are:
tagsave(::Any, ::Any, ::Bool, ::Any, ::Any) at deprecated.jl:65 got unsupported keyword argument "storepatch"
tagsave(::Any, ::Any, ::Bool, ::Any, ::Any, ::Any) at deprecated.jl:65 got unsupported keyword argument "storepatch"
tagsave(::Any, ::Any, ::String) at deprecated.jl:65 got unsupported keyword argument "storepatch"
...
```
But using directly `tagsave` works.
@sebastianpech I think at some point you mentioned that we have to remove non-keyword versions. Is this the time to do it?
|
1.0
|
Keyword problem with `@tagsave` - I am doing
```
@tagsave(datadir("husimi_$(fname).bson"), @dict Q1 Q2 r0s ks1 ks2 σ; storepatch=false)
```
which gives:
```
ERROR: MethodError: no method matching tagsave(::String, ::Dict{Symbol,Any}, ::String, ::Bool, ::LineNumberNode; storepatch=false)
Closest candidates are:
tagsave(::Any, ::Any, ::Bool, ::Any, ::Any) at deprecated.jl:65 got unsupported keyword argument "storepatch"
tagsave(::Any, ::Any, ::Bool, ::Any, ::Any, ::Any) at deprecated.jl:65 got unsupported keyword argument "storepatch"
tagsave(::Any, ::Any, ::String) at deprecated.jl:65 got unsupported keyword argument "storepatch"
...
```
But using directly `tagsave` works.
@sebastianpech I think at some point you mentioned that we have to remove non-keyword versions. Is this the time to do it?
|
non_defect
|
keyword problem with tagsave i am doing tagsave datadir husimi fname bson dict σ storepatch false which gives error methoderror no method matching tagsave string dict symbol any string bool linenumbernode storepatch false closest candidates are tagsave any any bool any any at deprecated jl got unsupported keyword argument storepatch tagsave any any bool any any any at deprecated jl got unsupported keyword argument storepatch tagsave any any string at deprecated jl got unsupported keyword argument storepatch but using directly tagsave works sebastianpech i think at some point you mentioned that we have to remove non keyword versions is this the time to do it
| 0
|
438,882
| 30,667,163,311
|
IssuesEvent
|
2023-07-25 19:12:51
|
gravitational/teleport
|
https://api.github.com/repos/gravitational/teleport
|
closed
|
Refresh/redesign SSO documentation
|
documentation sso time-to-value
|
## Details
The Teleport SSO guides - GitHub and Enterprise versions - have grown organically over the years. This led to a drift in page content and a lack of uniformity. For example:
- Different page titles: "Teleport SSO Authentication with GitLab" vs. "SSH Authentication with Google Workspace (G Suite)".
- Some guides offer numbered steps, some don't.
- The links have outdated titles and the listings are incomplete (e.g. https://goteleport.com/docs/enterprise/sso/?scope=enterprise links to just 4 guides in the body text).
There are more issues than those; more generally though, Teleport capabilities have grown and so did the complexity of the SSO guides. The guides should be streamlined and common parts between guides shared or unified. Furthermore, the new commands `tctl sso configure github|saml|oidc` and `tctl sso test` should be used in the guides for better experience:
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-configure-github
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-configure-oidc
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-configure-saml
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-test
Pages affected:
- Enterprise SSO overview: https://goteleport.com/docs/enterprise/sso/?scope=enterprise
- Github SSO guide: https://goteleport.com/docs/setup/admin/github-sso/?scope=enterprise
- Individual SAML/OIDC providers guides, e.g: https://goteleport.com/docs/enterprise/sso/google-workspace/?scope=enterprise https://goteleport.com/docs/enterprise/sso/gitlab/?scope=enterprise
### Category
- Improve Existing
- Remove Outdated
## Progress
- [x] Azure Active Directory (AD)
- [x] Active Directory (ADFS)
- [x] Google Workspace
- [x] GitHub
- [x] GitLab
- [x] OneLogin
- [x] OIDC
- [x] Okta
|
1.0
|
Refresh/redesign SSO documentation - ## Details
The Teleport SSO guides - GitHub and Enterprise versions - have grown organically over the years. This led to a drift in page content and a lack of uniformity. For example:
- Different page titles: "Teleport SSO Authentication with GitLab" vs. "SSH Authentication with Google Workspace (G Suite)".
- Some guides offer numbered steps, some don't.
- The links have outdated titles and the listings are incomplete (e.g. https://goteleport.com/docs/enterprise/sso/?scope=enterprise links to just 4 guides in the body text).
There are more issues than those; more generally though, Teleport capabilities have grown and so did the complexity of the SSO guides. The guides should be streamlined and common parts between guides shared or unified. Furthermore, the new commands `tctl sso configure github|saml|oidc` and `tctl sso test` should be used in the guides for better experience:
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-configure-github
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-configure-oidc
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-configure-saml
- https://goteleport.com/docs/setup/reference/cli/#tctl-sso-test
Pages affected:
- Enterprise SSO overview: https://goteleport.com/docs/enterprise/sso/?scope=enterprise
- Github SSO guide: https://goteleport.com/docs/setup/admin/github-sso/?scope=enterprise
- Individual SAML/OIDC providers guides, e.g: https://goteleport.com/docs/enterprise/sso/google-workspace/?scope=enterprise https://goteleport.com/docs/enterprise/sso/gitlab/?scope=enterprise
### Category
- Improve Existing
- Remove Outdated
## Progress
- [x] Azure Active Directory (AD)
- [x] Active Directory (ADFS)
- [x] Google Workspace
- [x] GitHub
- [x] GitLab
- [x] OneLogin
- [x] OIDC
- [x] Okta
|
non_defect
|
refresh redesign sso documentation details the teleport sso guides github and enterprise versions have grown organically over the years this led to a drift in page content and a lack of uniformity for example different page titles teleport sso authentication with gitlab vs ssh authentication with google workspace g suite some guides offer numbered steps some don t the links have outdated titles and the listings are incomplete e g links to just guides in the body text there are more issues than those more generally though teleport capabilities have grown and so did the complexity of the sso guides the guides should be streamlined and common parts between guides shared or unified furthermore the new commands tctl sso configure github saml oidc and tctl sso test should be used in the guides for better experience pages affected enterprise sso overview github sso guide individual saml oidc providers guides e g category improve existing remove outdated progress azure active directory ad active directory adfs google workspace github gitlab onelogin oidc okta
| 0
|
17,962
| 2,615,160,071
|
IssuesEvent
|
2015-03-01 06:38:41
|
chrsmith/html5rocks
|
https://api.github.com/repos/chrsmith/html5rocks
|
opened
|
Slide for css mask
|
auto-migrated Milestone-X New Priority-P2 Slides Type-Feature
|
```
mask using css gradient
- mask using image
```
Original issue reported on code.google.com by `ericbide...@html5rocks.com` on 21 Mar 2011 at 9:10
|
1.0
|
Slide for css mask - ```
mask using css gradient
- mask using image
```
Original issue reported on code.google.com by `ericbide...@html5rocks.com` on 21 Mar 2011 at 9:10
|
non_defect
|
slide for css mask mask using css gradient mask using image original issue reported on code google com by ericbide com on mar at
| 0
|
56,401
| 15,075,255,764
|
IssuesEvent
|
2021-02-05 01:38:17
|
DDCorkum/CTMod
|
https://api.github.com/repos/DDCorkum/CTMod
|
closed
|
Restore group labels to custom raid frames
|
- Defect CT_RaidAssist
|
**Issue:**
- Group/role/class labels no longer appear in the latest CTRA version, except briefly when a new group is added for the first time
**Reference:**
- Requested by Francomarx at CurseForge [comment 1072](https://www.curseforge.com/wow/addons/ctmod?comment=1072)
|
1.0
|
Restore group labels to custom raid frames - **Issue:**
- Group/role/class labels no longer appear in the latest CTRA version, except briefly when a new group is added for the first time
**Reference:**
- Requested by Francomarx at CurseForge [comment 1072](https://www.curseforge.com/wow/addons/ctmod?comment=1072)
|
defect
|
restore group labels to custom raid frames issue group role class labels no longer appear in the latest ctra version except briefly when a new group is added for the first time reference requested by francomarx at curseforge
| 1
|
65,552
| 19,571,464,179
|
IssuesEvent
|
2022-01-04 10:26:02
|
vector-im/element-android
|
https://api.github.com/repos/vector-im/element-android
|
opened
|
Breaks speaker phone on 3rd party applications
|
T-Defect
|
### Steps to reproduce
1. I use SIP voicecalls (Zoiper) . Enable speaker phone
2. Elements received message
3. speaker phone is turned off
or if I switch to elements while on speaker phone, the speaker phone is turned off. Same things happens during regular phone calls.
It affects my Nokia 7.1 with Android 10 or an old phone from 5 years ago with older Android.
### Outcome
Do not adjust sound devices when application becomes active.
### Your phone model
_No response_
### Operating system version
_No response_
### Application version and app store
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
1.0
|
Breaks speaker phone on 3rd party applications - ### Steps to reproduce
1. I use SIP voicecalls (Zoiper) . Enable speaker phone
2. Elements received message
3. speaker phone is turned off
or if I switch to elements while on speaker phone, the speaker phone is turned off. Same things happens during regular phone calls.
It affects my Nokia 7.1 with Android 10 or an old phone from 5 years ago with older Android.
### Outcome
Do not adjust sound devices when application becomes active.
### Your phone model
_No response_
### Operating system version
_No response_
### Application version and app store
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
defect
|
breaks speaker phone on party applications steps to reproduce i use sip voicecalls zoiper enable speaker phone elements received message speaker phone is turned off or if i switch to elements while on speaker phone the speaker phone is turned off same things happens during regular phone calls it affects my nokia with android or an old phone from years ago with older android outcome do not adjust sound devices when application becomes active your phone model no response operating system version no response application version and app store no response homeserver no response will you send logs no
| 1
|
61,210
| 17,023,636,888
|
IssuesEvent
|
2021-07-03 03:02:34
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
closed
|
Allow Bot for dynamical lists
|
Component: wiki Priority: major Resolution: fixed Type: defect
|
**[Submitted to the original trac issue database at 9.34am, Thursday, 23rd September 2010]**
Hi,
user:Tordanik and myself !i! migrate the static software lists to dynamical ones generated by a bot. Please mark the user account as bot to work around the captchas like requested here:
http://wiki.openstreetmap.org/wiki/Talk:Wiki#Bot_flag_request:_TTTBot
|
1.0
|
Allow Bot for dynamical lists - **[Submitted to the original trac issue database at 9.34am, Thursday, 23rd September 2010]**
Hi,
user:Tordanik and myself !i! migrate the static software lists to dynamical ones generated by a bot. Please mark the user account as bot to work around the captchas like requested here:
http://wiki.openstreetmap.org/wiki/Talk:Wiki#Bot_flag_request:_TTTBot
|
defect
|
allow bot for dynamical lists hi user tordanik and myself i migrate the static software lists to dynamical ones generated by a bot please mark the user account as bot to work around the captchas like requested here
| 1
|
252,261
| 27,235,264,095
|
IssuesEvent
|
2023-02-21 15:53:51
|
ministryofjustice/hmpps-probation-integration-services
|
https://api.github.com/repos/ministryofjustice/hmpps-probation-integration-services
|
closed
|
CVE-2022-44571 (person-search-index-from-delius)
|
dependencies security
|
There is a denial of service vulnerability in the Content-Disposition ...
* Project: person-search-index-from-delius
* Package: `rack:2.2.4`
* Location: `usr/share/logstash/vendor/bundle/jruby/2.6.0/specifications/rack-2.2.4.gemspec`
>There is a denial of service vulnerability in the Content-Disposition parsingcomponent of Rack fixed in 2.0.9.2, 2.1.4.2, 2.2.4.1, 3.0.0.1. This could allow an attacker to craft an input that can cause Content-Disposition header parsing in Rackto take an unexpected amount of time, possibly resulting in a denial ofservice attack vector. This header is used typically used in multipartparsing. Any applications that parse multipart posts using Rack (virtuallyall Rails applications) are impacted.
https://avd.aquasec.com/nvd/cve-2022-44571
If the vulnerability does not impact the `person-search-index-from-delius` project, you can suppress this alert by adding a comment starting with `Suppress`. For example, "Suppressed because we do not process any untrusted XML content".
|
True
|
CVE-2022-44571 (person-search-index-from-delius) - There is a denial of service vulnerability in the Content-Disposition ...
* Project: person-search-index-from-delius
* Package: `rack:2.2.4`
* Location: `usr/share/logstash/vendor/bundle/jruby/2.6.0/specifications/rack-2.2.4.gemspec`
>There is a denial of service vulnerability in the Content-Disposition parsingcomponent of Rack fixed in 2.0.9.2, 2.1.4.2, 2.2.4.1, 3.0.0.1. This could allow an attacker to craft an input that can cause Content-Disposition header parsing in Rackto take an unexpected amount of time, possibly resulting in a denial ofservice attack vector. This header is used typically used in multipartparsing. Any applications that parse multipart posts using Rack (virtuallyall Rails applications) are impacted.
https://avd.aquasec.com/nvd/cve-2022-44571
If the vulnerability does not impact the `person-search-index-from-delius` project, you can suppress this alert by adding a comment starting with `Suppress`. For example, "Suppressed because we do not process any untrusted XML content".
|
non_defect
|
cve person search index from delius there is a denial of service vulnerability in the content disposition project person search index from delius package rack location usr share logstash vendor bundle jruby specifications rack gemspec there is a denial of service vulnerability in the content disposition parsingcomponent of rack fixed in this could allow an attacker to craft an input that can cause content disposition header parsing in rackto take an unexpected amount of time possibly resulting in a denial ofservice attack vector this header is used typically used in multipartparsing any applications that parse multipart posts using rack virtuallyall rails applications are impacted if the vulnerability does not impact the person search index from delius project you can suppress this alert by adding a comment starting with suppress for example suppressed because we do not process any untrusted xml content
| 0
|
3,786
| 2,610,069,000
|
IssuesEvent
|
2015-02-26 18:20:15
|
chrsmith/jsjsj122
|
https://api.github.com/repos/chrsmith/jsjsj122
|
opened
|
黄岩治前列腺炎哪里效果比较好
|
auto-migrated Priority-Medium Type-Defect
|
```
黄岩治前列腺炎哪里效果比较好【台州五洲生殖医院】24小时
健康咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地�
��:台州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐1
04、108、118、198及椒江一金清公交车直达枫南小区,乘坐107、
105、109、112、901、
902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 9:13
|
1.0
|
黄岩治前列腺炎哪里效果比较好 - ```
黄岩治前列腺炎哪里效果比较好【台州五洲生殖医院】24小时
健康咨询热线:0576-88066933-(扣扣800080609)-(微信号tzwzszyy)医院地�
��:台州市椒江区枫南路229号(枫南大转盘旁)乘车线路:乘坐1
04、108、118、198及椒江一金清公交车直达枫南小区,乘坐107、
105、109、112、901、
902公交车到星星广场下车,步行即可到院。
诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,��
�精,无精。包皮包茎,精索静脉曲张,淋病等。
台州五洲生殖医院是台州最大的男科医院,权威专家在线免��
�咨询,拥有专业完善的男科检查治疗设备,严格按照国家标�
��收费。尖端医疗设备,与世界同步。权威专家,成就专业典
范。人性化服务,一切以患者为中心。
看男科就选台州五洲生殖医院,专业男科为男人。
```
-----
Original issue reported on code.google.com by `poweragr...@gmail.com` on 30 May 2014 at 9:13
|
defect
|
黄岩治前列腺炎哪里效果比较好 黄岩治前列腺炎哪里效果比较好【台州五洲生殖医院】 健康咨询热线 微信号tzwzszyy 医院地� �� (枫南大转盘旁)乘车线路 、 、 、 , 、 、 、 、 、 ,步行即可到院。 诊疗项目:阳痿,早泄,前列腺炎,前列腺增生,龟头炎,�� �精,无精。包皮包茎,精索静脉曲张,淋病等。 台州五洲生殖医院是台州最大的男科医院,权威专家在线免�� �咨询,拥有专业完善的男科检查治疗设备,严格按照国家标� ��收费。尖端医疗设备,与世界同步。权威专家,成就专业典 范。人性化服务,一切以患者为中心。 看男科就选台州五洲生殖医院,专业男科为男人。 original issue reported on code google com by poweragr gmail com on may at
| 1
|
30,522
| 24,892,240,102
|
IssuesEvent
|
2022-10-28 13:02:07
|
woocommerce/woocommerce
|
https://api.github.com/repos/woocommerce/woocommerce
|
opened
|
WooCommerce legacy assets missing SCSS files after being packaged
|
tool: monorepo infrastructure
|
### Prerequisites
- [X] I have carried out troubleshooting steps and I believe I have found a bug.
- [X] I have searched for similar bugs in both open and closed issues and cannot find a duplicate.
### Describe the bug
When running `pnpm run build:zip` for WooCommerce, `SCSS` files are missing from the build package of `assets/css`.
### Expected behavior
`SCSS` files should still be there.
### Actual behavior
`SCSS` files are missing.
### Steps to reproduce
1. `pnpm install`
2. `cd plugins/woocommerce`
3. `pnpm run build:zip`
4. Unzip the package and look in `assets/css`. `SCSS` files are missing.
### WordPress Environment
N/A
### Isolating the problem
- [X] I have deactivated other plugins and confirmed this bug occurs when only WooCommerce plugin is active.
- [X] This bug happens with a default WordPress theme active, or [Storefront](https://woocommerce.com/storefront/).
- [X] I can reproduce this bug consistently using the steps above.
|
1.0
|
WooCommerce legacy assets missing SCSS files after being packaged - ### Prerequisites
- [X] I have carried out troubleshooting steps and I believe I have found a bug.
- [X] I have searched for similar bugs in both open and closed issues and cannot find a duplicate.
### Describe the bug
When running `pnpm run build:zip` for WooCommerce, `SCSS` files are missing from the build package of `assets/css`.
### Expected behavior
`SCSS` files should still be there.
### Actual behavior
`SCSS` files are missing.
### Steps to reproduce
1. `pnpm install`
2. `cd plugins/woocommerce`
3. `pnpm run build:zip`
4. Unzip the package and look in `assets/css`. `SCSS` files are missing.
### WordPress Environment
N/A
### Isolating the problem
- [X] I have deactivated other plugins and confirmed this bug occurs when only WooCommerce plugin is active.
- [X] This bug happens with a default WordPress theme active, or [Storefront](https://woocommerce.com/storefront/).
- [X] I can reproduce this bug consistently using the steps above.
|
non_defect
|
woocommerce legacy assets missing scss files after being packaged prerequisites i have carried out troubleshooting steps and i believe i have found a bug i have searched for similar bugs in both open and closed issues and cannot find a duplicate describe the bug when running pnpm run build zip for woocommerce scss files are missing from the build package of assets css expected behavior scss files should still be there actual behavior scss files are missing steps to reproduce pnpm install cd plugins woocommerce pnpm run build zip unzip the package and look in assets css scss files are missing wordpress environment n a isolating the problem i have deactivated other plugins and confirmed this bug occurs when only woocommerce plugin is active this bug happens with a default wordpress theme active or i can reproduce this bug consistently using the steps above
| 0
|
51,970
| 10,739,779,315
|
IssuesEvent
|
2019-10-29 16:57:21
|
spyder-ide/spyder
|
https://api.github.com/repos/spyder-ide/spyder
|
closed
|
Autocompletions not working inside function calls
|
component:Code Completion
|
<!--- **PLEASE READ:** When submitting here, please ensure you've completed the following checklist and checked the boxes to confirm. Issue reports without it may be closed. Thanks! --->
## Problem Description
Autocompletions for variables don't work inside function calls.
### What steps reproduce the problem?
1. Start with the file:
```python
my_list = [1, 2, 3]
my_list_copy = list(my_$)
```
2. Observe that no autocompletions are shown.
3. However, if you manually request completions via `ctrl + space`, you will see the completions for `my_list` and `my_list_copy`.
### What is the expected output? What do you see instead?
__Expected:__ With autocompletions enabled, I expect to see the completion for `my_list`.
__Actual:__ No autocompletions are shown, even though manual completions work.
## Versions
<!--- You can get this information from Help > About Spyder...
or (if Spyder won't launch) the "conda list" command
from the Anaconda Prompt/Terminal/command line. --->
* Spyder version: 4.0.0.dev0 (Commit: c2b7673b1)
* Python version: 3.7.4 64-bit
* Qt version: Qt 5.9.6
* PyQt version: PyQt5 5.9.2
* Operating System name/version: Darwin 18.7.0
### Dependencies
<!--- Please go to the menu entry Help > Dependencies,
press the Copy to clipboard button and paste below --->
```
cloudpickle >=0.5.0 : 1.2.2 (OK)
pygments >=2.0 : 2.4.2 (OK)
qtconsole >=4.5.5 : 4.5.5 (OK)
nbconvert >=4.0 : 5.6.0 (OK)
sphinx >=0.6.6 : 2.2.0 (OK)
pylint >=0.25 : 2.4.2 (OK)
psutil >=0.3 : 5.6.3 (OK)
qtawesome >=0.5.7 : 0.6.0 (OK)
qtpy >=1.5.0 : 1.9.0 (OK)
pickleshare >=0.4 : 0.7.5 (OK)
zmq >=17 : 18.1.0 (OK)
chardet >=2.0.0 : 3.0.4 (OK)
numpydoc >=0.6.0 : 0.9.1 (OK)
spyder_kernels >=1.5.0;<2.0.0: 1.6.0 (OK)
qdarkstyle >=2.7 : 2.7 (OK)
atomicwrites >=1.2.0 : 1.3.0 (OK)
diff_match_patch >=20181111 : 20181111 (OK)
watchdog : None (OK)
keyring : None (OK)
pexpect >=4.4.0 : 4.7.0 (OK)
pympler : None (OK)
sympy >=0.7.3 : None (NOK)
cython >=0.21 : None (NOK)
IPython >=4.0 : 7.8.0 (OK)
matplotlib >=2.0.0 : None (NOK)
pandas >=0.13.1 : None (NOK)
numpy >=1.7 : None (NOK)
scipy >=0.17.0 : None (NOK)
pyls >=0.28.2;<0.29.0 : 0.28.3 (OK)
rtree >=0.8.3 : 0.8.3 (OK)
```
|
1.0
|
Autocompletions not working inside function calls - <!--- **PLEASE READ:** When submitting here, please ensure you've completed the following checklist and checked the boxes to confirm. Issue reports without it may be closed. Thanks! --->
## Problem Description
Autocompletions for variables don't work inside function calls.
### What steps reproduce the problem?
1. Start with the file:
```python
my_list = [1, 2, 3]
my_list_copy = list(my_$)
```
2. Observe that no autocompletions are shown.
3. However, if you manually request completions via `ctrl + space`, you will see the completions for `my_list` and `my_list_copy`.
### What is the expected output? What do you see instead?
__Expected:__ With autocompletions enabled, I expect to see the completion for `my_list`.
__Actual:__ No autocompletions are shown, even though manual completions work.
## Versions
<!--- You can get this information from Help > About Spyder...
or (if Spyder won't launch) the "conda list" command
from the Anaconda Prompt/Terminal/command line. --->
* Spyder version: 4.0.0.dev0 (Commit: c2b7673b1)
* Python version: 3.7.4 64-bit
* Qt version: Qt 5.9.6
* PyQt version: PyQt5 5.9.2
* Operating System name/version: Darwin 18.7.0
### Dependencies
<!--- Please go to the menu entry Help > Dependencies,
press the Copy to clipboard button and paste below --->
```
cloudpickle >=0.5.0 : 1.2.2 (OK)
pygments >=2.0 : 2.4.2 (OK)
qtconsole >=4.5.5 : 4.5.5 (OK)
nbconvert >=4.0 : 5.6.0 (OK)
sphinx >=0.6.6 : 2.2.0 (OK)
pylint >=0.25 : 2.4.2 (OK)
psutil >=0.3 : 5.6.3 (OK)
qtawesome >=0.5.7 : 0.6.0 (OK)
qtpy >=1.5.0 : 1.9.0 (OK)
pickleshare >=0.4 : 0.7.5 (OK)
zmq >=17 : 18.1.0 (OK)
chardet >=2.0.0 : 3.0.4 (OK)
numpydoc >=0.6.0 : 0.9.1 (OK)
spyder_kernels >=1.5.0;<2.0.0: 1.6.0 (OK)
qdarkstyle >=2.7 : 2.7 (OK)
atomicwrites >=1.2.0 : 1.3.0 (OK)
diff_match_patch >=20181111 : 20181111 (OK)
watchdog : None (OK)
keyring : None (OK)
pexpect >=4.4.0 : 4.7.0 (OK)
pympler : None (OK)
sympy >=0.7.3 : None (NOK)
cython >=0.21 : None (NOK)
IPython >=4.0 : 7.8.0 (OK)
matplotlib >=2.0.0 : None (NOK)
pandas >=0.13.1 : None (NOK)
numpy >=1.7 : None (NOK)
scipy >=0.17.0 : None (NOK)
pyls >=0.28.2;<0.29.0 : 0.28.3 (OK)
rtree >=0.8.3 : 0.8.3 (OK)
```
|
non_defect
|
autocompletions not working inside function calls problem description autocompletions for variables don t work inside function calls what steps reproduce the problem start with the file python my list my list copy list my observe that no autocompletions are shown however if you manually request completions via ctrl space you will see the completions for my list and my list copy what is the expected output what do you see instead expected with autocompletions enabled i expect to see the completion for my list actual no autocompletions are shown even though manual completions work versions about spyder or if spyder won t launch the conda list command from the anaconda prompt terminal command line spyder version commit python version bit qt version qt pyqt version operating system name version darwin dependencies dependencies press the copy to clipboard button and paste below cloudpickle ok pygments ok qtconsole ok nbconvert ok sphinx ok pylint ok psutil ok qtawesome ok qtpy ok pickleshare ok zmq ok chardet ok numpydoc ok spyder kernels ok qdarkstyle ok atomicwrites ok diff match patch ok watchdog none ok keyring none ok pexpect ok pympler none ok sympy none nok cython none nok ipython ok matplotlib none nok pandas none nok numpy none nok scipy none nok pyls ok rtree ok
| 0
|
15,603
| 2,862,030,231
|
IssuesEvent
|
2015-06-04 00:25:12
|
dart-lang/sdk
|
https://api.github.com/repos/dart-lang/sdk
|
closed
|
html5lib test case where two equivalent selectors give different result
|
Area-Pkg Pkg-Html5Lib Priority-Unassigned Triaged Type-Defect
|
*This issue was originally filed by jirkad...@gmail.com*
_____
**What steps will reproduce the problem?**
import 'package:html5lib/dom.dart' as dom;
import 'package:unittest/unittest.dart';
main() {
group('Select b inside span', () {
final div = new dom.Element.html(
r"""<div><span class="prep_nove"><b>1</b></span></div>""");
test('using two separate selectors', () {
var b = div.querySelector("span.prep_nove").querySelector("b");
expect(b, isNotNull);
expect(b.text, '1');
});
test('using one combined selector', () {
var b = div.querySelector("span.prep_nove b");
expect(b, isNotNull);
expect(b.text, '1');
});
});
}
**What is the expected output? What do you see instead?**
Observatory listening on http://127.0.0.1:56811
unittest-suite-wait-for-done
PASS: Select b inside span using two separate selectors
FAIL: Select b inside span using one combined selector
Expected: not null
Actual: <null>
**What version of the product are you using?**
Dart Editor version 1.9.0.dev_00_00 (DEV)
Dart SDK version 1.9.0-dev.0.0
html5lib 0.12.0
|
1.0
|
html5lib test case where two equivalent selectors give different result - *This issue was originally filed by jirkad...@gmail.com*
_____
**What steps will reproduce the problem?**
import 'package:html5lib/dom.dart' as dom;
import 'package:unittest/unittest.dart';
main() {
group('Select b inside span', () {
final div = new dom.Element.html(
r"""<div><span class="prep_nove"><b>1</b></span></div>""");
test('using two separate selectors', () {
var b = div.querySelector("span.prep_nove").querySelector("b");
expect(b, isNotNull);
expect(b.text, '1');
});
test('using one combined selector', () {
var b = div.querySelector("span.prep_nove b");
expect(b, isNotNull);
expect(b.text, '1');
});
});
}
**What is the expected output? What do you see instead?**
Observatory listening on http://127.0.0.1:56811
unittest-suite-wait-for-done
PASS: Select b inside span using two separate selectors
FAIL: Select b inside span using one combined selector
Expected: not null
Actual: <null>
**What version of the product are you using?**
Dart Editor version 1.9.0.dev_00_00 (DEV)
Dart SDK version 1.9.0-dev.0.0
html5lib 0.12.0
|
defect
|
test case where two equivalent selectors give different result this issue was originally filed by jirkad gmail com what steps will reproduce the problem import package dom dart as dom import package unittest unittest dart main nbsp nbsp group select b inside span nbsp nbsp nbsp nbsp final div new dom element html nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp r quot quot quot lt div gt lt span class quot prep nove quot gt lt b gt lt b gt lt span gt lt div gt quot quot quot nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp test using two separate selectors nbsp nbsp nbsp nbsp nbsp nbsp var b div queryselector quot span prep nove quot queryselector quot b quot nbsp nbsp nbsp nbsp nbsp nbsp expect b isnotnull nbsp nbsp nbsp nbsp nbsp nbsp expect b text nbsp nbsp nbsp nbsp nbsp nbsp nbsp nbsp test using one combined selector nbsp nbsp nbsp nbsp nbsp nbsp var b div queryselector quot span prep nove b quot nbsp nbsp nbsp nbsp nbsp nbsp expect b isnotnull nbsp nbsp nbsp nbsp nbsp nbsp expect b text nbsp nbsp nbsp nbsp nbsp nbsp what is the expected output what do you see instead observatory listening on unittest suite wait for done pass select b inside span using two separate selectors fail select b inside span using one combined selector nbsp nbsp expected not null nbsp nbsp nbsp nbsp actual lt null gt what version of the product are you using dart editor version dev dev dart sdk version dev
| 1
|
25,182
| 4,232,621,478
|
IssuesEvent
|
2016-07-05 01:27:49
|
schuel/hmmm
|
https://api.github.com/repos/schuel/hmmm
|
closed
|
Verify my E-Mail button doesn't do anything
|
defect
|
Clicking on the "Verify my E-Mail" button on the Account Settings page doesn't do anything. Also, following the E-Mail verification link in the original Welcome email message brings up the overview page but has no apparent effect. The "Verify my E-Mail" button remains visible on the Account Settings page.
|
1.0
|
Verify my E-Mail button doesn't do anything - Clicking on the "Verify my E-Mail" button on the Account Settings page doesn't do anything. Also, following the E-Mail verification link in the original Welcome email message brings up the overview page but has no apparent effect. The "Verify my E-Mail" button remains visible on the Account Settings page.
|
defect
|
verify my e mail button doesn t do anything clicking on the verify my e mail button on the account settings page doesn t do anything also following the e mail verification link in the original welcome email message brings up the overview page but has no apparent effect the verify my e mail button remains visible on the account settings page
| 1
|
78,422
| 27,516,133,614
|
IssuesEvent
|
2023-03-06 12:06:51
|
vector-im/element-x-android
|
https://api.github.com/repos/vector-im/element-x-android
|
opened
|
The context menu (tap-and-hold) for messages in the timeline is impossible to read on device dark theme
|
T-Defect
|
### Steps to reproduce

And the tap-and-hold message overlay is white text on a white background, making it impossible to make out the options.
1. Activate your device's dark theme (Android 13+).
2. Navigate to a message in the timeline.
3. Tap and hold on it.
### Outcome
#### What did you expect?
To be able to see the options in the tap-and-hold message menu.
#### What happened instead?
Only see a white box. The options are there and functional, but impossible to make out.
### Your phone model
Google Pixel 4a 5G
### Operating system version
Android 13.0
### Application version and app store
commit `4125035c1a827a5b62fa10fd30a901cc22e61a0d`
### Homeserver
_No response_
### Will you send logs?
No
### Are you willing to provide a PR?
No
|
1.0
|
The context menu (tap-and-hold) for messages in the timeline is impossible to read on device dark theme - ### Steps to reproduce

And the tap-and-hold message overlay is white text on a white background, making it impossible to make out the options.
1. Activate your device's dark theme (Android 13+).
2. Navigate to a message in the timeline.
3. Tap and hold on it.
### Outcome
#### What did you expect?
To be able to see the options in the tap-and-hold message menu.
#### What happened instead?
Only see a white box. The options are there and functional, but impossible to make out.
### Your phone model
Google Pixel 4a 5G
### Operating system version
Android 13.0
### Application version and app store
commit `4125035c1a827a5b62fa10fd30a901cc22e61a0d`
### Homeserver
_No response_
### Will you send logs?
No
### Are you willing to provide a PR?
No
|
defect
|
the context menu tap and hold for messages in the timeline is impossible to read on device dark theme steps to reproduce and the tap and hold message overlay is white text on a white background making it impossible to make out the options activate your device s dark theme android navigate to a message in the timeline tap and hold on it outcome what did you expect to be able to see the options in the tap and hold message menu what happened instead only see a white box the options are there and functional but impossible to make out your phone model google pixel operating system version android application version and app store commit homeserver no response will you send logs no are you willing to provide a pr no
| 1
|
182,847
| 14,167,532,316
|
IssuesEvent
|
2020-11-12 10:25:17
|
pandas-dev/pandas
|
https://api.github.com/repos/pandas-dev/pandas
|
closed
|
CI/TST: read_html test_banklist_url_positional_match failing with ResourceWarning on Travis
|
IO HTML Testing
|
Travis builds are recently failing, eg https://travis-ci.org/github/pandas-dev/pandas/jobs/742927007
```
=================================== FAILURES ===================================
_____________ TestReadHtml.test_banklist_url_positional_match[bs4] _____________
[gw0] linux -- Python 3.7.8 /home/travis/miniconda3/envs/pandas-dev/bin/python
self = <pandas.tests.io.test_html.TestReadHtml object at 0x7f3091661050>
@tm.network
def test_banklist_url_positional_match(self):
url = "http://www.fdic.gov/bank/individual/failed/banklist.html"
# Passing match argument as positional should cause a FutureWarning.
with tm.assert_produces_warning(FutureWarning):
df1 = self.read_html(
> url, "First Federal Bank of Florida", attrs={"id": "table"}
)
pandas/tests/io/test_html.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <contextlib._GeneratorContextManager object at 0x7f309168d4d0>
type = None, value = None, traceback = None
def __exit__(self, type, value, traceback):
if type is None:
try:
> next(self.gen)
E AssertionError: Caused unexpected warning(s): [('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=18, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43648), raddr=('172.217.212.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/html5lib/treebuilders/base.py', 38), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=16, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43650), raddr=('172.217.212.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/html5lib/treebuilders/base.py', 38), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=43, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43962), raddr=('172.217.214.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=42, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 46504), raddr=('74.125.124.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=41, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 46502), raddr=('74.125.124.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=15, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43640), raddr=('172.217.212.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335)]
../../../miniconda3/envs/pandas-dev/lib/python3.7/contextlib.py:119: AssertionError
```
|
1.0
|
CI/TST: read_html test_banklist_url_positional_match failing with ResourceWarning on Travis - Travis builds are recently failing, eg https://travis-ci.org/github/pandas-dev/pandas/jobs/742927007
```
=================================== FAILURES ===================================
_____________ TestReadHtml.test_banklist_url_positional_match[bs4] _____________
[gw0] linux -- Python 3.7.8 /home/travis/miniconda3/envs/pandas-dev/bin/python
self = <pandas.tests.io.test_html.TestReadHtml object at 0x7f3091661050>
@tm.network
def test_banklist_url_positional_match(self):
url = "http://www.fdic.gov/bank/individual/failed/banklist.html"
# Passing match argument as positional should cause a FutureWarning.
with tm.assert_produces_warning(FutureWarning):
df1 = self.read_html(
> url, "First Federal Bank of Florida", attrs={"id": "table"}
)
pandas/tests/io/test_html.py:130:
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _
self = <contextlib._GeneratorContextManager object at 0x7f309168d4d0>
type = None, value = None, traceback = None
def __exit__(self, type, value, traceback):
if type is None:
try:
> next(self.gen)
E AssertionError: Caused unexpected warning(s): [('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=18, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43648), raddr=('172.217.212.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/html5lib/treebuilders/base.py', 38), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=16, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43650), raddr=('172.217.212.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/html5lib/treebuilders/base.py', 38), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=43, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43962), raddr=('172.217.214.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=42, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 46504), raddr=('74.125.124.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=41, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 46502), raddr=('74.125.124.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335), ('ResourceWarning', ResourceWarning("unclosed <ssl.SSLSocket fd=15, family=AddressFamily.AF_INET, type=SocketKind.SOCK_STREAM, proto=6, laddr=('10.20.0.10', 43640), raddr=('172.217.212.95', 443)>"), '/home/travis/miniconda3/envs/pandas-dev/lib/python3.7/site-packages/bs4/builder/_html5lib.py', 335)]
../../../miniconda3/envs/pandas-dev/lib/python3.7/contextlib.py:119: AssertionError
```
|
non_defect
|
ci tst read html test banklist url positional match failing with resourcewarning on travis travis builds are recently failing eg failures testreadhtml test banklist url positional match linux python home travis envs pandas dev bin python self tm network def test banklist url positional match self url passing match argument as positional should cause a futurewarning with tm assert produces warning futurewarning self read html url first federal bank of florida attrs id table pandas tests io test html py self type none value none traceback none def exit self type value traceback if type is none try next self gen e assertionerror caused unexpected warning s envs pandas dev lib contextlib py assertionerror
| 0
|
54,597
| 3,070,120,602
|
IssuesEvent
|
2015-08-19 00:54:54
|
amzn/amazon-payments-magento-plugin
|
https://api.github.com/repos/amzn/amazon-payments-magento-plugin
|
closed
|
Auth Decline Errors could be more user friendly
|
bug Fix now high priority
|
The Auth decline error messages could be more user friendly. For common decline reasons, we should handle with a specific message, rather than building a generic error response using API response values.
|
1.0
|
Auth Decline Errors could be more user friendly - The Auth decline error messages could be more user friendly. For common decline reasons, we should handle with a specific message, rather than building a generic error response using API response values.
|
non_defect
|
auth decline errors could be more user friendly the auth decline error messages could be more user friendly for common decline reasons we should handle with a specific message rather than building a generic error response using api response values
| 0
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.