Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
1.02k
| labels
stringlengths 4
1.54k
| body
stringlengths 1
262k
| index
stringclasses 17
values | text_combine
stringlengths 95
262k
| label
stringclasses 2
values | text
stringlengths 96
252k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
311,241
| 26,779,152,219
|
IssuesEvent
|
2023-01-31 19:36:14
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
Failing test: Chrome UI Functional Tests.test/functional/apps/dashboard_elements/index·ts - dashboard elements "after all" hook: afterTestSuite.trigger in "dashboard elements"
|
Team:Presentation failed-test
|
A test failed on a tracked branch
```
NoSuchSessionError: invalid session id
at Object.throwDecodedError (node_modules/selenium-webdriver/lib/error.js:522:15)
at parseHttpResponse (node_modules/selenium-webdriver/lib/http.js:589:13)
at Executor.execute (node_modules/selenium-webdriver/lib/http.js:514:28)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Task.exec (test/functional/services/remote/prevent_parallel_calls.ts:28:20) {
remoteStacktrace: '#0 0x5632d74bd2c3 <unknown>\n' +
'#1 0x5632d72c6700 <unknown>\n' +
'#2 0x5632d72f2067 <unknown>\n' +
'#3 0x5632d731de3c <unknown>\n' +
'#4 0x5632d731ac90 <unknown>\n' +
'#5 0x5632d731a35b <unknown>\n' +
'#6 0x5632d729aab4 <unknown>\n' +
'#7 0x5632d729b8a3 <unknown>\n' +
'#8 0x5632d750b18e <unknown>\n' +
'#9 0x5632d750e622 <unknown>\n' +
'#10 0x5632d74f1aae <unknown>\n' +
'#11 0x5632d750f2a3 <unknown>\n' +
'#12 0x5632d74e5ecf <unknown>\n' +
'#13 0x5632d729a55c <unknown>\n' +
'#14 0x7f7c3584c083 <unknown>\n'
}
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/22680#0183f606-d795-4a7d-8714-b5d7165024ca)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/dashboard_elements/index·ts","test.name":"dashboard elements \"after all\" hook: afterTestSuite.trigger in \"dashboard elements\"","test.failCount":1}} -->
|
1.0
|
Failing test: Chrome UI Functional Tests.test/functional/apps/dashboard_elements/index·ts - dashboard elements "after all" hook: afterTestSuite.trigger in "dashboard elements" - A test failed on a tracked branch
```
NoSuchSessionError: invalid session id
at Object.throwDecodedError (node_modules/selenium-webdriver/lib/error.js:522:15)
at parseHttpResponse (node_modules/selenium-webdriver/lib/http.js:589:13)
at Executor.execute (node_modules/selenium-webdriver/lib/http.js:514:28)
at runMicrotasks (<anonymous>)
at processTicksAndRejections (node:internal/process/task_queues:96:5)
at Task.exec (test/functional/services/remote/prevent_parallel_calls.ts:28:20) {
remoteStacktrace: '#0 0x5632d74bd2c3 <unknown>\n' +
'#1 0x5632d72c6700 <unknown>\n' +
'#2 0x5632d72f2067 <unknown>\n' +
'#3 0x5632d731de3c <unknown>\n' +
'#4 0x5632d731ac90 <unknown>\n' +
'#5 0x5632d731a35b <unknown>\n' +
'#6 0x5632d729aab4 <unknown>\n' +
'#7 0x5632d729b8a3 <unknown>\n' +
'#8 0x5632d750b18e <unknown>\n' +
'#9 0x5632d750e622 <unknown>\n' +
'#10 0x5632d74f1aae <unknown>\n' +
'#11 0x5632d750f2a3 <unknown>\n' +
'#12 0x5632d74e5ecf <unknown>\n' +
'#13 0x5632d729a55c <unknown>\n' +
'#14 0x7f7c3584c083 <unknown>\n'
}
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/22680#0183f606-d795-4a7d-8714-b5d7165024ca)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome UI Functional Tests.test/functional/apps/dashboard_elements/index·ts","test.name":"dashboard elements \"after all\" hook: afterTestSuite.trigger in \"dashboard elements\"","test.failCount":1}} -->
|
test
|
failing test chrome ui functional tests test functional apps dashboard elements index·ts dashboard elements after all hook aftertestsuite trigger in dashboard elements a test failed on a tracked branch nosuchsessionerror invalid session id at object throwdecodederror node modules selenium webdriver lib error js at parsehttpresponse node modules selenium webdriver lib http js at executor execute node modules selenium webdriver lib http js at runmicrotasks at processticksandrejections node internal process task queues at task exec test functional services remote prevent parallel calls ts remotestacktrace n n n n n n n n n n n n n n n first failure
| 1
|
14,460
| 9,197,333,230
|
IssuesEvent
|
2019-03-07 09:45:08
|
clifordsymack/Electron-Cash
|
https://api.github.com/repos/clifordsymack/Electron-Cash
|
closed
|
Reduce amount of shuffle information in the history tab
|
UI & Usability WantToDo enhancement
|
Right now the shuffle tx info dominates. This makes it hard to see the information on spending and receiving transactions, which is probably more relevant to users.
One suggestion was to have the display of shuffle transactions be "toggleable", and when the toggle is "off" to replace all the Shuffle transaction with a "dash" (----)
Another suggested approach is to allow the shuffle transaction info to be collapsed and expanded for each set of "in-between" areas.
Another suggestion is to just remove shuffle transaction info from this tab altogether when the user selects "off".
|
True
|
Reduce amount of shuffle information in the history tab - Right now the shuffle tx info dominates. This makes it hard to see the information on spending and receiving transactions, which is probably more relevant to users.
One suggestion was to have the display of shuffle transactions be "toggleable", and when the toggle is "off" to replace all the Shuffle transaction with a "dash" (----)
Another suggested approach is to allow the shuffle transaction info to be collapsed and expanded for each set of "in-between" areas.
Another suggestion is to just remove shuffle transaction info from this tab altogether when the user selects "off".
|
non_test
|
reduce amount of shuffle information in the history tab right now the shuffle tx info dominates this makes it hard to see the information on spending and receiving transactions which is probably more relevant to users one suggestion was to have the display of shuffle transactions be toggleable and when the toggle is off to replace all the shuffle transaction with a dash another suggested approach is to allow the shuffle transaction info to be collapsed and expanded for each set of in between areas another suggestion is to just remove shuffle transaction info from this tab altogether when the user selects off
| 0
|
306,344
| 26,459,670,481
|
IssuesEvent
|
2023-01-16 16:28:31
|
mantidproject/mantid
|
https://api.github.com/repos/mantidproject/mantid
|
closed
|
Manual Testing Project Recovery
|
Manual Tests
|
You have been assigned manual testing. The hope is to catch as many problems with the code before release, so it would be great if you can take some time to give a serious test to your assigned area. Thank you!!
The general guide to manual testing:
* The tests must be performed on the installer versions of the final release candidate. Not on local compiled code.
* Serious errors involving loss of functionality, crashes etc. should be raised
as issues with the current release as a milestone and an email sent to the project manager immediately.
* Minor and cosmetic issues should be raised as issues against the forthcoming
releases.
* First try things that should work, then try to break Mantid, e.g. entering invalid values, unexpected characters etc.
* Don't spend more than a few hours on the testing as fatigue will kick in.
* If you find errors in the documentation, please correct them.
* Comment against this ticket the OS environment you are testing against.
* Close the this issue once you are done.
### Specific Notes:
http://developer.mantidproject.org/Testing/ErrorReporter-ProjectRecovery/ProjectRecoveryTesting.html
|
1.0
|
Manual Testing Project Recovery - You have been assigned manual testing. The hope is to catch as many problems with the code before release, so it would be great if you can take some time to give a serious test to your assigned area. Thank you!!
The general guide to manual testing:
* The tests must be performed on the installer versions of the final release candidate. Not on local compiled code.
* Serious errors involving loss of functionality, crashes etc. should be raised
as issues with the current release as a milestone and an email sent to the project manager immediately.
* Minor and cosmetic issues should be raised as issues against the forthcoming
releases.
* First try things that should work, then try to break Mantid, e.g. entering invalid values, unexpected characters etc.
* Don't spend more than a few hours on the testing as fatigue will kick in.
* If you find errors in the documentation, please correct them.
* Comment against this ticket the OS environment you are testing against.
* Close the this issue once you are done.
### Specific Notes:
http://developer.mantidproject.org/Testing/ErrorReporter-ProjectRecovery/ProjectRecoveryTesting.html
|
test
|
manual testing project recovery you have been assigned manual testing the hope is to catch as many problems with the code before release so it would be great if you can take some time to give a serious test to your assigned area thank you the general guide to manual testing the tests must be performed on the installer versions of the final release candidate not on local compiled code serious errors involving loss of functionality crashes etc should be raised as issues with the current release as a milestone and an email sent to the project manager immediately minor and cosmetic issues should be raised as issues against the forthcoming releases first try things that should work then try to break mantid e g entering invalid values unexpected characters etc don t spend more than a few hours on the testing as fatigue will kick in if you find errors in the documentation please correct them comment against this ticket the os environment you are testing against close the this issue once you are done specific notes
| 1
|
230,169
| 18,510,411,419
|
IssuesEvent
|
2021-10-20 01:44:54
|
pingcap/tidb
|
https://api.github.com/repos/pingcap/tidb
|
closed
|
IT `insert_update` failed
|
type/bug component/test severity/major
|
## Bug Report
Please answer these questions before submitting your issue. Thanks!
### 1. Minimal reproduce step (Required)
```
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="4 tests failed\n"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [role] err: sql:SET DEFAULT ROLE `rrrrr` TO u1@localhost;: failed to run query \n\"SET DEFAULT ROLE `rrrrr` TO u1@localhost;\" \n around line 466, \nwe need(104):\nError 3530: `rrrrr`@`%` is is not granted to u1@localhost\nSET DEFAULT ROLE `rrrrr` TO u1@localhost;\nErro\nbut got(104):\nSET DEFAULT ROLE `rrrrr` TO u1@localhost;\nError 1396: Operation SET DEFAULT ROLE failed for `rrrrr`@`%`\n\n"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [role2] err: sql:CREATE ROLE r1;: run \"CREATE ROLE r1;\" at line 0 err Error 1396: Operation CREATE ROLE failed for 'r1'@'%'"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [grant_dynamic] err: sql:CREATE USER 'u1'@'localhost' IDENTIFIED BY '123';: run \"CREATE USER 'u1'@'localhost' IDENTIFIED BY '123';\" at line 5 err Error 1396: Operation CREATE USER failed for 'u1'@'localhost'"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [insert_update] err: sql:UPDATE t1 SET a=x'8243' where a=x'8142';: run \"UPDATE t1 SET a=x'8243' where a=x'8142';\" at line 197 err Error 1267: Illegal mix of collations (ascii_bin,IMPLICIT) and (binary,COERCIBLE) for operation '='"
[2021-10-11T07:25:35.836Z] + echo 'tidb-server(PID: 463) stopped'
[2021-10-11T07:25:35.836Z] tidb-server(PID: 463) stopped
[2021-10-11T07:25:35.836Z] + kill -9 463
script returned exit code 1
```
ci: https://ci.pingcap.net/blue/organizations/jenkins/tidb_ghpr_integration_common_test/detail/tidb_ghpr_integration_common_test/6927/pipeline
pr: #27863
### 2. What did you expect to see? (Required)
### 3. What did you see instead (Required)
### 4. What is your TiDB version? (Required)
<!-- Paste the output of SELECT tidb_version() -->
|
1.0
|
IT `insert_update` failed - ## Bug Report
Please answer these questions before submitting your issue. Thanks!
### 1. Minimal reproduce step (Required)
```
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="4 tests failed\n"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [role] err: sql:SET DEFAULT ROLE `rrrrr` TO u1@localhost;: failed to run query \n\"SET DEFAULT ROLE `rrrrr` TO u1@localhost;\" \n around line 466, \nwe need(104):\nError 3530: `rrrrr`@`%` is is not granted to u1@localhost\nSET DEFAULT ROLE `rrrrr` TO u1@localhost;\nErro\nbut got(104):\nSET DEFAULT ROLE `rrrrr` TO u1@localhost;\nError 1396: Operation SET DEFAULT ROLE failed for `rrrrr`@`%`\n\n"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [role2] err: sql:CREATE ROLE r1;: run \"CREATE ROLE r1;\" at line 0 err Error 1396: Operation CREATE ROLE failed for 'r1'@'%'"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [grant_dynamic] err: sql:CREATE USER 'u1'@'localhost' IDENTIFIED BY '123';: run \"CREATE USER 'u1'@'localhost' IDENTIFIED BY '123';\" at line 5 err Error 1396: Operation CREATE USER failed for 'u1'@'localhost'"
[2021-10-11T07:25:35.836Z] time="2021-10-11T15:25:35+08:00" level=error msg="run test [insert_update] err: sql:UPDATE t1 SET a=x'8243' where a=x'8142';: run \"UPDATE t1 SET a=x'8243' where a=x'8142';\" at line 197 err Error 1267: Illegal mix of collations (ascii_bin,IMPLICIT) and (binary,COERCIBLE) for operation '='"
[2021-10-11T07:25:35.836Z] + echo 'tidb-server(PID: 463) stopped'
[2021-10-11T07:25:35.836Z] tidb-server(PID: 463) stopped
[2021-10-11T07:25:35.836Z] + kill -9 463
script returned exit code 1
```
ci: https://ci.pingcap.net/blue/organizations/jenkins/tidb_ghpr_integration_common_test/detail/tidb_ghpr_integration_common_test/6927/pipeline
pr: #27863
### 2. What did you expect to see? (Required)
### 3. What did you see instead (Required)
### 4. What is your TiDB version? (Required)
<!-- Paste the output of SELECT tidb_version() -->
|
test
|
it insert update failed bug report please answer these questions before submitting your issue thanks minimal reproduce step required time level error msg tests failed n time level error msg run test err sql set default role rrrrr to localhost failed to run query n set default role rrrrr to localhost n around line nwe need nerror rrrrr is is not granted to localhost nset default role rrrrr to localhost nerro nbut got nset default role rrrrr to localhost nerror operation set default role failed for rrrrr n n time level error msg run test err sql create role run create role at line err error operation create role failed for time level error msg run test err sql create user localhost identified by run create user localhost identified by at line err error operation create user failed for localhost time level error msg run test err sql update set a x where a x run update set a x where a x at line err error illegal mix of collations ascii bin implicit and binary coercible for operation echo tidb server pid stopped tidb server pid stopped kill script returned exit code ci pr what did you expect to see required what did you see instead required what is your tidb version required
| 1
|
345,267
| 10,360,624,407
|
IssuesEvent
|
2019-09-06 08:05:02
|
grpc/grpc
|
https://api.github.com/repos/grpc/grpc
|
closed
|
How to write middleware for grpc (node.js)
|
disposition/stale kind/enhancement lang/node priority/P3
|
I'm attaching a metadata to each client grpc service call (containing a jwt token), and I'd like to validate this token on the server, once globally (instead of repeating the validation code in each server service definition).
From my understanding, there is something called "Interceptor" for grpc, which is somewhat similar to http middleware.
Any example for Nodejs grpc interceptor? (or any other way to achieve something similar with http middleware?)
Many thanks!
|
1.0
|
How to write middleware for grpc (node.js) - I'm attaching a metadata to each client grpc service call (containing a jwt token), and I'd like to validate this token on the server, once globally (instead of repeating the validation code in each server service definition).
From my understanding, there is something called "Interceptor" for grpc, which is somewhat similar to http middleware.
Any example for Nodejs grpc interceptor? (or any other way to achieve something similar with http middleware?)
Many thanks!
|
non_test
|
how to write middleware for grpc node js i m attaching a metadata to each client grpc service call containing a jwt token and i d like to validate this token on the server once globally instead of repeating the validation code in each server service definition from my understanding there is something called interceptor for grpc which is somewhat similar to http middleware any example for nodejs grpc interceptor or any other way to achieve something similar with http middleware many thanks
| 0
|
7,284
| 10,434,736,019
|
IssuesEvent
|
2019-09-17 15:47:08
|
qgis/QGIS
|
https://api.github.com/repos/qgis/QGIS
|
closed
|
python error when using the QGIS "assign projection" tool in batch mode and output format is GPKG
|
Bug Processing
|
Author Name: **Giovanni Manghi** (@gioman)
Original Redmine Issue: [21498](https://issues.qgis.org/issues/21498)
Affected QGIS version: 3.4.5
Redmine category:processing/qgis
---
Input layers have already a "fid" field. Does not happen not using batch mode. Possibly affects other tools?
error is like
Algorithm Assign projection starting…
Input parameters:
{'CRS': <qgis._core.QgsCoordinateReferenceSystem object at 0x0000000010CABCA8>,
'INPUT': 'input1',
'OUTPUT': <QgsProcessingOutputLayerDefinition {'sink':C:/Users/qgis/test1.gpkg, 'createOptions': {}}>}
Could not create layer C:/Users/qgis/test1.gpkg: Creation of field fid failed (OGR error: Wrong field type for fid)
|
1.0
|
python error when using the QGIS "assign projection" tool in batch mode and output format is GPKG - Author Name: **Giovanni Manghi** (@gioman)
Original Redmine Issue: [21498](https://issues.qgis.org/issues/21498)
Affected QGIS version: 3.4.5
Redmine category:processing/qgis
---
Input layers have already a "fid" field. Does not happen not using batch mode. Possibly affects other tools?
error is like
Algorithm Assign projection starting…
Input parameters:
{'CRS': <qgis._core.QgsCoordinateReferenceSystem object at 0x0000000010CABCA8>,
'INPUT': 'input1',
'OUTPUT': <QgsProcessingOutputLayerDefinition {'sink':C:/Users/qgis/test1.gpkg, 'createOptions': {}}>}
Could not create layer C:/Users/qgis/test1.gpkg: Creation of field fid failed (OGR error: Wrong field type for fid)
|
non_test
|
python error when using the qgis assign projection tool in batch mode and output format is gpkg author name giovanni manghi gioman original redmine issue affected qgis version redmine category processing qgis input layers have already a fid field does not happen not using batch mode possibly affects other tools error is like algorithm assign projection starting… input parameters crs input output could not create layer c users qgis gpkg creation of field fid failed ogr error wrong field type for fid
| 0
|
448,347
| 31,789,151,786
|
IssuesEvent
|
2023-09-13 00:53:09
|
Elias288/ElelisPage
|
https://api.github.com/repos/Elias288/ElelisPage
|
opened
|
Publicacion Crear contendor con MongoDB
|
documentation
|
Crear un post en el blog explicando requisitos y paso a paso como crear un contenedor de base de datos local de Mongo DB
|
1.0
|
Publicacion Crear contendor con MongoDB - Crear un post en el blog explicando requisitos y paso a paso como crear un contenedor de base de datos local de Mongo DB
|
non_test
|
publicacion crear contendor con mongodb crear un post en el blog explicando requisitos y paso a paso como crear un contenedor de base de datos local de mongo db
| 0
|
214,710
| 24,101,234,364
|
IssuesEvent
|
2022-09-20 01:02:08
|
LalithK90/w3Campus
|
https://api.github.com/repos/LalithK90/w3Campus
|
opened
|
CVE-2022-31160 (Medium) detected in jquery-ui-1.12.1.jar
|
security vulnerability
|
## CVE-2022-31160 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-ui-1.12.1.jar</b></p></summary>
<p>WebJar for jQuery UI</p>
<p>Library home page: <a href="http://webjars.org">http://webjars.org</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /radle/caches/modules-2/files-2.1/org.webjars/jquery-ui/1.12.1/7251d21a1d8f78d5c99919954a16777ed8c7ec86/jquery-ui-1.12.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jquery-ui-1.12.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery UI is a curated set of user interface interactions, effects, widgets, and themes built on top of jQuery. Versions prior to 1.13.2 are potentially vulnerable to cross-site scripting. Initializing a checkboxradio widget on an input enclosed within a label makes that parent label contents considered as the input label. Calling `.checkboxradio( "refresh" )` on such a widget and the initial HTML contained encoded HTML entities will make them erroneously get decoded. This can lead to potentially executing JavaScript code. The bug has been patched in jQuery UI 1.13.2. To remediate the issue, someone who can change the initial HTML can wrap all the non-input contents of the `label` in a `span`.
<p>Publish Date: 2022-07-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31160>CVE-2022-31160</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160</a></p>
<p>Release Date: 2022-07-20</p>
<p>Fix Resolution: jquery-ui - 1.13.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2022-31160 (Medium) detected in jquery-ui-1.12.1.jar - ## CVE-2022-31160 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-ui-1.12.1.jar</b></p></summary>
<p>WebJar for jQuery UI</p>
<p>Library home page: <a href="http://webjars.org">http://webjars.org</a></p>
<p>Path to dependency file: /build.gradle</p>
<p>Path to vulnerable library: /radle/caches/modules-2/files-2.1/org.webjars/jquery-ui/1.12.1/7251d21a1d8f78d5c99919954a16777ed8c7ec86/jquery-ui-1.12.1.jar</p>
<p>
Dependency Hierarchy:
- :x: **jquery-ui-1.12.1.jar** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery UI is a curated set of user interface interactions, effects, widgets, and themes built on top of jQuery. Versions prior to 1.13.2 are potentially vulnerable to cross-site scripting. Initializing a checkboxradio widget on an input enclosed within a label makes that parent label contents considered as the input label. Calling `.checkboxradio( "refresh" )` on such a widget and the initial HTML contained encoded HTML entities will make them erroneously get decoded. This can lead to potentially executing JavaScript code. The bug has been patched in jQuery UI 1.13.2. To remediate the issue, someone who can change the initial HTML can wrap all the non-input contents of the `label` in a `span`.
<p>Publish Date: 2022-07-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-31160>CVE-2022-31160</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-31160</a></p>
<p>Release Date: 2022-07-20</p>
<p>Fix Resolution: jquery-ui - 1.13.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in jquery ui jar cve medium severity vulnerability vulnerable library jquery ui jar webjar for jquery ui library home page a href path to dependency file build gradle path to vulnerable library radle caches modules files org webjars jquery ui jquery ui jar dependency hierarchy x jquery ui jar vulnerable library found in base branch master vulnerability details jquery ui is a curated set of user interface interactions effects widgets and themes built on top of jquery versions prior to are potentially vulnerable to cross site scripting initializing a checkboxradio widget on an input enclosed within a label makes that parent label contents considered as the input label calling checkboxradio refresh on such a widget and the initial html contained encoded html entities will make them erroneously get decoded this can lead to potentially executing javascript code the bug has been patched in jquery ui to remediate the issue someone who can change the initial html can wrap all the non input contents of the label in a span publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery ui step up your open source security game with mend
| 0
|
103,617
| 8,923,785,729
|
IssuesEvent
|
2019-01-21 16:33:38
|
spring-projects/spring-framework
|
https://api.github.com/repos/spring-projects/spring-framework
|
reopened
|
spring-test should handle @Mock [SPR-14083]
|
in: core in: test status: bulk-closed
|
**[Martin Meyer](https://jira.spring.io/secure/ViewProfile.jspa?name=elreydetodo)** opened **[SPR-14083](https://jira.spring.io/browse/SPR-14083?redirect=false)** and commented
If I have a Transaction Manager running in unit tests, it becomes impossible for Mockito to properly initialize and inject mocks. This problem seems to be related to usage of proxy classes, not specific to using a transaction manager.
Here's a basic example of a test class:
```java
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = {EnableTransactionManager.class})
public class SomeTest {
@Mock
private MockBean someMockedBean;
@Autowired
@InjectMocks
private RealBean someRealBean;
@Before
public void before() {
MockitoAnnotations.initMocks(this);
}
@Test
public void someTest() {
someRealBean.doSomething();
}
}
```
What I see in the debugger is that on the `someRealBean.doSomething()` line, `someRealBean` contains a mock reference to my `MockBean` instance. But when I set the next breakpoint within `RealBean#doSomething()` and go there, the local field within the object is not a mock one, it's a real `@Autowire`'d bean.
In real terms, the impact on me is that tests fail whenever I try to implement a TransactionManager and have it configured during tests. I end up trying to call DAO methods that should have been mocked, and it tries to issue database calls on a DataSource that isn't really ready to service them.
I think the problem is a race between Mockito and Spring to performing their setup. I think the solution is for `SpringJUnit4ClassRunner` to automagically call the equivalent of `MockitoAnnotations.initMocks(this)` if `org.mockito.Mockito` is in the classpath. This would mean adding an optional dependency on Mockito.
There was [an issue opened against Mockito for this](https://github.com/mockito/mockito/issues/209), which was closed as WONTFIX. They suggested that it be fixed in Spring instead of bringing in a Spring dependency there.
The obvious workaround here is to not enable anything that will do AOP proxying during your unit tests.
---
**Affects:** 4.2.5
|
1.0
|
spring-test should handle @Mock [SPR-14083] - **[Martin Meyer](https://jira.spring.io/secure/ViewProfile.jspa?name=elreydetodo)** opened **[SPR-14083](https://jira.spring.io/browse/SPR-14083?redirect=false)** and commented
If I have a Transaction Manager running in unit tests, it becomes impossible for Mockito to properly initialize and inject mocks. This problem seems to be related to usage of proxy classes, not specific to using a transaction manager.
Here's a basic example of a test class:
```java
@RunWith(SpringJUnit4ClassRunner.class)
@ContextConfiguration(classes = {EnableTransactionManager.class})
public class SomeTest {
@Mock
private MockBean someMockedBean;
@Autowired
@InjectMocks
private RealBean someRealBean;
@Before
public void before() {
MockitoAnnotations.initMocks(this);
}
@Test
public void someTest() {
someRealBean.doSomething();
}
}
```
What I see in the debugger is that on the `someRealBean.doSomething()` line, `someRealBean` contains a mock reference to my `MockBean` instance. But when I set the next breakpoint within `RealBean#doSomething()` and go there, the local field within the object is not a mock one, it's a real `@Autowire`'d bean.
In real terms, the impact on me is that tests fail whenever I try to implement a TransactionManager and have it configured during tests. I end up trying to call DAO methods that should have been mocked, and it tries to issue database calls on a DataSource that isn't really ready to service them.
I think the problem is a race between Mockito and Spring to performing their setup. I think the solution is for `SpringJUnit4ClassRunner` to automagically call the equivalent of `MockitoAnnotations.initMocks(this)` if `org.mockito.Mockito` is in the classpath. This would mean adding an optional dependency on Mockito.
There was [an issue opened against Mockito for this](https://github.com/mockito/mockito/issues/209), which was closed as WONTFIX. They suggested that it be fixed in Spring instead of bringing in a Spring dependency there.
The obvious workaround here is to not enable anything that will do AOP proxying during your unit tests.
---
**Affects:** 4.2.5
|
test
|
spring test should handle mock opened and commented if i have a transaction manager running in unit tests it becomes impossible for mockito to properly initialize and inject mocks this problem seems to be related to usage of proxy classes not specific to using a transaction manager here s a basic example of a test class java runwith class contextconfiguration classes enabletransactionmanager class public class sometest mock private mockbean somemockedbean autowired injectmocks private realbean somerealbean before public void before mockitoannotations initmocks this test public void sometest somerealbean dosomething what i see in the debugger is that on the somerealbean dosomething line somerealbean contains a mock reference to my mockbean instance but when i set the next breakpoint within realbean dosomething and go there the local field within the object is not a mock one it s a real autowire d bean in real terms the impact on me is that tests fail whenever i try to implement a transactionmanager and have it configured during tests i end up trying to call dao methods that should have been mocked and it tries to issue database calls on a datasource that isn t really ready to service them i think the problem is a race between mockito and spring to performing their setup i think the solution is for to automagically call the equivalent of mockitoannotations initmocks this if org mockito mockito is in the classpath this would mean adding an optional dependency on mockito there was which was closed as wontfix they suggested that it be fixed in spring instead of bringing in a spring dependency there the obvious workaround here is to not enable anything that will do aop proxying during your unit tests affects
| 1
|
699,488
| 24,018,339,351
|
IssuesEvent
|
2022-09-15 04:32:43
|
MathMarEcol/WSMPA2
|
https://api.github.com/repos/MathMarEcol/WSMPA2
|
closed
|
Order of features and targets
|
High Priority
|
In the `prioritizr::problem` call, ensure that we have a check for the order of features matching the order of targets. Otherwise the incorrect target could be applied.
They should all stay in the correct order but worth checking.
|
1.0
|
Order of features and targets - In the `prioritizr::problem` call, ensure that we have a check for the order of features matching the order of targets. Otherwise the incorrect target could be applied.
They should all stay in the correct order but worth checking.
|
non_test
|
order of features and targets in the prioritizr problem call ensure that we have a check for the order of features matching the order of targets otherwise the incorrect target could be applied they should all stay in the correct order but worth checking
| 0
|
309,518
| 26,667,268,558
|
IssuesEvent
|
2023-01-26 06:13:53
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
sql/tests: TestRandomSyntaxGeneration failed
|
C-test-failure O-robot branch-master
|
sql/tests.TestRandomSyntaxGeneration [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/8455619?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/8455619?buildTab=artifacts#/) on master @ [2ad8df3df3272110705984efc32f1453631ce602](https://github.com/cockroachdb/cockroach/commits/2ad8df3df3272110705984efc32f1453631ce602):
```
rsg_test.go:828: 2m35s of 5m0s: 445057 executions, 22568 successful
rsg_test.go:828: 2m40s of 5m0s: 459221 executions, 23269 successful
rsg_test.go:828: 2m45s of 5m0s: 472933 executions, 23928 successful
rsg_test.go:828: 2m50s of 5m0s: 486544 executions, 24538 successful
rsg_test.go:828: 2m55s of 5m0s: 501066 executions, 25159 successful
rsg_test.go:828: 3m0s of 5m0s: 515646 executions, 25862 successful
rsg_test.go:828: 3m5s of 5m0s: 530970 executions, 26603 successful
rsg_test.go:828: 3m10s of 5m0s: 546532 executions, 27285 successful
rsg_test.go:828: 3m15s of 5m0s: 561208 executions, 27985 successful
rsg_test.go:828: 3m20s of 5m0s: 575777 executions, 28646 successful
rsg_test.go:828: 3m25s of 5m0s: 591233 executions, 29315 successful
rsg_test.go:828: 3m30s of 5m0s: 607486 executions, 30045 successful
rsg_test.go:828: 3m35s of 5m0s: 621877 executions, 30688 successful
rsg_test.go:828: 3m40s of 5m0s: 635690 executions, 31350 successful
rsg_test.go:828: 3m45s of 5m0s: 649641 executions, 31984 successful
rsg_test.go:828: 3m50s of 5m0s: 663668 executions, 32607 successful
rsg_test.go:828: 3m55s of 5m0s: 678592 executions, 33291 successful
rsg_test.go:828: 4m0s of 5m0s: 692858 executions, 33976 successful
rsg_test.go:828: 4m5s of 5m0s: 708018 executions, 34654 successful
rsg_test.go:828: 4m10s of 5m0s: 723884 executions, 35312 successful
rsg_test.go:828: 4m15s of 5m0s: 739322 executions, 35978 successful
rsg_test.go:828: 4m20s of 5m0s: 753960 executions, 36618 successful
rsg_test.go:828: 4m25s of 5m0s: 769418 executions, 37362 successful
rsg_test.go:828: 4m30s of 5m0s: 784511 executions, 38046 successful
rsg_test.go:828: 4m35s of 5m0s: 799362 executions, 38711 successful
panic: ReturnType called on TypedExpr with empty typeAnnotation. Was the underlying Expr type-checked before asserting a type of TypedExpr?
goroutine 4489663 [running]:
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.typeAnnotation.assertTyped(...)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/expr.go:152
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.(*Subquery).TypeCheck(0xc002d06cc0?, {0x0?, 0x0?}, 0x0?, 0x0?)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:1491 +0xe5
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.(*ColumnAccessExpr).TypeCheck(0xc02caf2800, {0x6d7a3d8?, 0xc02caf5c20?}, 0x0?, 0x0?)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:778 +0x55
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.TypeCheck({0x6d7a3d8?, 0xc02caf5c20?}, {0x6d96640?, 0xc02caf2800?}, 0xc002208000?, 0xc001e2ef18?)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:272 +0x94
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.TypeCheckAndRequire({0x6d7a3d8?, 0xc02caf5c20?}, {0x6d96640?, 0xc02caf2800?}, 0x0?, 0x9b29040, {0x582115a, 0x18})
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:282 +0x65
github.com/cockroachdb/cockroach/pkg/sql.(*planner).analyzeExpr(0xc026537d50, {0x6d7a3d8, 0xc02caf5c20}, {0x6d96640?, 0xc02caf2800?}, 0xc02cbd9ec0?, {{0x0, 0x0, 0x0}, {0x0, ...}}, ...)
github.com/cockroachdb/cockroach/pkg/sql/analyze_expr.go:53 +0x133
github.com/cockroachdb/cockroach/pkg/sql.(*planner).planTenantSpec(0x582115a?, {0x6d7a3d8?, 0xc02caf5c20?}, 0xc002208000?, {0x582115a?, 0x40a0f33f16?})
github.com/cockroachdb/cockroach/pkg/sql/tenant_spec.go:50 +0x254
github.com/cockroachdb/cockroach/pkg/sql.(*planner).LookupTenantInfo(0x93450e9a9fdb9eb3?, {0x6d7a3d8, 0xc02caf5c20}, 0x582115a?, {0x582115a?, 0xc01d3eed28?})
github.com/cockroachdb/cockroach/pkg/sql/tenant_spec.go:153 +0x2d
github.com/cockroachdb/cockroach/pkg/ccl/streamingccl/streamingest.alterReplicationJobHook.func1({0x6d7a3d8, 0xc02caf5c20}, {0x6d7a3d8?, 0xc02cb11aa0?, 0xf0f000000000000?}, 0x6d7a3d8?)
github.com/cockroachdb/cockroach/pkg/ccl/streamingccl/streamingest/alter_replication_job.go:153 +0x12d
github.com/cockroachdb/cockroach/pkg/sql.(*hookFnNode).startExec.func1()
github.com/cockroachdb/cockroach/pkg/sql/planhook.go:205 +0xa4
created by github.com/cockroachdb/cockroach/pkg/sql.(*hookFnNode).startExec
github.com/cockroachdb/cockroach/pkg/sql/planhook.go:203 +0x1b2
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #95618 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot T-sql-sessions branch-release-22.2]
- #89363 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot T-sql-sessions branch-release-22.2.0]
- #87572 sql/tests: TestRandomSyntaxGeneration failed [DROP OWNED BY timeout] [C-test-failure O-robot T-sql-schema branch-release-22.2]
- #77893 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot T-sql-sessions branch-release-22.1]
- #74271 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot branch-release-21.2]
- #65210 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot branch-release-21.1]
</p>
</details>
/cc @cockroachdb/sql-sessions
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxGeneration.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
1.0
|
sql/tests: TestRandomSyntaxGeneration failed - sql/tests.TestRandomSyntaxGeneration [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/8455619?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RandomSyntaxTestsBazel/8455619?buildTab=artifacts#/) on master @ [2ad8df3df3272110705984efc32f1453631ce602](https://github.com/cockroachdb/cockroach/commits/2ad8df3df3272110705984efc32f1453631ce602):
```
rsg_test.go:828: 2m35s of 5m0s: 445057 executions, 22568 successful
rsg_test.go:828: 2m40s of 5m0s: 459221 executions, 23269 successful
rsg_test.go:828: 2m45s of 5m0s: 472933 executions, 23928 successful
rsg_test.go:828: 2m50s of 5m0s: 486544 executions, 24538 successful
rsg_test.go:828: 2m55s of 5m0s: 501066 executions, 25159 successful
rsg_test.go:828: 3m0s of 5m0s: 515646 executions, 25862 successful
rsg_test.go:828: 3m5s of 5m0s: 530970 executions, 26603 successful
rsg_test.go:828: 3m10s of 5m0s: 546532 executions, 27285 successful
rsg_test.go:828: 3m15s of 5m0s: 561208 executions, 27985 successful
rsg_test.go:828: 3m20s of 5m0s: 575777 executions, 28646 successful
rsg_test.go:828: 3m25s of 5m0s: 591233 executions, 29315 successful
rsg_test.go:828: 3m30s of 5m0s: 607486 executions, 30045 successful
rsg_test.go:828: 3m35s of 5m0s: 621877 executions, 30688 successful
rsg_test.go:828: 3m40s of 5m0s: 635690 executions, 31350 successful
rsg_test.go:828: 3m45s of 5m0s: 649641 executions, 31984 successful
rsg_test.go:828: 3m50s of 5m0s: 663668 executions, 32607 successful
rsg_test.go:828: 3m55s of 5m0s: 678592 executions, 33291 successful
rsg_test.go:828: 4m0s of 5m0s: 692858 executions, 33976 successful
rsg_test.go:828: 4m5s of 5m0s: 708018 executions, 34654 successful
rsg_test.go:828: 4m10s of 5m0s: 723884 executions, 35312 successful
rsg_test.go:828: 4m15s of 5m0s: 739322 executions, 35978 successful
rsg_test.go:828: 4m20s of 5m0s: 753960 executions, 36618 successful
rsg_test.go:828: 4m25s of 5m0s: 769418 executions, 37362 successful
rsg_test.go:828: 4m30s of 5m0s: 784511 executions, 38046 successful
rsg_test.go:828: 4m35s of 5m0s: 799362 executions, 38711 successful
panic: ReturnType called on TypedExpr with empty typeAnnotation. Was the underlying Expr type-checked before asserting a type of TypedExpr?
goroutine 4489663 [running]:
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.typeAnnotation.assertTyped(...)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/expr.go:152
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.(*Subquery).TypeCheck(0xc002d06cc0?, {0x0?, 0x0?}, 0x0?, 0x0?)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:1491 +0xe5
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.(*ColumnAccessExpr).TypeCheck(0xc02caf2800, {0x6d7a3d8?, 0xc02caf5c20?}, 0x0?, 0x0?)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:778 +0x55
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.TypeCheck({0x6d7a3d8?, 0xc02caf5c20?}, {0x6d96640?, 0xc02caf2800?}, 0xc002208000?, 0xc001e2ef18?)
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:272 +0x94
github.com/cockroachdb/cockroach/pkg/sql/sem/tree.TypeCheckAndRequire({0x6d7a3d8?, 0xc02caf5c20?}, {0x6d96640?, 0xc02caf2800?}, 0x0?, 0x9b29040, {0x582115a, 0x18})
github.com/cockroachdb/cockroach/pkg/sql/sem/tree/type_check.go:282 +0x65
github.com/cockroachdb/cockroach/pkg/sql.(*planner).analyzeExpr(0xc026537d50, {0x6d7a3d8, 0xc02caf5c20}, {0x6d96640?, 0xc02caf2800?}, 0xc02cbd9ec0?, {{0x0, 0x0, 0x0}, {0x0, ...}}, ...)
github.com/cockroachdb/cockroach/pkg/sql/analyze_expr.go:53 +0x133
github.com/cockroachdb/cockroach/pkg/sql.(*planner).planTenantSpec(0x582115a?, {0x6d7a3d8?, 0xc02caf5c20?}, 0xc002208000?, {0x582115a?, 0x40a0f33f16?})
github.com/cockroachdb/cockroach/pkg/sql/tenant_spec.go:50 +0x254
github.com/cockroachdb/cockroach/pkg/sql.(*planner).LookupTenantInfo(0x93450e9a9fdb9eb3?, {0x6d7a3d8, 0xc02caf5c20}, 0x582115a?, {0x582115a?, 0xc01d3eed28?})
github.com/cockroachdb/cockroach/pkg/sql/tenant_spec.go:153 +0x2d
github.com/cockroachdb/cockroach/pkg/ccl/streamingccl/streamingest.alterReplicationJobHook.func1({0x6d7a3d8, 0xc02caf5c20}, {0x6d7a3d8?, 0xc02cb11aa0?, 0xf0f000000000000?}, 0x6d7a3d8?)
github.com/cockroachdb/cockroach/pkg/ccl/streamingccl/streamingest/alter_replication_job.go:153 +0x12d
github.com/cockroachdb/cockroach/pkg/sql.(*hookFnNode).startExec.func1()
github.com/cockroachdb/cockroach/pkg/sql/planhook.go:205 +0xa4
created by github.com/cockroachdb/cockroach/pkg/sql.(*hookFnNode).startExec
github.com/cockroachdb/cockroach/pkg/sql/planhook.go:203 +0x1b2
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #95618 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot T-sql-sessions branch-release-22.2]
- #89363 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot T-sql-sessions branch-release-22.2.0]
- #87572 sql/tests: TestRandomSyntaxGeneration failed [DROP OWNED BY timeout] [C-test-failure O-robot T-sql-schema branch-release-22.2]
- #77893 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot T-sql-sessions branch-release-22.1]
- #74271 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot branch-release-21.2]
- #65210 sql/tests: TestRandomSyntaxGeneration failed [C-test-failure O-robot branch-release-21.1]
</p>
</details>
/cc @cockroachdb/sql-sessions
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestRandomSyntaxGeneration.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
sql tests testrandomsyntaxgeneration failed sql tests testrandomsyntaxgeneration with on master rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful rsg test go of executions successful panic returntype called on typedexpr with empty typeannotation was the underlying expr type checked before asserting a type of typedexpr goroutine github com cockroachdb cockroach pkg sql sem tree typeannotation asserttyped github com cockroachdb cockroach pkg sql sem tree expr go github com cockroachdb cockroach pkg sql sem tree subquery typecheck github com cockroachdb cockroach pkg sql sem tree type check go github com cockroachdb cockroach pkg sql sem tree columnaccessexpr typecheck github com cockroachdb cockroach pkg sql sem tree type check go github com cockroachdb cockroach pkg sql sem tree typecheck github com cockroachdb cockroach pkg sql sem tree type check go github com cockroachdb cockroach pkg sql sem tree typecheckandrequire github com cockroachdb cockroach pkg sql sem tree type check go github com cockroachdb cockroach pkg sql planner analyzeexpr github com cockroachdb cockroach pkg sql analyze expr go github com cockroachdb cockroach pkg sql planner plantenantspec github com cockroachdb cockroach pkg sql tenant spec go github com cockroachdb cockroach pkg sql planner lookuptenantinfo github com cockroachdb cockroach pkg sql tenant spec go github com cockroachdb cockroach pkg ccl streamingccl streamingest alterreplicationjobhook github com cockroachdb cockroach pkg ccl streamingccl streamingest alter replication job go github com cockroachdb cockroach pkg sql hookfnnode startexec github com cockroachdb cockroach pkg sql planhook go created by github com cockroachdb cockroach pkg sql hookfnnode startexec github com cockroachdb cockroach pkg sql planhook go help see also same failure on other branches sql tests testrandomsyntaxgeneration failed sql tests testrandomsyntaxgeneration failed sql tests testrandomsyntaxgeneration failed sql tests testrandomsyntaxgeneration failed sql tests testrandomsyntaxgeneration failed sql tests testrandomsyntaxgeneration failed cc cockroachdb sql sessions
| 1
|
115,744
| 9,810,542,587
|
IssuesEvent
|
2019-06-12 20:44:32
|
IBM-ICP-CoC/SumApp
|
https://api.github.com/repos/IBM-ICP-CoC/SumApp
|
closed
|
Automated Test Results
|
test
|
<h3>Test Results</h3><p><a href="null">Build: null</a></p>
<pre>
run
run op1 op2 result title result
-----------------------------------------------
0 4 9 [4 + 9 = 13] OK --> passed
1 9 2 [9 + 2 = 11] OK --> passed
2 1 5 [1 + 5 = 6 ] OK --> passed
3 8 5 [8 + 5 = 13] OK --> passed
4 7 4 [7 + 4 = 11] OK --> passed
-----------------------------------------------
Runs: 5
Passed: 5
</pre>
|
1.0
|
Automated Test Results - <h3>Test Results</h3><p><a href="null">Build: null</a></p>
<pre>
run
run op1 op2 result title result
-----------------------------------------------
0 4 9 [4 + 9 = 13] OK --> passed
1 9 2 [9 + 2 = 11] OK --> passed
2 1 5 [1 + 5 = 6 ] OK --> passed
3 8 5 [8 + 5 = 13] OK --> passed
4 7 4 [7 + 4 = 11] OK --> passed
-----------------------------------------------
Runs: 5
Passed: 5
</pre>
|
test
|
automated test results test results build null run run result title result ok passed ok passed ok passed ok passed ok passed runs passed
| 1
|
180,768
| 6,653,273,356
|
IssuesEvent
|
2017-09-29 07:39:40
|
a8cteam51/smittenkitchen
|
https://api.github.com/repos/a8cteam51/smittenkitchen
|
closed
|
sidebar and footer modules look wonky
|
high-priority
|
I don't know if it's a bad ad messing up the code, but the the titles that usually center nicely over sidebar images are underneath and the footer module has weirdly sized images. It looked fine 2 days ago. Thank you.
|
1.0
|
sidebar and footer modules look wonky - I don't know if it's a bad ad messing up the code, but the the titles that usually center nicely over sidebar images are underneath and the footer module has weirdly sized images. It looked fine 2 days ago. Thank you.
|
non_test
|
sidebar and footer modules look wonky i don t know if it s a bad ad messing up the code but the the titles that usually center nicely over sidebar images are underneath and the footer module has weirdly sized images it looked fine days ago thank you
| 0
|
174,699
| 13,505,356,872
|
IssuesEvent
|
2020-09-13 22:29:34
|
ericberglund117/Refactor-tractor-WC-EB-KM-CD
|
https://api.github.com/repos/ericberglund117/Refactor-tractor-WC-EB-KM-CD
|
closed
|
SPRINT 2 :: PANTRY CLASS :: Remove ingredients
|
TDD/Testing enhancement
|
CHECKLIST
- [x] test written
- [x] code written
- [x] test passes
As a user, I should be able to remove ingredients used for a given meal from my pantry once that meal has been cooked
|
1.0
|
SPRINT 2 :: PANTRY CLASS :: Remove ingredients - CHECKLIST
- [x] test written
- [x] code written
- [x] test passes
As a user, I should be able to remove ingredients used for a given meal from my pantry once that meal has been cooked
|
test
|
sprint pantry class remove ingredients checklist test written code written test passes as a user i should be able to remove ingredients used for a given meal from my pantry once that meal has been cooked
| 1
|
719,761
| 24,769,163,853
|
IssuesEvent
|
2022-10-22 23:16:30
|
JasonBock/Rocks
|
https://api.github.com/repos/JasonBock/Rocks
|
closed
|
Projected Delegates Need to Duplicate Type Constraints
|
bug Medium Priority
|
If a delegate needs to be generated, any type constraints from the method need to be duplicated on the delegate.
This currently causes the following issues with `SixLabors.ImageSharp.Processing.Processors.Dithering.IDither`:
```
Error - Id: CS8377, Description: Rocks\Rocks.RockCreateGenerator\IDither_Rock_Create.g.cs(11,204): error CS8377: The type 'TPixel' must be a non-nullable value type, along with all fields at any level of nesting, in order to use it as parameter 'TPixel' in the generic type or method 'ImageFrame<TPixel>'
Error - Id: CS8377, Description: Rocks\Rocks.RockCreateGenerator\IDither_Rock_Create.g.cs(11,268): error CS8377: The type 'TPixel' must be a non-nullable value type, along with all fields at any level of nesting, in order to use it as parameter 'TPixel' in the generic type or method 'IndexedImageFrame<TPixel>'
```
|
1.0
|
Projected Delegates Need to Duplicate Type Constraints - If a delegate needs to be generated, any type constraints from the method need to be duplicated on the delegate.
This currently causes the following issues with `SixLabors.ImageSharp.Processing.Processors.Dithering.IDither`:
```
Error - Id: CS8377, Description: Rocks\Rocks.RockCreateGenerator\IDither_Rock_Create.g.cs(11,204): error CS8377: The type 'TPixel' must be a non-nullable value type, along with all fields at any level of nesting, in order to use it as parameter 'TPixel' in the generic type or method 'ImageFrame<TPixel>'
Error - Id: CS8377, Description: Rocks\Rocks.RockCreateGenerator\IDither_Rock_Create.g.cs(11,268): error CS8377: The type 'TPixel' must be a non-nullable value type, along with all fields at any level of nesting, in order to use it as parameter 'TPixel' in the generic type or method 'IndexedImageFrame<TPixel>'
```
|
non_test
|
projected delegates need to duplicate type constraints if a delegate needs to be generated any type constraints from the method need to be duplicated on the delegate this currently causes the following issues with sixlabors imagesharp processing processors dithering idither error id description rocks rocks rockcreategenerator idither rock create g cs error the type tpixel must be a non nullable value type along with all fields at any level of nesting in order to use it as parameter tpixel in the generic type or method imageframe error id description rocks rocks rockcreategenerator idither rock create g cs error the type tpixel must be a non nullable value type along with all fields at any level of nesting in order to use it as parameter tpixel in the generic type or method indexedimageframe
| 0
|
676,682
| 23,134,061,745
|
IssuesEvent
|
2022-07-28 13:00:23
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
Add rotating hash to payload - followup to 23055
|
bug priority/P3 QA/Yes release-notes/exclude regression feature/ads OS/Desktop
|
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Launch browser
2. Join rewards
3. Navigate to brave://rewards-internals
## Actual result:
<!--Please add screenshots if needed-->
`Device Id: Unknown` is shown on Ad Diagnostics
## Expected result:
`Device Id: ...` should be shown on Ad Diagnostics, where ... is the device id (which never leaves the local machine)
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily reproduced
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? No
- Can you reproduce this issue with the beta channel? Yes
- Can you reproduce this issue with the nightly channel? Yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields? N/A
- Does the issue resolve itself when disabling Brave Rewards? N/A
- Is the issue reproducible on the latest version of Chrome? N/A
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
|
1.0
|
Add rotating hash to payload - followup to 23055 - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description
<!--Provide a brief description of the issue-->
## Steps to Reproduce
<!--Please add a series of steps to reproduce the issue-->
1. Launch browser
2. Join rewards
3. Navigate to brave://rewards-internals
## Actual result:
<!--Please add screenshots if needed-->
`Device Id: Unknown` is shown on Ad Diagnostics
## Expected result:
`Device Id: ...` should be shown on Ad Diagnostics, where ... is the device id (which never leaves the local machine)
## Reproduces how often:
<!--[Easily reproduced/Intermittent issue/No steps to reproduce]-->
Easily reproduced
## Brave version (brave://version info)
<!--For installed build, please copy Brave, Revision and OS from brave://version and paste here. If building from source please mention it along with brave://version details-->
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current release? No
- Can you reproduce this issue with the beta channel? Yes
- Can you reproduce this issue with the nightly channel? Yes
## Other Additional Information:
- Does the issue resolve itself when disabling Brave Shields? N/A
- Does the issue resolve itself when disabling Brave Rewards? N/A
- Is the issue reproducible on the latest version of Chrome? N/A
## Miscellaneous Information:
<!--Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue-->
|
non_test
|
add rotating hash to payload followup to have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description steps to reproduce launch browser join rewards navigate to brave rewards internals actual result device id unknown is shown on ad diagnostics expected result device id should be shown on ad diagnostics where is the device id which never leaves the local machine reproduces how often easily reproduced brave version brave version info version channel information can you reproduce this issue with the current release no can you reproduce this issue with the beta channel yes can you reproduce this issue with the nightly channel yes other additional information does the issue resolve itself when disabling brave shields n a does the issue resolve itself when disabling brave rewards n a is the issue reproducible on the latest version of chrome n a miscellaneous information
| 0
|
321,289
| 27,520,247,137
|
IssuesEvent
|
2023-03-06 14:36:43
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix non_linear_activation_functions.test_torch_batch_norm
|
PyTorch Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4106132353/jobs/7083996639" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4064792994/jobs/6998677575" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4106132353/jobs/7083998421" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4106132353/jobs/7083993203" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
|
1.0
|
Fix non_linear_activation_functions.test_torch_batch_norm - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4106132353/jobs/7083996639" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4064792994/jobs/6998677575" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4106132353/jobs/7083998421" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4106132353/jobs/7083993203" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
<details>
<summary>Not found</summary>
Not found
</details>
|
test
|
fix non linear activation functions test torch batch norm tensorflow img src torch img src numpy img src jax img src not found not found not found not found not found not found
| 1
|
208,321
| 16,110,524,198
|
IssuesEvent
|
2021-04-27 20:30:49
|
tsdataclinic/AfricaCovidDashboard
|
https://api.github.com/repos/tsdataclinic/AfricaCovidDashboard
|
closed
|
What is in the Indian Ocean region?
|
documentation
|
Region shown as an option in dropdown, but unclear what it includes
|
1.0
|
What is in the Indian Ocean region? - Region shown as an option in dropdown, but unclear what it includes
|
non_test
|
what is in the indian ocean region region shown as an option in dropdown but unclear what it includes
| 0
|
230,794
| 18,715,925,139
|
IssuesEvent
|
2021-11-03 04:40:34
|
devtranslate/design
|
https://api.github.com/repos/devtranslate/design
|
closed
|
Feature | Desenvolver o componente Button
|
addition: test addition: style addition: feature addition: accessibility hacktoberfest
|
Criar componente seguindo o layout do botão feito no figma:
* https://www.figma.com/file/przYUCKKXEun8ezPR0cP7D/Design-System?node-id=43%3A40
|
1.0
|
Feature | Desenvolver o componente Button - Criar componente seguindo o layout do botão feito no figma:
* https://www.figma.com/file/przYUCKKXEun8ezPR0cP7D/Design-System?node-id=43%3A40
|
test
|
feature desenvolver o componente button criar componente seguindo o layout do botão feito no figma
| 1
|
91,676
| 8,316,267,472
|
IssuesEvent
|
2018-09-25 08:33:19
|
betagouv/pass-culture-browser
|
https://api.github.com/repos/betagouv/pass-culture-browser
|
closed
|
[PRO][] OFFERER (page /structures)
|
tests
|
Quand je créé un compte et que j'arrive sur la page structures
- [x] je vois un bandeau vert, 'le rattachement de la structure, bla bla bla...'
- [x] je vois juste le message 'En cours de validation....'
- [x] je peux ajouter un lieu même si ma structure n'est pas validée
Quand je créé une nouvelle structure
- [x] Si j'ajoute une structure mais qu'elle n'a pas encore de lieu, On affiche "Créez un lieu pour pouvoir y associer des offres." Pas de btn "nouvelle offre", pas de décompte des offres, pas de décompte des lieux : il faut aller dans la page de détail de la structure pour ajouter le lieux.
- [x] BUG ? si je rajoute une structure, le bandeau vert qui m'indique que 'Votre structure a bien été enregistrée, elle est en cours de validation.' s'affiche alors que la structure est active par défaut #350
Si ma structure principale n'est pas encore validée
- [x] Gérer l'affichage différement si la structure est validée ou pas, pour le moment l'affichage est conditionné par isActive et le isValidated n'est pas encore implémenté côté api @arnoo #349
Si je n'ai pas ajouté de lieu
- [ ] si je clique sur offre mais que je n'ai pas de lieu > que se passe-t-il ? #353
**Remarques**
- ça n'affiche pas toujours correctement le nombre d'offres mais comme ça touche aux selectors, j'ai laissé de côté.
- Une offre sans date n'apparaît pas non plus
Branche https://github.com/betagouv/pass-culture-pro/commits/fix-offerer-item-informations-display

|
1.0
|
[PRO][] OFFERER (page /structures) - Quand je créé un compte et que j'arrive sur la page structures
- [x] je vois un bandeau vert, 'le rattachement de la structure, bla bla bla...'
- [x] je vois juste le message 'En cours de validation....'
- [x] je peux ajouter un lieu même si ma structure n'est pas validée
Quand je créé une nouvelle structure
- [x] Si j'ajoute une structure mais qu'elle n'a pas encore de lieu, On affiche "Créez un lieu pour pouvoir y associer des offres." Pas de btn "nouvelle offre", pas de décompte des offres, pas de décompte des lieux : il faut aller dans la page de détail de la structure pour ajouter le lieux.
- [x] BUG ? si je rajoute une structure, le bandeau vert qui m'indique que 'Votre structure a bien été enregistrée, elle est en cours de validation.' s'affiche alors que la structure est active par défaut #350
Si ma structure principale n'est pas encore validée
- [x] Gérer l'affichage différement si la structure est validée ou pas, pour le moment l'affichage est conditionné par isActive et le isValidated n'est pas encore implémenté côté api @arnoo #349
Si je n'ai pas ajouté de lieu
- [ ] si je clique sur offre mais que je n'ai pas de lieu > que se passe-t-il ? #353
**Remarques**
- ça n'affiche pas toujours correctement le nombre d'offres mais comme ça touche aux selectors, j'ai laissé de côté.
- Une offre sans date n'apparaît pas non plus
Branche https://github.com/betagouv/pass-culture-pro/commits/fix-offerer-item-informations-display

|
test
|
offerer page structures quand je créé un compte et que j arrive sur la page structures je vois un bandeau vert le rattachement de la structure bla bla bla je vois juste le message en cours de validation je peux ajouter un lieu même si ma structure n est pas validée quand je créé une nouvelle structure si j ajoute une structure mais qu elle n a pas encore de lieu on affiche créez un lieu pour pouvoir y associer des offres pas de btn nouvelle offre pas de décompte des offres pas de décompte des lieux il faut aller dans la page de détail de la structure pour ajouter le lieux bug si je rajoute une structure le bandeau vert qui m indique que votre structure a bien été enregistrée elle est en cours de validation s affiche alors que la structure est active par défaut si ma structure principale n est pas encore validée gérer l affichage différement si la structure est validée ou pas pour le moment l affichage est conditionné par isactive et le isvalidated n est pas encore implémenté côté api arnoo si je n ai pas ajouté de lieu si je clique sur offre mais que je n ai pas de lieu que se passe t il remarques ça n affiche pas toujours correctement le nombre d offres mais comme ça touche aux selectors j ai laissé de côté une offre sans date n apparaît pas non plus branche
| 1
|
152,277
| 12,100,024,586
|
IssuesEvent
|
2020-04-20 13:11:08
|
ethereum/solidity
|
https://api.github.com/repos/ethereum/solidity
|
closed
|
[Fuzzing] Permit linking of libraries in fuzzer harnesses
|
testing :hammer:
|
## Abstract
The solidity compilation framework class that is used by fuzzing code is currently capable of only generating EVM bytecode for a specified contract.
This issue tracks support for linking libraries. This support will be useful in #8636 .
|
1.0
|
[Fuzzing] Permit linking of libraries in fuzzer harnesses - ## Abstract
The solidity compilation framework class that is used by fuzzing code is currently capable of only generating EVM bytecode for a specified contract.
This issue tracks support for linking libraries. This support will be useful in #8636 .
|
test
|
permit linking of libraries in fuzzer harnesses abstract the solidity compilation framework class that is used by fuzzing code is currently capable of only generating evm bytecode for a specified contract this issue tracks support for linking libraries this support will be useful in
| 1
|
52,024
| 13,211,369,584
|
IssuesEvent
|
2020-08-15 22:38:36
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
opened
|
cvmfs OpenCL detection looks for only libOpenCL.so.1 (Trac #1515)
|
Incomplete Migration Migrated from Trac cvmfs defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1515">https://code.icecube.wisc.edu/projects/icecube/ticket/1515</a>, reported by benedikt.riedeland owned by david.schultz</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-04-16T18:48:18",
"_ts": "1460832498369171",
"description": "OpenCL detection in cvmfs setup.sh depends on finding libOpenCL.so.1. On some systems libOpenCL.so.1 doesn't exist. Only libOpenCL.so is present on some systems, see\n\n\n{{{\n[briedel@parallel build]$ ll /global/software/cuda/6.0/lib64/libOpenCL*\n-rw-r--r-- 1 root root 26K May 29 2014 /global/software/cuda/6.0/lib64/libOpenCL.so\n}}}\n\n\nThe detection fails and causes problems with detecting the GPUs.",
"reporter": "benedikt.riedel",
"cc": "kclark, claudio.kopper",
"resolution": "fixed",
"time": "2016-01-19T23:39:05",
"component": "cvmfs",
"summary": "cvmfs OpenCL detection looks for only libOpenCL.so.1",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "david.schultz",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
cvmfs OpenCL detection looks for only libOpenCL.so.1 (Trac #1515) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1515">https://code.icecube.wisc.edu/projects/icecube/ticket/1515</a>, reported by benedikt.riedeland owned by david.schultz</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2016-04-16T18:48:18",
"_ts": "1460832498369171",
"description": "OpenCL detection in cvmfs setup.sh depends on finding libOpenCL.so.1. On some systems libOpenCL.so.1 doesn't exist. Only libOpenCL.so is present on some systems, see\n\n\n{{{\n[briedel@parallel build]$ ll /global/software/cuda/6.0/lib64/libOpenCL*\n-rw-r--r-- 1 root root 26K May 29 2014 /global/software/cuda/6.0/lib64/libOpenCL.so\n}}}\n\n\nThe detection fails and causes problems with detecting the GPUs.",
"reporter": "benedikt.riedel",
"cc": "kclark, claudio.kopper",
"resolution": "fixed",
"time": "2016-01-19T23:39:05",
"component": "cvmfs",
"summary": "cvmfs OpenCL detection looks for only libOpenCL.so.1",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "david.schultz",
"type": "defect"
}
```
</p>
</details>
|
non_test
|
cvmfs opencl detection looks for only libopencl so trac migrated from json status closed changetime ts description opencl detection in cvmfs setup sh depends on finding libopencl so on some systems libopencl so doesn t exist only libopencl so is present on some systems see n n n n ll global software cuda libopencl n rw r r root root may global software cuda libopencl so n n n nthe detection fails and causes problems with detecting the gpus reporter benedikt riedel cc kclark claudio kopper resolution fixed time component cvmfs summary cvmfs opencl detection looks for only libopencl so priority normal keywords milestone owner david schultz type defect
| 0
|
279,719
| 24,250,358,467
|
IssuesEvent
|
2022-09-27 13:47:58
|
lowRISC/opentitan
|
https://api.github.com/repos/lowRISC/opentitan
|
closed
|
[rom-e2e] rom_functests
|
Type:Task SW:ROM Milestone:V2 Component:RomE2eTest
|
### Test point name
[rom_functests](https://cs.opensource.google/opentitan/opentitan/+/master:sw/device/silicon_creator/rom/data/rom_testplan.hjson?q=rom_functests)
### Host side component
Unknown
### OpenTitanTool infrastructure implemented
Unknown
### Contact person
@alphan
### Checklist
Please fill out this checklist as items are completed. Link to PRs and issues as appropriate.
- [ ] Check if existing test covers most or all of this testpoint (if so, either extend said test to cover all points, or skip the next 3 checkboxes)
- [ ] Device-side (C) component developed
- [ ] Bazel build rules developed
- [ ] Host-side component developed
- [ ] HJSON test plan updated with test name (so it shows up in the dashboard)
### Determine which functests can be executed using ROM.
Functests test ROM components (e.g., drivers, libraries, etc.) work as intended on-chip.
However, unlike when these components are embedded in the ROM, functests are linked with
the OTTF, and run out of flash. Additionally, unlike the ROM E2E tests, functests are
booted by the test ROM.
Determine which functests can be executed using ROM to understand which tests can be
reused on the silicon.
|
1.0
|
[rom-e2e] rom_functests - ### Test point name
[rom_functests](https://cs.opensource.google/opentitan/opentitan/+/master:sw/device/silicon_creator/rom/data/rom_testplan.hjson?q=rom_functests)
### Host side component
Unknown
### OpenTitanTool infrastructure implemented
Unknown
### Contact person
@alphan
### Checklist
Please fill out this checklist as items are completed. Link to PRs and issues as appropriate.
- [ ] Check if existing test covers most or all of this testpoint (if so, either extend said test to cover all points, or skip the next 3 checkboxes)
- [ ] Device-side (C) component developed
- [ ] Bazel build rules developed
- [ ] Host-side component developed
- [ ] HJSON test plan updated with test name (so it shows up in the dashboard)
### Determine which functests can be executed using ROM.
Functests test ROM components (e.g., drivers, libraries, etc.) work as intended on-chip.
However, unlike when these components are embedded in the ROM, functests are linked with
the OTTF, and run out of flash. Additionally, unlike the ROM E2E tests, functests are
booted by the test ROM.
Determine which functests can be executed using ROM to understand which tests can be
reused on the silicon.
|
test
|
rom functests test point name host side component unknown opentitantool infrastructure implemented unknown contact person alphan checklist please fill out this checklist as items are completed link to prs and issues as appropriate check if existing test covers most or all of this testpoint if so either extend said test to cover all points or skip the next checkboxes device side c component developed bazel build rules developed host side component developed hjson test plan updated with test name so it shows up in the dashboard determine which functests can be executed using rom functests test rom components e g drivers libraries etc work as intended on chip however unlike when these components are embedded in the rom functests are linked with the ottf and run out of flash additionally unlike the rom tests functests are booted by the test rom determine which functests can be executed using rom to understand which tests can be reused on the silicon
| 1
|
495,516
| 14,283,396,797
|
IssuesEvent
|
2020-11-23 10:57:25
|
MrDaGree/ELS-FiveM
|
https://api.github.com/repos/MrDaGree/ELS-FiveM
|
closed
|
ELS script Error 444 & ELS files cant be read
|
priority: medium status: on hold type: question
|
Hello, i got actually two bugs... i searched for solutions for both but i couldn't find anything.
Im using fivepd on my server, there is no esx installed.
So the first bug is the following one:
https://prnt.sc/vir65j
Im not really familiar with the code stuff, so i have no idea what it means.
Second problem are some ELS vehicles files. Every ELS file for a replaced vehicle (police, police2 e.g.) is working. But for addons its a little bit different - some ELS files are working, some others not. There is the ELS menu in the right corner on every car, so i think ELS is at least detected. I change the steps of lighting (from 0 to 3). Siren is working too. But the lights on the roof aren't working. For some vehicles also the front flashers are working, but not the lights on the roof. I have no idea on what it depends, because some addon cars are working, others dont. In singleplayer every car is working without any problems. I didn't change anything in the .xml files.
I tried a fix i found on YT to change the .xml file names from normal.xml to BIG.xml, but it didn't change anything.
Would be great if someone has an idea why some files are working and others don't.
I fully removed and installed ELS again, but nothing changed - same issues.
### ELS Information
* Version: newest version
* Server Version #: FXServer-master v1.0.0.2967 linux
* Screenshots of config.lua, vcf.lua, vcf folder:
https://prnt.sc/vir6tz
https://prnt.sc/vir6hb
https://prnt.sc/vir742
Hope someone can help me, thanks :)
|
1.0
|
ELS script Error 444 & ELS files cant be read - Hello, i got actually two bugs... i searched for solutions for both but i couldn't find anything.
Im using fivepd on my server, there is no esx installed.
So the first bug is the following one:
https://prnt.sc/vir65j
Im not really familiar with the code stuff, so i have no idea what it means.
Second problem are some ELS vehicles files. Every ELS file for a replaced vehicle (police, police2 e.g.) is working. But for addons its a little bit different - some ELS files are working, some others not. There is the ELS menu in the right corner on every car, so i think ELS is at least detected. I change the steps of lighting (from 0 to 3). Siren is working too. But the lights on the roof aren't working. For some vehicles also the front flashers are working, but not the lights on the roof. I have no idea on what it depends, because some addon cars are working, others dont. In singleplayer every car is working without any problems. I didn't change anything in the .xml files.
I tried a fix i found on YT to change the .xml file names from normal.xml to BIG.xml, but it didn't change anything.
Would be great if someone has an idea why some files are working and others don't.
I fully removed and installed ELS again, but nothing changed - same issues.
### ELS Information
* Version: newest version
* Server Version #: FXServer-master v1.0.0.2967 linux
* Screenshots of config.lua, vcf.lua, vcf folder:
https://prnt.sc/vir6tz
https://prnt.sc/vir6hb
https://prnt.sc/vir742
Hope someone can help me, thanks :)
|
non_test
|
els script error els files cant be read hello i got actually two bugs i searched for solutions for both but i couldn t find anything im using fivepd on my server there is no esx installed so the first bug is the following one im not really familiar with the code stuff so i have no idea what it means second problem are some els vehicles files every els file for a replaced vehicle police e g is working but for addons its a little bit different some els files are working some others not there is the els menu in the right corner on every car so i think els is at least detected i change the steps of lighting from to siren is working too but the lights on the roof aren t working for some vehicles also the front flashers are working but not the lights on the roof i have no idea on what it depends because some addon cars are working others dont in singleplayer every car is working without any problems i didn t change anything in the xml files i tried a fix i found on yt to change the xml file names from normal xml to big xml but it didn t change anything would be great if someone has an idea why some files are working and others don t i fully removed and installed els again but nothing changed same issues els information version newest version server version fxserver master linux screenshots of config lua vcf lua vcf folder hope someone can help me thanks
| 0
|
202,065
| 7,043,726,268
|
IssuesEvent
|
2017-12-31 11:46:16
|
EmulatorNexus/VeniceUnleashed
|
https://api.github.com/repos/EmulatorNexus/VeniceUnleashed
|
closed
|
hanging flag-indicator during taking a flag
|
bug low priority
|
As i already reported yesterday via gamechat, here is the bugreport on git:
Sometimes (doesn't really matter what gamemode as long as it features flags) the flag indicator which gets filled up/down randomly hangs even if the flag is still beeing taken.
Happens in about 1/5 games i would say.
Video: http://a.pomf.se/iufwqt.webm
|
1.0
|
hanging flag-indicator during taking a flag - As i already reported yesterday via gamechat, here is the bugreport on git:
Sometimes (doesn't really matter what gamemode as long as it features flags) the flag indicator which gets filled up/down randomly hangs even if the flag is still beeing taken.
Happens in about 1/5 games i would say.
Video: http://a.pomf.se/iufwqt.webm
|
non_test
|
hanging flag indicator during taking a flag as i already reported yesterday via gamechat here is the bugreport on git sometimes doesn t really matter what gamemode as long as it features flags the flag indicator which gets filled up down randomly hangs even if the flag is still beeing taken happens in about games i would say video
| 0
|
352,416
| 32,065,039,261
|
IssuesEvent
|
2023-09-25 01:45:32
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix jax_lax_operators.test_jax_nextafter
|
JAX Frontend Sub Task Failing Test
|
| | |
|---|---|
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|
1.0
|
Fix jax_lax_operators.test_jax_nextafter - | | |
|---|---|
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/6293456359/job/17083995787"><img src=https://img.shields.io/badge/-failure-red></a>
|
test
|
fix jax lax operators test jax nextafter numpy a href src jax a href src tensorflow a href src torch a href src paddle a href src
| 1
|
84,887
| 24,461,661,513
|
IssuesEvent
|
2022-10-07 11:44:18
|
microsoft/vscode
|
https://api.github.com/repos/microsoft/vscode
|
closed
|
build: Egress is over the account limit.
|
vscode-build
|
```
[13:23:42] 'vscode-win32-x64-min-ci' errored after 11 s
[13:23:42] HTTPError: Response code 503 (Egress is over the account limit.)
at EventEmitter.<anonymous> (D:\a\_work\1\s\node_modules\got\source\as-stream.js:35:24)
at EventEmitter.emit (node:events:513:28)
at EventEmitter.emit (node:domain:552:15)
at module.exports (D:\a\_work\1\s\node_modules\got\source\get-response.js:22:10)
at ClientRequest.handleResponse (D:\a\_work\1\s\node_modules\got\source\request-as-event-emitter.js:155:5)
at Object.onceWrapper (node:events:628:26)
at ClientRequest.emit (node:events:525:35)
at ClientRequest.emit (node:domain:552:15)
at ClientRequest.origin.emit (D:\a\_work\1\s\node_modules\@szmarczak\http-timer\source\index.js:37:11)
at HTTPParser.parserOnIncomingClient [as onIncoming] (node:_http_client:674:27)
error Command failed with exit code 1.
```
Refs https://dev.azure.com/monacotools/Monaco/_build/results?buildId=186983&view=results
|
1.0
|
build: Egress is over the account limit. - ```
[13:23:42] 'vscode-win32-x64-min-ci' errored after 11 s
[13:23:42] HTTPError: Response code 503 (Egress is over the account limit.)
at EventEmitter.<anonymous> (D:\a\_work\1\s\node_modules\got\source\as-stream.js:35:24)
at EventEmitter.emit (node:events:513:28)
at EventEmitter.emit (node:domain:552:15)
at module.exports (D:\a\_work\1\s\node_modules\got\source\get-response.js:22:10)
at ClientRequest.handleResponse (D:\a\_work\1\s\node_modules\got\source\request-as-event-emitter.js:155:5)
at Object.onceWrapper (node:events:628:26)
at ClientRequest.emit (node:events:525:35)
at ClientRequest.emit (node:domain:552:15)
at ClientRequest.origin.emit (D:\a\_work\1\s\node_modules\@szmarczak\http-timer\source\index.js:37:11)
at HTTPParser.parserOnIncomingClient [as onIncoming] (node:_http_client:674:27)
error Command failed with exit code 1.
```
Refs https://dev.azure.com/monacotools/Monaco/_build/results?buildId=186983&view=results
|
non_test
|
build egress is over the account limit vscode min ci errored after s httperror response code egress is over the account limit at eventemitter d a work s node modules got source as stream js at eventemitter emit node events at eventemitter emit node domain at module exports d a work s node modules got source get response js at clientrequest handleresponse d a work s node modules got source request as event emitter js at object oncewrapper node events at clientrequest emit node events at clientrequest emit node domain at clientrequest origin emit d a work s node modules szmarczak http timer source index js at httpparser parseronincomingclient node http client error command failed with exit code refs
| 0
|
18,015
| 3,663,254,750
|
IssuesEvent
|
2016-02-19 04:22:55
|
nodejs/node
|
https://api.github.com/repos/nodejs/node
|
closed
|
test-debug-no-context segfaulting
|
arm debugger lts-watch-v4.x test windows
|
It looks like there may be more needed than 25776f3ea14 to keep `test-debug-no-context` from segfaulting as I just noticed tonight that it segfaulted on two separate occasions ([here](https://ci.nodejs.org/job/node-test-binary-windows/335/RUN_SUBSET=0,VS_VERSION=vs2015,label=win10/console) and [here](https://ci.nodejs.org/job/node-test-binary-windows/336/RUN_SUBSET=0,VS_VERSION=vs2015,label=win10/console)).
/cc @bnoordhuis @jasnell @evanlucas @nodejs/platform-windows
|
1.0
|
test-debug-no-context segfaulting - It looks like there may be more needed than 25776f3ea14 to keep `test-debug-no-context` from segfaulting as I just noticed tonight that it segfaulted on two separate occasions ([here](https://ci.nodejs.org/job/node-test-binary-windows/335/RUN_SUBSET=0,VS_VERSION=vs2015,label=win10/console) and [here](https://ci.nodejs.org/job/node-test-binary-windows/336/RUN_SUBSET=0,VS_VERSION=vs2015,label=win10/console)).
/cc @bnoordhuis @jasnell @evanlucas @nodejs/platform-windows
|
test
|
test debug no context segfaulting it looks like there may be more needed than to keep test debug no context from segfaulting as i just noticed tonight that it segfaulted on two separate occasions and cc bnoordhuis jasnell evanlucas nodejs platform windows
| 1
|
155,010
| 19,765,645,096
|
IssuesEvent
|
2022-01-17 01:38:42
|
tuanducdesign/infomation-covid19
|
https://api.github.com/repos/tuanducdesign/infomation-covid19
|
closed
|
CVE-2021-23386 (Medium) detected in dns-packet-1.3.1.tgz - autoclosed
|
security vulnerability
|
## CVE-2021-23386 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dns-packet-1.3.1.tgz</b></p></summary>
<p>An abstract-encoding compliant module for encoding / decoding DNS packets</p>
<p>Library home page: <a href="https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz">https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/dns-packet/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.3.tgz (Root Library)
- webpack-dev-server-3.11.0.tgz
- bonjour-3.5.0.tgz
- multicast-dns-6.2.3.tgz
- :x: **dns-packet-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/tuanducdesign/infomation-covid19/commit/ee46c4feb6053c8559c4199d4973822a30c8219f">ee46c4feb6053c8559c4199d4973822a30c8219f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package dns-packet before 5.2.2. It creates buffers with allocUnsafe and does not always fill them before forming network packets. This can expose internal application memory over unencrypted network when querying crafted invalid domain names.
<p>Publish Date: 2021-05-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23386>CVE-2021-23386</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386</a></p>
<p>Release Date: 2021-05-20</p>
<p>Fix Resolution: dns-packet - 5.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-23386 (Medium) detected in dns-packet-1.3.1.tgz - autoclosed - ## CVE-2021-23386 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>dns-packet-1.3.1.tgz</b></p></summary>
<p>An abstract-encoding compliant module for encoding / decoding DNS packets</p>
<p>Library home page: <a href="https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz">https://registry.npmjs.org/dns-packet/-/dns-packet-1.3.1.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/dns-packet/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-3.4.3.tgz (Root Library)
- webpack-dev-server-3.11.0.tgz
- bonjour-3.5.0.tgz
- multicast-dns-6.2.3.tgz
- :x: **dns-packet-1.3.1.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/tuanducdesign/infomation-covid19/commit/ee46c4feb6053c8559c4199d4973822a30c8219f">ee46c4feb6053c8559c4199d4973822a30c8219f</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects the package dns-packet before 5.2.2. It creates buffers with allocUnsafe and does not always fill them before forming network packets. This can expose internal application memory over unencrypted network when querying crafted invalid domain names.
<p>Publish Date: 2021-05-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23386>CVE-2021-23386</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2021-23386</a></p>
<p>Release Date: 2021-05-20</p>
<p>Fix Resolution: dns-packet - 5.2.2</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in dns packet tgz autoclosed cve medium severity vulnerability vulnerable library dns packet tgz an abstract encoding compliant module for encoding decoding dns packets library home page a href path to dependency file package json path to vulnerable library node modules dns packet package json dependency hierarchy react scripts tgz root library webpack dev server tgz bonjour tgz multicast dns tgz x dns packet tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects the package dns packet before it creates buffers with allocunsafe and does not always fill them before forming network packets this can expose internal application memory over unencrypted network when querying crafted invalid domain names publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution dns packet step up your open source security game with whitesource
| 0
|
145,947
| 11,715,946,606
|
IssuesEvent
|
2020-03-09 14:54:55
|
phetsims/wave-interference
|
https://api.github.com/repos/phetsims/wave-interference
|
opened
|
CT There is an issue with the licenses for media types
|
type:automated-testing
|
Also in Intro
```
wave-interference : build : phet
Failure to grunt snapshot-1583748161897/wave-interference:
Running "lint-all" task
Running "report-media" task
>> not-annotated: wave-interference/sounds/slider-click-v2-left.mp3
>> not-annotated: wave-interference/sounds/slider-click-v2-right.mp3
Fatal error: There is an issue with the licenses for media types.
Approximately 3/9/2020, 4:02:41 AM
```
|
1.0
|
CT There is an issue with the licenses for media types - Also in Intro
```
wave-interference : build : phet
Failure to grunt snapshot-1583748161897/wave-interference:
Running "lint-all" task
Running "report-media" task
>> not-annotated: wave-interference/sounds/slider-click-v2-left.mp3
>> not-annotated: wave-interference/sounds/slider-click-v2-right.mp3
Fatal error: There is an issue with the licenses for media types.
Approximately 3/9/2020, 4:02:41 AM
```
|
test
|
ct there is an issue with the licenses for media types also in intro wave interference build phet failure to grunt snapshot wave interference running lint all task running report media task not annotated wave interference sounds slider click left not annotated wave interference sounds slider click right fatal error there is an issue with the licenses for media types approximately am
| 1
|
127,851
| 10,490,807,842
|
IssuesEvent
|
2019-09-25 09:44:26
|
Joystream/joystream
|
https://api.github.com/repos/Joystream/joystream
|
opened
|
WIP: Create a repo for the new Joystream Content System
|
release-plan rome-testnet suggestions
|
# Background
When Rome goes live, our current content system will be replaced by a new, "dynamic, flexible and user friendly content directory". See #97 for tasks related to this.
Although different actors, ie. `content curators`, `content creators` and content consumers, can and will require different levels of understanding, we need to provide the resources for the current and future community to understand the system.
# Proposal
Create a repo that contains:
- high level explanation of the system
- workflow for publishing content
- a place to request improvements, request new classes, properties, entities etc.
- up-to-date overview of all available schemas
- resources for curators to create new JSON schemas
Some of this would overlap with what the [helpdesk](https://github.com/JoyStream/helpdesk) repo is meant to do, so it might make sense to keep the high level stuff in there.
Not sure if I should propose the structure in this repo, or create a proposal for review.
Name:
`Joystream/joystream-content-directory`
|
1.0
|
WIP: Create a repo for the new Joystream Content System - # Background
When Rome goes live, our current content system will be replaced by a new, "dynamic, flexible and user friendly content directory". See #97 for tasks related to this.
Although different actors, ie. `content curators`, `content creators` and content consumers, can and will require different levels of understanding, we need to provide the resources for the current and future community to understand the system.
# Proposal
Create a repo that contains:
- high level explanation of the system
- workflow for publishing content
- a place to request improvements, request new classes, properties, entities etc.
- up-to-date overview of all available schemas
- resources for curators to create new JSON schemas
Some of this would overlap with what the [helpdesk](https://github.com/JoyStream/helpdesk) repo is meant to do, so it might make sense to keep the high level stuff in there.
Not sure if I should propose the structure in this repo, or create a proposal for review.
Name:
`Joystream/joystream-content-directory`
|
test
|
wip create a repo for the new joystream content system background when rome goes live our current content system will be replaced by a new dynamic flexible and user friendly content directory see for tasks related to this although different actors ie content curators content creators and content consumers can and will require different levels of understanding we need to provide the resources for the current and future community to understand the system proposal create a repo that contains high level explanation of the system workflow for publishing content a place to request improvements request new classes properties entities etc up to date overview of all available schemas resources for curators to create new json schemas some of this would overlap with what the repo is meant to do so it might make sense to keep the high level stuff in there not sure if i should propose the structure in this repo or create a proposal for review name joystream joystream content directory
| 1
|
208,585
| 16,130,446,911
|
IssuesEvent
|
2021-04-29 03:16:55
|
baagaard-usgs/geomodelgrids
|
https://api.github.com/repos/baagaard-usgs/geomodelgrids
|
closed
|
Update documentation for Borehole application
|
documentation
|
* Column headers in output
* Add units to output header
|
1.0
|
Update documentation for Borehole application - * Column headers in output
* Add units to output header
|
non_test
|
update documentation for borehole application column headers in output add units to output header
| 0
|
97,056
| 10,980,366,233
|
IssuesEvent
|
2019-11-30 13:48:59
|
Students-of-the-city-of-Kostroma/Ray-of-hope
|
https://api.github.com/repos/Students-of-the-city-of-Kostroma/Ray-of-hope
|
closed
|
Работа с описанием сущностей сервера
|
Documentation Epic InTesting O8 PR4 Sprint 5 Sprint 6 Sprint 7
|
Основные сущности, будущие на сервере описаны.
https://drive.google.com/drive/folders/18JOX9crY4tql8sDR8QqhZZwwZIUUcS6q
Интересует диаграмма "Структура сущностей"
Написана не в нотации UML, представляет собой скорее диаграмму плана
"Чем в данный момент для нас является объект, какие поля нам нужны
(важны, необходимы для описания объекта на данном этапе), а какие нет"
Как пример -- на верхнем уровне Пользователя нас интересует только то,
что у него есть глобальная сущность Профиль, потому что у каждого Пользователя должен быть профиль.
Но зато после, когда мы говорим о данном Пользователе, а конкретно о его Профиле, мы понимаем, что у каждого Пользователя должно быть имя (название).
В дополнение есть файл с минимальным описанием предлагаемых уровней абстракции "Структура сущностей пояснение"
Задание:
(ВАЖНО) Понять принципы построения как диаграммы, так и документации. Пожалуйста.
Протестировать в соответствии с этим разработанную документацию по основным сущностям.
Выбрать места, недописанные до нужного уровня.
При необходимости выбрать неинтуитивно понятные дополнительные глобальные сущности, требующие такого же описания.
|
1.0
|
Работа с описанием сущностей сервера - Основные сущности, будущие на сервере описаны.
https://drive.google.com/drive/folders/18JOX9crY4tql8sDR8QqhZZwwZIUUcS6q
Интересует диаграмма "Структура сущностей"
Написана не в нотации UML, представляет собой скорее диаграмму плана
"Чем в данный момент для нас является объект, какие поля нам нужны
(важны, необходимы для описания объекта на данном этапе), а какие нет"
Как пример -- на верхнем уровне Пользователя нас интересует только то,
что у него есть глобальная сущность Профиль, потому что у каждого Пользователя должен быть профиль.
Но зато после, когда мы говорим о данном Пользователе, а конкретно о его Профиле, мы понимаем, что у каждого Пользователя должно быть имя (название).
В дополнение есть файл с минимальным описанием предлагаемых уровней абстракции "Структура сущностей пояснение"
Задание:
(ВАЖНО) Понять принципы построения как диаграммы, так и документации. Пожалуйста.
Протестировать в соответствии с этим разработанную документацию по основным сущностям.
Выбрать места, недописанные до нужного уровня.
При необходимости выбрать неинтуитивно понятные дополнительные глобальные сущности, требующие такого же описания.
|
non_test
|
работа с описанием сущностей сервера основные сущности будущие на сервере описаны интересует диаграмма структура сущностей написана не в нотации uml представляет собой скорее диаграмму плана чем в данный момент для нас является объект какие поля нам нужны важны необходимы для описания объекта на данном этапе а какие нет как пример на верхнем уровне пользователя нас интересует только то что у него есть глобальная сущность профиль потому что у каждого пользователя должен быть профиль но зато после когда мы говорим о данном пользователе а конкретно о его профиле мы понимаем что у каждого пользователя должно быть имя название в дополнение есть файл с минимальным описанием предлагаемых уровней абстракции структура сущностей пояснение задание важно понять принципы построения как диаграммы так и документации пожалуйста протестировать в соответствии с этим разработанную документацию по основным сущностям выбрать места недописанные до нужного уровня при необходимости выбрать неинтуитивно понятные дополнительные глобальные сущности требующие такого же описания
| 0
|
114,322
| 11,843,162,297
|
IssuesEvent
|
2020-03-24 01:23:54
|
RPHolloway/expertiza
|
https://api.github.com/repos/RPHolloway/expertiza
|
opened
|
Wiki must include a 'Test Plan' section
|
documentation
|
Over the approach of all existing and forthcoming tests.
|
1.0
|
Wiki must include a 'Test Plan' section - Over the approach of all existing and forthcoming tests.
|
non_test
|
wiki must include a test plan section over the approach of all existing and forthcoming tests
| 0
|
62,372
| 6,795,012,987
|
IssuesEvent
|
2017-11-01 14:23:21
|
EnMasseProject/enmasse
|
https://api.github.com/repos/EnMasseProject/enmasse
|
closed
|
System-tests: brokered: create test with scaling broker deployment up/down
|
test development
|
- add this functionality
- extend already created tests or develop new tests with scaling
|
1.0
|
System-tests: brokered: create test with scaling broker deployment up/down - - add this functionality
- extend already created tests or develop new tests with scaling
|
test
|
system tests brokered create test with scaling broker deployment up down add this functionality extend already created tests or develop new tests with scaling
| 1
|
24,169
| 4,063,839,977
|
IssuesEvent
|
2016-05-26 02:18:22
|
briansmith/ring
|
https://api.github.com/repos/briansmith/ring
|
opened
|
Figure out what to do with x86 Poly1305 AVX implementation
|
performance static-analysis-and-type-safety test-coverage
|
This is about the `$avx` flag in crypto/poly1305/asm/poly1305-x86.pl. Currently it is set to zero. It could be set to 2.
BoringSSL has it disabled ("0") because they can't generate their test coverage reports for it and so they don't trust the code.
This code also seems to use only this build-time flag, and not runtime feature detection, to determine whether to try the AVX code. That seems wrong.
Maybe the best solution is to stick to the SSE2 code and then just delete the AVX code.
|
1.0
|
Figure out what to do with x86 Poly1305 AVX implementation - This is about the `$avx` flag in crypto/poly1305/asm/poly1305-x86.pl. Currently it is set to zero. It could be set to 2.
BoringSSL has it disabled ("0") because they can't generate their test coverage reports for it and so they don't trust the code.
This code also seems to use only this build-time flag, and not runtime feature detection, to determine whether to try the AVX code. That seems wrong.
Maybe the best solution is to stick to the SSE2 code and then just delete the AVX code.
|
test
|
figure out what to do with avx implementation this is about the avx flag in crypto asm pl currently it is set to zero it could be set to boringssl has it disabled because they can t generate their test coverage reports for it and so they don t trust the code this code also seems to use only this build time flag and not runtime feature detection to determine whether to try the avx code that seems wrong maybe the best solution is to stick to the code and then just delete the avx code
| 1
|
155,847
| 12,279,847,479
|
IssuesEvent
|
2020-05-08 13:03:38
|
kiwicom/schemathesis
|
https://api.github.com/repos/kiwicom/schemathesis
|
closed
|
Reduce usage of mocks in CLI tests
|
Priority: Low Type: Testing
|
It will be problematic to test on Windows when there will be a subprocess runner
|
1.0
|
Reduce usage of mocks in CLI tests - It will be problematic to test on Windows when there will be a subprocess runner
|
test
|
reduce usage of mocks in cli tests it will be problematic to test on windows when there will be a subprocess runner
| 1
|
138,337
| 11,199,302,903
|
IssuesEvent
|
2020-01-03 18:19:57
|
Singaporee/Singapore
|
https://api.github.com/repos/Singaporee/Singapore
|
closed
|
Recording service status of an Asset
|
Check Tests needed
|
<!-- Any section or subsection that is not applicable should simply be removed -->
<!-- Problem/feature description including what needs to be implemented -->
### Description
This issue creates the opportunity for a user to record the service status of an `Asset` in order to know the status of it, where it is, and whether it is currently in service and what maintenance it requires.
Steps to be completed:
- [x] Implement an entity `ServiceStatus` with the following properties:
- `name: String` -- a single key member that would identify the `Service Status` of an `Asset`.
- `desc: String` -- should be made required by using annotation `@DescRequired`.
- [x] Create `ServiceStatus` Master
- [x] Create `ServiceStatus` Centre
- [x] Associate `ServiceStatus` class to `Asset` class
### Functional Requirements
- `Asset` is associated with relevant `ServiceStatus`
- date/time of the association of change is recorded
<!-- Provided the benefits of implementing this issue -->
### Expected outcome
User is able to record the service status of an `Asset` in order to perform the tasks described in 'Description'.
|
1.0
|
Recording service status of an Asset - <!-- Any section or subsection that is not applicable should simply be removed -->
<!-- Problem/feature description including what needs to be implemented -->
### Description
This issue creates the opportunity for a user to record the service status of an `Asset` in order to know the status of it, where it is, and whether it is currently in service and what maintenance it requires.
Steps to be completed:
- [x] Implement an entity `ServiceStatus` with the following properties:
- `name: String` -- a single key member that would identify the `Service Status` of an `Asset`.
- `desc: String` -- should be made required by using annotation `@DescRequired`.
- [x] Create `ServiceStatus` Master
- [x] Create `ServiceStatus` Centre
- [x] Associate `ServiceStatus` class to `Asset` class
### Functional Requirements
- `Asset` is associated with relevant `ServiceStatus`
- date/time of the association of change is recorded
<!-- Provided the benefits of implementing this issue -->
### Expected outcome
User is able to record the service status of an `Asset` in order to perform the tasks described in 'Description'.
|
test
|
recording service status of an asset description this issue creates the opportunity for a user to record the service status of an asset in order to know the status of it where it is and whether it is currently in service and what maintenance it requires steps to be completed implement an entity servicestatus with the following properties name string a single key member that would identify the service status of an asset desc string should be made required by using annotation descrequired create servicestatus master create servicestatus centre associate servicestatus class to asset class functional requirements asset is associated with relevant servicestatus date time of the association of change is recorded expected outcome user is able to record the service status of an asset in order to perform the tasks described in description
| 1
|
445,403
| 31,236,972,352
|
IssuesEvent
|
2023-08-20 12:00:04
|
yop-me/IoTProyecto
|
https://api.github.com/repos/yop-me/IoTProyecto
|
closed
|
Requerimientos funcionales
|
documentation
|
Implementar los siguientes:
- Control remoto de dispositivos
- Programacion de encendido y apagado
- Monitoreo del estado de los dispositivos
|
1.0
|
Requerimientos funcionales - Implementar los siguientes:
- Control remoto de dispositivos
- Programacion de encendido y apagado
- Monitoreo del estado de los dispositivos
|
non_test
|
requerimientos funcionales implementar los siguientes control remoto de dispositivos programacion de encendido y apagado monitoreo del estado de los dispositivos
| 0
|
107,247
| 13,446,095,894
|
IssuesEvent
|
2020-09-08 12:28:09
|
flutter/flutter
|
https://api.github.com/repos/flutter/flutter
|
closed
|
In Stocks app, switching tabs should remember ListView scroll position
|
f: material design framework team team: gallery
|
## Steps to Reproduce
Run stocks example app. Scroll in the market tab. Switch to portfolio tab and then again back to market tab. The market tab listview is scrolled to the top, instead of remembering the previous scroll position.
## Flutter Doctor
```
[✓] Flutter (on Mac OS X 10.12.5 16F73, locale en-US, channel alpha)
• Flutter at /Users/amardeep/soft/flutter
• Framework revision 1c372c6803 (4 days ago), 2017-08-31 15:54:45 -0700
• Engine revision f9e00a7c72
• Tools Dart version 1.25.0-dev.11.0
[✓] Android toolchain - develop for Android devices (Android SDK 26.0.1)
• Android SDK at /Users/amardeep/Library/Android/sdk
• Platform android-26, build-tools 26.0.1
• ANDROID_HOME = /Users/amardeep/Library/Android/sdk
• Java binary at: /Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b06)
[✓] iOS toolchain - develop for iOS devices (Xcode 8.3.3)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Xcode 8.3.3, Build version 8E3004b
• ios-deploy 1.9.1
• CocoaPods version 1.2.1
[✓] Android Studio
• Android Studio at /Applications/Android Studio 3.0 Preview.app/Contents
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b736)
[✓] Android Studio (version 2.3)
• Android Studio at /Applications/Android Studio.app/Contents
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b06)
[✓] Connected devices
• iPhone 6s • 25A5BCDD-C0A9-47E2-854A-E4E395186423 • ios • iOS 10.3 (simulator)
```
|
1.0
|
In Stocks app, switching tabs should remember ListView scroll position - ## Steps to Reproduce
Run stocks example app. Scroll in the market tab. Switch to portfolio tab and then again back to market tab. The market tab listview is scrolled to the top, instead of remembering the previous scroll position.
## Flutter Doctor
```
[✓] Flutter (on Mac OS X 10.12.5 16F73, locale en-US, channel alpha)
• Flutter at /Users/amardeep/soft/flutter
• Framework revision 1c372c6803 (4 days ago), 2017-08-31 15:54:45 -0700
• Engine revision f9e00a7c72
• Tools Dart version 1.25.0-dev.11.0
[✓] Android toolchain - develop for Android devices (Android SDK 26.0.1)
• Android SDK at /Users/amardeep/Library/Android/sdk
• Platform android-26, build-tools 26.0.1
• ANDROID_HOME = /Users/amardeep/Library/Android/sdk
• Java binary at: /Applications/Android Studio.app/Contents/jre/jdk/Contents/Home/bin/java
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b06)
[✓] iOS toolchain - develop for iOS devices (Xcode 8.3.3)
• Xcode at /Applications/Xcode.app/Contents/Developer
• Xcode 8.3.3, Build version 8E3004b
• ios-deploy 1.9.1
• CocoaPods version 1.2.1
[✓] Android Studio
• Android Studio at /Applications/Android Studio 3.0 Preview.app/Contents
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b736)
[✓] Android Studio (version 2.3)
• Android Studio at /Applications/Android Studio.app/Contents
• Java version OpenJDK Runtime Environment (build 1.8.0_112-release-b06)
[✓] Connected devices
• iPhone 6s • 25A5BCDD-C0A9-47E2-854A-E4E395186423 • ios • iOS 10.3 (simulator)
```
|
non_test
|
in stocks app switching tabs should remember listview scroll position steps to reproduce run stocks example app scroll in the market tab switch to portfolio tab and then again back to market tab the market tab listview is scrolled to the top instead of remembering the previous scroll position flutter doctor flutter on mac os x locale en us channel alpha • flutter at users amardeep soft flutter • framework revision days ago • engine revision • tools dart version dev android toolchain develop for android devices android sdk • android sdk at users amardeep library android sdk • platform android build tools • android home users amardeep library android sdk • java binary at applications android studio app contents jre jdk contents home bin java • java version openjdk runtime environment build release ios toolchain develop for ios devices xcode • xcode at applications xcode app contents developer • xcode build version • ios deploy • cocoapods version android studio • android studio at applications android studio preview app contents • java version openjdk runtime environment build release android studio version • android studio at applications android studio app contents • java version openjdk runtime environment build release connected devices • iphone • • ios • ios simulator
| 0
|
219,962
| 17,139,199,966
|
IssuesEvent
|
2021-07-13 07:42:28
|
milvus-io/milvus
|
https://api.github.com/repos/milvus-io/milvus
|
opened
|
Support multi clients in test runners
|
sig/testing
|
**Please state your issue using the following template and, most importantly, in English.**
Support multi test clients in locust runner, used in load testing.
**What would you like to be added ?**
**Why is this needed ?**
|
1.0
|
Support multi clients in test runners - **Please state your issue using the following template and, most importantly, in English.**
Support multi test clients in locust runner, used in load testing.
**What would you like to be added ?**
**Why is this needed ?**
|
test
|
support multi clients in test runners please state your issue using the following template and most importantly in english support multi test clients in locust runner used in load testing what would you like to be added why is this needed
| 1
|
103,430
| 4,172,985,704
|
IssuesEvent
|
2016-06-21 08:54:14
|
fac-freelancers/website
|
https://api.github.com/repos/fac-freelancers/website
|
closed
|
Accessibility
|
in-progress priority-2 T1h
|
Need to be careful of using features of JS that aren't universally supported. Both http://kangax.github.io/compat-table/es6 and caniuse.com should be your bible. If you really want these things, you must include polyfills.
You should also be doing cross-browser testing, and across older versions, not just the version of chrome that you're using.
Examples:
- [x] Use of `const`
- [x] Use of `Array.from`
|
1.0
|
Accessibility - Need to be careful of using features of JS that aren't universally supported. Both http://kangax.github.io/compat-table/es6 and caniuse.com should be your bible. If you really want these things, you must include polyfills.
You should also be doing cross-browser testing, and across older versions, not just the version of chrome that you're using.
Examples:
- [x] Use of `const`
- [x] Use of `Array.from`
|
non_test
|
accessibility need to be careful of using features of js that aren t universally supported both and caniuse com should be your bible if you really want these things you must include polyfills you should also be doing cross browser testing and across older versions not just the version of chrome that you re using examples use of const use of array from
| 0
|
232,880
| 18,921,265,729
|
IssuesEvent
|
2021-11-17 02:09:30
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: tpccbench/nodes=3/cpu=16/no-admission failed [perm denied #72635]
|
C-test-failure O-robot O-roachtest branch-master release-blocker
|
roachtest.tpccbench/nodes=3/cpu=16/no-admission [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3705715&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3705715&tab=artifacts#/tpccbench/nodes=3/cpu=16/no-admission) on master @ [2de17e7fbe66e14039fc7969a76139625761438f](https://github.com/cockroachdb/cockroach/commits/2de17e7fbe66e14039fc7969a76139625761438f):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/tpccbench/nodes=3/cpu=16/no-admission/run_1
cluster.go:1856,tpcc.go:1125,tpcc.go:1135,search.go:43,search.go:173,tpcc.go:1131,tpcc.go:905,test_runner.go:777: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --encrypt=false teamcity-3705715-1636529870-25-n4cpu16:1-3 returned: exit status 7
(1) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --encrypt=false teamcity-3705715-1636529870-25-n4cpu16:1-3 returned
| stderr:
|
| stdout:
| <... some data truncated by circular buffer; go to artifacts for details ...>
| nc9(0x1a19940, 0xc00007c4a0, 0x1, 0x2, 0xc0006bfa10, 0xc0006bfa38)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:457 +0xdf
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !main.wrap.func1(0x1a19940, 0xc00007c4a0, 0x1, 0x2)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:123 +0x6b
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !github.com/spf13/cobra.(*Command).execute(0x1a19940, 0xc00007c480, 0x2, 0x2, 0x1a19940, 0xc00007c480)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:856 +0x2c2
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !github.com/spf13/cobra.(*Command).ExecuteC(0x1a196c0, 0x0, 0x0, 0xc0004cc8c0)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:960 +0x375
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !github.com/spf13/cobra.(*Command).Execute(...)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:897
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !main.main()
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1170 +0x26a5
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !****************************************************************************
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !This node experienced a fatal error (printed above), and as a result the
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !process is terminating.
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !Fatal errors can occur due to faulty hardware (disks, memory, clocks) or a
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !problem in CockroachDB. With your help, the support team at Cockroach Labs
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !will try to determine the root cause, recommend next steps, and we can
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !improve CockroachDB based on your report.
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !Please submit a crash report by following the instructions here:
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! https://github.com/cockroachdb/cockroach/issues/new/choose
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !If you would rather not post publicly, please contact us directly at:
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! support@cockroachlabs.com
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !The Cockroach Labs team appreciates your feedback.
Wraps: (2) exit status 7
Error types: (1) *cluster.WithCommandDetails (2) *exec.ExitError
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
|
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*tpccbench/nodes=3/cpu=16/no-admission.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: tpccbench/nodes=3/cpu=16/no-admission failed [perm denied #72635] - roachtest.tpccbench/nodes=3/cpu=16/no-admission [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3705715&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3705715&tab=artifacts#/tpccbench/nodes=3/cpu=16/no-admission) on master @ [2de17e7fbe66e14039fc7969a76139625761438f](https://github.com/cockroachdb/cockroach/commits/2de17e7fbe66e14039fc7969a76139625761438f):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/tpccbench/nodes=3/cpu=16/no-admission/run_1
cluster.go:1856,tpcc.go:1125,tpcc.go:1135,search.go:43,search.go:173,tpcc.go:1131,tpcc.go:905,test_runner.go:777: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --encrypt=false teamcity-3705715-1636529870-25-n4cpu16:1-3 returned: exit status 7
(1) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod start --encrypt=false teamcity-3705715-1636529870-25-n4cpu16:1-3 returned
| stderr:
|
| stdout:
| <... some data truncated by circular buffer; go to artifacts for details ...>
| nc9(0x1a19940, 0xc00007c4a0, 0x1, 0x2, 0xc0006bfa10, 0xc0006bfa38)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:457 +0xdf
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !main.wrap.func1(0x1a19940, 0xc00007c4a0, 0x1, 0x2)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:123 +0x6b
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !github.com/spf13/cobra.(*Command).execute(0x1a19940, 0xc00007c480, 0x2, 0x2, 0x1a19940, 0xc00007c480)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:856 +0x2c2
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !github.com/spf13/cobra.(*Command).ExecuteC(0x1a196c0, 0x0, 0x0, 0xc0004cc8c0)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:960 +0x375
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !github.com/spf13/cobra.(*Command).Execute(...)
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:897
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !main.main()
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1170 +0x26a5
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !****************************************************************************
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !This node experienced a fatal error (printed above), and as a result the
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !process is terminating.
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !Fatal errors can occur due to faulty hardware (disks, memory, clocks) or a
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !problem in CockroachDB. With your help, the support team at Cockroach Labs
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !will try to determine the root cause, recommend next steps, and we can
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !improve CockroachDB based on your report.
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !Please submit a crash report by following the instructions here:
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! https://github.com/cockroachdb/cockroach/issues/new/choose
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !If you would rather not post publicly, please contact us directly at:
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 ! support@cockroachlabs.com
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !
| F211110 09:37:39.259452 1 roachprod/install/cluster_synced.go:1777 [-] 2 !The Cockroach Labs team appreciates your feedback.
Wraps: (2) exit status 7
Error types: (1) *cluster.WithCommandDetails (2) *exec.ExitError
```
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
|
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/kv-triage
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*tpccbench/nodes=3/cpu=16/no-admission.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
roachtest tpccbench nodes cpu no admission failed roachtest tpccbench nodes cpu no admission with on master the test failed on branch master cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts tpccbench nodes cpu no admission run cluster go tpcc go tpcc go search go search go tpcc go tpcc go test runner go home agent work go src github com cockroachdb cockroach bin roachprod start encrypt false teamcity returned exit status home agent work go src github com cockroachdb cockroach bin roachprod start encrypt false teamcity returned stderr stdout roachprod install cluster synced go home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go roachprod install cluster synced go main wrap roachprod install cluster synced go home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go roachprod install cluster synced go github com cobra command execute roachprod install cluster synced go home agent work go src github com cockroachdb cockroach vendor github com cobra command go roachprod install cluster synced go github com cobra command executec roachprod install cluster synced go home agent work go src github com cockroachdb cockroach vendor github com cobra command go roachprod install cluster synced go github com cobra command execute roachprod install cluster synced go home agent work go src github com cockroachdb cockroach vendor github com cobra command go roachprod install cluster synced go main main roachprod install cluster synced go home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go roachprod install cluster synced go roachprod install cluster synced go roachprod install cluster synced go roachprod install cluster synced go roachprod install cluster synced go this node experienced a fatal error printed above and as a result the roachprod install cluster synced go process is terminating roachprod install cluster synced go roachprod install cluster synced go fatal errors can occur due to faulty hardware disks memory clocks or a roachprod install cluster synced go problem in cockroachdb with your help the support team at cockroach labs roachprod install cluster synced go will try to determine the root cause recommend next steps and we can roachprod install cluster synced go improve cockroachdb based on your report roachprod install cluster synced go roachprod install cluster synced go please submit a crash report by following the instructions here roachprod install cluster synced go roachprod install cluster synced go roachprod install cluster synced go roachprod install cluster synced go if you would rather not post publicly please contact us directly at roachprod install cluster synced go roachprod install cluster synced go support cockroachlabs com roachprod install cluster synced go roachprod install cluster synced go the cockroach labs team appreciates your feedback wraps exit status error types cluster withcommanddetails exec exiterror help see see cc cockroachdb kv triage
| 1
|
26,122
| 26,458,811,460
|
IssuesEvent
|
2023-01-16 15:57:51
|
ElektraInitiative/libelektra
|
https://api.github.com/repos/ElektraInitiative/libelektra
|
reopened
|
COW metadata / no metadata on metakeys
|
bug usability stale
|
Because metadata on meta keys is not allowed we can use Copy-On-Write semantics for metadata. Therefore all keys of the namespace `KEY_NS_META` should _automatically_ have the flags `KEY_FLAG_RO_NAME`, `KEY_FLAG_RO_VALUE` and `KEY_FLAG_RO_META` set _during `keyVNew`_.
This ensures that `keySetMeta` will fail. It also avoids any problems with `ksDeepDup` (#3609), since the need for deep-copies of metadata is removed. Therefore `keyDup` (#3606) can just create a shallow-copy of the metadata without any further implications.
Additionally, `KEY_NS_META` keys should at all times fulfill `key->meta == NULL` and `keyMeta()` should return `NULL` instead of allocating a new `KeySet`.
|
True
|
COW metadata / no metadata on metakeys - Because metadata on meta keys is not allowed we can use Copy-On-Write semantics for metadata. Therefore all keys of the namespace `KEY_NS_META` should _automatically_ have the flags `KEY_FLAG_RO_NAME`, `KEY_FLAG_RO_VALUE` and `KEY_FLAG_RO_META` set _during `keyVNew`_.
This ensures that `keySetMeta` will fail. It also avoids any problems with `ksDeepDup` (#3609), since the need for deep-copies of metadata is removed. Therefore `keyDup` (#3606) can just create a shallow-copy of the metadata without any further implications.
Additionally, `KEY_NS_META` keys should at all times fulfill `key->meta == NULL` and `keyMeta()` should return `NULL` instead of allocating a new `KeySet`.
|
non_test
|
cow metadata no metadata on metakeys because metadata on meta keys is not allowed we can use copy on write semantics for metadata therefore all keys of the namespace key ns meta should automatically have the flags key flag ro name key flag ro value and key flag ro meta set during keyvnew this ensures that keysetmeta will fail it also avoids any problems with ksdeepdup since the need for deep copies of metadata is removed therefore keydup can just create a shallow copy of the metadata without any further implications additionally key ns meta keys should at all times fulfill key meta null and keymeta should return null instead of allocating a new keyset
| 0
|
123,930
| 10,291,673,671
|
IssuesEvent
|
2019-08-27 13:00:27
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
teamcity: failed test: _sequence_direct=false
|
C-test-failure O-robot
|
The following tests appear to have failed on master (testrace): _sequence_direct=false
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+_sequence_direct=false).
[#1452028](https://teamcity.cockroachdb.com/viewLog.html?buildId=1452028):
```
_sequence_direct=false
--- FAIL: testrace/TestImportData/PGDUMP:_sequence_direct=false (0.000s)
Test ended in panic.
------- Stdout: -------
I190823 23:16:23.525552 475 sql/event_log.go:130 [n1,client=127.0.0.1:51950,user=root] Event: "create_database", target: 140, info: {DatabaseName:d44 Statement:CREATE DATABASE d44 User:root}
I190823 23:16:23.838809 24934 storage/replica_command.go:284 [n1,s1,r114/1:/{Table/139/1-Max}] initiating a split of this range at key /Table/141/1 [r116] (manual)
I190823 23:16:23.873627 24933 ccl/importccl/read_import_base.go:66 [n1,import-distsql-ingest] could not fetch file size; falling back to per-file progress: bad ContentLength: -1
I190823 23:16:23.949883 24934 storage/replica_command.go:284 [n1,s1,r116/1:/{Table/141/1-Max}] initiating a split of this range at key /Table/142/1 [r117] (manual)
W190823 23:16:24.096938 24934 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
W190823 23:16:24.112067 142 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
W190823 23:16:24.112522 142 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
I190823 23:16:24.207236 24923 storage/replica_command.go:284 [n1,split,s1,r116/1:/Table/14{1/1-2/1}] initiating a split of this range at key /Table/142 [r118] (zone config)
I190823 23:16:24.271581 24942 storage/replica_command.go:284 [n1,split,s1,r114/1:/Table/1{39/1-41/1}] initiating a split of this range at key /Table/141 [r119] (zone config)
I190823 23:16:24.629731 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root] publish: descID=142 (t) version=3 mtime=2019-08-23 23:16:24.504169211 +0000 UTC
I190823 23:16:24.726501 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root] publish: descID=141 (i_seq) version=3 mtime=2019-08-23 23:16:24.504169211 +0000 UTC
I190823 23:16:24.804636 25187 storage/replica_command.go:598 [n1,merge,s1,r85/1:/Table/11{3-5}] initiating a merge of r88:/Table/11{5-7} [(n1,s1):1, next=2, gen=37] into this range (lhs+rhs has (size=0 B+0 B qps=0.00+0.00 --> 0.00qps) below threshold (size=0 B, qps=0.00))
I190823 23:16:24.897489 180 storage/store.go:2593 [n1,s1,r85/1:/Table/11{3-5}] removing replica r88/1
I190823 23:16:24.949151 475 sql/event_log.go:130 [n1,client=127.0.0.1:51950,user=root] Event: "drop_database", target: 140, info: {DatabaseName:d44 Statement:DROP DATABASE d44 User:root DroppedSchemaObjects:[t d44.public.i_seq]}
I190823 23:16:25.005587 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root,scExec] publish: descID=142 (t) version=4 mtime=2019-08-23 23:16:25.003845039 +0000 UTC
I190823 23:16:25.281927 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root,scExec] publish: descID=141 (i_seq) version=4 mtime=2019-08-23 23:16:25.268761053 +0000 UTC
```
Please assign, take a look and update the issue accordingly.
|
1.0
|
teamcity: failed test: _sequence_direct=false - The following tests appear to have failed on master (testrace): _sequence_direct=false
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+_sequence_direct=false).
[#1452028](https://teamcity.cockroachdb.com/viewLog.html?buildId=1452028):
```
_sequence_direct=false
--- FAIL: testrace/TestImportData/PGDUMP:_sequence_direct=false (0.000s)
Test ended in panic.
------- Stdout: -------
I190823 23:16:23.525552 475 sql/event_log.go:130 [n1,client=127.0.0.1:51950,user=root] Event: "create_database", target: 140, info: {DatabaseName:d44 Statement:CREATE DATABASE d44 User:root}
I190823 23:16:23.838809 24934 storage/replica_command.go:284 [n1,s1,r114/1:/{Table/139/1-Max}] initiating a split of this range at key /Table/141/1 [r116] (manual)
I190823 23:16:23.873627 24933 ccl/importccl/read_import_base.go:66 [n1,import-distsql-ingest] could not fetch file size; falling back to per-file progress: bad ContentLength: -1
I190823 23:16:23.949883 24934 storage/replica_command.go:284 [n1,s1,r116/1:/{Table/141/1-Max}] initiating a split of this range at key /Table/142/1 [r117] (manual)
W190823 23:16:24.096938 24934 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
W190823 23:16:24.112067 142 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
W190823 23:16:24.112522 142 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
I190823 23:16:24.207236 24923 storage/replica_command.go:284 [n1,split,s1,r116/1:/Table/14{1/1-2/1}] initiating a split of this range at key /Table/142 [r118] (zone config)
I190823 23:16:24.271581 24942 storage/replica_command.go:284 [n1,split,s1,r114/1:/Table/1{39/1-41/1}] initiating a split of this range at key /Table/141 [r119] (zone config)
I190823 23:16:24.629731 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root] publish: descID=142 (t) version=3 mtime=2019-08-23 23:16:24.504169211 +0000 UTC
I190823 23:16:24.726501 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root] publish: descID=141 (i_seq) version=3 mtime=2019-08-23 23:16:24.504169211 +0000 UTC
I190823 23:16:24.804636 25187 storage/replica_command.go:598 [n1,merge,s1,r85/1:/Table/11{3-5}] initiating a merge of r88:/Table/11{5-7} [(n1,s1):1, next=2, gen=37] into this range (lhs+rhs has (size=0 B+0 B qps=0.00+0.00 --> 0.00qps) below threshold (size=0 B, qps=0.00))
I190823 23:16:24.897489 180 storage/store.go:2593 [n1,s1,r85/1:/Table/11{3-5}] removing replica r88/1
I190823 23:16:24.949151 475 sql/event_log.go:130 [n1,client=127.0.0.1:51950,user=root] Event: "drop_database", target: 140, info: {DatabaseName:d44 Statement:DROP DATABASE d44 User:root DroppedSchemaObjects:[t d44.public.i_seq]}
I190823 23:16:25.005587 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root,scExec] publish: descID=142 (t) version=4 mtime=2019-08-23 23:16:25.003845039 +0000 UTC
I190823 23:16:25.281927 475 sql/sqlbase/structured.go:1511 [n1,client=127.0.0.1:51950,user=root,scExec] publish: descID=141 (i_seq) version=4 mtime=2019-08-23 23:16:25.268761053 +0000 UTC
```
Please assign, take a look and update the issue accordingly.
|
test
|
teamcity failed test sequence direct false the following tests appear to have failed on master testrace sequence direct false you may want to check sequence direct false fail testrace testimportdata pgdump sequence direct false test ended in panic stdout sql event log go event create database target info databasename statement create database user root storage replica command go initiating a split of this range at key table manual ccl importccl read import base go could not fetch file size falling back to per file progress bad contentlength storage replica command go initiating a split of this range at key table manual storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage replica command go initiating a split of this range at key table zone config storage replica command go initiating a split of this range at key table zone config sql sqlbase structured go publish descid t version mtime utc sql sqlbase structured go publish descid i seq version mtime utc storage replica command go initiating a merge of table into this range lhs rhs has size b b qps below threshold size b qps storage store go removing replica sql event log go event drop database target info databasename statement drop database user root droppedschemaobjects sql sqlbase structured go publish descid t version mtime utc sql sqlbase structured go publish descid i seq version mtime utc please assign take a look and update the issue accordingly
| 1
|
96,238
| 8,600,079,026
|
IssuesEvent
|
2018-11-16 05:50:34
|
ansible/ansible
|
https://api.github.com/repos/ansible/ansible
|
closed
|
sanity test: Most Python files should not be executable
|
affects_2.3 feature support:core test
|
##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
Tests
##### ANSIBLE VERSION
```
ansible 2.3.0 (devel 5502da3cf8) last updated 2016/10/25 11:22:53 (GMT +100)
lib/ansible/modules/core: (devel 4c020102a9) last updated 2016/10/25 11:22:57 (GMT +100)
lib/ansible/modules/extras: (devel 8f77a0e72a) last updated 2016/10/25 11:22:59 (GMT +100)
config file =
```
##### CONFIGURATION
##### OS / ENVIRONMENT
##### SUMMARY
A number of modules (in core & extras) are executable. This is difficult to spot during new module creation PR as GitHub doesn't display file mode flags.
Also I've seen some PRs that modify modules that end up changing the file mode.
Check that:
* No files are executable unless specified in the whitelist
* Ensure that files listed in the whitelist are executable
**EDIT:** Originally this issue suggested using `ansible-test`, though based on Matt Clay's suggestion I've updated this to say we should use sanity test + whitelist for the few files that must be executable.
|
1.0
|
sanity test: Most Python files should not be executable - ##### ISSUE TYPE
- Feature Idea
##### COMPONENT NAME
Tests
##### ANSIBLE VERSION
```
ansible 2.3.0 (devel 5502da3cf8) last updated 2016/10/25 11:22:53 (GMT +100)
lib/ansible/modules/core: (devel 4c020102a9) last updated 2016/10/25 11:22:57 (GMT +100)
lib/ansible/modules/extras: (devel 8f77a0e72a) last updated 2016/10/25 11:22:59 (GMT +100)
config file =
```
##### CONFIGURATION
##### OS / ENVIRONMENT
##### SUMMARY
A number of modules (in core & extras) are executable. This is difficult to spot during new module creation PR as GitHub doesn't display file mode flags.
Also I've seen some PRs that modify modules that end up changing the file mode.
Check that:
* No files are executable unless specified in the whitelist
* Ensure that files listed in the whitelist are executable
**EDIT:** Originally this issue suggested using `ansible-test`, though based on Matt Clay's suggestion I've updated this to say we should use sanity test + whitelist for the few files that must be executable.
|
test
|
sanity test most python files should not be executable issue type feature idea component name tests ansible version ansible devel last updated gmt lib ansible modules core devel last updated gmt lib ansible modules extras devel last updated gmt config file configuration os environment summary a number of modules in core extras are executable this is difficult to spot during new module creation pr as github doesn t display file mode flags also i ve seen some prs that modify modules that end up changing the file mode check that no files are executable unless specified in the whitelist ensure that files listed in the whitelist are executable edit originally this issue suggested using ansible test though based on matt clay s suggestion i ve updated this to say we should use sanity test whitelist for the few files that must be executable
| 1
|
271,675
| 20,710,585,167
|
IssuesEvent
|
2022-03-12 00:11:39
|
tom-ricci/octobox
|
https://api.github.com/repos/tom-ricci/octobox
|
closed
|
Create documentation
|
documentation
|
First of all, the current docs need to be deleted and rebuilt with v1.2.0 when it releases. Then, I need to actually make them.
|
1.0
|
Create documentation - First of all, the current docs need to be deleted and rebuilt with v1.2.0 when it releases. Then, I need to actually make them.
|
non_test
|
create documentation first of all the current docs need to be deleted and rebuilt with when it releases then i need to actually make them
| 0
|
134,668
| 10,926,281,948
|
IssuesEvent
|
2019-11-22 14:24:47
|
eclipse/openj9
|
https://api.github.com/repos/eclipse/openj9
|
closed
|
openjdknext_j9_sanity.functional_ppc64_aix omrport.359 * ** ASSERTION FAILED ** at ../../omr/port/common/omrmemtag.c:145: ((memoryCorruptionDetected))
|
test failure
|
Failure link
------------
https://ci.eclipse.org/openj9/job/Test_openjdknext_j9_sanity.functional_ppc64_aix_OpenJDK/13/consoleFull
Optional info
-------------
Failure output (captured from console output)
---------------------------------------------
```
02:35:14 ===============================================
02:35:14 Running test SharedCPEntryInvokerTests_1 ...
02:35:14 ===============================================
02:37:34 07:19:18.449 0x30008200 omrport.359 * ** ASSERTION FAILED ** at ../../omr/port/common/omrmemtag.c:145: ((memoryCorruptionDetected))
03:26:44 FAILED test targets:
03:26:44 SharedCPEntryInvokerTests_1
03:26:44
03:26:44 TOTAL: 232 EXECUTED: 137 PASSED: 136 FAILED: 1 DISABLED: 4 SKIPPED: 91
03:26:44 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
03:26:44 To rebuild the failed test in a jenkins job, copy the following link and fill out the <Jenkins URL> and <FAILED test target>:
03:26:44 <Jenkins URL>/parambuild/?JDK_VERSION=14&JDK_IMPL=openj9&BUILD_LIST=functional&JenkinsFile=openjdk_ppc64_aix&TARGET=<FAILED test target>
03:26:44
03:26:44 For example, to rebuild the failed tests in <Jenkins URL>=https://ci.adoptopenjdk.net/job/Grinder, use the following links:
03:26:44 https://ci.adoptopenjdk.net/job/Grinder/parambuild/?JDK_VERSION=14&JDK_IMPL=openj9&BUILD_LIST=functional&JenkinsFile=openjdk_ppc64_aix&TARGET=SharedCPEntryInvokerTests_1
03:26:44 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
```
|
1.0
|
openjdknext_j9_sanity.functional_ppc64_aix omrport.359 * ** ASSERTION FAILED ** at ../../omr/port/common/omrmemtag.c:145: ((memoryCorruptionDetected)) - Failure link
------------
https://ci.eclipse.org/openj9/job/Test_openjdknext_j9_sanity.functional_ppc64_aix_OpenJDK/13/consoleFull
Optional info
-------------
Failure output (captured from console output)
---------------------------------------------
```
02:35:14 ===============================================
02:35:14 Running test SharedCPEntryInvokerTests_1 ...
02:35:14 ===============================================
02:37:34 07:19:18.449 0x30008200 omrport.359 * ** ASSERTION FAILED ** at ../../omr/port/common/omrmemtag.c:145: ((memoryCorruptionDetected))
03:26:44 FAILED test targets:
03:26:44 SharedCPEntryInvokerTests_1
03:26:44
03:26:44 TOTAL: 232 EXECUTED: 137 PASSED: 136 FAILED: 1 DISABLED: 4 SKIPPED: 91
03:26:44 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
03:26:44 To rebuild the failed test in a jenkins job, copy the following link and fill out the <Jenkins URL> and <FAILED test target>:
03:26:44 <Jenkins URL>/parambuild/?JDK_VERSION=14&JDK_IMPL=openj9&BUILD_LIST=functional&JenkinsFile=openjdk_ppc64_aix&TARGET=<FAILED test target>
03:26:44
03:26:44 For example, to rebuild the failed tests in <Jenkins URL>=https://ci.adoptopenjdk.net/job/Grinder, use the following links:
03:26:44 https://ci.adoptopenjdk.net/job/Grinder/parambuild/?JDK_VERSION=14&JDK_IMPL=openj9&BUILD_LIST=functional&JenkinsFile=openjdk_ppc64_aix&TARGET=SharedCPEntryInvokerTests_1
03:26:44 ++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++++
```
|
test
|
openjdknext sanity functional aix omrport assertion failed at omr port common omrmemtag c memorycorruptiondetected failure link optional info failure output captured from console output running test sharedcpentryinvokertests omrport assertion failed at omr port common omrmemtag c memorycorruptiondetected failed test targets sharedcpentryinvokertests total executed passed failed disabled skipped to rebuild the failed test in a jenkins job copy the following link and fill out the and parambuild jdk version jdk impl build list functional jenkinsfile openjdk aix target for example to rebuild the failed tests in use the following links
| 1
|
183,183
| 14,214,682,642
|
IssuesEvent
|
2020-11-17 05:51:39
|
pingcap/tidb
|
https://api.github.com/repos/pingcap/tidb
|
closed
|
simple_test.go:testSuite3.TestDropStatsFromKV failed
|
component/test status/TODO
|
simple_test.go:testSuite3.TestDropStatsFromKV
```
[2020-09-01T10:55:02.473Z] ----------------------------------------------------------------------
[2020-09-01T10:55:02.473Z] FAIL: simple_test.go:581: testSuite3.TestDropStatsFromKV
[2020-09-01T10:55:02.473Z]
[2020-09-01T10:55:02.473Z] simple_test.go:594:
[2020-09-01T10:55:02.473Z] tk.MustQuery("select hist_id, bucket_id from mysql.stats_buckets where table_id = " + tblID).Check(
[2020-09-01T10:55:02.473Z] testkit.Rows("1 0",
[2020-09-01T10:55:02.473Z] [2020/09/01 18:54:27.975 +08:00] [ERROR] [mvcc_leveldb.go:785] ["ASSERTION FAIL!!!"] [mutation="op:Del key:\"t\\200\\000\\000\\000\\000\\000\\000\\033_r\\200\\000\\000\\000\\000\\000\\000\\003\" assertion:Exist "] [stack="github.com/pingcap/tidb/store/mockstore/mocktikv.prewriteMutation\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/mvcc_leveldb.go:785\ngithub.com/pingcap/tidb/store/mockstore/mocktikv.(*MVCCLevelDB).Prewrite\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/mvcc_leveldb.go:665\ngithub.com/pingcap/tidb/store/mockstore/mocktikv.(*rpcHandler).handleKvPrewrite\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/rpc.go:299\ngithub.com/pingcap/tidb/store/mockstore/mocktikv.(*RPCClient).SendRequest\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/rpc.go:743\ngithub.com/pingcap/tidb/store/tikv.(*RegionRequestSender).sendReqToRegion\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/region_request.go:177\ngithub.com/pingcap/tidb/store/tikv.(*RegionRequestSender).SendReqCtx\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/region_request.go:131\ngithub.com/pingcap/tidb/store/tikv.(*RegionRequestSender).SendReq\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/region_request.go:74\ngithub.com/pingcap/tidb/store/tikv.(*tikvStore).SendReq\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/kv.go:375\ngithub.com/pingcap/tidb/store/tikv.actionPrewrite.handleSingleBatch\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:610\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).doActionOnBatches\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:480\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).doActionOnKeys\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:469\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).prewriteKeys\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:1086\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).execute\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:1152\ngithub.com/pingcap/tidb/store/tikv.(*tikvTxn).Commit\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/txn.go:298\ngithub.com/pingcap/tidb/session.(*TxnState).Commit\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/txn.go:195\ngithub.com/pingcap/tidb/session.(*session).doCommit\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:446\ngithub.com/pingcap/tidb/session.(*session).doCommitWithRetry\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:460\ngithub.com/pingcap/tidb/session.(*session).CommitTxn\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:536\ngithub.com/pingcap/tidb/session.finishStmt\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:197\ngithub.com/pingcap/tidb/session.runStmt\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:283\ngithub.com/pingcap/tidb/session.(*session).executeStatement\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1031\ngithub.com/pingcap/tidb/session.(*session).execute\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1145\ngithub.com/pingcap/tidb/session.(*session).Execute\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1069\ngithub.com/pingcap/tidb/ddl/util.RemoveFromGCDeleteRange\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/util/util.go:118\ngithub.com/pingcap/tidb/ddl/util.CompleteDeleteRange\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/util/util.go:112\ngithub.com/pingcap/tidb/ddl.(*delRange).doTask\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/delete_range.go:223\ngithub.com/pingcap/tidb/ddl.(*delRange).doDelRangeWork\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/delete_range.go:177\ngithub.com/pingcap/tidb/ddl.(*delRange).startEmulator\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/delete_range.go:141"]
```
Latest failed builds:
https://internal.pingcap.net/idc-jenkins/job/tidb_ghpr_check/52411/display/redirect
https://internal.pingcap.net/idc-jenkins/job/tidb_ghpr_unit_test/48984/display/redirect
|
1.0
|
simple_test.go:testSuite3.TestDropStatsFromKV failed - simple_test.go:testSuite3.TestDropStatsFromKV
```
[2020-09-01T10:55:02.473Z] ----------------------------------------------------------------------
[2020-09-01T10:55:02.473Z] FAIL: simple_test.go:581: testSuite3.TestDropStatsFromKV
[2020-09-01T10:55:02.473Z]
[2020-09-01T10:55:02.473Z] simple_test.go:594:
[2020-09-01T10:55:02.473Z] tk.MustQuery("select hist_id, bucket_id from mysql.stats_buckets where table_id = " + tblID).Check(
[2020-09-01T10:55:02.473Z] testkit.Rows("1 0",
[2020-09-01T10:55:02.473Z] [2020/09/01 18:54:27.975 +08:00] [ERROR] [mvcc_leveldb.go:785] ["ASSERTION FAIL!!!"] [mutation="op:Del key:\"t\\200\\000\\000\\000\\000\\000\\000\\033_r\\200\\000\\000\\000\\000\\000\\000\\003\" assertion:Exist "] [stack="github.com/pingcap/tidb/store/mockstore/mocktikv.prewriteMutation\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/mvcc_leveldb.go:785\ngithub.com/pingcap/tidb/store/mockstore/mocktikv.(*MVCCLevelDB).Prewrite\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/mvcc_leveldb.go:665\ngithub.com/pingcap/tidb/store/mockstore/mocktikv.(*rpcHandler).handleKvPrewrite\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/rpc.go:299\ngithub.com/pingcap/tidb/store/mockstore/mocktikv.(*RPCClient).SendRequest\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/mockstore/mocktikv/rpc.go:743\ngithub.com/pingcap/tidb/store/tikv.(*RegionRequestSender).sendReqToRegion\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/region_request.go:177\ngithub.com/pingcap/tidb/store/tikv.(*RegionRequestSender).SendReqCtx\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/region_request.go:131\ngithub.com/pingcap/tidb/store/tikv.(*RegionRequestSender).SendReq\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/region_request.go:74\ngithub.com/pingcap/tidb/store/tikv.(*tikvStore).SendReq\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/kv.go:375\ngithub.com/pingcap/tidb/store/tikv.actionPrewrite.handleSingleBatch\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:610\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).doActionOnBatches\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:480\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).doActionOnKeys\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:469\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).prewriteKeys\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:1086\ngithub.com/pingcap/tidb/store/tikv.(*twoPhaseCommitter).execute\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/2pc.go:1152\ngithub.com/pingcap/tidb/store/tikv.(*tikvTxn).Commit\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/store/tikv/txn.go:298\ngithub.com/pingcap/tidb/session.(*TxnState).Commit\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/txn.go:195\ngithub.com/pingcap/tidb/session.(*session).doCommit\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:446\ngithub.com/pingcap/tidb/session.(*session).doCommitWithRetry\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:460\ngithub.com/pingcap/tidb/session.(*session).CommitTxn\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:536\ngithub.com/pingcap/tidb/session.finishStmt\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:197\ngithub.com/pingcap/tidb/session.runStmt\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/tidb.go:283\ngithub.com/pingcap/tidb/session.(*session).executeStatement\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1031\ngithub.com/pingcap/tidb/session.(*session).execute\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1145\ngithub.com/pingcap/tidb/session.(*session).Execute\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/session/session.go:1069\ngithub.com/pingcap/tidb/ddl/util.RemoveFromGCDeleteRange\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/util/util.go:118\ngithub.com/pingcap/tidb/ddl/util.CompleteDeleteRange\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/util/util.go:112\ngithub.com/pingcap/tidb/ddl.(*delRange).doTask\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/delete_range.go:223\ngithub.com/pingcap/tidb/ddl.(*delRange).doDelRangeWork\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/delete_range.go:177\ngithub.com/pingcap/tidb/ddl.(*delRange).startEmulator\n\t/home/jenkins/agent/workspace/tidb_ghpr_unit_test/go/src/github.com/pingcap/tidb/ddl/delete_range.go:141"]
```
Latest failed builds:
https://internal.pingcap.net/idc-jenkins/job/tidb_ghpr_check/52411/display/redirect
https://internal.pingcap.net/idc-jenkins/job/tidb_ghpr_unit_test/48984/display/redirect
|
test
|
simple test go testdropstatsfromkv failed simple test go testdropstatsfromkv fail simple test go testdropstatsfromkv simple test go tk mustquery select hist id bucket id from mysql stats buckets where table id tblid check testkit rows latest failed builds
| 1
|
250,043
| 21,259,213,158
|
IssuesEvent
|
2022-04-13 00:58:00
|
RamiMustafa/WAF_Sec_Test
|
https://api.github.com/repos/RamiMustafa/WAF_Sec_Test
|
opened
|
Automatically remove/obfuscate personally identifiable information (PII) for this workload
|
WARP-Import WAF_Sec_Test Security Health Modeling & Monitoring Application Level Monitoring
|
<a href="https://docs.microsoft.com/azure/search/cognitive-search-skill-pii-detection">Automatically remove/obfuscate personally identifiable information (PII) for this workload</a>
<p><b>Why Consider This?</b></p>
Extra care should be taken around logging of sensitive application areas. PII (contact information, payment information etc.) should not be stored in any application logs and protective measures should be applied (such as obfuscation).
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Deploy PII detection and removal/obfuscation solution for this workload.</span></p>
<p><b>Learn More</b></p>
<p><span>Machine learning tools like </span><a href="https://docs.microsoft.com/en-us/azure/search/cognitive-search-skill-pii-detection" target="_blank"><span>Cognitive Search PII detection</span></a><span> can help with this.</span></p>
|
1.0
|
Automatically remove/obfuscate personally identifiable information (PII) for this workload - <a href="https://docs.microsoft.com/azure/search/cognitive-search-skill-pii-detection">Automatically remove/obfuscate personally identifiable information (PII) for this workload</a>
<p><b>Why Consider This?</b></p>
Extra care should be taken around logging of sensitive application areas. PII (contact information, payment information etc.) should not be stored in any application logs and protective measures should be applied (such as obfuscation).
<p><b>Context</b></p>
<p><b>Suggested Actions</b></p>
<p><span>Deploy PII detection and removal/obfuscation solution for this workload.</span></p>
<p><b>Learn More</b></p>
<p><span>Machine learning tools like </span><a href="https://docs.microsoft.com/en-us/azure/search/cognitive-search-skill-pii-detection" target="_blank"><span>Cognitive Search PII detection</span></a><span> can help with this.</span></p>
|
test
|
automatically remove obfuscate personally identifiable information pii for this workload why consider this extra care should be taken around logging of sensitive application areas pii contact information payment information etc should not be stored in any application logs and protective measures should be applied such as obfuscation context suggested actions deploy pii detection and removal obfuscation solution for this workload learn more machine learning tools like cognitive search pii detection can help with this
| 1
|
279,190
| 24,205,710,638
|
IssuesEvent
|
2022-09-25 07:23:11
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: sqlsmith/setup=seed/setting=no-ddl failed
|
C-test-failure O-robot O-roachtest branch-master T-sql-queries
|
roachtest.sqlsmith/setup=seed/setting=no-ddl [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6601244?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6601244?buildTab=artifacts#/sqlsmith/setup=seed/setting=no-ddl) on master @ [89f4ad907a1756551bd6864c3e8516eeff6b0e0a](https://github.com/cockroachdb/cockroach/commits/89f4ad907a1756551bd6864c3e8516eeff6b0e0a):
```
test artifacts and logs in: /artifacts/sqlsmith/setup=seed/setting=no-ddl/run_1
sqlsmith.go:266,sqlsmith.go:327,test_runner.go:928: error: pq: internal error: expected required columns to be a subset of output columns
stmt:
EXPLAIN SELECT
tab_2462._float4 AS col_5881,
'[[{"V``;": {"<Ev$7I2@": {}, "fAGw": {}}, "a": [[{"bar": []}]], "gdub^": false}, 0.04545953921641038, true], [], {}]':::JSONB
AS col_5882,
'2028-02-05 20:55:16.000655':::TIMESTAMP AS col_5883,
'117.177.214.75/24':::INET AS col_5884
FROM
defaultdb.public.seed@seed__int8__float8__date_idx AS tab_2462
WHERE
tab_2462._bool
ORDER BY
tab_2462._float4, tab_2462._bool ASC, tab_2462._enum, tab_2462._decimal DESC;
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=seed/setting=no-ddl.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
2.0
|
roachtest: sqlsmith/setup=seed/setting=no-ddl failed - roachtest.sqlsmith/setup=seed/setting=no-ddl [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6601244?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyGceBazel/6601244?buildTab=artifacts#/sqlsmith/setup=seed/setting=no-ddl) on master @ [89f4ad907a1756551bd6864c3e8516eeff6b0e0a](https://github.com/cockroachdb/cockroach/commits/89f4ad907a1756551bd6864c3e8516eeff6b0e0a):
```
test artifacts and logs in: /artifacts/sqlsmith/setup=seed/setting=no-ddl/run_1
sqlsmith.go:266,sqlsmith.go:327,test_runner.go:928: error: pq: internal error: expected required columns to be a subset of output columns
stmt:
EXPLAIN SELECT
tab_2462._float4 AS col_5881,
'[[{"V``;": {"<Ev$7I2@": {}, "fAGw": {}}, "a": [[{"bar": []}]], "gdub^": false}, 0.04545953921641038, true], [], {}]':::JSONB
AS col_5882,
'2028-02-05 20:55:16.000655':::TIMESTAMP AS col_5883,
'117.177.214.75/24':::INET AS col_5884
FROM
defaultdb.public.seed@seed__int8__float8__date_idx AS tab_2462
WHERE
tab_2462._bool
ORDER BY
tab_2462._float4, tab_2462._bool ASC, tab_2462._enum, tab_2462._decimal DESC;
```
<p>Parameters: <code>ROACHTEST_cloud=gce</code>
, <code>ROACHTEST_cpu=4</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/sql-queries
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*sqlsmith/setup=seed/setting=no-ddl.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
|
test
|
roachtest sqlsmith setup seed setting no ddl failed roachtest sqlsmith setup seed setting no ddl with on master test artifacts and logs in artifacts sqlsmith setup seed setting no ddl run sqlsmith go sqlsmith go test runner go error pq internal error expected required columns to be a subset of output columns stmt explain select tab as col gdub false true jsonb as col timestamp as col inet as col from defaultdb public seed seed date idx as tab where tab bool order by tab tab bool asc tab enum tab decimal desc parameters roachtest cloud gce roachtest cpu roachtest encrypted false roachtest ssd help see see cc cockroachdb sql queries
| 1
|
176,488
| 21,411,531,491
|
IssuesEvent
|
2022-04-22 06:39:12
|
pazhanivel07/frameworks_av-CVE-2020-0218
|
https://api.github.com/repos/pazhanivel07/frameworks_av-CVE-2020-0218
|
opened
|
CVE-2020-0141 (Medium) detected in avandroid-10.0.0_r44
|
security vulnerability
|
## CVE-2020-0141 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>avandroid-10.0.0_r44</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/av>https://android.googlesource.com/platform/frameworks/av</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/frameworks_av-CVE-2020-0218/commit/c7c80110bfccffb246ac43055e660ac5605596f5">c7c80110bfccffb246ac43055e660ac5605596f5</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/media/codec2/sfplugin/CCodecBuffers.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In OutputBuffersArray::realloc of CCodecBuffers.cpp, there is a possible heap disclosure due to a race condition. This could lead to remote information disclosure with System execution privileges needed. User interaction is needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-142544793
<p>Publish Date: 2020-06-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0141>CVE-2020-0141</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37">https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37</a></p>
<p>Release Date: 2020-06-11</p>
<p>Fix Resolution: android-10.0.0_r37</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-0141 (Medium) detected in avandroid-10.0.0_r44 - ## CVE-2020-0141 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>avandroid-10.0.0_r44</b></p></summary>
<p>
<p>Library home page: <a href=https://android.googlesource.com/platform/frameworks/av>https://android.googlesource.com/platform/frameworks/av</a></p>
<p>Found in HEAD commit: <a href="https://github.com/pazhanivel07/frameworks_av-CVE-2020-0218/commit/c7c80110bfccffb246ac43055e660ac5605596f5">c7c80110bfccffb246ac43055e660ac5605596f5</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/media/codec2/sfplugin/CCodecBuffers.cpp</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In OutputBuffersArray::realloc of CCodecBuffers.cpp, there is a possible heap disclosure due to a race condition. This could lead to remote information disclosure with System execution privileges needed. User interaction is needed for exploitation.Product: AndroidVersions: Android-10Android ID: A-142544793
<p>Publish Date: 2020-06-11
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-0141>CVE-2020-0141</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>4.4</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37">https://android.googlesource.com/platform/frameworks/av/+/refs/tags/android-10.0.0_r37</a></p>
<p>Release Date: 2020-06-11</p>
<p>Fix Resolution: android-10.0.0_r37</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in avandroid cve medium severity vulnerability vulnerable library avandroid library home page a href found in head commit a href found in base branch master vulnerable source files media sfplugin ccodecbuffers cpp vulnerability details in outputbuffersarray realloc of ccodecbuffers cpp there is a possible heap disclosure due to a race condition this could lead to remote information disclosure with system execution privileges needed user interaction is needed for exploitation product androidversions android id a publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android step up your open source security game with whitesource
| 0
|
803,469
| 29,178,226,054
|
IssuesEvent
|
2023-05-19 09:41:07
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
[Android] Rewards panel needs better handling for smaller screen sizes
|
polish feature/rewards priority/P4 QA/Yes release-notes/exclude OS/Android
|
<!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description <!-- Provide a brief description of the issue -->
If you are using a newer Android device (ex. Google Pixel 3XL with Android 12), the rewards panel looks fine - most text is on 1 line and the panel doesn't take up the whole screen. However, on an older Android device (Samsung J7 Neo with Android 7), the panel doesn't look at nice. Lots of text goes onto 2 lines. This is ok in English but could cause problems with non-english languages as the panel grows longer. Also, the panel just doesn't look as nice this way.
## Steps to reproduce <!-- Please add a series of steps to reproduce the issue -->
pre-req: be using older Android device
1. Clean install 1.35.x
2. Enable staging env in QA Prefs, relaunch as necessary
3. Enable rewards
4. Look at panel
## Actual result <!-- Please add screenshots if needed -->
Light theme | Dark theme
----- | -----
 | 
## Expected result
Pixel for comparison:

## Issue reproduces how often <!-- [Easily reproduced/Intermittent issue/No steps to reproduce] -->
easily
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current Play Store version? n/a
- Can you reproduce this issue with the current Play Store Beta version? yes
- Can you reproduce this issue with the current Play Store Nightly version? yes
## Device details
- Install type (ARM, x86): ARM
- Device type (Phone, Tablet, Phablet): all
- Android version: older, such as 7
## Brave version
1.35.x
### Website problems only
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
### Additional information
<!-- Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue -->
viewport size on Pixel 3XL: 412px x 693px, screen size is 412px x 846px
viewport size on J7 Neo: 360px x 560px, screen size is 360px x 640px
cc @Miyayes @deeppandya
|
1.0
|
[Android] Rewards panel needs better handling for smaller screen sizes - <!-- Have you searched for similar issues? Before submitting this issue, please check the open issues and add a note before logging a new issue.
PLEASE USE THE TEMPLATE BELOW TO PROVIDE INFORMATION ABOUT THE ISSUE.
INSUFFICIENT INFO WILL GET THE ISSUE CLOSED. IT WILL ONLY BE REOPENED AFTER SUFFICIENT INFO IS PROVIDED-->
## Description <!-- Provide a brief description of the issue -->
If you are using a newer Android device (ex. Google Pixel 3XL with Android 12), the rewards panel looks fine - most text is on 1 line and the panel doesn't take up the whole screen. However, on an older Android device (Samsung J7 Neo with Android 7), the panel doesn't look at nice. Lots of text goes onto 2 lines. This is ok in English but could cause problems with non-english languages as the panel grows longer. Also, the panel just doesn't look as nice this way.
## Steps to reproduce <!-- Please add a series of steps to reproduce the issue -->
pre-req: be using older Android device
1. Clean install 1.35.x
2. Enable staging env in QA Prefs, relaunch as necessary
3. Enable rewards
4. Look at panel
## Actual result <!-- Please add screenshots if needed -->
Light theme | Dark theme
----- | -----
 | 
## Expected result
Pixel for comparison:

## Issue reproduces how often <!-- [Easily reproduced/Intermittent issue/No steps to reproduce] -->
easily
## Version/Channel Information:
<!--Does this issue happen on any other channels? Or is it specific to a certain channel?-->
- Can you reproduce this issue with the current Play Store version? n/a
- Can you reproduce this issue with the current Play Store Beta version? yes
- Can you reproduce this issue with the current Play Store Nightly version? yes
## Device details
- Install type (ARM, x86): ARM
- Device type (Phone, Tablet, Phablet): all
- Android version: older, such as 7
## Brave version
1.35.x
### Website problems only
- Does the issue resolve itself when disabling Brave Shields?
- Does the issue resolve itself when disabling Brave Rewards?
- Is the issue reproducible on the latest version of Chrome?
### Additional information
<!-- Any additional information, related issues, extra QA steps, configuration or data that might be necessary to reproduce the issue -->
viewport size on Pixel 3XL: 412px x 693px, screen size is 412px x 846px
viewport size on J7 Neo: 360px x 560px, screen size is 360px x 640px
cc @Miyayes @deeppandya
|
non_test
|
rewards panel needs better handling for smaller screen sizes have you searched for similar issues before submitting this issue please check the open issues and add a note before logging a new issue please use the template below to provide information about the issue insufficient info will get the issue closed it will only be reopened after sufficient info is provided description if you are using a newer android device ex google pixel with android the rewards panel looks fine most text is on line and the panel doesn t take up the whole screen however on an older android device samsung neo with android the panel doesn t look at nice lots of text goes onto lines this is ok in english but could cause problems with non english languages as the panel grows longer also the panel just doesn t look as nice this way steps to reproduce pre req be using older android device clean install x enable staging env in qa prefs relaunch as necessary enable rewards look at panel actual result light theme dark theme expected result pixel for comparison issue reproduces how often easily version channel information can you reproduce this issue with the current play store version n a can you reproduce this issue with the current play store beta version yes can you reproduce this issue with the current play store nightly version yes device details install type arm arm device type phone tablet phablet all android version older such as brave version x website problems only does the issue resolve itself when disabling brave shields does the issue resolve itself when disabling brave rewards is the issue reproducible on the latest version of chrome additional information viewport size on pixel x screen size is x viewport size on neo x screen size is x cc miyayes deeppandya
| 0
|
178,242
| 13,769,659,257
|
IssuesEvent
|
2020-10-07 18:58:18
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: sqlalchemy failed
|
C-test-failure O-roachtest O-robot branch-master
|
[(roachtest).sqlalchemy failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2328528&tab=buildLog) on [master@0c12d8b101a853f441926ad758c4c2882ecd33ee](https://github.com/cockroachdb/cockroach/commits/0c12d8b101a853f441926ad758c4c2882ecd33ee):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/sqlalchemy/run_1
orm_helpers.go:178,orm_helpers.go:154,sqlalchemy.go:212,sqlalchemy.go:38,test_runner.go:755: No tests ran! Fix the testing commands.
```
<details><summary>More</summary><p>
Artifacts: [/sqlalchemy](https://teamcity.cockroachdb.com/viewLog.html?buildId=2328528&tab=artifacts#/sqlalchemy)
Related:
- #54917 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Asqlalchemy.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
2.0
|
roachtest: sqlalchemy failed - [(roachtest).sqlalchemy failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2328528&tab=buildLog) on [master@0c12d8b101a853f441926ad758c4c2882ecd33ee](https://github.com/cockroachdb/cockroach/commits/0c12d8b101a853f441926ad758c4c2882ecd33ee):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/sqlalchemy/run_1
orm_helpers.go:178,orm_helpers.go:154,sqlalchemy.go:212,sqlalchemy.go:38,test_runner.go:755: No tests ran! Fix the testing commands.
```
<details><summary>More</summary><p>
Artifacts: [/sqlalchemy](https://teamcity.cockroachdb.com/viewLog.html?buildId=2328528&tab=artifacts#/sqlalchemy)
Related:
- #54917 roachtest: sqlalchemy failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-release-20.1](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-release-20.1) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Asqlalchemy.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
test
|
roachtest sqlalchemy failed on the test failed on branch master cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts sqlalchemy run orm helpers go orm helpers go sqlalchemy go sqlalchemy go test runner go no tests ran fix the testing commands more artifacts related roachtest sqlalchemy failed powered by
| 1
|
50,801
| 12,560,407,606
|
IssuesEvent
|
2020-06-07 22:02:50
|
curl/curl
|
https://api.github.com/repos/curl/curl
|
closed
|
Code does not compile with -Werror
|
build
|
### I did this
Compile with gcc 8 by adding -Werror
### I expected the following
I expected at most to have to add -Wno-error=?? related flags to allow for the build pass under a more strict build environment.
### What I saw
```
libtool: compile: /opt/rh/devtoolset-8/root/usr/bin/gcc -DHAVE_CONFIG_H -I../include -I../lib -I../lib -DBUILDING_LIBCURL -DCURL_HIDDEN_SYMBOLS -O0 -Wall -Werror -Wextra -Wno-error=unused-variable -fvisibility=hidden -Wno-error=unused-variable -Werror-implicit-function-declaration -Wno-system-headers -pthread -MT libcurl_la-altsvc.lo -MD -MP -MF .deps/libcurl_la-altsvc.Tpo -c altsvc.c -fPIC -DPIC -o .libs/libcurl_la-altsvc.o
In file included from curl_setup.h:673,
from altsvc.c:26:
curl_setup_once.h:109:8: error: redefinition of 'struct timeval'
struct timeval {
^~~~~~~
In file included from /usr/include/sys/select.h:45,
from /usr/include/sys/types.h:219,
from ../include/curl/system.h:430,
from ../include/curl/curl.h:38,
from curl_setup.h:157,
from altsvc.c:26:
/usr/include/bits/time.h:30:8: note: originally defined here
struct timeval
```
It would be nice if curl could compile with stricter settings.
|
1.0
|
Code does not compile with -Werror - ### I did this
Compile with gcc 8 by adding -Werror
### I expected the following
I expected at most to have to add -Wno-error=?? related flags to allow for the build pass under a more strict build environment.
### What I saw
```
libtool: compile: /opt/rh/devtoolset-8/root/usr/bin/gcc -DHAVE_CONFIG_H -I../include -I../lib -I../lib -DBUILDING_LIBCURL -DCURL_HIDDEN_SYMBOLS -O0 -Wall -Werror -Wextra -Wno-error=unused-variable -fvisibility=hidden -Wno-error=unused-variable -Werror-implicit-function-declaration -Wno-system-headers -pthread -MT libcurl_la-altsvc.lo -MD -MP -MF .deps/libcurl_la-altsvc.Tpo -c altsvc.c -fPIC -DPIC -o .libs/libcurl_la-altsvc.o
In file included from curl_setup.h:673,
from altsvc.c:26:
curl_setup_once.h:109:8: error: redefinition of 'struct timeval'
struct timeval {
^~~~~~~
In file included from /usr/include/sys/select.h:45,
from /usr/include/sys/types.h:219,
from ../include/curl/system.h:430,
from ../include/curl/curl.h:38,
from curl_setup.h:157,
from altsvc.c:26:
/usr/include/bits/time.h:30:8: note: originally defined here
struct timeval
```
It would be nice if curl could compile with stricter settings.
|
non_test
|
code does not compile with werror i did this compile with gcc by adding werror i expected the following i expected at most to have to add wno error related flags to allow for the build pass under a more strict build environment what i saw libtool compile opt rh devtoolset root usr bin gcc dhave config h i include i lib i lib dbuilding libcurl dcurl hidden symbols wall werror wextra wno error unused variable fvisibility hidden wno error unused variable werror implicit function declaration wno system headers pthread mt libcurl la altsvc lo md mp mf deps libcurl la altsvc tpo c altsvc c fpic dpic o libs libcurl la altsvc o in file included from curl setup h from altsvc c curl setup once h error redefinition of struct timeval struct timeval in file included from usr include sys select h from usr include sys types h from include curl system h from include curl curl h from curl setup h from altsvc c usr include bits time h note originally defined here struct timeval it would be nice if curl could compile with stricter settings
| 0
|
101,154
| 8,780,195,494
|
IssuesEvent
|
2018-12-19 16:39:53
|
SME-Issues/issues
|
https://api.github.com/repos/SME-Issues/issues
|
closed
|
Intent Errors (5004 Gold) - 19/12/2018
|
NLP Api PETEDEV pulse_tests
|
|Expression|Result|
|---|---|
| _is the business account close to the overdraft limit_ | <ul><li>expected intent to be `query_balance` but found `query_payment`</li><li>Expected to find entity 'overdraft_limit' but found only {text, pay_to, vbz, acct_type}</li><li>Response: _I can't find that company in your account._</li></ul> |
| _are the savings account funds doing alright_<details><summary>After pre-procesing</summary>_are the savings account funds doing ok_</details> | <ul><li>expected intent to be `query_balance` but found `query_payment` (originally `query_acct_name`)</li><li>Account type reference mismatch . Looking for: 'savings account', found: (none)</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _by what date should I pay alpha ltd_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Right, let me just look…{nl} I've gone through the most recent figures and your last payment to Alpha Ltd was £662.84 on the 14th October 2018. You have three unpaid invoices from them totalling £2,273.50, all of which are overdue._</li></ul> |
| (**g**) _does anyone expect payment from me today_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'datetime_now' but found only {datetime, text, pay_to}</li><li>Response: _OK hold on, I'm checking now…{nl} I've determined how things are. There were no outgoing transactions for today._</li></ul> |
| (**g**) _has anyone given us a payment deadline_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find one of entities {bill_from, pay_to} but found only {text, paid_yet, date_in, date_mod, datetime}.</li><li>Response: _Let's check that out for you…{nl} All right, I've taken stock and your last payment was to Delta Ltd for £159.93 on the 14th December 2018._</li></ul> |
| (**g**) _how much money from me are people waiting on_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _OK hold on, I'm checking now…{nl} I've just reviewed your accounts and since 20th November 2018 your outgoings were nine payments totalling £4,296.79._</li></ul> |
| _I'd like to see every sale I've made_<details><summary>After pre-procesing</summary>_i would like to see every sale i have made_</details> | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _OK, let's see…{nl} I've had a scoot around and your last received payment was from Zeta and Son for £1,491.24 on the 14th December 2018._</li></ul> |
| (**g**) _is anyone expecting money from me_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Let's check that out for you…{nl} I've evaluated the state of affairs and since 20th November 2018 your spending was nine payments totalling £4,296.79._</li></ul> |
| (**g**) _is anyone expecting money from me right now_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'datetime_now' but found only {text, money, pay_to, date_in, date_mod, datetime}</li><li>Response: _Righty ho, let's take a look…{nl} I've had a scoot around and since 20th November 2018 your outgoings were nine payments totalling £4,296.79._</li></ul> |
| (**g**) _what number of outstanding invoices do we have_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {unknown_entity, bill_from, date_in, date_mod, datetime}</li><li>Response: _Regrettably, I didn't recognise everything you said, however since 20th November 2018 your expenditure was nine payments totalling £4,296.79._</li></ul> |
| _when is the money from ABC ltd coming in_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {text, pay_from, money}</li><li>Response: _OK, I'm on the case…{nl} After checking, I can verify that there were no incoming transactions for ABC Ltd._</li></ul> |
| (**g**) _when will I get the money in from ABC ltd_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {when, text, pay_from, money}</li><li>Response: _Ok, I'm examining that now…{nl} All right, I've taken stock and you have never received payment from ABC Ltd. You have one unpaid bill to them which is for the amount of £22.80 and was due on the 14th December 2018._</li></ul> |
| _when will I recieve the money from ABC ltd_<details><summary>After pre-procesing</summary>_when will I receive the money from ABC ltd_</details> | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {when, text, pay_from, money}</li><li>Response: _Let's check that out for you…{nl} I've analysed the state of play and you have never received payment from ABC Ltd. You have one unpaid bill to them which is for the amount of £22.80 and was due on the 14th December 2018._</li></ul> |
| (**g**) _which bills do you think we should pay this week_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Right, let me just look…{nl} I've performed a quick investigation and there were no outgoing bill transactions this week._</li></ul> |
| (**g**) _which invoices should we pay on Wed_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {datetime, text, pay_to, paid_yet, dt, pay_type}</li><li>Response: _Let me check that out…{nl} Well, I've checked the most up-to-date records and there were no outgoing creditor invoice transactions on Wednesday (12th December)._</li></ul> |
| (**g**) _which supplier invoices have yet to be paid_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Just a second, I'm exploring now…{nl} I can see your last paid an invoice on the 14th December 2018 for the amount of £159.93._</li></ul> |
| _how much does my landlord want this month_ | <ul><li>expected intent to be `query_invoices` but found `query_payment` (originally `query_acct_name`)</li><li>Expected to find one of entities {bill_from, pay_to} but found only {date_mod, datetime, text, acct_name}.</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _what kind of money is my landlord demanding this month_ | <ul><li>expected intent to be `query_invoices` but found `query_payment` (originally `query_acct_name`)</li><li>Expected to find one of entities {bill_from, pay_to} but found only {text, date_mod, datetime, wp, acct_name}.</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _what will be the value of screw small businesses this year_ | <ul><li>expected intent to be `query_invoices` but found `query_payment` (originally `query_acct_name`)</li><li>Expected to find one of entities {bill_to, pay_from} but found only {date_mod, datetime, text, acct_name}.</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _can you compute the total on invoices paid in the second quarter_ | <ul><li>expected intent to be `query_payment` but found `query_invoices`</li><li>Expected to find entity 'closed_mod' but found only {date_mod, datetime_quarter, datetime, text, pay_type, query_total}</li><li>Response: _I'll look right now…{nl} I've inspected and in Q2 2018 you received 21 invoices totalling £10,938.95, all of which have been paid and you raised 27 invoices totalling £35,869.50, all of which have been paid._</li></ul> |
| (**g**) _have you got anything on bills paid in the last 4 weeks_ | <ul><li>expected intent to be `query_payment` but found `query_invoices`</li><li>Response: _I'm determining that now…{nl} I've had a quick look and in the last four weeks you received five invoices totalling £2,370.32, none of which have been paid and you raised two invoices totalling £848.92, none of which have been paid._</li></ul> |
| _it would be good to know the total of bills paid last qtr_<details><summary>After pre-procesing</summary>_it would be good to know the total of bills paid last quarter_</details> | <ul><li>expected intent to be `query_payment` but found `query_invoices`</li><li>Response: _Right, let me just look…{nl} I've gone through the most recent figures and in Q3 2018 you received 23 invoices totalling £8,212.60, all of which have been paid and you raised 27 invoices totalling £32,335.40, all of which have been paid._</li></ul> |
|
1.0
|
Intent Errors (5004 Gold) - 19/12/2018 - |Expression|Result|
|---|---|
| _is the business account close to the overdraft limit_ | <ul><li>expected intent to be `query_balance` but found `query_payment`</li><li>Expected to find entity 'overdraft_limit' but found only {text, pay_to, vbz, acct_type}</li><li>Response: _I can't find that company in your account._</li></ul> |
| _are the savings account funds doing alright_<details><summary>After pre-procesing</summary>_are the savings account funds doing ok_</details> | <ul><li>expected intent to be `query_balance` but found `query_payment` (originally `query_acct_name`)</li><li>Account type reference mismatch . Looking for: 'savings account', found: (none)</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _by what date should I pay alpha ltd_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Right, let me just look…{nl} I've gone through the most recent figures and your last payment to Alpha Ltd was £662.84 on the 14th October 2018. You have three unpaid invoices from them totalling £2,273.50, all of which are overdue._</li></ul> |
| (**g**) _does anyone expect payment from me today_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'datetime_now' but found only {datetime, text, pay_to}</li><li>Response: _OK hold on, I'm checking now…{nl} I've determined how things are. There were no outgoing transactions for today._</li></ul> |
| (**g**) _has anyone given us a payment deadline_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find one of entities {bill_from, pay_to} but found only {text, paid_yet, date_in, date_mod, datetime}.</li><li>Response: _Let's check that out for you…{nl} All right, I've taken stock and your last payment was to Delta Ltd for £159.93 on the 14th December 2018._</li></ul> |
| (**g**) _how much money from me are people waiting on_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _OK hold on, I'm checking now…{nl} I've just reviewed your accounts and since 20th November 2018 your outgoings were nine payments totalling £4,296.79._</li></ul> |
| _I'd like to see every sale I've made_<details><summary>After pre-procesing</summary>_i would like to see every sale i have made_</details> | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _OK, let's see…{nl} I've had a scoot around and your last received payment was from Zeta and Son for £1,491.24 on the 14th December 2018._</li></ul> |
| (**g**) _is anyone expecting money from me_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Let's check that out for you…{nl} I've evaluated the state of affairs and since 20th November 2018 your spending was nine payments totalling £4,296.79._</li></ul> |
| (**g**) _is anyone expecting money from me right now_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'datetime_now' but found only {text, money, pay_to, date_in, date_mod, datetime}</li><li>Response: _Righty ho, let's take a look…{nl} I've had a scoot around and since 20th November 2018 your outgoings were nine payments totalling £4,296.79._</li></ul> |
| (**g**) _what number of outstanding invoices do we have_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {unknown_entity, bill_from, date_in, date_mod, datetime}</li><li>Response: _Regrettably, I didn't recognise everything you said, however since 20th November 2018 your expenditure was nine payments totalling £4,296.79._</li></ul> |
| _when is the money from ABC ltd coming in_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {text, pay_from, money}</li><li>Response: _OK, I'm on the case…{nl} After checking, I can verify that there were no incoming transactions for ABC Ltd._</li></ul> |
| (**g**) _when will I get the money in from ABC ltd_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {when, text, pay_from, money}</li><li>Response: _Ok, I'm examining that now…{nl} All right, I've taken stock and you have never received payment from ABC Ltd. You have one unpaid bill to them which is for the amount of £22.80 and was due on the 14th December 2018._</li></ul> |
| _when will I recieve the money from ABC ltd_<details><summary>After pre-procesing</summary>_when will I receive the money from ABC ltd_</details> | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {when, text, pay_from, money}</li><li>Response: _Let's check that out for you…{nl} I've analysed the state of play and you have never received payment from ABC Ltd. You have one unpaid bill to them which is for the amount of £22.80 and was due on the 14th December 2018._</li></ul> |
| (**g**) _which bills do you think we should pay this week_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Right, let me just look…{nl} I've performed a quick investigation and there were no outgoing bill transactions this week._</li></ul> |
| (**g**) _which invoices should we pay on Wed_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Expected to find entity 'open_mod' but found only {datetime, text, pay_to, paid_yet, dt, pay_type}</li><li>Response: _Let me check that out…{nl} Well, I've checked the most up-to-date records and there were no outgoing creditor invoice transactions on Wednesday (12th December)._</li></ul> |
| (**g**) _which supplier invoices have yet to be paid_ | <ul><li>expected intent to be `query_invoices` but found `query_payment`</li><li>Response: _Just a second, I'm exploring now…{nl} I can see your last paid an invoice on the 14th December 2018 for the amount of £159.93._</li></ul> |
| _how much does my landlord want this month_ | <ul><li>expected intent to be `query_invoices` but found `query_payment` (originally `query_acct_name`)</li><li>Expected to find one of entities {bill_from, pay_to} but found only {date_mod, datetime, text, acct_name}.</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _what kind of money is my landlord demanding this month_ | <ul><li>expected intent to be `query_invoices` but found `query_payment` (originally `query_acct_name`)</li><li>Expected to find one of entities {bill_from, pay_to} but found only {text, date_mod, datetime, wp, acct_name}.</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _what will be the value of screw small businesses this year_ | <ul><li>expected intent to be `query_invoices` but found `query_payment` (originally `query_acct_name`)</li><li>Expected to find one of entities {bill_to, pay_from} but found only {date_mod, datetime, text, acct_name}.</li><li>Response: _I can't find that company in your account._</li></ul> |
| (**g**) _can you compute the total on invoices paid in the second quarter_ | <ul><li>expected intent to be `query_payment` but found `query_invoices`</li><li>Expected to find entity 'closed_mod' but found only {date_mod, datetime_quarter, datetime, text, pay_type, query_total}</li><li>Response: _I'll look right now…{nl} I've inspected and in Q2 2018 you received 21 invoices totalling £10,938.95, all of which have been paid and you raised 27 invoices totalling £35,869.50, all of which have been paid._</li></ul> |
| (**g**) _have you got anything on bills paid in the last 4 weeks_ | <ul><li>expected intent to be `query_payment` but found `query_invoices`</li><li>Response: _I'm determining that now…{nl} I've had a quick look and in the last four weeks you received five invoices totalling £2,370.32, none of which have been paid and you raised two invoices totalling £848.92, none of which have been paid._</li></ul> |
| _it would be good to know the total of bills paid last qtr_<details><summary>After pre-procesing</summary>_it would be good to know the total of bills paid last quarter_</details> | <ul><li>expected intent to be `query_payment` but found `query_invoices`</li><li>Response: _Right, let me just look…{nl} I've gone through the most recent figures and in Q3 2018 you received 23 invoices totalling £8,212.60, all of which have been paid and you raised 27 invoices totalling £32,335.40, all of which have been paid._</li></ul> |
|
test
|
intent errors gold expression result is the business account close to the overdraft limit expected intent to be query balance but found query payment expected to find entity overdraft limit but found only text pay to vbz acct type response i can t find that company in your account are the savings account funds doing alright after pre procesing are the savings account funds doing ok expected intent to be query balance but found query payment originally query acct name account type reference mismatch hairsp looking for savings account found none response i can t find that company in your account g by what date should i pay alpha ltd expected intent to be query invoices but found query payment response right let me just look… nl i ve gone through the most recent figures and your last payment to alpha ltd was £ on the october you have three unpaid invoices from them totalling £ all of which are overdue g does anyone expect payment from me today expected intent to be query invoices but found query payment expected to find entity datetime now but found only datetime text pay to response ok hold on i m checking now… nl i ve determined how things are there were no outgoing transactions for today g has anyone given us a payment deadline expected intent to be query invoices but found query payment expected to find one of entities bill from pay to but found only text paid yet date in date mod datetime response let s check that out for you… nl all right i ve taken stock and your last payment was to delta ltd for £ on the december g how much money from me are people waiting on expected intent to be query invoices but found query payment response ok hold on i m checking now… nl i ve just reviewed your accounts and since november your outgoings were nine payments totalling £ i d like to see every sale i ve made after pre procesing i would like to see every sale i have made expected intent to be query invoices but found query payment response ok let s see… nl i ve had a scoot around and your last received payment was from zeta and son for £ on the december g is anyone expecting money from me expected intent to be query invoices but found query payment response let s check that out for you… nl i ve evaluated the state of affairs and since november your spending was nine payments totalling £ g is anyone expecting money from me right now expected intent to be query invoices but found query payment expected to find entity datetime now but found only text money pay to date in date mod datetime response righty ho let s take a look… nl i ve had a scoot around and since november your outgoings were nine payments totalling £ g what number of outstanding invoices do we have expected intent to be query invoices but found query payment expected to find entity open mod but found only unknown entity bill from date in date mod datetime response regrettably i didn t recognise everything you said however since november your expenditure was nine payments totalling £ when is the money from abc ltd coming in expected intent to be query invoices but found query payment expected to find entity open mod but found only text pay from money response ok i m on the case… nl after checking i can verify that there were no incoming transactions for abc ltd g when will i get the money in from abc ltd expected intent to be query invoices but found query payment expected to find entity open mod but found only when text pay from money response ok i m examining that now… nl all right i ve taken stock and you have never received payment from abc ltd you have one unpaid bill to them which is for the amount of £ and was due on the december when will i recieve the money from abc ltd after pre procesing when will i receive the money from abc ltd expected intent to be query invoices but found query payment expected to find entity open mod but found only when text pay from money response let s check that out for you… nl i ve analysed the state of play and you have never received payment from abc ltd you have one unpaid bill to them which is for the amount of £ and was due on the december g which bills do you think we should pay this week expected intent to be query invoices but found query payment response right let me just look… nl i ve performed a quick investigation and there were no outgoing bill transactions this week g which invoices should we pay on wed expected intent to be query invoices but found query payment expected to find entity open mod but found only datetime text pay to paid yet dt pay type response let me check that out… nl well i ve checked the most up to date records and there were no outgoing creditor invoice transactions on wednesday december g which supplier invoices have yet to be paid expected intent to be query invoices but found query payment response just a second i m exploring now… nl i can see your last paid an invoice on the december for the amount of £ how much does my landlord want this month expected intent to be query invoices but found query payment originally query acct name expected to find one of entities bill from pay to but found only date mod datetime text acct name response i can t find that company in your account g what kind of money is my landlord demanding this month expected intent to be query invoices but found query payment originally query acct name expected to find one of entities bill from pay to but found only text date mod datetime wp acct name response i can t find that company in your account g what will be the value of screw small businesses this year expected intent to be query invoices but found query payment originally query acct name expected to find one of entities bill to pay from but found only date mod datetime text acct name response i can t find that company in your account g can you compute the total on invoices paid in the second quarter expected intent to be query payment but found query invoices expected to find entity closed mod but found only date mod datetime quarter datetime text pay type query total response i ll look right now… nl i ve inspected and in you received invoices totalling £ all of which have been paid and you raised invoices totalling £ all of which have been paid g have you got anything on bills paid in the last weeks expected intent to be query payment but found query invoices response i m determining that now… nl i ve had a quick look and in the last four weeks you received five invoices totalling £ none of which have been paid and you raised two invoices totalling £ none of which have been paid it would be good to know the total of bills paid last qtr after pre procesing it would be good to know the total of bills paid last quarter expected intent to be query payment but found query invoices response right let me just look… nl i ve gone through the most recent figures and in you received invoices totalling £ all of which have been paid and you raised invoices totalling £ all of which have been paid
| 1
|
26,485
| 4,227,165,351
|
IssuesEvent
|
2016-07-03 00:32:08
|
exercism/todo
|
https://api.github.com/repos/exercism/todo
|
closed
|
Extract shared inputs/outputs for exercise Sublist
|
shared-test-data
|
Goal: https://github.com/exercism/todo/issues/13
--------------
**Skillset:** This is mostly a matter of reading code (test suites) in several languages and creating some JSON.
---------------
Create a file `sublist.json` containing the inputs and expected outputs for Sublist.
Sublist has been implemented in the following languages:
- https://github.com/exercism/xelixir/tree/master/sublist
- https://github.com/exercism/xhaskell/tree/master/sublist
- https://github.com/exercism/xperl5/tree/master/sublist
- https://github.com/exercism/xpython/tree/master/sublist
- https://github.com/exercism/xscala/tree/master/sublist
See the following files for some examples:
- https://github.com/exercism/x-common/blob/master/bob.json
- https://github.com/exercism/x-common/blob/master/clock.json
- https://github.com/exercism/x-common/blob/master/custom-set.json
- https://github.com/exercism/x-common/blob/master/gigasecond.json
- https://github.com/exercism/x-common/blob/master/hamming.json
- https://github.com/exercism/x-common/blob/master/leap.json
|
1.0
|
Extract shared inputs/outputs for exercise Sublist - Goal: https://github.com/exercism/todo/issues/13
--------------
**Skillset:** This is mostly a matter of reading code (test suites) in several languages and creating some JSON.
---------------
Create a file `sublist.json` containing the inputs and expected outputs for Sublist.
Sublist has been implemented in the following languages:
- https://github.com/exercism/xelixir/tree/master/sublist
- https://github.com/exercism/xhaskell/tree/master/sublist
- https://github.com/exercism/xperl5/tree/master/sublist
- https://github.com/exercism/xpython/tree/master/sublist
- https://github.com/exercism/xscala/tree/master/sublist
See the following files for some examples:
- https://github.com/exercism/x-common/blob/master/bob.json
- https://github.com/exercism/x-common/blob/master/clock.json
- https://github.com/exercism/x-common/blob/master/custom-set.json
- https://github.com/exercism/x-common/blob/master/gigasecond.json
- https://github.com/exercism/x-common/blob/master/hamming.json
- https://github.com/exercism/x-common/blob/master/leap.json
|
test
|
extract shared inputs outputs for exercise sublist goal skillset this is mostly a matter of reading code test suites in several languages and creating some json create a file sublist json containing the inputs and expected outputs for sublist sublist has been implemented in the following languages see the following files for some examples
| 1
|
294,286
| 25,359,614,891
|
IssuesEvent
|
2022-11-20 18:47:36
|
carlonicora/obsidian-rpg-manager
|
https://api.github.com/repos/carlonicora/obsidian-rpg-manager
|
opened
|
[Minor Bugs] 3.3 Beta Testing Minor Issues Thread
|
Bug Testing
|
## Scope
This thread is for very minor issues. Typos, minor graphical glitches, and other small things that can be fixed in one go.
Major issues, such as large "This is absolutely not working" should continue to be reported in their own threads as in-depth back and forth conversation may need to happen.
### How to Report
1. Each comment contains one issue.
2. Include in your comment the issue in a checkbox format:
- [ ] Typo on this page
3. Include screenshots if able
4. Include whether mobile or desktop, and if running a test vault that has a custom theme and other adding enabled.
|
1.0
|
[Minor Bugs] 3.3 Beta Testing Minor Issues Thread - ## Scope
This thread is for very minor issues. Typos, minor graphical glitches, and other small things that can be fixed in one go.
Major issues, such as large "This is absolutely not working" should continue to be reported in their own threads as in-depth back and forth conversation may need to happen.
### How to Report
1. Each comment contains one issue.
2. Include in your comment the issue in a checkbox format:
- [ ] Typo on this page
3. Include screenshots if able
4. Include whether mobile or desktop, and if running a test vault that has a custom theme and other adding enabled.
|
test
|
beta testing minor issues thread scope this thread is for very minor issues typos minor graphical glitches and other small things that can be fixed in one go major issues such as large this is absolutely not working should continue to be reported in their own threads as in depth back and forth conversation may need to happen how to report each comment contains one issue include in your comment the issue in a checkbox format typo on this page include screenshots if able include whether mobile or desktop and if running a test vault that has a custom theme and other adding enabled
| 1
|
375,544
| 11,104,940,742
|
IssuesEvent
|
2019-12-17 08:49:25
|
internetarchive/openlibrary
|
https://api.github.com/repos/internetarchive/openlibrary
|
closed
|
Add GitHub tips for beginners
|
Priority 3: Normal State: Work In Progress Theme: Development Type: Feature
|
Getting started as a contributor can be tricky and some people have trouble with setting up a repository if they are new to GitHub.
### Describe the solution you'd like
Per community call discussion on 10 December 2019 I've created a step-by-step guide for setting up an openlibrary repository for beginners. This PR is for modifying the CONTRIBUTING file with the link to the [tips wiki page](https://github.com/internetarchive/openlibrary/wiki/New-to-Git%3F).
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
<!-- Which suggestions or requirements should be considered for how feature needs to appear or be implemented? -->
### Additional context
<!-- Add any other context or screenshots about the feature request here. -->
### Stakeholders
Link added per @cdrini 's request.
|
1.0
|
Add GitHub tips for beginners - Getting started as a contributor can be tricky and some people have trouble with setting up a repository if they are new to GitHub.
### Describe the solution you'd like
Per community call discussion on 10 December 2019 I've created a step-by-step guide for setting up an openlibrary repository for beginners. This PR is for modifying the CONTRIBUTING file with the link to the [tips wiki page](https://github.com/internetarchive/openlibrary/wiki/New-to-Git%3F).
<!-- What is the proposed solution / implementation? Is there a precedent of this approach succeeding elsewhere? -->
<!-- Which suggestions or requirements should be considered for how feature needs to appear or be implemented? -->
### Additional context
<!-- Add any other context or screenshots about the feature request here. -->
### Stakeholders
Link added per @cdrini 's request.
|
non_test
|
add github tips for beginners getting started as a contributor can be tricky and some people have trouble with setting up a repository if they are new to github describe the solution you d like per community call discussion on december i ve created a step by step guide for setting up an openlibrary repository for beginners this pr is for modifying the contributing file with the link to the additional context stakeholders link added per cdrini s request
| 0
|
2,402
| 2,684,244,726
|
IssuesEvent
|
2015-03-28 19:59:46
|
gregorio-project/gregorio
|
https://api.github.com/repos/gregorio-project/gregorio
|
closed
|
\includescore with option semantics have changed 2.4.2
|
documentation
|
In 2.4.2, the `[f]` option for `\includescore` returned that macro it its original semantics. In 3.0.0, the equivalent option is `[n]` and `[f]` actually forces the new autocompile semantics. Should this be explicitly pointed out in `UPGRADE.md` and/or `CHANGELOG.md`?
~~Note: the earlier fix for scribus added `[f]` and should now use `[n]` instead. I will create a pull request to fix that.~~
(edited: the Scribus scripts have already been corrected)
|
1.0
|
\includescore with option semantics have changed 2.4.2 - In 2.4.2, the `[f]` option for `\includescore` returned that macro it its original semantics. In 3.0.0, the equivalent option is `[n]` and `[f]` actually forces the new autocompile semantics. Should this be explicitly pointed out in `UPGRADE.md` and/or `CHANGELOG.md`?
~~Note: the earlier fix for scribus added `[f]` and should now use `[n]` instead. I will create a pull request to fix that.~~
(edited: the Scribus scripts have already been corrected)
|
non_test
|
includescore with option semantics have changed in the option for includescore returned that macro it its original semantics in the equivalent option is and actually forces the new autocompile semantics should this be explicitly pointed out in upgrade md and or changelog md note the earlier fix for scribus added and should now use instead i will create a pull request to fix that edited the scribus scripts have already been corrected
| 0
|
162,399
| 12,664,474,354
|
IssuesEvent
|
2020-06-18 04:47:10
|
BEXIS2/Core
|
https://api.github.com/repos/BEXIS2/Core
|
closed
|
Import xsd - Fail
|
High TestQuality bug resolution_Fixed
|
the system is not able convert the root element to a group because its a element : z:617 - XsdSchemamanager
|
1.0
|
Import xsd - Fail - the system is not able convert the root element to a group because its a element : z:617 - XsdSchemamanager
|
test
|
import xsd fail the system is not able convert the root element to a group because its a element z xsdschemamanager
| 1
|
349,346
| 31,794,477,619
|
IssuesEvent
|
2023-09-13 07:02:26
|
HSLdevcom/jore4
|
https://api.github.com/repos/HSLdevcom/jore4
|
closed
|
Add component tests for day type timetable card
|
frontend Timetables testing chore
|
https://github.com/HSLdevcom/jore4/issues/1155
Update on 20.6.2023:Add component tests for the timetable card on line timetables.
I think this card was meant to add component tests for `VehicleJourneyGroupInfo` Since that is what contains the information from #1155
|
1.0
|
Add component tests for day type timetable card - https://github.com/HSLdevcom/jore4/issues/1155
Update on 20.6.2023:Add component tests for the timetable card on line timetables.
I think this card was meant to add component tests for `VehicleJourneyGroupInfo` Since that is what contains the information from #1155
|
test
|
add component tests for day type timetable card update on add component tests for the timetable card on line timetables i think this card was meant to add component tests for vehiclejourneygroupinfo since that is what contains the information from
| 1
|
527,718
| 15,351,225,714
|
IssuesEvent
|
2021-03-01 04:33:40
|
magento/magento2
|
https://api.github.com/repos/magento/magento2
|
closed
|
shipping information API validation
|
Component: Webapi Issue: Confirmed Priority: P3 Progress: ready for dev Reproduced on 2.2.x Reproduced on 2.3.x Severity: S3 stale issue
|
<!---
Thank you for contributing to Magento.
To help us process this issue we recommend that you add the following information:
- Summary of the issue,
- Information on your environment,
- Steps to reproduce,
- Expected and actual results,
Please also have a look at our guidelines article before adding a new issue https://github.com/magento/magento2/wiki/Issue-reporting-guidelines
-->
### Preconditions
<!---
Please provide as detailed information about your environment as possible.
For example Magento version, tag, HEAD, PHP & MySQL version, etc..
-->
1. Magento 2.2.2
### Steps to reproduce
<!---
It is important to provide a set of clear steps to reproduce this bug.
If relevant please include code samples
-->
1. Generate cart using POST rest/V1/guest-carts
2. Add product to cart POST rest/V1/guest-carts/[cart-id]/items
**REQUEST**
`
{
"cartItem": {
"sku": "[sku]",
"qty": 1,
"quote_id": "[cart-id]"
}
}
`
3. Make a request to endpoint POST rest/V1/guest-carts/[:cart-id]/shipping-information
**REQUEST**
```
{
"addressInformation":{
"shippingAddress":{
"country_id":"IN",
"region":"XYZ",
"region_id":0,
"street":["test"],
"telephone":"99040503256",
"postcode":"1234",
"city":"",
"firstname":"Kandarp",
"lastname":"Patel",
"middlename":"Test"
},
"shipping_method_code":"flatrate",
"shipping_carrier_code":"flatrate"
}
}
````
### Expected result
<!--- Tell us what should happen -->
1. API should not allow the save shipping address as city is blank.
### Actual result
<!--- Tell us what happens instead -->
1. API allow to save save shipping address.
|
1.0
|
shipping information API validation - <!---
Thank you for contributing to Magento.
To help us process this issue we recommend that you add the following information:
- Summary of the issue,
- Information on your environment,
- Steps to reproduce,
- Expected and actual results,
Please also have a look at our guidelines article before adding a new issue https://github.com/magento/magento2/wiki/Issue-reporting-guidelines
-->
### Preconditions
<!---
Please provide as detailed information about your environment as possible.
For example Magento version, tag, HEAD, PHP & MySQL version, etc..
-->
1. Magento 2.2.2
### Steps to reproduce
<!---
It is important to provide a set of clear steps to reproduce this bug.
If relevant please include code samples
-->
1. Generate cart using POST rest/V1/guest-carts
2. Add product to cart POST rest/V1/guest-carts/[cart-id]/items
**REQUEST**
`
{
"cartItem": {
"sku": "[sku]",
"qty": 1,
"quote_id": "[cart-id]"
}
}
`
3. Make a request to endpoint POST rest/V1/guest-carts/[:cart-id]/shipping-information
**REQUEST**
```
{
"addressInformation":{
"shippingAddress":{
"country_id":"IN",
"region":"XYZ",
"region_id":0,
"street":["test"],
"telephone":"99040503256",
"postcode":"1234",
"city":"",
"firstname":"Kandarp",
"lastname":"Patel",
"middlename":"Test"
},
"shipping_method_code":"flatrate",
"shipping_carrier_code":"flatrate"
}
}
````
### Expected result
<!--- Tell us what should happen -->
1. API should not allow the save shipping address as city is blank.
### Actual result
<!--- Tell us what happens instead -->
1. API allow to save save shipping address.
|
non_test
|
shipping information api validation thank you for contributing to magento to help us process this issue we recommend that you add the following information summary of the issue information on your environment steps to reproduce expected and actual results please also have a look at our guidelines article before adding a new issue preconditions please provide as detailed information about your environment as possible for example magento version tag head php mysql version etc magento steps to reproduce it is important to provide a set of clear steps to reproduce this bug if relevant please include code samples generate cart using post rest guest carts add product to cart post rest guest carts items request cartitem sku qty quote id make a request to endpoint post rest guest carts shipping information request addressinformation shippingaddress country id in region xyz region id street telephone postcode city firstname kandarp lastname patel middlename test shipping method code flatrate shipping carrier code flatrate expected result api should not allow the save shipping address as city is blank actual result api allow to save save shipping address
| 0
|
116,601
| 17,379,925,631
|
IssuesEvent
|
2021-07-31 13:41:32
|
sap-labs-france/ev-server
|
https://api.github.com/repos/sap-labs-france/ev-server
|
closed
|
Security > Integrate SAP IDM role management
|
feature security
|
That permits to have a worklfow that tracks I- / D- and C-User role attribution in the application in a SGS compliant way. It will be needed to validate the security concepts approval. Probably a per-tenant setting. The integration details will have to be thought.
SAP IDM: https://jam4.sapjam.com/groups/O44DvPsGH6fPFhSDlWR2ZN/overview_page/jzddw6bx5xGEtpHAZRHrHT
|
True
|
Security > Integrate SAP IDM role management - That permits to have a worklfow that tracks I- / D- and C-User role attribution in the application in a SGS compliant way. It will be needed to validate the security concepts approval. Probably a per-tenant setting. The integration details will have to be thought.
SAP IDM: https://jam4.sapjam.com/groups/O44DvPsGH6fPFhSDlWR2ZN/overview_page/jzddw6bx5xGEtpHAZRHrHT
|
non_test
|
security integrate sap idm role management that permits to have a worklfow that tracks i d and c user role attribution in the application in a sgs compliant way it will be needed to validate the security concepts approval probably a per tenant setting the integration details will have to be thought sap idm
| 0
|
175,265
| 21,300,911,474
|
IssuesEvent
|
2022-04-15 02:53:32
|
turkdevops/web.dev
|
https://api.github.com/repos/turkdevops/web.dev
|
opened
|
CVE-2021-43138 (High) detected in multiple libraries
|
security vulnerability
|
## CVE-2021-43138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>async-2.6.3.tgz</b>, <b>async-3.2.0.tgz</b>, <b>async-1.5.2.tgz</b></p></summary>
<p>
<details><summary><b>async-2.6.3.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-2.6.3.tgz">https://registry.npmjs.org/async/-/async-2.6.3.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/streamroller/node_modules/async/package.json,/node_modules/portfinder/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.4.1.tgz (Root Library)
- log4js-4.5.1.tgz
- streamroller-1.0.6.tgz
- :x: **async-2.6.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-3.2.0.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-3.2.0.tgz">https://registry.npmjs.org/async/-/async-3.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/archiver/node_modules/async/package.json,/node_modules/winston/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- firebase-tools-9.16.0.tgz (Root Library)
- winston-3.3.3.tgz
- :x: **async-3.2.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-1.5.2.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.5.2.tgz">https://registry.npmjs.org/async/-/async-1.5.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- eleventy-0.12.1.tgz (Root Library)
- browser-sync-2.27.4.tgz
- portscanner-2.1.1.tgz
- :x: **async-1.5.2.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution (async): 3.2.2</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p><p>Fix Resolution (async): 3.2.2</p>
<p>Direct dependency fix Resolution (@11ty/eleventy): 2.0.0-canary.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2021-43138 (High) detected in multiple libraries - ## CVE-2021-43138 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>async-2.6.3.tgz</b>, <b>async-3.2.0.tgz</b>, <b>async-1.5.2.tgz</b></p></summary>
<p>
<details><summary><b>async-2.6.3.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-2.6.3.tgz">https://registry.npmjs.org/async/-/async-2.6.3.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/streamroller/node_modules/async/package.json,/node_modules/portfinder/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- karma-4.4.1.tgz (Root Library)
- log4js-4.5.1.tgz
- streamroller-1.0.6.tgz
- :x: **async-2.6.3.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-3.2.0.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-3.2.0.tgz">https://registry.npmjs.org/async/-/async-3.2.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/archiver/node_modules/async/package.json,/node_modules/winston/node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- firebase-tools-9.16.0.tgz (Root Library)
- winston-3.3.3.tgz
- :x: **async-3.2.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>async-1.5.2.tgz</b></p></summary>
<p>Higher-order functions and common patterns for asynchronous code</p>
<p>Library home page: <a href="https://registry.npmjs.org/async/-/async-1.5.2.tgz">https://registry.npmjs.org/async/-/async-1.5.2.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/async/package.json</p>
<p>
Dependency Hierarchy:
- eleventy-0.12.1.tgz (Root Library)
- browser-sync-2.27.4.tgz
- portscanner-2.1.1.tgz
- :x: **async-1.5.2.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>main</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A vulnerability exists in Async through 3.2.1 (fixed in 3.2.2) , which could let a malicious user obtain privileges via the mapValues() method.
<p>Publish Date: 2022-04-06
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-43138>CVE-2021-43138</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2021-43138">https://nvd.nist.gov/vuln/detail/CVE-2021-43138</a></p>
<p>Release Date: 2022-04-06</p>
<p>Fix Resolution (async): 3.2.2</p>
<p>Direct dependency fix Resolution (karma): 5.0.8</p><p>Fix Resolution (async): 3.2.2</p>
<p>Direct dependency fix Resolution (@11ty/eleventy): 2.0.0-canary.1</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries async tgz async tgz async tgz async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules streamroller node modules async package json node modules portfinder node modules async package json dependency hierarchy karma tgz root library tgz streamroller tgz x async tgz vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules archiver node modules async package json node modules winston node modules async package json dependency hierarchy firebase tools tgz root library winston tgz x async tgz vulnerable library async tgz higher order functions and common patterns for asynchronous code library home page a href path to dependency file package json path to vulnerable library node modules async package json dependency hierarchy eleventy tgz root library browser sync tgz portscanner tgz x async tgz vulnerable library found in base branch main vulnerability details a vulnerability exists in async through fixed in which could let a malicious user obtain privileges via the mapvalues method publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution async direct dependency fix resolution karma fix resolution async direct dependency fix resolution eleventy canary step up your open source security game with whitesource
| 0
|
326,421
| 27,990,268,832
|
IssuesEvent
|
2023-03-27 02:39:10
|
Sars9588/mywebclass-simulation
|
https://api.github.com/repos/Sars9588/mywebclass-simulation
|
closed
|
Testing clicking home button in navigation path of Our Story page
|
Test
|
Name of Test Developer: Meet
Test Name: Clicking the home button in the navigation path of the Our Story page
Test Type: Button Click
|
1.0
|
Testing clicking home button in navigation path of Our Story page - Name of Test Developer: Meet
Test Name: Clicking the home button in the navigation path of the Our Story page
Test Type: Button Click
|
test
|
testing clicking home button in navigation path of our story page name of test developer meet test name clicking the home button in the navigation path of the our story page test type button click
| 1
|
400,754
| 11,780,298,550
|
IssuesEvent
|
2020-03-16 19:46:05
|
pytorch/pytorch
|
https://api.github.com/repos/pytorch/pytorch
|
reopened
|
[Feature Request] Make nn layers accept empty batch size
|
feature high priority module: nn module: operators triaged
|
Now that we have support for tensors with zero in its size, I believe it would be very handy to have support for accepting batches of size 0 in `nn.functional` functions.
A (non-exhaustive) list of functions that would be good supporting:
- [x] `conv{1-2-3}d`
- [x] `conv_transpose{1-2-3}d`
- [x] `batch_norm`
- [x] `interpolate`
Handling the losses is a bit trickier, because it generally involves computing a `.mean()`, which results in `NaN` due to 0 / 0 division. I'd expect having a 0 loss for empty batches to make sense, but that's debatable so might be worth postponing this decision.
|
1.0
|
[Feature Request] Make nn layers accept empty batch size - Now that we have support for tensors with zero in its size, I believe it would be very handy to have support for accepting batches of size 0 in `nn.functional` functions.
A (non-exhaustive) list of functions that would be good supporting:
- [x] `conv{1-2-3}d`
- [x] `conv_transpose{1-2-3}d`
- [x] `batch_norm`
- [x] `interpolate`
Handling the losses is a bit trickier, because it generally involves computing a `.mean()`, which results in `NaN` due to 0 / 0 division. I'd expect having a 0 loss for empty batches to make sense, but that's debatable so might be worth postponing this decision.
|
non_test
|
make nn layers accept empty batch size now that we have support for tensors with zero in its size i believe it would be very handy to have support for accepting batches of size in nn functional functions a non exhaustive list of functions that would be good supporting conv d conv transpose d batch norm interpolate handling the losses is a bit trickier because it generally involves computing a mean which results in nan due to division i d expect having a loss for empty batches to make sense but that s debatable so might be worth postponing this decision
| 0
|
40,363
| 9,967,792,540
|
IssuesEvent
|
2019-07-08 14:19:16
|
ShaikASK/Testing
|
https://api.github.com/repos/ShaikASK/Testing
|
opened
|
FF Browser : Document : Map Fields : Two Vertical Scroll bar is being displayed
|
Beta Release #5 Defect Document Map fields HR Admin Module HR User Module P3
|
Steps To Replicate :
1.Switch to 1024 x 768 resolution
2.Launch the URL
3.Sign in HR admin user
4.Upload a document and click on save and Map fields
Experienced Behavior : Observed that two vertical scroll bar is being displayed against Map Fields screen when check with Firefox browser
Expected Behavior : Ensure that two vertical scroll bar should not be displayed against map fields screen


|
1.0
|
FF Browser : Document : Map Fields : Two Vertical Scroll bar is being displayed - Steps To Replicate :
1.Switch to 1024 x 768 resolution
2.Launch the URL
3.Sign in HR admin user
4.Upload a document and click on save and Map fields
Experienced Behavior : Observed that two vertical scroll bar is being displayed against Map Fields screen when check with Firefox browser
Expected Behavior : Ensure that two vertical scroll bar should not be displayed against map fields screen


|
non_test
|
ff browser document map fields two vertical scroll bar is being displayed steps to replicate switch to x resolution launch the url sign in hr admin user upload a document and click on save and map fields experienced behavior observed that two vertical scroll bar is being displayed against map fields screen when check with firefox browser expected behavior ensure that two vertical scroll bar should not be displayed against map fields screen
| 0
|
436,581
| 12,550,967,507
|
IssuesEvent
|
2020-06-06 13:07:10
|
googleapis/elixir-google-api
|
https://api.github.com/repos/googleapis/elixir-google-api
|
opened
|
Synthesis failed for IdentityToolkit
|
api: identitytoolkit autosynth failure priority: p1 type: bug
|
Hello! Autosynth couldn't regenerate IdentityToolkit. :broken_heart:
Here's the output from running `synth.py`:
```
)
* Getting mime (Hex package)
* Getting google_gax (Hex package)
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
==> temp
Compiling 3 files (.ex)
Generated temp app
===> Compiling parse_trans
===> Compiling mimerl
===> Compiling metrics
===> Compiling unicode_util_compat
===> Compiling idna
==> jason
Compiling 8 files (.ex)
Generated jason app
warning: String.strip/1 is deprecated. Use String.trim/1 instead
/workspace/deps/poison/mix.exs:4
==> poison
Compiling 4 files (.ex)
warning: Integer.to_char_list/2 is deprecated. Use Integer.to_charlist/2 instead
lib/poison/encoder.ex:173
Generated poison app
==> ssl_verify_fun
Compiling 7 files (.erl)
Generated ssl_verify_fun app
===> Compiling certifi
===> Compiling hackney
==> oauth2
Compiling 13 files (.ex)
Generated oauth2 app
==> mime
Compiling 2 files (.ex)
Generated mime app
==> tesla
Compiling 26 files (.ex)
Generated tesla app
==> google_gax
Compiling 5 files (.ex)
Generated google_gax app
==> google_api_discovery
Compiling 21 files (.ex)
Generated google_api_discovery app
==> google_apis
Compiling 27 files (.ex)
warning: System.cwd/0 is deprecated. Use File.cwd/0 instead
lib/google_apis/publisher.ex:24
Generated google_apis app
13:07:05.472 [info] FETCHING: https://identitytoolkit.googleapis.com/$discovery/GOOGLE_REST_SIMPLE_URI?version=v3
13:07:05.590 [info] FETCHING: https://identitytoolkit.googleapis.com/$discovery/rest?version=v3
13:07:05.603 [info] FOUND: https://identitytoolkit.googleapis.com/$discovery/rest?version=v3
Revision check: old=20180723, new=20180723, generating=true
Creating leading directories
Writing CreateAuthUriResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/create_auth_uri_response.ex.
Writing DeleteAccountResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/delete_account_response.ex.
Writing DownloadAccountResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/download_account_response.ex.
Writing EmailLinkSigninResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/email_link_signin_response.ex.
Writing EmailTemplate to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/email_template.ex.
Writing GetAccountInfoResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/get_account_info_response.ex.
Writing GetOobConfirmationCodeResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/get_oob_confirmation_code_response.ex.
Writing GetRecaptchaParamResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/get_recaptcha_param_response.ex.
Writing IdentitytoolkitRelyingpartyCreateAuthUriRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_create_auth_uri_request.ex.
Writing IdentitytoolkitRelyingpartyDeleteAccountRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_delete_account_request.ex.
Writing IdentitytoolkitRelyingpartyDownloadAccountRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_download_account_request.ex.
Writing IdentitytoolkitRelyingpartyEmailLinkSigninRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_email_link_signin_request.ex.
Writing IdentitytoolkitRelyingpartyGetAccountInfoRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_get_account_info_request.ex.
Writing IdentitytoolkitRelyingpartyGetProjectConfigResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_get_project_config_response.ex.
Writing IdentitytoolkitRelyingpartyGetPublicKeysResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_get_public_keys_response.ex.
Writing IdentitytoolkitRelyingpartyResetPasswordRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_reset_password_request.ex.
Writing IdentitytoolkitRelyingpartySendVerificationCodeRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_send_verification_code_request.ex.
Writing IdentitytoolkitRelyingpartySendVerificationCodeResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_send_verification_code_response.ex.
Writing IdentitytoolkitRelyingpartySetAccountInfoRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_set_account_info_request.ex.
Writing IdentitytoolkitRelyingpartySetProjectConfigRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_set_project_config_request.ex.
Writing IdentitytoolkitRelyingpartySetProjectConfigResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_set_project_config_response.ex.
Writing IdentitytoolkitRelyingpartySignOutUserRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_sign_out_user_request.ex.
Writing IdentitytoolkitRelyingpartySignOutUserResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_sign_out_user_response.ex.
Writing IdentitytoolkitRelyingpartySignupNewUserRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_signup_new_user_request.ex.
Writing IdentitytoolkitRelyingpartyUploadAccountRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_upload_account_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyAssertionRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_assertion_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyCustomTokenRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_custom_token_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyPasswordRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_password_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyPhoneNumberRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_phone_number_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyPhoneNumberResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_phone_number_response.ex.
Writing IdpConfig to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/idp_config.ex.
Writing Relyingparty to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/relyingparty.ex.
Writing ResetPasswordResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/reset_password_response.ex.
Writing SetAccountInfoResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/set_account_info_response.ex.
Writing SetAccountInfoResponseProviderUserInfo to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/set_account_info_response_provider_user_info.ex.
Writing SignupNewUserResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/signup_new_user_response.ex.
Writing UploadAccountResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/upload_account_response.ex.
Writing UploadAccountResponseError to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/upload_account_response_error.ex.
Writing UserInfo to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/user_info.ex.
Writing UserInfoProviderUserInfo to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/user_info_provider_user_info.ex.
Writing VerifyAssertionResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_assertion_response.ex.
Writing VerifyCustomTokenResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_custom_token_response.ex.
Writing VerifyPasswordResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_password_response.ex.
Writing Relyingparty to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/api/relyingparty.ex.
Writing connection.ex.
Writing metadata.ex.
Writing mix.exs
Writing README.md
Writing LICENSE
Writing .gitignore
Writing config/config.exs
Writing test/test_helper.exs
13:07:06.038 [info] Found only discovery_revision and/or formatting changes. Not significant enough for a PR.
fixing file permissions
2020-06-06 06:07:09,130 synthtool [DEBUG] > Wrote metadata to clients/identity_toolkit/synth.metadata.
DEBUG:synthtool:Wrote metadata to clients/identity_toolkit/synth.metadata.
2020-06-06 06:07:09,155 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 615, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 476, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 555, in _inner_main
).synthesize(base_synth_log_path)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 121, in synthesize
with open(log_file_path, "rt") as fp:
IsADirectoryError: [Errno 21] Is a directory: '/tmpfs/src/github/synthtool/logs/googleapis/elixir-google-api'
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/567e2ab8-5e4f-4fb0-8bae-9d0cc90aa1af/targets/github%2Fsynthtool;config=default/tests;query=elixir-google-api;failed=false).
|
1.0
|
Synthesis failed for IdentityToolkit - Hello! Autosynth couldn't regenerate IdentityToolkit. :broken_heart:
Here's the output from running `synth.py`:
```
)
* Getting mime (Hex package)
* Getting google_gax (Hex package)
[33mThe mix.lock file was generated with a newer version of Hex. Update your client by running `mix local.hex` to avoid losing data.[0m
==> temp
Compiling 3 files (.ex)
Generated temp app
===> Compiling parse_trans
===> Compiling mimerl
===> Compiling metrics
===> Compiling unicode_util_compat
===> Compiling idna
==> jason
Compiling 8 files (.ex)
Generated jason app
warning: String.strip/1 is deprecated. Use String.trim/1 instead
/workspace/deps/poison/mix.exs:4
==> poison
Compiling 4 files (.ex)
warning: Integer.to_char_list/2 is deprecated. Use Integer.to_charlist/2 instead
lib/poison/encoder.ex:173
Generated poison app
==> ssl_verify_fun
Compiling 7 files (.erl)
Generated ssl_verify_fun app
===> Compiling certifi
===> Compiling hackney
==> oauth2
Compiling 13 files (.ex)
Generated oauth2 app
==> mime
Compiling 2 files (.ex)
Generated mime app
==> tesla
Compiling 26 files (.ex)
Generated tesla app
==> google_gax
Compiling 5 files (.ex)
Generated google_gax app
==> google_api_discovery
Compiling 21 files (.ex)
Generated google_api_discovery app
==> google_apis
Compiling 27 files (.ex)
warning: System.cwd/0 is deprecated. Use File.cwd/0 instead
lib/google_apis/publisher.ex:24
Generated google_apis app
13:07:05.472 [info] FETCHING: https://identitytoolkit.googleapis.com/$discovery/GOOGLE_REST_SIMPLE_URI?version=v3
13:07:05.590 [info] FETCHING: https://identitytoolkit.googleapis.com/$discovery/rest?version=v3
13:07:05.603 [info] FOUND: https://identitytoolkit.googleapis.com/$discovery/rest?version=v3
Revision check: old=20180723, new=20180723, generating=true
Creating leading directories
Writing CreateAuthUriResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/create_auth_uri_response.ex.
Writing DeleteAccountResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/delete_account_response.ex.
Writing DownloadAccountResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/download_account_response.ex.
Writing EmailLinkSigninResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/email_link_signin_response.ex.
Writing EmailTemplate to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/email_template.ex.
Writing GetAccountInfoResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/get_account_info_response.ex.
Writing GetOobConfirmationCodeResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/get_oob_confirmation_code_response.ex.
Writing GetRecaptchaParamResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/get_recaptcha_param_response.ex.
Writing IdentitytoolkitRelyingpartyCreateAuthUriRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_create_auth_uri_request.ex.
Writing IdentitytoolkitRelyingpartyDeleteAccountRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_delete_account_request.ex.
Writing IdentitytoolkitRelyingpartyDownloadAccountRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_download_account_request.ex.
Writing IdentitytoolkitRelyingpartyEmailLinkSigninRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_email_link_signin_request.ex.
Writing IdentitytoolkitRelyingpartyGetAccountInfoRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_get_account_info_request.ex.
Writing IdentitytoolkitRelyingpartyGetProjectConfigResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_get_project_config_response.ex.
Writing IdentitytoolkitRelyingpartyGetPublicKeysResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_get_public_keys_response.ex.
Writing IdentitytoolkitRelyingpartyResetPasswordRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_reset_password_request.ex.
Writing IdentitytoolkitRelyingpartySendVerificationCodeRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_send_verification_code_request.ex.
Writing IdentitytoolkitRelyingpartySendVerificationCodeResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_send_verification_code_response.ex.
Writing IdentitytoolkitRelyingpartySetAccountInfoRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_set_account_info_request.ex.
Writing IdentitytoolkitRelyingpartySetProjectConfigRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_set_project_config_request.ex.
Writing IdentitytoolkitRelyingpartySetProjectConfigResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_set_project_config_response.ex.
Writing IdentitytoolkitRelyingpartySignOutUserRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_sign_out_user_request.ex.
Writing IdentitytoolkitRelyingpartySignOutUserResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_sign_out_user_response.ex.
Writing IdentitytoolkitRelyingpartySignupNewUserRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_signup_new_user_request.ex.
Writing IdentitytoolkitRelyingpartyUploadAccountRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_upload_account_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyAssertionRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_assertion_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyCustomTokenRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_custom_token_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyPasswordRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_password_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyPhoneNumberRequest to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_phone_number_request.ex.
Writing IdentitytoolkitRelyingpartyVerifyPhoneNumberResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/identitytoolkit_relyingparty_verify_phone_number_response.ex.
Writing IdpConfig to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/idp_config.ex.
Writing Relyingparty to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/relyingparty.ex.
Writing ResetPasswordResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/reset_password_response.ex.
Writing SetAccountInfoResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/set_account_info_response.ex.
Writing SetAccountInfoResponseProviderUserInfo to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/set_account_info_response_provider_user_info.ex.
Writing SignupNewUserResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/signup_new_user_response.ex.
Writing UploadAccountResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/upload_account_response.ex.
Writing UploadAccountResponseError to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/upload_account_response_error.ex.
Writing UserInfo to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/user_info.ex.
Writing UserInfoProviderUserInfo to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/user_info_provider_user_info.ex.
Writing VerifyAssertionResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_assertion_response.ex.
Writing VerifyCustomTokenResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_custom_token_response.ex.
Writing VerifyPasswordResponse to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/model/verify_password_response.ex.
Writing Relyingparty to clients/identity_toolkit/lib/google_api/identity_toolkit/v3/api/relyingparty.ex.
Writing connection.ex.
Writing metadata.ex.
Writing mix.exs
Writing README.md
Writing LICENSE
Writing .gitignore
Writing config/config.exs
Writing test/test_helper.exs
13:07:06.038 [info] Found only discovery_revision and/or formatting changes. Not significant enough for a PR.
fixing file permissions
2020-06-06 06:07:09,130 synthtool [DEBUG] > Wrote metadata to clients/identity_toolkit/synth.metadata.
DEBUG:synthtool:Wrote metadata to clients/identity_toolkit/synth.metadata.
2020-06-06 06:07:09,155 autosynth [DEBUG] > Running: git clean -fdx
Removing __pycache__/
Traceback (most recent call last):
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 193, in _run_module_as_main
"__main__", mod_spec)
File "/home/kbuilder/.pyenv/versions/3.6.9/lib/python3.6/runpy.py", line 85, in _run_code
exec(code, run_globals)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 615, in <module>
main()
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 476, in main
return _inner_main(temp_dir)
File "/tmpfs/src/github/synthtool/autosynth/synth.py", line 555, in _inner_main
).synthesize(base_synth_log_path)
File "/tmpfs/src/github/synthtool/autosynth/synthesizer.py", line 121, in synthesize
with open(log_file_path, "rt") as fp:
IsADirectoryError: [Errno 21] Is a directory: '/tmpfs/src/github/synthtool/logs/googleapis/elixir-google-api'
```
Google internal developers can see the full log [here](http://sponge2/results/invocations/567e2ab8-5e4f-4fb0-8bae-9d0cc90aa1af/targets/github%2Fsynthtool;config=default/tests;query=elixir-google-api;failed=false).
|
non_test
|
synthesis failed for identitytoolkit hello autosynth couldn t regenerate identitytoolkit broken heart here s the output from running synth py getting mime hex package getting google gax hex package mix lock file was generated with a newer version of hex update your client by running mix local hex to avoid losing data temp compiling files ex generated temp app compiling parse trans compiling mimerl compiling metrics compiling unicode util compat compiling idna jason compiling files ex generated jason app warning string strip is deprecated use string trim instead workspace deps poison mix exs poison compiling files ex warning integer to char list is deprecated use integer to charlist instead lib poison encoder ex generated poison app ssl verify fun compiling files erl generated ssl verify fun app compiling certifi compiling hackney compiling files ex generated app mime compiling files ex generated mime app tesla compiling files ex generated tesla app google gax compiling files ex generated google gax app google api discovery compiling files ex generated google api discovery app google apis compiling files ex warning system cwd is deprecated use file cwd instead lib google apis publisher ex generated google apis app fetching fetching found revision check old new generating true creating leading directories writing createauthuriresponse to clients identity toolkit lib google api identity toolkit model create auth uri response ex writing deleteaccountresponse to clients identity toolkit lib google api identity toolkit model delete account response ex writing downloadaccountresponse to clients identity toolkit lib google api identity toolkit model download account response ex writing emaillinksigninresponse to clients identity toolkit lib google api identity toolkit model email link signin response ex writing emailtemplate to clients identity toolkit lib google api identity toolkit model email template ex writing getaccountinforesponse to clients identity toolkit lib google api identity toolkit model get account info response ex writing getoobconfirmationcoderesponse to clients identity toolkit lib google api identity toolkit model get oob confirmation code response ex writing getrecaptchaparamresponse to clients identity toolkit lib google api identity toolkit model get recaptcha param response ex writing identitytoolkitrelyingpartycreateauthurirequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty create auth uri request ex writing identitytoolkitrelyingpartydeleteaccountrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty delete account request ex writing identitytoolkitrelyingpartydownloadaccountrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty download account request ex writing identitytoolkitrelyingpartyemaillinksigninrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty email link signin request ex writing identitytoolkitrelyingpartygetaccountinforequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty get account info request ex writing identitytoolkitrelyingpartygetprojectconfigresponse to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty get project config response ex writing identitytoolkitrelyingpartygetpublickeysresponse to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty get public keys response ex writing identitytoolkitrelyingpartyresetpasswordrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty reset password request ex writing identitytoolkitrelyingpartysendverificationcoderequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty send verification code request ex writing identitytoolkitrelyingpartysendverificationcoderesponse to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty send verification code response ex writing identitytoolkitrelyingpartysetaccountinforequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty set account info request ex writing identitytoolkitrelyingpartysetprojectconfigrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty set project config request ex writing identitytoolkitrelyingpartysetprojectconfigresponse to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty set project config response ex writing identitytoolkitrelyingpartysignoutuserrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty sign out user request ex writing identitytoolkitrelyingpartysignoutuserresponse to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty sign out user response ex writing identitytoolkitrelyingpartysignupnewuserrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty signup new user request ex writing identitytoolkitrelyingpartyuploadaccountrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty upload account request ex writing identitytoolkitrelyingpartyverifyassertionrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty verify assertion request ex writing identitytoolkitrelyingpartyverifycustomtokenrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty verify custom token request ex writing identitytoolkitrelyingpartyverifypasswordrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty verify password request ex writing identitytoolkitrelyingpartyverifyphonenumberrequest to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty verify phone number request ex writing identitytoolkitrelyingpartyverifyphonenumberresponse to clients identity toolkit lib google api identity toolkit model identitytoolkit relyingparty verify phone number response ex writing idpconfig to clients identity toolkit lib google api identity toolkit model idp config ex writing relyingparty to clients identity toolkit lib google api identity toolkit model relyingparty ex writing resetpasswordresponse to clients identity toolkit lib google api identity toolkit model reset password response ex writing setaccountinforesponse to clients identity toolkit lib google api identity toolkit model set account info response ex writing setaccountinforesponseprovideruserinfo to clients identity toolkit lib google api identity toolkit model set account info response provider user info ex writing signupnewuserresponse to clients identity toolkit lib google api identity toolkit model signup new user response ex writing uploadaccountresponse to clients identity toolkit lib google api identity toolkit model upload account response ex writing uploadaccountresponseerror to clients identity toolkit lib google api identity toolkit model upload account response error ex writing userinfo to clients identity toolkit lib google api identity toolkit model user info ex writing userinfoprovideruserinfo to clients identity toolkit lib google api identity toolkit model user info provider user info ex writing verifyassertionresponse to clients identity toolkit lib google api identity toolkit model verify assertion response ex writing verifycustomtokenresponse to clients identity toolkit lib google api identity toolkit model verify custom token response ex writing verifypasswordresponse to clients identity toolkit lib google api identity toolkit model verify password response ex writing relyingparty to clients identity toolkit lib google api identity toolkit api relyingparty ex writing connection ex writing metadata ex writing mix exs writing readme md writing license writing gitignore writing config config exs writing test test helper exs found only discovery revision and or formatting changes not significant enough for a pr fixing file permissions synthtool wrote metadata to clients identity toolkit synth metadata debug synthtool wrote metadata to clients identity toolkit synth metadata autosynth running git clean fdx removing pycache traceback most recent call last file home kbuilder pyenv versions lib runpy py line in run module as main main mod spec file home kbuilder pyenv versions lib runpy py line in run code exec code run globals file tmpfs src github synthtool autosynth synth py line in main file tmpfs src github synthtool autosynth synth py line in main return inner main temp dir file tmpfs src github synthtool autosynth synth py line in inner main synthesize base synth log path file tmpfs src github synthtool autosynth synthesizer py line in synthesize with open log file path rt as fp isadirectoryerror is a directory tmpfs src github synthtool logs googleapis elixir google api google internal developers can see the full log
| 0
|
128,031
| 10,512,198,625
|
IssuesEvent
|
2019-09-27 17:17:16
|
ExchangeUnion/xud-docker
|
https://api.github.com/repos/ExchangeUnion/xud-docker
|
closed
|
xud master password integration
|
P1 enhancement mainnet testnet
|
Now that we pushed back the deadline, let's go for the integration of https://github.com/ExchangeUnion/xud/pull/1006. It'd be good to have some people test this on testnet.
After selecting testnet/mainnet - check if lndbtc wallet exists in `~/.xud-docker/testnet` / `~/.xud-docker/mainnet` (we'll make lndbtc a **must** in all later setups since lndbtc is required anyways to generate the seed).
If wallet does not exist:
* `No existing wallets found. Creating wallets...`
* Call `CreateNode`, `Please set a password (it secures xud and all your wallets):`
* *print potential errors* and exit
* If no errors, continue with `Please write down this 24 word mnemonic. You will be able to recover xud and funds of all your wallets with it, in case you forget your master password or loose your device. Keep it somewhere safe, it's your ONLY backup`: `CreateNode` output
* Did you write down your 24 word mnemonic? You will not be able to view it again after this prompt. [Y/n] n -> show again
* Y -> `All wallets successfully created.`
* start
If lndbtc wallet exists:
* call `unlock`, `Please enter your xud password:`
* start
|
1.0
|
xud master password integration - Now that we pushed back the deadline, let's go for the integration of https://github.com/ExchangeUnion/xud/pull/1006. It'd be good to have some people test this on testnet.
After selecting testnet/mainnet - check if lndbtc wallet exists in `~/.xud-docker/testnet` / `~/.xud-docker/mainnet` (we'll make lndbtc a **must** in all later setups since lndbtc is required anyways to generate the seed).
If wallet does not exist:
* `No existing wallets found. Creating wallets...`
* Call `CreateNode`, `Please set a password (it secures xud and all your wallets):`
* *print potential errors* and exit
* If no errors, continue with `Please write down this 24 word mnemonic. You will be able to recover xud and funds of all your wallets with it, in case you forget your master password or loose your device. Keep it somewhere safe, it's your ONLY backup`: `CreateNode` output
* Did you write down your 24 word mnemonic? You will not be able to view it again after this prompt. [Y/n] n -> show again
* Y -> `All wallets successfully created.`
* start
If lndbtc wallet exists:
* call `unlock`, `Please enter your xud password:`
* start
|
test
|
xud master password integration now that we pushed back the deadline let s go for the integration of it d be good to have some people test this on testnet after selecting testnet mainnet check if lndbtc wallet exists in xud docker testnet xud docker mainnet we ll make lndbtc a must in all later setups since lndbtc is required anyways to generate the seed if wallet does not exist no existing wallets found creating wallets call createnode please set a password it secures xud and all your wallets print potential errors and exit if no errors continue with please write down this word mnemonic you will be able to recover xud and funds of all your wallets with it in case you forget your master password or loose your device keep it somewhere safe it s your only backup createnode output did you write down your word mnemonic you will not be able to view it again after this prompt n show again y all wallets successfully created start if lndbtc wallet exists call unlock please enter your xud password start
| 1
|
24,574
| 4,098,586,920
|
IssuesEvent
|
2016-06-03 09:00:57
|
CartoDB/cartodb
|
https://api.github.com/repos/CartoDB/cartodb
|
reopened
|
Fix raster tests for multiple CI configs
|
bug test
|
This test is failing in CI in two different ways:
```
08:20:21 1) raster2pgsql acceptance tests keeps the original table unaltered regardless of overviews
08:20:21 Failure/Error: }).first
08:20:21 Sequel::DatabaseError:
08:20:21 PG::Error: ERROR: invalid input syntax for type double precision: "0.148148148148133,10"
08:20:21 CONTEXT: SQL function "_raster_constraint_info_scale" statement 1
08:20:21 # ./services/importer/spec/acceptance/raster2pgsql_spec.rb:134:in `block (2 levels) in <top (required)>'
08:20:21
08:20:21 Finished in 2 minutes 15.09 seconds
08:20:21 29 examples, 1 failure, 7 pending
```
```
08:49:34 Failures:
08:49:34
08:49:34 1) raster2pgsql acceptance tests keeps the original table unaltered regardless of overviews
08:49:34 Failure/Error: metadata[:srid].should eq 4326
08:49:34 NoMethodError:
08:49:34 undefined method `[]' for nil:NilClass
08:49:34 # ./services/importer/spec/acceptance/raster2pgsql_spec.rb:135:in `block (2 levels) in <top (required)>'
08:49:34
08:49:34 Finished in 4 minutes 7.4 seconds
08:49:34 35 examples, 1 failure
```
|
1.0
|
Fix raster tests for multiple CI configs - This test is failing in CI in two different ways:
```
08:20:21 1) raster2pgsql acceptance tests keeps the original table unaltered regardless of overviews
08:20:21 Failure/Error: }).first
08:20:21 Sequel::DatabaseError:
08:20:21 PG::Error: ERROR: invalid input syntax for type double precision: "0.148148148148133,10"
08:20:21 CONTEXT: SQL function "_raster_constraint_info_scale" statement 1
08:20:21 # ./services/importer/spec/acceptance/raster2pgsql_spec.rb:134:in `block (2 levels) in <top (required)>'
08:20:21
08:20:21 Finished in 2 minutes 15.09 seconds
08:20:21 29 examples, 1 failure, 7 pending
```
```
08:49:34 Failures:
08:49:34
08:49:34 1) raster2pgsql acceptance tests keeps the original table unaltered regardless of overviews
08:49:34 Failure/Error: metadata[:srid].should eq 4326
08:49:34 NoMethodError:
08:49:34 undefined method `[]' for nil:NilClass
08:49:34 # ./services/importer/spec/acceptance/raster2pgsql_spec.rb:135:in `block (2 levels) in <top (required)>'
08:49:34
08:49:34 Finished in 4 minutes 7.4 seconds
08:49:34 35 examples, 1 failure
```
|
test
|
fix raster tests for multiple ci configs this test is failing in ci in two different ways acceptance tests keeps the original table unaltered regardless of overviews failure error first sequel databaseerror pg error error invalid input syntax for type double precision context sql function raster constraint info scale statement services importer spec acceptance spec rb in block levels in finished in minutes seconds examples failure pending failures acceptance tests keeps the original table unaltered regardless of overviews failure error metadata should eq nomethoderror undefined method for nil nilclass services importer spec acceptance spec rb in block levels in finished in minutes seconds examples failure
| 1
|
354,979
| 10,575,326,538
|
IssuesEvent
|
2019-10-07 15:32:15
|
canonical-web-and-design/deployment-configs
|
https://api.github.com/repos/canonical-web-and-design/deployment-configs
|
closed
|
Add TALISKER_NETWORKS to relevant services
|
Priority: Medium
|
Add `TALISKER_NETWORKS` environment variable to the config for relevant services, to allow querying from all internal IPs:
```
TALISKER_NETWORKS='10.0.0.0/8 172.16.0.0/12 192.168.0.0/16'
```
To services:
- [x] snapcraft.io
- [x] insights.ubuntu.com
- [ ] cn.ubuntnu.com
- [ ] www.canonical.com
- [ ] developer.ubuntu.com
- [ ] maas.io
|
1.0
|
Add TALISKER_NETWORKS to relevant services - Add `TALISKER_NETWORKS` environment variable to the config for relevant services, to allow querying from all internal IPs:
```
TALISKER_NETWORKS='10.0.0.0/8 172.16.0.0/12 192.168.0.0/16'
```
To services:
- [x] snapcraft.io
- [x] insights.ubuntu.com
- [ ] cn.ubuntnu.com
- [ ] www.canonical.com
- [ ] developer.ubuntu.com
- [ ] maas.io
|
non_test
|
add talisker networks to relevant services add talisker networks environment variable to the config for relevant services to allow querying from all internal ips talisker networks to services snapcraft io insights ubuntu com cn ubuntnu com developer ubuntu com maas io
| 0
|
81,627
| 7,788,470,065
|
IssuesEvent
|
2018-06-07 04:54:54
|
purebred-mua/purebred
|
https://api.github.com/repos/purebred-mua/purebred
|
closed
|
Move user acceptance tests into it's own repository?
|
refactor tests
|
I'd like to open this issue to collect cases where a separate repository would be benefital for the user acceptance tests instead of currently being part of purebred.
|
1.0
|
Move user acceptance tests into it's own repository? - I'd like to open this issue to collect cases where a separate repository would be benefital for the user acceptance tests instead of currently being part of purebred.
|
test
|
move user acceptance tests into it s own repository i d like to open this issue to collect cases where a separate repository would be benefital for the user acceptance tests instead of currently being part of purebred
| 1
|
127,184
| 18,010,324,184
|
IssuesEvent
|
2021-09-16 07:51:02
|
maddyCode23/linux-4.1.15
|
https://api.github.com/repos/maddyCode23/linux-4.1.15
|
opened
|
CVE-2020-11668 (High) detected in linux-stable-rtv4.1.33
|
security vulnerability
|
## CVE-2020-11668 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/maddyCode23/linux-4.1.15/commit/f1f3d2b150be669390b32dfea28e773471bdd6e7">f1f3d2b150be669390b32dfea28e773471bdd6e7</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/media/usb/gspca/xirlink_cit.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/media/usb/gspca/xirlink_cit.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel before 5.6.1, drivers/media/usb/gspca/xirlink_cit.c (aka the Xirlink camera USB driver) mishandles invalid descriptors, aka CID-a246b4d54770.
<p>Publish Date: 2020-04-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11668>CVE-2020-11668</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/torvalds/linux/commit/a246b4d547708f33ff4d4b9a7a5dbac741dc89d8">https://github.com/torvalds/linux/commit/a246b4d547708f33ff4d4b9a7a5dbac741dc89d8</a></p>
<p>Release Date: 2020-03-12</p>
<p>Fix Resolution: Replace or update the following file: xirlink_cit.c</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-11668 (High) detected in linux-stable-rtv4.1.33 - ## CVE-2020-11668 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-stable-rtv4.1.33</b></p></summary>
<p>
<p>Julia Cartwright's fork of linux-stable-rt.git</p>
<p>Library home page: <a href=https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git>https://git.kernel.org/pub/scm/linux/kernel/git/julia/linux-stable-rt.git</a></p>
<p>Found in HEAD commit: <a href="https://github.com/maddyCode23/linux-4.1.15/commit/f1f3d2b150be669390b32dfea28e773471bdd6e7">f1f3d2b150be669390b32dfea28e773471bdd6e7</a></p>
</p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/media/usb/gspca/xirlink_cit.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/media/usb/gspca/xirlink_cit.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In the Linux kernel before 5.6.1, drivers/media/usb/gspca/xirlink_cit.c (aka the Xirlink camera USB driver) mishandles invalid descriptors, aka CID-a246b4d54770.
<p>Publish Date: 2020-04-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-11668>CVE-2020-11668</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Change files</p>
<p>Origin: <a href="https://github.com/torvalds/linux/commit/a246b4d547708f33ff4d4b9a7a5dbac741dc89d8">https://github.com/torvalds/linux/commit/a246b4d547708f33ff4d4b9a7a5dbac741dc89d8</a></p>
<p>Release Date: 2020-03-12</p>
<p>Fix Resolution: Replace or update the following file: xirlink_cit.c</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve high detected in linux stable cve high severity vulnerability vulnerable library linux stable julia cartwright s fork of linux stable rt git library home page a href found in head commit a href vulnerable source files drivers media usb gspca xirlink cit c drivers media usb gspca xirlink cit c vulnerability details in the linux kernel before drivers media usb gspca xirlink cit c aka the xirlink camera usb driver mishandles invalid descriptors aka cid publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact high for more information on scores click a href suggested fix type change files origin a href release date fix resolution replace or update the following file xirlink cit c step up your open source security game with whitesource
| 0
|
41,061
| 10,606,525,088
|
IssuesEvent
|
2019-10-10 23:44:18
|
metabase/metabase
|
https://api.github.com/repos/metabase/metabase
|
closed
|
Can't remove sort from table output Question
|
Priority:P2 Query Builder Type:Bug
|
When ordering is applied to a Question's table output (i.e. sorting a Count column), that ordering can never be removed. It can be changed from DESC to ASC, and vice-versa, but it can never be removed.
This causes problems when you need to remove it (for work-arounds - see #7456)
- Your databases: MSSQL
- Metabase version: 0.28.5
|
1.0
|
Can't remove sort from table output Question - When ordering is applied to a Question's table output (i.e. sorting a Count column), that ordering can never be removed. It can be changed from DESC to ASC, and vice-versa, but it can never be removed.
This causes problems when you need to remove it (for work-arounds - see #7456)
- Your databases: MSSQL
- Metabase version: 0.28.5
|
non_test
|
can t remove sort from table output question when ordering is applied to a question s table output i e sorting a count column that ordering can never be removed it can be changed from desc to asc and vice versa but it can never be removed this causes problems when you need to remove it for work arounds see your databases mssql metabase version
| 0
|
71,522
| 23,673,590,436
|
IssuesEvent
|
2022-08-27 18:49:29
|
dkfans/keeperfx
|
https://api.github.com/repos/dkfans/keeperfx
|
closed
|
Placing doors leaves an opening on top
|
Type-Defect
|

Place a door inside the level and there's no dirt above the door, making it possible for units to fly over it. It should be filled.
Traced the issue back to this commit: https://github.com/dkfans/keeperfx/commit/0ca73ffac519d7358ba1923165f604ca4b55040b
|
1.0
|
Placing doors leaves an opening on top - 
Place a door inside the level and there's no dirt above the door, making it possible for units to fly over it. It should be filled.
Traced the issue back to this commit: https://github.com/dkfans/keeperfx/commit/0ca73ffac519d7358ba1923165f604ca4b55040b
|
non_test
|
placing doors leaves an opening on top place a door inside the level and there s no dirt above the door making it possible for units to fly over it it should be filled traced the issue back to this commit
| 0
|
379,360
| 11,220,552,769
|
IssuesEvent
|
2020-01-07 16:01:27
|
zephyrproject-rtos/zephyr
|
https://api.github.com/repos/zephyrproject-rtos/zephyr
|
closed
|
GATT: gatt_write_ccc_rsp with error (0x0e) removes always beginning from subscriptions head
|
area: Bluetooth bug has-pr priority: medium
|
**Describe the bug**
In a multicentral application when `gatt_write_ccc_rsp` is called with error "0x0e", the `subscriptions` always get removed beginning from the head, since in the call of `gatt_subscription_remove` `tmp` is provided as the second parameter instead of `prev`.
**To Reproduce**
Not sure, what error "0x0e" is actually.
I'm testing around with many (20+) connections from one central with one subscription to each peripheral device. I guess it hapens, on enabling the notifications to the last connected device (or during connection.
To see the issue, i added a counter to `bt_gatt_notification` to see, how many elements are present in `subscriptions.` (see log output below, `bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 44` --> 44 subscriptions; later on only 1 subscription).
**Expected behavior**
Only the subscription of the disconnected device is removed.
**Impact**
No notifications passed to devices connected before this happens.
**Screenshots or console output**
```
[00:07:29.383,483] <dbg> bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 44
[00:07:29.383,483] <dbg> bt_conn.bt_conn_unref: handle 36 ref 2
[00:07:29.492,736] <dbg> bt_conn.bt_conn_ref: handle 44 ref 3
[00:07:29.492,767] <dbg> bt_conn.bt_conn_unref: handle 44 ref 2
[00:07:29.492,797] <dbg> bt_conn.tx_complete_work: conn 0x20002fbc
[00:07:29.492,797] <dbg> bt_conn.tx_notify: conn 0x20002fbc
[00:07:29.492,828] <dbg> bt_conn.tx_notify: tx 0x2000142c cb 0x0000890d user_data 0x00000000
[00:07:29.588,165] <dbg> bt_conn.bt_conn_ref: handle 28 ref 3
[00:07:29.588,195] <dbg> bt_conn.tx_notify: conn 0x200025bc
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: handle 28 len 22 flags 02
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: First, len 22 final 18
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: rx_len 0
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: Successfully parsed 22 byte L2CAP packet
[00:07:29.588,348] <inf> mc: notification from e1:f5:e4:a9:21:37 (random) length 15
[00:07:29.588,378] <dbg> bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 44
[00:07:29.588,409] <dbg> bt_conn.bt_conn_unref: handle 28 ref 2
[00:07:29.740,081] <dbg> bt_conn.bt_conn_ref: handle 43 ref 3
[00:07:29.740,112] <dbg> bt_conn.bt_conn_set_state: connected -> disconnected
[00:07:29.740,142] <dbg> bt_conn.tx_notify: conn 0x20002f1c
[00:07:29.740,142] <dbg> bt_conn.bt_conn_unref: handle 0 ref 2
[00:07:29.740,203] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.740,386] <dbg> bt_conn.conn_update_timeout: conn 0x20002f1c
[00:07:29.740,417] <dbg> bt_gatt.gatt_write_ccc_rsp: err 0x0e
[00:07:29.740,417] <dbg> bt_gatt.gatt_subscription_remove: gatt_subscription_remove
[00:07:29.740,509] <inf> mc: [UNSUBSCRIBED] eb:c1:0d:41:e5:26 (random)
[00:07:29.740,509] <dbg> bt_gatt.bt_gatt_disconnected: conn 0x20002f1c
[00:07:29.740,692] <dbg> bt_gatt.remove_subscriptions: remove_subscriptions
[00:07:29.740,692] <dbg> bt_gatt.gatt_subscription_remove: gatt_subscription_remove
[00:07:29.740,783] <inf> mc: [UNSUBSCRIBED] eb:c1:0d:41:e5:26 (random)
[00:07:29.740,997] <dbg> bt_conn.bt_conn_unref: handle 0 ref 1
[00:07:29.740,997] <inf> mc: Disconnected: eb:c1:0d:41:e5:26 (random) (reason 0x08)(44/64)
[00:07:29.741,027] <dbg> bt_conn.bt_conn_unref: handle 0 ref 0
[00:07:29.867,767] <dbg> bt_conn.bt_conn_ref: handle 44 ref 3
[00:07:29.867,767] <dbg> bt_conn.bt_conn_unref: handle 44 ref 2
[00:07:29.868,225] <dbg> bt_conn.bt_conn_ref: handle 44 ref 3
[00:07:29.868,225] <dbg> bt_conn.tx_notify: conn 0x20002fbc
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: handle 44 len 14 flags 02
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: First, len 14 final 10
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: rx_len 0
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: Successfully parsed 14 byte L2CAP packet
[00:07:29.868,255] <dbg> bt_gatt.gatt_find_info_rsp: err 0x00
[00:07:29.868,286] <dbg> bt_gatt.gatt_find_info_rsp: handle 0x000e uuid 2902
[00:07:29.868,408] <inf> xibu: [e2:7c:1a:18:93:28 (random)]: xibu_discover_func (536968620)
[00:07:29.868,438] <dbg> bt_gatt.gatt_write_ccc: handle 0x000e value 0x0001
[00:07:29.868,438] <inf> xibu: up descriptor discovered, subscribe returned 0
[00:07:29.868,469] <dbg> bt_conn.bt_conn_le_param_update: conn 0x20002fbc features 0x25 params (2800-3200 0 1000)
[00:07:29.868,469] <dbg> bt_conn.send_conn_le_param_update: conn 0x20002fbc features 0x25 params (2800-3200 0 1000)
[00:07:29.868,591] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.868,743] <dbg> bt_conn.bt_conn_send_cb: conn handle 44 buf len 9 cb 0x0000890d user_data 0x00000000
[00:07:29.868,774] <dbg> bt_conn.bt_conn_unref: handle 44 ref 2
[00:07:29.868,835] <dbg> bt_conn.bt_conn_process_tx: conn 0x20002fbc
[00:07:29.868,835] <dbg> bt_conn.send_buf: conn 0x20002fbc buf 0x200199a8 len 9
[00:07:29.868,865] <dbg> bt_conn.send_frag: conn 0x20002fbc buf 0x200199a8 len 9 flags 0x00
[00:07:29.868,865] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.869,018] <inf> mc:
Starting scanner (44/64)
[00:07:29.869,171] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.869,445] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.869,659] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.170,898] <inf> mc: Connecting to device f8:2d:b3:63:1e:c8 (random)
[00:07:30.171,142] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.171,447] <dbg> bt_conn.bt_conn_set_state: disconnected -> connect-scan
[00:07:30.171,447] <dbg> bt_conn.bt_conn_ref: handle 0 ref 2
[00:07:30.171,508] <dbg> bt_conn.bt_conn_ref: handle 0 ref 3
[00:07:30.171,508] <dbg> bt_conn.bt_conn_unref: handle 0 ref 2
[00:07:30.171,630] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.171,905] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.172,088] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.184,448] <dbg> bt_conn.bt_conn_ref: handle 12 ref 3
[00:07:30.184,478] <dbg> bt_conn.tx_notify: conn 0x20001bbc
[00:07:30.184,478] <dbg> bt_conn.bt_conn_recv: handle 12 len 22 flags 02
[00:07:30.184,478] <dbg> bt_conn.bt_conn_recv: First, len 22 final 18
[00:07:30.184,478] <dbg> bt_conn.bt_conn_recv: rx_len 0
[00:07:30.184,509] <dbg> bt_conn.bt_conn_recv: Successfully parsed 22 byte L2CAP packet
[00:07:30.184,509] <dbg> bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 1
```
**Environment (please complete the following information):**
- OS: Linux
- Toolchain Zephyr SDK
- Commit SHA 5af54075703c248fae2bf69a50909bf2e3cc977e
- Target: nrf52840_pca10056
**Additional context**
I'll provide a pull-request for this soon.
|
1.0
|
GATT: gatt_write_ccc_rsp with error (0x0e) removes always beginning from subscriptions head - **Describe the bug**
In a multicentral application when `gatt_write_ccc_rsp` is called with error "0x0e", the `subscriptions` always get removed beginning from the head, since in the call of `gatt_subscription_remove` `tmp` is provided as the second parameter instead of `prev`.
**To Reproduce**
Not sure, what error "0x0e" is actually.
I'm testing around with many (20+) connections from one central with one subscription to each peripheral device. I guess it hapens, on enabling the notifications to the last connected device (or during connection.
To see the issue, i added a counter to `bt_gatt_notification` to see, how many elements are present in `subscriptions.` (see log output below, `bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 44` --> 44 subscriptions; later on only 1 subscription).
**Expected behavior**
Only the subscription of the disconnected device is removed.
**Impact**
No notifications passed to devices connected before this happens.
**Screenshots or console output**
```
[00:07:29.383,483] <dbg> bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 44
[00:07:29.383,483] <dbg> bt_conn.bt_conn_unref: handle 36 ref 2
[00:07:29.492,736] <dbg> bt_conn.bt_conn_ref: handle 44 ref 3
[00:07:29.492,767] <dbg> bt_conn.bt_conn_unref: handle 44 ref 2
[00:07:29.492,797] <dbg> bt_conn.tx_complete_work: conn 0x20002fbc
[00:07:29.492,797] <dbg> bt_conn.tx_notify: conn 0x20002fbc
[00:07:29.492,828] <dbg> bt_conn.tx_notify: tx 0x2000142c cb 0x0000890d user_data 0x00000000
[00:07:29.588,165] <dbg> bt_conn.bt_conn_ref: handle 28 ref 3
[00:07:29.588,195] <dbg> bt_conn.tx_notify: conn 0x200025bc
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: handle 28 len 22 flags 02
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: First, len 22 final 18
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: rx_len 0
[00:07:29.588,195] <dbg> bt_conn.bt_conn_recv: Successfully parsed 22 byte L2CAP packet
[00:07:29.588,348] <inf> mc: notification from e1:f5:e4:a9:21:37 (random) length 15
[00:07:29.588,378] <dbg> bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 44
[00:07:29.588,409] <dbg> bt_conn.bt_conn_unref: handle 28 ref 2
[00:07:29.740,081] <dbg> bt_conn.bt_conn_ref: handle 43 ref 3
[00:07:29.740,112] <dbg> bt_conn.bt_conn_set_state: connected -> disconnected
[00:07:29.740,142] <dbg> bt_conn.tx_notify: conn 0x20002f1c
[00:07:29.740,142] <dbg> bt_conn.bt_conn_unref: handle 0 ref 2
[00:07:29.740,203] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.740,386] <dbg> bt_conn.conn_update_timeout: conn 0x20002f1c
[00:07:29.740,417] <dbg> bt_gatt.gatt_write_ccc_rsp: err 0x0e
[00:07:29.740,417] <dbg> bt_gatt.gatt_subscription_remove: gatt_subscription_remove
[00:07:29.740,509] <inf> mc: [UNSUBSCRIBED] eb:c1:0d:41:e5:26 (random)
[00:07:29.740,509] <dbg> bt_gatt.bt_gatt_disconnected: conn 0x20002f1c
[00:07:29.740,692] <dbg> bt_gatt.remove_subscriptions: remove_subscriptions
[00:07:29.740,692] <dbg> bt_gatt.gatt_subscription_remove: gatt_subscription_remove
[00:07:29.740,783] <inf> mc: [UNSUBSCRIBED] eb:c1:0d:41:e5:26 (random)
[00:07:29.740,997] <dbg> bt_conn.bt_conn_unref: handle 0 ref 1
[00:07:29.740,997] <inf> mc: Disconnected: eb:c1:0d:41:e5:26 (random) (reason 0x08)(44/64)
[00:07:29.741,027] <dbg> bt_conn.bt_conn_unref: handle 0 ref 0
[00:07:29.867,767] <dbg> bt_conn.bt_conn_ref: handle 44 ref 3
[00:07:29.867,767] <dbg> bt_conn.bt_conn_unref: handle 44 ref 2
[00:07:29.868,225] <dbg> bt_conn.bt_conn_ref: handle 44 ref 3
[00:07:29.868,225] <dbg> bt_conn.tx_notify: conn 0x20002fbc
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: handle 44 len 14 flags 02
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: First, len 14 final 10
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: rx_len 0
[00:07:29.868,225] <dbg> bt_conn.bt_conn_recv: Successfully parsed 14 byte L2CAP packet
[00:07:29.868,255] <dbg> bt_gatt.gatt_find_info_rsp: err 0x00
[00:07:29.868,286] <dbg> bt_gatt.gatt_find_info_rsp: handle 0x000e uuid 2902
[00:07:29.868,408] <inf> xibu: [e2:7c:1a:18:93:28 (random)]: xibu_discover_func (536968620)
[00:07:29.868,438] <dbg> bt_gatt.gatt_write_ccc: handle 0x000e value 0x0001
[00:07:29.868,438] <inf> xibu: up descriptor discovered, subscribe returned 0
[00:07:29.868,469] <dbg> bt_conn.bt_conn_le_param_update: conn 0x20002fbc features 0x25 params (2800-3200 0 1000)
[00:07:29.868,469] <dbg> bt_conn.send_conn_le_param_update: conn 0x20002fbc features 0x25 params (2800-3200 0 1000)
[00:07:29.868,591] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.868,743] <dbg> bt_conn.bt_conn_send_cb: conn handle 44 buf len 9 cb 0x0000890d user_data 0x00000000
[00:07:29.868,774] <dbg> bt_conn.bt_conn_unref: handle 44 ref 2
[00:07:29.868,835] <dbg> bt_conn.bt_conn_process_tx: conn 0x20002fbc
[00:07:29.868,835] <dbg> bt_conn.send_buf: conn 0x20002fbc buf 0x200199a8 len 9
[00:07:29.868,865] <dbg> bt_conn.send_frag: conn 0x20002fbc buf 0x200199a8 len 9 flags 0x00
[00:07:29.868,865] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.869,018] <inf> mc:
Starting scanner (44/64)
[00:07:29.869,171] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.869,445] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:29.869,659] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.170,898] <inf> mc: Connecting to device f8:2d:b3:63:1e:c8 (random)
[00:07:30.171,142] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.171,447] <dbg> bt_conn.bt_conn_set_state: disconnected -> connect-scan
[00:07:30.171,447] <dbg> bt_conn.bt_conn_ref: handle 0 ref 2
[00:07:30.171,508] <dbg> bt_conn.bt_conn_ref: handle 0 ref 3
[00:07:30.171,508] <dbg> bt_conn.bt_conn_unref: handle 0 ref 2
[00:07:30.171,630] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.171,905] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.172,088] <dbg> bt_conn.bt_conn_prepare_events:
[00:07:30.184,448] <dbg> bt_conn.bt_conn_ref: handle 12 ref 3
[00:07:30.184,478] <dbg> bt_conn.tx_notify: conn 0x20001bbc
[00:07:30.184,478] <dbg> bt_conn.bt_conn_recv: handle 12 len 22 flags 02
[00:07:30.184,478] <dbg> bt_conn.bt_conn_recv: First, len 22 final 18
[00:07:30.184,478] <dbg> bt_conn.bt_conn_recv: rx_len 0
[00:07:30.184,509] <dbg> bt_conn.bt_conn_recv: Successfully parsed 22 byte L2CAP packet
[00:07:30.184,509] <dbg> bt_gatt.bt_gatt_notification: handle 0x000d length 15 it 1
```
**Environment (please complete the following information):**
- OS: Linux
- Toolchain Zephyr SDK
- Commit SHA 5af54075703c248fae2bf69a50909bf2e3cc977e
- Target: nrf52840_pca10056
**Additional context**
I'll provide a pull-request for this soon.
|
non_test
|
gatt gatt write ccc rsp with error removes always beginning from subscriptions head describe the bug in a multicentral application when gatt write ccc rsp is called with error the subscriptions always get removed beginning from the head since in the call of gatt subscription remove tmp is provided as the second parameter instead of prev to reproduce not sure what error is actually i m testing around with many connections from one central with one subscription to each peripheral device i guess it hapens on enabling the notifications to the last connected device or during connection to see the issue i added a counter to bt gatt notification to see how many elements are present in subscriptions see log output below bt gatt bt gatt notification handle length it subscriptions later on only subscription expected behavior only the subscription of the disconnected device is removed impact no notifications passed to devices connected before this happens screenshots or console output bt gatt bt gatt notification handle length it bt conn bt conn unref handle ref bt conn bt conn ref handle ref bt conn bt conn unref handle ref bt conn tx complete work conn bt conn tx notify conn bt conn tx notify tx cb user data bt conn bt conn ref handle ref bt conn tx notify conn bt conn bt conn recv handle len flags bt conn bt conn recv first len final bt conn bt conn recv rx len bt conn bt conn recv successfully parsed byte packet mc notification from random length bt gatt bt gatt notification handle length it bt conn bt conn unref handle ref bt conn bt conn ref handle ref bt conn bt conn set state connected disconnected bt conn tx notify conn bt conn bt conn unref handle ref bt conn bt conn prepare events bt conn conn update timeout conn bt gatt gatt write ccc rsp err bt gatt gatt subscription remove gatt subscription remove mc eb random bt gatt bt gatt disconnected conn bt gatt remove subscriptions remove subscriptions bt gatt gatt subscription remove gatt subscription remove mc eb random bt conn bt conn unref handle ref mc disconnected eb random reason bt conn bt conn unref handle ref bt conn bt conn ref handle ref bt conn bt conn unref handle ref bt conn bt conn ref handle ref bt conn tx notify conn bt conn bt conn recv handle len flags bt conn bt conn recv first len final bt conn bt conn recv rx len bt conn bt conn recv successfully parsed byte packet bt gatt gatt find info rsp err bt gatt gatt find info rsp handle uuid xibu xibu discover func bt gatt gatt write ccc handle value xibu up descriptor discovered subscribe returned bt conn bt conn le param update conn features params bt conn send conn le param update conn features params bt conn bt conn prepare events bt conn bt conn send cb conn handle buf len cb user data bt conn bt conn unref handle ref bt conn bt conn process tx conn bt conn send buf conn buf len bt conn send frag conn buf len flags bt conn bt conn prepare events mc starting scanner bt conn bt conn prepare events bt conn bt conn prepare events bt conn bt conn prepare events mc connecting to device random bt conn bt conn prepare events bt conn bt conn set state disconnected connect scan bt conn bt conn ref handle ref bt conn bt conn ref handle ref bt conn bt conn unref handle ref bt conn bt conn prepare events bt conn bt conn prepare events bt conn bt conn prepare events bt conn bt conn ref handle ref bt conn tx notify conn bt conn bt conn recv handle len flags bt conn bt conn recv first len final bt conn bt conn recv rx len bt conn bt conn recv successfully parsed byte packet bt gatt bt gatt notification handle length it environment please complete the following information os linux toolchain zephyr sdk commit sha target additional context i ll provide a pull request for this soon
| 0
|
84,101
| 7,890,267,337
|
IssuesEvent
|
2018-06-28 08:17:16
|
voyages-sncf-technologies/hesperides
|
https://api.github.com/repos/voyages-sncf-technologies/hesperides
|
closed
|
GET /applications/{application_name}/platforms/{platform_name}
|
level easy test-ready us platforms v4
|
Récupère le détail d'une plateforme.
**Input**
Paramètre de requête facultatif : timestamp
**Output**
Code 200 + Structure identique à celle de la création d'une plateforme :
{
"application_name": "",
"application_version": "",
"platform_name": "",
"production": false,
"version_id": 0,
"modules": [
{
"id": 0,
"name": "",
"version": "",
"path": "",
"working_copy": false,
"properties_path": "",
"instances": [
{
"name": "",
"key_values": [
{
"value": "",
"name": ""
}
]
}
]
}
]
}
|
1.0
|
GET /applications/{application_name}/platforms/{platform_name} - Récupère le détail d'une plateforme.
**Input**
Paramètre de requête facultatif : timestamp
**Output**
Code 200 + Structure identique à celle de la création d'une plateforme :
{
"application_name": "",
"application_version": "",
"platform_name": "",
"production": false,
"version_id": 0,
"modules": [
{
"id": 0,
"name": "",
"version": "",
"path": "",
"working_copy": false,
"properties_path": "",
"instances": [
{
"name": "",
"key_values": [
{
"value": "",
"name": ""
}
]
}
]
}
]
}
|
test
|
get applications application name platforms platform name récupère le détail d une plateforme input paramètre de requête facultatif timestamp output code structure identique à celle de la création d une plateforme application name application version platform name production false version id modules id name version path working copy false properties path instances name key values value name
| 1
|
778,967
| 27,334,547,863
|
IssuesEvent
|
2023-02-26 02:43:40
|
School-Simplified-HR-Automations/HR-Automation
|
https://api.github.com/repos/School-Simplified-HR-Automations/HR-Automation
|
closed
|
Lookup does not offer "back" button to main page after using select menu
|
bug help wanted low priority semver: patch
|
When a user runs the lookup command they are presented with a main page. After making a specific selection there is no way for them to go back to the main page. This should be implemented either as a button or selection.
|
1.0
|
Lookup does not offer "back" button to main page after using select menu - When a user runs the lookup command they are presented with a main page. After making a specific selection there is no way for them to go back to the main page. This should be implemented either as a button or selection.
|
non_test
|
lookup does not offer back button to main page after using select menu when a user runs the lookup command they are presented with a main page after making a specific selection there is no way for them to go back to the main page this should be implemented either as a button or selection
| 0
|
105,844
| 13,223,644,569
|
IssuesEvent
|
2020-08-17 17:36:43
|
cgeo/cgeo
|
https://api.github.com/repos/cgeo/cgeo
|
closed
|
Find better UI implementation for WP upload
|
Frontend Design
|
Currently the upload to personal note for user defined waypoints is placed between the existing buttons for Edit and Upload. I do see some problems with that:
- As the buttons are already rather small the users might unintentionally click the wrong button
- Users might not understand the function of this button as the page, where it is shown is unrelated to the data source used for it
On the other hand users will currently easily find the new function, but still I would like to suggest to find an alternative place for that function in the UI.
How about a third button below "Add current location" on top of the waypoint tab of a cache?
This would also have the advantage that longer strings fit in, such as "Copy user modified waypoints to personal note", which makes the function more self explanatory.
@eddiemuc
|
1.0
|
Find better UI implementation for WP upload - Currently the upload to personal note for user defined waypoints is placed between the existing buttons for Edit and Upload. I do see some problems with that:
- As the buttons are already rather small the users might unintentionally click the wrong button
- Users might not understand the function of this button as the page, where it is shown is unrelated to the data source used for it
On the other hand users will currently easily find the new function, but still I would like to suggest to find an alternative place for that function in the UI.
How about a third button below "Add current location" on top of the waypoint tab of a cache?
This would also have the advantage that longer strings fit in, such as "Copy user modified waypoints to personal note", which makes the function more self explanatory.
@eddiemuc
|
non_test
|
find better ui implementation for wp upload currently the upload to personal note for user defined waypoints is placed between the existing buttons for edit and upload i do see some problems with that as the buttons are already rather small the users might unintentionally click the wrong button users might not understand the function of this button as the page where it is shown is unrelated to the data source used for it on the other hand users will currently easily find the new function but still i would like to suggest to find an alternative place for that function in the ui how about a third button below add current location on top of the waypoint tab of a cache this would also have the advantage that longer strings fit in such as copy user modified waypoints to personal note which makes the function more self explanatory eddiemuc
| 0
|
4,843
| 7,326,875,622
|
IssuesEvent
|
2018-03-04 01:51:05
|
theworkingmen/idb
|
https://api.github.com/repos/theworkingmen/idb
|
opened
|
Create a UML diagram
|
Important Requirement
|
If you know/are willing to learn how to make UML diagrams, please assign yourself.
|
1.0
|
Create a UML diagram - If you know/are willing to learn how to make UML diagrams, please assign yourself.
|
non_test
|
create a uml diagram if you know are willing to learn how to make uml diagrams please assign yourself
| 0
|
504,723
| 14,620,785,502
|
IssuesEvent
|
2020-12-22 20:22:38
|
grannypron/uaf_levels
|
https://api.github.com/repos/grannypron/uaf_levels
|
closed
|
Flawed blacksmith smallpic
|
game39 low priority
|
There is a tiny little shield icon in smallpic_Blacksmith.png from my screenshot. Haha. Should re-grab it
|
1.0
|
Flawed blacksmith smallpic - There is a tiny little shield icon in smallpic_Blacksmith.png from my screenshot. Haha. Should re-grab it
|
non_test
|
flawed blacksmith smallpic there is a tiny little shield icon in smallpic blacksmith png from my screenshot haha should re grab it
| 0
|
124,357
| 10,309,996,747
|
IssuesEvent
|
2019-08-29 14:21:33
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
storage: TestRangeTransferLeaseExpirationBased failed under stress
|
C-test-failure O-robot
|
SHA: https://github.com/cockroachdb/cockroach/commits/01ee0704865391599abef3bbc89f462117f8007a
Parameters:
```
TAGS=
GOFLAGS=-parallel=4
```
To repro, try:
```
# Don't forget to check out a clean suitable branch and experiment with the
# stress invocation until the desired results present themselves. For example,
# using stress instead of stressrace and passing the '-p' stressflag which
# controls concurrency.
./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh
cd ~/go/src/github.com/cockroachdb/cockroach && \
stdbuf -oL -eL \
make stressrace TESTS=TestRangeTransferLeaseExpirationBased PKG=github.com/cockroachdb/cockroach/pkg/storage TESTTIMEOUT=5m STRESSFLAGS='-maxtime 20m -timeout 10m' 2>&1 | tee /tmp/stress.log
```
Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=1445911&tab=buildLog
```
=== RUN TestRangeTransferLeaseExpirationBased
--- FAIL: TestRangeTransferLeaseExpirationBased (0.57s)
=== RUN TestRangeTransferLeaseExpirationBased/DrainTransfer
I190820 06:51:19.943095 430128 gossip/gossip.go:394 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:36493" > attrs:<> locality:<> ServerVersion:<major_val:0 minor_val:0 patch:0 unstable:0 > build_tag:"" started_at:0 cluster_name:"" sql_address:<network_field:"" address_field:"" >
W190820 06:51:19.980699 430128 gossip/gossip.go:1517 [n2] no incoming or outgoing connections
I190820 06:51:19.980785 430128 gossip/gossip.go:394 [n2] NodeDescriptor set to node_id:2 address:<network_field:"tcp" address_field:"127.0.0.1:36593" > attrs:<> locality:<> ServerVersion:<major_val:0 minor_val:0 patch:0 unstable:0 > build_tag:"" started_at:0 cluster_name:"" sql_address:<network_field:"" address_field:"" >
I190820 06:51:19.983001 430452 gossip/client.go:124 [n2] started gossip client to 127.0.0.1:36493
I190820 06:51:20.026753 430128 storage/client_test.go:495 gossip network initialized
I190820 06:51:20.027647 430128 storage/replica_command.go:1263 [s1,r1/1:/M{in-ax}] change replicas (add [(n2,s2):2LEARNER] remove []): existing descriptor r1:/M{in-ax} [(n1,s1):1, next=2, gen=0]
I190820 06:51:20.029045 430128 storage/replica_raft.go:291 [s1,r1/1:/M{in-ax},txn=0af6926a] proposing ADD_REPLICA[(n2,s2):2LEARNER]: after=[(n1,s1):1 (n2,s2):2LEARNER] next=3
I190820 06:51:20.032437 430128 storage/store_snapshot.go:995 [s1,r1/1:/M{in-ax}] sending LEARNER snapshot 548aaee1 at applied index 18
I190820 06:51:20.032668 430128 storage/store_snapshot.go:1038 [s1,r1/1:/M{in-ax}] streamed snapshot to (n2,s2):2: kv pairs: 55, log entries: 0, rate-limit: 8.0 MiB/sec, 0.00s
I190820 06:51:20.034197 430654 storage/replica_raftstorage.go:808 [s2,r1/2:{-}] applying LEARNER snapshot [id=548aaee1 index=18]
I190820 06:51:20.034745 430284 storage/replica_command.go:1263 [replicate,s1,r1/1:/M{in-ax}] change replicas (add [] remove [(n2,s2):2LEARNER]): existing descriptor r1:/M{in-ax} [(n1,s1):1, (n2,s2):2LEARNER, next=3, gen=1]
W190820 06:51:20.035114 430654 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
I190820 06:51:20.035419 430654 storage/replica_raftstorage.go:829 [s2,r1/2:/M{in-ax}] applied LEARNER snapshot [total=1ms ingestion=4@1ms id=548aaee1 index=18]
I190820 06:51:20.054182 430284 storage/replica_raft.go:291 [replicate,s1,r1/1:/M{in-ax},txn=2361e7dc] proposing REMOVE_REPLICA[(n2,s2):2LEARNER]: after=[(n1,s1):1] next=3
I190820 06:51:20.055148 430128 storage/replica_command.go:948 [s1,r1/1:/M{in-ax}] could not promote [n2,s2] to voter, rolling back: change replicas of r1 failed: descriptor changed: [expected] r1:/M{in-ax} [(n1,s1):1, (n2,s2):2LEARNER, next=3, gen=1] != [actual] r1:/M{in-ax} [(n1,s1):1, next=3, gen=2]
--- FAIL: TestRangeTransferLeaseExpirationBased/DrainTransfer (0.15s)
client_replica_test.go:563: change replicas of r1 failed: descriptor changed: [expected] r1:/M{in-ax} [(n1,s1):1, (n2,s2):2LEARNER, next=3, gen=1] != [actual] r1:/M{in-ax} [(n1,s1):1, next=3, gen=2]
```
|
1.0
|
storage: TestRangeTransferLeaseExpirationBased failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/01ee0704865391599abef3bbc89f462117f8007a
Parameters:
```
TAGS=
GOFLAGS=-parallel=4
```
To repro, try:
```
# Don't forget to check out a clean suitable branch and experiment with the
# stress invocation until the desired results present themselves. For example,
# using stress instead of stressrace and passing the '-p' stressflag which
# controls concurrency.
./scripts/gceworker.sh start && ./scripts/gceworker.sh mosh
cd ~/go/src/github.com/cockroachdb/cockroach && \
stdbuf -oL -eL \
make stressrace TESTS=TestRangeTransferLeaseExpirationBased PKG=github.com/cockroachdb/cockroach/pkg/storage TESTTIMEOUT=5m STRESSFLAGS='-maxtime 20m -timeout 10m' 2>&1 | tee /tmp/stress.log
```
Failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=1445911&tab=buildLog
```
=== RUN TestRangeTransferLeaseExpirationBased
--- FAIL: TestRangeTransferLeaseExpirationBased (0.57s)
=== RUN TestRangeTransferLeaseExpirationBased/DrainTransfer
I190820 06:51:19.943095 430128 gossip/gossip.go:394 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:36493" > attrs:<> locality:<> ServerVersion:<major_val:0 minor_val:0 patch:0 unstable:0 > build_tag:"" started_at:0 cluster_name:"" sql_address:<network_field:"" address_field:"" >
W190820 06:51:19.980699 430128 gossip/gossip.go:1517 [n2] no incoming or outgoing connections
I190820 06:51:19.980785 430128 gossip/gossip.go:394 [n2] NodeDescriptor set to node_id:2 address:<network_field:"tcp" address_field:"127.0.0.1:36593" > attrs:<> locality:<> ServerVersion:<major_val:0 minor_val:0 patch:0 unstable:0 > build_tag:"" started_at:0 cluster_name:"" sql_address:<network_field:"" address_field:"" >
I190820 06:51:19.983001 430452 gossip/client.go:124 [n2] started gossip client to 127.0.0.1:36493
I190820 06:51:20.026753 430128 storage/client_test.go:495 gossip network initialized
I190820 06:51:20.027647 430128 storage/replica_command.go:1263 [s1,r1/1:/M{in-ax}] change replicas (add [(n2,s2):2LEARNER] remove []): existing descriptor r1:/M{in-ax} [(n1,s1):1, next=2, gen=0]
I190820 06:51:20.029045 430128 storage/replica_raft.go:291 [s1,r1/1:/M{in-ax},txn=0af6926a] proposing ADD_REPLICA[(n2,s2):2LEARNER]: after=[(n1,s1):1 (n2,s2):2LEARNER] next=3
I190820 06:51:20.032437 430128 storage/store_snapshot.go:995 [s1,r1/1:/M{in-ax}] sending LEARNER snapshot 548aaee1 at applied index 18
I190820 06:51:20.032668 430128 storage/store_snapshot.go:1038 [s1,r1/1:/M{in-ax}] streamed snapshot to (n2,s2):2: kv pairs: 55, log entries: 0, rate-limit: 8.0 MiB/sec, 0.00s
I190820 06:51:20.034197 430654 storage/replica_raftstorage.go:808 [s2,r1/2:{-}] applying LEARNER snapshot [id=548aaee1 index=18]
I190820 06:51:20.034745 430284 storage/replica_command.go:1263 [replicate,s1,r1/1:/M{in-ax}] change replicas (add [] remove [(n2,s2):2LEARNER]): existing descriptor r1:/M{in-ax} [(n1,s1):1, (n2,s2):2LEARNER, next=3, gen=1]
W190820 06:51:20.035114 430654 storage/engine/rocksdb.go:116 [rocksdb] [db/version_set.cc:3086] More existing levels in DB than needed. max_bytes_for_level_multiplier may not be guaranteed.
I190820 06:51:20.035419 430654 storage/replica_raftstorage.go:829 [s2,r1/2:/M{in-ax}] applied LEARNER snapshot [total=1ms ingestion=4@1ms id=548aaee1 index=18]
I190820 06:51:20.054182 430284 storage/replica_raft.go:291 [replicate,s1,r1/1:/M{in-ax},txn=2361e7dc] proposing REMOVE_REPLICA[(n2,s2):2LEARNER]: after=[(n1,s1):1] next=3
I190820 06:51:20.055148 430128 storage/replica_command.go:948 [s1,r1/1:/M{in-ax}] could not promote [n2,s2] to voter, rolling back: change replicas of r1 failed: descriptor changed: [expected] r1:/M{in-ax} [(n1,s1):1, (n2,s2):2LEARNER, next=3, gen=1] != [actual] r1:/M{in-ax} [(n1,s1):1, next=3, gen=2]
--- FAIL: TestRangeTransferLeaseExpirationBased/DrainTransfer (0.15s)
client_replica_test.go:563: change replicas of r1 failed: descriptor changed: [expected] r1:/M{in-ax} [(n1,s1):1, (n2,s2):2LEARNER, next=3, gen=1] != [actual] r1:/M{in-ax} [(n1,s1):1, next=3, gen=2]
```
|
test
|
storage testrangetransferleaseexpirationbased failed under stress sha parameters tags goflags parallel to repro try don t forget to check out a clean suitable branch and experiment with the stress invocation until the desired results present themselves for example using stress instead of stressrace and passing the p stressflag which controls concurrency scripts gceworker sh start scripts gceworker sh mosh cd go src github com cockroachdb cockroach stdbuf ol el make stressrace tests testrangetransferleaseexpirationbased pkg github com cockroachdb cockroach pkg storage testtimeout stressflags maxtime timeout tee tmp stress log failed test run testrangetransferleaseexpirationbased fail testrangetransferleaseexpirationbased run testrangetransferleaseexpirationbased draintransfer gossip gossip go nodedescriptor set to node id address attrs locality serverversion build tag started at cluster name sql address gossip gossip go no incoming or outgoing connections gossip gossip go nodedescriptor set to node id address attrs locality serverversion build tag started at cluster name sql address gossip client go started gossip client to storage client test go gossip network initialized storage replica command go change replicas add remove existing descriptor m in ax storage replica raft go proposing add replica after next storage store snapshot go sending learner snapshot at applied index storage store snapshot go streamed snapshot to kv pairs log entries rate limit mib sec storage replica raftstorage go applying learner snapshot storage replica command go change replicas add remove existing descriptor m in ax storage engine rocksdb go more existing levels in db than needed max bytes for level multiplier may not be guaranteed storage replica raftstorage go applied learner snapshot storage replica raft go proposing remove replica after next storage replica command go could not promote to voter rolling back change replicas of failed descriptor changed m in ax m in ax fail testrangetransferleaseexpirationbased draintransfer client replica test go change replicas of failed descriptor changed m in ax m in ax
| 1
|
284,515
| 24,604,921,435
|
IssuesEvent
|
2022-10-14 15:22:54
|
ValveSoftware/Dota-2
|
https://api.github.com/repos/ValveSoftware/Dota-2
|
closed
|
Random Crashes since 7.31d
|
Need Retest
|
#### Your system information
* System information from steam (`Steam` -> `Help` -> `System Information`) in a [gist](https://gist.github.com/BrodieSutherland/61c61aa75254d6dcdebb5fa2b6fc81fb):
* Have you checked for system updates?: Yes
* Are you using the latest stable video driver available for your system? Yes
* Have you verified the game files?: Yes
#### Please describe your issue in as much detail as possible:
Getting some random crashes since 7.31d that weren't present before, hoping I can help you guys find whats up here.
Game crashes when spectating or playing a live game, but not when spectating a replay (at least from my experience anyway). I crashed 3 times in my example Match ID (during the final fight too, RIP), no commonality between them in terms of behaviour immediately before crashes. I have attached the last crash's dump file here:
[crash_20220611224522_2.dmp.zip](https://github.com/jeffhill/Dota2/files/8884892/crash_20220611224522_2.dmp.zip)
Things I've tried:
Validating game integrity
Full game reinstall
Disabling any and all overlays
Running the game on lowest graphical settings
Switching renderer from Vulkan to OpenGL
Running Dota via Proton (didn't launch at all via this method actually lol)
Crying
Example Match ID (and possibly Timestamp)
6612040440 - 1:05, 40:00, 43:20
#### Steps for reproducing this issue:
1. Play Dota
2. Spectate live game or play a game
3. Wait for something to go wrong
|
1.0
|
Random Crashes since 7.31d - #### Your system information
* System information from steam (`Steam` -> `Help` -> `System Information`) in a [gist](https://gist.github.com/BrodieSutherland/61c61aa75254d6dcdebb5fa2b6fc81fb):
* Have you checked for system updates?: Yes
* Are you using the latest stable video driver available for your system? Yes
* Have you verified the game files?: Yes
#### Please describe your issue in as much detail as possible:
Getting some random crashes since 7.31d that weren't present before, hoping I can help you guys find whats up here.
Game crashes when spectating or playing a live game, but not when spectating a replay (at least from my experience anyway). I crashed 3 times in my example Match ID (during the final fight too, RIP), no commonality between them in terms of behaviour immediately before crashes. I have attached the last crash's dump file here:
[crash_20220611224522_2.dmp.zip](https://github.com/jeffhill/Dota2/files/8884892/crash_20220611224522_2.dmp.zip)
Things I've tried:
Validating game integrity
Full game reinstall
Disabling any and all overlays
Running the game on lowest graphical settings
Switching renderer from Vulkan to OpenGL
Running Dota via Proton (didn't launch at all via this method actually lol)
Crying
Example Match ID (and possibly Timestamp)
6612040440 - 1:05, 40:00, 43:20
#### Steps for reproducing this issue:
1. Play Dota
2. Spectate live game or play a game
3. Wait for something to go wrong
|
test
|
random crashes since your system information system information from steam steam help system information in a have you checked for system updates yes are you using the latest stable video driver available for your system yes have you verified the game files yes please describe your issue in as much detail as possible getting some random crashes since that weren t present before hoping i can help you guys find whats up here game crashes when spectating or playing a live game but not when spectating a replay at least from my experience anyway i crashed times in my example match id during the final fight too rip no commonality between them in terms of behaviour immediately before crashes i have attached the last crash s dump file here things i ve tried validating game integrity full game reinstall disabling any and all overlays running the game on lowest graphical settings switching renderer from vulkan to opengl running dota via proton didn t launch at all via this method actually lol crying example match id and possibly timestamp steps for reproducing this issue play dota spectate live game or play a game wait for something to go wrong
| 1
|
97,438
| 20,260,556,830
|
IssuesEvent
|
2022-02-15 06:49:50
|
fmnas/fmnas-site
|
https://api.github.com/repos/fmnas/fmnas-site
|
opened
|
Add tests for GCP services.
|
backend code health medium (3-8h)
|
* Go unit tests
* Build tests
* Integration tests with endpoint
* Domain mapping tests?
|
1.0
|
Add tests for GCP services. - * Go unit tests
* Build tests
* Integration tests with endpoint
* Domain mapping tests?
|
non_test
|
add tests for gcp services go unit tests build tests integration tests with endpoint domain mapping tests
| 0
|
12,846
| 4,543,709,966
|
IssuesEvent
|
2016-09-10 08:36:34
|
Nukepayload2/Nukepayload2.N2Engine
|
https://api.github.com/repos/Nukepayload2/Nukepayload2.N2Engine
|
opened
|
可见元素的行为
|
code-support-c# code-support-vb feature specification
|
对于非布局元素的可见元素,可以定义它们的行为
内置行为:
* 玩家可操控行为
* 敌人自动行动行为
* 普通NPC自动行动行为
* 伤害玩家行为
* 伤害敌人行为
目前此功能尚未开始
|
2.0
|
可见元素的行为 - 对于非布局元素的可见元素,可以定义它们的行为
内置行为:
* 玩家可操控行为
* 敌人自动行动行为
* 普通NPC自动行动行为
* 伤害玩家行为
* 伤害敌人行为
目前此功能尚未开始
|
non_test
|
可见元素的行为 对于非布局元素的可见元素,可以定义它们的行为 内置行为: 玩家可操控行为 敌人自动行动行为 普通npc自动行动行为 伤害玩家行为 伤害敌人行为 目前此功能尚未开始
| 0
|
15,388
| 3,461,882,092
|
IssuesEvent
|
2015-12-20 13:28:45
|
colobot/colobot
|
https://api.github.com/repos/colobot/colobot
|
closed
|
"Help about selected object" button doesn't load SatCom page
|
bug filesystem latest dev only SatCom ui
|
In the logs I've got this kind of errors:
```
[ERROR]: Failed to load text file `n¨/E/object/human.txt
```
SatCom screen appears, but is empty.
|
1.0
|
"Help about selected object" button doesn't load SatCom page - In the logs I've got this kind of errors:
```
[ERROR]: Failed to load text file `n¨/E/object/human.txt
```
SatCom screen appears, but is empty.
|
test
|
help about selected object button doesn t load satcom page in the logs i ve got this kind of errors failed to load text file n¨ e object human txt satcom screen appears but is empty
| 1
|
303,570
| 26,217,099,379
|
IssuesEvent
|
2023-01-04 11:56:09
|
saleor/saleor-dashboard
|
https://api.github.com/repos/saleor/saleor-dashboard
|
closed
|
Cypress test fail: should be able to update product type with variant attribute. TC: SALEOR_1504
|
tests
|
**Known bug for versions:**
v35: true
**Additional Info:**
Spec: As an admin I want to manage attributes in product types
**Created Ticked**
For: QA
Link: https://github.com/saleor/saleor-dashboard/issues/2780
|
1.0
|
Cypress test fail: should be able to update product type with variant attribute. TC: SALEOR_1504 - **Known bug for versions:**
v35: true
**Additional Info:**
Spec: As an admin I want to manage attributes in product types
**Created Ticked**
For: QA
Link: https://github.com/saleor/saleor-dashboard/issues/2780
|
test
|
cypress test fail should be able to update product type with variant attribute tc saleor known bug for versions true additional info spec as an admin i want to manage attributes in product types created ticked for qa link
| 1
|
215,894
| 16,721,367,394
|
IssuesEvent
|
2021-06-10 07:44:10
|
WoWManiaUK/Redemption
|
https://api.github.com/repos/WoWManiaUK/Redemption
|
closed
|
[Spell] Shadow Priest - Devouring Plague / heal
|
Fixed on PTR - Tester Confirmed
|
**Links:**
http://www.wow-mania.com/armory?spell=19280
**What is Happening:**
spell Devouring Plague on shadow priest heal effect can not be critical hit
**What Should happen:**
Devouring Plague heal should be able to be critical heal
|
1.0
|
[Spell] Shadow Priest - Devouring Plague / heal - **Links:**
http://www.wow-mania.com/armory?spell=19280
**What is Happening:**
spell Devouring Plague on shadow priest heal effect can not be critical hit
**What Should happen:**
Devouring Plague heal should be able to be critical heal
|
test
|
shadow priest devouring plague heal links what is happening spell devouring plague on shadow priest heal effect can not be critical hit what should happen devouring plague heal should be able to be critical heal
| 1
|
107,347
| 13,452,295,866
|
IssuesEvent
|
2020-09-08 21:52:28
|
discreetlogcontracts/dlcspecs
|
https://api.github.com/repos/discreetlogcontracts/dlcspecs
|
opened
|
Pretty Pictures!
|
design good first issue help wanted
|
At some point I (or someone else ... @Ichiro0219 I know your diagrams are prettier than mine haha) should invest some time in making some nice pictures to go along with explanations and resources on this repo. New contributors and users love pictures!
|
1.0
|
Pretty Pictures! - At some point I (or someone else ... @Ichiro0219 I know your diagrams are prettier than mine haha) should invest some time in making some nice pictures to go along with explanations and resources on this repo. New contributors and users love pictures!
|
non_test
|
pretty pictures at some point i or someone else i know your diagrams are prettier than mine haha should invest some time in making some nice pictures to go along with explanations and resources on this repo new contributors and users love pictures
| 0
|
149,968
| 11,941,256,946
|
IssuesEvent
|
2020-04-02 18:07:18
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
roachtest: alterpk-tpcc failed
|
C-test-failure O-roachtest O-robot branch-provisional_202003181957_v20.1.0-beta.3 release-blocker
|
[(roachtest).alterpk-tpcc failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816942&tab=buildLog) on [provisional_202003181957_v20.1.0-beta.3@37f2a6964760b5d72bef655342628e31601b8978](https://github.com/cockroachdb/cockroach/commits/37f2a6964760b5d72bef655342628e31601b8978):
```
The test failed on branch=provisional_202003181957_v20.1.0-beta.3, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20200319-1816942/alterpk-tpcc/run_1
test_runner.go:800: test timed out (10h0m0s)
```
<details><summary>More</summary><p>
Artifacts: [/alterpk-tpcc](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816942&tab=artifacts#/alterpk-tpcc)
Related:
- #46136 roachtest: alterpk-tpcc failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202003112118_v20.1.0-beta.3](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202003112118_v20.1.0-beta.3) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #45812 roachtest: alterpk-tpcc failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Aalterpk-tpcc.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
2.0
|
roachtest: alterpk-tpcc failed - [(roachtest).alterpk-tpcc failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816942&tab=buildLog) on [provisional_202003181957_v20.1.0-beta.3@37f2a6964760b5d72bef655342628e31601b8978](https://github.com/cockroachdb/cockroach/commits/37f2a6964760b5d72bef655342628e31601b8978):
```
The test failed on branch=provisional_202003181957_v20.1.0-beta.3, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/20200319-1816942/alterpk-tpcc/run_1
test_runner.go:800: test timed out (10h0m0s)
```
<details><summary>More</summary><p>
Artifacts: [/alterpk-tpcc](https://teamcity.cockroachdb.com/viewLog.html?buildId=1816942&tab=artifacts#/alterpk-tpcc)
Related:
- #46136 roachtest: alterpk-tpcc failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-provisional_202003112118_v20.1.0-beta.3](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-provisional_202003112118_v20.1.0-beta.3) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
- #45812 roachtest: alterpk-tpcc failed [C-test-failure](https://api.github.com/repos/cockroachdb/cockroach/labels/C-test-failure) [O-roachtest](https://api.github.com/repos/cockroachdb/cockroach/labels/O-roachtest) [O-robot](https://api.github.com/repos/cockroachdb/cockroach/labels/O-robot) [branch-master](https://api.github.com/repos/cockroachdb/cockroach/labels/branch-master) [release-blocker](https://api.github.com/repos/cockroachdb/cockroach/labels/release-blocker)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Aalterpk-tpcc.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
|
test
|
roachtest alterpk tpcc failed on the test failed on branch provisional beta cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts alterpk tpcc run test runner go test timed out more artifacts related roachtest alterpk tpcc failed roachtest alterpk tpcc failed powered by
| 1
|
162,331
| 12,651,356,810
|
IssuesEvent
|
2020-06-17 00:03:45
|
nearprotocol/nearcore
|
https://api.github.com/repos/nearprotocol/nearcore
|
closed
|
Test zero downtime upgrade for validators
|
testing
|
In https://github.com/near/docs/issues/341, @ilblackdragon described the way for validators to seamlessly upgrade their nodes (by spinning up a new node and stake with a new key). However, we have not yet done any testing to make sure that this works in practice. There should at least be some sanity test to show that the procedure described in https://github.com/near/docs/issues/341 isn't trivially broken.
|
1.0
|
Test zero downtime upgrade for validators - In https://github.com/near/docs/issues/341, @ilblackdragon described the way for validators to seamlessly upgrade their nodes (by spinning up a new node and stake with a new key). However, we have not yet done any testing to make sure that this works in practice. There should at least be some sanity test to show that the procedure described in https://github.com/near/docs/issues/341 isn't trivially broken.
|
test
|
test zero downtime upgrade for validators in ilblackdragon described the way for validators to seamlessly upgrade their nodes by spinning up a new node and stake with a new key however we have not yet done any testing to make sure that this works in practice there should at least be some sanity test to show that the procedure described in isn t trivially broken
| 1
|
89,067
| 8,189,131,585
|
IssuesEvent
|
2018-08-30 06:10:35
|
dotnet/docs
|
https://api.github.com/repos/dotnet/docs
|
closed
|
b
|
test-issue
|
# Issue Title
Be as descriptive as you can with your title. If your issue requests
a new topic, preface the title with [NewTopic].
# General
We use issues to drive the discussion for changes to
existing topics and the creation of new topics.
This way, we have those discussions before significant
effort is spent writing a new topic. Click on the Guidelines
for Contributing link above for details.
## Issues with Existing Topics
If the issue is a simple typo or similar correction, you can just submit a PR.
Otherwise, let us know what's wrong or what we should improve. Include a
link to the topic.
## Requests for new Topics
1. Tell us where this topic should go in the Table of Contents.
- Consider where you looked for this information before opening an issue.
2. Write an abstract
- In one **short** paragraph, describe what this topic will cover.
3. Fill in an outline
- Write a complete outline for the new topic. We'll help review the outline and approve that before anyone writes a topic.
4. Suggest reviewers
- If you know someone who can provide feedback, use '@' to ask them to review.
|
1.0
|
b - # Issue Title
Be as descriptive as you can with your title. If your issue requests
a new topic, preface the title with [NewTopic].
# General
We use issues to drive the discussion for changes to
existing topics and the creation of new topics.
This way, we have those discussions before significant
effort is spent writing a new topic. Click on the Guidelines
for Contributing link above for details.
## Issues with Existing Topics
If the issue is a simple typo or similar correction, you can just submit a PR.
Otherwise, let us know what's wrong or what we should improve. Include a
link to the topic.
## Requests for new Topics
1. Tell us where this topic should go in the Table of Contents.
- Consider where you looked for this information before opening an issue.
2. Write an abstract
- In one **short** paragraph, describe what this topic will cover.
3. Fill in an outline
- Write a complete outline for the new topic. We'll help review the outline and approve that before anyone writes a topic.
4. Suggest reviewers
- If you know someone who can provide feedback, use '@' to ask them to review.
|
test
|
b issue title be as descriptive as you can with your title if your issue requests a new topic preface the title with general we use issues to drive the discussion for changes to existing topics and the creation of new topics this way we have those discussions before significant effort is spent writing a new topic click on the guidelines for contributing link above for details issues with existing topics if the issue is a simple typo or similar correction you can just submit a pr otherwise let us know what s wrong or what we should improve include a link to the topic requests for new topics tell us where this topic should go in the table of contents consider where you looked for this information before opening an issue write an abstract in one short paragraph describe what this topic will cover fill in an outline write a complete outline for the new topic we ll help review the outline and approve that before anyone writes a topic suggest reviewers if you know someone who can provide feedback use to ask them to review
| 1
|
85,312
| 7,966,081,010
|
IssuesEvent
|
2018-07-14 17:15:17
|
bitcoin/bitcoin
|
https://api.github.com/repos/bitcoin/bitcoin
|
closed
|
Bootstrap new node test
|
Tests good first issue
|
We should have test infrastructure that bootstraps a new node (both on mainnet and on testnet). People (e.g. myself) have been doing this less formally prior to release, but we should be doing it continuously.
Should also probably test bootstrapping over tor.
Collecting some metrics like time to bootstrap (both to some specified heights as well as to completion) along the way would probably be useful.
|
1.0
|
Bootstrap new node test - We should have test infrastructure that bootstraps a new node (both on mainnet and on testnet). People (e.g. myself) have been doing this less formally prior to release, but we should be doing it continuously.
Should also probably test bootstrapping over tor.
Collecting some metrics like time to bootstrap (both to some specified heights as well as to completion) along the way would probably be useful.
|
test
|
bootstrap new node test we should have test infrastructure that bootstraps a new node both on mainnet and on testnet people e g myself have been doing this less formally prior to release but we should be doing it continuously should also probably test bootstrapping over tor collecting some metrics like time to bootstrap both to some specified heights as well as to completion along the way would probably be useful
| 1
|
223,036
| 7,445,865,713
|
IssuesEvent
|
2018-03-28 07:03:26
|
gluster/glusterd2
|
https://api.github.com/repos/gluster/glusterd2
|
closed
|
HA: Notify glusterfs clients of newly added glusterd2 instances
|
FW: Cluster Management FW: RPC priority: low
|
When a glusterfs client (fuse or otherwise) is connected to glusterd2 instance, it needs to be notified of other available (backup) glusterd2 instances that it can connect to. One way to do this is to notify all connected clients when a new peer (glusterd2 instance) is added to the cluster.
Reported by `rtalur`
|
1.0
|
HA: Notify glusterfs clients of newly added glusterd2 instances - When a glusterfs client (fuse or otherwise) is connected to glusterd2 instance, it needs to be notified of other available (backup) glusterd2 instances that it can connect to. One way to do this is to notify all connected clients when a new peer (glusterd2 instance) is added to the cluster.
Reported by `rtalur`
|
non_test
|
ha notify glusterfs clients of newly added instances when a glusterfs client fuse or otherwise is connected to instance it needs to be notified of other available backup instances that it can connect to one way to do this is to notify all connected clients when a new peer instance is added to the cluster reported by rtalur
| 0
|
87,719
| 8,120,153,586
|
IssuesEvent
|
2018-08-16 00:56:25
|
equella/Equella
|
https://api.github.com/repos/equella/Equella
|
closed
|
APIDocs page refreshes on button clicks with old UI
|
Backport-6.6-Stable Ready for Testing bug
|
The form submission isn't being disabled for normal button clicks
|
1.0
|
APIDocs page refreshes on button clicks with old UI - The form submission isn't being disabled for normal button clicks
|
test
|
apidocs page refreshes on button clicks with old ui the form submission isn t being disabled for normal button clicks
| 1
|
204,449
| 15,443,708,552
|
IssuesEvent
|
2021-03-08 09:28:36
|
mozilla-mobile/fenix
|
https://api.github.com/repos/mozilla-mobile/fenix
|
closed
|
verifyRateOnGooglePlayRedirect test triggers a Google Play ToS dialog that blocks other tests
|
eng:ui-test
|
### Firebase Test Run:
https://console.firebase.google.com/u/0/project/moz-fenix/testlab/histories/bh.66b7091e15d53d45/matrices/7755699007721319983/executions/bs.6059beeab37a8c34/test-cases
https://console.firebase.google.com/project/moz-fenix/testlab/histories/bh.66b7091e15d53d45/matrices/7890252434993435457/executions/bs.ad28916039cc29ee/test-cases
Screenshot of the dialog covering the app:

|
1.0
|
verifyRateOnGooglePlayRedirect test triggers a Google Play ToS dialog that blocks other tests - ### Firebase Test Run:
https://console.firebase.google.com/u/0/project/moz-fenix/testlab/histories/bh.66b7091e15d53d45/matrices/7755699007721319983/executions/bs.6059beeab37a8c34/test-cases
https://console.firebase.google.com/project/moz-fenix/testlab/histories/bh.66b7091e15d53d45/matrices/7890252434993435457/executions/bs.ad28916039cc29ee/test-cases
Screenshot of the dialog covering the app:

|
test
|
verifyrateongoogleplayredirect test triggers a google play tos dialog that blocks other tests firebase test run screenshot of the dialog covering the app
| 1
|
612,514
| 19,024,247,413
|
IssuesEvent
|
2021-11-24 00:07:22
|
google/ground-android
|
https://api.github.com/repos/google/ground-android
|
opened
|
[Code health] Fix duplicate / unclear `EditObservation*`flows
|
type: cleanup priority: p2
|
There are several problems with the lifecycle of `EditObservationFragment` and `EditObservationViewModel` while working on #1077:
* `initialize()` is called when the ViewModel is being restored, even if it was already initialized. As a result, the Form is re-rendered multiple times: once when the ViewModel is observed, and once when the Observation reloads.
* There are also several click event streams which use LiveData, which replays events which could lead to unexpected behaviors.
* New subscriptions are added each time the form is rebuilt, but are only disposed when the Fragment is destroyed. This causes a leak, and for event handlers to be called multiple times (once for each subscription created on each config change). One approach would be to add them to a `CompositeDisposable` that gets cleared each time the form is rebuilt.
* How the `form` LiveData is derived from the initialize stream is unintuitive; might be better to rely on side effects in this case (`MutableLiveData.setValue()`).
@shobhitagarwal1612 FYI.
|
1.0
|
[Code health] Fix duplicate / unclear `EditObservation*`flows - There are several problems with the lifecycle of `EditObservationFragment` and `EditObservationViewModel` while working on #1077:
* `initialize()` is called when the ViewModel is being restored, even if it was already initialized. As a result, the Form is re-rendered multiple times: once when the ViewModel is observed, and once when the Observation reloads.
* There are also several click event streams which use LiveData, which replays events which could lead to unexpected behaviors.
* New subscriptions are added each time the form is rebuilt, but are only disposed when the Fragment is destroyed. This causes a leak, and for event handlers to be called multiple times (once for each subscription created on each config change). One approach would be to add them to a `CompositeDisposable` that gets cleared each time the form is rebuilt.
* How the `form` LiveData is derived from the initialize stream is unintuitive; might be better to rely on side effects in this case (`MutableLiveData.setValue()`).
@shobhitagarwal1612 FYI.
|
non_test
|
fix duplicate unclear editobservation flows there are several problems with the lifecycle of editobservationfragment and editobservationviewmodel while working on initialize is called when the viewmodel is being restored even if it was already initialized as a result the form is re rendered multiple times once when the viewmodel is observed and once when the observation reloads there are also several click event streams which use livedata which replays events which could lead to unexpected behaviors new subscriptions are added each time the form is rebuilt but are only disposed when the fragment is destroyed this causes a leak and for event handlers to be called multiple times once for each subscription created on each config change one approach would be to add them to a compositedisposable that gets cleared each time the form is rebuilt how the form livedata is derived from the initialize stream is unintuitive might be better to rely on side effects in this case mutablelivedata setvalue fyi
| 0
|
157,965
| 12,397,121,435
|
IssuesEvent
|
2020-05-20 21:56:52
|
brave/brave-browser
|
https://api.github.com/repos/brave/brave-browser
|
closed
|
Have automated tests use IME or a different locale
|
l10n stale tests
|
Carried over from https://github.com/brave/browser-laptop/issues/9045
## Description
We should have tests that are running using a different system locale (such as Japanese). We should also have automated tests which try to visit IDNs and possibly form fill using IME
|
1.0
|
Have automated tests use IME or a different locale - Carried over from https://github.com/brave/browser-laptop/issues/9045
## Description
We should have tests that are running using a different system locale (such as Japanese). We should also have automated tests which try to visit IDNs and possibly form fill using IME
|
test
|
have automated tests use ime or a different locale carried over from description we should have tests that are running using a different system locale such as japanese we should also have automated tests which try to visit idns and possibly form fill using ime
| 1
|
189,527
| 14,512,806,642
|
IssuesEvent
|
2020-12-13 02:46:07
|
cpalmer718/book.parsing
|
https://api.github.com/repos/cpalmer718/book.parsing
|
closed
|
Add tests
|
tests
|
Not sure how to best go about this yet as the input data need to be kept private, which is fine but also any comparisons need to be kept private. Might need to generate a dummy file with representative formatting.
|
1.0
|
Add tests - Not sure how to best go about this yet as the input data need to be kept private, which is fine but also any comparisons need to be kept private. Might need to generate a dummy file with representative formatting.
|
test
|
add tests not sure how to best go about this yet as the input data need to be kept private which is fine but also any comparisons need to be kept private might need to generate a dummy file with representative formatting
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.