Unnamed: 0 int64 0 832k | id float64 2.49B 32.1B | type stringclasses 1 value | created_at stringlengths 19 19 | repo stringlengths 4 112 | repo_url stringlengths 33 141 | action stringclasses 3 values | title stringlengths 1 1.02k | labels stringlengths 4 1.54k | body stringlengths 1 262k | index stringclasses 17 values | text_combine stringlengths 95 262k | label stringclasses 2 values | text stringlengths 96 252k | binary_label int64 0 1 |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
680,535 | 23,275,581,958 | IssuesEvent | 2022-08-05 06:48:02 | webcompat/web-bugs | https://api.github.com/repos/webcompat/web-bugs | closed | www.oregon.gov - see bug description | browser-firefox priority-normal type-no-css engine-gecko | <!-- @browser: Firefox 103.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:103.0) Gecko/20100101 Firefox/103.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/108526 -->
**URL**: https://www.oregon.gov/DEQ/Pages/index.aspx?utm_source=pocket_mylist
**Browser / Version**: Firefox 103.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Something else
**Description**: on English pages in large font, I see this A LOT "arrow_drop_down"
**Steps to Reproduce**:
I opened my firefox browser and went to a website and saw "arrow_drop_down" here and there on the websites I visit. It's clearly not supposed to be there.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/8/db7c01ae-5e10-4be0-83e8-f66ef16a1044.jpg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | 1.0 | www.oregon.gov - see bug description - <!-- @browser: Firefox 103.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:103.0) Gecko/20100101 Firefox/103.0 -->
<!-- @reported_with: unknown -->
<!-- @public_url: https://github.com/webcompat/web-bugs/issues/108526 -->
**URL**: https://www.oregon.gov/DEQ/Pages/index.aspx?utm_source=pocket_mylist
**Browser / Version**: Firefox 103.0
**Operating System**: Windows 10
**Tested Another Browser**: Yes Edge
**Problem type**: Something else
**Description**: on English pages in large font, I see this A LOT "arrow_drop_down"
**Steps to Reproduce**:
I opened my firefox browser and went to a website and saw "arrow_drop_down" here and there on the websites I visit. It's clearly not supposed to be there.
<details>
<summary>View the screenshot</summary>
<img alt="Screenshot" src="https://webcompat.com/uploads/2022/8/db7c01ae-5e10-4be0-83e8-f66ef16a1044.jpg">
</details>
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_ | non_test | see bug description url browser version firefox operating system windows tested another browser yes edge problem type something else description on english pages in large font i see this a lot arrow drop down steps to reproduce i opened my firefox browser and went to a website and saw arrow drop down here and there on the websites i visit it s clearly not supposed to be there view the screenshot img alt screenshot src browser configuration none from with ❤️ | 0 |
469,919 | 13,527,949,149 | IssuesEvent | 2020-09-15 16:03:20 | wso2/product-apim | https://api.github.com/repos/wso2/product-apim | opened | Missing link in 3.2 doc | Docs/Has Impact Priority/Normal Type/Docs | ### Description:
Link on the following page
https://apim.docs.wso2.com/en/latest/develop/extending-api-manager/extending-workflows/configuring-http-redirection-for-workflows/
on last paragraf
Invoking the API Manager¶
To invoke the API Manager from a third party entity, see **Invoking the API Manager from the BPEL Engine** .
link is not working
https://apim.docs.wso2.com/en/latest/develop/extending-api-manager/extending-workflows/configuring-http-redirection-for-workflows/invoking-the-api-manager-from-the-bpel-engine
| 1.0 | Missing link in 3.2 doc - ### Description:
Link on the following page
https://apim.docs.wso2.com/en/latest/develop/extending-api-manager/extending-workflows/configuring-http-redirection-for-workflows/
on last paragraf
Invoking the API Manager¶
To invoke the API Manager from a third party entity, see **Invoking the API Manager from the BPEL Engine** .
link is not working
https://apim.docs.wso2.com/en/latest/develop/extending-api-manager/extending-workflows/configuring-http-redirection-for-workflows/invoking-the-api-manager-from-the-bpel-engine
| non_test | missing link in doc description link on the following page on last paragraf invoking the api manager¶ to invoke the api manager from a third party entity see invoking the api manager from the bpel engine link is not working | 0 |
346,126 | 24,886,567,819 | IssuesEvent | 2022-10-28 08:17:59 | craeyeons/ped | https://api.github.com/repos/craeyeons/ped | opened | Misuse of Square Brackets in UG | type.DocumentationBug severity.Low | 
For example in the "adding user" command, name is not an optional field, however square brackets are used to surround it which contradicts the instructions of the user guide.
<!--session: 1666944929838-03c33bcc-827d-4101-bd42-4c0ffb008d04-->
<!--Version: Web v3.4.4--> | 1.0 | Misuse of Square Brackets in UG - 
For example in the "adding user" command, name is not an optional field, however square brackets are used to surround it which contradicts the instructions of the user guide.
<!--session: 1666944929838-03c33bcc-827d-4101-bd42-4c0ffb008d04-->
<!--Version: Web v3.4.4--> | non_test | misuse of square brackets in ug for example in the adding user command name is not an optional field however square brackets are used to surround it which contradicts the instructions of the user guide | 0 |
238,186 | 19,701,966,777 | IssuesEvent | 2022-01-12 17:29:35 | WordPress/gutenberg | https://api.github.com/repos/WordPress/gutenberg | closed | Removing title support will partially hide the first block floating menu | Needs Testing | ### Description
When registering a new custom post type through [register_post_type()](https://developer.wordpress.org/reference/functions/register_post_type/), and setting the `support` parameter, removing the `title` core feature will partially hide the first page block floating menu.
### Step-by-step reproduction instructions
1. Remove support for the `title` feature from a post type.
```php
<?php
add_action( 'init', function () {
remove_post_type_support( 'post', 'title' );
} );
```
2. Add a new post.
3. The first block floating menu is partially hidden.
### Screenshots, screen recording, code snippet

### Environment info
- Wordpress 5.8.2, matching Gutenberg on default theme.
- Chrome Version 96.0.4664.110 (Build officiel) (64 bits)
- Desktop Windows 10
### Please confirm that you have searched existing issues in the repo.
Yes
### Please confirm that you have tested with all plugins deactivated except Gutenberg.
Yes | 1.0 | Removing title support will partially hide the first block floating menu - ### Description
When registering a new custom post type through [register_post_type()](https://developer.wordpress.org/reference/functions/register_post_type/), and setting the `support` parameter, removing the `title` core feature will partially hide the first page block floating menu.
### Step-by-step reproduction instructions
1. Remove support for the `title` feature from a post type.
```php
<?php
add_action( 'init', function () {
remove_post_type_support( 'post', 'title' );
} );
```
2. Add a new post.
3. The first block floating menu is partially hidden.
### Screenshots, screen recording, code snippet

### Environment info
- Wordpress 5.8.2, matching Gutenberg on default theme.
- Chrome Version 96.0.4664.110 (Build officiel) (64 bits)
- Desktop Windows 10
### Please confirm that you have searched existing issues in the repo.
Yes
### Please confirm that you have tested with all plugins deactivated except Gutenberg.
Yes | test | removing title support will partially hide the first block floating menu description when registering a new custom post type through and setting the support parameter removing the title core feature will partially hide the first page block floating menu step by step reproduction instructions remove support for the title feature from a post type php php add action init function remove post type support post title add a new post the first block floating menu is partially hidden screenshots screen recording code snippet environment info wordpress matching gutenberg on default theme chrome version build officiel bits desktop windows please confirm that you have searched existing issues in the repo yes please confirm that you have tested with all plugins deactivated except gutenberg yes | 1 |
124,711 | 16,638,290,754 | IssuesEvent | 2021-06-04 03:53:30 | PostHog/posthog | https://api.github.com/repos/PostHog/posthog | opened | Legend key in chart | UI/UX core-experience design enhancement | ## Is your feature request related to a problem?
Currently we show the legend table describing each graph series at the bottom of the graph. This is not ideal because a) you have to know it's there, b) you have to scroll which makes visualization more difficult and c) doesn't work well for screenshots.
## Describe the solution you'd like
I'd like us to introduce a legend key within the graph. See one example below.
<img width="578" alt="Screen Shot 2021-06-03 at 8 43 36 PM" src="https://user-images.githubusercontent.com/25164963/120729143-6736dd80-c4ac-11eb-93ec-3a4c6b35713d.png">
## Describe alternatives you've considered
Keep only the legend table at the bottom.
## Additional context
Related to #4492.
#### *Thank you* for your feature request – we love each and every one!
| 1.0 | Legend key in chart - ## Is your feature request related to a problem?
Currently we show the legend table describing each graph series at the bottom of the graph. This is not ideal because a) you have to know it's there, b) you have to scroll which makes visualization more difficult and c) doesn't work well for screenshots.
## Describe the solution you'd like
I'd like us to introduce a legend key within the graph. See one example below.
<img width="578" alt="Screen Shot 2021-06-03 at 8 43 36 PM" src="https://user-images.githubusercontent.com/25164963/120729143-6736dd80-c4ac-11eb-93ec-3a4c6b35713d.png">
## Describe alternatives you've considered
Keep only the legend table at the bottom.
## Additional context
Related to #4492.
#### *Thank you* for your feature request – we love each and every one!
| non_test | legend key in chart is your feature request related to a problem currently we show the legend table describing each graph series at the bottom of the graph this is not ideal because a you have to know it s there b you have to scroll which makes visualization more difficult and c doesn t work well for screenshots describe the solution you d like i d like us to introduce a legend key within the graph see one example below img width alt screen shot at pm src describe alternatives you ve considered keep only the legend table at the bottom additional context related to thank you for your feature request – we love each and every one | 0 |
203,510 | 15,372,903,736 | IssuesEvent | 2021-03-02 11:52:57 | chartjs/Chart.js | https://api.github.com/repos/chartjs/Chart.js | closed | chart.js:7 Uncaught (in promise) TypeError: Cannot set property '_index' of undefined | status: needs test case type: bug | ## Environment
Chart.js 2.9.4
Google Chrome Version 88.0.4324.190 (Official Build) (64-bit)
```
function update(unit, window) {
unit = unit || this.root.find(`.controllers .units .active`).attr('value');
window = window || this.root.find(`.controllers .windows .active`).attr('value');
if (unit && window) {
this.currentUnit = unit;
this.currentWindow = window;
for (const metal of this.keys) {
const metalDatasetIndex = this.lineChart.datasetIndexes[metal];
if (metalDatasetIndex != undefined && metalDatasetIndex != null) {
const dataset = this.lineChart.data.datasets[metalDatasetIndex];
if (dataset) {
dataset.data = getData(metal, unit, window);
if (dataset.data.length > 600) dataset.pointRadius = 0;
else dataset.pointRadius = 1;
}
}
}
}
this.lineChart.update(); // error happens here
//this.lineChart.resetZoom();
}
```
Looks like something with my data but I can't debug it. | 1.0 | chart.js:7 Uncaught (in promise) TypeError: Cannot set property '_index' of undefined - ## Environment
Chart.js 2.9.4
Google Chrome Version 88.0.4324.190 (Official Build) (64-bit)
```
function update(unit, window) {
unit = unit || this.root.find(`.controllers .units .active`).attr('value');
window = window || this.root.find(`.controllers .windows .active`).attr('value');
if (unit && window) {
this.currentUnit = unit;
this.currentWindow = window;
for (const metal of this.keys) {
const metalDatasetIndex = this.lineChart.datasetIndexes[metal];
if (metalDatasetIndex != undefined && metalDatasetIndex != null) {
const dataset = this.lineChart.data.datasets[metalDatasetIndex];
if (dataset) {
dataset.data = getData(metal, unit, window);
if (dataset.data.length > 600) dataset.pointRadius = 0;
else dataset.pointRadius = 1;
}
}
}
}
this.lineChart.update(); // error happens here
//this.lineChart.resetZoom();
}
```
Looks like something with my data but I can't debug it. | test | chart js uncaught in promise typeerror cannot set property index of undefined environment chart js google chrome version official build bit function update unit window unit unit this root find controllers units active attr value window window this root find controllers windows active attr value if unit window this currentunit unit this currentwindow window for const metal of this keys const metaldatasetindex this linechart datasetindexes if metaldatasetindex undefined metaldatasetindex null const dataset this linechart data datasets if dataset dataset data getdata metal unit window if dataset data length dataset pointradius else dataset pointradius this linechart update error happens here this linechart resetzoom looks like something with my data but i can t debug it | 1 |
25,305 | 12,236,714,990 | IssuesEvent | 2020-05-04 16:48:13 | cityofaustin/atd-data-tech | https://api.github.com/repos/cityofaustin/atd-data-tech | opened | Changes and Reports Section Access Rules | Impact: 3-Minor Need: 3-Could Have Product: Vision Zero Crash Data System Service: Dev Type: Enhancement Workgroup: VZ | Make the Changes and Reports sections in VZE only accessible to Admins and IT Supervisors | 1.0 | Changes and Reports Section Access Rules - Make the Changes and Reports sections in VZE only accessible to Admins and IT Supervisors | non_test | changes and reports section access rules make the changes and reports sections in vze only accessible to admins and it supervisors | 0 |
1,953 | 2,579,451,029 | IssuesEvent | 2015-02-13 10:21:01 | ajency/Foodstree | https://api.github.com/repos/ajency/Foodstree | closed | Error while creating a seller if the 'Check if billing address same as registered address' is checked | bug Pushed to live site Pushed to test site | Steps:
1 Create a new seller
2.Check the field 'Check if billing address same as registered address' field
3. Enter the passwords such that there is mismatch
Current behaviour: On entering the correct password the field 'Check if billing address same as registered address' stays checked and the fields below it are also displayed
Expected behaviour:The fields under shipping address should not be displayed if 'Check if billing address same as registered address' is checked.

| 1.0 | Error while creating a seller if the 'Check if billing address same as registered address' is checked - Steps:
1 Create a new seller
2.Check the field 'Check if billing address same as registered address' field
3. Enter the passwords such that there is mismatch
Current behaviour: On entering the correct password the field 'Check if billing address same as registered address' stays checked and the fields below it are also displayed
Expected behaviour:The fields under shipping address should not be displayed if 'Check if billing address same as registered address' is checked.

| test | error while creating a seller if the check if billing address same as registered address is checked steps create a new seller check the field check if billing address same as registered address field enter the passwords such that there is mismatch current behaviour on entering the correct password the field check if billing address same as registered address stays checked and the fields below it are also displayed expected behaviour the fields under shipping address should not be displayed if check if billing address same as registered address is checked | 1 |
379,055 | 11,212,590,711 | IssuesEvent | 2020-01-06 17:57:29 | googleapis/nodejs-logging-bunyan | https://api.github.com/repos/googleapis/nodejs-logging-bunyan | closed | NodeJS 10: Async bunyan logging crashes the cloud function if await is not used while logging | priority: p2 type: bug | Issue:
- nodeJS 10 Cloud function runs but results in a crash while using bunyan logs and publishing to another topic!
## Environment details
- OS: Google cloud function
- Node.js version: v10.14.2
- npm version: 6.4.1
- `@google-cloud/logging-bunyan` version: 2.0.0
#### Steps to reproduce
1. Create a cloud function using nodeJS 10 runtime with pubsub as trigger
2. use bunyan logging and [redirect logs](https://github.com/googleapis/nodejs-logging-bunyan/issues/291#issuecomment-526758642) to cloud functions.
3. Try and publish to a sample topic and use bunyan logs right after publish
Error reported:
```
Ignoring extra callback call
```
```
Function execution took 1852 ms, finished with status: 'crash'
```
```
{
insertId: "000000-811aac99-2369-4f5b-801f-19364c10437c"
labels: {…}
logName: "projects/some-project/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
receiveTimestamp: "2019-12-03T18:22:41.491846254Z"
resource: {
labels: {
function_name: "test-logger-lib-publish-duplicate"
project_id: "some-project"
region: "us-central1"
}
type: "cloud_function"
}
severity: "ERROR"
textPayload: "Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-started for more information.
at GoogleAuth.getApplicationDefaultAsync (/srv/functions/node_modules/google-auth-library/build/src/auth/googleauth.js:161:19)
at process._tickCallback (internal/process/next_tick.js:68:7)"
timestamp: "2019-12-03T18:22:40.664Z"
trace: "projects/some-project/traces/ce459298d8da7569d7fb40b07785d594"
}
```
Sample Code:
index.js
```const { PubSub } = require('@google-cloud/pubsub');
const bunyan = require('bunyan');
const { LoggingBunyan } = require('@google-cloud/logging-bunyan');
const loggingBunyan = new LoggingBunyan({
logName: process.env.LOG_NAME
});
function getLogger(logginglevel) {
let logger = bunyan.createLogger({
name: process.env.FUNCTION_TARGET,
level: logginglevel,
streams: [
loggingBunyan.stream()
]
});
logger.debug(`${process.env.FUNCTION_TARGET} :: ${process.env.LOG_NAME}`);
return logger;
}
const pubsub = new PubSub({
projectId: 'some-project'
});
exports.helloPubSub = async(event, context) => {
console.log('I am in!');
const logger = getLogger('debug');
const test_topic = pubsub.topic('logger-topic');
logger.debug('debug level test');
const data = Buffer.from('interesting', 'utf8');
await test_topic.publish(data);
logger.error('error level test');
logger.info('I am out');
};
```
package.json
```{
"name": "sample-pubsub",
"version": "0.0.1",
"dependencies": {
"@google-cloud/pubsub": "^1.0.0",
"bunyan": "^1.8.12",
"@google-cloud/logging-bunyan": "^2.0.0"
}
}
```
Note:
- The same code works fine for nodejs 8 runtime! This issue is only with nodejs 10 runtime.
- In the above index.js file, if I use await on the last logger.info('I am out') or all the logger calls, the function works like a charm!
Could anyone help me with what's wrong here?
Reference issues:
- [Async logging nodeJS 10](https://github.com/googleapis/nodejs-logging-bunyan/issues/304)
- [Redirect logs explicitly to cloud functions nodeJS 10](https://github.com/googleapis/nodejs-logging-bunyan/issues/291#issuecomment-526758642) | 1.0 | NodeJS 10: Async bunyan logging crashes the cloud function if await is not used while logging - Issue:
- nodeJS 10 Cloud function runs but results in a crash while using bunyan logs and publishing to another topic!
## Environment details
- OS: Google cloud function
- Node.js version: v10.14.2
- npm version: 6.4.1
- `@google-cloud/logging-bunyan` version: 2.0.0
#### Steps to reproduce
1. Create a cloud function using nodeJS 10 runtime with pubsub as trigger
2. use bunyan logging and [redirect logs](https://github.com/googleapis/nodejs-logging-bunyan/issues/291#issuecomment-526758642) to cloud functions.
3. Try and publish to a sample topic and use bunyan logs right after publish
Error reported:
```
Ignoring extra callback call
```
```
Function execution took 1852 ms, finished with status: 'crash'
```
```
{
insertId: "000000-811aac99-2369-4f5b-801f-19364c10437c"
labels: {…}
logName: "projects/some-project/logs/cloudfunctions.googleapis.com%2Fcloud-functions"
receiveTimestamp: "2019-12-03T18:22:41.491846254Z"
resource: {
labels: {
function_name: "test-logger-lib-publish-duplicate"
project_id: "some-project"
region: "us-central1"
}
type: "cloud_function"
}
severity: "ERROR"
textPayload: "Error: Could not load the default credentials. Browse to https://cloud.google.com/docs/authentication/getting-started for more information.
at GoogleAuth.getApplicationDefaultAsync (/srv/functions/node_modules/google-auth-library/build/src/auth/googleauth.js:161:19)
at process._tickCallback (internal/process/next_tick.js:68:7)"
timestamp: "2019-12-03T18:22:40.664Z"
trace: "projects/some-project/traces/ce459298d8da7569d7fb40b07785d594"
}
```
Sample Code:
index.js
```const { PubSub } = require('@google-cloud/pubsub');
const bunyan = require('bunyan');
const { LoggingBunyan } = require('@google-cloud/logging-bunyan');
const loggingBunyan = new LoggingBunyan({
logName: process.env.LOG_NAME
});
function getLogger(logginglevel) {
let logger = bunyan.createLogger({
name: process.env.FUNCTION_TARGET,
level: logginglevel,
streams: [
loggingBunyan.stream()
]
});
logger.debug(`${process.env.FUNCTION_TARGET} :: ${process.env.LOG_NAME}`);
return logger;
}
const pubsub = new PubSub({
projectId: 'some-project'
});
exports.helloPubSub = async(event, context) => {
console.log('I am in!');
const logger = getLogger('debug');
const test_topic = pubsub.topic('logger-topic');
logger.debug('debug level test');
const data = Buffer.from('interesting', 'utf8');
await test_topic.publish(data);
logger.error('error level test');
logger.info('I am out');
};
```
package.json
```{
"name": "sample-pubsub",
"version": "0.0.1",
"dependencies": {
"@google-cloud/pubsub": "^1.0.0",
"bunyan": "^1.8.12",
"@google-cloud/logging-bunyan": "^2.0.0"
}
}
```
Note:
- The same code works fine for nodejs 8 runtime! This issue is only with nodejs 10 runtime.
- In the above index.js file, if I use await on the last logger.info('I am out') or all the logger calls, the function works like a charm!
Could anyone help me with what's wrong here?
Reference issues:
- [Async logging nodeJS 10](https://github.com/googleapis/nodejs-logging-bunyan/issues/304)
- [Redirect logs explicitly to cloud functions nodeJS 10](https://github.com/googleapis/nodejs-logging-bunyan/issues/291#issuecomment-526758642) | non_test | nodejs async bunyan logging crashes the cloud function if await is not used while logging issue nodejs cloud function runs but results in a crash while using bunyan logs and publishing to another topic environment details os google cloud function node js version npm version google cloud logging bunyan version steps to reproduce create a cloud function using nodejs runtime with pubsub as trigger use bunyan logging and to cloud functions try and publish to a sample topic and use bunyan logs right after publish error reported ignoring extra callback call function execution took ms finished with status crash insertid labels … logname projects some project logs cloudfunctions googleapis com functions receivetimestamp resource labels function name test logger lib publish duplicate project id some project region us type cloud function severity error textpayload error could not load the default credentials browse to for more information at googleauth getapplicationdefaultasync srv functions node modules google auth library build src auth googleauth js at process tickcallback internal process next tick js timestamp trace projects some project traces sample code index js const pubsub require google cloud pubsub const bunyan require bunyan const loggingbunyan require google cloud logging bunyan const loggingbunyan new loggingbunyan logname process env log name function getlogger logginglevel let logger bunyan createlogger name process env function target level logginglevel streams loggingbunyan stream logger debug process env function target process env log name return logger const pubsub new pubsub projectid some project exports hellopubsub async event context console log i am in const logger getlogger debug const test topic pubsub topic logger topic logger debug debug level test const data buffer from interesting await test topic publish data logger error error level test logger info i am out package json name sample pubsub version dependencies google cloud pubsub bunyan google cloud logging bunyan note the same code works fine for nodejs runtime this issue is only with nodejs runtime in the above index js file if i use await on the last logger info i am out or all the logger calls the function works like a charm could anyone help me with what s wrong here reference issues | 0 |
189,048 | 14,484,090,977 | IssuesEvent | 2020-12-10 15:55:27 | kalexmills/github-vet-tests-dec2020 | https://api.github.com/repos/kalexmills/github-vet-tests-dec2020 | closed | gowroc/meetups: slices_and_strings/slices_iteration_benchmark/slices_interation_test.go; 6 LoC | fresh test tiny |
Found a possible issue in [gowroc/meetups](https://www.github.com/gowroc/meetups) at [slices_and_strings/slices_iteration_benchmark/slices_interation_test.go](https://github.com/gowroc/meetups/blob/2204bf02b581a3d700b5bb3e75eb7ded9e2ba538/slices_and_strings/slices_iteration_benchmark/slices_interation_test.go#L26-L31)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call at line 30 may store a reference to b
[Click here to see the code in its original context.](https://github.com/gowroc/meetups/blob/2204bf02b581a3d700b5bb3e75eb7ded9e2ba538/slices_and_strings/slices_iteration_benchmark/slices_interation_test.go#L26-L31)
<details>
<summary>Click here to show the 6 line(s) of Go which triggered the analyzer.</summary>
```go
for i, b := range bigs {
if i > n {
return
}
accept(&b)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 2204bf02b581a3d700b5bb3e75eb7ded9e2ba538
| 1.0 | gowroc/meetups: slices_and_strings/slices_iteration_benchmark/slices_interation_test.go; 6 LoC -
Found a possible issue in [gowroc/meetups](https://www.github.com/gowroc/meetups) at [slices_and_strings/slices_iteration_benchmark/slices_interation_test.go](https://github.com/gowroc/meetups/blob/2204bf02b581a3d700b5bb3e75eb7ded9e2ba538/slices_and_strings/slices_iteration_benchmark/slices_interation_test.go#L26-L31)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> function call at line 30 may store a reference to b
[Click here to see the code in its original context.](https://github.com/gowroc/meetups/blob/2204bf02b581a3d700b5bb3e75eb7ded9e2ba538/slices_and_strings/slices_iteration_benchmark/slices_interation_test.go#L26-L31)
<details>
<summary>Click here to show the 6 line(s) of Go which triggered the analyzer.</summary>
```go
for i, b := range bigs {
if i > n {
return
}
accept(&b)
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: 2204bf02b581a3d700b5bb3e75eb7ded9e2ba538
| test | gowroc meetups slices and strings slices iteration benchmark slices interation test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message function call at line may store a reference to b click here to show the line s of go which triggered the analyzer go for i b range bigs if i n return accept b leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id | 1 |
327,290 | 28,051,999,939 | IssuesEvent | 2023-03-29 06:42:16 | ALTA-LapakUMKM-Group-2/LapakUMKM-APITesting | https://api.github.com/repos/ALTA-LapakUMKM-Group-2/LapakUMKM-APITesting | closed | [User-A013] GET User With Invalid Parameter | Manual Api Testing | **Given** Get list users with invalid parameter
**And** Send Get users
**Then** API should return respon 404 Not Found | 1.0 | [User-A013] GET User With Invalid Parameter - **Given** Get list users with invalid parameter
**And** Send Get users
**Then** API should return respon 404 Not Found | test | get user with invalid parameter given get list users with invalid parameter and send get users then api should return respon not found | 1 |
303,485 | 26,213,059,809 | IssuesEvent | 2023-01-04 08:38:30 | ballerina-platform/ballerina-lang | https://api.github.com/repos/ballerina-platform/ballerina-lang | closed | [Improvement]: Provide the Graal VM native support for function mocking | Type/Improvement Team/DevTools Area/TestFramework | ### Description
$subject
Related to #37690
### Describe your problem(s)
_No response_
### Describe your solution(s)
_No response_
### Related area
-> Test Framework
### Related issue(s) (optional)
_No response_
### Suggested label(s) (optional)
_No response_
### Suggested assignee(s) (optional)
_No response_ | 1.0 | [Improvement]: Provide the Graal VM native support for function mocking - ### Description
$subject
Related to #37690
### Describe your problem(s)
_No response_
### Describe your solution(s)
_No response_
### Related area
-> Test Framework
### Related issue(s) (optional)
_No response_
### Suggested label(s) (optional)
_No response_
### Suggested assignee(s) (optional)
_No response_ | test | provide the graal vm native support for function mocking description subject related to describe your problem s no response describe your solution s no response related area test framework related issue s optional no response suggested label s optional no response suggested assignee s optional no response | 1 |
391,722 | 26,905,300,387 | IssuesEvent | 2023-02-06 18:33:26 | DD2480-group30/CI | https://api.github.com/repos/DD2480-group30/CI | closed | Document the headers from Github's webhooks | documentation | Add documentation about how the HTTP POST request is formed from Github.
If such documentation is available online, add a link to it. Otherwise, write our own documentation. | 1.0 | Document the headers from Github's webhooks - Add documentation about how the HTTP POST request is formed from Github.
If such documentation is available online, add a link to it. Otherwise, write our own documentation. | non_test | document the headers from github s webhooks add documentation about how the http post request is formed from github if such documentation is available online add a link to it otherwise write our own documentation | 0 |
7,620 | 18,670,903,799 | IssuesEvent | 2021-10-30 17:36:48 | jtkaufman737/CMSC-495 | https://api.github.com/repos/jtkaufman737/CMSC-495 | closed | Deployment options | discussion-needed backend architecture | I was revisiting AWS stuff the other day briefly and they do have some "Free forever" tier options. The main thing I was hoping to get in a deployment option would be that it is free forever so if any group member wants to include it as part of a professional portfolio, there are no worries about it going offline when a free trial expires.
Unfortunately I think all the cloud providers (azure, GCP, AWS maaaaybe although I need to look more at the free forever tier) have some trip point in time where you would have to pay for deployments. Maybe one task can be looking into that.
There are also things like pythonanywhere and heroku which could definitely be used free indefinitely | 1.0 | Deployment options - I was revisiting AWS stuff the other day briefly and they do have some "Free forever" tier options. The main thing I was hoping to get in a deployment option would be that it is free forever so if any group member wants to include it as part of a professional portfolio, there are no worries about it going offline when a free trial expires.
Unfortunately I think all the cloud providers (azure, GCP, AWS maaaaybe although I need to look more at the free forever tier) have some trip point in time where you would have to pay for deployments. Maybe one task can be looking into that.
There are also things like pythonanywhere and heroku which could definitely be used free indefinitely | non_test | deployment options i was revisiting aws stuff the other day briefly and they do have some free forever tier options the main thing i was hoping to get in a deployment option would be that it is free forever so if any group member wants to include it as part of a professional portfolio there are no worries about it going offline when a free trial expires unfortunately i think all the cloud providers azure gcp aws maaaaybe although i need to look more at the free forever tier have some trip point in time where you would have to pay for deployments maybe one task can be looking into that there are also things like pythonanywhere and heroku which could definitely be used free indefinitely | 0 |
164,253 | 20,364,418,332 | IssuesEvent | 2022-02-21 02:45:18 | faizulho/sanity-gatsby-blog | https://api.github.com/repos/faizulho/sanity-gatsby-blog | opened | CVE-2022-0512 (High) detected in url-parse-1.4.7.tgz | security vulnerability | ## CVE-2022-0512 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /studio/package.json</p>
<p>Path to vulnerable library: /studio/node_modules/url-parse/package.json,/web/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- gatsby-source-sanity-7.0.7.tgz (Root Library)
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.6.
<p>Publish Date: 2022-02-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0512>CVE-2022-0512</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0512">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0512</a></p>
<p>Release Date: 2022-02-14</p>
<p>Fix Resolution: url-parse - 1.5.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-0512 (High) detected in url-parse-1.4.7.tgz - ## CVE-2022-0512 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>url-parse-1.4.7.tgz</b></p></summary>
<p>Small footprint URL parser that works seamlessly across Node.js and browser environments</p>
<p>Library home page: <a href="https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz">https://registry.npmjs.org/url-parse/-/url-parse-1.4.7.tgz</a></p>
<p>Path to dependency file: /studio/package.json</p>
<p>Path to vulnerable library: /studio/node_modules/url-parse/package.json,/web/node_modules/url-parse/package.json</p>
<p>
Dependency Hierarchy:
- gatsby-source-sanity-7.0.7.tgz (Root Library)
- :x: **url-parse-1.4.7.tgz** (Vulnerable Library)
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Authorization Bypass Through User-Controlled Key in NPM url-parse prior to 1.5.6.
<p>Publish Date: 2022-02-14
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-0512>CVE-2022-0512</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0512">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-0512</a></p>
<p>Release Date: 2022-02-14</p>
<p>Fix Resolution: url-parse - 1.5.6</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in url parse tgz cve high severity vulnerability vulnerable library url parse tgz small footprint url parser that works seamlessly across node js and browser environments library home page a href path to dependency file studio package json path to vulnerable library studio node modules url parse package json web node modules url parse package json dependency hierarchy gatsby source sanity tgz root library x url parse tgz vulnerable library found in base branch master vulnerability details authorization bypass through user controlled key in npm url parse prior to publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution url parse step up your open source security game with whitesource | 0 |
336,876 | 30,225,480,014 | IssuesEvent | 2023-07-05 23:50:45 | unifyai/ivy | https://api.github.com/repos/unifyai/ivy | closed | Examine `test_jax_numpy_statistical` tests | JAX Frontend Testing Test Sweep ToDo_internal | Ensure a comprehensive examination of all tests to confirm their alignment with the current test writing policy. Ensure the tests follow the following guidelines, among others:
1. Avoid restricting the use of `get_dtypes` to specific kinds (`float`, `int`) unless absolutely necessary; the default value should be used when possible.
2. Avoid the use of `assume` to bypass specific data types.
3. Unless necessary, avoid restricting the generation of numbers with `min_value` or `max_value`. For example, it is acceptable to set a maximum value for the `pad` function if the `pad_value` value can halt the test.
4. Ensure proper use of strategies.
5. Verify that all tests are executed rather than skipped by `test_values=False`, unless the test logic is implemented within the test body itself.
6. Confirm that testing is comprehensive and complete, meaning that all possible combinations of inputs are tested.
7. Avoid manually setting test parameters, including native array flags, container flags, the number of positional arguments, etc.
8. Ensure that no parameters are skipped during testing.
> Always refer to the documentation for **Ivy Tests** and **Ivy Frontend Tests** for complete test writing policy.
If a test function does not require update, simply mark the function in the list as completed, otherwise create an issue for that test function.
- [x] test_jax_numpy_einsum
- [x] test_jax_numpy_mean
- [x] test_jax_numpy_var
- [x] #16277
- [x] test_jax_numpy_argmin
- [x] test_jax_numpy_bincount
- [x] test_jax_numpy_cumprod
- [ ] #16285
- [x] test_jax_numpy_sum
- [x] test_jax_numpy_min
- [x] test_jax_numpy_max
- [x] test_jax_numpy_average
- [x] #16276
- [x] #16278
- [x] test_numpy_nanmin
| 2.0 | Examine `test_jax_numpy_statistical` tests - Ensure a comprehensive examination of all tests to confirm their alignment with the current test writing policy. Ensure the tests follow the following guidelines, among others:
1. Avoid restricting the use of `get_dtypes` to specific kinds (`float`, `int`) unless absolutely necessary; the default value should be used when possible.
2. Avoid the use of `assume` to bypass specific data types.
3. Unless necessary, avoid restricting the generation of numbers with `min_value` or `max_value`. For example, it is acceptable to set a maximum value for the `pad` function if the `pad_value` value can halt the test.
4. Ensure proper use of strategies.
5. Verify that all tests are executed rather than skipped by `test_values=False`, unless the test logic is implemented within the test body itself.
6. Confirm that testing is comprehensive and complete, meaning that all possible combinations of inputs are tested.
7. Avoid manually setting test parameters, including native array flags, container flags, the number of positional arguments, etc.
8. Ensure that no parameters are skipped during testing.
> Always refer to the documentation for **Ivy Tests** and **Ivy Frontend Tests** for complete test writing policy.
If a test function does not require update, simply mark the function in the list as completed, otherwise create an issue for that test function.
- [x] test_jax_numpy_einsum
- [x] test_jax_numpy_mean
- [x] test_jax_numpy_var
- [x] #16277
- [x] test_jax_numpy_argmin
- [x] test_jax_numpy_bincount
- [x] test_jax_numpy_cumprod
- [ ] #16285
- [x] test_jax_numpy_sum
- [x] test_jax_numpy_min
- [x] test_jax_numpy_max
- [x] test_jax_numpy_average
- [x] #16276
- [x] #16278
- [x] test_numpy_nanmin
| test | examine test jax numpy statistical tests ensure a comprehensive examination of all tests to confirm their alignment with the current test writing policy ensure the tests follow the following guidelines among others avoid restricting the use of get dtypes to specific kinds float int unless absolutely necessary the default value should be used when possible avoid the use of assume to bypass specific data types unless necessary avoid restricting the generation of numbers with min value or max value for example it is acceptable to set a maximum value for the pad function if the pad value value can halt the test ensure proper use of strategies verify that all tests are executed rather than skipped by test values false unless the test logic is implemented within the test body itself confirm that testing is comprehensive and complete meaning that all possible combinations of inputs are tested avoid manually setting test parameters including native array flags container flags the number of positional arguments etc ensure that no parameters are skipped during testing always refer to the documentation for ivy tests and ivy frontend tests for complete test writing policy if a test function does not require update simply mark the function in the list as completed otherwise create an issue for that test function test jax numpy einsum test jax numpy mean test jax numpy var test jax numpy argmin test jax numpy bincount test jax numpy cumprod test jax numpy sum test jax numpy min test jax numpy max test jax numpy average test numpy nanmin | 1 |
224,022 | 17,651,923,438 | IssuesEvent | 2021-08-20 14:16:52 | elastic/kibana | https://api.github.com/repos/elastic/kibana | opened | [test-failed]: Chrome X-Pack UI Functional Tests1.x-pack/test/functional/apps/graph/feature_controls/graph_security·ts - graph app feature controls security global graph read-only privileges "before all" hook for "shows graph navlink" | failed-test test-cloud | **Version: 7.15.0**
**Class: Chrome X-Pack UI Functional Tests1.x-pack/test/functional/apps/graph/feature_controls/graph_security·ts**
**Stack Trace:**
```
Error: timed out waiting for logout button visible -- last error: Error: retry.try timeout: Error: expected testSubject(userMenu) to exist
at TestSubjects.existOrFail (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/functional/services/common/test_subjects.ts:45:13)
at /var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/x-pack/test/functional/services/user_menu.js:48:9
at runAttempt (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:27:15)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:66:21)
at RetryService.try (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:31:12)
at UserMenu._ensureMenuOpen (test/functional/services/user_menu.js:46:7)
at UserMenu.logoutLinkExists (test/functional/services/user_menu.js:28:7)
at /var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/x-pack/test/functional/page_objects/security_page.ts:249:19
at runAttempt (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:27:15)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:66:21)
at retryForTruthy (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:27:3)
at RetryService.waitFor (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:59:5)
at SecurityPageObject.login (test/functional/page_objects/security_page.ts:247:5)
at Context.<anonymous> (test/functional/apps/graph/feature_controls/graph_security.ts:116:9)
at Object.apply (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
at onFailure (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:17:9)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:57:13)
at RetryService.try (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:31:12)
at UserMenu._ensureMenuOpen (test/functional/services/user_menu.js:46:7)
at UserMenu.logoutLinkExists (test/functional/services/user_menu.js:28:7)
at /var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/x-pack/test/functional/page_objects/security_page.ts:249:19
at runAttempt (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:27:15)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:66:21)
at retryForTruthy (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:27:3)
at RetryService.waitFor (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:59:5)
at SecurityPageObject.login (test/functional/page_objects/security_page.ts:247:5)
at Context.<anonymous> (test/functional/apps/graph/feature_controls/graph_security.ts:116:9)
at Object.apply (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
at onFailure (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:39:13)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:57:13)
at retryForTruthy (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:27:3)
at RetryService.waitFor (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:59:5)
at SecurityPageObject.login (test/functional/page_objects/security_page.ts:247:5)
at Context.<anonymous> (test/functional/apps/graph/feature_controls/graph_security.ts:116:9)
at Object.apply (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
```
**Other test failures:**
- graph app feature controls security no graph privileges "before all" hook for "doesn't show graph navlink"
_Test Report: https://internal-ci.elastic.co/view/Stack%20Tests/job/elastic+estf-cloud-kibana-tests/2193/testReport/_ | 2.0 | [test-failed]: Chrome X-Pack UI Functional Tests1.x-pack/test/functional/apps/graph/feature_controls/graph_security·ts - graph app feature controls security global graph read-only privileges "before all" hook for "shows graph navlink" - **Version: 7.15.0**
**Class: Chrome X-Pack UI Functional Tests1.x-pack/test/functional/apps/graph/feature_controls/graph_security·ts**
**Stack Trace:**
```
Error: timed out waiting for logout button visible -- last error: Error: retry.try timeout: Error: expected testSubject(userMenu) to exist
at TestSubjects.existOrFail (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/functional/services/common/test_subjects.ts:45:13)
at /var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/x-pack/test/functional/services/user_menu.js:48:9
at runAttempt (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:27:15)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:66:21)
at RetryService.try (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:31:12)
at UserMenu._ensureMenuOpen (test/functional/services/user_menu.js:46:7)
at UserMenu.logoutLinkExists (test/functional/services/user_menu.js:28:7)
at /var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/x-pack/test/functional/page_objects/security_page.ts:249:19
at runAttempt (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:27:15)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:66:21)
at retryForTruthy (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:27:3)
at RetryService.waitFor (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:59:5)
at SecurityPageObject.login (test/functional/page_objects/security_page.ts:247:5)
at Context.<anonymous> (test/functional/apps/graph/feature_controls/graph_security.ts:116:9)
at Object.apply (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
at onFailure (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:17:9)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:57:13)
at RetryService.try (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:31:12)
at UserMenu._ensureMenuOpen (test/functional/services/user_menu.js:46:7)
at UserMenu.logoutLinkExists (test/functional/services/user_menu.js:28:7)
at /var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/x-pack/test/functional/page_objects/security_page.ts:249:19
at runAttempt (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:27:15)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:66:21)
at retryForTruthy (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:27:3)
at RetryService.waitFor (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:59:5)
at SecurityPageObject.login (test/functional/page_objects/security_page.ts:247:5)
at Context.<anonymous> (test/functional/apps/graph/feature_controls/graph_security.ts:116:9)
at Object.apply (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
at onFailure (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:39:13)
at retryForSuccess (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_success.ts:57:13)
at retryForTruthy (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry_for_truthy.ts:27:3)
at RetryService.waitFor (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/test/common/services/retry/retry.ts:59:5)
at SecurityPageObject.login (test/functional/page_objects/security_page.ts:247:5)
at Context.<anonymous> (test/functional/apps/graph/feature_controls/graph_security.ts:116:9)
at Object.apply (/var/lib/jenkins/workspace/elastic+estf-cloud-kibana-tests/JOB/xpackGrp2/TASK/saas_run_kibana_tests/node/ess-testing/ci/cloud/common/build/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
```
**Other test failures:**
- graph app feature controls security no graph privileges "before all" hook for "doesn't show graph navlink"
_Test Report: https://internal-ci.elastic.co/view/Stack%20Tests/job/elastic+estf-cloud-kibana-tests/2193/testReport/_ | test | chrome x pack ui functional x pack test functional apps graph feature controls graph security·ts graph app feature controls security global graph read only privileges before all hook for shows graph navlink version class chrome x pack ui functional x pack test functional apps graph feature controls graph security·ts stack trace error timed out waiting for logout button visible last error error retry try timeout error expected testsubject usermenu to exist at testsubjects existorfail var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test functional services common test subjects ts at var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana x pack test functional services user menu js at runattempt var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryforsuccess var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryservice try var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry ts at usermenu ensuremenuopen test functional services user menu js at usermenu logoutlinkexists test functional services user menu js at var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana x pack test functional page objects security page ts at runattempt var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryforsuccess var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryfortruthy var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for truthy ts at retryservice waitfor var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry ts at securitypageobject login test functional page objects security page ts at context test functional apps graph feature controls graph security ts at object apply var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana node modules kbn test target node functional test runner lib mocha wrap function js at onfailure var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryforsuccess var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryservice try var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry ts at usermenu ensuremenuopen test functional services user menu js at usermenu logoutlinkexists test functional services user menu js at var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana x pack test functional page objects security page ts at runattempt var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryforsuccess var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryfortruthy var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for truthy ts at retryservice waitfor var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry ts at securitypageobject login test functional page objects security page ts at context test functional apps graph feature controls graph security ts at object apply var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana node modules kbn test target node functional test runner lib mocha wrap function js at onfailure var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for truthy ts at retryforsuccess var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for success ts at retryfortruthy var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry for truthy ts at retryservice waitfor var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana test common services retry retry ts at securitypageobject login test functional page objects security page ts at context test functional apps graph feature controls graph security ts at object apply var lib jenkins workspace elastic estf cloud kibana tests job task saas run kibana tests node ess testing ci cloud common build kibana node modules kbn test target node functional test runner lib mocha wrap function js other test failures graph app feature controls security no graph privileges before all hook for doesn t show graph navlink test report | 1 |
650,878 | 21,427,824,648 | IssuesEvent | 2022-04-23 00:24:26 | metabase/metabase | https://api.github.com/repos/metabase/metabase | closed | Custom Expression not available in filters of Segment or Metric | Type:Bug Priority:P2 Administration/Metrics & Segments .Reproduced .Regression Querying/Notebook/Custom Expression | **Describe the bug**
Custom Expression not available in filters of Segment or Metric
**To Reproduce**
Steps to reproduce the behavior:
1. Go to Admin > Data Model > Segment/Metric > Create new or edit one of the existing
2. Click "Filtered by", doesn't show Custom Expression

**Information about your Metabase Installation:**
Metabase 0.36.0, 0.36.3 - it does exist on 0.35.4
**Additional context**
Related to #4038 and #12899 | 1.0 | Custom Expression not available in filters of Segment or Metric - **Describe the bug**
Custom Expression not available in filters of Segment or Metric
**To Reproduce**
Steps to reproduce the behavior:
1. Go to Admin > Data Model > Segment/Metric > Create new or edit one of the existing
2. Click "Filtered by", doesn't show Custom Expression

**Information about your Metabase Installation:**
Metabase 0.36.0, 0.36.3 - it does exist on 0.35.4
**Additional context**
Related to #4038 and #12899 | non_test | custom expression not available in filters of segment or metric describe the bug custom expression not available in filters of segment or metric to reproduce steps to reproduce the behavior go to admin data model segment metric create new or edit one of the existing click filtered by doesn t show custom expression information about your metabase installation metabase it does exist on additional context related to and | 0 |
23,877 | 4,050,217,727 | IssuesEvent | 2016-05-23 17:24:13 | phetsims/capacitor-lab-basics | https://api.github.com/repos/phetsims/capacitor-lab-basics | closed | Plate charge density is too high | status:fixed-pending-testing type:bug | It looks like the plate charge density is a bit higher than it was in previous dev versions. For comparison, here is the capacitor at 1.5 V connected to the battery in version 18:

Here is the capacitor connected to the battery at 1.5 V in version 21:

| 1.0 | Plate charge density is too high - It looks like the plate charge density is a bit higher than it was in previous dev versions. For comparison, here is the capacitor at 1.5 V connected to the battery in version 18:

Here is the capacitor connected to the battery at 1.5 V in version 21:

| test | plate charge density is too high it looks like the plate charge density is a bit higher than it was in previous dev versions for comparison here is the capacitor at v connected to the battery in version here is the capacitor connected to the battery at v in version | 1 |
52,242 | 10,790,734,900 | IssuesEvent | 2019-11-05 15:28:52 | eclipse-theia/theia | https://api.github.com/repos/eclipse-theia/theia | opened | [vscode] builtin extensions not recognized | plug-in system vscode | **Description**
I've created a custom VS Code [`plugin`](https://github.com/vince-fugnitto/ts-tools-plugin) which simply verifies that the `vscode-builtin-typescript-language-features` builtin extension correctly works. The plugin itself works perfectly in VS Code while in Theia it fails to find the extension.
**Screenshots**
_VS Code:_
<div align='center'>

</div>
_Theia:_
<div align='center'>

</div>
**Setup**
The following updates were made to the example-browser [package.json](https://github.com/eclipse-theia/theia/blob/master/examples/browser/package.json):
_Additions:_
```json
"@theia/vscode-builtin-typescript": "0.2.1",
"@theia/vscode-builtin-typescript-language-features": "0.2.1",
```
_Deletions:_
```json
"@theia/typescript": "^0.12.0"
```
---
Am I doing something wrong when attempting to consume builtin extensions in Theia? | 1.0 | [vscode] builtin extensions not recognized - **Description**
I've created a custom VS Code [`plugin`](https://github.com/vince-fugnitto/ts-tools-plugin) which simply verifies that the `vscode-builtin-typescript-language-features` builtin extension correctly works. The plugin itself works perfectly in VS Code while in Theia it fails to find the extension.
**Screenshots**
_VS Code:_
<div align='center'>

</div>
_Theia:_
<div align='center'>

</div>
**Setup**
The following updates were made to the example-browser [package.json](https://github.com/eclipse-theia/theia/blob/master/examples/browser/package.json):
_Additions:_
```json
"@theia/vscode-builtin-typescript": "0.2.1",
"@theia/vscode-builtin-typescript-language-features": "0.2.1",
```
_Deletions:_
```json
"@theia/typescript": "^0.12.0"
```
---
Am I doing something wrong when attempting to consume builtin extensions in Theia? | non_test | builtin extensions not recognized description i ve created a custom vs code which simply verifies that the vscode builtin typescript language features builtin extension correctly works the plugin itself works perfectly in vs code while in theia it fails to find the extension screenshots vs code theia setup the following updates were made to the example browser additions json theia vscode builtin typescript theia vscode builtin typescript language features deletions json theia typescript am i doing something wrong when attempting to consume builtin extensions in theia | 0 |
95,207 | 8,552,705,132 | IssuesEvent | 2018-11-07 21:54:35 | aeternity/elixir-node | https://api.github.com/repos/aeternity/elixir-node | opened | Channel compatibility tests - Contracts offchain | blocked channel latest-compatibility | Channels contracts updates offchain compatibility should be tested and tests created. This excludes force progress.
Blocked by #762 | 1.0 | Channel compatibility tests - Contracts offchain - Channels contracts updates offchain compatibility should be tested and tests created. This excludes force progress.
Blocked by #762 | test | channel compatibility tests contracts offchain channels contracts updates offchain compatibility should be tested and tests created this excludes force progress blocked by | 1 |
769,268 | 26,998,998,093 | IssuesEvent | 2023-02-10 05:27:37 | infor-design/enterprise-ng | https://api.github.com/repos/infor-design/enterprise-ng | closed | Update SohoDataGrid dataset removes SohoToolbarFlex | type: bug :bug: type: patch [2] type: regression bug :leftwards_arrow_with_hook: priority: high | **Describe the bug**
For some reason the toolbar disappears when being interacted with (buttons are pressed)
**To Reproduce**
Steps to reproduce the behavior:
1. Go to https://stackblitz.com/edit/ids-quick-start-1480-1t7bj3?file=src/app/app.component.ts
2. Click on the toolbar button
4. See error -> toolbar disappears
**Expected behavior**
Should not disappear and it didnt in the previous version.
**Version**
- ids-enterprise-ng: 15.0.1
**Additional context**
- Issue described [here]
- Narrowed it down to setting `toolbar: { keywordFilter: true }` | 1.0 | Update SohoDataGrid dataset removes SohoToolbarFlex - **Describe the bug**
For some reason the toolbar disappears when being interacted with (buttons are pressed)
**To Reproduce**
Steps to reproduce the behavior:
1. Go to https://stackblitz.com/edit/ids-quick-start-1480-1t7bj3?file=src/app/app.component.ts
2. Click on the toolbar button
4. See error -> toolbar disappears
**Expected behavior**
Should not disappear and it didnt in the previous version.
**Version**
- ids-enterprise-ng: 15.0.1
**Additional context**
- Issue described [here]
- Narrowed it down to setting `toolbar: { keywordFilter: true }` | non_test | update sohodatagrid dataset removes sohotoolbarflex describe the bug for some reason the toolbar disappears when being interacted with buttons are pressed to reproduce steps to reproduce the behavior go to click on the toolbar button see error toolbar disappears expected behavior should not disappear and it didnt in the previous version version ids enterprise ng additional context issue described narrowed it down to setting toolbar keywordfilter true | 0 |
120,873 | 10,137,852,618 | IssuesEvent | 2019-08-02 16:17:58 | RianAndEricReview/back-of-the-card | https://api.github.com/repos/RianAndEricReview/back-of-the-card | closed | Testing - game POST route: game creation | Testing | Test that the api/games POST route returns a newly created game with the expected game object data. | 1.0 | Testing - game POST route: game creation - Test that the api/games POST route returns a newly created game with the expected game object data. | test | testing game post route game creation test that the api games post route returns a newly created game with the expected game object data | 1 |
256,469 | 22,054,378,997 | IssuesEvent | 2022-05-30 11:32:24 | mountaincharlie/project-four-cook-ebook | https://api.github.com/repos/mountaincharlie/project-four-cook-ebook | opened | MANUAL TESTING - delete ingredient button deletes ingredient | Testing | - [ ] image before clicking delete ingredient button
- [ ] image after clicking delete ingredient button | 1.0 | MANUAL TESTING - delete ingredient button deletes ingredient - - [ ] image before clicking delete ingredient button
- [ ] image after clicking delete ingredient button | test | manual testing delete ingredient button deletes ingredient image before clicking delete ingredient button image after clicking delete ingredient button | 1 |
345,277 | 30,796,166,216 | IssuesEvent | 2023-07-31 20:04:11 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | : failed | C-test-failure O-robot branch-master T-testeng | . [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/11130618?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/11130618?buildTab=artifacts#/) on master @ [b27094b0ded0d37a56d3e8dd31e2e02514ee0eff](https://github.com/cockroachdb/cockroach/commits/b27094b0ded0d37a56d3e8dd31e2e02514ee0eff):
```
stdout:
, stderr:
```
<p>Parameters: <code>TAGS=bazel,gss</code>
, <code>stress=true</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #107852 : failed [C-test-failure O-robot T-testeng branch-release-23.1]
</p>
</details>
/cc @cockroachdb/test-eng
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 2.0 | : failed - . [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/11130618?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_StressBazel/11130618?buildTab=artifacts#/) on master @ [b27094b0ded0d37a56d3e8dd31e2e02514ee0eff](https://github.com/cockroachdb/cockroach/commits/b27094b0ded0d37a56d3e8dd31e2e02514ee0eff):
```
stdout:
, stderr:
```
<p>Parameters: <code>TAGS=bazel,gss</code>
, <code>stress=true</code>
</p>
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #107852 : failed [C-test-failure O-robot T-testeng branch-release-23.1]
</p>
</details>
/cc @cockroachdb/test-eng
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | failed with on master stdout stderr parameters tags bazel gss stress true help see also same failure on other branches failed cc cockroachdb test eng | 1 |
297,229 | 25,711,857,160 | IssuesEvent | 2022-12-07 07:22:50 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | kv/kvserver: TestProtectedTimestamps failed | C-test-failure O-robot branch-release-22.1 | kv/kvserver.TestProtectedTimestamps [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=7867889&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=7867889&tab=artifacts#/) on release-22.1 @ [3b76f78d724dfc1e7bc8d697f5a7de960d8d1e98](https://github.com/cockroachdb/cockroach/commits/3b76f78d724dfc1e7bc8d697f5a7de960d8d1e98):
```
=== RUN TestProtectedTimestamps
test_log_scope.go:79: test logs captured to: /artifacts/tmp/_tmp/751d67000aac5f3394c2369309253f02/logTestProtectedTimestamps1961543987
test_log_scope.go:80: use -show-logs to present logs inline
client_protectedts_test.go:267:
Error Trace: client_protectedts_test.go:267
soon.go:71
retry.go:207
soon.go:77
soon.go:58
soon.go:41
client_protectedts_test.go:260
Error: Should be true
Test: TestProtectedTimestamps
Messages: 1670396258.196688150,0 >= 1670396258.122760171,0
panic.go:642: -- test log scope end --
--- FAIL: TestProtectedTimestamps (11.12s)
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
Parameters in this failure:
- TAGS=bazel,gss,deadlock
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestProtectedTimestamps.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 1.0 | kv/kvserver: TestProtectedTimestamps failed - kv/kvserver.TestProtectedTimestamps [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=7867889&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=7867889&tab=artifacts#/) on release-22.1 @ [3b76f78d724dfc1e7bc8d697f5a7de960d8d1e98](https://github.com/cockroachdb/cockroach/commits/3b76f78d724dfc1e7bc8d697f5a7de960d8d1e98):
```
=== RUN TestProtectedTimestamps
test_log_scope.go:79: test logs captured to: /artifacts/tmp/_tmp/751d67000aac5f3394c2369309253f02/logTestProtectedTimestamps1961543987
test_log_scope.go:80: use -show-logs to present logs inline
client_protectedts_test.go:267:
Error Trace: client_protectedts_test.go:267
soon.go:71
retry.go:207
soon.go:77
soon.go:58
soon.go:41
client_protectedts_test.go:260
Error: Should be true
Test: TestProtectedTimestamps
Messages: 1670396258.196688150,0 >= 1670396258.122760171,0
panic.go:642: -- test log scope end --
--- FAIL: TestProtectedTimestamps (11.12s)
```
<details><summary>Help</summary>
<p>
See also: [How To Investigate a Go Test Failure \(internal\)](https://cockroachlabs.atlassian.net/l/c/HgfXfJgM)
Parameters in this failure:
- TAGS=bazel,gss,deadlock
</p>
</details>
/cc @cockroachdb/kv
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*TestProtectedTimestamps.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | kv kvserver testprotectedtimestamps failed kv kvserver testprotectedtimestamps with on release run testprotectedtimestamps test log scope go test logs captured to artifacts tmp tmp test log scope go use show logs to present logs inline client protectedts test go error trace client protectedts test go soon go retry go soon go soon go soon go client protectedts test go error should be true test testprotectedtimestamps messages panic go test log scope end fail testprotectedtimestamps help see also parameters in this failure tags bazel gss deadlock cc cockroachdb kv | 1 |
137,688 | 11,150,074,806 | IssuesEvent | 2019-12-23 21:04:04 | kubernetes/test-infra | https://api.github.com/repos/kubernetes/test-infra | closed | Unit test prow flags | area/prow good first issue help wanted kind/feature lifecycle/rotten sig/testing | We should validate that flag values work as expected.
See https://github.com/kubernetes/test-infra/blob/86a523febb3c6af836859dda5624dcbea7204f7a/prow/cmd/build/main_test.go#L25-L66 for an example of what this will look like
Also likely will require passing in a flagset and args to the gather method: https://github.com/kubernetes/test-infra/blob/86a523febb3c6af836859dda5624dcbea7204f7a/prow/cmd/build/main.go#L72
P0
- [x] build
- [ ] crier
- [ ] deck
- [ ] hook
- [x] horologium
- [ ] plank
- [x] sinker
- [ ] tide
P1
- [ ] checkconfig
- [ ] clonerefs
- [ ] entrypoint
- [ ] gcsupload
- [ ] initupload
- [ ] sidecar
P2
- [ ] artifact-uploader
- [ ] branch-protector
- [ ] config-bootstrapper
- [ ] gerrit
- [ ] jenkins-operator
- [ ] status-reconciler
- [ ] sub
P3
- [ ] mkbuild-cluster
- [ ] mkpj
- [ ] mkpod
- [ ] peribolos
- [ ] phony
- [ ] tackle
- [ ] tot
/area prow | 1.0 | Unit test prow flags - We should validate that flag values work as expected.
See https://github.com/kubernetes/test-infra/blob/86a523febb3c6af836859dda5624dcbea7204f7a/prow/cmd/build/main_test.go#L25-L66 for an example of what this will look like
Also likely will require passing in a flagset and args to the gather method: https://github.com/kubernetes/test-infra/blob/86a523febb3c6af836859dda5624dcbea7204f7a/prow/cmd/build/main.go#L72
P0
- [x] build
- [ ] crier
- [ ] deck
- [ ] hook
- [x] horologium
- [ ] plank
- [x] sinker
- [ ] tide
P1
- [ ] checkconfig
- [ ] clonerefs
- [ ] entrypoint
- [ ] gcsupload
- [ ] initupload
- [ ] sidecar
P2
- [ ] artifact-uploader
- [ ] branch-protector
- [ ] config-bootstrapper
- [ ] gerrit
- [ ] jenkins-operator
- [ ] status-reconciler
- [ ] sub
P3
- [ ] mkbuild-cluster
- [ ] mkpj
- [ ] mkpod
- [ ] peribolos
- [ ] phony
- [ ] tackle
- [ ] tot
/area prow | test | unit test prow flags we should validate that flag values work as expected see for an example of what this will look like also likely will require passing in a flagset and args to the gather method build crier deck hook horologium plank sinker tide checkconfig clonerefs entrypoint gcsupload initupload sidecar artifact uploader branch protector config bootstrapper gerrit jenkins operator status reconciler sub mkbuild cluster mkpj mkpod peribolos phony tackle tot area prow | 1 |
42,476 | 11,009,014,204 | IssuesEvent | 2019-12-04 11:43:55 | JabRef/jabref | https://api.github.com/repos/JabRef/jabref | closed | Increase build number for each commit to master | bug 🤹♂️ build-system installation | The MSI installer needs an increase in the version to properly upgrade a previous installation (otherwise you could end up with two installations in different locations).
There are a few tools that provide such a functionality based on the git commit (i.e GitVersion WeightedPreReleaseNumber). Maybe it's also a good moment to switch to semantic versioning? | 1.0 | Increase build number for each commit to master - The MSI installer needs an increase in the version to properly upgrade a previous installation (otherwise you could end up with two installations in different locations).
There are a few tools that provide such a functionality based on the git commit (i.e GitVersion WeightedPreReleaseNumber). Maybe it's also a good moment to switch to semantic versioning? | non_test | increase build number for each commit to master the msi installer needs an increase in the version to properly upgrade a previous installation otherwise you could end up with two installations in different locations there are a few tools that provide such a functionality based on the git commit i e gitversion weightedprereleasenumber maybe it s also a good moment to switch to semantic versioning | 0 |
221,158 | 17,295,063,146 | IssuesEvent | 2021-07-25 14:59:14 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | roachtest: import/tpch/nodes=8 failed | C-test-failure O-roachtest O-robot branch-master release-blocker | roachtest.import/tpch/nodes=8 [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3221075&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3221075&tab=artifacts#/import/tpch/nodes=8) on master @ [9baaa282b3a09977b96bd3e5ae6e2346adfa2c16](https://github.com/cockroachdb/cockroach/commits/9baaa282b3a09977b96bd3e5ae6e2346adfa2c16):
```
| | main.main
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:2087
| | runtime.main
| | /usr/local/go/src/runtime/proc.go:225
| | runtime.goexit
| | /usr/local/go/src/runtime/asm_amd64.s:1371
| Wraps: (2) 4: dead (exit status 134)
| Error types: (1) *withstack.withStack (2) *errutil.leafError
Wraps: (4) secondary error attachment
| 6: dead (exit status 134)
| (1) attached stack trace
| -- stack trace:
| | main.glob..func14
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1168
| | main.wrap.func1
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:276
| | github.com/spf13/cobra.(*Command).execute
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:856
| | github.com/spf13/cobra.(*Command).ExecuteC
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:960
| | github.com/spf13/cobra.(*Command).Execute
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:897
| | main.main
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:2087
| | runtime.main
| | /usr/local/go/src/runtime/proc.go:225
| | runtime.goexit
| | /usr/local/go/src/runtime/asm_amd64.s:1371
| Wraps: (2) 6: dead (exit status 134)
| Error types: (1) *withstack.withStack (2) *errutil.leafError
Wraps: (5) attached stack trace
-- stack trace:
| main.glob..func14
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1168
| main.wrap.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:276
| github.com/spf13/cobra.(*Command).execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:856
| github.com/spf13/cobra.(*Command).ExecuteC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:960
| github.com/spf13/cobra.(*Command).Execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:897
| main.main
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:2087
| runtime.main
| /usr/local/go/src/runtime/proc.go:225
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1371
Wraps: (6) 5: dead (exit status 134)
Error types: (1) errors.Unclassified (2) *secondary.withSecondaryError (3) *secondary.withSecondaryError (4) *secondary.withSecondaryError (5) *withstack.withStack (6) *errutil.leafError
```
<details><summary>Reproduce</summary>
<p>
<p>To reproduce, try:
```bash
## Simple repro (linux-only):
$ make cockroachshort bin/worklaod bin/roachprod bin/roachtest
$ PATH=$PWD/bin:$PATH roachtest run import/tpch/nodes=8 --local
## Proper repro probably needs more roachtest flags, or running
## the programs remotely on GCE. For more details, refer to
## pkg/cmd/roachtest/README.md.
```
</p>
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #66263 roachtest: import/tpch/nodes=8 failed [C-test-failure O-roachtest O-robot branch-release-20.1 release-blocker]
</p>
</details>
/cc @cockroachdb/bulk-io
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*import/tpch/nodes=8.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 2.0 | roachtest: import/tpch/nodes=8 failed - roachtest.import/tpch/nodes=8 [failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=3221075&tab=buildLog) with [artifacts](https://teamcity.cockroachdb.com/viewLog.html?buildId=3221075&tab=artifacts#/import/tpch/nodes=8) on master @ [9baaa282b3a09977b96bd3e5ae6e2346adfa2c16](https://github.com/cockroachdb/cockroach/commits/9baaa282b3a09977b96bd3e5ae6e2346adfa2c16):
```
| | main.main
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:2087
| | runtime.main
| | /usr/local/go/src/runtime/proc.go:225
| | runtime.goexit
| | /usr/local/go/src/runtime/asm_amd64.s:1371
| Wraps: (2) 4: dead (exit status 134)
| Error types: (1) *withstack.withStack (2) *errutil.leafError
Wraps: (4) secondary error attachment
| 6: dead (exit status 134)
| (1) attached stack trace
| -- stack trace:
| | main.glob..func14
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1168
| | main.wrap.func1
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:276
| | github.com/spf13/cobra.(*Command).execute
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:856
| | github.com/spf13/cobra.(*Command).ExecuteC
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:960
| | github.com/spf13/cobra.(*Command).Execute
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:897
| | main.main
| | /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:2087
| | runtime.main
| | /usr/local/go/src/runtime/proc.go:225
| | runtime.goexit
| | /usr/local/go/src/runtime/asm_amd64.s:1371
| Wraps: (2) 6: dead (exit status 134)
| Error types: (1) *withstack.withStack (2) *errutil.leafError
Wraps: (5) attached stack trace
-- stack trace:
| main.glob..func14
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:1168
| main.wrap.func1
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:276
| github.com/spf13/cobra.(*Command).execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:856
| github.com/spf13/cobra.(*Command).ExecuteC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:960
| github.com/spf13/cobra.(*Command).Execute
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/vendor/github.com/spf13/cobra/command.go:897
| main.main
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachprod/main.go:2087
| runtime.main
| /usr/local/go/src/runtime/proc.go:225
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1371
Wraps: (6) 5: dead (exit status 134)
Error types: (1) errors.Unclassified (2) *secondary.withSecondaryError (3) *secondary.withSecondaryError (4) *secondary.withSecondaryError (5) *withstack.withStack (6) *errutil.leafError
```
<details><summary>Reproduce</summary>
<p>
<p>To reproduce, try:
```bash
## Simple repro (linux-only):
$ make cockroachshort bin/worklaod bin/roachprod bin/roachtest
$ PATH=$PWD/bin:$PATH roachtest run import/tpch/nodes=8 --local
## Proper repro probably needs more roachtest flags, or running
## the programs remotely on GCE. For more details, refer to
## pkg/cmd/roachtest/README.md.
```
</p>
</p>
</details>
<details><summary>Same failure on other branches</summary>
<p>
- #66263 roachtest: import/tpch/nodes=8 failed [C-test-failure O-roachtest O-robot branch-release-20.1 release-blocker]
</p>
</details>
/cc @cockroachdb/bulk-io
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*import/tpch/nodes=8.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | roachtest import tpch nodes failed roachtest import tpch nodes with on master main main home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go runtime main usr local go src runtime proc go runtime goexit usr local go src runtime asm s wraps dead exit status error types withstack withstack errutil leaferror wraps secondary error attachment dead exit status attached stack trace stack trace main glob home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go main wrap home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go github com cobra command execute home agent work go src github com cockroachdb cockroach vendor github com cobra command go github com cobra command executec home agent work go src github com cockroachdb cockroach vendor github com cobra command go github com cobra command execute home agent work go src github com cockroachdb cockroach vendor github com cobra command go main main home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go runtime main usr local go src runtime proc go runtime goexit usr local go src runtime asm s wraps dead exit status error types withstack withstack errutil leaferror wraps attached stack trace stack trace main glob home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go main wrap home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go github com cobra command execute home agent work go src github com cockroachdb cockroach vendor github com cobra command go github com cobra command executec home agent work go src github com cockroachdb cockroach vendor github com cobra command go github com cobra command execute home agent work go src github com cockroachdb cockroach vendor github com cobra command go main main home agent work go src github com cockroachdb cockroach pkg cmd roachprod main go runtime main usr local go src runtime proc go runtime goexit usr local go src runtime asm s wraps dead exit status error types errors unclassified secondary withsecondaryerror secondary withsecondaryerror secondary withsecondaryerror withstack withstack errutil leaferror reproduce to reproduce try bash simple repro linux only make cockroachshort bin worklaod bin roachprod bin roachtest path pwd bin path roachtest run import tpch nodes local proper repro probably needs more roachtest flags or running the programs remotely on gce for more details refer to pkg cmd roachtest readme md same failure on other branches roachtest import tpch nodes failed cc cockroachdb bulk io | 1 |
46,543 | 13,055,929,979 | IssuesEvent | 2020-07-30 03:09:07 | icecube-trac/tix2 | https://api.github.com/repos/icecube-trac/tix2 | opened | [DomTools] Bring back I3IsolatedHitsCutModule (Trac #1397) | Incomplete Migration Migrated from Trac combo core defect | Migrated from https://code.icecube.wisc.edu/ticket/1397
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:07",
"description": "Or at least provide an alternative for filter scripts IceCube_BaseProc.py.",
"reporter": "olivas",
"cc": "",
"resolution": "fixed",
"_ts": "1458335647931556",
"component": "combo core",
"summary": "[DomTools] Bring back I3IsolatedHitsCutModule",
"priority": "blocker",
"keywords": "",
"time": "2015-10-14T07:06:54",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
| 1.0 | [DomTools] Bring back I3IsolatedHitsCutModule (Trac #1397) - Migrated from https://code.icecube.wisc.edu/ticket/1397
```json
{
"status": "closed",
"changetime": "2016-03-18T21:14:07",
"description": "Or at least provide an alternative for filter scripts IceCube_BaseProc.py.",
"reporter": "olivas",
"cc": "",
"resolution": "fixed",
"_ts": "1458335647931556",
"component": "combo core",
"summary": "[DomTools] Bring back I3IsolatedHitsCutModule",
"priority": "blocker",
"keywords": "",
"time": "2015-10-14T07:06:54",
"milestone": "",
"owner": "olivas",
"type": "defect"
}
```
| non_test | bring back trac migrated from json status closed changetime description or at least provide an alternative for filter scripts icecube baseproc py reporter olivas cc resolution fixed ts component combo core summary bring back priority blocker keywords time milestone owner olivas type defect | 0 |
130,834 | 5,134,116,904 | IssuesEvent | 2017-01-11 07:39:45 | aaberg/sql2o | https://api.github.com/repos/aaberg/sql2o | closed | Transaction management | feature request priority-normal | I'm a big fun of try-with-resource and propose this pattern for transaction management for new release
``` java
try(Transaction tr= cn.beginTran(TransactionIsolation.REPEATABLEREAD){
// some actions
tr.commit();
} // rollback if not commited in close()
```
| 1.0 | Transaction management - I'm a big fun of try-with-resource and propose this pattern for transaction management for new release
``` java
try(Transaction tr= cn.beginTran(TransactionIsolation.REPEATABLEREAD){
// some actions
tr.commit();
} // rollback if not commited in close()
```
| non_test | transaction management i m a big fun of try with resource and propose this pattern for transaction management for new release java try transaction tr cn begintran transactionisolation repeatableread some actions tr commit rollback if not commited in close | 0 |
677,769 | 23,174,138,926 | IssuesEvent | 2022-07-31 06:41:13 | FEeasy404/GameUs | https://api.github.com/repos/FEeasy404/GameUs | opened | 내비게이션 바의 프로필 페이지 버그 | 🐛bug 🖐Priority: Medium | ## 버그 발생 설명
- 타인의 프로필 페이지에서 프로필 버튼을 눌러 내 페이지로 이동할 때 페이지가 제대로 바뀌지 않는 버그가 있습니다.
- 타인의 프로필이 존재하지 않는 상황(404페이지)에서 프로필 버튼을 눌러 내 페이지로 이동할 때 프로필이 뜨지 않는 버그가 있습니다.
## 할 일
- [ ] 내비게이션 프로필 버튼을 눌렀을 때의 버그 수정
## ETC
- 새로고침하면 잘 되는데 왜일까요!!! | 1.0 | 내비게이션 바의 프로필 페이지 버그 - ## 버그 발생 설명
- 타인의 프로필 페이지에서 프로필 버튼을 눌러 내 페이지로 이동할 때 페이지가 제대로 바뀌지 않는 버그가 있습니다.
- 타인의 프로필이 존재하지 않는 상황(404페이지)에서 프로필 버튼을 눌러 내 페이지로 이동할 때 프로필이 뜨지 않는 버그가 있습니다.
## 할 일
- [ ] 내비게이션 프로필 버튼을 눌렀을 때의 버그 수정
## ETC
- 새로고침하면 잘 되는데 왜일까요!!! | non_test | 내비게이션 바의 프로필 페이지 버그 버그 발생 설명 타인의 프로필 페이지에서 프로필 버튼을 눌러 내 페이지로 이동할 때 페이지가 제대로 바뀌지 않는 버그가 있습니다 타인의 프로필이 존재하지 않는 상황 에서 프로필 버튼을 눌러 내 페이지로 이동할 때 프로필이 뜨지 않는 버그가 있습니다 할 일 내비게이션 프로필 버튼을 눌렀을 때의 버그 수정 etc 새로고침하면 잘 되는데 왜일까요 | 0 |
200,718 | 15,145,843,769 | IssuesEvent | 2021-02-11 05:43:56 | elastic/elasticsearch | https://api.github.com/repos/elastic/elasticsearch | closed | TokenBackwardsCompatibilityIT failures | :Security/Security >test-failure Team:Security | TokenBackwardsCompatibilityIT » testGeneratingTokensInMixedCluster
TokenBackwardsCompatibilityIT » testRefreshingTokensInMixedCluster
**Build scan**: https://gradle-enterprise.elastic.co/s/7wjt3rnujlkae
**Applicable branches**: 7.x
**Failure history**:
Started once #68375 was backported to 7.x
**Failure excerpt**:
```
org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT > testGeneratingTokensInMixedCluster FAILED |
-- | --
| org.elasticsearch.client.ResponseException: method [POST], host [http://[::1]:38011], URI [/_security/oauth2/token], status line [HTTP/1.1 500 Internal Server Error] |
| {"error":{"root_cause":[{"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"}],"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"},"status":500} |
| at __randomizedtesting.SeedInfo.seed([4392CA99489E4B54:BCDB25AF041027C4]:0) |
| at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:326) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:296) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:270) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.createTokens(TokenBackwardsCompatibilityIT.java:397) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.testGeneratingTokensInMixedCluster(TokenBackwardsCompatibilityIT.java:215) |
| REPRODUCE WITH: ./gradlew ':x-pack:qa:rolling-upgrade:v7.10.3#twoThirdsUpgradedTest' -Dtests.class="org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT" -Dtests.method="testGeneratingTokensInMixedCluster" -Dtests.seed=4392CA99489E4B54 -Dtests.security.manager=true -Dtests.bwc=true -Dtests.locale=ja -Dtests.timezone=America/Creston -Druntime.java=8 |
| |
| org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT > testRefreshingTokensInMixedCluster FAILED |
| org.elasticsearch.client.ResponseException: method [POST], host [http://127.0.0.1:34445], URI [/_security/oauth2/token], status line [HTTP/1.1 500 Internal Server Error] |
| {"error":{"root_cause":[{"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"}],"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"},"status":500} |
| at __randomizedtesting.SeedInfo.seed([4392CA99489E4B54:A1F36B143CFB8E73]:0) |
| at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:326) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:296) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:270) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.refreshToken(TokenBackwardsCompatibilityIT.java:429) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.testRefreshingTokensInMixedCluster(TokenBackwardsCompatibilityIT.java:246) |
| REPRODUCE WITH: ./gradlew ':x-pack:qa:rolling-upgrade:v7.10.3#twoThirdsUpgradedTest' -Dtests.class="org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT" -Dtests.method="testRefreshingTokensInMixedCluster" -Dtests.seed=4392CA99489E4B54 -Dtests.security.manager=true -Dtests.bwc=true -Dtests.locale=ja -Dtests.timezone=America/Creston -Druntime.java=8
```
| 1.0 | TokenBackwardsCompatibilityIT failures - TokenBackwardsCompatibilityIT » testGeneratingTokensInMixedCluster
TokenBackwardsCompatibilityIT » testRefreshingTokensInMixedCluster
**Build scan**: https://gradle-enterprise.elastic.co/s/7wjt3rnujlkae
**Applicable branches**: 7.x
**Failure history**:
Started once #68375 was backported to 7.x
**Failure excerpt**:
```
org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT > testGeneratingTokensInMixedCluster FAILED |
-- | --
| org.elasticsearch.client.ResponseException: method [POST], host [http://[::1]:38011], URI [/_security/oauth2/token], status line [HTTP/1.1 500 Internal Server Error] |
| {"error":{"root_cause":[{"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"}],"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"},"status":500} |
| at __randomizedtesting.SeedInfo.seed([4392CA99489E4B54:BCDB25AF041027C4]:0) |
| at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:326) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:296) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:270) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.createTokens(TokenBackwardsCompatibilityIT.java:397) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.testGeneratingTokensInMixedCluster(TokenBackwardsCompatibilityIT.java:215) |
| REPRODUCE WITH: ./gradlew ':x-pack:qa:rolling-upgrade:v7.10.3#twoThirdsUpgradedTest' -Dtests.class="org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT" -Dtests.method="testGeneratingTokensInMixedCluster" -Dtests.seed=4392CA99489E4B54 -Dtests.security.manager=true -Dtests.bwc=true -Dtests.locale=ja -Dtests.timezone=America/Creston -Druntime.java=8 |
| |
| org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT > testRefreshingTokensInMixedCluster FAILED |
| org.elasticsearch.client.ResponseException: method [POST], host [http://127.0.0.1:34445], URI [/_security/oauth2/token], status line [HTTP/1.1 500 Internal Server Error] |
| {"error":{"root_cause":[{"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"}],"type":"illegal_state_exception","reason":"Cannot update mappings in [.security-tokens-7]: system indices can only use mappings from their descriptors, but the mappings in the request did not match those in the descriptors(s)"},"status":500} |
| at __randomizedtesting.SeedInfo.seed([4392CA99489E4B54:A1F36B143CFB8E73]:0) |
| at org.elasticsearch.client.RestClient.convertResponse(RestClient.java:326) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:296) |
| at org.elasticsearch.client.RestClient.performRequest(RestClient.java:270) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.refreshToken(TokenBackwardsCompatibilityIT.java:429) |
| at org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT.testRefreshingTokensInMixedCluster(TokenBackwardsCompatibilityIT.java:246) |
| REPRODUCE WITH: ./gradlew ':x-pack:qa:rolling-upgrade:v7.10.3#twoThirdsUpgradedTest' -Dtests.class="org.elasticsearch.upgrades.TokenBackwardsCompatibilityIT" -Dtests.method="testRefreshingTokensInMixedCluster" -Dtests.seed=4392CA99489E4B54 -Dtests.security.manager=true -Dtests.bwc=true -Dtests.locale=ja -Dtests.timezone=America/Creston -Druntime.java=8
```
| test | tokenbackwardscompatibilityit failures tokenbackwardscompatibilityit » testgeneratingtokensinmixedcluster tokenbackwardscompatibilityit » testrefreshingtokensinmixedcluster build scan applicable branches x failure history started once was backported to x failure excerpt org elasticsearch upgrades tokenbackwardscompatibilityit testgeneratingtokensinmixedcluster failed org elasticsearch client responseexception method host uri status line error root cause system indices can only use mappings from their descriptors but the mappings in the request did not match those in the descriptors s type illegal state exception reason cannot update mappings in system indices can only use mappings from their descriptors but the mappings in the request did not match those in the descriptors s status at randomizedtesting seedinfo seed at org elasticsearch client restclient convertresponse restclient java at org elasticsearch client restclient performrequest restclient java at org elasticsearch client restclient performrequest restclient java at org elasticsearch upgrades tokenbackwardscompatibilityit createtokens tokenbackwardscompatibilityit java at org elasticsearch upgrades tokenbackwardscompatibilityit testgeneratingtokensinmixedcluster tokenbackwardscompatibilityit java reproduce with gradlew x pack qa rolling upgrade twothirdsupgradedtest dtests class org elasticsearch upgrades tokenbackwardscompatibilityit dtests method testgeneratingtokensinmixedcluster dtests seed dtests security manager true dtests bwc true dtests locale ja dtests timezone america creston druntime java org elasticsearch upgrades tokenbackwardscompatibilityit testrefreshingtokensinmixedcluster failed org elasticsearch client responseexception method host uri status line error root cause system indices can only use mappings from their descriptors but the mappings in the request did not match those in the descriptors s type illegal state exception reason cannot update mappings in system indices can only use mappings from their descriptors but the mappings in the request did not match those in the descriptors s status at randomizedtesting seedinfo seed at org elasticsearch client restclient convertresponse restclient java at org elasticsearch client restclient performrequest restclient java at org elasticsearch client restclient performrequest restclient java at org elasticsearch upgrades tokenbackwardscompatibilityit refreshtoken tokenbackwardscompatibilityit java at org elasticsearch upgrades tokenbackwardscompatibilityit testrefreshingtokensinmixedcluster tokenbackwardscompatibilityit java reproduce with gradlew x pack qa rolling upgrade twothirdsupgradedtest dtests class org elasticsearch upgrades tokenbackwardscompatibilityit dtests method testrefreshingtokensinmixedcluster dtests seed dtests security manager true dtests bwc true dtests locale ja dtests timezone america creston druntime java | 1 |
29,800 | 24,283,839,345 | IssuesEvent | 2022-09-28 19:58:00 | dotnet/project-system | https://api.github.com/repos/dotnet/project-system | closed | Add VS insertion detailed description | Area-Infrastructure Triage-Approved | As per the changes in [this PR](https://github.com/dotnet/project-system/pull/8420), we moved to using MicroBuild to create our VS insertion PRs. MicroBuild does not provide detailed descriptions for their VS insertion PRs. To provide the same convince that we had prior, we'll need to add this mechanism to our repo.
- Isolate the code from [Roslyn Insertion Tool](https://github.com/dotnet/roslyn-tools/tree/main/src/RoslynInsertionTool) that created this description
- Create a small, single purpose tool in our repo that simply spits out this description string
- Assign description string to a variable in the pipeline
- Assign the variable to the `InsertionDescription` value in the MicroBuild Insertion task | 1.0 | Add VS insertion detailed description - As per the changes in [this PR](https://github.com/dotnet/project-system/pull/8420), we moved to using MicroBuild to create our VS insertion PRs. MicroBuild does not provide detailed descriptions for their VS insertion PRs. To provide the same convince that we had prior, we'll need to add this mechanism to our repo.
- Isolate the code from [Roslyn Insertion Tool](https://github.com/dotnet/roslyn-tools/tree/main/src/RoslynInsertionTool) that created this description
- Create a small, single purpose tool in our repo that simply spits out this description string
- Assign description string to a variable in the pipeline
- Assign the variable to the `InsertionDescription` value in the MicroBuild Insertion task | non_test | add vs insertion detailed description as per the changes in we moved to using microbuild to create our vs insertion prs microbuild does not provide detailed descriptions for their vs insertion prs to provide the same convince that we had prior we ll need to add this mechanism to our repo isolate the code from that created this description create a small single purpose tool in our repo that simply spits out this description string assign description string to a variable in the pipeline assign the variable to the insertiondescription value in the microbuild insertion task | 0 |
11,040 | 9,206,114,886 | IssuesEvent | 2019-03-08 12:46:36 | qissue-bot/QGIS | https://api.github.com/repos/qissue-bot/QGIS | closed | Add ability to use socks5 proxy | Category: Web Services clients/WMS Component: Easy fix? Component: Pull Request or Patch supplied Component: Resolution Priority: Low Project: QGIS Application Status: Closed Tracker: Feature request | ---
Author Name: **cfarmer -** (cfarmer -)
Original Redmine Issue: 1422, https://issues.qgis.org/issues/1422
Original Assignee: nobody -
---
The attached diff adds the ability to use socks5 proxying to qgis for users working behind one of these proxy types. The fix also contains changes to the python plugin installer. Changes should not affect anything that does not explicitly recognize the change, so wms provider etc will not be affected...
---
- [bug_1422_fix.diff](https://issues.qgis.org/attachments/download/2179/bug_1422_fix.diff) (cfarmer -)
- [bug_1422_fix_2.diff](https://issues.qgis.org/attachments/download/2178/bug_1422_fix_2.diff) (cfarmer -) | 1.0 | Add ability to use socks5 proxy - ---
Author Name: **cfarmer -** (cfarmer -)
Original Redmine Issue: 1422, https://issues.qgis.org/issues/1422
Original Assignee: nobody -
---
The attached diff adds the ability to use socks5 proxying to qgis for users working behind one of these proxy types. The fix also contains changes to the python plugin installer. Changes should not affect anything that does not explicitly recognize the change, so wms provider etc will not be affected...
---
- [bug_1422_fix.diff](https://issues.qgis.org/attachments/download/2179/bug_1422_fix.diff) (cfarmer -)
- [bug_1422_fix_2.diff](https://issues.qgis.org/attachments/download/2178/bug_1422_fix_2.diff) (cfarmer -) | non_test | add ability to use proxy author name cfarmer cfarmer original redmine issue original assignee nobody the attached diff adds the ability to use proxying to qgis for users working behind one of these proxy types the fix also contains changes to the python plugin installer changes should not affect anything that does not explicitly recognize the change so wms provider etc will not be affected cfarmer cfarmer | 0 |
225,435 | 17,858,956,145 | IssuesEvent | 2021-09-05 15:38:45 | vladimirdimov99/Teodor.bg | https://api.github.com/repos/vladimirdimov99/Teodor.bg | opened | Suite - Authorization - Sign Up, Test ID - TEODOR - 010, Name - Sign Up with password under 4 characters | negative test case | **Description**
Validating that user cannot sign up with password under 4 characters
**Author**
Vladimir Dimov
**Priority**
High
**Behavior**
Negative
**Type**
Functional
**Preconditions**
1. Open https://teodor.bg/
2. Click on the profile icon at the top right.
3. Click on the “Sign Up“ button.
**Steps to reproduce**
1. Type valid credentials in the First Name, Last Name and Email required input fields.
**Expected result**
Crendetials are accepted
2. Type less than 4 characters in the password input field and the same for Confirm the password input field.
**Expected result**
Error message appears under password input field:
"Минимална дължина на това поле трябва да бъде равен или по-голяма от 4 символи. Водещите и крайните интервали ще бъдат игнорирани."
3. Tick the required options.
**Expected result**
Required options are ticked
4. Click on the “Sign Up“ button.
**Expected result**
User cannot Sign Up to the website | 1.0 | Suite - Authorization - Sign Up, Test ID - TEODOR - 010, Name - Sign Up with password under 4 characters - **Description**
Validating that user cannot sign up with password under 4 characters
**Author**
Vladimir Dimov
**Priority**
High
**Behavior**
Negative
**Type**
Functional
**Preconditions**
1. Open https://teodor.bg/
2. Click on the profile icon at the top right.
3. Click on the “Sign Up“ button.
**Steps to reproduce**
1. Type valid credentials in the First Name, Last Name and Email required input fields.
**Expected result**
Crendetials are accepted
2. Type less than 4 characters in the password input field and the same for Confirm the password input field.
**Expected result**
Error message appears under password input field:
"Минимална дължина на това поле трябва да бъде равен или по-голяма от 4 символи. Водещите и крайните интервали ще бъдат игнорирани."
3. Tick the required options.
**Expected result**
Required options are ticked
4. Click on the “Sign Up“ button.
**Expected result**
User cannot Sign Up to the website | test | suite authorization sign up test id teodor name sign up with password under characters description validating that user cannot sign up with password under characters author vladimir dimov priority high behavior negative type functional preconditions open click on the profile icon at the top right click on the “sign up“ button steps to reproduce type valid credentials in the first name last name and email required input fields expected result crendetials are accepted type less than characters in the password input field and the same for confirm the password input field expected result error message appears under password input field минимална дължина на това поле трябва да бъде равен или по голяма от символи водещите и крайните интервали ще бъдат игнорирани tick the required options expected result required options are ticked click on the “sign up“ button expected result user cannot sign up to the website | 1 |
9,080 | 12,150,601,447 | IssuesEvent | 2020-04-24 18:17:02 | googleapis/google-cloud-cpp-common | https://api.github.com/repos/googleapis/google-cloud-cpp-common | closed | all: use shfmt to format shell scripts | type: process | Carlos suggested setting up [shfmt](https://github.com/mvdan/sh) to automatically format our .sh files
From their doc page:
> to get the formatting appropriate for Google's Style guide, use `shfmt -i 2 -ci` | 1.0 | all: use shfmt to format shell scripts - Carlos suggested setting up [shfmt](https://github.com/mvdan/sh) to automatically format our .sh files
From their doc page:
> to get the formatting appropriate for Google's Style guide, use `shfmt -i 2 -ci` | non_test | all use shfmt to format shell scripts carlos suggested setting up to automatically format our sh files from their doc page to get the formatting appropriate for google s style guide use shfmt i ci | 0 |
105,351 | 9,071,880,658 | IssuesEvent | 2019-02-15 00:16:42 | spring-cloud/spring-cloud-dataflow | https://api.github.com/repos/spring-cloud/spring-cloud-dataflow | closed | Test stream/task deployments against multiple cf/k8s platform backends | in progress test-coverage | As a developer, I'd like to define multiple cf/k8s platform backends in SCDF (task config) +Skipper (stream config), so I can create, deploy or launch stream and tasks against different platforms from the Shell/UI.
**Acceptance:**
- Define 2 CF accounts (perhaps also in different pas environments)
- Define 2 K8s accounts (even on different K8s clusters)
- Deploy and Launch Streams and Tasks against different CF/K8s accounts
- Verify the deployment and launches were successful
- Verify the status and audit-history is correctly recorded | 1.0 | Test stream/task deployments against multiple cf/k8s platform backends - As a developer, I'd like to define multiple cf/k8s platform backends in SCDF (task config) +Skipper (stream config), so I can create, deploy or launch stream and tasks against different platforms from the Shell/UI.
**Acceptance:**
- Define 2 CF accounts (perhaps also in different pas environments)
- Define 2 K8s accounts (even on different K8s clusters)
- Deploy and Launch Streams and Tasks against different CF/K8s accounts
- Verify the deployment and launches were successful
- Verify the status and audit-history is correctly recorded | test | test stream task deployments against multiple cf platform backends as a developer i d like to define multiple cf platform backends in scdf task config skipper stream config so i can create deploy or launch stream and tasks against different platforms from the shell ui acceptance define cf accounts perhaps also in different pas environments define accounts even on different clusters deploy and launch streams and tasks against different cf accounts verify the deployment and launches were successful verify the status and audit history is correctly recorded | 1 |
32,810 | 4,791,220,390 | IssuesEvent | 2016-10-31 11:46:02 | leeensminger/DelDOT-NPDES-Field-Tool | https://api.github.com/repos/leeensminger/DelDOT-NPDES-Field-Tool | closed | Pipe Connection Structure Not Displaying on Map | Version 2.0 - Ready for testing in version 2.0 release | I created a structure=pipe connection, filled out all required fields and when I hit save, there was no symbol on the map where I had placed the pipe connection. I then attempted to modify the geometry and the spot where I had originally placed it was highlighted but if i cleared the graphics using the eraser on the toolbar, then the map would go back to showing no indication of a pipe connection. Pipe Connection is listed in the table of contents and has an assigned symbology.
Attempting to modify geometry

Modified Geometry

After Clearing Graphics

| 1.0 | Pipe Connection Structure Not Displaying on Map - I created a structure=pipe connection, filled out all required fields and when I hit save, there was no symbol on the map where I had placed the pipe connection. I then attempted to modify the geometry and the spot where I had originally placed it was highlighted but if i cleared the graphics using the eraser on the toolbar, then the map would go back to showing no indication of a pipe connection. Pipe Connection is listed in the table of contents and has an assigned symbology.
Attempting to modify geometry

Modified Geometry

After Clearing Graphics

| test | pipe connection structure not displaying on map i created a structure pipe connection filled out all required fields and when i hit save there was no symbol on the map where i had placed the pipe connection i then attempted to modify the geometry and the spot where i had originally placed it was highlighted but if i cleared the graphics using the eraser on the toolbar then the map would go back to showing no indication of a pipe connection pipe connection is listed in the table of contents and has an assigned symbology attempting to modify geometry modified geometry after clearing graphics | 1 |
333,724 | 29,803,691,238 | IssuesEvent | 2023-06-16 10:00:05 | Khalon-Bridge/gitunioin-test-specs | https://api.github.com/repos/Khalon-Bridge/gitunioin-test-specs | opened | Customize Notification Settings | currency: USD tasks:4 feature repo:gitunioin-test gitunion gitunion-app owner:Khalon-Bridge private org tech: react/nodejs bounty: 150 | ### Description
Allow users to customize their notification settings for emails.
### Acceptance criteria
- [ ] Users should be able to choose which email accounts they want notifications for.
- [ ] Users should be able to choose the frequency of notifications.
- [ ] Users should be able to customize the notification sound.
- [ ] Settings should be saved for each user. | 1.0 | Customize Notification Settings - ### Description
Allow users to customize their notification settings for emails.
### Acceptance criteria
- [ ] Users should be able to choose which email accounts they want notifications for.
- [ ] Users should be able to choose the frequency of notifications.
- [ ] Users should be able to customize the notification sound.
- [ ] Settings should be saved for each user. | test | customize notification settings description allow users to customize their notification settings for emails acceptance criteria users should be able to choose which email accounts they want notifications for users should be able to choose the frequency of notifications users should be able to customize the notification sound settings should be saved for each user | 1 |
99,943 | 8,718,634,494 | IssuesEvent | 2018-12-07 21:07:50 | rancher/rancher | https://api.github.com/repos/rancher/rancher | closed | Azure cluster with cloud provider - worker nodes fails to start because of kubelet restarting constantly with cloud provider option set. | kind/bug-qa priority/0 releases/alpha1 status/resolved status/to-test version/2.0 | Rancher server - Build from master - Dec 5
Steps to reproduce the problem:
Provision a Azure cluster with cloud provider with following node configuration with Azure cloud provider option set:
1 control node
1 etcd node
3 worker nodes.
Azure cluster gets to "Active" sate but worker nodes continue to be stuck in "Unavailable" state
with error message
```Kubelet stopped posting node status.```
or
```
Container runtime is down,PLEG is not healthy: pleg was last seen active 2562047h47m16.854775807s ago; threshold is 3m0s,runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
```
Rancher server logs:
```
018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
```
Agent logs:
```
time="2018-12-05T20:48:44Z" level=info msg="For process kubelet, Env has changed from [RKE_CLOUD_CONFIG_CHECKSUM=da94b08b5b80f8b5d5440cfd61760890] to [RKE_CLOUD_CONFIG_CHECKSUM=a2a7f88d0f324000661600c46fbe024a PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin]"
```
| 1.0 | Azure cluster with cloud provider - worker nodes fails to start because of kubelet restarting constantly with cloud provider option set. - Rancher server - Build from master - Dec 5
Steps to reproduce the problem:
Provision a Azure cluster with cloud provider with following node configuration with Azure cloud provider option set:
1 control node
1 etcd node
3 worker nodes.
Azure cluster gets to "Active" sate but worker nodes continue to be stuck in "Unavailable" state
with error message
```Kubelet stopped posting node status.```
or
```
Container runtime is down,PLEG is not healthy: pleg was last seen active 2562047h47m16.854775807s ago; threshold is 3m0s,runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:docker: network plugin is not ready: cni config uninitialized
```
Rancher server logs:
```
018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioning cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Updating cluster [c-bcgdr]
2018/12/05 20:46:51 [INFO] Provisioned cluster [c-bcgdr]
```
Agent logs:
```
time="2018-12-05T20:48:44Z" level=info msg="For process kubelet, Env has changed from [RKE_CLOUD_CONFIG_CHECKSUM=da94b08b5b80f8b5d5440cfd61760890] to [RKE_CLOUD_CONFIG_CHECKSUM=a2a7f88d0f324000661600c46fbe024a PATH=/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin]"
```
| test | azure cluster with cloud provider worker nodes fails to start because of kubelet restarting constantly with cloud provider option set rancher server build from master dec steps to reproduce the problem provision a azure cluster with cloud provider with following node configuration with azure cloud provider option set control node etcd node worker nodes azure cluster gets to active sate but worker nodes continue to be stuck in unavailable state with error message kubelet stopped posting node status or container runtime is down pleg is not healthy pleg was last seen active ago threshold is runtime network not ready networkready false reason networkpluginnotready message docker network plugin is not ready cni config uninitialized rancher server logs provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster provisioning cluster updating cluster provisioned cluster agent logs time level info msg for process kubelet env has changed from to | 1 |
326,132 | 27,976,233,282 | IssuesEvent | 2023-03-25 16:12:29 | unifyai/ivy | https://api.github.com/repos/unifyai/ivy | reopened | Fix elementwise.test_divide | Sub Task Failing Test | | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4519909128/jobs/7960677995" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4488346476/jobs/7892829556" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4488346476/jobs/7892829556" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4488346476/jobs/7892829556" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
<details>
<summary>Not found</summary>
Not found
</details>
| 1.0 | Fix elementwise.test_divide - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4519909128/jobs/7960677995" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4488346476/jobs/7892829556" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4488346476/jobs/7892829556" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4488346476/jobs/7892829556" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-success-success></a>
<details>
<summary>Not found</summary>
Not found
</details>
| test | fix elementwise test divide tensorflow img src torch img src numpy img src jax img src not found not found | 1 |
135,798 | 18,722,107,763 | IssuesEvent | 2021-11-03 12:58:29 | KDWSS/dd-trace-java | https://api.github.com/repos/KDWSS/dd-trace-java | opened | CVE-2021-35515 (High) detected in multiple libraries | security vulnerability | ## CVE-2021-35515 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>commons-compress-1.20.jar</b>, <b>commons-compress-1.10.jar</b>, <b>commons-compress-1.19.jar</b></p></summary>
<p>
<details><summary><b>commons-compress-1.20.jar</b></p></summary>
<p>Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4,
Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj.</p>
<p>Library home page: <a href="https://commons.apache.org/proper/commons-compress/">https://commons.apache.org/proper/commons-compress/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/rabbitmq-amqp-2.7/rabbitmq-amqp-2.7.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar</p>
<p>
Dependency Hierarchy:
- testcontainers-1.15.0-rc2.jar (Root Library)
- :x: **commons-compress-1.20.jar** (Vulnerable Library)
</details>
<details><summary><b>commons-compress-1.10.jar</b></p></summary>
<p>Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE and ar, cpio,
jar, tar, zip, dump, 7z, arj.</p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/mongo/driver-3.1-core-test/driver-3.1-core-test.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar</p>
<p>
Dependency Hierarchy:
- de.flapdoodle.embed.mongo-1.50.5.jar (Root Library)
- de.flapdoodle.embed.process-1.50.2.jar
- :x: **commons-compress-1.10.jar** (Vulnerable Library)
</details>
<details><summary><b>commons-compress-1.19.jar</b></p></summary>
<p>Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4,
Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj.</p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/mule-4/mule-4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.19/commons-compress-1.19.jar</p>
<p>
Dependency Hierarchy:
- mule-service-weave-2.2.2.jar (Root Library)
- avro-module-2.2.2.jar
- avro-1.10.0-MULE_3.jar
- :x: **commons-compress-1.19.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
When reading a specially crafted 7Z archive, the construction of the list of codecs that decompress an entry can result in an infinite loop. This could be used to mount a denial of service attack against services that use Compress' sevenz package.
<p>Publish Date: 2021-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35515>CVE-2021-35515</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://commons.apache.org/proper/commons-compress/security-reports.html">https://commons.apache.org/proper/commons-compress/security-reports.html</a></p>
<p>Release Date: 2021-07-13</p>
<p>Fix Resolution: org.apache.commons:commons-compress:1.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.commons","packageName":"commons-compress","packageVersion":"1.20","packageFilePaths":["/dd-java-agent/instrumentation/rabbitmq-amqp-2.7/rabbitmq-amqp-2.7.gradle","/dd-java-agent/instrumentation/spymemcached-2.12/spymemcached-2.12.gradle","/dd-java-agent/instrumentation/aerospike-4/aerospike-4.gradle","/dd-smoke-tests/springboot-mongo/springboot-mongo.gradle","/dd-trace-core/dd-trace-core.gradle","/dd-java-agent/instrumentation/spring-rabbit/spring-rabbit.gradle","/dd-java-agent/instrumentation/vertx-mysql-client-3.9/vertx-mysql-client-3.9.gradle","/dd-java-agent/instrumentation/jdbc/jdbc.gradle","/dd-smoke-tests/spring-boot-rabbit/spring-boot-rabbit.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.testcontainers:testcontainers:1.15.0-rc2;org.apache.commons:commons-compress:1.20","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.commons:commons-compress:1.21"},{"packageType":"Java","groupId":"org.apache.commons","packageName":"commons-compress","packageVersion":"1.10","packageFilePaths":["/dd-java-agent/instrumentation/mongo/driver-3.1-core-test/driver-3.1-core-test.gradle","/dd-java-agent/instrumentation/mongo/mongo.gradle","/dd-java-agent/instrumentation/mongo/driver-4.0-test/driver-4.0-test.gradle","/dd-java-agent/instrumentation/mongo/driver-3.4/driver-3.4.gradle","/dd-java-agent/appsec/weblog/weblog-spring-app/weblog-spring-app.gradle","/dd-java-agent/instrumentation/mongo/driver-3.3-async-test/driver-3.3-async-test.gradle","/dd-java-agent/instrumentation/mongo/driver-3.1/driver-3.1.gradle","/dd-java-agent/instrumentation/mongo/driver-3.7-core-test/driver-3.7-core-test.gradle","/dd-java-agent/instrumentation/mongo/driver-3.10-sync-test/driver-3.10-sync-test.gradle"],"isTransitiveDependency":true,"dependencyTree":"de.flapdoodle.embed:de.flapdoodle.embed.mongo:1.50.5;de.flapdoodle.embed:de.flapdoodle.embed.process:1.50.2;org.apache.commons:commons-compress:1.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.commons:commons-compress:1.21"},{"packageType":"Java","groupId":"org.apache.commons","packageName":"commons-compress","packageVersion":"1.19","packageFilePaths":["/dd-java-agent/instrumentation/mule-4/mule-4.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.mule.services:mule-service-weave:2.2.2;org.mule.weave:avro-module:2.2.2;org.apache.avro:avro:1.10.0-MULE_3;org.apache.commons:commons-compress:1.19","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.commons:commons-compress:1.21"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-35515","vulnerabilityDetails":"When reading a specially crafted 7Z archive, the construction of the list of codecs that decompress an entry can result in an infinite loop. This could be used to mount a denial of service attack against services that use Compress\u0027 sevenz package.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35515","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | True | CVE-2021-35515 (High) detected in multiple libraries - ## CVE-2021-35515 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>commons-compress-1.20.jar</b>, <b>commons-compress-1.10.jar</b>, <b>commons-compress-1.19.jar</b></p></summary>
<p>
<details><summary><b>commons-compress-1.20.jar</b></p></summary>
<p>Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4,
Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj.</p>
<p>Library home page: <a href="https://commons.apache.org/proper/commons-compress/">https://commons.apache.org/proper/commons-compress/</a></p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/rabbitmq-amqp-2.7/rabbitmq-amqp-2.7.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar,/home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.20/commons-compress-1.20.jar</p>
<p>
Dependency Hierarchy:
- testcontainers-1.15.0-rc2.jar (Root Library)
- :x: **commons-compress-1.20.jar** (Vulnerable Library)
</details>
<details><summary><b>commons-compress-1.10.jar</b></p></summary>
<p>Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE and ar, cpio,
jar, tar, zip, dump, 7z, arj.</p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/mongo/driver-3.1-core-test/driver-3.1-core-test.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar,/home/wss-scanner/.gradle/caches/modules-2/files-2.1/org.apache.commons/commons-compress/1.10/5eeb27c57eece1faf2d837868aeccc94d84dcc9a/commons-compress-1.10.jar</p>
<p>
Dependency Hierarchy:
- de.flapdoodle.embed.mongo-1.50.5.jar (Root Library)
- de.flapdoodle.embed.process-1.50.2.jar
- :x: **commons-compress-1.10.jar** (Vulnerable Library)
</details>
<details><summary><b>commons-compress-1.19.jar</b></p></summary>
<p>Apache Commons Compress software defines an API for working with
compression and archive formats. These include: bzip2, gzip, pack200,
lzma, xz, Snappy, traditional Unix Compress, DEFLATE, DEFLATE64, LZ4,
Brotli, Zstandard and ar, cpio, jar, tar, zip, dump, 7z, arj.</p>
<p>Path to dependency file: dd-trace-java/dd-java-agent/instrumentation/mule-4/mule-4.gradle</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/org/apache/commons/commons-compress/1.19/commons-compress-1.19.jar</p>
<p>
Dependency Hierarchy:
- mule-service-weave-2.2.2.jar (Root Library)
- avro-module-2.2.2.jar
- avro-1.10.0-MULE_3.jar
- :x: **commons-compress-1.19.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/KDWSS/dd-trace-java/commit/2819174635979a19573ec0ce8e3e2b63a3848079">2819174635979a19573ec0ce8e3e2b63a3848079</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
When reading a specially crafted 7Z archive, the construction of the list of codecs that decompress an entry can result in an infinite loop. This could be used to mount a denial of service attack against services that use Compress' sevenz package.
<p>Publish Date: 2021-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35515>CVE-2021-35515</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://commons.apache.org/proper/commons-compress/security-reports.html">https://commons.apache.org/proper/commons-compress/security-reports.html</a></p>
<p>Release Date: 2021-07-13</p>
<p>Fix Resolution: org.apache.commons:commons-compress:1.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"org.apache.commons","packageName":"commons-compress","packageVersion":"1.20","packageFilePaths":["/dd-java-agent/instrumentation/rabbitmq-amqp-2.7/rabbitmq-amqp-2.7.gradle","/dd-java-agent/instrumentation/spymemcached-2.12/spymemcached-2.12.gradle","/dd-java-agent/instrumentation/aerospike-4/aerospike-4.gradle","/dd-smoke-tests/springboot-mongo/springboot-mongo.gradle","/dd-trace-core/dd-trace-core.gradle","/dd-java-agent/instrumentation/spring-rabbit/spring-rabbit.gradle","/dd-java-agent/instrumentation/vertx-mysql-client-3.9/vertx-mysql-client-3.9.gradle","/dd-java-agent/instrumentation/jdbc/jdbc.gradle","/dd-smoke-tests/spring-boot-rabbit/spring-boot-rabbit.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.testcontainers:testcontainers:1.15.0-rc2;org.apache.commons:commons-compress:1.20","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.commons:commons-compress:1.21"},{"packageType":"Java","groupId":"org.apache.commons","packageName":"commons-compress","packageVersion":"1.10","packageFilePaths":["/dd-java-agent/instrumentation/mongo/driver-3.1-core-test/driver-3.1-core-test.gradle","/dd-java-agent/instrumentation/mongo/mongo.gradle","/dd-java-agent/instrumentation/mongo/driver-4.0-test/driver-4.0-test.gradle","/dd-java-agent/instrumentation/mongo/driver-3.4/driver-3.4.gradle","/dd-java-agent/appsec/weblog/weblog-spring-app/weblog-spring-app.gradle","/dd-java-agent/instrumentation/mongo/driver-3.3-async-test/driver-3.3-async-test.gradle","/dd-java-agent/instrumentation/mongo/driver-3.1/driver-3.1.gradle","/dd-java-agent/instrumentation/mongo/driver-3.7-core-test/driver-3.7-core-test.gradle","/dd-java-agent/instrumentation/mongo/driver-3.10-sync-test/driver-3.10-sync-test.gradle"],"isTransitiveDependency":true,"dependencyTree":"de.flapdoodle.embed:de.flapdoodle.embed.mongo:1.50.5;de.flapdoodle.embed:de.flapdoodle.embed.process:1.50.2;org.apache.commons:commons-compress:1.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.commons:commons-compress:1.21"},{"packageType":"Java","groupId":"org.apache.commons","packageName":"commons-compress","packageVersion":"1.19","packageFilePaths":["/dd-java-agent/instrumentation/mule-4/mule-4.gradle"],"isTransitiveDependency":true,"dependencyTree":"org.mule.services:mule-service-weave:2.2.2;org.mule.weave:avro-module:2.2.2;org.apache.avro:avro:1.10.0-MULE_3;org.apache.commons:commons-compress:1.19","isMinimumFixVersionAvailable":true,"minimumFixVersion":"org.apache.commons:commons-compress:1.21"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2021-35515","vulnerabilityDetails":"When reading a specially crafted 7Z archive, the construction of the list of codecs that decompress an entry can result in an infinite loop. This could be used to mount a denial of service attack against services that use Compress\u0027 sevenz package.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-35515","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> --> | non_test | cve high detected in multiple libraries cve high severity vulnerability vulnerable libraries commons compress jar commons compress jar commons compress jar commons compress jar apache commons compress software defines an api for working with compression and archive formats these include gzip lzma xz snappy traditional unix compress deflate brotli zstandard and ar cpio jar tar zip dump arj library home page a href path to dependency file dd trace java dd java agent instrumentation rabbitmq amqp rabbitmq amqp gradle path to vulnerable library home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar home wss scanner repository org apache commons commons compress commons compress jar dependency hierarchy testcontainers jar root library x commons compress jar vulnerable library commons compress jar apache commons compress software defines an api for working with compression and archive formats these include gzip lzma xz snappy traditional unix compress deflate and ar cpio jar tar zip dump arj path to dependency file dd trace java dd java agent instrumentation mongo driver core test driver core test gradle path to vulnerable library home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar home wss scanner gradle caches modules files org apache commons commons compress commons compress jar dependency hierarchy de flapdoodle embed mongo jar root library de flapdoodle embed process jar x commons compress jar vulnerable library commons compress jar apache commons compress software defines an api for working with compression and archive formats these include gzip lzma xz snappy traditional unix compress deflate brotli zstandard and ar cpio jar tar zip dump arj path to dependency file dd trace java dd java agent instrumentation mule mule gradle path to vulnerable library home wss scanner repository org apache commons commons compress commons compress jar dependency hierarchy mule service weave jar root library avro module jar avro mule jar x commons compress jar vulnerable library found in head commit a href found in base branch master vulnerability details when reading a specially crafted archive the construction of the list of codecs that decompress an entry can result in an infinite loop this could be used to mount a denial of service attack against services that use compress sevenz package publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution org apache commons commons compress isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org testcontainers testcontainers org apache commons commons compress isminimumfixversionavailable true minimumfixversion org apache commons commons compress packagetype java groupid org apache commons packagename commons compress packageversion packagefilepaths istransitivedependency true dependencytree de flapdoodle embed de flapdoodle embed mongo de flapdoodle embed de flapdoodle embed process org apache commons commons compress isminimumfixversionavailable true minimumfixversion org apache commons commons compress packagetype java groupid org apache commons packagename commons compress packageversion packagefilepaths istransitivedependency true dependencytree org mule services mule service weave org mule weave avro module org apache avro avro mule org apache commons commons compress isminimumfixversionavailable true minimumfixversion org apache commons commons compress basebranches vulnerabilityidentifier cve vulnerabilitydetails when reading a specially crafted archive the construction of the list of codecs that decompress an entry can result in an infinite loop this could be used to mount a denial of service attack against services that use compress sevenz package vulnerabilityurl | 0 |
2,759 | 2,642,559,175 | IssuesEvent | 2015-03-12 01:11:05 | Annexa/Moki-Ecommerce | https://api.github.com/repos/Annexa/Moki-Ecommerce | opened | Check all taxes are calculated correctly based on international location | Page design | International distros: 0% tax | 1.0 | Check all taxes are calculated correctly based on international location - International distros: 0% tax | non_test | check all taxes are calculated correctly based on international location international distros tax | 0 |
20,630 | 3,829,678,103 | IssuesEvent | 2016-03-31 11:48:17 | docker/docker | https://api.github.com/repos/docker/docker | opened | Unit test TestOverlayCreateSnap fails (on Fedora) | area/storage/overlay kind/flakytest | This test is consistently failing on my host with:
```
--- FAIL: TestOverlayCreateSnap (0.00s)
graphtest_unix.go:122: stat /var/tmp/docker-graphtest-723995303/overlay/Snap/merged/a subdir: no such file or directory
FAIL
coverage: 31.9% of statements
```
docker info
```
Containers: 1
Running: 1
Paused: 0
Stopped: 0
Images: 167
Server Version: 1.11.0-dev
Storage Driver: overlay
Backing Filesystem: extfs
Logging Driver: journald
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge null host
Kernel Version: 4.4.6-300.fc23.x86_64
Operating System: Fedora 23 (Workstation Edition)
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 11.44 GiB
Name: localhost.localdomain
ID: JQDM:ZDU3:BYSR:SIMJ:QGDH:GBGW:MF7Q:VYLQ:UYFD:W5KY:H4WG:PIMF
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): true
File Descriptors: 17
Goroutines: 48
System Time: 2016-03-31T13:47:54.765083779+02:00
EventsListeners: 0
Registry: https://index.docker.io/v1/
Experimental: true
```
docker version
```
Client:
Version: 1.11.0-dev
API version: 1.24
Go version: go1.5.3
Git commit: 47fa54a
Built: Thu Mar 31 13:44:02 2016
OS/Arch: linux/amd64
Experimental: true
Server:
Version: 1.11.0-dev
API version: 1.24
Go version: go1.5.3
Git commit: 47fa54a
Built: Thu Mar 31 13:44:02 2016
OS/Arch: linux/amd64
Experimental: true
```
I don't have any clue about this - might be kernel version (?)
ping @LK4D4 @estesp @calavera @icecrime | 1.0 | Unit test TestOverlayCreateSnap fails (on Fedora) - This test is consistently failing on my host with:
```
--- FAIL: TestOverlayCreateSnap (0.00s)
graphtest_unix.go:122: stat /var/tmp/docker-graphtest-723995303/overlay/Snap/merged/a subdir: no such file or directory
FAIL
coverage: 31.9% of statements
```
docker info
```
Containers: 1
Running: 1
Paused: 0
Stopped: 0
Images: 167
Server Version: 1.11.0-dev
Storage Driver: overlay
Backing Filesystem: extfs
Logging Driver: journald
Cgroup Driver: cgroupfs
Plugins:
Volume: local
Network: bridge null host
Kernel Version: 4.4.6-300.fc23.x86_64
Operating System: Fedora 23 (Workstation Edition)
OSType: linux
Architecture: x86_64
CPUs: 4
Total Memory: 11.44 GiB
Name: localhost.localdomain
ID: JQDM:ZDU3:BYSR:SIMJ:QGDH:GBGW:MF7Q:VYLQ:UYFD:W5KY:H4WG:PIMF
Docker Root Dir: /var/lib/docker
Debug Mode (client): false
Debug Mode (server): true
File Descriptors: 17
Goroutines: 48
System Time: 2016-03-31T13:47:54.765083779+02:00
EventsListeners: 0
Registry: https://index.docker.io/v1/
Experimental: true
```
docker version
```
Client:
Version: 1.11.0-dev
API version: 1.24
Go version: go1.5.3
Git commit: 47fa54a
Built: Thu Mar 31 13:44:02 2016
OS/Arch: linux/amd64
Experimental: true
Server:
Version: 1.11.0-dev
API version: 1.24
Go version: go1.5.3
Git commit: 47fa54a
Built: Thu Mar 31 13:44:02 2016
OS/Arch: linux/amd64
Experimental: true
```
I don't have any clue about this - might be kernel version (?)
ping @LK4D4 @estesp @calavera @icecrime | test | unit test testoverlaycreatesnap fails on fedora this test is consistently failing on my host with fail testoverlaycreatesnap graphtest unix go stat var tmp docker graphtest overlay snap merged a subdir no such file or directory fail coverage of statements docker info containers running paused stopped images server version dev storage driver overlay backing filesystem extfs logging driver journald cgroup driver cgroupfs plugins volume local network bridge null host kernel version operating system fedora workstation edition ostype linux architecture cpus total memory gib name localhost localdomain id jqdm bysr simj qgdh gbgw vylq uyfd pimf docker root dir var lib docker debug mode client false debug mode server true file descriptors goroutines system time eventslisteners registry experimental true docker version client version dev api version go version git commit built thu mar os arch linux experimental true server version dev api version go version git commit built thu mar os arch linux experimental true i don t have any clue about this might be kernel version ping estesp calavera icecrime | 1 |
172,025 | 13,259,747,421 | IssuesEvent | 2020-08-20 17:09:40 | fetchai/agents-aea | https://api.github.com/repos/fetchai/agents-aea | closed | Test against Python 3.5 | test | **Is your feature request related to a problem? Please describe.**
The framework has not been tested with Python 3.5
**Describe the solution you'd like**
Test the framework with Python 3.5
**Describe alternatives you've considered**
**Additional context**
| 1.0 | Test against Python 3.5 - **Is your feature request related to a problem? Please describe.**
The framework has not been tested with Python 3.5
**Describe the solution you'd like**
Test the framework with Python 3.5
**Describe alternatives you've considered**
**Additional context**
| test | test against python is your feature request related to a problem please describe the framework has not been tested with python describe the solution you d like test the framework with python describe alternatives you ve considered additional context | 1 |
203,834 | 15,391,251,611 | IssuesEvent | 2021-03-03 14:22:01 | WoWManiaUK/Redemption | https://api.github.com/repos/WoWManiaUK/Redemption | closed | Bite them hard! | Fixed on PTR - Tester Confirmed | **Links:** Can not find
**What is Happening:** Quest not need to be taken but it marks automatically complete after winning battleground. Second problem is after quest given to the npc and rewarad taken if win another battleground the npc will have question mark on head and can talk to him for quest but then you can not turn in
**What Should happen:** Quest must be taken to be complete before winning battleground and there must not be question mark after win another battleground
| 1.0 | Bite them hard! - **Links:** Can not find
**What is Happening:** Quest not need to be taken but it marks automatically complete after winning battleground. Second problem is after quest given to the npc and rewarad taken if win another battleground the npc will have question mark on head and can talk to him for quest but then you can not turn in
**What Should happen:** Quest must be taken to be complete before winning battleground and there must not be question mark after win another battleground
| test | bite them hard links can not find what is happening quest not need to be taken but it marks automatically complete after winning battleground second problem is after quest given to the npc and rewarad taken if win another battleground the npc will have question mark on head and can talk to him for quest but then you can not turn in what should happen quest must be taken to be complete before winning battleground and there must not be question mark after win another battleground | 1 |
47,568 | 5,902,742,089 | IssuesEvent | 2017-05-19 02:57:33 | tgstation/tgstation | https://api.github.com/repos/tgstation/tgstation | opened | Projectile dampener field broken? | Bug Needs Reproducing/Testing | 
I have no idea how this happened, but the peacekeeper borg isn't in sight. | 1.0 | Projectile dampener field broken? - 
I have no idea how this happened, but the peacekeeper borg isn't in sight. | test | projectile dampener field broken i have no idea how this happened but the peacekeeper borg isn t in sight | 1 |
186,142 | 15,048,692,135 | IssuesEvent | 2021-02-03 10:30:34 | cosmo-epfl/chemiscope | https://api.github.com/repos/cosmo-epfl/chemiscope | opened | Typo in tutorial "input file format for chemiscope" | documentation | In the first pseudo script under the section "Creating an input file"
the ase.io import needs to be changed to
```py
import ase.io
```
same for sklearn import
```py
import sklearn.decomposition
```
(or you change the usage later in the pseudo script) | 1.0 | Typo in tutorial "input file format for chemiscope" - In the first pseudo script under the section "Creating an input file"
the ase.io import needs to be changed to
```py
import ase.io
```
same for sklearn import
```py
import sklearn.decomposition
```
(or you change the usage later in the pseudo script) | non_test | typo in tutorial input file format for chemiscope in the first pseudo script under the section creating an input file the ase io import needs to be changed to py import ase io same for sklearn import py import sklearn decomposition or you change the usage later in the pseudo script | 0 |
283,979 | 24,574,568,681 | IssuesEvent | 2022-10-13 11:18:02 | kyma-project/api-gateway | https://api.github.com/repos/kyma-project/api-gateway | opened | Add test coverage output to test pipeline | kind/feature area/ci area/api-gateway area/tests | <!-- Thank you for your contribution. Before you submit the issue:
1. Search open and closed issues for duplicates.
2. Read the contributing guidelines.
-->
**Description**
Add html test coverage report as output of `pull-api-gateway-lint`
**Reasons**
Allow PR's to be easily checked whether the implemented functionality is covered with a unit test
<!-- Explain why we should add this feature. Provide use cases to illustrate its benefits. -->
**Attachments**
https://status.build.kyma-project.io/view/gs/kyma-prow-logs/pr-logs/pull/kyma-project_api-gateway/42/pre-kyma-project-api-gateway/1575843552640897024
<!-- Attach any files, links, code samples, or screenshots that will convince us to your idea. -->
| 1.0 | Add test coverage output to test pipeline - <!-- Thank you for your contribution. Before you submit the issue:
1. Search open and closed issues for duplicates.
2. Read the contributing guidelines.
-->
**Description**
Add html test coverage report as output of `pull-api-gateway-lint`
**Reasons**
Allow PR's to be easily checked whether the implemented functionality is covered with a unit test
<!-- Explain why we should add this feature. Provide use cases to illustrate its benefits. -->
**Attachments**
https://status.build.kyma-project.io/view/gs/kyma-prow-logs/pr-logs/pull/kyma-project_api-gateway/42/pre-kyma-project-api-gateway/1575843552640897024
<!-- Attach any files, links, code samples, or screenshots that will convince us to your idea. -->
| test | add test coverage output to test pipeline thank you for your contribution before you submit the issue search open and closed issues for duplicates read the contributing guidelines description add html test coverage report as output of pull api gateway lint reasons allow pr s to be easily checked whether the implemented functionality is covered with a unit test attachments | 1 |
38,120 | 5,166,847,150 | IssuesEvent | 2017-01-17 17:11:57 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | github.com/cockroachdb/cockroach/pkg/storage: TestRefreshPendingCommands failed under stress | Robot test-failure | SHA: https://github.com/cockroachdb/cockroach/commits/6dbac80561b565d2f5e8304b9f36c4ed9112f0f2
Parameters:
```
COCKROACH_PROPOSER_EVALUATED_KV=true
TAGS=deadlock
GOFLAGS=
```
Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=117253&tab=buildLog
```
I170113 10:05:48.200608 426585 storage/store.go:1250 [n1,s1]: failed initial metrics computation: [n1,s1]: system config not yet available
I170113 10:05:48.200928 426585 gossip/gossip.go:292 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:59802" > attrs:<> locality:<>
W170113 10:05:48.214283 426585 gossip/gossip.go:1138 [n?] no incoming or outgoing connections
I170113 10:05:48.216986 427677 gossip/client.go:125 [n2] started gossip client to 127.0.0.1:59802
I170113 10:05:48.217608 426585 storage/store.go:1250 [n2,s2]: failed initial metrics computation: [n2,s2]: system config not yet available
I170113 10:05:48.217741 426585 gossip/gossip.go:292 [n2] NodeDescriptor set to node_id:2 address:<network_field:"tcp" address_field:"127.0.0.1:33925" > attrs:<> locality:<>
W170113 10:05:48.225063 426585 gossip/gossip.go:1138 [n?] no incoming or outgoing connections
I170113 10:05:48.227954 428483 gossip/client.go:125 [n3] started gossip client to 127.0.0.1:59802
I170113 10:05:48.228099 426585 storage/store.go:1250 [n3,s3]: failed initial metrics computation: [n3,s3]: system config not yet available
I170113 10:05:48.228194 426585 gossip/gossip.go:292 [n3] NodeDescriptor set to node_id:3 address:<network_field:"tcp" address_field:"127.0.0.1:60336" > attrs:<> locality:<>
I170113 10:05:48.234485 426585 storage/replica_raftstorage.go:410 [s1,r1/1:/M{in-ax},@c428b44c00] generated preemptive snapshot 1124dfc0 at index 15
I170113 10:05:48.235907 426585 storage/store.go:3275 [s1,r1/1:/M{in-ax},@c428b44c00] streamed snapshot: kv pairs: 34, log entries: 5, 1ms
I170113 10:05:48.236469 429173 storage/replica_raftstorage.go:575 [s2,r1/?:{-},@c42a55e000] applying preemptive snapshot at index 15 (id=1124dfc0, encoded size=6643, 1 rocksdb batches, 5 log entries)
I170113 10:05:48.237030 429173 storage/replica_raftstorage.go:583 [s2,r1/?:/M{in-ax},@c42a55e000] applied preemptive snapshot in 0ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I170113 10:05:48.238325 426585 storage/replica_command.go:3210 [s1,r1/1:/M{in-ax},@c428b44c00] change replicas (remove {2 2 2}): read existing descriptor range_id:1 start_key:"" end_key:"\377\377" replicas:<node_id:1 store_id:1 replica_id:1 > next_replica_id:2
I170113 10:05:48.242318 429428 storage/replica.go:2371 [s1,r1/1:/M{in-ax},@c428b44c00] proposing ADD_REPLICA {NodeID:2 StoreID:2 ReplicaID:2}: [{NodeID:1 StoreID:1 ReplicaID:1} {NodeID:2 StoreID:2 ReplicaID:2}]
I170113 10:05:48.246949 426585 storage/replica_raftstorage.go:410 [s1,r1/1:/M{in-ax},@c428b44c00] generated preemptive snapshot a7d765b2 at index 17
I170113 10:05:48.247585 426585 storage/store.go:3275 [s1,r1/1:/M{in-ax},@c428b44c00] streamed snapshot: kv pairs: 37, log entries: 7, 0ms
I170113 10:05:48.248796 429017 storage/replica_raftstorage.go:575 [s3,r1/?:{-},@c428b26900] applying preemptive snapshot at index 17 (id=a7d765b2, encoded size=8825, 1 rocksdb batches, 7 log entries)
I170113 10:05:48.249527 429017 storage/replica_raftstorage.go:583 [s3,r1/?:/M{in-ax},@c428b26900] applied preemptive snapshot in 1ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I170113 10:05:48.250671 426585 storage/replica_command.go:3210 [s1,r1/1:/M{in-ax},@c428b44c00] change replicas (remove {3 3 3}): read existing descriptor range_id:1 start_key:"" end_key:"\377\377" replicas:<node_id:1 store_id:1 replica_id:1 > replicas:<node_id:2 store_id:2 replica_id:2 > next_replica_id:3
I170113 10:05:48.252947 430063 storage/raft_transport.go:437 raft transport stream to node 1 established
I170113 10:05:48.258817 430374 storage/replica.go:2371 [s1,r1/1:/M{in-ax},@c428b44c00] proposing ADD_REPLICA {NodeID:3 StoreID:3 ReplicaID:3}: [{NodeID:1 StoreID:1 ReplicaID:1} {NodeID:2 StoreID:2 ReplicaID:2} {NodeID:3 StoreID:3 ReplicaID:3}]
W170113 10:05:48.340140 431141 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.340305 431120 storage/store.go:3135 [s1] raft error: node 3 claims to not contain store 3 for replica {3 3 3}: store 3 was not found
W170113 10:05:48.340365 431112 storage/raft_transport.go:443 raft transport stream to node 3 failed: store 3 was not found
I170113 10:05:48.342277 432207 storage/raft_transport.go:437 raft transport stream to node 3 established
W170113 10:05:48.343660 432249 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.343800 432249 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.343888 432234 storage/store.go:3135 [s1] raft error: node 3 claims to not contain store 3 for replica {3 3 3}: store 3 was not found
I170113 10:05:48.346673 432512 storage/raft_transport.go:437 raft transport stream to node 3 established
W170113 10:05:48.347194 430083 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:2 StoreID:2 ReplicaID:2}: no handler registered for {NodeID:1 StoreID:1 ReplicaID:1}
W170113 10:05:48.347432 432567 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.347504 430085 storage/store.go:3135 [s2] raft error: node 1 claims to not contain store 1 for replica {1 1 1}: store 1 was not found
W170113 10:05:48.347618 430063 storage/raft_transport.go:443 raft transport stream to node 1 failed: store 1 was not found
W170113 10:05:48.347713 432573 storage/raft_transport.go:478 no handler found for store 1 in response range_id:1 from_replica:<node_id:3 store_id:3 replica_id:3 > to_replica:<node_id:1 store_id:1 replica_id:1 > union:<error:<message:"store 3 was not found" transaction_restart:NONE origin_node:0 detail:<store_not_found:<store_id:3 > > now:<wall_time:0 logical:0 > > >
I170113 10:05:48.718120 434258 storage/raft_transport.go:437 raft transport stream to node 1 established
I170113 10:05:48.721683 427858 storage/replica_raftstorage.go:410 [raftsnapshot,s2,r1/2:/M{in-ax},@c42a55e000] generated Raft snapshot 3ddb665c at index 23
I170113 10:05:48.722379 427858 storage/store.go:3275 [raftsnapshot,s2,r1/2:/M{in-ax},@c42a55e000] streamed snapshot: kv pairs: 42, log entries: 2, 0ms
I170113 10:05:48.722767 434441 storage/replica_raftstorage.go:575 [s3,r1/3:/M{in-ax},@c425017200] applying Raft snapshot at index 23 (id=3ddb665c, encoded size=4802, 1 rocksdb batches, 2 log entries)
I170113 10:05:48.723324 434441 storage/replica_raftstorage.go:583 [s3,r1/3:/M{in-ax},@c425017200] applied Raft snapshot in 0ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I170113 10:05:48.867851 433304 storage/replica.go:2432 [s3,r1/3:/M{in-ax},@c425017200] not quiescing: 1 pending commands
W170113 10:05:48.975037 435353 storage/replica.go:2088 [hb,s3,r1/3:/M{in-ax},@c425017200] context cancellation after 0.1s of attempting command [txn: 3a324b71], BeginTransaction [/System/NodeLiveness/1,/Min), ConditionalPut [/System/NodeLiveness/1,/Min), EndTransaction [/System/NodeLiveness/1,/Min)
E170113 10:05:48.975575 435082 storage/client_test.go:1132 [hb] result is ambiguous (context canceled)
E170113 10:05:48.975662 435082 storage/client_test.go:1132 [hb] context canceled
E170113 10:05:48.975979 435082 storage/client_test.go:1132 [hb] context canceled
I170113 10:05:49.449821 428471 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:60336->127.0.0.1:42905: use of closed network connection
I170113 10:05:49.449952 427669 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:33925->127.0.0.1:46456: use of closed network connection
I170113 10:05:49.450067 426902 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:59802->127.0.0.1:39489: use of closed network connection
``` | 1.0 | github.com/cockroachdb/cockroach/pkg/storage: TestRefreshPendingCommands failed under stress - SHA: https://github.com/cockroachdb/cockroach/commits/6dbac80561b565d2f5e8304b9f36c4ed9112f0f2
Parameters:
```
COCKROACH_PROPOSER_EVALUATED_KV=true
TAGS=deadlock
GOFLAGS=
```
Stress build found a failed test: https://teamcity.cockroachdb.com/viewLog.html?buildId=117253&tab=buildLog
```
I170113 10:05:48.200608 426585 storage/store.go:1250 [n1,s1]: failed initial metrics computation: [n1,s1]: system config not yet available
I170113 10:05:48.200928 426585 gossip/gossip.go:292 [n1] NodeDescriptor set to node_id:1 address:<network_field:"tcp" address_field:"127.0.0.1:59802" > attrs:<> locality:<>
W170113 10:05:48.214283 426585 gossip/gossip.go:1138 [n?] no incoming or outgoing connections
I170113 10:05:48.216986 427677 gossip/client.go:125 [n2] started gossip client to 127.0.0.1:59802
I170113 10:05:48.217608 426585 storage/store.go:1250 [n2,s2]: failed initial metrics computation: [n2,s2]: system config not yet available
I170113 10:05:48.217741 426585 gossip/gossip.go:292 [n2] NodeDescriptor set to node_id:2 address:<network_field:"tcp" address_field:"127.0.0.1:33925" > attrs:<> locality:<>
W170113 10:05:48.225063 426585 gossip/gossip.go:1138 [n?] no incoming or outgoing connections
I170113 10:05:48.227954 428483 gossip/client.go:125 [n3] started gossip client to 127.0.0.1:59802
I170113 10:05:48.228099 426585 storage/store.go:1250 [n3,s3]: failed initial metrics computation: [n3,s3]: system config not yet available
I170113 10:05:48.228194 426585 gossip/gossip.go:292 [n3] NodeDescriptor set to node_id:3 address:<network_field:"tcp" address_field:"127.0.0.1:60336" > attrs:<> locality:<>
I170113 10:05:48.234485 426585 storage/replica_raftstorage.go:410 [s1,r1/1:/M{in-ax},@c428b44c00] generated preemptive snapshot 1124dfc0 at index 15
I170113 10:05:48.235907 426585 storage/store.go:3275 [s1,r1/1:/M{in-ax},@c428b44c00] streamed snapshot: kv pairs: 34, log entries: 5, 1ms
I170113 10:05:48.236469 429173 storage/replica_raftstorage.go:575 [s2,r1/?:{-},@c42a55e000] applying preemptive snapshot at index 15 (id=1124dfc0, encoded size=6643, 1 rocksdb batches, 5 log entries)
I170113 10:05:48.237030 429173 storage/replica_raftstorage.go:583 [s2,r1/?:/M{in-ax},@c42a55e000] applied preemptive snapshot in 0ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I170113 10:05:48.238325 426585 storage/replica_command.go:3210 [s1,r1/1:/M{in-ax},@c428b44c00] change replicas (remove {2 2 2}): read existing descriptor range_id:1 start_key:"" end_key:"\377\377" replicas:<node_id:1 store_id:1 replica_id:1 > next_replica_id:2
I170113 10:05:48.242318 429428 storage/replica.go:2371 [s1,r1/1:/M{in-ax},@c428b44c00] proposing ADD_REPLICA {NodeID:2 StoreID:2 ReplicaID:2}: [{NodeID:1 StoreID:1 ReplicaID:1} {NodeID:2 StoreID:2 ReplicaID:2}]
I170113 10:05:48.246949 426585 storage/replica_raftstorage.go:410 [s1,r1/1:/M{in-ax},@c428b44c00] generated preemptive snapshot a7d765b2 at index 17
I170113 10:05:48.247585 426585 storage/store.go:3275 [s1,r1/1:/M{in-ax},@c428b44c00] streamed snapshot: kv pairs: 37, log entries: 7, 0ms
I170113 10:05:48.248796 429017 storage/replica_raftstorage.go:575 [s3,r1/?:{-},@c428b26900] applying preemptive snapshot at index 17 (id=a7d765b2, encoded size=8825, 1 rocksdb batches, 7 log entries)
I170113 10:05:48.249527 429017 storage/replica_raftstorage.go:583 [s3,r1/?:/M{in-ax},@c428b26900] applied preemptive snapshot in 1ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I170113 10:05:48.250671 426585 storage/replica_command.go:3210 [s1,r1/1:/M{in-ax},@c428b44c00] change replicas (remove {3 3 3}): read existing descriptor range_id:1 start_key:"" end_key:"\377\377" replicas:<node_id:1 store_id:1 replica_id:1 > replicas:<node_id:2 store_id:2 replica_id:2 > next_replica_id:3
I170113 10:05:48.252947 430063 storage/raft_transport.go:437 raft transport stream to node 1 established
I170113 10:05:48.258817 430374 storage/replica.go:2371 [s1,r1/1:/M{in-ax},@c428b44c00] proposing ADD_REPLICA {NodeID:3 StoreID:3 ReplicaID:3}: [{NodeID:1 StoreID:1 ReplicaID:1} {NodeID:2 StoreID:2 ReplicaID:2} {NodeID:3 StoreID:3 ReplicaID:3}]
W170113 10:05:48.340140 431141 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.340305 431120 storage/store.go:3135 [s1] raft error: node 3 claims to not contain store 3 for replica {3 3 3}: store 3 was not found
W170113 10:05:48.340365 431112 storage/raft_transport.go:443 raft transport stream to node 3 failed: store 3 was not found
I170113 10:05:48.342277 432207 storage/raft_transport.go:437 raft transport stream to node 3 established
W170113 10:05:48.343660 432249 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.343800 432249 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.343888 432234 storage/store.go:3135 [s1] raft error: node 3 claims to not contain store 3 for replica {3 3 3}: store 3 was not found
I170113 10:05:48.346673 432512 storage/raft_transport.go:437 raft transport stream to node 3 established
W170113 10:05:48.347194 430083 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:2 StoreID:2 ReplicaID:2}: no handler registered for {NodeID:1 StoreID:1 ReplicaID:1}
W170113 10:05:48.347432 432567 storage/raft_transport.go:258 unable to accept Raft message from {NodeID:1 StoreID:1 ReplicaID:1}: no handler registered for {NodeID:3 StoreID:3 ReplicaID:3}
W170113 10:05:48.347504 430085 storage/store.go:3135 [s2] raft error: node 1 claims to not contain store 1 for replica {1 1 1}: store 1 was not found
W170113 10:05:48.347618 430063 storage/raft_transport.go:443 raft transport stream to node 1 failed: store 1 was not found
W170113 10:05:48.347713 432573 storage/raft_transport.go:478 no handler found for store 1 in response range_id:1 from_replica:<node_id:3 store_id:3 replica_id:3 > to_replica:<node_id:1 store_id:1 replica_id:1 > union:<error:<message:"store 3 was not found" transaction_restart:NONE origin_node:0 detail:<store_not_found:<store_id:3 > > now:<wall_time:0 logical:0 > > >
I170113 10:05:48.718120 434258 storage/raft_transport.go:437 raft transport stream to node 1 established
I170113 10:05:48.721683 427858 storage/replica_raftstorage.go:410 [raftsnapshot,s2,r1/2:/M{in-ax},@c42a55e000] generated Raft snapshot 3ddb665c at index 23
I170113 10:05:48.722379 427858 storage/store.go:3275 [raftsnapshot,s2,r1/2:/M{in-ax},@c42a55e000] streamed snapshot: kv pairs: 42, log entries: 2, 0ms
I170113 10:05:48.722767 434441 storage/replica_raftstorage.go:575 [s3,r1/3:/M{in-ax},@c425017200] applying Raft snapshot at index 23 (id=3ddb665c, encoded size=4802, 1 rocksdb batches, 2 log entries)
I170113 10:05:48.723324 434441 storage/replica_raftstorage.go:583 [s3,r1/3:/M{in-ax},@c425017200] applied Raft snapshot in 0ms [clear=0ms batch=0ms entries=0ms commit=0ms]
I170113 10:05:48.867851 433304 storage/replica.go:2432 [s3,r1/3:/M{in-ax},@c425017200] not quiescing: 1 pending commands
W170113 10:05:48.975037 435353 storage/replica.go:2088 [hb,s3,r1/3:/M{in-ax},@c425017200] context cancellation after 0.1s of attempting command [txn: 3a324b71], BeginTransaction [/System/NodeLiveness/1,/Min), ConditionalPut [/System/NodeLiveness/1,/Min), EndTransaction [/System/NodeLiveness/1,/Min)
E170113 10:05:48.975575 435082 storage/client_test.go:1132 [hb] result is ambiguous (context canceled)
E170113 10:05:48.975662 435082 storage/client_test.go:1132 [hb] context canceled
E170113 10:05:48.975979 435082 storage/client_test.go:1132 [hb] context canceled
I170113 10:05:49.449821 428471 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:60336->127.0.0.1:42905: use of closed network connection
I170113 10:05:49.449952 427669 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:33925->127.0.0.1:46456: use of closed network connection
I170113 10:05:49.450067 426902 vendor/google.golang.org/grpc/transport/http2_server.go:320 transport: http2Server.HandleStreams failed to read frame: read tcp 127.0.0.1:59802->127.0.0.1:39489: use of closed network connection
``` | test | github com cockroachdb cockroach pkg storage testrefreshpendingcommands failed under stress sha parameters cockroach proposer evaluated kv true tags deadlock goflags stress build found a failed test storage store go failed initial metrics computation system config not yet available gossip gossip go nodedescriptor set to node id address attrs locality gossip gossip go no incoming or outgoing connections gossip client go started gossip client to storage store go failed initial metrics computation system config not yet available gossip gossip go nodedescriptor set to node id address attrs locality gossip gossip go no incoming or outgoing connections gossip client go started gossip client to storage store go failed initial metrics computation system config not yet available gossip gossip go nodedescriptor set to node id address attrs locality storage replica raftstorage go generated preemptive snapshot at index storage store go streamed snapshot kv pairs log entries storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas remove read existing descriptor range id start key end key replicas next replica id storage replica go proposing add replica nodeid storeid replicaid storage replica raftstorage go generated preemptive snapshot at index storage store go streamed snapshot kv pairs log entries storage replica raftstorage go applying preemptive snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied preemptive snapshot in storage replica command go change replicas remove read existing descriptor range id start key end key replicas replicas next replica id storage raft transport go raft transport stream to node established storage replica go proposing add replica nodeid storeid replicaid storage raft transport go unable to accept raft message from nodeid storeid replicaid no handler registered for nodeid storeid replicaid storage store go raft error node claims to not contain store for replica store was not found storage raft transport go raft transport stream to node failed store was not found storage raft transport go raft transport stream to node established storage raft transport go unable to accept raft message from nodeid storeid replicaid no handler registered for nodeid storeid replicaid storage raft transport go unable to accept raft message from nodeid storeid replicaid no handler registered for nodeid storeid replicaid storage store go raft error node claims to not contain store for replica store was not found storage raft transport go raft transport stream to node established storage raft transport go unable to accept raft message from nodeid storeid replicaid no handler registered for nodeid storeid replicaid storage raft transport go unable to accept raft message from nodeid storeid replicaid no handler registered for nodeid storeid replicaid storage store go raft error node claims to not contain store for replica store was not found storage raft transport go raft transport stream to node failed store was not found storage raft transport go no handler found for store in response range id from replica to replica union now storage raft transport go raft transport stream to node established storage replica raftstorage go generated raft snapshot at index storage store go streamed snapshot kv pairs log entries storage replica raftstorage go applying raft snapshot at index id encoded size rocksdb batches log entries storage replica raftstorage go applied raft snapshot in storage replica go not quiescing pending commands storage replica go context cancellation after of attempting command begintransaction system nodeliveness min conditionalput system nodeliveness min endtransaction system nodeliveness min storage client test go result is ambiguous context canceled storage client test go context canceled storage client test go context canceled vendor google golang org grpc transport server go transport handlestreams failed to read frame read tcp use of closed network connection vendor google golang org grpc transport server go transport handlestreams failed to read frame read tcp use of closed network connection vendor google golang org grpc transport server go transport handlestreams failed to read frame read tcp use of closed network connection | 1 |
566,529 | 16,823,614,438 | IssuesEvent | 2021-06-17 15:41:23 | betagouv/service-national-universel | https://api.github.com/repos/betagouv/service-national-universel | closed | Elastic Search search problem | priority-HIGH | - Les recherches elastic search ne sont pas try/catch = gros problème.
- Ces erreurs viennent du fait qu'il y a bcp trop de requêtes faites à un instant t levant une net::ERR_INSUFFICIENT_RESOURCES
- Sans try/catch l'application crash

| 1.0 | Elastic Search search problem - - Les recherches elastic search ne sont pas try/catch = gros problème.
- Ces erreurs viennent du fait qu'il y a bcp trop de requêtes faites à un instant t levant une net::ERR_INSUFFICIENT_RESOURCES
- Sans try/catch l'application crash

| non_test | elastic search search problem les recherches elastic search ne sont pas try catch gros problème ces erreurs viennent du fait qu il y a bcp trop de requêtes faites à un instant t levant une net err insufficient resources sans try catch l application crash | 0 |
270,681 | 20,605,338,755 | IssuesEvent | 2022-03-06 22:05:50 | dartmouth-cs98/21f-visuol | https://api.github.com/repos/dartmouth-cs98/21f-visuol | closed | Final Success Metrics and Validation Plan | documentation | # Success Metrics and Validation Plan
## Team Goals
*Discuss and identify any team goals that you think would be important to tackle*
* We finish complex features of comparison tools, health insurance glossary, and negotiation
* We want to publish this on the web
* We want to hopefully finish our stretch goal of adding housing
## Success Metrics
*what are some success metrics that you might use for your product / customers / your team / your cs98?*
* We want to have 50 plus users
* write a medium post about our product and it gets 10 plus claps
* every teammate feels like their individual goals for the course were accomplished
## Validation Plan
*how do you get from goals to success metrics that are validated? this can be user testing, performance metrics, a public demo, etc. this is your implementation plan to gather the above.*
* User testing among our peers
* Have at least 20 people say our project helped them better understand their offer
* Create a demo video
* Put up posters in Novack | 1.0 | Final Success Metrics and Validation Plan - # Success Metrics and Validation Plan
## Team Goals
*Discuss and identify any team goals that you think would be important to tackle*
* We finish complex features of comparison tools, health insurance glossary, and negotiation
* We want to publish this on the web
* We want to hopefully finish our stretch goal of adding housing
## Success Metrics
*what are some success metrics that you might use for your product / customers / your team / your cs98?*
* We want to have 50 plus users
* write a medium post about our product and it gets 10 plus claps
* every teammate feels like their individual goals for the course were accomplished
## Validation Plan
*how do you get from goals to success metrics that are validated? this can be user testing, performance metrics, a public demo, etc. this is your implementation plan to gather the above.*
* User testing among our peers
* Have at least 20 people say our project helped them better understand their offer
* Create a demo video
* Put up posters in Novack | non_test | final success metrics and validation plan success metrics and validation plan team goals discuss and identify any team goals that you think would be important to tackle we finish complex features of comparison tools health insurance glossary and negotiation we want to publish this on the web we want to hopefully finish our stretch goal of adding housing success metrics what are some success metrics that you might use for your product customers your team your we want to have plus users write a medium post about our product and it gets plus claps every teammate feels like their individual goals for the course were accomplished validation plan how do you get from goals to success metrics that are validated this can be user testing performance metrics a public demo etc this is your implementation plan to gather the above user testing among our peers have at least people say our project helped them better understand their offer create a demo video put up posters in novack | 0 |
349,290 | 31,790,904,323 | IssuesEvent | 2023-09-13 03:10:24 | TencentBlueKing/bk-cmdb | https://api.github.com/repos/TencentBlueKing/bk-cmdb | closed | 【CMDB+3.11.2-alpha1】只更改属性配置的内容,节点信息这里的同步集群没有同步 | for test | 问题描述:
只更改属性配置的内容,节点信息这里的同步集群没有同步
一,前置条件:
集群模板属性设置做了变更

二,操作步骤:
1.点击实例对应的所属拓扑

2.点击节点信息

三,预期结果:
同步集群显示待同步状态
四,实际结果:
“同步集群”没有实时更新 | 1.0 | 【CMDB+3.11.2-alpha1】只更改属性配置的内容,节点信息这里的同步集群没有同步 - 问题描述:
只更改属性配置的内容,节点信息这里的同步集群没有同步
一,前置条件:
集群模板属性设置做了变更

二,操作步骤:
1.点击实例对应的所属拓扑

2.点击节点信息

三,预期结果:
同步集群显示待同步状态
四,实际结果:
“同步集群”没有实时更新 | test | 【cmdb 】只更改属性配置的内容,节点信息这里的同步集群没有同步 问题描述: 只更改属性配置的内容,节点信息这里的同步集群没有同步 一,前置条件: 集群模板属性设置做了变更 二,操作步骤: 点击实例对应的所属拓扑 点击节点信息 三,预期结果: 同步集群显示待同步状态 四,实际结果: “同步集群”没有实时更新 | 1 |
272,945 | 20,764,701,039 | IssuesEvent | 2022-03-15 19:30:17 | zulip/zulip | https://api.github.com/repos/zulip/zulip | closed | Fix user group management help center docs | area: documentation (user) in progress release goal | 1. https://zulip.com/help/restrict-user-group-management should be incorporated as a section into https://zulip.com/help/user-groups; it's not worth having a separate page.
2. Both of these pages currently make incorrect statements about what kinds of user group management permissions are possible, which should be fixed.
| 1.0 | Fix user group management help center docs - 1. https://zulip.com/help/restrict-user-group-management should be incorporated as a section into https://zulip.com/help/user-groups; it's not worth having a separate page.
2. Both of these pages currently make incorrect statements about what kinds of user group management permissions are possible, which should be fixed.
| non_test | fix user group management help center docs should be incorporated as a section into it s not worth having a separate page both of these pages currently make incorrect statements about what kinds of user group management permissions are possible which should be fixed | 0 |
119,220 | 10,029,043,281 | IssuesEvent | 2019-07-17 13:08:09 | ethereum/solidity | https://api.github.com/repos/ethereum/solidity | closed | [Testing] Add bools and arrays of bools and dynamic byte arrays to abiv2 protobuf spec | testing :hammer: | ## Abstract
This issue tracks support for
- bools
- arrays of bools and dynamic byte arrays (`bytes` and `string` types)
in the abiv2 coder protobuf specification | 1.0 | [Testing] Add bools and arrays of bools and dynamic byte arrays to abiv2 protobuf spec - ## Abstract
This issue tracks support for
- bools
- arrays of bools and dynamic byte arrays (`bytes` and `string` types)
in the abiv2 coder protobuf specification | test | add bools and arrays of bools and dynamic byte arrays to protobuf spec abstract this issue tracks support for bools arrays of bools and dynamic byte arrays bytes and string types in the coder protobuf specification | 1 |
16,449 | 3,521,950,406 | IssuesEvent | 2016-01-13 06:20:09 | sIKE23/Mage-Wars | https://api.github.com/repos/sIKE23/Mage-Wars | closed | Player Request: Make the action markers in octgn a bit more colored at the edge when flipped to the white star side | Enhancement In/Needs Testing Low | http://forum.arcanewonders.com/index.php?topic=16239.msg60647#new | 1.0 | Player Request: Make the action markers in octgn a bit more colored at the edge when flipped to the white star side - http://forum.arcanewonders.com/index.php?topic=16239.msg60647#new | test | player request make the action markers in octgn a bit more colored at the edge when flipped to the white star side | 1 |
41,454 | 5,357,068,500 | IssuesEvent | 2017-02-20 17:15:57 | TEAMMATES/teammates | https://api.github.com/repos/TEAMMATES/teammates | closed | InstructorStudentListPageUiTest failing on live server | a-Testing f-Courses p.Medium | v5.96
```
java.lang.AssertionError: expected:<CCSDetailsUiT.alice.tmms@gmail.tmt
benny.c.tmms@gmail.tmt
hugh.i.tmms@gmail.tmt
ivan.j.tmms@gmail.tmt
jack.k.tmms@gmail.tmt
CCSDetailsUiT.charlie.tmms@gmail.tmt
denny.c.tmms@gmail.tmt> but was:<CCSDetailsUiT.alice.tmms@gmail.tmt
benny.c.tmms@gmail.tmt
hugh.i.tmms@gmail.tmt
ivan.j.tmms@gmail.tmt
jack.k.tmms@gmail.tmt
CCSDetailsUiT.charlie.tmms@gmail.tmt
denny.c.tmms@gmail.tmt>
at org.testng.AssertJUnit.assertEquals(AssertJUnit.java:101)
at org.testng.AssertJUnit.assertEquals(AssertJUnit.java:108)
at teammates.test.cases.BaseTestCase.assertEquals(BaseTestCase.java:133)
at teammates.test.cases.browsertests.InstructorStudentListPageUiTest.testContent(InstructorStudentListPageUiTest.java:110)
at teammates.test.cases.browsertests.InstructorStudentListPageUiTest.testAll(InstructorStudentListPageUiTest.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:85)
at org.testng.internal.Invoker.invokeMethod(Invoker.java:659)
at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:845)
at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1153)
at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:125)
at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108)
at org.testng.TestRunner.privateRun(TestRunner.java:771)
at org.testng.TestRunner.run(TestRunner.java:621)
at org.testng.SuiteRunner.runTest(SuiteRunner.java:357)
at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:352)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:310)
at org.testng.SuiteRunner.run(SuiteRunner.java:259)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1199)
at org.testng.TestNG.runSuitesLocally(TestNG.java:1124)
at org.testng.TestNG.run(TestNG.java:1032)
at org.testng.remote.RemoteTestNG.run(RemoteTestNG.java:111)
at org.testng.remote.RemoteTestNG.initAndRun(RemoteTestNG.java:204)
at org.testng.remote.RemoteTestNG.main(RemoteTestNG.java:175)
```
| 1.0 | InstructorStudentListPageUiTest failing on live server - v5.96
```
java.lang.AssertionError: expected:<CCSDetailsUiT.alice.tmms@gmail.tmt
benny.c.tmms@gmail.tmt
hugh.i.tmms@gmail.tmt
ivan.j.tmms@gmail.tmt
jack.k.tmms@gmail.tmt
CCSDetailsUiT.charlie.tmms@gmail.tmt
denny.c.tmms@gmail.tmt> but was:<CCSDetailsUiT.alice.tmms@gmail.tmt
benny.c.tmms@gmail.tmt
hugh.i.tmms@gmail.tmt
ivan.j.tmms@gmail.tmt
jack.k.tmms@gmail.tmt
CCSDetailsUiT.charlie.tmms@gmail.tmt
denny.c.tmms@gmail.tmt>
at org.testng.AssertJUnit.assertEquals(AssertJUnit.java:101)
at org.testng.AssertJUnit.assertEquals(AssertJUnit.java:108)
at teammates.test.cases.BaseTestCase.assertEquals(BaseTestCase.java:133)
at teammates.test.cases.browsertests.InstructorStudentListPageUiTest.testContent(InstructorStudentListPageUiTest.java:110)
at teammates.test.cases.browsertests.InstructorStudentListPageUiTest.testAll(InstructorStudentListPageUiTest.java:52)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.testng.internal.MethodInvocationHelper.invokeMethod(MethodInvocationHelper.java:85)
at org.testng.internal.Invoker.invokeMethod(Invoker.java:659)
at org.testng.internal.Invoker.invokeTestMethod(Invoker.java:845)
at org.testng.internal.Invoker.invokeTestMethods(Invoker.java:1153)
at org.testng.internal.TestMethodWorker.invokeTestMethods(TestMethodWorker.java:125)
at org.testng.internal.TestMethodWorker.run(TestMethodWorker.java:108)
at org.testng.TestRunner.privateRun(TestRunner.java:771)
at org.testng.TestRunner.run(TestRunner.java:621)
at org.testng.SuiteRunner.runTest(SuiteRunner.java:357)
at org.testng.SuiteRunner.runSequentially(SuiteRunner.java:352)
at org.testng.SuiteRunner.privateRun(SuiteRunner.java:310)
at org.testng.SuiteRunner.run(SuiteRunner.java:259)
at org.testng.SuiteRunnerWorker.runSuite(SuiteRunnerWorker.java:52)
at org.testng.SuiteRunnerWorker.run(SuiteRunnerWorker.java:86)
at org.testng.TestNG.runSuitesSequentially(TestNG.java:1199)
at org.testng.TestNG.runSuitesLocally(TestNG.java:1124)
at org.testng.TestNG.run(TestNG.java:1032)
at org.testng.remote.RemoteTestNG.run(RemoteTestNG.java:111)
at org.testng.remote.RemoteTestNG.initAndRun(RemoteTestNG.java:204)
at org.testng.remote.RemoteTestNG.main(RemoteTestNG.java:175)
```
| test | instructorstudentlistpageuitest failing on live server java lang assertionerror expected ccsdetailsuit alice tmms gmail tmt benny c tmms gmail tmt hugh i tmms gmail tmt ivan j tmms gmail tmt jack k tmms gmail tmt ccsdetailsuit charlie tmms gmail tmt denny c tmms gmail tmt but was ccsdetailsuit alice tmms gmail tmt benny c tmms gmail tmt hugh i tmms gmail tmt ivan j tmms gmail tmt jack k tmms gmail tmt ccsdetailsuit charlie tmms gmail tmt denny c tmms gmail tmt at org testng assertjunit assertequals assertjunit java at org testng assertjunit assertequals assertjunit java at teammates test cases basetestcase assertequals basetestcase java at teammates test cases browsertests instructorstudentlistpageuitest testcontent instructorstudentlistpageuitest java at teammates test cases browsertests instructorstudentlistpageuitest testall instructorstudentlistpageuitest java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at org testng internal methodinvocationhelper invokemethod methodinvocationhelper java at org testng internal invoker invokemethod invoker java at org testng internal invoker invoketestmethod invoker java at org testng internal invoker invoketestmethods invoker java at org testng internal testmethodworker invoketestmethods testmethodworker java at org testng internal testmethodworker run testmethodworker java at org testng testrunner privaterun testrunner java at org testng testrunner run testrunner java at org testng suiterunner runtest suiterunner java at org testng suiterunner runsequentially suiterunner java at org testng suiterunner privaterun suiterunner java at org testng suiterunner run suiterunner java at org testng suiterunnerworker runsuite suiterunnerworker java at org testng suiterunnerworker run suiterunnerworker java at org testng testng runsuitessequentially testng java at org testng testng runsuiteslocally testng java at org testng testng run testng java at org testng remote remotetestng run remotetestng java at org testng remote remotetestng initandrun remotetestng java at org testng remote remotetestng main remotetestng java | 1 |
292,036 | 21,947,434,446 | IssuesEvent | 2022-05-24 03:12:19 | ploomber/ploomber | https://api.github.com/repos/ploomber/ploomber | closed | copying code in console snippets ignores line break | documentation good first issue | when clicking on console snippets:

https://docs.ploomber.io/en/docs/use-cases/research.html
The contents are copied to the clipboard, but the line break is ignored, causing the clipboard to contain:
```
pip install ploomberploomber examples -n cookbook/grid -o grid
```
when it should be:
```
pip install ploomber
ploomber examples -n cookbook/grid -o grid
```
| 1.0 | copying code in console snippets ignores line break - when clicking on console snippets:

https://docs.ploomber.io/en/docs/use-cases/research.html
The contents are copied to the clipboard, but the line break is ignored, causing the clipboard to contain:
```
pip install ploomberploomber examples -n cookbook/grid -o grid
```
when it should be:
```
pip install ploomber
ploomber examples -n cookbook/grid -o grid
```
| non_test | copying code in console snippets ignores line break when clicking on console snippets the contents are copied to the clipboard but the line break is ignored causing the clipboard to contain pip install ploomberploomber examples n cookbook grid o grid when it should be pip install ploomber ploomber examples n cookbook grid o grid | 0 |
16,036 | 3,494,263,786 | IssuesEvent | 2016-01-05 09:30:36 | paperjs/paper.js | https://api.github.com/repos/paperjs/paper.js | closed | Intersections of rectangles don't make sense | boolean-operations unit-test-missing | Rectangles sharing two parallel co-linear segments don't intersect in the way I'd expect.
Here's a [sketch](http://sketch.paperjs.org/#S/xVffa9swEP5XhF9qU8/YaU0hWTdGGWUw2Fgf9tDkQbHlWI0jBVlpWUv+953kOPKvMMGczbTBPp1O33336WS/OQxviDN1HtZEJrnjOwlP1fNW8CeSyCApCBauN5uzOct2LJGUM1Tm/OVe0NSlTBLxjAsflfSVeOhtzhBcz1igleC7LbpFjLyge3V/CKLGMy6Qq5woONQxZvD0XsdRd5dm4BhWXTpsgNP0LqdFKghzH82gutR637HMg6+UEVc/cYjkhj6ino+MQa2kbJ5vF4D6KGwFoIesPTN/oXJUN/s61QpvKQVfk580lTlkHM36Y3e8AFIquvS9AhwGsdfyTXGZfxIC/wLPx9hHk0U1rH8EkTvBKk8w73sl+6IILYk2lG6Bl0QVLoL/Sat0tOkHK5VRsCKyPRumHJAl8MgLEhR8VcVEl2jufEDdKVOw+u3YRhEtcwD6+IyT3K3Ru83hlhxOLU7Lb89EFHj7sbtocBxyoZpqdKuqedvz0+Y6yZoabTyUSYvkjgrYIu5bW0MJUZGmAwE7YhM4pTugJu7YG6KYtjQR+pGvZGHc90fNGTqPVayqd4TRKJu2dcR3sSxwsr4Y8DDSRQdlVXucpmqFuh3EoNk4DCscygE3qfoBADBbGbY0IVMQsp62OFCg9hQYo47VwDBsnWAJFB1Wf5qovcGztMJzMzIemBj18fR35NzJuaCvnElcwG5OZKlEiX20NBkk1oxGvRS0NRyZ0tQe0E08HiB7TmGzS5o0GU18lFb+VQrEKoVJCKtNwp4uwjPoNLNDNIlHRjTEahYILrEk7rvr+DTLkhaSpIZjOFqzJscrq4yuwsGtdw6KcztAiuKebkdmeJjRJYeOm1JRn8M8y0oiEa8OL8P0ykd5k2lqvSGvhqUzeot4suu6wDTQ/T+oPrYIykqagnQbffgU7fD299SkfW23ZTXvY2Z5mvbirxAp4Z+bd5kLQpCivIQ3uQLetwH2keG1jwqTzcZuv8b/oklvei1RIWR2CJXO+0fhyDpnDYTwzRJUL6R/aDbdHr6BmSY9bn9KXvcKcI6Xua39IdkHdNPu62eRN2eVuAe0zeGzw5vB5/ZSELzWWEtn+rjY/wY=)
And the code for reference:
```
project.clear();
function showGrid(interval, size) {
var group = new Group();
for (var i = interval; i < size; i += interval) {
group.addChildren([
new Path.Line(new Point(0, i), new Point(size, i)),
new Path.Line(new Point(i, 0), new Point(i, size))
]);
}
group.strokeWidth = 1;
group.strokeColor = new Color(0, 0.5);
group.dashArray = [5, 2];
return group;
}
function showIntersections(label, s1, s2) {
var intersections = s1.getIntersections(s2);
console.log(label + "> getIntersections: ", intersections);
intersections.forEach(function(intersection) {
console.log(label + "> isOverlap?", intersection.isOverlap(), ", point=", intersection.point);
var point = new Path.Circle({
center: intersection.point,
radius: 5,
strokeColor: new Color(0,0,1,0.5)
});
});
var inter = s1.intersect(s2);
inter.strokeColor = 'black';
inter.strokeWidth = 1;
}
var grid = showGrid(50, 500);
var a = new Path.Rectangle({
point: [50, 50],
size: [150, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var b = new Path.Rectangle({
point: [70, 50],
size: [150, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("horizontal rects", a, b);
var c = new Path.Rectangle({
point: [50, 150],
size: [50, 100],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var d = new Path.Rectangle({
point: [50, 175],
size: [50, 100],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("vertical rects", c, d);
var e = new Path.Rectangle({
point: [200, 200],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var f = new Path.Rectangle({
point: [225, 200],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
f.rotate(-45);
showIntersections("tilted rects", e, f);
var g = new Path.Rectangle({
point: [300, 50],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var h = new Path.Rectangle({
point: [325, 75],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("both directions offset overlap rects", g, h);
var i = new Path.Rectangle({
point: [50, 300],
size: [100, 100],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var j = new Path.Rectangle({
point: [75, 325],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("vertical inside, horizontal offset overlap rects", i, j);
var k = new Path.Rectangle({
point: [250, 325],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var l = new Path.Rectangle({
point: [250, 325],
size: [125, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("three sides colinear rects", k, l);
var m = new Path.Rectangle({
point: [350, 200],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
m.rotate(-45);
var n = new Path.Rectangle({
point: [375, 175],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
n.rotate(-45, n.center);
showIntersections("both tilted rects", m, n);
var o = new Path.Rectangle({
point: [200, 400],
size: [50, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var p = new Path.Rectangle({
point: [225, 400],
size: [75, 75],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("one side colinear rects", o, p);
```
Am I just completely misunderstanding how this works? As an example, here's a [fiddle](https://jsfiddle.net/gg2rw8p4/) using basic canvas detecting a similar overlapping rectangular area (as the top-left set of rectangles). | 1.0 | Intersections of rectangles don't make sense - Rectangles sharing two parallel co-linear segments don't intersect in the way I'd expect.
Here's a [sketch](http://sketch.paperjs.org/#S/xVffa9swEP5XhF9qU8/YaU0hWTdGGWUw2Fgf9tDkQbHlWI0jBVlpWUv+953kOPKvMMGczbTBPp1O33336WS/OQxviDN1HtZEJrnjOwlP1fNW8CeSyCApCBauN5uzOct2LJGUM1Tm/OVe0NSlTBLxjAsflfSVeOhtzhBcz1igleC7LbpFjLyge3V/CKLGMy6Qq5woONQxZvD0XsdRd5dm4BhWXTpsgNP0LqdFKghzH82gutR637HMg6+UEVc/cYjkhj6ino+MQa2kbJ5vF4D6KGwFoIesPTN/oXJUN/s61QpvKQVfk580lTlkHM36Y3e8AFIquvS9AhwGsdfyTXGZfxIC/wLPx9hHk0U1rH8EkTvBKk8w73sl+6IILYk2lG6Bl0QVLoL/Sat0tOkHK5VRsCKyPRumHJAl8MgLEhR8VcVEl2jufEDdKVOw+u3YRhEtcwD6+IyT3K3Ru83hlhxOLU7Lb89EFHj7sbtocBxyoZpqdKuqedvz0+Y6yZoabTyUSYvkjgrYIu5bW0MJUZGmAwE7YhM4pTugJu7YG6KYtjQR+pGvZGHc90fNGTqPVayqd4TRKJu2dcR3sSxwsr4Y8DDSRQdlVXucpmqFuh3EoNk4DCscygE3qfoBADBbGbY0IVMQsp62OFCg9hQYo47VwDBsnWAJFB1Wf5qovcGztMJzMzIemBj18fR35NzJuaCvnElcwG5OZKlEiX20NBkk1oxGvRS0NRyZ0tQe0E08HiB7TmGzS5o0GU18lFb+VQrEKoVJCKtNwp4uwjPoNLNDNIlHRjTEahYILrEk7rvr+DTLkhaSpIZjOFqzJscrq4yuwsGtdw6KcztAiuKebkdmeJjRJYeOm1JRn8M8y0oiEa8OL8P0ykd5k2lqvSGvhqUzeot4suu6wDTQ/T+oPrYIykqagnQbffgU7fD299SkfW23ZTXvY2Z5mvbirxAp4Z+bd5kLQpCivIQ3uQLetwH2keG1jwqTzcZuv8b/oklvei1RIWR2CJXO+0fhyDpnDYTwzRJUL6R/aDbdHr6BmSY9bn9KXvcKcI6Xua39IdkHdNPu62eRN2eVuAe0zeGzw5vB5/ZSELzWWEtn+rjY/wY=)
And the code for reference:
```
project.clear();
function showGrid(interval, size) {
var group = new Group();
for (var i = interval; i < size; i += interval) {
group.addChildren([
new Path.Line(new Point(0, i), new Point(size, i)),
new Path.Line(new Point(i, 0), new Point(i, size))
]);
}
group.strokeWidth = 1;
group.strokeColor = new Color(0, 0.5);
group.dashArray = [5, 2];
return group;
}
function showIntersections(label, s1, s2) {
var intersections = s1.getIntersections(s2);
console.log(label + "> getIntersections: ", intersections);
intersections.forEach(function(intersection) {
console.log(label + "> isOverlap?", intersection.isOverlap(), ", point=", intersection.point);
var point = new Path.Circle({
center: intersection.point,
radius: 5,
strokeColor: new Color(0,0,1,0.5)
});
});
var inter = s1.intersect(s2);
inter.strokeColor = 'black';
inter.strokeWidth = 1;
}
var grid = showGrid(50, 500);
var a = new Path.Rectangle({
point: [50, 50],
size: [150, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var b = new Path.Rectangle({
point: [70, 50],
size: [150, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("horizontal rects", a, b);
var c = new Path.Rectangle({
point: [50, 150],
size: [50, 100],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var d = new Path.Rectangle({
point: [50, 175],
size: [50, 100],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("vertical rects", c, d);
var e = new Path.Rectangle({
point: [200, 200],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var f = new Path.Rectangle({
point: [225, 200],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
f.rotate(-45);
showIntersections("tilted rects", e, f);
var g = new Path.Rectangle({
point: [300, 50],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var h = new Path.Rectangle({
point: [325, 75],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("both directions offset overlap rects", g, h);
var i = new Path.Rectangle({
point: [50, 300],
size: [100, 100],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var j = new Path.Rectangle({
point: [75, 325],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("vertical inside, horizontal offset overlap rects", i, j);
var k = new Path.Rectangle({
point: [250, 325],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var l = new Path.Rectangle({
point: [250, 325],
size: [125, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("three sides colinear rects", k, l);
var m = new Path.Rectangle({
point: [350, 200],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
m.rotate(-45);
var n = new Path.Rectangle({
point: [375, 175],
size: [100, 50],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
n.rotate(-45, n.center);
showIntersections("both tilted rects", m, n);
var o = new Path.Rectangle({
point: [200, 400],
size: [50, 50],
strokeWidth: 5,
strokeColor: new Color(1, 0, 0, 0.5)
});
var p = new Path.Rectangle({
point: [225, 400],
size: [75, 75],
strokeWidth: 5,
strokeColor: new Color(0, 1, 0, 0.5)
});
showIntersections("one side colinear rects", o, p);
```
Am I just completely misunderstanding how this works? As an example, here's a [fiddle](https://jsfiddle.net/gg2rw8p4/) using basic canvas detecting a similar overlapping rectangular area (as the top-left set of rectangles). | test | intersections of rectangles don t make sense rectangles sharing two parallel co linear segments don t intersect in the way i d expect here s a and the code for reference project clear function showgrid interval size var group new group for var i interval i size i interval group addchildren new path line new point i new point size i new path line new point i new point i size group strokewidth group strokecolor new color group dasharray return group function showintersections label var intersections getintersections console log label getintersections intersections intersections foreach function intersection console log label isoverlap intersection isoverlap point intersection point var point new path circle center intersection point radius strokecolor new color var inter intersect inter strokecolor black inter strokewidth var grid showgrid var a new path rectangle point size strokewidth strokecolor new color var b new path rectangle point size strokewidth strokecolor new color showintersections horizontal rects a b var c new path rectangle point size strokewidth strokecolor new color var d new path rectangle point size strokewidth strokecolor new color showintersections vertical rects c d var e new path rectangle point size strokewidth strokecolor new color var f new path rectangle point size strokewidth strokecolor new color f rotate showintersections tilted rects e f var g new path rectangle point size strokewidth strokecolor new color var h new path rectangle point size strokewidth strokecolor new color showintersections both directions offset overlap rects g h var i new path rectangle point size strokewidth strokecolor new color var j new path rectangle point size strokewidth strokecolor new color showintersections vertical inside horizontal offset overlap rects i j var k new path rectangle point size strokewidth strokecolor new color var l new path rectangle point size strokewidth strokecolor new color showintersections three sides colinear rects k l var m new path rectangle point size strokewidth strokecolor new color m rotate var n new path rectangle point size strokewidth strokecolor new color n rotate n center showintersections both tilted rects m n var o new path rectangle point size strokewidth strokecolor new color var p new path rectangle point size strokewidth strokecolor new color showintersections one side colinear rects o p am i just completely misunderstanding how this works as an example here s a using basic canvas detecting a similar overlapping rectangular area as the top left set of rectangles | 1 |
196,392 | 14,857,661,108 | IssuesEvent | 2021-01-18 15:43:21 | VolmitSoftware/Iris | https://api.github.com/repos/VolmitSoftware/Iris | closed | trees partitially cutted off at a side in 1.2.5 | Bug Needs Testing | **Describe the bug**
some trees at biomeborders are some times missing parts of the side, i only have seen it till now for birches
**To Reproduce**
search biomeborder birchforrest
**Screenshots or Video Recordings**
If applicable, add screenshots or video recordings to help explain your problem.


**Server and Plugin Informations**
- Iris Version: 1.2.5
- Server Platform and Version [eg: PaperSpigot 1.16.3 #240]: paper 1.16.4
| 1.0 | trees partitially cutted off at a side in 1.2.5 - **Describe the bug**
some trees at biomeborders are some times missing parts of the side, i only have seen it till now for birches
**To Reproduce**
search biomeborder birchforrest
**Screenshots or Video Recordings**
If applicable, add screenshots or video recordings to help explain your problem.


**Server and Plugin Informations**
- Iris Version: 1.2.5
- Server Platform and Version [eg: PaperSpigot 1.16.3 #240]: paper 1.16.4
| test | trees partitially cutted off at a side in describe the bug some trees at biomeborders are some times missing parts of the side i only have seen it till now for birches to reproduce search biomeborder birchforrest screenshots or video recordings if applicable add screenshots or video recordings to help explain your problem server and plugin informations iris version server platform and version paper | 1 |
49,021 | 12,267,704,034 | IssuesEvent | 2020-05-07 11:10:46 | armbian/build | https://api.github.com/repos/armbian/build | opened | Build options LIB_TAG is ignored | bug build scripts | I tried to build an image based on a branch (e.g AR-159) and I just realized that LIB_TAG param is not taken in consideration anymore. So it seems I have to merge first into master localy.
I haven't looked into the build script yet to find the reason.
Wondering if it is a known issue. I remember the build script use to check out by default into master branch and do a git pull. | 1.0 | Build options LIB_TAG is ignored - I tried to build an image based on a branch (e.g AR-159) and I just realized that LIB_TAG param is not taken in consideration anymore. So it seems I have to merge first into master localy.
I haven't looked into the build script yet to find the reason.
Wondering if it is a known issue. I remember the build script use to check out by default into master branch and do a git pull. | non_test | build options lib tag is ignored i tried to build an image based on a branch e g ar and i just realized that lib tag param is not taken in consideration anymore so it seems i have to merge first into master localy i haven t looked into the build script yet to find the reason wondering if it is a known issue i remember the build script use to check out by default into master branch and do a git pull | 0 |
114,219 | 9,693,216,747 | IssuesEvent | 2019-05-24 15:32:04 | eclipse/openj9 | https://api.github.com/repos/eclipse/openj9 | closed | Test-sanity.system-JDK8-linux_ppc-64_cmprssptrs_le MauveMultiThreadLoadTest_0 java.security.AccessControlException: access denied | comp:test test failure | https://ci.eclipse.org/openj9/job/Test-sanity.system-JDK8-linux_ppc-64_cmprssptrs_le/234
```
01:20:29.105 - First failure detected by thread: load-2. Not creating dumps as no dump generation is requested for this load test
01:20:29.107 - Test failed
Failure num. = 1
Test number = 3033
Test details = 'Mauve[gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest]'
Suite number = 0
Thread number = 2
>>> Captured test output >>>
PASS: gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest: Default properties (number 0)
Test failed:
java.lang.ExceptionInInitializerError
at java.lang.J9VMInternals.ensureError(J9VMInternals.java:146)
at java.lang.J9VMInternals.recordInitializationFailure(J9VMInternals.java:135)
at sun.net.www.protocol.http.Handler.openConnection(Handler.java:62)
at sun.net.www.protocol.http.Handler.openConnection(Handler.java:57)
at java.net.URL.openConnection(URL.java:979)
at gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest.test_Properties(requestPropertiesTest.java:63)
at gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest.test(requestPropertiesTest.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at net.adoptopenjdk.loadTest.adaptors.MauveAdaptor.executeTest(MauveAdaptor.java:74)
at net.adoptopenjdk.loadTest.LoadTestRunner$2.run(LoadTestRunner.java:182)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:813)
Caused by: java.security.AccessControlException: access denied
at gnu.testlet.java.util.logging.Handler.TestSecurityManager.checkPermission(TestSecurityManager.java:48)
at java.util.logging.LogManager.checkPermission(LogManager.java:1586)
at java.util.logging.Logger.setParent(Logger.java:2035)
at java.util.logging.LogManager$7.run(LogManager.java:1205)
at java.security.AccessController.doPrivileged(AccessController.java:647)
at java.util.logging.LogManager.doSetParent(LogManager.java:1202)
at java.util.logging.LogManager.access$1200(LogManager.java:145)
at java.util.logging.LogManager$LoggerContext.addLocalLogger(LogManager.java:830)
at java.util.logging.LogManager$LoggerContext.addLocalLogger(LogManager.java:758)
at java.util.logging.LogManager$SystemLoggerContext.demandLogger(LogManager.java:927)
at java.util.logging.LogManager.demandSystemLogger(LogManager.java:581)
at java.util.logging.Logger.getPlatformLogger(Logger.java:576)
at java.util.logging.LoggingProxyImpl.getLogger(LoggingProxyImpl.java:41)
at sun.util.logging.LoggingSupport.getLogger(LoggingSupport.java:100)
at sun.util.logging.PlatformLogger$JavaLoggerProxy.<init>(PlatformLogger.java:602)
at sun.util.logging.PlatformLogger$JavaLoggerProxy.<init>(PlatformLogger.java:597)
at sun.util.logging.PlatformLogger.<init>(PlatformLogger.java:239)
at sun.util.logging.PlatformLogger.getLogger(PlatformLogger.java:198)
at sun.net.www.protocol.http.HttpURLConnection.<clinit>(HttpURLConnection.java:431)
... 14 more
<<<
``` | 2.0 | Test-sanity.system-JDK8-linux_ppc-64_cmprssptrs_le MauveMultiThreadLoadTest_0 java.security.AccessControlException: access denied - https://ci.eclipse.org/openj9/job/Test-sanity.system-JDK8-linux_ppc-64_cmprssptrs_le/234
```
01:20:29.105 - First failure detected by thread: load-2. Not creating dumps as no dump generation is requested for this load test
01:20:29.107 - Test failed
Failure num. = 1
Test number = 3033
Test details = 'Mauve[gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest]'
Suite number = 0
Thread number = 2
>>> Captured test output >>>
PASS: gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest: Default properties (number 0)
Test failed:
java.lang.ExceptionInInitializerError
at java.lang.J9VMInternals.ensureError(J9VMInternals.java:146)
at java.lang.J9VMInternals.recordInitializationFailure(J9VMInternals.java:135)
at sun.net.www.protocol.http.Handler.openConnection(Handler.java:62)
at sun.net.www.protocol.http.Handler.openConnection(Handler.java:57)
at java.net.URL.openConnection(URL.java:979)
at gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest.test_Properties(requestPropertiesTest.java:63)
at gnu.testlet.java.net.HttpURLConnection.requestPropertiesTest.test(requestPropertiesTest.java:44)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at net.adoptopenjdk.loadTest.adaptors.MauveAdaptor.executeTest(MauveAdaptor.java:74)
at net.adoptopenjdk.loadTest.LoadTestRunner$2.run(LoadTestRunner.java:182)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:813)
Caused by: java.security.AccessControlException: access denied
at gnu.testlet.java.util.logging.Handler.TestSecurityManager.checkPermission(TestSecurityManager.java:48)
at java.util.logging.LogManager.checkPermission(LogManager.java:1586)
at java.util.logging.Logger.setParent(Logger.java:2035)
at java.util.logging.LogManager$7.run(LogManager.java:1205)
at java.security.AccessController.doPrivileged(AccessController.java:647)
at java.util.logging.LogManager.doSetParent(LogManager.java:1202)
at java.util.logging.LogManager.access$1200(LogManager.java:145)
at java.util.logging.LogManager$LoggerContext.addLocalLogger(LogManager.java:830)
at java.util.logging.LogManager$LoggerContext.addLocalLogger(LogManager.java:758)
at java.util.logging.LogManager$SystemLoggerContext.demandLogger(LogManager.java:927)
at java.util.logging.LogManager.demandSystemLogger(LogManager.java:581)
at java.util.logging.Logger.getPlatformLogger(Logger.java:576)
at java.util.logging.LoggingProxyImpl.getLogger(LoggingProxyImpl.java:41)
at sun.util.logging.LoggingSupport.getLogger(LoggingSupport.java:100)
at sun.util.logging.PlatformLogger$JavaLoggerProxy.<init>(PlatformLogger.java:602)
at sun.util.logging.PlatformLogger$JavaLoggerProxy.<init>(PlatformLogger.java:597)
at sun.util.logging.PlatformLogger.<init>(PlatformLogger.java:239)
at sun.util.logging.PlatformLogger.getLogger(PlatformLogger.java:198)
at sun.net.www.protocol.http.HttpURLConnection.<clinit>(HttpURLConnection.java:431)
... 14 more
<<<
``` | test | test sanity system linux ppc cmprssptrs le mauvemultithreadloadtest java security accesscontrolexception access denied first failure detected by thread load not creating dumps as no dump generation is requested for this load test test failed failure num test number test details mauve suite number thread number captured test output pass gnu testlet java net httpurlconnection requestpropertiestest default properties number test failed java lang exceptionininitializererror at java lang ensureerror java at java lang recordinitializationfailure java at sun net at sun net at java net url openconnection url java at gnu testlet java net httpurlconnection requestpropertiestest test properties requestpropertiestest java at gnu testlet java net httpurlconnection requestpropertiestest test requestpropertiestest java at sun reflect nativemethodaccessorimpl native method at sun reflect nativemethodaccessorimpl invoke nativemethodaccessorimpl java at sun reflect delegatingmethodaccessorimpl invoke delegatingmethodaccessorimpl java at java lang reflect method invoke method java at net adoptopenjdk loadtest adaptors mauveadaptor executetest mauveadaptor java at net adoptopenjdk loadtest loadtestrunner run loadtestrunner java at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java caused by java security accesscontrolexception access denied at gnu testlet java util logging handler testsecuritymanager checkpermission testsecuritymanager java at java util logging logmanager checkpermission logmanager java at java util logging logger setparent logger java at java util logging logmanager run logmanager java at java security accesscontroller doprivileged accesscontroller java at java util logging logmanager dosetparent logmanager java at java util logging logmanager access logmanager java at java util logging logmanager loggercontext addlocallogger logmanager java at java util logging logmanager loggercontext addlocallogger logmanager java at java util logging logmanager systemloggercontext demandlogger logmanager java at java util logging logmanager demandsystemlogger logmanager java at java util logging logger getplatformlogger logger java at java util logging loggingproxyimpl getlogger loggingproxyimpl java at sun util logging loggingsupport getlogger loggingsupport java at sun util logging platformlogger javaloggerproxy platformlogger java at sun util logging platformlogger javaloggerproxy platformlogger java at sun util logging platformlogger platformlogger java at sun util logging platformlogger getlogger platformlogger java at sun net more | 1 |
247,578 | 26,718,535,703 | IssuesEvent | 2023-01-28 21:04:46 | samq-ghdemo/webgoat8.1 | https://api.github.com/repos/samq-ghdemo/webgoat8.1 | opened | guava-18.0.jar: 2 vulnerabilities (highest severity is: 5.9) | security vulnerability | <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>guava-18.0.jar</b></p></summary>
<p>Guava is a suite of core and expanded libraries that include
utility classes, google's collections, io classes, and much
much more.
Guava has only one code dependency - javax.annotation,
per the JSR-305 spec.</p>
<p>Library home page: <a href="http://code.google.com/p/guava-libraries">http://code.google.com/p/guava-libraries</a></p>
<p>Path to dependency file: /webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar,/home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/webgoat8.1/commit/60c081f3022a72fded7021d121b82532f51772cf">60c081f3022a72fded7021d121b82532f51772cf</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (guava version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2018-10237](https://www.mend.io/vulnerability-database/CVE-2018-10237) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.9 | guava-18.0.jar | Direct | 24.1.1-android | ✅ |
| [CVE-2020-8908](https://www.mend.io/vulnerability-database/CVE-2020-8908) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 3.3 | guava-18.0.jar | Direct | 30.0-android | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-10237</summary>
### Vulnerable Library - <b>guava-18.0.jar</b></p>
<p>Guava is a suite of core and expanded libraries that include
utility classes, google's collections, io classes, and much
much more.
Guava has only one code dependency - javax.annotation,
per the JSR-305 spec.</p>
<p>Library home page: <a href="http://code.google.com/p/guava-libraries">http://code.google.com/p/guava-libraries</a></p>
<p>Path to dependency file: /webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar,/home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **guava-18.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/webgoat8.1/commit/60c081f3022a72fded7021d121b82532f51772cf">60c081f3022a72fded7021d121b82532f51772cf</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Unbounded memory allocation in Google Guava 11.0 through 24.x before 24.1.1 allows remote attackers to conduct denial of service attacks against servers that depend on this library and deserialize attacker-provided data, because the AtomicDoubleArray class (when serialized with Java serialization) and the CompoundOrdering class (when serialized with GWT serialization) perform eager allocation without appropriate checks on what a client has sent and whether the data size is reasonable.
<p>Publish Date: 2018-04-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-10237>CVE-2018-10237</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-10237">https://nvd.nist.gov/vuln/detail/CVE-2018-10237</a></p>
<p>Release Date: 2018-04-26</p>
<p>Fix Resolution: 24.1.1-android</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2020-8908</summary>
### Vulnerable Library - <b>guava-18.0.jar</b></p>
<p>Guava is a suite of core and expanded libraries that include
utility classes, google's collections, io classes, and much
much more.
Guava has only one code dependency - javax.annotation,
per the JSR-305 spec.</p>
<p>Library home page: <a href="http://code.google.com/p/guava-libraries">http://code.google.com/p/guava-libraries</a></p>
<p>Path to dependency file: /webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar,/home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **guava-18.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/webgoat8.1/commit/60c081f3022a72fded7021d121b82532f51772cf">60c081f3022a72fded7021d121b82532f51772cf</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A temp directory creation vulnerability exists in all versions of Guava, allowing an attacker with access to the machine to potentially access data in a temporary directory created by the Guava API com.google.common.io.Files.createTempDir(). By default, on unix-like systems, the created directory is world-readable (readable by an attacker with access to the system). The method in question has been marked @Deprecated in versions 30.0 and later and should not be used. For Android developers, we recommend choosing a temporary directory API provided by Android, such as context.getCacheDir(). For other Java developers, we recommend migrating to the Java 7 API java.nio.file.Files.createTempDirectory() which explicitly configures permissions of 700, or configuring the Java runtime's java.io.tmpdir system property to point to a location whose permissions are appropriately configured.
<p>Publish Date: 2020-12-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-8908>CVE-2020-8908</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>3.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8908">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8908</a></p>
<p>Release Date: 2020-12-10</p>
<p>Fix Resolution: 30.0-android</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | True | guava-18.0.jar: 2 vulnerabilities (highest severity is: 5.9) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>guava-18.0.jar</b></p></summary>
<p>Guava is a suite of core and expanded libraries that include
utility classes, google's collections, io classes, and much
much more.
Guava has only one code dependency - javax.annotation,
per the JSR-305 spec.</p>
<p>Library home page: <a href="http://code.google.com/p/guava-libraries">http://code.google.com/p/guava-libraries</a></p>
<p>Path to dependency file: /webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar,/home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/webgoat8.1/commit/60c081f3022a72fded7021d121b82532f51772cf">60c081f3022a72fded7021d121b82532f51772cf</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in (guava version) | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | ------------- | --- |
| [CVE-2018-10237](https://www.mend.io/vulnerability-database/CVE-2018-10237) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 5.9 | guava-18.0.jar | Direct | 24.1.1-android | ✅ |
| [CVE-2020-8908](https://www.mend.io/vulnerability-database/CVE-2020-8908) | <img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Low | 3.3 | guava-18.0.jar | Direct | 30.0-android | ✅ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-10237</summary>
### Vulnerable Library - <b>guava-18.0.jar</b></p>
<p>Guava is a suite of core and expanded libraries that include
utility classes, google's collections, io classes, and much
much more.
Guava has only one code dependency - javax.annotation,
per the JSR-305 spec.</p>
<p>Library home page: <a href="http://code.google.com/p/guava-libraries">http://code.google.com/p/guava-libraries</a></p>
<p>Path to dependency file: /webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar,/home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **guava-18.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/webgoat8.1/commit/60c081f3022a72fded7021d121b82532f51772cf">60c081f3022a72fded7021d121b82532f51772cf</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
Unbounded memory allocation in Google Guava 11.0 through 24.x before 24.1.1 allows remote attackers to conduct denial of service attacks against servers that depend on this library and deserialize attacker-provided data, because the AtomicDoubleArray class (when serialized with Java serialization) and the CompoundOrdering class (when serialized with GWT serialization) perform eager allocation without appropriate checks on what a client has sent and whether the data size is reasonable.
<p>Publish Date: 2018-04-26
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2018-10237>CVE-2018-10237</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>5.9</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2018-10237">https://nvd.nist.gov/vuln/detail/CVE-2018-10237</a></p>
<p>Release Date: 2018-04-26</p>
<p>Fix Resolution: 24.1.1-android</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> CVE-2020-8908</summary>
### Vulnerable Library - <b>guava-18.0.jar</b></p>
<p>Guava is a suite of core and expanded libraries that include
utility classes, google's collections, io classes, and much
much more.
Guava has only one code dependency - javax.annotation,
per the JSR-305 spec.</p>
<p>Library home page: <a href="http://code.google.com/p/guava-libraries">http://code.google.com/p/guava-libraries</a></p>
<p>Path to dependency file: /webwolf/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar,/home/wss-scanner/.m2/repository/com/google/guava/guava/18.0/guava-18.0.jar</p>
<p>
Dependency Hierarchy:
- :x: **guava-18.0.jar** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/webgoat8.1/commit/60c081f3022a72fded7021d121b82532f51772cf">60c081f3022a72fded7021d121b82532f51772cf</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
A temp directory creation vulnerability exists in all versions of Guava, allowing an attacker with access to the machine to potentially access data in a temporary directory created by the Guava API com.google.common.io.Files.createTempDir(). By default, on unix-like systems, the created directory is world-readable (readable by an attacker with access to the system). The method in question has been marked @Deprecated in versions 30.0 and later and should not be used. For Android developers, we recommend choosing a temporary directory API provided by Android, such as context.getCacheDir(). For other Java developers, we recommend migrating to the Java 7 API java.nio.file.Files.createTempDirectory() which explicitly configures permissions of 700, or configuring the Java runtime's java.io.tmpdir system property to point to a location whose permissions are appropriately configured.
<p>Publish Date: 2020-12-10
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2020-8908>CVE-2020-8908</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>3.3</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8908">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2020-8908</a></p>
<p>Release Date: 2020-12-10</p>
<p>Fix Resolution: 30.0-android</p>
</p>
<p></p>
:rescue_worker_helmet: Automatic Remediation is available for this issue
</details>
***
<p>:rescue_worker_helmet: Automatic Remediation is available for this issue.</p> | non_test | guava jar vulnerabilities highest severity is vulnerable library guava jar guava is a suite of core and expanded libraries that include utility classes google s collections io classes and much much more guava has only one code dependency javax annotation per the jsr spec library home page a href path to dependency file webwolf pom xml path to vulnerable library home wss scanner repository com google guava guava guava jar home wss scanner repository com google guava guava guava jar found in head commit a href vulnerabilities cve severity cvss dependency type fixed in guava version remediation available medium guava jar direct android low guava jar direct android details cve vulnerable library guava jar guava is a suite of core and expanded libraries that include utility classes google s collections io classes and much much more guava has only one code dependency javax annotation per the jsr spec library home page a href path to dependency file webwolf pom xml path to vulnerable library home wss scanner repository com google guava guava guava jar home wss scanner repository com google guava guava guava jar dependency hierarchy x guava jar vulnerable library found in head commit a href found in base branch main vulnerability details unbounded memory allocation in google guava through x before allows remote attackers to conduct denial of service attacks against servers that depend on this library and deserialize attacker provided data because the atomicdoublearray class when serialized with java serialization and the compoundordering class when serialized with gwt serialization perform eager allocation without appropriate checks on what a client has sent and whether the data size is reasonable publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android rescue worker helmet automatic remediation is available for this issue cve vulnerable library guava jar guava is a suite of core and expanded libraries that include utility classes google s collections io classes and much much more guava has only one code dependency javax annotation per the jsr spec library home page a href path to dependency file webwolf pom xml path to vulnerable library home wss scanner repository com google guava guava guava jar home wss scanner repository com google guava guava guava jar dependency hierarchy x guava jar vulnerable library found in head commit a href found in base branch main vulnerability details a temp directory creation vulnerability exists in all versions of guava allowing an attacker with access to the machine to potentially access data in a temporary directory created by the guava api com google common io files createtempdir by default on unix like systems the created directory is world readable readable by an attacker with access to the system the method in question has been marked deprecated in versions and later and should not be used for android developers we recommend choosing a temporary directory api provided by android such as context getcachedir for other java developers we recommend migrating to the java api java nio file files createtempdirectory which explicitly configures permissions of or configuring the java runtime s java io tmpdir system property to point to a location whose permissions are appropriately configured publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution android rescue worker helmet automatic remediation is available for this issue rescue worker helmet automatic remediation is available for this issue | 0 |
26,251 | 2,684,269,517 | IssuesEvent | 2015-03-28 20:29:56 | ConEmu/old-issues | https://api.github.com/repos/ConEmu/old-issues | closed | Strange behaviour with Powershell | 1 star bug imported Priority-Medium | _From [JuhaPear...@gmail.com](https://code.google.com/u/100193669704337374304/) on January 17, 2013 08:13:34_
Required information! OS version: Win7 SP1 x64 ConEmu version: 120727c
Far version (if you are using Far Manager): N/A *Bug description* Running Netsh command in Powershell produces a different result than running the same command in the native Powershell console. *Steps to reproduction* 1. netsh wlan add profile filename="foo.xml"
Result:
Profile format error 0x80420011:
The network connection profile is corrupted.
Running the same command in Powershell native results in:
Profile foo is added on interface Wireless Network Connection
Extract from debug log:
16:12:30;536:32;Inject;;32;3;C:\Windows\SysWOW64\netsh.exe;"C:\Windows\system32\netsh.exe" wlan add profile "filename=Wireless Network Connection.xml";;0x00000003;0x00000007;0x0000000B
16:12:30;6896:32;SrLoad;;32;;Alloc: 0x000C0000
16:12:30;6896:32;Create;;32;3;;"C:\Windows\system32\netsh.exe" wlan add profile "filename=Wireless Network Connection.xml";Sw:1 ;;;
_Original issue: http://code.google.com/p/conemu-maximus5/issues/detail?id=891_ | 1.0 | Strange behaviour with Powershell - _From [JuhaPear...@gmail.com](https://code.google.com/u/100193669704337374304/) on January 17, 2013 08:13:34_
Required information! OS version: Win7 SP1 x64 ConEmu version: 120727c
Far version (if you are using Far Manager): N/A *Bug description* Running Netsh command in Powershell produces a different result than running the same command in the native Powershell console. *Steps to reproduction* 1. netsh wlan add profile filename="foo.xml"
Result:
Profile format error 0x80420011:
The network connection profile is corrupted.
Running the same command in Powershell native results in:
Profile foo is added on interface Wireless Network Connection
Extract from debug log:
16:12:30;536:32;Inject;;32;3;C:\Windows\SysWOW64\netsh.exe;"C:\Windows\system32\netsh.exe" wlan add profile "filename=Wireless Network Connection.xml";;0x00000003;0x00000007;0x0000000B
16:12:30;6896:32;SrLoad;;32;;Alloc: 0x000C0000
16:12:30;6896:32;Create;;32;3;;"C:\Windows\system32\netsh.exe" wlan add profile "filename=Wireless Network Connection.xml";Sw:1 ;;;
_Original issue: http://code.google.com/p/conemu-maximus5/issues/detail?id=891_ | non_test | strange behaviour with powershell from on january required information os version conemu version far version if you are using far manager n a bug description running netsh command in powershell produces a different result than running the same command in the native powershell console steps to reproduction netsh wlan add profile filename foo xml result profile format error the network connection profile is corrupted running the same command in powershell native results in profile foo is added on interface wireless network connection extract from debug log inject c windows netsh exe c windows netsh exe wlan add profile filename wireless network connection xml srload alloc create c windows netsh exe wlan add profile filename wireless network connection xml sw original issue | 0 |
3,752 | 2,910,313,374 | IssuesEvent | 2015-06-21 16:42:22 | joomla/joomla-cms | https://api.github.com/repos/joomla/joomla-cms | closed | Can't open administrator page | No Code Attached Yet | #### Steps to reproduce the issue
I can't open administrator page it always show this message
Warning: require_once(/home4/snackman/public_html/www.icansurvive.com/administrator/components/com_users/models/user.php): failed to open stream: No such file or directory in /home4/snackman/public_html/www.icansurvive.com/plugins/authentication/joomla/joomla.php on line 105
Fatal error: require_once(): Failed opening required '/home4/snackman/public_html/www.icansurvive.com/administrator/components/com_users/models/user.php' (include_path='.:/opt/php54/lib/php') in /home4/snackman/public_html/www.icansurvive.com/plugins/authentication/joomla/joomla.php on line 105
What I have to do ?
#### Expected result
#### Actual result
#### System information (as much as possible)
#### Additional comments
| 1.0 | Can't open administrator page - #### Steps to reproduce the issue
I can't open administrator page it always show this message
Warning: require_once(/home4/snackman/public_html/www.icansurvive.com/administrator/components/com_users/models/user.php): failed to open stream: No such file or directory in /home4/snackman/public_html/www.icansurvive.com/plugins/authentication/joomla/joomla.php on line 105
Fatal error: require_once(): Failed opening required '/home4/snackman/public_html/www.icansurvive.com/administrator/components/com_users/models/user.php' (include_path='.:/opt/php54/lib/php') in /home4/snackman/public_html/www.icansurvive.com/plugins/authentication/joomla/joomla.php on line 105
What I have to do ?
#### Expected result
#### Actual result
#### System information (as much as possible)
#### Additional comments
| non_test | can t open administrator page steps to reproduce the issue i can t open administrator page it always show this message warning require once snackman public html failed to open stream no such file or directory in snackman public html on line fatal error require once failed opening required snackman public html include path opt lib php in snackman public html on line what i have to do expected result actual result system information as much as possible additional comments | 0 |
304,024 | 26,246,463,849 | IssuesEvent | 2023-01-05 15:37:45 | zcash/zcash | https://api.github.com/repos/zcash/zcash | opened | Remove `z_gettotalbalance` usage from RPC tests | A-testing A-rpc-interface C-deprecation | In #6282 we disabled the `z_gettotalbalance` RPC by default. To minimise the effect on our codebase, we explicitly re-enabled it in every RPC test that uses it. We need to go through and replace usages of `z_getnewaddress` in our tests before we can remove it. | 1.0 | Remove `z_gettotalbalance` usage from RPC tests - In #6282 we disabled the `z_gettotalbalance` RPC by default. To minimise the effect on our codebase, we explicitly re-enabled it in every RPC test that uses it. We need to go through and replace usages of `z_getnewaddress` in our tests before we can remove it. | test | remove z gettotalbalance usage from rpc tests in we disabled the z gettotalbalance rpc by default to minimise the effect on our codebase we explicitly re enabled it in every rpc test that uses it we need to go through and replace usages of z getnewaddress in our tests before we can remove it | 1 |
341,154 | 30,569,006,462 | IssuesEvent | 2023-07-20 20:16:40 | web-platform-tests/interop | https://api.github.com/repos/web-platform-tests/interop | closed | Make color-mix() and RCS using rgb, hsl and hwb return round-trippable, unbounded results in color(srgb ...) | test-change-proposal focus area: Color Spaces and Functions | ### Test List
https://wpt.fyi/results/css/css-color/parsing/color-computed-color-mix-function.html
https://wpt.fyi/results/css/css-color/parsing/color-mix-out-of-gamut.html
~~https://wpt.fyi/results/css/css-color/parsing/color-computed-relative-color.html
https://wpt.fyi/results/css/css-color/parsing/color-valid-relative-color.html~~
These will all need to change to return `color(srgb ...)` for inputs with `rgb`, `hsl` and `hwb`, and that no gamut mapping is applied. Also "gamut-mapping.html" should be renamed to "color-mix-out-of-gamut.html".
### Rationale
From the discussion here: https://github.com/w3c/csswg-drafts/issues/8444
The following resolution was reached: https://github.com/mozilla/wg-decisions/issues/1125
Parsed and computed results of `color-mix` and relative color syntax for legacy colors should return colors in the format `color(srgb ....)`. | 1.0 | Make color-mix() and RCS using rgb, hsl and hwb return round-trippable, unbounded results in color(srgb ...) - ### Test List
https://wpt.fyi/results/css/css-color/parsing/color-computed-color-mix-function.html
https://wpt.fyi/results/css/css-color/parsing/color-mix-out-of-gamut.html
~~https://wpt.fyi/results/css/css-color/parsing/color-computed-relative-color.html
https://wpt.fyi/results/css/css-color/parsing/color-valid-relative-color.html~~
These will all need to change to return `color(srgb ...)` for inputs with `rgb`, `hsl` and `hwb`, and that no gamut mapping is applied. Also "gamut-mapping.html" should be renamed to "color-mix-out-of-gamut.html".
### Rationale
From the discussion here: https://github.com/w3c/csswg-drafts/issues/8444
The following resolution was reached: https://github.com/mozilla/wg-decisions/issues/1125
Parsed and computed results of `color-mix` and relative color syntax for legacy colors should return colors in the format `color(srgb ....)`. | test | make color mix and rcs using rgb hsl and hwb return round trippable unbounded results in color srgb test list these will all need to change to return color srgb for inputs with rgb hsl and hwb and that no gamut mapping is applied also gamut mapping html should be renamed to color mix out of gamut html rationale from the discussion here the following resolution was reached parsed and computed results of color mix and relative color syntax for legacy colors should return colors in the format color srgb | 1 |
231,729 | 18,790,984,127 | IssuesEvent | 2021-11-08 16:45:49 | CAKES-coding/swe574-group2 | https://api.github.com/repos/CAKES-coding/swe574-group2 | opened | Admin can not view delete user button within user list | bug effort: 1 priority: Low testing | **Describe the bug**
Admin can not view delete user button within user list
**To Reproduce**
Steps to reproduce the behavior:
1. Login with an admin user
2. Click on user list
3. At the right of the user list
4. Delete user button is missing
**Expected behavior**
There should be a delete button and is not applicable
**Desktop (please complete the following information):**
- OS: All
- Browser All
- Version All
| 1.0 | Admin can not view delete user button within user list - **Describe the bug**
Admin can not view delete user button within user list
**To Reproduce**
Steps to reproduce the behavior:
1. Login with an admin user
2. Click on user list
3. At the right of the user list
4. Delete user button is missing
**Expected behavior**
There should be a delete button and is not applicable
**Desktop (please complete the following information):**
- OS: All
- Browser All
- Version All
| test | admin can not view delete user button within user list describe the bug admin can not view delete user button within user list to reproduce steps to reproduce the behavior login with an admin user click on user list at the right of the user list delete user button is missing expected behavior there should be a delete button and is not applicable desktop please complete the following information os all browser all version all | 1 |
174,385 | 14,480,281,969 | IssuesEvent | 2020-12-10 10:57:26 | geosolutions-it/C179-DBIAIT | https://api.github.com/repos/geosolutions-it/C179-DBIAIT | closed | Design the database schemas | Task documentation | We need to design the database for the following schemas:
- ANALYSIS: contains tables for analysis purposes
- FREEZE: contains (partitioned) tables to store data per years
- SYSTEM: contains tables used by the webapp to manage the scheduling of tasks and monitoring them | 1.0 | Design the database schemas - We need to design the database for the following schemas:
- ANALYSIS: contains tables for analysis purposes
- FREEZE: contains (partitioned) tables to store data per years
- SYSTEM: contains tables used by the webapp to manage the scheduling of tasks and monitoring them | non_test | design the database schemas we need to design the database for the following schemas analysis contains tables for analysis purposes freeze contains partitioned tables to store data per years system contains tables used by the webapp to manage the scheduling of tasks and monitoring them | 0 |
282,750 | 8,710,085,272 | IssuesEvent | 2018-12-06 15:35:06 | googleapis/nodejs-bigtable | https://api.github.com/repos/googleapis/nodejs-bigtable | closed | Incorrect hyperlinks in documentation | :rotating_light: docs priority: p2 type: bug | The following hyperlinks in parameter description are incorrect
| Page Address | Nonworking Hyperlinks |
| --------------- | -------------------------|
|[ createInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#createInstance )| Hyperlink to Instance is 404 |
|[ getInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getInstance )| Hyperlink to Instance is 404 |
|[ listInstances(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listInstances )| Hyperlink to ListInstancesResponse is 404 |
|[ updateInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#updateInstance )| Hyperlink to Type,State and Instance is 404 |
|[ partialUpdateInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#partialUpdateInstance )| Hyperlink to Instance and FieldMask is 404 |
|[ createCluster(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#createCluster )| Hyperlink to Cluster is 404 |
|[ getCluster(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getCluster )| Hyperlink to Cluster is 404 |
|[ listClusters(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listClusters )| Hyperlink to ListInstancesResponse is 404 |
|[ updateCluster(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#updateCluster )| Hyperlink to State is 404 |
|[ createAppProfile(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#createAppProfile )| Hyperlink to AppProfile is 404 |
|[ getAppProfile(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getAppProfile )| Hyperlink to AppProfile is 404 |
|[ listAppProfiles(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listAppProfiles )| Hyperlink to AppProfile and ListAppProfilesResponse is 404 |
|[ listAppProfilesStream(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listAppProfilesStream )| Hyperlink to AppProfile is 404 |
|[ updateAppProfile(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#updateAppProfile )| Hyperlink to AppProfile and FieldMask is 404 |
|[ getIamPolicy(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getIamPolicy )| Hyperlink to Policy is 404 |
|[ setIamPolicy(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#setIamPolicy )| Hyperlink to Policy is 404 |
|[ createTable(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#createTable )| Hyperlink to Table and Split is 404 |
|[ listTables(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listTables )| Hyperlink to View,Table and ListTableResponse is 404 |
|[ listTablesStream(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listTablesStream )| Hyperlink to View and Table is 404 |
|[ getTable(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#getTable )| Hyperlink to View and Table is 404 |
|[ modifyColumnFamilies(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#modifyColumnFamilies )| Hyperlink to Modification and Table is 404 |
|[ generateConsistencyToken(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#generateConsistencyToken )| Hyperlink to GenerateConsistencyTokenResponse is 404 |
|[ checkConsistency(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#checkConsistency )| Hyperlink to CheckConsistencyResponse is 404 |
|[ snapshotTable(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#snapshotTable )| Hyperlink to Duration is 404 |
|[ getSnapshot(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#getSnapshot )| Hyperlink to Snapshot is 404 |
|[ listSnapshots(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listSnapshots )| Hyperlink to Snapshot and ListSnapshotsResponse is 404 |
|[ listSnapshotsStream(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listSnapshotsStream )| Hyperlink to Snapshot is 404 |
|[ readRows(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#readRows )| Hyperlink to RowSet,RowFilter and ReadRowsResponse is 404 |
|[ sampleRowKeys(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#sampleRowKeys )| Hyperlink to SampleRowKeysResponse is 404 |
|[ mutateRow(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#mutateRow )| Hyperlink to Mutation and MutateRowResponse is 404 |
|[ mutateRows(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#mutateRows )| Hyperlink to MutateRowsResponse is 404 |
|[ checkAndMutateRow(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#checkAndMutateRow )| Hyperlink to RowFilter,Mutation and CheckAndMutateRowResponse is 404 |
|[ readModifyWriteRow(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#readModifyWriteRow )| Hyperlink to ReadModifyWriteRule and ReadModifyWriteRowResponse is 404 |
| 1.0 | Incorrect hyperlinks in documentation - The following hyperlinks in parameter description are incorrect
| Page Address | Nonworking Hyperlinks |
| --------------- | -------------------------|
|[ createInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#createInstance )| Hyperlink to Instance is 404 |
|[ getInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getInstance )| Hyperlink to Instance is 404 |
|[ listInstances(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listInstances )| Hyperlink to ListInstancesResponse is 404 |
|[ updateInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#updateInstance )| Hyperlink to Type,State and Instance is 404 |
|[ partialUpdateInstance(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#partialUpdateInstance )| Hyperlink to Instance and FieldMask is 404 |
|[ createCluster(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#createCluster )| Hyperlink to Cluster is 404 |
|[ getCluster(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getCluster )| Hyperlink to Cluster is 404 |
|[ listClusters(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listClusters )| Hyperlink to ListInstancesResponse is 404 |
|[ updateCluster(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#updateCluster )| Hyperlink to State is 404 |
|[ createAppProfile(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#createAppProfile )| Hyperlink to AppProfile is 404 |
|[ getAppProfile(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getAppProfile )| Hyperlink to AppProfile is 404 |
|[ listAppProfiles(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listAppProfiles )| Hyperlink to AppProfile and ListAppProfilesResponse is 404 |
|[ listAppProfilesStream(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#listAppProfilesStream )| Hyperlink to AppProfile is 404 |
|[ updateAppProfile(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#updateAppProfile )| Hyperlink to AppProfile and FieldMask is 404 |
|[ getIamPolicy(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#getIamPolicy )| Hyperlink to Policy is 404 |
|[ setIamPolicy(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableInstanceAdminClient#setIamPolicy )| Hyperlink to Policy is 404 |
|[ createTable(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#createTable )| Hyperlink to Table and Split is 404 |
|[ listTables(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listTables )| Hyperlink to View,Table and ListTableResponse is 404 |
|[ listTablesStream(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listTablesStream )| Hyperlink to View and Table is 404 |
|[ getTable(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#getTable )| Hyperlink to View and Table is 404 |
|[ modifyColumnFamilies(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#modifyColumnFamilies )| Hyperlink to Modification and Table is 404 |
|[ generateConsistencyToken(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#generateConsistencyToken )| Hyperlink to GenerateConsistencyTokenResponse is 404 |
|[ checkConsistency(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#checkConsistency )| Hyperlink to CheckConsistencyResponse is 404 |
|[ snapshotTable(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#snapshotTable )| Hyperlink to Duration is 404 |
|[ getSnapshot(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#getSnapshot )| Hyperlink to Snapshot is 404 |
|[ listSnapshots(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listSnapshots )| Hyperlink to Snapshot and ListSnapshotsResponse is 404 |
|[ listSnapshotsStream(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableTableAdminClient#listSnapshotsStream )| Hyperlink to Snapshot is 404 |
|[ readRows(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#readRows )| Hyperlink to RowSet,RowFilter and ReadRowsResponse is 404 |
|[ sampleRowKeys(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#sampleRowKeys )| Hyperlink to SampleRowKeysResponse is 404 |
|[ mutateRow(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#mutateRow )| Hyperlink to Mutation and MutateRowResponse is 404 |
|[ mutateRows(request, options) returns Stream ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#mutateRows )| Hyperlink to MutateRowsResponse is 404 |
|[ checkAndMutateRow(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#checkAndMutateRow )| Hyperlink to RowFilter,Mutation and CheckAndMutateRowResponse is 404 |
|[ readModifyWriteRow(request, options, callback) returns Promise ]( https://cloud.google.com/nodejs/docs/reference/bigtable/0.13.x/v2.BigtableClient#readModifyWriteRow )| Hyperlink to ReadModifyWriteRule and ReadModifyWriteRowResponse is 404 |
| non_test | incorrect hyperlinks in documentation the following hyperlinks in parameter description are incorrect page address nonworking hyperlinks hyperlink to instance is hyperlink to instance is hyperlink to listinstancesresponse is hyperlink to type state and instance is hyperlink to instance and fieldmask is hyperlink to cluster is hyperlink to cluster is hyperlink to listinstancesresponse is hyperlink to state is hyperlink to appprofile is hyperlink to appprofile is hyperlink to appprofile and listappprofilesresponse is hyperlink to appprofile is hyperlink to appprofile and fieldmask is hyperlink to policy is hyperlink to policy is hyperlink to table and split is hyperlink to view table and listtableresponse is hyperlink to view and table is hyperlink to view and table is hyperlink to modification and table is hyperlink to generateconsistencytokenresponse is hyperlink to checkconsistencyresponse is hyperlink to duration is hyperlink to snapshot is hyperlink to snapshot and listsnapshotsresponse is hyperlink to snapshot is hyperlink to rowset rowfilter and readrowsresponse is hyperlink to samplerowkeysresponse is hyperlink to mutation and mutaterowresponse is hyperlink to mutaterowsresponse is hyperlink to rowfilter mutation and checkandmutaterowresponse is hyperlink to readmodifywriterule and readmodifywriterowresponse is | 0 |
430,871 | 30,204,518,129 | IssuesEvent | 2023-07-05 08:31:56 | HeyChobe/bombastic-ui | https://api.github.com/repos/HeyChobe/bombastic-ui | closed | Add Contributin Guidelines | documentation | Add Contributin Guidelines that describes aspects like:
- [x] Using Issue Tracker
- [x] Bug Reports
- [x] Contributing
- [x] License | 1.0 | Add Contributin Guidelines - Add Contributin Guidelines that describes aspects like:
- [x] Using Issue Tracker
- [x] Bug Reports
- [x] Contributing
- [x] License | non_test | add contributin guidelines add contributin guidelines that describes aspects like using issue tracker bug reports contributing license | 0 |
39,667 | 9,607,988,340 | IssuesEvent | 2019-05-12 00:46:18 | PureDarwin/PureFoundation | https://api.github.com/repos/PureDarwin/PureFoundation | closed | [NSMutableArray removeObject:inRange:] incorrect | Priority-Medium Type-Defect auto-migrated | ```
Looking at the source code of [NSMutableArray removeObject:inRange:]
(https://code.google.com/p/purefoundation/source/browse/trunk/NSArray.m#1328),
it appears that it only removes one occurrence of the given object.
However, in Apple's documentation for this method
(http://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation
/Classes/NSMutableArray_Class/Reference/Reference.html#//apple_ref/occ/instm/NSM
utableArray/removeObject:inRange:), it says that it "removes all occurrences"
[NSMutableArray removeObject:] has the same problem
```
Original issue reported on code.google.com by `spoon.re...@gmail.com` on 23 Mar 2011 at 1:49
| 1.0 | [NSMutableArray removeObject:inRange:] incorrect - ```
Looking at the source code of [NSMutableArray removeObject:inRange:]
(https://code.google.com/p/purefoundation/source/browse/trunk/NSArray.m#1328),
it appears that it only removes one occurrence of the given object.
However, in Apple's documentation for this method
(http://developer.apple.com/library/mac/documentation/Cocoa/Reference/Foundation
/Classes/NSMutableArray_Class/Reference/Reference.html#//apple_ref/occ/instm/NSM
utableArray/removeObject:inRange:), it says that it "removes all occurrences"
[NSMutableArray removeObject:] has the same problem
```
Original issue reported on code.google.com by `spoon.re...@gmail.com` on 23 Mar 2011 at 1:49
| non_test | incorrect looking at the source code of it appears that it only removes one occurrence of the given object however in apple s documentation for this method classes nsmutablearray class reference reference html apple ref occ instm nsm utablearray removeobject inrange it says that it removes all occurrences has the same problem original issue reported on code google com by spoon re gmail com on mar at | 0 |
101,789 | 8,797,645,804 | IssuesEvent | 2018-12-23 22:35:30 | JBaczuk/bitcoin_script_js | https://api.github.com/repos/JBaczuk/bitcoin_script_js | opened | Travis CI | tests tooling | Put this build tag on the README and run all unit tests in node.js and all browsers we want to be compatible with. | 1.0 | Travis CI - Put this build tag on the README and run all unit tests in node.js and all browsers we want to be compatible with. | test | travis ci put this build tag on the readme and run all unit tests in node js and all browsers we want to be compatible with | 1 |
88,098 | 8,131,774,596 | IssuesEvent | 2018-08-18 02:04:22 | adventurerscodex/uat | https://api.github.com/repos/adventurerscodex/uat | closed | Test: Add Maps and Images | severity-low-priority type-test | ### Test File
test_encounters.py
### User Story from UAT Doc
As a dm, I can add maps and images to an encounter and the data persists | 1.0 | Test: Add Maps and Images - ### Test File
test_encounters.py
### User Story from UAT Doc
As a dm, I can add maps and images to an encounter and the data persists | test | test add maps and images test file test encounters py user story from uat doc as a dm i can add maps and images to an encounter and the data persists | 1 |
175,635 | 14,533,995,796 | IssuesEvent | 2020-12-15 01:56:22 | daejo/deep-thoughts | https://api.github.com/repos/daejo/deep-thoughts | opened | Client-side login/sign up | documentation | **User Stories**
* As a user, I can create an account and login to the application through the front end | 1.0 | Client-side login/sign up - **User Stories**
* As a user, I can create an account and login to the application through the front end | non_test | client side login sign up user stories as a user i can create an account and login to the application through the front end | 0 |
28,564 | 4,422,354,998 | IssuesEvent | 2016-08-16 02:05:24 | SlateFoundation/slate-cbl | https://api.github.com/repos/SlateFoundation/slate-cbl | closed | Scaffold backend data model for To-Do | ready for testing | Students are able to create To-Dos for themselves. These are shown on their dashboard and exist outside the influence of the Current Section dropdown in the header. To-Dos are separated into several lists, one for each course the student is enrolled in and a final personal list that isn't associated with a course.
Each To-Do has five attributes:
- section (can be null for personal)
- description (text)
- due date
- complete boolean
- cleared boolean
The section is determined based on which header you've created the task under. Once a To-Do is created the student cannot edit the description or due date. It is up to the student to mark a To-Do as complete or incomplete. To get rid of the To-Do completely a student can Clear all To-Dos marked as complete for a given section.
This task revolves around setting up the basic To-Do model
## StudentToDo
### StudentID
- type: integer
### SectionID
- type: integer (nullable for Personal)
### Description
- type: string
### DueDate
- type: timestamp
### Completed
- type: bool
- default: false
### Cleared
- type: bool
- default: false
Ignore the Competencies listed above each To-Do in the designs below:

| 1.0 | Scaffold backend data model for To-Do - Students are able to create To-Dos for themselves. These are shown on their dashboard and exist outside the influence of the Current Section dropdown in the header. To-Dos are separated into several lists, one for each course the student is enrolled in and a final personal list that isn't associated with a course.
Each To-Do has five attributes:
- section (can be null for personal)
- description (text)
- due date
- complete boolean
- cleared boolean
The section is determined based on which header you've created the task under. Once a To-Do is created the student cannot edit the description or due date. It is up to the student to mark a To-Do as complete or incomplete. To get rid of the To-Do completely a student can Clear all To-Dos marked as complete for a given section.
This task revolves around setting up the basic To-Do model
## StudentToDo
### StudentID
- type: integer
### SectionID
- type: integer (nullable for Personal)
### Description
- type: string
### DueDate
- type: timestamp
### Completed
- type: bool
- default: false
### Cleared
- type: bool
- default: false
Ignore the Competencies listed above each To-Do in the designs below:

| test | scaffold backend data model for to do students are able to create to dos for themselves these are shown on their dashboard and exist outside the influence of the current section dropdown in the header to dos are separated into several lists one for each course the student is enrolled in and a final personal list that isn t associated with a course each to do has five attributes section can be null for personal description text due date complete boolean cleared boolean the section is determined based on which header you ve created the task under once a to do is created the student cannot edit the description or due date it is up to the student to mark a to do as complete or incomplete to get rid of the to do completely a student can clear all to dos marked as complete for a given section this task revolves around setting up the basic to do model studenttodo studentid type integer sectionid type integer nullable for personal description type string duedate type timestamp completed type bool default false cleared type bool default false ignore the competencies listed above each to do in the designs below | 1 |
326,310 | 27,982,505,200 | IssuesEvent | 2023-03-26 10:20:49 | py-pdf/pypdf | https://api.github.com/repos/py-pdf/pypdf | closed | Killing Security Mutants | nf-testing is-maintenance | Mutation testing applies common mistakes to the codebase and checks if the tests capture them. Those changes are called "mutations" (or mutants). Killing a mutant means to add a test which would fail if the mutation is applied.
## _writer.py::encrypt (V2)
```
--- PyPDF2/_writer.py
+++ PyPDF2/_writer.py
@@ -709,7 +709,7 @@
encrypt[NameObject(SA.FILTER)] = NameObject("/Standard")
encrypt[NameObject("/V")] = NumberObject(V)
if V == 2:
- encrypt[NameObject(SA.LENGTH)] = NumberObject(keylen * 8)
+ encrypt[NameObject(SA.LENGTH)] = NumberObject(keylen * 9)
encrypt[NameObject(ED.R)] = NumberObject(rev)
encrypt[NameObject(ED.O)] = ByteStringObject(O)
encrypt[NameObject(ED.U)] = ByteStringObject(U)
```
## Mutant #2325
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -78,7 +78,7 @@
m.update(id1_entry.original_bytes)
# 6. (Revision 3 or greater) If document metadata is not being encrypted,
# pass 4 bytes with the value 0xFFFFFFFF to the MD5 hash function.
- if rev >= 3 and not metadata_encrypt:
+ if rev > 3 and not metadata_encrypt:
m.update(b"\xff\xff\xff\xff")
# 7. Finish the hash.
md5_hash = m.digest()
```
## Mutant 2326
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -78,7 +78,7 @@
m.update(id1_entry.original_bytes)
# 6. (Revision 3 or greater) If document metadata is not being encrypted,
# pass 4 bytes with the value 0xFFFFFFFF to the MD5 hash function.
- if rev >= 3 and not metadata_encrypt:
+ if rev >= 4 and not metadata_encrypt:
m.update(b"\xff\xff\xff\xff")
# 7. Finish the hash.
md5_hash = m.digest()
```
## Mutant 2337
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -106,7 +106,7 @@
key = _alg33_1(owner_pwd, rev, keylen)
# 5. Pad or truncate the user password string as described in step 1 of
# algorithm 3.2.
- user_pwd_bytes = b_((user_pwd + str_(_encryption_padding))[:32])
+ user_pwd_bytes = b_((user_pwd + str_(_encryption_padding))[:33])
# 6. Encrypt the result of step 5, using an RC4 encryption function with
# the encryption key obtained in step 4.
val = RC4_encrypt(key, user_pwd_bytes)
```
## Mutant 2340 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -116,7 +116,7 @@
# taking each byte of the encryption key obtained in step 4 and performing
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
- if rev >= 3:
+ if rev > 3:
for i in range(1, 20):
new_key = ""
for key_char in key:
```
## Mutant 2341 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -116,7 +116,7 @@
# taking each byte of the encryption key obtained in step 4 and performing
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
- if rev >= 3:
+ if rev >= 4:
for i in range(1, 20):
new_key = ""
for key_char in key:
```
## Mutant 2342 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -117,7 +117,7 @@
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
if rev >= 3:
- for i in range(1, 20):
+ for i in range(2, 20):
new_key = ""
for key_char in key:
new_key += chr(ord_(key_char) ^ i)
```
## Mutant 2343
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -117,7 +117,7 @@
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
if rev >= 3:
- for i in range(1, 20):
+ for i in range(1, 21):
new_key = ""
for key_char in key:
new_key += chr(ord_(key_char) ^ i)
```
## Mutant 2383 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -226,7 +226,7 @@
def RC4_encrypt(key: Union[str, bytes], plaintext: bytes) -> bytes: # TODO
- S = list(range(256))
+ S = list(range(257))
j = 0
for i in range(256):
j = (j + S[i] + ord_(key[i % len(key)])) % 256
``` | 1.0 | Killing Security Mutants - Mutation testing applies common mistakes to the codebase and checks if the tests capture them. Those changes are called "mutations" (or mutants). Killing a mutant means to add a test which would fail if the mutation is applied.
## _writer.py::encrypt (V2)
```
--- PyPDF2/_writer.py
+++ PyPDF2/_writer.py
@@ -709,7 +709,7 @@
encrypt[NameObject(SA.FILTER)] = NameObject("/Standard")
encrypt[NameObject("/V")] = NumberObject(V)
if V == 2:
- encrypt[NameObject(SA.LENGTH)] = NumberObject(keylen * 8)
+ encrypt[NameObject(SA.LENGTH)] = NumberObject(keylen * 9)
encrypt[NameObject(ED.R)] = NumberObject(rev)
encrypt[NameObject(ED.O)] = ByteStringObject(O)
encrypt[NameObject(ED.U)] = ByteStringObject(U)
```
## Mutant #2325
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -78,7 +78,7 @@
m.update(id1_entry.original_bytes)
# 6. (Revision 3 or greater) If document metadata is not being encrypted,
# pass 4 bytes with the value 0xFFFFFFFF to the MD5 hash function.
- if rev >= 3 and not metadata_encrypt:
+ if rev > 3 and not metadata_encrypt:
m.update(b"\xff\xff\xff\xff")
# 7. Finish the hash.
md5_hash = m.digest()
```
## Mutant 2326
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -78,7 +78,7 @@
m.update(id1_entry.original_bytes)
# 6. (Revision 3 or greater) If document metadata is not being encrypted,
# pass 4 bytes with the value 0xFFFFFFFF to the MD5 hash function.
- if rev >= 3 and not metadata_encrypt:
+ if rev >= 4 and not metadata_encrypt:
m.update(b"\xff\xff\xff\xff")
# 7. Finish the hash.
md5_hash = m.digest()
```
## Mutant 2337
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -106,7 +106,7 @@
key = _alg33_1(owner_pwd, rev, keylen)
# 5. Pad or truncate the user password string as described in step 1 of
# algorithm 3.2.
- user_pwd_bytes = b_((user_pwd + str_(_encryption_padding))[:32])
+ user_pwd_bytes = b_((user_pwd + str_(_encryption_padding))[:33])
# 6. Encrypt the result of step 5, using an RC4 encryption function with
# the encryption key obtained in step 4.
val = RC4_encrypt(key, user_pwd_bytes)
```
## Mutant 2340 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -116,7 +116,7 @@
# taking each byte of the encryption key obtained in step 4 and performing
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
- if rev >= 3:
+ if rev > 3:
for i in range(1, 20):
new_key = ""
for key_char in key:
```
## Mutant 2341 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -116,7 +116,7 @@
# taking each byte of the encryption key obtained in step 4 and performing
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
- if rev >= 3:
+ if rev >= 4:
for i in range(1, 20):
new_key = ""
for key_char in key:
```
## Mutant 2342 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -117,7 +117,7 @@
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
if rev >= 3:
- for i in range(1, 20):
+ for i in range(2, 20):
new_key = ""
for key_char in key:
new_key += chr(ord_(key_char) ^ i)
```
## Mutant 2343
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -117,7 +117,7 @@
# an XOR operation between that byte and the single-byte value of the
# iteration counter (from 1 to 19).
if rev >= 3:
- for i in range(1, 20):
+ for i in range(1, 21):
new_key = ""
for key_char in key:
new_key += chr(ord_(key_char) ^ i)
```
## Mutant 2383 :heavy_check_mark:
```
--- PyPDF2/_security.py
+++ PyPDF2/_security.py
@@ -226,7 +226,7 @@
def RC4_encrypt(key: Union[str, bytes], plaintext: bytes) -> bytes: # TODO
- S = list(range(256))
+ S = list(range(257))
j = 0
for i in range(256):
j = (j + S[i] + ord_(key[i % len(key)])) % 256
``` | test | killing security mutants mutation testing applies common mistakes to the codebase and checks if the tests capture them those changes are called mutations or mutants killing a mutant means to add a test which would fail if the mutation is applied writer py encrypt writer py writer py encrypt nameobject standard encrypt numberobject v if v encrypt numberobject keylen encrypt numberobject keylen encrypt numberobject rev encrypt bytestringobject o encrypt bytestringobject u mutant security py security py m update entry original bytes revision or greater if document metadata is not being encrypted pass bytes with the value to the hash function if rev and not metadata encrypt if rev and not metadata encrypt m update b xff xff xff xff finish the hash hash m digest mutant security py security py m update entry original bytes revision or greater if document metadata is not being encrypted pass bytes with the value to the hash function if rev and not metadata encrypt if rev and not metadata encrypt m update b xff xff xff xff finish the hash hash m digest mutant security py security py key owner pwd rev keylen pad or truncate the user password string as described in step of algorithm user pwd bytes b user pwd str encryption padding user pwd bytes b user pwd str encryption padding encrypt the result of step using an encryption function with the encryption key obtained in step val encrypt key user pwd bytes mutant heavy check mark security py security py taking each byte of the encryption key obtained in step and performing an xor operation between that byte and the single byte value of the iteration counter from to if rev if rev for i in range new key for key char in key mutant heavy check mark security py security py taking each byte of the encryption key obtained in step and performing an xor operation between that byte and the single byte value of the iteration counter from to if rev if rev for i in range new key for key char in key mutant heavy check mark security py security py an xor operation between that byte and the single byte value of the iteration counter from to if rev for i in range for i in range new key for key char in key new key chr ord key char i mutant security py security py an xor operation between that byte and the single byte value of the iteration counter from to if rev for i in range for i in range new key for key char in key new key chr ord key char i mutant heavy check mark security py security py def encrypt key union plaintext bytes bytes todo s list range s list range j for i in range j j s ord key | 1 |
274,608 | 23,852,976,465 | IssuesEvent | 2022-09-06 19:50:17 | microsoft/vscode-jupyter | https://api.github.com/repos/microsoft/vscode-jupyter | closed | "Initial Notebook Cell Execution Perf Test" flaky | triage-needed flaky test | Saw this test fail multiple times recently
https://github.com/microsoft/vscode-jupyter/runs/8213023011?check_suite_focus=true
https://github.com/microsoft/vscode-jupyter/runs/8165563331?check_suite_focus=true
https://github.com/microsoft/vscode-jupyter/runs/8166064024?check_suite_focus=true | 1.0 | "Initial Notebook Cell Execution Perf Test" flaky - Saw this test fail multiple times recently
https://github.com/microsoft/vscode-jupyter/runs/8213023011?check_suite_focus=true
https://github.com/microsoft/vscode-jupyter/runs/8165563331?check_suite_focus=true
https://github.com/microsoft/vscode-jupyter/runs/8166064024?check_suite_focus=true | test | initial notebook cell execution perf test flaky saw this test fail multiple times recently | 1 |
337,020 | 30,234,112,937 | IssuesEvent | 2023-07-06 09:02:45 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | opened | roachtest: restore/tpce/400GB/aws/nodes=4/cpus=8 failed | C-test-failure O-robot O-roachtest release-blocker T-disaster-recovery branch-release-23.1 | roachtest.restore/tpce/400GB/aws/nodes=4/cpus=8 [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyAwsBazel/10797190?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyAwsBazel/10797190?buildTab=artifacts#/restore/tpce/400GB/aws/nodes=4/cpus=8) on release-23.1 @ [fa2d7f7c9894d701ac4a393f058aa84552957087](https://github.com/cockroachdb/cockroach/commits/fa2d7f7c9894d701ac4a393f058aa84552957087):
```
(test_runner.go:1073).runTest: test timed out (1h0m0s)
(monitor.go:137).Wait: monitor failure: monitor task failed: output in run_080048.232311134_n1_cockroach-sql-insecu: ./cockroach sql --insecure -e "RESTORE FROM LATEST IN 's3://cockroach-fixtures-us-east-2/backups/tpc-e/customers=25000/v22.2.0/inc-count=48?AUTH=implicit' AS OF SYSTEM TIME '2022-12-21T02:00:00Z' " returned: COMMAND_PROBLEM: exit status 137
test artifacts and logs in: /artifacts/restore/tpce/400GB/aws/nodes=4/cpus=8/run_1
```
<p>Parameters: <code>ROACHTEST_arch=amd64</code>
, <code>ROACHTEST_cloud=aws</code>
, <code>ROACHTEST_cpu=8</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_fs=ext4</code>
, <code>ROACHTEST_localSSD=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/disaster-recovery
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*restore/tpce/400GB/aws/nodes=4/cpus=8.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| 2.0 | roachtest: restore/tpce/400GB/aws/nodes=4/cpus=8 failed - roachtest.restore/tpce/400GB/aws/nodes=4/cpus=8 [failed](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyAwsBazel/10797190?buildTab=log) with [artifacts](https://teamcity.cockroachdb.com/buildConfiguration/Cockroach_Nightlies_RoachtestNightlyAwsBazel/10797190?buildTab=artifacts#/restore/tpce/400GB/aws/nodes=4/cpus=8) on release-23.1 @ [fa2d7f7c9894d701ac4a393f058aa84552957087](https://github.com/cockroachdb/cockroach/commits/fa2d7f7c9894d701ac4a393f058aa84552957087):
```
(test_runner.go:1073).runTest: test timed out (1h0m0s)
(monitor.go:137).Wait: monitor failure: monitor task failed: output in run_080048.232311134_n1_cockroach-sql-insecu: ./cockroach sql --insecure -e "RESTORE FROM LATEST IN 's3://cockroach-fixtures-us-east-2/backups/tpc-e/customers=25000/v22.2.0/inc-count=48?AUTH=implicit' AS OF SYSTEM TIME '2022-12-21T02:00:00Z' " returned: COMMAND_PROBLEM: exit status 137
test artifacts and logs in: /artifacts/restore/tpce/400GB/aws/nodes=4/cpus=8/run_1
```
<p>Parameters: <code>ROACHTEST_arch=amd64</code>
, <code>ROACHTEST_cloud=aws</code>
, <code>ROACHTEST_cpu=8</code>
, <code>ROACHTEST_encrypted=false</code>
, <code>ROACHTEST_fs=ext4</code>
, <code>ROACHTEST_localSSD=false</code>
, <code>ROACHTEST_ssd=0</code>
</p>
<details><summary>Help</summary>
<p>
See: [roachtest README](https://github.com/cockroachdb/cockroach/blob/master/pkg/cmd/roachtest/README.md)
See: [How To Investigate \(internal\)](https://cockroachlabs.atlassian.net/l/c/SSSBr8c7)
</p>
</details>
/cc @cockroachdb/disaster-recovery
<sub>
[This test on roachdash](https://roachdash.crdb.dev/?filter=status:open%20t:.*restore/tpce/400GB/aws/nodes=4/cpus=8.*&sort=title+created&display=lastcommented+project) | [Improve this report!](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)
</sub>
| test | roachtest restore tpce aws nodes cpus failed roachtest restore tpce aws nodes cpus with on release test runner go runtest test timed out monitor go wait monitor failure monitor task failed output in run cockroach sql insecu cockroach sql insecure e restore from latest in cockroach fixtures us east backups tpc e customers inc count auth implicit as of system time returned command problem exit status test artifacts and logs in artifacts restore tpce aws nodes cpus run parameters roachtest arch roachtest cloud aws roachtest cpu roachtest encrypted false roachtest fs roachtest localssd false roachtest ssd help see see cc cockroachdb disaster recovery | 1 |
135,223 | 19,532,333,184 | IssuesEvent | 2021-12-30 19:34:28 | Esri/calcite-components | https://api.github.com/repos/Esri/calcite-components | opened | [calcite-loader] Drop `active` in favor of `hidden` | discussion design refactor 0 - new needs triage | ### Description <!--( What needs to change? )-->
Stems from https://github.com/Esri/calcite-components/pull/3778/files#r774706383
It seems that `calcite-loader[active]` is used to toggle visibility, so maybe we should deprecate/remove it in favor of the global `hidden` attribute.
https://codepen.io/jcfranco/pen/KKXQErV?editors=1000
### Proposed Advantages <!--(Why should the code change? )-->
One less custom prop to manage, and it aligns with existing HTML conventions.
### Relevant Info <!--(e.g. Dependencies, Blockers, Helpful Details)-->
N/A
#### Which Component
`calcite-loader`
| 1.0 | [calcite-loader] Drop `active` in favor of `hidden` - ### Description <!--( What needs to change? )-->
Stems from https://github.com/Esri/calcite-components/pull/3778/files#r774706383
It seems that `calcite-loader[active]` is used to toggle visibility, so maybe we should deprecate/remove it in favor of the global `hidden` attribute.
https://codepen.io/jcfranco/pen/KKXQErV?editors=1000
### Proposed Advantages <!--(Why should the code change? )-->
One less custom prop to manage, and it aligns with existing HTML conventions.
### Relevant Info <!--(e.g. Dependencies, Blockers, Helpful Details)-->
N/A
#### Which Component
`calcite-loader`
| non_test | drop active in favor of hidden description stems from it seems that calcite loader is used to toggle visibility so maybe we should deprecate remove it in favor of the global hidden attribute proposed advantages one less custom prop to manage and it aligns with existing html conventions relevant info n a which component calcite loader | 0 |
88,833 | 17,671,455,350 | IssuesEvent | 2021-08-23 06:51:02 | flutter/website | https://api.github.com/repos/flutter/website | closed | Codelab code not working | p2-medium e1-hours codelab-external | [Using a plugin with a Flutter web app](https://codelabs.developers.google.com/codelabs/web-url-launcher/index.html?index=..%2F..index#9)
The following **Codelab** code is not working
```
onPressed: () => launch(
'/#/privacypolicy', // <--------Not Working
enableJavaScript: true,
enableDomStorage: true,
),
```
After this change code is working fine
```
onPressed: () => launch(
'/privacy_policy.html', //<--------------Change Required
enableJavaScript: true,
enableDomStorage: true,
),
```
I cross-check these pieces of code multiple times and then I'm writing this issue.
If I'm missing anything please let me know.
Thanks In Advance.
| 1.0 | Codelab code not working - [Using a plugin with a Flutter web app](https://codelabs.developers.google.com/codelabs/web-url-launcher/index.html?index=..%2F..index#9)
The following **Codelab** code is not working
```
onPressed: () => launch(
'/#/privacypolicy', // <--------Not Working
enableJavaScript: true,
enableDomStorage: true,
),
```
After this change code is working fine
```
onPressed: () => launch(
'/privacy_policy.html', //<--------------Change Required
enableJavaScript: true,
enableDomStorage: true,
),
```
I cross-check these pieces of code multiple times and then I'm writing this issue.
If I'm missing anything please let me know.
Thanks In Advance.
| non_test | codelab code not working the following codelab code is not working onpressed launch privacypolicy not working enablejavascript true enabledomstorage true after this change code is working fine onpressed launch privacy policy html change required enablejavascript true enabledomstorage true i cross check these pieces of code multiple times and then i m writing this issue if i m missing anything please let me know thanks in advance | 0 |
88,535 | 8,154,232,492 | IssuesEvent | 2018-08-23 02:06:33 | OpenZeppelin/openzeppelin-solidity | https://api.github.com/repos/OpenZeppelin/openzeppelin-solidity | opened | Use expectEvent to test logs | area:tests good first issue kind:improvement | Most log tests directly access the log array: we should instead use the `expectEvent` helper to do that.
Example from the `Escrow` tests:
```
const { logs } = await this.escrow.deposit(payee1, { from: owner, value: amount });
expectEvent.inLogs(logs, 'Deposited', { payee: payee1 });
```
Part of #1091. | 1.0 | Use expectEvent to test logs - Most log tests directly access the log array: we should instead use the `expectEvent` helper to do that.
Example from the `Escrow` tests:
```
const { logs } = await this.escrow.deposit(payee1, { from: owner, value: amount });
expectEvent.inLogs(logs, 'Deposited', { payee: payee1 });
```
Part of #1091. | test | use expectevent to test logs most log tests directly access the log array we should instead use the expectevent helper to do that example from the escrow tests const logs await this escrow deposit from owner value amount expectevent inlogs logs deposited payee part of | 1 |
73,495 | 8,882,229,123 | IssuesEvent | 2019-01-14 12:36:27 | owncloud/ios-app | https://api.github.com/repos/owncloud/ios-app | opened | [UI] Offline mode status message overlaps with the latest item in the list | Design bug discussion p4-low | Offline mode status message hides belowest item in the file list

List can be pulled up to check the info (as workaround), but other design solutions can be discussed
| 1.0 | [UI] Offline mode status message overlaps with the latest item in the list - Offline mode status message hides belowest item in the file list

List can be pulled up to check the info (as workaround), but other design solutions can be discussed
| non_test | offline mode status message overlaps with the latest item in the list offline mode status message hides belowest item in the file list list can be pulled up to check the info as workaround but other design solutions can be discussed | 0 |
20,114 | 10,465,157,068 | IssuesEvent | 2019-09-21 08:24:39 | Orange-Management/Modules | https://api.github.com/repos/Orange-Management/Modules | opened | [ItemManagement] Re-structure item views | env_development exp_medium sev_low stat_backlog time_medium type_feature type_security | The item management tool currently creates links to the views in the sales and purchase category.
This is fine but it should maybe also create it's own category where it can create different view links:
* General [all information]
* Sales [only sales information = important to hide purchase information]
* Purchase [only purchase information = important to hide sales information]
* Warehouse [only warehouse information = important to hide most of the purchase/sales information]
Either this should be done or every view needs to handle read permissions for the different tabs and ui components which get shown and hidden based on user permission. | True | [ItemManagement] Re-structure item views - The item management tool currently creates links to the views in the sales and purchase category.
This is fine but it should maybe also create it's own category where it can create different view links:
* General [all information]
* Sales [only sales information = important to hide purchase information]
* Purchase [only purchase information = important to hide sales information]
* Warehouse [only warehouse information = important to hide most of the purchase/sales information]
Either this should be done or every view needs to handle read permissions for the different tabs and ui components which get shown and hidden based on user permission. | non_test | re structure item views the item management tool currently creates links to the views in the sales and purchase category this is fine but it should maybe also create it s own category where it can create different view links general sales purchase warehouse either this should be done or every view needs to handle read permissions for the different tabs and ui components which get shown and hidden based on user permission | 0 |
744,009 | 25,923,535,204 | IssuesEvent | 2022-12-16 00:56:21 | Polygon-Solutions/city-keep | https://api.github.com/repos/Polygon-Solutions/city-keep | opened | Code Abstraction | priority | Most recent commit: 80415c8d5db9446f64d54543d5517f5268c7b5a1
The app currently has a lot of code that should be abstracted (ie. the same code is found in multiple locations).
Ways the code can be abstracted:
- Using styleOverrides in the MUI theme
- Enveloping an MUI component in a custom component with modifications to the CSS (using sx prop or other)
- Storing variables that represent a consistent property across multiple components or elements (eg. storing the Accordion component padding) | 1.0 | Code Abstraction - Most recent commit: 80415c8d5db9446f64d54543d5517f5268c7b5a1
The app currently has a lot of code that should be abstracted (ie. the same code is found in multiple locations).
Ways the code can be abstracted:
- Using styleOverrides in the MUI theme
- Enveloping an MUI component in a custom component with modifications to the CSS (using sx prop or other)
- Storing variables that represent a consistent property across multiple components or elements (eg. storing the Accordion component padding) | non_test | code abstraction most recent commit the app currently has a lot of code that should be abstracted ie the same code is found in multiple locations ways the code can be abstracted using styleoverrides in the mui theme enveloping an mui component in a custom component with modifications to the css using sx prop or other storing variables that represent a consistent property across multiple components or elements eg storing the accordion component padding | 0 |
301,754 | 26,094,713,365 | IssuesEvent | 2022-12-26 17:19:29 | godotengine/godot | https://api.github.com/repos/godotengine/godot | closed | Audio category is selectable in project settings and display some parameters | topic:editor needs testing | ### Godot version
v4.0.beta1
### System information
Windows 10
### Issue description
In project settings dialog box, the "Audio" category is selectable and displays some parameter values.
Some of them don't have the same value than those displayed in sub-categories.
This issue doesn't happen for all other categories.
### Steps to reproduce
- Open the "Project Settings" dialog box.
- Activate the "Advanced Settings" view.
- Select the "Audio" category by clicking the label.
### Minimal reproduction project
_No response_ | 1.0 | Audio category is selectable in project settings and display some parameters - ### Godot version
v4.0.beta1
### System information
Windows 10
### Issue description
In project settings dialog box, the "Audio" category is selectable and displays some parameter values.
Some of them don't have the same value than those displayed in sub-categories.
This issue doesn't happen for all other categories.
### Steps to reproduce
- Open the "Project Settings" dialog box.
- Activate the "Advanced Settings" view.
- Select the "Audio" category by clicking the label.
### Minimal reproduction project
_No response_ | test | audio category is selectable in project settings and display some parameters godot version system information windows issue description in project settings dialog box the audio category is selectable and displays some parameter values some of them don t have the same value than those displayed in sub categories this issue doesn t happen for all other categories steps to reproduce open the project settings dialog box activate the advanced settings view select the audio category by clicking the label minimal reproduction project no response | 1 |
38,729 | 15,785,926,286 | IssuesEvent | 2021-04-01 17:00:29 | microsoft/vscode-cpptools | https://api.github.com/repos/microsoft/vscode-cpptools | opened | -std=c++ should be filtered out when doing -x c compiler querying | Feature: Configuration Language Service bug | Otherwise, it'll fail to get the system defines for .c files and it'll give a configuration warning.
The opposite issue occurs when -std=c args are used for -x c++.
| 1.0 | -std=c++ should be filtered out when doing -x c compiler querying - Otherwise, it'll fail to get the system defines for .c files and it'll give a configuration warning.
The opposite issue occurs when -std=c args are used for -x c++.
| non_test | std c should be filtered out when doing x c compiler querying otherwise it ll fail to get the system defines for c files and it ll give a configuration warning the opposite issue occurs when std c args are used for x c | 0 |
172,201 | 13,281,600,640 | IssuesEvent | 2020-08-23 18:09:42 | Thy-Vipe/BeastsOfBermuda-issues | https://api.github.com/repos/Thy-Vipe/BeastsOfBermuda-issues | closed | [Bug] Herbs behaving strange when put down | Gameplay bug public_testing | _Originally written by **Shain | 76561198142835886**_
Game Version: 1.1.985
*===== System Specs =====
CPU Brand: Intel(R) Core(TM) i5-4670K CPU @ 3.40GHz
Vendor: GenuineIntel
GPU Brand: AMD Radeon R7 250 Series
GPU Driver Info: Unknown
Num CPU Cores: 4
===================*
Context: **orycto public test (orycto)**
Map: Ancestral_Plains
When I pick up a flower and put it down on some ground surfaces they look like they "replant" themselves and become unable to be picked up oor eaten. When I put the plants down inside burrows or on rocks, they behave normally.
Location: X=109814.750 Y=-135443.719 Z=4390.231 | 1.0 | [Bug] Herbs behaving strange when put down - _Originally written by **Shain | 76561198142835886**_
Game Version: 1.1.985
*===== System Specs =====
CPU Brand: Intel(R) Core(TM) i5-4670K CPU @ 3.40GHz
Vendor: GenuineIntel
GPU Brand: AMD Radeon R7 250 Series
GPU Driver Info: Unknown
Num CPU Cores: 4
===================*
Context: **orycto public test (orycto)**
Map: Ancestral_Plains
When I pick up a flower and put it down on some ground surfaces they look like they "replant" themselves and become unable to be picked up oor eaten. When I put the plants down inside burrows or on rocks, they behave normally.
Location: X=109814.750 Y=-135443.719 Z=4390.231 | test | herbs behaving strange when put down originally written by shain game version system specs cpu brand intel r core tm cpu vendor genuineintel gpu brand amd radeon series gpu driver info unknown num cpu cores context orycto public test orycto map ancestral plains when i pick up a flower and put it down on some ground surfaces they look like they replant themselves and become unable to be picked up oor eaten when i put the plants down inside burrows or on rocks they behave normally location x y z | 1 |
273,626 | 29,831,044,891 | IssuesEvent | 2023-06-18 09:22:57 | RG4421/ampere-centos-kernel | https://api.github.com/repos/RG4421/ampere-centos-kernel | closed | CVE-2022-3643 (Medium) detected in linuxv5.2 - autoclosed | Mend: dependency security vulnerability | ## CVE-2022-3643 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/xen-netback/netback.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/xen-netback/netback.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/xen-netback/netback.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Guests can trigger NIC interface reset/abort/crash via netback It is possible for a guest to trigger a NIC interface reset/abort/crash in a Linux based network backend by sending certain kinds of packets. It appears to be an (unwritten?) assumption in the rest of the Linux network stack that packet protocol headers are all contained within the linear section of the SKB and some NICs behave badly if this is not the case. This has been reported to occur with Cisco (enic) and Broadcom NetXtrem II BCM5780 (bnx2x) though it may be an issue with other NICs/drivers as well. In case the frontend is sending requests with split headers, netback will forward those violating above mentioned assumption to the networking core, resulting in said misbehavior.
<p>Publish Date: 2022-12-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3643>CVE-2022-3643</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-3643">https://www.linuxkernelcves.com/cves/CVE-2022-3643</a></p>
<p>Release Date: 2022-12-07</p>
<p>Fix Resolution: v4.9.336,v4.14.302,v4.19.269,v5.4.227,v5.10.159,v5.15.83,v6.0.13,v6.1</p>
</p>
</details>
<p></p>
| True | CVE-2022-3643 (Medium) detected in linuxv5.2 - autoclosed - ## CVE-2022-3643 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxv5.2</b></p></summary>
<p>
<p>Linux kernel source tree</p>
<p>Library home page: <a href=https://github.com/torvalds/linux.git>https://github.com/torvalds/linux.git</a></p>
<p>Found in base branch: <b>amp-centos-8.0-kernel</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (3)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/xen-netback/netback.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/xen-netback/netback.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/drivers/net/xen-netback/netback.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png?' width=19 height=20> Vulnerability Details</summary>
<p>
Guests can trigger NIC interface reset/abort/crash via netback It is possible for a guest to trigger a NIC interface reset/abort/crash in a Linux based network backend by sending certain kinds of packets. It appears to be an (unwritten?) assumption in the rest of the Linux network stack that packet protocol headers are all contained within the linear section of the SKB and some NICs behave badly if this is not the case. This has been reported to occur with Cisco (enic) and Broadcom NetXtrem II BCM5780 (bnx2x) though it may be an issue with other NICs/drivers as well. In case the frontend is sending requests with split headers, netback will forward those violating above mentioned assumption to the networking core, resulting in said misbehavior.
<p>Publish Date: 2022-12-07
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3643>CVE-2022-3643</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://www.linuxkernelcves.com/cves/CVE-2022-3643">https://www.linuxkernelcves.com/cves/CVE-2022-3643</a></p>
<p>Release Date: 2022-12-07</p>
<p>Fix Resolution: v4.9.336,v4.14.302,v4.19.269,v5.4.227,v5.10.159,v5.15.83,v6.0.13,v6.1</p>
</p>
</details>
<p></p>
| non_test | cve medium detected in autoclosed cve medium severity vulnerability vulnerable library linux kernel source tree library home page a href found in base branch amp centos kernel vulnerable source files drivers net xen netback netback c drivers net xen netback netback c drivers net xen netback netback c vulnerability details guests can trigger nic interface reset abort crash via netback it is possible for a guest to trigger a nic interface reset abort crash in a linux based network backend by sending certain kinds of packets it appears to be an unwritten assumption in the rest of the linux network stack that packet protocol headers are all contained within the linear section of the skb and some nics behave badly if this is not the case this has been reported to occur with cisco enic and broadcom netxtrem ii though it may be an issue with other nics drivers as well in case the frontend is sending requests with split headers netback will forward those violating above mentioned assumption to the networking core resulting in said misbehavior publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope changed impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution | 0 |
351,526 | 32,007,664,594 | IssuesEvent | 2023-09-21 15:47:01 | Quadratic-Labs/VerkleTries_Besu | https://api.github.com/repos/Quadratic-Labs/VerkleTries_Besu | opened | Tests commitments against another lib | good first issue tests | LibIpaMultiPoint nativelib and JNI wrapper computes commitments for 32-le-bytes values.
We need to generate test cases that concorde with results from another implementation such as rust-verkle one: https://github.com/crate-crypto/rust-verkle/tree/master | 1.0 | Tests commitments against another lib - LibIpaMultiPoint nativelib and JNI wrapper computes commitments for 32-le-bytes values.
We need to generate test cases that concorde with results from another implementation such as rust-verkle one: https://github.com/crate-crypto/rust-verkle/tree/master | test | tests commitments against another lib libipamultipoint nativelib and jni wrapper computes commitments for le bytes values we need to generate test cases that concorde with results from another implementation such as rust verkle one | 1 |
151,529 | 12,042,176,144 | IssuesEvent | 2020-04-14 10:05:51 | hibernate/hibernate-rx | https://api.github.com/repos/hibernate/hibernate-rx | closed | Test IdClass | good first issue testing | We need to add tests for when @IdClass is used. In particular, there is a line of code that we converted from ORM that I'm not sure it's correct: https://github.com/hibernate/hibernate-rx/blob/master/hibernate-rx-core/src/main/java/org/hibernate/rx/event/impl/DefaultRxLoadEventListener.java#L141
The mapping to test this should be (but I haven't tried it yet) something like:
```
@Entity
@IdClass(AnEntityId)
public class AnEntity {
@Id
@OneToOne
private OtherEntity otherEntity;
}
public class AnEntityId implements Serializable {
private int otherEntityId;
// getter/setter
}
@Entity
public class OtherEntity {
@Id
private int id;
}
```
and/or
```
@Entity
public class AnEntity {
@Id
@OneToOne
private OtherEntity otherEntity;
}
@Entity
public class OtherEntity {
@Id
private int id;
}
``` | 1.0 | Test IdClass - We need to add tests for when @IdClass is used. In particular, there is a line of code that we converted from ORM that I'm not sure it's correct: https://github.com/hibernate/hibernate-rx/blob/master/hibernate-rx-core/src/main/java/org/hibernate/rx/event/impl/DefaultRxLoadEventListener.java#L141
The mapping to test this should be (but I haven't tried it yet) something like:
```
@Entity
@IdClass(AnEntityId)
public class AnEntity {
@Id
@OneToOne
private OtherEntity otherEntity;
}
public class AnEntityId implements Serializable {
private int otherEntityId;
// getter/setter
}
@Entity
public class OtherEntity {
@Id
private int id;
}
```
and/or
```
@Entity
public class AnEntity {
@Id
@OneToOne
private OtherEntity otherEntity;
}
@Entity
public class OtherEntity {
@Id
private int id;
}
``` | test | test idclass we need to add tests for when idclass is used in particular there is a line of code that we converted from orm that i m not sure it s correct the mapping to test this should be but i haven t tried it yet something like entity idclass anentityid public class anentity id onetoone private otherentity otherentity public class anentityid implements serializable private int otherentityid getter setter entity public class otherentity id private int id and or entity public class anentity id onetoone private otherentity otherentity entity public class otherentity id private int id | 1 |
53,473 | 22,821,023,327 | IssuesEvent | 2022-07-12 02:19:28 | Azure/AppConfiguration | https://api.github.com/repos/Azure/AppConfiguration | closed | Locking a feature flag / key leads to invalid state | bug service | To reproduce:
- Go to Feature manager
- Right click a feature / key -> Lock
- Everything seems fine (Success message appears)
- Refresh page
- You an error "Some of your feature flags are not valid. Please navigate to Advanced Edit for more details."
The name + id of the flag are missing in the list.
If you open it in Advanced edit you see random (hashed?) characters like
"BGNCL2sUEmbC8Uxx [...] rtB66" | 1.0 | Locking a feature flag / key leads to invalid state - To reproduce:
- Go to Feature manager
- Right click a feature / key -> Lock
- Everything seems fine (Success message appears)
- Refresh page
- You an error "Some of your feature flags are not valid. Please navigate to Advanced Edit for more details."
The name + id of the flag are missing in the list.
If you open it in Advanced edit you see random (hashed?) characters like
"BGNCL2sUEmbC8Uxx [...] rtB66" | non_test | locking a feature flag key leads to invalid state to reproduce go to feature manager right click a feature key lock everything seems fine success message appears refresh page you an error some of your feature flags are not valid please navigate to advanced edit for more details the name id of the flag are missing in the list if you open it in advanced edit you see random hashed characters like | 0 |
292,841 | 25,244,349,602 | IssuesEvent | 2022-11-15 09:59:23 | elastic/kibana | https://api.github.com/repos/elastic/kibana | closed | Failing test: Jest Integration Tests.src/core/server/integration_tests/saved_objects/migrations/actions - migration actions initAction resolves right record with found indices | Team:Core failed-test | A test failed on a tracked branch
```
Error: thrown: "Exceeded timeout of 280000 ms for a hook.
Use jest.setTimeout(newTimeout) to increase the timeout value, if this is a long-running test."
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/src/core/server/integration_tests/saved_objects/migrations/actions/actions.test.ts:62:3
at _dispatchDescribe (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/index.js:98:26)
at describe (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/index.js:60:5)
at Object.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/src/core/server/integration_tests/saved_objects/migrations/actions/actions.test.ts:59:1)
at Runtime._execModule (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runtime/build/index.js:1646:24)
at Runtime._loadModule (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runtime/build/index.js:1185:12)
at Runtime.requireModule (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runtime/build/index.js:1009:12)
at jestAdapter (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapter.js:79:13)
at runTestInternal (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:389:16)
at runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:475:34)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/23497#01847704-a6ed-451b-a0ae-f94694f8b3b3)
<!-- kibanaCiData = {"failed-test":{"test.class":"Jest Integration Tests.src/core/server/integration_tests/saved_objects/migrations/actions","test.name":"migration actions initAction resolves right record with found indices","test.failCount":1}} --> | 1.0 | Failing test: Jest Integration Tests.src/core/server/integration_tests/saved_objects/migrations/actions - migration actions initAction resolves right record with found indices - A test failed on a tracked branch
```
Error: thrown: "Exceeded timeout of 280000 ms for a hook.
Use jest.setTimeout(newTimeout) to increase the timeout value, if this is a long-running test."
at /var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/src/core/server/integration_tests/saved_objects/migrations/actions/actions.test.ts:62:3
at _dispatchDescribe (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/index.js:98:26)
at describe (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/index.js:60:5)
at Object.<anonymous> (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/src/core/server/integration_tests/saved_objects/migrations/actions/actions.test.ts:59:1)
at Runtime._execModule (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runtime/build/index.js:1646:24)
at Runtime._loadModule (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runtime/build/index.js:1185:12)
at Runtime.requireModule (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runtime/build/index.js:1009:12)
at jestAdapter (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-circus/build/legacy-code-todo-rewrite/jestAdapter.js:79:13)
at runTestInternal (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:389:16)
at runTest (/var/lib/buildkite-agent/builds/kb-n2-4-spot-8e651ab3d6147ec4/elastic/kibana-on-merge/kibana/node_modules/jest-runner/build/runTest.js:475:34)
```
First failure: [CI Build - main](https://buildkite.com/elastic/kibana-on-merge/builds/23497#01847704-a6ed-451b-a0ae-f94694f8b3b3)
<!-- kibanaCiData = {"failed-test":{"test.class":"Jest Integration Tests.src/core/server/integration_tests/saved_objects/migrations/actions","test.name":"migration actions initAction resolves right record with found indices","test.failCount":1}} --> | test | failing test jest integration tests src core server integration tests saved objects migrations actions migration actions initaction resolves right record with found indices a test failed on a tracked branch error thrown exceeded timeout of ms for a hook use jest settimeout newtimeout to increase the timeout value if this is a long running test at var lib buildkite agent builds kb spot elastic kibana on merge kibana src core server integration tests saved objects migrations actions actions test ts at dispatchdescribe var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build index js at describe var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build index js at object var lib buildkite agent builds kb spot elastic kibana on merge kibana src core server integration tests saved objects migrations actions actions test ts at runtime execmodule var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runtime build index js at runtime loadmodule var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runtime build index js at runtime requiremodule var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runtime build index js at jestadapter var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest circus build legacy code todo rewrite jestadapter js at runtestinternal var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runner build runtest js at runtest var lib buildkite agent builds kb spot elastic kibana on merge kibana node modules jest runner build runtest js first failure | 1 |
603,266 | 18,536,377,459 | IssuesEvent | 2021-10-21 11:59:50 | Megabit/Blazorise | https://api.github.com/repos/Megabit/Blazorise | closed | [Feature] Full featured work with DataTable in DataGrid | Type: Feature Request ⛱ Priority: Low ♻ | **Is your feature request related to a problem? Please describe.**
`DataGrid `is very useful and simple to use when working with strongly typed collection of objects. Sometimes (e.g. user configurable reports) is useful to use a simple (old?) [DataTable ](https://docs.microsoft.com/en-us/dotnet/api/system.data.datatable?view=netcore-3.1) or [DataView](https://docs.microsoft.com/en-us/dotnet/api/system.data.dataview?view=netcore-3.1).
Both can be used in `Data` property of `DataGrid`. But through Field is possible to access `DataRow` properties not the `DataTable` columns. When I use a column name I get:
System.ArgumentException: 'col1' is not a member of type 'System.Data.DataRowView' (Parameter 'propertyOrFieldName')
at System.Linq.Expressions.Expression.PropertyOrField(Expression expression, String propertyOrFieldName)
at Blazorise.DataGrid.Utils.FunctionCompiler.GetSafeField(Expression item, String fieldName)
at Blazorise.DataGrid.Utils.FunctionCompiler.CreateValueGetter[TItem](String fieldName)
**Describe the solution you'd like**
The solution would be that for `DataRowView` or `DataRow` objects the component would use `Item[Field]` instead of `Item.Field`. My estimation from quick look to [FunctionCompiler.cs](https://github.com/stsrki/Blazorise/blob/master/Source/Extensions/Blazorise.DataGrid/Utils/FunctionCompiler.cs) is that simple decision by type in `CreateValueGetter` could do the job. WPF or WebForms `GridView` probably do similar decision because they are both able to use collection as well as `DataTable`.
### Additional context
DataTable has a collection of `DataRow`s which can be accessed strongly typed through extension method `.AsEnumerable()`. `DataView` is a collection of DataRowView but is necessary to use `.Cast<DataRowView>`
Code Sample (a dummy property is used so no sorting or filtering is available):
<DataGrid Data="dataView.Cast<DataRowView>()" TItem="DataRowView" IsBordered="true"
IsHoverable="true" ShowPager="true">
<DataGridColumn TItem="DataRowView" Caption="col1" Field="@nameof(DataRowView.RowVersion)" Context="item">
<DisplayTemplate>
@($"{item["col1"]}")
</DisplayTemplate>
</DataGridColumn>
<DataGridColumn TItem="DataRowView" Caption="col2" Context="item" Field="@nameof(DataRowView.RowVersion)">
<DisplayTemplate>
@($"{item["col2"]}")
</DisplayTemplate>
</DataGridColumn>
</DataGrid>
@code {
DataTable myTable;
DataView dataView;
protected override void OnInitialized()
{
base.OnInitialized();
myTable = new DataTable();
myTable.Columns.Add("col1", typeof(string));
myTable.Columns.Add("col2", typeof(string));
myTable.Rows.Add("a1", "bb");
myTable.Rows.Add("a2", "cc");
dataView = myTable.DefaultView;
}
}
And I Imagine that the Grid markup will be
<DataGrid Data="dataView.Cast<DataRowView>()" TItem="DataRowView" IsBordered="true"
IsHoverable="true" ShowPager="true">
<DataGridColumn TItem="DataRowView" Caption="col1" Field="col1"/>
<DataGridColumn TItem="DataRowView" Caption="col2" Context="item" Field="col2" />
</DataGrid>
| 1.0 | [Feature] Full featured work with DataTable in DataGrid - **Is your feature request related to a problem? Please describe.**
`DataGrid `is very useful and simple to use when working with strongly typed collection of objects. Sometimes (e.g. user configurable reports) is useful to use a simple (old?) [DataTable ](https://docs.microsoft.com/en-us/dotnet/api/system.data.datatable?view=netcore-3.1) or [DataView](https://docs.microsoft.com/en-us/dotnet/api/system.data.dataview?view=netcore-3.1).
Both can be used in `Data` property of `DataGrid`. But through Field is possible to access `DataRow` properties not the `DataTable` columns. When I use a column name I get:
System.ArgumentException: 'col1' is not a member of type 'System.Data.DataRowView' (Parameter 'propertyOrFieldName')
at System.Linq.Expressions.Expression.PropertyOrField(Expression expression, String propertyOrFieldName)
at Blazorise.DataGrid.Utils.FunctionCompiler.GetSafeField(Expression item, String fieldName)
at Blazorise.DataGrid.Utils.FunctionCompiler.CreateValueGetter[TItem](String fieldName)
**Describe the solution you'd like**
The solution would be that for `DataRowView` or `DataRow` objects the component would use `Item[Field]` instead of `Item.Field`. My estimation from quick look to [FunctionCompiler.cs](https://github.com/stsrki/Blazorise/blob/master/Source/Extensions/Blazorise.DataGrid/Utils/FunctionCompiler.cs) is that simple decision by type in `CreateValueGetter` could do the job. WPF or WebForms `GridView` probably do similar decision because they are both able to use collection as well as `DataTable`.
### Additional context
DataTable has a collection of `DataRow`s which can be accessed strongly typed through extension method `.AsEnumerable()`. `DataView` is a collection of DataRowView but is necessary to use `.Cast<DataRowView>`
Code Sample (a dummy property is used so no sorting or filtering is available):
<DataGrid Data="dataView.Cast<DataRowView>()" TItem="DataRowView" IsBordered="true"
IsHoverable="true" ShowPager="true">
<DataGridColumn TItem="DataRowView" Caption="col1" Field="@nameof(DataRowView.RowVersion)" Context="item">
<DisplayTemplate>
@($"{item["col1"]}")
</DisplayTemplate>
</DataGridColumn>
<DataGridColumn TItem="DataRowView" Caption="col2" Context="item" Field="@nameof(DataRowView.RowVersion)">
<DisplayTemplate>
@($"{item["col2"]}")
</DisplayTemplate>
</DataGridColumn>
</DataGrid>
@code {
DataTable myTable;
DataView dataView;
protected override void OnInitialized()
{
base.OnInitialized();
myTable = new DataTable();
myTable.Columns.Add("col1", typeof(string));
myTable.Columns.Add("col2", typeof(string));
myTable.Rows.Add("a1", "bb");
myTable.Rows.Add("a2", "cc");
dataView = myTable.DefaultView;
}
}
And I Imagine that the Grid markup will be
<DataGrid Data="dataView.Cast<DataRowView>()" TItem="DataRowView" IsBordered="true"
IsHoverable="true" ShowPager="true">
<DataGridColumn TItem="DataRowView" Caption="col1" Field="col1"/>
<DataGridColumn TItem="DataRowView" Caption="col2" Context="item" Field="col2" />
</DataGrid>
| non_test | full featured work with datatable in datagrid is your feature request related to a problem please describe datagrid is very useful and simple to use when working with strongly typed collection of objects sometimes e g user configurable reports is useful to use a simple old or both can be used in data property of datagrid but through field is possible to access datarow properties not the datatable columns when i use a column name i get system argumentexception is not a member of type system data datarowview parameter propertyorfieldname at system linq expressions expression propertyorfield expression expression string propertyorfieldname at blazorise datagrid utils functioncompiler getsafefield expression item string fieldname at blazorise datagrid utils functioncompiler createvaluegetter string fieldname describe the solution you d like the solution would be that for datarowview or datarow objects the component would use item instead of item field my estimation from quick look to is that simple decision by type in createvaluegetter could do the job wpf or webforms gridview probably do similar decision because they are both able to use collection as well as datatable additional context datatable has a collection of datarow s which can be accessed strongly typed through extension method asenumerable dataview is a collection of datarowview but is necessary to use cast code sample a dummy property is used so no sorting or filtering is available titem datarowview isbordered true ishoverable true showpager true item item code datatable mytable dataview dataview protected override void oninitialized base oninitialized mytable new datatable mytable columns add typeof string mytable columns add typeof string mytable rows add bb mytable rows add cc dataview mytable defaultview and i imagine that the grid markup will be titem datarowview isbordered true ishoverable true showpager true | 0 |
48,443 | 2,998,170,998 | IssuesEvent | 2015-07-23 12:41:12 | jayway/powermock | https://api.github.com/repos/jayway/powermock | closed | Add a PrepareEverythingForTest annotation | enhancement imported Milestone-Release1.0 Priority-High | _From [johan.ha...@gmail.com](https://code.google.com/u/105676376875942041029/) on November 07, 2008 21:08:05_
to modify all classes.
_Original issue: http://code.google.com/p/powermock/issues/detail?id=66_ | 1.0 | Add a PrepareEverythingForTest annotation - _From [johan.ha...@gmail.com](https://code.google.com/u/105676376875942041029/) on November 07, 2008 21:08:05_
to modify all classes.
_Original issue: http://code.google.com/p/powermock/issues/detail?id=66_ | non_test | add a prepareeverythingfortest annotation from on november to modify all classes original issue | 0 |
120,962 | 17,644,542,255 | IssuesEvent | 2021-08-20 02:43:17 | Killy85/game_ai_trainer | https://api.github.com/repos/Killy85/game_ai_trainer | opened | CVE-2021-37659 (High) detected in tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl | security vulnerability | ## CVE-2021-37659 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
TensorFlow is an end-to-end open source platform for machine learning. In affected versions an attacker can cause undefined behavior via binding a reference to null pointer in all binary cwise operations that don't require broadcasting (e.g., gradients of binary cwise operations). The [implementation](https://github.com/tensorflow/tensorflow/blob/84d053187cb80d975ef2b9684d4b61981bca0c41/tensorflow/core/kernels/cwise_ops_common.h#L264) assumes that the two inputs have exactly the same number of elements but does not check that. Hence, when the eigen functor executes it triggers heap OOB reads and undefined behavior due to binding to nullptr. We have patched the issue in GitHub commit 93f428fd1768df147171ed674fee1fc5ab8309ec. The fix will be included in TensorFlow 2.6.0. We will also cherrypick this commit on TensorFlow 2.5.1, TensorFlow 2.4.3, and TensorFlow 2.3.4, as these are also affected and still in supported range.
<p>Publish Date: 2021-08-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37659>CVE-2021-37659</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-q3g3-h9r4-prrc">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-q3g3-h9r4-prrc</a></p>
<p>Release Date: 2021-08-12</p>
<p>Fix Resolution: tensorflow - 2.3.4, 2.4.3, 2.5.1, 2.6.0, tensorflow-cpu - 2.3.4, 2.4.3, 2.5.1, 2.6.0, tensorflow-gpu - 2.3.4, 2.4.3, 2.5.1, 2.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2021-37659 (High) detected in tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl - ## CVE-2021-37659 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</b></p></summary>
<p>TensorFlow is an open source machine learning framework for everyone.</p>
<p>Library home page: <a href="https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl">https://files.pythonhosted.org/packages/d2/ea/ab2c8c0e81bd051cc1180b104c75a865ab0fc66c89be992c4b20bbf6d624/tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl</a></p>
<p>
Dependency Hierarchy:
- :x: **tensorflow-1.13.1-cp27-cp27mu-manylinux1_x86_64.whl** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
TensorFlow is an end-to-end open source platform for machine learning. In affected versions an attacker can cause undefined behavior via binding a reference to null pointer in all binary cwise operations that don't require broadcasting (e.g., gradients of binary cwise operations). The [implementation](https://github.com/tensorflow/tensorflow/blob/84d053187cb80d975ef2b9684d4b61981bca0c41/tensorflow/core/kernels/cwise_ops_common.h#L264) assumes that the two inputs have exactly the same number of elements but does not check that. Hence, when the eigen functor executes it triggers heap OOB reads and undefined behavior due to binding to nullptr. We have patched the issue in GitHub commit 93f428fd1768df147171ed674fee1fc5ab8309ec. The fix will be included in TensorFlow 2.6.0. We will also cherrypick this commit on TensorFlow 2.5.1, TensorFlow 2.4.3, and TensorFlow 2.3.4, as these are also affected and still in supported range.
<p>Publish Date: 2021-08-12
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-37659>CVE-2021-37659</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/tensorflow/tensorflow/security/advisories/GHSA-q3g3-h9r4-prrc">https://github.com/tensorflow/tensorflow/security/advisories/GHSA-q3g3-h9r4-prrc</a></p>
<p>Release Date: 2021-08-12</p>
<p>Fix Resolution: tensorflow - 2.3.4, 2.4.3, 2.5.1, 2.6.0, tensorflow-cpu - 2.3.4, 2.4.3, 2.5.1, 2.6.0, tensorflow-gpu - 2.3.4, 2.4.3, 2.5.1, 2.6.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in tensorflow whl cve high severity vulnerability vulnerable library tensorflow whl tensorflow is an open source machine learning framework for everyone library home page a href dependency hierarchy x tensorflow whl vulnerable library vulnerability details tensorflow is an end to end open source platform for machine learning in affected versions an attacker can cause undefined behavior via binding a reference to null pointer in all binary cwise operations that don t require broadcasting e g gradients of binary cwise operations the assumes that the two inputs have exactly the same number of elements but does not check that hence when the eigen functor executes it triggers heap oob reads and undefined behavior due to binding to nullptr we have patched the issue in github commit the fix will be included in tensorflow we will also cherrypick this commit on tensorflow tensorflow and tensorflow as these are also affected and still in supported range publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact low availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution tensorflow tensorflow cpu tensorflow gpu step up your open source security game with whitesource | 0 |
233,989 | 25,793,362,124 | IssuesEvent | 2022-12-10 09:32:13 | turkdevops/karma-jasmine | https://api.github.com/repos/turkdevops/karma-jasmine | closed | CVE-2022-37602 (High) detected in grunt-karma-4.0.0.tgz - autoclosed | security vulnerability | ## CVE-2022-37602 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>grunt-karma-4.0.0.tgz</b></p></summary>
<p>grunt plugin for karma test runner</p>
<p>Library home page: <a href="https://registry.npmjs.org/grunt-karma/-/grunt-karma-4.0.0.tgz">https://registry.npmjs.org/grunt-karma/-/grunt-karma-4.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/grunt-karma/package.json</p>
<p>
Dependency Hierarchy:
- :x: **grunt-karma-4.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/karma-jasmine/commit/c7d3ef39e2adf00f4ceec1ffae2d36ce56c19cc0">c7d3ef39e2adf00f4ceec1ffae2d36ce56c19cc0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in karma-runner grunt-karma 4.0.1 via the key variable in grunt-karma.js.
<p>Publish Date: 2022-10-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-37602>CVE-2022-37602</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | True | CVE-2022-37602 (High) detected in grunt-karma-4.0.0.tgz - autoclosed - ## CVE-2022-37602 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>grunt-karma-4.0.0.tgz</b></p></summary>
<p>grunt plugin for karma test runner</p>
<p>Library home page: <a href="https://registry.npmjs.org/grunt-karma/-/grunt-karma-4.0.0.tgz">https://registry.npmjs.org/grunt-karma/-/grunt-karma-4.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/grunt-karma/package.json</p>
<p>
Dependency Hierarchy:
- :x: **grunt-karma-4.0.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/turkdevops/karma-jasmine/commit/c7d3ef39e2adf00f4ceec1ffae2d36ce56c19cc0">c7d3ef39e2adf00f4ceec1ffae2d36ce56c19cc0</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in karma-runner grunt-karma 4.0.1 via the key variable in grunt-karma.js.
<p>Publish Date: 2022-10-14
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-37602>CVE-2022-37602</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github) | non_test | cve high detected in grunt karma tgz autoclosed cve high severity vulnerability vulnerable library grunt karma tgz grunt plugin for karma test runner library home page a href path to dependency file package json path to vulnerable library node modules grunt karma package json dependency hierarchy x grunt karma tgz vulnerable library found in head commit a href found in base branch master vulnerability details prototype pollution vulnerability in karma runner grunt karma via the key variable in grunt karma js publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href step up your open source security game with mend | 0 |
449,437 | 31,844,582,879 | IssuesEvent | 2023-09-14 18:52:44 | Julybui199x/Julubui9x | https://api.github.com/repos/Julybui199x/Julubui9x | reopened | Bvphuoc | bug documentation duplicate enhancement help wanted good first issue invalid question wontfix |
09-12 23:32:34.348 5176 5314 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:34.328 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: ViewPostIme pointer 1
09-12 23:32:34.219 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: ViewPostIme pointer 0
09-12 23:32:34.085 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:33.731 5176 5314 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:33.536 5176 5176 I MSHandlerLifeCycle: removeMultiSplitHandler: no exist. decor=DecorView@3210dde[OverviewActivity]
09-12 23:32:33.534 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: WindowStopped on org.walleth/org.walleth.overview.OverviewActivity set to true
09-12 23:32:33.534 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: stopped(true) old = false
09-12 23:32:33.533 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)8 dur=3 res=0x2 s={false 0x0} ch=true seqId=0
09-12 23:32:33.529 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: performTraversals mFirst=false windowShouldResize=false viewVisibilityChanged=true mForceNextWindowRelayout=false params=null
09-12 23:32:33.528 5176 5218 D OpenGLRenderer: destroyEglSurface
09-12 23:32:33.528 5176 5218 D OpenGLRenderer: setSurface() destroyed EGLSurface
09-12 23:32:33.528 5176 5218 D OpenGLRenderer: setSurface called with nullptr
09-12 23:32:33.511 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: handleAppVisibility mAppVisible = true visible = false
09-12 23:32:33.370 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)0 dur=8 res=0x0 s={true 0xb40000744e780290} ch=false seqId=0
09-12 23:32:33.370 5176 5176 I BLASTBufferQueue: update, w= 1080 h= 2400 mName = ViewRootImpl@7e61af[OverviewActivity] mNativeObject= 0xb4000073be74d270 sc.mNativeObject= 0xb40000734e75a7a0 format= -1 caller= android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2898 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301
09-12 23:32:33.369 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: updateBlastSurfaceIfNeeded mBlastBufferQueue=0xb4000073be74d270 isSameSurfaceControl=true
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: sfl=100000}
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fitSides= naviIconColor=0
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: bhv=DEFAULT
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: pfl=12020040
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fl=81812100
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: performTraversals mFirst=false windowShouldResize=false viewVisibilityChanged=false mForceNextWindowRelayout=false params={(0,0)(fillxfill) sim={adjust=pan} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:33.160 5176 5176 I InputMethodManager: startInputInner - mService.startInputOrWindowGainedFocus
09-12 23:32:33.160 5176 5176 D InputMethodManager: startInputInner - Id : 0
09-12 23:32:33.160 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: mThreadedRenderer.initializeIfNeeded()#2 mSurface={isValid=true 0xb40000744e77be70}
09-12 23:32:33.160 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: MSG_WINDOW_FOCUS_CHANGED 1 0
09-12 23:32:33.077 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: MSG_WINDOW_FOCUS_CHANGED 0 0
09-12 23:32:33.056 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: reportDrawFinished seqId=0 mSyncId=-1 fn=1 mSurfaceChangedTransaction=0xb40000737e758010
09-12 23:32:33.056 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: setupSync seqId=0 mSyncId=0 fn=1 caller=android.view.ViewRootImpl$$ExternalSyntheticLambda11.accept:6 android.window.SurfaceSyncer.lambda$setupSync$1$android-window-SurfaceSyncer:128 android.window.SurfaceSyncer$$ExternalSyntheticLambda1.accept:8 android.window.SurfaceSyncer$SyncSet.checkIfSyncIsComplete:382 android.window.SurfaceSyncer$SyncSet.markSyncReady:359 android.window.SurfaceSyncer.markSyncReady:151 android.view.ViewRootImpl.performTraversals:4503
09-12 23:32:33.056 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: onSyncComplete
09-12 23:32:33.056 5176 5218 W Parcel : Expecting binder but got null!
09-12 23:32:33.055 5176 5218 D OpenGLRenderer: CFMS:: SetUp Pid : 5176 Tid : 5218
09-12 23:32:33.054 5176 5218 I ViewRootImpl@c29d430[DebugWallethActivity]: Received frameCommittedCallback lastAttemptedDrawFrameNum=1 didProduceBuffer=true
09-12 23:32:33.054 5176 5218 I BLASTBufferQueue: [ViewRootImpl@c29d430[DebugWallethActivity]#2](f:0,a:0) onFrameAvailable the first frame is available
09-12 23:32:33.039 5176 5273 I ViewRootImpl@c29d430[DebugWallethActivity]: Setting up sync and frameCommitCallback
09-12 23:32:33.039 5176 5273 I ViewRootImpl@c29d430[DebugWallethActivity]: Received frameDrawingCallback syncResult=0 frameNum=1.
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: registerCallbacksForSync syncBuffer=false
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: Setting syncFrameCallback
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: Setup new sync id=0
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: reportNextDraw android.view.ViewRootImpl.performTraversals:4438 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301 android.view.Choreographer$CallbackRecord.run:1309
09-12 23:32:33.035 5176 5176 D ScrollView: onsize change changed
09-12 23:32:33.034 5176 5218 D OpenGLRenderer: eglCreateWindowSurface
09-12 23:32:33.034 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: mThreadedRenderer.initialize() mSurface={isValid=true 0xb40000744e77be70} hwInitialized=true
09-12 23:32:33.034 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)0 dur=10 res=0x3 s={true 0xb40000744e77be70} ch=true seqId=0
09-12 23:32:33.034 5176 5176 I BLASTBufferQueue: update, w= 1080 h= 2400 mName = ViewRootImpl@c29d430[DebugWallethActivity] mNativeObject= 0xb4000073be75dc10 sc.mNativeObject= 0xb40000734e76d830 format= -1 caller= android.graphics.BLASTBufferQueue.<init>:84 android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2909 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885
09-12 23:32:33.033 5176 5176 I BLASTBufferQueue: new BLASTBufferQueue, mName= ViewRootImpl@c29d430[DebugWallethActivity] mNativeObject= 0xb4000073be75dc10 sc.mNativeObject= 0xb40000734e76d830 caller= android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2909 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301 android.view.Choreographer$CallbackRecord.run:1309 android.view.Choreographer.doCallbacks:923 android.view.Choreographer.doFrame:852 android.view.Choreographer$FrameDisplayEventReceiver.run:1283
09-12 23:32:33.033 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: updateBlastSurfaceIfNeeded mBlastBufferQueue=null isSameSurfaceControl=false
09-12 23:32:33.033 5176 5176 D InsetsController: onStateChanged: InsetsState: {mDisplayFrame=Rect(0, 0 - 1080, 2400), mDisplayCutout=DisplayCutout{insets=Rect(0, 80 - 0, 0) waterfall=Insets{left=0, top=0, right=0, bottom=0} boundingRect={Bounds=[Rect(0, 0 - 0, 0), Rect(468, 0 - 612, 80), Rect(0, 0 - 0, 0), Rect(0, 0 - 0, 0)]} cutoutPathParserInfo={CutoutPathParserInfo{displayWidth=1080 displayHeight=2400 physicalDisplayWidth=1080 physicalDisplayHeight=2400 density={2.8125} cutoutSpec={M 0,0 H -25.6 V 28.44444444444444 H 25.6 V 0 H 0 Z @dp} rotation={0} scale={1.0} physicalPixelDisplaySizeRatio={1.0}}}}, mRoundedCorners=RoundedCorners{[RoundedCorner{position=TopLeft, radius=90, center=Point(90, 90)}, RoundedCorner{position=TopRight, radius=90, center=Point(990, 90)}, RoundedCorner{position=BottomRight, radius=90, center=Point(990, 2310)}, RoundedCorner{position=BottomLeft, radius=90, center=Point(90, 2310)}]} mRoundedCornerFrame=Rect(0, 0 - 1080, 2400), mPrivacyIndicatorBounds=PrivacyIndicatorBounds {static bounds=Rect(956, 0 - 1080, 80) rotation=0}, mSources= { InsetsSource: {mType=ITYPE_STATUS_BAR, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_NAVIGATION_BAR, mFrame=[0,2358][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_GESTURES, mFrame=[0,0][84,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_GESTURES, mFrame=[996,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_MANDATORY_GESTURES, mFrame=[0,0][1080,114], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_MANDATORY_GESTURES, mFrame=[0,2310][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_DISPLAY_CUTOUT, mFrame=[0,0][-100000,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_DISPLAY_CUTOUT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_DISPLAY_CUTOUT, mFrame=[100000,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_DISPLAY_CUTOUT, mFrame=[0,100000][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_TAPPABLE_ELEMENT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_TAPPABLE_ELEMENT, mFrame=[0,0][0,0], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_IME, mFrame=[0,0][0,0], mVisibleFrame=[0,1392][1080,2400], mVisible=false, mInsetsRoundedCornerFrame=false} } host=org.walleth/org.walleth.debug.DebugWallethActivity from=android.view.ViewRootImpl.relayoutWindow:9802
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: sfl=100000}
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fitSides= naviIconColor=0
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: bhv=DEFAULT
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: pfl=12020040
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fl=81812100
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: performTraversals mFirst=true windowShouldResize=true viewVisibilityChanged=false mForceNextWindowRelayout=false params={(0,0)(fillxfill) sim={adjust=resize forwardNavigation} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: sfl=100000}
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fitSides= naviIconColor=0
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: bhv=DEFAULT
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: pfl=12020040
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fl=81812100
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: performTraversals params={(0,0)(fillxfill) sim={adjust=resize forwardNavigation} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:33.018 5176 5176 I MSHandlerLifeCycle: removeMultiSplitHandler: no exist. decor=DecorView@5644a93[DebugWallethActivity]
09-12 23:32:33.018 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: setView = com.android.internal.policy.DecorView@5644a93 TM=true
09-12 23:32:33.016 5176 5176 D InsetsController: onStateChanged: InsetsState: {mDisplayFrame=Rect(0, 0 - 1080, 2400), mDisplayCutout=DisplayCutout{insets=Rect(0, 80 - 0, 0) waterfall=Insets{left=0, top=0, right=0, bottom=0} boundingRect={Bounds=[Rect(0, 0 - 0, 0), Rect(468, 0 - 612, 80), Rect(0, 0 - 0, 0), Rect(0, 0 - 0, 0)]} cutoutPathParserInfo={CutoutPathParserInfo{displayWidth=1080 displayHeight=2400 physicalDisplayWidth=1080 physicalDisplayHeight=2400 density={2.8125} cutoutSpec={M 0,0 H -25.6 V 28.44444444444444 H 25.6 V 0 H 0 Z @dp} rotation={0} scale={1.0} physicalPixelDisplaySizeRatio={1.0}}}}, mRoundedCorners=RoundedCorners{[RoundedCorner{position=TopLeft, radius=90, center=Point(90, 90)}, RoundedCorner{position=TopRight, radius=90, center=Point(990, 90)}, RoundedCorner{position=BottomRight, radius=90, center=Point(990, 2310)}, RoundedCorner{position=BottomLeft, radius=90, center=Point(90, 2310)}]} mRoundedCornerFrame=Rect(0, 0 - 1080, 2400), mPrivacyIndicatorBounds=PrivacyIndicatorBounds {static bounds=Rect(956, 0 - 1080, 80) rotation=0}, mSources= { InsetsSource: {mType=ITYPE_STATUS_BAR, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_NAVIGATION_BAR, mFrame=[0,2358][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_GESTURES, mFrame=[0,0][84,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_GESTURES, mFrame=[996,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_MANDATORY_GESTURES, mFrame=[0,0][1080,114], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_MANDATORY_GESTURES, mFrame=[0,2310][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_DISPLAY_CUTOUT, mFrame=[0,0][-100000,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_DISPLAY_CUTOUT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_DISPLAY_CUTOUT, mFrame=[100000,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_DISPLAY_CUTOUT, mFrame=[0,100000][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_TAPPABLE_ELEMENT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_TAPPABLE_ELEMENT, mFrame=[0,0][0,0], mVisible=true, mInsetsRoundedCornerFrame=false} } host=org.walleth/org.walleth.debug.DebugWallethActivity from=android.view.ViewRootImpl.setView:1732
09-12 23:32:33.012 5176 5218 D NativeCustomFrequencyManager: [NativeCFMS] BpCustomFrequencyManager::BpCustomFrequencyManager()
09-12 23:32:33.010 5176 5176 I MSHandlerLifeCycle: removeMultiSplitHandler: no exist. decor=DecorView@5644a93[]
09-12 23:32:33.010 5176 5176 I MSHandlerLifeCycle: check: return. pkg=org.walleth parent=null callers=com.android.internal.policy.DecorView.setVisibility:4416 android.app.ActivityThread.handleResumeActivity:5476 android.app.servertransaction.ResumeActivityItem.execute:54 android.app.servertransaction.ActivityTransactionItem.execute:45 android.app.servertransaction.TransactionExecutor.executeLifecycleState:176
09-12 23:32:33.004 5176 5176 D ScrollView: initGoToTop
09-12 23:32:33.003 5176 5176 D CompatibilityChangeReporter: Compat change id reported: 171228096; UID 10570; state: ENABLED
09-12 23:32:32.983 5176 5176 I AppCompatViewInflater: app:theme is now deprecated. Please move to using android:theme instead.
09-12 23:32:32.981 5176 5176 I DecorView: setWindowBackground: isPopOver=false color=ffffffff d=android.graphics.drawable.ColorDrawable@5e11ac9
09-12 23:32:32.981 5176 5176 I DecorView: getCurrentDensityDpi: from real metrics. densityDpi=450 msg=resources_loaded
09-12 23:32:32.981 5176 5176 D DecorView: setCaptionType = 0, this = DecorView@5644a93[]
09-12 23:32:32.981 5176 5176 I DecorView: updateCaptionType: isFloating=false isApplication=true hasWindowDecorCaption=false this=DecorView@5644a93[]
09-12 23:32:32.981 5176 5176 I DecorView: [INFO] isPopOver=false config=true
09-12 23:32:32.935 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 1
09-12 23:32:32.823 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 0
09-12 23:32:31.424 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)0 dur=4 res=0x0 s={true 0xb40000744e780290} ch=false seqId=0
09-12 23:32:31.424 5176 5176 I BLASTBufferQueue: update, w= 1080 h= 2400 mName = ViewRootImpl@7e61af[OverviewActivity] mNativeObject= 0xb4000073be74d270 sc.mNativeObject= 0xb40000734e761180 format= -1 caller= android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2898 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301
09-12 23:32:31.424 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: updateBlastSurfaceIfNeeded mBlastBufferQueue=0xb4000073be74d270 isSameSurfaceControl=true
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: sfl=100000}
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fitSides= naviIconColor=0
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: bhv=DEFAULT
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: pfl=12020040
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fl=81812100
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: performTraversals mFirst=false windowShouldResize=false viewVisibilityChanged=false mForceNextWindowRelayout=false params={(0,0)(fillxfill) sim={adjust=resize} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:31.398 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 1
09-12 23:32:31.252 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 0
09-12 23:32:30.523 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:29.634 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:29.258 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:28.997 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 1
09-12 23:32:28.874 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 0
09-12 23:32:26.761 5176 5176 D InsetsController: onStateChanged: InsetsState: {mDisplayFrame=Rect(0, 0 - 1080, 2400), mDisplayCutout=DisplayCutout{insets=Rect(0, 80 - 0, 0) waterfall=Insets{left=0, top=0, right=0, bottom=0} boundingRect={Bounds=[Rect(0, 0 - 0, 0), Rect(468, 0 - 612, 80), Rect(0, 0 - 0, 0), Rect(0, 0 - 0, 0)]} cutoutPathParserInfo={CutoutPathParserInfo{displayWidth=1080 displayHeight=2400 physicalDisplayWidth=1080 physicalDisplayHeight=2400 density={2.8125} cutoutSpec={M 0,0 H -25.6 V 28.44444444444444 H 25.6 V 0 H 0 Z @dp} rotation={0} scale={1.0} physicalPixelDisplaySizeRatio={1.0}}}}, mRoundedCorners=RoundedCorners{[RoundedCorner{position=TopLeft, radius=90, center=Point(90, 90)}, RoundedCorner{position=TopRight, radius=90, center=Point(990, 90)}, RoundedCorner{position=BottomRight, radius=90, center=Point(990, 2310)}, RoundedCorner{position=BottomLeft, radius=90, center=Point(90, 2310)}]} mRoundedCornerFrame=Rect(0, 0 - 1080, 2400), mPrivacyIndicatorBounds=PrivacyIndicatorBounds {static bounds=Rect(956, 0 - 1080, 80) rotation=0}, mSources= { InsetsSource: {mType=ITYPE_STATUS_BAR, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_NAVIGATION_BAR, mFrame=[0,2358][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_GESTURES, mFrame=[0,0][84,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_GESTURES, mFrame=[996,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_MANDATORY_GESTURES, mFrame=[0,0][1080,114], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_MANDATORY_GESTURES, mFrame=[0,2310][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_DISPLAY_CUTOUT, mFrame=[0,0][-100000,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_DISPLAY_CUTOUT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_DISPLAY_CUTOUT, mFrame=[100000,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_DISPLAY_CUTOUT, mFrame=[0,100000][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_TAPPABLE_ELEMENT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_TAPPABLE_ELEMENT, mFrame=[0,0][0,0], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_IME, mFrame=[0,0][0,0], mVisibleFrame=[0,1392][1080,2400], mVisible=false, mInsetsRoundedCornerFrame=false} } host=org.walleth/org.walleth.overview.OverviewActivity from=android.view.ViewRootImpl$ViewRootHandler.handleMessageImpl:6740
09-12 23:32:26.714 5176 5176 D InputMethodManager: startInputInner - Id : 0
09-12 23:32:26.694 5176 5176 I InputMethodManager: startInputInner - mService.startInputOrWindowGainedFocus
09-12 23:32:26.694 5176 5176 D InputMethodManager: startInputInner - Id : 0
09-12 23:32:26.693 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: mThreadedRenderer.initializeIfNeeded()#2 mSurface={isValid=true 0xb40000744e780290}
09-12 23:32:26.693 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: MSG_WINDOW_FOCUS_CHANGED 1 0
09-12 23:32:26.689 5176 5176 D InputTransport: Input channel destroyed: '2008fe6', fd=101
| 1.0 | Bvphuoc -
09-12 23:32:34.348 5176 5314 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:34.328 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: ViewPostIme pointer 1
09-12 23:32:34.219 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: ViewPostIme pointer 0
09-12 23:32:34.085 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:33.731 5176 5314 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:33.536 5176 5176 I MSHandlerLifeCycle: removeMultiSplitHandler: no exist. decor=DecorView@3210dde[OverviewActivity]
09-12 23:32:33.534 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: WindowStopped on org.walleth/org.walleth.overview.OverviewActivity set to true
09-12 23:32:33.534 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: stopped(true) old = false
09-12 23:32:33.533 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)8 dur=3 res=0x2 s={false 0x0} ch=true seqId=0
09-12 23:32:33.529 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: performTraversals mFirst=false windowShouldResize=false viewVisibilityChanged=true mForceNextWindowRelayout=false params=null
09-12 23:32:33.528 5176 5218 D OpenGLRenderer: destroyEglSurface
09-12 23:32:33.528 5176 5218 D OpenGLRenderer: setSurface() destroyed EGLSurface
09-12 23:32:33.528 5176 5218 D OpenGLRenderer: setSurface called with nullptr
09-12 23:32:33.511 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: handleAppVisibility mAppVisible = true visible = false
09-12 23:32:33.370 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)0 dur=8 res=0x0 s={true 0xb40000744e780290} ch=false seqId=0
09-12 23:32:33.370 5176 5176 I BLASTBufferQueue: update, w= 1080 h= 2400 mName = ViewRootImpl@7e61af[OverviewActivity] mNativeObject= 0xb4000073be74d270 sc.mNativeObject= 0xb40000734e75a7a0 format= -1 caller= android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2898 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301
09-12 23:32:33.369 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: updateBlastSurfaceIfNeeded mBlastBufferQueue=0xb4000073be74d270 isSameSurfaceControl=true
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: sfl=100000}
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fitSides= naviIconColor=0
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: bhv=DEFAULT
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: pfl=12020040
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fl=81812100
09-12 23:32:33.361 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: performTraversals mFirst=false windowShouldResize=false viewVisibilityChanged=false mForceNextWindowRelayout=false params={(0,0)(fillxfill) sim={adjust=pan} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:33.160 5176 5176 I InputMethodManager: startInputInner - mService.startInputOrWindowGainedFocus
09-12 23:32:33.160 5176 5176 D InputMethodManager: startInputInner - Id : 0
09-12 23:32:33.160 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: mThreadedRenderer.initializeIfNeeded()#2 mSurface={isValid=true 0xb40000744e77be70}
09-12 23:32:33.160 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: MSG_WINDOW_FOCUS_CHANGED 1 0
09-12 23:32:33.077 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: MSG_WINDOW_FOCUS_CHANGED 0 0
09-12 23:32:33.056 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: reportDrawFinished seqId=0 mSyncId=-1 fn=1 mSurfaceChangedTransaction=0xb40000737e758010
09-12 23:32:33.056 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: setupSync seqId=0 mSyncId=0 fn=1 caller=android.view.ViewRootImpl$$ExternalSyntheticLambda11.accept:6 android.window.SurfaceSyncer.lambda$setupSync$1$android-window-SurfaceSyncer:128 android.window.SurfaceSyncer$$ExternalSyntheticLambda1.accept:8 android.window.SurfaceSyncer$SyncSet.checkIfSyncIsComplete:382 android.window.SurfaceSyncer$SyncSet.markSyncReady:359 android.window.SurfaceSyncer.markSyncReady:151 android.view.ViewRootImpl.performTraversals:4503
09-12 23:32:33.056 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: onSyncComplete
09-12 23:32:33.056 5176 5218 W Parcel : Expecting binder but got null!
09-12 23:32:33.055 5176 5218 D OpenGLRenderer: CFMS:: SetUp Pid : 5176 Tid : 5218
09-12 23:32:33.054 5176 5218 I ViewRootImpl@c29d430[DebugWallethActivity]: Received frameCommittedCallback lastAttemptedDrawFrameNum=1 didProduceBuffer=true
09-12 23:32:33.054 5176 5218 I BLASTBufferQueue: [ViewRootImpl@c29d430[DebugWallethActivity]#2](f:0,a:0) onFrameAvailable the first frame is available
09-12 23:32:33.039 5176 5273 I ViewRootImpl@c29d430[DebugWallethActivity]: Setting up sync and frameCommitCallback
09-12 23:32:33.039 5176 5273 I ViewRootImpl@c29d430[DebugWallethActivity]: Received frameDrawingCallback syncResult=0 frameNum=1.
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: registerCallbacksForSync syncBuffer=false
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: Setting syncFrameCallback
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: Setup new sync id=0
09-12 23:32:33.036 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: reportNextDraw android.view.ViewRootImpl.performTraversals:4438 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301 android.view.Choreographer$CallbackRecord.run:1309
09-12 23:32:33.035 5176 5176 D ScrollView: onsize change changed
09-12 23:32:33.034 5176 5218 D OpenGLRenderer: eglCreateWindowSurface
09-12 23:32:33.034 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: mThreadedRenderer.initialize() mSurface={isValid=true 0xb40000744e77be70} hwInitialized=true
09-12 23:32:33.034 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)0 dur=10 res=0x3 s={true 0xb40000744e77be70} ch=true seqId=0
09-12 23:32:33.034 5176 5176 I BLASTBufferQueue: update, w= 1080 h= 2400 mName = ViewRootImpl@c29d430[DebugWallethActivity] mNativeObject= 0xb4000073be75dc10 sc.mNativeObject= 0xb40000734e76d830 format= -1 caller= android.graphics.BLASTBufferQueue.<init>:84 android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2909 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885
09-12 23:32:33.033 5176 5176 I BLASTBufferQueue: new BLASTBufferQueue, mName= ViewRootImpl@c29d430[DebugWallethActivity] mNativeObject= 0xb4000073be75dc10 sc.mNativeObject= 0xb40000734e76d830 caller= android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2909 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301 android.view.Choreographer$CallbackRecord.run:1309 android.view.Choreographer.doCallbacks:923 android.view.Choreographer.doFrame:852 android.view.Choreographer$FrameDisplayEventReceiver.run:1283
09-12 23:32:33.033 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: updateBlastSurfaceIfNeeded mBlastBufferQueue=null isSameSurfaceControl=false
09-12 23:32:33.033 5176 5176 D InsetsController: onStateChanged: InsetsState: {mDisplayFrame=Rect(0, 0 - 1080, 2400), mDisplayCutout=DisplayCutout{insets=Rect(0, 80 - 0, 0) waterfall=Insets{left=0, top=0, right=0, bottom=0} boundingRect={Bounds=[Rect(0, 0 - 0, 0), Rect(468, 0 - 612, 80), Rect(0, 0 - 0, 0), Rect(0, 0 - 0, 0)]} cutoutPathParserInfo={CutoutPathParserInfo{displayWidth=1080 displayHeight=2400 physicalDisplayWidth=1080 physicalDisplayHeight=2400 density={2.8125} cutoutSpec={M 0,0 H -25.6 V 28.44444444444444 H 25.6 V 0 H 0 Z @dp} rotation={0} scale={1.0} physicalPixelDisplaySizeRatio={1.0}}}}, mRoundedCorners=RoundedCorners{[RoundedCorner{position=TopLeft, radius=90, center=Point(90, 90)}, RoundedCorner{position=TopRight, radius=90, center=Point(990, 90)}, RoundedCorner{position=BottomRight, radius=90, center=Point(990, 2310)}, RoundedCorner{position=BottomLeft, radius=90, center=Point(90, 2310)}]} mRoundedCornerFrame=Rect(0, 0 - 1080, 2400), mPrivacyIndicatorBounds=PrivacyIndicatorBounds {static bounds=Rect(956, 0 - 1080, 80) rotation=0}, mSources= { InsetsSource: {mType=ITYPE_STATUS_BAR, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_NAVIGATION_BAR, mFrame=[0,2358][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_GESTURES, mFrame=[0,0][84,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_GESTURES, mFrame=[996,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_MANDATORY_GESTURES, mFrame=[0,0][1080,114], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_MANDATORY_GESTURES, mFrame=[0,2310][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_DISPLAY_CUTOUT, mFrame=[0,0][-100000,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_DISPLAY_CUTOUT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_DISPLAY_CUTOUT, mFrame=[100000,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_DISPLAY_CUTOUT, mFrame=[0,100000][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_TAPPABLE_ELEMENT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_TAPPABLE_ELEMENT, mFrame=[0,0][0,0], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_IME, mFrame=[0,0][0,0], mVisibleFrame=[0,1392][1080,2400], mVisible=false, mInsetsRoundedCornerFrame=false} } host=org.walleth/org.walleth.debug.DebugWallethActivity from=android.view.ViewRootImpl.relayoutWindow:9802
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: sfl=100000}
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fitSides= naviIconColor=0
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: bhv=DEFAULT
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: pfl=12020040
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fl=81812100
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: performTraversals mFirst=true windowShouldResize=true viewVisibilityChanged=false mForceNextWindowRelayout=false params={(0,0)(fillxfill) sim={adjust=resize forwardNavigation} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: sfl=100000}
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fitSides= naviIconColor=0
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: bhv=DEFAULT
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: pfl=12020040
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: fl=81812100
09-12 23:32:33.023 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: performTraversals params={(0,0)(fillxfill) sim={adjust=resize forwardNavigation} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:33.018 5176 5176 I MSHandlerLifeCycle: removeMultiSplitHandler: no exist. decor=DecorView@5644a93[DebugWallethActivity]
09-12 23:32:33.018 5176 5176 I ViewRootImpl@c29d430[DebugWallethActivity]: setView = com.android.internal.policy.DecorView@5644a93 TM=true
09-12 23:32:33.016 5176 5176 D InsetsController: onStateChanged: InsetsState: {mDisplayFrame=Rect(0, 0 - 1080, 2400), mDisplayCutout=DisplayCutout{insets=Rect(0, 80 - 0, 0) waterfall=Insets{left=0, top=0, right=0, bottom=0} boundingRect={Bounds=[Rect(0, 0 - 0, 0), Rect(468, 0 - 612, 80), Rect(0, 0 - 0, 0), Rect(0, 0 - 0, 0)]} cutoutPathParserInfo={CutoutPathParserInfo{displayWidth=1080 displayHeight=2400 physicalDisplayWidth=1080 physicalDisplayHeight=2400 density={2.8125} cutoutSpec={M 0,0 H -25.6 V 28.44444444444444 H 25.6 V 0 H 0 Z @dp} rotation={0} scale={1.0} physicalPixelDisplaySizeRatio={1.0}}}}, mRoundedCorners=RoundedCorners{[RoundedCorner{position=TopLeft, radius=90, center=Point(90, 90)}, RoundedCorner{position=TopRight, radius=90, center=Point(990, 90)}, RoundedCorner{position=BottomRight, radius=90, center=Point(990, 2310)}, RoundedCorner{position=BottomLeft, radius=90, center=Point(90, 2310)}]} mRoundedCornerFrame=Rect(0, 0 - 1080, 2400), mPrivacyIndicatorBounds=PrivacyIndicatorBounds {static bounds=Rect(956, 0 - 1080, 80) rotation=0}, mSources= { InsetsSource: {mType=ITYPE_STATUS_BAR, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_NAVIGATION_BAR, mFrame=[0,2358][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_GESTURES, mFrame=[0,0][84,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_GESTURES, mFrame=[996,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_MANDATORY_GESTURES, mFrame=[0,0][1080,114], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_MANDATORY_GESTURES, mFrame=[0,2310][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_DISPLAY_CUTOUT, mFrame=[0,0][-100000,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_DISPLAY_CUTOUT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_DISPLAY_CUTOUT, mFrame=[100000,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_DISPLAY_CUTOUT, mFrame=[0,100000][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_TAPPABLE_ELEMENT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_TAPPABLE_ELEMENT, mFrame=[0,0][0,0], mVisible=true, mInsetsRoundedCornerFrame=false} } host=org.walleth/org.walleth.debug.DebugWallethActivity from=android.view.ViewRootImpl.setView:1732
09-12 23:32:33.012 5176 5218 D NativeCustomFrequencyManager: [NativeCFMS] BpCustomFrequencyManager::BpCustomFrequencyManager()
09-12 23:32:33.010 5176 5176 I MSHandlerLifeCycle: removeMultiSplitHandler: no exist. decor=DecorView@5644a93[]
09-12 23:32:33.010 5176 5176 I MSHandlerLifeCycle: check: return. pkg=org.walleth parent=null callers=com.android.internal.policy.DecorView.setVisibility:4416 android.app.ActivityThread.handleResumeActivity:5476 android.app.servertransaction.ResumeActivityItem.execute:54 android.app.servertransaction.ActivityTransactionItem.execute:45 android.app.servertransaction.TransactionExecutor.executeLifecycleState:176
09-12 23:32:33.004 5176 5176 D ScrollView: initGoToTop
09-12 23:32:33.003 5176 5176 D CompatibilityChangeReporter: Compat change id reported: 171228096; UID 10570; state: ENABLED
09-12 23:32:32.983 5176 5176 I AppCompatViewInflater: app:theme is now deprecated. Please move to using android:theme instead.
09-12 23:32:32.981 5176 5176 I DecorView: setWindowBackground: isPopOver=false color=ffffffff d=android.graphics.drawable.ColorDrawable@5e11ac9
09-12 23:32:32.981 5176 5176 I DecorView: getCurrentDensityDpi: from real metrics. densityDpi=450 msg=resources_loaded
09-12 23:32:32.981 5176 5176 D DecorView: setCaptionType = 0, this = DecorView@5644a93[]
09-12 23:32:32.981 5176 5176 I DecorView: updateCaptionType: isFloating=false isApplication=true hasWindowDecorCaption=false this=DecorView@5644a93[]
09-12 23:32:32.981 5176 5176 I DecorView: [INFO] isPopOver=false config=true
09-12 23:32:32.935 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 1
09-12 23:32:32.823 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 0
09-12 23:32:31.424 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: Relayout returned: old=(0,0,1080,2400) new=(0,0,1080,2400) req=(1080,2400)0 dur=4 res=0x0 s={true 0xb40000744e780290} ch=false seqId=0
09-12 23:32:31.424 5176 5176 I BLASTBufferQueue: update, w= 1080 h= 2400 mName = ViewRootImpl@7e61af[OverviewActivity] mNativeObject= 0xb4000073be74d270 sc.mNativeObject= 0xb40000734e761180 format= -1 caller= android.view.ViewRootImpl.updateBlastSurfaceIfNeeded:2898 android.view.ViewRootImpl.relayoutWindow:9847 android.view.ViewRootImpl.performTraversals:3884 android.view.ViewRootImpl.doTraversal:3116 android.view.ViewRootImpl$TraversalRunnable.run:10885 android.view.Choreographer$CallbackRecord.run:1301
09-12 23:32:31.424 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: updateBlastSurfaceIfNeeded mBlastBufferQueue=0xb4000073be74d270 isSameSurfaceControl=true
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: sfl=100000}
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fitSides= naviIconColor=0
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: bhv=DEFAULT
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: pfl=12020040
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: fl=81812100
09-12 23:32:31.420 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: performTraversals mFirst=false windowShouldResize=false viewVisibilityChanged=false mForceNextWindowRelayout=false params={(0,0)(fillxfill) sim={adjust=resize} ty=BASE_APPLICATION wanim=0x1030309
09-12 23:32:31.398 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 1
09-12 23:32:31.252 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 0
09-12 23:32:30.523 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:29.634 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:29.258 5176 5318 W EtherScanAPI: Problem with JSON from EtherScan: Value Max rate limit reached, please use API Key for higher rate limit at result of type java.lang.String cannot be converted to JSONArray
09-12 23:32:28.997 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 1
09-12 23:32:28.874 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: ViewPostIme pointer 0
09-12 23:32:26.761 5176 5176 D InsetsController: onStateChanged: InsetsState: {mDisplayFrame=Rect(0, 0 - 1080, 2400), mDisplayCutout=DisplayCutout{insets=Rect(0, 80 - 0, 0) waterfall=Insets{left=0, top=0, right=0, bottom=0} boundingRect={Bounds=[Rect(0, 0 - 0, 0), Rect(468, 0 - 612, 80), Rect(0, 0 - 0, 0), Rect(0, 0 - 0, 0)]} cutoutPathParserInfo={CutoutPathParserInfo{displayWidth=1080 displayHeight=2400 physicalDisplayWidth=1080 physicalDisplayHeight=2400 density={2.8125} cutoutSpec={M 0,0 H -25.6 V 28.44444444444444 H 25.6 V 0 H 0 Z @dp} rotation={0} scale={1.0} physicalPixelDisplaySizeRatio={1.0}}}}, mRoundedCorners=RoundedCorners{[RoundedCorner{position=TopLeft, radius=90, center=Point(90, 90)}, RoundedCorner{position=TopRight, radius=90, center=Point(990, 90)}, RoundedCorner{position=BottomRight, radius=90, center=Point(990, 2310)}, RoundedCorner{position=BottomLeft, radius=90, center=Point(90, 2310)}]} mRoundedCornerFrame=Rect(0, 0 - 1080, 2400), mPrivacyIndicatorBounds=PrivacyIndicatorBounds {static bounds=Rect(956, 0 - 1080, 80) rotation=0}, mSources= { InsetsSource: {mType=ITYPE_STATUS_BAR, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_NAVIGATION_BAR, mFrame=[0,2358][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_GESTURES, mFrame=[0,0][84,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_GESTURES, mFrame=[996,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_MANDATORY_GESTURES, mFrame=[0,0][1080,114], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_MANDATORY_GESTURES, mFrame=[0,2310][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_LEFT_DISPLAY_CUTOUT, mFrame=[0,0][-100000,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_DISPLAY_CUTOUT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_RIGHT_DISPLAY_CUTOUT, mFrame=[100000,0][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_DISPLAY_CUTOUT, mFrame=[0,100000][1080,2400], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_TOP_TAPPABLE_ELEMENT, mFrame=[0,0][1080,80], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_BOTTOM_TAPPABLE_ELEMENT, mFrame=[0,0][0,0], mVisible=true, mInsetsRoundedCornerFrame=false}, InsetsSource: {mType=ITYPE_IME, mFrame=[0,0][0,0], mVisibleFrame=[0,1392][1080,2400], mVisible=false, mInsetsRoundedCornerFrame=false} } host=org.walleth/org.walleth.overview.OverviewActivity from=android.view.ViewRootImpl$ViewRootHandler.handleMessageImpl:6740
09-12 23:32:26.714 5176 5176 D InputMethodManager: startInputInner - Id : 0
09-12 23:32:26.694 5176 5176 I InputMethodManager: startInputInner - mService.startInputOrWindowGainedFocus
09-12 23:32:26.694 5176 5176 D InputMethodManager: startInputInner - Id : 0
09-12 23:32:26.693 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: mThreadedRenderer.initializeIfNeeded()#2 mSurface={isValid=true 0xb40000744e780290}
09-12 23:32:26.693 5176 5176 I ViewRootImpl@7e61af[OverviewActivity]: MSG_WINDOW_FOCUS_CHANGED 1 0
09-12 23:32:26.689 5176 5176 D InputTransport: Input channel destroyed: '2008fe6', fd=101
| non_test | bvphuoc w etherscanapi problem with json from etherscan value max rate limit reached please use api key for higher rate limit at result of type java lang string cannot be converted to jsonarray i viewrootimpl viewpostime pointer i viewrootimpl viewpostime pointer w etherscanapi problem with json from etherscan value max rate limit reached please use api key for higher rate limit at result of type java lang string cannot be converted to jsonarray w etherscanapi problem with json from etherscan value max rate limit reached please use api key for higher rate limit at result of type java lang string cannot be converted to jsonarray i mshandlerlifecycle removemultisplithandler no exist decor decorview i viewrootimpl windowstopped on org walleth org walleth overview overviewactivity set to true i viewrootimpl stopped true old false i viewrootimpl relayout returned old new req dur res s false ch true seqid i viewrootimpl performtraversals mfirst false windowshouldresize false viewvisibilitychanged true mforcenextwindowrelayout false params null d openglrenderer destroyeglsurface d openglrenderer setsurface destroyed eglsurface d openglrenderer setsurface called with nullptr i viewrootimpl handleappvisibility mappvisible true visible false i viewrootimpl relayout returned old new req dur res s true ch false seqid i blastbufferqueue update w h mname viewrootimpl mnativeobject sc mnativeobject format caller android view viewrootimpl updateblastsurfaceifneeded android view viewrootimpl relayoutwindow android view viewrootimpl performtraversals android view viewrootimpl dotraversal android view viewrootimpl traversalrunnable run android view choreographer callbackrecord run i viewrootimpl updateblastsurfaceifneeded mblastbufferqueue issamesurfacecontrol true i viewrootimpl sfl i viewrootimpl fitsides naviiconcolor i viewrootimpl bhv default i viewrootimpl pfl i viewrootimpl fl i viewrootimpl performtraversals mfirst false windowshouldresize false viewvisibilitychanged false mforcenextwindowrelayout false params fillxfill sim adjust pan ty base application wanim i inputmethodmanager startinputinner mservice startinputorwindowgainedfocus d inputmethodmanager startinputinner id i viewrootimpl mthreadedrenderer initializeifneeded msurface isvalid true i viewrootimpl msg window focus changed i viewrootimpl msg window focus changed i viewrootimpl reportdrawfinished seqid msyncid fn msurfacechangedtransaction i viewrootimpl setupsync seqid msyncid fn caller android view viewrootimpl accept android window surfacesyncer lambda setupsync android window surfacesyncer android window surfacesyncer accept android window surfacesyncer syncset checkifsynciscomplete android window surfacesyncer syncset marksyncready android window surfacesyncer marksyncready android view viewrootimpl performtraversals i viewrootimpl onsynccomplete w parcel expecting binder but got null d openglrenderer cfms setup pid tid i viewrootimpl received framecommittedcallback lastattempteddrawframenum didproducebuffer true i blastbufferqueue f a onframeavailable the first frame is available i viewrootimpl setting up sync and framecommitcallback i viewrootimpl received framedrawingcallback syncresult framenum i viewrootimpl registercallbacksforsync syncbuffer false i viewrootimpl setting syncframecallback i viewrootimpl setup new sync id i viewrootimpl reportnextdraw android view viewrootimpl performtraversals android view viewrootimpl dotraversal android view viewrootimpl traversalrunnable run android view choreographer callbackrecord run android view choreographer callbackrecord run d scrollview onsize change changed d openglrenderer eglcreatewindowsurface i viewrootimpl mthreadedrenderer initialize msurface isvalid true hwinitialized true i viewrootimpl relayout returned old new req dur res s true ch true seqid i blastbufferqueue update w h mname viewrootimpl mnativeobject sc mnativeobject format caller android graphics blastbufferqueue android view viewrootimpl updateblastsurfaceifneeded android view viewrootimpl relayoutwindow android view viewrootimpl performtraversals android view viewrootimpl dotraversal android view viewrootimpl traversalrunnable run i blastbufferqueue new blastbufferqueue mname viewrootimpl mnativeobject sc mnativeobject caller android view viewrootimpl updateblastsurfaceifneeded android view viewrootimpl relayoutwindow android view viewrootimpl performtraversals android view viewrootimpl dotraversal android view viewrootimpl traversalrunnable run android view choreographer callbackrecord run android view choreographer callbackrecord run android view choreographer docallbacks android view choreographer doframe android view choreographer framedisplayeventreceiver run i viewrootimpl updateblastsurfaceifneeded mblastbufferqueue null issamesurfacecontrol false d insetscontroller onstatechanged insetsstate mdisplayframe rect mdisplaycutout displaycutout insets rect waterfall insets left top right bottom boundingrect bounds cutoutpathparserinfo cutoutpathparserinfo displaywidth displayheight physicaldisplaywidth physicaldisplayheight density cutoutspec m h v h v h z dp rotation scale physicalpixeldisplaysizeratio mroundedcorners roundedcorners mroundedcornerframe rect mprivacyindicatorbounds privacyindicatorbounds static bounds rect rotation msources insetssource mtype itype status bar mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype navigation bar mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype left gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype right gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top mandatory gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom mandatory gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype left display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype right display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top tappable element mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom tappable element mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype ime mframe mvisibleframe mvisible false minsetsroundedcornerframe false host org walleth org walleth debug debugwallethactivity from android view viewrootimpl relayoutwindow i viewrootimpl sfl i viewrootimpl fitsides naviiconcolor i viewrootimpl bhv default i viewrootimpl pfl i viewrootimpl fl i viewrootimpl performtraversals mfirst true windowshouldresize true viewvisibilitychanged false mforcenextwindowrelayout false params fillxfill sim adjust resize forwardnavigation ty base application wanim i viewrootimpl sfl i viewrootimpl fitsides naviiconcolor i viewrootimpl bhv default i viewrootimpl pfl i viewrootimpl fl i viewrootimpl performtraversals params fillxfill sim adjust resize forwardnavigation ty base application wanim i mshandlerlifecycle removemultisplithandler no exist decor decorview i viewrootimpl setview com android internal policy decorview tm true d insetscontroller onstatechanged insetsstate mdisplayframe rect mdisplaycutout displaycutout insets rect waterfall insets left top right bottom boundingrect bounds cutoutpathparserinfo cutoutpathparserinfo displaywidth displayheight physicaldisplaywidth physicaldisplayheight density cutoutspec m h v h v h z dp rotation scale physicalpixeldisplaysizeratio mroundedcorners roundedcorners mroundedcornerframe rect mprivacyindicatorbounds privacyindicatorbounds static bounds rect rotation msources insetssource mtype itype status bar mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype navigation bar mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype left gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype right gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top mandatory gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom mandatory gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype left display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype right display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top tappable element mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom tappable element mframe mvisible true minsetsroundedcornerframe false host org walleth org walleth debug debugwallethactivity from android view viewrootimpl setview d nativecustomfrequencymanager bpcustomfrequencymanager bpcustomfrequencymanager i mshandlerlifecycle removemultisplithandler no exist decor decorview i mshandlerlifecycle check return pkg org walleth parent null callers com android internal policy decorview setvisibility android app activitythread handleresumeactivity android app servertransaction resumeactivityitem execute android app servertransaction activitytransactionitem execute android app servertransaction transactionexecutor executelifecyclestate d scrollview initgototop d compatibilitychangereporter compat change id reported uid state enabled i appcompatviewinflater app theme is now deprecated please move to using android theme instead i decorview setwindowbackground ispopover false color ffffffff d android graphics drawable colordrawable i decorview getcurrentdensitydpi from real metrics densitydpi msg resources loaded d decorview setcaptiontype this decorview i decorview updatecaptiontype isfloating false isapplication true haswindowdecorcaption false this decorview i decorview ispopover false config true i viewrootimpl viewpostime pointer i viewrootimpl viewpostime pointer i viewrootimpl relayout returned old new req dur res s true ch false seqid i blastbufferqueue update w h mname viewrootimpl mnativeobject sc mnativeobject format caller android view viewrootimpl updateblastsurfaceifneeded android view viewrootimpl relayoutwindow android view viewrootimpl performtraversals android view viewrootimpl dotraversal android view viewrootimpl traversalrunnable run android view choreographer callbackrecord run i viewrootimpl updateblastsurfaceifneeded mblastbufferqueue issamesurfacecontrol true i viewrootimpl sfl i viewrootimpl fitsides naviiconcolor i viewrootimpl bhv default i viewrootimpl pfl i viewrootimpl fl i viewrootimpl performtraversals mfirst false windowshouldresize false viewvisibilitychanged false mforcenextwindowrelayout false params fillxfill sim adjust resize ty base application wanim i viewrootimpl viewpostime pointer i viewrootimpl viewpostime pointer w etherscanapi problem with json from etherscan value max rate limit reached please use api key for higher rate limit at result of type java lang string cannot be converted to jsonarray w etherscanapi problem with json from etherscan value max rate limit reached please use api key for higher rate limit at result of type java lang string cannot be converted to jsonarray w etherscanapi problem with json from etherscan value max rate limit reached please use api key for higher rate limit at result of type java lang string cannot be converted to jsonarray i viewrootimpl viewpostime pointer i viewrootimpl viewpostime pointer d insetscontroller onstatechanged insetsstate mdisplayframe rect mdisplaycutout displaycutout insets rect waterfall insets left top right bottom boundingrect bounds cutoutpathparserinfo cutoutpathparserinfo displaywidth displayheight physicaldisplaywidth physicaldisplayheight density cutoutspec m h v h v h z dp rotation scale physicalpixeldisplaysizeratio mroundedcorners roundedcorners mroundedcornerframe rect mprivacyindicatorbounds privacyindicatorbounds static bounds rect rotation msources insetssource mtype itype status bar mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype navigation bar mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype left gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype right gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top mandatory gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom mandatory gestures mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype left display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype right display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom display cutout mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype top tappable element mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype bottom tappable element mframe mvisible true minsetsroundedcornerframe false insetssource mtype itype ime mframe mvisibleframe mvisible false minsetsroundedcornerframe false host org walleth org walleth overview overviewactivity from android view viewrootimpl viewroothandler handlemessageimpl d inputmethodmanager startinputinner id i inputmethodmanager startinputinner mservice startinputorwindowgainedfocus d inputmethodmanager startinputinner id i viewrootimpl mthreadedrenderer initializeifneeded msurface isvalid true i viewrootimpl msg window focus changed d inputtransport input channel destroyed fd | 0 |
175,173 | 13,537,957,081 | IssuesEvent | 2020-09-16 11:20:31 | cockroachdb/cockroach | https://api.github.com/repos/cockroachdb/cockroach | closed | roachtest: tpcc/interleaved/nodes=3/cpu=16/w=500 failed | C-test-failure O-roachtest O-robot branch-master | [(roachtest).tpcc/interleaved/nodes=3/cpu=16/w=500 failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2159276&tab=buildLog) on [master@d310eb8eb6da5df4ac4407a2c90d898acdfdddf3](https://github.com/cockroachdb/cockroach/commits/d310eb8eb6da5df4ac4407a2c90d898acdfdddf3):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/tpcc/interleaved/nodes=3/cpu=16/w=500/run_1
test_runner.go:801: test timed out (6h0m0s)
cluster.go:2126,tpcc.go:189,tpcc.go:363,test_runner.go:754: output in run_092727.938_n4_workload_check_tpcc: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2159276-1596781348-25-n4cpu16:4 -- ./workload check tpcc --warehouses=500 {pgurl:1} returned: context canceled
(1) attached stack trace
| main.(*cluster).RunE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2204
| main.(*cluster).Run
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2124
| main.runTPCC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:189
| main.registerTPCC.func6
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:363
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:754
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1373
Wraps: (2) 2 safe details enclosed
Wraps: (3) output in run_092727.938_n4_workload_check_tpcc
Wraps: (4) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2159276-1596781348-25-n4cpu16:4 -- ./workload check tpcc --warehouses=500 {pgurl:1} returned
| stderr:
|
| stdout:
Wraps: (5) secondary error attachment
| signal: killed
| (1) signal: killed
| Error types: (1) *exec.ExitError
Wraps: (6) context canceled
Error types: (1) *withstack.withStack (2) *safedetails.withSafeDetails (3) *errutil.withMessage (4) *main.withCommandDetails (5) *secondary.withSecondaryError (6) *errors.errorString
```
<details><summary>More</summary><p>
Artifacts: [/tpcc/interleaved/nodes=3/cpu=16/w=500](https://teamcity.cockroachdb.com/viewLog.html?buildId=2159276&tab=artifacts#/tpcc/interleaved/nodes=3/cpu=16/w=500)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Atpcc%2Finterleaved%2Fnodes%3D3%2Fcpu%3D16%2Fw%3D500.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| 2.0 | roachtest: tpcc/interleaved/nodes=3/cpu=16/w=500 failed - [(roachtest).tpcc/interleaved/nodes=3/cpu=16/w=500 failed](https://teamcity.cockroachdb.com/viewLog.html?buildId=2159276&tab=buildLog) on [master@d310eb8eb6da5df4ac4407a2c90d898acdfdddf3](https://github.com/cockroachdb/cockroach/commits/d310eb8eb6da5df4ac4407a2c90d898acdfdddf3):
```
The test failed on branch=master, cloud=gce:
test artifacts and logs in: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/artifacts/tpcc/interleaved/nodes=3/cpu=16/w=500/run_1
test_runner.go:801: test timed out (6h0m0s)
cluster.go:2126,tpcc.go:189,tpcc.go:363,test_runner.go:754: output in run_092727.938_n4_workload_check_tpcc: /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2159276-1596781348-25-n4cpu16:4 -- ./workload check tpcc --warehouses=500 {pgurl:1} returned: context canceled
(1) attached stack trace
| main.(*cluster).RunE
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2204
| main.(*cluster).Run
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/cluster.go:2124
| main.runTPCC
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:189
| main.registerTPCC.func6
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/tpcc.go:363
| main.(*testRunner).runTest.func2
| /home/agent/work/.go/src/github.com/cockroachdb/cockroach/pkg/cmd/roachtest/test_runner.go:754
| runtime.goexit
| /usr/local/go/src/runtime/asm_amd64.s:1373
Wraps: (2) 2 safe details enclosed
Wraps: (3) output in run_092727.938_n4_workload_check_tpcc
Wraps: (4) /home/agent/work/.go/src/github.com/cockroachdb/cockroach/bin/roachprod run teamcity-2159276-1596781348-25-n4cpu16:4 -- ./workload check tpcc --warehouses=500 {pgurl:1} returned
| stderr:
|
| stdout:
Wraps: (5) secondary error attachment
| signal: killed
| (1) signal: killed
| Error types: (1) *exec.ExitError
Wraps: (6) context canceled
Error types: (1) *withstack.withStack (2) *safedetails.withSafeDetails (3) *errutil.withMessage (4) *main.withCommandDetails (5) *secondary.withSecondaryError (6) *errors.errorString
```
<details><summary>More</summary><p>
Artifacts: [/tpcc/interleaved/nodes=3/cpu=16/w=500](https://teamcity.cockroachdb.com/viewLog.html?buildId=2159276&tab=artifacts#/tpcc/interleaved/nodes=3/cpu=16/w=500)
[See this test on roachdash](https://roachdash.crdb.dev/?filter=status%3Aopen+t%3A.%2Atpcc%2Finterleaved%2Fnodes%3D3%2Fcpu%3D16%2Fw%3D500.%2A&sort=title&restgroup=false&display=lastcommented+project)
<sub>powered by [pkg/cmd/internal/issues](https://github.com/cockroachdb/cockroach/tree/master/pkg/cmd/internal/issues)</sub></p></details>
| test | roachtest tpcc interleaved nodes cpu w failed on the test failed on branch master cloud gce test artifacts and logs in home agent work go src github com cockroachdb cockroach artifacts tpcc interleaved nodes cpu w run test runner go test timed out cluster go tpcc go tpcc go test runner go output in run workload check tpcc home agent work go src github com cockroachdb cockroach bin roachprod run teamcity workload check tpcc warehouses pgurl returned context canceled attached stack trace main cluster rune home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main cluster run home agent work go src github com cockroachdb cockroach pkg cmd roachtest cluster go main runtpcc home agent work go src github com cockroachdb cockroach pkg cmd roachtest tpcc go main registertpcc home agent work go src github com cockroachdb cockroach pkg cmd roachtest tpcc go main testrunner runtest home agent work go src github com cockroachdb cockroach pkg cmd roachtest test runner go runtime goexit usr local go src runtime asm s wraps safe details enclosed wraps output in run workload check tpcc wraps home agent work go src github com cockroachdb cockroach bin roachprod run teamcity workload check tpcc warehouses pgurl returned stderr stdout wraps secondary error attachment signal killed signal killed error types exec exiterror wraps context canceled error types withstack withstack safedetails withsafedetails errutil withmessage main withcommanddetails secondary withsecondaryerror errors errorstring more artifacts powered by | 1 |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.