Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 4
112
| repo_url
stringlengths 33
141
| action
stringclasses 3
values | title
stringlengths 1
1.02k
| labels
stringlengths 4
1.54k
| body
stringlengths 1
262k
| index
stringclasses 17
values | text_combine
stringlengths 95
262k
| label
stringclasses 2
values | text
stringlengths 96
252k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
16,835
| 10,573,292,688
|
IssuesEvent
|
2019-10-07 11:37:53
|
terraform-providers/terraform-provider-azurerm
|
https://api.github.com/repos/terraform-providers/terraform-provider-azurerm
|
closed
|
azurerm_virtual_machine_scale_set.scaleset: diffs didn't match during apply
|
bug service/vmss
|
_This issue was originally opened by @agolomoodysaada as hashicorp/terraform#17291. It was migrated here as a result of the [provider split](https://www.hashicorp.com/blog/upcoming-provider-changes-in-terraform-0-10/). The original body of the issue is below._
<hr>
```
Terraform Version: 0.11.3
Resource ID: azurerm_virtual_machine_scale_set.scaleset
Mismatch reason: attribute mismatch: network_profile.3458559321.ip_configuration.#
Diff One (usually from plan): *terraform.InstanceDiff{mu:sync.Mutex{state:0, sema:0x0}, Attributes:map[string]*terraform.ResourceAttrDiff{"network_profile.~3458559321.ip_configuration.0.primary":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.accelerated_networking":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.#":*terraform.ResourceAttrDiff{Old:"0", New:"1", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.public_ip_address_configuration.#":*terraform.ResourceAttrDiff{Old:"0", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.2187753327":*terraform.ResourceAttrDiff{Old:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/historical-filters/providers/Microsoft.Network/loadBalancers/agolo-staging-historical-filters/backendAddressPools/BackEndAddressPool", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.#":*terraform.ResourceAttrDiff{Old:"1", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.primary":*terraform.ResourceAttrDiff{Old:"true", New:"false", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.name":*terraform.ResourceAttrDiff{Old:"", New:"historical-filters-ipconfig", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.primary":*terraform.ResourceAttrDiff{Old:"", New:"true", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.#":*terraform.ResourceAttrDiff{Old:"1", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.network_security_group_id":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.accelerated_networking":*terraform.ResourceAttrDiff{Old:"false", New:"false", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.subnet_id":*terraform.ResourceAttrDiff{Old:"", New:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/general/providers/Microsoft.Network/virtualNetworks/agolo/subnets/historical-filters-subnet", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.primary":*terraform.ResourceAttrDiff{Old:"false", New:"false", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.#":*terraform.ResourceAttrDiff{Old:"0", New:"1", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil),RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.public_ip_address_configuration.#":*terraform.ResourceAttrDiff{Old:"0", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.subnet_id":*terraform.ResourceAttrDiff{Old:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/general/providers/Microsoft.Network/virtualNetworks/agolo/subnets/historical-filters-subnet", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.name":*terraform.ResourceAttrDiff{Old:"", New:"historical-filters-nic", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.network_security_group_id":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.2187753327":*terraform.ResourceAttrDiff{Old:"", New:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/historical-filters/providers/Microsoft.Network/loadBalancers/agolo-staging-historical-filters/backendAddressPools/BackEndAddressPool", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.load_balancer_inbound_nat_rules_ids.#":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:true, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.name":*terraform.ResourceAttrDiff{Old:"historical-filters-ipconfig", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.name":*terraform.ResourceAttrDiff{Old:"historical-filters-nic", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}}, Destroy:false, DestroyDeposed:false, DestroyTainted:false, Meta:map[string]interface {}(nil)}
Diff Two (usually from apply): *terraform.InstanceDiff{mu:sync.Mutex{state:0, sema:0x0}, Attributes:map[string]*terraform.ResourceAttrDiff(nil), Destroy:false, DestroyDeposed:false, DestroyTainted:false, Meta:map[string]interface {}(nil)}
```
|
1.0
|
azurerm_virtual_machine_scale_set.scaleset: diffs didn't match during apply - _This issue was originally opened by @agolomoodysaada as hashicorp/terraform#17291. It was migrated here as a result of the [provider split](https://www.hashicorp.com/blog/upcoming-provider-changes-in-terraform-0-10/). The original body of the issue is below._
<hr>
```
Terraform Version: 0.11.3
Resource ID: azurerm_virtual_machine_scale_set.scaleset
Mismatch reason: attribute mismatch: network_profile.3458559321.ip_configuration.#
Diff One (usually from plan): *terraform.InstanceDiff{mu:sync.Mutex{state:0, sema:0x0}, Attributes:map[string]*terraform.ResourceAttrDiff{"network_profile.~3458559321.ip_configuration.0.primary":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.accelerated_networking":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.#":*terraform.ResourceAttrDiff{Old:"0", New:"1", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.public_ip_address_configuration.#":*terraform.ResourceAttrDiff{Old:"0", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.2187753327":*terraform.ResourceAttrDiff{Old:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/historical-filters/providers/Microsoft.Network/loadBalancers/agolo-staging-historical-filters/backendAddressPools/BackEndAddressPool", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.#":*terraform.ResourceAttrDiff{Old:"1", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.primary":*terraform.ResourceAttrDiff{Old:"true", New:"false", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.name":*terraform.ResourceAttrDiff{Old:"", New:"historical-filters-ipconfig", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.primary":*terraform.ResourceAttrDiff{Old:"", New:"true", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.#":*terraform.ResourceAttrDiff{Old:"1", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.network_security_group_id":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.accelerated_networking":*terraform.ResourceAttrDiff{Old:"false", New:"false", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.subnet_id":*terraform.ResourceAttrDiff{Old:"", New:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/general/providers/Microsoft.Network/virtualNetworks/agolo/subnets/historical-filters-subnet", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.primary":*terraform.ResourceAttrDiff{Old:"false", New:"false", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.#":*terraform.ResourceAttrDiff{Old:"0", New:"1", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil),RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.public_ip_address_configuration.#":*terraform.ResourceAttrDiff{Old:"0", New:"0", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.subnet_id":*terraform.ResourceAttrDiff{Old:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/general/providers/Microsoft.Network/virtualNetworks/agolo/subnets/historical-filters-subnet", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.name":*terraform.ResourceAttrDiff{Old:"", New:"historical-filters-nic", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.network_security_group_id":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.load_balancer_backend_address_pool_ids.2187753327":*terraform.ResourceAttrDiff{Old:"", New:"/subscriptions/93c2ebb5-31e9-487d-9f19-f1716b0673ce/resourceGroups/historical-filters/providers/Microsoft.Network/loadBalancers/agolo-staging-historical-filters/backendAddressPools/BackEndAddressPool", NewComputed:false, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.~3458559321.ip_configuration.0.load_balancer_inbound_nat_rules_ids.#":*terraform.ResourceAttrDiff{Old:"", New:"", NewComputed:true, NewRemoved:false, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.ip_configuration.0.name":*terraform.ResourceAttrDiff{Old:"historical-filters-ipconfig", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}, "network_profile.3458559321.name":*terraform.ResourceAttrDiff{Old:"historical-filters-nic", New:"", NewComputed:false, NewRemoved:true, NewExtra:interface {}(nil), RequiresNew:false, Sensitive:false, Type:0x0}}, Destroy:false, DestroyDeposed:false, DestroyTainted:false, Meta:map[string]interface {}(nil)}
Diff Two (usually from apply): *terraform.InstanceDiff{mu:sync.Mutex{state:0, sema:0x0}, Attributes:map[string]*terraform.ResourceAttrDiff(nil), Destroy:false, DestroyDeposed:false, DestroyTainted:false, Meta:map[string]interface {}(nil)}
```
|
non_test
|
azurerm virtual machine scale set scaleset diffs didn t match during apply this issue was originally opened by agolomoodysaada as hashicorp terraform it was migrated here as a result of the the original body of the issue is below terraform version resource id azurerm virtual machine scale set scaleset mismatch reason attribute mismatch network profile ip configuration diff one usually from plan terraform instancediff mu sync mutex state sema attributes map terraform resourceattrdiff network profile ip configuration primary terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile accelerated networking terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration load balancer backend address pool ids terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration public ip address configuration terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration load balancer backend address pool ids terraform resourceattrdiff old subscriptions resourcegroups historical filters providers microsoft network loadbalancers agolo staging historical filters backendaddresspools backendaddresspool new newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type network profile ip configuration load balancer backend address pool ids terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile primary terraform resourceattrdiff old true new false newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type network profile ip configuration name terraform resourceattrdiff old new historical filters ipconfig newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile primary terraform resourceattrdiff old new true newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile network security group id terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile accelerated networking terraform resourceattrdiff old false new false newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type network profile ip configuration subnet id terraform resourceattrdiff old new subscriptions resourcegroups general providers microsoft network virtualnetworks agolo subnets historical filters subnet newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration primary terraform resourceattrdiff old false new false newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type network profile ip configuration terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration public ip address configuration terraform resourceattrdiff old new newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration subnet id terraform resourceattrdiff old subscriptions resourcegroups general providers microsoft network virtualnetworks agolo subnets historical filters subnet new newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type network profile name terraform resourceattrdiff old new historical filters nic newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile network security group id terraform resourceattrdiff old new newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type network profile ip configuration load balancer backend address pool ids terraform resourceattrdiff old new subscriptions resourcegroups historical filters providers microsoft network loadbalancers agolo staging historical filters backendaddresspools backendaddresspool newcomputed false newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration load balancer inbound nat rules ids terraform resourceattrdiff old new newcomputed true newremoved false newextra interface nil requiresnew false sensitive false type network profile ip configuration name terraform resourceattrdiff old historical filters ipconfig new newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type network profile name terraform resourceattrdiff old historical filters nic new newcomputed false newremoved true newextra interface nil requiresnew false sensitive false type destroy false destroydeposed false destroytainted false meta map interface nil diff two usually from apply terraform instancediff mu sync mutex state sema attributes map terraform resourceattrdiff nil destroy false destroydeposed false destroytainted false meta map interface nil
| 0
|
65,335
| 6,955,906,281
|
IssuesEvent
|
2017-12-07 09:37:56
|
CLARIAH/wp5_mediasuite
|
https://api.github.com/repos/CLARIAH/wp5_mediasuite
|
closed
|
Improve "detailed resource viewer"(s) and the result list
|
Done & tested! Function: result list Importance: medium MS-Component-function MSv2 Work: interface
|
==summary of this request:
1. Delete eye icon
2. Make description-metadata viewable only if you click on an arrow icon (expand-collapse) that shows the information-metadata that currently pops-up when people click on eye icon)
==
In the search result list of the Comparative search and Multi-layered Single collection search recipes, you find two options to get the detailed record (i.e., the "detailed resource viewer"): (option 1) clicking on the "eye" icon, or (2) clicking on the title itself:

**Problem 1**: When clicking on the icon (option 1), a pop-up window shows up, which you can close with the X on top. But, when you click on the title (option 2) you see the detailed record, and when you hit the back button, your query disappears.
Solution to problem 1: What is needed is that the query and the result list stays after going back from the detailed record view to the query.
**Problem 2**: When the user clicks on the "eye" icon (option 1) she gets a pop-up window with the metadata, and the option to play the media at the end of this window:

When the user clicks on the title itself (option 2), she gets the media first (with video display and active annotation functionalities), and the metadata ("all data") below the media file:

The problem here is that the user finds two ways of arriving to the "detailed record view" and to the resource, while it would be more clear to present only one.
The proposed solution for problem 2 is:
1) When the user is inspecting the result list, she finds an arrow (collapsible menu) or tab to display the metadata for each individual under the result, without being prompted to any pop-up window or extra window outside the search results. Example from Scopus database, with collapsible menu when clicking on "view abstract"):


Another example from the UvA library catalog with a "Details" tab:

This embedded display can be more clear than the current pop-up window.
2) Then, if the pop-up window is replaced, the Eye icon disappears, and the user can arrive to the detailed record view of individual search results only in one way: e.g., clicking on the title in the search result:

|
1.0
|
Improve "detailed resource viewer"(s) and the result list - ==summary of this request:
1. Delete eye icon
2. Make description-metadata viewable only if you click on an arrow icon (expand-collapse) that shows the information-metadata that currently pops-up when people click on eye icon)
==
In the search result list of the Comparative search and Multi-layered Single collection search recipes, you find two options to get the detailed record (i.e., the "detailed resource viewer"): (option 1) clicking on the "eye" icon, or (2) clicking on the title itself:

**Problem 1**: When clicking on the icon (option 1), a pop-up window shows up, which you can close with the X on top. But, when you click on the title (option 2) you see the detailed record, and when you hit the back button, your query disappears.
Solution to problem 1: What is needed is that the query and the result list stays after going back from the detailed record view to the query.
**Problem 2**: When the user clicks on the "eye" icon (option 1) she gets a pop-up window with the metadata, and the option to play the media at the end of this window:

When the user clicks on the title itself (option 2), she gets the media first (with video display and active annotation functionalities), and the metadata ("all data") below the media file:

The problem here is that the user finds two ways of arriving to the "detailed record view" and to the resource, while it would be more clear to present only one.
The proposed solution for problem 2 is:
1) When the user is inspecting the result list, she finds an arrow (collapsible menu) or tab to display the metadata for each individual under the result, without being prompted to any pop-up window or extra window outside the search results. Example from Scopus database, with collapsible menu when clicking on "view abstract"):


Another example from the UvA library catalog with a "Details" tab:

This embedded display can be more clear than the current pop-up window.
2) Then, if the pop-up window is replaced, the Eye icon disappears, and the user can arrive to the detailed record view of individual search results only in one way: e.g., clicking on the title in the search result:

|
test
|
improve detailed resource viewer s and the result list summary of this request delete eye icon make description metadata viewable only if you click on an arrow icon expand collapse that shows the information metadata that currently pops up when people click on eye icon in the search result list of the comparative search and multi layered single collection search recipes you find two options to get the detailed record i e the detailed resource viewer option clicking on the eye icon or clicking on the title itself problem when clicking on the icon option a pop up window shows up which you can close with the x on top but when you click on the title option you see the detailed record and when you hit the back button your query disappears solution to problem what is needed is that the query and the result list stays after going back from the detailed record view to the query problem when the user clicks on the eye icon option she gets a pop up window with the metadata and the option to play the media at the end of this window when the user clicks on the title itself option she gets the media first with video display and active annotation functionalities and the metadata all data below the media file the problem here is that the user finds two ways of arriving to the detailed record view and to the resource while it would be more clear to present only one the proposed solution for problem is when the user is inspecting the result list she finds an arrow collapsible menu or tab to display the metadata for each individual under the result without being prompted to any pop up window or extra window outside the search results example from scopus database with collapsible menu when clicking on view abstract another example from the uva library catalog with a details tab this embedded display can be more clear than the current pop up window then if the pop up window is replaced the eye icon disappears and the user can arrive to the detailed record view of individual search results only in one way e g clicking on the title in the search result
| 1
|
204,794
| 23,280,869,233
|
IssuesEvent
|
2022-08-05 11:57:24
|
MendDemo-josh/IdentityServer4
|
https://api.github.com/repos/MendDemo-josh/IdentityServer4
|
opened
|
bootstrap-3.3.6.min.js: 6 vulnerabilities (highest severity is: 6.1)
|
security vulnerability
|
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2019-8331](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1 | ❌ |
| [CVE-2018-14040](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0 | ❌ |
| [CVE-2018-20677](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0 | ❌ |
| [CVE-2018-14042](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0 | ❌ |
| [CVE-2018-20676](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | bootstrap - 3.4.0 | ❌ |
| [CVE-2016-10735](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | bootstrap - 3.4.0, 4.0.0-beta.2 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-8331</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.
<p>Publish Date: 2019-02-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-02-20</p>
<p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-14040</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-20677</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677>CVE-2018-20677</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-14042</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042>CVE-2018-14042</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-20676</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 3.4.0, XSS is possible in the tooltip data-viewport attribute.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676>CVE-2018-20676</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2016-10735</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
<p></p>
</details>
|
True
|
bootstrap-3.3.6.min.js: 6 vulnerabilities (highest severity is: 6.1) - <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p></summary>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p></details>
## Vulnerabilities
| CVE | Severity | <img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS | Dependency | Type | Fixed in | Remediation Available |
| ------------- | ------------- | ----- | ----- | ----- | --- | --- |
| [CVE-2019-8331](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1 | ❌ |
| [CVE-2018-14040](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0 | ❌ |
| [CVE-2018-20677](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0 | ❌ |
| [CVE-2018-14042](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0 | ❌ |
| [CVE-2018-20676](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | bootstrap - 3.4.0 | ❌ |
| [CVE-2016-10735](https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735) | <img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Medium | 6.1 | bootstrap-3.3.6.min.js | Direct | bootstrap - 3.4.0, 4.0.0-beta.2 | ❌ |
## Details
<details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2019-8331</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 3.4.1 and 4.3.x before 4.3.1, XSS is possible in the tooltip or popover data-template attribute.
<p>Publish Date: 2019-02-20
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-8331>CVE-2019-8331</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2019-02-20</p>
<p>Fix Resolution: bootstrap - 3.4.1,4.3.1;bootstrap-sass - 3.4.1,4.3.1</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-14040</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 4.1.2, XSS is possible in the collapse data-parent attribute.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14040>CVE-2018-14040</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2,org.webjars:bootstrap:3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-20677</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 3.4.0, XSS is possible in the affix configuration target property.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20677>CVE-2018-20677</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20677</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: Bootstrap - v3.4.0;NorDroN.AngularTemplate - 0.1.6;Dynamic.NET.Express.ProjectTemplates - 0.8.0;dotnetng.template - 1.0.0.4;ZNxtApp.Core.Module.Theme - 1.0.9-Beta;JMeter - 5.0.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-14042</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 4.1.2, XSS is possible in the data-container property of tooltip.
<p>Publish Date: 2018-07-13
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-14042>CVE-2018-14042</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2018-07-13</p>
<p>Fix Resolution: org.webjars.npm:bootstrap:4.1.2.org.webjars:bootstrap:3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2018-20676</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap before 3.4.0, XSS is possible in the tooltip data-viewport attribute.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-20676>CVE-2018-20676</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2018-20676</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0</p>
</p>
<p></p>
</details><details>
<summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> CVE-2016-10735</summary>
### Vulnerable Library - <b>bootstrap-3.3.6.min.js</b></p>
<p>The most popular front-end framework for developing responsive, mobile first projects on the web.</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js">https://cdnjs.cloudflare.com/ajax/libs/twitter-bootstrap/3.3.6/js/bootstrap.min.js</a></p>
<p>Path to vulnerable library: /samples/Clients/src/MvcHybridBackChannel/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybridAutomaticRefresh/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js,/samples/Clients/old/MvcHybrid/wwwroot/lib/bootstrap/dist/js/bootstrap.min.js</p>
<p>
Dependency Hierarchy:
- :x: **bootstrap-3.3.6.min.js** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/MendDemo-josh/IdentityServer4/commit/ff53f0b986cff715c1a22e58306a6985498ab04f">ff53f0b986cff715c1a22e58306a6985498ab04f</a></p>
<p>Found in base branch: <b>main</b></p>
</p>
<p></p>
### Vulnerability Details
<p>
In Bootstrap 3.x before 3.4.0 and 4.x-beta before 4.0.0-beta.2, XSS is possible in the data-target attribute, a different vulnerability than CVE-2018-14041.
<p>Publish Date: 2019-01-09
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2016-10735>CVE-2016-10735</a></p>
</p>
<p></p>
### CVSS 3 Score Details (<b>6.1</b>)
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
<p></p>
### Suggested Fix
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2016-10735</a></p>
<p>Release Date: 2019-01-09</p>
<p>Fix Resolution: bootstrap - 3.4.0, 4.0.0-beta.2</p>
</p>
<p></p>
</details>
|
non_test
|
bootstrap min js vulnerabilities highest severity is vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js found in head commit a href vulnerabilities cve severity cvss dependency type fixed in remediation available medium bootstrap min js direct bootstrap bootstrap sass medium bootstrap min js direct org webjars npm bootstrap org webjars bootstrap medium bootstrap min js direct bootstrap nordron angulartemplate dynamic net express projecttemplates dotnetng template znxtapp core module theme beta jmeter medium bootstrap min js direct org webjars npm bootstrap org webjars bootstrap medium bootstrap min js direct bootstrap medium bootstrap min js direct bootstrap beta details cve vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap before and x before xss is possible in the tooltip or popover data template attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution bootstrap bootstrap sass cve vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap before xss is possible in the collapse data parent attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution org webjars npm bootstrap org webjars bootstrap cve vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap before xss is possible in the affix configuration target property publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap nordron angulartemplate dynamic net express projecttemplates dotnetng template znxtapp core module theme beta jmeter cve vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap before xss is possible in the data container property of tooltip publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version release date fix resolution org webjars npm bootstrap org webjars bootstrap cve vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap before xss is possible in the tooltip data viewport attribute publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap cve vulnerable library bootstrap min js the most popular front end framework for developing responsive mobile first projects on the web library home page a href path to vulnerable library samples clients src mvchybridbackchannel wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybridautomaticrefresh wwwroot lib bootstrap dist js bootstrap min js samples clients old mvchybrid wwwroot lib bootstrap dist js bootstrap min js dependency hierarchy x bootstrap min js vulnerable library found in head commit a href found in base branch main vulnerability details in bootstrap x before and x beta before beta xss is possible in the data target attribute a different vulnerability than cve publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution bootstrap beta
| 0
|
310,497
| 26,719,178,215
|
IssuesEvent
|
2023-01-28 23:06:53
|
PalisadoesFoundation/talawa-api
|
https://api.github.com/repos/PalisadoesFoundation/talawa-api
|
closed
|
Test: src/lib/resolvers/index.ts
|
good first issue unapproved points 01 test
|
- Please coordinate **issue assignment** and **PR reviews** with the contributors listed in this issue https://github.com/PalisadoesFoundation/talawa/issues/359
The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage.
- Tests need to be written for file `src/lib/resolvers/index.ts`
- We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed.
- When complete, all all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a
single file with the name ` __tests__/resolvers/index.spec.ts`. You may need to create the appropriate directory structure to do this.
### IMPORTANT:
Please refer to the parent issue on how to implement these tests correctly:
- https://github.com/PalisadoesFoundation/talawa-api/issues/490
### PR Acceptance Criteria
- When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR.
- [The current code coverage for the file can be found by visting this page](https://app.codecov.io/gh/PalisadoesFoundation/talawa-api?displayType=list). Login using your GitHub credentials.
- Create your own `codecov.io` to help with testing.
- The PR you create will show a report for the code coverage for the file you have added. You can also use that as a guide.
|
1.0
|
Test: src/lib/resolvers/index.ts - - Please coordinate **issue assignment** and **PR reviews** with the contributors listed in this issue https://github.com/PalisadoesFoundation/talawa/issues/359
The Talawa-API code base needs to be 100% reliable. This means we need to have 100% test code coverage.
- Tests need to be written for file `src/lib/resolvers/index.ts`
- We will need the API to be refactored for all methods, classes and/or functions found in this file for testing to be correctly executed.
- When complete, all all methods, classes and/or functions in the refactored file will need to be tested. These tests must be placed in a
single file with the name ` __tests__/resolvers/index.spec.ts`. You may need to create the appropriate directory structure to do this.
### IMPORTANT:
Please refer to the parent issue on how to implement these tests correctly:
- https://github.com/PalisadoesFoundation/talawa-api/issues/490
### PR Acceptance Criteria
- When complete this file must show **100%** coverage when merged into the code base. This will be clearly visible when you submit your PR.
- [The current code coverage for the file can be found by visting this page](https://app.codecov.io/gh/PalisadoesFoundation/talawa-api?displayType=list). Login using your GitHub credentials.
- Create your own `codecov.io` to help with testing.
- The PR you create will show a report for the code coverage for the file you have added. You can also use that as a guide.
|
test
|
test src lib resolvers index ts please coordinate issue assignment and pr reviews with the contributors listed in this issue the talawa api code base needs to be reliable this means we need to have test code coverage tests need to be written for file src lib resolvers index ts we will need the api to be refactored for all methods classes and or functions found in this file for testing to be correctly executed when complete all all methods classes and or functions in the refactored file will need to be tested these tests must be placed in a single file with the name tests resolvers index spec ts you may need to create the appropriate directory structure to do this important please refer to the parent issue on how to implement these tests correctly pr acceptance criteria when complete this file must show coverage when merged into the code base this will be clearly visible when you submit your pr login using your github credentials create your own codecov io to help with testing the pr you create will show a report for the code coverage for the file you have added you can also use that as a guide
| 1
|
802,209
| 28,781,234,562
|
IssuesEvent
|
2023-05-02 00:59:58
|
Dotori-app/Dotori-iOS
|
https://api.github.com/repos/Dotori-app/Dotori-iOS
|
closed
|
Then을 Configure로 변경
|
⚙ Setting 2️⃣Priority: Medium ⚡️ Simple
|
### Describe
직접 구현해놓은 Then을
https://github.com/GSM-MSG/Configure 를 사용하도록 바꿉니다
### Additional
_No response_
|
1.0
|
Then을 Configure로 변경 - ### Describe
직접 구현해놓은 Then을
https://github.com/GSM-MSG/Configure 를 사용하도록 바꿉니다
### Additional
_No response_
|
non_test
|
then을 configure로 변경 describe 직접 구현해놓은 then을 를 사용하도록 바꿉니다 additional no response
| 0
|
95,285
| 16,084,499,655
|
IssuesEvent
|
2021-04-26 09:35:37
|
NixOS/nixpkgs
|
https://api.github.com/repos/NixOS/nixpkgs
|
opened
|
migrate away from ffmpeg_3
|
1.severity: security
|
`ffmpeg_3` has many open vulnerabilities (see #94003 and #120372). There seems to be no effort to add patches for these, so we should drop `ffmpeg_3` or at least mark it as insecure.
In https://github.com/NixOS/nixpkgs/pull/89264, `ffmpeg_3` was made the de facto default by making every package that depends on `ffmpeg` depend on `ffmpeg_3` instead. I think that was a bad idea given that the Ffmpeg packages aren't well maintained.
Most packages should build just fine with `ffmpeg` but someone needs to test them.
Is there an easy way to obtain a list of packages using `ffmpeg_3` and ping their maintainers?
cc @doronbehar @codyopel
|
True
|
migrate away from ffmpeg_3 - `ffmpeg_3` has many open vulnerabilities (see #94003 and #120372). There seems to be no effort to add patches for these, so we should drop `ffmpeg_3` or at least mark it as insecure.
In https://github.com/NixOS/nixpkgs/pull/89264, `ffmpeg_3` was made the de facto default by making every package that depends on `ffmpeg` depend on `ffmpeg_3` instead. I think that was a bad idea given that the Ffmpeg packages aren't well maintained.
Most packages should build just fine with `ffmpeg` but someone needs to test them.
Is there an easy way to obtain a list of packages using `ffmpeg_3` and ping their maintainers?
cc @doronbehar @codyopel
|
non_test
|
migrate away from ffmpeg ffmpeg has many open vulnerabilities see and there seems to be no effort to add patches for these so we should drop ffmpeg or at least mark it as insecure in ffmpeg was made the de facto default by making every package that depends on ffmpeg depend on ffmpeg instead i think that was a bad idea given that the ffmpeg packages aren t well maintained most packages should build just fine with ffmpeg but someone needs to test them is there an easy way to obtain a list of packages using ffmpeg and ping their maintainers cc doronbehar codyopel
| 0
|
180,010
| 13,915,349,995
|
IssuesEvent
|
2020-10-21 00:23:32
|
nicorithner/rails_engine
|
https://api.github.com/repos/nicorithner/rails_engine
|
closed
|
Relationships
|
API Tests database
|
These endpoints should show related records. The relationship endpoints you should expose are:
- [ ] GET /api/v1/merchants/:id/items - return all items associated with a merchant.
- [ ] GET /api/v1/items/:id/merchants - return the merchant associated with an item
|
1.0
|
Relationships - These endpoints should show related records. The relationship endpoints you should expose are:
- [ ] GET /api/v1/merchants/:id/items - return all items associated with a merchant.
- [ ] GET /api/v1/items/:id/merchants - return the merchant associated with an item
|
test
|
relationships these endpoints should show related records the relationship endpoints you should expose are get api merchants id items return all items associated with a merchant get api items id merchants return the merchant associated with an item
| 1
|
331,275
| 28,807,832,013
|
IssuesEvent
|
2023-05-03 00:14:01
|
NCAR/ucomp-pipeline
|
https://api.github.com/repos/NCAR/ucomp-pipeline
|
closed
|
Fix incorrect FLTFILE1 and MFLTEXT1 values
|
bug needs testing
|
For the run for 20220901 with intermediate products, the following information about the flats used for `20220901.182014.ucomp.1074.l1.3.fts` was:
FLTFILE1= '20220901.173314.67.ucomp.530.l0.fts' / name of raw flat file used
FLTEXTS1= '2,8,14,20' / ext in 20220901.173314.67.ucomp.530.l0.fts used
MFLTEXT1= '122,125 ' / ext in 20220901.ucomp.flat.1074.fts, wt 1.00
For the `FLTFILE1` value, this is indicating that a 530 flat was used to flat correct a 1074 science image. Either that is true and an incorrect operation, or it is recorded in the headers incorrectly.
The `MFLTEXT1` extensions are not possible, as `20220901.ucomp.flat.1074.fts` only has 42 flat extensions stored in it.
|
1.0
|
Fix incorrect FLTFILE1 and MFLTEXT1 values - For the run for 20220901 with intermediate products, the following information about the flats used for `20220901.182014.ucomp.1074.l1.3.fts` was:
FLTFILE1= '20220901.173314.67.ucomp.530.l0.fts' / name of raw flat file used
FLTEXTS1= '2,8,14,20' / ext in 20220901.173314.67.ucomp.530.l0.fts used
MFLTEXT1= '122,125 ' / ext in 20220901.ucomp.flat.1074.fts, wt 1.00
For the `FLTFILE1` value, this is indicating that a 530 flat was used to flat correct a 1074 science image. Either that is true and an incorrect operation, or it is recorded in the headers incorrectly.
The `MFLTEXT1` extensions are not possible, as `20220901.ucomp.flat.1074.fts` only has 42 flat extensions stored in it.
|
test
|
fix incorrect and values for the run for with intermediate products the following information about the flats used for ucomp fts was ucomp fts name of raw flat file used ext in ucomp fts used ext in ucomp flat fts wt for the value this is indicating that a flat was used to flat correct a science image either that is true and an incorrect operation or it is recorded in the headers incorrectly the extensions are not possible as ucomp flat fts only has flat extensions stored in it
| 1
|
79,007
| 9,815,804,579
|
IssuesEvent
|
2019-06-13 13:26:43
|
xi-editor/xi-editor
|
https://api.github.com/repos/xi-editor/xi-editor
|
closed
|
Displaying spaces, tabs (and other usually not drawn characters?)
|
feature request needs design
|
It would be really nice if Xi supported drawing tabs, spaces etc. I've (sort of) implemeted support for this in gxi by replacings spaces with a dot during rendering, but this comes with some problems (namely that I have to carefully space them to not mess up the cursor position and that theming them doesn't work properly because they're wider than spaces). It would he great if Xi supported this natively and I'd very much like to work on this, but I'm not quite sure how to implement this.
|
1.0
|
Displaying spaces, tabs (and other usually not drawn characters?) - It would be really nice if Xi supported drawing tabs, spaces etc. I've (sort of) implemeted support for this in gxi by replacings spaces with a dot during rendering, but this comes with some problems (namely that I have to carefully space them to not mess up the cursor position and that theming them doesn't work properly because they're wider than spaces). It would he great if Xi supported this natively and I'd very much like to work on this, but I'm not quite sure how to implement this.
|
non_test
|
displaying spaces tabs and other usually not drawn characters it would be really nice if xi supported drawing tabs spaces etc i ve sort of implemeted support for this in gxi by replacings spaces with a dot during rendering but this comes with some problems namely that i have to carefully space them to not mess up the cursor position and that theming them doesn t work properly because they re wider than spaces it would he great if xi supported this natively and i d very much like to work on this but i m not quite sure how to implement this
| 0
|
111,450
| 4,473,202,714
|
IssuesEvent
|
2016-08-26 02:22:56
|
GandaG/fomod-designer
|
https://api.github.com/repos/GandaG/fomod-designer
|
closed
|
Refresh rate changed in settings but not taking affect
|
bug mid priority
|
Per request, opening a new ticket.
Changed the Preview Refresh Rate from On Property Editing to On Node Select, but the preview window is still updated with each change in the property editor. A restart of the utility did not change behavior.
|
1.0
|
Refresh rate changed in settings but not taking affect - Per request, opening a new ticket.
Changed the Preview Refresh Rate from On Property Editing to On Node Select, but the preview window is still updated with each change in the property editor. A restart of the utility did not change behavior.
|
non_test
|
refresh rate changed in settings but not taking affect per request opening a new ticket changed the preview refresh rate from on property editing to on node select but the preview window is still updated with each change in the property editor a restart of the utility did not change behavior
| 0
|
82,315
| 10,240,479,025
|
IssuesEvent
|
2019-08-19 20:54:45
|
GoogleChrome/devsummit
|
https://api.github.com/repos/GoogleChrome/devsummit
|
opened
|
Schedule page design v2
|
💀 design
|
- [ ] current session
- [ ] current time
- [ ] past session
- [ ] sessions with stream links
any other interesting states or functionality we want to strive for?
|
1.0
|
Schedule page design v2 - - [ ] current session
- [ ] current time
- [ ] past session
- [ ] sessions with stream links
any other interesting states or functionality we want to strive for?
|
non_test
|
schedule page design current session current time past session sessions with stream links any other interesting states or functionality we want to strive for
| 0
|
86,635
| 8,042,427,538
|
IssuesEvent
|
2018-07-31 08:07:02
|
alibaba/pouch
|
https://api.github.com/repos/alibaba/pouch
|
closed
|
[help wanted] add unit-test for modifyContainerNamespaceOptions
|
areas/test
|
### Ⅰ. Issue Description
Add unit-test for `modifyContainerNamespaceOptions` method which locate on cri/v1alpha1/cri_utils.go.
You can take [env_test.go](https://github.com/alibaba/pouch/blob/master/apis/opts/env_test.go) for reference.
### Ⅱ. Describe what happened
### Ⅲ. Describe what you expected to happen
### Ⅳ. How to reproduce it (as minimally and precisely as possible)
1.
2.
3.
### Ⅴ. Anything else we need to know?
### Ⅵ. Environment:
- pouch version (use `pouch version`):
- OS (e.g. from /etc/os-release):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
|
1.0
|
[help wanted] add unit-test for modifyContainerNamespaceOptions - ### Ⅰ. Issue Description
Add unit-test for `modifyContainerNamespaceOptions` method which locate on cri/v1alpha1/cri_utils.go.
You can take [env_test.go](https://github.com/alibaba/pouch/blob/master/apis/opts/env_test.go) for reference.
### Ⅱ. Describe what happened
### Ⅲ. Describe what you expected to happen
### Ⅳ. How to reproduce it (as minimally and precisely as possible)
1.
2.
3.
### Ⅴ. Anything else we need to know?
### Ⅵ. Environment:
- pouch version (use `pouch version`):
- OS (e.g. from /etc/os-release):
- Kernel (e.g. `uname -a`):
- Install tools:
- Others:
|
test
|
add unit test for modifycontainernamespaceoptions ⅰ issue description add unit test for modifycontainernamespaceoptions method which locate on cri cri utils go you can take for reference ⅱ describe what happened ⅲ describe what you expected to happen ⅳ how to reproduce it as minimally and precisely as possible ⅴ anything else we need to know ⅵ environment pouch version use pouch version os e g from etc os release kernel e g uname a install tools others
| 1
|
101,877
| 8,806,665,464
|
IssuesEvent
|
2018-12-27 05:46:23
|
humera987/FXLabs-Test-Automation
|
https://api.github.com/repos/humera987/FXLabs-Test-Automation
|
closed
|
testing 3 : ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue
|
testing 3
|
Project : testing 3
Job : Default
Env : Default
Region : US_WEST
Result : fail
Status Code : 200
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=ZTk4NmZmMjMtMTkyYS00YTFhLWEwOGUtODNiZGUyMzJiNTYz; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 27 Dec 2018 05:38:27 GMT]}
Endpoint : http://13.56.210.25/api/v1/abac/project/null/add-abacpositive-rules
Request :
Response :
{
"requestId" : "None",
"requestTime" : "2018-12-27T05:38:27.881+0000",
"errors" : true,
"messages" : [ {
"type" : "ERROR",
"key" : null,
"value" : "ABAC Positive Generator not found in project null"
} ],
"data" : null,
"totalPages" : 0,
"totalElements" : 0
}
Logs :
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : URL [http://13.56.210.25/api/v1/abac/project/null/add-abacpositive-rules]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Method [GET]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Request []
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Request-Headers [{Content-Type=[application/json], Accept=[application/json], Authorization=[Basic SHVtZXJhLy9odW1lcmFAZnhsYWJzLmlvOmh1bWVyYTEyMyQ=]}]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Response [{
"requestId" : "None",
"requestTime" : "2018-12-27T05:38:27.881+0000",
"errors" : true,
"messages" : [ {
"type" : "ERROR",
"key" : null,
"value" : "ABAC Positive Generator not found in project null"
} ],
"data" : null,
"totalPages" : 0,
"totalElements" : 0
}]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Response-Headers [{X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=ZTk4NmZmMjMtMTkyYS00YTFhLWEwOGUtODNiZGUyMzJiNTYz; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 27 Dec 2018 05:38:27 GMT]}]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : StatusCode [200]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Time [1574]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Size [225]
2018-12-27 05:38:27 INFO [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 401] resolved-to [200 != 401] result [Passed]
2018-12-27 05:38:27 INFO [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 500] resolved-to [200 != 500] result [Passed]
2018-12-27 05:38:27 ERROR [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 200] resolved-to [200 != 200] result [Failed]
2018-12-27 05:38:27 INFO [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 404] resolved-to [200 != 404] result [Passed]
--- FX Bot ---
|
1.0
|
testing 3 : ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue - Project : testing 3
Job : Default
Env : Default
Region : US_WEST
Result : fail
Status Code : 200
Headers : {X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=ZTk4NmZmMjMtMTkyYS00YTFhLWEwOGUtODNiZGUyMzJiNTYz; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 27 Dec 2018 05:38:27 GMT]}
Endpoint : http://13.56.210.25/api/v1/abac/project/null/add-abacpositive-rules
Request :
Response :
{
"requestId" : "None",
"requestTime" : "2018-12-27T05:38:27.881+0000",
"errors" : true,
"messages" : [ {
"type" : "ERROR",
"key" : null,
"value" : "ABAC Positive Generator not found in project null"
} ],
"data" : null,
"totalPages" : 0,
"totalElements" : 0
}
Logs :
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : URL [http://13.56.210.25/api/v1/abac/project/null/add-abacpositive-rules]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Method [GET]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Request []
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Request-Headers [{Content-Type=[application/json], Accept=[application/json], Authorization=[Basic SHVtZXJhLy9odW1lcmFAZnhsYWJzLmlvOmh1bWVyYTEyMyQ=]}]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Response [{
"requestId" : "None",
"requestTime" : "2018-12-27T05:38:27.881+0000",
"errors" : true,
"messages" : [ {
"type" : "ERROR",
"key" : null,
"value" : "ABAC Positive Generator not found in project null"
} ],
"data" : null,
"totalPages" : 0,
"totalElements" : 0
}]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Response-Headers [{X-Content-Type-Options=[nosniff], X-XSS-Protection=[1; mode=block], Cache-Control=[no-cache, no-store, max-age=0, must-revalidate], Pragma=[no-cache], Expires=[0], X-Frame-Options=[DENY], Set-Cookie=[SESSION=ZTk4NmZmMjMtMTkyYS00YTFhLWEwOGUtODNiZGUyMzJiNTYz; Path=/; HttpOnly], Content-Type=[application/json;charset=UTF-8], Transfer-Encoding=[chunked], Date=[Thu, 27 Dec 2018 05:38:27 GMT]}]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : StatusCode [200]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Time [1574]
2018-12-27 05:38:27 DEBUG [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Size [225]
2018-12-27 05:38:27 INFO [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 401] resolved-to [200 != 401] result [Passed]
2018-12-27 05:38:27 INFO [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 500] resolved-to [200 != 500] result [Passed]
2018-12-27 05:38:27 ERROR [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 200] resolved-to [200 != 200] result [Failed]
2018-12-27 05:38:27 INFO [ApiV1AbacProjectProjectidAddAbacpositiveRulesGetPathParamProjectidNullValue] : Assertion [@StatusCode != 404] resolved-to [200 != 404] result [Passed]
--- FX Bot ---
|
test
|
testing project testing job default env default region us west result fail status code headers x content type options x xss protection cache control pragma expires x frame options set cookie content type transfer encoding date endpoint request response requestid none requesttime errors true messages type error key null value abac positive generator not found in project null data null totalpages totalelements logs debug url debug method debug request debug request headers accept authorization debug response requestid none requesttime errors true messages type error key null value abac positive generator not found in project null data null totalpages totalelements debug response headers x xss protection cache control pragma expires x frame options set cookie content type transfer encoding date debug statuscode debug time debug size info assertion resolved to result info assertion resolved to result error assertion resolved to result info assertion resolved to result fx bot
| 1
|
723,990
| 24,914,971,831
|
IssuesEvent
|
2022-10-30 09:39:55
|
apache/hudi
|
https://api.github.com/repos/apache/hudi
|
closed
|
[SUPPORT] Executor OOM upserting 20M records from Kafka
|
performance priority:major
|
**Describe the problem you faced**
While upserting Mongo oplogs from Kafka to Blob, facing Executor OOM
**Environment Description**
* Hudi version : 0.9.0
* Spark version : 2.4.4
* Hive version : 3.1.2
* Hadoop version : 2.7.3
* Storage (HDFS/S3/GCS..) : Azure Blob
* Running on Docker? (yes/no) : K8s
**Additional context**
Spark K8s yaml file
```
apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
name: hudi-ss-ah-ds-{{ ti.job_id }}
namespace: dataplatform
labels:
spark_name: hudi-ss-ah-ds-{{ ti.job_id }}
dag_name: hudi-ss-ah
task_name: ds
environment: "prod"
cloud: "aws"
tier: "t2"
team: "dataplatform"
service_type: "airflow"
k8s_cluster_name: "tapi"
plip_version: 0.1.10-dp-ev
spec:
type: Java
mode: cluster
image: "hudi-ds:4"
imagePullPolicy: Always
mainClass: org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer
mainApplicationFile: "local:///opt/spark/hudi/hudi-utilities-bundle_2.11-0.9.0-SNAPSHOT.jar"
deps:
packages:
- org.apache.spark:spark-avro_2.11:2.4.4
sparkConf:
"spark.serializer": "org.apache.spark.serializer.KryoSerializer"
"spark.memory.fraction": "0.2"
"spark.memory.storageFraction": "0.2"
arguments:
- "--table-type"
- "COPY_ON_WRITE"
- "--props"
- "/opt/spark/hudi/config/source.properties"
- "--schemaprovider-class"
- "org.apache.hudi.utilities.schema.SchemaRegistryProvider"
- "--source-class"
- "org.apache.hudi.utilities.sources.JsonKafkaSource"
- "--target-base-path"
- "s3a://<ourbucket>/fusion/mongo/data/application_histories"
- "--target-table"
- "application_histories"
- "--op"
- "UPSERT"
- "--source-ordering-field"
- "__ts_ms"
- "--continuous"
- "--min-sync-interval-seconds"
- "60"
sparkVersion: "2.4.4"
restartPolicy:
type: Always
onFailureRetries: 100000
onFailureRetryInterval: 60
onSubmissionFailureRetries: 100000
onSubmissionFailureRetryInterval: 60
timeToLiveSeconds: 3600
volumes:
- name: hudi-ss-ah-ds
configMap:
name: hudi-ss-ah-ds
driver:
env:
- name: HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_impl
value: org.apache.hadoop.fs.s3a.S3AFileSystem
cores: 1
coreLimit: "1200m"
memory: "4G"
serviceAccount: "dataplatform"
volumeMounts:
- name: hudi-ss-ah-ds
mountPath: /opt/spark/hudi/config
subpath: config.yaml
javaOptions: "-Dnetworkaddress.cache.ttl=60 -Duser.timezone=IST -XX:+PrintGCApplicationConcurrentTime -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/varadarb_ds_driver.hprof"
executor:
env:
- name: HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_impl
value: org.apache.hadoop.fs.s3a.S3AFileSystem
cores: 1
instances: 20
memory: "6G"
volumeMounts:
- name: hudi-ss-ah-ds
mountPath: /opt/spark/hudi/config
subpath: config.yaml
javaOptions: "-Dnetworkaddress.cache.ttl=60 -Duser.timezone=IST -XX:+PrintGCApplicationConcurrentTime -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/varadarb_ds_driver.hprof"
sparkUIOptions:
ingressAnnotations:
kubernetes.io/ingress.class: nginx
monitoring:
exposeDriverMetrics: true
exposeExecutorMetrics: true
prometheus:
jmxExporterJar: "/opt/spark/hudi/prometheus/jmx_prometheus_javaagent-0.16.1.jar"
port: 8090
```
source.properties
```
#base properties
hoodie.upsert.shuffle.parallelism=500
hoodie.insert.shuffle.parallelism=50
hoodie.delete.shuffle.parallelism=50
hoodie.bulkinsert.shuffle.parallelism=10
hoodie.embed.timeline.server=true
hoodie.filesystem.view.type=EMBEDDED_KV_STORE
hoodie.compact.inline=false
#datasource properties
hoodie.deltastreamer.schemaprovider.registry.url=http://localhost:8081/subjects/mongo.self_signup.application_histories-value/versions/latest
hoodie.datasource.write.recordkey.field=id
hoodie.datasource.write.partitionpath.field=
hoodie.deltastreamer.source.kafka.topic=self_signup.application_histories
hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.NonpartitionedKeyGenerator
hoodie.deltastreamer.kafka.source.maxEvents=50000
#cleaning
hoodie.cleaner.policy=KEEP_LATEST_COMMITS
hoodie.cleaner.commits.retained=1
hoodie.clean.async=true
#archival
hoodie.keep.min.commits=12
hoodie.keep.max.commits=15
#kafka props
bootstrap.servers=localhost:9092
auto.offset.reset=earliest
schema.registry.url=http://localhost:8081
#prometheus
hoodie.metrics.on=true
hoodie.metrics.reporter.type=PROMETHEUS_PUSHGATEWAY
hoodie.metrics.pushgateway.host=k8s-prometheus-pushgateway.observability.svc.cluster.local
hoodie.metrics.pushgateway.port=9091
hoodie.metrics.pushgateway.delete.on.shutdown=false
hoodie.metrics.pushgateway.random.job.name.suffix=false
hoodie.metrics.pushgateway.job.name=hudi-ss-ah
```
**Stacktrace**
```
22/01/28 17:40:05 INFO DAGScheduler: ShuffleMapStage 39 (countByKey at SparkHoodieBloomIndex.java:114) failed in 5.523 s due to org.apache.spark.shuffle.FetchFailedException: Failure while fetching StreamChunkId{streamId=489876428219, chunkIndex=0}: java.io.IOException: Out of memory
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.FileDispatcherImpl.read(FileDispatcherImpl.java:46)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:197)
at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:159)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
at java.io.DataInputStream.readFully(DataInputStream.java:195)
at java.io.DataInputStream.readLong(DataInputStream.java:416)
at org.apache.spark.shuffle.IndexShuffleBlockResolver.getBlockData(IndexShuffleBlockResolver.scala:208)
at org.apache.spark.storage.BlockManager.getBlockData(BlockManager.scala:382)
at org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$1.apply(NettyBlockRpcServer.scala:61)
at org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$1.apply(NettyBlockRpcServer.scala:60)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.convert.Wrappers$IteratorWrapper.next(Wrappers.scala:31)
at org.apache.spark.network.server.OneForOneStreamManager.getChunk(OneForOneStreamManager.java:87)
at org.apache.spark.network.server.TransportRequestHandler.processFetchRequest(TransportRequestHandler.java:130)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:101)
at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:554)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:485)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:31)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:156)
at org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:50)
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:84)
at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:105)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
```
|
1.0
|
[SUPPORT] Executor OOM upserting 20M records from Kafka - **Describe the problem you faced**
While upserting Mongo oplogs from Kafka to Blob, facing Executor OOM
**Environment Description**
* Hudi version : 0.9.0
* Spark version : 2.4.4
* Hive version : 3.1.2
* Hadoop version : 2.7.3
* Storage (HDFS/S3/GCS..) : Azure Blob
* Running on Docker? (yes/no) : K8s
**Additional context**
Spark K8s yaml file
```
apiVersion: "sparkoperator.k8s.io/v1beta2"
kind: SparkApplication
metadata:
name: hudi-ss-ah-ds-{{ ti.job_id }}
namespace: dataplatform
labels:
spark_name: hudi-ss-ah-ds-{{ ti.job_id }}
dag_name: hudi-ss-ah
task_name: ds
environment: "prod"
cloud: "aws"
tier: "t2"
team: "dataplatform"
service_type: "airflow"
k8s_cluster_name: "tapi"
plip_version: 0.1.10-dp-ev
spec:
type: Java
mode: cluster
image: "hudi-ds:4"
imagePullPolicy: Always
mainClass: org.apache.hudi.utilities.deltastreamer.HoodieDeltaStreamer
mainApplicationFile: "local:///opt/spark/hudi/hudi-utilities-bundle_2.11-0.9.0-SNAPSHOT.jar"
deps:
packages:
- org.apache.spark:spark-avro_2.11:2.4.4
sparkConf:
"spark.serializer": "org.apache.spark.serializer.KryoSerializer"
"spark.memory.fraction": "0.2"
"spark.memory.storageFraction": "0.2"
arguments:
- "--table-type"
- "COPY_ON_WRITE"
- "--props"
- "/opt/spark/hudi/config/source.properties"
- "--schemaprovider-class"
- "org.apache.hudi.utilities.schema.SchemaRegistryProvider"
- "--source-class"
- "org.apache.hudi.utilities.sources.JsonKafkaSource"
- "--target-base-path"
- "s3a://<ourbucket>/fusion/mongo/data/application_histories"
- "--target-table"
- "application_histories"
- "--op"
- "UPSERT"
- "--source-ordering-field"
- "__ts_ms"
- "--continuous"
- "--min-sync-interval-seconds"
- "60"
sparkVersion: "2.4.4"
restartPolicy:
type: Always
onFailureRetries: 100000
onFailureRetryInterval: 60
onSubmissionFailureRetries: 100000
onSubmissionFailureRetryInterval: 60
timeToLiveSeconds: 3600
volumes:
- name: hudi-ss-ah-ds
configMap:
name: hudi-ss-ah-ds
driver:
env:
- name: HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_impl
value: org.apache.hadoop.fs.s3a.S3AFileSystem
cores: 1
coreLimit: "1200m"
memory: "4G"
serviceAccount: "dataplatform"
volumeMounts:
- name: hudi-ss-ah-ds
mountPath: /opt/spark/hudi/config
subpath: config.yaml
javaOptions: "-Dnetworkaddress.cache.ttl=60 -Duser.timezone=IST -XX:+PrintGCApplicationConcurrentTime -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/varadarb_ds_driver.hprof"
executor:
env:
- name: HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_access_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key
value: {{ var.value.HOODIE_ENV_fs_DOT_s3a_DOT_secret_DOT_key }}
- name: HOODIE_ENV_fs_DOT_s3a_DOT_impl
value: org.apache.hadoop.fs.s3a.S3AFileSystem
cores: 1
instances: 20
memory: "6G"
volumeMounts:
- name: hudi-ss-ah-ds
mountPath: /opt/spark/hudi/config
subpath: config.yaml
javaOptions: "-Dnetworkaddress.cache.ttl=60 -Duser.timezone=IST -XX:+PrintGCApplicationConcurrentTime -XX:+PrintGCTimeStamps -XX:+HeapDumpOnOutOfMemoryError -XX:HeapDumpPath=/tmp/varadarb_ds_driver.hprof"
sparkUIOptions:
ingressAnnotations:
kubernetes.io/ingress.class: nginx
monitoring:
exposeDriverMetrics: true
exposeExecutorMetrics: true
prometheus:
jmxExporterJar: "/opt/spark/hudi/prometheus/jmx_prometheus_javaagent-0.16.1.jar"
port: 8090
```
source.properties
```
#base properties
hoodie.upsert.shuffle.parallelism=500
hoodie.insert.shuffle.parallelism=50
hoodie.delete.shuffle.parallelism=50
hoodie.bulkinsert.shuffle.parallelism=10
hoodie.embed.timeline.server=true
hoodie.filesystem.view.type=EMBEDDED_KV_STORE
hoodie.compact.inline=false
#datasource properties
hoodie.deltastreamer.schemaprovider.registry.url=http://localhost:8081/subjects/mongo.self_signup.application_histories-value/versions/latest
hoodie.datasource.write.recordkey.field=id
hoodie.datasource.write.partitionpath.field=
hoodie.deltastreamer.source.kafka.topic=self_signup.application_histories
hoodie.datasource.write.keygenerator.class=org.apache.hudi.keygen.NonpartitionedKeyGenerator
hoodie.deltastreamer.kafka.source.maxEvents=50000
#cleaning
hoodie.cleaner.policy=KEEP_LATEST_COMMITS
hoodie.cleaner.commits.retained=1
hoodie.clean.async=true
#archival
hoodie.keep.min.commits=12
hoodie.keep.max.commits=15
#kafka props
bootstrap.servers=localhost:9092
auto.offset.reset=earliest
schema.registry.url=http://localhost:8081
#prometheus
hoodie.metrics.on=true
hoodie.metrics.reporter.type=PROMETHEUS_PUSHGATEWAY
hoodie.metrics.pushgateway.host=k8s-prometheus-pushgateway.observability.svc.cluster.local
hoodie.metrics.pushgateway.port=9091
hoodie.metrics.pushgateway.delete.on.shutdown=false
hoodie.metrics.pushgateway.random.job.name.suffix=false
hoodie.metrics.pushgateway.job.name=hudi-ss-ah
```
**Stacktrace**
```
22/01/28 17:40:05 INFO DAGScheduler: ShuffleMapStage 39 (countByKey at SparkHoodieBloomIndex.java:114) failed in 5.523 s due to org.apache.spark.shuffle.FetchFailedException: Failure while fetching StreamChunkId{streamId=489876428219, chunkIndex=0}: java.io.IOException: Out of memory
at sun.nio.ch.FileDispatcherImpl.read0(Native Method)
at sun.nio.ch.FileDispatcherImpl.read(FileDispatcherImpl.java:46)
at sun.nio.ch.IOUtil.readIntoNativeBuffer(IOUtil.java:223)
at sun.nio.ch.IOUtil.read(IOUtil.java:197)
at sun.nio.ch.FileChannelImpl.read(FileChannelImpl.java:159)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:65)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:109)
at sun.nio.ch.ChannelInputStream.read(ChannelInputStream.java:103)
at java.io.DataInputStream.readFully(DataInputStream.java:195)
at java.io.DataInputStream.readLong(DataInputStream.java:416)
at org.apache.spark.shuffle.IndexShuffleBlockResolver.getBlockData(IndexShuffleBlockResolver.scala:208)
at org.apache.spark.storage.BlockManager.getBlockData(BlockManager.scala:382)
at org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$1.apply(NettyBlockRpcServer.scala:61)
at org.apache.spark.network.netty.NettyBlockRpcServer$$anonfun$1.apply(NettyBlockRpcServer.scala:60)
at scala.collection.Iterator$$anon$11.next(Iterator.scala:410)
at scala.collection.convert.Wrappers$IteratorWrapper.next(Wrappers.scala:31)
at org.apache.spark.network.server.OneForOneStreamManager.getChunk(OneForOneStreamManager.java:87)
at org.apache.spark.network.server.TransportRequestHandler.processFetchRequest(TransportRequestHandler.java:130)
at org.apache.spark.network.server.TransportRequestHandler.handle(TransportRequestHandler.java:101)
at org.apache.spark.network.server.TransportChannelHandler.channelRead(TransportChannelHandler.java:118)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.timeout.IdleStateHandler.channelRead(IdleStateHandler.java:286)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.handler.codec.MessageToMessageDecoder.channelRead(MessageToMessageDecoder.java:102)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at org.apache.spark.network.util.TransportFrameDecoder.channelRead(TransportFrameDecoder.java:85)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.AbstractChannelHandlerContext.fireChannelRead(AbstractChannelHandlerContext.java:340)
at io.netty.channel.DefaultChannelPipeline$HeadContext.channelRead(DefaultChannelPipeline.java:1359)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:362)
at io.netty.channel.AbstractChannelHandlerContext.invokeChannelRead(AbstractChannelHandlerContext.java:348)
at io.netty.channel.DefaultChannelPipeline.fireChannelRead(DefaultChannelPipeline.java:935)
at io.netty.channel.nio.AbstractNioByteChannel$NioByteUnsafe.read(AbstractNioByteChannel.java:138)
at io.netty.channel.nio.NioEventLoop.processSelectedKey(NioEventLoop.java:645)
at io.netty.channel.nio.NioEventLoop.processSelectedKeysOptimized(NioEventLoop.java:580)
at io.netty.channel.nio.NioEventLoop.processSelectedKeys(NioEventLoop.java:497)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:459)
at io.netty.util.concurrent.SingleThreadEventExecutor$5.run(SingleThreadEventExecutor.java:858)
at io.netty.util.concurrent.DefaultThreadFactory$DefaultRunnableDecorator.run(DefaultThreadFactory.java:138)
at java.lang.Thread.run(Thread.java:748)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.throwFetchFailedException(ShuffleBlockFetcherIterator.scala:554)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:485)
at org.apache.spark.storage.ShuffleBlockFetcherIterator.next(ShuffleBlockFetcherIterator.scala:64)
at scala.collection.Iterator$$anon$12.nextCur(Iterator.scala:435)
at scala.collection.Iterator$$anon$12.hasNext(Iterator.scala:441)
at scala.collection.Iterator$$anon$11.hasNext(Iterator.scala:409)
at org.apache.spark.util.CompletionIterator.hasNext(CompletionIterator.scala:31)
at org.apache.spark.InterruptibleIterator.hasNext(InterruptibleIterator.scala:37)
at org.apache.spark.util.collection.ExternalAppendOnlyMap.insertAll(ExternalAppendOnlyMap.scala:156)
at org.apache.spark.Aggregator.combineCombinersByKey(Aggregator.scala:50)
at org.apache.spark.shuffle.BlockStoreShuffleReader.read(BlockStoreShuffleReader.scala:84)
at org.apache.spark.rdd.ShuffledRDD.compute(ShuffledRDD.scala:105)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:337)
at org.apache.spark.rdd.RDD$$anonfun$7.apply(RDD.scala:335)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1182)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:1091)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:1156)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:882)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:335)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:286)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:52)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:324)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:288)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:99)
at org.apache.spark.scheduler.ShuffleMapTask.runTask(ShuffleMapTask.scala:55)
at org.apache.spark.scheduler.Task.run(Task.scala:123)
at org.apache.spark.executor.Executor$TaskRunner$$anonfun$10.apply(Executor.scala:408)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1360)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:414)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
```
|
non_test
|
executor oom upserting records from kafka describe the problem you faced while upserting mongo oplogs from kafka to blob facing executor oom environment description hudi version spark version hive version hadoop version storage hdfs gcs azure blob running on docker yes no additional context spark yaml file apiversion sparkoperator io kind sparkapplication metadata name hudi ss ah ds ti job id namespace dataplatform labels spark name hudi ss ah ds ti job id dag name hudi ss ah task name ds environment prod cloud aws tier team dataplatform service type airflow cluster name tapi plip version dp ev spec type java mode cluster image hudi ds imagepullpolicy always mainclass org apache hudi utilities deltastreamer hoodiedeltastreamer mainapplicationfile local opt spark hudi hudi utilities bundle snapshot jar deps packages org apache spark spark avro sparkconf spark serializer org apache spark serializer kryoserializer spark memory fraction spark memory storagefraction arguments table type copy on write props opt spark hudi config source properties schemaprovider class org apache hudi utilities schema schemaregistryprovider source class org apache hudi utilities sources jsonkafkasource target base path fusion mongo data application histories target table application histories op upsert source ordering field ts ms continuous min sync interval seconds sparkversion restartpolicy type always onfailureretries onfailureretryinterval onsubmissionfailureretries onsubmissionfailureretryinterval timetoliveseconds volumes name hudi ss ah ds configmap name hudi ss ah ds driver env name hoodie env fs dot dot access dot key value var value hoodie env fs dot dot access dot key name hoodie env fs dot dot secret dot key value var value hoodie env fs dot dot secret dot key name hoodie env fs dot dot impl value org apache hadoop fs cores corelimit memory serviceaccount dataplatform volumemounts name hudi ss ah ds mountpath opt spark hudi config subpath config yaml javaoptions dnetworkaddress cache ttl duser timezone ist xx printgcapplicationconcurrenttime xx printgctimestamps xx heapdumponoutofmemoryerror xx heapdumppath tmp varadarb ds driver hprof executor env name hoodie env fs dot dot access dot key value var value hoodie env fs dot dot access dot key name hoodie env fs dot dot secret dot key value var value hoodie env fs dot dot secret dot key name hoodie env fs dot dot impl value org apache hadoop fs cores instances memory volumemounts name hudi ss ah ds mountpath opt spark hudi config subpath config yaml javaoptions dnetworkaddress cache ttl duser timezone ist xx printgcapplicationconcurrenttime xx printgctimestamps xx heapdumponoutofmemoryerror xx heapdumppath tmp varadarb ds driver hprof sparkuioptions ingressannotations kubernetes io ingress class nginx monitoring exposedrivermetrics true exposeexecutormetrics true prometheus jmxexporterjar opt spark hudi prometheus jmx prometheus javaagent jar port source properties base properties hoodie upsert shuffle parallelism hoodie insert shuffle parallelism hoodie delete shuffle parallelism hoodie bulkinsert shuffle parallelism hoodie embed timeline server true hoodie filesystem view type embedded kv store hoodie compact inline false datasource properties hoodie deltastreamer schemaprovider registry url hoodie datasource write recordkey field id hoodie datasource write partitionpath field hoodie deltastreamer source kafka topic self signup application histories hoodie datasource write keygenerator class org apache hudi keygen nonpartitionedkeygenerator hoodie deltastreamer kafka source maxevents cleaning hoodie cleaner policy keep latest commits hoodie cleaner commits retained hoodie clean async true archival hoodie keep min commits hoodie keep max commits kafka props bootstrap servers localhost auto offset reset earliest schema registry url prometheus hoodie metrics on true hoodie metrics reporter type prometheus pushgateway hoodie metrics pushgateway host prometheus pushgateway observability svc cluster local hoodie metrics pushgateway port hoodie metrics pushgateway delete on shutdown false hoodie metrics pushgateway random job name suffix false hoodie metrics pushgateway job name hudi ss ah stacktrace info dagscheduler shufflemapstage countbykey at sparkhoodiebloomindex java failed in s due to org apache spark shuffle fetchfailedexception failure while fetching streamchunkid streamid chunkindex java io ioexception out of memory at sun nio ch filedispatcherimpl native method at sun nio ch filedispatcherimpl read filedispatcherimpl java at sun nio ch ioutil readintonativebuffer ioutil java at sun nio ch ioutil read ioutil java at sun nio ch filechannelimpl read filechannelimpl java at sun nio ch channelinputstream read channelinputstream java at sun nio ch channelinputstream read channelinputstream java at sun nio ch channelinputstream read channelinputstream java at java io datainputstream readfully datainputstream java at java io datainputstream readlong datainputstream java at org apache spark shuffle indexshuffleblockresolver getblockdata indexshuffleblockresolver scala at org apache spark storage blockmanager getblockdata blockmanager scala at org apache spark network netty nettyblockrpcserver anonfun apply nettyblockrpcserver scala at org apache spark network netty nettyblockrpcserver anonfun apply nettyblockrpcserver scala at scala collection iterator anon next iterator scala at scala collection convert wrappers iteratorwrapper next wrappers scala at org apache spark network server oneforonestreammanager getchunk oneforonestreammanager java at org apache spark network server transportrequesthandler processfetchrequest transportrequesthandler java at org apache spark network server transportrequesthandler handle transportrequesthandler java at org apache spark network server transportchannelhandler channelread transportchannelhandler java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler timeout idlestatehandler channelread idlestatehandler java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty handler codec messagetomessagedecoder channelread messagetomessagedecoder java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at org apache spark network util transportframedecoder channelread transportframedecoder java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext firechannelread abstractchannelhandlercontext java at io netty channel defaultchannelpipeline headcontext channelread defaultchannelpipeline java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel abstractchannelhandlercontext invokechannelread abstractchannelhandlercontext java at io netty channel defaultchannelpipeline firechannelread defaultchannelpipeline java at io netty channel nio abstractniobytechannel niobyteunsafe read abstractniobytechannel java at io netty channel nio nioeventloop processselectedkey nioeventloop java at io netty channel nio nioeventloop processselectedkeysoptimized nioeventloop java at io netty channel nio nioeventloop processselectedkeys nioeventloop java at io netty channel nio nioeventloop run nioeventloop java at io netty util concurrent singlethreadeventexecutor run singlethreadeventexecutor java at io netty util concurrent defaultthreadfactory defaultrunnabledecorator run defaultthreadfactory java at java lang thread run thread java at org apache spark storage shuffleblockfetcheriterator throwfetchfailedexception shuffleblockfetcheriterator scala at org apache spark storage shuffleblockfetcheriterator next shuffleblockfetcheriterator scala at org apache spark storage shuffleblockfetcheriterator next shuffleblockfetcheriterator scala at scala collection iterator anon nextcur iterator scala at scala collection iterator anon hasnext iterator scala at scala collection iterator anon hasnext iterator scala at org apache spark util completioniterator hasnext completioniterator scala at org apache spark interruptibleiterator hasnext interruptibleiterator scala at org apache spark util collection externalappendonlymap insertall externalappendonlymap scala at org apache spark aggregator combinecombinersbykey aggregator scala at org apache spark shuffle blockstoreshufflereader read blockstoreshufflereader scala at org apache spark rdd shuffledrdd compute shuffledrdd scala at org apache spark rdd rdd computeorreadcheckpoint rdd scala at org apache spark rdd rdd iterator rdd scala at org apache spark rdd mappartitionsrdd compute mappartitionsrdd scala at org apache spark rdd rdd computeorreadcheckpoint rdd scala at org apache spark rdd rdd anonfun apply rdd scala at org apache spark rdd rdd anonfun apply rdd scala at org apache spark storage blockmanager anonfun doputiterator apply blockmanager scala at org apache spark storage blockmanager anonfun doputiterator apply blockmanager scala at org apache spark storage blockmanager doput blockmanager scala at org apache spark storage blockmanager doputiterator blockmanager scala at org apache spark storage blockmanager getorelseupdate blockmanager scala at org apache spark rdd rdd getorcompute rdd scala at org apache spark rdd rdd iterator rdd scala at org apache spark rdd mappartitionsrdd compute mappartitionsrdd scala at org apache spark rdd rdd computeorreadcheckpoint rdd scala at org apache spark rdd rdd iterator rdd scala at org apache spark rdd mappartitionsrdd compute mappartitionsrdd scala at org apache spark rdd rdd computeorreadcheckpoint rdd scala at org apache spark rdd rdd iterator rdd scala at org apache spark scheduler shufflemaptask runtask shufflemaptask scala at org apache spark scheduler shufflemaptask runtask shufflemaptask scala at org apache spark scheduler task run task scala at org apache spark executor executor taskrunner anonfun apply executor scala at org apache spark util utils trywithsafefinally utils scala at org apache spark executor executor taskrunner run executor scala at java util concurrent threadpoolexecutor runworker threadpoolexecutor java at java util concurrent threadpoolexecutor worker run threadpoolexecutor java at java lang thread run thread java
| 0
|
314,795
| 9,603,170,428
|
IssuesEvent
|
2019-05-10 16:18:31
|
threefoldtech/jumpscaleX
|
https://api.github.com/repos/threefoldtech/jumpscaleX
|
opened
|
jsx tfchain client lists available outputs that are actually spent
|
priority_critical type_bug
|
need to investigate why and fix it
|
1.0
|
jsx tfchain client lists available outputs that are actually spent - need to investigate why and fix it
|
non_test
|
jsx tfchain client lists available outputs that are actually spent need to investigate why and fix it
| 0
|
15,694
| 3,481,393,577
|
IssuesEvent
|
2015-12-29 15:51:52
|
slivne/try_git
|
https://api.github.com/repos/slivne/try_git
|
opened
|
repair : repair_while_new_node_is_added_test
|
dtest repair
|
Check that repair is accompileshed while new node is added
1. Create a cluster of 2 nodes with rf=2
2. Stop node 2
3. Insert data
4. Start node 2
6. Start repair
7. Create a new node and start it
|
1.0
|
repair : repair_while_new_node_is_added_test - Check that repair is accompileshed while new node is added
1. Create a cluster of 2 nodes with rf=2
2. Stop node 2
3. Insert data
4. Start node 2
6. Start repair
7. Create a new node and start it
|
test
|
repair repair while new node is added test check that repair is accompileshed while new node is added create a cluster of nodes with rf stop node insert data start node start repair create a new node and start it
| 1
|
344,255
| 10,341,954,801
|
IssuesEvent
|
2019-09-04 04:30:28
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.pubgmobile.com - A zoomed in version of the site is displayed
|
browser-fenix engine-gecko priority-normal severity-important
|
<!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.pubgmobile.com/act/a20180515iggamepc/
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Design is broken
**Description**: site isn't displaying properly
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.pubgmobile.com - A zoomed in version of the site is displayed - <!-- @browser: Firefox Mobile 68.0 -->
<!-- @ua_header: Mozilla/5.0 (Android 9; Mobile; rv:68.0) Gecko/68.0 Firefox/68.0 -->
<!-- @reported_with: -->
<!-- @extra_labels: browser-fenix -->
**URL**: https://www.pubgmobile.com/act/a20180515iggamepc/
**Browser / Version**: Firefox Mobile 68.0
**Operating System**: Android
**Tested Another Browser**: Yes
**Problem type**: Design is broken
**Description**: site isn't displaying properly
**Steps to Reproduce**:
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_test
|
a zoomed in version of the site is displayed url browser version firefox mobile operating system android tested another browser yes problem type design is broken description site isn t displaying properly steps to reproduce browser configuration none from with ❤️
| 0
|
301,974
| 26,113,956,637
|
IssuesEvent
|
2022-12-28 01:54:41
|
atsushieno/android-audio-plugin-framework
|
https://api.github.com/repos/atsushieno/android-audio-plugin-framework
|
closed
|
privatize Binder instance within connection manager and provide proxy instead
|
bug testing
|
Currently `AAPClientContext` in binder-client-as-plugin acquires `AIBinder* binder` from plugin factory at `aap_client_as_plugin_new()` and instantiate a strongly typed proxy (class created by AIDL) locally. The proxy instance is then destroyed when the `AAPClientContext` is destroyed.
It worked when there is only one instance per client. But that is wrong. A proxy is tied to a binder, which is instantiated only once for a client. It is per connection, not per plugin instance. We have to support more than one instances on a client.
It is currently the core of the issue that prevents multiple instantiation at a client, and therefore prevents basic `connectedCheck`s.
|
1.0
|
privatize Binder instance within connection manager and provide proxy instead - Currently `AAPClientContext` in binder-client-as-plugin acquires `AIBinder* binder` from plugin factory at `aap_client_as_plugin_new()` and instantiate a strongly typed proxy (class created by AIDL) locally. The proxy instance is then destroyed when the `AAPClientContext` is destroyed.
It worked when there is only one instance per client. But that is wrong. A proxy is tied to a binder, which is instantiated only once for a client. It is per connection, not per plugin instance. We have to support more than one instances on a client.
It is currently the core of the issue that prevents multiple instantiation at a client, and therefore prevents basic `connectedCheck`s.
|
test
|
privatize binder instance within connection manager and provide proxy instead currently aapclientcontext in binder client as plugin acquires aibinder binder from plugin factory at aap client as plugin new and instantiate a strongly typed proxy class created by aidl locally the proxy instance is then destroyed when the aapclientcontext is destroyed it worked when there is only one instance per client but that is wrong a proxy is tied to a binder which is instantiated only once for a client it is per connection not per plugin instance we have to support more than one instances on a client it is currently the core of the issue that prevents multiple instantiation at a client and therefore prevents basic connectedcheck s
| 1
|
104,431
| 8,972,329,529
|
IssuesEvent
|
2019-01-29 17:59:44
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
closed
|
visualize app pie chart other bucket should apply correct filter on other bucket
|
:KibanaApp :Visualizations failed-test high test triaged
|
└- ✖ fail: "visualize app pie chart other bucket should apply correct filter on other bucket"
23:58:17 │ pie chart
23:58:17 │ other bucket
23:58:17 │ should apply correct filter on other bucket:
23:58:17 │
23:58:17 │ Error: expected [ 'win 8', 'win xp', 'win 7', 'ios', 'Missing', 'Other' ] to sort of equal [ 'Missing', 'osx' ]
23:58:17 │ + expected - actual
23:58:17 │
23:58:17 │ [
23:58:17 │ - "win 8"
23:58:17 │ - "win xp"
23:58:17 │ - "win 7"
23:58:17 │ - "ios"
23:58:17 │ "Missing"
23:58:17 │ - "Other"
23:58:17 │ + "osx"
23:58:17 │ ]
23:58:17 │
23:58:17 │ at Assertion.assert (node_modules/expect.js/index.js:96:13)
23:58:17 │ at Assertion.eql (node_modules/expect.js/index.js:230:10)
23:58:17 │ at Context.it (test/functional/apps/visualize/_pie_chart.js:125:28)
23:58:17 │ at <anonymous>
23:58:17 │ at process._tickCallback (internal/process/next_tick.js:188:7)
23:58:17 │
Failing on cloud 6.5.1
|
2.0
|
visualize app pie chart other bucket should apply correct filter on other bucket - └- ✖ fail: "visualize app pie chart other bucket should apply correct filter on other bucket"
23:58:17 │ pie chart
23:58:17 │ other bucket
23:58:17 │ should apply correct filter on other bucket:
23:58:17 │
23:58:17 │ Error: expected [ 'win 8', 'win xp', 'win 7', 'ios', 'Missing', 'Other' ] to sort of equal [ 'Missing', 'osx' ]
23:58:17 │ + expected - actual
23:58:17 │
23:58:17 │ [
23:58:17 │ - "win 8"
23:58:17 │ - "win xp"
23:58:17 │ - "win 7"
23:58:17 │ - "ios"
23:58:17 │ "Missing"
23:58:17 │ - "Other"
23:58:17 │ + "osx"
23:58:17 │ ]
23:58:17 │
23:58:17 │ at Assertion.assert (node_modules/expect.js/index.js:96:13)
23:58:17 │ at Assertion.eql (node_modules/expect.js/index.js:230:10)
23:58:17 │ at Context.it (test/functional/apps/visualize/_pie_chart.js:125:28)
23:58:17 │ at <anonymous>
23:58:17 │ at process._tickCallback (internal/process/next_tick.js:188:7)
23:58:17 │
Failing on cloud 6.5.1
|
test
|
visualize app pie chart other bucket should apply correct filter on other bucket └ ✖ fail visualize app pie chart other bucket should apply correct filter on other bucket │ pie chart │ other bucket │ should apply correct filter on other bucket │ │ error expected to sort of equal │ expected actual │ │ │ win │ win xp │ win │ ios │ missing │ other │ osx │ │ │ at assertion assert node modules expect js index js │ at assertion eql node modules expect js index js │ at context it test functional apps visualize pie chart js │ at │ at process tickcallback internal process next tick js │ failing on cloud
| 1
|
198,001
| 14,953,091,376
|
IssuesEvent
|
2021-01-26 16:16:53
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
opened
|
Failing test: Jest Tests.src/core/server/http - Cookie based SessionStorage #get() reads from session storage
|
failed-test
|
A test failed on a tracked branch
```
Error: connect ECONNRESET 127.0.0.1:37795
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1146:16)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.11/234/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.src/core/server/http","test.name":"Cookie based SessionStorage #get() reads from session storage","test.failCount":1}} -->
|
1.0
|
Failing test: Jest Tests.src/core/server/http - Cookie based SessionStorage #get() reads from session storage - A test failed on a tracked branch
```
Error: connect ECONNRESET 127.0.0.1:37795
at TCPConnectWrap.afterConnect [as oncomplete] (net.js:1146:16)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+7.11/234/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Jest Tests.src/core/server/http","test.name":"Cookie based SessionStorage #get() reads from session storage","test.failCount":1}} -->
|
test
|
failing test jest tests src core server http cookie based sessionstorage get reads from session storage a test failed on a tracked branch error connect econnreset at tcpconnectwrap afterconnect net js first failure
| 1
|
151,925
| 12,065,932,399
|
IssuesEvent
|
2020-04-16 10:50:28
|
dso-toolkit/dso-toolkit
|
https://api.github.com/repos/dso-toolkit/dso-toolkit
|
closed
|
Ontwerp Inloggen en Machtiging discussie
|
question status:testable
|
Vanwege nieuwe functionaliteit krijgen de inlogpagina's een herontwerp. Deze is gebaseerd op het inlog systeem van de belasting dienst(besproken met belastingdienst).
In een branchrelease is een prototype gemaakt
https://dso-toolkit.nl/_489-Machtigen-Inloggen-Herontwerp/components/detail/machtigen.html
De hoofdvragen over deze ontwerpen zijn:
- Is dit de juiste implementatie van de toolkit @tfrijsewijk
- Zou dit een patroon moeten zijn voor toevoeging aan toolkit of zijn er onderdelen.
- Zijn de pagina's digitoegankelijk @timveld
Er is custom css toegepast plus er zijn onderdelen die nog niet goed werken.
- De highlightbox heeft custom margins
- De uitlijning van de lnk-buttons in de highlight box kan niet verticaal gecentreerd worden
- Ook werken de pagina's nog niet goed op mobiel ivm het niet kunnen wrappen van buttons. Moeten buttons kunnen wrappen anders moeten we miss max lengths gaan voorschrijven.
Visuele opmerkingen:
Als het laatste tablad op een tabblad actief is zie je nog een klein haakje aan de achterkan. Hier zou je eigenlijk een strakke lijn willen. Zoals bij de voorkant als het eerste tabblad actief is.
Visited states op tabbladen. Is dit wenselijk?
|
1.0
|
Ontwerp Inloggen en Machtiging discussie - Vanwege nieuwe functionaliteit krijgen de inlogpagina's een herontwerp. Deze is gebaseerd op het inlog systeem van de belasting dienst(besproken met belastingdienst).
In een branchrelease is een prototype gemaakt
https://dso-toolkit.nl/_489-Machtigen-Inloggen-Herontwerp/components/detail/machtigen.html
De hoofdvragen over deze ontwerpen zijn:
- Is dit de juiste implementatie van de toolkit @tfrijsewijk
- Zou dit een patroon moeten zijn voor toevoeging aan toolkit of zijn er onderdelen.
- Zijn de pagina's digitoegankelijk @timveld
Er is custom css toegepast plus er zijn onderdelen die nog niet goed werken.
- De highlightbox heeft custom margins
- De uitlijning van de lnk-buttons in de highlight box kan niet verticaal gecentreerd worden
- Ook werken de pagina's nog niet goed op mobiel ivm het niet kunnen wrappen van buttons. Moeten buttons kunnen wrappen anders moeten we miss max lengths gaan voorschrijven.
Visuele opmerkingen:
Als het laatste tablad op een tabblad actief is zie je nog een klein haakje aan de achterkan. Hier zou je eigenlijk een strakke lijn willen. Zoals bij de voorkant als het eerste tabblad actief is.
Visited states op tabbladen. Is dit wenselijk?
|
test
|
ontwerp inloggen en machtiging discussie vanwege nieuwe functionaliteit krijgen de inlogpagina s een herontwerp deze is gebaseerd op het inlog systeem van de belasting dienst besproken met belastingdienst in een branchrelease is een prototype gemaakt de hoofdvragen over deze ontwerpen zijn is dit de juiste implementatie van de toolkit tfrijsewijk zou dit een patroon moeten zijn voor toevoeging aan toolkit of zijn er onderdelen zijn de pagina s digitoegankelijk timveld er is custom css toegepast plus er zijn onderdelen die nog niet goed werken de highlightbox heeft custom margins de uitlijning van de lnk buttons in de highlight box kan niet verticaal gecentreerd worden ook werken de pagina s nog niet goed op mobiel ivm het niet kunnen wrappen van buttons moeten buttons kunnen wrappen anders moeten we miss max lengths gaan voorschrijven visuele opmerkingen als het laatste tablad op een tabblad actief is zie je nog een klein haakje aan de achterkan hier zou je eigenlijk een strakke lijn willen zoals bij de voorkant als het eerste tabblad actief is visited states op tabbladen is dit wenselijk
| 1
|
322,056
| 23,887,362,734
|
IssuesEvent
|
2022-09-08 08:44:59
|
IBM-Cloud/terraform-provider-ibm
|
https://api.github.com/repos/IBM-Cloud/terraform-provider-ibm
|
opened
|
Confusing wording in resource description
|
documentation
|
<!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### New or Affected Resource(s) or Datasource(s)
<!--- Please list the new or affected resources and data sources. --->
https://registry.terraform.io/providers/IBM-Cloud/ibm/latest/docs/resources/iam_authorization_policy_detach

The linked documentation does not mention any "detach" (https://cloud.ibm.com/docs/account?topic=account-serviceauth&interface=ui).
What is the purpose of this resource? What is meant with "This allows authorization policy to delete."???
|
1.0
|
Confusing wording in resource description - <!--- Please keep this note for the community --->
### Community Note
* Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or other comments that do not add relevant new information or questions, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment
<!--- Thank you for keeping this note for the community --->
### New or Affected Resource(s) or Datasource(s)
<!--- Please list the new or affected resources and data sources. --->
https://registry.terraform.io/providers/IBM-Cloud/ibm/latest/docs/resources/iam_authorization_policy_detach

The linked documentation does not mention any "detach" (https://cloud.ibm.com/docs/account?topic=account-serviceauth&interface=ui).
What is the purpose of this resource? What is meant with "This allows authorization policy to delete."???
|
non_test
|
confusing wording in resource description community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or other comments that do not add relevant new information or questions they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment new or affected resource s or datasource s the linked documentation does not mention any detach what is the purpose of this resource what is meant with this allows authorization policy to delete
| 0
|
189,098
| 14,485,047,560
|
IssuesEvent
|
2020-12-10 17:05:55
|
compare-ci/admin
|
https://api.github.com/repos/compare-ci/admin
|
closed
|
Automated test 1607619864.7172809
|
Test
|
This is a tracking issue for the automated tests being run. Test id: `automated-test-1607619864.7172809`
|[python-sum](https://github.com/compare-ci/python-sum/pull/1386)|Pull Created|Check Start|Check End|Total|Check|
|-|-|-|-|-|-|
|CircleCI Checks|17:04:29|17:04:30|17:05:30|0:01:01|0:01:00|
|Travis CI|17:04:29|17:05:05|17:05:24|0:00:55|0:00:19|
|GitHub Actions|17:04:29|17:04:44|17:04:48|0:00:19|0:00:04|
|Azure Pipelines|17:04:29|17:04:56|17:05:07|0:00:38|0:00:11|
|[node-sum](https://github.com/compare-ci/node-sum/pull/1367)|Pull Created|Check Start|Check End|Total|Check|
|-|-|-|-|-|-|
|CircleCI Checks|17:04:34|17:04:35|17:05:35|0:01:01|0:01:00|
|GitHub Actions|17:04:34|17:04:48|17:05:09|0:00:35|0:00:21|
|Azure Pipelines|17:04:34|17:04:57|17:05:20|0:00:46|0:00:23|
|Travis CI|17:04:34|17:05:05|17:05:44|0:01:10|0:00:39|
|
1.0
|
Automated test 1607619864.7172809 - This is a tracking issue for the automated tests being run. Test id: `automated-test-1607619864.7172809`
|[python-sum](https://github.com/compare-ci/python-sum/pull/1386)|Pull Created|Check Start|Check End|Total|Check|
|-|-|-|-|-|-|
|CircleCI Checks|17:04:29|17:04:30|17:05:30|0:01:01|0:01:00|
|Travis CI|17:04:29|17:05:05|17:05:24|0:00:55|0:00:19|
|GitHub Actions|17:04:29|17:04:44|17:04:48|0:00:19|0:00:04|
|Azure Pipelines|17:04:29|17:04:56|17:05:07|0:00:38|0:00:11|
|[node-sum](https://github.com/compare-ci/node-sum/pull/1367)|Pull Created|Check Start|Check End|Total|Check|
|-|-|-|-|-|-|
|CircleCI Checks|17:04:34|17:04:35|17:05:35|0:01:01|0:01:00|
|GitHub Actions|17:04:34|17:04:48|17:05:09|0:00:35|0:00:21|
|Azure Pipelines|17:04:34|17:04:57|17:05:20|0:00:46|0:00:23|
|Travis CI|17:04:34|17:05:05|17:05:44|0:01:10|0:00:39|
|
test
|
automated test this is a tracking issue for the automated tests being run test id automated test created check start check end total check circleci checks travis ci github actions azure pipelines created check start check end total check circleci checks github actions azure pipelines travis ci
| 1
|
22,042
| 3,932,149,294
|
IssuesEvent
|
2016-04-25 14:53:03
|
Microsoft/vscode
|
https://api.github.com/repos/Microsoft/vscode
|
closed
|
Test API: Add support for internal links in previewHtml
|
testplan-item
|
Test plan item for #3676
@jrieken Please complete...
|
1.0
|
Test API: Add support for internal links in previewHtml - Test plan item for #3676
@jrieken Please complete...
|
test
|
test api add support for internal links in previewhtml test plan item for jrieken please complete
| 1
|
51,775
| 10,723,770,774
|
IssuesEvent
|
2019-10-27 21:01:10
|
comphack/comp_hack
|
https://api.github.com/repos/comphack/comp_hack
|
closed
|
Parallel Boss Defeat Bug
|
bug code
|
(On the Re:Imagine server if this matters at all)
At seemingly random times since the latest content release, every single one of my/my demons abilities will just seem to stop working. I can click a skill, click a demon to summon/desummon, anything. It will look like it worked but it does absolutely nothing except let me incant the skill over and over to no effect. I just got a report in that this was happening to another player as well. The only similar thing I seem to get is "it happens after lag" or "it happens after zoning" and I'm having difficulty recreating it. The only way to fix it is to log off.
Sorry for just a blob of ranting instead of a real way to get this to happen, but that's all I got so far. Hopefully I'll have it happen again or get more reports of it soon.
|
1.0
|
Parallel Boss Defeat Bug - (On the Re:Imagine server if this matters at all)
At seemingly random times since the latest content release, every single one of my/my demons abilities will just seem to stop working. I can click a skill, click a demon to summon/desummon, anything. It will look like it worked but it does absolutely nothing except let me incant the skill over and over to no effect. I just got a report in that this was happening to another player as well. The only similar thing I seem to get is "it happens after lag" or "it happens after zoning" and I'm having difficulty recreating it. The only way to fix it is to log off.
Sorry for just a blob of ranting instead of a real way to get this to happen, but that's all I got so far. Hopefully I'll have it happen again or get more reports of it soon.
|
non_test
|
parallel boss defeat bug on the re imagine server if this matters at all at seemingly random times since the latest content release every single one of my my demons abilities will just seem to stop working i can click a skill click a demon to summon desummon anything it will look like it worked but it does absolutely nothing except let me incant the skill over and over to no effect i just got a report in that this was happening to another player as well the only similar thing i seem to get is it happens after lag or it happens after zoning and i m having difficulty recreating it the only way to fix it is to log off sorry for just a blob of ranting instead of a real way to get this to happen but that s all i got so far hopefully i ll have it happen again or get more reports of it soon
| 0
|
133,151
| 10,797,904,757
|
IssuesEvent
|
2019-11-06 08:58:03
|
whatTool/Validation
|
https://api.github.com/repos/whatTool/Validation
|
closed
|
Option to add requirements in __construct()
|
TODO Test solution
|
Add option for requirement withing construct so $v->requirements($rules) isn't needed.
|
1.0
|
Option to add requirements in __construct() - Add option for requirement withing construct so $v->requirements($rules) isn't needed.
|
test
|
option to add requirements in construct add option for requirement withing construct so v requirements rules isn t needed
| 1
|
47,283
| 2,974,618,168
|
IssuesEvent
|
2015-07-15 02:26:54
|
palantir/tslint
|
https://api.github.com/repos/palantir/tslint
|
closed
|
class-name rule fails on default class export
|
Bug ES6+ Syntax High Priority
|
this code breaks on the "class-name" rule:
```ts
export default class {
...
}
```
|
1.0
|
class-name rule fails on default class export - this code breaks on the "class-name" rule:
```ts
export default class {
...
}
```
|
non_test
|
class name rule fails on default class export this code breaks on the class name rule ts export default class
| 0
|
14,803
| 3,422,590,344
|
IssuesEvent
|
2015-12-08 23:50:29
|
radare/radare2
|
https://api.github.com/repos/radare/radare2
|
reopened
|
Broken search when mapping a file in an address where it collides with its header
|
bug test-required
|
**UDPATE**: Detailed description of this bug is in this comment: https://github.com/radare/radare2/issues/3788#issuecomment-163042481
Checkout latest radare2-regressions repo and `cd` to:
```
$ cd radare2-regressions/bins/vsf
$ r2 ./c64-rambo2-rom.vsf
```
It has 3 sections, by default the last section is enabled, RAM.
```
[0x0000c2cd]> S
[00] . 0x0001209d mr-x va=0x0000a000 sz=0x2000 vsz=0x2000 BASIC
[01] . 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[02] * 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
```
Now, I will try to search for something that I know is RAM:
```
[0x0000c2cd]> s 0
[0x00000000]> px 32
- offset - 0 1 2 3 4 5 6 7 8 9 A B C D E F 0123456789ABCDEF
0x00000000 ff14 0000 0d0e 000e 0007 0007 0000 0001 ................
0x00000010 0000 0000 0000 0000 0000 0000 0000 0000 ................
```
So, I search for `0xff` and this happens
```
[0x00000000]> /x ff
Searching 1 bytes...
# 7 [0xa000-0x10000]
hits: 900
0x0000a336 hit0_0 ff
0x0000a449 hit0_1 ff
0x0000a47f hit0_2 ff
0x0000a491 hit0_3 ff
...
```
**BUG 1**: The range is from 0xa000 to 0x10000, and not from 0x0000 to 0x10000.
Now I will delete the sections BASIC and section KERNAL so that only section RAM is present:
```
[0x0000c2cd]> S
[00] . 0x0001209d mr-x va=0x0000a000 sz=0x2000 vsz=0x2000 BASIC
[01] . 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[02] * 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
[0x0000c2cd]> s 0xa000
[0x0000a000]> S
[00] * 0x0001209d mr-x va=0x0000a000 sz=0x2000 vsz=0x2000 BASIC
[01] . 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[02] . 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
[0x0000a000]> S-
[0x0000a000]> s 0xe000
[0x0000e000]> S
[00] * 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[01] . 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
[0x0000e000]> S-
[0x0000e000]> S
[00] * 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
```
Now I will search for the first 2 bytes that are in RAM:
```
[0x00000000]> px 2
- offset - 0 1 2 3 4 5 6 7 8 9 A B C D E F 0123456789ABCDEF
0x00000000 ff14
[0x00000000]> /x ff14
Searching 2 bytes...
# 7 [0x0-0x10000]
hits: 3
0x000099cd hit0_0 2085
0x0000a691 hit0_1 00ff
0x0000b691 hit0_2 00ff
```
**BUG 2**: I can't find "ff14" at 0x0... and also it reports findings that I'm not interested in like `2085`, `00ff` and `00ff`
However, the range seems to be fixed.
These are my esil search settings:
```
[0x00000000]> e~search
anal.searchstringrefs = false
search.align = 0
search.chunk = 0
search.contiguous = true
search.count = 0
search.distance = 0
search.esilcombo = 8
search.flags = true
search.from = 0xffffffffffffffff
search.in = file
search.kwidx = 1
search.maxhits = 0
search.prefix = hit
search.show = true
search.to = 0xffffffffffffffff
```
|
1.0
|
Broken search when mapping a file in an address where it collides with its header - **UDPATE**: Detailed description of this bug is in this comment: https://github.com/radare/radare2/issues/3788#issuecomment-163042481
Checkout latest radare2-regressions repo and `cd` to:
```
$ cd radare2-regressions/bins/vsf
$ r2 ./c64-rambo2-rom.vsf
```
It has 3 sections, by default the last section is enabled, RAM.
```
[0x0000c2cd]> S
[00] . 0x0001209d mr-x va=0x0000a000 sz=0x2000 vsz=0x2000 BASIC
[01] . 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[02] * 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
```
Now, I will try to search for something that I know is RAM:
```
[0x0000c2cd]> s 0
[0x00000000]> px 32
- offset - 0 1 2 3 4 5 6 7 8 9 A B C D E F 0123456789ABCDEF
0x00000000 ff14 0000 0d0e 000e 0007 0007 0000 0001 ................
0x00000010 0000 0000 0000 0000 0000 0000 0000 0000 ................
```
So, I search for `0xff` and this happens
```
[0x00000000]> /x ff
Searching 1 bytes...
# 7 [0xa000-0x10000]
hits: 900
0x0000a336 hit0_0 ff
0x0000a449 hit0_1 ff
0x0000a47f hit0_2 ff
0x0000a491 hit0_3 ff
...
```
**BUG 1**: The range is from 0xa000 to 0x10000, and not from 0x0000 to 0x10000.
Now I will delete the sections BASIC and section KERNAL so that only section RAM is present:
```
[0x0000c2cd]> S
[00] . 0x0001209d mr-x va=0x0000a000 sz=0x2000 vsz=0x2000 BASIC
[01] . 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[02] * 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
[0x0000c2cd]> s 0xa000
[0x0000a000]> S
[00] * 0x0001209d mr-x va=0x0000a000 sz=0x2000 vsz=0x2000 BASIC
[01] . 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[02] . 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
[0x0000a000]> S-
[0x0000a000]> s 0xe000
[0x0000e000]> S
[00] * 0x0001009d mr-x va=0x0000e000 sz=0x2000 vsz=0x2000 KERNAL
[01] . 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
[0x0000e000]> S-
[0x0000e000]> S
[00] * 0x00000084 mrwx va=0x00000000 sz=0x10000 vsz=0x10000 RAM
```
Now I will search for the first 2 bytes that are in RAM:
```
[0x00000000]> px 2
- offset - 0 1 2 3 4 5 6 7 8 9 A B C D E F 0123456789ABCDEF
0x00000000 ff14
[0x00000000]> /x ff14
Searching 2 bytes...
# 7 [0x0-0x10000]
hits: 3
0x000099cd hit0_0 2085
0x0000a691 hit0_1 00ff
0x0000b691 hit0_2 00ff
```
**BUG 2**: I can't find "ff14" at 0x0... and also it reports findings that I'm not interested in like `2085`, `00ff` and `00ff`
However, the range seems to be fixed.
These are my esil search settings:
```
[0x00000000]> e~search
anal.searchstringrefs = false
search.align = 0
search.chunk = 0
search.contiguous = true
search.count = 0
search.distance = 0
search.esilcombo = 8
search.flags = true
search.from = 0xffffffffffffffff
search.in = file
search.kwidx = 1
search.maxhits = 0
search.prefix = hit
search.show = true
search.to = 0xffffffffffffffff
```
|
test
|
broken search when mapping a file in an address where it collides with its header udpate detailed description of this bug is in this comment checkout latest regressions repo and cd to cd regressions bins vsf rom vsf it has sections by default the last section is enabled ram s mr x va sz vsz basic mr x va sz vsz kernal mrwx va sz vsz ram now i will try to search for something that i know is ram s px offset a b c d e f so i search for and this happens x ff searching bytes hits ff ff ff ff bug the range is from to and not from to now i will delete the sections basic and section kernal so that only section ram is present s mr x va sz vsz basic mr x va sz vsz kernal mrwx va sz vsz ram s s mr x va sz vsz basic mr x va sz vsz kernal mrwx va sz vsz ram s s s mr x va sz vsz kernal mrwx va sz vsz ram s s mrwx va sz vsz ram now i will search for the first bytes that are in ram px offset a b c d e f x searching bytes hits bug i can t find at and also it reports findings that i m not interested in like and however the range seems to be fixed these are my esil search settings e search anal searchstringrefs false search align search chunk search contiguous true search count search distance search esilcombo search flags true search from search in file search kwidx search maxhits search prefix hit search show true search to
| 1
|
176,499
| 13,644,106,504
|
IssuesEvent
|
2020-09-25 18:18:07
|
softmatterlab/Braph-2.0-Matlab
|
https://api.github.com/repos/softmatterlab/Braph-2.0-Matlab
|
closed
|
Add Average Overlapping In-Strength and Average Overlapping Out-Strength
|
measure test
|
- [x] OverlappingInStrengthAv.m
- [x] test_OverlappingInStrengthAv.m
- [x] OverlappingOutStrengthAv.m
- [x] test_OverlappingOutStrengthAv.m
Branch from develop.
|
1.0
|
Add Average Overlapping In-Strength and Average Overlapping Out-Strength - - [x] OverlappingInStrengthAv.m
- [x] test_OverlappingInStrengthAv.m
- [x] OverlappingOutStrengthAv.m
- [x] test_OverlappingOutStrengthAv.m
Branch from develop.
|
test
|
add average overlapping in strength and average overlapping out strength overlappinginstrengthav m test overlappinginstrengthav m overlappingoutstrengthav m test overlappingoutstrengthav m branch from develop
| 1
|
65,624
| 6,970,858,863
|
IssuesEvent
|
2017-12-11 11:53:32
|
joserogerio/promocaldasSite
|
https://api.github.com/repos/joserogerio/promocaldasSite
|
closed
|
Validar CNPJ
|
melhoramento test
|
Validar CNPJ ao cadastrar e atualizar empresa, pois se não tiver um CNPJ válido irá dar erro ao gerar boleto.
|
1.0
|
Validar CNPJ - Validar CNPJ ao cadastrar e atualizar empresa, pois se não tiver um CNPJ válido irá dar erro ao gerar boleto.
|
test
|
validar cnpj validar cnpj ao cadastrar e atualizar empresa pois se não tiver um cnpj válido irá dar erro ao gerar boleto
| 1
|
15,413
| 9,549,226,551
|
IssuesEvent
|
2019-05-02 08:32:12
|
talent-wins/devjobv2
|
https://api.github.com/repos/talent-wins/devjobv2
|
closed
|
CVE-2015-9251 Medium Severity Vulnerability detected by WhiteSource
|
security vulnerability
|
## CVE-2015-9251 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js</a></p>
<p>Path to dependency file: /devjobv2/public/admin_layout/js/plugins/codemirror/mode/slim/index.html</p>
<p>Path to vulnerable library: /devjobv2/public/admin_layout/js/plugins/codemirror/mode/slim/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.1.min.js** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-9251>CVE-2015-9251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: 3.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isOpenPROnNewVersion":false,"isPackageBased":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.1","isTransitiveDependency":true,"dependencyTree":"jquery:1.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2015-9251 Medium Severity Vulnerability detected by WhiteSource - ## CVE-2015-9251 - Medium Severity Vulnerability
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.1.min.js</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js</a></p>
<p>Path to dependency file: /devjobv2/public/admin_layout/js/plugins/codemirror/mode/slim/index.html</p>
<p>Path to vulnerable library: /devjobv2/public/admin_layout/js/plugins/codemirror/mode/slim/index.html</p>
<p>
Dependency Hierarchy:
- :x: **jquery-1.11.1.min.js** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.
<p>Publish Date: 2018-01-18
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2015-9251>CVE-2015-9251</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Changed
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://www.whitesourcesoftware.com/wp-content/uploads/2018/10/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p>
<p>Release Date: 2018-01-18</p>
<p>Fix Resolution: 3.0.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isOpenPROnNewVersion":false,"isPackageBased":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.1","isTransitiveDependency":true,"dependencyTree":"jquery:1.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve medium severity vulnerability detected by whitesource cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to dependency file public admin layout js plugins codemirror mode slim index html path to vulnerable library public admin layout js plugins codemirror mode slim index html dependency hierarchy x jquery min js vulnerable library vulnerability details jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource isopenpronvulnerability true isopenpronnewversion false ispackagebased true packages vulnerabilityidentifier cve vulnerabilitydetails jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed medium a none ac low pr none s changed c low ui required av network i low extradata
| 0
|
121,219
| 10,153,985,647
|
IssuesEvent
|
2019-08-06 06:46:51
|
OWASP/NodeGoat
|
https://api.github.com/repos/OWASP/NodeGoat
|
opened
|
Improve Cypress script
|
enhancement help wanted priority: HIGH testing
|
### Context
- This is part of `release-1.5` #148
- Critical task
### Tasks
- [ ] Improve the speed of the e2e tests (almost 8 minutes now). Maybe videos are generated (ci/local envs)
- [ ] Add test that are missing (fast check)
- [ ] Remove unnecessary/repetitive tests
- [ ] Improve the [command `dbReset`](https://github.com/OWASP/NodeGoat/blob/master/test/e2e/support/commands.js#L28). Missing validation due Travis conflicts
- [ ] Check that the npm tasks are working as expected
- [ ] Update the `readme.md` with the extra relevant info (if needed)
### Assignation
- This tasks is open for assignation, just claim it (as reply to this) and submit your PR ;-)
### Important
- Check the current roadmap #148
- Don't forget to check the [Contributing Guidelines](https://github.com/OWASP/NodeGoat/blob/master/CONTRIBUTING.md) and follow the [Code of Conduct](https://github.com/OWASP/NodeGoat/blob/master/CONTRIBUTING.md)
|
1.0
|
Improve Cypress script - ### Context
- This is part of `release-1.5` #148
- Critical task
### Tasks
- [ ] Improve the speed of the e2e tests (almost 8 minutes now). Maybe videos are generated (ci/local envs)
- [ ] Add test that are missing (fast check)
- [ ] Remove unnecessary/repetitive tests
- [ ] Improve the [command `dbReset`](https://github.com/OWASP/NodeGoat/blob/master/test/e2e/support/commands.js#L28). Missing validation due Travis conflicts
- [ ] Check that the npm tasks are working as expected
- [ ] Update the `readme.md` with the extra relevant info (if needed)
### Assignation
- This tasks is open for assignation, just claim it (as reply to this) and submit your PR ;-)
### Important
- Check the current roadmap #148
- Don't forget to check the [Contributing Guidelines](https://github.com/OWASP/NodeGoat/blob/master/CONTRIBUTING.md) and follow the [Code of Conduct](https://github.com/OWASP/NodeGoat/blob/master/CONTRIBUTING.md)
|
test
|
improve cypress script context this is part of release critical task tasks improve the speed of the tests almost minutes now maybe videos are generated ci local envs add test that are missing fast check remove unnecessary repetitive tests improve the missing validation due travis conflicts check that the npm tasks are working as expected update the readme md with the extra relevant info if needed assignation this tasks is open for assignation just claim it as reply to this and submit your pr important check the current roadmap don t forget to check the and follow the
| 1
|
626,568
| 19,828,128,112
|
IssuesEvent
|
2022-01-20 09:10:00
|
canonical-web-and-design/snapcraft.io
|
https://api.github.com/repos/canonical-web-and-design/snapcraft.io
|
closed
|
Closing and reopening the "Add member" panel in brand store doesn't clear roles
|
Priority: High
|
- Try to add a new member
- Close and reopen the new member panel
- The roles are still populated
- Filling in the email address doesn't enable the save button until there is a change in the roles
|
1.0
|
Closing and reopening the "Add member" panel in brand store doesn't clear roles - - Try to add a new member
- Close and reopen the new member panel
- The roles are still populated
- Filling in the email address doesn't enable the save button until there is a change in the roles
|
non_test
|
closing and reopening the add member panel in brand store doesn t clear roles try to add a new member close and reopen the new member panel the roles are still populated filling in the email address doesn t enable the save button until there is a change in the roles
| 0
|
218,168
| 24,351,807,075
|
IssuesEvent
|
2022-10-03 01:21:28
|
snowdensb/jpo-ode
|
https://api.github.com/repos/snowdensb/jpo-ode
|
opened
|
CVE-2022-42004 (Medium) detected in jackson-databind-2.12.3.jar
|
security vulnerability
|
## CVE-2022-42004 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.12.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /jpo-ode-common/pom.xml</p>
<p>Path to vulnerable library: /m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.12.3.jar** (Vulnerable Library)
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In FasterXML jackson-databind before 2.13.4, resource exhaustion can occur because of a lack of a check in BeanDeserializer._deserializeFromArray to prevent use of deeply nested arrays. An application is vulnerable only with certain customized choices for deserialization.
<p>Publish Date: 2022-10-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-42004>CVE-2022-42004</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.13.4</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
|
True
|
CVE-2022-42004 (Medium) detected in jackson-databind-2.12.3.jar - ## CVE-2022-42004 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jackson-databind-2.12.3.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: /jpo-ode-common/pom.xml</p>
<p>Path to vulnerable library: /m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.12.3/jackson-databind-2.12.3.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.12.3.jar** (Vulnerable Library)
<p>Found in base branch: <b>dev</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
In FasterXML jackson-databind before 2.13.4, resource exhaustion can occur because of a lack of a check in BeanDeserializer._deserializeFromArray to prevent use of deeply nested arrays. An application is vulnerable only with certain customized choices for deserialization.
<p>Publish Date: 2022-10-02
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2022-42004>CVE-2022-42004</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: Required
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Release Date: 2022-10-02</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.13.4</p>
</p>
</details>
<p></p>
***
:rescue_worker_helmet: Automatic Remediation is available for this issue
|
non_test
|
cve medium detected in jackson databind jar cve medium severity vulnerability vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file jpo ode common pom xml path to vulnerable library repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in base branch dev vulnerability details in fasterxml jackson databind before resource exhaustion can occur because of a lack of a check in beandeserializer deserializefromarray to prevent use of deeply nested arrays an application is vulnerable only with certain customized choices for deserialization publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version release date fix resolution com fasterxml jackson core jackson databind rescue worker helmet automatic remediation is available for this issue
| 0
|
341,992
| 30,607,076,344
|
IssuesEvent
|
2023-07-23 06:07:29
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
reopened
|
Fix tensor.test_torch_special_gt
|
PyTorch Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5634798880"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5629743168/job/15255044367"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5629743168/job/15255044367"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5634798880"><img src=https://img.shields.io/badge/-failure-red></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5634798880"><img src=https://img.shields.io/badge/-failure-red></a>
|
1.0
|
Fix tensor.test_torch_special_gt - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5634798880"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/5629743168/job/15255044367"><img src=https://img.shields.io/badge/-success-success></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/5629743168/job/15255044367"><img src=https://img.shields.io/badge/-success-success></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/5634798880"><img src=https://img.shields.io/badge/-failure-red></a>
|paddle|<a href="https://github.com/unifyai/ivy/actions/runs/5634798880"><img src=https://img.shields.io/badge/-failure-red></a>
|
test
|
fix tensor test torch special gt tensorflow a href src jax a href src numpy a href src torch a href src paddle a href src
| 1
|
183,979
| 14,964,705,406
|
IssuesEvent
|
2021-01-27 12:22:23
|
alphagov/govuk-frontend
|
https://api.github.com/repos/alphagov/govuk-frontend
|
opened
|
Update macros content to specify when you use `text` or `html`
|
awaiting triage documentation
|
## What
While working on cookie banner macros, I got some feedback that could apply to all other macros content.
A reviewer asked why users should choose either `text` or `html`, as we hadn't specified this in the cookie content. I then learned that if users ever attempt to pass both `text` and `html`, `html` will win every time.
We then decided to add this sentence to the `html` Description: "If you are not passing HTML, use `text`."
All of our components contain `text` and `html` options. However, it's only in the cookie banner content that we explain when to use one or the other.
If we updated the other components to match, it would entail 28 issues and pull requests. I'm happy to make those changes, but I'd like to find out from the team whether you consider it a good use of time. Or should we investigate whether there are other changes that apply across the board? Then wait until we have a comprehensive list of changes to make?
## Why
To clarify for users when they should use either `text` or `html`.
## Who needs to know about this
Technical Writer, Developers
## Done when
- [ ] Technical writer drafts update
- [ ] Developers review and approve update
|
1.0
|
Update macros content to specify when you use `text` or `html` - ## What
While working on cookie banner macros, I got some feedback that could apply to all other macros content.
A reviewer asked why users should choose either `text` or `html`, as we hadn't specified this in the cookie content. I then learned that if users ever attempt to pass both `text` and `html`, `html` will win every time.
We then decided to add this sentence to the `html` Description: "If you are not passing HTML, use `text`."
All of our components contain `text` and `html` options. However, it's only in the cookie banner content that we explain when to use one or the other.
If we updated the other components to match, it would entail 28 issues and pull requests. I'm happy to make those changes, but I'd like to find out from the team whether you consider it a good use of time. Or should we investigate whether there are other changes that apply across the board? Then wait until we have a comprehensive list of changes to make?
## Why
To clarify for users when they should use either `text` or `html`.
## Who needs to know about this
Technical Writer, Developers
## Done when
- [ ] Technical writer drafts update
- [ ] Developers review and approve update
|
non_test
|
update macros content to specify when you use text or html what while working on cookie banner macros i got some feedback that could apply to all other macros content a reviewer asked why users should choose either text or html as we hadn t specified this in the cookie content i then learned that if users ever attempt to pass both text and html html will win every time we then decided to add this sentence to the html description if you are not passing html use text all of our components contain text and html options however it s only in the cookie banner content that we explain when to use one or the other if we updated the other components to match it would entail issues and pull requests i m happy to make those changes but i d like to find out from the team whether you consider it a good use of time or should we investigate whether there are other changes that apply across the board then wait until we have a comprehensive list of changes to make why to clarify for users when they should use either text or html who needs to know about this technical writer developers done when technical writer drafts update developers review and approve update
| 0
|
188,097
| 14,438,559,633
|
IssuesEvent
|
2020-12-07 13:13:25
|
kalexmills/github-vet-tests-dec2020
|
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
|
closed
|
yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t: src/sync/atomic/atomic_test.go; 21 LoC
|
fresh small test
|
Found a possible issue in [yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t](https://www.github.com/yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t) at [src/sync/atomic/atomic_test.go](https://github.com/yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t/blob/df1163c14a959edb3cdcae1dd58b339b623e6c9b/src/sync/atomic/atomic_test.go#L884-L904)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable testf used in defer or goroutine at line 895
[Click here to see the code in its original context.](https://github.com/yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t/blob/df1163c14a959edb3cdcae1dd58b339b623e6c9b/src/sync/atomic/atomic_test.go#L884-L904)
<details>
<summary>Click here to show the 21 line(s) of Go which triggered the analyzer.</summary>
```go
for name, testf := range hammer32 {
c := make(chan int)
var val uint32
for i := 0; i < p; i++ {
go func() {
defer func() {
if err := recover(); err != nil {
t.Error(err.(string))
}
c <- 1
}()
testf(&val, n)
}()
}
for i := 0; i < p; i++ {
<-c
}
if !strings.HasPrefix(name, "Swap") && val != uint32(n)*p {
t.Fatalf("%s: val=%d want %d", name, val, n*p)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: df1163c14a959edb3cdcae1dd58b339b623e6c9b
|
1.0
|
yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t: src/sync/atomic/atomic_test.go; 21 LoC -
Found a possible issue in [yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t](https://www.github.com/yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t) at [src/sync/atomic/atomic_test.go](https://github.com/yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t/blob/df1163c14a959edb3cdcae1dd58b339b623e6c9b/src/sync/atomic/atomic_test.go#L884-L904)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable testf used in defer or goroutine at line 895
[Click here to see the code in its original context.](https://github.com/yasya1100/RnJvbSAyMTgyYmIwOTdkYzVjYWNmNTU2Zjg5ZTRlN2QyY2ZkZDk2ODgyMjM3IE1vbiBTZXAgMTcgMDA6MDA6MDAgMjAwMQpGcm9t/blob/df1163c14a959edb3cdcae1dd58b339b623e6c9b/src/sync/atomic/atomic_test.go#L884-L904)
<details>
<summary>Click here to show the 21 line(s) of Go which triggered the analyzer.</summary>
```go
for name, testf := range hammer32 {
c := make(chan int)
var val uint32
for i := 0; i < p; i++ {
go func() {
defer func() {
if err := recover(); err != nil {
t.Error(err.(string))
}
c <- 1
}()
testf(&val, n)
}()
}
for i := 0; i < p; i++ {
<-c
}
if !strings.HasPrefix(name, "Swap") && val != uint32(n)*p {
t.Fatalf("%s: val=%d want %d", name, val, n*p)
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: df1163c14a959edb3cdcae1dd58b339b623e6c9b
|
test
|
src sync atomic atomic test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable testf used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for name testf range c make chan int var val for i i p i go func defer func if err recover err nil t error err string c testf val n for i i p i c if strings hasprefix name swap val n p t fatalf s val d want d name val n p leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 1
|
52,331
| 3,022,642,565
|
IssuesEvent
|
2015-07-31 21:39:25
|
information-artifact-ontology/IAO
|
https://api.github.com/repos/information-artifact-ontology/IAO
|
opened
|
synonym
|
imported Priority-Medium Type-Term
|
_From [dosu...@gmail.com](https://code.google.com/u/102674886352087815907/) on March 10, 2011 09:53:44_
*label for the new term*
synonym
*Background*
This is intended as an IAO mapping for the OBO tag synonym.
This tag is used to relate ontology terms to the language that people use. But that language is not always precise. Words do not come with necessary and sufficient conditions to specify their referents. In regular language, and at least some scientific language, there is no way to define crisp boundaries for the referent of a word that will satisfy most users conception of what that word refers to.
Even within a single scientific community there are often subtle distinctions of usage. For a term used in anatomy, biologists often differ in how they define spatial and or temporal boundaries for the referents of a term. Or they might differ slightly in what subclasses they consider should be subsumed by a class. Or they may use variants of a term to refer to the same thing, or some completely different term for identically or similarly defined structures . We can't possibly add terms for all such cases and deal with the resulting overlapping partonomies or fantastically detailed classifications. Such an ontology would be unmaintainable and probably unusable.
So, for a whole set of closely related terms we choose one definition and and one term and add the rest as synonyms. That way, users can find candidate terms by searching with a term that is familiar to them. It may be useful to further classify (scope) synonyms as having spatial, temporal boundaries or class referents that are broader, narrower, the same as or just slightly different from the referents of the ontology term. But note that only where we can link to a particular usage can we be precise about this (in reference X Michael Ashburner uses term Y as an exact synonym of term Z). The rest of the time, scoping synonyms in this way can only be a matter of judgment about usage by the community of users. Given that, any definitions using quantifiers will be useless. Instead, we need qualitative terms like 'generally' or 'largely'.
Finally, a note on definitions
Defining these synonyms as ontology terms is far from trivial because it implicitly requires some stance on what words in regular language actually refer to. This is not exactly an uncontroversial subject. I have neither the time nor inclination to get involved in a long debate on this matter before we can add this term to the IAO
*Textual definition*
def: "Information artifact that links an ontology term to a term in language that, within some specified discipline or publication, has referents that largely or completely correspond to the referents of the ontology term."
comment: "Publication can be specified directly via a database cross reference, domain Please add an example of usage for that term Please see several 10s of thousands off instances of use in OBO ontologies.
_Original issue: http://code.google.com/p/information-artifact-ontology/issues/detail?id=103_
|
1.0
|
synonym - _From [dosu...@gmail.com](https://code.google.com/u/102674886352087815907/) on March 10, 2011 09:53:44_
*label for the new term*
synonym
*Background*
This is intended as an IAO mapping for the OBO tag synonym.
This tag is used to relate ontology terms to the language that people use. But that language is not always precise. Words do not come with necessary and sufficient conditions to specify their referents. In regular language, and at least some scientific language, there is no way to define crisp boundaries for the referent of a word that will satisfy most users conception of what that word refers to.
Even within a single scientific community there are often subtle distinctions of usage. For a term used in anatomy, biologists often differ in how they define spatial and or temporal boundaries for the referents of a term. Or they might differ slightly in what subclasses they consider should be subsumed by a class. Or they may use variants of a term to refer to the same thing, or some completely different term for identically or similarly defined structures . We can't possibly add terms for all such cases and deal with the resulting overlapping partonomies or fantastically detailed classifications. Such an ontology would be unmaintainable and probably unusable.
So, for a whole set of closely related terms we choose one definition and and one term and add the rest as synonyms. That way, users can find candidate terms by searching with a term that is familiar to them. It may be useful to further classify (scope) synonyms as having spatial, temporal boundaries or class referents that are broader, narrower, the same as or just slightly different from the referents of the ontology term. But note that only where we can link to a particular usage can we be precise about this (in reference X Michael Ashburner uses term Y as an exact synonym of term Z). The rest of the time, scoping synonyms in this way can only be a matter of judgment about usage by the community of users. Given that, any definitions using quantifiers will be useless. Instead, we need qualitative terms like 'generally' or 'largely'.
Finally, a note on definitions
Defining these synonyms as ontology terms is far from trivial because it implicitly requires some stance on what words in regular language actually refer to. This is not exactly an uncontroversial subject. I have neither the time nor inclination to get involved in a long debate on this matter before we can add this term to the IAO
*Textual definition*
def: "Information artifact that links an ontology term to a term in language that, within some specified discipline or publication, has referents that largely or completely correspond to the referents of the ontology term."
comment: "Publication can be specified directly via a database cross reference, domain Please add an example of usage for that term Please see several 10s of thousands off instances of use in OBO ontologies.
_Original issue: http://code.google.com/p/information-artifact-ontology/issues/detail?id=103_
|
non_test
|
synonym from on march label for the new term synonym background this is intended as an iao mapping for the obo tag synonym this tag is used to relate ontology terms to the language that people use but that language is not always precise words do not come with necessary and sufficient conditions to specify their referents in regular language and at least some scientific language there is no way to define crisp boundaries for the referent of a word that will satisfy most users conception of what that word refers to even within a single scientific community there are often subtle distinctions of usage for a term used in anatomy biologists often differ in how they define spatial and or temporal boundaries for the referents of a term or they might differ slightly in what subclasses they consider should be subsumed by a class or they may use variants of a term to refer to the same thing or some completely different term for identically or similarly defined structures we can t possibly add terms for all such cases and deal with the resulting overlapping partonomies or fantastically detailed classifications such an ontology would be unmaintainable and probably unusable so for a whole set of closely related terms we choose one definition and and one term and add the rest as synonyms that way users can find candidate terms by searching with a term that is familiar to them it may be useful to further classify scope synonyms as having spatial temporal boundaries or class referents that are broader narrower the same as or just slightly different from the referents of the ontology term but note that only where we can link to a particular usage can we be precise about this in reference x michael ashburner uses term y as an exact synonym of term z the rest of the time scoping synonyms in this way can only be a matter of judgment about usage by the community of users given that any definitions using quantifiers will be useless instead we need qualitative terms like generally or largely finally a note on definitions defining these synonyms as ontology terms is far from trivial because it implicitly requires some stance on what words in regular language actually refer to this is not exactly an uncontroversial subject i have neither the time nor inclination to get involved in a long debate on this matter before we can add this term to the iao textual definition def information artifact that links an ontology term to a term in language that within some specified discipline or publication has referents that largely or completely correspond to the referents of the ontology term comment publication can be specified directly via a database cross reference domain please add an example of usage for that term please see several of thousands off instances of use in obo ontologies original issue
| 0
|
348,295
| 24,911,104,075
|
IssuesEvent
|
2022-10-29 21:43:06
|
altair-viz/altair
|
https://api.github.com/repos/altair-viz/altair
|
closed
|
Add dict to allow a mapping from nominal names to what's displayed in a legend (or similar)
|
vega-lite-related documentation
|
I'd like to have a little flexibility to specify a mapping between a nominal data type and what's actually displayed. For instance, I might have:
```
color=Color('first_names:N',
legend=Legend(title='First Names'),
),
```
but my `first_names` are `tom`, `dick`, and `harry` and I want to display `Thomas`, `Richard`, and `Harrison`. It'd be nice to be able to pass in a dict:
```
d = { 'tom' : 'Thomas', 'dick' : 'Richard', ... }
... legend=Legend(title='First Names', mapping=d), ...
```
It would seem like this is probably already possible, but it's not obvious in the docs or the examples, sorry to have missed it if so.
|
1.0
|
Add dict to allow a mapping from nominal names to what's displayed in a legend (or similar) - I'd like to have a little flexibility to specify a mapping between a nominal data type and what's actually displayed. For instance, I might have:
```
color=Color('first_names:N',
legend=Legend(title='First Names'),
),
```
but my `first_names` are `tom`, `dick`, and `harry` and I want to display `Thomas`, `Richard`, and `Harrison`. It'd be nice to be able to pass in a dict:
```
d = { 'tom' : 'Thomas', 'dick' : 'Richard', ... }
... legend=Legend(title='First Names', mapping=d), ...
```
It would seem like this is probably already possible, but it's not obvious in the docs or the examples, sorry to have missed it if so.
|
non_test
|
add dict to allow a mapping from nominal names to what s displayed in a legend or similar i d like to have a little flexibility to specify a mapping between a nominal data type and what s actually displayed for instance i might have color color first names n legend legend title first names but my first names are tom dick and harry and i want to display thomas richard and harrison it d be nice to be able to pass in a dict d tom thomas dick richard legend legend title first names mapping d it would seem like this is probably already possible but it s not obvious in the docs or the examples sorry to have missed it if so
| 0
|
4,433
| 7,308,529,931
|
IssuesEvent
|
2018-02-28 08:42:48
|
UKHomeOffice/dq-aws-transition
|
https://api.github.com/repos/UKHomeOffice/dq-aws-transition
|
closed
|
Configure Maytech Mock Connectivity from NotProd Ingest Linux Server
|
DQ Data Ingest DQ Tranche 1 Production SSM processing
|
Private Key Migration
- [x] Private Key for Linux Ingest NotProd (/home/SSM/.ssh/id_rsa) for Mock Maytech Server
Configure NotProd ssh_remote_* parameters in sftp_oag_client_maytech.py
- [x] ssh_remote_host: <mock SFTP server>
- [x] ssh_remote_user: <mock oag user>
- [x] ssh_remote_key: /home/SSM/.ssh/id_rsa
|
1.0
|
Configure Maytech Mock Connectivity from NotProd Ingest Linux Server - Private Key Migration
- [x] Private Key for Linux Ingest NotProd (/home/SSM/.ssh/id_rsa) for Mock Maytech Server
Configure NotProd ssh_remote_* parameters in sftp_oag_client_maytech.py
- [x] ssh_remote_host: <mock SFTP server>
- [x] ssh_remote_user: <mock oag user>
- [x] ssh_remote_key: /home/SSM/.ssh/id_rsa
|
non_test
|
configure maytech mock connectivity from notprod ingest linux server private key migration private key for linux ingest notprod home ssm ssh id rsa for mock maytech server configure notprod ssh remote parameters in sftp oag client maytech py ssh remote host ssh remote user ssh remote key home ssm ssh id rsa
| 0
|
259,978
| 22,581,532,141
|
IssuesEvent
|
2022-06-28 12:05:58
|
4team-final/client
|
https://api.github.com/repos/4team-final/client
|
closed
|
feat: 로그인 client-server 연결 및 토큰 관리
|
🙂 FEAT 📜 TEST
|
## 💡 Issue
server-client 로그인 서비스 연동 및 쿠키와 리덕스를 통한 토큰 관리
## 📝 todo
- [x] 로그인 server-client 연결
- [x] 로그인 테스트
- [x] Access / Refresh 토큰 분할 관리 구현
- [x] 토큰 저장 테스트
|
1.0
|
feat: 로그인 client-server 연결 및 토큰 관리 - ## 💡 Issue
server-client 로그인 서비스 연동 및 쿠키와 리덕스를 통한 토큰 관리
## 📝 todo
- [x] 로그인 server-client 연결
- [x] 로그인 테스트
- [x] Access / Refresh 토큰 분할 관리 구현
- [x] 토큰 저장 테스트
|
test
|
feat 로그인 client server 연결 및 토큰 관리 💡 issue server client 로그인 서비스 연동 및 쿠키와 리덕스를 통한 토큰 관리 📝 todo 로그인 server client 연결 로그인 테스트 access refresh 토큰 분할 관리 구현 토큰 저장 테스트
| 1
|
198,384
| 22,634,541,229
|
IssuesEvent
|
2022-06-30 17:34:24
|
devonfw/devon4j
|
https://api.github.com/repos/devonfw/devon4j
|
closed
|
Add fido2 support
|
enhancement security
|
The standards fido2 and webauthn are the new hot topics revolutionizing authentication getting rid of passwords:
https://fidoalliance.org/fido2/
We should add a module/starter for devon4j supporting this OOTB.
|
True
|
Add fido2 support - The standards fido2 and webauthn are the new hot topics revolutionizing authentication getting rid of passwords:
https://fidoalliance.org/fido2/
We should add a module/starter for devon4j supporting this OOTB.
|
non_test
|
add support the standards and webauthn are the new hot topics revolutionizing authentication getting rid of passwords we should add a module starter for supporting this ootb
| 0
|
215,112
| 16,637,042,828
|
IssuesEvent
|
2021-06-04 01:11:23
|
Azure/azure-sdk-for-js
|
https://api.github.com/repos/Azure/azure-sdk-for-js
|
closed
|
Azure Ai Text Analytics Readme Issue
|
Client Cognitive - Text Analytics Docs bug test-manual-pass
|
1.
Section [link](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics#recognize-pii-entities):

Reason:
The ` TextAnalyticsApiKeyCredential ` is not in the ` @azure/ai-text-analytics` .
Suggestion:
Update to:
```
const { TextAnalyticsClient, AzureKeyCredential } = require("@azure/ai-text-analytics");
const client = new TextAnalyticsClient(
"<endpoint>",
new AzureKeyCredential("<API key>")
);
```
@jongio for notification.
|
1.0
|
Azure Ai Text Analytics Readme Issue - 1.
Section [link](https://github.com/Azure/azure-sdk-for-js/tree/master/sdk/textanalytics/ai-text-analytics#recognize-pii-entities):

Reason:
The ` TextAnalyticsApiKeyCredential ` is not in the ` @azure/ai-text-analytics` .
Suggestion:
Update to:
```
const { TextAnalyticsClient, AzureKeyCredential } = require("@azure/ai-text-analytics");
const client = new TextAnalyticsClient(
"<endpoint>",
new AzureKeyCredential("<API key>")
);
```
@jongio for notification.
|
test
|
azure ai text analytics readme issue section reason the textanalyticsapikeycredential is not in the azure ai text analytics suggestion update to const textanalyticsclient azurekeycredential require azure ai text analytics const client new textanalyticsclient new azurekeycredential jongio for notification
| 1
|
358,297
| 25,186,142,155
|
IssuesEvent
|
2022-11-11 18:10:25
|
r3bl-org/r3bl_rs_utils
|
https://api.github.com/repos/r3bl-org/r3bl_rs_utils
|
opened
|
Add docs for algorithms
|
documentation enhancement
|
1. Layout algorithm
- Why not use a DOM like tree?
- Pros / cons of using a stack instead of a tree?
- Cons: can't look up nodes by id?
- Pros: less memory consumption?
- How do algorithms for non-binary tree match up w/ stack based ones for eg: layout pass, tree walking, etc?
3. Tree walking w/ memory arena non-binary-tree
- describe the algorithms for tree walking
|
1.0
|
Add docs for algorithms - 1. Layout algorithm
- Why not use a DOM like tree?
- Pros / cons of using a stack instead of a tree?
- Cons: can't look up nodes by id?
- Pros: less memory consumption?
- How do algorithms for non-binary tree match up w/ stack based ones for eg: layout pass, tree walking, etc?
3. Tree walking w/ memory arena non-binary-tree
- describe the algorithms for tree walking
|
non_test
|
add docs for algorithms layout algorithm why not use a dom like tree pros cons of using a stack instead of a tree cons can t look up nodes by id pros less memory consumption how do algorithms for non binary tree match up w stack based ones for eg layout pass tree walking etc tree walking w memory arena non binary tree describe the algorithms for tree walking
| 0
|
23,840
| 11,963,082,045
|
IssuesEvent
|
2020-04-05 14:43:36
|
badges/shields
|
https://api.github.com/repos/badges/shields
|
closed
|
Badge request: POEditor
|
good first issue service-badge
|
:clipboard: **Description**
Showing translation progress per language, from the online platform l10n platform <https://poeditor.com>.
Right now, I make my own static badges for this. For example:







These contain:
- language name
- flag of country’s language (encoded utf8 symbols)
- progress as a percentage, and color-coded from red (0%) to green (100%)
:link: **Data**
The API
- is public,
- requires:
- a project id
- a (read-only) API key,
- is described at <https://poeditor.com/docs/api>, and
- works using _HTTP POST_ requests.
For example for one of my projects, and with my public read-only API key, the request for Chinese would be:
$ curl -s https://api.poeditor.com/v2/languages/list -d id=323337 -d api_token=7a666b44c0985d16a7b59748f488275c | jq '.result.languages[] | select(.code == "zh-CN")'
{
"name": "Chinese",
"code": "zh-CN",
"translations": 22,
"percentage": 22,
"updated": "2020-03-12T12:29:04+0000"
}
The equivalent jsonpath expression is `$.result.languages[?(@.code=="zh-CN")]`.
I suppose the basic URL could be something like `/poeditor/:project-id/:language-code`.
---
The flags are of course very optional but they look nice. They are just the 2 country code letters (in the chinese example case CN) encoded as utf-8 “REGIONAL INDICATOR SYMBOL LETTER”. E.g.:
function getUTF8Flag(countryCode) {
const firstLetter = Character.codePointAt(countryCode, 0) - 0x41 + 0x1F1E6
const secondLetter = Character.codePointAt(countryCode, 1) - 0x41 + 0x1F1E6
return String(Character.toChars(firstLetter)) + String(Character.toChars(secondLetter))
}
(0x41 represents uppercase A letter and 0x1F1E6 is REGIONAL INDICATOR SYMBOL LETTER A in the Unicode table.)
:microphone: **Motivation**
I like these badges to easily show to potential contributors what they can do to help out on FOSS projects.
|
1.0
|
Badge request: POEditor - :clipboard: **Description**
Showing translation progress per language, from the online platform l10n platform <https://poeditor.com>.
Right now, I make my own static badges for this. For example:







These contain:
- language name
- flag of country’s language (encoded utf8 symbols)
- progress as a percentage, and color-coded from red (0%) to green (100%)
:link: **Data**
The API
- is public,
- requires:
- a project id
- a (read-only) API key,
- is described at <https://poeditor.com/docs/api>, and
- works using _HTTP POST_ requests.
For example for one of my projects, and with my public read-only API key, the request for Chinese would be:
$ curl -s https://api.poeditor.com/v2/languages/list -d id=323337 -d api_token=7a666b44c0985d16a7b59748f488275c | jq '.result.languages[] | select(.code == "zh-CN")'
{
"name": "Chinese",
"code": "zh-CN",
"translations": 22,
"percentage": 22,
"updated": "2020-03-12T12:29:04+0000"
}
The equivalent jsonpath expression is `$.result.languages[?(@.code=="zh-CN")]`.
I suppose the basic URL could be something like `/poeditor/:project-id/:language-code`.
---
The flags are of course very optional but they look nice. They are just the 2 country code letters (in the chinese example case CN) encoded as utf-8 “REGIONAL INDICATOR SYMBOL LETTER”. E.g.:
function getUTF8Flag(countryCode) {
const firstLetter = Character.codePointAt(countryCode, 0) - 0x41 + 0x1F1E6
const secondLetter = Character.codePointAt(countryCode, 1) - 0x41 + 0x1F1E6
return String(Character.toChars(firstLetter)) + String(Character.toChars(secondLetter))
}
(0x41 represents uppercase A letter and 0x1F1E6 is REGIONAL INDICATOR SYMBOL LETTER A in the Unicode table.)
:microphone: **Motivation**
I like these badges to easily show to potential contributors what they can do to help out on FOSS projects.
|
non_test
|
badge request poeditor clipboard description showing translation progress per language from the online platform platform right now i make my own static badges for this for example these contain language name flag of country’s language encoded symbols progress as a percentage and color coded from red to green link data the api is public requires a project id a read only api key is described at and works using http post requests for example for one of my projects and with my public read only api key the request for chinese would be curl s d id d api token jq result languages select code zh cn name chinese code zh cn translations percentage updated the equivalent jsonpath expression is result languages i suppose the basic url could be something like poeditor project id language code the flags are of course very optional but they look nice they are just the country code letters in the chinese example case cn encoded as utf “regional indicator symbol letter” e g function countrycode const firstletter character codepointat countrycode const secondletter character codepointat countrycode return string character tochars firstletter string character tochars secondletter represents uppercase a letter and is regional indicator symbol letter a in the unicode table microphone motivation i like these badges to easily show to potential contributors what they can do to help out on foss projects
| 0
|
330,999
| 10,058,644,870
|
IssuesEvent
|
2019-07-22 14:19:44
|
INN/umbrella-currentorg
|
https://api.github.com/repos/INN/umbrella-currentorg
|
closed
|
Hide radio button for thumbnail image on submit a listing page
|
Estimate < 2 Hours Priority: High
|
We want this radio button to be hidden so it does not cause confusion for users.

|
1.0
|
Hide radio button for thumbnail image on submit a listing page - We want this radio button to be hidden so it does not cause confusion for users.

|
non_test
|
hide radio button for thumbnail image on submit a listing page we want this radio button to be hidden so it does not cause confusion for users
| 0
|
94,929
| 10,861,875,563
|
IssuesEvent
|
2019-11-14 12:04:23
|
CHEF-KOCH/Windows-10-hardening
|
https://api.github.com/repos/CHEF-KOCH/Windows-10-hardening
|
opened
|
LoLBins
|
Documentation Script
|
## Overview
LOLBins are on my to-do list for a very long time. The only reliable solution I can find is to block the critical files such as powershell.exe, regedit, etc. and/or restrict their internet access.
## Problems
While this is problematic (you might wanna actually work with those things) there should be a "workaround" script added.
Another "_solution_" is to work with AVs and their cloud protection (_I'm not much of a fan of it_).
## Reference
https://blog.talosintelligence.com/2019/11/hunting-for-lolbins.html
|
1.0
|
LoLBins - ## Overview
LOLBins are on my to-do list for a very long time. The only reliable solution I can find is to block the critical files such as powershell.exe, regedit, etc. and/or restrict their internet access.
## Problems
While this is problematic (you might wanna actually work with those things) there should be a "workaround" script added.
Another "_solution_" is to work with AVs and their cloud protection (_I'm not much of a fan of it_).
## Reference
https://blog.talosintelligence.com/2019/11/hunting-for-lolbins.html
|
non_test
|
lolbins overview lolbins are on my to do list for a very long time the only reliable solution i can find is to block the critical files such as powershell exe regedit etc and or restrict their internet access problems while this is problematic you might wanna actually work with those things there should be a workaround script added another solution is to work with avs and their cloud protection i m not much of a fan of it reference
| 0
|
9,509
| 3,051,267,054
|
IssuesEvent
|
2015-08-12 07:18:25
|
medic/medic-webapp
|
https://api.github.com/repos/medic/medic-webapp
|
closed
|
enketo date-picker is missing its styles
|
3 - Acceptance testing
|
* date picker needs class `enketo` applied to inherit the styles correctly
* when this fix has been made, date-picker `z-index` can be a substyle of `.enketo`
|
1.0
|
enketo date-picker is missing its styles - * date picker needs class `enketo` applied to inherit the styles correctly
* when this fix has been made, date-picker `z-index` can be a substyle of `.enketo`
|
test
|
enketo date picker is missing its styles date picker needs class enketo applied to inherit the styles correctly when this fix has been made date picker z index can be a substyle of enketo
| 1
|
43,538
| 11,255,290,736
|
IssuesEvent
|
2020-01-12 08:01:54
|
minecraft-dev/MinecraftDev
|
https://api.github.com/repos/minecraft-dev/MinecraftDev
|
closed
|
Gradle error after creating project
|
build: gradle status: stale status: waiting reply
|
Please include the following information in all issues:
* Minecraft Development for IntelliJ plugin version: 1.2.23.1
* IntelliJ version: 2019.2.1 Ultimate
* Operating System: Windows 10
* Target platforms: Forge 1.12.2-14.23.5.2838
When I create a new project and run the gradle sync this error comes:
Unable to find method 'org.gradle.api.internal.TaskOutputsInternal.dir(Ljava/lang/Object;)Lorg/gradle/api/tasks/TaskOutputs;'.
Possible causes for this unexpected error include:
Gradle's dependency cache may be corrupt (this sometimes occurs after a network connection timeout.)
Re-download dependencies and sync project (requires network)
The state of a Gradle build process (daemon) may be corrupt. Stopping all Gradle daemons may solve this problem.
Stop Gradle build processes (requires restart)
Your project may be using a third-party plugin which is not compatible with the other plugins in the project or the version of Gradle requested by the project.
In the case of corrupt Gradle processes, you can also try closing the IDE and then killing all Java processes.
|
1.0
|
Gradle error after creating project - Please include the following information in all issues:
* Minecraft Development for IntelliJ plugin version: 1.2.23.1
* IntelliJ version: 2019.2.1 Ultimate
* Operating System: Windows 10
* Target platforms: Forge 1.12.2-14.23.5.2838
When I create a new project and run the gradle sync this error comes:
Unable to find method 'org.gradle.api.internal.TaskOutputsInternal.dir(Ljava/lang/Object;)Lorg/gradle/api/tasks/TaskOutputs;'.
Possible causes for this unexpected error include:
Gradle's dependency cache may be corrupt (this sometimes occurs after a network connection timeout.)
Re-download dependencies and sync project (requires network)
The state of a Gradle build process (daemon) may be corrupt. Stopping all Gradle daemons may solve this problem.
Stop Gradle build processes (requires restart)
Your project may be using a third-party plugin which is not compatible with the other plugins in the project or the version of Gradle requested by the project.
In the case of corrupt Gradle processes, you can also try closing the IDE and then killing all Java processes.
|
non_test
|
gradle error after creating project please include the following information in all issues minecraft development for intellij plugin version intellij version ultimate operating system windows target platforms forge when i create a new project and run the gradle sync this error comes unable to find method org gradle api internal taskoutputsinternal dir ljava lang object lorg gradle api tasks taskoutputs possible causes for this unexpected error include gradle s dependency cache may be corrupt this sometimes occurs after a network connection timeout re download dependencies and sync project requires network the state of a gradle build process daemon may be corrupt stopping all gradle daemons may solve this problem stop gradle build processes requires restart your project may be using a third party plugin which is not compatible with the other plugins in the project or the version of gradle requested by the project in the case of corrupt gradle processes you can also try closing the ide and then killing all java processes
| 0
|
190,325
| 14,542,428,648
|
IssuesEvent
|
2020-12-15 15:42:13
|
kalexmills/github-vet-tests-dec2020
|
https://api.github.com/repos/kalexmills/github-vet-tests-dec2020
|
closed
|
sourcegraph/go-vcs: vcs/diff_test.go; 59 LoC
|
fresh medium test
|
Found a possible issue in [sourcegraph/go-vcs](https://www.github.com/sourcegraph/go-vcs) at [vcs/diff_test.go](https://github.com/sourcegraph/go-vcs/blob/d784c9520ccdd19883f59efd0a2ae4441f576582/vcs/diff_test.go#L284-L342)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable test used in defer or goroutine at line 306
[Click here to see the code in its original context.](https://github.com/sourcegraph/go-vcs/blob/d784c9520ccdd19883f59efd0a2ae4441f576582/vcs/diff_test.go#L284-L342)
<details>
<summary>Click here to show the 59 line(s) of Go which triggered the analyzer.</summary>
```go
for label, test := range tests {
baseCommitID, err := test.baseRepo.ResolveRevision(test.base)
if err != nil {
t.Errorf("%s: ResolveRevision(%q) on base: %s", label, test.base, err)
continue
}
headCommitID, err := test.headRepo.ResolveRevision(test.head)
if err != nil {
t.Errorf("%s: ResolveRevision(%q) on head: %s", label, test.head, err)
continue
}
// Try calling CrossRepoDiff a lot. The git impls do some
// global state stuff (creating a new remote, fetching into
// the base).
const n = 100
var wg sync.WaitGroup
for i := 0; i < n; i++ {
wg.Add(1)
go func() {
defer wg.Done()
_, err := test.baseRepo.CrossRepoDiff(baseCommitID, test.headRepo, headCommitID, test.opt)
if err != nil {
t.Errorf("%s: in concurrency test for CrossRepoDiff(%s, %v, %s, %v): %s", label, baseCommitID, test.headRepo, headCommitID, test.opt, err)
}
}()
}
wg.Wait()
diff, err := test.baseRepo.CrossRepoDiff(baseCommitID, test.headRepo, headCommitID, test.opt)
if err != nil {
t.Errorf("%s: CrossRepoDiff(%s, %v, %s, %v): %s", label, baseCommitID, test.headRepo, headCommitID, test.opt, err)
continue
}
// Substitute for easier test expectation definition. See the
// wantDiff field doc for more info.
test.wantDiff.Raw = strings.Replace(test.wantDiff.Raw, "%(baseCommitID)", string(baseCommitID), -1)
test.wantDiff.Raw = strings.Replace(test.wantDiff.Raw, "%(headCommitID)", string(headCommitID), -1)
if !reflect.DeepEqual(diff, test.wantDiff) {
t.Errorf("%s: diff != wantDiff\n\ndiff ==========\n%s\n\nwantDiff ==========\n%s", label, asJSON(diff), asJSON(test.wantDiff))
}
if _, err := test.baseRepo.CrossRepoDiff(nonexistentCommitID, test.headRepo, headCommitID, test.opt); err != vcs.ErrCommitNotFound {
t.Errorf("%s: CrossRepoDiff with bad base commit ID: want ErrCommitNotFound, got %v", label, err)
continue
}
if _, err := test.baseRepo.CrossRepoDiff(baseCommitID, test.headRepo, nonexistentCommitID, test.opt); err != vcs.ErrCommitNotFound {
if label == "git cmd" {
t.Log("skipping failure on git cmd because unimplemented")
continue
}
t.Errorf("%s: CrossRepoDiff with bad head commit ID: want ErrCommitNotFound, got %v", label, err)
continue
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: d784c9520ccdd19883f59efd0a2ae4441f576582
|
1.0
|
sourcegraph/go-vcs: vcs/diff_test.go; 59 LoC -
Found a possible issue in [sourcegraph/go-vcs](https://www.github.com/sourcegraph/go-vcs) at [vcs/diff_test.go](https://github.com/sourcegraph/go-vcs/blob/d784c9520ccdd19883f59efd0a2ae4441f576582/vcs/diff_test.go#L284-L342)
Below is the message reported by the analyzer for this snippet of code. Beware that the analyzer only reports the first
issue it finds, so please do not limit your consideration to the contents of the below message.
> range-loop variable test used in defer or goroutine at line 306
[Click here to see the code in its original context.](https://github.com/sourcegraph/go-vcs/blob/d784c9520ccdd19883f59efd0a2ae4441f576582/vcs/diff_test.go#L284-L342)
<details>
<summary>Click here to show the 59 line(s) of Go which triggered the analyzer.</summary>
```go
for label, test := range tests {
baseCommitID, err := test.baseRepo.ResolveRevision(test.base)
if err != nil {
t.Errorf("%s: ResolveRevision(%q) on base: %s", label, test.base, err)
continue
}
headCommitID, err := test.headRepo.ResolveRevision(test.head)
if err != nil {
t.Errorf("%s: ResolveRevision(%q) on head: %s", label, test.head, err)
continue
}
// Try calling CrossRepoDiff a lot. The git impls do some
// global state stuff (creating a new remote, fetching into
// the base).
const n = 100
var wg sync.WaitGroup
for i := 0; i < n; i++ {
wg.Add(1)
go func() {
defer wg.Done()
_, err := test.baseRepo.CrossRepoDiff(baseCommitID, test.headRepo, headCommitID, test.opt)
if err != nil {
t.Errorf("%s: in concurrency test for CrossRepoDiff(%s, %v, %s, %v): %s", label, baseCommitID, test.headRepo, headCommitID, test.opt, err)
}
}()
}
wg.Wait()
diff, err := test.baseRepo.CrossRepoDiff(baseCommitID, test.headRepo, headCommitID, test.opt)
if err != nil {
t.Errorf("%s: CrossRepoDiff(%s, %v, %s, %v): %s", label, baseCommitID, test.headRepo, headCommitID, test.opt, err)
continue
}
// Substitute for easier test expectation definition. See the
// wantDiff field doc for more info.
test.wantDiff.Raw = strings.Replace(test.wantDiff.Raw, "%(baseCommitID)", string(baseCommitID), -1)
test.wantDiff.Raw = strings.Replace(test.wantDiff.Raw, "%(headCommitID)", string(headCommitID), -1)
if !reflect.DeepEqual(diff, test.wantDiff) {
t.Errorf("%s: diff != wantDiff\n\ndiff ==========\n%s\n\nwantDiff ==========\n%s", label, asJSON(diff), asJSON(test.wantDiff))
}
if _, err := test.baseRepo.CrossRepoDiff(nonexistentCommitID, test.headRepo, headCommitID, test.opt); err != vcs.ErrCommitNotFound {
t.Errorf("%s: CrossRepoDiff with bad base commit ID: want ErrCommitNotFound, got %v", label, err)
continue
}
if _, err := test.baseRepo.CrossRepoDiff(baseCommitID, test.headRepo, nonexistentCommitID, test.opt); err != vcs.ErrCommitNotFound {
if label == "git cmd" {
t.Log("skipping failure on git cmd because unimplemented")
continue
}
t.Errorf("%s: CrossRepoDiff with bad head commit ID: want ErrCommitNotFound, got %v", label, err)
continue
}
}
```
</details>
Leave a reaction on this issue to contribute to the project by classifying this instance as a **Bug** :-1:, **Mitigated** :+1:, or **Desirable Behavior** :rocket:
See the descriptions of the classifications [here](https://github.com/github-vet/rangeclosure-findings#how-can-i-help) for more information.
commit ID: d784c9520ccdd19883f59efd0a2ae4441f576582
|
test
|
sourcegraph go vcs vcs diff test go loc found a possible issue in at below is the message reported by the analyzer for this snippet of code beware that the analyzer only reports the first issue it finds so please do not limit your consideration to the contents of the below message range loop variable test used in defer or goroutine at line click here to show the line s of go which triggered the analyzer go for label test range tests basecommitid err test baserepo resolverevision test base if err nil t errorf s resolverevision q on base s label test base err continue headcommitid err test headrepo resolverevision test head if err nil t errorf s resolverevision q on head s label test head err continue try calling crossrepodiff a lot the git impls do some global state stuff creating a new remote fetching into the base const n var wg sync waitgroup for i i n i wg add go func defer wg done err test baserepo crossrepodiff basecommitid test headrepo headcommitid test opt if err nil t errorf s in concurrency test for crossrepodiff s v s v s label basecommitid test headrepo headcommitid test opt err wg wait diff err test baserepo crossrepodiff basecommitid test headrepo headcommitid test opt if err nil t errorf s crossrepodiff s v s v s label basecommitid test headrepo headcommitid test opt err continue substitute for easier test expectation definition see the wantdiff field doc for more info test wantdiff raw strings replace test wantdiff raw basecommitid string basecommitid test wantdiff raw strings replace test wantdiff raw headcommitid string headcommitid if reflect deepequal diff test wantdiff t errorf s diff wantdiff n ndiff n s n nwantdiff n s label asjson diff asjson test wantdiff if err test baserepo crossrepodiff nonexistentcommitid test headrepo headcommitid test opt err vcs errcommitnotfound t errorf s crossrepodiff with bad base commit id want errcommitnotfound got v label err continue if err test baserepo crossrepodiff basecommitid test headrepo nonexistentcommitid test opt err vcs errcommitnotfound if label git cmd t log skipping failure on git cmd because unimplemented continue t errorf s crossrepodiff with bad head commit id want errcommitnotfound got v label err continue leave a reaction on this issue to contribute to the project by classifying this instance as a bug mitigated or desirable behavior rocket see the descriptions of the classifications for more information commit id
| 1
|
286,102
| 24,719,346,904
|
IssuesEvent
|
2022-10-20 09:28:21
|
matrix-org/dendrite
|
https://api.github.com/repos/matrix-org/dendrite
|
closed
|
CS peek sytest flakes about 10% of the time with db deadlock in sqlite
|
tests
|
Sytest is from https://github.com/matrix-org/sytest/pull/944/files (in this instance, it's the test to check that peeking by alias works, but it can happen to any of them)
It looks like this when it jams:
```
[server]: time="2020-09-07T22:43:25.649467000Z" level=info msg="Sent event to roomserver" func="SendEvent\n\t" file=" [github.com/matrix-org/dendrite@/clientapi/routing/sendevent.go:108]" event_id="$HctHckxg92w8Kouy4abmX1OGXuSyxGxyvTpvsrv8kf0" req.id=myw1xGEtMaf4 req.method=PUT req.path="/_matrix/client/r0/rooms/!rHrmkzrfrnSrVMYS:localhost:8800/send/m.room.message/2" room_id="!rHrmkzrfrnSrVMYS:localhost:8800" room_version=5 user_id="@anon-20200907_224324-9:localhost:8800"
[server]: time="2020-09-07T22:43:25.649561000Z" level=trace msg="Responding (59 bytes)" func="respond\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:173]" code=200 req.id=myw1xGEtMaf4 req.method=PUT req.path="/_matrix/client/r0/rooms/!rHrmkzrfrnSrVMYS:localhost:8800/send/m.room.message/2"
[server]: time="2020-09-07T22:43:25.650986000Z" level=trace msg="Incoming request" func="RequestWithLogging\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:123]" req.id=kZzE1VI6ry7s req.method=GET req.path=/_matrix/client/r0/sync
[server]: time="2020-09-07T22:43:25.651898000Z" level=debug msg="QueryKeyChanges request p=0,off=26,to=-1 response p=0 off=0 uids=[]" func="DeviceListCatchup\n\t" file=" [github.com/matrix-org/dendrite@/syncapi/internal/keychange.go:103]" context=missing
[server]: time="2020-09-07T22:43:25.651961000Z" level=info msg=Responding func="OnIncomingSyncRequest\n\t" file=" [github.com/matrix-org/dendrite@/syncapi/sync/requestpool.go:140]" device_id=yhfw5OUI limit=20 next=s97_0.dl-0-0 req.id=kZzE1VI6ry7s req.method=GET req.path=/_matrix/client/r0/sync since=s96_0.dl-0-26 timed_out=false timeout=1s user_id="@anon-20200907_224324-9:localhost:8800"
[server]: time="2020-09-07T22:43:25.652003000Z" level=trace msg="Responding (804 bytes)" func="respond\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:173]" code=200 req.id=kZzE1VI6ry7s req.method=GET req.path=/_matrix/client/r0/sync
[server]: time="2020-09-07T22:43:25.653582000Z" level=trace msg="Incoming request" func="RequestWithLogging\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:123]" req.id=m0r6ydUASlKl req.method=POST req.path="/_matrix/client/r0/peek/#peektest-20200907_224324:localhost:8800"
[server]: time="2020-09-07T22:43:25.654129000Z" level=info msg="Producing to topic 'DendriteOutputRoomEvent'" func="WriteOutputEvents\n\t" file=" [github.com/matrix-org/dendrite@/roomserver/internal/input/input.go:98]" room_id="!rHrmkzrfrnSrVMYS:localhost:8800" type=new_peek
[server]: time="2020-09-07T22:43:25.654621000Z" level=trace msg="Responding (46 bytes)" func="respond\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:173]" code=200 req.id=m0r6ydUASlKl req.method=POST req.path="/_matrix/client/r0/peek/#peektest-20200907_224324:localhost:8800"
[server]: time="2020-09-07T22:43:25.654624000Z" level=debug msg="roomserver output log: ignoring unknown output type" func="onMessage\n\t" file=" [github.com/matrix-org/dendrite@/currentstateserver/consumers/roomserver.go:68]" type=new_peek
[server]: time="2020-09-07T22:43:25.654628000Z" level=debug msg="roomserver output log: ignoring unknown output type" func="onMessage\n\t" file=" [github.com/matrix-org/dendrite@/federationsender/consumers/roomserver.go:101]" type=new_peek
[server]: time="2020-09-07T22:43:25.654633000Z" level=debug msg="roomserver output log: ignoring unknown output type" func="onMessage\n\t" file=" [github.com/matrix-org/dendrite@/appservice/consumers/roomserver.go:85]" type=new_peek
[server]: time="2020-09-07T22:43:25.655983000Z" level=trace msg="Incoming request" func="RequestWithLogging\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:123]" req.id=0lh8OYBFOoDe req.method=GET req.path=/_matrix/client/r0/sync
[server]: panic: the ContinualConsumer in "syncapi/roomserver" failed to SetPartitionOffset: database is locked
[server]:
[server]: goroutine 160 [running]:
[server]: github.com/matrix-org/dendrite/internal.(*ContinualConsumer).consumePartition(0xc0001e4550, 0x4f027a0, 0xc0003f26e0)
[server]: github.com/matrix-org/dendrite@/internal/consumers.go:115 +0x2fc
[server]: created by github.com/matrix-org/dendrite/internal.(*ContinualConsumer).StartOffsets
```
|
1.0
|
CS peek sytest flakes about 10% of the time with db deadlock in sqlite - Sytest is from https://github.com/matrix-org/sytest/pull/944/files (in this instance, it's the test to check that peeking by alias works, but it can happen to any of them)
It looks like this when it jams:
```
[server]: time="2020-09-07T22:43:25.649467000Z" level=info msg="Sent event to roomserver" func="SendEvent\n\t" file=" [github.com/matrix-org/dendrite@/clientapi/routing/sendevent.go:108]" event_id="$HctHckxg92w8Kouy4abmX1OGXuSyxGxyvTpvsrv8kf0" req.id=myw1xGEtMaf4 req.method=PUT req.path="/_matrix/client/r0/rooms/!rHrmkzrfrnSrVMYS:localhost:8800/send/m.room.message/2" room_id="!rHrmkzrfrnSrVMYS:localhost:8800" room_version=5 user_id="@anon-20200907_224324-9:localhost:8800"
[server]: time="2020-09-07T22:43:25.649561000Z" level=trace msg="Responding (59 bytes)" func="respond\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:173]" code=200 req.id=myw1xGEtMaf4 req.method=PUT req.path="/_matrix/client/r0/rooms/!rHrmkzrfrnSrVMYS:localhost:8800/send/m.room.message/2"
[server]: time="2020-09-07T22:43:25.650986000Z" level=trace msg="Incoming request" func="RequestWithLogging\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:123]" req.id=kZzE1VI6ry7s req.method=GET req.path=/_matrix/client/r0/sync
[server]: time="2020-09-07T22:43:25.651898000Z" level=debug msg="QueryKeyChanges request p=0,off=26,to=-1 response p=0 off=0 uids=[]" func="DeviceListCatchup\n\t" file=" [github.com/matrix-org/dendrite@/syncapi/internal/keychange.go:103]" context=missing
[server]: time="2020-09-07T22:43:25.651961000Z" level=info msg=Responding func="OnIncomingSyncRequest\n\t" file=" [github.com/matrix-org/dendrite@/syncapi/sync/requestpool.go:140]" device_id=yhfw5OUI limit=20 next=s97_0.dl-0-0 req.id=kZzE1VI6ry7s req.method=GET req.path=/_matrix/client/r0/sync since=s96_0.dl-0-26 timed_out=false timeout=1s user_id="@anon-20200907_224324-9:localhost:8800"
[server]: time="2020-09-07T22:43:25.652003000Z" level=trace msg="Responding (804 bytes)" func="respond\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:173]" code=200 req.id=kZzE1VI6ry7s req.method=GET req.path=/_matrix/client/r0/sync
[server]: time="2020-09-07T22:43:25.653582000Z" level=trace msg="Incoming request" func="RequestWithLogging\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:123]" req.id=m0r6ydUASlKl req.method=POST req.path="/_matrix/client/r0/peek/#peektest-20200907_224324:localhost:8800"
[server]: time="2020-09-07T22:43:25.654129000Z" level=info msg="Producing to topic 'DendriteOutputRoomEvent'" func="WriteOutputEvents\n\t" file=" [github.com/matrix-org/dendrite@/roomserver/internal/input/input.go:98]" room_id="!rHrmkzrfrnSrVMYS:localhost:8800" type=new_peek
[server]: time="2020-09-07T22:43:25.654621000Z" level=trace msg="Responding (46 bytes)" func="respond\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:173]" code=200 req.id=m0r6ydUASlKl req.method=POST req.path="/_matrix/client/r0/peek/#peektest-20200907_224324:localhost:8800"
[server]: time="2020-09-07T22:43:25.654624000Z" level=debug msg="roomserver output log: ignoring unknown output type" func="onMessage\n\t" file=" [github.com/matrix-org/dendrite@/currentstateserver/consumers/roomserver.go:68]" type=new_peek
[server]: time="2020-09-07T22:43:25.654628000Z" level=debug msg="roomserver output log: ignoring unknown output type" func="onMessage\n\t" file=" [github.com/matrix-org/dendrite@/federationsender/consumers/roomserver.go:101]" type=new_peek
[server]: time="2020-09-07T22:43:25.654633000Z" level=debug msg="roomserver output log: ignoring unknown output type" func="onMessage\n\t" file=" [github.com/matrix-org/dendrite@/appservice/consumers/roomserver.go:85]" type=new_peek
[server]: time="2020-09-07T22:43:25.655983000Z" level=trace msg="Incoming request" func="RequestWithLogging\n\t" file=" [github.com/matrix-org/util@v0.0.0-20200807132607-55161520e1d4/json.go:123]" req.id=0lh8OYBFOoDe req.method=GET req.path=/_matrix/client/r0/sync
[server]: panic: the ContinualConsumer in "syncapi/roomserver" failed to SetPartitionOffset: database is locked
[server]:
[server]: goroutine 160 [running]:
[server]: github.com/matrix-org/dendrite/internal.(*ContinualConsumer).consumePartition(0xc0001e4550, 0x4f027a0, 0xc0003f26e0)
[server]: github.com/matrix-org/dendrite@/internal/consumers.go:115 +0x2fc
[server]: created by github.com/matrix-org/dendrite/internal.(*ContinualConsumer).StartOffsets
```
|
test
|
cs peek sytest flakes about of the time with db deadlock in sqlite sytest is from in this instance it s the test to check that peeking by alias works but it can happen to any of them it looks like this when it jams time level info msg sent event to roomserver func sendevent n t file event id req id req method put req path matrix client rooms rhrmkzrfrnsrvmys localhost send m room message room id rhrmkzrfrnsrvmys localhost room version user id anon localhost time level trace msg responding bytes func respond n t file code req id req method put req path matrix client rooms rhrmkzrfrnsrvmys localhost send m room message time level trace msg incoming request func requestwithlogging n t file req id req method get req path matrix client sync time level debug msg querykeychanges request p off to response p off uids func devicelistcatchup n t file context missing time level info msg responding func onincomingsyncrequest n t file device id limit next dl req id req method get req path matrix client sync since dl timed out false timeout user id anon localhost time level trace msg responding bytes func respond n t file code req id req method get req path matrix client sync time level trace msg incoming request func requestwithlogging n t file req id req method post req path matrix client peek peektest localhost time level info msg producing to topic dendriteoutputroomevent func writeoutputevents n t file room id rhrmkzrfrnsrvmys localhost type new peek time level trace msg responding bytes func respond n t file code req id req method post req path matrix client peek peektest localhost time level debug msg roomserver output log ignoring unknown output type func onmessage n t file type new peek time level debug msg roomserver output log ignoring unknown output type func onmessage n t file type new peek time level debug msg roomserver output log ignoring unknown output type func onmessage n t file type new peek time level trace msg incoming request func requestwithlogging n t file req id req method get req path matrix client sync panic the continualconsumer in syncapi roomserver failed to setpartitionoffset database is locked goroutine github com matrix org dendrite internal continualconsumer consumepartition github com matrix org dendrite internal consumers go created by github com matrix org dendrite internal continualconsumer startoffsets
| 1
|
134,013
| 12,559,840,738
|
IssuesEvent
|
2020-06-07 20:12:39
|
gcm1001/TFG-CeniehAriadne
|
https://api.github.com/repos/gcm1001/TFG-CeniehAriadne
|
opened
|
Actualizar memoria y anexos
|
documentation
|
## Memoria
**Actualizar** los puntos:
- 1 Introducción.
- 2 Objetivos del proyecto.
- 3 Conceptos teóricos.
- 4 Técnicas y herramientas.
**Empezar** los puntos:
- 5 Aspectos relevantes del desarrollo del proyecto
## Anexos
**Actualizar** los puntos:
A. Plan proyecto.
**Empezar** los puntos:
- E Manual de usuario
|
1.0
|
Actualizar memoria y anexos - ## Memoria
**Actualizar** los puntos:
- 1 Introducción.
- 2 Objetivos del proyecto.
- 3 Conceptos teóricos.
- 4 Técnicas y herramientas.
**Empezar** los puntos:
- 5 Aspectos relevantes del desarrollo del proyecto
## Anexos
**Actualizar** los puntos:
A. Plan proyecto.
**Empezar** los puntos:
- E Manual de usuario
|
non_test
|
actualizar memoria y anexos memoria actualizar los puntos introducción objetivos del proyecto conceptos teóricos técnicas y herramientas empezar los puntos aspectos relevantes del desarrollo del proyecto anexos actualizar los puntos a plan proyecto empezar los puntos e manual de usuario
| 0
|
343,431
| 30,665,580,890
|
IssuesEvent
|
2023-07-25 18:01:03
|
opensearch-project/dashboards-visualizations
|
https://api.github.com/repos/opensearch-project/dashboards-visualizations
|
closed
|
[AUTOCUT] Integration Test failed for ganttChartDashboards: 2.9.0 tar distribution
|
untriaged autocut integ-test-failure v2.9.0
|
The integration test failed at distribution level for component ganttChartDashboards<br>Version: 2.9.0<br>Distribution: tar<br>Architecture: arm64<br>Platform: linux<br><br>Please check the logs: https://build.ci.opensearch.org/job/integ-test-opensearch-dashboards/3691/display/redirect<br><br> * Steps to reproduce: See https://github.com/opensearch-project/opensearch-build/tree/main/src/test_workflow#integration-tests<br>* See all log files:<br> - [With security](https://ci.opensearch.org/ci/dbc/integ-test-opensearch-dashboards/2.9.0/6388/linux/arm64/tar/test-results/3691/integ-test/ganttChartDashboards/with-security/ganttChartDashboards.yml) (if applicable)<br> - [Without security](https://ci.opensearch.org/ci/dbc/integ-test-opensearch-dashboards/2.9.0/6388/linux/arm64/tar/test-results/3691/integ-test/ganttChartDashboards/without-security/ganttChartDashboards.yml) (if applicable)<br><br>
|
1.0
|
[AUTOCUT] Integration Test failed for ganttChartDashboards: 2.9.0 tar distribution - The integration test failed at distribution level for component ganttChartDashboards<br>Version: 2.9.0<br>Distribution: tar<br>Architecture: arm64<br>Platform: linux<br><br>Please check the logs: https://build.ci.opensearch.org/job/integ-test-opensearch-dashboards/3691/display/redirect<br><br> * Steps to reproduce: See https://github.com/opensearch-project/opensearch-build/tree/main/src/test_workflow#integration-tests<br>* See all log files:<br> - [With security](https://ci.opensearch.org/ci/dbc/integ-test-opensearch-dashboards/2.9.0/6388/linux/arm64/tar/test-results/3691/integ-test/ganttChartDashboards/with-security/ganttChartDashboards.yml) (if applicable)<br> - [Without security](https://ci.opensearch.org/ci/dbc/integ-test-opensearch-dashboards/2.9.0/6388/linux/arm64/tar/test-results/3691/integ-test/ganttChartDashboards/without-security/ganttChartDashboards.yml) (if applicable)<br><br>
|
test
|
integration test failed for ganttchartdashboards tar distribution the integration test failed at distribution level for component ganttchartdashboards version distribution tar architecture platform linux please check the logs steps to reproduce see see all log files if applicable if applicable
| 1
|
55,768
| 14,020,550,292
|
IssuesEvent
|
2020-10-29 19:51:49
|
anyulled/react-skeleton
|
https://api.github.com/repos/anyulled/react-skeleton
|
opened
|
CVE-2020-7751 (Medium) detected in pathval-1.1.0.tgz
|
security vulnerability
|
## CVE-2020-7751 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>pathval-1.1.0.tgz</b></p></summary>
<p>Object value retrieval given a string path</p>
<p>Library home page: <a href="https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz">https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz</a></p>
<p>Path to dependency file: react-skeleton/client/package.json</p>
<p>Path to vulnerable library: react-skeleton/client/node_modules/pathval/package.json</p>
<p>
Dependency Hierarchy:
- chai-4.2.0.tgz (Root Library)
- :x: **pathval-1.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/anyulled/react-skeleton/commit/9adc88615191ad561d2a8a874dfc37d4fc4c235e">9adc88615191ad561d2a8a874dfc37d4fc4c235e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects all versions of package pathval.
<p>Publish Date: 2020-10-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7751>CVE-2020-7751</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2020-7751 (Medium) detected in pathval-1.1.0.tgz - ## CVE-2020-7751 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>pathval-1.1.0.tgz</b></p></summary>
<p>Object value retrieval given a string path</p>
<p>Library home page: <a href="https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz">https://registry.npmjs.org/pathval/-/pathval-1.1.0.tgz</a></p>
<p>Path to dependency file: react-skeleton/client/package.json</p>
<p>Path to vulnerable library: react-skeleton/client/node_modules/pathval/package.json</p>
<p>
Dependency Hierarchy:
- chai-4.2.0.tgz (Root Library)
- :x: **pathval-1.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/anyulled/react-skeleton/commit/9adc88615191ad561d2a8a874dfc37d4fc4c235e">9adc88615191ad561d2a8a874dfc37d4fc4c235e</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
This affects all versions of package pathval.
<p>Publish Date: 2020-10-26
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-7751>CVE-2020-7751</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: High
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: Low
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve medium detected in pathval tgz cve medium severity vulnerability vulnerable library pathval tgz object value retrieval given a string path library home page a href path to dependency file react skeleton client package json path to vulnerable library react skeleton client node modules pathval package json dependency hierarchy chai tgz root library x pathval tgz vulnerable library found in head commit a href found in base branch master vulnerability details this affects all versions of package pathval publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact low integrity impact low availability impact high for more information on scores click a href step up your open source security game with whitesource
| 0
|
182,867
| 14,169,139,667
|
IssuesEvent
|
2020-11-12 12:48:22
|
cnigfr/PCRS
|
https://api.github.com/repos/cnigfr/PCRS
|
closed
|
Jeu de test : nombreux namespaces définis sans nécessité
|
jeux tests
|
Dans le jeu de tests JeuxTestv2.gml des espaces de nommages non nécessaires sont définis de nombreuses fois.
`
<featureMember xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:fn="http://www.w3.org/2005/xpath-functions" xmlns:pcrs="http://cnig.gouv.fr/pcrs">
`
Les espaces de nommage xmlns:xs="http://www.w3.org/2001/XMLSchema" et xmlns:fn="http://www.w3.org/2005/xpath-functions" ne sont pas utilisés ; ils n’ont pas besoin d’être définis à chaque featureMember.
(idem pour xmlns:pcrs="http://cnig.gouv.fr/pcrs" mais traité un peu différemment dans l'"issue" 2.)
Correction suggérée : supprimer [xmlns:xs="http://www.w3.org/2001/XMLSchema"] et [xmlns:fn="http://www.w3.org/2005/xpath-functions"] dans tout le document.
|
1.0
|
Jeu de test : nombreux namespaces définis sans nécessité - Dans le jeu de tests JeuxTestv2.gml des espaces de nommages non nécessaires sont définis de nombreuses fois.
`
<featureMember xmlns:xs="http://www.w3.org/2001/XMLSchema" xmlns:fn="http://www.w3.org/2005/xpath-functions" xmlns:pcrs="http://cnig.gouv.fr/pcrs">
`
Les espaces de nommage xmlns:xs="http://www.w3.org/2001/XMLSchema" et xmlns:fn="http://www.w3.org/2005/xpath-functions" ne sont pas utilisés ; ils n’ont pas besoin d’être définis à chaque featureMember.
(idem pour xmlns:pcrs="http://cnig.gouv.fr/pcrs" mais traité un peu différemment dans l'"issue" 2.)
Correction suggérée : supprimer [xmlns:xs="http://www.w3.org/2001/XMLSchema"] et [xmlns:fn="http://www.w3.org/2005/xpath-functions"] dans tout le document.
|
test
|
jeu de test nombreux namespaces définis sans nécessité dans le jeu de tests gml des espaces de nommages non nécessaires sont définis de nombreuses fois featuremember xmlns xs xmlns fn xmlns pcrs les espaces de nommage xmlns xs et xmlns fn ne sont pas utilisés ils n’ont pas besoin d’être définis à chaque featuremember idem pour xmlns pcrs mais traité un peu différemment dans l issue correction suggérée supprimer et dans tout le document
| 1
|
299,011
| 25,875,251,508
|
IssuesEvent
|
2022-12-14 07:19:51
|
zephyrproject-rtos/test_results
|
https://api.github.com/repos/zephyrproject-rtos/test_results
|
closed
|
tests-ci : portability: posix: eventfd_basic.newlib.posix_api test No Console Output(Timeout)
|
bug area: Tests
|
**Describe the bug**
eventfd_basic.newlib.posix_api test is No Console Output(Timeout) on zephyr-v3.2.0-2490-ga1b4896efe46 on mimxrt1170_evk_cm7
see logs for details
**To Reproduce**
1.
```
scripts/twister --device-testing --device-serial /dev/ttyACM0 -p mimxrt1170_evk_cm7 --sub-test portability.posix
```
2. See error
**Expected behavior**
test pass
**Impact**
**Logs and console output**
```
E: ***** USAGE FAULT *****
E: Illegal use of the EPSR
E: r0/a1: 0xac335cd7 r1/a2: 0xe000ed00 r2/a3: 0xffffffe8
E: r3/a4: 0x8000496c r12/ip: 0x40460000 r14/lr: 0x3000d199
E: xpsr: 0x60000000
E: Faulting instruction address (r15/pc): 0x00000000
E: >>> ZEPHYR FATAL ERROR 0: CPU exception on CPU 0
E: Current thread: 0x800022c8 (main)
E: Halting system
```
**Environment (please complete the following information):**
- OS: (e.g. Linux )
- Toolchain (e.g Zephyr SDK)
- Commit SHA or Version used: zephyr-v3.2.0-2490-ga1b4896efe46
|
1.0
|
tests-ci : portability: posix: eventfd_basic.newlib.posix_api test No Console Output(Timeout)
-
**Describe the bug**
eventfd_basic.newlib.posix_api test is No Console Output(Timeout) on zephyr-v3.2.0-2490-ga1b4896efe46 on mimxrt1170_evk_cm7
see logs for details
**To Reproduce**
1.
```
scripts/twister --device-testing --device-serial /dev/ttyACM0 -p mimxrt1170_evk_cm7 --sub-test portability.posix
```
2. See error
**Expected behavior**
test pass
**Impact**
**Logs and console output**
```
E: ***** USAGE FAULT *****
E: Illegal use of the EPSR
E: r0/a1: 0xac335cd7 r1/a2: 0xe000ed00 r2/a3: 0xffffffe8
E: r3/a4: 0x8000496c r12/ip: 0x40460000 r14/lr: 0x3000d199
E: xpsr: 0x60000000
E: Faulting instruction address (r15/pc): 0x00000000
E: >>> ZEPHYR FATAL ERROR 0: CPU exception on CPU 0
E: Current thread: 0x800022c8 (main)
E: Halting system
```
**Environment (please complete the following information):**
- OS: (e.g. Linux )
- Toolchain (e.g Zephyr SDK)
- Commit SHA or Version used: zephyr-v3.2.0-2490-ga1b4896efe46
|
test
|
tests ci portability posix eventfd basic newlib posix api test no console output timeout describe the bug eventfd basic newlib posix api test is no console output timeout on zephyr on evk see logs for details to reproduce scripts twister device testing device serial dev p evk sub test portability posix see error expected behavior test pass impact logs and console output e usage fault e illegal use of the epsr e e ip lr e xpsr e faulting instruction address pc e zephyr fatal error cpu exception on cpu e current thread main e halting system environment please complete the following information os e g linux toolchain e g zephyr sdk commit sha or version used zephyr
| 1
|
216,747
| 16,814,905,774
|
IssuesEvent
|
2021-06-17 05:55:46
|
datafuselabs/datafuse
|
https://api.github.com/repos/datafuselabs/datafuse
|
closed
|
[tests] Tests support error code match rather than error text.
|
easy-task feature testing
|
**Summary**
Tests support error code match rather than error text.
```
SELECT x, number FROM system.numbers LIMIT 1; -- { serverError 47 }
```
ClickHouse: https://github.com/ClickHouse/ClickHouse/blob/master/tests/queries/0_stateless/00002_system_numbers.sql#L9-L14
|
1.0
|
[tests] Tests support error code match rather than error text. - **Summary**
Tests support error code match rather than error text.
```
SELECT x, number FROM system.numbers LIMIT 1; -- { serverError 47 }
```
ClickHouse: https://github.com/ClickHouse/ClickHouse/blob/master/tests/queries/0_stateless/00002_system_numbers.sql#L9-L14
|
test
|
tests support error code match rather than error text summary tests support error code match rather than error text select x number from system numbers limit servererror clickhouse
| 1
|
90,923
| 8,287,392,222
|
IssuesEvent
|
2018-09-19 08:44:51
|
Mojo1917/LocBase
|
https://api.github.com/repos/Mojo1917/LocBase
|
closed
|
Database-Umschaltung cleverer machen
|
Feature zum Test für Matthias
|
Lass uns oben, beim Data-Base-Umschalten eine ähnliche Logik wie bei dem (neuen) Filterbutton anweden:
Gehen wir davon aus, dass es 5 Datenbanken gibt und ich mich in der 2. befinde. Dann kann ich horizontal nach rechts und links wischen. In gewissen Abständen switche ich (nur den Titel) zu den anderen Datenbanekn, was auch mit einer Vibration/Klick bestätigt wird. Links kann ich noch einmal gehen, rechts aber drei mal. Aber auch wieder zurück etc. - aber erst beim LOslassen wird (nochmals Vibration/Klick) die Datenbank angesprungen.
Diese Bedienung ist mir serh wichtig für die Zukunft, weil die Bedieung unserer ToDo-App sehr auf diesem Prinzip beruhen wird.
|
1.0
|
Database-Umschaltung cleverer machen - Lass uns oben, beim Data-Base-Umschalten eine ähnliche Logik wie bei dem (neuen) Filterbutton anweden:
Gehen wir davon aus, dass es 5 Datenbanken gibt und ich mich in der 2. befinde. Dann kann ich horizontal nach rechts und links wischen. In gewissen Abständen switche ich (nur den Titel) zu den anderen Datenbanekn, was auch mit einer Vibration/Klick bestätigt wird. Links kann ich noch einmal gehen, rechts aber drei mal. Aber auch wieder zurück etc. - aber erst beim LOslassen wird (nochmals Vibration/Klick) die Datenbank angesprungen.
Diese Bedienung ist mir serh wichtig für die Zukunft, weil die Bedieung unserer ToDo-App sehr auf diesem Prinzip beruhen wird.
|
test
|
database umschaltung cleverer machen lass uns oben beim data base umschalten eine ähnliche logik wie bei dem neuen filterbutton anweden gehen wir davon aus dass es datenbanken gibt und ich mich in der befinde dann kann ich horizontal nach rechts und links wischen in gewissen abständen switche ich nur den titel zu den anderen datenbanekn was auch mit einer vibration klick bestätigt wird links kann ich noch einmal gehen rechts aber drei mal aber auch wieder zurück etc aber erst beim loslassen wird nochmals vibration klick die datenbank angesprungen diese bedienung ist mir serh wichtig für die zukunft weil die bedieung unserer todo app sehr auf diesem prinzip beruhen wird
| 1
|
678,279
| 23,191,254,704
|
IssuesEvent
|
2022-08-01 12:51:49
|
cheminfo/nmrium
|
https://api.github.com/repos/cheminfo/nmrium
|
closed
|
Rename workspace
|
enhancement Priority
|
Process 1D workspace is probably useless and we can rename it to '1D multiple spectra analysis' (I don't think we need to repeat in the menu 'workspace'. Just keep it for the first item (Default workspace)).
Here are the active tools / panels :


|
1.0
|
Rename workspace - Process 1D workspace is probably useless and we can rename it to '1D multiple spectra analysis' (I don't think we need to repeat in the menu 'workspace'. Just keep it for the first item (Default workspace)).
Here are the active tools / panels :


|
non_test
|
rename workspace process workspace is probably useless and we can rename it to multiple spectra analysis i don t think we need to repeat in the menu workspace just keep it for the first item default workspace here are the active tools panels
| 0
|
426,611
| 12,375,123,413
|
IssuesEvent
|
2020-05-19 03:42:07
|
QingCloudAppcenter/QKE
|
https://api.github.com/repos/QingCloudAppcenter/QKE
|
closed
|
registry mirror 易配错,增加文案提示
|
kind/feature priority/important-soon
|
<!-- Please only use this template for submitting enhancement requests -->
**What would you like to be added**:
目前:镜像服务地址,多个用 1 个空格隔开
修改后:镜像 mirror 地址,多个用 1 个空格隔开。非必填项,不是镜像仓库地址。
**Why is this needed**:
很多用户填错此项,造成集群创建失败
|
1.0
|
registry mirror 易配错,增加文案提示 - <!-- Please only use this template for submitting enhancement requests -->
**What would you like to be added**:
目前:镜像服务地址,多个用 1 个空格隔开
修改后:镜像 mirror 地址,多个用 1 个空格隔开。非必填项,不是镜像仓库地址。
**Why is this needed**:
很多用户填错此项,造成集群创建失败
|
non_test
|
registry mirror 易配错,增加文案提示 what would you like to be added 目前:镜像服务地址,多个用 个空格隔开 修改后:镜像 mirror 地址,多个用 个空格隔开。非必填项,不是镜像仓库地址。 why is this needed 很多用户填错此项,造成集群创建失败
| 0
|
326,954
| 28,034,418,541
|
IssuesEvent
|
2023-03-28 14:16:23
|
SPW-DIG/metawal-core-geonetwork
|
https://api.github.com/repos/SPW-DIG/metawal-core-geonetwork
|
closed
|
Webcomponent - affichage des relations dans un tableau
|
Env test - OK Env valid - OK Env prod - OK WebComponent
|
On veut afficher les ressources en relation avec une autre ressource dans un tableau. Est-ce possible actuellement avec les WC ? Comment ?
| Nom donnée | Fait partie de | Est constitué de| Sert de source pour | Dérivé de | Est une révision de |
| ------------- | ------------- | ------------- | ------------- | ------------- | ------------- |
| Donnée 1 | Série A | | Donnée 2, Donnée 3 | | |
| Série A | | Donnée 1, Donnée 4, Donnée 5 | | | Série B |
|
1.0
|
Webcomponent - affichage des relations dans un tableau - On veut afficher les ressources en relation avec une autre ressource dans un tableau. Est-ce possible actuellement avec les WC ? Comment ?
| Nom donnée | Fait partie de | Est constitué de| Sert de source pour | Dérivé de | Est une révision de |
| ------------- | ------------- | ------------- | ------------- | ------------- | ------------- |
| Donnée 1 | Série A | | Donnée 2, Donnée 3 | | |
| Série A | | Donnée 1, Donnée 4, Donnée 5 | | | Série B |
|
test
|
webcomponent affichage des relations dans un tableau on veut afficher les ressources en relation avec une autre ressource dans un tableau est ce possible actuellement avec les wc comment nom donnée fait partie de est constitué de sert de source pour dérivé de est une révision de donnée série a donnée donnée série a donnée donnée donnée série b
| 1
|
5,281
| 2,770,146,398
|
IssuesEvent
|
2015-05-01 11:33:44
|
AAndharia/iManager
|
https://api.github.com/repos/AAndharia/iManager
|
reopened
|
Get Employee Personal Details
|
Status - Tested & Working Type - Wiki
|
User Must have to fill details in this page, he cannot access any other page until he saves all data in this page. If user doesn't save data and close browser or logout and login again then again he'll see this page and will not be able to continue until submitting data on this page.
**Get Employee Personal Details**
create new page for getting information from employee.
Filed List
**Personal Email** text-box control required
**Date Of Birth** text-box date-picker required
**PAN Card** text-box control optional
**Do you have passport** check-box control
**passport number** text-box control required depends on "Do you have passport" filed.
**Bank Name** text-box control optional
**Bank A/C NO.** text-box control optional
**Are you married?** check-box control
**Anniversary Date** text-box date-picker control required depends on "Are you married?" filed.
**Create Address Custom Directive** see task no #45
**Permanent Address** Address custom directive.
**Same as Permanent Address** check-box control. by default unchecked.
**Present Address** Address custom directive. hide this control when user is checked "Same as Permanent Address" check box.
Click on save or submit button then update to each filed on relevant table user or parson or employee etc...
after save next process is explain in task no #46
UI look like

|
1.0
|
Get Employee Personal Details - User Must have to fill details in this page, he cannot access any other page until he saves all data in this page. If user doesn't save data and close browser or logout and login again then again he'll see this page and will not be able to continue until submitting data on this page.
**Get Employee Personal Details**
create new page for getting information from employee.
Filed List
**Personal Email** text-box control required
**Date Of Birth** text-box date-picker required
**PAN Card** text-box control optional
**Do you have passport** check-box control
**passport number** text-box control required depends on "Do you have passport" filed.
**Bank Name** text-box control optional
**Bank A/C NO.** text-box control optional
**Are you married?** check-box control
**Anniversary Date** text-box date-picker control required depends on "Are you married?" filed.
**Create Address Custom Directive** see task no #45
**Permanent Address** Address custom directive.
**Same as Permanent Address** check-box control. by default unchecked.
**Present Address** Address custom directive. hide this control when user is checked "Same as Permanent Address" check box.
Click on save or submit button then update to each filed on relevant table user or parson or employee etc...
after save next process is explain in task no #46
UI look like

|
test
|
get employee personal details user must have to fill details in this page he cannot access any other page until he saves all data in this page if user doesn t save data and close browser or logout and login again then again he ll see this page and will not be able to continue until submitting data on this page get employee personal details create new page for getting information from employee filed list personal email text box control required date of birth text box date picker required pan card text box control optional do you have passport check box control passport number text box control required depends on do you have passport filed bank name text box control optional bank a c no text box control optional are you married check box control anniversary date text box date picker control required depends on are you married filed create address custom directive see task no permanent address address custom directive same as permanent address check box control by default unchecked present address address custom directive hide this control when user is checked same as permanent address check box click on save or submit button then update to each filed on relevant table user or parson or employee etc after save next process is explain in task no ui look like
| 1
|
132,115
| 10,731,395,304
|
IssuesEvent
|
2019-10-28 19:27:26
|
canjs/canjs
|
https://api.github.com/repos/canjs/canjs
|
closed
|
Test the production webpack code
|
testing webpack
|
It looks like we test building webpack in production mode, but maybe we don’t test that the code actually works?
Unfortunately I didn’t have time to dig into this, but I came across an issue where the webpack build passed for an example app but the code didn’t work because some code was compiled out of the production build, causing a runtime error. Here’s what I fixed: https://github.com/canjs/can-connect/pull/506/files
|
1.0
|
Test the production webpack code - It looks like we test building webpack in production mode, but maybe we don’t test that the code actually works?
Unfortunately I didn’t have time to dig into this, but I came across an issue where the webpack build passed for an example app but the code didn’t work because some code was compiled out of the production build, causing a runtime error. Here’s what I fixed: https://github.com/canjs/can-connect/pull/506/files
|
test
|
test the production webpack code it looks like we test building webpack in production mode but maybe we don’t test that the code actually works unfortunately i didn’t have time to dig into this but i came across an issue where the webpack build passed for an example app but the code didn’t work because some code was compiled out of the production build causing a runtime error here’s what i fixed
| 1
|
309,788
| 26,678,584,333
|
IssuesEvent
|
2023-01-26 15:59:00
|
wazuh/wazuh-qa
|
https://api.github.com/repos/wazuh/wazuh-qa
|
opened
|
Amazon Linux 2022 SCA policy - check 4.2 to 4.3
|
team/qa type/dev-testing status/not-tracked
|
| Target version | Related issue | Related PR |
|---|---|---|
| 4.4.x | #3838 | https://github.com/wazuh/wazuh/pull/15681 |
|Check Id and Name| Status| Extra|
|---|---|---|
|4.2 Configure Logging||||
|4.2.1 Configure rsyslog||||
|4.2.1.1 Ensure rsyslog is installed (Automated)||||
|4.2.1.2 Ensure rsyslog service is enabled (Automated)||||
|4.2.1.3 Ensure journald is configured to send logs to rsyslog (Manual)||||
|4.2.1.4 Ensure rsyslog default file permissions are configured (Automated)||||
|4.2.1.5 Ensure logging is configured (Manual)||||
|4.2.1.6 Ensure rsyslog is configured to send logs to a remote log host (Manual)||||
|4.2.1.7 Ensure rsyslog is not configured to receive logs from a remote client (Automated)||||
|4.2.2 Configure journald||||
|4.2.2.1 Ensure journald is configured to send logs to a remote log host||||
|4.2.2.1.1 Ensure systemd-journal-remote is installed (Manual)||||
|4.2.2.1.2 Ensure systemd-journal-remote is configured (Manual)||||
|4.2.2.1.3 Ensure systemd-journal-remote is enabled (Manual)||||
|4.2.2.1.4 Ensure journald is not configured to receive logs from a remote client (Automated)||||
|4.2.2.2 Ensure journald service is enabled (Automated)||||
|4.2.2.3 Ensure journald is configured to compress large log files (Automated)||||
|4.2.2.4 Ensure journald is configured to write logfiles to persistent disk (Automated)||||
|4.2.2.5 Ensure journald is not configured to send logs to rsyslog (Manual)||||
|4.2.2.6 Ensure journald log rotation is configured per site policy (Manual)||||
|4.2.2.7 Ensure journald default file permissions configured (Manual)||||
|4.2.3 Ensure permissions on all logfiles are configured (Automated)||||
|4.3 Ensure logrotate is configured (Manual)||||
|
1.0
|
Amazon Linux 2022 SCA policy - check 4.2 to 4.3 - | Target version | Related issue | Related PR |
|---|---|---|
| 4.4.x | #3838 | https://github.com/wazuh/wazuh/pull/15681 |
|Check Id and Name| Status| Extra|
|---|---|---|
|4.2 Configure Logging||||
|4.2.1 Configure rsyslog||||
|4.2.1.1 Ensure rsyslog is installed (Automated)||||
|4.2.1.2 Ensure rsyslog service is enabled (Automated)||||
|4.2.1.3 Ensure journald is configured to send logs to rsyslog (Manual)||||
|4.2.1.4 Ensure rsyslog default file permissions are configured (Automated)||||
|4.2.1.5 Ensure logging is configured (Manual)||||
|4.2.1.6 Ensure rsyslog is configured to send logs to a remote log host (Manual)||||
|4.2.1.7 Ensure rsyslog is not configured to receive logs from a remote client (Automated)||||
|4.2.2 Configure journald||||
|4.2.2.1 Ensure journald is configured to send logs to a remote log host||||
|4.2.2.1.1 Ensure systemd-journal-remote is installed (Manual)||||
|4.2.2.1.2 Ensure systemd-journal-remote is configured (Manual)||||
|4.2.2.1.3 Ensure systemd-journal-remote is enabled (Manual)||||
|4.2.2.1.4 Ensure journald is not configured to receive logs from a remote client (Automated)||||
|4.2.2.2 Ensure journald service is enabled (Automated)||||
|4.2.2.3 Ensure journald is configured to compress large log files (Automated)||||
|4.2.2.4 Ensure journald is configured to write logfiles to persistent disk (Automated)||||
|4.2.2.5 Ensure journald is not configured to send logs to rsyslog (Manual)||||
|4.2.2.6 Ensure journald log rotation is configured per site policy (Manual)||||
|4.2.2.7 Ensure journald default file permissions configured (Manual)||||
|4.2.3 Ensure permissions on all logfiles are configured (Automated)||||
|4.3 Ensure logrotate is configured (Manual)||||
|
test
|
amazon linux sca policy check to target version related issue related pr x check id and name status extra configure logging configure rsyslog ensure rsyslog is installed automated ensure rsyslog service is enabled automated ensure journald is configured to send logs to rsyslog manual ensure rsyslog default file permissions are configured automated ensure logging is configured manual ensure rsyslog is configured to send logs to a remote log host manual ensure rsyslog is not configured to receive logs from a remote client automated configure journald ensure journald is configured to send logs to a remote log host ensure systemd journal remote is installed manual ensure systemd journal remote is configured manual ensure systemd journal remote is enabled manual ensure journald is not configured to receive logs from a remote client automated ensure journald service is enabled automated ensure journald is configured to compress large log files automated ensure journald is configured to write logfiles to persistent disk automated ensure journald is not configured to send logs to rsyslog manual ensure journald log rotation is configured per site policy manual ensure journald default file permissions configured manual ensure permissions on all logfiles are configured automated ensure logrotate is configured manual
| 1
|
195,915
| 15,560,443,148
|
IssuesEvent
|
2021-03-16 12:43:40
|
alphagov/govuk-design-system
|
https://api.github.com/repos/alphagov/govuk-design-system
|
closed
|
Update working group page with content about submissions
|
documentation
|
## What
Add content about submissions to [Community page about the working group](https://design-system.service.gov.uk/community/design-system-working-group/).
## Why
Our Drive contains a doc about [what should, and should not, go to the working group](https://docs.google.com/document/d/1xL-0WBl2hct1eiIBbW7rY6xExfopQHP85JqJOg1cf_U/edit?ts=6038cc4a). It's only had a few viewers since a prior team-member created it. If this info were visible publicly, it could tell more users about the scope of the working group.
## Who needs to know about this
Community Manager, Content Designer, Delivery Manager, Designers, Developers, Head of Interaction Design, Senior Product Manager, Technical Writer
## Done when
- [x] Content Designer drafts update
- [x] Team reviews update
- [x] Update gets 2i, if needed
- [ ] Update published
|
1.0
|
Update working group page with content about submissions - ## What
Add content about submissions to [Community page about the working group](https://design-system.service.gov.uk/community/design-system-working-group/).
## Why
Our Drive contains a doc about [what should, and should not, go to the working group](https://docs.google.com/document/d/1xL-0WBl2hct1eiIBbW7rY6xExfopQHP85JqJOg1cf_U/edit?ts=6038cc4a). It's only had a few viewers since a prior team-member created it. If this info were visible publicly, it could tell more users about the scope of the working group.
## Who needs to know about this
Community Manager, Content Designer, Delivery Manager, Designers, Developers, Head of Interaction Design, Senior Product Manager, Technical Writer
## Done when
- [x] Content Designer drafts update
- [x] Team reviews update
- [x] Update gets 2i, if needed
- [ ] Update published
|
non_test
|
update working group page with content about submissions what add content about submissions to why our drive contains a doc about it s only had a few viewers since a prior team member created it if this info were visible publicly it could tell more users about the scope of the working group who needs to know about this community manager content designer delivery manager designers developers head of interaction design senior product manager technical writer done when content designer drafts update team reviews update update gets if needed update published
| 0
|
288,138
| 24,882,768,787
|
IssuesEvent
|
2022-10-28 03:47:11
|
MPMG-DCC-UFMG/F01
|
https://api.github.com/repos/MPMG-DCC-UFMG/F01
|
closed
|
Teste de generalizacao para a tag Orçamento - Execução - Capela Nova
|
generalization test development template - Memory (66) tag - Orçamento subtag - Execução
|
DoD: Realizar o teste de Generalização do validador da tag Orçamento - Execução para o Município de Capela Nova.
|
1.0
|
Teste de generalizacao para a tag Orçamento - Execução - Capela Nova - DoD: Realizar o teste de Generalização do validador da tag Orçamento - Execução para o Município de Capela Nova.
|
test
|
teste de generalizacao para a tag orçamento execução capela nova dod realizar o teste de generalização do validador da tag orçamento execução para o município de capela nova
| 1
|
421,371
| 12,256,179,867
|
IssuesEvent
|
2020-05-06 11:36:14
|
wp-media/wp-rocket
|
https://api.github.com/repos/wp-media/wp-rocket
|
closed
|
Deprecated: tag_row_actions after update to WooCommerce 4.0
|
community effort: [S] module: cache priority: low type: bug
|

This occurs when Wp-rocket 3.5.2 is active
Deprecated: tag_row_actions has been deprecated since version 3.0.0. Use {$ taxonomy} _row_actions instead.
**Describe the bug**
A message displayed from a function listed as "deprecated"
**To Reproduce**
Steps to reproduce the behavior:
1. With WooCommerce installed, we go to products
2. We go to categories
3. Just below the category name you see the message
4. See error
|
1.0
|
Deprecated: tag_row_actions after update to WooCommerce 4.0 - 
This occurs when Wp-rocket 3.5.2 is active
Deprecated: tag_row_actions has been deprecated since version 3.0.0. Use {$ taxonomy} _row_actions instead.
**Describe the bug**
A message displayed from a function listed as "deprecated"
**To Reproduce**
Steps to reproduce the behavior:
1. With WooCommerce installed, we go to products
2. We go to categories
3. Just below the category name you see the message
4. See error
|
non_test
|
deprecated tag row actions after update to woocommerce this occurs when wp rocket is active deprecated tag row actions has been deprecated since version use taxonomy row actions instead describe the bug a message displayed from a function listed as deprecated to reproduce steps to reproduce the behavior with woocommerce installed we go to products we go to categories just below the category name you see the message see error
| 0
|
163,121
| 12,704,638,757
|
IssuesEvent
|
2020-06-23 02:04:41
|
kubernetes/kubernetes
|
https://api.github.com/repos/kubernetes/kubernetes
|
closed
|
Code deficiencies of various kinds and severities
|
area/test-infra kind/feature lifecycle/frozen sig/testing
|
Hi,
I ran [staticcheck](https://github.com/dominikh/go-staticcheck) on Kubernetes at commit 2110f72, filtered out benign issues and false positives and assembled the following report, grouped by category and sorted by severity in descending order. I hope this report is useful to you.
# defer mu.Lock() (SA2003)
The following line of code defers `Lock` instead of `Unlock`:
```
k8s.io/kubernetes/pkg/registry/registrytest/service.go:124:2: deferring Lock right after having locked already; did you mean to defer Unlock? (SA2003)
```
# Deferring before checking for errors (SA5001)
The following lines of code open files or connections and defer `f.Close` before checking the returned error. This will lead to panics in case of an error.
```
k8s.io/kubernetes/cmd/kubeadm/app/cmd/reset_test.go:159:4: should check returned error before deferring f.Close() (SA5001)
k8s.io/kubernetes/pkg/client/unversioned/clientcmd/validation.go:187:3: should check returned error before deferring clientCertCA.Close() (SA5001)
k8s.io/kubernetes/pkg/client/unversioned/clientcmd/validation.go:225:4: should check returned error before deferring clientCertFile.Close() (SA5001)
k8s.io/kubernetes/pkg/client/unversioned/clientcmd/validation.go:232:4: should check returned error before deferring clientKeyFile.Close() (SA5001)
k8s.io/kubernetes/pkg/kubectl/cmd/util/helpers.go:453:2: should check returned error before deferring f.Close() (SA5001)
k8s.io/kubernetes/test/images/netexec/netexec.go:360:2: should check returned error before deferring serverConn.Close() (SA5001)
```
# Ineffective field assignments (SA4005)
In the following instances, values are being assigned to struct fields in methods with value receivers, without ever reading the fields again. This either suggests unnecessary assignments, or receivers that should be pointers instead.
```
k8s.io/kubernetes/pkg/cloudprovider/providers/mesos/client_test.go:159:5: ineffective assignment to field callback (SA4005)
k8s.io/kubernetes/pkg/kubectl/service_basic.go:103:4: ineffective assignment to field Name (SA4005)
k8s.io/kubernetes/pkg/kubectl/service_basic.go:104:4: ineffective assignment to field TCP (SA4005)
k8s.io/kubernetes/pkg/kubectl/service_basic.go:105:4: ineffective assignment to field ClusterIP (SA4005)
k8s.io/kubernetes/pkg/util/term/term_writer.go:120:5: ineffective assignment to field written (SA4005)
k8s.io/kubernetes/pkg/util/term/term_writer.go:121:5: ineffective assignment to field currentWidth (SA4005)
```
# Calling T.FailNow (and related) in goroutines (SA2002)
Per the testing.T documentation, T.FailNow, and by extension Fatal, Fatalf and so on, must be called on the same goroutine as the test, not in other goroutines. That invariant is violated in the following places:
```
k8s.io/kubernetes/pkg/apiserver/watch_test.go:674:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/apiserver/watch_test.go:714:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/apiserver/watch_test.go:763:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/registry/generic/registry/store_test.go:1079:4: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/registry/generic/registry/store_test.go:237:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/httpstream/spdy/connection_test.go:120:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/httpstream/spdy/connection_test.go:125:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wait/wait_test.go:352:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wait/wait_test.go:388:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:140:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:157:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:58:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:75:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/watch/versioned/decoder_test.go:44:3: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/watch/versioned/decoder_test.go:60:3: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/plugin/pkg/scheduler/scheduler_test.go:196:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/discoverysummarizer/discoverysummarizer_test.go:57:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/discoverysummarizer/discoverysummarizer_test.go:73:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/examples/apiserver_test.go:46:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/examples/apiserver_test.go:64:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/federation/server_test.go:95:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
```
# Printf with dynamic string and no arguments (SA1006)
The following instances use Printf even though they don't need format strings. In some cases, the inputs are user provided, which can lead to incorrect output:
```
k8s.io/kubernetes/cmd/kubeadm/app/util/error.go:89:14: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/cmd/libs/go2idl/go-to-protobuf/protobuf/cmd.go:230:15: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/cmd/libs/go2idl/go-to-protobuf/protobuf/cmd.go:251:15: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/cmd/libs/go2idl/go-to-protobuf/protobuf/cmd.go:262:15: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/examples/https-nginx/make_secret.go:69:13: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/examples/sharing-clusters/make_secret.go:62:13: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/annotate.go:291:44: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/edit.go:100:24: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/util/helpers.go:594:40: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/util/helpers.go:604:39: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubelet/util/format/pod.go:71:21: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/test/e2e_node/remote/remote.go:338:21: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
```
# identical expressions on the left and right side of the '==' operator (SA4000)
The following condition compares `rightDirective` with itself. That's either pointless, or a possible copy & paste eror:
```
k8s.io/kubernetes/pkg/util/strategicpatch/patch.go:1213:56: identical expressions on the left and right side of the '==' operator (SA4000)
```
# Errors that are assigned but never checked or returned (SA4006)
These values of `err` are never evaluated:
```
k8s.io/kubernetes/pkg/api/validation/validation.go:1206:2: this value of allErrs is never used (SA4006)
k8s.io/kubernetes/pkg/controller/daemon/daemoncontroller.go:292:3: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/controller/replicaset/replica_set.go:274:3: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/conversion/converter_test.go:571:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/kubectl/cmd/scale.go:130:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/kubelet/dockertools/exec.go:78:3: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/kubelet/kubelet.go:1554:5: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/persistentvolumeclaim/etcd/etcd_test.go:155:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/volume/host_path/host_path_test.go:151:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/volume/volume.go:321:4: this value of err is never used (SA4006)
```
# Loops that don't loop (SA4004, SA4008)
The following loops always loop at most once, indicating bugs or unnecessary control flow constructs:
```
k8s.io/kubernetes/federation/pkg/kubefed/init/init_test.go:866:4: the surrounding loop is unconditionally terminated (SA4004)
k8s.io/kubernetes/pkg/cloudprovider/providers/aws/aws.go:1923:3: the surrounding loop is unconditionally terminated (SA4004)
k8s.io/kubernetes/pkg/volume/rbd/rbd_util.go:390:18: variable in loop condition never changes (SA4008)
```
# Arguments being overwritten before first use (SA4005)
The following arguments are being overwritten before their first use. Judging from context, this is either because of missing pointer dereferences, or because of useless arguments:
```
k8s.io/kubernetes/pkg/api/conversion.go:232:70: argument out is overwritten before first use (SA4009)
k8s.io/kubernetes/pkg/util/jsonpath/parser.go:179:34: argument cur is overwritten before first use (SA4009)
k8s.io/kubernetes/pkg/volume/flexvolume/flexvolume_test.go:308:64: argument tmpDir is overwritten before first use (SA4009)
```
# Variable assignments with no effect (SA4006)
The same error class as assigned but not read errors, but indicating left over useless code, not missing error checking:
```
k8s.io/kubernetes/cmd/hyperkube/hyperkube.go:147:4: this value of baseCommand is never used (SA4006)
k8s.io/kubernetes/federation/pkg/federation-controller/service/servicecontroller.go:689:3: this value of servicesToUpdate is never used (SA4006)
k8s.io/kubernetes/pkg/apiserver/proxy.go:116:3: this value of httpCode is never used (SA4006)
k8s.io/kubernetes/pkg/apiserver/proxy.go:123:3: this value of httpCode is never used (SA4006)
k8s.io/kubernetes/pkg/apiserver/proxy.go:156:3: this value of httpCode is never used (SA4006)
k8s.io/kubernetes/pkg/credentialprovider/config_test.go:42:2: this value of preferredPaths is never used (SA4006)
k8s.io/kubernetes/pkg/kubectl/stop.go:337:3: this value of timeout is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:392:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:422:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:466:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:533:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:593:3: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:621:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/service/ipallocator/allocator_test.go:204:2: this value of other is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/service/portallocator/allocator_test.go:140:2: this value of other is never used (SA4006)
k8s.io/kubernetes/pkg/registry/extensions/deployment/etcd/etcd_test.go:341:3: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/extensions/deployment/etcd/etcd_test.go:372:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/util/framer/framer.go:135:4: this value of data is never used (SA4006)
k8s.io/kubernetes/pkg/util/framer/framer.go:141:3: this value of data is never used (SA4006)
k8s.io/kubernetes/pkg/util/framer/framer.go:158:3: this value of data is never used (SA4006)
k8s.io/kubernetes/pkg/util/jsonpath/parser.go:216:3: this value of r is never used (SA4006)
k8s.io/kubernetes/pkg/volume/azure_dd/attacher.go:261:3: this value of instanceid is never used (SA4006)
k8s.io/kubernetes/plugin/pkg/admission/podnodeselector/admission.go:189:2: this value of labelsMap is never used (SA4006)
k8s.io/kubernetes/test/integration/federation/server_test.go:380:2: this value of found is never used (SA4006)
k8s.io/kubernetes/third_party/forked/golang/json/fields.go:136:2: this value of count is never used (SA4006)
```
# appends whose results are never meaningfully used (SA4010)
All of these slices are never used. Most are lists of things that are discarded, some are values that are subsequently overwritten unconditionally.
```
k8s.io/kubernetes/pkg/controller/disruption/disruption_test.go:394:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/credentialprovider/config_test.go:42:2: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/kubelet/container/testing/fake_runtime.go:296:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/labels/selector_test.go:252:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/proxy/iptables/proxier.go:1218:5: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/util/framer/framer.go:135:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/util/framer/framer.go:141:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/util/framer/framer.go:158:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/volume/quobyte/quobyte.go:440:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/volume/quobyte/quobyte.go:443:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/volume/quobyte/quobyte.go:446:4: this result of append is never used, except maybe in other appends (SA4010)
```
|
2.0
|
Code deficiencies of various kinds and severities - Hi,
I ran [staticcheck](https://github.com/dominikh/go-staticcheck) on Kubernetes at commit 2110f72, filtered out benign issues and false positives and assembled the following report, grouped by category and sorted by severity in descending order. I hope this report is useful to you.
# defer mu.Lock() (SA2003)
The following line of code defers `Lock` instead of `Unlock`:
```
k8s.io/kubernetes/pkg/registry/registrytest/service.go:124:2: deferring Lock right after having locked already; did you mean to defer Unlock? (SA2003)
```
# Deferring before checking for errors (SA5001)
The following lines of code open files or connections and defer `f.Close` before checking the returned error. This will lead to panics in case of an error.
```
k8s.io/kubernetes/cmd/kubeadm/app/cmd/reset_test.go:159:4: should check returned error before deferring f.Close() (SA5001)
k8s.io/kubernetes/pkg/client/unversioned/clientcmd/validation.go:187:3: should check returned error before deferring clientCertCA.Close() (SA5001)
k8s.io/kubernetes/pkg/client/unversioned/clientcmd/validation.go:225:4: should check returned error before deferring clientCertFile.Close() (SA5001)
k8s.io/kubernetes/pkg/client/unversioned/clientcmd/validation.go:232:4: should check returned error before deferring clientKeyFile.Close() (SA5001)
k8s.io/kubernetes/pkg/kubectl/cmd/util/helpers.go:453:2: should check returned error before deferring f.Close() (SA5001)
k8s.io/kubernetes/test/images/netexec/netexec.go:360:2: should check returned error before deferring serverConn.Close() (SA5001)
```
# Ineffective field assignments (SA4005)
In the following instances, values are being assigned to struct fields in methods with value receivers, without ever reading the fields again. This either suggests unnecessary assignments, or receivers that should be pointers instead.
```
k8s.io/kubernetes/pkg/cloudprovider/providers/mesos/client_test.go:159:5: ineffective assignment to field callback (SA4005)
k8s.io/kubernetes/pkg/kubectl/service_basic.go:103:4: ineffective assignment to field Name (SA4005)
k8s.io/kubernetes/pkg/kubectl/service_basic.go:104:4: ineffective assignment to field TCP (SA4005)
k8s.io/kubernetes/pkg/kubectl/service_basic.go:105:4: ineffective assignment to field ClusterIP (SA4005)
k8s.io/kubernetes/pkg/util/term/term_writer.go:120:5: ineffective assignment to field written (SA4005)
k8s.io/kubernetes/pkg/util/term/term_writer.go:121:5: ineffective assignment to field currentWidth (SA4005)
```
# Calling T.FailNow (and related) in goroutines (SA2002)
Per the testing.T documentation, T.FailNow, and by extension Fatal, Fatalf and so on, must be called on the same goroutine as the test, not in other goroutines. That invariant is violated in the following places:
```
k8s.io/kubernetes/pkg/apiserver/watch_test.go:674:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/apiserver/watch_test.go:714:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/apiserver/watch_test.go:763:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/registry/generic/registry/store_test.go:1079:4: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/registry/generic/registry/store_test.go:237:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/httpstream/spdy/connection_test.go:120:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/httpstream/spdy/connection_test.go:125:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wait/wait_test.go:352:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wait/wait_test.go:388:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:140:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:157:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:58:2: the goroutine calls T.Fatal, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/util/wsstream/conn_test.go:75:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/watch/versioned/decoder_test.go:44:3: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/pkg/watch/versioned/decoder_test.go:60:3: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/plugin/pkg/scheduler/scheduler_test.go:196:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/discoverysummarizer/discoverysummarizer_test.go:57:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/discoverysummarizer/discoverysummarizer_test.go:73:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/examples/apiserver_test.go:46:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/examples/apiserver_test.go:64:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
k8s.io/kubernetes/test/integration/federation/server_test.go:95:2: the goroutine calls T.Fatalf, which must be called in the same goroutine as the test (SA2002)
```
# Printf with dynamic string and no arguments (SA1006)
The following instances use Printf even though they don't need format strings. In some cases, the inputs are user provided, which can lead to incorrect output:
```
k8s.io/kubernetes/cmd/kubeadm/app/util/error.go:89:14: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/cmd/libs/go2idl/go-to-protobuf/protobuf/cmd.go:230:15: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/cmd/libs/go2idl/go-to-protobuf/protobuf/cmd.go:251:15: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/cmd/libs/go2idl/go-to-protobuf/protobuf/cmd.go:262:15: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/examples/https-nginx/make_secret.go:69:13: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/examples/sharing-clusters/make_secret.go:62:13: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/annotate.go:291:44: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/edit.go:100:24: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/util/helpers.go:594:40: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubectl/cmd/util/helpers.go:604:39: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/pkg/kubelet/util/format/pod.go:71:21: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
k8s.io/kubernetes/test/e2e_node/remote/remote.go:338:21: printf-style function with dynamic first argument and no further arguments should use print-style function instead (SA1006)
```
# identical expressions on the left and right side of the '==' operator (SA4000)
The following condition compares `rightDirective` with itself. That's either pointless, or a possible copy & paste eror:
```
k8s.io/kubernetes/pkg/util/strategicpatch/patch.go:1213:56: identical expressions on the left and right side of the '==' operator (SA4000)
```
# Errors that are assigned but never checked or returned (SA4006)
These values of `err` are never evaluated:
```
k8s.io/kubernetes/pkg/api/validation/validation.go:1206:2: this value of allErrs is never used (SA4006)
k8s.io/kubernetes/pkg/controller/daemon/daemoncontroller.go:292:3: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/controller/replicaset/replica_set.go:274:3: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/conversion/converter_test.go:571:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/kubectl/cmd/scale.go:130:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/kubelet/dockertools/exec.go:78:3: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/kubelet/kubelet.go:1554:5: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/persistentvolumeclaim/etcd/etcd_test.go:155:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/volume/host_path/host_path_test.go:151:2: this value of err is never used (SA4006)
k8s.io/kubernetes/pkg/volume/volume.go:321:4: this value of err is never used (SA4006)
```
# Loops that don't loop (SA4004, SA4008)
The following loops always loop at most once, indicating bugs or unnecessary control flow constructs:
```
k8s.io/kubernetes/federation/pkg/kubefed/init/init_test.go:866:4: the surrounding loop is unconditionally terminated (SA4004)
k8s.io/kubernetes/pkg/cloudprovider/providers/aws/aws.go:1923:3: the surrounding loop is unconditionally terminated (SA4004)
k8s.io/kubernetes/pkg/volume/rbd/rbd_util.go:390:18: variable in loop condition never changes (SA4008)
```
# Arguments being overwritten before first use (SA4005)
The following arguments are being overwritten before their first use. Judging from context, this is either because of missing pointer dereferences, or because of useless arguments:
```
k8s.io/kubernetes/pkg/api/conversion.go:232:70: argument out is overwritten before first use (SA4009)
k8s.io/kubernetes/pkg/util/jsonpath/parser.go:179:34: argument cur is overwritten before first use (SA4009)
k8s.io/kubernetes/pkg/volume/flexvolume/flexvolume_test.go:308:64: argument tmpDir is overwritten before first use (SA4009)
```
# Variable assignments with no effect (SA4006)
The same error class as assigned but not read errors, but indicating left over useless code, not missing error checking:
```
k8s.io/kubernetes/cmd/hyperkube/hyperkube.go:147:4: this value of baseCommand is never used (SA4006)
k8s.io/kubernetes/federation/pkg/federation-controller/service/servicecontroller.go:689:3: this value of servicesToUpdate is never used (SA4006)
k8s.io/kubernetes/pkg/apiserver/proxy.go:116:3: this value of httpCode is never used (SA4006)
k8s.io/kubernetes/pkg/apiserver/proxy.go:123:3: this value of httpCode is never used (SA4006)
k8s.io/kubernetes/pkg/apiserver/proxy.go:156:3: this value of httpCode is never used (SA4006)
k8s.io/kubernetes/pkg/credentialprovider/config_test.go:42:2: this value of preferredPaths is never used (SA4006)
k8s.io/kubernetes/pkg/kubectl/stop.go:337:3: this value of timeout is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:392:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:422:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:466:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:533:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:593:3: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/pod/etcd/etcd_test.go:621:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/service/ipallocator/allocator_test.go:204:2: this value of other is never used (SA4006)
k8s.io/kubernetes/pkg/registry/core/service/portallocator/allocator_test.go:140:2: this value of other is never used (SA4006)
k8s.io/kubernetes/pkg/registry/extensions/deployment/etcd/etcd_test.go:341:3: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/registry/extensions/deployment/etcd/etcd_test.go:372:2: this value of key is never used (SA4006)
k8s.io/kubernetes/pkg/util/framer/framer.go:135:4: this value of data is never used (SA4006)
k8s.io/kubernetes/pkg/util/framer/framer.go:141:3: this value of data is never used (SA4006)
k8s.io/kubernetes/pkg/util/framer/framer.go:158:3: this value of data is never used (SA4006)
k8s.io/kubernetes/pkg/util/jsonpath/parser.go:216:3: this value of r is never used (SA4006)
k8s.io/kubernetes/pkg/volume/azure_dd/attacher.go:261:3: this value of instanceid is never used (SA4006)
k8s.io/kubernetes/plugin/pkg/admission/podnodeselector/admission.go:189:2: this value of labelsMap is never used (SA4006)
k8s.io/kubernetes/test/integration/federation/server_test.go:380:2: this value of found is never used (SA4006)
k8s.io/kubernetes/third_party/forked/golang/json/fields.go:136:2: this value of count is never used (SA4006)
```
# appends whose results are never meaningfully used (SA4010)
All of these slices are never used. Most are lists of things that are discarded, some are values that are subsequently overwritten unconditionally.
```
k8s.io/kubernetes/pkg/controller/disruption/disruption_test.go:394:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/credentialprovider/config_test.go:42:2: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/kubelet/container/testing/fake_runtime.go:296:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/labels/selector_test.go:252:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/proxy/iptables/proxier.go:1218:5: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/util/framer/framer.go:135:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/util/framer/framer.go:141:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/util/framer/framer.go:158:3: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/volume/quobyte/quobyte.go:440:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/volume/quobyte/quobyte.go:443:4: this result of append is never used, except maybe in other appends (SA4010)
k8s.io/kubernetes/pkg/volume/quobyte/quobyte.go:446:4: this result of append is never used, except maybe in other appends (SA4010)
```
|
test
|
code deficiencies of various kinds and severities hi i ran on kubernetes at commit filtered out benign issues and false positives and assembled the following report grouped by category and sorted by severity in descending order i hope this report is useful to you defer mu lock the following line of code defers lock instead of unlock io kubernetes pkg registry registrytest service go deferring lock right after having locked already did you mean to defer unlock deferring before checking for errors the following lines of code open files or connections and defer f close before checking the returned error this will lead to panics in case of an error io kubernetes cmd kubeadm app cmd reset test go should check returned error before deferring f close io kubernetes pkg client unversioned clientcmd validation go should check returned error before deferring clientcertca close io kubernetes pkg client unversioned clientcmd validation go should check returned error before deferring clientcertfile close io kubernetes pkg client unversioned clientcmd validation go should check returned error before deferring clientkeyfile close io kubernetes pkg kubectl cmd util helpers go should check returned error before deferring f close io kubernetes test images netexec netexec go should check returned error before deferring serverconn close ineffective field assignments in the following instances values are being assigned to struct fields in methods with value receivers without ever reading the fields again this either suggests unnecessary assignments or receivers that should be pointers instead io kubernetes pkg cloudprovider providers mesos client test go ineffective assignment to field callback io kubernetes pkg kubectl service basic go ineffective assignment to field name io kubernetes pkg kubectl service basic go ineffective assignment to field tcp io kubernetes pkg kubectl service basic go ineffective assignment to field clusterip io kubernetes pkg util term term writer go ineffective assignment to field written io kubernetes pkg util term term writer go ineffective assignment to field currentwidth calling t failnow and related in goroutines per the testing t documentation t failnow and by extension fatal fatalf and so on must be called on the same goroutine as the test not in other goroutines that invariant is violated in the following places io kubernetes pkg apiserver watch test go the goroutine calls t fatal which must be called in the same goroutine as the test io kubernetes pkg apiserver watch test go the goroutine calls t fatal which must be called in the same goroutine as the test io kubernetes pkg apiserver watch test go the goroutine calls t fatal which must be called in the same goroutine as the test io kubernetes pkg registry generic registry store test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg registry generic registry store test go the goroutine calls t fatal which must be called in the same goroutine as the test io kubernetes pkg util httpstream spdy connection test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg util httpstream spdy connection test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg util wait wait test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg util wait wait test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg util wsstream conn test go the goroutine calls t fatal which must be called in the same goroutine as the test io kubernetes pkg util wsstream conn test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg util wsstream conn test go the goroutine calls t fatal which must be called in the same goroutine as the test io kubernetes pkg util wsstream conn test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg watch versioned decoder test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes pkg watch versioned decoder test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes plugin pkg scheduler scheduler test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes test integration discoverysummarizer discoverysummarizer test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes test integration discoverysummarizer discoverysummarizer test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes test integration examples apiserver test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes test integration examples apiserver test go the goroutine calls t fatalf which must be called in the same goroutine as the test io kubernetes test integration federation server test go the goroutine calls t fatalf which must be called in the same goroutine as the test printf with dynamic string and no arguments the following instances use printf even though they don t need format strings in some cases the inputs are user provided which can lead to incorrect output io kubernetes cmd kubeadm app util error go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes cmd libs go to protobuf protobuf cmd go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes cmd libs go to protobuf protobuf cmd go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes cmd libs go to protobuf protobuf cmd go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes examples https nginx make secret go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes examples sharing clusters make secret go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes pkg kubectl cmd annotate go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes pkg kubectl cmd edit go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes pkg kubectl cmd util helpers go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes pkg kubectl cmd util helpers go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes pkg kubelet util format pod go printf style function with dynamic first argument and no further arguments should use print style function instead io kubernetes test node remote remote go printf style function with dynamic first argument and no further arguments should use print style function instead identical expressions on the left and right side of the operator the following condition compares rightdirective with itself that s either pointless or a possible copy paste eror io kubernetes pkg util strategicpatch patch go identical expressions on the left and right side of the operator errors that are assigned but never checked or returned these values of err are never evaluated io kubernetes pkg api validation validation go this value of allerrs is never used io kubernetes pkg controller daemon daemoncontroller go this value of err is never used io kubernetes pkg controller replicaset replica set go this value of err is never used io kubernetes pkg conversion converter test go this value of err is never used io kubernetes pkg kubectl cmd scale go this value of err is never used io kubernetes pkg kubelet dockertools exec go this value of err is never used io kubernetes pkg kubelet kubelet go this value of err is never used io kubernetes pkg registry core persistentvolumeclaim etcd etcd test go this value of err is never used io kubernetes pkg volume host path host path test go this value of err is never used io kubernetes pkg volume volume go this value of err is never used loops that don t loop the following loops always loop at most once indicating bugs or unnecessary control flow constructs io kubernetes federation pkg kubefed init init test go the surrounding loop is unconditionally terminated io kubernetes pkg cloudprovider providers aws aws go the surrounding loop is unconditionally terminated io kubernetes pkg volume rbd rbd util go variable in loop condition never changes arguments being overwritten before first use the following arguments are being overwritten before their first use judging from context this is either because of missing pointer dereferences or because of useless arguments io kubernetes pkg api conversion go argument out is overwritten before first use io kubernetes pkg util jsonpath parser go argument cur is overwritten before first use io kubernetes pkg volume flexvolume flexvolume test go argument tmpdir is overwritten before first use variable assignments with no effect the same error class as assigned but not read errors but indicating left over useless code not missing error checking io kubernetes cmd hyperkube hyperkube go this value of basecommand is never used io kubernetes federation pkg federation controller service servicecontroller go this value of servicestoupdate is never used io kubernetes pkg apiserver proxy go this value of httpcode is never used io kubernetes pkg apiserver proxy go this value of httpcode is never used io kubernetes pkg apiserver proxy go this value of httpcode is never used io kubernetes pkg credentialprovider config test go this value of preferredpaths is never used io kubernetes pkg kubectl stop go this value of timeout is never used io kubernetes pkg registry core pod etcd etcd test go this value of key is never used io kubernetes pkg registry core pod etcd etcd test go this value of key is never used io kubernetes pkg registry core pod etcd etcd test go this value of key is never used io kubernetes pkg registry core pod etcd etcd test go this value of key is never used io kubernetes pkg registry core pod etcd etcd test go this value of key is never used io kubernetes pkg registry core pod etcd etcd test go this value of key is never used io kubernetes pkg registry core service ipallocator allocator test go this value of other is never used io kubernetes pkg registry core service portallocator allocator test go this value of other is never used io kubernetes pkg registry extensions deployment etcd etcd test go this value of key is never used io kubernetes pkg registry extensions deployment etcd etcd test go this value of key is never used io kubernetes pkg util framer framer go this value of data is never used io kubernetes pkg util framer framer go this value of data is never used io kubernetes pkg util framer framer go this value of data is never used io kubernetes pkg util jsonpath parser go this value of r is never used io kubernetes pkg volume azure dd attacher go this value of instanceid is never used io kubernetes plugin pkg admission podnodeselector admission go this value of labelsmap is never used io kubernetes test integration federation server test go this value of found is never used io kubernetes third party forked golang json fields go this value of count is never used appends whose results are never meaningfully used all of these slices are never used most are lists of things that are discarded some are values that are subsequently overwritten unconditionally io kubernetes pkg controller disruption disruption test go this result of append is never used except maybe in other appends io kubernetes pkg credentialprovider config test go this result of append is never used except maybe in other appends io kubernetes pkg kubelet container testing fake runtime go this result of append is never used except maybe in other appends io kubernetes pkg labels selector test go this result of append is never used except maybe in other appends io kubernetes pkg proxy iptables proxier go this result of append is never used except maybe in other appends io kubernetes pkg util framer framer go this result of append is never used except maybe in other appends io kubernetes pkg util framer framer go this result of append is never used except maybe in other appends io kubernetes pkg util framer framer go this result of append is never used except maybe in other appends io kubernetes pkg volume quobyte quobyte go this result of append is never used except maybe in other appends io kubernetes pkg volume quobyte quobyte go this result of append is never used except maybe in other appends io kubernetes pkg volume quobyte quobyte go this result of append is never used except maybe in other appends
| 1
|
121,495
| 10,170,396,014
|
IssuesEvent
|
2019-08-08 05:07:58
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
closed
|
teamcity: failed test: TestStoreMetrics
|
C-test-failure O-robot
|
The following tests appear to have failed on master (test): TestStoreMetrics
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestStoreMetrics).
[#1428229](https://teamcity.cockroachdb.com/viewLog.html?buildId=1428229):
```
TestStoreMetrics
.../client_metrics_test.go:61 SucceedsSoon: expected intent count to be zero, was -1
I190807 16:21:25.336597 414544 storage/client_metrics_test.go:61 SucceedsSoon: expected intent count to be zero, was -1
I190807 16:21:26.337059 414544 storage/client_metrics_test.go:61 SucceedsSoon: expected intent count to be zero, was -1
W190807 16:21:26.929918 422603 storage/store_rebalancer.go:219 [s2,store-rebalancer] StorePool missing descriptor for local store
I190807 16:21:27.337301 728842 util/stop/stopper.go:542 quiescing; tasks left:
1 [async] intent_resolver_ir_batcher
1 [async] intent_resolver_gc_batcher
I190807 16:21:27.337314 728840 util/stop/stopper.go:542 quiescing; tasks left:
1 [async] intent_resolver_ir_batcher
1 [async] intent_resolver_gc_batcher
I190807 16:21:27.337340 728841 util/stop/stopper.go:542 quiescing; tasks left:
1 [async] intent_resolver_ir_batcher
1 [async] intent_resolver_gc_batcher
I190807 16:21:27.338274 728839 util/stop/stopper.go:542 quiescing; tasks left:
7 rpc heartbeat
I190807 16:21:27.338727 728839 util/stop/stopper.go:542 quiescing; tasks left:
6 rpc heartbeat
I190807 16:21:27.338807 728839 util/stop/stopper.go:542 quiescing; tasks left:
5 rpc heartbeat
W190807 16:21:27.338890 416587 storage/raft_transport.go:620 while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
I190807 16:21:27.338919 728839 util/stop/stopper.go:542 quiescing; tasks left:
4 rpc heartbeat
I190807 16:21:27.338959 728839 util/stop/stopper.go:542 quiescing; tasks left:
3 rpc heartbeat
W190807 16:21:27.339010 416823 storage/raft_transport.go:620 while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
W190807 16:21:27.339081 415386 gossip/gossip.go:1498 [n3] no incoming or outgoing connections
W190807 16:21:27.339124 416085 storage/raft_transport.go:620 while processing outgoing Raft queue to node 1: rpc error: code = Unavailable desc = transport is closing:
W190807 16:21:27.339180 414827 gossip/gossip.go:1498 [n2] no incoming or outgoing connections
W190807 16:21:27.339286 416769 storage/raft_transport.go:620 while processing outgoing Raft queue to node 1: rpc error: code = Canceled desc = grpc: the client connection is closing:
I190807 16:21:27.339298 728839 util/stop/stopper.go:542 quiescing; tasks left:
2 rpc heartbeat
I190807 16:21:27.339507 728839 util/stop/stopper.go:542 quiescing; tasks left:
1 rpc heartbeat
W190807 16:21:27.339346 416029 storage/raft_transport.go:620 while processing outgoing Raft queue to node 2: rpc error: code = Unavailable desc = transport is closing:
W190807 16:21:27.339416 416595 storage/raft_transport.go:620 while processing outgoing Raft queue to node 2: rpc error: code = Unavailable desc = transport is closing:
soon.go:35: condition failed to evaluate within 45s: expected intent count to be zero, was -1
goroutine 414544 [running]:
runtime/debug.Stack(0x3dadd60, 0xc00175a200, 0xc001eaef60)
/usr/local/go/src/runtime/debug/stack.go:24 +0x9d
github.com/cockroachdb/cockroach/pkg/testutils.SucceedsSoon(0x3dadd60, 0xc00175a200, 0xc001eaef60)
/go/src/github.com/cockroachdb/cockroach/pkg/testutils/soon.go:36 +0x6b
github.com/cockroachdb/cockroach/pkg/storage_test.verifyStats(0xc00175a200, 0xc000518380, 0xc001b43c88, 0x3, 0x3)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/client_metrics_test.go:61 +0x1e6
github.com/cockroachdb/cockroach/pkg/storage_test.TestStoreMetrics(0xc00175a200)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/client_metrics_test.go:323 +0xbe5
testing.tRunner(0xc00175a200, 0x3654748)
/usr/local/go/src/testing/testing.go:865 +0xc0
created by testing.(*T).Run
/usr/local/go/src/testing/testing.go:916 +0x35a
```
Please assign, take a look and update the issue accordingly.
|
1.0
|
teamcity: failed test: TestStoreMetrics - The following tests appear to have failed on master (test): TestStoreMetrics
You may want to check [for open issues](https://github.com/cockroachdb/cockroach/issues?q=is%3Aissue+is%3Aopen+TestStoreMetrics).
[#1428229](https://teamcity.cockroachdb.com/viewLog.html?buildId=1428229):
```
TestStoreMetrics
.../client_metrics_test.go:61 SucceedsSoon: expected intent count to be zero, was -1
I190807 16:21:25.336597 414544 storage/client_metrics_test.go:61 SucceedsSoon: expected intent count to be zero, was -1
I190807 16:21:26.337059 414544 storage/client_metrics_test.go:61 SucceedsSoon: expected intent count to be zero, was -1
W190807 16:21:26.929918 422603 storage/store_rebalancer.go:219 [s2,store-rebalancer] StorePool missing descriptor for local store
I190807 16:21:27.337301 728842 util/stop/stopper.go:542 quiescing; tasks left:
1 [async] intent_resolver_ir_batcher
1 [async] intent_resolver_gc_batcher
I190807 16:21:27.337314 728840 util/stop/stopper.go:542 quiescing; tasks left:
1 [async] intent_resolver_ir_batcher
1 [async] intent_resolver_gc_batcher
I190807 16:21:27.337340 728841 util/stop/stopper.go:542 quiescing; tasks left:
1 [async] intent_resolver_ir_batcher
1 [async] intent_resolver_gc_batcher
I190807 16:21:27.338274 728839 util/stop/stopper.go:542 quiescing; tasks left:
7 rpc heartbeat
I190807 16:21:27.338727 728839 util/stop/stopper.go:542 quiescing; tasks left:
6 rpc heartbeat
I190807 16:21:27.338807 728839 util/stop/stopper.go:542 quiescing; tasks left:
5 rpc heartbeat
W190807 16:21:27.338890 416587 storage/raft_transport.go:620 while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
I190807 16:21:27.338919 728839 util/stop/stopper.go:542 quiescing; tasks left:
4 rpc heartbeat
I190807 16:21:27.338959 728839 util/stop/stopper.go:542 quiescing; tasks left:
3 rpc heartbeat
W190807 16:21:27.339010 416823 storage/raft_transport.go:620 while processing outgoing Raft queue to node 3: rpc error: code = Canceled desc = grpc: the client connection is closing:
W190807 16:21:27.339081 415386 gossip/gossip.go:1498 [n3] no incoming or outgoing connections
W190807 16:21:27.339124 416085 storage/raft_transport.go:620 while processing outgoing Raft queue to node 1: rpc error: code = Unavailable desc = transport is closing:
W190807 16:21:27.339180 414827 gossip/gossip.go:1498 [n2] no incoming or outgoing connections
W190807 16:21:27.339286 416769 storage/raft_transport.go:620 while processing outgoing Raft queue to node 1: rpc error: code = Canceled desc = grpc: the client connection is closing:
I190807 16:21:27.339298 728839 util/stop/stopper.go:542 quiescing; tasks left:
2 rpc heartbeat
I190807 16:21:27.339507 728839 util/stop/stopper.go:542 quiescing; tasks left:
1 rpc heartbeat
W190807 16:21:27.339346 416029 storage/raft_transport.go:620 while processing outgoing Raft queue to node 2: rpc error: code = Unavailable desc = transport is closing:
W190807 16:21:27.339416 416595 storage/raft_transport.go:620 while processing outgoing Raft queue to node 2: rpc error: code = Unavailable desc = transport is closing:
soon.go:35: condition failed to evaluate within 45s: expected intent count to be zero, was -1
goroutine 414544 [running]:
runtime/debug.Stack(0x3dadd60, 0xc00175a200, 0xc001eaef60)
/usr/local/go/src/runtime/debug/stack.go:24 +0x9d
github.com/cockroachdb/cockroach/pkg/testutils.SucceedsSoon(0x3dadd60, 0xc00175a200, 0xc001eaef60)
/go/src/github.com/cockroachdb/cockroach/pkg/testutils/soon.go:36 +0x6b
github.com/cockroachdb/cockroach/pkg/storage_test.verifyStats(0xc00175a200, 0xc000518380, 0xc001b43c88, 0x3, 0x3)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/client_metrics_test.go:61 +0x1e6
github.com/cockroachdb/cockroach/pkg/storage_test.TestStoreMetrics(0xc00175a200)
/go/src/github.com/cockroachdb/cockroach/pkg/storage/client_metrics_test.go:323 +0xbe5
testing.tRunner(0xc00175a200, 0x3654748)
/usr/local/go/src/testing/testing.go:865 +0xc0
created by testing.(*T).Run
/usr/local/go/src/testing/testing.go:916 +0x35a
```
Please assign, take a look and update the issue accordingly.
|
test
|
teamcity failed test teststoremetrics the following tests appear to have failed on master test teststoremetrics you may want to check teststoremetrics client metrics test go succeedssoon expected intent count to be zero was storage client metrics test go succeedssoon expected intent count to be zero was storage client metrics test go succeedssoon expected intent count to be zero was storage store rebalancer go storepool missing descriptor for local store util stop stopper go quiescing tasks left intent resolver ir batcher intent resolver gc batcher util stop stopper go quiescing tasks left intent resolver ir batcher intent resolver gc batcher util stop stopper go quiescing tasks left intent resolver ir batcher intent resolver gc batcher util stop stopper go quiescing tasks left rpc heartbeat util stop stopper go quiescing tasks left rpc heartbeat util stop stopper go quiescing tasks left rpc heartbeat storage raft transport go while processing outgoing raft queue to node rpc error code canceled desc grpc the client connection is closing util stop stopper go quiescing tasks left rpc heartbeat util stop stopper go quiescing tasks left rpc heartbeat storage raft transport go while processing outgoing raft queue to node rpc error code canceled desc grpc the client connection is closing gossip gossip go no incoming or outgoing connections storage raft transport go while processing outgoing raft queue to node rpc error code unavailable desc transport is closing gossip gossip go no incoming or outgoing connections storage raft transport go while processing outgoing raft queue to node rpc error code canceled desc grpc the client connection is closing util stop stopper go quiescing tasks left rpc heartbeat util stop stopper go quiescing tasks left rpc heartbeat storage raft transport go while processing outgoing raft queue to node rpc error code unavailable desc transport is closing storage raft transport go while processing outgoing raft queue to node rpc error code unavailable desc transport is closing soon go condition failed to evaluate within expected intent count to be zero was goroutine runtime debug stack usr local go src runtime debug stack go github com cockroachdb cockroach pkg testutils succeedssoon go src github com cockroachdb cockroach pkg testutils soon go github com cockroachdb cockroach pkg storage test verifystats go src github com cockroachdb cockroach pkg storage client metrics test go github com cockroachdb cockroach pkg storage test teststoremetrics go src github com cockroachdb cockroach pkg storage client metrics test go testing trunner usr local go src testing testing go created by testing t run usr local go src testing testing go please assign take a look and update the issue accordingly
| 1
|
703,572
| 24,166,352,709
|
IssuesEvent
|
2022-09-22 15:19:23
|
webcompat/web-bugs
|
https://api.github.com/repos/webcompat/web-bugs
|
closed
|
www.bankrate.com - Missaligned buttons overlapping page elemtents
|
browser-firefox priority-normal severity-minor engine-gecko
|
<!-- @browser: Firefox 91.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0 -->
<!-- @reported_with: unknown -->
**URL**: https://www.bankrate.com/retirement/calculators/roth-ira-plan-calculator/
**Browser / Version**: Firefox Nightly 95.0a1 (2021-10-08) (64-bit)
**Operating System**: Windows 10
**Tested Another Browser**: Yes Chrome
**Problem type**: Design is broken
**Description**: Items are overlapped
**Steps to Reproduce**:
Buttons "Calculate" and "View report" overlap the sentence "At retirement..."
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
1.0
|
www.bankrate.com - Missaligned buttons overlapping page elemtents - <!-- @browser: Firefox 91.0 -->
<!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; rv:91.0) Gecko/20100101 Firefox/91.0 -->
<!-- @reported_with: unknown -->
**URL**: https://www.bankrate.com/retirement/calculators/roth-ira-plan-calculator/
**Browser / Version**: Firefox Nightly 95.0a1 (2021-10-08) (64-bit)
**Operating System**: Windows 10
**Tested Another Browser**: Yes Chrome
**Problem type**: Design is broken
**Description**: Items are overlapped
**Steps to Reproduce**:
Buttons "Calculate" and "View report" overlap the sentence "At retirement..."
<details>
<summary>Browser Configuration</summary>
<ul>
<li>None</li>
</ul>
</details>
_From [webcompat.com](https://webcompat.com/) with ❤️_
|
non_test
|
missaligned buttons overlapping page elemtents url browser version firefox nightly bit operating system windows tested another browser yes chrome problem type design is broken description items are overlapped steps to reproduce buttons calculate and view report overlap the sentence at retirement browser configuration none from with ❤️
| 0
|
185
| 2,573,331,948
|
IssuesEvent
|
2015-02-11 08:56:03
|
molgenis/molgenis
|
https://api.github.com/repos/molgenis/molgenis
|
opened
|
Inactive user can still request a new password (which doesn't work)
|
enhancement security
|
To reproduce:
Set a user (for example admin) to active = 0.
Request a new password at login.
A new password is sent to to the admin email address.
Password will not be valid.
A message telling the user the account is inactive should be shown.
|
True
|
Inactive user can still request a new password (which doesn't work) - To reproduce:
Set a user (for example admin) to active = 0.
Request a new password at login.
A new password is sent to to the admin email address.
Password will not be valid.
A message telling the user the account is inactive should be shown.
|
non_test
|
inactive user can still request a new password which doesn t work to reproduce set a user for example admin to active request a new password at login a new password is sent to to the admin email address password will not be valid a message telling the user the account is inactive should be shown
| 0
|
137,591
| 11,145,975,180
|
IssuesEvent
|
2019-12-23 08:23:50
|
hazelcast/hazelcast
|
https://api.github.com/repos/hazelcast/hazelcast
|
closed
|
Test failure ClientRegressionWithRealNetworkTest.testConnectionCountAfterClientReconnect_memberHostname_clientHostname
|
Source: Internal Team: Client Type: Test-Failure
|
http://jenkins.hazelcast.com/job/Hazelcast-pr-builder/5127/testReport/junit/com.hazelcast.client/ClientRegressionWithRealNetworkTest/testConnectionCountAfterClientReconnect_memberHostname_clientHostname/
```
Error Message
expected:<1> but was:<0>
Stacktrace
java.lang.AssertionError: expected:<1> but was:<0>
at com.hazelcast.client.ClientRegressionWithRealNetworkTest.testConnectionCountAfterClientReconnect(ClientRegressionWithRealNetworkTest.java:164)
at com.hazelcast.client.ClientRegressionWithRealNetworkTest.testConnectionCountAfterClientReconnect_memberHostname_clientHostname(ClientRegressionWithRealNetworkTest.java:117)
```
|
1.0
|
Test failure ClientRegressionWithRealNetworkTest.testConnectionCountAfterClientReconnect_memberHostname_clientHostname - http://jenkins.hazelcast.com/job/Hazelcast-pr-builder/5127/testReport/junit/com.hazelcast.client/ClientRegressionWithRealNetworkTest/testConnectionCountAfterClientReconnect_memberHostname_clientHostname/
```
Error Message
expected:<1> but was:<0>
Stacktrace
java.lang.AssertionError: expected:<1> but was:<0>
at com.hazelcast.client.ClientRegressionWithRealNetworkTest.testConnectionCountAfterClientReconnect(ClientRegressionWithRealNetworkTest.java:164)
at com.hazelcast.client.ClientRegressionWithRealNetworkTest.testConnectionCountAfterClientReconnect_memberHostname_clientHostname(ClientRegressionWithRealNetworkTest.java:117)
```
|
test
|
test failure clientregressionwithrealnetworktest testconnectioncountafterclientreconnect memberhostname clienthostname error message expected but was stacktrace java lang assertionerror expected but was at com hazelcast client clientregressionwithrealnetworktest testconnectioncountafterclientreconnect clientregressionwithrealnetworktest java at com hazelcast client clientregressionwithrealnetworktest testconnectioncountafterclientreconnect memberhostname clienthostname clientregressionwithrealnetworktest java
| 1
|
143,594
| 11,570,268,295
|
IssuesEvent
|
2020-02-20 19:11:32
|
cdnjs/cdnjs
|
https://api.github.com/repos/cdnjs/cdnjs
|
closed
|
[Test] Make sure libs are under ajax/libs
|
:bulb: Help wanted :sunglasses: Nice to Have :traffic_light: Test
|
Sometimes contributor put the libraries at the wrong place - `ajax/lib`, then the currently test process will totally miss that library and just let the test pass, I didn't come up with a great solution yet, but maybe we can just make sure there is no other files than `libs` under `ajax`.
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/40208613-test-make-sure-libs-are-under-ajax-libs?utm_campaign=plugin&utm_content=tracker%2F32893&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F32893&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
1.0
|
[Test] Make sure libs are under ajax/libs - Sometimes contributor put the libraries at the wrong place - `ajax/lib`, then the currently test process will totally miss that library and just let the test pass, I didn't come up with a great solution yet, but maybe we can just make sure there is no other files than `libs` under `ajax`.
<bountysource-plugin>
---
Want to back this issue? **[Post a bounty on it!](https://www.bountysource.com/issues/40208613-test-make-sure-libs-are-under-ajax-libs?utm_campaign=plugin&utm_content=tracker%2F32893&utm_medium=issues&utm_source=github)** We accept bounties via [Bountysource](https://www.bountysource.com/?utm_campaign=plugin&utm_content=tracker%2F32893&utm_medium=issues&utm_source=github).
</bountysource-plugin>
|
test
|
make sure libs are under ajax libs sometimes contributor put the libraries at the wrong place ajax lib then the currently test process will totally miss that library and just let the test pass i didn t come up with a great solution yet but maybe we can just make sure there is no other files than libs under ajax want to back this issue we accept bounties via
| 1
|
39,551
| 5,102,008,430
|
IssuesEvent
|
2017-01-04 17:00:43
|
fsr-itse/1327
|
https://api.github.com/repos/fsr-itse/1327
|
opened
|
show last level of navigation in breadcrumbs
|
[P] nice to have [T] design
|
Currently the last level is not shown in the breadcrumbs. It should be shown if the breadcrumbs itself are shown (so on a page on the main level breadcrumbs should still be shown)
|
1.0
|
show last level of navigation in breadcrumbs - Currently the last level is not shown in the breadcrumbs. It should be shown if the breadcrumbs itself are shown (so on a page on the main level breadcrumbs should still be shown)
|
non_test
|
show last level of navigation in breadcrumbs currently the last level is not shown in the breadcrumbs it should be shown if the breadcrumbs itself are shown so on a page on the main level breadcrumbs should still be shown
| 0
|
61,410
| 14,986,142,635
|
IssuesEvent
|
2021-01-28 20:51:30
|
codereport/jsource
|
https://api.github.com/repos/codereport/jsource
|
closed
|
More folder restructuring
|
HIGH PRIORITY ci / build good first issue
|
Add the new folders:
* `words`
* `debugging` (:star: note **debugging** not **debug**)
* `format`
* `parsing`
* `representations`
* `xenos`
With the following files:
```
./wc.c:4:/* Words: Control Words */
./ws.c:4:/* Words: Spelling */
./w.c:4:/* Words: Word Formation */
./wn.c:4:/* Words: Numeric Input Conversion */
./dss.c:4:/* Debug: Single Step */
./dsusp.c:4:/* Debug: Suspension */
./dstop.c:4:/* Debug: Stops */
./d.c:4:/* Debug: Error Signalling and Display */
./dc.c:4:/* Debug: Function Call Information */
./d.h:4:/* Debug */
./f.c:4:/* Format: ": Monad */
./fbu.c:4:/* Format: ": Monad boxed unicode */
./f2.c:4:/* Format: ": Dyad */
./p.c:4:/* Parsing; see APL Dictionary, pp. 12-13 & 38. */
./p.h:4:/* Parsing: Macros and Defined Constants */
./pv.c:4:/* Parsing: Tacit Verb Translator (13 : ) */
// maybe px.c ???
./r.c:4:/* Representations: Atomic, Boxed, and 5!:0 */
./rt.c:4:/* Representations: Tree */
./rl.c:4:/* Representations: Linear and Paren */
./xo.c:4:/* Xenos: File Open/Close */
./xl.c:4:/* Xenos: File Lock/Unlock */
./xcrc.c:4:/* Xenos: CRC calculation and base64 encode/decode */
./xc.c:4:/* Xenos: Custom */
./xi.c:4:/* Xenos: Implementation Internals */
./xc.h:4:/* Xenos: Custom */
./x15.c:4:/* Xenos: DLL call driver */
./xaes.c:4:/* Xenos: AES calculation */
./xh.c:4:/* Xenos: Host Command Facilities */
./xb.c:4:/* Xenos: Binary Representation */
./xf.c:4:/* Xenos: Files */
./xd.c:4:/* Xenos: file directory, attributes, & permission */
./x.h:4:/* Xenos: Macros and Defined Constants for !: */
./xfmt.c:4:/* Xenos: 8!:x formatting stuff */
./xu.c:4:/* Xenos: u: conversions */
./xa.c:4:/* Xenos: Miscellaneous */
./xt.c:4:/* Xenos: time and space */
./xs.c:4:/* Xenos: Scripts */
./xsha.c:4:/* Xenos: SHA calculation */
```
|
1.0
|
More folder restructuring - Add the new folders:
* `words`
* `debugging` (:star: note **debugging** not **debug**)
* `format`
* `parsing`
* `representations`
* `xenos`
With the following files:
```
./wc.c:4:/* Words: Control Words */
./ws.c:4:/* Words: Spelling */
./w.c:4:/* Words: Word Formation */
./wn.c:4:/* Words: Numeric Input Conversion */
./dss.c:4:/* Debug: Single Step */
./dsusp.c:4:/* Debug: Suspension */
./dstop.c:4:/* Debug: Stops */
./d.c:4:/* Debug: Error Signalling and Display */
./dc.c:4:/* Debug: Function Call Information */
./d.h:4:/* Debug */
./f.c:4:/* Format: ": Monad */
./fbu.c:4:/* Format: ": Monad boxed unicode */
./f2.c:4:/* Format: ": Dyad */
./p.c:4:/* Parsing; see APL Dictionary, pp. 12-13 & 38. */
./p.h:4:/* Parsing: Macros and Defined Constants */
./pv.c:4:/* Parsing: Tacit Verb Translator (13 : ) */
// maybe px.c ???
./r.c:4:/* Representations: Atomic, Boxed, and 5!:0 */
./rt.c:4:/* Representations: Tree */
./rl.c:4:/* Representations: Linear and Paren */
./xo.c:4:/* Xenos: File Open/Close */
./xl.c:4:/* Xenos: File Lock/Unlock */
./xcrc.c:4:/* Xenos: CRC calculation and base64 encode/decode */
./xc.c:4:/* Xenos: Custom */
./xi.c:4:/* Xenos: Implementation Internals */
./xc.h:4:/* Xenos: Custom */
./x15.c:4:/* Xenos: DLL call driver */
./xaes.c:4:/* Xenos: AES calculation */
./xh.c:4:/* Xenos: Host Command Facilities */
./xb.c:4:/* Xenos: Binary Representation */
./xf.c:4:/* Xenos: Files */
./xd.c:4:/* Xenos: file directory, attributes, & permission */
./x.h:4:/* Xenos: Macros and Defined Constants for !: */
./xfmt.c:4:/* Xenos: 8!:x formatting stuff */
./xu.c:4:/* Xenos: u: conversions */
./xa.c:4:/* Xenos: Miscellaneous */
./xt.c:4:/* Xenos: time and space */
./xs.c:4:/* Xenos: Scripts */
./xsha.c:4:/* Xenos: SHA calculation */
```
|
non_test
|
more folder restructuring add the new folders words debugging star note debugging not debug format parsing representations xenos with the following files wc c words control words ws c words spelling w c words word formation wn c words numeric input conversion dss c debug single step dsusp c debug suspension dstop c debug stops d c debug error signalling and display dc c debug function call information d h debug f c format monad fbu c format monad boxed unicode c format dyad p c parsing see apl dictionary pp p h parsing macros and defined constants pv c parsing tacit verb translator maybe px c r c representations atomic boxed and rt c representations tree rl c representations linear and paren xo c xenos file open close xl c xenos file lock unlock xcrc c xenos crc calculation and encode decode xc c xenos custom xi c xenos implementation internals xc h xenos custom c xenos dll call driver xaes c xenos aes calculation xh c xenos host command facilities xb c xenos binary representation xf c xenos files xd c xenos file directory attributes permission x h xenos macros and defined constants for xfmt c xenos x formatting stuff xu c xenos u conversions xa c xenos miscellaneous xt c xenos time and space xs c xenos scripts xsha c xenos sha calculation
| 0
|
77,721
| 7,600,914,691
|
IssuesEvent
|
2018-04-28 07:51:20
|
openshift/origin
|
https://api.github.com/repos/openshift/origin
|
closed
|
Operation cannot be fulfilled on namespaces (extended image ecosystem tests failure)
|
kind/test-flake priority/P1
|
"Operation cannot be fulfilled on namespaces "<namespace>": The system is ensuring all content is removed from this namespace. Upon completion, this namespace will automatically be purged by the system.
https://ci.openshift.redhat.com/jenkins/job/test_branch_origin_extended_image_ecosystem/458/
|
1.0
|
Operation cannot be fulfilled on namespaces (extended image ecosystem tests failure) - "Operation cannot be fulfilled on namespaces "<namespace>": The system is ensuring all content is removed from this namespace. Upon completion, this namespace will automatically be purged by the system.
https://ci.openshift.redhat.com/jenkins/job/test_branch_origin_extended_image_ecosystem/458/
|
test
|
operation cannot be fulfilled on namespaces extended image ecosystem tests failure operation cannot be fulfilled on namespaces the system is ensuring all content is removed from this namespace upon completion this namespace will automatically be purged by the system
| 1
|
81,141
| 7,768,458,863
|
IssuesEvent
|
2018-06-03 18:14:42
|
cerberustesting/cerberus-source
|
https://api.github.com/repos/cerberustesting/cerberus-source
|
closed
|
Allways Execute a step at the end of test case execution
|
Perim : ENGINETransversal Perim : GUITest Prio : 1 high+
|
Add possibility to always execute a step at the end of test case execution :
- [x] Add a new check box on a step `Force this step if Testcase is not OK`
- [x] Add possibility to create a `PostTesting` Testcase link to an application. This `PostTesting` Testcase will executed at the end of a TestCase. If `Force this step if Testcase is not OK` is checked, it will be allways executed, else only if all steps are OK.
|
1.0
|
Allways Execute a step at the end of test case execution -
Add possibility to always execute a step at the end of test case execution :
- [x] Add a new check box on a step `Force this step if Testcase is not OK`
- [x] Add possibility to create a `PostTesting` Testcase link to an application. This `PostTesting` Testcase will executed at the end of a TestCase. If `Force this step if Testcase is not OK` is checked, it will be allways executed, else only if all steps are OK.
|
test
|
allways execute a step at the end of test case execution add possibility to always execute a step at the end of test case execution add a new check box on a step force this step if testcase is not ok add possibility to create a posttesting testcase link to an application this posttesting testcase will executed at the end of a testcase if force this step if testcase is not ok is checked it will be allways executed else only if all steps are ok
| 1
|
51,711
| 6,194,013,069
|
IssuesEvent
|
2017-07-05 08:51:29
|
Kademi/kademi-dev
|
https://api.github.com/repos/Kademi/kademi-dev
|
closed
|
leadman theme - Switch to another goal Popup: wrong message
|
Ready to Test - Dev
|
https://github.com/Kademi/kademi-dev/issues/3431

profile to login: http://lantest1.admin.kademi-prod.com/manageUsers/74549
lead: http://hattonfake001.kademi-prod.com/leads/117242/
current, this popup's showing message of "next node" function
|
1.0
|
leadman theme - Switch to another goal Popup: wrong message - https://github.com/Kademi/kademi-dev/issues/3431

profile to login: http://lantest1.admin.kademi-prod.com/manageUsers/74549
lead: http://hattonfake001.kademi-prod.com/leads/117242/
current, this popup's showing message of "next node" function
|
test
|
leadman theme switch to another goal popup wrong message profile to login lead current this popup s showing message of next node function
| 1
|
19,380
| 3,769,105,730
|
IssuesEvent
|
2016-03-16 09:16:58
|
Microsoft/vscode
|
https://api.github.com/repos/Microsoft/vscode
|
closed
|
[Localization] Extensions icon hover tooltip not localized
|
v-test
|
- VSCode Version: 0.10.12-alpha --locale=fr (or any locale)
- OS Version: Windows 10
Steps to Reproduce:
1. Launch a localized VS Code instance
2. Hover over the extensions icon in the bottom left
3. "Extensions" tooltip is not localized

|
1.0
|
[Localization] Extensions icon hover tooltip not localized - - VSCode Version: 0.10.12-alpha --locale=fr (or any locale)
- OS Version: Windows 10
Steps to Reproduce:
1. Launch a localized VS Code instance
2. Hover over the extensions icon in the bottom left
3. "Extensions" tooltip is not localized

|
test
|
extensions icon hover tooltip not localized vscode version alpha locale fr or any locale os version windows steps to reproduce launch a localized vs code instance hover over the extensions icon in the bottom left extensions tooltip is not localized
| 1
|
26,723
| 4,240,986,134
|
IssuesEvent
|
2016-07-06 15:02:57
|
rlf/uSkyBlock
|
https://api.github.com/repos/rlf/uSkyBlock
|
closed
|
Island party bug
|
S duplicate T tested awaiting reporter
|
_Please paste the output from `/usb version` below_
```
�Name: �uSkyBlock�
�Version: �2.6.12�
�Description: �Ultimate SkyBlock v2.6.12-9e2d30-413�
�Language: �cs (cs)�
�------------------------------�
�Server: �CraftBukkit git-Spigot-c3e4052-1953f52 (MC: 1.10)�
�------------------------------�
��Vault �1.5.6-b49 �(�ENABLED�)�
��WorldEdit �6.1.3;7a097ca �(�ENABLED�)�
��WorldGuard �6.1.2;e38d98d �(�ENABLED�)�
��AsyncWorldEdit �2.2.2 �(�ENABLED�)�
��Multiverse-Core �2.5-b717 �(�ENABLED�)�
��Multiverse-NetherPortals �2.5-b675 �(�ENABLED�)�
�------------------------------�
```
_What steps will reproduce the problem?_
1. Use command /is party
2. Click to player head
3. Give player permissions for setbiome, lock island... not working.
This isn't working only for minimum players. Here config for island: http://pastebin.com/ggu0fugH
Problem is for player Lawkam, leader can set other player permissions. but for player Lawkam can't.
|
1.0
|
Island party bug - _Please paste the output from `/usb version` below_
```
�Name: �uSkyBlock�
�Version: �2.6.12�
�Description: �Ultimate SkyBlock v2.6.12-9e2d30-413�
�Language: �cs (cs)�
�------------------------------�
�Server: �CraftBukkit git-Spigot-c3e4052-1953f52 (MC: 1.10)�
�------------------------------�
��Vault �1.5.6-b49 �(�ENABLED�)�
��WorldEdit �6.1.3;7a097ca �(�ENABLED�)�
��WorldGuard �6.1.2;e38d98d �(�ENABLED�)�
��AsyncWorldEdit �2.2.2 �(�ENABLED�)�
��Multiverse-Core �2.5-b717 �(�ENABLED�)�
��Multiverse-NetherPortals �2.5-b675 �(�ENABLED�)�
�------------------------------�
```
_What steps will reproduce the problem?_
1. Use command /is party
2. Click to player head
3. Give player permissions for setbiome, lock island... not working.
This isn't working only for minimum players. Here config for island: http://pastebin.com/ggu0fugH
Problem is for player Lawkam, leader can set other player permissions. but for player Lawkam can't.
|
test
|
island party bug please paste the output from usb version below �name �uskyblock� �version � � �description �ultimate skyblock � �language �cs cs � � � �server �craftbukkit git spigot mc � � � ��vault � � �enabled� � ��worldedit � � �enabled� � ��worldguard � � �enabled� � ��asyncworldedit � � �enabled� � ��multiverse core � � �enabled� � ��multiverse netherportals � � �enabled� � � � what steps will reproduce the problem use command is party click to player head give player permissions for setbiome lock island not working this isn t working only for minimum players here config for island problem is for player lawkam leader can set other player permissions but for player lawkam can t
| 1
|
67,537
| 17,009,505,189
|
IssuesEvent
|
2021-07-02 00:37:17
|
spack/spack
|
https://api.github.com/repos/spack/spack
|
closed
|
Installation issue: libflame fails to build on Cray EX with "undefined reference to `main'"
|
build-error
|
### Steps to reproduce the issue
<!-- Fill in the exact spec you are trying to build and the relevant part of the error message -->
```console
$ spack install amdlibflame threads=openmp
[...]
ranlib lib/x86_64-unknown-linux-gnu/libflame.a
/project/d110/timuel/spack/lib/spack/env/gcc/gcc -shared -Wl,-soname,libflame.so.3 -o lib/x86_64-unknown-linux-gnu/libflame.so -Wl,--whole-archive,libflame.a,--no-whole-archive -L/opt/cray/pe/mpich/8.1.4/ofi/gnu/9.1/lib -L/opt/cray/pe/dsmml/0.1.4/dsmml//lib -L/opt/cray/xpmem/2.2.40-7.0.1.0_1.9__g1d7a24d.shasta/lib64 -L/opt/gcc/10.2.0/snos/lib/gcc/x86_64-suse-linux/10.2.0 -L/opt/gcc/10.2.0/snos/lib/gcc/x86_64-suse-linux/10.2.0/../../../../lib64 -L/lib/../lib64 -L/usr/lib/../lib64 -L/opt/gcc/10.2.0/snos/lib/gcc/x86_64-suse-linux/10.2.0/../../.. -lblis-mt -lmpifort_gnu_91 -lmpi_gnu_91 -lxpmem -ldsmml -lgfortran -lquadmath -lpthread -lm -fopenmp
/usr/bin/ld: /usr/lib/../lib64/crt1.o: in function `_start':
/home/abuild/rpmbuild/BUILD/glibc-2.26/csu/../sysdeps/x86_64/start.S:110: undefined reference to `main'
collect2: error: ld returned 1 exit status
make: *** [Makefile:593: lib/x86_64-unknown-linux-gnu/libflame.so] Error 1
```
### Information on your system
```
* **Spack:** 0.16.2-3059-e321578bbe
* **Python:** 3.6.12
* **Platform:** cray-sles15-zen2
* **Concretizer:** original
```
```yaml
packages:
...
cray-mpich:
externals:
- spec: cray-mpich@8.1.4%aocc
prefix: /opt/cray/pe/mpich/8.1.4/ofi/aocc/2.2
- spec: cray-mpich@8.1.4%gcc
prefix: /opt/cray/pe/mpich/8.1.4/ofi/gnu/9.1
buildable: False
...
```
```yaml
compilers:
- compiler:
spec: gcc@10.2.0
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-gnu
- gcc/10.2.0
environment: {}
extra_rpaths: []
```
### Additional information
<!-- Please upload the following files. They should be present in the stage directory of the failing build. Also upload any config.log or similar file if one exists. -->
* [spack-build-out.txt](https://github.com/spack/spack/files/6619683/spack-build-out.txt)
* [spack-build-env.txt](https://github.com/spack/spack/files/6619686/spack-build-env.txt)
@amd-toolchain-support ... my guess is that the Cray compiler wrappers interfere and switch to executable output mode with the `-o ...` occurring after the `-shared`, but that's really just a guess.
### General information
<!-- These boxes can be checked by replacing [ ] with [x] or by clicking them after submitting the issue. -->
- [x] I have run `spack debug report` and reported the version of Spack/Python/Platform
- [x] I have run `spack maintainers <name-of-the-package>` and @mentioned any maintainers
- [x] I have uploaded the build log and environment files
- [x] I have searched the issues of this repo and believe this is not a duplicate
|
1.0
|
Installation issue: libflame fails to build on Cray EX with "undefined reference to `main'" - ### Steps to reproduce the issue
<!-- Fill in the exact spec you are trying to build and the relevant part of the error message -->
```console
$ spack install amdlibflame threads=openmp
[...]
ranlib lib/x86_64-unknown-linux-gnu/libflame.a
/project/d110/timuel/spack/lib/spack/env/gcc/gcc -shared -Wl,-soname,libflame.so.3 -o lib/x86_64-unknown-linux-gnu/libflame.so -Wl,--whole-archive,libflame.a,--no-whole-archive -L/opt/cray/pe/mpich/8.1.4/ofi/gnu/9.1/lib -L/opt/cray/pe/dsmml/0.1.4/dsmml//lib -L/opt/cray/xpmem/2.2.40-7.0.1.0_1.9__g1d7a24d.shasta/lib64 -L/opt/gcc/10.2.0/snos/lib/gcc/x86_64-suse-linux/10.2.0 -L/opt/gcc/10.2.0/snos/lib/gcc/x86_64-suse-linux/10.2.0/../../../../lib64 -L/lib/../lib64 -L/usr/lib/../lib64 -L/opt/gcc/10.2.0/snos/lib/gcc/x86_64-suse-linux/10.2.0/../../.. -lblis-mt -lmpifort_gnu_91 -lmpi_gnu_91 -lxpmem -ldsmml -lgfortran -lquadmath -lpthread -lm -fopenmp
/usr/bin/ld: /usr/lib/../lib64/crt1.o: in function `_start':
/home/abuild/rpmbuild/BUILD/glibc-2.26/csu/../sysdeps/x86_64/start.S:110: undefined reference to `main'
collect2: error: ld returned 1 exit status
make: *** [Makefile:593: lib/x86_64-unknown-linux-gnu/libflame.so] Error 1
```
### Information on your system
```
* **Spack:** 0.16.2-3059-e321578bbe
* **Python:** 3.6.12
* **Platform:** cray-sles15-zen2
* **Concretizer:** original
```
```yaml
packages:
...
cray-mpich:
externals:
- spec: cray-mpich@8.1.4%aocc
prefix: /opt/cray/pe/mpich/8.1.4/ofi/aocc/2.2
- spec: cray-mpich@8.1.4%gcc
prefix: /opt/cray/pe/mpich/8.1.4/ofi/gnu/9.1
buildable: False
...
```
```yaml
compilers:
- compiler:
spec: gcc@10.2.0
paths:
cc: cc
cxx: CC
f77: ftn
fc: ftn
flags: {}
operating_system: sles15
target: any
modules:
- PrgEnv-gnu
- gcc/10.2.0
environment: {}
extra_rpaths: []
```
### Additional information
<!-- Please upload the following files. They should be present in the stage directory of the failing build. Also upload any config.log or similar file if one exists. -->
* [spack-build-out.txt](https://github.com/spack/spack/files/6619683/spack-build-out.txt)
* [spack-build-env.txt](https://github.com/spack/spack/files/6619686/spack-build-env.txt)
@amd-toolchain-support ... my guess is that the Cray compiler wrappers interfere and switch to executable output mode with the `-o ...` occurring after the `-shared`, but that's really just a guess.
### General information
<!-- These boxes can be checked by replacing [ ] with [x] or by clicking them after submitting the issue. -->
- [x] I have run `spack debug report` and reported the version of Spack/Python/Platform
- [x] I have run `spack maintainers <name-of-the-package>` and @mentioned any maintainers
- [x] I have uploaded the build log and environment files
- [x] I have searched the issues of this repo and believe this is not a duplicate
|
non_test
|
installation issue libflame fails to build on cray ex with undefined reference to main steps to reproduce the issue console spack install amdlibflame threads openmp ranlib lib unknown linux gnu libflame a project timuel spack lib spack env gcc gcc shared wl soname libflame so o lib unknown linux gnu libflame so wl whole archive libflame a no whole archive l opt cray pe mpich ofi gnu lib l opt cray pe dsmml dsmml lib l opt cray xpmem shasta l opt gcc snos lib gcc suse linux l opt gcc snos lib gcc suse linux l lib l usr lib l opt gcc snos lib gcc suse linux lblis mt lmpifort gnu lmpi gnu lxpmem ldsmml lgfortran lquadmath lpthread lm fopenmp usr bin ld usr lib o in function start home abuild rpmbuild build glibc csu sysdeps start s undefined reference to main error ld returned exit status make error information on your system spack python platform cray concretizer original yaml packages cray mpich externals spec cray mpich aocc prefix opt cray pe mpich ofi aocc spec cray mpich gcc prefix opt cray pe mpich ofi gnu buildable false yaml compilers compiler spec gcc paths cc cc cxx cc ftn fc ftn flags operating system target any modules prgenv gnu gcc environment extra rpaths additional information amd toolchain support my guess is that the cray compiler wrappers interfere and switch to executable output mode with the o occurring after the shared but that s really just a guess general information i have run spack debug report and reported the version of spack python platform i have run spack maintainers and mentioned any maintainers i have uploaded the build log and environment files i have searched the issues of this repo and believe this is not a duplicate
| 0
|
54,095
| 6,363,621,487
|
IssuesEvent
|
2017-07-31 17:51:06
|
phetsims/gene-expression-essentials
|
https://api.github.com/repos/phetsims/gene-expression-essentials
|
closed
|
Sim won't load in IE, math.log10 not supported
|
status:fixed-pending-testing type:bug
|
The load bar gets about 80% full then stops when trying to open on Internet Explorer. Looking into the console shows math.log10 is not supported.
related to phetsims/tasks#776.
Sim would not load so not all trouble shooting information could be gathered
Name: Gene Expression Essentials
URL: http://www.colorado.edu/physics/phet/dev/html/gene-expression-essentials/1.0.0-dev.2/gene-expression-essentials_en.html
Version: 1.0.0-dev.2 2017-01-30 19:21:44 UTC
Features missing: touch
|
1.0
|
Sim won't load in IE, math.log10 not supported - The load bar gets about 80% full then stops when trying to open on Internet Explorer. Looking into the console shows math.log10 is not supported.
related to phetsims/tasks#776.
Sim would not load so not all trouble shooting information could be gathered
Name: Gene Expression Essentials
URL: http://www.colorado.edu/physics/phet/dev/html/gene-expression-essentials/1.0.0-dev.2/gene-expression-essentials_en.html
Version: 1.0.0-dev.2 2017-01-30 19:21:44 UTC
Features missing: touch
|
test
|
sim won t load in ie math not supported the load bar gets about full then stops when trying to open on internet explorer looking into the console shows math is not supported related to phetsims tasks sim would not load so not all trouble shooting information could be gathered name gene expression essentials url version dev utc features missing touch
| 1
|
147,696
| 13,212,608,643
|
IssuesEvent
|
2020-08-16 08:05:47
|
usc-psychsim/psychsim
|
https://api.github.com/repos/usc-psychsim/psychsim
|
closed
|
thresholdRow() computes >= operation?
|
documentation question
|
From my tests, it seems that the `thresholdRow()` function computes the `>=` (greater than or equal to) operation and not the `>` operation as indicated in the documentation (both the code documentation and the readthedocs: https://psychsim.readthedocs.io/en/latest/modeling.html#hyperplanes).
Just to confirm if this is true or I am doing something wrong.
|
1.0
|
thresholdRow() computes >= operation? - From my tests, it seems that the `thresholdRow()` function computes the `>=` (greater than or equal to) operation and not the `>` operation as indicated in the documentation (both the code documentation and the readthedocs: https://psychsim.readthedocs.io/en/latest/modeling.html#hyperplanes).
Just to confirm if this is true or I am doing something wrong.
|
non_test
|
thresholdrow computes operation from my tests it seems that the thresholdrow function computes the greater than or equal to operation and not the operation as indicated in the documentation both the code documentation and the readthedocs just to confirm if this is true or i am doing something wrong
| 0
|
87,823
| 8,123,281,464
|
IssuesEvent
|
2018-08-16 14:12:45
|
NetsBlox/NetsBlox
|
https://api.github.com/repos/NetsBlox/NetsBlox
|
opened
|
onClose is not a fn error logs when running tests
|
bug minor testing
|
Some of the tests can result in error messages about `onClose` not being a function. This is a result of the `onClose` function being applied by the NetworkTopology and can cause problems when only the RPC tests are run
|
1.0
|
onClose is not a fn error logs when running tests - Some of the tests can result in error messages about `onClose` not being a function. This is a result of the `onClose` function being applied by the NetworkTopology and can cause problems when only the RPC tests are run
|
test
|
onclose is not a fn error logs when running tests some of the tests can result in error messages about onclose not being a function this is a result of the onclose function being applied by the networktopology and can cause problems when only the rpc tests are run
| 1
|
326,430
| 24,084,247,911
|
IssuesEvent
|
2022-09-19 09:28:29
|
yunnsann/fastcampus-project-board
|
https://api.github.com/repos/yunnsann/fastcampus-project-board
|
closed
|
깃헙 프로젝트와 이슈 정리하기
|
documentation
|
깃헙 프로젝트 세팅하고, 카드를 만들어 정리하자.
* [x] 프로젝트 베타 만들기
* [x] 카드 목록 만들기 - 강의 커리큘럼 참고
* [x] 이슈로 적절히 바꾸기
|
1.0
|
깃헙 프로젝트와 이슈 정리하기 - 깃헙 프로젝트 세팅하고, 카드를 만들어 정리하자.
* [x] 프로젝트 베타 만들기
* [x] 카드 목록 만들기 - 강의 커리큘럼 참고
* [x] 이슈로 적절히 바꾸기
|
non_test
|
깃헙 프로젝트와 이슈 정리하기 깃헙 프로젝트 세팅하고 카드를 만들어 정리하자 프로젝트 베타 만들기 카드 목록 만들기 강의 커리큘럼 참고 이슈로 적절히 바꾸기
| 0
|
35,435
| 17,083,039,941
|
IssuesEvent
|
2021-07-08 08:19:13
|
pixiebrix/pixiebrix-extension
|
https://api.github.com/repos/pixiebrix/pixiebrix-extension
|
opened
|
Should injection be avoided on origins where they're not required?
|
performance question
|
Currently `webext-dynamic-content-scripts` indiscriminately injects the content scripts on every extra origin we have permission to. This generally matches the intent, but also means that the content script will be injected on some pages that either no longer have any bricks running or that are just side-permissions for API access, for example.
Unknowns:
- how much of a problem this is in practice (other than forgotten/unused permissions)
- whether parts of the extension require/benefit from the content script loading in certain tabs regardless of bricks being run
Related:
- #588
- #579
|
True
|
Should injection be avoided on origins where they're not required? - Currently `webext-dynamic-content-scripts` indiscriminately injects the content scripts on every extra origin we have permission to. This generally matches the intent, but also means that the content script will be injected on some pages that either no longer have any bricks running or that are just side-permissions for API access, for example.
Unknowns:
- how much of a problem this is in practice (other than forgotten/unused permissions)
- whether parts of the extension require/benefit from the content script loading in certain tabs regardless of bricks being run
Related:
- #588
- #579
|
non_test
|
should injection be avoided on origins where they re not required currently webext dynamic content scripts indiscriminately injects the content scripts on every extra origin we have permission to this generally matches the intent but also means that the content script will be injected on some pages that either no longer have any bricks running or that are just side permissions for api access for example unknowns how much of a problem this is in practice other than forgotten unused permissions whether parts of the extension require benefit from the content script loading in certain tabs regardless of bricks being run related
| 0
|
162,018
| 12,604,037,178
|
IssuesEvent
|
2020-06-11 14:23:16
|
cockroachdb/cockroach
|
https://api.github.com/repos/cockroachdb/cockroach
|
opened
|
kvserver: TestClosedTimestampCanServeWithConflictingIntent fails in testrace
|
C-test-failure
|
This test is failing nearly every Publish Bleeding Edge run in testrace:
For example, https://teamcity.cockroachdb.com/viewLog.html?buildId=2005819&buildTypeId=Cockroach_MergeToMaster
Looks like it times out.
|
1.0
|
kvserver: TestClosedTimestampCanServeWithConflictingIntent fails in testrace - This test is failing nearly every Publish Bleeding Edge run in testrace:
For example, https://teamcity.cockroachdb.com/viewLog.html?buildId=2005819&buildTypeId=Cockroach_MergeToMaster
Looks like it times out.
|
test
|
kvserver testclosedtimestampcanservewithconflictingintent fails in testrace this test is failing nearly every publish bleeding edge run in testrace for example looks like it times out
| 1
|
105,256
| 9,048,938,749
|
IssuesEvent
|
2019-02-12 02:14:19
|
Microsoft/vscode-python
|
https://api.github.com/repos/Microsoft/vscode-python
|
closed
|
Add "Configure Unit Tests" Command
|
feature-testing needs PR type-enhancement
|
(task from #1983)
The "Configure Unit Tests" command will allow user to force configuration of unit tests if for whatever reason it can't be done automatically. A test will need to be added.
|
1.0
|
Add "Configure Unit Tests" Command - (task from #1983)
The "Configure Unit Tests" command will allow user to force configuration of unit tests if for whatever reason it can't be done automatically. A test will need to be added.
|
test
|
add configure unit tests command task from the configure unit tests command will allow user to force configuration of unit tests if for whatever reason it can t be done automatically a test will need to be added
| 1
|
248,758
| 21,055,630,719
|
IssuesEvent
|
2022-04-01 02:44:39
|
ventoy/Ventoy
|
https://api.github.com/repos/ventoy/Ventoy
|
closed
|
[Success Image Report]: Xubuntu 20.04.4 on a IA-32 UEFI device
|
【Tested Image Report】
|
### Official Website List
- [X] I have checked the list in official website and the image file is not listed there.
### Ventoy Version
1.0.72
### BIOS Mode
UEFI Mode
### Partition Style
MBR
### Image file name
xubuntu-20.04.4-desktop-amd64.iso
### Image file checksum type
SHA256
### Image file checksum value
3265c2a2d8393c92452cff2cc0606c3d6eb326f11b0b435e47b841ffd73b37a5
### Image file download link (if applicable)
http://ubuntutym2.u-toyama.ac.jp/xubuntu/20.04/release/xubuntu-20.04.4-desktop-amd64.iso
### Test envrionment
Genuine GenPad E10T3W
### More Details?
This image file booted successfully on a IA-32 UEFI device.
|
1.0
|
[Success Image Report]: Xubuntu 20.04.4 on a IA-32 UEFI device - ### Official Website List
- [X] I have checked the list in official website and the image file is not listed there.
### Ventoy Version
1.0.72
### BIOS Mode
UEFI Mode
### Partition Style
MBR
### Image file name
xubuntu-20.04.4-desktop-amd64.iso
### Image file checksum type
SHA256
### Image file checksum value
3265c2a2d8393c92452cff2cc0606c3d6eb326f11b0b435e47b841ffd73b37a5
### Image file download link (if applicable)
http://ubuntutym2.u-toyama.ac.jp/xubuntu/20.04/release/xubuntu-20.04.4-desktop-amd64.iso
### Test envrionment
Genuine GenPad E10T3W
### More Details?
This image file booted successfully on a IA-32 UEFI device.
|
test
|
xubuntu on a ia uefi device official website list i have checked the list in official website and the image file is not listed there ventoy version bios mode uefi mode partition style mbr image file name xubuntu desktop iso image file checksum type image file checksum value image file download link if applicable test envrionment genuine genpad more details this image file booted successfully on a ia uefi device
| 1
|
156,508
| 13,649,969,977
|
IssuesEvent
|
2020-09-26 16:52:39
|
DiptoChakrabarty/Resume-Generator
|
https://api.github.com/repos/DiptoChakrabarty/Resume-Generator
|
closed
|
Documentation errors (installation)
|
documentation good first issue hacktoberfest
|
I noticed some discrepancies in the installation part of the README.md file:
- ```git clone https://github.com/DiptoChakrabarty/website``` Although it says "website", the link still works, but "Resume-Generator" might be better in the end.
- There is no directory named ```website``` in the repository. This seems to be a renaming issue, as with point 1.
- Point 4 has a typo, it should be ```rm resume/site.db```.
- In point 5, the following can be excluded from the coding block:
```create file .env inside folder resume```
```Add the following ```
|
1.0
|
Documentation errors (installation) - I noticed some discrepancies in the installation part of the README.md file:
- ```git clone https://github.com/DiptoChakrabarty/website``` Although it says "website", the link still works, but "Resume-Generator" might be better in the end.
- There is no directory named ```website``` in the repository. This seems to be a renaming issue, as with point 1.
- Point 4 has a typo, it should be ```rm resume/site.db```.
- In point 5, the following can be excluded from the coding block:
```create file .env inside folder resume```
```Add the following ```
|
non_test
|
documentation errors installation i noticed some discrepancies in the installation part of the readme md file git clone although it says website the link still works but resume generator might be better in the end there is no directory named website in the repository this seems to be a renaming issue as with point point has a typo it should be rm resume site db in point the following can be excluded from the coding block create file env inside folder resume add the following
| 0
|
118,239
| 4,733,327,118
|
IssuesEvent
|
2016-10-19 10:47:35
|
japanesemediamanager/jmmserver
|
https://api.github.com/repos/japanesemediamanager/jmmserver
|
closed
|
Export & import via API will stop JMM Server from working
|
Bug - Low Priority Enhancement - Core Change - APIv2
|
As the Export use ServerSettings.ToContract() that dont include database configuration those settings will prevent JMM from running after import.
Solutions:
- make ToContract() throw all data that are used in settings.json
|
1.0
|
Export & import via API will stop JMM Server from working - As the Export use ServerSettings.ToContract() that dont include database configuration those settings will prevent JMM from running after import.
Solutions:
- make ToContract() throw all data that are used in settings.json
|
non_test
|
export import via api will stop jmm server from working as the export use serversettings tocontract that dont include database configuration those settings will prevent jmm from running after import solutions make tocontract throw all data that are used in settings json
| 0
|
211,981
| 16,386,369,658
|
IssuesEvent
|
2021-05-17 10:59:49
|
spack/spack
|
https://api.github.com/repos/spack/spack
|
opened
|
Perform smoke tests in temporary directory
|
feature smoke-tests
|
Currently post-installation tests are usually run inside the install prefix and must be cleaned before reusing. I suggest a mechanism for running inside a temporary directory.
### Rationale
Installation tests, especially for libraries, can be messy (e.g. create a `build` dir with many files), and the result of one build might affect another build resulting in a false positive or negative.
Furthermore, it's possible that users of a shared spack installation other than the original installer would want to test, or spack is installed onto a read-only device. Currently the test system assumes the user testing has full write privileges on the installation prefix.
### Description
I think a `self.test_build_dir` or something could easily replace the handrolled smoke test "build" dirs in place.
### General information
- [x] Spack version 0.16.1-2652-39a4f3ba88
- [x] I have searched the issues of this repo and believe this is not a duplicate
|
1.0
|
Perform smoke tests in temporary directory - Currently post-installation tests are usually run inside the install prefix and must be cleaned before reusing. I suggest a mechanism for running inside a temporary directory.
### Rationale
Installation tests, especially for libraries, can be messy (e.g. create a `build` dir with many files), and the result of one build might affect another build resulting in a false positive or negative.
Furthermore, it's possible that users of a shared spack installation other than the original installer would want to test, or spack is installed onto a read-only device. Currently the test system assumes the user testing has full write privileges on the installation prefix.
### Description
I think a `self.test_build_dir` or something could easily replace the handrolled smoke test "build" dirs in place.
### General information
- [x] Spack version 0.16.1-2652-39a4f3ba88
- [x] I have searched the issues of this repo and believe this is not a duplicate
|
test
|
perform smoke tests in temporary directory currently post installation tests are usually run inside the install prefix and must be cleaned before reusing i suggest a mechanism for running inside a temporary directory rationale installation tests especially for libraries can be messy e g create a build dir with many files and the result of one build might affect another build resulting in a false positive or negative furthermore it s possible that users of a shared spack installation other than the original installer would want to test or spack is installed onto a read only device currently the test system assumes the user testing has full write privileges on the installation prefix description i think a self test build dir or something could easily replace the handrolled smoke test build dirs in place general information spack version i have searched the issues of this repo and believe this is not a duplicate
| 1
|
216,434
| 16,659,215,317
|
IssuesEvent
|
2021-06-06 03:39:00
|
crankyoldgit/IRremoteESP8266
|
https://api.github.com/repos/crankyoldgit/IRremoteESP8266
|
closed
|
Frigidaire FHPC102AB1 Compatibility Update for Documentation
|
Documentation
|
Just a FYI. I was able to use the Electra / [Electra Library](https://github.com/crankyoldgit/IRremoteESP8266/blob/master/src/ir_Electra.h) and the Web Server with a Frigidaire FGPC102AB1. Specifically [this one](https://www.amazon.com/dp/B07ZDTBC9Y?psc=1&ref=ppx_yo2_dt_b_product_details) from Amazon. Works Great! Sorry if this isn't where this goes. Just wanted to help keep the compatibility list growing
|
1.0
|
Frigidaire FHPC102AB1 Compatibility Update for Documentation - Just a FYI. I was able to use the Electra / [Electra Library](https://github.com/crankyoldgit/IRremoteESP8266/blob/master/src/ir_Electra.h) and the Web Server with a Frigidaire FGPC102AB1. Specifically [this one](https://www.amazon.com/dp/B07ZDTBC9Y?psc=1&ref=ppx_yo2_dt_b_product_details) from Amazon. Works Great! Sorry if this isn't where this goes. Just wanted to help keep the compatibility list growing
|
non_test
|
frigidaire compatibility update for documentation just a fyi i was able to use the electra and the web server with a frigidaire specifically from amazon works great sorry if this isn t where this goes just wanted to help keep the compatibility list growing
| 0
|
223,596
| 17,611,128,099
|
IssuesEvent
|
2021-08-18 01:30:58
|
elastic/kibana
|
https://api.github.com/repos/elastic/kibana
|
opened
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/lens/smokescreen·ts - lens app lens smokescreen tests should transition from line chart to donut chart and to bar chart
|
failed-test
|
A test failed on a tracked branch
```
Error: retry.try timeout: TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="tableListSearchBox"])
Wait timed out after 10008ms
at /dev/shm/workspace/parallel/21/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
at runMicrotasks (<anonymous>)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at onFailure (/dev/shm/workspace/parallel/21/kibana/test/common/services/retry/retry_for_success.ts:17:9)
at retryForSuccess (/dev/shm/workspace/parallel/21/kibana/test/common/services/retry/retry_for_success.ts:57:13)
at RetryService.try (/dev/shm/workspace/parallel/21/kibana/test/common/services/retry/retry.ts:31:12)
at ListingTableService.searchForItemWithName (/dev/shm/workspace/parallel/21/kibana/test/functional/services/listing_table.ts:107:5)
at Context.<anonymous> (test/functional/apps/lens/smokescreen.ts:335:7)
at Object.apply (/dev/shm/workspace/parallel/21/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/16353/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/lens/smokescreen·ts","test.name":"lens app lens smokescreen tests should transition from line chart to donut chart and to bar chart","test.failCount":1}} -->
|
1.0
|
Failing test: Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/lens/smokescreen·ts - lens app lens smokescreen tests should transition from line chart to donut chart and to bar chart - A test failed on a tracked branch
```
Error: retry.try timeout: TimeoutError: Waiting for element to be located By(css selector, [data-test-subj="tableListSearchBox"])
Wait timed out after 10008ms
at /dev/shm/workspace/parallel/21/kibana/node_modules/selenium-webdriver/lib/webdriver.js:842:17
at runMicrotasks (<anonymous>)
at processTicksAndRejections (internal/process/task_queues.js:95:5)
at onFailure (/dev/shm/workspace/parallel/21/kibana/test/common/services/retry/retry_for_success.ts:17:9)
at retryForSuccess (/dev/shm/workspace/parallel/21/kibana/test/common/services/retry/retry_for_success.ts:57:13)
at RetryService.try (/dev/shm/workspace/parallel/21/kibana/test/common/services/retry/retry.ts:31:12)
at ListingTableService.searchForItemWithName (/dev/shm/workspace/parallel/21/kibana/test/functional/services/listing_table.ts:107:5)
at Context.<anonymous> (test/functional/apps/lens/smokescreen.ts:335:7)
at Object.apply (/dev/shm/workspace/parallel/21/kibana/node_modules/@kbn/test/target_node/functional_test_runner/lib/mocha/wrap_function.js:87:16)
```
First failure: [Jenkins Build](https://kibana-ci.elastic.co/job/elastic+kibana+master/16353/)
<!-- kibanaCiData = {"failed-test":{"test.class":"Chrome X-Pack UI Functional Tests.x-pack/test/functional/apps/lens/smokescreen·ts","test.name":"lens app lens smokescreen tests should transition from line chart to donut chart and to bar chart","test.failCount":1}} -->
|
test
|
failing test chrome x pack ui functional tests x pack test functional apps lens smokescreen·ts lens app lens smokescreen tests should transition from line chart to donut chart and to bar chart a test failed on a tracked branch error retry try timeout timeouterror waiting for element to be located by css selector wait timed out after at dev shm workspace parallel kibana node modules selenium webdriver lib webdriver js at runmicrotasks at processticksandrejections internal process task queues js at onfailure dev shm workspace parallel kibana test common services retry retry for success ts at retryforsuccess dev shm workspace parallel kibana test common services retry retry for success ts at retryservice try dev shm workspace parallel kibana test common services retry retry ts at listingtableservice searchforitemwithname dev shm workspace parallel kibana test functional services listing table ts at context test functional apps lens smokescreen ts at object apply dev shm workspace parallel kibana node modules kbn test target node functional test runner lib mocha wrap function js first failure
| 1
|
95,198
| 16,074,155,358
|
IssuesEvent
|
2021-04-25 02:41:53
|
samq-ghdemo/JS-Demo
|
https://api.github.com/repos/samq-ghdemo/JS-Demo
|
closed
|
CVE-2020-28282 (High) detected in getobject-0.1.0.tgz - autoclosed
|
security vulnerability
|
## CVE-2020-28282 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>getobject-0.1.0.tgz</b></p></summary>
<p>get.and.set.deep.objects.easily = true</p>
<p>Library home page: <a href="https://registry.npmjs.org/getobject/-/getobject-0.1.0.tgz">https://registry.npmjs.org/getobject/-/getobject-0.1.0.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/getobject/package.json</p>
<p>
Dependency Hierarchy:
- grunt-1.0.3.tgz (Root Library)
- grunt-legacy-util-1.1.1.tgz
- :x: **getobject-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/JS-Demo/commit/46781df511f58c350408cb5158290290709b373c">46781df511f58c350408cb5158290290709b373c</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in 'getobject' version 0.1.0 allows an attacker to cause a denial of service and may lead to remote code execution.
<p>Publish Date: 2020-12-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28282>CVE-2020-28282</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"getobject","packageVersion":"0.1.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:1.0.3;grunt-legacy-util:1.1.1;getobject:0.1.0","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-28282","vulnerabilityDetails":"Prototype pollution vulnerability in \u0027getobject\u0027 version 0.1.0 allows an attacker to cause a denial of service and may lead to remote code execution.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28282","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-28282 (High) detected in getobject-0.1.0.tgz - autoclosed - ## CVE-2020-28282 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>getobject-0.1.0.tgz</b></p></summary>
<p>get.and.set.deep.objects.easily = true</p>
<p>Library home page: <a href="https://registry.npmjs.org/getobject/-/getobject-0.1.0.tgz">https://registry.npmjs.org/getobject/-/getobject-0.1.0.tgz</a></p>
<p>Path to dependency file: JS-Demo/package.json</p>
<p>Path to vulnerable library: JS-Demo/node_modules/getobject/package.json</p>
<p>
Dependency Hierarchy:
- grunt-1.0.3.tgz (Root Library)
- grunt-legacy-util-1.1.1.tgz
- :x: **getobject-0.1.0.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://github.com/samq-ghdemo/JS-Demo/commit/46781df511f58c350408cb5158290290709b373c">46781df511f58c350408cb5158290290709b373c</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Prototype pollution vulnerability in 'getobject' version 0.1.0 allows an attacker to cause a denial of service and may lead to remote code execution.
<p>Publish Date: 2020-12-29
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28282>CVE-2020-28282</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"getobject","packageVersion":"0.1.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"grunt:1.0.3;grunt-legacy-util:1.1.1;getobject:0.1.0","isMinimumFixVersionAvailable":false}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-28282","vulnerabilityDetails":"Prototype pollution vulnerability in \u0027getobject\u0027 version 0.1.0 allows an attacker to cause a denial of service and may lead to remote code execution.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28282","cvss3Severity":"high","cvss3Score":"9.8","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_test
|
cve high detected in getobject tgz autoclosed cve high severity vulnerability vulnerable library getobject tgz get and set deep objects easily true library home page a href path to dependency file js demo package json path to vulnerable library js demo node modules getobject package json dependency hierarchy grunt tgz root library grunt legacy util tgz x getobject tgz vulnerable library found in head commit a href found in base branch master vulnerability details prototype pollution vulnerability in getobject version allows an attacker to cause a denial of service and may lead to remote code execution publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree grunt grunt legacy util getobject isminimumfixversionavailable false basebranches vulnerabilityidentifier cve vulnerabilitydetails prototype pollution vulnerability in version allows an attacker to cause a denial of service and may lead to remote code execution vulnerabilityurl
| 0
|
76,264
| 15,495,903,372
|
IssuesEvent
|
2021-03-11 01:42:20
|
yadavrahul12957/coupon
|
https://api.github.com/repos/yadavrahul12957/coupon
|
opened
|
CVE-2019-20149 (High) detected in kind-of-6.0.2.tgz
|
security vulnerability
|
## CVE-2019-20149 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /coupon/package.json</p>
<p>Path to vulnerable library: coupon/node_modules/http-proxy-middleware/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- sass-loader-7.1.0.tgz
- clone-deep-2.0.2.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p>
<p>Release Date: 2019-12-30</p>
<p>Fix Resolution: 6.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-20149 (High) detected in kind-of-6.0.2.tgz - ## CVE-2019-20149 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>kind-of-6.0.2.tgz</b></p></summary>
<p>Get the native type of a value.</p>
<p>Library home page: <a href="https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz">https://registry.npmjs.org/kind-of/-/kind-of-6.0.2.tgz</a></p>
<p>Path to dependency file: /coupon/package.json</p>
<p>Path to vulnerable library: coupon/node_modules/http-proxy-middleware/node_modules/kind-of/package.json</p>
<p>
Dependency Hierarchy:
- react-scripts-2.1.8.tgz (Root Library)
- sass-loader-7.1.0.tgz
- clone-deep-2.0.2.tgz
- :x: **kind-of-6.0.2.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ctorName in index.js in kind-of v6.0.2 allows external user input to overwrite certain internal attributes via a conflicting name, as demonstrated by 'constructor': {'name':'Symbol'}. Hence, a crafted payload can overwrite this builtin attribute to manipulate the type detection result.
<p>Publish Date: 2019-12-30
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-20149>CVE-2019-20149</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: High
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2019-20149</a></p>
<p>Release Date: 2019-12-30</p>
<p>Fix Resolution: 6.0.3</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_test
|
cve high detected in kind of tgz cve high severity vulnerability vulnerable library kind of tgz get the native type of a value library home page a href path to dependency file coupon package json path to vulnerable library coupon node modules http proxy middleware node modules kind of package json dependency hierarchy react scripts tgz root library sass loader tgz clone deep tgz x kind of tgz vulnerable library vulnerability details ctorname in index js in kind of allows external user input to overwrite certain internal attributes via a conflicting name as demonstrated by constructor name symbol hence a crafted payload can overwrite this builtin attribute to manipulate the type detection result publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact high availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
483,391
| 13,923,975,437
|
IssuesEvent
|
2020-10-21 15:01:19
|
eyebeam/eyebeam.org
|
https://api.github.com/repos/eyebeam/eyebeam.org
|
opened
|
section header won't render (Pleading Face on Apple iOS 13.3)
|
High Priority
|
it won't even give you the option to type
<img width="1102" alt="Screen Shot 2020-10-21 at 11 00 31 AM" src="https://user-images.githubusercontent.com/31375201/96738457-bcdecf80-138c-11eb-84e9-30499a1d7986.png">
|
1.0
|
section header won't render (Pleading Face on Apple iOS 13.3) - it won't even give you the option to type
<img width="1102" alt="Screen Shot 2020-10-21 at 11 00 31 AM" src="https://user-images.githubusercontent.com/31375201/96738457-bcdecf80-138c-11eb-84e9-30499a1d7986.png">
|
non_test
|
section header won t render pleading face on apple ios it won t even give you the option to type img width alt screen shot at am src
| 0
|
22,141
| 3,603,361,097
|
IssuesEvent
|
2016-02-03 18:46:51
|
extnet/Ext.NET
|
https://api.github.com/repos/extnet/Ext.NET
|
opened
|
ImageButton does not handle the target= property
|
2.x 3.x 4.x defect
|
Reported in this forum thread: [ImageButton and HrefTarget](http://forums.ext.net/showthread.php?60570).
The problem raised when comparing a normal `<ext:Button />` to an `<ext:ImageButton />`. While both controls accept the `target=` property, only the former works with it, correctly handling the chosen target string (like `_blank` to open on a new tab).
The problem happens because the `ImageButton` control is emitted as an `<img />` tag to the page. It does not support the `target=` property at all.
Interesting fact is that it supports -- at least on IE11 -- the `href=` prop -- like the `<a />` tag. But the `href` property is not present in [IMG tag's HTML5 reference documentation](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/img). This indicates that also supporting the `href=` property on `ImageButton` should be discontinued.
It should be taken at least one of the approaches below to address the issue:
- Support the `target=` property, by wrapping the component in `<a>` tag, like the current button implementation does.
- No longer export the `target=` property, as it does not really work to `<img>` elements.
|
1.0
|
ImageButton does not handle the target= property - Reported in this forum thread: [ImageButton and HrefTarget](http://forums.ext.net/showthread.php?60570).
The problem raised when comparing a normal `<ext:Button />` to an `<ext:ImageButton />`. While both controls accept the `target=` property, only the former works with it, correctly handling the chosen target string (like `_blank` to open on a new tab).
The problem happens because the `ImageButton` control is emitted as an `<img />` tag to the page. It does not support the `target=` property at all.
Interesting fact is that it supports -- at least on IE11 -- the `href=` prop -- like the `<a />` tag. But the `href` property is not present in [IMG tag's HTML5 reference documentation](https://developer.mozilla.org/en-US/docs/Web/HTML/Element/img). This indicates that also supporting the `href=` property on `ImageButton` should be discontinued.
It should be taken at least one of the approaches below to address the issue:
- Support the `target=` property, by wrapping the component in `<a>` tag, like the current button implementation does.
- No longer export the `target=` property, as it does not really work to `<img>` elements.
|
non_test
|
imagebutton does not handle the target property reported in this forum thread the problem raised when comparing a normal to an while both controls accept the target property only the former works with it correctly handling the chosen target string like blank to open on a new tab the problem happens because the imagebutton control is emitted as an tag to the page it does not support the target property at all interesting fact is that it supports at least on the href prop like the tag but the href property is not present in this indicates that also supporting the href property on imagebutton should be discontinued it should be taken at least one of the approaches below to address the issue support the target property by wrapping the component in tag like the current button implementation does no longer export the target property as it does not really work to elements
| 0
|
322,395
| 27,598,334,896
|
IssuesEvent
|
2023-03-09 08:20:17
|
unifyai/ivy
|
https://api.github.com/repos/unifyai/ivy
|
opened
|
Fix jax_numpy_math.test_jax_numpy_radians
|
JAX Frontend Sub Task Failing Test
|
| | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
|
1.0
|
Fix jax_numpy_math.test_jax_numpy_radians - | | |
|---|---|
|tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|torch|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|numpy|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
|jax|<a href="https://github.com/unifyai/ivy/actions/runs/4369918880/jobs/7644261475" rel="noopener noreferrer" target="_blank"><img src=https://img.shields.io/badge/-failure-red></a>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
<details>
<summary>FAILED ivy_tests/test_ivy/test_frontends/test_jax/test_jax_numpy_math.py::test_jax_numpy_radians[cpu-ivy.functional.backends.jax-False-False]</summary>
2023-03-09T01:08:40.1530411Z E jax._src.traceback_util.UnfilteredStackTrace: TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1530942Z E
2023-03-09T01:08:40.1531405Z E The stack trace below excludes JAX-internal frames.
2023-03-09T01:08:40.1531903Z E The preceding is the original exception that occurred, unmodified.
2023-03-09T01:08:40.1532468Z E
2023-03-09T01:08:40.1532827Z E --------------------
2023-03-09T01:08:40.1540807Z E TypeError: deg2rad() got some positional-only arguments passed as keyword arguments: 'x'
2023-03-09T01:08:40.1541351Z E Falsifying example: test_jax_numpy_radians(
2023-03-09T01:08:40.1541890Z E dtype_and_x=(['float32'], [array(-1., dtype=float32)]),
2023-03-09T01:08:40.1542622Z E fn_tree='ivy.functional.frontends.jax.numpy.radians',
2023-03-09T01:08:40.1543122Z E test_flags=FrontendFunctionTestFlags(
2023-03-09T01:08:40.1543689Z E num_positional_args=0,
2023-03-09T01:08:40.1544194Z E with_out=False,
2023-03-09T01:08:40.1544520Z E inplace=False,
2023-03-09T01:08:40.1544857Z E as_variable=[False],
2023-03-09T01:08:40.1545208Z E native_arrays=[False],
2023-03-09T01:08:40.1545537Z E ),
2023-03-09T01:08:40.1545892Z E on_device='cpu',
2023-03-09T01:08:40.1546264Z E frontend='jax',
2023-03-09T01:08:40.1546578Z E )
2023-03-09T01:08:40.1547005Z E
2023-03-09T01:08:40.1547715Z E You can reproduce this example by temporarily adding @reproduce_failure('6.68.2', b'AXicY2AAAkYGCADTAAAkAAM=') as a decorator on your test case
</details>
|
test
|
fix jax numpy math test jax numpy radians tensorflow img src torch img src numpy img src jax img src failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy radians e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments x e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments x e falsifying example test jax numpy radians e dtype and x e fn tree ivy functional frontends jax numpy radians e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy radians e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments x e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments x e falsifying example test jax numpy radians e dtype and x e fn tree ivy functional frontends jax numpy radians e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy radians e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments x e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments x e falsifying example test jax numpy radians e dtype and x e fn tree ivy functional frontends jax numpy radians e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case failed ivy tests test ivy test frontends test jax test jax numpy math py test jax numpy radians e jax src traceback util unfilteredstacktrace typeerror got some positional only arguments passed as keyword arguments x e e the stack trace below excludes jax internal frames e the preceding is the original exception that occurred unmodified e e e typeerror got some positional only arguments passed as keyword arguments x e falsifying example test jax numpy radians e dtype and x e fn tree ivy functional frontends jax numpy radians e test flags frontendfunctiontestflags e num positional args e with out false e inplace false e as variable e native arrays e e on device cpu e frontend jax e e e you can reproduce this example by temporarily adding reproduce failure b as a decorator on your test case
| 1
|
330,703
| 28,484,250,899
|
IssuesEvent
|
2023-04-18 06:37:16
|
choderalab/asapdiscovery
|
https://api.github.com/repos/choderalab/asapdiscovery
|
closed
|
Ship PDB/CIF input files as part of the asapdiscovery-data package
|
help wanted software testing
|
In order to run tests and provide examples, we need to be able to include some binary files (oedu, npy, etc) as well as large files (pdb, cif, sdf, csv) as a part of our repo. @ijpulidos has been very cautious about this as it results in GitHub keeping track of wayyyy too many changes.
We need some way to be able to ship these files as a part of the repo!
|
1.0
|
Ship PDB/CIF input files as part of the asapdiscovery-data package - In order to run tests and provide examples, we need to be able to include some binary files (oedu, npy, etc) as well as large files (pdb, cif, sdf, csv) as a part of our repo. @ijpulidos has been very cautious about this as it results in GitHub keeping track of wayyyy too many changes.
We need some way to be able to ship these files as a part of the repo!
|
test
|
ship pdb cif input files as part of the asapdiscovery data package in order to run tests and provide examples we need to be able to include some binary files oedu npy etc as well as large files pdb cif sdf csv as a part of our repo ijpulidos has been very cautious about this as it results in github keeping track of wayyyy too many changes we need some way to be able to ship these files as a part of the repo
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.