Unnamed: 0
int64 0
832k
| id
float64 2.49B
32.1B
| type
stringclasses 1
value | created_at
stringlengths 19
19
| repo
stringlengths 5
112
| repo_url
stringlengths 34
141
| action
stringclasses 3
values | title
stringlengths 1
757
| labels
stringlengths 4
664
| body
stringlengths 3
261k
| index
stringclasses 10
values | text_combine
stringlengths 96
261k
| label
stringclasses 2
values | text
stringlengths 96
232k
| binary_label
int64 0
1
|
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
22,204
| 11,700,339,712
|
IssuesEvent
|
2020-03-06 17:17:25
|
Azure/azure-sdk-for-net
|
https://api.github.com/repos/Azure/azure-sdk-for-net
|
opened
|
Set default value of "" on TrainingFileFilter.Path
|
Client Cognitive Services FormRecognizer
|
In CustomFormClient.StartTraining(), we check whether filter has been passed in, and if so, we add it to the trainRequest.
Once we are able to set a default value on TrainingFileFilter with code gen (https://github.com/Azure/autorest.csharp/issues/467), we can send this always.
We will want to decide whether it's better to send TrainingFileFilter on every request, or leave it unset on the TrainRequest if it's not passed in.
|
1.0
|
Set default value of "" on TrainingFileFilter.Path - In CustomFormClient.StartTraining(), we check whether filter has been passed in, and if so, we add it to the trainRequest.
Once we are able to set a default value on TrainingFileFilter with code gen (https://github.com/Azure/autorest.csharp/issues/467), we can send this always.
We will want to decide whether it's better to send TrainingFileFilter on every request, or leave it unset on the TrainRequest if it's not passed in.
|
non_defect
|
set default value of on trainingfilefilter path in customformclient starttraining we check whether filter has been passed in and if so we add it to the trainrequest once we are able to set a default value on trainingfilefilter with code gen we can send this always we will want to decide whether it s better to send trainingfilefilter on every request or leave it unset on the trainrequest if it s not passed in
| 0
|
42,551
| 11,017,047,735
|
IssuesEvent
|
2019-12-05 07:22:41
|
microsoft/WindowsTemplateStudio
|
https://api.github.com/repos/microsoft/WindowsTemplateStudio
|
closed
|
Build dev.version_0.20.19339.01 failed
|
bug vsts-build
|
## Build dev.version_0.20.19339.01
- **Build result:** `failed`
- **Build queued:** 12/5/2019 3:00:02 AM
- **Build duration:** 1.64 minutes
### Details
Build [dev.version_0.20.19339.01](https://winappstudio.visualstudio.com/web/build.aspx?pcguid=a4ef43be-68ce-4195-a619-079b4d9834c2&builduri=vstfs%3a%2f%2f%2fBuild%2fBuild%2f32183) failed
+ RightClickActions.cs (206): code\src\UI\VisualStudio\RightClickActions.cs(206,71): Error CS0103: The name 'Assembly' does not exist in the current context
+ RightClickActions.cs (207): code\src\UI\VisualStudio\RightClickActions.cs(207,114): Error CS0246: The type or namespace name 'DigitalSignatureService' could not be found (are you missing a using directive or an assembly reference?)
+ Process 'msbuild.exe' exited with code '1'.
Find detailed information in the [build log files]()
|
1.0
|
Build dev.version_0.20.19339.01 failed - ## Build dev.version_0.20.19339.01
- **Build result:** `failed`
- **Build queued:** 12/5/2019 3:00:02 AM
- **Build duration:** 1.64 minutes
### Details
Build [dev.version_0.20.19339.01](https://winappstudio.visualstudio.com/web/build.aspx?pcguid=a4ef43be-68ce-4195-a619-079b4d9834c2&builduri=vstfs%3a%2f%2f%2fBuild%2fBuild%2f32183) failed
+ RightClickActions.cs (206): code\src\UI\VisualStudio\RightClickActions.cs(206,71): Error CS0103: The name 'Assembly' does not exist in the current context
+ RightClickActions.cs (207): code\src\UI\VisualStudio\RightClickActions.cs(207,114): Error CS0246: The type or namespace name 'DigitalSignatureService' could not be found (are you missing a using directive or an assembly reference?)
+ Process 'msbuild.exe' exited with code '1'.
Find detailed information in the [build log files]()
|
non_defect
|
build dev version failed build dev version build result failed build queued am build duration minutes details build failed rightclickactions cs code src ui visualstudio rightclickactions cs error the name assembly does not exist in the current context rightclickactions cs code src ui visualstudio rightclickactions cs error the type or namespace name digitalsignatureservice could not be found are you missing a using directive or an assembly reference process msbuild exe exited with code find detailed information in the
| 0
|
71,150
| 30,823,638,277
|
IssuesEvent
|
2023-08-01 18:14:09
|
hashicorp/terraform-provider-azurerm
|
https://api.github.com/repos/hashicorp/terraform-provider-azurerm
|
closed
|
Unable to import azurerm_pim_eligible_role_assignment resources
|
bug service/authorization v/3.x
|
### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment and review the [contribution guide](https://github.com/hashicorp/terraform-provider-azurerm/blob/main/contributing/README.md) to help.
<!--- Thank you for keeping this note for the community --->
### Terraform Version
1.5.3
### AzureRM Provider Version
3.65.0
### Affected Resource(s)/Data Source(s)
azurerm_pim_eligible_role_assignment
### Terraform Configuration Files
```hcl
variable "teams" {
type = map(object({
team_name = string
location = string
Owner = string
TechnicalContact = string
SecurityGroup = string
DepartmentName = string
City = string
ApplicationGroupType = string
ApplicationType = string
LoadBalancerType = string
VDIType = string
MaximumSessions = number
}))
}
data "azurerm_subscription" "currentSubscription" {}
resource "azurerm_resource_group" "vdi-rg" {
for_each = var.teams
name = "${each.value.team_name}-VDI"
location = coalesce(each.value.location, each.key)
tags = {
Owner = coalesce(each.value.Owner, each.key)
TechnicalContact = coalesce(each.value.TechnicalContact, each.key)
Location = coalesce(each.value.City, each.key)
DepartmentName = coalesce(each.value.DepartmentName, each.key)
TeamName = coalesce(each.value.team_name, each.key)
}
}
resource "azurerm_pim_eligible_role_assignment" "role-vdi-vmadminpim" {
for_each = var.teams
scope = "/subscriptions/subscriptionGUID/resourceGroups/${azurerm_resource_group.vdi-rg[each.key].name}"
role_definition_id = "Virtual Machine Administrator Login"
principal_id = coalesce(each.value.SecurityGroup, each.key)
}
```
### Debug Output/Panic Output
```shell
PS C:\Repos\Infra_Azure_Terraform_Source\Infra_Azure_Terraform_Source> terraform import --var-file=azureVDI.tfvars 'azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim[\"Team2\"]' "/subscriptions/subscriptionGUID/resourecGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID"
data.azurerm_subscription.currentSubscription: Reading...
data.azurerm_subscription.currentSubscription: Read complete after 0s [id=/subscriptions/subscriptionGUID]
azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim["Team2"]: Importing from ID "/subscriptions/subscriptionGUID/resourecGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID"...
azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim["Team2"]: Import prepared!
Prepared azurerm_pim_eligible_role_assignment for import
azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim["Team2"]: Refreshing state... [id=/subscriptions/subscriptionGUID/resourecGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID]
╷
│ Error: listing role assignments on scope Role Management Policy: (Principal Id "securityGroupGUID" / Scope "/subscriptions/subscriptionGUID/resourecGroupName" / Role Definition Id "/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4"): loading results: unexpected status 404 with response: {"message":"No HTTP resource was found that matches the request URI 'https://management.azure.com/subscriptions/subscriptionGUID/resourecGroupName/providers/Microsoft.Authorization/roleEligibilityScheduleInstances?%24filter=%28principalId+eq+%27securityGroupGUID%27+and+roleDefinitionId+eq+%27%2Fsubscriptions%2FsubscriptionGUID%2Fproviders%2FMicrosoft.Authorization%2FroleDefinitions%2F1c0163c0-47e6-4577-8991-ea5c82e286e4%27%29&api-version=2020-10-01'."}
│
│ listing role assignments on scope Role Management Policy: (Principal Id "securityGroupGUID" / Scope "/subscriptions/subscriptionGUID/resourecGroupName" / Role
│ Definition Id "/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4"): loading results: unexpected status 404 with
│ response: {"message":"No HTTP resource was found that matches the request URI
│ 'https://management.azure.com/subscriptions/subscriptionGUID/resourecGroupName/providers/Microsoft.Authorization/roleEligibilityScheduleInstances?%24filter=%28principalId+eq+%27securityGroupGUID%27+and+roleDefinitionId+eq+%27%2Fsubscriptions%2FsubscriptionGUID%2Fproviders%2FMicrosoft.Authorization%2FroleDefinitions%2F1c0163c0-47e6-4577-8991-ea5c82e286e4%27%29&api-version=2020-10-01'."}
```
### Expected Behaviour
Terraform should import the PIM eligible role assignment into the state with the map value specified in [brackets]
### Actual Behaviour
Terraform prepares to import the resource but then throws the 404 error shown above
### Steps to Reproduce
1. Run terraform apply --var-file=azureVDI.tfvars (or other applicable tfvars file) with the above configuration
2. Terraform creates the resource group but times out when creating the PIM assignment - the resource group is successfully added to the state but the PIM assignments are not, even though the PIM assignments exist in Azure
3. Run terraform import --var-file=azureVDI.tfvars 'azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim[\"Team2\"]' "/subscriptions/subscriptionGUID/resourceGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID" , replacing the following:
- Replace Team2 with the key from the mapped variable - whatever key you want that failed to create from steps 1 and 2
- Replace subscriptionGUID with the ID of the subscription you're working with
- Replace resourceGroupName with the name of the Azure resource group created by TF in the config from steps 1 and 2
4. Error occurs
### Important Factoids
_No response_
### References
Issue #22588 covers the problem described in Steps to Reproduce 1 and 2
|
1.0
|
Unable to import azurerm_pim_eligible_role_assignment resources - ### Is there an existing issue for this?
- [X] I have searched the existing issues
### Community Note
<!--- Please keep this note for the community --->
* Please vote on this issue by adding a :thumbsup: [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request
* Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request
* If you are interested in working on this issue or have submitted a pull request, please leave a comment and review the [contribution guide](https://github.com/hashicorp/terraform-provider-azurerm/blob/main/contributing/README.md) to help.
<!--- Thank you for keeping this note for the community --->
### Terraform Version
1.5.3
### AzureRM Provider Version
3.65.0
### Affected Resource(s)/Data Source(s)
azurerm_pim_eligible_role_assignment
### Terraform Configuration Files
```hcl
variable "teams" {
type = map(object({
team_name = string
location = string
Owner = string
TechnicalContact = string
SecurityGroup = string
DepartmentName = string
City = string
ApplicationGroupType = string
ApplicationType = string
LoadBalancerType = string
VDIType = string
MaximumSessions = number
}))
}
data "azurerm_subscription" "currentSubscription" {}
resource "azurerm_resource_group" "vdi-rg" {
for_each = var.teams
name = "${each.value.team_name}-VDI"
location = coalesce(each.value.location, each.key)
tags = {
Owner = coalesce(each.value.Owner, each.key)
TechnicalContact = coalesce(each.value.TechnicalContact, each.key)
Location = coalesce(each.value.City, each.key)
DepartmentName = coalesce(each.value.DepartmentName, each.key)
TeamName = coalesce(each.value.team_name, each.key)
}
}
resource "azurerm_pim_eligible_role_assignment" "role-vdi-vmadminpim" {
for_each = var.teams
scope = "/subscriptions/subscriptionGUID/resourceGroups/${azurerm_resource_group.vdi-rg[each.key].name}"
role_definition_id = "Virtual Machine Administrator Login"
principal_id = coalesce(each.value.SecurityGroup, each.key)
}
```
### Debug Output/Panic Output
```shell
PS C:\Repos\Infra_Azure_Terraform_Source\Infra_Azure_Terraform_Source> terraform import --var-file=azureVDI.tfvars 'azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim[\"Team2\"]' "/subscriptions/subscriptionGUID/resourecGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID"
data.azurerm_subscription.currentSubscription: Reading...
data.azurerm_subscription.currentSubscription: Read complete after 0s [id=/subscriptions/subscriptionGUID]
azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim["Team2"]: Importing from ID "/subscriptions/subscriptionGUID/resourecGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID"...
azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim["Team2"]: Import prepared!
Prepared azurerm_pim_eligible_role_assignment for import
azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim["Team2"]: Refreshing state... [id=/subscriptions/subscriptionGUID/resourecGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID]
╷
│ Error: listing role assignments on scope Role Management Policy: (Principal Id "securityGroupGUID" / Scope "/subscriptions/subscriptionGUID/resourecGroupName" / Role Definition Id "/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4"): loading results: unexpected status 404 with response: {"message":"No HTTP resource was found that matches the request URI 'https://management.azure.com/subscriptions/subscriptionGUID/resourecGroupName/providers/Microsoft.Authorization/roleEligibilityScheduleInstances?%24filter=%28principalId+eq+%27securityGroupGUID%27+and+roleDefinitionId+eq+%27%2Fsubscriptions%2FsubscriptionGUID%2Fproviders%2FMicrosoft.Authorization%2FroleDefinitions%2F1c0163c0-47e6-4577-8991-ea5c82e286e4%27%29&api-version=2020-10-01'."}
│
│ listing role assignments on scope Role Management Policy: (Principal Id "securityGroupGUID" / Scope "/subscriptions/subscriptionGUID/resourecGroupName" / Role
│ Definition Id "/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4"): loading results: unexpected status 404 with
│ response: {"message":"No HTTP resource was found that matches the request URI
│ 'https://management.azure.com/subscriptions/subscriptionGUID/resourecGroupName/providers/Microsoft.Authorization/roleEligibilityScheduleInstances?%24filter=%28principalId+eq+%27securityGroupGUID%27+and+roleDefinitionId+eq+%27%2Fsubscriptions%2FsubscriptionGUID%2Fproviders%2FMicrosoft.Authorization%2FroleDefinitions%2F1c0163c0-47e6-4577-8991-ea5c82e286e4%27%29&api-version=2020-10-01'."}
```
### Expected Behaviour
Terraform should import the PIM eligible role assignment into the state with the map value specified in [brackets]
### Actual Behaviour
Terraform prepares to import the resource but then throws the 404 error shown above
### Steps to Reproduce
1. Run terraform apply --var-file=azureVDI.tfvars (or other applicable tfvars file) with the above configuration
2. Terraform creates the resource group but times out when creating the PIM assignment - the resource group is successfully added to the state but the PIM assignments are not, even though the PIM assignments exist in Azure
3. Run terraform import --var-file=azureVDI.tfvars 'azurerm_pim_eligible_role_assignment.role-vdi-vmadminpim[\"Team2\"]' "/subscriptions/subscriptionGUID/resourceGroupName|/subscriptions/subscriptionGUID/providers/Microsoft.Authorization/roleDefinitions/1c0163c0-47e6-4577-8991-ea5c82e286e4|securityGroupGUID" , replacing the following:
- Replace Team2 with the key from the mapped variable - whatever key you want that failed to create from steps 1 and 2
- Replace subscriptionGUID with the ID of the subscription you're working with
- Replace resourceGroupName with the name of the Azure resource group created by TF in the config from steps 1 and 2
4. Error occurs
### Important Factoids
_No response_
### References
Issue #22588 covers the problem described in Steps to Reproduce 1 and 2
|
non_defect
|
unable to import azurerm pim eligible role assignment resources is there an existing issue for this i have searched the existing issues community note please vote on this issue by adding a thumbsup to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment and review the to help terraform version azurerm provider version affected resource s data source s azurerm pim eligible role assignment terraform configuration files hcl variable teams type map object team name string location string owner string technicalcontact string securitygroup string departmentname string city string applicationgrouptype string applicationtype string loadbalancertype string vditype string maximumsessions number data azurerm subscription currentsubscription resource azurerm resource group vdi rg for each var teams name each value team name vdi location coalesce each value location each key tags owner coalesce each value owner each key technicalcontact coalesce each value technicalcontact each key location coalesce each value city each key departmentname coalesce each value departmentname each key teamname coalesce each value team name each key resource azurerm pim eligible role assignment role vdi vmadminpim for each var teams scope subscriptions subscriptionguid resourcegroups azurerm resource group vdi rg name role definition id virtual machine administrator login principal id coalesce each value securitygroup each key debug output panic output shell ps c repos infra azure terraform source infra azure terraform source terraform import var file azurevdi tfvars azurerm pim eligible role assignment role vdi vmadminpim subscriptions subscriptionguid resourecgroupname subscriptions subscriptionguid providers microsoft authorization roledefinitions securitygroupguid data azurerm subscription currentsubscription reading data azurerm subscription currentsubscription read complete after azurerm pim eligible role assignment role vdi vmadminpim importing from id subscriptions subscriptionguid resourecgroupname subscriptions subscriptionguid providers microsoft authorization roledefinitions securitygroupguid azurerm pim eligible role assignment role vdi vmadminpim import prepared prepared azurerm pim eligible role assignment for import azurerm pim eligible role assignment role vdi vmadminpim refreshing state ╷ │ error listing role assignments on scope role management policy principal id securitygroupguid scope subscriptions subscriptionguid resourecgroupname role definition id subscriptions subscriptionguid providers microsoft authorization roledefinitions loading results unexpected status with response message no http resource was found that matches the request uri │ │ listing role assignments on scope role management policy principal id securitygroupguid scope subscriptions subscriptionguid resourecgroupname role │ definition id subscriptions subscriptionguid providers microsoft authorization roledefinitions loading results unexpected status with │ response message no http resource was found that matches the request uri │ expected behaviour terraform should import the pim eligible role assignment into the state with the map value specified in actual behaviour terraform prepares to import the resource but then throws the error shown above steps to reproduce run terraform apply var file azurevdi tfvars or other applicable tfvars file with the above configuration terraform creates the resource group but times out when creating the pim assignment the resource group is successfully added to the state but the pim assignments are not even though the pim assignments exist in azure run terraform import var file azurevdi tfvars azurerm pim eligible role assignment role vdi vmadminpim subscriptions subscriptionguid resourcegroupname subscriptions subscriptionguid providers microsoft authorization roledefinitions securitygroupguid replacing the following replace with the key from the mapped variable whatever key you want that failed to create from steps and replace subscriptionguid with the id of the subscription you re working with replace resourcegroupname with the name of the azure resource group created by tf in the config from steps and error occurs important factoids no response references issue covers the problem described in steps to reproduce and
| 0
|
19,583
| 3,227,224,116
|
IssuesEvent
|
2015-10-11 00:39:58
|
jsr107/jsr107spec
|
https://api.github.com/repos/jsr107/jsr107spec
|
closed
|
Wrong reference to put in CacheRemove/CacheRemoveAll
|
Defect
|
The Javadoc of `@CacheRemove#afterInvocation` refers to the put operation while it should probably be the remove operation. Excerpt:
_If true and the annotated method throws an exception the put will not be executed._
`CacheRemoveAll` is affected as well.
Besides, this statement does not take the `evictFor` and `noEvictFor` parameters.
|
1.0
|
Wrong reference to put in CacheRemove/CacheRemoveAll - The Javadoc of `@CacheRemove#afterInvocation` refers to the put operation while it should probably be the remove operation. Excerpt:
_If true and the annotated method throws an exception the put will not be executed._
`CacheRemoveAll` is affected as well.
Besides, this statement does not take the `evictFor` and `noEvictFor` parameters.
|
defect
|
wrong reference to put in cacheremove cacheremoveall the javadoc of cacheremove afterinvocation refers to the put operation while it should probably be the remove operation excerpt if true and the annotated method throws an exception the put will not be executed cacheremoveall is affected as well besides this statement does not take the evictfor and noevictfor parameters
| 1
|
9,828
| 2,615,175,556
|
IssuesEvent
|
2015-03-01 06:58:56
|
chrsmith/reaver-wps
|
https://api.github.com/repos/chrsmith/reaver-wps
|
opened
|
Reaver Alice
|
auto-migrated Priority-Triage Type-Defect
|
```
Salve a tutti, è da tanto che sto cercando di provare a trovare la password
del router di un mio amico, ma non riesco. Quando do il comando:
reaver -i mon0 -b mac -p pin -vv alla fine mi esce questo e praticamente va
all'infinito cosi:
Sending EAPOL START request
[+] Received identity request
[+] Sending identity response
[+] Received identity request
[+] Sending identity response
[+] Received M1 message
[+] Sending M2 message
[!] WARNING: Receive timeout occurred
[+] Sending WSC NACK
[!] WPS transaction failed (code: 0x02), re-trying last pin
Grazie in anticipo.
```
Original issue reported on code.google.com by `franci9...@gmail.com` on 17 Oct 2014 at 9:44
|
1.0
|
Reaver Alice - ```
Salve a tutti, è da tanto che sto cercando di provare a trovare la password
del router di un mio amico, ma non riesco. Quando do il comando:
reaver -i mon0 -b mac -p pin -vv alla fine mi esce questo e praticamente va
all'infinito cosi:
Sending EAPOL START request
[+] Received identity request
[+] Sending identity response
[+] Received identity request
[+] Sending identity response
[+] Received M1 message
[+] Sending M2 message
[!] WARNING: Receive timeout occurred
[+] Sending WSC NACK
[!] WPS transaction failed (code: 0x02), re-trying last pin
Grazie in anticipo.
```
Original issue reported on code.google.com by `franci9...@gmail.com` on 17 Oct 2014 at 9:44
|
defect
|
reaver alice salve a tutti è da tanto che sto cercando di provare a trovare la password del router di un mio amico ma non riesco quando do il comando reaver i b mac p pin vv alla fine mi esce questo e praticamente va all infinito cosi sending eapol start request received identity request sending identity response received identity request sending identity response received message sending message warning receive timeout occurred sending wsc nack wps transaction failed code re trying last pin grazie in anticipo original issue reported on code google com by gmail com on oct at
| 1
|
10,812
| 2,622,191,249
|
IssuesEvent
|
2015-03-04 00:23:10
|
byzhang/cudpp
|
https://api.github.com/repos/byzhang/cudpp
|
closed
|
link error with CUDA 3.1
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. Update to cuda 3.1
2. Compile with cuda 3.1 nvcc
3.
What is the expected output? What do you see instead?
The linking should produce a shared lib. Instead get a linking error:
ld: duplicate symbol OperatorMin<double>::identity() constin
./cudpp_generated_scan_app.cu.o and ./cudpp_generated_compact_app.cu.o
collect2: ld returned 1 exit status
make[2]: *** [lib/libcudpp.dylib] Error 1
What version of the product are you using? On what operating system?
1.1.1
Please provide any additional information below.
Prior to the 3.1 release, nvcc treated __device__ functions as implicitly static. This behavior has changed with the 3.1 release. As a result, the host linker will give a link error regarding multiple defined symbols, if two identical __device__ functions are defined in two different compilations units. For example, when including function definitions through the #include <> mechanism; or a __device__ function and an identical host function are defined in two different compilations units.
For both cases, declaring the __device__ function as static will make the
compilation succeed.
```
Original issue reported on code.google.com by `kashif.r...@gmail.com` on 27 Jun 2010 at 4:27
|
1.0
|
link error with CUDA 3.1 - ```
What steps will reproduce the problem?
1. Update to cuda 3.1
2. Compile with cuda 3.1 nvcc
3.
What is the expected output? What do you see instead?
The linking should produce a shared lib. Instead get a linking error:
ld: duplicate symbol OperatorMin<double>::identity() constin
./cudpp_generated_scan_app.cu.o and ./cudpp_generated_compact_app.cu.o
collect2: ld returned 1 exit status
make[2]: *** [lib/libcudpp.dylib] Error 1
What version of the product are you using? On what operating system?
1.1.1
Please provide any additional information below.
Prior to the 3.1 release, nvcc treated __device__ functions as implicitly static. This behavior has changed with the 3.1 release. As a result, the host linker will give a link error regarding multiple defined symbols, if two identical __device__ functions are defined in two different compilations units. For example, when including function definitions through the #include <> mechanism; or a __device__ function and an identical host function are defined in two different compilations units.
For both cases, declaring the __device__ function as static will make the
compilation succeed.
```
Original issue reported on code.google.com by `kashif.r...@gmail.com` on 27 Jun 2010 at 4:27
|
defect
|
link error with cuda what steps will reproduce the problem update to cuda compile with cuda nvcc what is the expected output what do you see instead the linking should produce a shared lib instead get a linking error ld duplicate symbol operatormin identity constin cudpp generated scan app cu o and cudpp generated compact app cu o ld returned exit status make error what version of the product are you using on what operating system please provide any additional information below prior to the release nvcc treated device functions as implicitly static this behavior has changed with the release as a result the host linker will give a link error regarding multiple defined symbols if two identical device functions are defined in two different compilations units for example when including function definitions through the include mechanism or a device function and an identical host function are defined in two different compilations units for both cases declaring the device function as static will make the compilation succeed original issue reported on code google com by kashif r gmail com on jun at
| 1
|
80,595
| 30,347,561,921
|
IssuesEvent
|
2023-07-11 16:26:11
|
vector-im/element-integration-manager
|
https://api.github.com/repos/vector-im/element-integration-manager
|
closed
|
We need consistent widget settings UX
|
T-Defect
|
- Some URL inputs allow typing, some only allow pasting
- Some have static URLs pre-generated even before clicking "Save", which is odd because why do I need to save something that's already known? This might be as simple as not using the word "Save".
Maybe these issues will go away when we have a generic "Here are the instances of this widget" view, and "here you can edit the settings for this instance" view.
|
1.0
|
We need consistent widget settings UX - - Some URL inputs allow typing, some only allow pasting
- Some have static URLs pre-generated even before clicking "Save", which is odd because why do I need to save something that's already known? This might be as simple as not using the word "Save".
Maybe these issues will go away when we have a generic "Here are the instances of this widget" view, and "here you can edit the settings for this instance" view.
|
defect
|
we need consistent widget settings ux some url inputs allow typing some only allow pasting some have static urls pre generated even before clicking save which is odd because why do i need to save something that s already known this might be as simple as not using the word save maybe these issues will go away when we have a generic here are the instances of this widget view and here you can edit the settings for this instance view
| 1
|
14,674
| 8,664,957,570
|
IssuesEvent
|
2018-11-28 21:45:17
|
keras-team/keras
|
https://api.github.com/repos/keras-team/keras
|
closed
|
TensorBoard Callback write_images
|
type:bug/performance type:tensorFlow
|
I want to use the TensorBoard callback to visualize my conv layer kernels. But i can only see the first conv layer kernel in TensorBoard and my Dense layers at the end. For the other conv layers i can just see the bias values and not the kernels.
Here is my sample code for the Keras model.
```
# Imports
import tensorflow as tf
import numpy as np
import os
from os import makedirs
from os.path import exists, join
from keras.datasets import mnist
import time
from keras.layers import *
from keras.activations import *
from keras.models import *
from keras.optimizers import *
from keras.initializers import *
from keras.callbacks import TensorBoard
from keras.callbacks import ModelCheckpoint
from keras.utils.np_utils import to_categorical
from plotting import *
log_dir = '"./"
# Load MNIST dataset
(x_train, y_train), (x_test, y_test) = mnist.load_data()
batch_size = 128
epochs = 10
width = 28
height = 28
depth = 1
num_classes = 10
train_size = x_train.shape[0]
test_size = x_test.shape[0]
x_train = x_train.reshape(train_size, width, height, depth)
y_train = to_categorical(y_train, num_classes=num_classes)
x_test = x_test.reshape(test_size, width, height, depth)
y_test = to_categorical(y_test, num_classes=num_classes)
tb = TensorBoard(
log_dir=log_dir,
histogram_freq=1,
write_graph=True,
write_images=True)
# Define the DNN
model = Sequential()
model.add(Conv2D(filters=16, kernel_size=3, input_shape=(width, height, depth), name="conv1"))
model.add(Activation("relu"))
model.add(Conv2D(filters=20, kernel_size=3, name="conv2"))
model.add(Activation("relu"))
model.add(MaxPool2D())
model.add(Conv2D(filters=24, kernel_size=3, name="conv3"))
model.add(Activation("relu"))
model.add(Conv2D(filters=28, kernel_size=3, name="conv4"))
model.add(Activation("relu"))
model.add(MaxPool2D())
model.add(Flatten())
model.add(Dense(128))
model.add(Activation("relu"))
model.add(Dense(num_classes, name="features"))
model.add(Activation("softmax"))
# Print the DNN layers
model.summary()
# Train the DNN
lr = 1e-3
optimizer = Adam(lr=lr)
model.compile(loss="categorical_crossentropy", optimizer=optimizer, metrics=["accuracy"])
model.fit(x_train, y_train, verbose=1, batch_size=batch_size, epochs=epochs, validation_data=(x_test, y_test), callbacks=[tb])
# Test the DNN
score = model.evaluate(x_test, y_test, batch_size=batch_size)
print("Test performance: ", score)
```
Here is the resulting screenshot from TensorBoard.

|
True
|
TensorBoard Callback write_images - I want to use the TensorBoard callback to visualize my conv layer kernels. But i can only see the first conv layer kernel in TensorBoard and my Dense layers at the end. For the other conv layers i can just see the bias values and not the kernels.
Here is my sample code for the Keras model.
```
# Imports
import tensorflow as tf
import numpy as np
import os
from os import makedirs
from os.path import exists, join
from keras.datasets import mnist
import time
from keras.layers import *
from keras.activations import *
from keras.models import *
from keras.optimizers import *
from keras.initializers import *
from keras.callbacks import TensorBoard
from keras.callbacks import ModelCheckpoint
from keras.utils.np_utils import to_categorical
from plotting import *
log_dir = '"./"
# Load MNIST dataset
(x_train, y_train), (x_test, y_test) = mnist.load_data()
batch_size = 128
epochs = 10
width = 28
height = 28
depth = 1
num_classes = 10
train_size = x_train.shape[0]
test_size = x_test.shape[0]
x_train = x_train.reshape(train_size, width, height, depth)
y_train = to_categorical(y_train, num_classes=num_classes)
x_test = x_test.reshape(test_size, width, height, depth)
y_test = to_categorical(y_test, num_classes=num_classes)
tb = TensorBoard(
log_dir=log_dir,
histogram_freq=1,
write_graph=True,
write_images=True)
# Define the DNN
model = Sequential()
model.add(Conv2D(filters=16, kernel_size=3, input_shape=(width, height, depth), name="conv1"))
model.add(Activation("relu"))
model.add(Conv2D(filters=20, kernel_size=3, name="conv2"))
model.add(Activation("relu"))
model.add(MaxPool2D())
model.add(Conv2D(filters=24, kernel_size=3, name="conv3"))
model.add(Activation("relu"))
model.add(Conv2D(filters=28, kernel_size=3, name="conv4"))
model.add(Activation("relu"))
model.add(MaxPool2D())
model.add(Flatten())
model.add(Dense(128))
model.add(Activation("relu"))
model.add(Dense(num_classes, name="features"))
model.add(Activation("softmax"))
# Print the DNN layers
model.summary()
# Train the DNN
lr = 1e-3
optimizer = Adam(lr=lr)
model.compile(loss="categorical_crossentropy", optimizer=optimizer, metrics=["accuracy"])
model.fit(x_train, y_train, verbose=1, batch_size=batch_size, epochs=epochs, validation_data=(x_test, y_test), callbacks=[tb])
# Test the DNN
score = model.evaluate(x_test, y_test, batch_size=batch_size)
print("Test performance: ", score)
```
Here is the resulting screenshot from TensorBoard.

|
non_defect
|
tensorboard callback write images i want to use the tensorboard callback to visualize my conv layer kernels but i can only see the first conv layer kernel in tensorboard and my dense layers at the end for the other conv layers i can just see the bias values and not the kernels here is my sample code for the keras model imports import tensorflow as tf import numpy as np import os from os import makedirs from os path import exists join from keras datasets import mnist import time from keras layers import from keras activations import from keras models import from keras optimizers import from keras initializers import from keras callbacks import tensorboard from keras callbacks import modelcheckpoint from keras utils np utils import to categorical from plotting import log dir load mnist dataset x train y train x test y test mnist load data batch size epochs width height depth num classes train size x train shape test size x test shape x train x train reshape train size width height depth y train to categorical y train num classes num classes x test x test reshape test size width height depth y test to categorical y test num classes num classes tb tensorboard log dir log dir histogram freq write graph true write images true define the dnn model sequential model add filters kernel size input shape width height depth name model add activation relu model add filters kernel size name model add activation relu model add model add filters kernel size name model add activation relu model add filters kernel size name model add activation relu model add model add flatten model add dense model add activation relu model add dense num classes name features model add activation softmax print the dnn layers model summary train the dnn lr optimizer adam lr lr model compile loss categorical crossentropy optimizer optimizer metrics model fit x train y train verbose batch size batch size epochs epochs validation data x test y test callbacks test the dnn score model evaluate x test y test batch size batch size print test performance score here is the resulting screenshot from tensorboard
| 0
|
212,399
| 7,236,712,414
|
IssuesEvent
|
2018-02-13 08:25:13
|
bmintz/emoji-connoisseur
|
https://api.github.com/repos/bmintz/emoji-connoisseur
|
closed
|
Can't test the bot without affecting prod
|
bug high priority
|
Unfortunately, the nature of the bot requires several backend servers that are owned by a specific person, and no other guilds that are owned by the bot owner. This means that in order to test the bot without affecting prod, I need a separate set of backend guilds, which requires another account.
I tried to create a new account but it got "verify locked" again, requiring yet another phone number to be put in. Hopefully support@discordapp.com will help me here.
This isn't really a code issue, although https://github.com/bmintz/emoji-connoisseur/blob/4846a970d8f4d382f3a18b7a0e854d84fcb5156f/utils.py#L14 will have to be changed to pull from data/config.json afterwards.
|
1.0
|
Can't test the bot without affecting prod - Unfortunately, the nature of the bot requires several backend servers that are owned by a specific person, and no other guilds that are owned by the bot owner. This means that in order to test the bot without affecting prod, I need a separate set of backend guilds, which requires another account.
I tried to create a new account but it got "verify locked" again, requiring yet another phone number to be put in. Hopefully support@discordapp.com will help me here.
This isn't really a code issue, although https://github.com/bmintz/emoji-connoisseur/blob/4846a970d8f4d382f3a18b7a0e854d84fcb5156f/utils.py#L14 will have to be changed to pull from data/config.json afterwards.
|
non_defect
|
can t test the bot without affecting prod unfortunately the nature of the bot requires several backend servers that are owned by a specific person and no other guilds that are owned by the bot owner this means that in order to test the bot without affecting prod i need a separate set of backend guilds which requires another account i tried to create a new account but it got verify locked again requiring yet another phone number to be put in hopefully support discordapp com will help me here this isn t really a code issue although will have to be changed to pull from data config json afterwards
| 0
|
4,986
| 2,610,163,421
|
IssuesEvent
|
2015-02-26 18:51:49
|
chrsmith/republic-at-war
|
https://api.github.com/repos/chrsmith/republic-at-war
|
closed
|
Text
|
auto-migrated Priority-Medium Type-Defect
|
```
BARC speeder individual unit description is missing
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 3 May 2011 at 6:42
|
1.0
|
Text - ```
BARC speeder individual unit description is missing
```
-----
Original issue reported on code.google.com by `z3r0...@gmail.com` on 3 May 2011 at 6:42
|
defect
|
text barc speeder individual unit description is missing original issue reported on code google com by gmail com on may at
| 1
|
198,151
| 15,702,014,730
|
IssuesEvent
|
2021-03-26 12:02:07
|
devflask/RoboFlask
|
https://api.github.com/repos/devflask/RoboFlask
|
closed
|
Documentation
|
Documentation
|
Initial commit for documentation:
Create a documenation,
Add license
Add README
|
1.0
|
Documentation - Initial commit for documentation:
Create a documenation,
Add license
Add README
|
non_defect
|
documentation initial commit for documentation create a documenation add license add readme
| 0
|
59,567
| 17,023,164,082
|
IssuesEvent
|
2021-07-03 00:39:43
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
closed
|
ways created in reverse direction
|
Component: potlatch (flash editor) Priority: major Resolution: fixed Type: defect
|
**[Submitted to the original trac issue database at 1.45pm, Saturday, 19th May 2007]**
Potlatch seems to create ways in the reverse direction to the way in which they were drawn, so the last segment drawn is segment number 1
|
1.0
|
ways created in reverse direction - **[Submitted to the original trac issue database at 1.45pm, Saturday, 19th May 2007]**
Potlatch seems to create ways in the reverse direction to the way in which they were drawn, so the last segment drawn is segment number 1
|
defect
|
ways created in reverse direction potlatch seems to create ways in the reverse direction to the way in which they were drawn so the last segment drawn is segment number
| 1
|
210,547
| 7,190,798,454
|
IssuesEvent
|
2018-02-02 18:33:46
|
openshift/origin
|
https://api.github.com/repos/openshift/origin
|
closed
|
Make the Source secret key name consistent with the filename
|
kind/bug priority/P3
|
Currently, the 'key' name for the Source secret has to be `ssh-privatekey`. However, the file it contains is `id_rsa` or `id_dsa` and it actually can be whatever (I personally have >4 SSH private keys for different systems). So:
- We should make the 'key' name match the filename
- We should have a 'SSHKey' type secret
|
1.0
|
Make the Source secret key name consistent with the filename - Currently, the 'key' name for the Source secret has to be `ssh-privatekey`. However, the file it contains is `id_rsa` or `id_dsa` and it actually can be whatever (I personally have >4 SSH private keys for different systems). So:
- We should make the 'key' name match the filename
- We should have a 'SSHKey' type secret
|
non_defect
|
make the source secret key name consistent with the filename currently the key name for the source secret has to be ssh privatekey however the file it contains is id rsa or id dsa and it actually can be whatever i personally have ssh private keys for different systems so we should make the key name match the filename we should have a sshkey type secret
| 0
|
109,445
| 23,766,364,985
|
IssuesEvent
|
2022-09-01 13:07:01
|
WordPress/gutenberg
|
https://api.github.com/repos/WordPress/gutenberg
|
closed
|
Editor: Refactor currentPost state to use core-data package
|
[Type] Code Quality [Package] Core data [Package] Editor
|
Previously: https://github.com/WordPress/gutenberg/pull/16402#discussion_r301216604
The editor maintains a copy of the original post being edited in its state, as `currentPost`. This value is always representative of the last saved value of the post; edits are maintained separately. There should be effectively no difference between this value and what would be produced by...
```
wp.data.select( 'core' ).getEntityRecord( 'postType', 'post', wp.data.select( 'core/editor' ).getCurrentPostId() );
```
By using the entity provided from the `core-data` data store, we gain some benefit:
- There is only at most one copy of the post held in memory at a given time
- Updates to the canonical post are reflected in the editor, and vice-versa
- The implementation will likely push us to implement saving and setting behaviors for `core-data` entities (and a subsequent refactor of `savePost` to use `core-data` as well)
- Reduced implementation complexity and maintenance overhead of the editor reducer file
**Implementation Specifics:**
For the most part, the crux of the implementation requires changing...
https://github.com/WordPress/gutenberg/blob/8995efde26c7eea6790f0d593dc7bfb5137d0084/packages/editor/src/store/selectors.js#L156-L158
...to:
```js
export const getCurrentPost = createRegistrySelector( ( registry ) => ( state ) => {
return registry.select( 'core' ).getEntityRecord( 'postType', 'post' /* !!! */, getCurrentPostId( state ) );
} );
```
Some details:
- How do we know the post type? We may still need to track this in editor state, or as an edit. This is unfortunately required because an entity cannot be retrieved from the REST API without providing its post type ([related discussion](https://wordpress.slack.com/archives/C02RQC26G/p1546983215113700)).
- We should see about providing a simpler interface to retrieving entities (I thought we already had `getPost` etc. auto-generated for entities?)
|
1.0
|
Editor: Refactor currentPost state to use core-data package - Previously: https://github.com/WordPress/gutenberg/pull/16402#discussion_r301216604
The editor maintains a copy of the original post being edited in its state, as `currentPost`. This value is always representative of the last saved value of the post; edits are maintained separately. There should be effectively no difference between this value and what would be produced by...
```
wp.data.select( 'core' ).getEntityRecord( 'postType', 'post', wp.data.select( 'core/editor' ).getCurrentPostId() );
```
By using the entity provided from the `core-data` data store, we gain some benefit:
- There is only at most one copy of the post held in memory at a given time
- Updates to the canonical post are reflected in the editor, and vice-versa
- The implementation will likely push us to implement saving and setting behaviors for `core-data` entities (and a subsequent refactor of `savePost` to use `core-data` as well)
- Reduced implementation complexity and maintenance overhead of the editor reducer file
**Implementation Specifics:**
For the most part, the crux of the implementation requires changing...
https://github.com/WordPress/gutenberg/blob/8995efde26c7eea6790f0d593dc7bfb5137d0084/packages/editor/src/store/selectors.js#L156-L158
...to:
```js
export const getCurrentPost = createRegistrySelector( ( registry ) => ( state ) => {
return registry.select( 'core' ).getEntityRecord( 'postType', 'post' /* !!! */, getCurrentPostId( state ) );
} );
```
Some details:
- How do we know the post type? We may still need to track this in editor state, or as an edit. This is unfortunately required because an entity cannot be retrieved from the REST API without providing its post type ([related discussion](https://wordpress.slack.com/archives/C02RQC26G/p1546983215113700)).
- We should see about providing a simpler interface to retrieving entities (I thought we already had `getPost` etc. auto-generated for entities?)
|
non_defect
|
editor refactor currentpost state to use core data package previously the editor maintains a copy of the original post being edited in its state as currentpost this value is always representative of the last saved value of the post edits are maintained separately there should be effectively no difference between this value and what would be produced by wp data select core getentityrecord posttype post wp data select core editor getcurrentpostid by using the entity provided from the core data data store we gain some benefit there is only at most one copy of the post held in memory at a given time updates to the canonical post are reflected in the editor and vice versa the implementation will likely push us to implement saving and setting behaviors for core data entities and a subsequent refactor of savepost to use core data as well reduced implementation complexity and maintenance overhead of the editor reducer file implementation specifics for the most part the crux of the implementation requires changing to js export const getcurrentpost createregistryselector registry state return registry select core getentityrecord posttype post getcurrentpostid state some details how do we know the post type we may still need to track this in editor state or as an edit this is unfortunately required because an entity cannot be retrieved from the rest api without providing its post type we should see about providing a simpler interface to retrieving entities i thought we already had getpost etc auto generated for entities
| 0
|
246,530
| 18,846,857,902
|
IssuesEvent
|
2021-11-11 15:50:55
|
game-sales-analytics/userssrv
|
https://api.github.com/repos/game-sales-analytics/userssrv
|
opened
|
Add Protocol Buffer definitions
|
documentation enhancement
|
Currently, the following RPCs are required from the service:
- [ ] Register User:
Given user registration information along with their email and password, service must persist the user information, and return persisted user information as success response.
- [ ] Login User:
Given user email, password, and other contextual information, e.g., their IP address, device user-agent information, etc., service must verify provided credentials. Upon, successful verification, it should generate an auth token and returns it along with userful information about the generated token, e.g., token expiration time, as successful login result.
- [ ] Authenticate User
Given user auth token, service should verify the token, and upon successful verification, it should return useful information about the authenticated user, e.g., user ID, as successful authentication result to caller.
|
1.0
|
Add Protocol Buffer definitions - Currently, the following RPCs are required from the service:
- [ ] Register User:
Given user registration information along with their email and password, service must persist the user information, and return persisted user information as success response.
- [ ] Login User:
Given user email, password, and other contextual information, e.g., their IP address, device user-agent information, etc., service must verify provided credentials. Upon, successful verification, it should generate an auth token and returns it along with userful information about the generated token, e.g., token expiration time, as successful login result.
- [ ] Authenticate User
Given user auth token, service should verify the token, and upon successful verification, it should return useful information about the authenticated user, e.g., user ID, as successful authentication result to caller.
|
non_defect
|
add protocol buffer definitions currently the following rpcs are required from the service register user given user registration information along with their email and password service must persist the user information and return persisted user information as success response login user given user email password and other contextual information e g their ip address device user agent information etc service must verify provided credentials upon successful verification it should generate an auth token and returns it along with userful information about the generated token e g token expiration time as successful login result authenticate user given user auth token service should verify the token and upon successful verification it should return useful information about the authenticated user e g user id as successful authentication result to caller
| 0
|
19,131
| 3,144,822,978
|
IssuesEvent
|
2015-09-14 15:09:31
|
ox-it/ords
|
https://api.github.com/repos/ox-it/ords
|
closed
|
Cannot save changes to records with Boolean data type if table also contains a Time field
|
auto-migrated Priority-Critical Type-Defect
|
```
What steps will reproduce the problem?
1. Open a table which contains a field with the Boolean data type, plus one
with the Time data type
2. Try to edit a field with the Boolean data type, and save the changes
3.
What is the expected output? What do you see instead?
For some reason, ORDS won't let changes to a Boolean field be saved if the same
table also contains a Time field. As there are also problems with Time fields
(see issue 702), possibly this is related?
It does seem to be possible to add a new record with the Boolean field checked
- just not to edit an existing record.
(We initially thought this problem only existed in Dev, but a bit more
experimentation indicates it's also true of App - just only when there's also a
Time field present.)
Please use labels and text to provide additional information.
```
Original issue reported on code.google.com by `meriel.p...@gmail.com` on 3 Aug 2015 at 1:24
|
1.0
|
Cannot save changes to records with Boolean data type if table also contains a Time field - ```
What steps will reproduce the problem?
1. Open a table which contains a field with the Boolean data type, plus one
with the Time data type
2. Try to edit a field with the Boolean data type, and save the changes
3.
What is the expected output? What do you see instead?
For some reason, ORDS won't let changes to a Boolean field be saved if the same
table also contains a Time field. As there are also problems with Time fields
(see issue 702), possibly this is related?
It does seem to be possible to add a new record with the Boolean field checked
- just not to edit an existing record.
(We initially thought this problem only existed in Dev, but a bit more
experimentation indicates it's also true of App - just only when there's also a
Time field present.)
Please use labels and text to provide additional information.
```
Original issue reported on code.google.com by `meriel.p...@gmail.com` on 3 Aug 2015 at 1:24
|
defect
|
cannot save changes to records with boolean data type if table also contains a time field what steps will reproduce the problem open a table which contains a field with the boolean data type plus one with the time data type try to edit a field with the boolean data type and save the changes what is the expected output what do you see instead for some reason ords won t let changes to a boolean field be saved if the same table also contains a time field as there are also problems with time fields see issue possibly this is related it does seem to be possible to add a new record with the boolean field checked just not to edit an existing record we initially thought this problem only existed in dev but a bit more experimentation indicates it s also true of app just only when there s also a time field present please use labels and text to provide additional information original issue reported on code google com by meriel p gmail com on aug at
| 1
|
30,375
| 6,123,330,137
|
IssuesEvent
|
2017-06-23 04:09:31
|
Advanced-Post-List/advanced-post-list
|
https://api.github.com/repos/Advanced-Post-List/advanced-post-list
|
closed
|
Change Admin Page for 0.4
|
P3 - Major T-Possible Defect
|
Redesign the admin page top-down following many of the standards and styles already set in place by WordPress. Eliminate any outside code ( jQuery UI & MultiSelect ).
Design UI/UX as Follows
- Hide **Post Type & Taxonomy** until clicked/check.
- Hide **Empty Message** Design.
- Collapse **Before** and **After**.
[Mock-up submitted by user](https://wordpress.org/support/topic/finally-a-good-powerful-list-plugin/#post-8402850).

|
1.0
|
Change Admin Page for 0.4 - Redesign the admin page top-down following many of the standards and styles already set in place by WordPress. Eliminate any outside code ( jQuery UI & MultiSelect ).
Design UI/UX as Follows
- Hide **Post Type & Taxonomy** until clicked/check.
- Hide **Empty Message** Design.
- Collapse **Before** and **After**.
[Mock-up submitted by user](https://wordpress.org/support/topic/finally-a-good-powerful-list-plugin/#post-8402850).

|
defect
|
change admin page for redesign the admin page top down following many of the standards and styles already set in place by wordpress eliminate any outside code jquery ui multiselect design ui ux as follows hide post type taxonomy until clicked check hide empty message design collapse before and after
| 1
|
51,212
| 13,207,395,224
|
IssuesEvent
|
2020-08-14 22:56:32
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
opened
|
Request for I3RecoPulseMap typedef (Trac #48)
|
Incomplete Migration Migrated from Trac defect offline-software
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/48">https://code.icecube.wisc.edu/projects/icecube/ticket/48</a>, reported by blaufuss</summary>
<p>
```json
{
"status": "closed",
"changetime": "2007-11-11T03:51:18",
"_ts": "1194753078000000",
"description": "An old request from Tilo:\n\nHi Erik,\n\nat the moment we are cleaning up and restructuring the IceTop reconstruction \nmodules and we would like to add the\n\ntypedef I3Map<OMKey, I3RecoPulse> I3RecoPulseMap;\n\nand the corresponding pointer typedefs into the I3RecoPulse header file. We \nwant to switch from using I3RecoPulseSeriesMaps to I3RecoPulseMaps for two \nreasons:\n\n(1) In IceTop we always deal with single pulses and we never use second pulses \nin our analysis. We are only interested in the integrated charge. Therefore a \nsingle pulse per tank is sufficient.\n\n(2) The topeventbuilder module converts the pulse charges from PE into VEM. To \navoid confusions we want to store our final VEM pulses in a I3RecoPulseMap. \nThen they can't be confused anymore with PE pulse which are normally stored \nin I3RecoPulseSeriesMaps.\n\nCould you please add these few lines into the I3RecoPulse?\n\nCheers, Tilo",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"time": "2007-06-06T01:56:02",
"component": "offline-software",
"summary": "Request for I3RecoPulseMap typedef",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
Request for I3RecoPulseMap typedef (Trac #48) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/48">https://code.icecube.wisc.edu/projects/icecube/ticket/48</a>, reported by blaufuss</summary>
<p>
```json
{
"status": "closed",
"changetime": "2007-11-11T03:51:18",
"_ts": "1194753078000000",
"description": "An old request from Tilo:\n\nHi Erik,\n\nat the moment we are cleaning up and restructuring the IceTop reconstruction \nmodules and we would like to add the\n\ntypedef I3Map<OMKey, I3RecoPulse> I3RecoPulseMap;\n\nand the corresponding pointer typedefs into the I3RecoPulse header file. We \nwant to switch from using I3RecoPulseSeriesMaps to I3RecoPulseMaps for two \nreasons:\n\n(1) In IceTop we always deal with single pulses and we never use second pulses \nin our analysis. We are only interested in the integrated charge. Therefore a \nsingle pulse per tank is sufficient.\n\n(2) The topeventbuilder module converts the pulse charges from PE into VEM. To \navoid confusions we want to store our final VEM pulses in a I3RecoPulseMap. \nThen they can't be confused anymore with PE pulse which are normally stored \nin I3RecoPulseSeriesMaps.\n\nCould you please add these few lines into the I3RecoPulse?\n\nCheers, Tilo",
"reporter": "blaufuss",
"cc": "",
"resolution": "fixed",
"time": "2007-06-06T01:56:02",
"component": "offline-software",
"summary": "Request for I3RecoPulseMap typedef",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
defect
|
request for typedef trac migrated from json status closed changetime ts description an old request from tilo n nhi erik n nat the moment we are cleaning up and restructuring the icetop reconstruction nmodules and we would like to add the n ntypedef n nand the corresponding pointer typedefs into the header file we nwant to switch from using to for two nreasons n n in icetop we always deal with single pulses and we never use second pulses nin our analysis we are only interested in the integrated charge therefore a nsingle pulse per tank is sufficient n n the topeventbuilder module converts the pulse charges from pe into vem to navoid confusions we want to store our final vem pulses in a nthen they can t be confused anymore with pe pulse which are normally stored nin n ncould you please add these few lines into the n ncheers tilo reporter blaufuss cc resolution fixed time component offline software summary request for typedef priority normal keywords milestone owner type defect
| 1
|
641,670
| 20,831,870,933
|
IssuesEvent
|
2022-03-19 15:26:29
|
internet4refugees/beherbergung
|
https://api.github.com/repos/internet4refugees/beherbergung
|
opened
|
[Feature] Links between Rows in Table and Markers in Map
|
enhancement frontend medium priority
|
**Problem**
Easy jumping from one to another
**Suggested solution**
- [ ] Link from markers popup to anchor in row
- [ ] Onclick-event from row (or link from address-cells) to marker
- [ ] Update center map at this location
|
1.0
|
[Feature] Links between Rows in Table and Markers in Map - **Problem**
Easy jumping from one to another
**Suggested solution**
- [ ] Link from markers popup to anchor in row
- [ ] Onclick-event from row (or link from address-cells) to marker
- [ ] Update center map at this location
|
non_defect
|
links between rows in table and markers in map problem easy jumping from one to another suggested solution link from markers popup to anchor in row onclick event from row or link from address cells to marker update center map at this location
| 0
|
76,878
| 26,653,314,218
|
IssuesEvent
|
2023-01-25 15:09:25
|
hyperledger/iroha
|
https://api.github.com/repos/hyperledger/iroha
|
opened
|
[BUG] Failed tolerance doesn't work on the 7 peers.
|
Bug iroha2 Dev defect Pre-alpha defect QA-confirmed
|
### OS and Environment
MacOS, Docker Hub
### GIT commit hash
ac35bda6
### Minimum working example / Steps to reproduce
1. Run the `docker-compose.yml` with 7 peers below.
```yml
version: "3.8"
services:
iroha0:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha0:1337
TORII_API_URL: iroha0:8080
TORII_TELEMETRY_URL: iroha0:8180
IROHA_PUBLIC_KEY: "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "282ed9f3cf92811c3818dbc4ae594ed59dc1a2f78e4241e31924e101d6b1fb831c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
IROHA_GENESIS_ACCOUNT_PRIVATE_KEY: '{ "digest_function": "ed25519", "payload": "038ae16b219da35aa036335ed0a43c28a2cc737150112c78a7b8034b9d99c9023f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255" }'
ports:
- "1337:1337"
- "8080:8080"
- "8180:8180"
volumes:
- './configs/peer:/config'
init: true
command: iroha --submit-genesis
iroha1:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha1:1338
TORII_API_URL: iroha1:8081
TORII_TELEMETRY_URL: iroha1:8181
IROHA_PUBLIC_KEY: "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "3bac34cda9e3763fa069c1198312d1ec73b53023b8180c822ac355435edc4a24cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1338:1338"
- "8081:8081"
- "8181:8181"
volumes:
- './configs/peer:/config'
init: true
iroha2:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha2:1339
TORII_API_URL: iroha2:8082
TORII_TELEMETRY_URL: iroha2:8182
IROHA_PUBLIC_KEY: "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "1261a436d36779223d7d6cf20e8b644510e488e6a50bafd77a7485264d27197dfaca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1339:1339"
- "8082:8082"
- "8182:8182"
volumes:
- './configs/peer:/config'
init: true
iroha3:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha3:1340
TORII_API_URL: iroha3:8083
TORII_TELEMETRY_URL: iroha3:8183
IROHA_PUBLIC_KEY: "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "a70dab95c7482eb9f159111b65947e482108cfe67df877bd8d3b9441a781c7c98e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1340:1340"
- "8083:8083"
- "8183:8183"
volumes:
- './configs/peer:/config'
init: true
iroha4:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha4:1341
TORII_API_URL: iroha4:8084
TORII_TELEMETRY_URL: iroha4:8184
IROHA_PUBLIC_KEY: "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "449dab240ceba408d21cb450cd3107ad80c7a3ee0e309fb84f9a91726ed4eb2fe23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1341:1341"
- "8084:8084"
- "8184:8184"
volumes:
- './configs/peer:/config'
init: true
iroha5:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha5:1342
TORII_API_URL: iroha5:8095
TORII_TELEMETRY_URL: iroha5:8195
IROHA_PUBLIC_KEY: "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "44643be1ddd3a679a2a092dc021ad34c0faa3e1df6b6d88f345057861d577d8954b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1342:1342"
- "8095:8095"
- "8195:8195"
volumes:
- './configs/peer:/config'
init: true
iroha6:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha6:1343
TORII_API_URL: iroha6:8086
TORII_TELEMETRY_URL: iroha6:8186
IROHA_PUBLIC_KEY: "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "c4e1e9ad5d21763375f81780d2d5ee0c4fc678d328c3f2517b511642f1382bfe1feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1343:1343"
- "8086:8086"
- "8186:8186"
volumes:
- './configs/peer:/config'
init: true
```
2. Send transaction
```bash
./iroha_client_cli asset register \
--id="tea#wonderland" \
--value-type=Quantity
```
### Actual result
```bash
2023-01-25T12:49:22.822635Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
```
### Expected result
Successfully transaction
### Logs in JSON format
<details>
<summary>Log contents</summary>
```bash
2023-01-25T12:49:22.155425Z WARN run: iroha_core::sumeragi::main_loop: No block produced in due time, requesting view change... role=ProxyTail
2023-01-25T12:49:22.181677Z WARN iroha_p2p::network: Didn't find peer to send message peer.id=Id { address: "iroha5:1342", public_key: { digest: ed25519, payload: 54B2008E5D71065AE061131EEE052D282CDBA7DC0363A3D87A91409AD91A1DDA } }
2023-01-25T12:49:22.500266Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:22.563178Z WARN iroha_p2p::network: Unable to create peer error=Failed IO operation.
2023-01-25T12:49:22.563574Z WARN iroha_p2p::network: Didn't find peer to send message peer.id=Id { address: "iroha5:1342", public_key: { digest: ed25519, payload: 54B2008E5D71065AE061131EEE052D282CDBA7DC0363A3D87A91409AD91A1DDA } }
2023-01-25T12:49:22.822635Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.078555Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.317350Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.634494Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.686324Z WARN iroha_p2p::network: Unable to create peer error=Failed IO operation.
```
</details>
### Who can help to reproduce?
@astrokov7
### Notes
_No response_
|
2.0
|
[BUG] Failed tolerance doesn't work on the 7 peers. - ### OS and Environment
MacOS, Docker Hub
### GIT commit hash
ac35bda6
### Minimum working example / Steps to reproduce
1. Run the `docker-compose.yml` with 7 peers below.
```yml
version: "3.8"
services:
iroha0:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha0:1337
TORII_API_URL: iroha0:8080
TORII_TELEMETRY_URL: iroha0:8180
IROHA_PUBLIC_KEY: "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "282ed9f3cf92811c3818dbc4ae594ed59dc1a2f78e4241e31924e101d6b1fb831c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
IROHA_GENESIS_ACCOUNT_PRIVATE_KEY: '{ "digest_function": "ed25519", "payload": "038ae16b219da35aa036335ed0a43c28a2cc737150112c78a7b8034b9d99c9023f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255" }'
ports:
- "1337:1337"
- "8080:8080"
- "8180:8180"
volumes:
- './configs/peer:/config'
init: true
command: iroha --submit-genesis
iroha1:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha1:1338
TORII_API_URL: iroha1:8081
TORII_TELEMETRY_URL: iroha1:8181
IROHA_PUBLIC_KEY: "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "3bac34cda9e3763fa069c1198312d1ec73b53023b8180c822ac355435edc4a24cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1338:1338"
- "8081:8081"
- "8181:8181"
volumes:
- './configs/peer:/config'
init: true
iroha2:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha2:1339
TORII_API_URL: iroha2:8082
TORII_TELEMETRY_URL: iroha2:8182
IROHA_PUBLIC_KEY: "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "1261a436d36779223d7d6cf20e8b644510e488e6a50bafd77a7485264d27197dfaca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1339:1339"
- "8082:8082"
- "8182:8182"
volumes:
- './configs/peer:/config'
init: true
iroha3:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha3:1340
TORII_API_URL: iroha3:8083
TORII_TELEMETRY_URL: iroha3:8183
IROHA_PUBLIC_KEY: "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "a70dab95c7482eb9f159111b65947e482108cfe67df877bd8d3b9441a781c7c98e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1340:1340"
- "8083:8083"
- "8183:8183"
volumes:
- './configs/peer:/config'
init: true
iroha4:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha4:1341
TORII_API_URL: iroha4:8084
TORII_TELEMETRY_URL: iroha4:8184
IROHA_PUBLIC_KEY: "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "449dab240ceba408d21cb450cd3107ad80c7a3ee0e309fb84f9a91726ed4eb2fe23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1341:1341"
- "8084:8084"
- "8184:8184"
volumes:
- './configs/peer:/config'
init: true
iroha5:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha5:1342
TORII_API_URL: iroha5:8095
TORII_TELEMETRY_URL: iroha5:8195
IROHA_PUBLIC_KEY: "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "44643be1ddd3a679a2a092dc021ad34c0faa3e1df6b6d88f345057861d577d8954b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1342:1342"
- "8095:8095"
- "8195:8195"
volumes:
- './configs/peer:/config'
init: true
iroha6:
image: hyperledger/iroha2:dev
environment:
TORII_P2P_ADDR: iroha6:1343
TORII_API_URL: iroha6:8086
TORII_TELEMETRY_URL: iroha6:8186
IROHA_PUBLIC_KEY: "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"
IROHA_PRIVATE_KEY: '{"digest_function": "ed25519", "payload": "c4e1e9ad5d21763375f81780d2d5ee0c4fc678d328c3f2517b511642f1382bfe1feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}'
SUMERAGI_TRUSTED_PEERS: '[{"address":"iroha0:1337", "public_key": "ed01201c61faf8fe94e253b93114240394f79a607b7fa55f9e5a41ebec74b88055768b"}, {"address":"iroha1:1338", "public_key": "ed0120cc25624d62896d3a0bfd8940f928dc2abf27cc57cefeb442aa96d9081aae58a1"}, {"address": "iroha2:1339", "public_key": "ed0120faca9e8aa83225cb4d16d67f27dd4f93fc30ffa11adc1f5c88fd5495ecc91020"}, {"address": "iroha3:1340", "public_key": "ed01208e351a70b6a603ed285d666b8d689b680865913ba03ce29fb7d13a166c4e7f1f"}, {"address": "iroha4:1341", "public_key": "ed0120e23057aad8a49cb05028c16113306dca761a32f731ba2abde7d447ac9f5e347e"}, {"address": "iroha5:1342", "public_key": "ed012054b2008e5d71065ae061131eee052d282cdba7dc0363a3d87a91409ad91a1dda"}, {"address": "iroha6:1343", "public_key": "ed01201feeba0ba2238a5c63b7b9992c13da227c787e519a19c666de369c943ea876a8"}]'
IROHA_GENESIS_ACCOUNT_PUBLIC_KEY: 'ed01203f4e3e98571b55514edc5ccf7e53ca7509d89b2868e62921180a6f57c2f4e255'
ports:
- "1343:1343"
- "8086:8086"
- "8186:8186"
volumes:
- './configs/peer:/config'
init: true
```
2. Send transaction
```bash
./iroha_client_cli asset register \
--id="tea#wonderland" \
--value-type=Quantity
```
### Actual result
```bash
2023-01-25T12:49:22.822635Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
```
### Expected result
Successfully transaction
### Logs in JSON format
<details>
<summary>Log contents</summary>
```bash
2023-01-25T12:49:22.155425Z WARN run: iroha_core::sumeragi::main_loop: No block produced in due time, requesting view change... role=ProxyTail
2023-01-25T12:49:22.181677Z WARN iroha_p2p::network: Didn't find peer to send message peer.id=Id { address: "iroha5:1342", public_key: { digest: ed25519, payload: 54B2008E5D71065AE061131EEE052D282CDBA7DC0363A3D87A91409AD91A1DDA } }
2023-01-25T12:49:22.500266Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:22.563178Z WARN iroha_p2p::network: Unable to create peer error=Failed IO operation.
2023-01-25T12:49:22.563574Z WARN iroha_p2p::network: Didn't find peer to send message peer.id=Id { address: "iroha5:1342", public_key: { digest: ed25519, payload: 54B2008E5D71065AE061131EEE052D282CDBA7DC0363A3D87A91409AD91A1DDA } }
2023-01-25T12:49:22.822635Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.078555Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.317350Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.634494Z ERROR iroha_core::sumeragi: This peer is faulty. Incoming messages have to be dropped due to low processing speed.
2023-01-25T12:49:23.686324Z WARN iroha_p2p::network: Unable to create peer error=Failed IO operation.
```
</details>
### Who can help to reproduce?
@astrokov7
### Notes
_No response_
|
defect
|
failed tolerance doesn t work on the peers os and environment macos docker hub git commit hash minimum working example steps to reproduce run the docker compose yml with peers below yml version services image hyperledger dev environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key iroha genesis account private key digest function payload ports volumes configs peer config init true command iroha submit genesis image hyperledger dev environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key ports volumes configs peer config init true image hyperledger dev environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key ports volumes configs peer config init true image hyperledger dev environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key ports volumes configs peer config init true image hyperledger dev environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key ports volumes configs peer config init true image hyperledger dev environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key ports volumes configs peer config init true image hyperledger dev environment torii addr torii api url torii telemetry url iroha public key iroha private key digest function payload sumeragi trusted peers iroha genesis account public key ports volumes configs peer config init true send transaction bash iroha client cli asset register id tea wonderland value type quantity actual result bash error iroha core sumeragi this peer is faulty incoming messages have to be dropped due to low processing speed expected result successfully transaction logs in json format log contents bash warn run iroha core sumeragi main loop no block produced in due time requesting view change role proxytail warn iroha network didn t find peer to send message peer id id address public key digest payload error iroha core sumeragi this peer is faulty incoming messages have to be dropped due to low processing speed warn iroha network unable to create peer error failed io operation warn iroha network didn t find peer to send message peer id id address public key digest payload error iroha core sumeragi this peer is faulty incoming messages have to be dropped due to low processing speed error iroha core sumeragi this peer is faulty incoming messages have to be dropped due to low processing speed error iroha core sumeragi this peer is faulty incoming messages have to be dropped due to low processing speed error iroha core sumeragi this peer is faulty incoming messages have to be dropped due to low processing speed warn iroha network unable to create peer error failed io operation who can help to reproduce notes no response
| 1
|
446,924
| 12,879,733,909
|
IssuesEvent
|
2020-07-12 00:22:51
|
krsiakdaniel/movies
|
https://api.github.com/repos/krsiakdaniel/movies
|
closed
|
Convert JS => TypeScript
|
enhancement no-issue-activity priority
|
Convert `.js` => `.tsx` or `.ts`
- [x] model + page: HOME = https://github.com/krsiakdaniel/movies/pull/98
- [ ] model + page: MOVIE
- [ ] remove `PropTypes` + uninstall
|
1.0
|
Convert JS => TypeScript - Convert `.js` => `.tsx` or `.ts`
- [x] model + page: HOME = https://github.com/krsiakdaniel/movies/pull/98
- [ ] model + page: MOVIE
- [ ] remove `PropTypes` + uninstall
|
non_defect
|
convert js typescript convert js tsx or ts model page home model page movie remove proptypes uninstall
| 0
|
66,918
| 7,028,252,214
|
IssuesEvent
|
2017-12-25 08:14:27
|
renderforest/notification-hooks
|
https://api.github.com/repos/renderforest/notification-hooks
|
closed
|
add test (full coverage)
|
test
|
* test setup is done, remains write tests
Currently we have only slack hook (not too much to write)
|
1.0
|
add test (full coverage) - * test setup is done, remains write tests
Currently we have only slack hook (not too much to write)
|
non_defect
|
add test full coverage test setup is done remains write tests currently we have only slack hook not too much to write
| 0
|
360,550
| 10,694,188,216
|
IssuesEvent
|
2019-10-23 10:18:53
|
OpenSourceEconomics/soepy
|
https://api.github.com/repos/OpenSourceEconomics/soepy
|
closed
|
remove nuisance files
|
enhancement pb package priority low size small
|
There appear to be some files that are leftover from our attempts at the packaging. Please check whether they can be removed. For example, build.sh setup.cfg ...
|
1.0
|
remove nuisance files - There appear to be some files that are leftover from our attempts at the packaging. Please check whether they can be removed. For example, build.sh setup.cfg ...
|
non_defect
|
remove nuisance files there appear to be some files that are leftover from our attempts at the packaging please check whether they can be removed for example build sh setup cfg
| 0
|
240,049
| 26,254,316,772
|
IssuesEvent
|
2023-01-05 22:32:23
|
tamirverthim/gutenberg
|
https://api.github.com/repos/tamirverthim/gutenberg
|
opened
|
CVE-2021-3807 (High) detected in ansi-regex-3.0.0.tgz
|
security vulnerability
|
## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>
Dependency Hierarchy:
- @wordpress/blocks-file:packages/blocks.tgz (Root Library)
- showdown-1.8.6.tgz
- yargs-10.1.2.tgz
- cliui-4.1.0.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p>
</p>
</details>
<p></p>
|
True
|
CVE-2021-3807 (High) detected in ansi-regex-3.0.0.tgz - ## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>
Dependency Hierarchy:
- @wordpress/blocks-file:packages/blocks.tgz (Root Library)
- showdown-1.8.6.tgz
- yargs-10.1.2.tgz
- cliui-4.1.0.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution: ansi-regex - 5.0.1,6.0.1</p>
</p>
</details>
<p></p>
|
non_defect
|
cve high detected in ansi regex tgz cve high severity vulnerability vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href dependency hierarchy wordpress blocks file packages blocks tgz root library showdown tgz yargs tgz cliui tgz strip ansi tgz x ansi regex tgz vulnerable library vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex
| 0
|
49,457
| 13,186,741,137
|
IssuesEvent
|
2020-08-13 01:10:06
|
icecube-trac/tix3
|
https://api.github.com/repos/icecube-trac/tix3
|
opened
|
[steamshovel] QSpinBox of ShovelSlider is unreadable on Mac (Trac #1436)
|
Incomplete Migration Migrated from Trac combo core defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1436">https://code.icecube.wisc.edu/ticket/1436</a>, reported by hdembinski and owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-12-16T17:35:34",
"description": "The text inside the spinbox is cut off at the top and the bottom. Seems to be only a Mac issue (works on Ubuntu 14.04).",
"reporter": "hdembinski",
"cc": "delia.tosi, david.schultz",
"resolution": "fixed",
"_ts": "1450287334271104",
"component": "combo core",
"summary": "[steamshovel] QSpinBox of ShovelSlider is unreadable on Mac",
"priority": "normal",
"keywords": "",
"time": "2015-11-17T21:47:10",
"milestone": "Long-Term Future",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
[steamshovel] QSpinBox of ShovelSlider is unreadable on Mac (Trac #1436) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/ticket/1436">https://code.icecube.wisc.edu/ticket/1436</a>, reported by hdembinski and owned by hdembinski</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2015-12-16T17:35:34",
"description": "The text inside the spinbox is cut off at the top and the bottom. Seems to be only a Mac issue (works on Ubuntu 14.04).",
"reporter": "hdembinski",
"cc": "delia.tosi, david.schultz",
"resolution": "fixed",
"_ts": "1450287334271104",
"component": "combo core",
"summary": "[steamshovel] QSpinBox of ShovelSlider is unreadable on Mac",
"priority": "normal",
"keywords": "",
"time": "2015-11-17T21:47:10",
"milestone": "Long-Term Future",
"owner": "hdembinski",
"type": "defect"
}
```
</p>
</details>
|
defect
|
qspinbox of shovelslider is unreadable on mac trac migrated from json status closed changetime description the text inside the spinbox is cut off at the top and the bottom seems to be only a mac issue works on ubuntu reporter hdembinski cc delia tosi david schultz resolution fixed ts component combo core summary qspinbox of shovelslider is unreadable on mac priority normal keywords time milestone long term future owner hdembinski type defect
| 1
|
72,368
| 24,081,084,431
|
IssuesEvent
|
2022-09-19 06:38:56
|
martinrotter/rssguard
|
https://api.github.com/repos/martinrotter/rssguard
|
closed
|
[BUG]: RSS Guard misses new entries in specific feed
|
Type-Defect
|
### Brief description of the issue
I have https://www.thecodedmessage.com/index.xml as a feed, and it longer grabs new articles ("Programming Portfolio" from 2022-06-23 is the latest one). "Fetch metadata" still works (and selects the ATOM 1.0 type), and both my browser and curl are still able to fetch it, showing it does have new entries, making me believe the issue lies with RSS Guard. I have tried adding the URL as a new feed, but the same issue occurs (it simply has no articles and doesn't get any). This is the only URL I have this issue with.
### How to reproduce the bug?
1. Add or have https://www.thecodedmessage.com/index.xml as a feed
2. Fetch entries for it
### What was the expected result?
The new articles show up in the articles view.
### What actually happened?
The feed name in the feed list becomes blue, now new articles appear.
### Debug log
The most relevant line appears to be:
```
time=" 30.767" type="critical" -> database: Failed bulk insert of articles: 'CHECK constraint failed: date_created >= 0 Unable to fetch rowCHECK constraint failed: date_created >= 0Unable to fetch row'.
```
```
time=" 29.712" type="debug" -> feed-downloader: Downloading new messages for feed ID '269' URL: 'https://www.thecodedmessage.com/index.xml' title: 'The Coded Message' in thread: '0x3988'.
time=" 29.712" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 29.713" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 29.713" type="debug" -> core: Downloading URL 'https://www.thecodedmessage.com/index.xml' to obtain feed data.
time=" 29.713" type="debug" -> network: Settings of BaseNetworkAccessManager loaded.
time=" 30.471" type="debug" -> network: Destroying Downloader instance.
time=" 30.472" type="debug" -> network: Destroying SilentNetworkAccessManager instance.
time=" 30.606" type="debug" -> feed-downloader: Downloaded 66 messages for feed ID '269' URL: 'https://www.thecodedmessage.com/index.xml' title: 'The Coded Message' in thread: '0x3988'. Operation took 893939 microseconds.
time=" 30.613" type="debug" -> feed-downloader: Saving messages of feed ID '269' URL: 'https://www.thecodedmessage.com/index.xml' title: 'The Coded Message' in thread: '0x3988'.
time=" 30.613" type="debug" -> core: Updating messages in DB. Main thread: 'false'.
time=" 30.613" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.614" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.614" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/strong-typing/' is present in DB.
time=" 30.617" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/erased-serde/' is present in DB.
time=" 30.620" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/write-everything-down/' is present in DB.
time=" 30.622" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/blocking-sockets/' is present in DB.
time=" 30.624" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/2022-07-14-programming-unwrap/' is present in DB.
time=" 30.626" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/review-plain-truth/' is present in DB.
time=" 30.628" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/haskell-error-message-2/' is present in DB.
time=" 30.630" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/2022-06-16-programming-cli/' is present in DB.
time=" 30.632" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/grammar/' is present in DB.
time=" 30.633" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/trivia-rust-types/' is present in DB.
time=" 30.635" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/function-overloading-in-rust/' is present in DB.
time=" 30.638" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hugo-2022/' is present in DB.
time=" 30.640" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/netflix-tech/' is present in DB.
time=" 30.641" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/2022-05-11-programming-multiparadigm/' is present in DB.
time=" 30.643" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/process-checklist/' is present in DB.
time=" 30.645" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/patience/' is present in DB.
time=" 30.647" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/mortgage2/' is present in DB.
time=" 30.649" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/programming-integers/' is present in DB.
time=" 30.651" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hugo-2021/' is present in DB.
time=" 30.654" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/comic-beer/' is present in DB.
time=" 30.656" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/reproducibility/' is present in DB.
time=" 30.658" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/rust-map-entry/' is present in DB.
time=" 30.660" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/biking-to-philly/' is present in DB.
time=" 30.662" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/crank-em-out/' is present in DB.
time=" 30.664" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/qbasic-nostalgia/' is present in DB.
time=" 30.665" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/warnings/' is present in DB.
time=" 30.667" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/haskell-gripe/' is present in DB.
time=" 30.669" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/mortgage_interest/' is present in DB.
time=" 30.671" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/buried-lede/' is present in DB.
time=" 30.673" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/unsafe/' is present in DB.
time=" 30.675" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/async-colors/' is present in DB.
time=" 30.677" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/endian_polymorphism/' is present in DB.
time=" 30.678" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/cpp-move/' is present in DB.
time=" 30.680" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hello-rust/' is present in DB.
time=" 30.682" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/humpty_dumpty/' is present in DB.
time=" 30.684" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/apple_silicon/' is present in DB.
time=" 30.686" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/perpetual_quarantime/' is present in DB.
time=" 30.688" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/octopedian/' is present in DB.
time=" 30.690" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/rent_pause/' is present in DB.
time=" 30.692" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/web/' is present in DB.
time=" 30.694" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/bar/' is present in DB.
time=" 30.696" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hungarian/' is present in DB.
time=" 30.698" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/roots/' is present in DB.
time=" 30.700" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/os_tour/' is present in DB.
time=" 30.702" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/extra_version/' is present in DB.
time=" 30.704" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/father_forgive_them/' is present in DB.
time=" 30.706" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/swiss/' is present in DB.
time=" 30.708" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/apartments/' is present in DB.
time=" 30.711" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/current_os/' is present in DB.
time=" 30.713" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/music_words/' is present in DB.
time=" 30.715" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/operating_system/' is present in DB.
time=" 30.716" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/soulfully/' is present in DB.
time=" 30.718" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/major_country/' is present in DB.
time=" 30.721" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/money/' is present in DB.
time=" 30.723" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/sermon/' is present in DB.
time=" 30.725" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/function-ptrs/' is present in DB.
time=" 30.727" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/angels/' is present in DB.
time=" 30.729" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/are-you-sure/' is present in DB.
time=" 30.731" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/india2/' is present in DB.
time=" 30.733" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/india/' is present in DB.
time=" 30.736" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/india0/' is present in DB.
time=" 30.739" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/about/' is present in DB.
time=" 30.741" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/programming-portfolio/' is present in DB.
time=" 30.743" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/reading-log/' is present in DB.
time=" 30.745" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/programming-rec-reading/' is present in DB.
time=" 30.746" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/rust-opinions/' is present in DB.
time=" 30.767" type="critical" -> database: Failed bulk insert of articles: 'CHECK constraint failed: date_created >= 0 Unable to fetch rowCHECK constraint failed: date_created >= 0Unable to fetch row'.
time=" 30.771" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.771" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.776" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.776" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.779" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.780" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.782" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.783" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.784" type="debug" -> feed-downloader: Updating messages in DB took 171137 microseconds.
time=" 30.784" type="debug" -> feed-model: There is request to reload feed model, reloading the 5 items individually.
time=" 30.784" type="debug" -> feed-downloader: std::pair(66,66) messages for feed 269 stored in DB.
time=" 30.785" type="debug" -> core: Filter accepts row 'Recycle bin' and filter result is: 'true'.
time=" 30.785" type="debug" -> feed-downloader: Made progress in feed updates, total feeds count 1/1 (id of feed is 269).
time=" 30.785" type="debug" -> feed-downloader: Finished feed updates in thread: '0x3988'.
```
### Operating system and version
* OS: Windows 10
* RSS Guard version: 4.2.4
All the info from the about popup:
Version: 4.2.4 (built on Windows/AMD64)
Revision: 1f6d7c0b
Build date: 05/09/2022 11:59
Qt: 6.3.1 (compiled against 6.3.1)
|
1.0
|
[BUG]: RSS Guard misses new entries in specific feed - ### Brief description of the issue
I have https://www.thecodedmessage.com/index.xml as a feed, and it longer grabs new articles ("Programming Portfolio" from 2022-06-23 is the latest one). "Fetch metadata" still works (and selects the ATOM 1.0 type), and both my browser and curl are still able to fetch it, showing it does have new entries, making me believe the issue lies with RSS Guard. I have tried adding the URL as a new feed, but the same issue occurs (it simply has no articles and doesn't get any). This is the only URL I have this issue with.
### How to reproduce the bug?
1. Add or have https://www.thecodedmessage.com/index.xml as a feed
2. Fetch entries for it
### What was the expected result?
The new articles show up in the articles view.
### What actually happened?
The feed name in the feed list becomes blue, now new articles appear.
### Debug log
The most relevant line appears to be:
```
time=" 30.767" type="critical" -> database: Failed bulk insert of articles: 'CHECK constraint failed: date_created >= 0 Unable to fetch rowCHECK constraint failed: date_created >= 0Unable to fetch row'.
```
```
time=" 29.712" type="debug" -> feed-downloader: Downloading new messages for feed ID '269' URL: 'https://www.thecodedmessage.com/index.xml' title: 'The Coded Message' in thread: '0x3988'.
time=" 29.712" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 29.713" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 29.713" type="debug" -> core: Downloading URL 'https://www.thecodedmessage.com/index.xml' to obtain feed data.
time=" 29.713" type="debug" -> network: Settings of BaseNetworkAccessManager loaded.
time=" 30.471" type="debug" -> network: Destroying Downloader instance.
time=" 30.472" type="debug" -> network: Destroying SilentNetworkAccessManager instance.
time=" 30.606" type="debug" -> feed-downloader: Downloaded 66 messages for feed ID '269' URL: 'https://www.thecodedmessage.com/index.xml' title: 'The Coded Message' in thread: '0x3988'. Operation took 893939 microseconds.
time=" 30.613" type="debug" -> feed-downloader: Saving messages of feed ID '269' URL: 'https://www.thecodedmessage.com/index.xml' title: 'The Coded Message' in thread: '0x3988'.
time=" 30.613" type="debug" -> core: Updating messages in DB. Main thread: 'false'.
time=" 30.613" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.614" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.614" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/strong-typing/' is present in DB.
time=" 30.617" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/erased-serde/' is present in DB.
time=" 30.620" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/write-everything-down/' is present in DB.
time=" 30.622" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/blocking-sockets/' is present in DB.
time=" 30.624" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/2022-07-14-programming-unwrap/' is present in DB.
time=" 30.626" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/review-plain-truth/' is present in DB.
time=" 30.628" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/haskell-error-message-2/' is present in DB.
time=" 30.630" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/2022-06-16-programming-cli/' is present in DB.
time=" 30.632" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/grammar/' is present in DB.
time=" 30.633" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/trivia-rust-types/' is present in DB.
time=" 30.635" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/function-overloading-in-rust/' is present in DB.
time=" 30.638" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hugo-2022/' is present in DB.
time=" 30.640" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/netflix-tech/' is present in DB.
time=" 30.641" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/2022-05-11-programming-multiparadigm/' is present in DB.
time=" 30.643" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/process-checklist/' is present in DB.
time=" 30.645" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/patience/' is present in DB.
time=" 30.647" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/mortgage2/' is present in DB.
time=" 30.649" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/programming-integers/' is present in DB.
time=" 30.651" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hugo-2021/' is present in DB.
time=" 30.654" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/comic-beer/' is present in DB.
time=" 30.656" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/reproducibility/' is present in DB.
time=" 30.658" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/rust-map-entry/' is present in DB.
time=" 30.660" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/biking-to-philly/' is present in DB.
time=" 30.662" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/crank-em-out/' is present in DB.
time=" 30.664" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/qbasic-nostalgia/' is present in DB.
time=" 30.665" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/warnings/' is present in DB.
time=" 30.667" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/haskell-gripe/' is present in DB.
time=" 30.669" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/mortgage_interest/' is present in DB.
time=" 30.671" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/buried-lede/' is present in DB.
time=" 30.673" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/unsafe/' is present in DB.
time=" 30.675" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/async-colors/' is present in DB.
time=" 30.677" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/endian_polymorphism/' is present in DB.
time=" 30.678" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/cpp-move/' is present in DB.
time=" 30.680" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hello-rust/' is present in DB.
time=" 30.682" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/humpty_dumpty/' is present in DB.
time=" 30.684" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/apple_silicon/' is present in DB.
time=" 30.686" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/perpetual_quarantime/' is present in DB.
time=" 30.688" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/octopedian/' is present in DB.
time=" 30.690" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/rent_pause/' is present in DB.
time=" 30.692" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/web/' is present in DB.
time=" 30.694" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/bar/' is present in DB.
time=" 30.696" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/hungarian/' is present in DB.
time=" 30.698" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/roots/' is present in DB.
time=" 30.700" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/os_tour/' is present in DB.
time=" 30.702" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/extra_version/' is present in DB.
time=" 30.704" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/father_forgive_them/' is present in DB.
time=" 30.706" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/swiss/' is present in DB.
time=" 30.708" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/apartments/' is present in DB.
time=" 30.711" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/current_os/' is present in DB.
time=" 30.713" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/music_words/' is present in DB.
time=" 30.715" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/operating_system/' is present in DB.
time=" 30.716" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/soulfully/' is present in DB.
time=" 30.718" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/major_country/' is present in DB.
time=" 30.721" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/money/' is present in DB.
time=" 30.723" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/sermon/' is present in DB.
time=" 30.725" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/function-ptrs/' is present in DB.
time=" 30.727" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/angels/' is present in DB.
time=" 30.729" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/are-you-sure/' is present in DB.
time=" 30.731" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/india2/' is present in DB.
time=" 30.733" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/india/' is present in DB.
time=" 30.736" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/posts/india0/' is present in DB.
time=" 30.739" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/about/' is present in DB.
time=" 30.741" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/programming-portfolio/' is present in DB.
time=" 30.743" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/reading-log/' is present in DB.
time=" 30.745" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/programming-rec-reading/' is present in DB.
time=" 30.746" type="debug" -> database: Checking if message with feed-specific custom ID 'https://www.thecodedmessage.com/rust-opinions/' is present in DB.
time=" 30.767" type="critical" -> database: Failed bulk insert of articles: 'CHECK constraint failed: date_created >= 0 Unable to fetch rowCHECK constraint failed: date_created >= 0Unable to fetch row'.
time=" 30.771" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.771" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.776" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.776" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.779" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.780" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.782" type="debug" -> database: SQLite connection 'feed_upd' is already active.
time=" 30.783" type="debug" -> database: SQLite database connection 'feed_upd' to file 'C:/Users/MyUserName/AppData/Local/RSS Guard 4/database/database.db' seems to be established.
time=" 30.784" type="debug" -> feed-downloader: Updating messages in DB took 171137 microseconds.
time=" 30.784" type="debug" -> feed-model: There is request to reload feed model, reloading the 5 items individually.
time=" 30.784" type="debug" -> feed-downloader: std::pair(66,66) messages for feed 269 stored in DB.
time=" 30.785" type="debug" -> core: Filter accepts row 'Recycle bin' and filter result is: 'true'.
time=" 30.785" type="debug" -> feed-downloader: Made progress in feed updates, total feeds count 1/1 (id of feed is 269).
time=" 30.785" type="debug" -> feed-downloader: Finished feed updates in thread: '0x3988'.
```
### Operating system and version
* OS: Windows 10
* RSS Guard version: 4.2.4
All the info from the about popup:
Version: 4.2.4 (built on Windows/AMD64)
Revision: 1f6d7c0b
Build date: 05/09/2022 11:59
Qt: 6.3.1 (compiled against 6.3.1)
|
defect
|
rss guard misses new entries in specific feed brief description of the issue i have as a feed and it longer grabs new articles programming portfolio from is the latest one fetch metadata still works and selects the atom type and both my browser and curl are still able to fetch it showing it does have new entries making me believe the issue lies with rss guard i have tried adding the url as a new feed but the same issue occurs it simply has no articles and doesn t get any this is the only url i have this issue with how to reproduce the bug add or have as a feed fetch entries for it what was the expected result the new articles show up in the articles view what actually happened the feed name in the feed list becomes blue now new articles appear debug log the most relevant line appears to be time type critical database failed bulk insert of articles check constraint failed date created unable to fetch rowcheck constraint failed date created to fetch row time type debug feed downloader downloading new messages for feed id url title the coded message in thread time type debug database sqlite connection feed upd is already active time type debug database sqlite database connection feed upd to file c users myusername appdata local rss guard database database db seems to be established time type debug core downloading url to obtain feed data time type debug network settings of basenetworkaccessmanager loaded time type debug network destroying downloader instance time type debug network destroying silentnetworkaccessmanager instance time type debug feed downloader downloaded messages for feed id url title the coded message in thread operation took microseconds time type debug feed downloader saving messages of feed id url title the coded message in thread time type debug core updating messages in db main thread false time type debug database sqlite connection feed upd is already active time type debug database sqlite database connection feed upd to file c users myusername appdata local rss guard database database db seems to be established time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type debug database checking if message with feed specific custom id is present in db time type critical database failed bulk insert of articles check constraint failed date created unable to fetch rowcheck constraint failed date created to fetch row time type debug database sqlite connection feed upd is already active time type debug database sqlite database connection feed upd to file c users myusername appdata local rss guard database database db seems to be established time type debug database sqlite connection feed upd is already active time type debug database sqlite database connection feed upd to file c users myusername appdata local rss guard database database db seems to be established time type debug database sqlite connection feed upd is already active time type debug database sqlite database connection feed upd to file c users myusername appdata local rss guard database database db seems to be established time type debug database sqlite connection feed upd is already active time type debug database sqlite database connection feed upd to file c users myusername appdata local rss guard database database db seems to be established time type debug feed downloader updating messages in db took microseconds time type debug feed model there is request to reload feed model reloading the items individually time type debug feed downloader std pair messages for feed stored in db time type debug core filter accepts row recycle bin and filter result is true time type debug feed downloader made progress in feed updates total feeds count id of feed is time type debug feed downloader finished feed updates in thread operating system and version os windows rss guard version all the info from the about popup version built on windows revision build date qt compiled against
| 1
|
241,370
| 18,448,056,615
|
IssuesEvent
|
2021-10-15 06:43:03
|
VincentChtln/NotaComp
|
https://api.github.com/repos/VincentChtln/NotaComp
|
opened
|
Ecrire la documentation
|
documentation
|
### Description
Ajouter des fichiers de documentation pour expliquer le fonctionnement du code.
|
1.0
|
Ecrire la documentation - ### Description
Ajouter des fichiers de documentation pour expliquer le fonctionnement du code.
|
non_defect
|
ecrire la documentation description ajouter des fichiers de documentation pour expliquer le fonctionnement du code
| 0
|
276,970
| 30,581,327,619
|
IssuesEvent
|
2023-07-21 09:54:38
|
ministryofjustice/hmpps-probation-integration-services
|
https://api.github.com/repos/ministryofjustice/hmpps-probation-integration-services
|
closed
|
CVE-2023-34035 (sentence-plan-and-delius)
|
dependencies security
|
Spring Security's authorization rules can be misconfigured when using multiple servlets
* Project: sentence-plan-and-delius
* Package: `org.springframework.security:spring-security-config:6.1.1`
* Location: `app/libs/spring-security-config-6.1.1.jar`
>Spring Security versions 5.8 prior to 5.8.5, 6.0 prior to 6.0.5, and 6.1 prior to 6.1.2 could be susceptible to authorization rule misconfiguration if the application uses requestMatchers(String) and multiple servlets, one of them being Spring MVC’s DispatcherServlet. (DispatcherServlet is a Spring MVC component that maps HTTP endpoints to methods on @Controller-annotated classes.)
Specifically, an application is vulnerable when all of the following are true:
* Spring MVC is on the classpath
* Spring Security is securing more than one servlet in a single application (one of them being Spring MVC’s DispatcherServlet)
* The application uses requestMatchers(String) to refer to endpoints that are not Spring MVC endpoints
An application is not vulnerable if any of the following is true:
* The application does not have Spring MVC on the classpath
* The application secures no servlets other than Spring MVC’s DispatcherServlet
* The application uses requestMatchers(String) only for Spring MVC endpoints
https://avd.aquasec.com/nvd/cve-2023-34035
If the vulnerability does not impact the `sentence-plan-and-delius` project, you can suppress this alert by adding a comment starting with `Suppress`. For example, "Suppressed because we do not process any untrusted XML content".
|
True
|
CVE-2023-34035 (sentence-plan-and-delius) - Spring Security's authorization rules can be misconfigured when using multiple servlets
* Project: sentence-plan-and-delius
* Package: `org.springframework.security:spring-security-config:6.1.1`
* Location: `app/libs/spring-security-config-6.1.1.jar`
>Spring Security versions 5.8 prior to 5.8.5, 6.0 prior to 6.0.5, and 6.1 prior to 6.1.2 could be susceptible to authorization rule misconfiguration if the application uses requestMatchers(String) and multiple servlets, one of them being Spring MVC’s DispatcherServlet. (DispatcherServlet is a Spring MVC component that maps HTTP endpoints to methods on @Controller-annotated classes.)
Specifically, an application is vulnerable when all of the following are true:
* Spring MVC is on the classpath
* Spring Security is securing more than one servlet in a single application (one of them being Spring MVC’s DispatcherServlet)
* The application uses requestMatchers(String) to refer to endpoints that are not Spring MVC endpoints
An application is not vulnerable if any of the following is true:
* The application does not have Spring MVC on the classpath
* The application secures no servlets other than Spring MVC’s DispatcherServlet
* The application uses requestMatchers(String) only for Spring MVC endpoints
https://avd.aquasec.com/nvd/cve-2023-34035
If the vulnerability does not impact the `sentence-plan-and-delius` project, you can suppress this alert by adding a comment starting with `Suppress`. For example, "Suppressed because we do not process any untrusted XML content".
|
non_defect
|
cve sentence plan and delius spring security s authorization rules can be misconfigured when using multiple servlets project sentence plan and delius package org springframework security spring security config location app libs spring security config jar spring security versions prior to prior to and prior to could be susceptible to authorization rule misconfiguration if the application uses requestmatchers string and multiple servlets one of them being spring mvc’s dispatcherservlet dispatcherservlet is a spring mvc component that maps http endpoints to methods on controller annotated classes specifically an application is vulnerable when all of the following are true spring mvc is on the classpath spring security is securing more than one servlet in a single application one of them being spring mvc’s dispatcherservlet the application uses requestmatchers string to refer to endpoints that are not spring mvc endpoints an application is not vulnerable if any of the following is true the application does not have spring mvc on the classpath the application secures no servlets other than spring mvc’s dispatcherservlet the application uses requestmatchers string only for spring mvc endpoints if the vulnerability does not impact the sentence plan and delius project you can suppress this alert by adding a comment starting with suppress for example suppressed because we do not process any untrusted xml content
| 0
|
220
| 2,520,319,225
|
IssuesEvent
|
2015-01-19 00:07:01
|
GarageGames/Torque3D
|
https://api.github.com/repos/GarageGames/Torque3D
|
opened
|
Reduce duplication in script templates
|
Defect / improvement New feature
|
As per [this forum thread](http://www.garagegames.com/community/forums/viewthread/140846), we will introduce a Packages folder to split duplicated script modules into single locations. For example, where we now have something like:
Templates/
Full/
tools/
Empty/
tools/
This change will result in:
Templates/
Full/
Empty/
Packages/
Editor/
tools/
When a project is created, desired packages are copied into the project directory the same way templates currently are. This means that instead of changing template scripts in two locations, the package can be changed, and all projects will benefit from the update.
This change will be fully backwards-compatible with 3.6 at the project level - i.e. a project created in 3.7 will look identical to a project created in 3.6. However, user modifications to the templates will have to be ported to the appropriate package.
For associated project generator changes, see GarageGames/Torque3D-ProjectManager#36.
|
1.0
|
Reduce duplication in script templates - As per [this forum thread](http://www.garagegames.com/community/forums/viewthread/140846), we will introduce a Packages folder to split duplicated script modules into single locations. For example, where we now have something like:
Templates/
Full/
tools/
Empty/
tools/
This change will result in:
Templates/
Full/
Empty/
Packages/
Editor/
tools/
When a project is created, desired packages are copied into the project directory the same way templates currently are. This means that instead of changing template scripts in two locations, the package can be changed, and all projects will benefit from the update.
This change will be fully backwards-compatible with 3.6 at the project level - i.e. a project created in 3.7 will look identical to a project created in 3.6. However, user modifications to the templates will have to be ported to the appropriate package.
For associated project generator changes, see GarageGames/Torque3D-ProjectManager#36.
|
defect
|
reduce duplication in script templates as per we will introduce a packages folder to split duplicated script modules into single locations for example where we now have something like templates full tools empty tools this change will result in templates full empty packages editor tools when a project is created desired packages are copied into the project directory the same way templates currently are this means that instead of changing template scripts in two locations the package can be changed and all projects will benefit from the update this change will be fully backwards compatible with at the project level i e a project created in will look identical to a project created in however user modifications to the templates will have to be ported to the appropriate package for associated project generator changes see garagegames projectmanager
| 1
|
30,014
| 5,971,372,274
|
IssuesEvent
|
2017-05-31 02:13:52
|
kaneless/mybatisnet
|
https://api.github.com/repos/kaneless/mybatisnet
|
closed
|
Work with SqlDependency
|
auto-migrated Priority-Low Type-Defect
|
```
Running MyBatis.NET v1.6.1
Hi,
Is there a way to use SqlDependency in iBatis? rather than working with
ADP.NET ?
Thanks,
Aviad
```
Original issue reported on code.google.com by `Aviad...@gmail.com` on 19 Jan 2011 at 11:33
|
1.0
|
Work with SqlDependency - ```
Running MyBatis.NET v1.6.1
Hi,
Is there a way to use SqlDependency in iBatis? rather than working with
ADP.NET ?
Thanks,
Aviad
```
Original issue reported on code.google.com by `Aviad...@gmail.com` on 19 Jan 2011 at 11:33
|
defect
|
work with sqldependency running mybatis net hi is there a way to use sqldependency in ibatis rather than working with adp net thanks aviad original issue reported on code google com by aviad gmail com on jan at
| 1
|
26,349
| 4,682,383,119
|
IssuesEvent
|
2016-10-09 08:00:03
|
luigirizzo/netmap
|
https://api.github.com/repos/luigirizzo/netmap
|
closed
|
kring error: hwcur 17 rcur 17 hwtail 18 head 18 cur 17 tail 18
|
auto-migrated Priority-Medium Type-Defect
|
```
Hello again!
I run netmap in my linux box with Intel 82599 but hit very strange issue:
/usr/src/netmap/examples/pkt-gen -i eth3 -f rx -p 8
068.342571 main [1624] interface is eth3
068.342807 extract_ip_range [275] range is 10.0.0.1:0 to 10.0.0.1:0
068.342812 extract_ip_range [275] range is 10.1.0.1:0 to 10.1.0.1:0
068.591115 main [1807] mapped 334980KB at 0x7f620e688000
Receiving from netmap:eth3: 8 queues, 8 threads and 1 cpus.
068.591150 main [1887] Wait 2 secs for phy reset
070.591210 main [1889] Ready...
070.591258 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591326 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591349 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591369 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591397 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591424 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591453 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591481 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
071.592511 main_thread [1421] 130833 pps (130964 pkts in 1001004 usec)
072.593532 main_thread [1421] 121625 pps (121749 pkts in 1001020 usec)
073.594558 main_thread [1421] 124048 pps (124175 pkts in 1001027 usec)
074.595584 main_thread [1421] 128057 pps (128188 pkts in 1001026 usec)
075.596594 main_thread [1421] 128582 pps (128712 pkts in 1001010 usec)
076.597607 main_thread [1421] 126655 pps (126783 pkts in 1001013 usec)
077.598497 main_thread [1421] 132176 pps (132294 pkts in 1000890 usec)
078.599508 main_thread [1421] 131634 pps (131767 pkts in 1001010 usec)
079.600521 main_thread [1421] 128732 pps (128863 pkts in 1001014 usec)
080.601544 main_thread [1421] 133364 pps (133500 pkts in 1001023 usec)
081.604340 main_thread [1421] 127227 pps (127583 pkts in 1002796 usec)
082.605351 main_thread [1421] 137244 pps (137383 pkts in 1001011 usec)
083.606365 main_thread [1421] 142815 pps (142960 pkts in 1001013 usec)
084.607385 main_thread [1421] 125526 pps (125654 pkts in 1001021 usec)
085.608397 main_thread [1421] 0 pps (0 pkts in 1001012 usec)
086.609408 main_thread [1421] 0 pps (0 pkts in 1001010 usec)
087.610419 main_thread [1421] 0 pps (0 pkts in 1001011 usec)
088.611431 main_thread [1421] 0 pps (0 pkts in 1001013 usec)
089.612441 main_thread [1421] 0 pps (0 pkts in 1001010 usec)
As you can see from some moment all traffic was dropped (but will exists on
interface).
And I got following messages in dmesg:
[391294.336358] 139.602658 [1398] netmap_ring_reinit called for eth3
[391294.346908] 139.613234 [1373] nm_rxsync_prologue kring error: hwcur
255 rcur 255 hwtail 256 head 256 cur 255 tail 256
[391294.348009] 139.614342 [1398] netmap_ring_reinit called for eth3
[391294.356373] 139.622726 [1373] nm_rxsync_prologue kring error: hwcur
446 rcur 446 hwtail 448 head 448 cur 446 tail 448
[391294.357458] 139.623817 [1398] netmap_ring_reinit called for eth3
[391294.370475] 139.636871 [1373] nm_rxsync_prologue kring error: hwcur
245 rcur 244 hwtail 245 head 244 cur 244 tail 245
[391294.371550] 139.637950 [1398] netmap_ring_reinit called for eth3
[391294.383473] 139.649904 [1398] netmap_ring_reinit called for eth3
[391294.399922] 139.666403 [1398] netmap_ring_reinit called for eth3
[391294.403495] 139.669983 [1398] netmap_ring_reinit called for eth3
[391294.407691] 139.674194 [1398] netmap_ring_reinit called for eth3
[391294.418766] 139.685300 [1398] netmap_ring_reinit called for eth3
[391295.319042] 140.588149 [1373] nm_rxsync_prologue kring error: hwcur
185 rcur 185 hwtail 186 head 186 cur 185 tail 186
[391295.320174] 140.589287 [1398] netmap_ring_reinit called for eth3
[391295.344051] 140.613228 [1373] nm_rxsync_prologue kring error: hwcur
273 rcur 273 hwtail 275 head 275 cur 273 tail 275
[391295.345158] 140.614342 [1398] netmap_ring_reinit called for eth3
[391295.348072] 140.617259 [1373] nm_rxsync_prologue kring error: hwcur
119 rcur 119 hwtail 120 head 120 cur 119 tail 120
[391295.349180] 140.618376 [1398] netmap_ring_reinit called for eth3
[391295.358263] 140.627483 [1373] nm_rxsync_prologue kring error: hwcur
121 rcur 121 hwtail 122 head 122 cur 121 tail 122
[391295.359375] 140.628600 [1398] netmap_ring_reinit called for eth3
[391295.405420] 140.674772 [1373] nm_rxsync_prologue kring error: hwcur
260 rcur 260 hwtail 261 head 261 cur 260 tail 261
[391295.406509] 140.675869 [1398] netmap_ring_reinit called for eth3
[391295.407519] 140.676881 [1398] netmap_ring_reinit called for eth3
[391295.407850] 140.677212 [1398] netmap_ring_reinit called for eth3
[391295.418148] 140.687541 [1398] netmap_ring_reinit called for eth3
[391295.431216] 140.700645 [1398] netmap_ring_reinit called for eth3
[391295.440787] 140.710242 [1398] netmap_ring_reinit called for eth3
[391296.330430] 141.602426 [1373] nm_rxsync_prologue kring error: hwcur
408 rcur 408 hwtail 409 head 409 cur 408 tail 409
[391296.331513] 141.603518 [1398] netmap_ring_reinit called for eth3
[391296.340486] 141.612515 [1373] nm_rxsync_prologue kring error: hwcur
372 rcur 372 hwtail 373 head 373 cur 372 tail 373
[391296.341551] 141.613585 [1398] netmap_ring_reinit called for eth3
[391296.345355] 141.617389 [1373] nm_rxsync_prologue kring error: hwcur
434 rcur 434 hwtail 435 head 435 cur 434 tail 435
[391296.346690] 141.618737 [1398] netmap_ring_reinit called for eth3
[391296.361698] 141.633785 [1373] nm_rxsync_prologue kring error: hwcur
118 rcur 118 hwtail 119 head 119 cur 118 tail 119
[391296.362781] 141.634876 [1398] netmap_ring_reinit called for eth3
[391296.364596] 141.636694 [1373] nm_rxsync_prologue kring error: hwcur
229 rcur 229 hwtail 230 head 230 cur 229 tail 230
[391296.365658] 141.637760 [1398] netmap_ring_reinit called for eth3
[391296.366671] 141.638775 [1398] netmap_ring_reinit called for eth3
[391296.374807] 141.646933 [1398] netmap_ring_reinit called for eth3
[391296.395417] 141.667602 [1398] netmap_ring_reinit called for eth3
[391296.432846] 141.705138 [1398] netmap_ring_reinit called for eth3
[391296.436222] 141.708524 [1398] netmap_ring_reinit called for eth3
[391297.309841] 142.584641 [1373] nm_rxsync_prologue kring error: hwcur
445 rcur 445 hwtail 446 head 446 cur 445 tail 446
[391297.310885] 142.585691 [1398] netmap_ring_reinit called for eth3
[391297.354334] 142.629261 [1373] nm_rxsync_prologue kring error: hwcur
58 rcur 58 hwtail 59 head 59 cur 58 tail 59
[391297.355512] 142.630444 [1398] netmap_ring_reinit called for eth3
[391297.379930] 142.654931 [1373] nm_rxsync_prologue kring error: hwcur
171 rcur 171 hwtail 173 head 173 cur 171 tail 173
[391297.380997] 142.656002 [1398] netmap_ring_reinit called for eth3
[391297.382836] 142.657843 [1373] nm_rxsync_prologue kring error: hwcur
405 rcur 405 hwtail 406 head 406 cur 405 tail 406
[391297.383946] 142.658959 [1398] netmap_ring_reinit called for eth3
[391297.386143] 142.661159 [1373] nm_rxsync_prologue kring error: hwcur
17 rcur 17 hwtail 18 head 18 cur 17 tail 18
[391297.387224] 142.662248 [1398] netmap_ring_reinit called for eth3
[391297.388722] 142.663749 [1398] netmap_ring_reinit called for eth3
[391297.401513] 142.676577 [1398] netmap_ring_reinit called for eth3
[391297.407089] 142.682168 [1398] netmap_ring_reinit called for eth3
[391297.410863] 142.685951 [1398] netmap_ring_reinit called for eth3
[391297.422376] 142.697499 [1398] netmap_ring_reinit called for eth3
```
Original issue reported on code.google.com by `pavel.odintsov` on 29 Oct 2014 at 1:30
|
1.0
|
kring error: hwcur 17 rcur 17 hwtail 18 head 18 cur 17 tail 18 - ```
Hello again!
I run netmap in my linux box with Intel 82599 but hit very strange issue:
/usr/src/netmap/examples/pkt-gen -i eth3 -f rx -p 8
068.342571 main [1624] interface is eth3
068.342807 extract_ip_range [275] range is 10.0.0.1:0 to 10.0.0.1:0
068.342812 extract_ip_range [275] range is 10.1.0.1:0 to 10.1.0.1:0
068.591115 main [1807] mapped 334980KB at 0x7f620e688000
Receiving from netmap:eth3: 8 queues, 8 threads and 1 cpus.
068.591150 main [1887] Wait 2 secs for phy reset
070.591210 main [1889] Ready...
070.591258 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591326 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591349 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591369 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591397 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591424 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591453 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
070.591481 nm_open [457] overriding ifname eth3 ringid 0x0 flags 0x1
071.592511 main_thread [1421] 130833 pps (130964 pkts in 1001004 usec)
072.593532 main_thread [1421] 121625 pps (121749 pkts in 1001020 usec)
073.594558 main_thread [1421] 124048 pps (124175 pkts in 1001027 usec)
074.595584 main_thread [1421] 128057 pps (128188 pkts in 1001026 usec)
075.596594 main_thread [1421] 128582 pps (128712 pkts in 1001010 usec)
076.597607 main_thread [1421] 126655 pps (126783 pkts in 1001013 usec)
077.598497 main_thread [1421] 132176 pps (132294 pkts in 1000890 usec)
078.599508 main_thread [1421] 131634 pps (131767 pkts in 1001010 usec)
079.600521 main_thread [1421] 128732 pps (128863 pkts in 1001014 usec)
080.601544 main_thread [1421] 133364 pps (133500 pkts in 1001023 usec)
081.604340 main_thread [1421] 127227 pps (127583 pkts in 1002796 usec)
082.605351 main_thread [1421] 137244 pps (137383 pkts in 1001011 usec)
083.606365 main_thread [1421] 142815 pps (142960 pkts in 1001013 usec)
084.607385 main_thread [1421] 125526 pps (125654 pkts in 1001021 usec)
085.608397 main_thread [1421] 0 pps (0 pkts in 1001012 usec)
086.609408 main_thread [1421] 0 pps (0 pkts in 1001010 usec)
087.610419 main_thread [1421] 0 pps (0 pkts in 1001011 usec)
088.611431 main_thread [1421] 0 pps (0 pkts in 1001013 usec)
089.612441 main_thread [1421] 0 pps (0 pkts in 1001010 usec)
As you can see from some moment all traffic was dropped (but will exists on
interface).
And I got following messages in dmesg:
[391294.336358] 139.602658 [1398] netmap_ring_reinit called for eth3
[391294.346908] 139.613234 [1373] nm_rxsync_prologue kring error: hwcur
255 rcur 255 hwtail 256 head 256 cur 255 tail 256
[391294.348009] 139.614342 [1398] netmap_ring_reinit called for eth3
[391294.356373] 139.622726 [1373] nm_rxsync_prologue kring error: hwcur
446 rcur 446 hwtail 448 head 448 cur 446 tail 448
[391294.357458] 139.623817 [1398] netmap_ring_reinit called for eth3
[391294.370475] 139.636871 [1373] nm_rxsync_prologue kring error: hwcur
245 rcur 244 hwtail 245 head 244 cur 244 tail 245
[391294.371550] 139.637950 [1398] netmap_ring_reinit called for eth3
[391294.383473] 139.649904 [1398] netmap_ring_reinit called for eth3
[391294.399922] 139.666403 [1398] netmap_ring_reinit called for eth3
[391294.403495] 139.669983 [1398] netmap_ring_reinit called for eth3
[391294.407691] 139.674194 [1398] netmap_ring_reinit called for eth3
[391294.418766] 139.685300 [1398] netmap_ring_reinit called for eth3
[391295.319042] 140.588149 [1373] nm_rxsync_prologue kring error: hwcur
185 rcur 185 hwtail 186 head 186 cur 185 tail 186
[391295.320174] 140.589287 [1398] netmap_ring_reinit called for eth3
[391295.344051] 140.613228 [1373] nm_rxsync_prologue kring error: hwcur
273 rcur 273 hwtail 275 head 275 cur 273 tail 275
[391295.345158] 140.614342 [1398] netmap_ring_reinit called for eth3
[391295.348072] 140.617259 [1373] nm_rxsync_prologue kring error: hwcur
119 rcur 119 hwtail 120 head 120 cur 119 tail 120
[391295.349180] 140.618376 [1398] netmap_ring_reinit called for eth3
[391295.358263] 140.627483 [1373] nm_rxsync_prologue kring error: hwcur
121 rcur 121 hwtail 122 head 122 cur 121 tail 122
[391295.359375] 140.628600 [1398] netmap_ring_reinit called for eth3
[391295.405420] 140.674772 [1373] nm_rxsync_prologue kring error: hwcur
260 rcur 260 hwtail 261 head 261 cur 260 tail 261
[391295.406509] 140.675869 [1398] netmap_ring_reinit called for eth3
[391295.407519] 140.676881 [1398] netmap_ring_reinit called for eth3
[391295.407850] 140.677212 [1398] netmap_ring_reinit called for eth3
[391295.418148] 140.687541 [1398] netmap_ring_reinit called for eth3
[391295.431216] 140.700645 [1398] netmap_ring_reinit called for eth3
[391295.440787] 140.710242 [1398] netmap_ring_reinit called for eth3
[391296.330430] 141.602426 [1373] nm_rxsync_prologue kring error: hwcur
408 rcur 408 hwtail 409 head 409 cur 408 tail 409
[391296.331513] 141.603518 [1398] netmap_ring_reinit called for eth3
[391296.340486] 141.612515 [1373] nm_rxsync_prologue kring error: hwcur
372 rcur 372 hwtail 373 head 373 cur 372 tail 373
[391296.341551] 141.613585 [1398] netmap_ring_reinit called for eth3
[391296.345355] 141.617389 [1373] nm_rxsync_prologue kring error: hwcur
434 rcur 434 hwtail 435 head 435 cur 434 tail 435
[391296.346690] 141.618737 [1398] netmap_ring_reinit called for eth3
[391296.361698] 141.633785 [1373] nm_rxsync_prologue kring error: hwcur
118 rcur 118 hwtail 119 head 119 cur 118 tail 119
[391296.362781] 141.634876 [1398] netmap_ring_reinit called for eth3
[391296.364596] 141.636694 [1373] nm_rxsync_prologue kring error: hwcur
229 rcur 229 hwtail 230 head 230 cur 229 tail 230
[391296.365658] 141.637760 [1398] netmap_ring_reinit called for eth3
[391296.366671] 141.638775 [1398] netmap_ring_reinit called for eth3
[391296.374807] 141.646933 [1398] netmap_ring_reinit called for eth3
[391296.395417] 141.667602 [1398] netmap_ring_reinit called for eth3
[391296.432846] 141.705138 [1398] netmap_ring_reinit called for eth3
[391296.436222] 141.708524 [1398] netmap_ring_reinit called for eth3
[391297.309841] 142.584641 [1373] nm_rxsync_prologue kring error: hwcur
445 rcur 445 hwtail 446 head 446 cur 445 tail 446
[391297.310885] 142.585691 [1398] netmap_ring_reinit called for eth3
[391297.354334] 142.629261 [1373] nm_rxsync_prologue kring error: hwcur
58 rcur 58 hwtail 59 head 59 cur 58 tail 59
[391297.355512] 142.630444 [1398] netmap_ring_reinit called for eth3
[391297.379930] 142.654931 [1373] nm_rxsync_prologue kring error: hwcur
171 rcur 171 hwtail 173 head 173 cur 171 tail 173
[391297.380997] 142.656002 [1398] netmap_ring_reinit called for eth3
[391297.382836] 142.657843 [1373] nm_rxsync_prologue kring error: hwcur
405 rcur 405 hwtail 406 head 406 cur 405 tail 406
[391297.383946] 142.658959 [1398] netmap_ring_reinit called for eth3
[391297.386143] 142.661159 [1373] nm_rxsync_prologue kring error: hwcur
17 rcur 17 hwtail 18 head 18 cur 17 tail 18
[391297.387224] 142.662248 [1398] netmap_ring_reinit called for eth3
[391297.388722] 142.663749 [1398] netmap_ring_reinit called for eth3
[391297.401513] 142.676577 [1398] netmap_ring_reinit called for eth3
[391297.407089] 142.682168 [1398] netmap_ring_reinit called for eth3
[391297.410863] 142.685951 [1398] netmap_ring_reinit called for eth3
[391297.422376] 142.697499 [1398] netmap_ring_reinit called for eth3
```
Original issue reported on code.google.com by `pavel.odintsov` on 29 Oct 2014 at 1:30
|
defect
|
kring error hwcur rcur hwtail head cur tail hello again i run netmap in my linux box with intel but hit very strange issue usr src netmap examples pkt gen i f rx p main interface is extract ip range range is to extract ip range range is to main mapped at receiving from netmap queues threads and cpus main wait secs for phy reset main ready nm open overriding ifname ringid flags nm open overriding ifname ringid flags nm open overriding ifname ringid flags nm open overriding ifname ringid flags nm open overriding ifname ringid flags nm open overriding ifname ringid flags nm open overriding ifname ringid flags nm open overriding ifname ringid flags main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec main thread pps pkts in usec as you can see from some moment all traffic was dropped but will exists on interface and i got following messages in dmesg netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for nm rxsync prologue kring error hwcur rcur hwtail head cur tail netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for netmap ring reinit called for original issue reported on code google com by pavel odintsov on oct at
| 1
|
72,671
| 24,228,390,840
|
IssuesEvent
|
2022-09-26 16:04:28
|
MarcusWolschon/osmeditor4android
|
https://api.github.com/repos/MarcusWolschon/osmeditor4android
|
closed
|
Screen flies up after each property deletion
|
Defect Minor UI
|
Let's say we want to delete the _Games_ item. We tap its trash can:


The screen has now flown fully up to the top. And we have to pull it back down to where our eyes were manually:

|
1.0
|
Screen flies up after each property deletion - Let's say we want to delete the _Games_ item. We tap its trash can:


The screen has now flown fully up to the top. And we have to pull it back down to where our eyes were manually:

|
defect
|
screen flies up after each property deletion let s say we want to delete the games item we tap its trash can the screen has now flown fully up to the top and we have to pull it back down to where our eyes were manually
| 1
|
10,639
| 2,622,178,353
|
IssuesEvent
|
2015-03-04 00:17:41
|
byzhang/leveldb
|
https://api.github.com/repos/byzhang/leveldb
|
closed
|
Build issue on FreeBSD
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1. Run gmake
What is the expected output? What do you see instead?
Expected leveldb to build, instead getting repeated output like
In file included from ./port/port.h:14,
from ./db/filename.h:14,
from db/builder.cc:7:
./port/port_posix.h:67: error: '__BYTE_ORDER' was not declared in this scope
./port/port_posix.h:67: error: '__LITTLE_ENDIAN' was not declared in this scope
Rolling back to 85584d497e7b354853b72f450683d59fcf6b9c5c fixes the problem.
What version of the product are you using? On what operating system?
This happens with dd0d562b4d4fbd07db6a44f9e221f8d368fee8e4
```
Original issue reported on code.google.com by `jhans...@gmail.com` on 20 Jun 2012 at 9:24
|
1.0
|
Build issue on FreeBSD - ```
What steps will reproduce the problem?
1. Run gmake
What is the expected output? What do you see instead?
Expected leveldb to build, instead getting repeated output like
In file included from ./port/port.h:14,
from ./db/filename.h:14,
from db/builder.cc:7:
./port/port_posix.h:67: error: '__BYTE_ORDER' was not declared in this scope
./port/port_posix.h:67: error: '__LITTLE_ENDIAN' was not declared in this scope
Rolling back to 85584d497e7b354853b72f450683d59fcf6b9c5c fixes the problem.
What version of the product are you using? On what operating system?
This happens with dd0d562b4d4fbd07db6a44f9e221f8d368fee8e4
```
Original issue reported on code.google.com by `jhans...@gmail.com` on 20 Jun 2012 at 9:24
|
defect
|
build issue on freebsd what steps will reproduce the problem run gmake what is the expected output what do you see instead expected leveldb to build instead getting repeated output like in file included from port port h from db filename h from db builder cc port port posix h error byte order was not declared in this scope port port posix h error little endian was not declared in this scope rolling back to fixes the problem what version of the product are you using on what operating system this happens with original issue reported on code google com by jhans gmail com on jun at
| 1
|
51,585
| 13,207,530,473
|
IssuesEvent
|
2020-08-14 23:28:15
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
opened
|
Python I3File should mix in non-native keys (Trac #656)
|
Incomplete Migration Migrated from Trac dataio defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/656">https://code.icecube.wisc.edu/projects/icecube/ticket/656</a>, reported by jvansanten</summary>
<p>
```json
{
"status": "closed",
"changetime": "2011-10-25T14:30:56",
"_ts": "1319553056000000",
"description": "I3Frames emitted by I3Modules contain the non-native keys (e.g. GCDQ) found in all previous frames. It would be convenient if the Python I3File had the same functionality.\n\nA hobo implementation can be found here:\n\nhttp://code.icecube.wisc.edu/projects/icecube/browser/sandbox/python-event-viewer/trunk/python/hobomux.py?rev=79730",
"reporter": "jvansanten",
"cc": "",
"resolution": "fixed",
"time": "2011-10-25T14:27:41",
"component": "dataio",
"summary": "Python I3File should mix in non-native keys",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
Python I3File should mix in non-native keys (Trac #656) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/656">https://code.icecube.wisc.edu/projects/icecube/ticket/656</a>, reported by jvansanten</summary>
<p>
```json
{
"status": "closed",
"changetime": "2011-10-25T14:30:56",
"_ts": "1319553056000000",
"description": "I3Frames emitted by I3Modules contain the non-native keys (e.g. GCDQ) found in all previous frames. It would be convenient if the Python I3File had the same functionality.\n\nA hobo implementation can be found here:\n\nhttp://code.icecube.wisc.edu/projects/icecube/browser/sandbox/python-event-viewer/trunk/python/hobomux.py?rev=79730",
"reporter": "jvansanten",
"cc": "",
"resolution": "fixed",
"time": "2011-10-25T14:27:41",
"component": "dataio",
"summary": "Python I3File should mix in non-native keys",
"priority": "normal",
"keywords": "",
"milestone": "",
"owner": "",
"type": "defect"
}
```
</p>
</details>
|
defect
|
python should mix in non native keys trac migrated from json status closed changetime ts description emitted by contain the non native keys e g gcdq found in all previous frames it would be convenient if the python had the same functionality n na hobo implementation can be found here n n reporter jvansanten cc resolution fixed time component dataio summary python should mix in non native keys priority normal keywords milestone owner type defect
| 1
|
45,795
| 13,055,750,074
|
IssuesEvent
|
2020-07-30 02:37:29
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
opened
|
changing I3_WORK to I3_SRC and I3_BUILD (Trac #65)
|
Incomplete Migration Migrated from Trac cmake defect
|
Migrated from https://code.icecube.wisc.edu/ticket/65
```json
{
"status": "closed",
"changetime": "2007-06-14T19:38:14",
"description": "With the current cmake compiling tool, we no longer have just the one main work directory (I3_WORK), but two relevant directories: source code dir (I3_SRC) and build dir (I3_BUILD).\nSome components of the code still expect I3_WORK, even though this is obsolete now.\nI created branches of both cmake and ithon called \"src-build\", which have these relevant changes.\nThese branches should be used in a release of offline-software and also merged with the trunk at some point.\n\nWith I3_WORK in place, pyton (via ithon) would return error message about missing .so files, which made no sense, and were impossible to figure out...",
"reporter": "dule",
"cc": "troy@icecube.umd.edu",
"resolution": "duplicate",
"_ts": "1181849894000000",
"component": "cmake",
"summary": "changing I3_WORK to I3_SRC and I3_BUILD",
"priority": "major",
"keywords": "",
"time": "2007-06-14T19:31:21",
"milestone": "",
"owner": "troy",
"type": "defect"
}
```
|
1.0
|
changing I3_WORK to I3_SRC and I3_BUILD (Trac #65) - Migrated from https://code.icecube.wisc.edu/ticket/65
```json
{
"status": "closed",
"changetime": "2007-06-14T19:38:14",
"description": "With the current cmake compiling tool, we no longer have just the one main work directory (I3_WORK), but two relevant directories: source code dir (I3_SRC) and build dir (I3_BUILD).\nSome components of the code still expect I3_WORK, even though this is obsolete now.\nI created branches of both cmake and ithon called \"src-build\", which have these relevant changes.\nThese branches should be used in a release of offline-software and also merged with the trunk at some point.\n\nWith I3_WORK in place, pyton (via ithon) would return error message about missing .so files, which made no sense, and were impossible to figure out...",
"reporter": "dule",
"cc": "troy@icecube.umd.edu",
"resolution": "duplicate",
"_ts": "1181849894000000",
"component": "cmake",
"summary": "changing I3_WORK to I3_SRC and I3_BUILD",
"priority": "major",
"keywords": "",
"time": "2007-06-14T19:31:21",
"milestone": "",
"owner": "troy",
"type": "defect"
}
```
|
defect
|
changing work to src and build trac migrated from json status closed changetime description with the current cmake compiling tool we no longer have just the one main work directory work but two relevant directories source code dir src and build dir build nsome components of the code still expect work even though this is obsolete now ni created branches of both cmake and ithon called src build which have these relevant changes nthese branches should be used in a release of offline software and also merged with the trunk at some point n nwith work in place pyton via ithon would return error message about missing so files which made no sense and were impossible to figure out reporter dule cc troy icecube umd edu resolution duplicate ts component cmake summary changing work to src and build priority major keywords time milestone owner troy type defect
| 1
|
255,285
| 27,484,892,717
|
IssuesEvent
|
2023-03-04 01:31:06
|
panasalap/linux-4.1.15
|
https://api.github.com/repos/panasalap/linux-4.1.15
|
closed
|
CVE-2017-18202 (High) detected in linux-yocto-devv4.2.8 - autoclosed
|
security vulnerability
|
## CVE-2017-18202 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-devv4.2.8</b></p></summary>
<p>
<p>Linux Embedded Kernel - tracks the next mainline release</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-dev>https://git.yoctoproject.org/git/linux-yocto-dev</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.1.15/commit/aae4c2fa46027fd4c477372871df090c6b94f3f1">aae4c2fa46027fd4c477372871df090c6b94f3f1</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/mm/oom_kill.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/mm/oom_kill.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The __oom_reap_task_mm function in mm/oom_kill.c in the Linux kernel before 4.14.4 mishandles gather operations, which allows attackers to cause a denial of service (TLB entry leak or use-after-free) or possibly have unspecified other impact by triggering a copy_to_user call within a certain time window.
<p>Publish Date: 2018-02-27
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-18202>CVE-2017-18202</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-18202">https://nvd.nist.gov/vuln/detail/CVE-2017-18202</a></p>
<p>Release Date: 2018-02-27</p>
<p>Fix Resolution: 4.14.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2017-18202 (High) detected in linux-yocto-devv4.2.8 - autoclosed - ## CVE-2017-18202 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linux-yocto-devv4.2.8</b></p></summary>
<p>
<p>Linux Embedded Kernel - tracks the next mainline release</p>
<p>Library home page: <a href=https://git.yoctoproject.org/git/linux-yocto-dev>https://git.yoctoproject.org/git/linux-yocto-dev</a></p>
<p>Found in HEAD commit: <a href="https://github.com/panasalap/linux-4.1.15/commit/aae4c2fa46027fd4c477372871df090c6b94f3f1">aae4c2fa46027fd4c477372871df090c6b94f3f1</a></p>
<p>Found in base branch: <b>master</b></p></p>
</details>
</p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (2)</summary>
<p></p>
<p>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/mm/oom_kill.c</b>
<img src='https://s3.amazonaws.com/wss-public/bitbucketImages/xRedImage.png' width=19 height=20> <b>/mm/oom_kill.c</b>
</p>
</details>
<p></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
The __oom_reap_task_mm function in mm/oom_kill.c in the Linux kernel before 4.14.4 mishandles gather operations, which allows attackers to cause a denial of service (TLB entry leak or use-after-free) or possibly have unspecified other impact by triggering a copy_to_user call within a certain time window.
<p>Publish Date: 2018-02-27
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2017-18202>CVE-2017-18202</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Local
- Attack Complexity: High
- Privileges Required: Low
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2017-18202">https://nvd.nist.gov/vuln/detail/CVE-2017-18202</a></p>
<p>Release Date: 2018-02-27</p>
<p>Fix Resolution: 4.14.4</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve high detected in linux yocto autoclosed cve high severity vulnerability vulnerable library linux yocto linux embedded kernel tracks the next mainline release library home page a href found in head commit a href found in base branch master vulnerable source files mm oom kill c mm oom kill c vulnerability details the oom reap task mm function in mm oom kill c in the linux kernel before mishandles gather operations which allows attackers to cause a denial of service tlb entry leak or use after free or possibly have unspecified other impact by triggering a copy to user call within a certain time window publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with mend
| 0
|
7,081
| 2,597,119,905
|
IssuesEvent
|
2015-02-21 03:14:55
|
phetsims/tasks
|
https://api.github.com/repos/phetsims/tasks
|
closed
|
Dev Test Least-Squares Regression 1.0.0-dev.6
|
High Priority QA
|
Please dev test the latest version of Least-Squares Regression (http://www.colorado.edu/physics/phet/dev/html/least-squares-regression/1.0.0-dev.6/)
- [x] Windows + Chrome
- [x] OS X + Safari 7.1+
- [x] iPad iOS 8 + Safari
I know this testing has already begun, just creating an issue to track progress.
|
1.0
|
Dev Test Least-Squares Regression 1.0.0-dev.6 - Please dev test the latest version of Least-Squares Regression (http://www.colorado.edu/physics/phet/dev/html/least-squares-regression/1.0.0-dev.6/)
- [x] Windows + Chrome
- [x] OS X + Safari 7.1+
- [x] iPad iOS 8 + Safari
I know this testing has already begun, just creating an issue to track progress.
|
non_defect
|
dev test least squares regression dev please dev test the latest version of least squares regression windows chrome os x safari ipad ios safari i know this testing has already begun just creating an issue to track progress
| 0
|
40,716
| 2,868,938,048
|
IssuesEvent
|
2015-06-05 22:04:25
|
dart-lang/pub
|
https://api.github.com/repos/dart-lang/pub
|
closed
|
Pub validator test is failing on mac
|
bug Fixed Priority-Medium
|
<a href="https://github.com/nex3"><img src="https://avatars.githubusercontent.com/u/188?v=3" align="left" width="96" height="96"hspace="10"></img></a> **Issue by [nex3](https://github.com/nex3)**
_Originally opened as dart-lang/sdk#7330_
----
http://build.chromium.org/p/client.dart/builders/pub-mac/builds/1441/steps/pub%20tests/logs/stdio
For some reason, it doesn't seem to be recognizing a "src" directory beneath "lib" as "src".
|
1.0
|
Pub validator test is failing on mac - <a href="https://github.com/nex3"><img src="https://avatars.githubusercontent.com/u/188?v=3" align="left" width="96" height="96"hspace="10"></img></a> **Issue by [nex3](https://github.com/nex3)**
_Originally opened as dart-lang/sdk#7330_
----
http://build.chromium.org/p/client.dart/builders/pub-mac/builds/1441/steps/pub%20tests/logs/stdio
For some reason, it doesn't seem to be recognizing a "src" directory beneath "lib" as "src".
|
non_defect
|
pub validator test is failing on mac issue by originally opened as dart lang sdk for some reason it doesn t seem to be recognizing a quot src quot directory beneath quot lib quot as quot src quot
| 0
|
56,500
| 15,114,037,759
|
IssuesEvent
|
2021-02-09 00:58:56
|
playframework/playframework
|
https://api.github.com/repos/playframework/playframework
|
closed
|
QueryStringBindable.bindableString.unbind() produces URL-encoded keys, other binders - don't
|
type:defect
|
### Play Version
play-`2.8.2`
### API
Scala
### Operating System
`Linux pc 5.7.6-arch1-1 #1 SMP PREEMPT Thu, 25 Jun 2020 00:14:47 +0000 x86_64 GNU/Linux`
### JDK
```
openjdk version "1.8.0_252"
OpenJDK Runtime Environment (build 1.8.0_252-b09)
OpenJDK 64-Bit Server VM (build 25.252-b09, mixed mode)
```
### Library Dependencies
None
### Expected Behavior
`implicitly[QueryStringBindable[String]].unbind()` should NOT URL-encode key-part.
```
$ sbt console
import play.api.mvc.QueryStringBindable
scala> QueryStringBindable.bindableChar.unbind( "items[1]", '1' )
val res1: String = items[1]=1
scala> QueryStringBindable.bindableInt.unbind( "items[1]", 1 )
val res2: String = items[1]=1
scala> QueryStringBindable.bindableString.unbind( "items[1]", "1" )
val res3: String = items%5B1%5D=1
scala> QueryStringBindable.bindableDouble.unbind( "items[1]", 1 )
val res4: String = items[1]=1.0
```
### Actual Behavior
As you can see, the `res3` contains unexpected URL-encoded characters for `[` and `]` in key part: `items%5B1%5D`.
Other binders works as expected: `items[1]`.
Problem is in [play/api/mvc/Binders.scala#L312](https://github.com/playframework/playframework/blob/ddf3a7ee4285212ec665826ec268ef32b5a76000/core/play/src/main/scala/play/api/mvc/Binders.scala#L312).
|
1.0
|
QueryStringBindable.bindableString.unbind() produces URL-encoded keys, other binders - don't - ### Play Version
play-`2.8.2`
### API
Scala
### Operating System
`Linux pc 5.7.6-arch1-1 #1 SMP PREEMPT Thu, 25 Jun 2020 00:14:47 +0000 x86_64 GNU/Linux`
### JDK
```
openjdk version "1.8.0_252"
OpenJDK Runtime Environment (build 1.8.0_252-b09)
OpenJDK 64-Bit Server VM (build 25.252-b09, mixed mode)
```
### Library Dependencies
None
### Expected Behavior
`implicitly[QueryStringBindable[String]].unbind()` should NOT URL-encode key-part.
```
$ sbt console
import play.api.mvc.QueryStringBindable
scala> QueryStringBindable.bindableChar.unbind( "items[1]", '1' )
val res1: String = items[1]=1
scala> QueryStringBindable.bindableInt.unbind( "items[1]", 1 )
val res2: String = items[1]=1
scala> QueryStringBindable.bindableString.unbind( "items[1]", "1" )
val res3: String = items%5B1%5D=1
scala> QueryStringBindable.bindableDouble.unbind( "items[1]", 1 )
val res4: String = items[1]=1.0
```
### Actual Behavior
As you can see, the `res3` contains unexpected URL-encoded characters for `[` and `]` in key part: `items%5B1%5D`.
Other binders works as expected: `items[1]`.
Problem is in [play/api/mvc/Binders.scala#L312](https://github.com/playframework/playframework/blob/ddf3a7ee4285212ec665826ec268ef32b5a76000/core/play/src/main/scala/play/api/mvc/Binders.scala#L312).
|
defect
|
querystringbindable bindablestring unbind produces url encoded keys other binders don t play version play api scala operating system linux pc smp preempt thu jun gnu linux jdk openjdk version openjdk runtime environment build openjdk bit server vm build mixed mode library dependencies none expected behavior implicitly unbind should not url encode key part sbt console import play api mvc querystringbindable scala querystringbindable bindablechar unbind items val string items scala querystringbindable bindableint unbind items val string items scala querystringbindable bindablestring unbind items val string items scala querystringbindable bindabledouble unbind items val string items actual behavior as you can see the contains unexpected url encoded characters for in key part items other binders works as expected items problem is in
| 1
|
9,799
| 2,615,175,127
|
IssuesEvent
|
2015-03-01 06:58:21
|
chrsmith/reaver-wps
|
https://api.github.com/repos/chrsmith/reaver-wps
|
opened
|
d
|
auto-migrated Priority-Triage Type-Defect
|
```
A few things to consider before submitting an issue:
0. We write documentation for a reason, if you have not read it and are
having problems with Reaver these pages are required reading before
submitting an issue:
http://code.google.com/p/reaver-wps/wiki/HintsAndTips
http://code.google.com/p/reaver-wps/wiki/README
http://code.google.com/p/reaver-wps/wiki/FAQ
http://code.google.com/p/reaver-wps/wiki/SupportedWirelessDrivers
1. Reaver will only work if your card is in monitor mode. If you do not
know what monitor mode is then you should learn more about 802.11 hacking
in linux before using Reaver.
2. Using Reaver against access points you do not own or have permission to
attack is illegal. If you cannot answer basic questions (i.e. model
number, distance away, etc) about the device you are attacking then do not
post your issue here. We will not help you break the law.
3. Please look through issues that have already been posted and make sure
your question has not already been asked here: http://code.google.com/p
/reaver-wps/issues/list
4. Often times we need packet captures of mon0 while Reaver is running to
troubleshoot the issue (tcpdump -i mon0 -s0 -w broken_reaver.pcap). Issue
reports with pcap files attached will receive more serious consideration.
Answer the following questions for every issue submitted:
0. What version of Reaver are you using? (Only defects against the latest
version will be considered.)
1. What operating system are you using (Linux is the only supported OS)?
2. Is your wireless card in monitor mode (yes/no)?
3. What is the signal strength of the Access Point you are trying to crack?
4. What is the manufacturer and model # of the device you are trying to
crack?
5. What is the entire command line string you are supplying to reaver?
6. Please describe what you think the issue is.
7. Paste the output from Reaver below.
```
Original issue reported on code.google.com by `gregory...@gmail.com` on 29 Apr 2014 at 12:36
|
1.0
|
d - ```
A few things to consider before submitting an issue:
0. We write documentation for a reason, if you have not read it and are
having problems with Reaver these pages are required reading before
submitting an issue:
http://code.google.com/p/reaver-wps/wiki/HintsAndTips
http://code.google.com/p/reaver-wps/wiki/README
http://code.google.com/p/reaver-wps/wiki/FAQ
http://code.google.com/p/reaver-wps/wiki/SupportedWirelessDrivers
1. Reaver will only work if your card is in monitor mode. If you do not
know what monitor mode is then you should learn more about 802.11 hacking
in linux before using Reaver.
2. Using Reaver against access points you do not own or have permission to
attack is illegal. If you cannot answer basic questions (i.e. model
number, distance away, etc) about the device you are attacking then do not
post your issue here. We will not help you break the law.
3. Please look through issues that have already been posted and make sure
your question has not already been asked here: http://code.google.com/p
/reaver-wps/issues/list
4. Often times we need packet captures of mon0 while Reaver is running to
troubleshoot the issue (tcpdump -i mon0 -s0 -w broken_reaver.pcap). Issue
reports with pcap files attached will receive more serious consideration.
Answer the following questions for every issue submitted:
0. What version of Reaver are you using? (Only defects against the latest
version will be considered.)
1. What operating system are you using (Linux is the only supported OS)?
2. Is your wireless card in monitor mode (yes/no)?
3. What is the signal strength of the Access Point you are trying to crack?
4. What is the manufacturer and model # of the device you are trying to
crack?
5. What is the entire command line string you are supplying to reaver?
6. Please describe what you think the issue is.
7. Paste the output from Reaver below.
```
Original issue reported on code.google.com by `gregory...@gmail.com` on 29 Apr 2014 at 12:36
|
defect
|
d a few things to consider before submitting an issue we write documentation for a reason if you have not read it and are having problems with reaver these pages are required reading before submitting an issue reaver will only work if your card is in monitor mode if you do not know what monitor mode is then you should learn more about hacking in linux before using reaver using reaver against access points you do not own or have permission to attack is illegal if you cannot answer basic questions i e model number distance away etc about the device you are attacking then do not post your issue here we will not help you break the law please look through issues that have already been posted and make sure your question has not already been asked here reaver wps issues list often times we need packet captures of while reaver is running to troubleshoot the issue tcpdump i w broken reaver pcap issue reports with pcap files attached will receive more serious consideration answer the following questions for every issue submitted what version of reaver are you using only defects against the latest version will be considered what operating system are you using linux is the only supported os is your wireless card in monitor mode yes no what is the signal strength of the access point you are trying to crack what is the manufacturer and model of the device you are trying to crack what is the entire command line string you are supplying to reaver please describe what you think the issue is paste the output from reaver below original issue reported on code google com by gregory gmail com on apr at
| 1
|
418,946
| 12,215,788,147
|
IssuesEvent
|
2020-05-01 13:46:00
|
onaio/rdt-standard
|
https://api.github.com/repos/onaio/rdt-standard
|
closed
|
Xapiens and ZebraX requesting that they need access to the most recent OpenSRP Repo
|
Priority - high covid response
|
On-call Xapeins and ZebraX request that they have most recent OpenSRP repo and have admin access to OpenSRP staging server and OpenMRS.
|
1.0
|
Xapiens and ZebraX requesting that they need access to the most recent OpenSRP Repo - On-call Xapeins and ZebraX request that they have most recent OpenSRP repo and have admin access to OpenSRP staging server and OpenMRS.
|
non_defect
|
xapiens and zebrax requesting that they need access to the most recent opensrp repo on call xapeins and zebrax request that they have most recent opensrp repo and have admin access to opensrp staging server and openmrs
| 0
|
379,598
| 11,223,918,179
|
IssuesEvent
|
2020-01-08 00:18:30
|
kubeflow/pipelines
|
https://api.github.com/repos/kubeflow/pipelines
|
closed
|
Is there a way to specify "privileged" for a containerOp?
|
priority/p2
|
cc @sethjuarez
--------------
Afaict the relevant permissions are svc-account roles and instance scopes. the latter can't be change on running nodes.
I tried updating and re-applying the cluster `.yaml` to grant the new permissions.
The svc-accts _may_ have picked up the new roles on the next pipeline run attempt.
I believe the nodes did not get the new scopes, and I had to start a fresh cluster with the changes made manually between the `generate` and `apply` `platform` kfctl steps
making this more transparent and tweakable definitely seems important for usability
|
1.0
|
Is there a way to specify "privileged" for a containerOp? - cc @sethjuarez
--------------
Afaict the relevant permissions are svc-account roles and instance scopes. the latter can't be change on running nodes.
I tried updating and re-applying the cluster `.yaml` to grant the new permissions.
The svc-accts _may_ have picked up the new roles on the next pipeline run attempt.
I believe the nodes did not get the new scopes, and I had to start a fresh cluster with the changes made manually between the `generate` and `apply` `platform` kfctl steps
making this more transparent and tweakable definitely seems important for usability
|
non_defect
|
is there a way to specify privileged for a containerop cc sethjuarez afaict the relevant permissions are svc account roles and instance scopes the latter can t be change on running nodes i tried updating and re applying the cluster yaml to grant the new permissions the svc accts may have picked up the new roles on the next pipeline run attempt i believe the nodes did not get the new scopes and i had to start a fresh cluster with the changes made manually between the generate and apply platform kfctl steps making this more transparent and tweakable definitely seems important for usability
| 0
|
156,332
| 12,305,680,694
|
IssuesEvent
|
2020-05-11 23:10:45
|
hkdobrev/notetaker
|
https://api.github.com/repos/hkdobrev/notetaker
|
opened
|
Add integration tests
|
tests
|
It would be great to have integration tests in the repo running in CI with example notes files and expected results.
|
1.0
|
Add integration tests - It would be great to have integration tests in the repo running in CI with example notes files and expected results.
|
non_defect
|
add integration tests it would be great to have integration tests in the repo running in ci with example notes files and expected results
| 0
|
78,005
| 27,273,654,037
|
IssuesEvent
|
2023-02-23 01:44:04
|
zed-industries/community
|
https://api.github.com/repos/zed-industries/community
|
closed
|
Vim mode `r` replaces with the text "enter" or "escape"
|
defect vim
|
### Check for existing issues
- [X] Completed
### Describe the bug / provide steps to reproduce it
Similar to #881, same fix may resolve both of these issues
When using replace in vim mode (by hitting <kbd>r</kbd>), if you hit either <kbd>return</kbd> to insert a newline or <kbd>esc</kbd> to cancel the action, instead you will insert the literal text `enter` or `escape`, respectively.
### Environment
Zed: v0.71.3 (stable)
OS: macOS 13.1.0
Memory: 16 GiB
Architecture: aarch64
### If applicable, add mockups / screenshots to help explain present your vision of the feature
_No response_
### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue.
If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000.
_No response_
|
1.0
|
Vim mode `r` replaces with the text "enter" or "escape" - ### Check for existing issues
- [X] Completed
### Describe the bug / provide steps to reproduce it
Similar to #881, same fix may resolve both of these issues
When using replace in vim mode (by hitting <kbd>r</kbd>), if you hit either <kbd>return</kbd> to insert a newline or <kbd>esc</kbd> to cancel the action, instead you will insert the literal text `enter` or `escape`, respectively.
### Environment
Zed: v0.71.3 (stable)
OS: macOS 13.1.0
Memory: 16 GiB
Architecture: aarch64
### If applicable, add mockups / screenshots to help explain present your vision of the feature
_No response_
### If applicable, attach your `~/Library/Logs/Zed/Zed.log` file to this issue.
If you only need the most recent lines, you can run the `zed: open log` command palette action to see the last 1000.
_No response_
|
defect
|
vim mode r replaces with the text enter or escape check for existing issues completed describe the bug provide steps to reproduce it similar to same fix may resolve both of these issues when using replace in vim mode by hitting r if you hit either return to insert a newline or esc to cancel the action instead you will insert the literal text enter or escape respectively environment zed stable os macos memory gib architecture if applicable add mockups screenshots to help explain present your vision of the feature no response if applicable attach your library logs zed zed log file to this issue if you only need the most recent lines you can run the zed open log command palette action to see the last no response
| 1
|
48,080
| 13,067,431,198
|
IssuesEvent
|
2020-07-31 00:25:56
|
icecube-trac/tix2
|
https://api.github.com/repos/icecube-trac/tix2
|
closed
|
I3Particle documentation incomplete (Trac #1746)
|
Migrated from Trac combo core defect
|
Per Alex Olivas ... there is missing documentation on the cascade
Migrated from https://code.icecube.wisc.edu/ticket/1746
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:47",
"description": "\nPer Alex Olivas ... there is missing documentation on the cascade\n",
"reporter": "pmeade",
"cc": "olivas, gmaggi@vub.ac.be",
"resolution": "fixed",
"_ts": "1550067167842669",
"component": "combo core",
"summary": "I3Particle documentation incomplete",
"priority": "normal",
"keywords": "",
"time": "2016-06-14T19:36:30",
"milestone": "",
"owner": "gmaggi",
"type": "defect"
}
```
|
1.0
|
I3Particle documentation incomplete (Trac #1746) -
Per Alex Olivas ... there is missing documentation on the cascade
Migrated from https://code.icecube.wisc.edu/ticket/1746
```json
{
"status": "closed",
"changetime": "2019-02-13T14:12:47",
"description": "\nPer Alex Olivas ... there is missing documentation on the cascade\n",
"reporter": "pmeade",
"cc": "olivas, gmaggi@vub.ac.be",
"resolution": "fixed",
"_ts": "1550067167842669",
"component": "combo core",
"summary": "I3Particle documentation incomplete",
"priority": "normal",
"keywords": "",
"time": "2016-06-14T19:36:30",
"milestone": "",
"owner": "gmaggi",
"type": "defect"
}
```
|
defect
|
documentation incomplete trac per alex olivas there is missing documentation on the cascade migrated from json status closed changetime description nper alex olivas there is missing documentation on the cascade n reporter pmeade cc olivas gmaggi vub ac be resolution fixed ts component combo core summary documentation incomplete priority normal keywords time milestone owner gmaggi type defect
| 1
|
77,723
| 27,131,754,923
|
IssuesEvent
|
2023-02-16 10:16:26
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
opened
|
When sorting invites by activity, newest are now at the bottom which isn't helpful when you stack them up because ignoring invites still doesn't exist....
|
T-Defect
|
### Steps to reproduce
See rageshake: https://github.com/matrix-org/element-web-rageshakes/issues/20150
When sorting invites by activity, newest are now at the bottom which isn't helpful when you stack them up because ignoring invites still doesn't exist....
### Outcome
.
### Operating system
.
### Browser information
.
### URL for webapp
.
### Application version
2023021601
### Homeserver
.
### Will you send logs?
Yes
|
1.0
|
When sorting invites by activity, newest are now at the bottom which isn't helpful when you stack them up because ignoring invites still doesn't exist.... - ### Steps to reproduce
See rageshake: https://github.com/matrix-org/element-web-rageshakes/issues/20150
When sorting invites by activity, newest are now at the bottom which isn't helpful when you stack them up because ignoring invites still doesn't exist....
### Outcome
.
### Operating system
.
### Browser information
.
### URL for webapp
.
### Application version
2023021601
### Homeserver
.
### Will you send logs?
Yes
|
defect
|
when sorting invites by activity newest are now at the bottom which isn t helpful when you stack them up because ignoring invites still doesn t exist steps to reproduce see rageshake when sorting invites by activity newest are now at the bottom which isn t helpful when you stack them up because ignoring invites still doesn t exist outcome operating system browser information url for webapp application version homeserver will you send logs yes
| 1
|
59,865
| 17,023,270,340
|
IssuesEvent
|
2021-07-03 01:09:10
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
closed
|
Umlaut problem while typing
|
Component: potlatch (flash editor) Priority: major Resolution: wontfix Type: defect
|
**[Submitted to the original trac issue database at 4.35pm, Wednesday, 9th July 2008]**
As i want to give a street a name, i want to write "Hhenstrae" but in the field i read "Hhenstrae".
I found out, that if i write "Hhentstrae" into a file and copy & paste it to the name-tag, it works Oo.
|
1.0
|
Umlaut problem while typing - **[Submitted to the original trac issue database at 4.35pm, Wednesday, 9th July 2008]**
As i want to give a street a name, i want to write "Hhenstrae" but in the field i read "Hhenstrae".
I found out, that if i write "Hhentstrae" into a file and copy & paste it to the name-tag, it works Oo.
|
defect
|
umlaut problem while typing as i want to give a street a name i want to write hhenstrae but in the field i read hhenstrae i found out that if i write hhentstrae into a file and copy paste it to the name tag it works oo
| 1
|
387,400
| 26,722,289,663
|
IssuesEvent
|
2023-01-29 09:28:41
|
curl/curl
|
https://api.github.com/repos/curl/curl
|
closed
|
CURLOPT_READFUNCTION documentation return value confuses bytes with objects
|
documentation
|
### I did this
I read the documentation at https://curl.se/libcurl/c/CURLOPT_READFUNCTION.html.
### I expected the following
I expected to understand if the return value should be the number of _bytes_ read, like the return value of [read](https://pubs.opengroup.org/onlinepubs/9699919799/functions/read.html), or the number of _objects_ read, like the return value of [fread](https://pubs.opengroup.org/onlinepubs/9699919799/functions/fread.html).
This is confusing because the documentation is not consistent with itself.
First, it says the return value should be the the number of bytes read:
https://github.com/curl/curl/blob/c12fb3ddaf48e709a7a4deaa55ec485e4df163ee/docs/libcurl/opts/CURLOPT_READFUNCTION.3#L47-L49
However the example it provides uses `fread`, which returns the number of objects read:
https://github.com/curl/curl/blob/c12fb3ddaf48e709a7a4deaa55ec485e4df163ee/docs/libcurl/opts/CURLOPT_READFUNCTION.3#L88
If the return value is indeed `bytes`, then it seems the example should instead be something like:
```C
size_t retcode = fread(ptr, 1, size*nmemb, readhere);
```
Scanning through the codebase, it looks like one of those arguments is always `1`, so in practice the difference might not matter. If that's the case, explicitly saying that the `size` argument is always `1` would also be helpful to make it less ambiguous.
|
1.0
|
CURLOPT_READFUNCTION documentation return value confuses bytes with objects - ### I did this
I read the documentation at https://curl.se/libcurl/c/CURLOPT_READFUNCTION.html.
### I expected the following
I expected to understand if the return value should be the number of _bytes_ read, like the return value of [read](https://pubs.opengroup.org/onlinepubs/9699919799/functions/read.html), or the number of _objects_ read, like the return value of [fread](https://pubs.opengroup.org/onlinepubs/9699919799/functions/fread.html).
This is confusing because the documentation is not consistent with itself.
First, it says the return value should be the the number of bytes read:
https://github.com/curl/curl/blob/c12fb3ddaf48e709a7a4deaa55ec485e4df163ee/docs/libcurl/opts/CURLOPT_READFUNCTION.3#L47-L49
However the example it provides uses `fread`, which returns the number of objects read:
https://github.com/curl/curl/blob/c12fb3ddaf48e709a7a4deaa55ec485e4df163ee/docs/libcurl/opts/CURLOPT_READFUNCTION.3#L88
If the return value is indeed `bytes`, then it seems the example should instead be something like:
```C
size_t retcode = fread(ptr, 1, size*nmemb, readhere);
```
Scanning through the codebase, it looks like one of those arguments is always `1`, so in practice the difference might not matter. If that's the case, explicitly saying that the `size` argument is always `1` would also be helpful to make it less ambiguous.
|
non_defect
|
curlopt readfunction documentation return value confuses bytes with objects i did this i read the documentation at i expected the following i expected to understand if the return value should be the number of bytes read like the return value of or the number of objects read like the return value of this is confusing because the documentation is not consistent with itself first it says the return value should be the the number of bytes read however the example it provides uses fread which returns the number of objects read if the return value is indeed bytes then it seems the example should instead be something like c size t retcode fread ptr size nmemb readhere scanning through the codebase it looks like one of those arguments is always so in practice the difference might not matter if that s the case explicitly saying that the size argument is always would also be helpful to make it less ambiguous
| 0
|
122,968
| 12,179,792,841
|
IssuesEvent
|
2020-04-28 11:18:49
|
RPR-2019/nrs_projekat_tim3
|
https://api.github.com/repos/RPR-2019/nrs_projekat_tim3
|
closed
|
Kreirati dokument dizajn sistema
|
Prioritet 1 documentation
|
Kreirati dokument "Dizajn sistema" koji će predstavljati detaljan opis dizajna buduće aplikacije. Dokument obavezno mora sadržavati minimalno:
* dijagram entitet-veza
* dijagram klasa
* dijagram raspoređivanja
* dijagram komponenti
Pored toga, dopunite dokument i drugim dijagramima za koje smatrate da su bitne.
|
1.0
|
Kreirati dokument dizajn sistema - Kreirati dokument "Dizajn sistema" koji će predstavljati detaljan opis dizajna buduće aplikacije. Dokument obavezno mora sadržavati minimalno:
* dijagram entitet-veza
* dijagram klasa
* dijagram raspoređivanja
* dijagram komponenti
Pored toga, dopunite dokument i drugim dijagramima za koje smatrate da su bitne.
|
non_defect
|
kreirati dokument dizajn sistema kreirati dokument dizajn sistema koji će predstavljati detaljan opis dizajna buduće aplikacije dokument obavezno mora sadržavati minimalno dijagram entitet veza dijagram klasa dijagram raspoređivanja dijagram komponenti pored toga dopunite dokument i drugim dijagramima za koje smatrate da su bitne
| 0
|
125,129
| 12,247,887,773
|
IssuesEvent
|
2020-05-05 16:34:58
|
iml-wg/HEPML-LivingReview
|
https://api.github.com/repos/iml-wg/HEPML-LivingReview
|
opened
|
Add Zenodo support
|
documentation
|
In the same manner that the HEP-ML-Resources has a Zenodo DOI [](https://doi.org/10.5281/zenodo.3626294) the HEMPL-LivingReview should have one as well. Similar to HEP-ML-Resources, creating annual (or maybe now quarterly) tags should be sufficient.
|
1.0
|
Add Zenodo support - In the same manner that the HEP-ML-Resources has a Zenodo DOI [](https://doi.org/10.5281/zenodo.3626294) the HEMPL-LivingReview should have one as well. Similar to HEP-ML-Resources, creating annual (or maybe now quarterly) tags should be sufficient.
|
non_defect
|
add zenodo support in the same manner that the hep ml resources has a zenodo doi the hempl livingreview should have one as well similar to hep ml resources creating annual or maybe now quarterly tags should be sufficient
| 0
|
38,550
| 8,887,576,192
|
IssuesEvent
|
2019-01-15 06:31:43
|
primefaces/primeng
|
https://api.github.com/repos/primefaces/primeng
|
closed
|
ClearState method is cleaning all states. It should clears only stateKey state.
|
defect
|
Reported by a PRO user;
> "clearState()" method in the "table" component is supposed to clear the saved state of the table by using the "stateKey" given in the template. But instead, it deleting everything.
|
1.0
|
ClearState method is cleaning all states. It should clears only stateKey state. - Reported by a PRO user;
> "clearState()" method in the "table" component is supposed to clear the saved state of the table by using the "stateKey" given in the template. But instead, it deleting everything.
|
defect
|
clearstate method is cleaning all states it should clears only statekey state reported by a pro user clearstate method in the table component is supposed to clear the saved state of the table by using the statekey given in the template but instead it deleting everything
| 1
|
10,869
| 2,622,337,198
|
IssuesEvent
|
2015-03-04 01:40:39
|
0xtob/nitrotracker
|
https://api.github.com/repos/0xtob/nitrotracker
|
opened
|
Folders having a special character as a first character, appear before [..] link
|
auto-migrated Priority-Medium Type-Defect
|
```
If the first character in a folder name is a special character, like an
exclamation mark (which I often use to keep certain folders at the top of
alphabetized lists), it will appear above the [..] link.
```
Original issue reported on code.google.com by `zoik...@yahoo.com` on 7 Apr 2010 at 12:41
|
1.0
|
Folders having a special character as a first character, appear before [..] link - ```
If the first character in a folder name is a special character, like an
exclamation mark (which I often use to keep certain folders at the top of
alphabetized lists), it will appear above the [..] link.
```
Original issue reported on code.google.com by `zoik...@yahoo.com` on 7 Apr 2010 at 12:41
|
defect
|
folders having a special character as a first character appear before link if the first character in a folder name is a special character like an exclamation mark which i often use to keep certain folders at the top of alphabetized lists it will appear above the link original issue reported on code google com by zoik yahoo com on apr at
| 1
|
67,924
| 21,317,787,167
|
IssuesEvent
|
2022-04-16 15:36:42
|
cakephp/bake
|
https://api.github.com/repos/cakephp/bake
|
closed
|
If you create an entity with `bin/cake bake model`, the Value of `$_accessible` is generated with "1". I want boolean.
|
Defect
|
### Description
ex
```sh
$ bin/cake bake model users
```
```php
class User extends Entity
{
// ...
protected $_accessible = [
'id' => 1,
'email' => 1,
'password' => 1,
];
// ...
}
```
Here's what I want.
```
class User extends Entity
{
// ...
protected $_accessible = [
'id' => true,
'email' => true,
'password' => true,
];
// ...
}
```
### CakePHP Version
4.3.0
### PHP Version
8.0
|
1.0
|
If you create an entity with `bin/cake bake model`, the Value of `$_accessible` is generated with "1". I want boolean. - ### Description
ex
```sh
$ bin/cake bake model users
```
```php
class User extends Entity
{
// ...
protected $_accessible = [
'id' => 1,
'email' => 1,
'password' => 1,
];
// ...
}
```
Here's what I want.
```
class User extends Entity
{
// ...
protected $_accessible = [
'id' => true,
'email' => true,
'password' => true,
];
// ...
}
```
### CakePHP Version
4.3.0
### PHP Version
8.0
|
defect
|
if you create an entity with bin cake bake model the value of accessible is generated with i want boolean description ex sh bin cake bake model users php class user extends entity protected accessible id email password here s what i want class user extends entity protected accessible id true email true password true cakephp version php version
| 1
|
76,114
| 26,247,949,602
|
IssuesEvent
|
2023-01-05 16:42:11
|
vector-im/element-android
|
https://api.github.com/repos/vector-im/element-android
|
closed
|
[Session manager] Missing info when a session does not support encryption
|
T-Defect S-Minor O-Uncommon A-Settings
|
### Steps to reproduce
- Create a session that does not support encryption using the following command line:
`curl -d '{ "type": "m.login.password", "identifier": { "type": "m.id.user", "user": "..." }, "password": "..." }' -H "Content-Type: application/json" -X POST https://matrix-client.matrix.org/_matrix/client/v3/login`
- Enable the new session manager in labs settings
- Go to Settings -> Security & Privacy -> Show all sessions
- Go to the other sessions list to find the new created session from command line
- Press the item of this session
### Outcome
#### What did you expect?
When a session does not support encryption, in the new session manager:
- the session name should not be empty and should be the session ID in worst case scenario
- session info is displayed in the session overview screen
- session appears in the unverified sessions list
Also, when the last activity timestamp of a session is not known, it should not be considered as inactive.
#### What happened instead?
When a session does not support encryption, in the new session manager:
- the session name is empty
- there is no info displayed in the session overview screen
- the session does not appear in the unverified sessions list whereas it is considered as unverified
Also, when the last activity timestamp of a session is not known, it is considered as inactive whereas we cannot say anything about the inactivity status of the session.
### Your phone model
_No response_
### Operating system version
_No response_
### Application version and app store
_No response_
### Homeserver
_No response_
### Will you send logs?
No
### Are you willing to provide a PR?
Yes
|
1.0
|
[Session manager] Missing info when a session does not support encryption - ### Steps to reproduce
- Create a session that does not support encryption using the following command line:
`curl -d '{ "type": "m.login.password", "identifier": { "type": "m.id.user", "user": "..." }, "password": "..." }' -H "Content-Type: application/json" -X POST https://matrix-client.matrix.org/_matrix/client/v3/login`
- Enable the new session manager in labs settings
- Go to Settings -> Security & Privacy -> Show all sessions
- Go to the other sessions list to find the new created session from command line
- Press the item of this session
### Outcome
#### What did you expect?
When a session does not support encryption, in the new session manager:
- the session name should not be empty and should be the session ID in worst case scenario
- session info is displayed in the session overview screen
- session appears in the unverified sessions list
Also, when the last activity timestamp of a session is not known, it should not be considered as inactive.
#### What happened instead?
When a session does not support encryption, in the new session manager:
- the session name is empty
- there is no info displayed in the session overview screen
- the session does not appear in the unverified sessions list whereas it is considered as unverified
Also, when the last activity timestamp of a session is not known, it is considered as inactive whereas we cannot say anything about the inactivity status of the session.
### Your phone model
_No response_
### Operating system version
_No response_
### Application version and app store
_No response_
### Homeserver
_No response_
### Will you send logs?
No
### Are you willing to provide a PR?
Yes
|
defect
|
missing info when a session does not support encryption steps to reproduce create a session that does not support encryption using the following command line curl d type m login password identifier type m id user user password h content type application json x post enable the new session manager in labs settings go to settings security privacy show all sessions go to the other sessions list to find the new created session from command line press the item of this session outcome what did you expect when a session does not support encryption in the new session manager the session name should not be empty and should be the session id in worst case scenario session info is displayed in the session overview screen session appears in the unverified sessions list also when the last activity timestamp of a session is not known it should not be considered as inactive what happened instead when a session does not support encryption in the new session manager the session name is empty there is no info displayed in the session overview screen the session does not appear in the unverified sessions list whereas it is considered as unverified also when the last activity timestamp of a session is not known it is considered as inactive whereas we cannot say anything about the inactivity status of the session your phone model no response operating system version no response application version and app store no response homeserver no response will you send logs no are you willing to provide a pr yes
| 1
|
206,745
| 15,772,211,196
|
IssuesEvent
|
2021-03-31 21:29:52
|
pathfinder-for-autonomous-navigation/FlightSoftware
|
https://api.github.com/repos/pathfinder-for-autonomous-navigation/FlightSoftware
|
closed
|
Update QuakeFaultHandler behavior to modify QuakeManager
|
enhancement functional testing
|
The current implementation is based on the assumption that the QuakeFaultHandler can only power cycle when the radio is in the Wait state.
After more examination, we determined that this underlying behavior needed to be changed. When the QuakeFaultHandler is about to power cycle, it should instead be setting the radio to be in the Wait state. This is related to the closed ticket #509 which tested this old behavior.
|
1.0
|
Update QuakeFaultHandler behavior to modify QuakeManager - The current implementation is based on the assumption that the QuakeFaultHandler can only power cycle when the radio is in the Wait state.
After more examination, we determined that this underlying behavior needed to be changed. When the QuakeFaultHandler is about to power cycle, it should instead be setting the radio to be in the Wait state. This is related to the closed ticket #509 which tested this old behavior.
|
non_defect
|
update quakefaulthandler behavior to modify quakemanager the current implementation is based on the assumption that the quakefaulthandler can only power cycle when the radio is in the wait state after more examination we determined that this underlying behavior needed to be changed when the quakefaulthandler is about to power cycle it should instead be setting the radio to be in the wait state this is related to the closed ticket which tested this old behavior
| 0
|
196,460
| 14,860,830,548
|
IssuesEvent
|
2021-01-18 21:22:10
|
ESMValGroup/ESMValTool
|
https://api.github.com/repos/ESMValGroup/ESMValTool
|
closed
|
Tests are failing on Circle because the cached test code uses an ancient fiona
|
test
|
https://app.circleci.com/pipelines/github/ESMValGroup/ESMValTool/4012/workflows/d9967e41-789d-4a98-9493-e1d113a129ea/jobs/41067/steps
Keep calm and update fiona! Don't worry about the the Circle test fails, the `fiona` that gets installed on Circle is `1.8.13` which is ancient, the testing suite works perfectly fine with `1.8.18` - in fact in https://github.com/ESMValGroup/ESMValCore/pull/885 I have added a note about it saying we may have to switch to install it from conda in the very near future anyway
|
1.0
|
Tests are failing on Circle because the cached test code uses an ancient fiona - https://app.circleci.com/pipelines/github/ESMValGroup/ESMValTool/4012/workflows/d9967e41-789d-4a98-9493-e1d113a129ea/jobs/41067/steps
Keep calm and update fiona! Don't worry about the the Circle test fails, the `fiona` that gets installed on Circle is `1.8.13` which is ancient, the testing suite works perfectly fine with `1.8.18` - in fact in https://github.com/ESMValGroup/ESMValCore/pull/885 I have added a note about it saying we may have to switch to install it from conda in the very near future anyway
|
non_defect
|
tests are failing on circle because the cached test code uses an ancient fiona keep calm and update fiona don t worry about the the circle test fails the fiona that gets installed on circle is which is ancient the testing suite works perfectly fine with in fact in i have added a note about it saying we may have to switch to install it from conda in the very near future anyway
| 0
|
119,121
| 15,415,482,806
|
IssuesEvent
|
2021-03-05 02:42:51
|
SasanLabs/VulnerableApp
|
https://api.github.com/repos/SasanLabs/VulnerableApp
|
closed
|
In future this project might become heavy weight so we need to think on making it light weight.
|
Analysis Future Goal design-document documentation
|
Going further in future this project might become very heavy weight so we need to think on making it in such a way that it will be light weight.
|
1.0
|
In future this project might become heavy weight so we need to think on making it light weight. - Going further in future this project might become very heavy weight so we need to think on making it in such a way that it will be light weight.
|
non_defect
|
in future this project might become heavy weight so we need to think on making it light weight going further in future this project might become very heavy weight so we need to think on making it in such a way that it will be light weight
| 0
|
71,735
| 23,778,397,119
|
IssuesEvent
|
2022-09-02 00:04:59
|
CorfuDB/CorfuDB
|
https://api.github.com/repos/CorfuDB/CorfuDB
|
closed
|
Propose in Paxos produces a false negative OutrankedException is certain cases
|
defect
|
## Overview
```
CFUtils.getUninterruptibly(CompletableFuture.anyOf(proposeList),
OutrankedException.class, TimeoutException.class, NetworkException.class,
WrongEpochException.class);
```
In the propose code of the LayoutView wrongfully throws OutRankedException in certain scenarios. Example:
* A, B, C at epoch 0, A has finished state transfer and is about to run this method.
* A invokes either layoutManagementView.mergeSegments or layoutManagementView.runLayoutReconfiguration does not matter.
* In both of these methods, it calls sealEpoch.
* B & C seal immediately, now they are at epoch 1, and we only need a majority for this method to return. A is still at 0 as its layout server is still processing it.
* A calls attemptConsensus on the new layout.
* It sends prepare to A, B, C, gets a quorum responses from B and C. So continues to propose phase.
* When the layout_prepare request hits A's layout server and it it's still not sealed, we reject the prepare request with a WEE.
* A runs the second phase (propose), since the majority is sealed, and finds out that the phase1Rank is null (cause it has been previously rejected to the WEE) it throws OutRankedException within the CompletableFuture.anyOf(proposeList).
|
1.0
|
Propose in Paxos produces a false negative OutrankedException is certain cases - ## Overview
```
CFUtils.getUninterruptibly(CompletableFuture.anyOf(proposeList),
OutrankedException.class, TimeoutException.class, NetworkException.class,
WrongEpochException.class);
```
In the propose code of the LayoutView wrongfully throws OutRankedException in certain scenarios. Example:
* A, B, C at epoch 0, A has finished state transfer and is about to run this method.
* A invokes either layoutManagementView.mergeSegments or layoutManagementView.runLayoutReconfiguration does not matter.
* In both of these methods, it calls sealEpoch.
* B & C seal immediately, now they are at epoch 1, and we only need a majority for this method to return. A is still at 0 as its layout server is still processing it.
* A calls attemptConsensus on the new layout.
* It sends prepare to A, B, C, gets a quorum responses from B and C. So continues to propose phase.
* When the layout_prepare request hits A's layout server and it it's still not sealed, we reject the prepare request with a WEE.
* A runs the second phase (propose), since the majority is sealed, and finds out that the phase1Rank is null (cause it has been previously rejected to the WEE) it throws OutRankedException within the CompletableFuture.anyOf(proposeList).
|
defect
|
propose in paxos produces a false negative outrankedexception is certain cases overview cfutils getuninterruptibly completablefuture anyof proposelist outrankedexception class timeoutexception class networkexception class wrongepochexception class in the propose code of the layoutview wrongfully throws outrankedexception in certain scenarios example a b c at epoch a has finished state transfer and is about to run this method a invokes either layoutmanagementview mergesegments or layoutmanagementview runlayoutreconfiguration does not matter in both of these methods it calls sealepoch b c seal immediately now they are at epoch and we only need a majority for this method to return a is still at as its layout server is still processing it a calls attemptconsensus on the new layout it sends prepare to a b c gets a quorum responses from b and c so continues to propose phase when the layout prepare request hits a s layout server and it it s still not sealed we reject the prepare request with a wee a runs the second phase propose since the majority is sealed and finds out that the is null cause it has been previously rejected to the wee it throws outrankedexception within the completablefuture anyof proposelist
| 1
|
199,900
| 22,715,374,377
|
IssuesEvent
|
2022-07-06 01:09:37
|
RG4421/openedr
|
https://api.github.com/repos/RG4421/openedr
|
opened
|
CVE-2021-22924 (Low) detected in multiple libraries
|
security vulnerability
|
## CVE-2021-22924 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
libcurl keeps previously used connections in a connection pool for subsequenttransfers to reuse, if one of them matches the setup.Due to errors in the logic, the config matching function did not take 'issuercert' into account and it compared the involved paths *case insensitively*,which could lead to libcurl reusing wrong connections.File paths are, or can be, case sensitive on many systems but not all, and caneven vary depending on used file systems.The comparison also didn't include the 'issuer cert' which a transfer can setto qualify how to verify the server certificate.
<p>Publish Date: 2021-08-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-22924>CVE-2021-22924</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://curl.se/docs/CVE-2021-22924.html">https://curl.se/docs/CVE-2021-22924.html</a></p>
<p>Release Date: 2021-08-05</p>
<p>Fix Resolution: curl-7_78_0</p>
</p>
</details>
<p></p>
|
True
|
CVE-2021-22924 (Low) detected in multiple libraries - ## CVE-2021-22924 - Low Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b>, <b>curlcurl-7_63_0</b></p></summary>
<p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/low_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
libcurl keeps previously used connections in a connection pool for subsequenttransfers to reuse, if one of them matches the setup.Due to errors in the logic, the config matching function did not take 'issuercert' into account and it compared the involved paths *case insensitively*,which could lead to libcurl reusing wrong connections.File paths are, or can be, case sensitive on many systems but not all, and caneven vary depending on used file systems.The comparison also didn't include the 'issuer cert' which a transfer can setto qualify how to verify the server certificate.
<p>Publish Date: 2021-08-05
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-22924>CVE-2021-22924</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>3.7</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: Low
- Integrity Impact: None
- Availability Impact: None
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://curl.se/docs/CVE-2021-22924.html">https://curl.se/docs/CVE-2021-22924.html</a></p>
<p>Release Date: 2021-08-05</p>
<p>Fix Resolution: curl-7_78_0</p>
</p>
</details>
<p></p>
|
non_defect
|
cve low detected in multiple libraries cve low severity vulnerability vulnerable libraries curlcurl curlcurl curlcurl curlcurl curlcurl curlcurl vulnerability details libcurl keeps previously used connections in a connection pool for subsequenttransfers to reuse if one of them matches the setup due to errors in the logic the config matching function did not take issuercert into account and it compared the involved paths case insensitively which could lead to libcurl reusing wrong connections file paths are or can be case sensitive on many systems but not all and caneven vary depending on used file systems the comparison also didn t include the issuer cert which a transfer can setto qualify how to verify the server certificate publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact low integrity impact none availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution curl
| 0
|
57,813
| 16,085,899,610
|
IssuesEvent
|
2021-04-26 11:09:25
|
primefaces/primereact
|
https://api.github.com/repos/primefaces/primereact
|
closed
|
Tooltip: Fixed tooltip doesnt work with elements inside Tooltip children ( autoHide = false )
|
defect
|
**I'm submitting a ...** (check one with "x")
```
[x] bug report
[ ] feature request
[ ] support request => Please do not submit support request here, instead see https://forum.primefaces.org/viewforum.php?f=57
```
**Codesandbox Case (Bug Reports)**
https://codesandbox.io/s/gifted-chandrasekhar-nepgn?file=/src/demo/TooltipDemo.js
**Current behavior**
<!-- Describe how the bug manifests. -->
Autohide false
**Expected behavior**
<!-- Describe what the behavior would be without the bug. -->
When a element exists inside Tooltip content, the tooltip will close.
**Minimal reproduction of the problem with instructions**
The tooltip shouldnt close for elements inside the tooltip tree
|
1.0
|
Tooltip: Fixed tooltip doesnt work with elements inside Tooltip children ( autoHide = false ) - **I'm submitting a ...** (check one with "x")
```
[x] bug report
[ ] feature request
[ ] support request => Please do not submit support request here, instead see https://forum.primefaces.org/viewforum.php?f=57
```
**Codesandbox Case (Bug Reports)**
https://codesandbox.io/s/gifted-chandrasekhar-nepgn?file=/src/demo/TooltipDemo.js
**Current behavior**
<!-- Describe how the bug manifests. -->
Autohide false
**Expected behavior**
<!-- Describe what the behavior would be without the bug. -->
When a element exists inside Tooltip content, the tooltip will close.
**Minimal reproduction of the problem with instructions**
The tooltip shouldnt close for elements inside the tooltip tree
|
defect
|
tooltip fixed tooltip doesnt work with elements inside tooltip children autohide false i m submitting a check one with x bug report feature request support request please do not submit support request here instead see codesandbox case bug reports current behavior autohide false expected behavior when a element exists inside tooltip content the tooltip will close minimal reproduction of the problem with instructions the tooltip shouldnt close for elements inside the tooltip tree
| 1
|
32,684
| 6,892,617,653
|
IssuesEvent
|
2017-11-22 21:54:06
|
jquery/esprima
|
https://api.github.com/repos/jquery/esprima
|
closed
|
`a <!-- b` in modules are actually equivalent to `a < !(--b)`
|
defect
|
See: https://github.com/tc39/ecma262/issues/949
An error is generated for `a <!-- b` with `scriptType: "module"`, when it should parse like `a < !(--b)`.
|
1.0
|
`a <!-- b` in modules are actually equivalent to `a < !(--b)` - See: https://github.com/tc39/ecma262/issues/949
An error is generated for `a <!-- b` with `scriptType: "module"`, when it should parse like `a < !(--b)`.
|
defect
|
a b in modules are actually equivalent to a b see an error is generated for a b with scripttype module when it should parse like a b
| 1
|
16,768
| 2,942,244,513
|
IssuesEvent
|
2015-07-02 13:19:33
|
icatproject/ijp.torque
|
https://api.github.com/repos/icatproject/ijp.torque
|
closed
|
Clean up mechanism for interactive users inadequate
|
Type-Defect
|
If an interactive user fails to log off properly the system will not notice and will
not put the machine back on line.
This can be seen by tailing portal.log
Workaround:
Log onto the machine and kill all processes belonging to that user
|
1.0
|
Clean up mechanism for interactive users inadequate - If an interactive user fails to log off properly the system will not notice and will
not put the machine back on line.
This can be seen by tailing portal.log
Workaround:
Log onto the machine and kill all processes belonging to that user
|
defect
|
clean up mechanism for interactive users inadequate if an interactive user fails to log off properly the system will not notice and will not put the machine back on line this can be seen by tailing portal log workaround log onto the machine and kill all processes belonging to that user
| 1
|
508,850
| 14,707,011,544
|
IssuesEvent
|
2021-01-04 20:51:35
|
internetarchive/openlibrary
|
https://api.github.com/repos/internetarchive/openlibrary
|
opened
|
Give Nick full access to blog.openlibrary.org
|
1-off tasks Lead: @mekarpeles Priority: 1
|
This is blocking Nick from contributing blog posts efficiently. The process was weird; he has an account blog.archive.org, but can't log in to blog.openlibrary.org , I believe.
|
1.0
|
Give Nick full access to blog.openlibrary.org - This is blocking Nick from contributing blog posts efficiently. The process was weird; he has an account blog.archive.org, but can't log in to blog.openlibrary.org , I believe.
|
non_defect
|
give nick full access to blog openlibrary org this is blocking nick from contributing blog posts efficiently the process was weird he has an account blog archive org but can t log in to blog openlibrary org i believe
| 0
|
6,627
| 7,714,719,499
|
IssuesEvent
|
2018-05-23 03:48:32
|
aws/aws-sdk-go
|
https://api.github.com/repos/aws/aws-sdk-go
|
closed
|
SQS TLS handshake timeout
|
Service API
|
### Version of AWS SDK for Go?
v1.13.32
### Version of Go (`go version`)?
1.9.2
### What issue did you see?
Getting a handshake timeout when trying to connect to SQS:
RequestError: send request failed
caused by: Post https://sqs.ap-southeast-2.amazonaws.com/: net/http: TLS handshake timeout
### Steps to reproduce
Using snippet:
`
resultURL, err := channel.GetQueueUrl(&sqs.GetQueueUrlInput{
QueueName: aws.String(queueName),
})
`
|
1.0
|
SQS TLS handshake timeout - ### Version of AWS SDK for Go?
v1.13.32
### Version of Go (`go version`)?
1.9.2
### What issue did you see?
Getting a handshake timeout when trying to connect to SQS:
RequestError: send request failed
caused by: Post https://sqs.ap-southeast-2.amazonaws.com/: net/http: TLS handshake timeout
### Steps to reproduce
Using snippet:
`
resultURL, err := channel.GetQueueUrl(&sqs.GetQueueUrlInput{
QueueName: aws.String(queueName),
})
`
|
non_defect
|
sqs tls handshake timeout version of aws sdk for go version of go go version what issue did you see getting a handshake timeout when trying to connect to sqs requesterror send request failed caused by post net http tls handshake timeout steps to reproduce using snippet resulturl err channel getqueueurl sqs getqueueurlinput queuename aws string queuename
| 0
|
50,590
| 6,103,284,066
|
IssuesEvent
|
2017-06-20 18:22:28
|
kubernetes/kubernetes
|
https://api.github.com/repos/kubernetes/kubernetes
|
closed
|
What's the specification of the slave that runs "bazel build" in your jenkins CI ?
|
sig/testing
|
I find it takes long time to build kubernetes. For some reason, I want to know the specification of the slave that runs bazel-build in the your CI. What's more, why your jenkins runs "bazel clean" before "bazel build" every time?
|
1.0
|
What's the specification of the slave that runs "bazel build" in your jenkins CI ? - I find it takes long time to build kubernetes. For some reason, I want to know the specification of the slave that runs bazel-build in the your CI. What's more, why your jenkins runs "bazel clean" before "bazel build" every time?
|
non_defect
|
what s the specification of the slave that runs bazel build in your jenkins ci i find it takes long time to build kubernetes for some reason i want to know the specification of the slave that runs bazel build in the your ci what s more why your jenkins runs bazel clean before bazel build every time
| 0
|
177,041
| 6,573,491,077
|
IssuesEvent
|
2017-09-11 08:59:51
|
gdgphilippines/devfest
|
https://api.github.com/repos/gdgphilippines/devfest
|
closed
|
Different sizes of pictures in different browe
|
bug Priority Medium review
|
<!-- Instructions: https://github.com/gdgphilippines/devfest/blob/master/CONTRIBUTING.md#using-the-issue-tracker -->
<!-- Copied from Firebase Polyfire template -->
### Description
Different sizes of pictures and Google chrome don't have a hamburger.
### Expected outcome
The same sizes
### Actual outcome
Different sizes
### Live Demo
<!-- Example: https://jsbin.com/cagaye/edit?html,output -->
### Steps to reproduce
<!-- Example
1. Put a `paper-foo` element in the page.
2. Open the page in a web browser.
3. Click the `paper-foo` element.
-->
### Browsers Affected
<!-- Check all that apply -->
- [ ] Chrome
- [ ] Firefox
- [ ] Edge

702/30250460-40cc736a-9681-11e7-99bc-def284d66244.png)


|
1.0
|
Different sizes of pictures in different browe - <!-- Instructions: https://github.com/gdgphilippines/devfest/blob/master/CONTRIBUTING.md#using-the-issue-tracker -->
<!-- Copied from Firebase Polyfire template -->
### Description
Different sizes of pictures and Google chrome don't have a hamburger.
### Expected outcome
The same sizes
### Actual outcome
Different sizes
### Live Demo
<!-- Example: https://jsbin.com/cagaye/edit?html,output -->
### Steps to reproduce
<!-- Example
1. Put a `paper-foo` element in the page.
2. Open the page in a web browser.
3. Click the `paper-foo` element.
-->
### Browsers Affected
<!-- Check all that apply -->
- [ ] Chrome
- [ ] Firefox
- [ ] Edge

702/30250460-40cc736a-9681-11e7-99bc-def284d66244.png)


|
non_defect
|
different sizes of pictures in different browe description different sizes of pictures and google chrome don t have a hamburger expected outcome the same sizes actual outcome different sizes live demo steps to reproduce example put a paper foo element in the page open the page in a web browser click the paper foo element browsers affected chrome firefox edge png
| 0
|
12,467
| 2,700,606,630
|
IssuesEvent
|
2015-04-04 10:40:14
|
bridgedotnet/Bridge
|
https://api.github.com/repos/bridgedotnet/Bridge
|
closed
|
Inline comments are broken to next line in generated JavaScript file
|
defect
|
The C# code:
```
namespace InLineCMT
{
public class Commenter
{
private static void Main()
{
var a = 1; // inits a with one
}
}
}
```
Results in:
```
Bridge.define('InLineCMT.Commenter', {
statics: {
main: function () {
var a = 1;
// inits a with one
}
}
});
```
This makes nicely inline comments on C# look strange on the output JavaScript.
|
1.0
|
Inline comments are broken to next line in generated JavaScript file - The C# code:
```
namespace InLineCMT
{
public class Commenter
{
private static void Main()
{
var a = 1; // inits a with one
}
}
}
```
Results in:
```
Bridge.define('InLineCMT.Commenter', {
statics: {
main: function () {
var a = 1;
// inits a with one
}
}
});
```
This makes nicely inline comments on C# look strange on the output JavaScript.
|
defect
|
inline comments are broken to next line in generated javascript file the c code namespace inlinecmt public class commenter private static void main var a inits a with one results in bridge define inlinecmt commenter statics main function var a inits a with one this makes nicely inline comments on c look strange on the output javascript
| 1
|
57,423
| 15,777,933,771
|
IssuesEvent
|
2021-04-01 07:02:24
|
primefaces/primeng
|
https://api.github.com/repos/primefaces/primeng
|
closed
|
Pick list events emit inconsistent types
|
LTS-PORTABLE defect
|
**I'm submitting a ...**
```
[x] bug report => Search github for a similar issue or PR before submitting
[ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap
[ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35
```
**Plunkr Case (Bug Reports)**
https://stackblitz.com/edit/github-rb49kh?file=src%2Fapp%2Fapp.component.html
**Current behavior**
When dragging and dropping a picklist item, an array is emitted, where as when moving via double click or using a transfer button, an object with an items array property is emitted.
The documentation currently states the following:
```onMoveToTarget | event.items: Moved items array```
```onMoveToSource | event.items: Moved items array```
**Expected behavior**
When moving items in either fashion, the types emitted should be the same, as per the documentation.
```
{
items: any[]
}
```
**Minimal reproduction of the problem with instructions**
* Add a picklist to your component
* Setup event bindings for the onMoveToTarget and onMoveToSource ouputs.
* Observe the outputs for these event bindings when doing a drag and drop, as well as when doing a transfer using the transfer buttons or double click.
**What is the motivation / use case for changing the behavior?**
To ensure that the documentation is accurate and the behaviour is consistent across all events that use the same event emitters
**Please tell us about your environment:**
Windows 10
VSCode
npm
* **Angular version:** 11.2.7
* **PrimeNG version:** 11.3.1
* **Browser:** all
<!-- All browsers where this could be reproduced -->
* **Language:** TypeScript 4.1.5
* **Node (for AoT issues):** `node --version` = v12.20.1
|
1.0
|
Pick list events emit inconsistent types - **I'm submitting a ...**
```
[x] bug report => Search github for a similar issue or PR before submitting
[ ] feature request => Please check if request is not on the roadmap already https://github.com/primefaces/primeng/wiki/Roadmap
[ ] support request => Please do not submit support request here, instead see http://forum.primefaces.org/viewforum.php?f=35
```
**Plunkr Case (Bug Reports)**
https://stackblitz.com/edit/github-rb49kh?file=src%2Fapp%2Fapp.component.html
**Current behavior**
When dragging and dropping a picklist item, an array is emitted, where as when moving via double click or using a transfer button, an object with an items array property is emitted.
The documentation currently states the following:
```onMoveToTarget | event.items: Moved items array```
```onMoveToSource | event.items: Moved items array```
**Expected behavior**
When moving items in either fashion, the types emitted should be the same, as per the documentation.
```
{
items: any[]
}
```
**Minimal reproduction of the problem with instructions**
* Add a picklist to your component
* Setup event bindings for the onMoveToTarget and onMoveToSource ouputs.
* Observe the outputs for these event bindings when doing a drag and drop, as well as when doing a transfer using the transfer buttons or double click.
**What is the motivation / use case for changing the behavior?**
To ensure that the documentation is accurate and the behaviour is consistent across all events that use the same event emitters
**Please tell us about your environment:**
Windows 10
VSCode
npm
* **Angular version:** 11.2.7
* **PrimeNG version:** 11.3.1
* **Browser:** all
<!-- All browsers where this could be reproduced -->
* **Language:** TypeScript 4.1.5
* **Node (for AoT issues):** `node --version` = v12.20.1
|
defect
|
pick list events emit inconsistent types i m submitting a bug report search github for a similar issue or pr before submitting feature request please check if request is not on the roadmap already support request please do not submit support request here instead see plunkr case bug reports current behavior when dragging and dropping a picklist item an array is emitted where as when moving via double click or using a transfer button an object with an items array property is emitted the documentation currently states the following onmovetotarget event items moved items array onmovetosource event items moved items array expected behavior when moving items in either fashion the types emitted should be the same as per the documentation items any minimal reproduction of the problem with instructions add a picklist to your component setup event bindings for the onmovetotarget and onmovetosource ouputs observe the outputs for these event bindings when doing a drag and drop as well as when doing a transfer using the transfer buttons or double click what is the motivation use case for changing the behavior to ensure that the documentation is accurate and the behaviour is consistent across all events that use the same event emitters please tell us about your environment windows vscode npm angular version primeng version browser all language typescript node for aot issues node version
| 1
|
32,282
| 6,758,304,834
|
IssuesEvent
|
2017-10-24 13:50:05
|
BOINC/boinc
|
https://api.github.com/repos/BOINC/boinc
|
closed
|
Commas in usernames break private messages
|
C: Web - Private Messages E: 1 day P: Minor T: Defect
|
**Reported by ToeBee on 8 Dec 38260853 16:00 UTC**
Commas are used on the "Send Private Message" page (pm.php?action=new) to allow you to send messages to multiple recipients. However this causes a problem if the user you are trying to send a message to has a comma in their username. It tries to find two users to send the message to. As a workaround you can remove the name and only use the user ID but when you click on a user's name from the forum both are included.
Either a different delimiter needs to be used or the username needs to be enclosed in something or maybe commas in usernames should be escaped. Currently the username is enclosed in parens but the parsing of usernames apparently doesn't take them into account. Also, what if a username has parens in their name? Should some of these characters be disallowed?
Migrated-From: http://boinc.berkeley.edu/trac/ticket/594
|
1.0
|
Commas in usernames break private messages - **Reported by ToeBee on 8 Dec 38260853 16:00 UTC**
Commas are used on the "Send Private Message" page (pm.php?action=new) to allow you to send messages to multiple recipients. However this causes a problem if the user you are trying to send a message to has a comma in their username. It tries to find two users to send the message to. As a workaround you can remove the name and only use the user ID but when you click on a user's name from the forum both are included.
Either a different delimiter needs to be used or the username needs to be enclosed in something or maybe commas in usernames should be escaped. Currently the username is enclosed in parens but the parsing of usernames apparently doesn't take them into account. Also, what if a username has parens in their name? Should some of these characters be disallowed?
Migrated-From: http://boinc.berkeley.edu/trac/ticket/594
|
defect
|
commas in usernames break private messages reported by toebee on dec utc commas are used on the send private message page pm php action new to allow you to send messages to multiple recipients however this causes a problem if the user you are trying to send a message to has a comma in their username it tries to find two users to send the message to as a workaround you can remove the name and only use the user id but when you click on a user s name from the forum both are included either a different delimiter needs to be used or the username needs to be enclosed in something or maybe commas in usernames should be escaped currently the username is enclosed in parens but the parsing of usernames apparently doesn t take them into account also what if a username has parens in their name should some of these characters be disallowed migrated from
| 1
|
80,354
| 30,246,123,695
|
IssuesEvent
|
2023-07-06 16:36:27
|
gperftools/gperftools
|
https://api.github.com/repos/gperftools/gperftools
|
closed
|
Failed to build with lib musl
|
Type-Defect Priority-Medium Status-New
|
Originally reported on Google Code with ID 690
```
What steps will reproduce the problem?
1. unpack source onto x86_64, musl based system
2. apply patch to define __off64_t
3. try to build
What is the expected output? What do you see instead?
Build fails instead of normal completition:
src/malloc_hook_mmap_linux.h: In function 'void* mmap(void*, size_t, int, int, int,
off_t)':
src/malloc_hook_mmap_linux.h:169:18: error: redefinition of 'void* mmap(void*, size_t,
int, int, int, off_t)'
extern "C" void* mmap(void *start, size_t length, int prot, int flags,
^
In file included from src/malloc_hook.cc:41:0:
src/malloc_hook_mmap_linux.h:155:18: error: 'void* mmap(void*, size_t, int, int, int,
__off64_t)' previously defined here
extern "C" void* mmap64(void *start, size_t length, int prot, int flags,
^
Makefile:4515: recipe for target 'src/libtcmalloc_minimal_internal_la-malloc_hook.lo'
failed
make: *** [src/libtcmalloc_minimal_internal_la-malloc_hook.lo] Error 1
What version of the product are you using? On what operating system?
Alpine linux 3.0
gcc 4.8.2
musl 1.1.4
Please provide any additional information below.
1. _off64_t is not defined, so I patched base/linux_syscall_support.h to define it
into off64_t
2. musl sys/mman.h is quite different from glibc, so I guess the problem of redifinition
comes from that fact.
I would appreciate for any advice how it could be fixed. Full configure & make logs
attached
```
Reported by `filipp.andronov` on 2015-05-15 17:59:04
<hr>
- _Attachment: [build.log](https://storage.googleapis.com/google-code-attachments/gperftools/issue-690/comment-0/build.log)_
- _Attachment: [10-define-off64-t.patch](https://storage.googleapis.com/google-code-attachments/gperftools/issue-690/comment-0/10-define-off64-t.patch)_
|
1.0
|
Failed to build with lib musl - Originally reported on Google Code with ID 690
```
What steps will reproduce the problem?
1. unpack source onto x86_64, musl based system
2. apply patch to define __off64_t
3. try to build
What is the expected output? What do you see instead?
Build fails instead of normal completition:
src/malloc_hook_mmap_linux.h: In function 'void* mmap(void*, size_t, int, int, int,
off_t)':
src/malloc_hook_mmap_linux.h:169:18: error: redefinition of 'void* mmap(void*, size_t,
int, int, int, off_t)'
extern "C" void* mmap(void *start, size_t length, int prot, int flags,
^
In file included from src/malloc_hook.cc:41:0:
src/malloc_hook_mmap_linux.h:155:18: error: 'void* mmap(void*, size_t, int, int, int,
__off64_t)' previously defined here
extern "C" void* mmap64(void *start, size_t length, int prot, int flags,
^
Makefile:4515: recipe for target 'src/libtcmalloc_minimal_internal_la-malloc_hook.lo'
failed
make: *** [src/libtcmalloc_minimal_internal_la-malloc_hook.lo] Error 1
What version of the product are you using? On what operating system?
Alpine linux 3.0
gcc 4.8.2
musl 1.1.4
Please provide any additional information below.
1. _off64_t is not defined, so I patched base/linux_syscall_support.h to define it
into off64_t
2. musl sys/mman.h is quite different from glibc, so I guess the problem of redifinition
comes from that fact.
I would appreciate for any advice how it could be fixed. Full configure & make logs
attached
```
Reported by `filipp.andronov` on 2015-05-15 17:59:04
<hr>
- _Attachment: [build.log](https://storage.googleapis.com/google-code-attachments/gperftools/issue-690/comment-0/build.log)_
- _Attachment: [10-define-off64-t.patch](https://storage.googleapis.com/google-code-attachments/gperftools/issue-690/comment-0/10-define-off64-t.patch)_
|
defect
|
failed to build with lib musl originally reported on google code with id what steps will reproduce the problem unpack source onto musl based system apply patch to define t try to build what is the expected output what do you see instead build fails instead of normal completition src malloc hook mmap linux h in function void mmap void size t int int int off t src malloc hook mmap linux h error redefinition of void mmap void size t int int int off t extern c void mmap void start size t length int prot int flags in file included from src malloc hook cc src malloc hook mmap linux h error void mmap void size t int int int t previously defined here extern c void void start size t length int prot int flags makefile recipe for target src libtcmalloc minimal internal la malloc hook lo failed make error what version of the product are you using on what operating system alpine linux gcc musl please provide any additional information below t is not defined so i patched base linux syscall support h to define it into t musl sys mman h is quite different from glibc so i guess the problem of redifinition comes from that fact i would appreciate for any advice how it could be fixed full configure make logs attached reported by filipp andronov on attachment attachment
| 1
|
35,603
| 7,787,729,923
|
IssuesEvent
|
2018-06-07 00:05:00
|
jccastillo0007/eFacturaT
|
https://api.github.com/repos/jccastillo0007/eFacturaT
|
opened
|
OPTIBELT - LA BASE DEL IVA REPORTADA EN EL PDF ES INCORRECTA
|
bug defect
|
El PDF que se genera en la plataforma (no estoy seguro que desde el conector venga el error), es incorrecta.
En lugar de enviar la base de IVA que corresponde, envía la base multiplicada por la cantidad.
No sé el XML como venga, pero asumo que está bien, de lo contrario ya se hubiera marcado un error.
Te mandaré un correo con un ejemplo concreto.
|
1.0
|
OPTIBELT - LA BASE DEL IVA REPORTADA EN EL PDF ES INCORRECTA - El PDF que se genera en la plataforma (no estoy seguro que desde el conector venga el error), es incorrecta.
En lugar de enviar la base de IVA que corresponde, envía la base multiplicada por la cantidad.
No sé el XML como venga, pero asumo que está bien, de lo contrario ya se hubiera marcado un error.
Te mandaré un correo con un ejemplo concreto.
|
defect
|
optibelt la base del iva reportada en el pdf es incorrecta el pdf que se genera en la plataforma no estoy seguro que desde el conector venga el error es incorrecta en lugar de enviar la base de iva que corresponde envía la base multiplicada por la cantidad no sé el xml como venga pero asumo que está bien de lo contrario ya se hubiera marcado un error te mandaré un correo con un ejemplo concreto
| 1
|
107,619
| 16,761,611,994
|
IssuesEvent
|
2021-06-13 22:31:19
|
gms-ws-demo/nibrs
|
https://api.github.com/repos/gms-ws-demo/nibrs
|
closed
|
CVE-2020-36179 (High) detected in multiple libraries - autoclosed
|
security vulnerability
|
## CVE-2020-36179 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.8.jar</b>, <b>jackson-databind-2.9.5.jar</b>, <b>jackson-databind-2.9.6.jar</b>, <b>jackson-databind-2.8.0.jar</b>, <b>jackson-databind-2.8.10.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-summary-report-common/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.5.RELEASE.jar (Root Library)
- spring-boot-starter-json-2.1.5.RELEASE.jar
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.5.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-flatfile/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar</p>
<p>
Dependency Hierarchy:
- tika-parsers-1.18.jar (Root Library)
- :x: **jackson-databind-2.9.5.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-staging-data/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,nibrs/web/nibrs-web/target/nibrs-web/WEB-INF/lib/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.6.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.0.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-common/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.0/jackson-databind-2.8.0.jar</p>
<p>
Dependency Hierarchy:
- tika-parsers-1.18.jar (Root Library)
- :x: **jackson-databind-2.8.0.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.10.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.10/jackson-databind-2.8.10.jar,nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/jackson-databind-2.8.10.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.8.10.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/gms-ws-demo/nibrs/commit/9fb1c19bd26c2113d1961640de126a33eacdc946">9fb1c19bd26c2113d1961640de126a33eacdc946</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.commons.dbcp.cpdsadapter.DriverAdapterCPDS.
<p>Publish Date: 2021-01-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36179>CVE-2020-36179</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/3004">https://github.com/FasterXML/jackson-databind/issues/3004</a></p>
<p>Release Date: 2021-01-07</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","packageFilePaths":["/tools/nibrs-summary-report-common/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.1.5.RELEASE;org.springframework.boot:spring-boot-starter-json:2.1.5.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.5","packageFilePaths":["/tools/nibrs-flatfile/pom.xml","/tools/nibrs-validate-common/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.tika:tika-parsers:1.18;com.fasterxml.jackson.core:jackson-databind:2.9.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","packageFilePaths":["/tools/nibrs-staging-data/pom.xml","/tools/nibrs-summary-report/pom.xml","/tools/nibrs-route/pom.xml","/tools/nibrs-staging-data-common/pom.xml","/tools/nibrs-xmlfile/pom.xml","/tools/nibrs-validation/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.0","packageFilePaths":["/tools/nibrs-common/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.tika:tika-parsers:1.18;com.fasterxml.jackson.core:jackson-databind:2.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.10","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.8.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36179","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.commons.dbcp.cpdsadapter.DriverAdapterCPDS.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36179","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-36179 (High) detected in multiple libraries - autoclosed - ## CVE-2020-36179 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>jackson-databind-2.9.8.jar</b>, <b>jackson-databind-2.9.5.jar</b>, <b>jackson-databind-2.9.6.jar</b>, <b>jackson-databind-2.8.0.jar</b>, <b>jackson-databind-2.8.10.jar</b></p></summary>
<p>
<details><summary><b>jackson-databind-2.9.8.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-summary-report-common/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.8/jackson-databind-2.9.8.jar</p>
<p>
Dependency Hierarchy:
- spring-boot-starter-web-2.1.5.RELEASE.jar (Root Library)
- spring-boot-starter-json-2.1.5.RELEASE.jar
- :x: **jackson-databind-2.9.8.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.5.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-flatfile/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.5/jackson-databind-2.9.5.jar</p>
<p>
Dependency Hierarchy:
- tika-parsers-1.18.jar (Root Library)
- :x: **jackson-databind-2.9.5.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.9.6.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-staging-data/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,nibrs/web/nibrs-web/target/nibrs-web/WEB-INF/lib/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar,/home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.9.6/jackson-databind-2.9.6.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.9.6.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.0.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-common/pom.xml</p>
<p>Path to vulnerable library: /home/wss-scanner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.0/jackson-databind-2.8.0.jar</p>
<p>
Dependency Hierarchy:
- tika-parsers-1.18.jar (Root Library)
- :x: **jackson-databind-2.8.0.jar** (Vulnerable Library)
</details>
<details><summary><b>jackson-databind-2.8.10.jar</b></p></summary>
<p>General data-binding functionality for Jackson: works on core streaming API</p>
<p>Library home page: <a href="http://github.com/FasterXML/jackson">http://github.com/FasterXML/jackson</a></p>
<p>Path to dependency file: nibrs/tools/nibrs-fbi-service/pom.xml</p>
<p>Path to vulnerable library: canner/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.8.10/jackson-databind-2.8.10.jar,nibrs/tools/nibrs-fbi-service/target/nibrs-fbi-service-1.0.0/WEB-INF/lib/jackson-databind-2.8.10.jar</p>
<p>
Dependency Hierarchy:
- :x: **jackson-databind-2.8.10.jar** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/gms-ws-demo/nibrs/commit/9fb1c19bd26c2113d1961640de126a33eacdc946">9fb1c19bd26c2113d1961640de126a33eacdc946</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.commons.dbcp.cpdsadapter.DriverAdapterCPDS.
<p>Publish Date: 2021-01-07
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36179>CVE-2020-36179</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.1</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: High
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: High
- Integrity Impact: High
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/FasterXML/jackson-databind/issues/3004">https://github.com/FasterXML/jackson-databind/issues/3004</a></p>
<p>Release Date: 2021-01-07</p>
<p>Fix Resolution: com.fasterxml.jackson.core:jackson-databind:2.9.10.8</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.8","packageFilePaths":["/tools/nibrs-summary-report-common/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.springframework.boot:spring-boot-starter-web:2.1.5.RELEASE;org.springframework.boot:spring-boot-starter-json:2.1.5.RELEASE;com.fasterxml.jackson.core:jackson-databind:2.9.8","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.5","packageFilePaths":["/tools/nibrs-flatfile/pom.xml","/tools/nibrs-validate-common/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.tika:tika-parsers:1.18;com.fasterxml.jackson.core:jackson-databind:2.9.5","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.9.6","packageFilePaths":["/tools/nibrs-staging-data/pom.xml","/tools/nibrs-summary-report/pom.xml","/tools/nibrs-route/pom.xml","/tools/nibrs-staging-data-common/pom.xml","/tools/nibrs-xmlfile/pom.xml","/tools/nibrs-validation/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.9.6","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.0","packageFilePaths":["/tools/nibrs-common/pom.xml"],"isTransitiveDependency":true,"dependencyTree":"org.apache.tika:tika-parsers:1.18;com.fasterxml.jackson.core:jackson-databind:2.8.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"},{"packageType":"Java","groupId":"com.fasterxml.jackson.core","packageName":"jackson-databind","packageVersion":"2.8.10","packageFilePaths":["/tools/nibrs-fbi-service/pom.xml"],"isTransitiveDependency":false,"dependencyTree":"com.fasterxml.jackson.core:jackson-databind:2.8.10","isMinimumFixVersionAvailable":true,"minimumFixVersion":"com.fasterxml.jackson.core:jackson-databind:2.9.10.8"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-36179","vulnerabilityDetails":"FasterXML jackson-databind 2.x before 2.9.10.8 mishandles the interaction between serialization gadgets and typing, related to oadd.org.apache.commons.dbcp.cpdsadapter.DriverAdapterCPDS.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-36179","cvss3Severity":"high","cvss3Score":"8.1","cvss3Metrics":{"A":"High","AC":"High","PR":"None","S":"Unchanged","C":"High","UI":"None","AV":"Network","I":"High"},"extraData":{}}</REMEDIATE> -->
|
non_defect
|
cve high detected in multiple libraries autoclosed cve high severity vulnerability vulnerable libraries jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file nibrs tools nibrs summary report common pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy spring boot starter web release jar root library spring boot starter json release jar x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file nibrs tools nibrs flatfile pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy tika parsers jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file nibrs tools nibrs staging data pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar nibrs web nibrs web target nibrs web web inf lib jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar canner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file nibrs tools nibrs common pom xml path to vulnerable library home wss scanner repository com fasterxml jackson core jackson databind jackson databind jar dependency hierarchy tika parsers jar root library x jackson databind jar vulnerable library jackson databind jar general data binding functionality for jackson works on core streaming api library home page a href path to dependency file nibrs tools nibrs fbi service pom xml path to vulnerable library canner repository com fasterxml jackson core jackson databind jackson databind jar nibrs tools nibrs fbi service target nibrs fbi service web inf lib jackson databind jar dependency hierarchy x jackson databind jar vulnerable library found in head commit a href found in base branch master vulnerability details fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to oadd org apache commons dbcp cpdsadapter driveradaptercpds publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity high privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com fasterxml jackson core jackson databind isopenpronvulnerability false ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree org springframework boot spring boot starter web release org springframework boot spring boot starter json release com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org apache tika tika parsers com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency true dependencytree org apache tika tika parsers com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind packagetype java groupid com fasterxml jackson core packagename jackson databind packageversion packagefilepaths istransitivedependency false dependencytree com fasterxml jackson core jackson databind isminimumfixversionavailable true minimumfixversion com fasterxml jackson core jackson databind basebranches vulnerabilityidentifier cve vulnerabilitydetails fasterxml jackson databind x before mishandles the interaction between serialization gadgets and typing related to oadd org apache commons dbcp cpdsadapter driveradaptercpds vulnerabilityurl
| 0
|
459,384
| 13,192,288,525
|
IssuesEvent
|
2020-08-13 13:33:39
|
yalla-coop/presspad
|
https://api.github.com/repos/yalla-coop/presspad
|
closed
|
Create Intern Settings (FRONT END)
|
5-points Frontend backlog priority-3
|
- [ ] Set up Settings route
- [ ] Settings page in line with wireframes: https://www.figma.com/file/CMkMSsbTLjpitcetLUunz9/PressPad?node-id=2878%3A52971
**Across all**
- [ ] Any changes show changes saved when user clicks save button
- [ ] Client side validation to ensure user doesn't delete any fields that are required
- [ ] If any section isn't complete (this might happen is a user skipped a section in sign up process) show the incomplete prompt message at top (same as example you see in host): https://www.figma.com/file/CMkMSsbTLjpitcetLUunz9/PressPad?node-id=3250%3A48406
**My Account**
- [ ] Clicking change my password shows two inputs for user to enter old and new password
- [ ] Client side validation to ensure old and new passwords aren't the same
- [ ] Password validation - At least 8 characters, 1 upper case, 1 lower case and 1 number
**About Me**
This is the section in the sign up profile
- [ ] Pre-fill inputs with answers from when the user signed up
**My Listing**
This is the section in the sign up profile
- [ ] Pre-fill inputs with answers from when the user signed up
**Verifications**
This is the section in the sign up profile
- [ ] Pre-fill inputs with answers from when the user signed up
**Changes to dbs details**
- [ ] When hitting save user taken to message page to let them know their profile is under review again (you can see this in the host pages as it's the same: https://www.figma.com/file/CMkMSsbTLjpitcetLUunz9/PressPad?node-id=3251%3A0)
|
1.0
|
Create Intern Settings (FRONT END) - - [ ] Set up Settings route
- [ ] Settings page in line with wireframes: https://www.figma.com/file/CMkMSsbTLjpitcetLUunz9/PressPad?node-id=2878%3A52971
**Across all**
- [ ] Any changes show changes saved when user clicks save button
- [ ] Client side validation to ensure user doesn't delete any fields that are required
- [ ] If any section isn't complete (this might happen is a user skipped a section in sign up process) show the incomplete prompt message at top (same as example you see in host): https://www.figma.com/file/CMkMSsbTLjpitcetLUunz9/PressPad?node-id=3250%3A48406
**My Account**
- [ ] Clicking change my password shows two inputs for user to enter old and new password
- [ ] Client side validation to ensure old and new passwords aren't the same
- [ ] Password validation - At least 8 characters, 1 upper case, 1 lower case and 1 number
**About Me**
This is the section in the sign up profile
- [ ] Pre-fill inputs with answers from when the user signed up
**My Listing**
This is the section in the sign up profile
- [ ] Pre-fill inputs with answers from when the user signed up
**Verifications**
This is the section in the sign up profile
- [ ] Pre-fill inputs with answers from when the user signed up
**Changes to dbs details**
- [ ] When hitting save user taken to message page to let them know their profile is under review again (you can see this in the host pages as it's the same: https://www.figma.com/file/CMkMSsbTLjpitcetLUunz9/PressPad?node-id=3251%3A0)
|
non_defect
|
create intern settings front end set up settings route settings page in line with wireframes across all any changes show changes saved when user clicks save button client side validation to ensure user doesn t delete any fields that are required if any section isn t complete this might happen is a user skipped a section in sign up process show the incomplete prompt message at top same as example you see in host my account clicking change my password shows two inputs for user to enter old and new password client side validation to ensure old and new passwords aren t the same password validation at least characters upper case lower case and number about me this is the section in the sign up profile pre fill inputs with answers from when the user signed up my listing this is the section in the sign up profile pre fill inputs with answers from when the user signed up verifications this is the section in the sign up profile pre fill inputs with answers from when the user signed up changes to dbs details when hitting save user taken to message page to let them know their profile is under review again you can see this in the host pages as it s the same
| 0
|
184,594
| 21,784,915,073
|
IssuesEvent
|
2022-05-14 01:47:26
|
n-devs/freebitco.in-mobile
|
https://api.github.com/repos/n-devs/freebitco.in-mobile
|
closed
|
CVE-2019-5428 (Medium) detected in jquery-2.2.4.tgz - autoclosed
|
security vulnerability
|
## CVE-2019-5428 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-2.2.4.tgz</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://registry.npmjs.org/jquery/-/jquery-2.2.4.tgz">https://registry.npmjs.org/jquery/-/jquery-2.2.4.tgz</a></p>
<p>Path to dependency file: /freebitco.in-mobile/package.json</p>
<p>Path to vulnerable library: /freebitco.in-mobile/node_modules/jquery/package.json</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.2.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/n-psk/freebitco.in-mobile/commits/72a833fc650d6f78ba14880fedbd1662570968f8">72a833fc650d6f78ba14880fedbd1662570968f8</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A prototype pollution vulnerability exists in jQuery versions < 3.4.0 that allows an attacker to inject properties on Object.prototype.
<p>Publish Date: 2019-04-22
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-5428>CVE-2019-5428</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2019/04/10/jquery-3-4-0-released/">https://blog.jquery.com/2019/04/10/jquery-3-4-0-released/</a></p>
<p>Release Date: 2019-04-22</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
True
|
CVE-2019-5428 (Medium) detected in jquery-2.2.4.tgz - autoclosed - ## CVE-2019-5428 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-2.2.4.tgz</b></p></summary>
<p>JavaScript library for DOM operations</p>
<p>Library home page: <a href="https://registry.npmjs.org/jquery/-/jquery-2.2.4.tgz">https://registry.npmjs.org/jquery/-/jquery-2.2.4.tgz</a></p>
<p>Path to dependency file: /freebitco.in-mobile/package.json</p>
<p>Path to vulnerable library: /freebitco.in-mobile/node_modules/jquery/package.json</p>
<p>
Dependency Hierarchy:
- :x: **jquery-2.2.4.tgz** (Vulnerable Library)
<p>Found in HEAD commit: <a href="https://api.github.com/repos/n-psk/freebitco.in-mobile/commits/72a833fc650d6f78ba14880fedbd1662570968f8">72a833fc650d6f78ba14880fedbd1662570968f8</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
A prototype pollution vulnerability exists in jQuery versions < 3.4.0 that allows an attacker to inject properties on Object.prototype.
<p>Publish Date: 2019-04-22
<p>URL: <a href=https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-5428>CVE-2019-5428</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 2 Score Details (<b>5.6</b>)</summary>
<p>
Base Score Metrics not available</p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://blog.jquery.com/2019/04/10/jquery-3-4-0-released/">https://blog.jquery.com/2019/04/10/jquery-3-4-0-released/</a></p>
<p>Release Date: 2019-04-22</p>
<p>Fix Resolution: 3.4.0</p>
</p>
</details>
<p></p>
***
Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
|
non_defect
|
cve medium detected in jquery tgz autoclosed cve medium severity vulnerability vulnerable library jquery tgz javascript library for dom operations library home page a href path to dependency file freebitco in mobile package json path to vulnerable library freebitco in mobile node modules jquery package json dependency hierarchy x jquery tgz vulnerable library found in head commit a href vulnerability details a prototype pollution vulnerability exists in jquery versions that allows an attacker to inject properties on object prototype publish date url a href cvss score details base score metrics not available suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
| 0
|
59,000
| 16,997,060,422
|
IssuesEvent
|
2021-07-01 07:56:27
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
scipy.io.loadmat failure in 1.7.0
|
defect scipy.io
|
My issue is about a failure to load MATLAB files with scipy 1.7.0. I use `scipy.io.loadmat` (with `squeeze_me=True`) to load MATLAB files & it works just fine in 1.6.3.
I traced the issue down to a change in `scipy.io.matlab.mio5.MatFile5Reader`'s `get_variables` method. I noticed when diffing the 1.7.0 branch with the 1.6.3 branch that [line 313](https://github.com/scipy/scipy/blob/f5e613cd5bd505f68c5a23be06d7622327ad93f8/scipy/io/matlab/mio5.py#L313) was changed from `asstr(hdr.name)` (`from numpy.compat import asstr`) to `hdr.name.decode('latin1')`. Looks like this change was introduced in the following [commit](https://github.com/scipy/scipy/commit/cee52900f78d157ed216db6d6f25366cd557441d).
It appears as though `asstr` handles the case when `hdr.name is None`, which occurs with some of the MATLAB files I am working with.
Reverting to using `asstr` resolves the issue & so does changing the line to `hdr.name.decode('latin1') if hdr.name is not None else 'None'` (`asstr(None) == 'None')`). Not sure which would be more appropriate, or if `hdr.name` even should ever be `None`.
#### Error message:
```
Error
Traceback (most recent call last):
(redacted a few unrelated traceback lines)
File "/usr/lib/python3.8/site-packages/scipy/io/matlab/mio.py", line 226, in loadmat
matfile_dict = MR.get_variables(variable_names)
File "/usr/lib/python3.8/site-packages/scipy/io/matlab/mio5.py", line 313, in get_variables
name = hdr.name.decode('latin1')
AttributeError: 'NoneType' object has no attribute 'decode'
```
#### Scipy/Numpy/Python version information:
1.7.0 1.20.3 sys.version_info(major=3, minor=8, micro=5, releaselevel='final', serial=0)
|
1.0
|
scipy.io.loadmat failure in 1.7.0 - My issue is about a failure to load MATLAB files with scipy 1.7.0. I use `scipy.io.loadmat` (with `squeeze_me=True`) to load MATLAB files & it works just fine in 1.6.3.
I traced the issue down to a change in `scipy.io.matlab.mio5.MatFile5Reader`'s `get_variables` method. I noticed when diffing the 1.7.0 branch with the 1.6.3 branch that [line 313](https://github.com/scipy/scipy/blob/f5e613cd5bd505f68c5a23be06d7622327ad93f8/scipy/io/matlab/mio5.py#L313) was changed from `asstr(hdr.name)` (`from numpy.compat import asstr`) to `hdr.name.decode('latin1')`. Looks like this change was introduced in the following [commit](https://github.com/scipy/scipy/commit/cee52900f78d157ed216db6d6f25366cd557441d).
It appears as though `asstr` handles the case when `hdr.name is None`, which occurs with some of the MATLAB files I am working with.
Reverting to using `asstr` resolves the issue & so does changing the line to `hdr.name.decode('latin1') if hdr.name is not None else 'None'` (`asstr(None) == 'None')`). Not sure which would be more appropriate, or if `hdr.name` even should ever be `None`.
#### Error message:
```
Error
Traceback (most recent call last):
(redacted a few unrelated traceback lines)
File "/usr/lib/python3.8/site-packages/scipy/io/matlab/mio.py", line 226, in loadmat
matfile_dict = MR.get_variables(variable_names)
File "/usr/lib/python3.8/site-packages/scipy/io/matlab/mio5.py", line 313, in get_variables
name = hdr.name.decode('latin1')
AttributeError: 'NoneType' object has no attribute 'decode'
```
#### Scipy/Numpy/Python version information:
1.7.0 1.20.3 sys.version_info(major=3, minor=8, micro=5, releaselevel='final', serial=0)
|
defect
|
scipy io loadmat failure in my issue is about a failure to load matlab files with scipy i use scipy io loadmat with squeeze me true to load matlab files it works just fine in i traced the issue down to a change in scipy io matlab s get variables method i noticed when diffing the branch with the branch that was changed from asstr hdr name from numpy compat import asstr to hdr name decode looks like this change was introduced in the following it appears as though asstr handles the case when hdr name is none which occurs with some of the matlab files i am working with reverting to using asstr resolves the issue so does changing the line to hdr name decode if hdr name is not none else none asstr none none not sure which would be more appropriate or if hdr name even should ever be none error message error traceback most recent call last redacted a few unrelated traceback lines file usr lib site packages scipy io matlab mio py line in loadmat matfile dict mr get variables variable names file usr lib site packages scipy io matlab py line in get variables name hdr name decode attributeerror nonetype object has no attribute decode scipy numpy python version information sys version info major minor micro releaselevel final serial
| 1
|
52,152
| 3,021,888,282
|
IssuesEvent
|
2015-07-31 17:10:11
|
joefutrelle/domdb
|
https://api.github.com/repos/joefutrelle/domdb
|
closed
|
wiki installation instructions need updating
|
priority:high
|
Right now the installation instructions still describe the vagrant file.
|
1.0
|
wiki installation instructions need updating - Right now the installation instructions still describe the vagrant file.
|
non_defect
|
wiki installation instructions need updating right now the installation instructions still describe the vagrant file
| 0
|
33,268
| 7,065,612,435
|
IssuesEvent
|
2018-01-06 22:17:59
|
bridgedotnet/Bridge
|
https://api.github.com/repos/bridgedotnet/Bridge
|
closed
|
DateTime sorting fails after modifying (via AddMinutes(), etc.)
|
defect in progress
|
IOrderedEnumerable can't compare DateTimes after they've been modified with AddMinutes(), etc. It attempts to compare ticks that have been removed from DateTime instances (as a result of being modified by their adjustment methods).
### Steps To Reproduce
https://deck.net/2a202202c30951cc29fb109379f42ee0
```csharp
public class App
{
public static void Main()
{
List<DateTime> times = new List<DateTime>();
DateTime dt1 = DateTime.UtcNow;
times.Add(dt1);
DateTime dt2 = dt1.AddMinutes(-10);
times.Add(dt2);
times = times.OrderBy(dt => dt).ToList();
Console.WriteLine(times[0]);
Console.WriteLine(times[1]);
}
}
```
### Expected Result
```js
12/12/2017 00:05:45
12/12/2017 00:15:45
```
### Actual Result
```js
System.Exception: TypeError: Cannot read property 'low' of null
at o (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:119820)
at i.n.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:122194)
at System.Int64.compareTo (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:128325)
at Object.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:17168)
at Object.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:16772)
at ctor.<anonymous> (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:235941)
at v.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:361451)
at https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:360913
at Array.sort (native)
at https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:360885
```
|
1.0
|
DateTime sorting fails after modifying (via AddMinutes(), etc.) - IOrderedEnumerable can't compare DateTimes after they've been modified with AddMinutes(), etc. It attempts to compare ticks that have been removed from DateTime instances (as a result of being modified by their adjustment methods).
### Steps To Reproduce
https://deck.net/2a202202c30951cc29fb109379f42ee0
```csharp
public class App
{
public static void Main()
{
List<DateTime> times = new List<DateTime>();
DateTime dt1 = DateTime.UtcNow;
times.Add(dt1);
DateTime dt2 = dt1.AddMinutes(-10);
times.Add(dt2);
times = times.OrderBy(dt => dt).ToList();
Console.WriteLine(times[0]);
Console.WriteLine(times[1]);
}
}
```
### Expected Result
```js
12/12/2017 00:05:45
12/12/2017 00:15:45
```
### Actual Result
```js
System.Exception: TypeError: Cannot read property 'low' of null
at o (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:119820)
at i.n.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:122194)
at System.Int64.compareTo (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:128325)
at Object.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:17168)
at Object.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:16772)
at ctor.<anonymous> (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:235941)
at v.compare (https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:361451)
at https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:360913
at Array.sort (native)
at https://deck.net/resources/js/bridge/bridge.min.js?16.5.0:7:360885
```
|
defect
|
datetime sorting fails after modifying via addminutes etc iorderedenumerable can t compare datetimes after they ve been modified with addminutes etc it attempts to compare ticks that have been removed from datetime instances as a result of being modified by their adjustment methods steps to reproduce csharp public class app public static void main list times new list datetime datetime utcnow times add datetime addminutes times add times times orderby dt dt tolist console writeline times console writeline times expected result js actual result js system exception typeerror cannot read property low of null at o at i n compare at system compareto at object compare at object compare at ctor at v compare at at array sort native at
| 1
|
96,516
| 27,876,312,931
|
IssuesEvent
|
2023-03-21 16:13:52
|
xamarin/xamarin-android
|
https://api.github.com/repos/xamarin/xamarin-android
|
opened
|
Caused by: com.android.tools.r8.internal.f: Type kotlin.collections.ArraysUtilJVM is defined multiple times
|
Area: App+Library Build needs-triage
|
### Android application type
.NET Android (net7.0-android, etc.)
### Affected platform version
VSfM 2022 17.6 Preview 983
### Description
I tried out my binding library for Mapbox on .NET7.
Similar issues:
- https://github.com/xamarin/xamarin-android/issues/6219
- https://github.com/xamarin/GooglePlayServicesComponents/issues/648
```
Exception in thread "main" java.lang.RuntimeException: com.android.tools.r8.CompilationFailedException: Compilation failed to complete, origin: /Users/tuyen/.nuget/packages/xamarin.kotlin.stdlib/1.8.10/buildTransitive/net6.0-android31.0/../../jar/org.jetbrains.kotlin.kotlin-stdlib-1.8.10.jar:kotlin/collections/ArraysUtilJVM.class
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:98)
at com.android.tools.r8.D8.main(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:4)
Caused by: com.android.tools.r8.CompilationFailedException: Compilation failed to complete, origin: /Users/tuyen/.nuget/packages/xamarin.kotlin.stdlib/1.8.10/buildTransitive/net6.0-android31.0/../../jar/org.jetbrains.kotlin.kotlin-stdlib-1.8.10.jar:kotlin/collections/ArraysUtilJVM.class
at Version.fakeStackEntry(Version_3.3.28.java:0)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:75)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:28)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:27)
at com.android.tools.r8.internal.Bj.b(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:2)
at com.android.tools.r8.D8.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:22)
at com.android.tools.r8.D8.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:17)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:85)
... 1 more
Caused by: com.android.tools.r8.internal.f: Type kotlin.collections.ArraysUtilJVM is defined multiple times: /Users/tuyen/.nuget/packages/xamarin.kotlin.stdlib/1.8.10/buildTransitive/net6.0-android31.0/../../jar/org.jetbrains.kotlin.kotlin-stdlib-1.8.10.jar:kotlin/collections/ArraysUtilJVM.class, obj/Debug/net6.0-android33.0/lp/111/jl/libs/E85EFB03D25E2E39.jar:kotlin/collections/ArraysUtilJVM.class
at com.android.tools.r8.internal.DT.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:14)
at com.android.tools.r8.internal.DT.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:22)
at com.android.tools.r8.internal.CN.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:33)
at com.android.tools.r8.internal.CN.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:10)
at java.base/java.util.concurrent.ConcurrentHashMap.merge(ConcurrentHashMap.java:2048)
at com.android.tools.r8.internal.CN.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:6)
at com.android.tools.r8.graph.B2$a.e(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:4)
at com.android.tools.r8.dex.b.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:105)
at com.android.tools.r8.dex.b.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:28)
at com.android.tools.r8.D8.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:25)
at com.android.tools.r8.D8.d(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:606)
at com.android.tools.r8.D8.c(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:1)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:24)
```
### Steps to Reproduce
- Create binding library for Mapbox
- Try it out in .NET7 app
### Did you find any workaround?
_No response_
### Relevant log output
_No response_
|
1.0
|
Caused by: com.android.tools.r8.internal.f: Type kotlin.collections.ArraysUtilJVM is defined multiple times - ### Android application type
.NET Android (net7.0-android, etc.)
### Affected platform version
VSfM 2022 17.6 Preview 983
### Description
I tried out my binding library for Mapbox on .NET7.
Similar issues:
- https://github.com/xamarin/xamarin-android/issues/6219
- https://github.com/xamarin/GooglePlayServicesComponents/issues/648
```
Exception in thread "main" java.lang.RuntimeException: com.android.tools.r8.CompilationFailedException: Compilation failed to complete, origin: /Users/tuyen/.nuget/packages/xamarin.kotlin.stdlib/1.8.10/buildTransitive/net6.0-android31.0/../../jar/org.jetbrains.kotlin.kotlin-stdlib-1.8.10.jar:kotlin/collections/ArraysUtilJVM.class
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:98)
at com.android.tools.r8.D8.main(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:4)
Caused by: com.android.tools.r8.CompilationFailedException: Compilation failed to complete, origin: /Users/tuyen/.nuget/packages/xamarin.kotlin.stdlib/1.8.10/buildTransitive/net6.0-android31.0/../../jar/org.jetbrains.kotlin.kotlin-stdlib-1.8.10.jar:kotlin/collections/ArraysUtilJVM.class
at Version.fakeStackEntry(Version_3.3.28.java:0)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:75)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:28)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:27)
at com.android.tools.r8.internal.Bj.b(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:2)
at com.android.tools.r8.D8.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:22)
at com.android.tools.r8.D8.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:17)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:85)
... 1 more
Caused by: com.android.tools.r8.internal.f: Type kotlin.collections.ArraysUtilJVM is defined multiple times: /Users/tuyen/.nuget/packages/xamarin.kotlin.stdlib/1.8.10/buildTransitive/net6.0-android31.0/../../jar/org.jetbrains.kotlin.kotlin-stdlib-1.8.10.jar:kotlin/collections/ArraysUtilJVM.class, obj/Debug/net6.0-android33.0/lp/111/jl/libs/E85EFB03D25E2E39.jar:kotlin/collections/ArraysUtilJVM.class
at com.android.tools.r8.internal.DT.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:14)
at com.android.tools.r8.internal.DT.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:22)
at com.android.tools.r8.internal.CN.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:33)
at com.android.tools.r8.internal.CN.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:10)
at java.base/java.util.concurrent.ConcurrentHashMap.merge(ConcurrentHashMap.java:2048)
at com.android.tools.r8.internal.CN.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:6)
at com.android.tools.r8.graph.B2$a.e(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:4)
at com.android.tools.r8.dex.b.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:105)
at com.android.tools.r8.dex.b.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:28)
at com.android.tools.r8.D8.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:25)
at com.android.tools.r8.D8.d(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:606)
at com.android.tools.r8.D8.c(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:1)
at com.android.tools.r8.internal.Bj.a(R8_3.3.28_2aaf796388b4e9f6bed752d926eca110512a53a3f09a8d755196089c1cfdf799:24)
```
### Steps to Reproduce
- Create binding library for Mapbox
- Try it out in .NET7 app
### Did you find any workaround?
_No response_
### Relevant log output
_No response_
|
non_defect
|
caused by com android tools internal f type kotlin collections arraysutiljvm is defined multiple times android application type net android android etc affected platform version vsfm preview description i tried out my binding library for mapbox on similar issues exception in thread main java lang runtimeexception com android tools compilationfailedexception compilation failed to complete origin users tuyen nuget packages xamarin kotlin stdlib buildtransitive jar org jetbrains kotlin kotlin stdlib jar kotlin collections arraysutiljvm class at com android tools internal bj a at com android tools main caused by com android tools compilationfailedexception compilation failed to complete origin users tuyen nuget packages xamarin kotlin stdlib buildtransitive jar org jetbrains kotlin kotlin stdlib jar kotlin collections arraysutiljvm class at version fakestackentry version java at com android tools internal bj a at com android tools internal bj a at com android tools internal bj a at com android tools internal bj b at com android tools a at com android tools a at com android tools internal bj a more caused by com android tools internal f type kotlin collections arraysutiljvm is defined multiple times users tuyen nuget packages xamarin kotlin stdlib buildtransitive jar org jetbrains kotlin kotlin stdlib jar kotlin collections arraysutiljvm class obj debug lp jl libs jar kotlin collections arraysutiljvm class at com android tools internal dt a at com android tools internal dt a at com android tools internal cn a at com android tools internal cn a at java base java util concurrent concurrenthashmap merge concurrenthashmap java at com android tools internal cn a at com android tools graph a e at com android tools dex b a at com android tools dex b a at com android tools a at com android tools d at com android tools c at com android tools internal bj a steps to reproduce create binding library for mapbox try it out in app did you find any workaround no response relevant log output no response
| 0
|
194,740
| 15,438,645,758
|
IssuesEvent
|
2021-03-07 21:07:48
|
Syncplay/syncplay
|
https://api.github.com/repos/Syncplay/syncplay
|
closed
|
Manpages for client and server
|
Linux documentation
|
Hi,
for the Debian package I wrote two simple manpages, one for the client and one for the server. Please feel free to adapt and include them. You can find them here:
- https://salsa.debian.org/fuddl/syncplay/-/blob/master/debian/syncplay-server.1
- https://salsa.debian.org/fuddl/syncplay/-/blob/master/debian/syncplay.1
Cheers,
Bruno
|
1.0
|
Manpages for client and server - Hi,
for the Debian package I wrote two simple manpages, one for the client and one for the server. Please feel free to adapt and include them. You can find them here:
- https://salsa.debian.org/fuddl/syncplay/-/blob/master/debian/syncplay-server.1
- https://salsa.debian.org/fuddl/syncplay/-/blob/master/debian/syncplay.1
Cheers,
Bruno
|
non_defect
|
manpages for client and server hi for the debian package i wrote two simple manpages one for the client and one for the server please feel free to adapt and include them you can find them here cheers bruno
| 0
|
68,741
| 21,875,377,625
|
IssuesEvent
|
2022-05-19 09:39:24
|
primefaces/primefaces
|
https://api.github.com/repos/primefaces/primefaces
|
closed
|
FileUpload: First multiple upload after fresh deploy fails on a random number of files when `validateContenType=true`
|
defect
|
### Describe the bug
After a fresh deployment, when multiple files are being uploaded in the very first upload, a small random number of them fail to reach the fileuplaod component listener method, and the server ouputs the following log:
> Could not determine content type of uploaded file _filename_here_, consider plugging in an adequate FileTypeDetector implementation
The log appears the exact same number of times as the number of files that doesn't reach the fileupload component listener method.
After that operation, all uploads work properly until the next restart (redeployment or Application Server restart).
This doesn't happen if just one file is uploaded.
This happens only if `validateContenType="true"`
This happens with both Tika and mime-types providers.
**Cause Theory**
* `PrimeApplicationContext` is a Singleton, and hence, its `FileTypeDetector fileTypeDetector` is shared between different threads.
* Fileupload component sends each file in a different request, making them to arrive concurrently
* The `FileTypeDetector` (anonymous inner class in `PrimeApplicationContext`) uses an instance of `ServiceLoader<FileTypeDetector>` to resolve the service providers.
* According to the ServiceLoader official documentation (https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/util/ServiceLoader.html): _Instances of this class are not safe for use by multiple concurrent threads._
This means that first services resolution are happening simultaneosly (each sent file) on the same ServiceLoader (one instantiation in a Singleton class), and probably this is causing trouble in this very moment, the first resolution in the first multiple upload.
This would explain why in doesn't happen when uplaoding only one file the first time, and why after the first upload all other uplaods work properly, because the Service has already been resolved.
### Reproducer
I added two files
- `primefaces-test` contains the test with the primefaces template from its repository. With this one, I don't know exactly why (but I presume is related to some kind of performance difference in ServiceLoader between Jetty and Wildfly), I could only reproduce it once. It didn't happen in any other try (aprocimately 10 times I tried)
- `primefaces-bug` contains a minimal reproducible example without the template, I use it to deploy it ina WIldfly 20.0.1.Final, and there it reproduces the error each and every time.
In any of them, to try to reproduce it, just:
1. Open homepage
2. Click upload files
3. Select a large number of files, 10 for instance.
4. Click Upload
5. The listener method will output the log for some of them, and for the rest a `Could not determine content type of uploaded file _filename_here_, consider plugging in an adequate FileTypeDetector implementation` message will be shown in the server output.
[primefaces-bug.zip](https://github.com/primefaces/primefaces/files/8721787/primefaces-bug.zip)
[primefaces-test.zip](https://github.com/primefaces/primefaces/files/8721788/primefaces-test.zip)
### Expected behavior
Any multiple fileupload should work properly, even the first uploads after a fresh deploy.
### PrimeFaces edition
Community
### PrimeFaces version
tested on 10.0.0 and 11.0.0
### Theme
default
### JSF implementation
Mojarra
### JSF version
2.3
### Browser(s)
Chrome Version 100.0.4896.127 (Official Build) (64-bit)
|
1.0
|
FileUpload: First multiple upload after fresh deploy fails on a random number of files when `validateContenType=true` - ### Describe the bug
After a fresh deployment, when multiple files are being uploaded in the very first upload, a small random number of them fail to reach the fileuplaod component listener method, and the server ouputs the following log:
> Could not determine content type of uploaded file _filename_here_, consider plugging in an adequate FileTypeDetector implementation
The log appears the exact same number of times as the number of files that doesn't reach the fileupload component listener method.
After that operation, all uploads work properly until the next restart (redeployment or Application Server restart).
This doesn't happen if just one file is uploaded.
This happens only if `validateContenType="true"`
This happens with both Tika and mime-types providers.
**Cause Theory**
* `PrimeApplicationContext` is a Singleton, and hence, its `FileTypeDetector fileTypeDetector` is shared between different threads.
* Fileupload component sends each file in a different request, making them to arrive concurrently
* The `FileTypeDetector` (anonymous inner class in `PrimeApplicationContext`) uses an instance of `ServiceLoader<FileTypeDetector>` to resolve the service providers.
* According to the ServiceLoader official documentation (https://docs.oracle.com/en/java/javase/11/docs/api/java.base/java/util/ServiceLoader.html): _Instances of this class are not safe for use by multiple concurrent threads._
This means that first services resolution are happening simultaneosly (each sent file) on the same ServiceLoader (one instantiation in a Singleton class), and probably this is causing trouble in this very moment, the first resolution in the first multiple upload.
This would explain why in doesn't happen when uplaoding only one file the first time, and why after the first upload all other uplaods work properly, because the Service has already been resolved.
### Reproducer
I added two files
- `primefaces-test` contains the test with the primefaces template from its repository. With this one, I don't know exactly why (but I presume is related to some kind of performance difference in ServiceLoader between Jetty and Wildfly), I could only reproduce it once. It didn't happen in any other try (aprocimately 10 times I tried)
- `primefaces-bug` contains a minimal reproducible example without the template, I use it to deploy it ina WIldfly 20.0.1.Final, and there it reproduces the error each and every time.
In any of them, to try to reproduce it, just:
1. Open homepage
2. Click upload files
3. Select a large number of files, 10 for instance.
4. Click Upload
5. The listener method will output the log for some of them, and for the rest a `Could not determine content type of uploaded file _filename_here_, consider plugging in an adequate FileTypeDetector implementation` message will be shown in the server output.
[primefaces-bug.zip](https://github.com/primefaces/primefaces/files/8721787/primefaces-bug.zip)
[primefaces-test.zip](https://github.com/primefaces/primefaces/files/8721788/primefaces-test.zip)
### Expected behavior
Any multiple fileupload should work properly, even the first uploads after a fresh deploy.
### PrimeFaces edition
Community
### PrimeFaces version
tested on 10.0.0 and 11.0.0
### Theme
default
### JSF implementation
Mojarra
### JSF version
2.3
### Browser(s)
Chrome Version 100.0.4896.127 (Official Build) (64-bit)
|
defect
|
fileupload first multiple upload after fresh deploy fails on a random number of files when validatecontentype true describe the bug after a fresh deployment when multiple files are being uploaded in the very first upload a small random number of them fail to reach the fileuplaod component listener method and the server ouputs the following log could not determine content type of uploaded file filename here consider plugging in an adequate filetypedetector implementation the log appears the exact same number of times as the number of files that doesn t reach the fileupload component listener method after that operation all uploads work properly until the next restart redeployment or application server restart this doesn t happen if just one file is uploaded this happens only if validatecontentype true this happens with both tika and mime types providers cause theory primeapplicationcontext is a singleton and hence its filetypedetector filetypedetector is shared between different threads fileupload component sends each file in a different request making them to arrive concurrently the filetypedetector anonymous inner class in primeapplicationcontext uses an instance of serviceloader to resolve the service providers according to the serviceloader official documentation instances of this class are not safe for use by multiple concurrent threads this means that first services resolution are happening simultaneosly each sent file on the same serviceloader one instantiation in a singleton class and probably this is causing trouble in this very moment the first resolution in the first multiple upload this would explain why in doesn t happen when uplaoding only one file the first time and why after the first upload all other uplaods work properly because the service has already been resolved reproducer i added two files primefaces test contains the test with the primefaces template from its repository with this one i don t know exactly why but i presume is related to some kind of performance difference in serviceloader between jetty and wildfly i could only reproduce it once it didn t happen in any other try aprocimately times i tried primefaces bug contains a minimal reproducible example without the template i use it to deploy it ina wildfly final and there it reproduces the error each and every time in any of them to try to reproduce it just open homepage click upload files select a large number of files for instance click upload the listener method will output the log for some of them and for the rest a could not determine content type of uploaded file filename here consider plugging in an adequate filetypedetector implementation message will be shown in the server output expected behavior any multiple fileupload should work properly even the first uploads after a fresh deploy primefaces edition community primefaces version tested on and theme default jsf implementation mojarra jsf version browser s chrome version official build bit
| 1
|
4,449
| 2,610,094,279
|
IssuesEvent
|
2015-02-26 18:28:29
|
chrsmith/dsdsdaadf
|
https://api.github.com/repos/chrsmith/dsdsdaadf
|
opened
|
深圳红蓝光祛痘痘
|
auto-migrated Priority-Medium Type-Defect
|
```
深圳红蓝光祛痘痘【深圳韩方科颜全国热线400-869-1818,24小时
QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘��
�——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方�
��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健
康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业��
�疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘�
��。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 8:14
|
1.0
|
深圳红蓝光祛痘痘 - ```
深圳红蓝光祛痘痘【深圳韩方科颜全国热线400-869-1818,24小时
QQ4008691818】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘��
�——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方�
��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健
康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业��
�疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘�
��。
```
-----
Original issue reported on code.google.com by `szft...@163.com` on 14 May 2014 at 8:14
|
defect
|
深圳红蓝光祛痘痘 深圳红蓝光祛痘痘【 , 】深圳韩方科颜专业祛痘连锁机构,机构以韩国秘�� �——韩方科颜这一国妆准字号治疗型权威,祛痘佳品,韩方� ��颜专业祛痘连锁机构,采用韩国秘方配合专业“不反弹”健 康祛痘技术并结合先进“先进豪华彩光”仪,开创国内专业�� �疗粉刺、痤疮签约包治先河,成功消除了许多顾客脸上的痘� ��。 original issue reported on code google com by szft com on may at
| 1
|
585,388
| 17,496,216,412
|
IssuesEvent
|
2021-08-10 00:50:40
|
kubernetes/website
|
https://api.github.com/repos/kubernetes/website
|
closed
|
The labels should be changed in the `Expose Your App Publicly` section
|
kind/bug priority/important-longterm lifecycle/stale triage/needs-information language/en triage/accepted
|
**This is a Bug Report**
<!-- Thanks for filing an issue! Before submitting, please fill in the following information. -->
<!-- See https://kubernetes.io/docs/contribute/start/ for guidance on writing an actionable issue description. -->
<!--Required Information-->
**Problem:** The labels in the following commands are wrong -
```
kubectl get pods -l run=kubernetes-bootcamp
kubectl get services -l run=kubernetes-bootcamp
```
**Proposed Solution:**
The labels should be changed and the commands on the page should be -
```
kubectl get pods -l app=kubernetes-bootcamp
kubectl get services -l app=kubernetes-bootcamp
kubectl label pod $POD_NAME version=v1
kubectl get pods -l version=v1
```
Similarly, on the Step 3 of the tutorial, the command should change to the following -
```
kubectl delete service -l app=kubernetes-bootcamp
```
**Page to Update:**
https://kubernetes.io/docs/tutorials/kubernetes-basics/expose/expose-interactive/
<!--Optional Information (remove the comment tags around information you would like to include)-->
<!--Kubernetes Version:-->
<!--Additional Information:-->
I am using the latest version of minikube on Windows 10 -
```
PS C:\> minikube version
minikube version: v1.7.3
commit: 436667c819c324e35d7e839f8116b968a2d0a3ff
```
On a side note, is it possible to update the minikube version inside katacoda?
|
1.0
|
The labels should be changed in the `Expose Your App Publicly` section - **This is a Bug Report**
<!-- Thanks for filing an issue! Before submitting, please fill in the following information. -->
<!-- See https://kubernetes.io/docs/contribute/start/ for guidance on writing an actionable issue description. -->
<!--Required Information-->
**Problem:** The labels in the following commands are wrong -
```
kubectl get pods -l run=kubernetes-bootcamp
kubectl get services -l run=kubernetes-bootcamp
```
**Proposed Solution:**
The labels should be changed and the commands on the page should be -
```
kubectl get pods -l app=kubernetes-bootcamp
kubectl get services -l app=kubernetes-bootcamp
kubectl label pod $POD_NAME version=v1
kubectl get pods -l version=v1
```
Similarly, on the Step 3 of the tutorial, the command should change to the following -
```
kubectl delete service -l app=kubernetes-bootcamp
```
**Page to Update:**
https://kubernetes.io/docs/tutorials/kubernetes-basics/expose/expose-interactive/
<!--Optional Information (remove the comment tags around information you would like to include)-->
<!--Kubernetes Version:-->
<!--Additional Information:-->
I am using the latest version of minikube on Windows 10 -
```
PS C:\> minikube version
minikube version: v1.7.3
commit: 436667c819c324e35d7e839f8116b968a2d0a3ff
```
On a side note, is it possible to update the minikube version inside katacoda?
|
non_defect
|
the labels should be changed in the expose your app publicly section this is a bug report problem the labels in the following commands are wrong kubectl get pods l run kubernetes bootcamp kubectl get services l run kubernetes bootcamp proposed solution the labels should be changed and the commands on the page should be kubectl get pods l app kubernetes bootcamp kubectl get services l app kubernetes bootcamp kubectl label pod pod name version kubectl get pods l version similarly on the step of the tutorial the command should change to the following kubectl delete service l app kubernetes bootcamp page to update i am using the latest version of minikube on windows ps c minikube version minikube version commit on a side note is it possible to update the minikube version inside katacoda
| 0
|
18,057
| 3,022,280,412
|
IssuesEvent
|
2015-07-31 19:22:25
|
catmaid/CATMAID
|
https://api.github.com/repos/catmaid/CATMAID
|
closed
|
Annotation search doesn't recognize newly created annotations
|
priority: important status: done type: defect
|
The Neuron Search widget populates a list of all annotations when it's created, but it never refreshes it. Since it needs this to get the annotation for a search, it is impossible to search for newly created annotations without opening a new Neuron Search widget. It would be better if it re-pulled the list of annotations once you annotate something or maybe just after searches in case annotations were added via another widget.
|
1.0
|
Annotation search doesn't recognize newly created annotations - The Neuron Search widget populates a list of all annotations when it's created, but it never refreshes it. Since it needs this to get the annotation for a search, it is impossible to search for newly created annotations without opening a new Neuron Search widget. It would be better if it re-pulled the list of annotations once you annotate something or maybe just after searches in case annotations were added via another widget.
|
defect
|
annotation search doesn t recognize newly created annotations the neuron search widget populates a list of all annotations when it s created but it never refreshes it since it needs this to get the annotation for a search it is impossible to search for newly created annotations without opening a new neuron search widget it would be better if it re pulled the list of annotations once you annotate something or maybe just after searches in case annotations were added via another widget
| 1
|
7,512
| 2,610,403,457
|
IssuesEvent
|
2015-02-26 20:11:09
|
chrsmith/republic-at-war
|
https://api.github.com/repos/chrsmith/republic-at-war
|
closed
|
Ryloth Garrison
|
auto-migrated Priority-Medium Type-Defect
|
```
The land skirmish map 'Ryloth Garrison' has the Rebellion and Empire symbols in
the map preview.
```
-----
Original issue reported on code.google.com by `KillerHurdz@netscape.net` on 19 Jun 2011 at 7:44
|
1.0
|
Ryloth Garrison - ```
The land skirmish map 'Ryloth Garrison' has the Rebellion and Empire symbols in
the map preview.
```
-----
Original issue reported on code.google.com by `KillerHurdz@netscape.net` on 19 Jun 2011 at 7:44
|
defect
|
ryloth garrison the land skirmish map ryloth garrison has the rebellion and empire symbols in the map preview original issue reported on code google com by killerhurdz netscape net on jun at
| 1
|
61,355
| 17,023,673,637
|
IssuesEvent
|
2021-07-03 03:13:54
|
tomhughes/trac-tickets
|
https://api.github.com/repos/tomhughes/trac-tickets
|
closed
|
GPX upload fails, server returns 500
|
Component: api Priority: major Resolution: fixed Type: defect
|
**[Submitted to the original trac issue database at 9.45pm, Monday, 24th January 2011]**
IIRC it started yesterday. Might be a temporary administrative problem but why does it return a few times in a year? I'm considering that as "defect".
If uploading a GPX through JOSM with DirectUpload plugin, there's an error message about denied permissions on the server side. I'm attaching a protocol transcript captured by Wireshark.
If uploading a GPX through http://www.openstreetmap.org/trace/create, the new entry marked "PENDING" etc does not appear. There is only the green text "Your GPX file has been uploaded ..." which is misleading. No new entry even after considerable waiting. No reports in my mailbox.
A few other users are also affected: http://www.openstreetmap.org/user/RGPS/diary/12880
|
1.0
|
GPX upload fails, server returns 500 - **[Submitted to the original trac issue database at 9.45pm, Monday, 24th January 2011]**
IIRC it started yesterday. Might be a temporary administrative problem but why does it return a few times in a year? I'm considering that as "defect".
If uploading a GPX through JOSM with DirectUpload plugin, there's an error message about denied permissions on the server side. I'm attaching a protocol transcript captured by Wireshark.
If uploading a GPX through http://www.openstreetmap.org/trace/create, the new entry marked "PENDING" etc does not appear. There is only the green text "Your GPX file has been uploaded ..." which is misleading. No new entry even after considerable waiting. No reports in my mailbox.
A few other users are also affected: http://www.openstreetmap.org/user/RGPS/diary/12880
|
defect
|
gpx upload fails server returns iirc it started yesterday might be a temporary administrative problem but why does it return a few times in a year i m considering that as defect if uploading a gpx through josm with directupload plugin there s an error message about denied permissions on the server side i m attaching a protocol transcript captured by wireshark if uploading a gpx through the new entry marked pending etc does not appear there is only the green text your gpx file has been uploaded which is misleading no new entry even after considerable waiting no reports in my mailbox a few other users are also affected
| 1
|
256,611
| 27,561,698,385
|
IssuesEvent
|
2023-03-07 22:40:46
|
samqws-marketing/fico-xpress_vdlx-datagrid
|
https://api.github.com/repos/samqws-marketing/fico-xpress_vdlx-datagrid
|
closed
|
CVE-2021-3807 (High) detected in multiple libraries - autoclosed
|
security vulnerability
|
## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/purgecss/node_modules/ansi-regex/package.json,/node_modules/string-width/node_modules/ansi-regex/package.json,/node_modules/cliui/node_modules/ansi-regex/package.json,/node_modules/wrap-ansi/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- bestzip-2.1.5.tgz (Root Library)
- yargs-13.3.0.tgz
- cliui-5.0.0.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/jest-each/node_modules/ansi-regex/package.json,/node_modules/jest-validate/node_modules/ansi-regex/package.json,/node_modules/jest-leak-detector/node_modules/ansi-regex/package.json,/node_modules/jest-matcher-utils/node_modules/ansi-regex/package.json,/node_modules/jest-config/node_modules/ansi-regex/package.json,/node_modules/jest-runtime/node_modules/ansi-regex/package.json,/node_modules/@jest/core/node_modules/ansi-regex/package.json,/node_modules/pretty-format/node_modules/ansi-regex/package.json,/node_modules/jest/node_modules/ansi-regex/package.json,/node_modules/string-length/node_modules/ansi-regex/package.json,/node_modules/jest-jasmine2/node_modules/ansi-regex/package.json,/node_modules/jest-snapshot/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- jest-25.2.3.tgz (Root Library)
- pretty-format-25.5.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/strip-ansi/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- parcel-bundler-1.12.4.tgz (Root Library)
- logger-1.11.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/fico-xpress_vdlx-datagrid/commit/1034e9edaadc6cb260836b29dab13197a606790b">1034e9edaadc6cb260836b29dab13197a606790b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution (ansi-regex): 4.1.1</p>
<p>Direct dependency fix Resolution (bestzip): 2.2.0</p><p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (@types/jest): 26.0.0</p><p>Fix Resolution (ansi-regex): 3.0.1</p>
<p>Direct dependency fix Resolution (parcel-bundler): 1.12.5</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
True
|
CVE-2021-3807 (High) detected in multiple libraries - autoclosed - ## CVE-2021-3807 - High Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>ansi-regex-4.1.0.tgz</b>, <b>ansi-regex-5.0.0.tgz</b>, <b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>
<details><summary><b>ansi-regex-4.1.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-4.1.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/purgecss/node_modules/ansi-regex/package.json,/node_modules/string-width/node_modules/ansi-regex/package.json,/node_modules/cliui/node_modules/ansi-regex/package.json,/node_modules/wrap-ansi/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- bestzip-2.1.5.tgz (Root Library)
- yargs-13.3.0.tgz
- cliui-5.0.0.tgz
- strip-ansi-5.2.0.tgz
- :x: **ansi-regex-4.1.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-5.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-5.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/jest-each/node_modules/ansi-regex/package.json,/node_modules/jest-validate/node_modules/ansi-regex/package.json,/node_modules/jest-leak-detector/node_modules/ansi-regex/package.json,/node_modules/jest-matcher-utils/node_modules/ansi-regex/package.json,/node_modules/jest-config/node_modules/ansi-regex/package.json,/node_modules/jest-runtime/node_modules/ansi-regex/package.json,/node_modules/@jest/core/node_modules/ansi-regex/package.json,/node_modules/pretty-format/node_modules/ansi-regex/package.json,/node_modules/jest/node_modules/ansi-regex/package.json,/node_modules/string-length/node_modules/ansi-regex/package.json,/node_modules/jest-jasmine2/node_modules/ansi-regex/package.json,/node_modules/jest-snapshot/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- jest-25.2.3.tgz (Root Library)
- pretty-format-25.5.0.tgz
- :x: **ansi-regex-5.0.0.tgz** (Vulnerable Library)
</details>
<details><summary><b>ansi-regex-3.0.0.tgz</b></p></summary>
<p>Regular expression for matching ANSI escape codes</p>
<p>Library home page: <a href="https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz">https://registry.npmjs.org/ansi-regex/-/ansi-regex-3.0.0.tgz</a></p>
<p>Path to dependency file: /package.json</p>
<p>Path to vulnerable library: /node_modules/strip-ansi/node_modules/ansi-regex/package.json</p>
<p>
Dependency Hierarchy:
- parcel-bundler-1.12.4.tgz (Root Library)
- logger-1.11.1.tgz
- strip-ansi-4.0.0.tgz
- :x: **ansi-regex-3.0.0.tgz** (Vulnerable Library)
</details>
<p>Found in HEAD commit: <a href="https://github.com/samqws-marketing/fico-xpress_vdlx-datagrid/commit/1034e9edaadc6cb260836b29dab13197a606790b">1034e9edaadc6cb260836b29dab13197a606790b</a></p>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
ansi-regex is vulnerable to Inefficient Regular Expression Complexity
<p>Publish Date: 2021-09-17
<p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2021-3807>CVE-2021-3807</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: High
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/">https://huntr.dev/bounties/5b3cf33b-ede0-4398-9974-800876dfd994/</a></p>
<p>Release Date: 2021-09-17</p>
<p>Fix Resolution (ansi-regex): 4.1.1</p>
<p>Direct dependency fix Resolution (bestzip): 2.2.0</p><p>Fix Resolution (ansi-regex): 5.0.1</p>
<p>Direct dependency fix Resolution (@types/jest): 26.0.0</p><p>Fix Resolution (ansi-regex): 3.0.1</p>
<p>Direct dependency fix Resolution (parcel-bundler): 1.12.5</p>
</p>
</details>
<p></p>
***
<!-- REMEDIATE-OPEN-PR-START -->
- [ ] Check this box to open an automated fix PR
<!-- REMEDIATE-OPEN-PR-END -->
|
non_defect
|
cve high detected in multiple libraries autoclosed cve high severity vulnerability vulnerable libraries ansi regex tgz ansi regex tgz ansi regex tgz ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules purgecss node modules ansi regex package json node modules string width node modules ansi regex package json node modules cliui node modules ansi regex package json node modules wrap ansi node modules ansi regex package json dependency hierarchy bestzip tgz root library yargs tgz cliui tgz strip ansi tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules jest each node modules ansi regex package json node modules jest validate node modules ansi regex package json node modules jest leak detector node modules ansi regex package json node modules jest matcher utils node modules ansi regex package json node modules jest config node modules ansi regex package json node modules jest runtime node modules ansi regex package json node modules jest core node modules ansi regex package json node modules pretty format node modules ansi regex package json node modules jest node modules ansi regex package json node modules string length node modules ansi regex package json node modules jest node modules ansi regex package json node modules jest snapshot node modules ansi regex package json dependency hierarchy jest tgz root library pretty format tgz x ansi regex tgz vulnerable library ansi regex tgz regular expression for matching ansi escape codes library home page a href path to dependency file package json path to vulnerable library node modules strip ansi node modules ansi regex package json dependency hierarchy parcel bundler tgz root library logger tgz strip ansi tgz x ansi regex tgz vulnerable library found in head commit a href found in base branch master vulnerability details ansi regex is vulnerable to inefficient regular expression complexity publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution ansi regex direct dependency fix resolution bestzip fix resolution ansi regex direct dependency fix resolution types jest fix resolution ansi regex direct dependency fix resolution parcel bundler check this box to open an automated fix pr
| 0
|
24,358
| 3,968,652,694
|
IssuesEvent
|
2016-05-03 20:26:09
|
lester88a/snova
|
https://api.github.com/repos/lester88a/snova
|
closed
|
nitrous io
|
auto-migrated Priority-Medium Type-Defect
|
```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `wssbwssb...@gmail.com` on 18 May 2014 at 8:10
|
1.0
|
nitrous io - ```
What steps will reproduce the problem?
1.
2.
3.
What is the expected output? What do you see instead?
What version of the product are you using? On what operating system?
Please provide any additional information below.
```
Original issue reported on code.google.com by `wssbwssb...@gmail.com` on 18 May 2014 at 8:10
|
defect
|
nitrous io what steps will reproduce the problem what is the expected output what do you see instead what version of the product are you using on what operating system please provide any additional information below original issue reported on code google com by wssbwssb gmail com on may at
| 1
|
43,142
| 11,498,898,899
|
IssuesEvent
|
2020-02-12 12:57:37
|
hazelcast/hazelcast-jet
|
https://api.github.com/repos/hazelcast/hazelcast-jet
|
closed
|
More Jet nodes started in daemon mode overwrite the same output file
|
defect
|
In case when more Jet nodes are started by `jet-start` in daemon mode then they overwrite the same log file. It means log is available only for one of those nodes.
We should allow to configure different name of log file for each node. It is currently impossible since `$JET_LOG` is rewritten even if it is configured outside of script.
|
1.0
|
More Jet nodes started in daemon mode overwrite the same output file - In case when more Jet nodes are started by `jet-start` in daemon mode then they overwrite the same log file. It means log is available only for one of those nodes.
We should allow to configure different name of log file for each node. It is currently impossible since `$JET_LOG` is rewritten even if it is configured outside of script.
|
defect
|
more jet nodes started in daemon mode overwrite the same output file in case when more jet nodes are started by jet start in daemon mode then they overwrite the same log file it means log is available only for one of those nodes we should allow to configure different name of log file for each node it is currently impossible since jet log is rewritten even if it is configured outside of script
| 1
|
314,899
| 9,604,209,961
|
IssuesEvent
|
2019-05-10 19:17:24
|
linkerd/linkerd2
|
https://api.github.com/repos/linkerd/linkerd2
|
closed
|
Add inject flag for disabling tap
|
area/cli priority/P1
|
Add a `--disable-tap` flag to inject which prevents the pod from being tapped by instructing the proxy to not expose a tap server.
* Introduce a proxy environment variable to disable the tap server
* Add an inject flag to set the proxy env var
* Also set an annotation on the pod to indicate that tap has been disabled on that pod so that the control plane knows to exclude that pod from any taps
|
1.0
|
Add inject flag for disabling tap - Add a `--disable-tap` flag to inject which prevents the pod from being tapped by instructing the proxy to not expose a tap server.
* Introduce a proxy environment variable to disable the tap server
* Add an inject flag to set the proxy env var
* Also set an annotation on the pod to indicate that tap has been disabled on that pod so that the control plane knows to exclude that pod from any taps
|
non_defect
|
add inject flag for disabling tap add a disable tap flag to inject which prevents the pod from being tapped by instructing the proxy to not expose a tap server introduce a proxy environment variable to disable the tap server add an inject flag to set the proxy env var also set an annotation on the pod to indicate that tap has been disabled on that pod so that the control plane knows to exclude that pod from any taps
| 0
|
16,312
| 2,889,333,372
|
IssuesEvent
|
2015-06-13 09:55:10
|
kuribot/boilerpipe
|
https://api.github.com/repos/kuribot/boilerpipe
|
closed
|
Support HTML5 elements
|
auto-migrated Priority-Medium Type-Defect
|
```
Now that HTML5 becomes more pervasive on the web, it might be worth considering
additional parsing support in places, one example being the recently added
image extractor. HTML5 includes <figure> and <figcaption> for adding semantics
to images, especially the figcaption element is of interest since the text
could be used to determine image relevancy in relation to the extracted
document text.
```
Original issue reported on code.google.com by `misja.ho...@gmail.com` on 18 Oct 2011 at 9:03
|
1.0
|
Support HTML5 elements - ```
Now that HTML5 becomes more pervasive on the web, it might be worth considering
additional parsing support in places, one example being the recently added
image extractor. HTML5 includes <figure> and <figcaption> for adding semantics
to images, especially the figcaption element is of interest since the text
could be used to determine image relevancy in relation to the extracted
document text.
```
Original issue reported on code.google.com by `misja.ho...@gmail.com` on 18 Oct 2011 at 9:03
|
defect
|
support elements now that becomes more pervasive on the web it might be worth considering additional parsing support in places one example being the recently added image extractor includes and for adding semantics to images especially the figcaption element is of interest since the text could be used to determine image relevancy in relation to the extracted document text original issue reported on code google com by misja ho gmail com on oct at
| 1
|
72,127
| 23,953,334,546
|
IssuesEvent
|
2022-09-12 13:16:45
|
vector-im/element-android
|
https://api.github.com/repos/vector-im/element-android
|
reopened
|
App Layout: Bottom sheet padding missing (and landscape mode issue)
|
T-Defect S-Major O-Frequent Team: Delight Z-AppLayout
|
### Steps to reproduce
When I open the bottom sheet there's no padding above all chats, it looks unpolished
Also, when in landscape mode the bottom sheet only shows all chats
| In product | In figma |
|---|---|
|  | <img width="1500" alt="Screenshot 2022-09-06 at 10 03 35" src="https://user-images.githubusercontent.com/89144281/188594278-0b4de16d-6755-45e5-9d04-73a8d350749a.png"> |

|
1.0
|
App Layout: Bottom sheet padding missing (and landscape mode issue) - ### Steps to reproduce
When I open the bottom sheet there's no padding above all chats, it looks unpolished
Also, when in landscape mode the bottom sheet only shows all chats
| In product | In figma |
|---|---|
|  | <img width="1500" alt="Screenshot 2022-09-06 at 10 03 35" src="https://user-images.githubusercontent.com/89144281/188594278-0b4de16d-6755-45e5-9d04-73a8d350749a.png"> |

|
defect
|
app layout bottom sheet padding missing and landscape mode issue steps to reproduce when i open the bottom sheet there s no padding above all chats it looks unpolished also when in landscape mode the bottom sheet only shows all chats in product in figma img width alt screenshot at src
| 1
|
81,279
| 30,780,731,280
|
IssuesEvent
|
2023-07-31 09:49:17
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
opened
|
Misalignment in topbar of mac application
|
T-Defect S-Minor A-Electron A-Appearance O-Frequent
|
### Steps to reproduce
1. Use Element Desktop on Mac
### Outcome
#### What did you expect?
I would expect to see better alignment in the top part.
It's a great touch to remove the topbar as it makes the app more integrated and makes better use of the available vertical space.
It would be great to see the controls aligned with the search bar, and maybe experiment with aligning the user's avatar with the breadcrumb
Other apps like Slack do create a more functional topbar. I am not suggesting to change to that, but it is interesting to see the design direction they took
<img width="1440" alt="Screenshot 2023-07-31 at 10 31 16" src="https://github.com/vector-im/element-web/assets/769871/dfdac1a3-d3f2-4914-8e31-9024b7e63f9b">
#### What happened instead?
<img width="495" alt="Screenshot 2023-07-31 at 10 33 21" src="https://github.com/vector-im/element-web/assets/769871/117c1156-b7bc-4701-b66b-dedba318a127">
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Application version
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
1.0
|
Misalignment in topbar of mac application - ### Steps to reproduce
1. Use Element Desktop on Mac
### Outcome
#### What did you expect?
I would expect to see better alignment in the top part.
It's a great touch to remove the topbar as it makes the app more integrated and makes better use of the available vertical space.
It would be great to see the controls aligned with the search bar, and maybe experiment with aligning the user's avatar with the breadcrumb
Other apps like Slack do create a more functional topbar. I am not suggesting to change to that, but it is interesting to see the design direction they took
<img width="1440" alt="Screenshot 2023-07-31 at 10 31 16" src="https://github.com/vector-im/element-web/assets/769871/dfdac1a3-d3f2-4914-8e31-9024b7e63f9b">
#### What happened instead?
<img width="495" alt="Screenshot 2023-07-31 at 10 33 21" src="https://github.com/vector-im/element-web/assets/769871/117c1156-b7bc-4701-b66b-dedba318a127">
### Operating system
_No response_
### Browser information
_No response_
### URL for webapp
_No response_
### Application version
_No response_
### Homeserver
_No response_
### Will you send logs?
No
|
defect
|
misalignment in topbar of mac application steps to reproduce use element desktop on mac outcome what did you expect i would expect to see better alignment in the top part it s a great touch to remove the topbar as it makes the app more integrated and makes better use of the available vertical space it would be great to see the controls aligned with the search bar and maybe experiment with aligning the user s avatar with the breadcrumb other apps like slack do create a more functional topbar i am not suggesting to change to that but it is interesting to see the design direction they took img width alt screenshot at src what happened instead img width alt screenshot at src operating system no response browser information no response url for webapp no response application version no response homeserver no response will you send logs no
| 1
|
52,085
| 13,211,386,968
|
IssuesEvent
|
2020-08-15 22:46:24
|
icecube-trac/tix4
|
https://api.github.com/repos/icecube-trac/tix4
|
opened
|
[topsimulator] initialize sampleIndex_ in constructor (Trac #1663)
|
Incomplete Migration Migrated from Trac combo simulation defect
|
<details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1663">https://code.icecube.wisc.edu/projects/icecube/ticket/1663</a>, reported by david.schultzand owned by jgonzalez</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:13:35",
"_ts": "1550067215093672",
"description": "Please initialize `sampleIndex_` in the constructor, so warnings like this don't appear.\n\nThe warning:\n{{{\n/home/dschultz/Documents/combo/trunk/src/topsimulator/private/topsimulator/injectors/I3CorsikaThinnedInjector.cxx:140:16:\n warning: operation on \u2018((I3CorsikaThinnedInjector *)this)->\n I3CorsikaThinnedInjector::sampleIndex_\u2019 may be undefined [-Wsequence-\n point]\nsampleIndex_ = ++sampleIndex_ % numSamples_;\n~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n}}}",
"reporter": "david.schultz",
"cc": "",
"resolution": "fixed",
"time": "2016-04-26T20:43:06",
"component": "combo simulation",
"summary": "[topsimulator] initialize sampleIndex_ in constructor",
"priority": "minor",
"keywords": "",
"milestone": "",
"owner": "jgonzalez",
"type": "defect"
}
```
</p>
</details>
|
1.0
|
[topsimulator] initialize sampleIndex_ in constructor (Trac #1663) - <details>
<summary><em>Migrated from <a href="https://code.icecube.wisc.edu/projects/icecube/ticket/1663">https://code.icecube.wisc.edu/projects/icecube/ticket/1663</a>, reported by david.schultzand owned by jgonzalez</em></summary>
<p>
```json
{
"status": "closed",
"changetime": "2019-02-13T14:13:35",
"_ts": "1550067215093672",
"description": "Please initialize `sampleIndex_` in the constructor, so warnings like this don't appear.\n\nThe warning:\n{{{\n/home/dschultz/Documents/combo/trunk/src/topsimulator/private/topsimulator/injectors/I3CorsikaThinnedInjector.cxx:140:16:\n warning: operation on \u2018((I3CorsikaThinnedInjector *)this)->\n I3CorsikaThinnedInjector::sampleIndex_\u2019 may be undefined [-Wsequence-\n point]\nsampleIndex_ = ++sampleIndex_ % numSamples_;\n~~~~~~~~~~~~~^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~\n}}}",
"reporter": "david.schultz",
"cc": "",
"resolution": "fixed",
"time": "2016-04-26T20:43:06",
"component": "combo simulation",
"summary": "[topsimulator] initialize sampleIndex_ in constructor",
"priority": "minor",
"keywords": "",
"milestone": "",
"owner": "jgonzalez",
"type": "defect"
}
```
</p>
</details>
|
defect
|
initialize sampleindex in constructor trac migrated from json status closed changetime ts description please initialize sampleindex in the constructor so warnings like this don t appear n nthe warning n n home dschultz documents combo trunk src topsimulator private topsimulator injectors cxx n warning operation on this n sampleindex may be undefined nsampleindex sampleindex numsamples n n reporter david schultz cc resolution fixed time component combo simulation summary initialize sampleindex in constructor priority minor keywords milestone owner jgonzalez type defect
| 1
|
82,284
| 15,646,536,535
|
IssuesEvent
|
2021-03-23 01:09:19
|
jgeraigery/TwxAzureDataLakeConnector
|
https://api.github.com/repos/jgeraigery/TwxAzureDataLakeConnector
|
opened
|
CVE-2020-28500 (Medium) detected in multiple libraries
|
security vulnerability
|
## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.17.14.tgz</b>, <b>lodash-4.17.15.tgz</b>, <b>lodash-4.17.11.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.17.14.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.14.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.14.tgz</a></p>
<p>Path to dependency file: TwxAzureDataLakeConnector/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json</p>
<p>Path to vulnerable library: TwxAzureDataLakeConnector/src/node_modules/ptc-flow-test-helper/node_modules/lodash/package.json,TwxAzureDataLakeConnector/src/node_modules/ptc-flow-test-helper/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- ptc-flow-sdk-2.0.61.tgz (Root Library)
- :x: **lodash-4.17.14.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-4.17.15.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p>
<p>Path to dependency file: TwxAzureDataLakeConnector/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json</p>
<p>Path to vulnerable library: TwxAzureDataLakeConnector/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- :x: **lodash-4.17.15.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-4.17.11.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz</a></p>
<p>Path to dependency file: TwxAzureDataLakeConnector/src/package.json</p>
<p>Path to vulnerable library: TwxAzureDataLakeConnector/src/node_modules/nyc/node_modules/lodash/package.json,TwxAzureDataLakeConnector/src/node_modules/nyc/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- nyc-13.3.0.tgz (Root Library)
- istanbul-reports-2.1.1.tgz
- handlebars-4.1.0.tgz
- async-2.6.2.tgz
- :x: **lodash-4.17.11.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7">https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash - 4.17.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.14","packageFilePaths":["/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json","/src/package.json"],"isTransitiveDependency":true,"dependencyTree":"ptc-flow-sdk:2.0.61;lodash:4.17.14","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"},{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.15","packageFilePaths":["/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json"],"isTransitiveDependency":false,"dependencyTree":"lodash:4.17.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"},{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.11","packageFilePaths":["/src/package.json","/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json"],"isTransitiveDependency":true,"dependencyTree":"nyc:13.3.0;istanbul-reports:2.1.1;handlebars:4.1.0;async:2.6.2;lodash:4.17.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-28500","vulnerabilityDetails":"Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
True
|
CVE-2020-28500 (Medium) detected in multiple libraries - ## CVE-2020-28500 - Medium Severity Vulnerability
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>lodash-4.17.14.tgz</b>, <b>lodash-4.17.15.tgz</b>, <b>lodash-4.17.11.tgz</b></p></summary>
<p>
<details><summary><b>lodash-4.17.14.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.14.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.14.tgz</a></p>
<p>Path to dependency file: TwxAzureDataLakeConnector/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json</p>
<p>Path to vulnerable library: TwxAzureDataLakeConnector/src/node_modules/ptc-flow-test-helper/node_modules/lodash/package.json,TwxAzureDataLakeConnector/src/node_modules/ptc-flow-test-helper/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- ptc-flow-sdk-2.0.61.tgz (Root Library)
- :x: **lodash-4.17.14.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-4.17.15.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.15.tgz</a></p>
<p>Path to dependency file: TwxAzureDataLakeConnector/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json</p>
<p>Path to vulnerable library: TwxAzureDataLakeConnector/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- :x: **lodash-4.17.15.tgz** (Vulnerable Library)
</details>
<details><summary><b>lodash-4.17.11.tgz</b></p></summary>
<p>Lodash modular utilities.</p>
<p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.11.tgz</a></p>
<p>Path to dependency file: TwxAzureDataLakeConnector/src/package.json</p>
<p>Path to vulnerable library: TwxAzureDataLakeConnector/src/node_modules/nyc/node_modules/lodash/package.json,TwxAzureDataLakeConnector/src/node_modules/nyc/node_modules/lodash/package.json</p>
<p>
Dependency Hierarchy:
- nyc-13.3.0.tgz (Root Library)
- istanbul-reports-2.1.1.tgz
- handlebars-4.1.0.tgz
- async-2.6.2.tgz
- :x: **lodash-4.17.11.tgz** (Vulnerable Library)
</details>
<p>Found in base branch: <b>master</b></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary>
<p>
Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.
<p>Publish Date: 2021-02-15
<p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500>CVE-2020-28500</a></p>
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary>
<p>
Base Score Metrics:
- Exploitability Metrics:
- Attack Vector: Network
- Attack Complexity: Low
- Privileges Required: None
- User Interaction: None
- Scope: Unchanged
- Impact Metrics:
- Confidentiality Impact: None
- Integrity Impact: None
- Availability Impact: Low
</p>
For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>.
</p>
</details>
<p></p>
<details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary>
<p>
<p>Type: Upgrade version</p>
<p>Origin: <a href="https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7">https://github.com/lodash/lodash/commit/02906b8191d3c100c193fe6f7b27d1c40f200bb7</a></p>
<p>Release Date: 2021-02-15</p>
<p>Fix Resolution: lodash - 4.17.21</p>
</p>
</details>
<p></p>
<!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.14","packageFilePaths":["/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json","/src/package.json"],"isTransitiveDependency":true,"dependencyTree":"ptc-flow-sdk:2.0.61;lodash:4.17.14","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"},{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.15","packageFilePaths":["/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json"],"isTransitiveDependency":false,"dependencyTree":"lodash:4.17.15","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"},{"packageType":"javascript/Node.js","packageName":"lodash","packageVersion":"4.17.11","packageFilePaths":["/src/package.json","/dist/ptc-adls-connector-1-0-0/ptc-adls-connector/package.json"],"isTransitiveDependency":true,"dependencyTree":"nyc:13.3.0;istanbul-reports:2.1.1;handlebars:4.1.0;async:2.6.2;lodash:4.17.11","isMinimumFixVersionAvailable":true,"minimumFixVersion":"lodash - 4.17.21"}],"baseBranches":["master"],"vulnerabilityIdentifier":"CVE-2020-28500","vulnerabilityDetails":"Lodash versions prior to 4.17.21 are vulnerable to Regular Expression Denial of Service (ReDoS) via the toNumber, trim and trimEnd functions.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2020-28500","cvss3Severity":"medium","cvss3Score":"5.3","cvss3Metrics":{"A":"Low","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
|
non_defect
|
cve medium detected in multiple libraries cve medium severity vulnerability vulnerable libraries lodash tgz lodash tgz lodash tgz lodash tgz lodash modular utilities library home page a href path to dependency file twxazuredatalakeconnector dist ptc adls connector ptc adls connector package json path to vulnerable library twxazuredatalakeconnector src node modules ptc flow test helper node modules lodash package json twxazuredatalakeconnector src node modules ptc flow test helper node modules lodash package json dependency hierarchy ptc flow sdk tgz root library x lodash tgz vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file twxazuredatalakeconnector dist ptc adls connector ptc adls connector package json path to vulnerable library twxazuredatalakeconnector dist ptc adls connector ptc adls connector node modules lodash package json dependency hierarchy x lodash tgz vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file twxazuredatalakeconnector src package json path to vulnerable library twxazuredatalakeconnector src node modules nyc node modules lodash package json twxazuredatalakeconnector src node modules nyc node modules lodash package json dependency hierarchy nyc tgz root library istanbul reports tgz handlebars tgz async tgz x lodash tgz vulnerable library found in base branch master vulnerability details lodash versions prior to are vulnerable to regular expression denial of service redos via the tonumber trim and trimend functions publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution lodash isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree ptc flow sdk lodash isminimumfixversionavailable true minimumfixversion lodash packagetype javascript node js packagename lodash packageversion packagefilepaths istransitivedependency false dependencytree lodash isminimumfixversionavailable true minimumfixversion lodash packagetype javascript node js packagename lodash packageversion packagefilepaths istransitivedependency true dependencytree nyc istanbul reports handlebars async lodash isminimumfixversionavailable true minimumfixversion lodash basebranches vulnerabilityidentifier cve vulnerabilitydetails lodash versions prior to are vulnerable to regular expression denial of service redos via the tonumber trim and trimend functions vulnerabilityurl
| 0
|
17,332
| 2,999,709,551
|
IssuesEvent
|
2015-07-23 20:28:44
|
scipy/scipy
|
https://api.github.com/repos/scipy/scipy
|
closed
|
some scipy ufuncs have inconsistent output dtypes?
|
defect scipy.special
|
The `lambertw` function appears to have inconsistent output dtypes:
```python
>>> from scipy.special import lambertw
>>> lambertw(0, 0, 0)
0j
>>> lambertw([0], 0, 0)
array([ 0.+0.j])
>>> lambertw(0, [0], 0)
array([ 0.+0.j], dtype=complex64)
>>> lambertw(0, 0, [0])
array([ 0.+0.j])
```
This behavior is shared by `sph_harm` and is possibly at the root of https://github.com/scipy/scipy/issues/4887.
|
1.0
|
some scipy ufuncs have inconsistent output dtypes? - The `lambertw` function appears to have inconsistent output dtypes:
```python
>>> from scipy.special import lambertw
>>> lambertw(0, 0, 0)
0j
>>> lambertw([0], 0, 0)
array([ 0.+0.j])
>>> lambertw(0, [0], 0)
array([ 0.+0.j], dtype=complex64)
>>> lambertw(0, 0, [0])
array([ 0.+0.j])
```
This behavior is shared by `sph_harm` and is possibly at the root of https://github.com/scipy/scipy/issues/4887.
|
defect
|
some scipy ufuncs have inconsistent output dtypes the lambertw function appears to have inconsistent output dtypes python from scipy special import lambertw lambertw lambertw array lambertw array dtype lambertw array this behavior is shared by sph harm and is possibly at the root of
| 1
|
17,957
| 3,013,821,117
|
IssuesEvent
|
2015-07-29 11:28:34
|
yawlfoundation/yawl
|
https://api.github.com/repos/yawlfoundation/yawl
|
closed
|
Editor 3.0 creates unwanted out-mapping for input only variable
|
auto-migrated Priority-Medium Type-Defect
|
```
When using the new function to pull a net variable into a task, the mappings
are set automatically, nice thing.
But when the task variable is then set to input, the mapping is not removed in
the spec.
Editor latest build 469, see attached screenshot
```
Original issue reported on code.google.com by `anwim...@gmail.com` on 27 May 2014 at 7:25
Attachments:
* [unwanted_out_mapping.PNG](https://storage.googleapis.com/google-code-attachments/yawl/issue-524/comment-0/unwanted_out_mapping.PNG)
|
1.0
|
Editor 3.0 creates unwanted out-mapping for input only variable - ```
When using the new function to pull a net variable into a task, the mappings
are set automatically, nice thing.
But when the task variable is then set to input, the mapping is not removed in
the spec.
Editor latest build 469, see attached screenshot
```
Original issue reported on code.google.com by `anwim...@gmail.com` on 27 May 2014 at 7:25
Attachments:
* [unwanted_out_mapping.PNG](https://storage.googleapis.com/google-code-attachments/yawl/issue-524/comment-0/unwanted_out_mapping.PNG)
|
defect
|
editor creates unwanted out mapping for input only variable when using the new function to pull a net variable into a task the mappings are set automatically nice thing but when the task variable is then set to input the mapping is not removed in the spec editor latest build see attached screenshot original issue reported on code google com by anwim gmail com on may at attachments
| 1
|
26,939
| 4,838,302,454
|
IssuesEvent
|
2016-11-09 02:27:03
|
zealdocs/zeal
|
https://api.github.com/repos/zealdocs/zeal
|
closed
|
Sidebar does not show context relevant links
|
Component: Docset Registry Resolution: Fixed Type: Defect
|
In a build from latest master, the sidebar no longer shows methods or other links for the documentation being viewed.
Initially when a page is viewed in a new docset, the sidebar populates with items from that docset, but not necessarily items that are in any way relevant to that page. Additionally, when viewing new pages within that docset, the sidebar doesn't update at all. So far this occurs with every docset I have tried, including ruby 2, Bash, PHP, and Qt 5. The sidebar is actually completely non-present for other docsets, like Javascript, Java SE 8, and Python 2 (but I don't know if that's intentional).
Especially for the ruby 2 docset, this issue is crippling, because search doesn't work in that docset either (issue #641) so there's no way to jump to a specific part of the page.
|
1.0
|
Sidebar does not show context relevant links - In a build from latest master, the sidebar no longer shows methods or other links for the documentation being viewed.
Initially when a page is viewed in a new docset, the sidebar populates with items from that docset, but not necessarily items that are in any way relevant to that page. Additionally, when viewing new pages within that docset, the sidebar doesn't update at all. So far this occurs with every docset I have tried, including ruby 2, Bash, PHP, and Qt 5. The sidebar is actually completely non-present for other docsets, like Javascript, Java SE 8, and Python 2 (but I don't know if that's intentional).
Especially for the ruby 2 docset, this issue is crippling, because search doesn't work in that docset either (issue #641) so there's no way to jump to a specific part of the page.
|
defect
|
sidebar does not show context relevant links in a build from latest master the sidebar no longer shows methods or other links for the documentation being viewed initially when a page is viewed in a new docset the sidebar populates with items from that docset but not necessarily items that are in any way relevant to that page additionally when viewing new pages within that docset the sidebar doesn t update at all so far this occurs with every docset i have tried including ruby bash php and qt the sidebar is actually completely non present for other docsets like javascript java se and python but i don t know if that s intentional especially for the ruby docset this issue is crippling because search doesn t work in that docset either issue so there s no way to jump to a specific part of the page
| 1
|
313,705
| 23,488,956,925
|
IssuesEvent
|
2022-08-17 16:42:56
|
kubelt/kubelt
|
https://api.github.com/repos/kubelt/kubelt
|
closed
|
FEAT(onboarding): Update FAQ Copy
|
documentation
|
# Why
The onboarding dashboard FAQ helps users understand 3ID so that they can use it and earn invitations.
# What
Please update the FAQ copy so that it is more reflective of features and goals.
# How
> Please design, decompose, and discuss: break the logical **What** into a set of issues documenting physical implementation tasks.
- [ ] Issue #1
- [ ] Issue #2
- [ ] Issue #3
# Notes
- [User journey](https://www.notion.so/kubelt/3iD-Onboarding-18fd37f9dec340178dbe86d48afd2431).
- [Design assets](https://www.figma.com/file/EqUEbCGHGnZXDMSUbrxhX7/Kubelt?node-id=2540%3A10207).
|
1.0
|
FEAT(onboarding): Update FAQ Copy - # Why
The onboarding dashboard FAQ helps users understand 3ID so that they can use it and earn invitations.
# What
Please update the FAQ copy so that it is more reflective of features and goals.
# How
> Please design, decompose, and discuss: break the logical **What** into a set of issues documenting physical implementation tasks.
- [ ] Issue #1
- [ ] Issue #2
- [ ] Issue #3
# Notes
- [User journey](https://www.notion.so/kubelt/3iD-Onboarding-18fd37f9dec340178dbe86d48afd2431).
- [Design assets](https://www.figma.com/file/EqUEbCGHGnZXDMSUbrxhX7/Kubelt?node-id=2540%3A10207).
|
non_defect
|
feat onboarding update faq copy why the onboarding dashboard faq helps users understand so that they can use it and earn invitations what please update the faq copy so that it is more reflective of features and goals how please design decompose and discuss break the logical what into a set of issues documenting physical implementation tasks issue issue issue notes
| 0
|
10,216
| 3,369,486,484
|
IssuesEvent
|
2015-11-23 10:26:00
|
systemd/systemd
|
https://api.github.com/repos/systemd/systemd
|
closed
|
RFE: make systemd.directives html pages jump to the anchor in the target
|
documentation RFE
|
currently when you click on a link from http://www.freedesktop.org/software/systemd/man/systemd.directives.html
say, the first one, Accept=, it currently goes to http://www.freedesktop.org/software/systemd/man/systemd.socket.html
it would be useful if it went to http://www.freedesktop.org/software/systemd/man/systemd.socket.html#Accept=
(i don't know dockbook nor if that is easy)
|
1.0
|
RFE: make systemd.directives html pages jump to the anchor in the target - currently when you click on a link from http://www.freedesktop.org/software/systemd/man/systemd.directives.html
say, the first one, Accept=, it currently goes to http://www.freedesktop.org/software/systemd/man/systemd.socket.html
it would be useful if it went to http://www.freedesktop.org/software/systemd/man/systemd.socket.html#Accept=
(i don't know dockbook nor if that is easy)
|
non_defect
|
rfe make systemd directives html pages jump to the anchor in the target currently when you click on a link from say the first one accept it currently goes to it would be useful if it went to i don t know dockbook nor if that is easy
| 0
|
69,335
| 22,320,451,368
|
IssuesEvent
|
2022-06-14 05:39:21
|
vector-im/element-web
|
https://api.github.com/repos/vector-im/element-web
|
closed
|
AVIF Support - White image on preview
|
T-Defect X-Needs-Info Z-Platform-Specific S-Major A-Media Z-Upstream O-Uncommon
|
### Steps to reproduce
1. Upload a .avif image
2. Look at the chat and there will be a while preview (unless it is different on my version for some reason)
3. When you click on the image it displays alright
### What happened?
### What did you expect?
The preview to show the image
### What happened?
It was white
### Operating system
Popos 20.04
(using i3 with xorg)
### Application version
Element version: 1.8.5 Olm version: 3.2.3
### How did you install the app?
element.io
### Homeserver
matrix.org
### Have you submitted a rageshake?
No
|
1.0
|
AVIF Support - White image on preview - ### Steps to reproduce
1. Upload a .avif image
2. Look at the chat and there will be a while preview (unless it is different on my version for some reason)
3. When you click on the image it displays alright
### What happened?
### What did you expect?
The preview to show the image
### What happened?
It was white
### Operating system
Popos 20.04
(using i3 with xorg)
### Application version
Element version: 1.8.5 Olm version: 3.2.3
### How did you install the app?
element.io
### Homeserver
matrix.org
### Have you submitted a rageshake?
No
|
defect
|
avif support white image on preview steps to reproduce upload a avif image look at the chat and there will be a while preview unless it is different on my version for some reason when you click on the image it displays alright what happened what did you expect the preview to show the image what happened it was white operating system popos using with xorg application version element version olm version how did you install the app element io homeserver matrix org have you submitted a rageshake no
| 1
|
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.