Unnamed: 0
int64
0
832k
id
float64
2.49B
32.1B
type
stringclasses
1 value
created_at
stringlengths
19
19
repo
stringlengths
7
112
repo_url
stringlengths
36
141
action
stringclasses
3 values
title
stringlengths
1
744
labels
stringlengths
4
574
body
stringlengths
9
211k
index
stringclasses
10 values
text_combine
stringlengths
96
211k
label
stringclasses
2 values
text
stringlengths
96
188k
binary_label
int64
0
1
756,930
26,490,173,580
IssuesEvent
2023-01-17 21:49:09
yugabyte/yugabyte-db
https://api.github.com/repos/yugabyte/yugabyte-db
closed
Key pattern search and delete in Redis.
kind/enhancement priority/low area/ycql
Jira Link: [DB-4721](https://yugabyte.atlassian.net/browse/DB-4721) Please add the ability to do key pattern search and delete. ref: https://redis.io/commands/keys https://redis.io/commands/del
1.0
Key pattern search and delete in Redis. - Jira Link: [DB-4721](https://yugabyte.atlassian.net/browse/DB-4721) Please add the ability to do key pattern search and delete. ref: https://redis.io/commands/keys https://redis.io/commands/del
non_process
key pattern search and delete in redis jira link please add the ability to do key pattern search and delete ref
0
124,605
10,318,058,071
IssuesEvent
2019-08-30 14:08:32
bertvannuffelen/demo_oslodoc
https://api.github.com/repos/bertvannuffelen/demo_oslodoc
closed
Uri fragmenten van aangemaakte eigenschappen uit associaties met richtingsaanduiding te beginnen met kleine letter
Topic: testassociaties readyfortest
Uri fragmenten van **aangemaakte** eigenschappen uit associaties **met richtingsaanduiding** zouden met **kleine letter** moeten beginnen. ![afbeelding](https://user-images.githubusercontent.com/6448245/60177699-d532ee80-9819-11e9-848b-9840067290f6.png) Voorbeeld hoe het nu is: "https://data.vlaanderen.be/ns/mijndomein#Heeft01A" "https://data.vlaanderen.be/ns/mijndomein#Heeft01B" Voorgestelde oplossing: laat toolchain de conversie doen voor aangemaakte eigenschappen uit associaties met richtingsaanduiding (niet voor uri fragmenten van bestaande eigenschappen!) "https://data.vlaanderen.be/ns/mijndomein#heeft01A" "https://data.vlaanderen.be/ns/mijndomein#heeft01B"
2.0
Uri fragmenten van aangemaakte eigenschappen uit associaties met richtingsaanduiding te beginnen met kleine letter - Uri fragmenten van **aangemaakte** eigenschappen uit associaties **met richtingsaanduiding** zouden met **kleine letter** moeten beginnen. ![afbeelding](https://user-images.githubusercontent.com/6448245/60177699-d532ee80-9819-11e9-848b-9840067290f6.png) Voorbeeld hoe het nu is: "https://data.vlaanderen.be/ns/mijndomein#Heeft01A" "https://data.vlaanderen.be/ns/mijndomein#Heeft01B" Voorgestelde oplossing: laat toolchain de conversie doen voor aangemaakte eigenschappen uit associaties met richtingsaanduiding (niet voor uri fragmenten van bestaande eigenschappen!) "https://data.vlaanderen.be/ns/mijndomein#heeft01A" "https://data.vlaanderen.be/ns/mijndomein#heeft01B"
non_process
uri fragmenten van aangemaakte eigenschappen uit associaties met richtingsaanduiding te beginnen met kleine letter uri fragmenten van aangemaakte eigenschappen uit associaties met richtingsaanduiding zouden met kleine letter moeten beginnen voorbeeld hoe het nu is voorgestelde oplossing laat toolchain de conversie doen voor aangemaakte eigenschappen uit associaties met richtingsaanduiding niet voor uri fragmenten van bestaande eigenschappen
0
43,608
17,630,245,585
IssuesEvent
2021-08-19 06:57:33
hashicorp/terraform-provider-azurerm
https://api.github.com/repos/hashicorp/terraform-provider-azurerm
closed
Error deleting non-last load balancer frontend IP config
bug upstream-terraform service/load-balancers
<!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform (and AzureRM Provider) Version ``` Terraform v0.12.9 + provider.aws v2.29.0 + provider.azurerm v1.34.0 + provider.null v2.1.2 ``` ### Affected Resource(s) * `azurerm_lb` ### Terraform Configuration Files At first, we create a load balancer with three frontend IP configs. Note that the third one has a public IP *prefix*, but the first two have public IP *addresses*. ```json "azurerm_lb": { "my-lb1": { "frontend_ip_configuration": [ { "name": "my-lb1-ms4-frontend", "public_ip_address_id": "${azurerm_public_ip.my-ip-ms4.id}" }, { "name": "my-lb1-ms5-frontend", "public_ip_address_id": "${azurerm_public_ip.my-ip-ms5.id}" }, { "name": "my-snat", "public_ip_prefix_id": "${azurerm_public_ip_prefix.my.id}" } ], "location": "centraluseuap", "name": "my-lb1", "resource_group_name": "${azurerm_resource_group.my.name}", "sku": "Standard" } }, ``` Then, we modify the config, removing the first frontend IP config, leaving only one *prefix* and one *address*: ```json "azurerm_lb": { "my-lb1": { "frontend_ip_configuration": [ { "name": "my-lb1-ms5-frontend", "public_ip_address_id": "${azurerm_public_ip.my-ip-ms5.id}" }, { "name": "my-snat", "public_ip_prefix_id": "${azurerm_public_ip_prefix.my.id}" } ], "location": "centraluseuap", "name": "my-lb1", "resource_group_name": "${azurerm_resource_group.my.name}", "sku": "Standard" } }, ``` ### Debug Output ``` Terraform will perform the following actions: # azurerm_lb.my-lb1 will be updated in-place ~ resource "azurerm_lb" "my-lb1" { id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1" location = "centraluseuap" name = "my-lb1" private_ip_addresses = [] resource_group_name = "my" sku = "Standard" tags = {} ~ frontend_ip_configuration { inbound_nat_rules = [] load_balancer_rules = [ "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms4-ws7", "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms4-ws8", ] ~ name = "my-lb1-ms4-frontend" -> "my-lb1-ms5-frontend" outbound_rules = [] private_ip_address_allocation = "Dynamic" ~ public_ip_address_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms4" -> "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms5" zones = [] } ~ frontend_ip_configuration { inbound_nat_rules = [] load_balancer_rules = [ "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms5-ws10", "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms5-ws9", ] ~ name = "my-lb1-ms5-frontend" -> "my-snat" outbound_rules = [] private_ip_address_allocation = "Dynamic" public_ip_address_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms5" + public_ip_prefix_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPPrefixes/my" zones = [] } - frontend_ip_configuration { - inbound_nat_rules = [] -> null - load_balancer_rules = [] -> null - name = "my-snat" -> null - outbound_rules = [ - "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/outboundRules/my-lb1", ] -> null - private_ip_address_allocation = "Dynamic" -> null - public_ip_prefix_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPPrefixes/my" -> null - zones = [] -> null } } ``` ... ``` Error: Error Creating/Updating Load Balancer "my-lb1" (Resource Group "my"): network.LoadBalancersClient#CreateOrUpdate: Failure sending request: StatusCode=400 -- Original Error: Code="LoadBalancerFrontendIPConfigurationCannotReferencePublicIPAddressAndPublicIPPrefix" Message="Load Balancer Frontend IP Configuration /subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/frontendIPConfigurations/my-snat cannot reference Public IP Address /subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms4 and Public IP Prefix /subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPPrefixes/my" Details=[] ``` ### Panic Output <!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. ---> ### Expected Behavior Preferred behavior: Terraform realizes that I deleted one of the three frontend IP configurations, and deletes that one, without touching the other two. Acceptable behavior: Terraform doesn't realize I deleted one of the configs; it just does a straight diff and ends up deleting the last/third config, and "shifting" the settings of the first two configs one down. So 3 becomes 2, 2 becomes 1, and 1 is gone. ### Actual Behavior Terraform doesn't realize I deleted a config, so it tries to do what I said above in "acceptable behavior." However, it appears to miss the `public_ip_prefix`. That causes it to try and add a `public_ip_prefix` to a config that already has a `public_ip_address`. Azure, rightfully, refuses to do so. ### Steps to Reproduce 1. Use the first config above 1. `terraform apply` 1. Use the second config above 1. `terraform apply` ### Important Factoids <!--- Are there anything atypical about your accounts that we should know? For example: Running in a Azure China/Germany/Government? ---> ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Such as vendor documentation? ---> * May be related to #1622
1.0
Error deleting non-last load balancer frontend IP config - <!--- Please note the following potential times when an issue might be in Terraform core: * [Configuration Language](https://www.terraform.io/docs/configuration/index.html) or resource ordering issues * [State](https://www.terraform.io/docs/state/index.html) and [State Backend](https://www.terraform.io/docs/backends/index.html) issues * [Provisioner](https://www.terraform.io/docs/provisioners/index.html) issues * [Registry](https://registry.terraform.io/) issues * Spans resources across multiple providers If you are running into one of these scenarios, we recommend opening an issue in the [Terraform core repository](https://github.com/hashicorp/terraform/) instead. ---> <!--- Please keep this note for the community ---> ### Community Note * Please vote on this issue by adding a 👍 [reaction](https://blog.github.com/2016-03-10-add-reactions-to-pull-requests-issues-and-comments/) to the original issue to help the community and maintainers prioritize this request * Please do not leave "+1" or "me too" comments, they generate extra noise for issue followers and do not help prioritize the request * If you are interested in working on this issue or have submitted a pull request, please leave a comment <!--- Thank you for keeping this note for the community ---> ### Terraform (and AzureRM Provider) Version ``` Terraform v0.12.9 + provider.aws v2.29.0 + provider.azurerm v1.34.0 + provider.null v2.1.2 ``` ### Affected Resource(s) * `azurerm_lb` ### Terraform Configuration Files At first, we create a load balancer with three frontend IP configs. Note that the third one has a public IP *prefix*, but the first two have public IP *addresses*. ```json "azurerm_lb": { "my-lb1": { "frontend_ip_configuration": [ { "name": "my-lb1-ms4-frontend", "public_ip_address_id": "${azurerm_public_ip.my-ip-ms4.id}" }, { "name": "my-lb1-ms5-frontend", "public_ip_address_id": "${azurerm_public_ip.my-ip-ms5.id}" }, { "name": "my-snat", "public_ip_prefix_id": "${azurerm_public_ip_prefix.my.id}" } ], "location": "centraluseuap", "name": "my-lb1", "resource_group_name": "${azurerm_resource_group.my.name}", "sku": "Standard" } }, ``` Then, we modify the config, removing the first frontend IP config, leaving only one *prefix* and one *address*: ```json "azurerm_lb": { "my-lb1": { "frontend_ip_configuration": [ { "name": "my-lb1-ms5-frontend", "public_ip_address_id": "${azurerm_public_ip.my-ip-ms5.id}" }, { "name": "my-snat", "public_ip_prefix_id": "${azurerm_public_ip_prefix.my.id}" } ], "location": "centraluseuap", "name": "my-lb1", "resource_group_name": "${azurerm_resource_group.my.name}", "sku": "Standard" } }, ``` ### Debug Output ``` Terraform will perform the following actions: # azurerm_lb.my-lb1 will be updated in-place ~ resource "azurerm_lb" "my-lb1" { id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1" location = "centraluseuap" name = "my-lb1" private_ip_addresses = [] resource_group_name = "my" sku = "Standard" tags = {} ~ frontend_ip_configuration { inbound_nat_rules = [] load_balancer_rules = [ "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms4-ws7", "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms4-ws8", ] ~ name = "my-lb1-ms4-frontend" -> "my-lb1-ms5-frontend" outbound_rules = [] private_ip_address_allocation = "Dynamic" ~ public_ip_address_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms4" -> "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms5" zones = [] } ~ frontend_ip_configuration { inbound_nat_rules = [] load_balancer_rules = [ "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms5-ws10", "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/loadBalancingRules/my-lb1-ms5-ws9", ] ~ name = "my-lb1-ms5-frontend" -> "my-snat" outbound_rules = [] private_ip_address_allocation = "Dynamic" public_ip_address_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms5" + public_ip_prefix_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPPrefixes/my" zones = [] } - frontend_ip_configuration { - inbound_nat_rules = [] -> null - load_balancer_rules = [] -> null - name = "my-snat" -> null - outbound_rules = [ - "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/outboundRules/my-lb1", ] -> null - private_ip_address_allocation = "Dynamic" -> null - public_ip_prefix_id = "/subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPPrefixes/my" -> null - zones = [] -> null } } ``` ... ``` Error: Error Creating/Updating Load Balancer "my-lb1" (Resource Group "my"): network.LoadBalancersClient#CreateOrUpdate: Failure sending request: StatusCode=400 -- Original Error: Code="LoadBalancerFrontendIPConfigurationCannotReferencePublicIPAddressAndPublicIPPrefix" Message="Load Balancer Frontend IP Configuration /subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/loadBalancers/my-lb1/frontendIPConfigurations/my-snat cannot reference Public IP Address /subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPAddresses/my-ip-ms4 and Public IP Prefix /subscriptions/subid-xxxx/resourceGroups/my/providers/Microsoft.Network/publicIPPrefixes/my" Details=[] ``` ### Panic Output <!--- If Terraform produced a panic, please provide a link to a GitHub Gist containing the output of the `crash.log`. ---> ### Expected Behavior Preferred behavior: Terraform realizes that I deleted one of the three frontend IP configurations, and deletes that one, without touching the other two. Acceptable behavior: Terraform doesn't realize I deleted one of the configs; it just does a straight diff and ends up deleting the last/third config, and "shifting" the settings of the first two configs one down. So 3 becomes 2, 2 becomes 1, and 1 is gone. ### Actual Behavior Terraform doesn't realize I deleted a config, so it tries to do what I said above in "acceptable behavior." However, it appears to miss the `public_ip_prefix`. That causes it to try and add a `public_ip_prefix` to a config that already has a `public_ip_address`. Azure, rightfully, refuses to do so. ### Steps to Reproduce 1. Use the first config above 1. `terraform apply` 1. Use the second config above 1. `terraform apply` ### Important Factoids <!--- Are there anything atypical about your accounts that we should know? For example: Running in a Azure China/Germany/Government? ---> ### References <!--- Information about referencing Github Issues: https://help.github.com/articles/basic-writing-and-formatting-syntax/#referencing-issues-and-pull-requests Are there any other GitHub issues (open or closed) or pull requests that should be linked here? Such as vendor documentation? ---> * May be related to #1622
non_process
error deleting non last load balancer frontend ip config please note the following potential times when an issue might be in terraform core or resource ordering issues and issues issues issues spans resources across multiple providers if you are running into one of these scenarios we recommend opening an issue in the instead community note please vote on this issue by adding a 👍 to the original issue to help the community and maintainers prioritize this request please do not leave or me too comments they generate extra noise for issue followers and do not help prioritize the request if you are interested in working on this issue or have submitted a pull request please leave a comment terraform and azurerm provider version terraform provider aws provider azurerm provider null affected resource s azurerm lb terraform configuration files at first we create a load balancer with three frontend ip configs note that the third one has a public ip prefix but the first two have public ip addresses json azurerm lb my frontend ip configuration name my frontend public ip address id azurerm public ip my ip id name my frontend public ip address id azurerm public ip my ip id name my snat public ip prefix id azurerm public ip prefix my id location centraluseuap name my resource group name azurerm resource group my name sku standard then we modify the config removing the first frontend ip config leaving only one prefix and one address json azurerm lb my frontend ip configuration name my frontend public ip address id azurerm public ip my ip id name my snat public ip prefix id azurerm public ip prefix my id location centraluseuap name my resource group name azurerm resource group my name sku standard debug output terraform will perform the following actions azurerm lb my will be updated in place resource azurerm lb my id subscriptions subid xxxx resourcegroups my providers microsoft network loadbalancers my location centraluseuap name my private ip addresses resource group name my sku standard tags frontend ip configuration inbound nat rules load balancer rules subscriptions subid xxxx resourcegroups my providers microsoft network loadbalancers my loadbalancingrules my subscriptions subid xxxx resourcegroups my providers microsoft network loadbalancers my loadbalancingrules my name my frontend my frontend outbound rules private ip address allocation dynamic public ip address id subscriptions subid xxxx resourcegroups my providers microsoft network publicipaddresses my ip subscriptions subid xxxx resourcegroups my providers microsoft network publicipaddresses my ip zones frontend ip configuration inbound nat rules load balancer rules subscriptions subid xxxx resourcegroups my providers microsoft network loadbalancers my loadbalancingrules my subscriptions subid xxxx resourcegroups my providers microsoft network loadbalancers my loadbalancingrules my name my frontend my snat outbound rules private ip address allocation dynamic public ip address id subscriptions subid xxxx resourcegroups my providers microsoft network publicipaddresses my ip public ip prefix id subscriptions subid xxxx resourcegroups my providers microsoft network publicipprefixes my zones frontend ip configuration inbound nat rules null load balancer rules null name my snat null outbound rules subscriptions subid xxxx resourcegroups my providers microsoft network loadbalancers my outboundrules my null private ip address allocation dynamic null public ip prefix id subscriptions subid xxxx resourcegroups my providers microsoft network publicipprefixes my null zones null error error creating updating load balancer my resource group my network loadbalancersclient createorupdate failure sending request statuscode original error code loadbalancerfrontendipconfigurationcannotreferencepublicipaddressandpublicipprefix message load balancer frontend ip configuration subscriptions subid xxxx resourcegroups my providers microsoft network loadbalancers my frontendipconfigurations my snat cannot reference public ip address subscriptions subid xxxx resourcegroups my providers microsoft network publicipaddresses my ip and public ip prefix subscriptions subid xxxx resourcegroups my providers microsoft network publicipprefixes my details panic output expected behavior preferred behavior terraform realizes that i deleted one of the three frontend ip configurations and deletes that one without touching the other two acceptable behavior terraform doesn t realize i deleted one of the configs it just does a straight diff and ends up deleting the last third config and shifting the settings of the first two configs one down so becomes becomes and is gone actual behavior terraform doesn t realize i deleted a config so it tries to do what i said above in acceptable behavior however it appears to miss the public ip prefix that causes it to try and add a public ip prefix to a config that already has a public ip address azure rightfully refuses to do so steps to reproduce use the first config above terraform apply use the second config above terraform apply important factoids references information about referencing github issues are there any other github issues open or closed or pull requests that should be linked here such as vendor documentation may be related to
0
19,233
25,387,016,145
IssuesEvent
2022-11-21 22:55:44
open-telemetry/opentelemetry-collector-contrib
https://api.github.com/repos/open-telemetry/opentelemetry-collector-contrib
closed
New component: Span Event Filter Processor
priority:p2 processor/transform pkg/ottl
**Is your feature request related to a problem? Please describe.** While [span events](https://opentelemetry.io/docs/instrumentation/go/manual/#events) can be useful in some circumstances, other times they can be noisy or undesired. Some things like service meshes (Envoy, Istio) and auto-instrumentation (gRPC) attach a lot of span events, and there is currently no way to filter them out of traces. **Describe the solution you'd like** A Span Event Filter Processor would provide the ability to filter out span events based on attributes and names. **Describe alternatives you've considered** Attempting to drop span events from all possible instrumentations is not feasible. Since it's also not possible (yet?) to drop individual spans, dropping the trace entirely or sampling more heavily are other options... but that results in losing more valuable data to filter out unwanted data. **Additional context** ​ There is already a [(now closed) PR ](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/5716) that I would recommend reviving for this purpose.
1.0
New component: Span Event Filter Processor - **Is your feature request related to a problem? Please describe.** While [span events](https://opentelemetry.io/docs/instrumentation/go/manual/#events) can be useful in some circumstances, other times they can be noisy or undesired. Some things like service meshes (Envoy, Istio) and auto-instrumentation (gRPC) attach a lot of span events, and there is currently no way to filter them out of traces. **Describe the solution you'd like** A Span Event Filter Processor would provide the ability to filter out span events based on attributes and names. **Describe alternatives you've considered** Attempting to drop span events from all possible instrumentations is not feasible. Since it's also not possible (yet?) to drop individual spans, dropping the trace entirely or sampling more heavily are other options... but that results in losing more valuable data to filter out unwanted data. **Additional context** ​ There is already a [(now closed) PR ](https://github.com/open-telemetry/opentelemetry-collector-contrib/pull/5716) that I would recommend reviving for this purpose.
process
new component span event filter processor is your feature request related to a problem please describe while can be useful in some circumstances other times they can be noisy or undesired some things like service meshes envoy istio and auto instrumentation grpc attach a lot of span events and there is currently no way to filter them out of traces describe the solution you d like a span event filter processor would provide the ability to filter out span events based on attributes and names describe alternatives you ve considered attempting to drop span events from all possible instrumentations is not feasible since it s also not possible yet to drop individual spans dropping the trace entirely or sampling more heavily are other options but that results in losing more valuable data to filter out unwanted data additional context ​ there is already a that i would recommend reviving for this purpose
1
12,678
15,046,794,927
IssuesEvent
2021-02-03 07:59:24
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
UI issue: forgot password label uneven in PM
Bug P1 Process: Dev Process: Fixed Process: Tested QA Process: Tested dev
Forgot password text in PM is overlapping **To Reproduce** Steps to reproduce the behavior: 1. Launch PM app 2. In sign-in screen check Forgot password label 3. Forgot password label is uneven in the UI **Expected behavior** Forgot password label to be in a single line.
4.0
UI issue: forgot password label uneven in PM - Forgot password text in PM is overlapping **To Reproduce** Steps to reproduce the behavior: 1. Launch PM app 2. In sign-in screen check Forgot password label 3. Forgot password label is uneven in the UI **Expected behavior** Forgot password label to be in a single line.
process
ui issue forgot password label uneven in pm forgot password text in pm is overlapping to reproduce steps to reproduce the behavior launch pm app in sign in screen check forgot password label forgot password label is uneven in the ui expected behavior forgot password label to be in a single line
1
23,411
7,327,726,463
IssuesEvent
2018-03-04 13:41:01
rust-lang/rust
https://api.github.com/repos/rust-lang/rust
closed
rustc_llvm build failure
A-rustbuild
I was trying to build the latest rustc from the git repo @ 4a316e7483f73ba20c0a0d2abd73d3b9da66bf2b via ````./x.py build````. ```` Compiling rustc_llvm v0.0.0 (file:///home/matthias/RUST/rust_build/src/librustc_llvm) error: failed to run custom build command for `rustc_llvm v0.0.0 (file:///home/matthias/RUST/rust_build/src/librustc_llvm)` process didn't exit successfully: `/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0-rustc/release/build/rustc_llvm-50548538c501b828/build-script-build` (exit code: 101) --- stdout cargo:rerun-if-changed=/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/llvm/build/bin/llvm-config cargo:rerun-if-env-changed=LLVM_CONFIG cargo:rustc-cfg=llvm_component="aarch64" cargo:rustc-cfg=llvm_component="arm" cargo:rustc-cfg=llvm_component="asmparser" cargo:rustc-cfg=llvm_component="bitreader" cargo:rustc-cfg=llvm_component="bitwriter" cargo:rustc-cfg=llvm_component="hexagon" cargo:rustc-cfg=llvm_component="instrumentation" cargo:rustc-cfg=llvm_component="interpreter" cargo:rustc-cfg=llvm_component="ipo" cargo:rustc-cfg=llvm_component="linker" cargo:rustc-cfg=llvm_component="lto" cargo:rustc-cfg=llvm_component="mcjit" cargo:rustc-cfg=llvm_component="mips" cargo:rustc-cfg=llvm_component="msp430" cargo:rustc-cfg=llvm_component="nvptx" cargo:rustc-cfg=llvm_component="powerpc" cargo:rustc-cfg=llvm_component="sparc" cargo:rustc-cfg=llvm_component="systemz" cargo:rustc-cfg=llvm_component="x86" cargo:rerun-if-changed-env=LLVM_RUSTLLVM cargo:rerun-if-changed=../rustllvm/RustWrapper.cpp cargo:rerun-if-changed=../rustllvm/Linker.cpp cargo:rerun-if-changed=../rustllvm/rustllvm.h cargo:rerun-if-changed=../rustllvm/llvm-rebuild-trigger cargo:rerun-if-changed=../rustllvm/README cargo:rerun-if-changed=../rustllvm/PassWrapper.cpp cargo:rerun-if-changed=../rustllvm/ArchiveWrapper.cpp cargo:rerun-if-changed=../rustllvm/.editorconfig TARGET = Some("x86_64-unknown-linux-gnu") OPT_LEVEL = Some("2") TARGET = Some("x86_64-unknown-linux-gnu") HOST = Some("x86_64-unknown-linux-gnu") TARGET = Some("x86_64-unknown-linux-gnu") TARGET = Some("x86_64-unknown-linux-gnu") HOST = Some("x86_64-unknown-linux-gnu") CXX_x86_64-unknown-linux-gnu = Some("c++") TARGET = Some("x86_64-unknown-linux-gnu") HOST = Some("x86_64-unknown-linux-gnu") CXXFLAGS_x86_64-unknown-linux-gnu = Some("-ffunction-sections -fdata-sections -fPIC -m64") DEBUG = Some("false") running: "c++" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-m64" "-I/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/llvm/build/include" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-fPIC" "-fvisibility-inlines-hidden" "-Werror=date-time" "-std=c++11" "-Wall" "-W" "-Wno-unused-parameter" "-Wwrite-strings" "-Wcast-qual" "-Wno-missing-field-initializers" "-pedantic" "-Wno-long-long" "-Wno-maybe-uninitialized" "-Wdelete-non-virtual-dtor" "-Wno-comment" "-ffunction-sections" "-fdata-sections" "-O3" "-DNDEBUG" "-fno-exceptions" "-fno-rtti" "-D_GNU_SOURCE" "-D__STDC_CONSTANT_MACROS" "-D__STDC_FORMAT_MACROS" "-D__STDC_LIMIT_MACROS" "-DLLVM_COMPONENT_AARCH64" "-DLLVM_COMPONENT_ARM" "-DLLVM_COMPONENT_ASMPARSER" "-DLLVM_COMPONENT_BITREADER" "-DLLVM_COMPONENT_BITWRITER" "-DLLVM_COMPONENT_HEXAGON" "-DLLVM_COMPONENT_INSTRUMENTATION" "-DLLVM_COMPONENT_INTERPRETER" "-DLLVM_COMPONENT_IPO" "-DLLVM_COMPONENT_LINKER" "-DLLVM_COMPONENT_LTO" "-DLLVM_COMPONENT_MCJIT" "-DLLVM_COMPONENT_MIPS" "-DLLVM_COMPONENT_MSP430" "-DLLVM_COMPONENT_NVPTX" "-DLLVM_COMPONENT_POWERPC" "-DLLVM_COMPONENT_SPARC" "-DLLVM_COMPONENT_SYSTEMZ" "-DLLVM_COMPONENT_X86" "-DLLVM_RUSTLLVM" "-o" "/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0-rustc/x86_64-unknown-linux-gnu/release/build/rustc_llvm-c55d82589df89e26/out/../rustllvm/PassWrapper.o" "-c" "../rustllvm/PassWrapper.cpp" cargo:warning=../rustllvm/PassWrapper.cpp:29:10: fatal error: llvm/CodeGen/TargetSubtargetInfo.h: No such file or directory cargo:warning= #include "llvm/CodeGen/TargetSubtargetInfo.h" cargo:warning= ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ cargo:warning=compilation terminated. exit code: 1 --- stderr thread 'main' panicked at ' Internal error occurred: Command "c++" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-m64" "-I/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/llvm/build/include" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-fPIC" "-fvisibility-inlines-hidden" "-Werror=date-time" "-std=c++11" "-Wall" "-W" "-Wno-unused-parameter" "-Wwrite-strings" "-Wcast-qual" "-Wno-missing-field-initializers" "-pedantic" "-Wno-long-long" "-Wno-maybe-uninitialized" "-Wdelete-non-virtual-dtor" "-Wno-comment" "-ffunction-sections" "-fdata-sections" "-O3" "-DNDEBUG" "-fno-exceptions" "-fno-rtti" "-D_GNU_SOURCE" "-D__STDC_CONSTANT_MACROS" "-D__STDC_FORMAT_MACROS" "-D__STDC_LIMIT_MACROS" "-DLLVM_COMPONENT_AARCH64" "-DLLVM_COMPONENT_ARM" "-DLLVM_COMPONENT_ASMPARSER" "-DLLVM_COMPONENT_BITREADER" "-DLLVM_COMPONENT_BITWRITER" "-DLLVM_COMPONENT_HEXAGON" "-DLLVM_COMPONENT_INSTRUMENTATION" "-DLLVM_COMPONENT_INTERPRETER" "-DLLVM_COMPONENT_IPO" "-DLLVM_COMPONENT_LINKER" "-DLLVM_COMPONENT_LTO" "-DLLVM_COMPONENT_MCJIT" "-DLLVM_COMPONENT_MIPS" "-DLLVM_COMPONENT_MSP430" "-DLLVM_COMPONENT_NVPTX" "-DLLVM_COMPONENT_POWERPC" "-DLLVM_COMPONENT_SPARC" "-DLLVM_COMPONENT_SYSTEMZ" "-DLLVM_COMPONENT_X86" "-DLLVM_RUSTLLVM" "-o" "/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0-rustc/x86_64-unknown-linux-gnu/release/build/rustc_llvm-c55d82589df89e26/out/../rustllvm/PassWrapper.o" "-c" "../rustllvm/PassWrapper.cpp" with args "c++" did not execute successfully (status code exit code: 1). ', /home/matthias/.cargo/registry/src/github.com-1ecc6299db9ec823/cc-1.0.4/src/lib.rs:1984:5 note: Run with `RUST_BACKTRACE=1` for a backtrace. thread 'main' panicked at 'command did not execute successfully: "/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0/bin/cargo" "build" "--target" "x86_64-unknown-linux-gnu" "-j" "4" "--release" "--manifest-path" "/home/matthias/RUST/rust_build/src/librustc_trans/Cargo.toml" "--features" " jemalloc" "--message-format" "json" expected success, got: exit code: 101', bootstrap/compile.rs:1092:9 note: Run with `RUST_BACKTRACE=1` for a backtrace. failed to run: /home/matthias/RUST/rust_build/build/bootstrap/debug/bootstrap build Build completed unsuccessfully in 0:00:04 ````
1.0
rustc_llvm build failure - I was trying to build the latest rustc from the git repo @ 4a316e7483f73ba20c0a0d2abd73d3b9da66bf2b via ````./x.py build````. ```` Compiling rustc_llvm v0.0.0 (file:///home/matthias/RUST/rust_build/src/librustc_llvm) error: failed to run custom build command for `rustc_llvm v0.0.0 (file:///home/matthias/RUST/rust_build/src/librustc_llvm)` process didn't exit successfully: `/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0-rustc/release/build/rustc_llvm-50548538c501b828/build-script-build` (exit code: 101) --- stdout cargo:rerun-if-changed=/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/llvm/build/bin/llvm-config cargo:rerun-if-env-changed=LLVM_CONFIG cargo:rustc-cfg=llvm_component="aarch64" cargo:rustc-cfg=llvm_component="arm" cargo:rustc-cfg=llvm_component="asmparser" cargo:rustc-cfg=llvm_component="bitreader" cargo:rustc-cfg=llvm_component="bitwriter" cargo:rustc-cfg=llvm_component="hexagon" cargo:rustc-cfg=llvm_component="instrumentation" cargo:rustc-cfg=llvm_component="interpreter" cargo:rustc-cfg=llvm_component="ipo" cargo:rustc-cfg=llvm_component="linker" cargo:rustc-cfg=llvm_component="lto" cargo:rustc-cfg=llvm_component="mcjit" cargo:rustc-cfg=llvm_component="mips" cargo:rustc-cfg=llvm_component="msp430" cargo:rustc-cfg=llvm_component="nvptx" cargo:rustc-cfg=llvm_component="powerpc" cargo:rustc-cfg=llvm_component="sparc" cargo:rustc-cfg=llvm_component="systemz" cargo:rustc-cfg=llvm_component="x86" cargo:rerun-if-changed-env=LLVM_RUSTLLVM cargo:rerun-if-changed=../rustllvm/RustWrapper.cpp cargo:rerun-if-changed=../rustllvm/Linker.cpp cargo:rerun-if-changed=../rustllvm/rustllvm.h cargo:rerun-if-changed=../rustllvm/llvm-rebuild-trigger cargo:rerun-if-changed=../rustllvm/README cargo:rerun-if-changed=../rustllvm/PassWrapper.cpp cargo:rerun-if-changed=../rustllvm/ArchiveWrapper.cpp cargo:rerun-if-changed=../rustllvm/.editorconfig TARGET = Some("x86_64-unknown-linux-gnu") OPT_LEVEL = Some("2") TARGET = Some("x86_64-unknown-linux-gnu") HOST = Some("x86_64-unknown-linux-gnu") TARGET = Some("x86_64-unknown-linux-gnu") TARGET = Some("x86_64-unknown-linux-gnu") HOST = Some("x86_64-unknown-linux-gnu") CXX_x86_64-unknown-linux-gnu = Some("c++") TARGET = Some("x86_64-unknown-linux-gnu") HOST = Some("x86_64-unknown-linux-gnu") CXXFLAGS_x86_64-unknown-linux-gnu = Some("-ffunction-sections -fdata-sections -fPIC -m64") DEBUG = Some("false") running: "c++" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-m64" "-I/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/llvm/build/include" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-fPIC" "-fvisibility-inlines-hidden" "-Werror=date-time" "-std=c++11" "-Wall" "-W" "-Wno-unused-parameter" "-Wwrite-strings" "-Wcast-qual" "-Wno-missing-field-initializers" "-pedantic" "-Wno-long-long" "-Wno-maybe-uninitialized" "-Wdelete-non-virtual-dtor" "-Wno-comment" "-ffunction-sections" "-fdata-sections" "-O3" "-DNDEBUG" "-fno-exceptions" "-fno-rtti" "-D_GNU_SOURCE" "-D__STDC_CONSTANT_MACROS" "-D__STDC_FORMAT_MACROS" "-D__STDC_LIMIT_MACROS" "-DLLVM_COMPONENT_AARCH64" "-DLLVM_COMPONENT_ARM" "-DLLVM_COMPONENT_ASMPARSER" "-DLLVM_COMPONENT_BITREADER" "-DLLVM_COMPONENT_BITWRITER" "-DLLVM_COMPONENT_HEXAGON" "-DLLVM_COMPONENT_INSTRUMENTATION" "-DLLVM_COMPONENT_INTERPRETER" "-DLLVM_COMPONENT_IPO" "-DLLVM_COMPONENT_LINKER" "-DLLVM_COMPONENT_LTO" "-DLLVM_COMPONENT_MCJIT" "-DLLVM_COMPONENT_MIPS" "-DLLVM_COMPONENT_MSP430" "-DLLVM_COMPONENT_NVPTX" "-DLLVM_COMPONENT_POWERPC" "-DLLVM_COMPONENT_SPARC" "-DLLVM_COMPONENT_SYSTEMZ" "-DLLVM_COMPONENT_X86" "-DLLVM_RUSTLLVM" "-o" "/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0-rustc/x86_64-unknown-linux-gnu/release/build/rustc_llvm-c55d82589df89e26/out/../rustllvm/PassWrapper.o" "-c" "../rustllvm/PassWrapper.cpp" cargo:warning=../rustllvm/PassWrapper.cpp:29:10: fatal error: llvm/CodeGen/TargetSubtargetInfo.h: No such file or directory cargo:warning= #include "llvm/CodeGen/TargetSubtargetInfo.h" cargo:warning= ^~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ cargo:warning=compilation terminated. exit code: 1 --- stderr thread 'main' panicked at ' Internal error occurred: Command "c++" "-O2" "-ffunction-sections" "-fdata-sections" "-fPIC" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-m64" "-I/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/llvm/build/include" "-ffunction-sections" "-fdata-sections" "-fPIC" "-m64" "-fPIC" "-fvisibility-inlines-hidden" "-Werror=date-time" "-std=c++11" "-Wall" "-W" "-Wno-unused-parameter" "-Wwrite-strings" "-Wcast-qual" "-Wno-missing-field-initializers" "-pedantic" "-Wno-long-long" "-Wno-maybe-uninitialized" "-Wdelete-non-virtual-dtor" "-Wno-comment" "-ffunction-sections" "-fdata-sections" "-O3" "-DNDEBUG" "-fno-exceptions" "-fno-rtti" "-D_GNU_SOURCE" "-D__STDC_CONSTANT_MACROS" "-D__STDC_FORMAT_MACROS" "-D__STDC_LIMIT_MACROS" "-DLLVM_COMPONENT_AARCH64" "-DLLVM_COMPONENT_ARM" "-DLLVM_COMPONENT_ASMPARSER" "-DLLVM_COMPONENT_BITREADER" "-DLLVM_COMPONENT_BITWRITER" "-DLLVM_COMPONENT_HEXAGON" "-DLLVM_COMPONENT_INSTRUMENTATION" "-DLLVM_COMPONENT_INTERPRETER" "-DLLVM_COMPONENT_IPO" "-DLLVM_COMPONENT_LINKER" "-DLLVM_COMPONENT_LTO" "-DLLVM_COMPONENT_MCJIT" "-DLLVM_COMPONENT_MIPS" "-DLLVM_COMPONENT_MSP430" "-DLLVM_COMPONENT_NVPTX" "-DLLVM_COMPONENT_POWERPC" "-DLLVM_COMPONENT_SPARC" "-DLLVM_COMPONENT_SYSTEMZ" "-DLLVM_COMPONENT_X86" "-DLLVM_RUSTLLVM" "-o" "/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0-rustc/x86_64-unknown-linux-gnu/release/build/rustc_llvm-c55d82589df89e26/out/../rustllvm/PassWrapper.o" "-c" "../rustllvm/PassWrapper.cpp" with args "c++" did not execute successfully (status code exit code: 1). ', /home/matthias/.cargo/registry/src/github.com-1ecc6299db9ec823/cc-1.0.4/src/lib.rs:1984:5 note: Run with `RUST_BACKTRACE=1` for a backtrace. thread 'main' panicked at 'command did not execute successfully: "/home/matthias/RUST/rust_build/build/x86_64-unknown-linux-gnu/stage0/bin/cargo" "build" "--target" "x86_64-unknown-linux-gnu" "-j" "4" "--release" "--manifest-path" "/home/matthias/RUST/rust_build/src/librustc_trans/Cargo.toml" "--features" " jemalloc" "--message-format" "json" expected success, got: exit code: 101', bootstrap/compile.rs:1092:9 note: Run with `RUST_BACKTRACE=1` for a backtrace. failed to run: /home/matthias/RUST/rust_build/build/bootstrap/debug/bootstrap build Build completed unsuccessfully in 0:00:04 ````
non_process
rustc llvm build failure i was trying to build the latest rustc from the git repo via x py build compiling rustc llvm file home matthias rust rust build src librustc llvm error failed to run custom build command for rustc llvm file home matthias rust rust build src librustc llvm process didn t exit successfully home matthias rust rust build build unknown linux gnu rustc release build rustc llvm build script build exit code stdout cargo rerun if changed home matthias rust rust build build unknown linux gnu llvm build bin llvm config cargo rerun if env changed llvm config cargo rustc cfg llvm component cargo rustc cfg llvm component arm cargo rustc cfg llvm component asmparser cargo rustc cfg llvm component bitreader cargo rustc cfg llvm component bitwriter cargo rustc cfg llvm component hexagon cargo rustc cfg llvm component instrumentation cargo rustc cfg llvm component interpreter cargo rustc cfg llvm component ipo cargo rustc cfg llvm component linker cargo rustc cfg llvm component lto cargo rustc cfg llvm component mcjit cargo rustc cfg llvm component mips cargo rustc cfg llvm component cargo rustc cfg llvm component nvptx cargo rustc cfg llvm component powerpc cargo rustc cfg llvm component sparc cargo rustc cfg llvm component systemz cargo rustc cfg llvm component cargo rerun if changed env llvm rustllvm cargo rerun if changed rustllvm rustwrapper cpp cargo rerun if changed rustllvm linker cpp cargo rerun if changed rustllvm rustllvm h cargo rerun if changed rustllvm llvm rebuild trigger cargo rerun if changed rustllvm readme cargo rerun if changed rustllvm passwrapper cpp cargo rerun if changed rustllvm archivewrapper cpp cargo rerun if changed rustllvm editorconfig target some unknown linux gnu opt level some target some unknown linux gnu host some unknown linux gnu target some unknown linux gnu target some unknown linux gnu host some unknown linux gnu cxx unknown linux gnu some c target some unknown linux gnu host some unknown linux gnu cxxflags unknown linux gnu some ffunction sections fdata sections fpic debug some false running c ffunction sections fdata sections fpic ffunction sections fdata sections fpic i home matthias rust rust build build unknown linux gnu llvm build include ffunction sections fdata sections fpic fpic fvisibility inlines hidden werror date time std c wall w wno unused parameter wwrite strings wcast qual wno missing field initializers pedantic wno long long wno maybe uninitialized wdelete non virtual dtor wno comment ffunction sections fdata sections dndebug fno exceptions fno rtti d gnu source d stdc constant macros d stdc format macros d stdc limit macros dllvm component dllvm component arm dllvm component asmparser dllvm component bitreader dllvm component bitwriter dllvm component hexagon dllvm component instrumentation dllvm component interpreter dllvm component ipo dllvm component linker dllvm component lto dllvm component mcjit dllvm component mips dllvm component dllvm component nvptx dllvm component powerpc dllvm component sparc dllvm component systemz dllvm component dllvm rustllvm o home matthias rust rust build build unknown linux gnu rustc unknown linux gnu release build rustc llvm out rustllvm passwrapper o c rustllvm passwrapper cpp cargo warning rustllvm passwrapper cpp fatal error llvm codegen targetsubtargetinfo h no such file or directory cargo warning include llvm codegen targetsubtargetinfo h cargo warning cargo warning compilation terminated exit code stderr thread main panicked at internal error occurred command c ffunction sections fdata sections fpic ffunction sections fdata sections fpic i home matthias rust rust build build unknown linux gnu llvm build include ffunction sections fdata sections fpic fpic fvisibility inlines hidden werror date time std c wall w wno unused parameter wwrite strings wcast qual wno missing field initializers pedantic wno long long wno maybe uninitialized wdelete non virtual dtor wno comment ffunction sections fdata sections dndebug fno exceptions fno rtti d gnu source d stdc constant macros d stdc format macros d stdc limit macros dllvm component dllvm component arm dllvm component asmparser dllvm component bitreader dllvm component bitwriter dllvm component hexagon dllvm component instrumentation dllvm component interpreter dllvm component ipo dllvm component linker dllvm component lto dllvm component mcjit dllvm component mips dllvm component dllvm component nvptx dllvm component powerpc dllvm component sparc dllvm component systemz dllvm component dllvm rustllvm o home matthias rust rust build build unknown linux gnu rustc unknown linux gnu release build rustc llvm out rustllvm passwrapper o c rustllvm passwrapper cpp with args c did not execute successfully status code exit code home matthias cargo registry src github com cc src lib rs note run with rust backtrace for a backtrace thread main panicked at command did not execute successfully home matthias rust rust build build unknown linux gnu bin cargo build target unknown linux gnu j release manifest path home matthias rust rust build src librustc trans cargo toml features jemalloc message format json expected success got exit code bootstrap compile rs note run with rust backtrace for a backtrace failed to run home matthias rust rust build build bootstrap debug bootstrap build build completed unsuccessfully in
0
21,932
30,446,676,227
IssuesEvent
2023-07-15 19:08:43
h4sh5/pypi-auto-scanner
https://api.github.com/repos/h4sh5/pypi-auto-scanner
opened
pyutils 0.0.1b3 has 2 GuardDog issues
guarddog typosquatting silent-process-execution
https://pypi.org/project/pyutils https://inspector.pypi.io/project/pyutils ```{ "dependency": "pyutils", "version": "0.0.1b3", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: python-utils, pytils", "silent-process-execution": [ { "location": "pyutils-0.0.1b3/src/pyutils/exec_utils.py:204", "code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmppmldpvcg/pyutils" } }```
1.0
pyutils 0.0.1b3 has 2 GuardDog issues - https://pypi.org/project/pyutils https://inspector.pypi.io/project/pyutils ```{ "dependency": "pyutils", "version": "0.0.1b3", "result": { "issues": 2, "errors": {}, "results": { "typosquatting": "This package closely ressembles the following package names, and might be a typosquatting attempt: python-utils, pytils", "silent-process-execution": [ { "location": "pyutils-0.0.1b3/src/pyutils/exec_utils.py:204", "code": " subproc = subprocess.Popen(\n args,\n stdin=subprocess.DEVNULL,\n stdout=subprocess.DEVNULL,\n stderr=subprocess.DEVNULL,\n )", "message": "This package is silently executing an external binary, redirecting stdout, stderr and stdin to /dev/null" } ] }, "path": "/tmp/tmppmldpvcg/pyutils" } }```
process
pyutils has guarddog issues dependency pyutils version result issues errors results typosquatting this package closely ressembles the following package names and might be a typosquatting attempt python utils pytils silent process execution location pyutils src pyutils exec utils py code subproc subprocess popen n args n stdin subprocess devnull n stdout subprocess devnull n stderr subprocess devnull n message this package is silently executing an external binary redirecting stdout stderr and stdin to dev null path tmp tmppmldpvcg pyutils
1
36,606
8,133,597,539
IssuesEvent
2018-08-19 04:26:22
joomla/joomla-cms
https://api.github.com/repos/joomla/joomla-cms
closed
Workflow - States Ordering
J4 Issue New Feature Workflow No Code Attached Yet
### Steps to reproduce the issue With the default install you can change the sort order of the states. Here I changed them to be alphabetical <img width="383" alt="chrome_2018-08-10_14-05-29" src="https://user-images.githubusercontent.com/1296369/43959404-a0c90e32-9ca6-11e8-8969-288fc097516d.png"> However that order is not used in the select lists <img width="219" alt="chrome_2018-08-10_14-05-11" src="https://user-images.githubusercontent.com/1296369/43959445-c5dedbfc-9ca6-11e8-8095-09098ea48600.png">
1.0
Workflow - States Ordering - ### Steps to reproduce the issue With the default install you can change the sort order of the states. Here I changed them to be alphabetical <img width="383" alt="chrome_2018-08-10_14-05-29" src="https://user-images.githubusercontent.com/1296369/43959404-a0c90e32-9ca6-11e8-8969-288fc097516d.png"> However that order is not used in the select lists <img width="219" alt="chrome_2018-08-10_14-05-11" src="https://user-images.githubusercontent.com/1296369/43959445-c5dedbfc-9ca6-11e8-8095-09098ea48600.png">
non_process
workflow states ordering steps to reproduce the issue with the default install you can change the sort order of the states here i changed them to be alphabetical img width alt chrome src however that order is not used in the select lists img width alt chrome src
0
35,073
12,309,218,727
IssuesEvent
2020-05-12 08:36:19
benchabot/gitlabhq
https://api.github.com/repos/benchabot/gitlabhq
opened
WS-2016-0075 (Medium) detected in moment-2.13.0.min.js, moment-2.5.1.min.js
security vulnerability
## WS-2016-0075 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>moment-2.13.0.min.js</b>, <b>moment-2.5.1.min.js</b></p></summary> <p> <details><summary><b>moment-2.13.0.min.js</b></p></summary> <p>Parse, validate, manipulate, and display dates</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.13.0/moment.min.js">https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.13.0/moment.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/gitlabhq/node_modules/chart.js/samples/scales/time/line.html</p> <p>Path to vulnerable library: /gitlabhq/node_modules/chart.js/samples/scales/time/line.html</p> <p> Dependency Hierarchy: - :x: **moment-2.13.0.min.js** (Vulnerable Library) </details> <details><summary><b>moment-2.5.1.min.js</b></p></summary> <p>Parse, validate, manipulate, and display dates</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.5.1/moment.min.js">https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.5.1/moment.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/gitlabhq/node_modules/pikaday/examples/moment.html</p> <p>Path to vulnerable library: /gitlabhq/node_modules/pikaday/examples/moment.html</p> <p> Dependency Hierarchy: - :x: **moment-2.5.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/benchabot/gitlabhq/commit/4df6da43e04f9d7cdd5f6da47e8321a22941530a">4df6da43e04f9d7cdd5f6da47e8321a22941530a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Regular expression denial of service vulnerability in the moment package, by using a specific 40 characters long string in the "format" method. <p>Publish Date: 2016-10-24 <p>URL: <a href=https://github.com/moment/moment/commit/663f33e333212b3800b63592cd8e237ac8fabdb9>WS-2016-0075</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Change files</p> <p>Origin: <a href="https://github.com/moment/moment/commit/663f33e333212b3800b63592cd8e237ac8fabdb9">https://github.com/moment/moment/commit/663f33e333212b3800b63592cd8e237ac8fabdb9</a></p> <p>Release Date: 2016-10-24</p> <p>Fix Resolution: Replace or update the following files: month.js, lt.js</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
WS-2016-0075 (Medium) detected in moment-2.13.0.min.js, moment-2.5.1.min.js - ## WS-2016-0075 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Libraries - <b>moment-2.13.0.min.js</b>, <b>moment-2.5.1.min.js</b></p></summary> <p> <details><summary><b>moment-2.13.0.min.js</b></p></summary> <p>Parse, validate, manipulate, and display dates</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.13.0/moment.min.js">https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.13.0/moment.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/gitlabhq/node_modules/chart.js/samples/scales/time/line.html</p> <p>Path to vulnerable library: /gitlabhq/node_modules/chart.js/samples/scales/time/line.html</p> <p> Dependency Hierarchy: - :x: **moment-2.13.0.min.js** (Vulnerable Library) </details> <details><summary><b>moment-2.5.1.min.js</b></p></summary> <p>Parse, validate, manipulate, and display dates</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.5.1/moment.min.js">https://cdnjs.cloudflare.com/ajax/libs/moment.js/2.5.1/moment.min.js</a></p> <p>Path to dependency file: /tmp/ws-scm/gitlabhq/node_modules/pikaday/examples/moment.html</p> <p>Path to vulnerable library: /gitlabhq/node_modules/pikaday/examples/moment.html</p> <p> Dependency Hierarchy: - :x: **moment-2.5.1.min.js** (Vulnerable Library) </details> <p>Found in HEAD commit: <a href="https://github.com/benchabot/gitlabhq/commit/4df6da43e04f9d7cdd5f6da47e8321a22941530a">4df6da43e04f9d7cdd5f6da47e8321a22941530a</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Regular expression denial of service vulnerability in the moment package, by using a specific 40 characters long string in the "format" method. <p>Publish Date: 2016-10-24 <p>URL: <a href=https://github.com/moment/moment/commit/663f33e333212b3800b63592cd8e237ac8fabdb9>WS-2016-0075</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.3</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: Low </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Change files</p> <p>Origin: <a href="https://github.com/moment/moment/commit/663f33e333212b3800b63592cd8e237ac8fabdb9">https://github.com/moment/moment/commit/663f33e333212b3800b63592cd8e237ac8fabdb9</a></p> <p>Release Date: 2016-10-24</p> <p>Fix Resolution: Replace or update the following files: month.js, lt.js</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
ws medium detected in moment min js moment min js ws medium severity vulnerability vulnerable libraries moment min js moment min js moment min js parse validate manipulate and display dates library home page a href path to dependency file tmp ws scm gitlabhq node modules chart js samples scales time line html path to vulnerable library gitlabhq node modules chart js samples scales time line html dependency hierarchy x moment min js vulnerable library moment min js parse validate manipulate and display dates library home page a href path to dependency file tmp ws scm gitlabhq node modules pikaday examples moment html path to vulnerable library gitlabhq node modules pikaday examples moment html dependency hierarchy x moment min js vulnerable library found in head commit a href vulnerability details regular expression denial of service vulnerability in the moment package by using a specific characters long string in the format method publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact low for more information on scores click a href suggested fix type change files origin a href release date fix resolution replace or update the following files month js lt js step up your open source security game with whitesource
0
4,745
7,603,472,788
IssuesEvent
2018-04-29 14:55:17
nodejs/node
https://api.github.com/repos/nodejs/node
closed
Memory leak due to process.fork()?
child_process confirmed-bug
* **Version**: v8.5.0 * **Platform**: Linux I am experiencing memory leakage in an app that uses`process.fork()` a lot. These child processes get sent messages via `process.send()` with a `sendHandle` and are terminated later on. I did run into issues with memory management here. Some heap dumps show that even after the child-processes exited, the `ChildProcess`-instances are retained in the master process. I learned that using `subprocess.disconnect()` partly fixes that issue, but one more retainer can be found here: https://github.com/nodejs/node/blob/20259f90927a8b2923a0ad3210f6400d3a29966b/lib/net.js#L1665 How, where and when should this `socketList` be removed from the `_workers`-array?
1.0
Memory leak due to process.fork()? - * **Version**: v8.5.0 * **Platform**: Linux I am experiencing memory leakage in an app that uses`process.fork()` a lot. These child processes get sent messages via `process.send()` with a `sendHandle` and are terminated later on. I did run into issues with memory management here. Some heap dumps show that even after the child-processes exited, the `ChildProcess`-instances are retained in the master process. I learned that using `subprocess.disconnect()` partly fixes that issue, but one more retainer can be found here: https://github.com/nodejs/node/blob/20259f90927a8b2923a0ad3210f6400d3a29966b/lib/net.js#L1665 How, where and when should this `socketList` be removed from the `_workers`-array?
process
memory leak due to process fork version platform linux i am experiencing memory leakage in an app that uses process fork a lot these child processes get sent messages via process send with a sendhandle and are terminated later on i did run into issues with memory management here some heap dumps show that even after the child processes exited the childprocess instances are retained in the master process i learned that using subprocess disconnect partly fixes that issue but one more retainer can be found here how where and when should this socketlist be removed from the workers array
1
109,273
13,757,582,012
IssuesEvent
2020-10-06 21:56:27
omou-org/front-end
https://api.github.com/repos/omou-org/front-end
closed
A027 - Improper label to parent search in add student form
design good first issue
A027 Improper label to Parent Search in Add Student form Bug found by: Daniel Teams link: https://teams.microsoft.com/l/message/19:77d621b48fd54d03ba7155385d33cb54@thread.skype/1596253270314?tenantId=4757b031-c3cc-4b2a-ac2c-a2ef4731ed22&groupId=7f2d402f-bd5b-4f3d-875f-fb78e2889c1a&parentMessageId=1596253270314&teamName=Omou&channelName=Defects&createdTime=1596253270314
1.0
A027 - Improper label to parent search in add student form - A027 Improper label to Parent Search in Add Student form Bug found by: Daniel Teams link: https://teams.microsoft.com/l/message/19:77d621b48fd54d03ba7155385d33cb54@thread.skype/1596253270314?tenantId=4757b031-c3cc-4b2a-ac2c-a2ef4731ed22&groupId=7f2d402f-bd5b-4f3d-875f-fb78e2889c1a&parentMessageId=1596253270314&teamName=Omou&channelName=Defects&createdTime=1596253270314
non_process
improper label to parent search in add student form improper label to parent search in add student form bug found by daniel teams link
0
147,470
5,640,440,105
IssuesEvent
2017-04-06 16:23:28
YaleSTC/vesta
https://api.github.com/repos/YaleSTC/vesta
opened
Include building as a column in results views
complexity: 2 priority: 3 type: enhancement
While we added the building name to the suite name in many draw views in #561, we forgot the results views. This should probably be handled as a separate column in the table for export purposes.
1.0
Include building as a column in results views - While we added the building name to the suite name in many draw views in #561, we forgot the results views. This should probably be handled as a separate column in the table for export purposes.
non_process
include building as a column in results views while we added the building name to the suite name in many draw views in we forgot the results views this should probably be handled as a separate column in the table for export purposes
0
3,527
6,568,988,690
IssuesEvent
2017-09-09 00:46:34
MikaylaFischler/dorm-leds
https://api.github.com/repos/MikaylaFischler/dorm-leds
closed
Button Listener Process
control device process
Code a process to listen for button input when told to listen (enabled) and sleep when not enabled.
1.0
Button Listener Process - Code a process to listen for button input when told to listen (enabled) and sleep when not enabled.
process
button listener process code a process to listen for button input when told to listen enabled and sleep when not enabled
1
350,891
31,932,519,625
IssuesEvent
2023-09-19 08:23:50
unifyai/ivy
https://api.github.com/repos/unifyai/ivy
reopened
Fix elementwise.test_imag
Sub Task Failing Test
| | | |---|---| |jax|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5128573848/jobs/9225384241"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a> |paddle|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a>
1.0
Fix elementwise.test_imag - | | | |---|---| |jax|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a> |numpy|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a> |tensorflow|<a href="https://github.com/unifyai/ivy/actions/runs/5128573848/jobs/9225384241"><img src=https://img.shields.io/badge/-success-success></a> |torch|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a> |paddle|<a href="null"><img src=https://img.shields.io/badge/-failure-red></a>
non_process
fix elementwise test imag jax img src numpy img src tensorflow a href src torch img src paddle img src
0
539,364
15,787,425,289
IssuesEvent
2021-04-01 19:12:25
mroswell/list-N
https://api.github.com/repos/mroswell/list-N
closed
Invalid SQL on array fields
bug priority
<img width="841" alt="Screen Shot 2021-03-20 at 11 33 34 PM" src="https://user-images.githubusercontent.com/192568/111892798-fca9f180-89d4-11eb-872d-fcb35233c268.png"> <img width="1307" alt="Screen Shot 2021-03-20 at 11 36 06 PM" src="https://user-images.githubusercontent.com/192568/111892817-15b2a280-89d5-11eb-9ad9-011ff6dd6d3a.png">
1.0
Invalid SQL on array fields - <img width="841" alt="Screen Shot 2021-03-20 at 11 33 34 PM" src="https://user-images.githubusercontent.com/192568/111892798-fca9f180-89d4-11eb-872d-fcb35233c268.png"> <img width="1307" alt="Screen Shot 2021-03-20 at 11 36 06 PM" src="https://user-images.githubusercontent.com/192568/111892817-15b2a280-89d5-11eb-9ad9-011ff6dd6d3a.png">
non_process
invalid sql on array fields img width alt screen shot at pm src img width alt screen shot at pm src
0
12,247
14,744,114,731
IssuesEvent
2021-01-07 14:51:50
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Missing revenue- issue with Separate Child Account Transactions | Parent:1679
anc-process anp-important ant-bug ant-child/secondary has attachment
In GitLab by @kdjstudios on Jan 3, 2020, 08:43 **Submitted by:** "Leah Mitchell" <leah.mitchell@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/10126353 **Server:** Internal (Both) **Client/Site:** **Account:** **Issue:** We are majorly missing revenue in some parent/child setups. I would like to review on the phone to make it easier but for documentation purposes here are the details. This is the same issue I submitted a ticket for the other day (Ticket Number: 2019-12-23-36658) but the support team claims there isn’t anything wrong but here is the proof. 😊 Santa Rosa - Solano HSS https://answernet.sabilling.com/accounts/5515a971f0cf67dea20006bb Base rate on the parent allows for 1500 min. Sub accounts are set to sum (total) all minutes used and bill at the parent rate. Box is and always has been checked for ‘Use Parent account activities’. The client requested recently that we separate the sub transactions so today we checked the ‘separate child account transactions’ box for this billing cycle. However, now– it is treating it as if there was an allowance of 1500 min on EACH sub. **We have since turned this back off and reverted in order to bill correctly, but I have screen shots below of what it was attempting to do with these settings. Here is the proof: If you check the humanreadable for this sub (for example) Child Welfare Services SC9803, they had 1675 minutes used under 555099 but the draft invoice screen shot below shows that it’s being billed for only the 175 minutes OVER 1500. If we had used this draft invoice- their total new charges would have been $1678 and now that we have reverted and turned this feature back off, their total new charges are correct at $3,470 This is a HUGE difference that we are missing out on other accounts (like Cummings in my previous ticket) Below is also the corrected invoice after we turned off the ‘separate’ feature. I am confident this is occurring on other accounts that I have previously provided examples of (Cummins in Billings and HCB in El Paso) ![image](/uploads/0cd5895cb504b7d5b037823b8ca4704d/image.png) ![image](/uploads/189b945dfbbfec96943d10808b1acce3/image.png) ![image](/uploads/23a54fd8340290688860ad62b137ccad/image.png) Corrected invoice with ‘separate child account transactions’ turned back OFF. ![image](/uploads/f044bfe0f13cc4f1c406b7b91c9c7ead/image.png)
1.0
Missing revenue- issue with Separate Child Account Transactions | Parent:1679 - In GitLab by @kdjstudios on Jan 3, 2020, 08:43 **Submitted by:** "Leah Mitchell" <leah.mitchell@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/10126353 **Server:** Internal (Both) **Client/Site:** **Account:** **Issue:** We are majorly missing revenue in some parent/child setups. I would like to review on the phone to make it easier but for documentation purposes here are the details. This is the same issue I submitted a ticket for the other day (Ticket Number: 2019-12-23-36658) but the support team claims there isn’t anything wrong but here is the proof. 😊 Santa Rosa - Solano HSS https://answernet.sabilling.com/accounts/5515a971f0cf67dea20006bb Base rate on the parent allows for 1500 min. Sub accounts are set to sum (total) all minutes used and bill at the parent rate. Box is and always has been checked for ‘Use Parent account activities’. The client requested recently that we separate the sub transactions so today we checked the ‘separate child account transactions’ box for this billing cycle. However, now– it is treating it as if there was an allowance of 1500 min on EACH sub. **We have since turned this back off and reverted in order to bill correctly, but I have screen shots below of what it was attempting to do with these settings. Here is the proof: If you check the humanreadable for this sub (for example) Child Welfare Services SC9803, they had 1675 minutes used under 555099 but the draft invoice screen shot below shows that it’s being billed for only the 175 minutes OVER 1500. If we had used this draft invoice- their total new charges would have been $1678 and now that we have reverted and turned this feature back off, their total new charges are correct at $3,470 This is a HUGE difference that we are missing out on other accounts (like Cummings in my previous ticket) Below is also the corrected invoice after we turned off the ‘separate’ feature. I am confident this is occurring on other accounts that I have previously provided examples of (Cummins in Billings and HCB in El Paso) ![image](/uploads/0cd5895cb504b7d5b037823b8ca4704d/image.png) ![image](/uploads/189b945dfbbfec96943d10808b1acce3/image.png) ![image](/uploads/23a54fd8340290688860ad62b137ccad/image.png) Corrected invoice with ‘separate child account transactions’ turned back OFF. ![image](/uploads/f044bfe0f13cc4f1c406b7b91c9c7ead/image.png)
process
missing revenue issue with separate child account transactions parent in gitlab by kdjstudios on jan submitted by leah mitchell helpdesk server internal both client site account issue we are majorly missing revenue in some parent child setups i would like to review on the phone to make it easier but for documentation purposes here are the details this is the same issue i submitted a ticket for the other day ticket number but the support team claims there isn’t anything wrong but here is the proof 😊 santa rosa solano hss base rate on the parent allows for min sub accounts are set to sum total all minutes used and bill at the parent rate box is and always has been checked for ‘use parent account activities’ the client requested recently that we separate the sub transactions so today we checked the ‘separate child account transactions’ box for this billing cycle however now– it is treating it as if there was an allowance of min on each sub we have since turned this back off and reverted in order to bill correctly but i have screen shots below of what it was attempting to do with these settings here is the proof if you check the humanreadable for this sub for example child welfare services they had minutes used under but the draft invoice screen shot below shows that it’s being billed for only the minutes over if we had used this draft invoice their total new charges would have been and now that we have reverted and turned this feature back off their total new charges are correct at this is a huge difference that we are missing out on other accounts like cummings in my previous ticket below is also the corrected invoice after we turned off the ‘separate’ feature i am confident this is occurring on other accounts that i have previously provided examples of cummins in billings and hcb in el paso uploads image png uploads image png uploads image png corrected invoice with ‘separate child account transactions’ turned back off uploads image png
1
19,821
26,210,234,042
IssuesEvent
2023-01-04 05:21:22
AssetRipper/AssetRipper
https://api.github.com/repos/AssetRipper/AssetRipper
closed
Multiple AssetCollection for One Scene
enhancement scenes processing
## Context Currently, `AssetCollection`s can be a scene or a collection of other assets. However, multiple `AssetCollection`s cannot be combined into the same scene. Prior to asset processing, this was not an issue. However, prefab outlining requires this improvement (and others) before its development can continue. ## Justification `SerializedAssetCollection` should continue to be nearly immutable. New assets should not be added to those in order to reduce confusion about which assets are artificially created by AssetRipper. As such, it is necessary to allow a `SerializedAssetCollection` to be combined with one or more `ProcessedAssetCollection`s into a single scene during export. ## Design ```cs public sealed class SceneDefinition { public string Name { get; } public string Path { get; init; } public IReadOnlyList<AssetCollection> Collections { get; } public void AddCollection(AssetCollection collection); } class AssetCollection { public SceneDefinition? Scene { get; internal set; } } ```
1.0
Multiple AssetCollection for One Scene - ## Context Currently, `AssetCollection`s can be a scene or a collection of other assets. However, multiple `AssetCollection`s cannot be combined into the same scene. Prior to asset processing, this was not an issue. However, prefab outlining requires this improvement (and others) before its development can continue. ## Justification `SerializedAssetCollection` should continue to be nearly immutable. New assets should not be added to those in order to reduce confusion about which assets are artificially created by AssetRipper. As such, it is necessary to allow a `SerializedAssetCollection` to be combined with one or more `ProcessedAssetCollection`s into a single scene during export. ## Design ```cs public sealed class SceneDefinition { public string Name { get; } public string Path { get; init; } public IReadOnlyList<AssetCollection> Collections { get; } public void AddCollection(AssetCollection collection); } class AssetCollection { public SceneDefinition? Scene { get; internal set; } } ```
process
multiple assetcollection for one scene context currently assetcollection s can be a scene or a collection of other assets however multiple assetcollection s cannot be combined into the same scene prior to asset processing this was not an issue however prefab outlining requires this improvement and others before its development can continue justification serializedassetcollection should continue to be nearly immutable new assets should not be added to those in order to reduce confusion about which assets are artificially created by assetripper as such it is necessary to allow a serializedassetcollection to be combined with one or more processedassetcollection s into a single scene during export design cs public sealed class scenedefinition public string name get public string path get init public ireadonlylist collections get public void addcollection assetcollection collection class assetcollection public scenedefinition scene get internal set
1
661,464
22,055,621,577
IssuesEvent
2022-05-30 12:37:58
aiidateam/aiida-core
https://api.github.com/repos/aiidateam/aiida-core
closed
verdi config should work on configurations without profiles
type/bug priority/nice-to-have
### Describe the bug On an "empty" configuration (that was created automatically by AiiDA) ```json { "CONFIG_VERSION": { "CURRENT": 9, "OLDEST_COMPATIBLE": 9 }, "profiles": {} } ``` I'm getting the following ``` $ verdi config set warnings.development_version False --global Traceback (most recent call last): ... File "/.../aiida-core/aiida/cmdline/commands/cmd_config.py", line 126, in verdi_config_set config: Config = ctx.obj.config AttributeError: 'NoneType' object has no attribute 'config' ``` ### Steps to reproduce Steps to reproduce the behavior: 1. Take any AiiDA 2 installation 2. Point the `AIIDA_PATH` variable to an empty directory 3. Run `verdi config set warnings.development_version False` ### Expected behavior Setting a global config option should be possible, even on a config that does not yet contain any profile. ### Your environment - Operating system [e.g. Linux]: ubuntu - Python version [e.g. 3.7.1]: 3.9 - aiida-core version [e.g. 1.2.1]: 2.0.1.post0
1.0
verdi config should work on configurations without profiles - ### Describe the bug On an "empty" configuration (that was created automatically by AiiDA) ```json { "CONFIG_VERSION": { "CURRENT": 9, "OLDEST_COMPATIBLE": 9 }, "profiles": {} } ``` I'm getting the following ``` $ verdi config set warnings.development_version False --global Traceback (most recent call last): ... File "/.../aiida-core/aiida/cmdline/commands/cmd_config.py", line 126, in verdi_config_set config: Config = ctx.obj.config AttributeError: 'NoneType' object has no attribute 'config' ``` ### Steps to reproduce Steps to reproduce the behavior: 1. Take any AiiDA 2 installation 2. Point the `AIIDA_PATH` variable to an empty directory 3. Run `verdi config set warnings.development_version False` ### Expected behavior Setting a global config option should be possible, even on a config that does not yet contain any profile. ### Your environment - Operating system [e.g. Linux]: ubuntu - Python version [e.g. 3.7.1]: 3.9 - aiida-core version [e.g. 1.2.1]: 2.0.1.post0
non_process
verdi config should work on configurations without profiles describe the bug on an empty configuration that was created automatically by aiida json config version current oldest compatible profiles i m getting the following verdi config set warnings development version false global traceback most recent call last file aiida core aiida cmdline commands cmd config py line in verdi config set config config ctx obj config attributeerror nonetype object has no attribute config steps to reproduce steps to reproduce the behavior take any aiida installation point the aiida path variable to an empty directory run verdi config set warnings development version false expected behavior setting a global config option should be possible even on a config that does not yet contain any profile your environment operating system ubuntu python version aiida core version
0
2,211
5,049,225,899
IssuesEvent
2016-12-20 15:21:03
CERNDocumentServer/cds
https://api.github.com/repos/CERNDocumentServer/cds
opened
Webhooks endpoint to stop/start task of a receiver
avc_processing
Implements endpoints like `/receivers/[id]/events/[id]/tasks/[id]` to be able to start/stop tasks.
1.0
Webhooks endpoint to stop/start task of a receiver - Implements endpoints like `/receivers/[id]/events/[id]/tasks/[id]` to be able to start/stop tasks.
process
webhooks endpoint to stop start task of a receiver implements endpoints like receivers events tasks to be able to start stop tasks
1
8,152
11,354,737,380
IssuesEvent
2020-01-24 18:20:25
googleapis/java-dlp
https://api.github.com/repos/googleapis/java-dlp
closed
Promote to GA
type: process
Package name: **google-cloud-dlp** Current release: **beta** Proposed release: **GA** ## Instructions Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue. ## Required - [x] 28 days elapsed since last beta release with new API surface - [x] Server API is GA - [x] Package API is stable, and we can commit to backward compatibility - [x] All dependencies are GA ## Optional - [ ] Most common / important scenarios have descriptive samples - [ ] Public manual methods have at least one usage sample each (excluding overloads) - [ ] Per-API README includes a full description of the API - [ ] Per-API README contains at least one “getting started” sample using the most common API scenario - [ ] Manual code has been reviewed by API producer - [ ] Manual code has been reviewed by a DPE responsible for samples - [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
1.0
Promote to GA - Package name: **google-cloud-dlp** Current release: **beta** Proposed release: **GA** ## Instructions Check the lists below, adding tests / documentation as required. Once all the "required" boxes are ticked, please create a release and close this issue. ## Required - [x] 28 days elapsed since last beta release with new API surface - [x] Server API is GA - [x] Package API is stable, and we can commit to backward compatibility - [x] All dependencies are GA ## Optional - [ ] Most common / important scenarios have descriptive samples - [ ] Public manual methods have at least one usage sample each (excluding overloads) - [ ] Per-API README includes a full description of the API - [ ] Per-API README contains at least one “getting started” sample using the most common API scenario - [ ] Manual code has been reviewed by API producer - [ ] Manual code has been reviewed by a DPE responsible for samples - [ ] 'Client Libraries' page is added to the product documentation in 'APIs & Reference' section of the product's documentation on Cloud Site
process
promote to ga package name google cloud dlp current release beta proposed release ga instructions check the lists below adding tests documentation as required once all the required boxes are ticked please create a release and close this issue required days elapsed since last beta release with new api surface server api is ga package api is stable and we can commit to backward compatibility all dependencies are ga optional most common important scenarios have descriptive samples public manual methods have at least one usage sample each excluding overloads per api readme includes a full description of the api per api readme contains at least one “getting started” sample using the most common api scenario manual code has been reviewed by api producer manual code has been reviewed by a dpe responsible for samples client libraries page is added to the product documentation in apis reference section of the product s documentation on cloud site
1
18,171
24,207,814,816
IssuesEvent
2022-09-25 13:36:40
sebastianbergmann/phpunit
https://api.github.com/repos/sebastianbergmann/phpunit
closed
Enable strict interpretation of scalar type declarations in process isolation code templates
type/refactoring feature/process-isolation
69e61bb1463803d5016ea57cb09a021fd50b4a82 enabled the strict interpretation of scalar type declarations in the code templates used by process isolation. This caused a problem with PHPDBG detailed in #3772. We need to investigate which statement(s) in these templates cause a problem when scalar type declarations are interpreted strictly. After fixing {this|these} issue(s) we can then enable the strict interpretation of scalar type declarations in the code templates again.
1.0
Enable strict interpretation of scalar type declarations in process isolation code templates - 69e61bb1463803d5016ea57cb09a021fd50b4a82 enabled the strict interpretation of scalar type declarations in the code templates used by process isolation. This caused a problem with PHPDBG detailed in #3772. We need to investigate which statement(s) in these templates cause a problem when scalar type declarations are interpreted strictly. After fixing {this|these} issue(s) we can then enable the strict interpretation of scalar type declarations in the code templates again.
process
enable strict interpretation of scalar type declarations in process isolation code templates enabled the strict interpretation of scalar type declarations in the code templates used by process isolation this caused a problem with phpdbg detailed in we need to investigate which statement s in these templates cause a problem when scalar type declarations are interpreted strictly after fixing this these issue s we can then enable the strict interpretation of scalar type declarations in the code templates again
1
22,032
30,546,178,921
IssuesEvent
2023-07-20 04:15:18
elastic/beats
https://api.github.com/repos/elastic/beats
closed
[decode_cef] Allow int64 (instead of int32) for bytesIn, bytesOut
enhancement Filebeat :Processors Team:Security-External Integrations
**Describe the enhancement:** The `decode_cef` processor is a fairly strict implementation of the [_Micro Focus Security ArcSight Common Event Format Version 25_]( https://archive.org/download/commoneventformatv25/CommonEventFormatV25.pdf) specification. In this document the CEF specification declares `in` (aka `bytesIn`) and `out` (aka `bytesOut`) as `Integer` types. Our parser could be more permissive and allow these fields to be treated as int64 values. It will be a slightly less strict implementation of the specification, but I think the spec should have originally marked these a `Long` types. https://github.com/elastic/beats/blob/57d649d3a32e3aa73cd903e3421edfe0cbcef67b/x-pack/filebeat/processors/decode_cef/cef/keys.go#L91-L98 **Describe a specific use case for the enhancement or feature:** Network devices with counters will be able to pass values larger than 2 GiB.
1.0
[decode_cef] Allow int64 (instead of int32) for bytesIn, bytesOut - **Describe the enhancement:** The `decode_cef` processor is a fairly strict implementation of the [_Micro Focus Security ArcSight Common Event Format Version 25_]( https://archive.org/download/commoneventformatv25/CommonEventFormatV25.pdf) specification. In this document the CEF specification declares `in` (aka `bytesIn`) and `out` (aka `bytesOut`) as `Integer` types. Our parser could be more permissive and allow these fields to be treated as int64 values. It will be a slightly less strict implementation of the specification, but I think the spec should have originally marked these a `Long` types. https://github.com/elastic/beats/blob/57d649d3a32e3aa73cd903e3421edfe0cbcef67b/x-pack/filebeat/processors/decode_cef/cef/keys.go#L91-L98 **Describe a specific use case for the enhancement or feature:** Network devices with counters will be able to pass values larger than 2 GiB.
process
allow instead of for bytesin bytesout describe the enhancement the decode cef processor is a fairly strict implementation of the specification in this document the cef specification declares in aka bytesin and out aka bytesout as integer types our parser could be more permissive and allow these fields to be treated as values it will be a slightly less strict implementation of the specification but i think the spec should have originally marked these a long types describe a specific use case for the enhancement or feature network devices with counters will be able to pass values larger than gib
1
87,954
25,260,513,481
IssuesEvent
2022-11-15 22:17:10
NVIDIA/spark-rapids
https://api.github.com/repos/NVIDIA/spark-rapids
closed
[BUG] AnsiCastOpSuite 340 failures
bug test build Spark 3.4+
Sub issue of #6987 **Describe the bug** 24 `Java.lang.NoSuchFieldException: ansiEnabled` errors in AnsiCastOpSuite unit tests **Steps/Code to reproduce bug** `mvn -Dbuildver=340 test -pl tests -Dsuites='com.nvidia.spark.rapids.AnsiCastOpSuite'` **Expected behavior** Test should pass cleanly
1.0
[BUG] AnsiCastOpSuite 340 failures - Sub issue of #6987 **Describe the bug** 24 `Java.lang.NoSuchFieldException: ansiEnabled` errors in AnsiCastOpSuite unit tests **Steps/Code to reproduce bug** `mvn -Dbuildver=340 test -pl tests -Dsuites='com.nvidia.spark.rapids.AnsiCastOpSuite'` **Expected behavior** Test should pass cleanly
non_process
ansicastopsuite failures sub issue of describe the bug java lang nosuchfieldexception ansienabled errors in ansicastopsuite unit tests steps code to reproduce bug mvn dbuildver test pl tests dsuites com nvidia spark rapids ansicastopsuite expected behavior test should pass cleanly
0
8,562
3,194,353,349
IssuesEvent
2015-09-30 11:39:17
ComputationalRadiationPhysics/picongpu
https://api.github.com/repos/ComputationalRadiationPhysics/picongpu
closed
Hypnos picongpu.profile.example does not work
documentation
Using the default picongpu profile on release 0.1.1 I got the following error when sourcing: ``` horny74@hypnos5:/bigdata/hplsim/external/horny74/picongpu$ source ~/picongpu.profile This module will set up environment variables for mpfr/3.1.2. This module will set up environment variables for mpc/1.0.1. This module will set up environment variables for gmp/5.1.1. This module will set up environment variables for PSM-Infinipath (QLogic). This module will set up environment variables for mpfr/3.1.2. This module will set up environment variables for mpc/1.0.1. This module will set up environment variables for gmp/5.1.1. This module will set up environment variables for mpfr/3.0.0. This module will set up environment variables for mpc/0.8.2. This module will set up environment variables for gmp/4.3.1. This module will set up environment variables for gcc/4.6.2. gcc/4.6.2(114):ERROR:150: Module 'gcc/4.6.2' conflicts with the currently loaded module(s) 'gcc/4.8.2' gcc/4.6.2(114):ERROR:102: Tcl command execution failed: conflict gcc This module will set up environment variables for cmake/3.0.1. This module will set up environment variables for the Message-Passing Interface environment openmpi/1.6.3. USE THIS MPI VERSION ON KEPLER NODES ONLY!!!! openmpi/1.6.3(39):ERROR:150: Module 'openmpi/1.6.3' conflicts with the currently loaded module(s) 'openmpi/1.8.0' openmpi/1.6.3(39):ERROR:102: Tcl command execution failed: conflict openmpi This module will set up environment variables for boost/1.54.0. This module will set up environment variables for cuda/6.5. This module will set up environment variables for mallocmc/2.0.1. This module will set up environment variables for pngwriter/0.5.4. This module will set up environment variables for hdf5-parallel/1.8.14. libsplash/1.2.3(7):ERROR:151: Module 'libsplash/1.2.3' depends on one of the module(s) 'openmpi/1.6.3' libsplash/1.2.3(7):ERROR:102: Tcl command execution failed: prereq openmpi/1.6.3 ```
1.0
Hypnos picongpu.profile.example does not work - Using the default picongpu profile on release 0.1.1 I got the following error when sourcing: ``` horny74@hypnos5:/bigdata/hplsim/external/horny74/picongpu$ source ~/picongpu.profile This module will set up environment variables for mpfr/3.1.2. This module will set up environment variables for mpc/1.0.1. This module will set up environment variables for gmp/5.1.1. This module will set up environment variables for PSM-Infinipath (QLogic). This module will set up environment variables for mpfr/3.1.2. This module will set up environment variables for mpc/1.0.1. This module will set up environment variables for gmp/5.1.1. This module will set up environment variables for mpfr/3.0.0. This module will set up environment variables for mpc/0.8.2. This module will set up environment variables for gmp/4.3.1. This module will set up environment variables for gcc/4.6.2. gcc/4.6.2(114):ERROR:150: Module 'gcc/4.6.2' conflicts with the currently loaded module(s) 'gcc/4.8.2' gcc/4.6.2(114):ERROR:102: Tcl command execution failed: conflict gcc This module will set up environment variables for cmake/3.0.1. This module will set up environment variables for the Message-Passing Interface environment openmpi/1.6.3. USE THIS MPI VERSION ON KEPLER NODES ONLY!!!! openmpi/1.6.3(39):ERROR:150: Module 'openmpi/1.6.3' conflicts with the currently loaded module(s) 'openmpi/1.8.0' openmpi/1.6.3(39):ERROR:102: Tcl command execution failed: conflict openmpi This module will set up environment variables for boost/1.54.0. This module will set up environment variables for cuda/6.5. This module will set up environment variables for mallocmc/2.0.1. This module will set up environment variables for pngwriter/0.5.4. This module will set up environment variables for hdf5-parallel/1.8.14. libsplash/1.2.3(7):ERROR:151: Module 'libsplash/1.2.3' depends on one of the module(s) 'openmpi/1.6.3' libsplash/1.2.3(7):ERROR:102: Tcl command execution failed: prereq openmpi/1.6.3 ```
non_process
hypnos picongpu profile example does not work using the default picongpu profile on release i got the following error when sourcing bigdata hplsim external picongpu source picongpu profile this module will set up environment variables for mpfr this module will set up environment variables for mpc this module will set up environment variables for gmp this module will set up environment variables for psm infinipath qlogic this module will set up environment variables for mpfr this module will set up environment variables for mpc this module will set up environment variables for gmp this module will set up environment variables for mpfr this module will set up environment variables for mpc this module will set up environment variables for gmp this module will set up environment variables for gcc gcc error module gcc conflicts with the currently loaded module s gcc gcc error tcl command execution failed conflict gcc this module will set up environment variables for cmake this module will set up environment variables for the message passing interface environment openmpi use this mpi version on kepler nodes only openmpi error module openmpi conflicts with the currently loaded module s openmpi openmpi error tcl command execution failed conflict openmpi this module will set up environment variables for boost this module will set up environment variables for cuda this module will set up environment variables for mallocmc this module will set up environment variables for pngwriter this module will set up environment variables for parallel libsplash error module libsplash depends on one of the module s openmpi libsplash error tcl command execution failed prereq openmpi
0
65,244
6,953,037,564
IssuesEvent
2017-12-06 19:33:50
pouchdb/pouchdb
https://api.github.com/repos/pouchdb/pouchdb
closed
Multiple $or inside $and not possible
bug has test case wontfix
_From @dolfje on September 8, 2016 16:37_ The code is written to reduce all $and into an array in mergeAndedSelectors. Although there is one use-case where this behaviour is not wanted. With the following query: ``` {$and: [ {$or: [{type: "dog"}, {type: "cat"}]}, {$or: [{owner: "lisa"}, {owner: "bob"}]} ]} ``` ps. There are also problems with userOperatorLosesPrecision and checkFieldsLogicallySound not supporting $and and $or. ps2. It would be awesome if this construction just triggers 2 queries and then merge them. _Copied from original issue: nolanlawson/pouchdb-find#214_
1.0
Multiple $or inside $and not possible - _From @dolfje on September 8, 2016 16:37_ The code is written to reduce all $and into an array in mergeAndedSelectors. Although there is one use-case where this behaviour is not wanted. With the following query: ``` {$and: [ {$or: [{type: "dog"}, {type: "cat"}]}, {$or: [{owner: "lisa"}, {owner: "bob"}]} ]} ``` ps. There are also problems with userOperatorLosesPrecision and checkFieldsLogicallySound not supporting $and and $or. ps2. It would be awesome if this construction just triggers 2 queries and then merge them. _Copied from original issue: nolanlawson/pouchdb-find#214_
non_process
multiple or inside and not possible from dolfje on september the code is written to reduce all and into an array in mergeandedselectors although there is one use case where this behaviour is not wanted with the following query and or or ps there are also problems with useroperatorlosesprecision and checkfieldslogicallysound not supporting and and or it would be awesome if this construction just triggers queries and then merge them copied from original issue nolanlawson pouchdb find
0
182,975
21,678,647,271
IssuesEvent
2022-05-09 02:24:03
turkdevops/thread-loader
https://api.github.com/repos/turkdevops/thread-loader
closed
CVE-2021-23337 (High) detected in lodash-4.17.20.tgz - autoclosed
security vulnerability
## CVE-2021-23337 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.20.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - :x: **lodash-4.17.20.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/turkdevops/thread-loader/commit/751b60aa5b87b55382bde58feeb3dbfbe2433fad">751b60aa5b87b55382bde58feeb3dbfbe2433fad</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Lodash versions prior to 4.17.21 are vulnerable to Command Injection via the template function. <p>Publish Date: 2021-02-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23337>CVE-2021-23337</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c">https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c</a></p> <p>Release Date: 2021-02-15</p> <p>Fix Resolution: 4.17.21</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2021-23337 (High) detected in lodash-4.17.20.tgz - autoclosed - ## CVE-2021-23337 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>lodash-4.17.20.tgz</b></p></summary> <p>Lodash modular utilities.</p> <p>Library home page: <a href="https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz">https://registry.npmjs.org/lodash/-/lodash-4.17.20.tgz</a></p> <p>Path to dependency file: /package.json</p> <p>Path to vulnerable library: /node_modules/lodash/package.json</p> <p> Dependency Hierarchy: - :x: **lodash-4.17.20.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/turkdevops/thread-loader/commit/751b60aa5b87b55382bde58feeb3dbfbe2433fad">751b60aa5b87b55382bde58feeb3dbfbe2433fad</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Lodash versions prior to 4.17.21 are vulnerable to Command Injection via the template function. <p>Publish Date: 2021-02-15 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2021-23337>CVE-2021-23337</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.2</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: High - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c">https://github.com/lodash/lodash/commit/3469357cff396a26c363f8c1b5a91dde28ba4b1c</a></p> <p>Release Date: 2021-02-15</p> <p>Fix Resolution: 4.17.21</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in lodash tgz autoclosed cve high severity vulnerability vulnerable library lodash tgz lodash modular utilities library home page a href path to dependency file package json path to vulnerable library node modules lodash package json dependency hierarchy x lodash tgz vulnerable library found in head commit a href found in base branch master vulnerability details lodash versions prior to are vulnerable to command injection via the template function publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required high user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
252
2,672,475,805
IssuesEvent
2015-03-24 14:26:03
GsDevKit/gsDevKitHome
https://api.github.com/repos/GsDevKit/gsDevKitHome
opened
GsUpgrader download stalls
in process
Of course with all of the mucking about that wen into fixing the [recently appearing hangs](https://github.com/GsDevKit/GsDevKit/issues/60) and then the [need to include the patch in GsUpgrader](https://github.com/GsDevKit/gsUpgrader/issues/14) ... now I've had a build hang during the download of GsUpgrader ... will wonders never cease!
1.0
GsUpgrader download stalls - Of course with all of the mucking about that wen into fixing the [recently appearing hangs](https://github.com/GsDevKit/GsDevKit/issues/60) and then the [need to include the patch in GsUpgrader](https://github.com/GsDevKit/gsUpgrader/issues/14) ... now I've had a build hang during the download of GsUpgrader ... will wonders never cease!
process
gsupgrader download stalls of course with all of the mucking about that wen into fixing the and then the now i ve had a build hang during the download of gsupgrader will wonders never cease
1
116,364
14,945,914,466
IssuesEvent
2021-01-26 05:28:41
microsoft/pyright
https://api.github.com/repos/microsoft/pyright
closed
Did not get the correct type of variable (calling pandas.read_csv)
as designed
**Describe the bug** When using `read_csv` function from `pandas`, setting `iterator` to `False` (default behavior) should return a `DataFrame` object, which should have all `DataFrame` methods. However, pyright seems to decide that it returns the `TextFileReader` object, which should only be the case when `iterator` is set to `True` or `chunksize` is not `None`. Since all my remaining code tries to analyze the data assuming the object is `DataFrame`, pyright shows that every line has errors. **To Reproduce** ``` import pandas as pd df = pd.read_csv("any.csv") print(df.head) ``` **Expected behavior** `df` should be a `DataFrame` object and accessing all `DataFrame` methods should not report as errors. **Screenshots or Code** ![image](https://user-images.githubusercontent.com/10877536/105803365-565ce500-5f6b-11eb-8916-211740040b6a.png) ![image](https://user-images.githubusercontent.com/10877536/105803331-3b8a7080-5f6b-11eb-9564-5295f1138e6e.png) **VS Code extension or command-line** Using coc-pyright extension (1.1.104)
1.0
Did not get the correct type of variable (calling pandas.read_csv) - **Describe the bug** When using `read_csv` function from `pandas`, setting `iterator` to `False` (default behavior) should return a `DataFrame` object, which should have all `DataFrame` methods. However, pyright seems to decide that it returns the `TextFileReader` object, which should only be the case when `iterator` is set to `True` or `chunksize` is not `None`. Since all my remaining code tries to analyze the data assuming the object is `DataFrame`, pyright shows that every line has errors. **To Reproduce** ``` import pandas as pd df = pd.read_csv("any.csv") print(df.head) ``` **Expected behavior** `df` should be a `DataFrame` object and accessing all `DataFrame` methods should not report as errors. **Screenshots or Code** ![image](https://user-images.githubusercontent.com/10877536/105803365-565ce500-5f6b-11eb-8916-211740040b6a.png) ![image](https://user-images.githubusercontent.com/10877536/105803331-3b8a7080-5f6b-11eb-9564-5295f1138e6e.png) **VS Code extension or command-line** Using coc-pyright extension (1.1.104)
non_process
did not get the correct type of variable calling pandas read csv describe the bug when using read csv function from pandas setting iterator to false default behavior should return a dataframe object which should have all dataframe methods however pyright seems to decide that it returns the textfilereader object which should only be the case when iterator is set to true or chunksize is not none since all my remaining code tries to analyze the data assuming the object is dataframe pyright shows that every line has errors to reproduce import pandas as pd df pd read csv any csv print df head expected behavior df should be a dataframe object and accessing all dataframe methods should not report as errors screenshots or code vs code extension or command line using coc pyright extension
0
8,811
11,912,513,138
IssuesEvent
2020-03-31 10:23:44
elastic/beats
https://api.github.com/repos/elastic/beats
closed
Clarify documentation about indexers and matchers
:Processors Team:Integrations Team:Platforms containers docs
We could clarify `add_kubernetes_metadata` indexers and matchers functionality: https://www.elastic.co/guide/en/beats/filebeat/6.0/add-kubernetes-metadata.html Indexers extract pod metadata from kubernetes events and store it in a map in memory Matchers process beat events and enrich them with metadata from the in-memory map We should try to avoid indexing related words to avoid confusion with ES indexing, as this is totally unrelated
1.0
Clarify documentation about indexers and matchers - We could clarify `add_kubernetes_metadata` indexers and matchers functionality: https://www.elastic.co/guide/en/beats/filebeat/6.0/add-kubernetes-metadata.html Indexers extract pod metadata from kubernetes events and store it in a map in memory Matchers process beat events and enrich them with metadata from the in-memory map We should try to avoid indexing related words to avoid confusion with ES indexing, as this is totally unrelated
process
clarify documentation about indexers and matchers we could clarify add kubernetes metadata indexers and matchers functionality indexers extract pod metadata from kubernetes events and store it in a map in memory matchers process beat events and enrich them with metadata from the in memory map we should try to avoid indexing related words to avoid confusion with es indexing as this is totally unrelated
1
348,162
24,907,782,530
IssuesEvent
2022-10-29 13:31:13
osbc2022/programming-language-timeline-website
https://api.github.com/repos/osbc2022/programming-language-timeline-website
closed
[DOCS] : wrong description in readme
bug documentation help wanted good first issue 🪙 point : 1
### ✏️ Description the `readme.md` has wrong description of the website , it is of the to-do-website , but it should be of timeline website ### 📸 Screenshots ![image](https://user-images.githubusercontent.com/83657737/198730394-382f950f-4b6d-40a8-90d8-f4d844aa7951.png)
1.0
[DOCS] : wrong description in readme - ### ✏️ Description the `readme.md` has wrong description of the website , it is of the to-do-website , but it should be of timeline website ### 📸 Screenshots ![image](https://user-images.githubusercontent.com/83657737/198730394-382f950f-4b6d-40a8-90d8-f4d844aa7951.png)
non_process
wrong description in readme ✏️ description the readme md has wrong description of the website it is of the to do website but it should be of timeline website 📸 screenshots
0
21,183
28,151,516,822
IssuesEvent
2023-04-03 02:00:07
lizhihao6/get-daily-arxiv-noti
https://api.github.com/repos/lizhihao6/get-daily-arxiv-noti
opened
New submissions for Mon, 3 Apr 23
event camera white balance isp compression image signal processing image signal process raw raw image events camera color contrast events AWB
## Keyword: events ### Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter Correction - **Authors:** Delin Qu, Yizhen Lao, Zhigang Wang, Dong Wang, Bin Zhao, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18125 - **Pdf link:** https://arxiv.org/pdf/2303.18125 - **Abstract** This paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion. Existing methods suffer from two main drawbacks. Firstly, they face challenges in estimating the accurate correction field due to the uniform velocity assumption, leading to significant image correction errors under complex motion. Secondly, the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames. To tackle these challenges, we model the curvilinear trajectory of pixels analytically and propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixel. Besides, to reconstruct high-quality occlusion frames in dynamic scenes, we present a 3D video architecture that effectively Aligns and Aggregates multi-frame context, namely, RSA^2-Net. We evaluate our method across a broad range of cameras and video sequences, demonstrating its significant superiority. Specifically, our method surpasses the state-of-the-arts by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively. ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB ### Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter Correction - **Authors:** Delin Qu, Yizhen Lao, Zhigang Wang, Dong Wang, Bin Zhao, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18125 - **Pdf link:** https://arxiv.org/pdf/2303.18125 - **Abstract** This paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion. Existing methods suffer from two main drawbacks. Firstly, they face challenges in estimating the accurate correction field due to the uniform velocity assumption, leading to significant image correction errors under complex motion. Secondly, the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames. To tackle these challenges, we model the curvilinear trajectory of pixels analytically and propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixel. Besides, to reconstruct high-quality occlusion frames in dynamic scenes, we present a 3D video architecture that effectively Aligns and Aggregates multi-frame context, namely, RSA^2-Net. We evaluate our method across a broad range of cameras and video sequences, demonstrating its significant superiority. Specifically, our method surpasses the state-of-the-arts by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively. ## Keyword: ISP ### DIME-FM: DIstilling Multimodal and Efficient Foundation Models - **Authors:** Ximeng Sun, Pengchuan Zhang, Peizhao Zhang, Hardik Shah, Kate Saenko, Xide Xia - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18232 - **Pdf link:** https://arxiv.org/pdf/2303.18232 - **Abstract** Large Vision-Language Foundation Models (VLFM), such as CLIP, ALIGN and Florence, are trained on large-scale datasets of image-caption pairs and achieve superior transferability and robustness on downstream tasks, but they are difficult to use in many practical applications due to their large size, high latency and fixed architectures. Unfortunately, recent work shows training a small custom VLFM for resource-limited applications is currently very difficult using public and smaller-scale data. In this paper, we introduce a new distillation mechanism (DIME-FM) that allows us to transfer the knowledge contained in large VLFMs to smaller, customized foundation models using a relatively small amount of inexpensive, unpaired images and sentences. We transfer the knowledge from the pre-trained CLIP-ViTL/14 model to a ViT-B/32 model, with only 40M public images and 28.4M unpaired public sentences. The resulting model "Distill-ViT-B/32" rivals the CLIP-ViT-B/32 model pre-trained on its private WiT dataset (400M image-text pairs): Distill-ViT-B/32 achieves similar results in terms of zero-shot and linear-probing performance on both ImageNet and the ELEVATER (20 image classification tasks) benchmarks. It also displays comparable robustness when evaluated on five datasets with natural distribution shifts from ImageNet. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### Diff-ID: An Explainable Identity Difference Quantification Framework for DeepFake Detection - **Authors:** Chuer Yu, Xuhong Zhang, Yuxuan Duan, Senbo Yan, Zonghui Wang, Yang Xiang, Shouling Ji, Wenzhi Chen - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18174 - **Pdf link:** https://arxiv.org/pdf/2303.18174 - **Abstract** Despite the fact that DeepFake forgery detection algorithms have achieved impressive performance on known manipulations, they often face disastrous performance degradation when generalized to an unseen manipulation. Some recent works show improvement in generalization but rely on features fragile to image distortions such as compression. To this end, we propose Diff-ID, a concise and effective approach that explains and measures the identity loss induced by facial manipulations. When testing on an image of a specific person, Diff-ID utilizes an authentic image of that person as a reference and aligns them to the same identity-insensitive attribute feature space by applying a face-swapping generator. We then visualize the identity loss between the test and the reference image from the image differences of the aligned pairs, and design a custom metric to quantify the identity loss. The metric is then proved to be effective in distinguishing the forgery images from the real ones. Extensive experiments show that our approach achieves high detection performance on DeepFake images and state-of-the-art generalization ability to unknown forgery methods, while also being robust to image distortions. ## Keyword: RAW ### Whether and When does Endoscopy Domain Pretraining Make Sense? - **Authors:** Dominik Batić, Felix Holm, Ege Özsoy, Tobias Czempiel, Nassir Navab - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.17636 - **Pdf link:** https://arxiv.org/pdf/2303.17636 - **Abstract** Automated endoscopy video analysis is a challenging task in medical computer vision, with the primary objective of assisting surgeons during procedures. The difficulty arises from the complexity of surgical scenes and the lack of a sufficient amount of annotated data. In recent years, large-scale pretraining has shown great success in natural language processing and computer vision communities. These approaches reduce the need for annotated data, which is always a concern in the medical domain. However, most works on endoscopic video understanding use models pretrained on natural images, creating a domain gap between pretraining and finetuning. In this work, we investigate the need for endoscopy domain-specific pretraining based on downstream objectives. To this end, we first collect Endo700k, the largest publicly available corpus of endoscopic images, extracted from nine public Minimally Invasive Surgery (MIS) datasets. Endo700k comprises more than 700,000 unannotated raw images. Next, we introduce EndoViT, an endoscopy pretrained Vision Transformer (ViT). Through ablations, we demonstrate that domain-specific pretraining is particularly beneficial for more complex downstream tasks, such as Action Triplet Detection, and less effective and even unnecessary for simpler tasks, such as Surgical Phase Recognition. We will release both our code and pretrained models upon acceptance to facilitate further research in this direction. ### GlyphDraw: Learning to Draw Chinese Characters in Image Synthesis Models Coherently - **Authors:** Jian Ma, Mingjun Zhao, Chen Chen, Ruichen Wang, Di Niu, Haonan Lu, Xiaodong Lin - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.17870 - **Pdf link:** https://arxiv.org/pdf/2303.17870 - **Abstract** Recent breakthroughs in the field of language-guided image generation have yielded impressive achievements, enabling the creation of high-quality and diverse images based on user instructions. Although the synthesis performance is fascinating, one significant limitation of current image generation models is their insufficient ability to generate coherent text within images, particularly for complex glyph structures like Chinese characters. To address this problem, we introduce GlyphDraw, a general learning framework aiming at endowing image generation models with the capacity to generate images embedded with coherent text. To the best of our knowledge, this is the first work in the field of image synthesis to address the generation of Chinese characters. % we first adopt the OCR technique to collect images with Chinese characters as training samples, and extract the text and locations as auxiliary information. We first sophisticatedly design the image-text dataset's construction strategy, then build our model specifically on a diffusion-based image generator and carefully modify the network structure to allow the model to learn drawing Chinese characters with the help of glyph and position information. Furthermore, we maintain the model's open-domain image synthesis capability by preventing catastrophic forgetting by using a variety of training techniques. Extensive qualitative and quantitative experiments demonstrate that our method not only produces accurate Chinese characters as in prompts, but also naturally blends the generated text into the background. Please refer to https://1073521013.github.io/glyph-draw.github.io ### Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter Correction - **Authors:** Delin Qu, Yizhen Lao, Zhigang Wang, Dong Wang, Bin Zhao, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18125 - **Pdf link:** https://arxiv.org/pdf/2303.18125 - **Abstract** This paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion. Existing methods suffer from two main drawbacks. Firstly, they face challenges in estimating the accurate correction field due to the uniform velocity assumption, leading to significant image correction errors under complex motion. Secondly, the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames. To tackle these challenges, we model the curvilinear trajectory of pixels analytically and propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixel. Besides, to reconstruct high-quality occlusion frames in dynamic scenes, we present a 3D video architecture that effectively Aligns and Aggregates multi-frame context, namely, RSA^2-Net. We evaluate our method across a broad range of cameras and video sequences, demonstrating its significant superiority. Specifically, our method surpasses the state-of-the-arts by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively. ### Efficient View Synthesis and 3D-based Multi-Frame Denoising with Multiplane Feature Representations - **Authors:** Thomas Tanay, Aleš Leonardis, Matteo Maggioni - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18139 - **Pdf link:** https://arxiv.org/pdf/2303.18139 - **Abstract** While current multi-frame restoration methods combine information from multiple input images using 2D alignment techniques, recent advances in novel view synthesis are paving the way for a new paradigm relying on volumetric scene representations. In this work, we introduce the first 3D-based multi-frame denoising method that significantly outperforms its 2D-based counterparts with lower computational requirements. Our method extends the multiplane image (MPI) framework for novel view synthesis by introducing a learnable encoder-renderer pair manipulating multiplane representations in feature space. The encoder fuses information across views and operates in a depth-wise manner while the renderer fuses information across depths and operates in a view-wise manner. The two modules are trained end-to-end and learn to separate depths in an unsupervised way, giving rise to Multiplane Feature (MPF) representations. Experiments on the Spaces and Real Forward-Facing datasets as well as on raw burst data validate our approach for view synthesis, multi-frame denoising, and view synthesis under noisy conditions. ## Keyword: raw image ### Whether and When does Endoscopy Domain Pretraining Make Sense? - **Authors:** Dominik Batić, Felix Holm, Ege Özsoy, Tobias Czempiel, Nassir Navab - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.17636 - **Pdf link:** https://arxiv.org/pdf/2303.17636 - **Abstract** Automated endoscopy video analysis is a challenging task in medical computer vision, with the primary objective of assisting surgeons during procedures. The difficulty arises from the complexity of surgical scenes and the lack of a sufficient amount of annotated data. In recent years, large-scale pretraining has shown great success in natural language processing and computer vision communities. These approaches reduce the need for annotated data, which is always a concern in the medical domain. However, most works on endoscopic video understanding use models pretrained on natural images, creating a domain gap between pretraining and finetuning. In this work, we investigate the need for endoscopy domain-specific pretraining based on downstream objectives. To this end, we first collect Endo700k, the largest publicly available corpus of endoscopic images, extracted from nine public Minimally Invasive Surgery (MIS) datasets. Endo700k comprises more than 700,000 unannotated raw images. Next, we introduce EndoViT, an endoscopy pretrained Vision Transformer (ViT). Through ablations, we demonstrate that domain-specific pretraining is particularly beneficial for more complex downstream tasks, such as Action Triplet Detection, and less effective and even unnecessary for simpler tasks, such as Surgical Phase Recognition. We will release both our code and pretrained models upon acceptance to facilitate further research in this direction.
2.0
New submissions for Mon, 3 Apr 23 - ## Keyword: events ### Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter Correction - **Authors:** Delin Qu, Yizhen Lao, Zhigang Wang, Dong Wang, Bin Zhao, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18125 - **Pdf link:** https://arxiv.org/pdf/2303.18125 - **Abstract** This paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion. Existing methods suffer from two main drawbacks. Firstly, they face challenges in estimating the accurate correction field due to the uniform velocity assumption, leading to significant image correction errors under complex motion. Secondly, the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames. To tackle these challenges, we model the curvilinear trajectory of pixels analytically and propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixel. Besides, to reconstruct high-quality occlusion frames in dynamic scenes, we present a 3D video architecture that effectively Aligns and Aggregates multi-frame context, namely, RSA^2-Net. We evaluate our method across a broad range of cameras and video sequences, demonstrating its significant superiority. Specifically, our method surpasses the state-of-the-arts by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively. ## Keyword: event camera There is no result ## Keyword: events camera There is no result ## Keyword: white balance There is no result ## Keyword: color contrast There is no result ## Keyword: AWB ### Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter Correction - **Authors:** Delin Qu, Yizhen Lao, Zhigang Wang, Dong Wang, Bin Zhao, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18125 - **Pdf link:** https://arxiv.org/pdf/2303.18125 - **Abstract** This paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion. Existing methods suffer from two main drawbacks. Firstly, they face challenges in estimating the accurate correction field due to the uniform velocity assumption, leading to significant image correction errors under complex motion. Secondly, the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames. To tackle these challenges, we model the curvilinear trajectory of pixels analytically and propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixel. Besides, to reconstruct high-quality occlusion frames in dynamic scenes, we present a 3D video architecture that effectively Aligns and Aggregates multi-frame context, namely, RSA^2-Net. We evaluate our method across a broad range of cameras and video sequences, demonstrating its significant superiority. Specifically, our method surpasses the state-of-the-arts by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively. ## Keyword: ISP ### DIME-FM: DIstilling Multimodal and Efficient Foundation Models - **Authors:** Ximeng Sun, Pengchuan Zhang, Peizhao Zhang, Hardik Shah, Kate Saenko, Xide Xia - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18232 - **Pdf link:** https://arxiv.org/pdf/2303.18232 - **Abstract** Large Vision-Language Foundation Models (VLFM), such as CLIP, ALIGN and Florence, are trained on large-scale datasets of image-caption pairs and achieve superior transferability and robustness on downstream tasks, but they are difficult to use in many practical applications due to their large size, high latency and fixed architectures. Unfortunately, recent work shows training a small custom VLFM for resource-limited applications is currently very difficult using public and smaller-scale data. In this paper, we introduce a new distillation mechanism (DIME-FM) that allows us to transfer the knowledge contained in large VLFMs to smaller, customized foundation models using a relatively small amount of inexpensive, unpaired images and sentences. We transfer the knowledge from the pre-trained CLIP-ViTL/14 model to a ViT-B/32 model, with only 40M public images and 28.4M unpaired public sentences. The resulting model "Distill-ViT-B/32" rivals the CLIP-ViT-B/32 model pre-trained on its private WiT dataset (400M image-text pairs): Distill-ViT-B/32 achieves similar results in terms of zero-shot and linear-probing performance on both ImageNet and the ELEVATER (20 image classification tasks) benchmarks. It also displays comparable robustness when evaluated on five datasets with natural distribution shifts from ImageNet. ## Keyword: image signal processing There is no result ## Keyword: image signal process There is no result ## Keyword: compression ### Diff-ID: An Explainable Identity Difference Quantification Framework for DeepFake Detection - **Authors:** Chuer Yu, Xuhong Zhang, Yuxuan Duan, Senbo Yan, Zonghui Wang, Yang Xiang, Shouling Ji, Wenzhi Chen - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18174 - **Pdf link:** https://arxiv.org/pdf/2303.18174 - **Abstract** Despite the fact that DeepFake forgery detection algorithms have achieved impressive performance on known manipulations, they often face disastrous performance degradation when generalized to an unseen manipulation. Some recent works show improvement in generalization but rely on features fragile to image distortions such as compression. To this end, we propose Diff-ID, a concise and effective approach that explains and measures the identity loss induced by facial manipulations. When testing on an image of a specific person, Diff-ID utilizes an authentic image of that person as a reference and aligns them to the same identity-insensitive attribute feature space by applying a face-swapping generator. We then visualize the identity loss between the test and the reference image from the image differences of the aligned pairs, and design a custom metric to quantify the identity loss. The metric is then proved to be effective in distinguishing the forgery images from the real ones. Extensive experiments show that our approach achieves high detection performance on DeepFake images and state-of-the-art generalization ability to unknown forgery methods, while also being robust to image distortions. ## Keyword: RAW ### Whether and When does Endoscopy Domain Pretraining Make Sense? - **Authors:** Dominik Batić, Felix Holm, Ege Özsoy, Tobias Czempiel, Nassir Navab - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.17636 - **Pdf link:** https://arxiv.org/pdf/2303.17636 - **Abstract** Automated endoscopy video analysis is a challenging task in medical computer vision, with the primary objective of assisting surgeons during procedures. The difficulty arises from the complexity of surgical scenes and the lack of a sufficient amount of annotated data. In recent years, large-scale pretraining has shown great success in natural language processing and computer vision communities. These approaches reduce the need for annotated data, which is always a concern in the medical domain. However, most works on endoscopic video understanding use models pretrained on natural images, creating a domain gap between pretraining and finetuning. In this work, we investigate the need for endoscopy domain-specific pretraining based on downstream objectives. To this end, we first collect Endo700k, the largest publicly available corpus of endoscopic images, extracted from nine public Minimally Invasive Surgery (MIS) datasets. Endo700k comprises more than 700,000 unannotated raw images. Next, we introduce EndoViT, an endoscopy pretrained Vision Transformer (ViT). Through ablations, we demonstrate that domain-specific pretraining is particularly beneficial for more complex downstream tasks, such as Action Triplet Detection, and less effective and even unnecessary for simpler tasks, such as Surgical Phase Recognition. We will release both our code and pretrained models upon acceptance to facilitate further research in this direction. ### GlyphDraw: Learning to Draw Chinese Characters in Image Synthesis Models Coherently - **Authors:** Jian Ma, Mingjun Zhao, Chen Chen, Ruichen Wang, Di Niu, Haonan Lu, Xiaodong Lin - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.17870 - **Pdf link:** https://arxiv.org/pdf/2303.17870 - **Abstract** Recent breakthroughs in the field of language-guided image generation have yielded impressive achievements, enabling the creation of high-quality and diverse images based on user instructions. Although the synthesis performance is fascinating, one significant limitation of current image generation models is their insufficient ability to generate coherent text within images, particularly for complex glyph structures like Chinese characters. To address this problem, we introduce GlyphDraw, a general learning framework aiming at endowing image generation models with the capacity to generate images embedded with coherent text. To the best of our knowledge, this is the first work in the field of image synthesis to address the generation of Chinese characters. % we first adopt the OCR technique to collect images with Chinese characters as training samples, and extract the text and locations as auxiliary information. We first sophisticatedly design the image-text dataset's construction strategy, then build our model specifically on a diffusion-based image generator and carefully modify the network structure to allow the model to learn drawing Chinese characters with the help of glyph and position information. Furthermore, we maintain the model's open-domain image synthesis capability by preventing catastrophic forgetting by using a variety of training techniques. Extensive qualitative and quantitative experiments demonstrate that our method not only produces accurate Chinese characters as in prompts, but also naturally blends the generated text into the background. Please refer to https://1073521013.github.io/glyph-draw.github.io ### Towards Nonlinear-Motion-Aware and Occlusion-Robust Rolling Shutter Correction - **Authors:** Delin Qu, Yizhen Lao, Zhigang Wang, Dong Wang, Bin Zhao, Xuelong Li - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18125 - **Pdf link:** https://arxiv.org/pdf/2303.18125 - **Abstract** This paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion. Existing methods suffer from two main drawbacks. Firstly, they face challenges in estimating the accurate correction field due to the uniform velocity assumption, leading to significant image correction errors under complex motion. Secondly, the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames. To tackle these challenges, we model the curvilinear trajectory of pixels analytically and propose a geometry-based Quadratic Rolling Shutter (QRS) motion solver, which precisely estimates the high-order correction field of individual pixel. Besides, to reconstruct high-quality occlusion frames in dynamic scenes, we present a 3D video architecture that effectively Aligns and Aggregates multi-frame context, namely, RSA^2-Net. We evaluate our method across a broad range of cameras and video sequences, demonstrating its significant superiority. Specifically, our method surpasses the state-of-the-arts by +4.98, +0.77, and +4.33 of PSNR on Carla-RS, Fastec-RS, and BS-RSC datasets, respectively. ### Efficient View Synthesis and 3D-based Multi-Frame Denoising with Multiplane Feature Representations - **Authors:** Thomas Tanay, Aleš Leonardis, Matteo Maggioni - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.18139 - **Pdf link:** https://arxiv.org/pdf/2303.18139 - **Abstract** While current multi-frame restoration methods combine information from multiple input images using 2D alignment techniques, recent advances in novel view synthesis are paving the way for a new paradigm relying on volumetric scene representations. In this work, we introduce the first 3D-based multi-frame denoising method that significantly outperforms its 2D-based counterparts with lower computational requirements. Our method extends the multiplane image (MPI) framework for novel view synthesis by introducing a learnable encoder-renderer pair manipulating multiplane representations in feature space. The encoder fuses information across views and operates in a depth-wise manner while the renderer fuses information across depths and operates in a view-wise manner. The two modules are trained end-to-end and learn to separate depths in an unsupervised way, giving rise to Multiplane Feature (MPF) representations. Experiments on the Spaces and Real Forward-Facing datasets as well as on raw burst data validate our approach for view synthesis, multi-frame denoising, and view synthesis under noisy conditions. ## Keyword: raw image ### Whether and When does Endoscopy Domain Pretraining Make Sense? - **Authors:** Dominik Batić, Felix Holm, Ege Özsoy, Tobias Czempiel, Nassir Navab - **Subjects:** Computer Vision and Pattern Recognition (cs.CV) - **Arxiv link:** https://arxiv.org/abs/2303.17636 - **Pdf link:** https://arxiv.org/pdf/2303.17636 - **Abstract** Automated endoscopy video analysis is a challenging task in medical computer vision, with the primary objective of assisting surgeons during procedures. The difficulty arises from the complexity of surgical scenes and the lack of a sufficient amount of annotated data. In recent years, large-scale pretraining has shown great success in natural language processing and computer vision communities. These approaches reduce the need for annotated data, which is always a concern in the medical domain. However, most works on endoscopic video understanding use models pretrained on natural images, creating a domain gap between pretraining and finetuning. In this work, we investigate the need for endoscopy domain-specific pretraining based on downstream objectives. To this end, we first collect Endo700k, the largest publicly available corpus of endoscopic images, extracted from nine public Minimally Invasive Surgery (MIS) datasets. Endo700k comprises more than 700,000 unannotated raw images. Next, we introduce EndoViT, an endoscopy pretrained Vision Transformer (ViT). Through ablations, we demonstrate that domain-specific pretraining is particularly beneficial for more complex downstream tasks, such as Action Triplet Detection, and less effective and even unnecessary for simpler tasks, such as Surgical Phase Recognition. We will release both our code and pretrained models upon acceptance to facilitate further research in this direction.
process
new submissions for mon apr keyword events towards nonlinear motion aware and occlusion robust rolling shutter correction authors delin qu yizhen lao zhigang wang dong wang bin zhao xuelong li subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract this paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion existing methods suffer from two main drawbacks firstly they face challenges in estimating the accurate correction field due to the uniform velocity assumption leading to significant image correction errors under complex motion secondly the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames to tackle these challenges we model the curvilinear trajectory of pixels analytically and propose a geometry based quadratic rolling shutter qrs motion solver which precisely estimates the high order correction field of individual pixel besides to reconstruct high quality occlusion frames in dynamic scenes we present a video architecture that effectively aligns and aggregates multi frame context namely rsa net we evaluate our method across a broad range of cameras and video sequences demonstrating its significant superiority specifically our method surpasses the state of the arts by and of psnr on carla rs fastec rs and bs rsc datasets respectively keyword event camera there is no result keyword events camera there is no result keyword white balance there is no result keyword color contrast there is no result keyword awb towards nonlinear motion aware and occlusion robust rolling shutter correction authors delin qu yizhen lao zhigang wang dong wang bin zhao xuelong li subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract this paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion existing methods suffer from two main drawbacks firstly they face challenges in estimating the accurate correction field due to the uniform velocity assumption leading to significant image correction errors under complex motion secondly the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames to tackle these challenges we model the curvilinear trajectory of pixels analytically and propose a geometry based quadratic rolling shutter qrs motion solver which precisely estimates the high order correction field of individual pixel besides to reconstruct high quality occlusion frames in dynamic scenes we present a video architecture that effectively aligns and aggregates multi frame context namely rsa net we evaluate our method across a broad range of cameras and video sequences demonstrating its significant superiority specifically our method surpasses the state of the arts by and of psnr on carla rs fastec rs and bs rsc datasets respectively keyword isp dime fm distilling multimodal and efficient foundation models authors ximeng sun pengchuan zhang peizhao zhang hardik shah kate saenko xide xia subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract large vision language foundation models vlfm such as clip align and florence are trained on large scale datasets of image caption pairs and achieve superior transferability and robustness on downstream tasks but they are difficult to use in many practical applications due to their large size high latency and fixed architectures unfortunately recent work shows training a small custom vlfm for resource limited applications is currently very difficult using public and smaller scale data in this paper we introduce a new distillation mechanism dime fm that allows us to transfer the knowledge contained in large vlfms to smaller customized foundation models using a relatively small amount of inexpensive unpaired images and sentences we transfer the knowledge from the pre trained clip vitl model to a vit b model with only public images and unpaired public sentences the resulting model distill vit b rivals the clip vit b model pre trained on its private wit dataset image text pairs distill vit b achieves similar results in terms of zero shot and linear probing performance on both imagenet and the elevater image classification tasks benchmarks it also displays comparable robustness when evaluated on five datasets with natural distribution shifts from imagenet keyword image signal processing there is no result keyword image signal process there is no result keyword compression diff id an explainable identity difference quantification framework for deepfake detection authors chuer yu xuhong zhang yuxuan duan senbo yan zonghui wang yang xiang shouling ji wenzhi chen subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract despite the fact that deepfake forgery detection algorithms have achieved impressive performance on known manipulations they often face disastrous performance degradation when generalized to an unseen manipulation some recent works show improvement in generalization but rely on features fragile to image distortions such as compression to this end we propose diff id a concise and effective approach that explains and measures the identity loss induced by facial manipulations when testing on an image of a specific person diff id utilizes an authentic image of that person as a reference and aligns them to the same identity insensitive attribute feature space by applying a face swapping generator we then visualize the identity loss between the test and the reference image from the image differences of the aligned pairs and design a custom metric to quantify the identity loss the metric is then proved to be effective in distinguishing the forgery images from the real ones extensive experiments show that our approach achieves high detection performance on deepfake images and state of the art generalization ability to unknown forgery methods while also being robust to image distortions keyword raw whether and when does endoscopy domain pretraining make sense authors dominik batić felix holm ege özsoy tobias czempiel nassir navab subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract automated endoscopy video analysis is a challenging task in medical computer vision with the primary objective of assisting surgeons during procedures the difficulty arises from the complexity of surgical scenes and the lack of a sufficient amount of annotated data in recent years large scale pretraining has shown great success in natural language processing and computer vision communities these approaches reduce the need for annotated data which is always a concern in the medical domain however most works on endoscopic video understanding use models pretrained on natural images creating a domain gap between pretraining and finetuning in this work we investigate the need for endoscopy domain specific pretraining based on downstream objectives to this end we first collect the largest publicly available corpus of endoscopic images extracted from nine public minimally invasive surgery mis datasets comprises more than unannotated raw images next we introduce endovit an endoscopy pretrained vision transformer vit through ablations we demonstrate that domain specific pretraining is particularly beneficial for more complex downstream tasks such as action triplet detection and less effective and even unnecessary for simpler tasks such as surgical phase recognition we will release both our code and pretrained models upon acceptance to facilitate further research in this direction glyphdraw learning to draw chinese characters in image synthesis models coherently authors jian ma mingjun zhao chen chen ruichen wang di niu haonan lu xiaodong lin subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract recent breakthroughs in the field of language guided image generation have yielded impressive achievements enabling the creation of high quality and diverse images based on user instructions although the synthesis performance is fascinating one significant limitation of current image generation models is their insufficient ability to generate coherent text within images particularly for complex glyph structures like chinese characters to address this problem we introduce glyphdraw a general learning framework aiming at endowing image generation models with the capacity to generate images embedded with coherent text to the best of our knowledge this is the first work in the field of image synthesis to address the generation of chinese characters we first adopt the ocr technique to collect images with chinese characters as training samples and extract the text and locations as auxiliary information we first sophisticatedly design the image text dataset s construction strategy then build our model specifically on a diffusion based image generator and carefully modify the network structure to allow the model to learn drawing chinese characters with the help of glyph and position information furthermore we maintain the model s open domain image synthesis capability by preventing catastrophic forgetting by using a variety of training techniques extensive qualitative and quantitative experiments demonstrate that our method not only produces accurate chinese characters as in prompts but also naturally blends the generated text into the background please refer to towards nonlinear motion aware and occlusion robust rolling shutter correction authors delin qu yizhen lao zhigang wang dong wang bin zhao xuelong li subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract this paper addresses the problem of rolling shutter correction in complex nonlinear and dynamic scenes with extreme occlusion existing methods suffer from two main drawbacks firstly they face challenges in estimating the accurate correction field due to the uniform velocity assumption leading to significant image correction errors under complex motion secondly the drastic occlusion in dynamic scenes prevents current solutions from achieving better image quality because of the inherent difficulties in aligning and aggregating multiple frames to tackle these challenges we model the curvilinear trajectory of pixels analytically and propose a geometry based quadratic rolling shutter qrs motion solver which precisely estimates the high order correction field of individual pixel besides to reconstruct high quality occlusion frames in dynamic scenes we present a video architecture that effectively aligns and aggregates multi frame context namely rsa net we evaluate our method across a broad range of cameras and video sequences demonstrating its significant superiority specifically our method surpasses the state of the arts by and of psnr on carla rs fastec rs and bs rsc datasets respectively efficient view synthesis and based multi frame denoising with multiplane feature representations authors thomas tanay aleš leonardis matteo maggioni subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract while current multi frame restoration methods combine information from multiple input images using alignment techniques recent advances in novel view synthesis are paving the way for a new paradigm relying on volumetric scene representations in this work we introduce the first based multi frame denoising method that significantly outperforms its based counterparts with lower computational requirements our method extends the multiplane image mpi framework for novel view synthesis by introducing a learnable encoder renderer pair manipulating multiplane representations in feature space the encoder fuses information across views and operates in a depth wise manner while the renderer fuses information across depths and operates in a view wise manner the two modules are trained end to end and learn to separate depths in an unsupervised way giving rise to multiplane feature mpf representations experiments on the spaces and real forward facing datasets as well as on raw burst data validate our approach for view synthesis multi frame denoising and view synthesis under noisy conditions keyword raw image whether and when does endoscopy domain pretraining make sense authors dominik batić felix holm ege özsoy tobias czempiel nassir navab subjects computer vision and pattern recognition cs cv arxiv link pdf link abstract automated endoscopy video analysis is a challenging task in medical computer vision with the primary objective of assisting surgeons during procedures the difficulty arises from the complexity of surgical scenes and the lack of a sufficient amount of annotated data in recent years large scale pretraining has shown great success in natural language processing and computer vision communities these approaches reduce the need for annotated data which is always a concern in the medical domain however most works on endoscopic video understanding use models pretrained on natural images creating a domain gap between pretraining and finetuning in this work we investigate the need for endoscopy domain specific pretraining based on downstream objectives to this end we first collect the largest publicly available corpus of endoscopic images extracted from nine public minimally invasive surgery mis datasets comprises more than unannotated raw images next we introduce endovit an endoscopy pretrained vision transformer vit through ablations we demonstrate that domain specific pretraining is particularly beneficial for more complex downstream tasks such as action triplet detection and less effective and even unnecessary for simpler tasks such as surgical phase recognition we will release both our code and pretrained models upon acceptance to facilitate further research in this direction
1
58,411
6,589,329,127
IssuesEvent
2017-09-14 08:30:32
hazelcast/hazelcast
https://api.github.com/repos/hazelcast/hazelcast
closed
InvalidationMemberAddRemoveTest.ensure_nearCachedClient_and_member_data_sync_eventually
Team: Client Type: Test-Failure
``` org.junit.runners.model.TestTimedOutException: test timed out after 300000 milliseconds at sun.misc.Unsafe.park(Native Method) at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:349) at com.hazelcast.spi.impl.AbstractInvocationFuture.get(AbstractInvocationFuture.java:179) at com.hazelcast.client.util.ClientDelegatingFuture.get(ClientDelegatingFuture.java:127) at com.hazelcast.client.util.ClientDelegatingFuture.get(ClientDelegatingFuture.java:113) at com.hazelcast.client.cache.impl.AbstractClientCacheProxy.getInternal(AbstractClientCacheProxy.java:137) at com.hazelcast.client.cache.impl.AbstractClientCacheProxy.get(AbstractClientCacheProxy.java:248) at com.hazelcast.client.cache.impl.ClientCacheProxy.get(ClientCacheProxy.java:79) at com.hazelcast.client.cache.impl.ClientCacheProxy.get(ClientCacheProxy.java:91) at com.hazelcast.client.cache.impl.nearcache.invalidation.InvalidationMemberAddRemoveTest$5.run(InvalidationMemberAddRemoveTest.java:182) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:967) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:984) at com.hazelcast.client.cache.impl.nearcache.invalidation.InvalidationMemberAddRemoveTest.ensure_nearCachedClient_and_member_data_sync_eventually(InvalidationMemberAddRemoveTest.java:177) ``` https://hazelcast-l337.ci.cloudbees.com/view/Hazelcast/job/Hazelcast-3.x-nightly/1211/com.hazelcast$hazelcast-client/testReport/com.hazelcast.client.cache.impl.nearcache.invalidation/InvalidationMemberAddRemoveTest/ensure_nearCachedClient_and_member_data_sync_eventually/ https://hazelcast-l337.ci.cloudbees.com/view/Hazelcast/job/Hazelcast-3.x-nightly/com.hazelcast$hazelcast-client/1210/testReport/junit/com.hazelcast.client.cache.impl.nearcache.invalidation/InvalidationMemberAddRemoveTest/ensure_nearCachedClient_and_member_data_sync_eventually/
1.0
InvalidationMemberAddRemoveTest.ensure_nearCachedClient_and_member_data_sync_eventually - ``` org.junit.runners.model.TestTimedOutException: test timed out after 300000 milliseconds at sun.misc.Unsafe.park(Native Method) at java.util.concurrent.locks.LockSupport.parkNanos(LockSupport.java:349) at com.hazelcast.spi.impl.AbstractInvocationFuture.get(AbstractInvocationFuture.java:179) at com.hazelcast.client.util.ClientDelegatingFuture.get(ClientDelegatingFuture.java:127) at com.hazelcast.client.util.ClientDelegatingFuture.get(ClientDelegatingFuture.java:113) at com.hazelcast.client.cache.impl.AbstractClientCacheProxy.getInternal(AbstractClientCacheProxy.java:137) at com.hazelcast.client.cache.impl.AbstractClientCacheProxy.get(AbstractClientCacheProxy.java:248) at com.hazelcast.client.cache.impl.ClientCacheProxy.get(ClientCacheProxy.java:79) at com.hazelcast.client.cache.impl.ClientCacheProxy.get(ClientCacheProxy.java:91) at com.hazelcast.client.cache.impl.nearcache.invalidation.InvalidationMemberAddRemoveTest$5.run(InvalidationMemberAddRemoveTest.java:182) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:967) at com.hazelcast.test.HazelcastTestSupport.assertTrueEventually(HazelcastTestSupport.java:984) at com.hazelcast.client.cache.impl.nearcache.invalidation.InvalidationMemberAddRemoveTest.ensure_nearCachedClient_and_member_data_sync_eventually(InvalidationMemberAddRemoveTest.java:177) ``` https://hazelcast-l337.ci.cloudbees.com/view/Hazelcast/job/Hazelcast-3.x-nightly/1211/com.hazelcast$hazelcast-client/testReport/com.hazelcast.client.cache.impl.nearcache.invalidation/InvalidationMemberAddRemoveTest/ensure_nearCachedClient_and_member_data_sync_eventually/ https://hazelcast-l337.ci.cloudbees.com/view/Hazelcast/job/Hazelcast-3.x-nightly/com.hazelcast$hazelcast-client/1210/testReport/junit/com.hazelcast.client.cache.impl.nearcache.invalidation/InvalidationMemberAddRemoveTest/ensure_nearCachedClient_and_member_data_sync_eventually/
non_process
invalidationmemberaddremovetest ensure nearcachedclient and member data sync eventually org junit runners model testtimedoutexception test timed out after milliseconds at sun misc unsafe park native method at java util concurrent locks locksupport parknanos locksupport java at com hazelcast spi impl abstractinvocationfuture get abstractinvocationfuture java at com hazelcast client util clientdelegatingfuture get clientdelegatingfuture java at com hazelcast client util clientdelegatingfuture get clientdelegatingfuture java at com hazelcast client cache impl abstractclientcacheproxy getinternal abstractclientcacheproxy java at com hazelcast client cache impl abstractclientcacheproxy get abstractclientcacheproxy java at com hazelcast client cache impl clientcacheproxy get clientcacheproxy java at com hazelcast client cache impl clientcacheproxy get clientcacheproxy java at com hazelcast client cache impl nearcache invalidation invalidationmemberaddremovetest run invalidationmemberaddremovetest java at com hazelcast test hazelcasttestsupport asserttrueeventually hazelcasttestsupport java at com hazelcast test hazelcasttestsupport asserttrueeventually hazelcasttestsupport java at com hazelcast client cache impl nearcache invalidation invalidationmemberaddremovetest ensure nearcachedclient and member data sync eventually invalidationmemberaddremovetest java
0
16,975
22,338,131,673
IssuesEvent
2022-06-14 20:43:11
DSpace/DSpace
https://api.github.com/repos/DSpace/DSpace
closed
process scheduler does not keep track of special groups
bug medium priority tools: processes help wanted
**Describe the bug** Whenever a process is scheduled, information about user scheduling it is tracked. But we don't keep track of special groups, if any, active for the user at the moment of scheduling. In such cases, if scheduled process requires permission(s) related to a special group, it might fail. To provide a scenario if an user is granted as administrator by shibboleth, but he / she isn't administrator according to dspace systems, and with administrator grant can schedule a process, when the process is triggered and permissions are re-checked we do not have evidence of previous administrator grant.
1.0
process scheduler does not keep track of special groups - **Describe the bug** Whenever a process is scheduled, information about user scheduling it is tracked. But we don't keep track of special groups, if any, active for the user at the moment of scheduling. In such cases, if scheduled process requires permission(s) related to a special group, it might fail. To provide a scenario if an user is granted as administrator by shibboleth, but he / she isn't administrator according to dspace systems, and with administrator grant can schedule a process, when the process is triggered and permissions are re-checked we do not have evidence of previous administrator grant.
process
process scheduler does not keep track of special groups describe the bug whenever a process is scheduled information about user scheduling it is tracked but we don t keep track of special groups if any active for the user at the moment of scheduling in such cases if scheduled process requires permission s related to a special group it might fail to provide a scenario if an user is granted as administrator by shibboleth but he she isn t administrator according to dspace systems and with administrator grant can schedule a process when the process is triggered and permissions are re checked we do not have evidence of previous administrator grant
1
10,168
13,044,162,722
IssuesEvent
2020-07-29 03:47:35
tikv/tikv
https://api.github.com/repos/tikv/tikv
closed
UCP: Migrate scalar function `ValuesInt` from TiDB
challenge-program-2 component/coprocessor difficulty/easy sig/coprocessor
## Description Port the scalar function `ValuesInt` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
2.0
UCP: Migrate scalar function `ValuesInt` from TiDB - ## Description Port the scalar function `ValuesInt` from TiDB to coprocessor. ## Score * 50 ## Mentor(s) * @lonng ## Recommended Skills * Rust programming ## Learning Materials Already implemented expressions ported from TiDB - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/rpn_expr) - https://github.com/tikv/tikv/tree/master/components/tidb_query/src/expr)
process
ucp migrate scalar function valuesint from tidb description port the scalar function valuesint from tidb to coprocessor score mentor s lonng recommended skills rust programming learning materials already implemented expressions ported from tidb
1
30,262
24,708,676,702
IssuesEvent
2022-10-19 21:34:00
dotnet/runtime
https://api.github.com/repos/dotnet/runtime
closed
The 3.1 and 5.0 SDKs do not support musl/Alpine Linux anymore
area-Infrastructure-libraries untriaged
### Describe the bug I am building an app for Alpine 3.15 and/or 3.16, but those versions of Alpine can no longer load the .so files from the linux-musl-x64 folders. I had a look at the runtime.json on those installs and I notice the version of alpine in the json is only 3.14. ### To Reproduce Just try use SkiaSharp - or any other NuGet that has a musl-specific binary. Se also these issues: - https://github.com/mono/SkiaSharp/issues/2169 - https://github.com/mono/SkiaSharp/issues/2215 I can work around this by adding/editing the json file to include alpine.3.15 and alpine.3.16. ### Exceptions (if any) ``` System.DllNotFoundException: Unable to load shared library 'libSkiaSharp' or one of its dependencies. In order to help diagnose loading problems, consider setting the LD_DEBUG environment variable: Error loading shared library liblibSkiaSharp: No such file or directory ``` ### Further technical details - Using the dotnet docker image which was Alpine 3.16 and had SDK 3.1.28 installed. - I also see the Alpine version in the json on GitHub: https://github.com/dotnet/corefx/blob/v3.1.28/pkg/Microsoft.NETCore.Platforms/runtime.json#L80
1.0
The 3.1 and 5.0 SDKs do not support musl/Alpine Linux anymore - ### Describe the bug I am building an app for Alpine 3.15 and/or 3.16, but those versions of Alpine can no longer load the .so files from the linux-musl-x64 folders. I had a look at the runtime.json on those installs and I notice the version of alpine in the json is only 3.14. ### To Reproduce Just try use SkiaSharp - or any other NuGet that has a musl-specific binary. Se also these issues: - https://github.com/mono/SkiaSharp/issues/2169 - https://github.com/mono/SkiaSharp/issues/2215 I can work around this by adding/editing the json file to include alpine.3.15 and alpine.3.16. ### Exceptions (if any) ``` System.DllNotFoundException: Unable to load shared library 'libSkiaSharp' or one of its dependencies. In order to help diagnose loading problems, consider setting the LD_DEBUG environment variable: Error loading shared library liblibSkiaSharp: No such file or directory ``` ### Further technical details - Using the dotnet docker image which was Alpine 3.16 and had SDK 3.1.28 installed. - I also see the Alpine version in the json on GitHub: https://github.com/dotnet/corefx/blob/v3.1.28/pkg/Microsoft.NETCore.Platforms/runtime.json#L80
non_process
the and sdks do not support musl alpine linux anymore describe the bug i am building an app for alpine and or but those versions of alpine can no longer load the so files from the linux musl folders i had a look at the runtime json on those installs and i notice the version of alpine in the json is only to reproduce just try use skiasharp or any other nuget that has a musl specific binary se also these issues i can work around this by adding editing the json file to include alpine and alpine exceptions if any system dllnotfoundexception unable to load shared library libskiasharp or one of its dependencies in order to help diagnose loading problems consider setting the ld debug environment variable error loading shared library liblibskiasharp no such file or directory further technical details using the dotnet docker image which was alpine and had sdk installed i also see the alpine version in the json on github
0
6,590
9,664,234,875
IssuesEvent
2019-05-21 04:31:03
tc39/proposal-weakrefs
https://api.github.com/repos/tc39/proposal-weakrefs
opened
Advance to stage 3
process
Criteria taken from the TC39 process document minus those from previous stages: - [ ] Complete spec text https://tc39.github.io/proposal-weakrefs/ - [ ] Designated reviewers have signed off on the current spec text - [ ] @syg - [ ] @littledan - [ ] @leobalter - [ ] All ECMAScript editors have signed off on the current spec text - [ ] @ljharb - [ ] @zenparsing
1.0
Advance to stage 3 - Criteria taken from the TC39 process document minus those from previous stages: - [ ] Complete spec text https://tc39.github.io/proposal-weakrefs/ - [ ] Designated reviewers have signed off on the current spec text - [ ] @syg - [ ] @littledan - [ ] @leobalter - [ ] All ECMAScript editors have signed off on the current spec text - [ ] @ljharb - [ ] @zenparsing
process
advance to stage criteria taken from the process document minus those from previous stages complete spec text designated reviewers have signed off on the current spec text syg littledan leobalter all ecmascript editors have signed off on the current spec text ljharb zenparsing
1
227,536
25,081,147,281
IssuesEvent
2022-11-07 19:25:20
JMD60260/fetchmeaband
https://api.github.com/repos/JMD60260/fetchmeaband
closed
CVE-2019-19919 (High) detected in handlebars-1.3.0.tgz
security vulnerability
## CVE-2019-19919 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-1.3.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz</a></p> <p>Path to dependency file: /public/vendor/owl.carousel/package.json</p> <p>Path to vulnerable library: /public/vendor/owl.carousel/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - assemble-0.4.42.tgz (Root Library) - assemble-handlebars-0.2.6.tgz - :x: **handlebars-1.3.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/JMD60260/fetchmeaband/commit/430b5f2947d45ada69dc047ea870d3c988006344">430b5f2947d45ada69dc047ea870d3c988006344</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Versions of handlebars prior to 4.3.0 are vulnerable to Prototype Pollution leading to Remote Code Execution. Templates may alter an Object's __proto__ and __defineGetter__ properties, which may allow an attacker to execute arbitrary code through crafted payloads. <p>Publish Date: 2019-12-20 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19919>CVE-2019-19919</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19919">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19919</a></p> <p>Release Date: 2019-12-20</p> <p>Fix Resolution (handlebars): 3.0.8</p> <p>Direct dependency fix Resolution (assemble): 0.6.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-19919 (High) detected in handlebars-1.3.0.tgz - ## CVE-2019-19919 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>handlebars-1.3.0.tgz</b></p></summary> <p>Handlebars provides the power necessary to let you build semantic templates effectively with no frustration</p> <p>Library home page: <a href="https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz">https://registry.npmjs.org/handlebars/-/handlebars-1.3.0.tgz</a></p> <p>Path to dependency file: /public/vendor/owl.carousel/package.json</p> <p>Path to vulnerable library: /public/vendor/owl.carousel/node_modules/handlebars/package.json</p> <p> Dependency Hierarchy: - assemble-0.4.42.tgz (Root Library) - assemble-handlebars-0.2.6.tgz - :x: **handlebars-1.3.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/JMD60260/fetchmeaband/commit/430b5f2947d45ada69dc047ea870d3c988006344">430b5f2947d45ada69dc047ea870d3c988006344</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Versions of handlebars prior to 4.3.0 are vulnerable to Prototype Pollution leading to Remote Code Execution. Templates may alter an Object's __proto__ and __defineGetter__ properties, which may allow an attacker to execute arbitrary code through crafted payloads. <p>Publish Date: 2019-12-20 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2019-19919>CVE-2019-19919</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>9.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19919">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-19919</a></p> <p>Release Date: 2019-12-20</p> <p>Fix Resolution (handlebars): 3.0.8</p> <p>Direct dependency fix Resolution (assemble): 0.6.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in handlebars tgz cve high severity vulnerability vulnerable library handlebars tgz handlebars provides the power necessary to let you build semantic templates effectively with no frustration library home page a href path to dependency file public vendor owl carousel package json path to vulnerable library public vendor owl carousel node modules handlebars package json dependency hierarchy assemble tgz root library assemble handlebars tgz x handlebars tgz vulnerable library found in head commit a href found in base branch master vulnerability details versions of handlebars prior to are vulnerable to prototype pollution leading to remote code execution templates may alter an object s proto and definegetter properties which may allow an attacker to execute arbitrary code through crafted payloads publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution handlebars direct dependency fix resolution assemble step up your open source security game with mend
0
13,598
16,175,674,171
IssuesEvent
2021-05-03 06:13:35
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
Dependency updates
Bug Process: Fixed Process: Tested QA
Copied from Andrew's email. In addition to the one where it created bugs, here are the ones that it didn't create bugs for that also have a patched version available. Let me know if you need the details on the vulnerabilities. I added H/M/L for the ones marked as High severity, Moderate severity, & Low severity. **study-builder/fdahpStudyDesigner/pom.xml** M- Upgrade com.mchange:c3p0 to version 0.9.5.3 or later. H- Upgrade commons-fileupload:commons-fileupload to version 1.3.3 or later. H- Upgrade org.springframework:spring-core to version 4.3.20 or later. (note this one may not be possible. Dependabot reports the latest version that can be upgraded to is 3.2.8, but there are a few vulnerabilities so check and see. participant-manager-datastore/participant-manager-service/pom.xml M- Upgrade com.google.guava:guava to version 24.1.1 or later. H- Upgrade org.apache.poi:poi to version 3.17 or later **oauth-scim-service/pom.xml** M- Upgrade com.google.guava:guava to version 24.1.1 or later. **study-datastore/pom.xml** L- Upgrade junit:junit to version 4.13.1 or later M- Upgrade org.quartz-scheduler:quartz to version 2.3.2 or later M- Upgrade mysql:mysql-connector-java to version 8.0.16 or later H- Upgrade commons-collections:commons-collections to version 3.2.2 or later. **participant-manager/packagelock.json** L- Upgrade ini to version 1.3.6 or later L- Upgrade webpack-subresource-integrity to version 1.5.1 or later M- Upgrade socket.io to version 2.4.0 or later. Also fix the issues raised by [dependabot ](https://github.com/GoogleCloudPlatform/fda-mystudies/issues?q=author%3Aapp%2Fdependabot)
2.0
Dependency updates - Copied from Andrew's email. In addition to the one where it created bugs, here are the ones that it didn't create bugs for that also have a patched version available. Let me know if you need the details on the vulnerabilities. I added H/M/L for the ones marked as High severity, Moderate severity, & Low severity. **study-builder/fdahpStudyDesigner/pom.xml** M- Upgrade com.mchange:c3p0 to version 0.9.5.3 or later. H- Upgrade commons-fileupload:commons-fileupload to version 1.3.3 or later. H- Upgrade org.springframework:spring-core to version 4.3.20 or later. (note this one may not be possible. Dependabot reports the latest version that can be upgraded to is 3.2.8, but there are a few vulnerabilities so check and see. participant-manager-datastore/participant-manager-service/pom.xml M- Upgrade com.google.guava:guava to version 24.1.1 or later. H- Upgrade org.apache.poi:poi to version 3.17 or later **oauth-scim-service/pom.xml** M- Upgrade com.google.guava:guava to version 24.1.1 or later. **study-datastore/pom.xml** L- Upgrade junit:junit to version 4.13.1 or later M- Upgrade org.quartz-scheduler:quartz to version 2.3.2 or later M- Upgrade mysql:mysql-connector-java to version 8.0.16 or later H- Upgrade commons-collections:commons-collections to version 3.2.2 or later. **participant-manager/packagelock.json** L- Upgrade ini to version 1.3.6 or later L- Upgrade webpack-subresource-integrity to version 1.5.1 or later M- Upgrade socket.io to version 2.4.0 or later. Also fix the issues raised by [dependabot ](https://github.com/GoogleCloudPlatform/fda-mystudies/issues?q=author%3Aapp%2Fdependabot)
process
dependency updates copied from andrew s email in addition to the one where it created bugs here are the ones that it didn t create bugs for that also have a patched version available let me know if you need the details on the vulnerabilities i added h m l for the ones marked as high severity moderate severity low severity study builder fdahpstudydesigner pom xml m upgrade com mchange to version or later h upgrade commons fileupload commons fileupload to version or later h upgrade org springframework spring core to version or later note this one may not be possible dependabot reports the latest version that can be upgraded to is but there are a few vulnerabilities so check and see participant manager datastore participant manager service pom xml m upgrade com google guava guava to version or later h upgrade org apache poi poi to version or later oauth scim service pom xml m upgrade com google guava guava to version or later study datastore pom xml l upgrade junit junit to version or later m upgrade org quartz scheduler quartz to version or later m upgrade mysql mysql connector java to version or later h upgrade commons collections commons collections to version or later participant manager packagelock json l upgrade ini to version or later l upgrade webpack subresource integrity to version or later m upgrade socket io to version or later also fix the issues raised by
1
177,835
21,509,191,278
IssuesEvent
2022-04-28 01:14:24
Rossb0b/Filmographie-Angular
https://api.github.com/repos/Rossb0b/Filmographie-Angular
closed
CVE-2018-11695 (High) detected in node-sass-4.10.0.tgz - autoclosed
security vulnerability
## CVE-2018-11695 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.10.0.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.10.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.10.0.tgz</a></p> <p>Path to dependency file: /Filmographie-Angular/package.json</p> <p>Path to vulnerable library: Filmographie-Angular/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - build-angular-0.8.7.tgz (Root Library) - :x: **node-sass-4.10.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/Rossb0b/Filmographie-Angular/commits/b54842ae48f08f2f51fe5c49b7c7a8ce375748b0">b54842ae48f08f2f51fe5c49b7c7a8ce375748b0</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in LibSass <3.5.3. A NULL pointer dereference was found in the function Sass::Expand::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact. <p>Publish Date: 2018-06-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11695>CVE-2018-11695</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/sass/libsass/issues/2664">https://github.com/sass/libsass/issues/2664</a></p> <p>Release Date: 2018-06-04</p> <p>Fix Resolution: Libsass:3.5.3, Node-sass:4.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2018-11695 (High) detected in node-sass-4.10.0.tgz - autoclosed - ## CVE-2018-11695 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>node-sass-4.10.0.tgz</b></p></summary> <p>Wrapper around libsass</p> <p>Library home page: <a href="https://registry.npmjs.org/node-sass/-/node-sass-4.10.0.tgz">https://registry.npmjs.org/node-sass/-/node-sass-4.10.0.tgz</a></p> <p>Path to dependency file: /Filmographie-Angular/package.json</p> <p>Path to vulnerable library: Filmographie-Angular/node_modules/node-sass/package.json</p> <p> Dependency Hierarchy: - build-angular-0.8.7.tgz (Root Library) - :x: **node-sass-4.10.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://api.github.com/repos/Rossb0b/Filmographie-Angular/commits/b54842ae48f08f2f51fe5c49b7c7a8ce375748b0">b54842ae48f08f2f51fe5c49b7c7a8ce375748b0</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> An issue was discovered in LibSass <3.5.3. A NULL pointer dereference was found in the function Sass::Expand::operator which could be leveraged by an attacker to cause a denial of service (application crash) or possibly have unspecified other impact. <p>Publish Date: 2018-06-04 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2018-11695>CVE-2018-11695</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>8.8</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://github.com/sass/libsass/issues/2664">https://github.com/sass/libsass/issues/2664</a></p> <p>Release Date: 2018-06-04</p> <p>Fix Resolution: Libsass:3.5.3, Node-sass:4.9.0</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in node sass tgz autoclosed cve high severity vulnerability vulnerable library node sass tgz wrapper around libsass library home page a href path to dependency file filmographie angular package json path to vulnerable library filmographie angular node modules node sass package json dependency hierarchy build angular tgz root library x node sass tgz vulnerable library found in head commit a href vulnerability details an issue was discovered in libsass a null pointer dereference was found in the function sass expand operator which could be leveraged by an attacker to cause a denial of service application crash or possibly have unspecified other impact publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution libsass node sass step up your open source security game with whitesource
0
65,313
14,710,741,748
IssuesEvent
2021-01-05 05:58:35
SecurityPointer/vulnerable-node
https://api.github.com/repos/SecurityPointer/vulnerable-node
opened
CVE-2015-9251 (Medium) detected in jquery-1.11.1.min.js
security vulnerability
## CVE-2015-9251 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js</a></p> <p>Path to vulnerable library: vulnerable-node/public/js/jquery.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.11.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/SecurityPointer/vulnerable-node/commit/36d6d4785c579830bf5bc228efd5c181896c88cb">36d6d4785c579830bf5bc228efd5c181896c88cb</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v3.0.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.1","isTransitiveDependency":false,"dependencyTree":"jquery:1.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
True
CVE-2015-9251 (Medium) detected in jquery-1.11.1.min.js - ## CVE-2015-9251 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>jquery-1.11.1.min.js</b></p></summary> <p>JavaScript library for DOM operations</p> <p>Library home page: <a href="https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js">https://cdnjs.cloudflare.com/ajax/libs/jquery/1.11.1/jquery.min.js</a></p> <p>Path to vulnerable library: vulnerable-node/public/js/jquery.js</p> <p> Dependency Hierarchy: - :x: **jquery-1.11.1.min.js** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/SecurityPointer/vulnerable-node/commit/36d6d4785c579830bf5bc228efd5c181896c88cb">36d6d4785c579830bf5bc228efd5c181896c88cb</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed. <p>Publish Date: 2018-01-18 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251>CVE-2015-9251</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>6.1</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nvd.nist.gov/vuln/detail/CVE-2015-9251">https://nvd.nist.gov/vuln/detail/CVE-2015-9251</a></p> <p>Release Date: 2018-01-18</p> <p>Fix Resolution: jQuery - v3.0.0</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":false,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"JavaScript","packageName":"jquery","packageVersion":"1.11.1","isTransitiveDependency":false,"dependencyTree":"jquery:1.11.1","isMinimumFixVersionAvailable":true,"minimumFixVersion":"jQuery - v3.0.0"}],"vulnerabilityIdentifier":"CVE-2015-9251","vulnerabilityDetails":"jQuery before 3.0.0 is vulnerable to Cross-site Scripting (XSS) attacks when a cross-domain Ajax request is performed without the dataType option, causing text/javascript responses to be executed.","vulnerabilityUrl":"https://vuln.whitesourcesoftware.com/vulnerability/CVE-2015-9251","cvss3Severity":"medium","cvss3Score":"6.1","cvss3Metrics":{"A":"None","AC":"Low","PR":"None","S":"Changed","C":"Low","UI":"Required","AV":"Network","I":"Low"},"extraData":{}}</REMEDIATE> -->
non_process
cve medium detected in jquery min js cve medium severity vulnerability vulnerable library jquery min js javascript library for dom operations library home page a href path to vulnerable library vulnerable node public js jquery js dependency hierarchy x jquery min js vulnerable library found in head commit a href found in base branch master vulnerability details jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution jquery isopenpronvulnerability false ispackagebased true isdefaultbranch true packages vulnerabilityidentifier cve vulnerabilitydetails jquery before is vulnerable to cross site scripting xss attacks when a cross domain ajax request is performed without the datatype option causing text javascript responses to be executed vulnerabilityurl
0
17,274
23,059,412,746
IssuesEvent
2022-07-25 08:34:28
Arch666Angel/mods
https://api.github.com/repos/Arch666Angel/mods
closed
Brown Algae Recipe Unlock
Impact: Enhancement Angels Bio Processing
### Suggestion Make recipe Brown Algae Processing unlock later in the tech tree **Reasons:** - Basic Algae processing tech unlocks a recipe that makes both Green Algae and Brown Algae. - No need to unlock two recipes for the same thing at the same time - Makes sense to have the general recipe available first, specialized recipe available a little later - Moves one tech earlier in the tree, hopefully differentiating the techs a bit more - Currently, Advanced algae processing is available to research immediately after Basic algae processing - No additional prerequisites required ![image](https://user-images.githubusercontent.com/59639/173805227-6b24e737-3fd5-4e12-9f48-18a03fe462a4.png) ![image](https://user-images.githubusercontent.com/59639/173804961-0ffe06ce-f6f4-4a33-b222-7685f8bbe940.png) **Changes;** - Remove Brown algae recipe unlock from Basic algae processing - Add Brown Algae recipe unlock to Advanced algae processing - Make Brown algae processing require Algae farm 2 - Remove Water treatment prerequisite from Basic algae processing - Add Water treatment prerequisite to Advanced algae processing - Add Basic algae processing as a prerequisite for Wood processing - Having fewer techs available at very start of the game is good - Makes sense as cellulose fibre is required for wood pellets ### Note: This suggestion had been logged as part of #674 but wasn't really related so have split it out into it's own issue.
1.0
Brown Algae Recipe Unlock - ### Suggestion Make recipe Brown Algae Processing unlock later in the tech tree **Reasons:** - Basic Algae processing tech unlocks a recipe that makes both Green Algae and Brown Algae. - No need to unlock two recipes for the same thing at the same time - Makes sense to have the general recipe available first, specialized recipe available a little later - Moves one tech earlier in the tree, hopefully differentiating the techs a bit more - Currently, Advanced algae processing is available to research immediately after Basic algae processing - No additional prerequisites required ![image](https://user-images.githubusercontent.com/59639/173805227-6b24e737-3fd5-4e12-9f48-18a03fe462a4.png) ![image](https://user-images.githubusercontent.com/59639/173804961-0ffe06ce-f6f4-4a33-b222-7685f8bbe940.png) **Changes;** - Remove Brown algae recipe unlock from Basic algae processing - Add Brown Algae recipe unlock to Advanced algae processing - Make Brown algae processing require Algae farm 2 - Remove Water treatment prerequisite from Basic algae processing - Add Water treatment prerequisite to Advanced algae processing - Add Basic algae processing as a prerequisite for Wood processing - Having fewer techs available at very start of the game is good - Makes sense as cellulose fibre is required for wood pellets ### Note: This suggestion had been logged as part of #674 but wasn't really related so have split it out into it's own issue.
process
brown algae recipe unlock suggestion make recipe brown algae processing unlock later in the tech tree reasons basic algae processing tech unlocks a recipe that makes both green algae and brown algae no need to unlock two recipes for the same thing at the same time makes sense to have the general recipe available first specialized recipe available a little later moves one tech earlier in the tree hopefully differentiating the techs a bit more currently advanced algae processing is available to research immediately after basic algae processing no additional prerequisites required changes remove brown algae recipe unlock from basic algae processing add brown algae recipe unlock to advanced algae processing make brown algae processing require algae farm remove water treatment prerequisite from basic algae processing add water treatment prerequisite to advanced algae processing add basic algae processing as a prerequisite for wood processing having fewer techs available at very start of the game is good makes sense as cellulose fibre is required for wood pellets note this suggestion had been logged as part of but wasn t really related so have split it out into it s own issue
1
14,686
17,798,489,434
IssuesEvent
2021-09-01 03:08:48
lynnandtonic/nestflix.fun
https://api.github.com/repos/lynnandtonic/nestflix.fun
closed
Popular Slut Club
suggested title in process
Please add as much of the following info as you can: Title: Popular Slut Club Type (film/tv show): tv show Film or show in which it appears: Futurama Season 6 Episode 21 Yo Leela Leela Actual footage of the film/show can be seen (yes) ![psc logo futurama](https://user-images.githubusercontent.com/88994668/129626830-83d1fa97-9327-4027-bb1f-6f3a08bae584.png)
1.0
Popular Slut Club - Please add as much of the following info as you can: Title: Popular Slut Club Type (film/tv show): tv show Film or show in which it appears: Futurama Season 6 Episode 21 Yo Leela Leela Actual footage of the film/show can be seen (yes) ![psc logo futurama](https://user-images.githubusercontent.com/88994668/129626830-83d1fa97-9327-4027-bb1f-6f3a08bae584.png)
process
popular slut club please add as much of the following info as you can title popular slut club type film tv show tv show film or show in which it appears futurama season episode yo leela leela actual footage of the film show can be seen yes
1
145,547
13,153,279,447
IssuesEvent
2020-08-10 02:41:31
gsbDBI/facebook_adaptive
https://api.github.com/repos/gsbDBI/facebook_adaptive
closed
IRB application
documentation
The next deadline for Stanford IRB application is 7/1 (5pm PT) for review during the 7/31 meeting. Here is a quick checklist of what we'll need for the application: - [x] 1. IRB certificates - [x] 2. Description of recruitment -- decide how much we're paying respondents ([link](https://docs.google.com/spreadsheets/d/1-lv6Lr67tSr6Ffj1x8prFrjzcDzGOwQ-K3A0Gab-dgc/edit?usp=sharing) to draft budget), target protocol for FB ads - [x] 3. Experimental design -- treatments - [x] 4. Survey instrument submit forms here: https://eprotocol.stanford.edu/ : IRB-57430
1.0
IRB application - The next deadline for Stanford IRB application is 7/1 (5pm PT) for review during the 7/31 meeting. Here is a quick checklist of what we'll need for the application: - [x] 1. IRB certificates - [x] 2. Description of recruitment -- decide how much we're paying respondents ([link](https://docs.google.com/spreadsheets/d/1-lv6Lr67tSr6Ffj1x8prFrjzcDzGOwQ-K3A0Gab-dgc/edit?usp=sharing) to draft budget), target protocol for FB ads - [x] 3. Experimental design -- treatments - [x] 4. Survey instrument submit forms here: https://eprotocol.stanford.edu/ : IRB-57430
non_process
irb application the next deadline for stanford irb application is pt for review during the meeting here is a quick checklist of what we ll need for the application irb certificates description of recruitment decide how much we re paying respondents to draft budget target protocol for fb ads experimental design treatments survey instrument submit forms here irb
0
114,605
9,744,322,059
IssuesEvent
2019-06-03 06:32:53
dotnet/corefx
https://api.github.com/repos/dotnet/corefx
opened
Test failure: System.IO.Pipelines.Tests.ReadAsyncCancellationTests/FlushAsyncCancellationDeadlock
area-System.IO.Pipelines test-run-core
Test `System.IO.Pipelines.Tests.ReadAsyncCancellationTests/FlushAsyncCancellationDeadlock `has filed. Message : ``` Assert.True() Failure Expected: True Actual: False ``` Stack Trace : ``` at System.IO.Pipelines.Tests.ReadAsyncCancellationTests.FlushAsyncCancellationDeadlock() in /_/src/System.IO.Pipelines/tests/ReadAsyncCancellationTests.cs:line 169 ``` Build: -[20190602.20](https://mc.dot.net/#/user/dotnet-bot/pr~2Fdotnet~2Fcorefx~2Frefs~2Fheads~2Fmaster/test~2Ffunctional~2Fcli~2Finnerloop~2F/20190602.20)(Master) Failing configurations: - Ubuntu.1804.Amd64.Open-x64-Release Details: https://mc.dot.net/#/user/dotnet-bot/pr~2Fdotnet~2Fcorefx~2Frefs~2Fheads~2Fmaster/test~2Ffunctional~2Fcli~2Finnerloop~2F/20190602.20/workItem/System.IO.Pipelines.Tests/analysis/xunit/System.IO.Pipelines.Tests.ReadAsyncCancellationTests~2FFlushAsyncCancellationDeadlock
1.0
Test failure: System.IO.Pipelines.Tests.ReadAsyncCancellationTests/FlushAsyncCancellationDeadlock - Test `System.IO.Pipelines.Tests.ReadAsyncCancellationTests/FlushAsyncCancellationDeadlock `has filed. Message : ``` Assert.True() Failure Expected: True Actual: False ``` Stack Trace : ``` at System.IO.Pipelines.Tests.ReadAsyncCancellationTests.FlushAsyncCancellationDeadlock() in /_/src/System.IO.Pipelines/tests/ReadAsyncCancellationTests.cs:line 169 ``` Build: -[20190602.20](https://mc.dot.net/#/user/dotnet-bot/pr~2Fdotnet~2Fcorefx~2Frefs~2Fheads~2Fmaster/test~2Ffunctional~2Fcli~2Finnerloop~2F/20190602.20)(Master) Failing configurations: - Ubuntu.1804.Amd64.Open-x64-Release Details: https://mc.dot.net/#/user/dotnet-bot/pr~2Fdotnet~2Fcorefx~2Frefs~2Fheads~2Fmaster/test~2Ffunctional~2Fcli~2Finnerloop~2F/20190602.20/workItem/System.IO.Pipelines.Tests/analysis/xunit/System.IO.Pipelines.Tests.ReadAsyncCancellationTests~2FFlushAsyncCancellationDeadlock
non_process
test failure system io pipelines tests readasynccancellationtests flushasynccancellationdeadlock test system io pipelines tests readasynccancellationtests flushasynccancellationdeadlock has filed message assert true failure expected true actual false stack trace at system io pipelines tests readasynccancellationtests flushasynccancellationdeadlock in src system io pipelines tests readasynccancellationtests cs line build failing configurations ubuntu open release details
0
16,144
20,405,703,372
IssuesEvent
2022-02-23 04:59:05
fmnas/fmnas-site
https://api.github.com/repos/fmnas/fmnas-site
closed
Periodically check for persisted serialized applications indicating unreported failures
public form processor blocked x-small (<1h)
--- _This issue has been automatically created by [todo-actions](https://github.com/apps/todo-actions) based on a TODO comment found in [public/application/index.php:80](https://github.com/fmnas/fmnas-site/blob/main/public/application/index.php#L80). It will automatically be closed when the TODO comment is removed from the default branch (main)._
1.0
Periodically check for persisted serialized applications indicating unreported failures - --- _This issue has been automatically created by [todo-actions](https://github.com/apps/todo-actions) based on a TODO comment found in [public/application/index.php:80](https://github.com/fmnas/fmnas-site/blob/main/public/application/index.php#L80). It will automatically be closed when the TODO comment is removed from the default branch (main)._
process
periodically check for persisted serialized applications indicating unreported failures this issue has been automatically created by based on a todo comment found in it will automatically be closed when the todo comment is removed from the default branch main
1
76,274
21,320,588,592
IssuesEvent
2022-04-17 02:20:46
nextest-rs/nextest
https://api.github.com/repos/nextest-rs/nextest
closed
Tracking issue for reuse build options
A-reuse-build
cargo-nextest 0.9.10 introduces experimental support for reusing the build across machines and invocations. This issue tracks stabilizing this option: * [x] How does this interact with #82? * [x] Verify option names. * [x] Ensure that the "cargo present on destination" and "cargo not present on destination" scenarios both work well. * [x] Is the current path remapping support sufficient? cc @Guiguiprim
1.0
Tracking issue for reuse build options - cargo-nextest 0.9.10 introduces experimental support for reusing the build across machines and invocations. This issue tracks stabilizing this option: * [x] How does this interact with #82? * [x] Verify option names. * [x] Ensure that the "cargo present on destination" and "cargo not present on destination" scenarios both work well. * [x] Is the current path remapping support sufficient? cc @Guiguiprim
non_process
tracking issue for reuse build options cargo nextest introduces experimental support for reusing the build across machines and invocations this issue tracks stabilizing this option how does this interact with verify option names ensure that the cargo present on destination and cargo not present on destination scenarios both work well is the current path remapping support sufficient cc guiguiprim
0
10,493
13,258,984,110
IssuesEvent
2020-08-20 16:05:43
crim-ca/weaver
https://api.github.com/repos/crim-ca/weaver
reopened
[BUG] Failing CWT install of ESGF-compute API
process/esgf-cwt triage/bug
**Describe the bug** The requirement specified here: https://github.com/crim-ca/weaver/blame/d156e01dc9bffa107aa2d3c2fcb3f720d7963c97/setup.py#L53 Constantly results into following error when installing weaver from fresh (or update): ``` [...] File "/home/francis/DEV/miniconda/envs/weaver-py3/lib/python3.7/site-packages/pkg_resources/__init__.py", line 899, in require needed = self.resolve(parse_requirements(requirements)) File "/home/francis/DEV/miniconda/envs/weaver-py3/lib/python3.7/site-packages/pkg_resources/__init__.py", line 785, in resolve raise DistributionNotFound(req, requirers) pkg_resources.DistributionNotFound: The 'cwt' distribution was not found and is required by weaver Exception ignored in: <module 'threading' from '/home/francis/DEV/miniconda/envs/weaver-py3/lib/python3.7/threading.py'> ``` The requirement doesn't seem to like very much the different `cwt` vs `esgf-compute-api` package installed. Furthermore, looking at https://github.com/ESGF/esgf-compute-api/tree/v2.1.0, the `cwt.process` also contain some `print status` operation which is Python-2. This is only fixed in v2.2.0, but there are many (potentially breaking) changes also. Maybe we should try to request a v2.1.1 with only this statement removed. In any way, this requirement is extremely breaking weaver install, and must be resolved quickly.
1.0
[BUG] Failing CWT install of ESGF-compute API - **Describe the bug** The requirement specified here: https://github.com/crim-ca/weaver/blame/d156e01dc9bffa107aa2d3c2fcb3f720d7963c97/setup.py#L53 Constantly results into following error when installing weaver from fresh (or update): ``` [...] File "/home/francis/DEV/miniconda/envs/weaver-py3/lib/python3.7/site-packages/pkg_resources/__init__.py", line 899, in require needed = self.resolve(parse_requirements(requirements)) File "/home/francis/DEV/miniconda/envs/weaver-py3/lib/python3.7/site-packages/pkg_resources/__init__.py", line 785, in resolve raise DistributionNotFound(req, requirers) pkg_resources.DistributionNotFound: The 'cwt' distribution was not found and is required by weaver Exception ignored in: <module 'threading' from '/home/francis/DEV/miniconda/envs/weaver-py3/lib/python3.7/threading.py'> ``` The requirement doesn't seem to like very much the different `cwt` vs `esgf-compute-api` package installed. Furthermore, looking at https://github.com/ESGF/esgf-compute-api/tree/v2.1.0, the `cwt.process` also contain some `print status` operation which is Python-2. This is only fixed in v2.2.0, but there are many (potentially breaking) changes also. Maybe we should try to request a v2.1.1 with only this statement removed. In any way, this requirement is extremely breaking weaver install, and must be resolved quickly.
process
failing cwt install of esgf compute api describe the bug the requirement specified here constantly results into following error when installing weaver from fresh or update file home francis dev miniconda envs weaver lib site packages pkg resources init py line in require needed self resolve parse requirements requirements file home francis dev miniconda envs weaver lib site packages pkg resources init py line in resolve raise distributionnotfound req requirers pkg resources distributionnotfound the cwt distribution was not found and is required by weaver exception ignored in the requirement doesn t seem to like very much the different cwt vs esgf compute api package installed furthermore looking at the cwt process also contain some print status operation which is python this is only fixed in but there are many potentially breaking changes also maybe we should try to request a with only this statement removed in any way this requirement is extremely breaking weaver install and must be resolved quickly
1
20,970
3,441,841,924
IssuesEvent
2015-12-14 20:06:50
wdg/blacktree-secrets
https://api.github.com/repos/wdg/blacktree-secrets
closed
Secrets Website is down.
auto-migrated Priority-Medium Type-Defect
``` The website is down. Unable to Update Secrets. ``` Original issue reported on code.google.com by `themacin...@gmail.com` on 11 May 2008 at 5:53
1.0
Secrets Website is down. - ``` The website is down. Unable to Update Secrets. ``` Original issue reported on code.google.com by `themacin...@gmail.com` on 11 May 2008 at 5:53
non_process
secrets website is down the website is down unable to update secrets original issue reported on code google com by themacin gmail com on may at
0
175,546
21,313,849,378
IssuesEvent
2022-04-16 01:09:10
Nivaskumark/kernel_v4.1.15
https://api.github.com/repos/Nivaskumark/kernel_v4.1.15
opened
CVE-2017-17712 (High) detected in linuxlinux-4.6
security vulnerability
## CVE-2017-17712 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The raw_sendmsg() function in net/ipv4/raw.c in the Linux kernel through 4.14.6 has a race condition in inet->hdrincl that leads to uninitialized stack pointer usage; this allows a local user to execute code and gain privileges. <p>Publish Date: 2017-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-17712>CVE-2017-17712</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17712">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17712</a></p> <p>Release Date: 2017-12-16</p> <p>Fix Resolution: v4.15-rc4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2017-17712 (High) detected in linuxlinux-4.6 - ## CVE-2017-17712 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>linuxlinux-4.6</b></p></summary> <p> <p>The Linux Kernel</p> <p>Library home page: <a href=https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux>https://mirrors.edge.kernel.org/pub/linux/kernel/v4.x/?wsslib=linux</a></p> <p>Found in base branch: <b>master</b></p></p> </details> </p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Source Files (1)</summary> <p></p> <p> </p> </details> <p></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The raw_sendmsg() function in net/ipv4/raw.c in the Linux kernel through 4.14.6 has a race condition in inet->hdrincl that leads to uninitialized stack pointer usage; this allows a local user to execute code and gain privileges. <p>Publish Date: 2017-12-16 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2017-17712>CVE-2017-17712</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.0</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Local - Attack Complexity: High - Privileges Required: Low - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: High - Integrity Impact: High - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17712">http://web.nvd.nist.gov/view/vuln/detail?vulnId=CVE-2017-17712</a></p> <p>Release Date: 2017-12-16</p> <p>Fix Resolution: v4.15-rc4</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in linuxlinux cve high severity vulnerability vulnerable library linuxlinux the linux kernel library home page a href found in base branch master vulnerable source files vulnerability details the raw sendmsg function in net raw c in the linux kernel through has a race condition in inet hdrincl that leads to uninitialized stack pointer usage this allows a local user to execute code and gain privileges publish date url a href cvss score details base score metrics exploitability metrics attack vector local attack complexity high privileges required low user interaction none scope unchanged impact metrics confidentiality impact high integrity impact high availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
4,665
7,497,305,078
IssuesEvent
2018-04-08 18:26:59
nyu-software-engineering/online-time-tracker
https://api.github.com/repos/nyu-software-engineering/online-time-tracker
closed
Add a section for percentage of time spent in extension
1-sprint backlog 2-in process sprint 3 task
- [ ] Find Percentage of Time spent on each website
1.0
Add a section for percentage of time spent in extension - - [ ] Find Percentage of Time spent on each website
process
add a section for percentage of time spent in extension find percentage of time spent on each website
1
15,657
19,846,894,118
IssuesEvent
2022-01-21 07:47:28
ooi-data/CE06ISSM-RID16-07-NUTNRB000-recovered_host-nutnr_b_dcl_dark_conc_instrument_recovered
https://api.github.com/repos/ooi-data/CE06ISSM-RID16-07-NUTNRB000-recovered_host-nutnr_b_dcl_dark_conc_instrument_recovered
opened
🛑 Processing failed: ValueError
process
## Overview `ValueError` found in `processing_task` task during run ended on 2022-01-21T07:47:27.662849. ## Details Flow name: `CE06ISSM-RID16-07-NUTNRB000-recovered_host-nutnr_b_dcl_dark_conc_instrument_recovered` Task name: `processing_task` Error type: `ValueError` Error message: not enough values to unpack (expected 3, got 0) <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr existing_arr.append(var_data.values) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values return _as_array_or_item(self._data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item data = np.asarray(data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__ x = self.compute() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter c = np.asarray(c) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__ self._ensure_cached() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 70, in __array__ return self.func(self.array) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 137, in _apply_mask data = np.asarray(data, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__ return array[key.tuple] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__ return self.get_basic_selection(selection, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection return self._get_basic_selection_nd(selection=selection, out=out, File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd return self._get_selection(indexer=indexer, out=out, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection lchunk_coords, lchunk_selection, lout_selection = zip(*indexer) ValueError: not enough values to unpack (expected 3, got 0) ``` </details>
1.0
🛑 Processing failed: ValueError - ## Overview `ValueError` found in `processing_task` task during run ended on 2022-01-21T07:47:27.662849. ## Details Flow name: `CE06ISSM-RID16-07-NUTNRB000-recovered_host-nutnr_b_dcl_dark_conc_instrument_recovered` Task name: `processing_task` Error type: `ValueError` Error message: not enough values to unpack (expected 3, got 0) <details> <summary>Traceback</summary> ``` Traceback (most recent call last): File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/pipeline.py", line 165, in processing final_path = finalize_data_stream( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 84, in finalize_data_stream append_to_zarr(mod_ds, final_store, enc, logger=logger) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/__init__.py", line 357, in append_to_zarr _append_zarr(store, mod_ds) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/ooi_harvester/processor/utils.py", line 187, in _append_zarr existing_arr.append(var_data.values) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 519, in values return _as_array_or_item(self._data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/variable.py", line 259, in _as_array_or_item data = np.asarray(data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 1541, in __array__ x = self.compute() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 288, in compute (result,) = compute(self, traverse=False, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/base.py", line 571, in compute results = schedule(dsk, keys, **kwargs) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/threaded.py", line 79, in get results = get_async( File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 507, in get_async raise_exception(exc, tb) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 315, in reraise raise exc File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/local.py", line 220, in execute_task result = _execute_task(task, data) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/core.py", line 119, in _execute_task return func(*(_execute_task(a, cache) for a in args)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/dask/array/core.py", line 116, in getter c = np.asarray(c) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 357, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 551, in __array__ self._ensure_cached() File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 548, in _ensure_cached self.array = NumpyIndexingAdapter(np.asarray(self.array)) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 521, in __array__ return np.asarray(self.array, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 70, in __array__ return self.func(self.array) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/coding/variables.py", line 137, in _apply_mask data = np.asarray(data, dtype=dtype) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/core/indexing.py", line 422, in __array__ return np.asarray(array[self.key], dtype=None) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/xarray/backends/zarr.py", line 73, in __getitem__ return array[key.tuple] File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 673, in __getitem__ return self.get_basic_selection(selection, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 798, in get_basic_selection return self._get_basic_selection_nd(selection=selection, out=out, File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 841, in _get_basic_selection_nd return self._get_selection(indexer=indexer, out=out, fields=fields) File "/srv/conda/envs/notebook/lib/python3.9/site-packages/zarr/core.py", line 1135, in _get_selection lchunk_coords, lchunk_selection, lout_selection = zip(*indexer) ValueError: not enough values to unpack (expected 3, got 0) ``` </details>
process
🛑 processing failed valueerror overview valueerror found in processing task task during run ended on details flow name recovered host nutnr b dcl dark conc instrument recovered task name processing task error type valueerror error message not enough values to unpack expected got traceback traceback most recent call last file srv conda envs notebook lib site packages ooi harvester processor pipeline py line in processing final path finalize data stream file srv conda envs notebook lib site packages ooi harvester processor init py line in finalize data stream append to zarr mod ds final store enc logger logger file srv conda envs notebook lib site packages ooi harvester processor init py line in append to zarr append zarr store mod ds file srv conda envs notebook lib site packages ooi harvester processor utils py line in append zarr existing arr append var data values file srv conda envs notebook lib site packages xarray core variable py line in values return as array or item self data file srv conda envs notebook lib site packages xarray core variable py line in as array or item data np asarray data file srv conda envs notebook lib site packages dask array core py line in array x self compute file srv conda envs notebook lib site packages dask base py line in compute result compute self traverse false kwargs file srv conda envs notebook lib site packages dask base py line in compute results schedule dsk keys kwargs file srv conda envs notebook lib site packages dask threaded py line in get results get async file srv conda envs notebook lib site packages dask local py line in get async raise exception exc tb file srv conda envs notebook lib site packages dask local py line in reraise raise exc file srv conda envs notebook lib site packages dask local py line in execute task result execute task task data file srv conda envs notebook lib site packages dask core py line in execute task return func execute task a cache for a in args file srv conda envs notebook lib site packages dask array core py line in getter c np asarray c file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array self ensure cached file srv conda envs notebook lib site packages xarray core indexing py line in ensure cached self array numpyindexingadapter np asarray self array file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray self array dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray array dtype none file srv conda envs notebook lib site packages xarray coding variables py line in array return self func self array file srv conda envs notebook lib site packages xarray coding variables py line in apply mask data np asarray data dtype dtype file srv conda envs notebook lib site packages xarray core indexing py line in array return np asarray array dtype none file srv conda envs notebook lib site packages xarray backends zarr py line in getitem return array file srv conda envs notebook lib site packages zarr core py line in getitem return self get basic selection selection fields fields file srv conda envs notebook lib site packages zarr core py line in get basic selection return self get basic selection nd selection selection out out file srv conda envs notebook lib site packages zarr core py line in get basic selection nd return self get selection indexer indexer out out fields fields file srv conda envs notebook lib site packages zarr core py line in get selection lchunk coords lchunk selection lout selection zip indexer valueerror not enough values to unpack expected got
1
17,211
22,795,111,678
IssuesEvent
2022-07-10 15:47:46
maticnetwork/miden
https://api.github.com/repos/maticnetwork/miden
opened
"Backfill" memory range check requests to op exec row instead of AuxTable memory trace row
enhancement processor
With the current implementation of range check lookups for stack and memory in the `p1` column (implemented in #286), it's possible for requests from the stack and requests from memory to occur in the same row of the execution trace, since the memory requests occur during each row of the memory segment of the Auxiliary Table trace, whereas the stack requests occur when the operation is executed. In order to keep the AIR constraint degrees of the `p1` column below 9, this requires an additional aux trace column `q` of intermediate values which holds the stack lookups. A better approach would be to include the range check requests from memory at the time the `mload` and `mstore` operations are executed on the stack. This would guarantee that there would never be more than one set of range check lookups requested by any operation at any given cycle. We could then remove the extra `q` column. The complication here is that memory information isn't available until the execution trace is being finalized and all memory accesses are known. Thus, we can't include memory range checks at the time that `mload` or `mstore` operations are executed. Instead, when the trace is being finalized, we need to "backfill" the memory lookup information into the decoder columns at the cycle where the operations were executed, and then update the range checker's `AuxTraceBuilder` so that range check requests by memory are performed at the row where the memory operation was executed rather than at the row where it's included in the memory trace. (This means we need to track those cycles, which is currently not done.) This requires a refactor of the current approach to the range checker's `p1` column in the [range checker's `AuxTraceBuilder`](https://github.com/maticnetwork/miden/blob/next/processor/src/range/aux_trace.rs). This will allow us to: 1. remove the `q` column, since all required AIR constraints for `p1` will be 9 degrees (or fewer) 2. remove the `cycle_range_checks` BTree and `CycleRangeChecks` struct and use vectors of "hints" and "rows" instead, similar to the `AuxTraceBuilder` pattern in the [stack](https://github.com/maticnetwork/miden/blob/next/processor/src/stack/aux_trace.rs) or [hasher](https://github.com/maticnetwork/miden/blob/next/processor/src/hasher/aux_trace.rs).
1.0
"Backfill" memory range check requests to op exec row instead of AuxTable memory trace row - With the current implementation of range check lookups for stack and memory in the `p1` column (implemented in #286), it's possible for requests from the stack and requests from memory to occur in the same row of the execution trace, since the memory requests occur during each row of the memory segment of the Auxiliary Table trace, whereas the stack requests occur when the operation is executed. In order to keep the AIR constraint degrees of the `p1` column below 9, this requires an additional aux trace column `q` of intermediate values which holds the stack lookups. A better approach would be to include the range check requests from memory at the time the `mload` and `mstore` operations are executed on the stack. This would guarantee that there would never be more than one set of range check lookups requested by any operation at any given cycle. We could then remove the extra `q` column. The complication here is that memory information isn't available until the execution trace is being finalized and all memory accesses are known. Thus, we can't include memory range checks at the time that `mload` or `mstore` operations are executed. Instead, when the trace is being finalized, we need to "backfill" the memory lookup information into the decoder columns at the cycle where the operations were executed, and then update the range checker's `AuxTraceBuilder` so that range check requests by memory are performed at the row where the memory operation was executed rather than at the row where it's included in the memory trace. (This means we need to track those cycles, which is currently not done.) This requires a refactor of the current approach to the range checker's `p1` column in the [range checker's `AuxTraceBuilder`](https://github.com/maticnetwork/miden/blob/next/processor/src/range/aux_trace.rs). This will allow us to: 1. remove the `q` column, since all required AIR constraints for `p1` will be 9 degrees (or fewer) 2. remove the `cycle_range_checks` BTree and `CycleRangeChecks` struct and use vectors of "hints" and "rows" instead, similar to the `AuxTraceBuilder` pattern in the [stack](https://github.com/maticnetwork/miden/blob/next/processor/src/stack/aux_trace.rs) or [hasher](https://github.com/maticnetwork/miden/blob/next/processor/src/hasher/aux_trace.rs).
process
backfill memory range check requests to op exec row instead of auxtable memory trace row with the current implementation of range check lookups for stack and memory in the column implemented in it s possible for requests from the stack and requests from memory to occur in the same row of the execution trace since the memory requests occur during each row of the memory segment of the auxiliary table trace whereas the stack requests occur when the operation is executed in order to keep the air constraint degrees of the column below this requires an additional aux trace column q of intermediate values which holds the stack lookups a better approach would be to include the range check requests from memory at the time the mload and mstore operations are executed on the stack this would guarantee that there would never be more than one set of range check lookups requested by any operation at any given cycle we could then remove the extra q column the complication here is that memory information isn t available until the execution trace is being finalized and all memory accesses are known thus we can t include memory range checks at the time that mload or mstore operations are executed instead when the trace is being finalized we need to backfill the memory lookup information into the decoder columns at the cycle where the operations were executed and then update the range checker s auxtracebuilder so that range check requests by memory are performed at the row where the memory operation was executed rather than at the row where it s included in the memory trace this means we need to track those cycles which is currently not done this requires a refactor of the current approach to the range checker s column in the this will allow us to remove the q column since all required air constraints for will be degrees or fewer remove the cycle range checks btree and cyclerangechecks struct and use vectors of hints and rows instead similar to the auxtracebuilder pattern in the or
1
152,487
12,109,968,427
IssuesEvent
2020-04-21 09:38:07
amitdavidson234/test-repo
https://api.github.com/repos/amitdavidson234/test-repo
opened
Test TestIntegrationOnEngine is failing
Test Failure
The test failed here: https://circleci.com/gh/demisto/server/70259#tests/containers/0 The logs are: ``` --- FAIL: TestIntegrationOnEngine (17.18s) engineutil.go:288: Error Trace: engineutil.go:288 engineutil.go:81 engine_test.go:412 Error: Should be true Test: TestIntegrationOnEngine ``` The test failed here: https://circleci.com/gh/demisto/server/70257#tests/containers/0 The logs are: ``` --- FAIL: TestIntegrationOnEngine (20.48s) engineutil.go:288: Error Trace: engineutil.go:288 engineutil.go:81 engine_test.go:412 Error: Should be true Test: TestIntegrationOnEngine ```
1.0
Test TestIntegrationOnEngine is failing - The test failed here: https://circleci.com/gh/demisto/server/70259#tests/containers/0 The logs are: ``` --- FAIL: TestIntegrationOnEngine (17.18s) engineutil.go:288: Error Trace: engineutil.go:288 engineutil.go:81 engine_test.go:412 Error: Should be true Test: TestIntegrationOnEngine ``` The test failed here: https://circleci.com/gh/demisto/server/70257#tests/containers/0 The logs are: ``` --- FAIL: TestIntegrationOnEngine (20.48s) engineutil.go:288: Error Trace: engineutil.go:288 engineutil.go:81 engine_test.go:412 Error: Should be true Test: TestIntegrationOnEngine ```
non_process
test testintegrationonengine is failing the test failed here the logs are fail testintegrationonengine engineutil go error trace engineutil go engineutil go engine test go error should be true test testintegrationonengine the test failed here the logs are fail testintegrationonengine engineutil go error trace engineutil go engineutil go engine test go error should be true test testintegrationonengine
0
231,411
18,765,782,736
IssuesEvent
2021-11-05 23:49:08
akhilaji/api_test
https://api.github.com/repos/akhilaji/api_test
opened
Test: Trying_Github_Integration completed with a status: failed.
TestGold Test Healed
Trying_Github_Integration was started at 2021-11-05 23:46:26.088419 completed at 2021-11-05 23:49:03.328089 with a final status: failed. You can view more detailed results at: https://playground.testgold.dev/interceptor/interceptor/results/v1/test/5d6f73936309bc6b31edcb5236a21231. ['test run retry']
2.0
Test: Trying_Github_Integration completed with a status: failed. - Trying_Github_Integration was started at 2021-11-05 23:46:26.088419 completed at 2021-11-05 23:49:03.328089 with a final status: failed. You can view more detailed results at: https://playground.testgold.dev/interceptor/interceptor/results/v1/test/5d6f73936309bc6b31edcb5236a21231. ['test run retry']
non_process
test trying github integration completed with a status failed trying github integration was started at completed at with a final status failed you can view more detailed results at
0
6,849
9,991,386,047
IssuesEvent
2019-07-11 10:58:21
linnovate/root
https://api.github.com/repos/linnovate/root
closed
meetings fullscreen
2.0.7 Fixed Process bug
go to meetings open new discussion press on full screen click the discussion you've opened go to Deadline and try to selecting a date result;you can not select a date for discussion go to meetings open new discussion press on full screen ![image](https://user-images.githubusercontent.com/47353222/60433633-f591ed00-9c0d-11e9-889d-ae37c8ecf632.png) click the discussion you've opened go to Deadline and try to selecting a date ![image](https://user-images.githubusercontent.com/47353222/60433713-22460480-9c0e-11e9-942b-baa6346edee8.png)
1.0
meetings fullscreen - go to meetings open new discussion press on full screen click the discussion you've opened go to Deadline and try to selecting a date result;you can not select a date for discussion go to meetings open new discussion press on full screen ![image](https://user-images.githubusercontent.com/47353222/60433633-f591ed00-9c0d-11e9-889d-ae37c8ecf632.png) click the discussion you've opened go to Deadline and try to selecting a date ![image](https://user-images.githubusercontent.com/47353222/60433713-22460480-9c0e-11e9-942b-baa6346edee8.png)
process
meetings fullscreen go to meetings open new discussion press on full screen click the discussion you ve opened go to deadline and try to selecting a date result you can not select a date for discussion go to meetings open new discussion press on full screen click the discussion you ve opened go to deadline and try to selecting a date
1
3,529
6,569,091,720
IssuesEvent
2017-09-09 02:12:38
P0cL4bs/WiFi-Pumpkin
https://api.github.com/repos/P0cL4bs/WiFi-Pumpkin
closed
Devices can't connect to AP
in process priority solved
Once I get my AP up and running everything seems fine. However when a device want to connect I always get an error in dhcpserver.py Traceback: File "/root/WiFi-Pumpkin/core/packets/dhcpserver.py, line 420, in run 'host_name': self.leases[client_mac] [12][0] Key error:12 #### Please tell us details about your environment. * Card wireless adapters name (please check if support AP/mode): AWUS036NHA * Version used tool: 0.8.5 * Virtual Machine (no ): * Operating System and version: Kali 0.6 RPI3
1.0
Devices can't connect to AP - Once I get my AP up and running everything seems fine. However when a device want to connect I always get an error in dhcpserver.py Traceback: File "/root/WiFi-Pumpkin/core/packets/dhcpserver.py, line 420, in run 'host_name': self.leases[client_mac] [12][0] Key error:12 #### Please tell us details about your environment. * Card wireless adapters name (please check if support AP/mode): AWUS036NHA * Version used tool: 0.8.5 * Virtual Machine (no ): * Operating System and version: Kali 0.6 RPI3
process
devices can t connect to ap once i get my ap up and running everything seems fine however when a device want to connect i always get an error in dhcpserver py traceback file root wifi pumpkin core packets dhcpserver py line in run host name self leases key error please tell us details about your environment card wireless adapters name please check if support ap mode version used tool virtual machine no operating system and version kali
1
2,631
5,410,219,274
IssuesEvent
2017-03-01 07:55:48
FujiXeroxNZ-Wellington/Indigo
https://api.github.com/repos/FujiXeroxNZ-Wellington/Indigo
opened
date can be entered other than dd-mm-yyyy
0-4-Contract Processing 0-Contract Management Known Issue
expected date is dd-mm-yyyy but the date can be entered as any 2 digit number for date,2 digit number for month and any 4 digit number for year. Need to implement strict date restrictions.
1.0
date can be entered other than dd-mm-yyyy - expected date is dd-mm-yyyy but the date can be entered as any 2 digit number for date,2 digit number for month and any 4 digit number for year. Need to implement strict date restrictions.
process
date can be entered other than dd mm yyyy expected date is dd mm yyyy but the date can be entered as any digit number for date digit number for month and any digit number for year need to implement strict date restrictions
1
3,342
6,475,111,153
IssuesEvent
2017-08-17 19:37:45
thewca/wca-regulations
https://api.github.com/repos/thewca/wca-regulations
closed
Simplify Handling of Borderline-Legal Puzzles
clarity guidelines process puzzles regulations
In the delegates list and elsewhere, many discussions have happened regarding the many cases where a puzzle is neither clearly legal or illegal. Personally, I don't keep track of most of these, and am not willing to. The discussions I have looked at often don't even come to a conclusion. As a result, I have seen puzzle regulations either not strongly enforced, or enforced extremely strictly, depending on a delegates philosophy. There are a number of possible solutions to this problem: 1. Status quo: just allow inconsistent delegate rulings about puzzle legality 2. Maintain an explicit list of (un)acceptable puzzles 3. Maintain an explicit list of (un)acceptable puzzle features 4. Maintain a list of only 'major' rulings on (un)acceptable puzzles 5. Relax the regulations to try and reduce the number of borderline cases 6. A mix of the above, maybe something else. Personally: I least like the first option, as I feel bad for competitors who are on the strict side of this. And although possible, I don't care too much for the second and third options, since I feel bad telling competitors they can't use puzzles for arbitrary/nuanced reasons, and don't want to have to inspect cubes to maintain fairness. The 4th option could possibly reduce the problem, but may still have the problems of points 2-3, and won't solve it completely. So I somewhat prefer the 5th option, specifically a change that would effectively allow 'see-through' cubes, which (as far as I care to think about right now) would legalize most border cases except for BLD. But in any case, I really would just prefer not 1-4. This is sort of a meta-issue including things like #13, #59, #50, #33 (point 5 above also makes debate about logos sort of irrelevant), etc.
1.0
Simplify Handling of Borderline-Legal Puzzles - In the delegates list and elsewhere, many discussions have happened regarding the many cases where a puzzle is neither clearly legal or illegal. Personally, I don't keep track of most of these, and am not willing to. The discussions I have looked at often don't even come to a conclusion. As a result, I have seen puzzle regulations either not strongly enforced, or enforced extremely strictly, depending on a delegates philosophy. There are a number of possible solutions to this problem: 1. Status quo: just allow inconsistent delegate rulings about puzzle legality 2. Maintain an explicit list of (un)acceptable puzzles 3. Maintain an explicit list of (un)acceptable puzzle features 4. Maintain a list of only 'major' rulings on (un)acceptable puzzles 5. Relax the regulations to try and reduce the number of borderline cases 6. A mix of the above, maybe something else. Personally: I least like the first option, as I feel bad for competitors who are on the strict side of this. And although possible, I don't care too much for the second and third options, since I feel bad telling competitors they can't use puzzles for arbitrary/nuanced reasons, and don't want to have to inspect cubes to maintain fairness. The 4th option could possibly reduce the problem, but may still have the problems of points 2-3, and won't solve it completely. So I somewhat prefer the 5th option, specifically a change that would effectively allow 'see-through' cubes, which (as far as I care to think about right now) would legalize most border cases except for BLD. But in any case, I really would just prefer not 1-4. This is sort of a meta-issue including things like #13, #59, #50, #33 (point 5 above also makes debate about logos sort of irrelevant), etc.
process
simplify handling of borderline legal puzzles in the delegates list and elsewhere many discussions have happened regarding the many cases where a puzzle is neither clearly legal or illegal personally i don t keep track of most of these and am not willing to the discussions i have looked at often don t even come to a conclusion as a result i have seen puzzle regulations either not strongly enforced or enforced extremely strictly depending on a delegates philosophy there are a number of possible solutions to this problem status quo just allow inconsistent delegate rulings about puzzle legality maintain an explicit list of un acceptable puzzles maintain an explicit list of un acceptable puzzle features maintain a list of only major rulings on un acceptable puzzles relax the regulations to try and reduce the number of borderline cases a mix of the above maybe something else personally i least like the first option as i feel bad for competitors who are on the strict side of this and although possible i don t care too much for the second and third options since i feel bad telling competitors they can t use puzzles for arbitrary nuanced reasons and don t want to have to inspect cubes to maintain fairness the option could possibly reduce the problem but may still have the problems of points and won t solve it completely so i somewhat prefer the option specifically a change that would effectively allow see through cubes which as far as i care to think about right now would legalize most border cases except for bld but in any case i really would just prefer not this is sort of a meta issue including things like point above also makes debate about logos sort of irrelevant etc
1
10,706
13,501,854,591
IssuesEvent
2020-09-13 05:00:05
amor71/LiuAlgoTrader
https://api.github.com/repos/amor71/LiuAlgoTrader
closed
liquidation of non-updated stocks
in-process
sell stocks before the end of trading session, even if there are no update/trades
1.0
liquidation of non-updated stocks - sell stocks before the end of trading session, even if there are no update/trades
process
liquidation of non updated stocks sell stocks before the end of trading session even if there are no update trades
1
117,394
11,945,953,454
IssuesEvent
2020-04-03 07:08:06
sharadhr/ped
https://api.github.com/repos/sharadhr/ped
opened
UG user interface image does not match actual UI; more explanations using the actual UI could be done
severity.Medium type.DocumentationBug
The image of the user interface of the application appears to be a mock-up, or a proposal, rather than the actual interface: ![image.png](https://raw.githubusercontent.com/sharadhr/ped/master/files/101bcce3-080b-49e3-9b22-e275a6cab333.png) Once again, to *first-time* users, this would be very, very confusing, and detract from the purpose of a user guide in the first place. Secondly, you might want to consider using screenshots from the actual UI, and employ appropriate annotations to help the user along with the UI, because as it is, it is still not clear what the purpose of each tab is, *despite* the naming on top.
1.0
UG user interface image does not match actual UI; more explanations using the actual UI could be done - The image of the user interface of the application appears to be a mock-up, or a proposal, rather than the actual interface: ![image.png](https://raw.githubusercontent.com/sharadhr/ped/master/files/101bcce3-080b-49e3-9b22-e275a6cab333.png) Once again, to *first-time* users, this would be very, very confusing, and detract from the purpose of a user guide in the first place. Secondly, you might want to consider using screenshots from the actual UI, and employ appropriate annotations to help the user along with the UI, because as it is, it is still not clear what the purpose of each tab is, *despite* the naming on top.
non_process
ug user interface image does not match actual ui more explanations using the actual ui could be done the image of the user interface of the application appears to be a mock up or a proposal rather than the actual interface once again to first time users this would be very very confusing and detract from the purpose of a user guide in the first place secondly you might want to consider using screenshots from the actual ui and employ appropriate annotations to help the user along with the ui because as it is it is still not clear what the purpose of each tab is despite the naming on top
0
15,107
18,844,481,585
IssuesEvent
2021-11-11 13:30:24
ethereum/EIPs
https://api.github.com/repos/ethereum/EIPs
closed
Add definitions of SHOULD, MUST, OPTIONAL to EIP1
type: Meta type: EIP1 (Process)
I noticed in #20 that the words SHOULD, MUST and OPTIONAL are used without explanation. They are capitalised, so I assume there is a special meaning attached. No definitions are given in [EIP1 - EIP Purpose and Guidelines](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-1.md). I MUST know that they mean. SHOULD these terms be added to EIP1 or OPTIONALy somewhere else?
1.0
Add definitions of SHOULD, MUST, OPTIONAL to EIP1 - I noticed in #20 that the words SHOULD, MUST and OPTIONAL are used without explanation. They are capitalised, so I assume there is a special meaning attached. No definitions are given in [EIP1 - EIP Purpose and Guidelines](https://github.com/ethereum/EIPs/blob/master/EIPS/eip-1.md). I MUST know that they mean. SHOULD these terms be added to EIP1 or OPTIONALy somewhere else?
process
add definitions of should must optional to i noticed in that the words should must and optional are used without explanation they are capitalised so i assume there is a special meaning attached no definitions are given in i must know that they mean should these terms be added to or optionaly somewhere else
1
621,242
19,581,404,586
IssuesEvent
2022-01-04 21:53:51
rubrikinc/rubrik-sdk-for-powershell
https://api.github.com/repos/rubrikinc/rubrik-sdk-for-powershell
closed
Add -ExistingSnapshotRetention parameter to various Protect-* cmdlets
platform-windows kind-enhancement help-wanted exp-intermediate platform-linux platform-mac priority-p2
The `Protect-RubrikVM` cmdlet provides the ability to modify retention of existing snapshots when using the `-DoNotProtect` parameter by specifing `-ExistingSnapshotRetention` when unprotecting. This is also referenced in [Issue 785](https://github.com/rubrikinc/rubrik-sdk-for-powershell/issues/785) This functionality should also be available on * Protect-RubrikFileset * Protect-RubrikDatabase * Protect-RubrikHyperVVM * Protect-RubrikNutanixVM
1.0
Add -ExistingSnapshotRetention parameter to various Protect-* cmdlets - The `Protect-RubrikVM` cmdlet provides the ability to modify retention of existing snapshots when using the `-DoNotProtect` parameter by specifing `-ExistingSnapshotRetention` when unprotecting. This is also referenced in [Issue 785](https://github.com/rubrikinc/rubrik-sdk-for-powershell/issues/785) This functionality should also be available on * Protect-RubrikFileset * Protect-RubrikDatabase * Protect-RubrikHyperVVM * Protect-RubrikNutanixVM
non_process
add existingsnapshotretention parameter to various protect cmdlets the protect rubrikvm cmdlet provides the ability to modify retention of existing snapshots when using the donotprotect parameter by specifing existingsnapshotretention when unprotecting this is also referenced in this functionality should also be available on protect rubrikfileset protect rubrikdatabase protect rubrikhypervvm protect rubriknutanixvm
0
11,490
14,361,846,166
IssuesEvent
2020-11-30 18:53:34
CERT-Polska/drakvuf-sandbox
https://api.github.com/repos/CERT-Polska/drakvuf-sandbox
opened
Send new task after postprocessing
certpl drakrun/postprocessing priority:medium
It's not possible to listen for an end of postprocessing. Send a task when it has finished doing things.
1.0
Send new task after postprocessing - It's not possible to listen for an end of postprocessing. Send a task when it has finished doing things.
process
send new task after postprocessing it s not possible to listen for an end of postprocessing send a task when it has finished doing things
1
21,022
27,969,907,735
IssuesEvent
2023-03-25 00:17:00
ethereum/EIPs
https://api.github.com/repos/ethereum/EIPs
closed
Updated PR Merge Queue
w-stale enhancement r-process e-consensus
### Proposed Change I suggest that instead of the current "EIP Editors review whichever PRs they want", I suggest the following: - If there are one or more PRs that have contentious issues (changes to CI, the website, and living and final EIPs would count as contentious until proven otherwise by the PR being merged) and have new activity, review them first, in ascending order of least recent unread message. GitHub will give you that list for you (if we make a contentious issue label): TODO - If there are one or more PRs that would need manual merging and have new activity, review them in top-to-down milestone order (new PRs must appear at the bottom). Again, GitHub gives you that list for you: https://github.com/ethereum/EIPs/milestone/1 - Review all PRs that add a new EIP as Withdrawn, Living, or Final and have new activity in *least* recently updated order. (All these items have lists that could work. I could even probably make a small app that aggregates these lists). - Review all PRs that change the status of an *existing* EIP to final, living, or withdrawn and have new activity in *most* recently updated order - Review all PRs that change the status of an *existing* EIP to last call and have new activity in *most* recently updated order - Review all PRs that change the status of an *existing* EIP to review and have new activity in *most* recently updated order - Review all PRs that add a new EIP and have new activity in order of ascending number of changed lines - Review all other PRs that have new activity at random :] (I think I covered all the cases, but this works as a catch-all)
1.0
Updated PR Merge Queue - ### Proposed Change I suggest that instead of the current "EIP Editors review whichever PRs they want", I suggest the following: - If there are one or more PRs that have contentious issues (changes to CI, the website, and living and final EIPs would count as contentious until proven otherwise by the PR being merged) and have new activity, review them first, in ascending order of least recent unread message. GitHub will give you that list for you (if we make a contentious issue label): TODO - If there are one or more PRs that would need manual merging and have new activity, review them in top-to-down milestone order (new PRs must appear at the bottom). Again, GitHub gives you that list for you: https://github.com/ethereum/EIPs/milestone/1 - Review all PRs that add a new EIP as Withdrawn, Living, or Final and have new activity in *least* recently updated order. (All these items have lists that could work. I could even probably make a small app that aggregates these lists). - Review all PRs that change the status of an *existing* EIP to final, living, or withdrawn and have new activity in *most* recently updated order - Review all PRs that change the status of an *existing* EIP to last call and have new activity in *most* recently updated order - Review all PRs that change the status of an *existing* EIP to review and have new activity in *most* recently updated order - Review all PRs that add a new EIP and have new activity in order of ascending number of changed lines - Review all other PRs that have new activity at random :] (I think I covered all the cases, but this works as a catch-all)
process
updated pr merge queue proposed change i suggest that instead of the current eip editors review whichever prs they want i suggest the following if there are one or more prs that have contentious issues changes to ci the website and living and final eips would count as contentious until proven otherwise by the pr being merged and have new activity review them first in ascending order of least recent unread message github will give you that list for you if we make a contentious issue label todo if there are one or more prs that would need manual merging and have new activity review them in top to down milestone order new prs must appear at the bottom again github gives you that list for you review all prs that add a new eip as withdrawn living or final and have new activity in least recently updated order all these items have lists that could work i could even probably make a small app that aggregates these lists review all prs that change the status of an existing eip to final living or withdrawn and have new activity in most recently updated order review all prs that change the status of an existing eip to last call and have new activity in most recently updated order review all prs that change the status of an existing eip to review and have new activity in most recently updated order review all prs that add a new eip and have new activity in order of ascending number of changed lines review all other prs that have new activity at random i think i covered all the cases but this works as a catch all
1
10,658
13,451,716,037
IssuesEvent
2020-09-08 20:44:53
nlpie/mtap
https://api.github.com/repos/nlpie/mtap
opened
Rename the "--config" run option to "--mtap-config"
area/framework/processing kind/enhancement lang/framework-langs
The "--config" option for run_processor / processor_parser is overly general and it would be useful to both increase specificity and to free it up for use by processors themselves.
1.0
Rename the "--config" run option to "--mtap-config" - The "--config" option for run_processor / processor_parser is overly general and it would be useful to both increase specificity and to free it up for use by processors themselves.
process
rename the config run option to mtap config the config option for run processor processor parser is overly general and it would be useful to both increase specificity and to free it up for use by processors themselves
1
359,189
10,661,153,756
IssuesEvent
2019-10-18 11:39:44
webcompat/web-bugs
https://api.github.com/repos/webcompat/web-bugs
closed
forum.worldoftanks.ru - see bug description
ML Correct ML ON browser-firefox engine-gecko priority-important
<!-- @browser: Firefox 71.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:71.0) Gecko/20100101 Firefox/71.0 --> <!-- @reported_with: desktop-reporter --> **URL**: http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu **Browser / Version**: Firefox 71.0 **Operating System**: Windows 10 **Tested Another Browser**: Unknown **Problem type**: Something else **Description**: smallest font size **Steps to Reproduce**: smallest font size does not work for this site [![Screenshot Description](https://webcompat.com/uploads/2019/10/af319735-fd19-4d9e-8530-957fdbdcba17-thumb.jpeg)](https://webcompat.com/uploads/2019/10/af319735-fd19-4d9e-8530-957fdbdcba17.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20191013093601</li><li>channel: nightly</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> <p>Console Messages:</p> <pre> [{'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/games/ru.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/cards/ru.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/realms/ru_wot.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/services/ru_wot.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/i18n/en.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/i18n/ru.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/3rd_party/prototype.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '79:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/ipb.js?ipbv=90159526a9cc9790f8805fef06a42237&load=quickpm,hovercard,board .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '80:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/3rd_party/scriptaculous/scriptaculous-cache.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '81:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/lang_cache/master_ru_RU.UTF-8/ipb.lang.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '82:1'}, {'level': 'warn', 'log': ['This page uses the non standard property zoom. Consider using calc() in the relevant property values, or using transform along with transform-origin: 0 0.'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}, {'level': 'warn', 'log': [' <script> http://static-cds.gcdn.co/static/client/js/wgcds.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '84:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/auth.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '113:1'}, {'level': 'error', 'log': ['ReferenceError: ipb is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '175:3'}, {'level': 'error', 'log': ['TypeError: document.observe is not a function'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '247:13'}, {'level': 'error', 'log': ['ReferenceError: $ is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '356:10'}, {'level': 'error', 'log': ['ReferenceError: ipb is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '3115:23'}, {'level': 'error', 'log': ['ReferenceError: WGCDS is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '3290:10'}, {'level': 'warn', 'log': [' (http://) . , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}, {'level': 'warn', 'log': [' <script> http://tenor.wargaming.net/assets/clicks/static/tracker.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '47:28'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12136'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12197'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '117:16'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12800'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12868'}, {'level': 'warn', 'log': [' onmozfullscreenchange .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}, {'level': 'warn', 'log': [' onmozfullscreenerror .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
1.0
forum.worldoftanks.ru - see bug description - <!-- @browser: Firefox 71.0 --> <!-- @ua_header: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:71.0) Gecko/20100101 Firefox/71.0 --> <!-- @reported_with: desktop-reporter --> **URL**: http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu **Browser / Version**: Firefox 71.0 **Operating System**: Windows 10 **Tested Another Browser**: Unknown **Problem type**: Something else **Description**: smallest font size **Steps to Reproduce**: smallest font size does not work for this site [![Screenshot Description](https://webcompat.com/uploads/2019/10/af319735-fd19-4d9e-8530-957fdbdcba17-thumb.jpeg)](https://webcompat.com/uploads/2019/10/af319735-fd19-4d9e-8530-957fdbdcba17.jpeg) <details> <summary>Browser Configuration</summary> <ul> <li>gfx.webrender.all: false</li><li>gfx.webrender.blob-images: true</li><li>gfx.webrender.enabled: false</li><li>image.mem.shared: true</li><li>buildID: 20191013093601</li><li>channel: nightly</li><li>hasTouchScreen: false</li><li>mixed active content blocked: false</li><li>mixed passive content blocked: false</li><li>tracking content blocked: false</li> </ul> <p>Console Messages:</p> <pre> [{'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/games/ru.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/cards/ru.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/realms/ru_wot.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/services/ru_wot.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/i18n/en.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cm-ru.wargaming.net/media/public/config/i18n/ru.js?1.1.14 .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/3rd_party/prototype.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '79:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/ipb.js?ipbv=90159526a9cc9790f8805fef06a42237&load=quickpm,hovercard,board .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '80:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/3rd_party/scriptaculous/scriptaculous-cache.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '81:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/lang_cache/master_ru_RU.UTF-8/ipb.lang.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '82:1'}, {'level': 'warn', 'log': ['This page uses the non standard property zoom. Consider using calc() in the relevant property values, or using transform along with transform-origin: 0 0.'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}, {'level': 'warn', 'log': [' <script> http://static-cds.gcdn.co/static/client/js/wgcds.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '84:1'}, {'level': 'warn', 'log': [' <script> http://cdn-frm-eu.wargaming.net/4.5/js/auth.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '113:1'}, {'level': 'error', 'log': ['ReferenceError: ipb is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '175:3'}, {'level': 'error', 'log': ['TypeError: document.observe is not a function'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '247:13'}, {'level': 'error', 'log': ['ReferenceError: $ is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '356:10'}, {'level': 'error', 'log': ['ReferenceError: ipb is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '3115:23'}, {'level': 'error', 'log': ['ReferenceError: WGCDS is not defined'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '3290:10'}, {'level': 'warn', 'log': [' (http://) . , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}, {'level': 'warn', 'log': [' <script> http://tenor.wargaming.net/assets/clicks/static/tracker.js .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:1'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '47:28'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12136'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12197'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '117:16'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12800'}, {'level': 'warn', 'log': [' https://cm-ru.wargaming.net/public/shared-frame.html?ts=1571026318593&origin=%2F%2Fforum.worldoftanks.ru&user_id=- , , , .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '1:12868'}, {'level': 'warn', 'log': [' onmozfullscreenchange .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}, {'level': 'warn', 'log': [' onmozfullscreenerror .'], 'uri': 'http://forum.worldoftanks.ru/?link_place=wotp_link_main-menu', 'pos': '0:0'}] </pre> </details> _From [webcompat.com](https://webcompat.com/) with ❤️_
non_process
forum worldoftanks ru see bug description url browser version firefox operating system windows tested another browser unknown problem type something else description smallest font size steps to reproduce smallest font size does not work for this site browser configuration gfx webrender all false gfx webrender blob images true gfx webrender enabled false image mem shared true buildid channel nightly hastouchscreen false mixed active content blocked false mixed passive content blocked false tracking content blocked false console messages uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level error log uri pos level error log uri pos level error log uri pos level error log uri pos level error log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos level warn log uri pos from with ❤️
0
98,084
20,605,939,843
IssuesEvent
2022-03-07 00:01:18
inventare/django-image-uploader-widget
https://api.github.com/repos/inventare/django-image-uploader-widget
closed
Fix "identical-code" issue in src/ImageUploaderInline.ts
CodeClimate
Identical blocks of code found in 2 locations. Consider refactoring. https://codeclimate.com/github/inventare/django-image-uploader-widget/src/ImageUploaderInline.ts#issue_620e9b9e31c83f0001000077
1.0
Fix "identical-code" issue in src/ImageUploaderInline.ts - Identical blocks of code found in 2 locations. Consider refactoring. https://codeclimate.com/github/inventare/django-image-uploader-widget/src/ImageUploaderInline.ts#issue_620e9b9e31c83f0001000077
non_process
fix identical code issue in src imageuploaderinline ts identical blocks of code found in locations consider refactoring
0
412,962
27,881,680,313
IssuesEvent
2023-03-21 19:56:59
spring-projects/spring-boot
https://api.github.com/repos/spring-projects/spring-boot
closed
Add documentation tip showing how to configure publishRegistry Maven properties from the command line
type: documentation status: forward-port
Forward port of issue #34517 to 3.0.x.
1.0
Add documentation tip showing how to configure publishRegistry Maven properties from the command line - Forward port of issue #34517 to 3.0.x.
non_process
add documentation tip showing how to configure publishregistry maven properties from the command line forward port of issue to x
0
257,648
19,527,155,789
IssuesEvent
2021-12-30 10:02:13
playframework/playframework
https://api.github.com/repos/playframework/playframework
closed
Change from ValidationError to JsonValidationError should be documented in the Play 2.6 migration guide
good first issue topic:documentation
https://github.com/playframework/playframework/pull/6763 requires updating projects to use `JsonValidationError` in Play JSON reads instead of `play.api.data.validation.ValidationError`. This is currently missing from the [Play 2.6 Migration Guide](https://www.playframework.com/documentation/2.6.x/Migration26).
1.0
Change from ValidationError to JsonValidationError should be documented in the Play 2.6 migration guide - https://github.com/playframework/playframework/pull/6763 requires updating projects to use `JsonValidationError` in Play JSON reads instead of `play.api.data.validation.ValidationError`. This is currently missing from the [Play 2.6 Migration Guide](https://www.playframework.com/documentation/2.6.x/Migration26).
non_process
change from validationerror to jsonvalidationerror should be documented in the play migration guide requires updating projects to use jsonvalidationerror in play json reads instead of play api data validation validationerror this is currently missing from the
0
1,339
3,900,497,394
IssuesEvent
2016-04-18 06:27:41
DynareTeam/dynare
https://api.github.com/repos/DynareTeam/dynare
closed
derivation with respect to the parameters
enhancement preprocessor
When modifying <fname>_static.m in order to address problems with auxiliary variables (see #1133), I realized that the creation of <fname>_params_derivs.m and the code for analytical derivatives of the likelihood function assume that the static model is the static version of the <fname>_dynamic, equation by equation. This is not obvious in the preprocessor where the static model is stored and handled in a different tree and it stops me to find an efficient solution to the computation of the steady state of auxiliary variables. For this reason, I will introduce two files for the computation of the derivatives with respect to the parameters: <fname>_params_derivs_static and <fname>_derivs_dynamic I will also change all calling sequences throughout the code
1.0
derivation with respect to the parameters - When modifying <fname>_static.m in order to address problems with auxiliary variables (see #1133), I realized that the creation of <fname>_params_derivs.m and the code for analytical derivatives of the likelihood function assume that the static model is the static version of the <fname>_dynamic, equation by equation. This is not obvious in the preprocessor where the static model is stored and handled in a different tree and it stops me to find an efficient solution to the computation of the steady state of auxiliary variables. For this reason, I will introduce two files for the computation of the derivatives with respect to the parameters: <fname>_params_derivs_static and <fname>_derivs_dynamic I will also change all calling sequences throughout the code
process
derivation with respect to the parameters when modifying static m in order to address problems with auxiliary variables see i realized that the creation of params derivs m and the code for analytical derivatives of the likelihood function assume that the static model is the static version of the dynamic equation by equation this is not obvious in the preprocessor where the static model is stored and handled in a different tree and it stops me to find an efficient solution to the computation of the steady state of auxiliary variables for this reason i will introduce two files for the computation of the derivatives with respect to the parameters params derivs static and derivs dynamic i will also change all calling sequences throughout the code
1
113,734
17,150,891,579
IssuesEvent
2021-07-13 20:26:40
snowdensb/braindump
https://api.github.com/repos/snowdensb/braindump
opened
WS-2018-0069 (High) detected in is-my-json-valid-2.15.0.tgz
security vulnerability
## WS-2018-0069 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>is-my-json-valid-2.15.0.tgz</b></p></summary> <p>A JSONSchema validator that uses code generation to be extremely fast</p> <p>Library home page: <a href="https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz">https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz</a></p> <p>Path to dependency file: braindump/package.json</p> <p>Path to vulnerable library: braindump/node_modules/is-my-json-valid</p> <p> Dependency Hierarchy: - gulp-sass-2.3.2.tgz (Root Library) - node-sass-3.12.1.tgz - request-2.78.0.tgz - har-validator-2.0.6.tgz - :x: **is-my-json-valid-2.15.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/snowdensb/braindump/commit/815ae0afebcf867f02143f3ab9cf88b1d4dacdec">815ae0afebcf867f02143f3ab9cf88b1d4dacdec</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Version of is-my-json-valid before 1.4.1 or 2.17.2 are vulnerable to regular expression denial of service (ReDoS) via the email validation function. <p>Publish Date: 2018-02-14 <p>URL: <a href=https://github.com/mafintosh/is-my-json-valid/commit/b3051b277f7caa08cd2edc6f74f50aeda65d2976>WS-2018-0069</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nodesecurity.io/advisories/572">https://nodesecurity.io/advisories/572</a></p> <p>Release Date: 2018-01-24</p> <p>Fix Resolution: 1.4.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"is-my-json-valid","packageVersion":"2.15.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-sass:2.3.2;node-sass:3.12.1;request:2.78.0;har-validator:2.0.6;is-my-json-valid:2.15.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.4.1"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0069","vulnerabilityDetails":"Version of is-my-json-valid before 1.4.1 or 2.17.2 are vulnerable to regular expression denial of service (ReDoS) via the email validation function.","vulnerabilityUrl":"https://github.com/mafintosh/is-my-json-valid/commit/b3051b277f7caa08cd2edc6f74f50aeda65d2976","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
True
WS-2018-0069 (High) detected in is-my-json-valid-2.15.0.tgz - ## WS-2018-0069 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>is-my-json-valid-2.15.0.tgz</b></p></summary> <p>A JSONSchema validator that uses code generation to be extremely fast</p> <p>Library home page: <a href="https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz">https://registry.npmjs.org/is-my-json-valid/-/is-my-json-valid-2.15.0.tgz</a></p> <p>Path to dependency file: braindump/package.json</p> <p>Path to vulnerable library: braindump/node_modules/is-my-json-valid</p> <p> Dependency Hierarchy: - gulp-sass-2.3.2.tgz (Root Library) - node-sass-3.12.1.tgz - request-2.78.0.tgz - har-validator-2.0.6.tgz - :x: **is-my-json-valid-2.15.0.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/snowdensb/braindump/commit/815ae0afebcf867f02143f3ab9cf88b1d4dacdec">815ae0afebcf867f02143f3ab9cf88b1d4dacdec</a></p> <p>Found in base branch: <b>master</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> Version of is-my-json-valid before 1.4.1 or 2.17.2 are vulnerable to regular expression denial of service (ReDoS) via the email validation function. <p>Publish Date: 2018-02-14 <p>URL: <a href=https://github.com/mafintosh/is-my-json-valid/commit/b3051b277f7caa08cd2edc6f74f50aeda65d2976>WS-2018-0069</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://nodesecurity.io/advisories/572">https://nodesecurity.io/advisories/572</a></p> <p>Release Date: 2018-01-24</p> <p>Fix Resolution: 1.4.1</p> </p> </details> <p></p> <!-- <REMEDIATE>{"isOpenPROnVulnerability":true,"isPackageBased":true,"isDefaultBranch":true,"packages":[{"packageType":"javascript/Node.js","packageName":"is-my-json-valid","packageVersion":"2.15.0","packageFilePaths":["/package.json"],"isTransitiveDependency":true,"dependencyTree":"gulp-sass:2.3.2;node-sass:3.12.1;request:2.78.0;har-validator:2.0.6;is-my-json-valid:2.15.0","isMinimumFixVersionAvailable":true,"minimumFixVersion":"1.4.1"}],"baseBranches":["master"],"vulnerabilityIdentifier":"WS-2018-0069","vulnerabilityDetails":"Version of is-my-json-valid before 1.4.1 or 2.17.2 are vulnerable to regular expression denial of service (ReDoS) via the email validation function.","vulnerabilityUrl":"https://github.com/mafintosh/is-my-json-valid/commit/b3051b277f7caa08cd2edc6f74f50aeda65d2976","cvss3Severity":"high","cvss3Score":"7.5","cvss3Metrics":{"A":"High","AC":"Low","PR":"None","S":"Unchanged","C":"None","UI":"None","AV":"Network","I":"None"},"extraData":{}}</REMEDIATE> -->
non_process
ws high detected in is my json valid tgz ws high severity vulnerability vulnerable library is my json valid tgz a jsonschema validator that uses code generation to be extremely fast library home page a href path to dependency file braindump package json path to vulnerable library braindump node modules is my json valid dependency hierarchy gulp sass tgz root library node sass tgz request tgz har validator tgz x is my json valid tgz vulnerable library found in head commit a href found in base branch master vulnerability details version of is my json valid before or are vulnerable to regular expression denial of service redos via the email validation function publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution isopenpronvulnerability true ispackagebased true isdefaultbranch true packages istransitivedependency true dependencytree gulp sass node sass request har validator is my json valid isminimumfixversionavailable true minimumfixversion basebranches vulnerabilityidentifier ws vulnerabilitydetails version of is my json valid before or are vulnerable to regular expression denial of service redos via the email validation function vulnerabilityurl
0
11,912
14,019,157,871
IssuesEvent
2020-10-29 17:47:28
gudmdharalds/vip-go-mu-plugins-test
https://api.github.com/repos/gudmdharalds/vip-go-mu-plugins-test
closed
PHP 7.4 Compatibility issue lib/class-apc-cache-interceptor.php
PHP Compatibility PHP Compatiblity 7.4
Found issue in master: * <b>Warning</b>: Curly brace syntax for accessing array elements and string offsets has been deprecated in PHP 7.4. Found: $group{0} https://github.com/gudmdharalds/vip-go-mu-plugins-test/blob/28a657d751092304567701c0b0128d2d1c6132ff/lib/class-apc-cache-interceptor.php#L171
True
PHP 7.4 Compatibility issue lib/class-apc-cache-interceptor.php - Found issue in master: * <b>Warning</b>: Curly brace syntax for accessing array elements and string offsets has been deprecated in PHP 7.4. Found: $group{0} https://github.com/gudmdharalds/vip-go-mu-plugins-test/blob/28a657d751092304567701c0b0128d2d1c6132ff/lib/class-apc-cache-interceptor.php#L171
non_process
php compatibility issue lib class apc cache interceptor php found issue in master warning curly brace syntax for accessing array elements and string offsets has been deprecated in php found group
0
14,347
17,372,126,760
IssuesEvent
2021-07-30 15:17:49
DevExpress/testcafe-hammerhead
https://api.github.com/repos/DevExpress/testcafe-hammerhead
closed
Regression causes hammerhead to apparently execute arbitrary remote code
AREA: client AREA: server FREQUENCY: level 1 SYSTEM: script processing TYPE: bug
### What is your Scenario? After the recent fix for HTTP2 GOAWAY I upgraded from hammerhead 23.0.0 to 24.4.2. Unfortunately there appears to be a regression when loading the URL https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js via the proxy. ### What is the Current behavior? Loading `https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js` via the proxy (e.g. `http://localhost:50406/S6CRz7GiD!s!utf-8/https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js`) ends up with: ``` TypeError: Cannot read property 'name' of null ``` <!-- Describe the behavior you see and consider invalid. --> ### What is the Expected behavior? Loading the URL https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js ends up with the expected JavaScript text. ### What is your public web site URL? See above, I also attached a screenshot: <img width="2558" alt="Bildschirmfoto 2021-07-29 um 16 08 48" src="https://user-images.githubusercontent.com/3372410/127507551-2aa0c081-b3fb-4ea9-b81b-ebb5875f8c54.png"> ### Steps to Reproduce: ``` test('reproduce', async (t) => { await t.navigateTo('https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js') await t.wait(5000) }) ``` ### Your Environment details: testcafe versions: ``` "testcafe": "1.15.1", "testcafe-reporter-html": "1.4.6", "testcafe-hammerhead": "24.4.2", "testcafe-reporter-spec-time": "4.0.0", "testcafe-browser-tools": "2.0.16" ``` config: ``` { "disableScreenshots": true, "stopOnFirstFail": true, "debugOnFail": true, "speed": 1, "selectorTimeout": 20000, "assertionTimeout": 10000, "pageLoadTimeout": 20000, "pageRequestTimeout": 30000, "retryTestPages": false, "disableMultipleWindows": true, "src": ["tests"], "skipJsErrors": true, "reporter": [ { "name": "spec-time" }, { "name": "xunit", "output": "reports/testcafe-unit-tests.xml" }, { "name": "html", "output": "reports/testcafe-report.html" } ], "browsers": "firefox --width 1920 --height 1280", "concurrency": 2, "quarantine": true, "hostname": "localhost", "live": true } ``` * node.js version: v16.5.0 * browser name and version: FireFox 89.0.2 * platform and version: latest macOS
1.0
Regression causes hammerhead to apparently execute arbitrary remote code - ### What is your Scenario? After the recent fix for HTTP2 GOAWAY I upgraded from hammerhead 23.0.0 to 24.4.2. Unfortunately there appears to be a regression when loading the URL https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js via the proxy. ### What is the Current behavior? Loading `https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js` via the proxy (e.g. `http://localhost:50406/S6CRz7GiD!s!utf-8/https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js`) ends up with: ``` TypeError: Cannot read property 'name' of null ``` <!-- Describe the behavior you see and consider invalid. --> ### What is the Expected behavior? Loading the URL https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js ends up with the expected JavaScript text. ### What is your public web site URL? See above, I also attached a screenshot: <img width="2558" alt="Bildschirmfoto 2021-07-29 um 16 08 48" src="https://user-images.githubusercontent.com/3372410/127507551-2aa0c081-b3fb-4ea9-b81b-ebb5875f8c54.png"> ### Steps to Reproduce: ``` test('reproduce', async (t) => { await t.navigateTo('https://cdn.jsdelivr.net/npm/monaco-editor@0.25.2/min/vs/editor/editor.main.js') await t.wait(5000) }) ``` ### Your Environment details: testcafe versions: ``` "testcafe": "1.15.1", "testcafe-reporter-html": "1.4.6", "testcafe-hammerhead": "24.4.2", "testcafe-reporter-spec-time": "4.0.0", "testcafe-browser-tools": "2.0.16" ``` config: ``` { "disableScreenshots": true, "stopOnFirstFail": true, "debugOnFail": true, "speed": 1, "selectorTimeout": 20000, "assertionTimeout": 10000, "pageLoadTimeout": 20000, "pageRequestTimeout": 30000, "retryTestPages": false, "disableMultipleWindows": true, "src": ["tests"], "skipJsErrors": true, "reporter": [ { "name": "spec-time" }, { "name": "xunit", "output": "reports/testcafe-unit-tests.xml" }, { "name": "html", "output": "reports/testcafe-report.html" } ], "browsers": "firefox --width 1920 --height 1280", "concurrency": 2, "quarantine": true, "hostname": "localhost", "live": true } ``` * node.js version: v16.5.0 * browser name and version: FireFox 89.0.2 * platform and version: latest macOS
process
regression causes hammerhead to apparently execute arbitrary remote code what is your scenario after the recent fix for goaway i upgraded from hammerhead to unfortunately there appears to be a regression when loading the url via the proxy what is the current behavior loading via the proxy e g ends up with typeerror cannot read property name of null what is the expected behavior loading the url ends up with the expected javascript text what is your public web site url see above i also attached a screenshot img width alt bildschirmfoto um src steps to reproduce test reproduce async t await t navigateto await t wait your environment details testcafe versions testcafe testcafe reporter html testcafe hammerhead testcafe reporter spec time testcafe browser tools config disablescreenshots true stoponfirstfail true debugonfail true speed selectortimeout assertiontimeout pageloadtimeout pagerequesttimeout retrytestpages false disablemultiplewindows true src skipjserrors true reporter name spec time name xunit output reports testcafe unit tests xml name html output reports testcafe report html browsers firefox width height concurrency quarantine true hostname localhost live true node js version browser name and version firefox platform and version latest macos
1
240,545
26,256,386,733
IssuesEvent
2023-01-06 01:22:29
ARUMAIS/playwright
https://api.github.com/repos/ARUMAIS/playwright
opened
CVE-2022-3509 (High) detected in protobuf-java-3.10.0.jar
security vulnerability
## CVE-2022-3509 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>protobuf-java-3.10.0.jar</b></p></summary> <p>Core Protocol Buffers library. Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.</p> <p>Library home page: <a href="https://developers.google.com/protocol-buffers/">https://developers.google.com/protocol-buffers/</a></p> <p>Path to dependency file: /packages/playwright-core/src/server/android/driver/app/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.10.0/410b61dd0088aab4caa05739558d43df248958c9/protobuf-java-3.10.0.jar</p> <p> Dependency Hierarchy: - lint-gradle-27.1.0.jar (Root Library) - sdk-common-27.1.0.jar - :x: **protobuf-java-3.10.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/L00163425/playwright/commit/75b1b367dd4a1fb86cc96f5a7a44e354f1ca3a39">75b1b367dd4a1fb86cc96f5a7a44e354f1ca3a39</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing issue similar to CVE-2022-3171, but with textformat in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack. Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms, resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above. <p>Publish Date: 2022-12-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3509>CVE-2022-3509</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-3509">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-3509</a></p> <p>Release Date: 2022-12-12</p> <p>Fix Resolution: com.google.protobuf:protobuf-java:3.16.3,3.19.6,3.20.3,3.21.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2022-3509 (High) detected in protobuf-java-3.10.0.jar - ## CVE-2022-3509 - High Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>protobuf-java-3.10.0.jar</b></p></summary> <p>Core Protocol Buffers library. Protocol Buffers are a way of encoding structured data in an efficient yet extensible format.</p> <p>Library home page: <a href="https://developers.google.com/protocol-buffers/">https://developers.google.com/protocol-buffers/</a></p> <p>Path to dependency file: /packages/playwright-core/src/server/android/driver/app/build.gradle</p> <p>Path to vulnerable library: /home/wss-scanner/.gradle/caches/modules-2/files-2.1/com.google.protobuf/protobuf-java/3.10.0/410b61dd0088aab4caa05739558d43df248958c9/protobuf-java-3.10.0.jar</p> <p> Dependency Hierarchy: - lint-gradle-27.1.0.jar (Root Library) - sdk-common-27.1.0.jar - :x: **protobuf-java-3.10.0.jar** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/L00163425/playwright/commit/75b1b367dd4a1fb86cc96f5a7a44e354f1ca3a39">75b1b367dd4a1fb86cc96f5a7a44e354f1ca3a39</a></p> <p>Found in base branch: <b>main</b></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/high_vul.png' width=19 height=20> Vulnerability Details</summary> <p> A parsing issue similar to CVE-2022-3171, but with textformat in protobuf-java core and lite versions prior to 3.21.7, 3.20.3, 3.19.6 and 3.16.3 can lead to a denial of service attack. Inputs containing multiple instances of non-repeated embedded messages with repeated or unknown fields causes objects to be converted back-n-forth between mutable and immutable forms, resulting in potentially long garbage collection pauses. We recommend updating to the versions mentioned above. <p>Publish Date: 2022-12-12 <p>URL: <a href=https://www.mend.io/vulnerability-database/CVE-2022-3509>CVE-2022-3509</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>7.5</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: None - User Interaction: None - Scope: Unchanged - Impact Metrics: - Confidentiality Impact: None - Integrity Impact: None - Availability Impact: High </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-3509">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2022-3509</a></p> <p>Release Date: 2022-12-12</p> <p>Fix Resolution: com.google.protobuf:protobuf-java:3.16.3,3.19.6,3.20.3,3.21.7</p> </p> </details> <p></p> *** Step up your Open Source Security Game with Mend [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve high detected in protobuf java jar cve high severity vulnerability vulnerable library protobuf java jar core protocol buffers library protocol buffers are a way of encoding structured data in an efficient yet extensible format library home page a href path to dependency file packages playwright core src server android driver app build gradle path to vulnerable library home wss scanner gradle caches modules files com google protobuf protobuf java protobuf java jar dependency hierarchy lint gradle jar root library sdk common jar x protobuf java jar vulnerable library found in head commit a href found in base branch main vulnerability details a parsing issue similar to cve but with textformat in protobuf java core and lite versions prior to and can lead to a denial of service attack inputs containing multiple instances of non repeated embedded messages with repeated or unknown fields causes objects to be converted back n forth between mutable and immutable forms resulting in potentially long garbage collection pauses we recommend updating to the versions mentioned above publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required none user interaction none scope unchanged impact metrics confidentiality impact none integrity impact none availability impact high for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution com google protobuf protobuf java step up your open source security game with mend
0
32,560
13,878,860,652
IssuesEvent
2020-10-17 11:45:45
microsoft/vscode-cpptools
https://api.github.com/repos/microsoft/vscode-cpptools
closed
preview to show multiple line but no \n kept as orginal code
Feature Request Language Service more votes needed
_From @andyysj on March 9, 2017 3:14_ - VSCode Version: Code 1.10.2 (8076a19fdcab7e1fc1707952d652f0bb6c6db331, 2017-03-08T14:02:52.799Z) - OS Version: Windows_NT ia32 10.0.14393 - Extensions: |Extension|Author|Version| |---|---|---| |vscode-clang|mitaki28|0.2.2| |cpptools|ms-vscode|0.10.2| |vim|vscodevim|0.6.2|; --- Steps to Reproduce: 1. Push mouse on one multiple lines function name, the preview will show the contents 2. Check the contents you will find no \n as original code, all codes keep one line and wrap up. _Copied from original issue: Microsoft/vscode#22267_
1.0
preview to show multiple line but no \n kept as orginal code - _From @andyysj on March 9, 2017 3:14_ - VSCode Version: Code 1.10.2 (8076a19fdcab7e1fc1707952d652f0bb6c6db331, 2017-03-08T14:02:52.799Z) - OS Version: Windows_NT ia32 10.0.14393 - Extensions: |Extension|Author|Version| |---|---|---| |vscode-clang|mitaki28|0.2.2| |cpptools|ms-vscode|0.10.2| |vim|vscodevim|0.6.2|; --- Steps to Reproduce: 1. Push mouse on one multiple lines function name, the preview will show the contents 2. Check the contents you will find no \n as original code, all codes keep one line and wrap up. _Copied from original issue: Microsoft/vscode#22267_
non_process
preview to show multiple line but no n kept as orginal code from andyysj on march vscode version code os version windows nt extensions extension author version vscode clang cpptools ms vscode vim vscodevim steps to reproduce push mouse on one multiple lines function name the preview will show the contents check the contents you will find no n as original code all codes keep one line and wrap up copied from original issue microsoft vscode
0
26,033
12,341,943,153
IssuesEvent
2020-05-14 23:14:49
MicrosoftDocs/azure-docs
https://api.github.com/repos/MicrosoftDocs/azure-docs
closed
Please add a note to indicate `azureuser` expires
Pri1 assigned-to-author container-service/svc doc-enhancement triaged
Per https://github.com/Azure/AKS/issues/1170 and https://github.com/MicrosoftDocs/azure-docs/issues/47302, the `azureuser` account on VMs expires after 30 days. So, we tried following the steps here and were scratching our heads as to why our SSH key wasn't being accepted; turned out to be that the public and private keys were deployed correctly but the account was locked. Was able to get in by first resetting the password on `azureuser` to a long, random throwaway password (to unlock the account) and then by re-deploying the SSH public key. Others might not understand that's what's happening here. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3b81a147-db1c-b92f-88d1-6dd0991a72fe * Version Independent ID: f5a2c949-498f-5848-d89c-031d1a757120 * Content: [SSH into Azure Kubernetes Service (AKS) cluster nodes - Azure Kubernetes Service](https://docs.microsoft.com/en-us/azure/aks/ssh) * Content Source: [articles/aks/ssh.md](https://github.com/Microsoft/azure-docs/blob/master/articles/aks/ssh.md) * Service: **container-service** * GitHub Login: @mlearned * Microsoft Alias: **mlearned**
1.0
Please add a note to indicate `azureuser` expires - Per https://github.com/Azure/AKS/issues/1170 and https://github.com/MicrosoftDocs/azure-docs/issues/47302, the `azureuser` account on VMs expires after 30 days. So, we tried following the steps here and were scratching our heads as to why our SSH key wasn't being accepted; turned out to be that the public and private keys were deployed correctly but the account was locked. Was able to get in by first resetting the password on `azureuser` to a long, random throwaway password (to unlock the account) and then by re-deploying the SSH public key. Others might not understand that's what's happening here. --- #### Document Details ⚠ *Do not edit this section. It is required for docs.microsoft.com ➟ GitHub issue linking.* * ID: 3b81a147-db1c-b92f-88d1-6dd0991a72fe * Version Independent ID: f5a2c949-498f-5848-d89c-031d1a757120 * Content: [SSH into Azure Kubernetes Service (AKS) cluster nodes - Azure Kubernetes Service](https://docs.microsoft.com/en-us/azure/aks/ssh) * Content Source: [articles/aks/ssh.md](https://github.com/Microsoft/azure-docs/blob/master/articles/aks/ssh.md) * Service: **container-service** * GitHub Login: @mlearned * Microsoft Alias: **mlearned**
non_process
please add a note to indicate azureuser expires per and the azureuser account on vms expires after days so we tried following the steps here and were scratching our heads as to why our ssh key wasn t being accepted turned out to be that the public and private keys were deployed correctly but the account was locked was able to get in by first resetting the password on azureuser to a long random throwaway password to unlock the account and then by re deploying the ssh public key others might not understand that s what s happening here document details ⚠ do not edit this section it is required for docs microsoft com ➟ github issue linking id version independent id content content source service container service github login mlearned microsoft alias mlearned
0
439,724
30,711,365,165
IssuesEvent
2023-07-27 10:00:41
keptn/lifecycle-toolkit
https://api.github.com/repos/keptn/lifecycle-toolkit
closed
[Docs] make server doesn't kill container
documentation good first issue
When I run `make server` in the `/docs` repo and then exit the command with `Ctrl+C` the hugo pod is left running. This means that restarting the server by running `make server` again (as I must do because live reload isn't working) will **fail** because port `1314` is already allocated. Then I need to `docker ps`, get the container ID and `docker stop 123`. ## The Fix Add the `-i` flag to this line: https://github.com/keptn/lifecycle-toolkit/blob/077f0d5d0a49bc5b1f0e800274343660b8218c65/docs/Makefile#L7 so `-t` becomes `-it`
1.0
[Docs] make server doesn't kill container - When I run `make server` in the `/docs` repo and then exit the command with `Ctrl+C` the hugo pod is left running. This means that restarting the server by running `make server` again (as I must do because live reload isn't working) will **fail** because port `1314` is already allocated. Then I need to `docker ps`, get the container ID and `docker stop 123`. ## The Fix Add the `-i` flag to this line: https://github.com/keptn/lifecycle-toolkit/blob/077f0d5d0a49bc5b1f0e800274343660b8218c65/docs/Makefile#L7 so `-t` becomes `-it`
non_process
make server doesn t kill container when i run make server in the docs repo and then exit the command with ctrl c the hugo pod is left running this means that restarting the server by running make server again as i must do because live reload isn t working will fail because port is already allocated then i need to docker ps get the container id and docker stop the fix add the i flag to this line so t becomes it
0
194
2,597,141,086
IssuesEvent
2015-02-21 03:49:31
opattison/olivermakes
https://api.github.com/repos/opattison/olivermakes
opened
Design documentation model
content process structure visual
Questions: - Public on the web (besides GitHub?) - How does it integrate with the patterns page? This may be a prerequisite for #149.
1.0
Design documentation model - Questions: - Public on the web (besides GitHub?) - How does it integrate with the patterns page? This may be a prerequisite for #149.
process
design documentation model questions public on the web besides github how does it integrate with the patterns page this may be a prerequisite for
1
7,346
10,482,254,138
IssuesEvent
2019-09-24 11:34:41
openopps/openopps-platform
https://api.github.com/repos/openopps/openopps-platform
opened
Application review page: work experience should display newest to oldest
Apply Process Requirements Ready State Dept.
Who: applicants What: reviewing their work experience should display newest to oldest Why: to be in line with the resume and USAJOBS Acceptance Criteria: Student application - review page: When a student applies, the work experience on the experience page should display newest to oldest like USAJOBS
1.0
Application review page: work experience should display newest to oldest - Who: applicants What: reviewing their work experience should display newest to oldest Why: to be in line with the resume and USAJOBS Acceptance Criteria: Student application - review page: When a student applies, the work experience on the experience page should display newest to oldest like USAJOBS
process
application review page work experience should display newest to oldest who applicants what reviewing their work experience should display newest to oldest why to be in line with the resume and usajobs acceptance criteria student application review page when a student applies the work experience on the experience page should display newest to oldest like usajobs
1
8,773
11,890,777,572
IssuesEvent
2020-03-28 19:56:26
qgis/QGIS
https://api.github.com/repos/qgis/QGIS
closed
Processing ignore file extension when exporting results
Bug High Priority Processing Regression
<!-- Bug fixing and feature development is a community responsibility, and not the responsibility of the QGIS project alone. If this bug report or feature request is high-priority for you, we suggest engaging a QGIS developer or support organisation and financially sponsoring a fix Checklist before submitting - [ ] Search through existing issue reports and gis.stackexchange.com to check whether the issue already exists - [ ] Test with a [clean new user profile](https://docs.qgis.org/testing/en/docs/user_manual/introduction/qgis_configuration.html?highlight=profile#working-with-user-profiles). - [ ] Create a light and self-contained sample dataset and project file which demonstrates the issue --> **Describe the bug** When saving the output to a file (no temporary), Processing is ignoring the file extension picked and put `gpkg`. Happens with all the algorithms. ![ezgif com-video-to-gif](https://user-images.githubusercontent.com/2884884/77329309-1eb75800-6d1e-11ea-900e-365c60bdf667.gif) **QGIS and OS versions** QGIS master 3.13 compiled on Linux Sid.
1.0
Processing ignore file extension when exporting results - <!-- Bug fixing and feature development is a community responsibility, and not the responsibility of the QGIS project alone. If this bug report or feature request is high-priority for you, we suggest engaging a QGIS developer or support organisation and financially sponsoring a fix Checklist before submitting - [ ] Search through existing issue reports and gis.stackexchange.com to check whether the issue already exists - [ ] Test with a [clean new user profile](https://docs.qgis.org/testing/en/docs/user_manual/introduction/qgis_configuration.html?highlight=profile#working-with-user-profiles). - [ ] Create a light and self-contained sample dataset and project file which demonstrates the issue --> **Describe the bug** When saving the output to a file (no temporary), Processing is ignoring the file extension picked and put `gpkg`. Happens with all the algorithms. ![ezgif com-video-to-gif](https://user-images.githubusercontent.com/2884884/77329309-1eb75800-6d1e-11ea-900e-365c60bdf667.gif) **QGIS and OS versions** QGIS master 3.13 compiled on Linux Sid.
process
processing ignore file extension when exporting results bug fixing and feature development is a community responsibility and not the responsibility of the qgis project alone if this bug report or feature request is high priority for you we suggest engaging a qgis developer or support organisation and financially sponsoring a fix checklist before submitting search through existing issue reports and gis stackexchange com to check whether the issue already exists test with a create a light and self contained sample dataset and project file which demonstrates the issue describe the bug when saving the output to a file no temporary processing is ignoring the file extension picked and put gpkg happens with all the algorithms qgis and os versions qgis master compiled on linux sid
1
12,010
14,738,367,907
IssuesEvent
2021-01-07 04:33:23
kdjstudios/SABillingGitlab
https://api.github.com/repos/kdjstudios/SABillingGitlab
closed
Stockton Account ST3011 Alpha Towing Central Florica LLC
anc-ops anc-process anp-1.5 ant-bug ant-support has attachment
In GitLab by @kdjstudios on May 15, 2018, 12:23 **Submitted by:** "Martin Villegas" <martin.villegas@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-15-18213/conversation **Server:** Internal **Client/Site:** Stockton **Account:** ST3011 **Issue:** When trying to edit this account, I get the following error: I need to change the resource code for billing purposes. ![image](/uploads/59e8b07413236d277a402d6d9f4153fb/image.png)
1.0
Stockton Account ST3011 Alpha Towing Central Florica LLC - In GitLab by @kdjstudios on May 15, 2018, 12:23 **Submitted by:** "Martin Villegas" <martin.villegas@answernet.com> **Helpdesk:** http://www.servicedesk.answernet.com/profiles/ticket/2018-05-15-18213/conversation **Server:** Internal **Client/Site:** Stockton **Account:** ST3011 **Issue:** When trying to edit this account, I get the following error: I need to change the resource code for billing purposes. ![image](/uploads/59e8b07413236d277a402d6d9f4153fb/image.png)
process
stockton account alpha towing central florica llc in gitlab by kdjstudios on may submitted by martin villegas helpdesk server internal client site stockton account issue when trying to edit this account i get the following error i need to change the resource code for billing purposes uploads image png
1
16,569
21,579,354,011
IssuesEvent
2022-05-02 16:58:24
streamnative/pulsar-flink
https://api.github.com/repos/streamnative/pulsar-flink
closed
[BUG] Enable key hash range | When Enable key hash range is set to true only one reader is created with a partial topic range on one source, this causes a Flink application to miss messages and fail.
type/bug compute/data-processing
My setup is as follows: 1. Flink application with source parallelism of 5 2. Pulsar Flink connector 2.7.1.1 with key hash range enabled 3. One pulsar topic without partitions When the application starts, only one reader with a partial range is created for the topic, and this happens only on one subtask/source, this means that messages that are not within the hash range can't be received , and Flink parallelism is still bounded to number of partitions. The expected behavior is that each of the 5 sources would create a reader with a different hash range, and all the subtasks will consume the whole range in parallel. As you can see in the logs below only source 4 created a reader but the range is partial 14:13:11.648 [Source: EventsInputStream (5/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 4 initially has no topics to read from. 14:13:11.648 [Source: EventsInputStream (1/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 0 initially has no topics to read from. 14:13:11.648 [Source: EventsInputStream (2/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 1 initially has no topics to read from. 14:13:11.648 [Source: AssetsStream (2/2)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 1 initially has no topics to read from. 14:13:11.648 [Source: EventsInputStream (3/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 2 initially has no topics to read from. 14:13:11.648 **[Source: EventsInputStream (4/5)]** INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 3 will start reading 1 topics from initialized positions: {TopicRange{topic=persistent://public/default/inputevents, key-range=SerializableRange{range=[39322, 52428]}}=9223372036854775807:9223372036854775807:-1}
1.0
[BUG] Enable key hash range | When Enable key hash range is set to true only one reader is created with a partial topic range on one source, this causes a Flink application to miss messages and fail. - My setup is as follows: 1. Flink application with source parallelism of 5 2. Pulsar Flink connector 2.7.1.1 with key hash range enabled 3. One pulsar topic without partitions When the application starts, only one reader with a partial range is created for the topic, and this happens only on one subtask/source, this means that messages that are not within the hash range can't be received , and Flink parallelism is still bounded to number of partitions. The expected behavior is that each of the 5 sources would create a reader with a different hash range, and all the subtasks will consume the whole range in parallel. As you can see in the logs below only source 4 created a reader but the range is partial 14:13:11.648 [Source: EventsInputStream (5/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 4 initially has no topics to read from. 14:13:11.648 [Source: EventsInputStream (1/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 0 initially has no topics to read from. 14:13:11.648 [Source: EventsInputStream (2/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 1 initially has no topics to read from. 14:13:11.648 [Source: AssetsStream (2/2)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 1 initially has no topics to read from. 14:13:11.648 [Source: EventsInputStream (3/5)] INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 2 initially has no topics to read from. 14:13:11.648 **[Source: EventsInputStream (4/5)]** INFO [] o.a.f.s.c.pulsar.FlinkPulsarSource [] - Source 3 will start reading 1 topics from initialized positions: {TopicRange{topic=persistent://public/default/inputevents, key-range=SerializableRange{range=[39322, 52428]}}=9223372036854775807:9223372036854775807:-1}
process
enable key hash range when enable key hash range is set to true only one reader is created with a partial topic range on one source this causes a flink application to miss messages and fail my setup is as follows flink application with source parallelism of pulsar flink connector with key hash range enabled one pulsar topic without partitions when the application starts only one reader with a partial range is created for the topic and this happens only on one subtask source this means that messages that are not within the hash range can t be received and flink parallelism is still bounded to number of partitions the expected behavior is that each of the sources would create a reader with a different hash range and all the subtasks will consume the whole range in parallel as you can see in the logs below only source created a reader but the range is partial info o a f s c pulsar flinkpulsarsource source initially has no topics to read from info o a f s c pulsar flinkpulsarsource source initially has no topics to read from info o a f s c pulsar flinkpulsarsource source initially has no topics to read from info o a f s c pulsar flinkpulsarsource source initially has no topics to read from info o a f s c pulsar flinkpulsarsource source initially has no topics to read from info o a f s c pulsar flinkpulsarsource source will start reading topics from initialized positions topicrange topic persistent public default inputevents key range serializablerange range
1
14,474
17,596,590,822
IssuesEvent
2021-08-17 06:22:18
GoogleCloudPlatform/fda-mystudies
https://api.github.com/repos/GoogleCloudPlatform/fda-mystudies
closed
[iOS] Signup page > Opacity issue on top portion of signup page after logout/delete account
Bug iOS UI P3 Process: Fixed Process: Tested QA Process: Tested dev
Steps: 1. Login/Signup into iOS app 2. Click on Menu 3. Signout. Navigated to app overview 4. Click on Signup 5. Observe the top portion of signup page Actual: Opacity issue on top portion of signup page after logout/delete account Expected: UI should be as per design Issue not observed after killing and relaunching the app after step 4 Issue observed for delete account Issue not observed when navigated to signup page freshly Opacity issue post signout: ![IMG_1837](https://user-images.githubusercontent.com/60386291/127106807-cc061a07-4af7-4ecf-9518-df14352b6fe7.PNG) Expected UI: ![Screenshot 2021-07-27 at 12 05 17 PM](https://user-images.githubusercontent.com/60386291/127107218-0c82771c-bd6c-432f-b521-f730fcb91d0d.png)
3.0
[iOS] Signup page > Opacity issue on top portion of signup page after logout/delete account - Steps: 1. Login/Signup into iOS app 2. Click on Menu 3. Signout. Navigated to app overview 4. Click on Signup 5. Observe the top portion of signup page Actual: Opacity issue on top portion of signup page after logout/delete account Expected: UI should be as per design Issue not observed after killing and relaunching the app after step 4 Issue observed for delete account Issue not observed when navigated to signup page freshly Opacity issue post signout: ![IMG_1837](https://user-images.githubusercontent.com/60386291/127106807-cc061a07-4af7-4ecf-9518-df14352b6fe7.PNG) Expected UI: ![Screenshot 2021-07-27 at 12 05 17 PM](https://user-images.githubusercontent.com/60386291/127107218-0c82771c-bd6c-432f-b521-f730fcb91d0d.png)
process
signup page opacity issue on top portion of signup page after logout delete account steps login signup into ios app click on menu signout navigated to app overview click on signup observe the top portion of signup page actual opacity issue on top portion of signup page after logout delete account expected ui should be as per design issue not observed after killing and relaunching the app after step issue observed for delete account issue not observed when navigated to signup page freshly opacity issue post signout expected ui
1
1,344
3,901,907,515
IssuesEvent
2016-04-18 12:54:27
processing/processing
https://api.github.com/repos/processing/processing
closed
numeric literals with underscores not handled consistently
duplicate preprocessor
Processing 3 seems to be able to handle numeric literals with underscores with simple assignment: ```java int literal = 10_000; ``` But when combined with operators, Processing throws 'unexpected token: int' at runtime (but no red wavey line in the editor): ```java int literal = 10_000 + 1; ```
1.0
numeric literals with underscores not handled consistently - Processing 3 seems to be able to handle numeric literals with underscores with simple assignment: ```java int literal = 10_000; ``` But when combined with operators, Processing throws 'unexpected token: int' at runtime (but no red wavey line in the editor): ```java int literal = 10_000 + 1; ```
process
numeric literals with underscores not handled consistently processing seems to be able to handle numeric literals with underscores with simple assignment java int literal but when combined with operators processing throws unexpected token int at runtime but no red wavey line in the editor java int literal
1
42,318
10,962,033,433
IssuesEvent
2019-11-27 16:27:36
zotonic/zotonic
https://api.github.com/repos/zotonic/zotonic
closed
zotonic does not see the site, but it works
defect
`16:58:58.621 [error] gen_server 'z_module_manager$nahodka' terminated with reason: {{badmap,{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}},[{maps,keys,[{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}],[]},{z_module_manager,handle_upgrade,1,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,849}]},{z_module_manager,handle_cast,2,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,630}]},{gen_server,try_dispatch,4,[{file,"gen_server.erl"},{...}]},...]} 16:58:58.621 [error] CRASH REPORT Process 'z_module_manager$nahodka' with 0 neighbours crashed with reason: {{badmap,{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}},[{maps,keys,[{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}],[]},{z_module_manager,handle_upgrade,1,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,849}]},{z_module_manager,handle_cast,2,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,630}]},{gen_server,try_dispatch,4,[{file,"gen_server.erl"},{...}]},...]} 16:58:58.623 [error] Supervisor 'z_site_sup$nahodka' had child z_module_manager started with z_module_manager:start_link([{title,"nahodka"},{enabled,true},{dbschema,"public"},{dbhost,"postgres"},{install_modules,[mod_base,...]},...]) at <0.638.0> exit with reason {{badmap,{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}},[{maps,keys,[{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}],[]},{z_module_manager,handle_upgrade,1,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,849}]},{z_module_manager,handle_cast,2,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,630}]},{gen_server,try_dispatch,4,[{file,"gen_server.erl"},{...}]},...]} in context child_terminated`
1.0
zotonic does not see the site, but it works - `16:58:58.621 [error] gen_server 'z_module_manager$nahodka' terminated with reason: {{badmap,{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}},[{maps,keys,[{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}],[]},{z_module_manager,handle_upgrade,1,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,849}]},{z_module_manager,handle_cast,2,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,630}]},{gen_server,try_dispatch,4,[{file,"gen_server.erl"},{...}]},...]} 16:58:58.621 [error] CRASH REPORT Process 'z_module_manager$nahodka' with 0 neighbours crashed with reason: {{badmap,{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}},[{maps,keys,[{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}],[]},{z_module_manager,handle_upgrade,1,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,849}]},{z_module_manager,handle_cast,2,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,630}]},{gen_server,try_dispatch,4,[{file,"gen_server.erl"},{...}]},...]} 16:58:58.623 [error] Supervisor 'z_site_sup$nahodka' had child z_module_manager started with z_module_manager:start_link([{title,"nahodka"},{enabled,true},{dbschema,"public"},{dbhost,"postgres"},{install_modules,[mod_base,...]},...]) at <0.638.0> exit with reason {{badmap,{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}},[{maps,keys,[{module_status,nahodka,nahodka,<0.690.0>,restarting,1562224693,1562224693,undefined,0}],[]},{z_module_manager,handle_upgrade,1,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,849}]},{z_module_manager,handle_cast,2,[{file,"/opt/zotonic/apps/zotonic_core/src/support/z_module_manager.erl"},{line,630}]},{gen_server,try_dispatch,4,[{file,"gen_server.erl"},{...}]},...]} in context child_terminated`
non_process
zotonic does not see the site but it works gen server z module manager nahodka terminated with reason badmap module status nahodka nahodka restarting undefined z module manager handle upgrade z module manager handle cast gen server try dispatch crash report process z module manager nahodka with neighbours crashed with reason badmap module status nahodka nahodka restarting undefined z module manager handle upgrade z module manager handle cast gen server try dispatch supervisor z site sup nahodka had child z module manager started with z module manager start link at exit with reason badmap module status nahodka nahodka restarting undefined z module manager handle upgrade z module manager handle cast gen server try dispatch in context child terminated
0
31,344
14,937,465,156
IssuesEvent
2021-01-25 14:40:16
flutter/flutter
https://api.github.com/repos/flutter/flutter
closed
Flutter for Web large build size
created via performance template
<!-- Thank you for using Flutter! If you are looking for support, please check out our documentation or consider asking a question on Stack Overflow: * https://flutter.dev/ * https://api.flutter.dev/ * https://stackoverflow.com/questions/tagged/flutter?sort=frequent If you have found a performance problem, then fill out the template below. Please read our guide to filing a bug first: https://flutter.dev/docs/resources/bug-reports --> ## Details <!-- 1. Please tell us exactly how to reproduce the problem you are running into, and how you measured the performance. 2. Please attach a small application (ideally just one main.dart file) that reproduces the problem. You could use https://gist.github.com/ for this. 3. Switch flutter to master channel and run this app on a physical device using profile or release mode. Verify that the performance issue can be reproduced there. The bleeding edge master channel is encouraged here because Flutter is constantly fixing bugs and improving its performance. Your problem in an older Flutter version may have already been solved in the master channel. --> 1. Flutter web production build generated using flutter build web --release is generating >1 MB main bundle file. This is manifolds higher than general web builds. ![image](https://user-images.githubusercontent.com/15185453/105578902-66c56200-5da9-11eb-88e3-d6fc98888534.png) 2. --release tag actually does nothing same sized builds are getting generated. As flutter web is in beta, wanted to know if this is actively worked on, as we are evaluating flutter for web for our org. Or if there are other build optimization techniques we can use **Target Platform:** Web **Target OS version/browser:** chrome **Devices:** Web ## Logs <details> <summary>Logs</summary> Flutter analyze ``` Analyzing buildsize... No issues found! (ran in 1.4s) ``` Flutter Doctor Output - ``` [✓] Flutter (Channel beta, 1.25.0-8.3.pre, on Microsoft Windows [Version 10.0.18363.1316], locale en-US) • Flutter version 1.25.0-8.3.pre at D:\Arnab\flutter-git\flutter • Framework revision 5d36f2e7f5 (9 days ago), 2021-01-14 15:57:49 -0800 • Engine revision 7a8f8ca02c • Dart version 2.12.0 (build 2.12.0-133.7.beta) [!] Android toolchain - develop for Android devices (Android SDK version 29.0.2) • Android SDK at C:\Users\arnab\AppData\Local\Android\Sdk • Platform android-30, build-tools 29.0.2 • ANDROID_HOME = C:\Users\arnab\AppData\Local\Android\Sdk • ANDROID_SDK_ROOT = C:\Users\arnab\AppData\Local\Android\Sdk • Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java • Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b01) ! Some Android licenses not accepted. To resolve this, run: flutter doctor --android-licenses [✓] Chrome - develop for the web • Chrome at C:\Program Files (x86)\Google\Chrome\Application\chrome.exe [✓] Android Studio (version 4.0) • Android Studio at C:\Program Files\Android\Android Studio • Flutter plugin version 48.0.2 • Dart plugin version 193.7361 • Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b01) [✓] VS Code (version 1.52.1) • VS Code at C:\Users\arnab\AppData\Local\Programs\Microsoft VS Code • Flutter extension version 3.18.1 [✓] Connected device (2 available) • Chrome (web) • chrome • web-javascript • Google Chrome 87.0.4280.141 • Edge (web) • edge • web-javascript • Microsoft Edge 87.0.664.75 ``` </details>
True
Flutter for Web large build size - <!-- Thank you for using Flutter! If you are looking for support, please check out our documentation or consider asking a question on Stack Overflow: * https://flutter.dev/ * https://api.flutter.dev/ * https://stackoverflow.com/questions/tagged/flutter?sort=frequent If you have found a performance problem, then fill out the template below. Please read our guide to filing a bug first: https://flutter.dev/docs/resources/bug-reports --> ## Details <!-- 1. Please tell us exactly how to reproduce the problem you are running into, and how you measured the performance. 2. Please attach a small application (ideally just one main.dart file) that reproduces the problem. You could use https://gist.github.com/ for this. 3. Switch flutter to master channel and run this app on a physical device using profile or release mode. Verify that the performance issue can be reproduced there. The bleeding edge master channel is encouraged here because Flutter is constantly fixing bugs and improving its performance. Your problem in an older Flutter version may have already been solved in the master channel. --> 1. Flutter web production build generated using flutter build web --release is generating >1 MB main bundle file. This is manifolds higher than general web builds. ![image](https://user-images.githubusercontent.com/15185453/105578902-66c56200-5da9-11eb-88e3-d6fc98888534.png) 2. --release tag actually does nothing same sized builds are getting generated. As flutter web is in beta, wanted to know if this is actively worked on, as we are evaluating flutter for web for our org. Or if there are other build optimization techniques we can use **Target Platform:** Web **Target OS version/browser:** chrome **Devices:** Web ## Logs <details> <summary>Logs</summary> Flutter analyze ``` Analyzing buildsize... No issues found! (ran in 1.4s) ``` Flutter Doctor Output - ``` [✓] Flutter (Channel beta, 1.25.0-8.3.pre, on Microsoft Windows [Version 10.0.18363.1316], locale en-US) • Flutter version 1.25.0-8.3.pre at D:\Arnab\flutter-git\flutter • Framework revision 5d36f2e7f5 (9 days ago), 2021-01-14 15:57:49 -0800 • Engine revision 7a8f8ca02c • Dart version 2.12.0 (build 2.12.0-133.7.beta) [!] Android toolchain - develop for Android devices (Android SDK version 29.0.2) • Android SDK at C:\Users\arnab\AppData\Local\Android\Sdk • Platform android-30, build-tools 29.0.2 • ANDROID_HOME = C:\Users\arnab\AppData\Local\Android\Sdk • ANDROID_SDK_ROOT = C:\Users\arnab\AppData\Local\Android\Sdk • Java binary at: C:\Program Files\Android\Android Studio\jre\bin\java • Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b01) ! Some Android licenses not accepted. To resolve this, run: flutter doctor --android-licenses [✓] Chrome - develop for the web • Chrome at C:\Program Files (x86)\Google\Chrome\Application\chrome.exe [✓] Android Studio (version 4.0) • Android Studio at C:\Program Files\Android\Android Studio • Flutter plugin version 48.0.2 • Dart plugin version 193.7361 • Java version OpenJDK Runtime Environment (build 1.8.0_242-release-1644-b01) [✓] VS Code (version 1.52.1) • VS Code at C:\Users\arnab\AppData\Local\Programs\Microsoft VS Code • Flutter extension version 3.18.1 [✓] Connected device (2 available) • Chrome (web) • chrome • web-javascript • Google Chrome 87.0.4280.141 • Edge (web) • edge • web-javascript • Microsoft Edge 87.0.664.75 ``` </details>
non_process
flutter for web large build size thank you for using flutter if you are looking for support please check out our documentation or consider asking a question on stack overflow if you have found a performance problem then fill out the template below please read our guide to filing a bug first details please tell us exactly how to reproduce the problem you are running into and how you measured the performance please attach a small application ideally just one main dart file that reproduces the problem you could use for this switch flutter to master channel and run this app on a physical device using profile or release mode verify that the performance issue can be reproduced there the bleeding edge master channel is encouraged here because flutter is constantly fixing bugs and improving its performance your problem in an older flutter version may have already been solved in the master channel flutter web production build generated using flutter build web release is generating mb main bundle file this is manifolds higher than general web builds release tag actually does nothing same sized builds are getting generated as flutter web is in beta wanted to know if this is actively worked on as we are evaluating flutter for web for our org or if there are other build optimization techniques we can use target platform web target os version browser chrome devices web logs logs flutter analyze analyzing buildsize no issues found ran in flutter doctor output flutter channel beta pre on microsoft windows locale en us • flutter version pre at d arnab flutter git flutter • framework revision days ago • engine revision • dart version build beta android toolchain develop for android devices android sdk version • android sdk at c users arnab appdata local android sdk • platform android build tools • android home c users arnab appdata local android sdk • android sdk root c users arnab appdata local android sdk • java binary at c program files android android studio jre bin java • java version openjdk runtime environment build release some android licenses not accepted to resolve this run flutter doctor android licenses chrome develop for the web • chrome at c program files google chrome application chrome exe android studio version • android studio at c program files android android studio • flutter plugin version • dart plugin version • java version openjdk runtime environment build release vs code version • vs code at c users arnab appdata local programs microsoft vs code • flutter extension version connected device available • chrome web • chrome • web javascript • google chrome • edge web • edge • web javascript • microsoft edge
0
5,189
7,968,453,461
IssuesEvent
2018-07-16 03:06:38
rubberduck-vba/Rubberduck
https://api.github.com/repos/rubberduck-vba/Rubberduck
closed
Reference resolver doesn't resolve properties defined using the MIDL "short form"
bug feature-reference-explorer parse-tree-processing
The MSDN [DispInterface](https://msdn.microsoft.com/en-us/library/windows/desktop/aa366802(v=vs.85).aspx) documentation states: > The dispinterface statement defines a set of properties and methods... > > - **property-list** > (Syntax 1) An optional list of properties supported by the object, declared in the form of variables. > This is the **short form** for declaring the property functions in the methods list. > - **method-list** > (Syntax 1) A list comprising a function prototype for each method and property in the dispinterface [where each property methods can be marked with `[propget]`, `[propput]` or `[propputref]`]. > **You can declare properties in either the properties or methods lists. Declaring properties in the properties list does not indicate the type of access the property supports (that is, get, put, or putref).** RD is seemingly/currently unable to resolve the members defined in the short-form Properties syntax, or at least isn't able to resolve usages of the those properties to the actual property.
1.0
Reference resolver doesn't resolve properties defined using the MIDL "short form" - The MSDN [DispInterface](https://msdn.microsoft.com/en-us/library/windows/desktop/aa366802(v=vs.85).aspx) documentation states: > The dispinterface statement defines a set of properties and methods... > > - **property-list** > (Syntax 1) An optional list of properties supported by the object, declared in the form of variables. > This is the **short form** for declaring the property functions in the methods list. > - **method-list** > (Syntax 1) A list comprising a function prototype for each method and property in the dispinterface [where each property methods can be marked with `[propget]`, `[propput]` or `[propputref]`]. > **You can declare properties in either the properties or methods lists. Declaring properties in the properties list does not indicate the type of access the property supports (that is, get, put, or putref).** RD is seemingly/currently unable to resolve the members defined in the short-form Properties syntax, or at least isn't able to resolve usages of the those properties to the actual property.
process
reference resolver doesn t resolve properties defined using the midl short form the msdn documentation states the dispinterface statement defines a set of properties and methods property list syntax an optional list of properties supported by the object declared in the form of variables this is the short form for declaring the property functions in the methods list method list syntax a list comprising a function prototype for each method and property in the dispinterface or you can declare properties in either the properties or methods lists declaring properties in the properties list does not indicate the type of access the property supports that is get put or putref rd is seemingly currently unable to resolve the members defined in the short form properties syntax or at least isn t able to resolve usages of the those properties to the actual property
1
2,106
4,940,233,986
IssuesEvent
2016-11-29 16:20:10
pelias/pelias
https://api.github.com/repos/pelias/pelias
closed
sort by house number
enhancement low priority processed question
Currently, if our index contains multiple addresses for a given street and a user searches for another address on that street, but with a house number that we _don't have_, the API returns the aforementioned addresses in no particular order. A [query for `125 Dean Street`](https://pelias.mapzen.com/search?bbox=40.84887829610045,-73.81301879882812,40.60274481122281,-74.14775848388672&input=125+dean+street&lat=40.7259&lon=-73.9804&size=10&zoom=12), for instance, currently returns: ``` javascript { "geometry": { "coordinates": [ -73.9651769, 40.679767500000004 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin1_abbr": "NY", "admin2": "Kings", "alpha3": "USA", "id": "1177268453", "layer": "osmnode", "local_admin": "Brooklyn", "locality": "New York", "name": "Dean Street", "neighborhood": "Adelphi", "text": "Dean Street, Brooklyn, NY", "type": "osmnode" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.946418, 40.67681 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "2f06da64b4bd49b384f36eeafe360c5f", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "1285 Dean Street", "neighborhood": "Crown Heights", "text": "1285 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.947874, 40.676901 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "876d784d18bf4d10b4de9150424a8314", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "1247 Dean Street", "neighborhood": "Crown Heights", "text": "1247 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.942037, 40.676304 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "7b2e80f860274fcc80f4c0e7cf68ae3e", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "1406 Dean Street", "neighborhood": "Crown Heights", "text": "1406 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.988436, 40.686613 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "e345a6cffa4d4560a8f402f36e077b7d", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "122 Dean Street", "neighborhood": "Boerum Hill", "text": "122 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, ... ``` Sorting by house number would be extremely valuable. Could be doable with the `address` objects that we now add to our ES documents.
1.0
sort by house number - Currently, if our index contains multiple addresses for a given street and a user searches for another address on that street, but with a house number that we _don't have_, the API returns the aforementioned addresses in no particular order. A [query for `125 Dean Street`](https://pelias.mapzen.com/search?bbox=40.84887829610045,-73.81301879882812,40.60274481122281,-74.14775848388672&input=125+dean+street&lat=40.7259&lon=-73.9804&size=10&zoom=12), for instance, currently returns: ``` javascript { "geometry": { "coordinates": [ -73.9651769, 40.679767500000004 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin1_abbr": "NY", "admin2": "Kings", "alpha3": "USA", "id": "1177268453", "layer": "osmnode", "local_admin": "Brooklyn", "locality": "New York", "name": "Dean Street", "neighborhood": "Adelphi", "text": "Dean Street, Brooklyn, NY", "type": "osmnode" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.946418, 40.67681 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "2f06da64b4bd49b384f36eeafe360c5f", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "1285 Dean Street", "neighborhood": "Crown Heights", "text": "1285 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.947874, 40.676901 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "876d784d18bf4d10b4de9150424a8314", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "1247 Dean Street", "neighborhood": "Crown Heights", "text": "1247 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.942037, 40.676304 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "7b2e80f860274fcc80f4c0e7cf68ae3e", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "1406 Dean Street", "neighborhood": "Crown Heights", "text": "1406 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, { "geometry": { "coordinates": [ -73.988436, 40.686613 ], "type": "Point" }, "properties": { "admin0": "United States", "admin1": "New York", "admin2": "Kings", "alpha3": "USA", "id": "e345a6cffa4d4560a8f402f36e077b7d", "layer": "openaddresses", "local_admin": "Brooklyn", "locality": "New York", "name": "122 Dean Street", "neighborhood": "Boerum Hill", "text": "122 Dean Street, Brooklyn, New York", "type": "openaddresses" }, "type": "Feature" }, ... ``` Sorting by house number would be extremely valuable. Could be doable with the `address` objects that we now add to our ES documents.
process
sort by house number currently if our index contains multiple addresses for a given street and a user searches for another address on that street but with a house number that we don t have the api returns the aforementioned addresses in no particular order a for instance currently returns javascript geometry coordinates type point properties united states new york abbr ny kings usa id layer osmnode local admin brooklyn locality new york name dean street neighborhood adelphi text dean street brooklyn ny type osmnode type feature geometry coordinates type point properties united states new york kings usa id layer openaddresses local admin brooklyn locality new york name dean street neighborhood crown heights text dean street brooklyn new york type openaddresses type feature geometry coordinates type point properties united states new york kings usa id layer openaddresses local admin brooklyn locality new york name dean street neighborhood crown heights text dean street brooklyn new york type openaddresses type feature geometry coordinates type point properties united states new york kings usa id layer openaddresses local admin brooklyn locality new york name dean street neighborhood crown heights text dean street brooklyn new york type openaddresses type feature geometry coordinates type point properties united states new york kings usa id layer openaddresses local admin brooklyn locality new york name dean street neighborhood boerum hill text dean street brooklyn new york type openaddresses type feature sorting by house number would be extremely valuable could be doable with the address objects that we now add to our es documents
1
39,971
6,792,916,645
IssuesEvent
2017-11-01 03:44:27
Evenedric/stuff
https://api.github.com/repos/Evenedric/stuff
opened
Antenna pattern with dielectrics
documentation enhancement
We are assuming that we have free space (epsilon=1) at the ABC. If that is not true then k2,k needs to be adjusted. Document this limitation in the meantime.
1.0
Antenna pattern with dielectrics - We are assuming that we have free space (epsilon=1) at the ABC. If that is not true then k2,k needs to be adjusted. Document this limitation in the meantime.
non_process
antenna pattern with dielectrics we are assuming that we have free space epsilon at the abc if that is not true then k needs to be adjusted document this limitation in the meantime
0
32,823
13,933,092,554
IssuesEvent
2020-10-22 08:14:02
wellcomecollection/platform
https://api.github.com/repos/wellcomecollection/platform
opened
Remove the FileContext intermediate class
📦 Storage service 🚑 Health
Some leftover cleanup from https://github.com/wellcomecollection/platform/issues/4853 Currently the file finder sends an instance of `FileContext`, which gets turned into an `IndexedFile` in the file indexer. We can probably get away with the intermediate class, and send an IndexedFile immediately from the FileIndexer.
1.0
Remove the FileContext intermediate class - Some leftover cleanup from https://github.com/wellcomecollection/platform/issues/4853 Currently the file finder sends an instance of `FileContext`, which gets turned into an `IndexedFile` in the file indexer. We can probably get away with the intermediate class, and send an IndexedFile immediately from the FileIndexer.
non_process
remove the filecontext intermediate class some leftover cleanup from currently the file finder sends an instance of filecontext which gets turned into an indexedfile in the file indexer we can probably get away with the intermediate class and send an indexedfile immediately from the fileindexer
0
17,838
23,776,591,114
IssuesEvent
2022-09-01 21:39:55
Azure/azure-sdk-tools
https://api.github.com/repos/Azure/azure-sdk-tools
opened
API Review Step: Cadl
Engagement Experience WS: Process Tools & Automation
The purpose of this Epic is to define the gaps on the API Review process inside the Azure SDK Release process affected by Cadl. Some general parts on the step that need to be accounted for: - APIView should support Cadl in some sort. Issue looking into that is https://github.com/Azure/azure-sdk-tools/issues/1208 - Information communicated to API Stewardship around Cadl state and breaking changes - How will the approval take place? With a label on the PR like today? With a button on APIView? - Follow up with teams?
1.0
API Review Step: Cadl - The purpose of this Epic is to define the gaps on the API Review process inside the Azure SDK Release process affected by Cadl. Some general parts on the step that need to be accounted for: - APIView should support Cadl in some sort. Issue looking into that is https://github.com/Azure/azure-sdk-tools/issues/1208 - Information communicated to API Stewardship around Cadl state and breaking changes - How will the approval take place? With a label on the PR like today? With a button on APIView? - Follow up with teams?
process
api review step cadl the purpose of this epic is to define the gaps on the api review process inside the azure sdk release process affected by cadl some general parts on the step that need to be accounted for apiview should support cadl in some sort issue looking into that is information communicated to api stewardship around cadl state and breaking changes how will the approval take place with a label on the pr like today with a button on apiview follow up with teams
1
20,910
27,751,798,644
IssuesEvent
2023-03-15 21:24:10
openxla/community
https://api.github.com/repos/openxla/community
closed
Archive or delete old triton repository
process
First triton fork was made in error and should be archived/deleted per GitHub policy
1.0
Archive or delete old triton repository - First triton fork was made in error and should be archived/deleted per GitHub policy
process
archive or delete old triton repository first triton fork was made in error and should be archived deleted per github policy
1
82,127
3,603,331,474
IssuesEvent
2016-02-03 18:40:10
InWithForward/sharetribe
https://api.github.com/repos/InWithForward/sharetribe
reopened
"Coming soon" mode for an experience
high priority needs estimation
We currently have experiences listed that require a Kudoz staff person to accompany a kudoer because the experience isn't ready to be done independently yet. These are reasons why a experience would be in coming soon mode: 1. A taster has not gone on it yet 2. The criminal record check for the host hasn't been submitted or returned 3. The experience location hasn't had a safety check. We want to be able to flag these experiences and have that label display in "admin > view experiences" so we can be alerted and deploy a team member to go on the experience to. Suggested solution: Create 3 new "yes/no" admin-only fields in the experience listing [I didn't do this yet because of #79 bug] - Taster complete? [yes/no] - Crim record back? [yes/no] - Location Safety check done? [yes/no] If any are "no" then the "coming soon" label displays on the listing (see screenshot) <img width="1091" alt="screen shot 2016-01-31 at 3 12 05 pm" src="https://cloud.githubusercontent.com/assets/14365107/12705898/e6204ce8-c82d-11e5-9724-34f262892964.png"> We manually deploy a team member for these experiences.
1.0
"Coming soon" mode for an experience - We currently have experiences listed that require a Kudoz staff person to accompany a kudoer because the experience isn't ready to be done independently yet. These are reasons why a experience would be in coming soon mode: 1. A taster has not gone on it yet 2. The criminal record check for the host hasn't been submitted or returned 3. The experience location hasn't had a safety check. We want to be able to flag these experiences and have that label display in "admin > view experiences" so we can be alerted and deploy a team member to go on the experience to. Suggested solution: Create 3 new "yes/no" admin-only fields in the experience listing [I didn't do this yet because of #79 bug] - Taster complete? [yes/no] - Crim record back? [yes/no] - Location Safety check done? [yes/no] If any are "no" then the "coming soon" label displays on the listing (see screenshot) <img width="1091" alt="screen shot 2016-01-31 at 3 12 05 pm" src="https://cloud.githubusercontent.com/assets/14365107/12705898/e6204ce8-c82d-11e5-9724-34f262892964.png"> We manually deploy a team member for these experiences.
non_process
coming soon mode for an experience we currently have experiences listed that require a kudoz staff person to accompany a kudoer because the experience isn t ready to be done independently yet these are reasons why a experience would be in coming soon mode a taster has not gone on it yet the criminal record check for the host hasn t been submitted or returned the experience location hasn t had a safety check we want to be able to flag these experiences and have that label display in admin view experiences so we can be alerted and deploy a team member to go on the experience to suggested solution create new yes no admin only fields in the experience listing taster complete crim record back location safety check done if any are no then the coming soon label displays on the listing see screenshot img width alt screen shot at pm src we manually deploy a team member for these experiences
0
3,405
6,520,364,340
IssuesEvent
2017-08-28 16:12:05
w3c/w3process
https://api.github.com/repos/w3c/w3process
closed
Definition of obsolete
Editorial improvements Process2018Candidate
The first sentence of the definition of an obsolete specification is difficult to read: >"An Obsolete Recommendation is a specification that W3C does not believe has sufficient market relevance to continue recommending that the community implement it, but does not consider that there are fundamental problems that require the Recommendation be Rescinded."> Could it be simplified? Perhaps: >"An obsolete Recommendation is a specification that the W3C believes no longer has enough market relevance to continue recommending it for implementation, but does not have fundamental problems that would require it to be rescinded.">
1.0
Definition of obsolete - The first sentence of the definition of an obsolete specification is difficult to read: >"An Obsolete Recommendation is a specification that W3C does not believe has sufficient market relevance to continue recommending that the community implement it, but does not consider that there are fundamental problems that require the Recommendation be Rescinded."> Could it be simplified? Perhaps: >"An obsolete Recommendation is a specification that the W3C believes no longer has enough market relevance to continue recommending it for implementation, but does not have fundamental problems that would require it to be rescinded.">
process
definition of obsolete the first sentence of the definition of an obsolete specification is difficult to read an obsolete recommendation is a specification that does not believe has sufficient market relevance to continue recommending that the community implement it but does not consider that there are fundamental problems that require the recommendation be rescinded could it be simplified perhaps an obsolete recommendation is a specification that the believes no longer has enough market relevance to continue recommending it for implementation but does not have fundamental problems that would require it to be rescinded
1
41,620
12,832,423,991
IssuesEvent
2020-07-07 07:38:52
rvvergara/restaurant-site
https://api.github.com/repos/rvvergara/restaurant-site
closed
CVE-2019-16769 (Medium) detected in serialize-javascript-1.6.1.tgz
security vulnerability
## CVE-2019-16769 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>serialize-javascript-1.6.1.tgz</b></p></summary> <p>Serialize JavaScript to a superset of JSON that includes regular expressions and functions.</p> <p>Library home page: <a href="https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz">https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/restaurant-site/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/restaurant-site/node_modules/serialize-javascript/package.json</p> <p> Dependency Hierarchy: - webpack-4.29.3.tgz (Root Library) - terser-webpack-plugin-1.2.2.tgz - :x: **serialize-javascript-1.6.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/rvvergara/restaurant-site/commit/5c1dfff8fef13cfe8c7155cdf24e7c4209277e90">5c1dfff8fef13cfe8c7155cdf24e7c4209277e90</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The serialize-javascript npm package before version 2.1.1 is vulnerable to Cross-site Scripting (XSS). It does not properly mitigate against unsafe characters in serialized regular expressions. This vulnerability is not affected on Node.js environment since Node.js's implementation of RegExp.prototype.toString() backslash-escapes all forward slashes in regular expressions. If serialized data of regular expression objects are used in an environment other than Node.js, it is affected by this vulnerability. <p>Publish Date: 2019-12-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16769>CVE-2019-16769</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769</a></p> <p>Release Date: 2019-12-05</p> <p>Fix Resolution: v2.1.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
True
CVE-2019-16769 (Medium) detected in serialize-javascript-1.6.1.tgz - ## CVE-2019-16769 - Medium Severity Vulnerability <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/vulnerability_details.png' width=19 height=20> Vulnerable Library - <b>serialize-javascript-1.6.1.tgz</b></p></summary> <p>Serialize JavaScript to a superset of JSON that includes regular expressions and functions.</p> <p>Library home page: <a href="https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz">https://registry.npmjs.org/serialize-javascript/-/serialize-javascript-1.6.1.tgz</a></p> <p>Path to dependency file: /tmp/ws-scm/restaurant-site/package.json</p> <p>Path to vulnerable library: /tmp/ws-scm/restaurant-site/node_modules/serialize-javascript/package.json</p> <p> Dependency Hierarchy: - webpack-4.29.3.tgz (Root Library) - terser-webpack-plugin-1.2.2.tgz - :x: **serialize-javascript-1.6.1.tgz** (Vulnerable Library) <p>Found in HEAD commit: <a href="https://github.com/rvvergara/restaurant-site/commit/5c1dfff8fef13cfe8c7155cdf24e7c4209277e90">5c1dfff8fef13cfe8c7155cdf24e7c4209277e90</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/medium_vul.png' width=19 height=20> Vulnerability Details</summary> <p> The serialize-javascript npm package before version 2.1.1 is vulnerable to Cross-site Scripting (XSS). It does not properly mitigate against unsafe characters in serialized regular expressions. This vulnerability is not affected on Node.js environment since Node.js's implementation of RegExp.prototype.toString() backslash-escapes all forward slashes in regular expressions. If serialized data of regular expression objects are used in an environment other than Node.js, it is affected by this vulnerability. <p>Publish Date: 2019-12-05 <p>URL: <a href=https://vuln.whitesourcesoftware.com/vulnerability/CVE-2019-16769>CVE-2019-16769</a></p> </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/cvss3.png' width=19 height=20> CVSS 3 Score Details (<b>5.4</b>)</summary> <p> Base Score Metrics: - Exploitability Metrics: - Attack Vector: Network - Attack Complexity: Low - Privileges Required: Low - User Interaction: Required - Scope: Changed - Impact Metrics: - Confidentiality Impact: Low - Integrity Impact: Low - Availability Impact: None </p> For more information on CVSS3 Scores, click <a href="https://www.first.org/cvss/calculator/3.0">here</a>. </p> </details> <p></p> <details><summary><img src='https://whitesource-resources.whitesourcesoftware.com/suggested_fix.png' width=19 height=20> Suggested Fix</summary> <p> <p>Type: Upgrade version</p> <p>Origin: <a href="https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769">https://cve.mitre.org/cgi-bin/cvename.cgi?name=CVE-2019-16769</a></p> <p>Release Date: 2019-12-05</p> <p>Fix Resolution: v2.1.1</p> </p> </details> <p></p> *** Step up your Open Source Security Game with WhiteSource [here](https://www.whitesourcesoftware.com/full_solution_bolt_github)
non_process
cve medium detected in serialize javascript tgz cve medium severity vulnerability vulnerable library serialize javascript tgz serialize javascript to a superset of json that includes regular expressions and functions library home page a href path to dependency file tmp ws scm restaurant site package json path to vulnerable library tmp ws scm restaurant site node modules serialize javascript package json dependency hierarchy webpack tgz root library terser webpack plugin tgz x serialize javascript tgz vulnerable library found in head commit a href vulnerability details the serialize javascript npm package before version is vulnerable to cross site scripting xss it does not properly mitigate against unsafe characters in serialized regular expressions this vulnerability is not affected on node js environment since node js s implementation of regexp prototype tostring backslash escapes all forward slashes in regular expressions if serialized data of regular expression objects are used in an environment other than node js it is affected by this vulnerability publish date url a href cvss score details base score metrics exploitability metrics attack vector network attack complexity low privileges required low user interaction required scope changed impact metrics confidentiality impact low integrity impact low availability impact none for more information on scores click a href suggested fix type upgrade version origin a href release date fix resolution step up your open source security game with whitesource
0
13,669
16,388,870,711
IssuesEvent
2021-05-17 13:54:22
Bedrohung-der-Bienen/Transformationsfelder-Digitalisierung
https://api.github.com/repos/Bedrohung-der-Bienen/Transformationsfelder-Digitalisierung
closed
Auf der Registrierungsseite sollte ein registrierungsbutton sein.
backend datenbank frontend javascript register process ui element
# Szenario: Benutzer soll sich durch ein registrierungsbutton registrieren können - **Gegeben** Der Benutzer ist auf der Startseite angelangt - **Wenn** sich der Benutzer anmelden möchte - **Dann** klickt er auf den Login Tab in der Navigation - **Und** es öffnet sich eine Anmeldeseite und dort wählt er Registrieren aus - **Und** der Benutzer gibt seine Daten ein - **Und** klickt auf dem registrierungsbutton , um sich zu registrieren Der Benutzer muss die Möglichkeit haben sich registrieren zu können, um Kommentare hinzuzufügen, eine Pflanze bewerten zu können und seine Favoritenliste einsehen zu können mit dem registrierungsbutton werden die Eingabedaten in der Datenbank eingetragen. ----- **Als** Benutzer, **möchte ich** durch einen Button Registrieren können, **damit** ich mich anmelden kann. **Szenario 1:** Der Benutzer klickt auf den Login Tab, um sich anzumelden, dabei hat er die Möglichkeit sich zu registrieren. Er gibt alle Daten an und wird mit dem Klicken des Buttons registriert.
1.0
Auf der Registrierungsseite sollte ein registrierungsbutton sein. - # Szenario: Benutzer soll sich durch ein registrierungsbutton registrieren können - **Gegeben** Der Benutzer ist auf der Startseite angelangt - **Wenn** sich der Benutzer anmelden möchte - **Dann** klickt er auf den Login Tab in der Navigation - **Und** es öffnet sich eine Anmeldeseite und dort wählt er Registrieren aus - **Und** der Benutzer gibt seine Daten ein - **Und** klickt auf dem registrierungsbutton , um sich zu registrieren Der Benutzer muss die Möglichkeit haben sich registrieren zu können, um Kommentare hinzuzufügen, eine Pflanze bewerten zu können und seine Favoritenliste einsehen zu können mit dem registrierungsbutton werden die Eingabedaten in der Datenbank eingetragen. ----- **Als** Benutzer, **möchte ich** durch einen Button Registrieren können, **damit** ich mich anmelden kann. **Szenario 1:** Der Benutzer klickt auf den Login Tab, um sich anzumelden, dabei hat er die Möglichkeit sich zu registrieren. Er gibt alle Daten an und wird mit dem Klicken des Buttons registriert.
process
auf der registrierungsseite sollte ein registrierungsbutton sein szenario benutzer soll sich durch ein registrierungsbutton registrieren können gegeben der benutzer ist auf der startseite angelangt wenn sich der benutzer anmelden möchte dann klickt er auf den login tab in der navigation und es öffnet sich eine anmeldeseite und dort wählt er registrieren aus und der benutzer gibt seine daten ein und klickt auf dem registrierungsbutton um sich zu registrieren der benutzer muss die möglichkeit haben sich registrieren zu können um kommentare hinzuzufügen eine pflanze bewerten zu können und seine favoritenliste einsehen zu können mit dem registrierungsbutton werden die eingabedaten in der datenbank eingetragen als benutzer möchte ich durch einen button registrieren können damit ich mich anmelden kann szenario der benutzer klickt auf den login tab um sich anzumelden dabei hat er die möglichkeit sich zu registrieren er gibt alle daten an und wird mit dem klicken des buttons registriert
1
10,541
13,312,322,665
IssuesEvent
2020-08-26 09:34:33
pingcap/tidb
https://api.github.com/repos/pingcap/tidb
closed
Aggregate Function: group_concat with order by produces wrong result
component/coprocessor severity/critical type/bug
## Bug Report Preview ```` mysql> select group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) from t1; +--------------------------------------------------------------+ | group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) | +--------------------------------------------------------------+ | c,a,b | +--------------------------------------------------------------+ 1 row in set (0.00 sec) ```` ### 1. Minimal reproduce step (Required) ```` create table t1 (a int, a1 varchar(10)); create table t2 (a0 int); insert into t1 values (0,"a"),(0,"b"),(1,"c"); insert into t2 values (1),(2),(3); -- incorrect select group_concat(a1 order by (t1.a IN (select a0 from t2))) from t1; -- incorrect select group_concat(a1 order by a in (1,2,3) desc) from t1; -- correct select a1 from t1 order by 1 desc; ```` ### 2. What did you expect to see? (Required) ```` mysql> select group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) from t1; +--------------------------------------------------------------+ | group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) | +--------------------------------------------------------------+ | c,b,a | +-------------------------- mysql> select group_concat(a1 order by a in (1,2,3) desc) from t1; +---------------------------------------------+ | group_concat(a1 order by a in (1,2,3) desc) | +---------------------------------------------+ | c,b,a | +---------------------------------------------+ 1 row in set (0.00 sec) ```` ### 3. What did you see instead (Required) ```` mysql> select group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) from t1; +--------------------------------------------------------------+ | group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) | +--------------------------------------------------------------+ | c,a,b | +--------------------------------------------------------------+ 1 row in set (0.00 sec) mysql> select group_concat(a1 order by a in (1,2,3) desc) from t1; +---------------------------------------------+ | group_concat(a1 order by a in (1,2,3) desc) | +---------------------------------------------+ | c,a,b | +---------------------------------------------+ 1 row in set (0.00 sec) ```` ### 4. Affected version (Required) ```` commit 402fd2a247fff00ebe696a7d7672bfb2008435e4 (HEAD -> master, origin/master, origin/HEAD) Author: lysu <sulifx@gmail.com> Date: Tue Jun 30 19:31:37 2020 +0800 ```` ### 5. Root Cause Analysis <!-- should be filled by the investigator before it's closed -->
1.0
Aggregate Function: group_concat with order by produces wrong result - ## Bug Report Preview ```` mysql> select group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) from t1; +--------------------------------------------------------------+ | group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) | +--------------------------------------------------------------+ | c,a,b | +--------------------------------------------------------------+ 1 row in set (0.00 sec) ```` ### 1. Minimal reproduce step (Required) ```` create table t1 (a int, a1 varchar(10)); create table t2 (a0 int); insert into t1 values (0,"a"),(0,"b"),(1,"c"); insert into t2 values (1),(2),(3); -- incorrect select group_concat(a1 order by (t1.a IN (select a0 from t2))) from t1; -- incorrect select group_concat(a1 order by a in (1,2,3) desc) from t1; -- correct select a1 from t1 order by 1 desc; ```` ### 2. What did you expect to see? (Required) ```` mysql> select group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) from t1; +--------------------------------------------------------------+ | group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) | +--------------------------------------------------------------+ | c,b,a | +-------------------------- mysql> select group_concat(a1 order by a in (1,2,3) desc) from t1; +---------------------------------------------+ | group_concat(a1 order by a in (1,2,3) desc) | +---------------------------------------------+ | c,b,a | +---------------------------------------------+ 1 row in set (0.00 sec) ```` ### 3. What did you see instead (Required) ```` mysql> select group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) from t1; +--------------------------------------------------------------+ | group_concat(a1 order by (t1.a IN (select a0 from t2)) desc) | +--------------------------------------------------------------+ | c,a,b | +--------------------------------------------------------------+ 1 row in set (0.00 sec) mysql> select group_concat(a1 order by a in (1,2,3) desc) from t1; +---------------------------------------------+ | group_concat(a1 order by a in (1,2,3) desc) | +---------------------------------------------+ | c,a,b | +---------------------------------------------+ 1 row in set (0.00 sec) ```` ### 4. Affected version (Required) ```` commit 402fd2a247fff00ebe696a7d7672bfb2008435e4 (HEAD -> master, origin/master, origin/HEAD) Author: lysu <sulifx@gmail.com> Date: Tue Jun 30 19:31:37 2020 +0800 ```` ### 5. Root Cause Analysis <!-- should be filled by the investigator before it's closed -->
process
aggregate function group concat with order by produces wrong result bug report preview mysql select group concat order by a in select from desc from group concat order by a in select from desc c a b row in set sec minimal reproduce step required create table a int varchar create table int insert into values a b c insert into values incorrect select group concat order by a in select from from incorrect select group concat order by a in desc from correct select from order by desc what did you expect to see required mysql select group concat order by a in select from desc from group concat order by a in select from desc c b a mysql select group concat order by a in desc from group concat order by a in desc c b a row in set sec what did you see instead required mysql select group concat order by a in select from desc from group concat order by a in select from desc c a b row in set sec mysql select group concat order by a in desc from group concat order by a in desc c a b row in set sec affected version required commit head master origin master origin head author lysu date tue jun root cause analysis
1
10,958
13,761,833,470
IssuesEvent
2020-10-07 08:17:49
aiidateam/aiida-core
https://api.github.com/repos/aiidateam/aiida-core
closed
Add possibility to remove inputs from a builder with `delattr`
topic/calc-jobs topic/engine topic/processes topic/workflows type/feature request
Currently, once you have a `builder`, you might want to remove an optional output (e.g. if you get a builder from a `get_builder_restart` and want to drop an optional input). Currently, this is achievable by deleting an item using dictionary syntax: ```python del builder['input_name'] ``` However, the following syntax does not work: ```python del builder.input_name ``` I suggest to support also this syntax, since it is already possible (and 'suggested') to use builders with attribute access for setting data, thanks to the TAB-completion support (this can just proxy to the existing deletion function for dictionary items).
1.0
Add possibility to remove inputs from a builder with `delattr` - Currently, once you have a `builder`, you might want to remove an optional output (e.g. if you get a builder from a `get_builder_restart` and want to drop an optional input). Currently, this is achievable by deleting an item using dictionary syntax: ```python del builder['input_name'] ``` However, the following syntax does not work: ```python del builder.input_name ``` I suggest to support also this syntax, since it is already possible (and 'suggested') to use builders with attribute access for setting data, thanks to the TAB-completion support (this can just proxy to the existing deletion function for dictionary items).
process
add possibility to remove inputs from a builder with delattr currently once you have a builder you might want to remove an optional output e g if you get a builder from a get builder restart and want to drop an optional input currently this is achievable by deleting an item using dictionary syntax python del builder however the following syntax does not work python del builder input name i suggest to support also this syntax since it is already possible and suggested to use builders with attribute access for setting data thanks to the tab completion support this can just proxy to the existing deletion function for dictionary items
1
13,174
15,597,045,464
IssuesEvent
2021-03-18 16:29:38
ESMValGroup/ESMValCore
https://api.github.com/repos/ESMValGroup/ESMValCore
closed
New preprocessor mask_multimodel
enhancement preprocessor
**Is your feature request related to a problem? Please describe.** I want to propagate the mask over several datasets such that when an element is masked in one dataset, it is also masked in all other datasets. **Would you be able to help out?** Yes, working on it. Initially I thought `mask_fillvalues` would do the job, but it doesn't. Also Github kindly points me to other issues, and just now I realize this has been discussed before here https://github.com/ESMValGroup/ESMValCore/issues/116
1.0
New preprocessor mask_multimodel - **Is your feature request related to a problem? Please describe.** I want to propagate the mask over several datasets such that when an element is masked in one dataset, it is also masked in all other datasets. **Would you be able to help out?** Yes, working on it. Initially I thought `mask_fillvalues` would do the job, but it doesn't. Also Github kindly points me to other issues, and just now I realize this has been discussed before here https://github.com/ESMValGroup/ESMValCore/issues/116
process
new preprocessor mask multimodel is your feature request related to a problem please describe i want to propagate the mask over several datasets such that when an element is masked in one dataset it is also masked in all other datasets would you be able to help out yes working on it initially i thought mask fillvalues would do the job but it doesn t also github kindly points me to other issues and just now i realize this has been discussed before here
1
258,989
22,362,667,464
IssuesEvent
2022-06-15 22:26:21
microsoft/AzureStorageExplorer
https://api.github.com/repos/microsoft/AzureStorageExplorer
closed
No warning 'At least one column should be selected' displays in the 'Column Options' dialog when no column is selected
:heavy_check_mark: merged 🧪 testing :gear: tables :beetle: regression
**Storage Explorer Version:** 1.25.0-dev **Build Number:** 20220608.1 **Branch:** main **Platform/OS:** Windows 10/Linux Ubuntu 20.04/MacOS Monterey 12.4 (Apple M1 Pro) **Architecture:** ia32\x64 **How Found:** From running test cases **Regression From:** Previous release (1.24.2) ## Steps to Reproduce ## 1. Expand one storage account -> Tables. 2. Create a table -> Click 'Column Options' -> Uncheck all available columns. 3. Check whether a warning 'At least one column should be selected' displays. ## Expected Experience ## A warning 'At least one column should be selected' displays. ![image](https://user-images.githubusercontent.com/87792676/172551542-1c089e31-b061-40c3-8664-2baa4dcfc940.png) ## Actual Experience ## No warning 'At least one column should be selected' displays. ![image](https://user-images.githubusercontent.com/87792676/172551682-7534843f-c9de-469a-a7b3-c6fef3dc1d4d.png)
1.0
No warning 'At least one column should be selected' displays in the 'Column Options' dialog when no column is selected - **Storage Explorer Version:** 1.25.0-dev **Build Number:** 20220608.1 **Branch:** main **Platform/OS:** Windows 10/Linux Ubuntu 20.04/MacOS Monterey 12.4 (Apple M1 Pro) **Architecture:** ia32\x64 **How Found:** From running test cases **Regression From:** Previous release (1.24.2) ## Steps to Reproduce ## 1. Expand one storage account -> Tables. 2. Create a table -> Click 'Column Options' -> Uncheck all available columns. 3. Check whether a warning 'At least one column should be selected' displays. ## Expected Experience ## A warning 'At least one column should be selected' displays. ![image](https://user-images.githubusercontent.com/87792676/172551542-1c089e31-b061-40c3-8664-2baa4dcfc940.png) ## Actual Experience ## No warning 'At least one column should be selected' displays. ![image](https://user-images.githubusercontent.com/87792676/172551682-7534843f-c9de-469a-a7b3-c6fef3dc1d4d.png)
non_process
no warning at least one column should be selected displays in the column options dialog when no column is selected storage explorer version dev build number branch main platform os windows linux ubuntu macos monterey apple pro architecture how found from running test cases regression from previous release steps to reproduce expand one storage account tables create a table click column options uncheck all available columns check whether a warning at least one column should be selected displays expected experience a warning at least one column should be selected displays actual experience no warning at least one column should be selected displays
0